Although I wasn’t able to participate in this chat directly, the moderators kindly allowed me to write the summary, which meant I had the opportunity to experience the idea exchange, albeit post chattum.
The chat kick started with Marisa Constantinides (@Marisa_C) asking Anthony Ash (@ashowksi) and Angelos Bollas (@angelos_bollas) about their experiences with the Experimental Practice aspect of the Cambridge Delta. Anthony had looked at the use of Cuisenaire Rods for eliciting lexis and generating ideas. Angelos focused on the use of Dogme in the classroom.
Marisa then posed the following question to get the discussion under way:
“Don’t you think teachers are into the experimentation business anyway?”
David Petrie (@teflgeek) quickly responded with the fact that “to some extent” every lesson is indeed an experiment. However, whether David’s statement is true, the question here, as Marisa put it, is really begging the notion of “how to be more organised about it.”
Shaun Wilden (@Shaunwilden) and Angelos both seemed to feel that not all teachers take an experimental approach to teaching. As Angelos put it, he and many other teachers did the “same things every day” until he covered Experimental Practice in Delta.
Based on Angelos’ assertion , Marisa suggested that “perhaps teachers who have some training or find out things online get involved in class trialling?”
Statistics and Research
The chat then developed towards action research and in-class research. Marisa pointed out that “ELT has difficulties with numerical, quantitative studies and data” and that “class research does not yield what we would call statistically significant results.” This makes Marisa “very wary and negative” towards Evidence-Based Teaching posts.
Patrick Andrews (@patrickelt) responded with the notion that the quality of the research might link directly to the quality of reflection on experimental practice.
What followed was a range of relevant and important questions, including:
- How could we be more systematic?
- Does it need to be statistically significant?
- How do you define results?
- What constitutes a successful experiment?
- Why do ELT programmes lack modules on qualitative/quantitative research?
Although these questions were all thought-provoking, Shaun and Patrick posed two more excellent questions which went straight in the face of the quantitative experimental approach:
- Does an experiment, say on using rods, need to be so in depth?
- Is there not a need to learn from experience as well as from systematic research?
Whether you agree with Shaun, in that Experimental Practice doesn’t need to be so formal, or whether you are on the other side of the fence, you will need to consider this point by David Petrie:
You need to develop a set of objective measure for every aspect of teaching.
Planning Experimental Practice & In-Class Research
Moving the chat along, Marisa posed the question of “how” to go about planning successful in-class research and experiments?
Angelos suggested that the first step should be “to read the available literature” in the given area. Cherry M P (@cherrymp) was very supportive of this suggestion, in that not only does it mean you will be informed before carrying out your experiment but also that it provides you with insight into previously executed experiments, which you could then “replicate” in your own teaching context. Perhaps the results would be significantly different from the study you read due to the difference in environment.
Marisa highlighted that the Oxford ELT Journal is a source of accessible research for ELT professionals out there interested in reading up on the literature before experimenting.
Marisa also quickly pointed out that many academic journals are “dense” and “inaccessible” to most teachers in the field. However, David noted that:
If teachers develop a habit of reading any journal, then it will become “accessible”
The Action Research Cycle
On the basis of the research-focused direction the chat was taking, David Petrie brought in the concept of the Action Research Cycle, which is portrayed in the image below:
The difference is only a question of audience, with bloggers having a wider audience.
Sophia Khan (@SophiaKhan4) highlighted that there are plenty of classroom-based researchers who are also bloggers and who “build birdges between research and practice”, such as Scott Thornbury.
Angelos then asked the participants if any were currently doing any action research or experimenting with different practices.
David responded that he does “every now and again” when he has time “between teaching engagements.”
As part of active research, Marisa suggested that what teachers could investigate is “learner reaction to an activity or method.” However, she quickly noted that this might not always be obvious or measurable. Perhaps someone can suggest a way to make this measurable?
Angelos noted that while Marisa’s suggestion is a great one, the research would depend a lot on the teacher’s ability “to set the new practice” rather than the practice itself.
Paul Gallantry (@pjgallantry) suggested that measuring learners’ reactions to new practices and methods could be extended to measuring teachers’ reactions those methods and practices.
Jenny Ankenbauer (@jankenb2) suggested that a good way “to determine efficacy” in action research is “to collect feedback” from learners over a period of set intervals. This assumption is that this would give the teacher a fairer insight into how the experiment is being received by the learners.
There was also the question of whether bloggers were more qualified to carry out action research in comparison to, say, their colleagues whose research remains within the four walls of the school.
However, Angelos posed an insightful question:
Is it necessary to repeat the same experiment over and over with different students?
Perhaps this is helpful in order to gain a fairer picture, or perhaps it would depend on the criteria of the experiment?
In response to this, David said that he sometimes gets his learners to do the experimenting themselves, with questions such as: Does this exam strategy work for you? Are you getting higher scores? And Marisa also supported the idea of repetitive research.
Marisa then posed the question of how you go about collecting data and information. Sue Annan (@SueAnnan) suggesting making “an observation instrument”, such as a questionnaire, which Angelos Bollas suggested could be used with both learners and colleagues.
Cherry M P made an excellent observation, in that “how” you collect data is closely linked to “what” data you want to collect, and provided the example that when looking at satisfaction, you need to have direct feedback from the learners.
Marisa then asked whether recording audio or video might be helpful in the data collection process, to which several participants responded positively.
Daivd, however, pointed out that regardless of the recording being audio or video, a lot of action research takes “teacher focus off the learners and on to something else.”
A very thought-provoking chat with many ideas of what to consider when going about experimenting and researching in the classroom as well as some ideas on how to best do this.
If you have put any of the ideas here into practice or you have an experience you would like to share, why don’t you leave a comment below or tweet about it using the hashtag #ELTChat.