Two related themes emerged for me from the Methods in Chemistry Education Research meeting last week: confidence and iteration.
Let’s start where we finished: Georgios Tsaparlis’ presentation gave an overview of his career studying problem solving. This work emerged out of Johnstone’s remarkable findings around working memory and mental demand (M-demand).1,2 Johnstone devised a simple formula – if the requirements of a task were within the capability of working memory, students would be able to process the task; if not, students would find it difficult. This proposal was borne out of the plots of performance against complexity (demand) which showed a substantial drop at the point where M-demand exceeded working memory, and these findings seeded a remarkable amount of subsequent research.
However, things are seldom as simple as they seem and Tsaparlis’ work involved studying this relationship in different areas of chemistry – looking, for example, at how students solve problems in organic chemistry compared to physical chemistry, and the effect of the type of question. Each study was an iteration, an investigation of another aspect of this multi-dimensional jigsaw, aiming to make a little bit more sense each time. Sometimes the results led to an ill-fitting piece, with data being consigned to a desk drawer for a few years until further study allowed it to be explored in a new light. Towards the end of this arc of work, he began to move away from linear modelling, where we look at the strength of individual aspects on an overall hypothesis, to more complex models such as the “random walk”. It is another iteration.
The point to make here is there was no study that said: this is how students solve equilibrium questions. Rather, each study added a little more to understanding of a particular model framed around this understanding. Indeed Keith Taber outlined in his Ethics workshop the importance of context and situation in publishing results. Things are rarely definitive and usually context dependent.
For me this is reassuring. Just like Johnstone’s “ON-OFF” findings for working memory, there is a fear that one is either able to complete education research or one isn’t; a few participants indicated that “confidence” was one of the barriers in getting involved in education research in responding to Suzanne Fergus’ pre-meeting prompts, which guided her talk on writing research questions. I remember talking to an eminent chemistry professor who said something along the lines of “never look back!” – to just publish what you know to be your best understanding (and instrumentation) at a given time, accepting that more studies and analysis might lead to more understanding.
While this probably wasn’t intended to be as carefree as I leverage it here, there will always be one more publication, one better approach, one alternative understanding. The task then is to continually inform and develop our understanding of what it is we wish to understand. The action research cycles outlined more formally by Orla Kelly in her presentation facilitate this, although of course one might complete several cycles before considering publication. But I think iterations happen naturally as any research study progresses. Graham Scott illustrated this nicely in his presentation; later publications adding further depth to earlier ones. Stewart Kirton discussed building this iteration onto the design of research instruments.
Our task as education researchers then is to ensure that we are publishing to the best of or understanding and publishing with good intent – that we believe what we are saying at a particular time is an honest overview of our understanding of our study at that time in a given context.
Our task as practitioners is to move on from the duality of things that “work” and things that “don’t work”. The education literature isn’t unique in that it tends to publish more positive results than not, so when looking for the “best” way to do something, a keen reader may soon become overwhelmed, and even frustrated with the education literature for its lack of clarity. A task of those attending MICER then is not necessarily in translating research into practice; a common call, but rather communicating a greater awareness of the process of education research, along with how to meaningfully interpret outcomes so that they may be used – or not – in our teaching of chemistry.
 J. Chem. Educ., 1984, 61, p 847
 Education in Chemistry, 1986, 23, 80-84