I was invited to contribute to the TEAM conference in March but couldn’t make it in person, so sent this screencast. IT outlines my current thinking about the value of using a complex learning framework for laboratory education and what that means in terms of supporting student learning
One of the difficulties students often raise is that the lab report they are required to produce is different for one section (not looking at anyone…) than it is for others. I think it is a fair comment. In Scotland and Ireland, students complete four year undergraduate Bachelor courses, and the first year in these courses is usually a highly organised, well-oiled machine which is consistent in format across the year (it would be similar in nature to the “Gen Chem” courses internationally). So when a student enters Year 2, I think it must be quite a shock to find out about different sections, and that different sections have their own cultures.
One thing we have done this year is to agree on a common report template for reports. Yes, I know Physical like to do it this way and Organic like it that way (Inorganic chemists don’t seem too fussy). Our agreed template tries to accommodate these differences by mentioning particular emphases in each component of the report, although not without compromise. The intention is that as students move from one section to another through their year, feedback they get on a particular component of the report in one section in November is useful to them when they are doing a report for another section in March. Or rather, the clarity about the value of that feedback is better.
Once we had this in the bag, other things fall into place. The poster shows what is now in every Chemistry 2 laboratory manual and outside the laboratory. As well as the report assessment, we’ve harmonised how we treat pre-labs, what the expectations are in the lab. But we’ve also made clear (I hope!) how the current programme builds on Year 1 work, as well as outlining what is next. A key point is that each section (Inorganic, Physical, Organic) in the year is described in terms of the main focus (outcomes), showing students what the similarities and differences are. I think that this kind of information, which is often implicit, is useful to extend to students. More importantly, it keeps staff focussed on considering the practical course as one course rather than three courses.
As I begin to think about next year’s manuals, I’ll happily hear any comments or suggestions!
Yesterday’s post discussed our recent work in thinking about how to build experimental design into the teaching laboratory. This post is related, but aims to think about the overall laboratory teaching curriculum.
I’ve been thinking about this and have Tina Overton’s mantra ringing in my head: what do we want the students to be at the end of it? So, what do we want students to be at the end of a practical curriculum? I think many of us will have varying answers, but there’s a couple of broad themes, which we can assemble thanks to the likes of Tamir (1976), Kirschner and Meester (1992), and Carnduff and Reid (2003, or Reid and Shah, 2007, more accessible).
Tamir considers the is of practical work should include –take a deep breath: skills (e.g., manipulative, inquiry, investigative, organizational, communicative), concepts (e.g., data, hypothesis, theoretical model, taxonomic category), cognitive abilities (e.g., critical thinking, problem solving, application, analysis, synthesis, evaluation, decision making, creativity), understanding the nature of science (e.g., the scientific enterprise, the scientists and how they work, the existence of a multiplicity of scientific methods, the interrelationships between science and technology and among various disciplines of science) and attitudes (e.g., curiosity, interest, risk taking, objectivity, precision, perseverance, satisfaction, responsibility, consensus and collaboration, confidence in scientific knowledge, self-reliance, liking science.)1
Kirschner and Meester list the aims as being: to formulate hypotheses, to solve problems, to use knowledge and skills in unfamiliar situations, to design simple experiments to test hypotheses, to use laboratory skills in performing (simple) experiments, to interpret experimental data, to describe clearly the experiment and to remember the central idea of an experiment over a significantly long period of time.2
And Reid presents the desired outcomes in terms of four skill types: skills relating to learning chemistry, practical skills, scientific skills, and general (meaning transferable) skills.3
So we can see some commonalities, but each have a slightly different perspective. In trying to grapple with the aims of practical work, and think about how they are introduced across a curriculum, I came up with the diagram below a few years ago, recently modified for the Scottish system (we have 5 years instead of 4). This model especially focuses on the concept of “nature of science”, which I consider is the overarching desire for practical work, encompassing the concept of “syntactical knowledge” described in yesterday’s post.
The intention is that each year of the curriculum adds on a new layer. Each year incorporates the year below, but includes a new dimension. So students in Year 3 will become exposed to Experimental Design (Familiar), but they’ll still be developing skills and exploring models/hypotheses.
I’ve shown this model to students at various stages, and they seem to like it. The sense of progression is obvious, and it is clear what the additional demand will be. In fact their reaction this year was so positive that it struck me that we should really share our curriculum design model (whatever it may be) with students, so there is clarity about expectation and demand. So I will include this model in lab manuals in future years. That way, it’s not just that each year is “harder” (or as is often the case, not harder at all, just longer experiments) but the exact focus is identified. They can see (their) ultimate target of final year project, although I think that perhaps we should, with Tina in mind again, have something on the top platform, stating the desired attributes on graduation.
I’d be interested in opinions on this model. One challenge it raises is how to make labs in the earlier years more interesting, and I think the intentional incorporation of interesting chemistry, decision making, and documenting skill development will help in that regard. Thoughts?!
- Tamir, P. The role of the laboratory in science teaching; University of Iowa: 1976.
- Kirschner, P. A.; Meester, M. A. M., The laboratory in higher science education: Problems, premises and objectives. Higher Education 1988, 17 (1), 81-98.
- (a) Carnduff, J.; Reid, N., Enhancing undergraduate chemistry laboratories: pre-laboratory and post-laboratory exercises. Royal Society of Chemistry: 2003; (b) Reid, N.; Shah, I., The role of laboratory work in university chemistry. Chemistry Education Research and Practice 2007, 8 (2), 172-185.
A great dilemma lies at the heart of practical education. We wish to introduce students to the nature and practices of scientific enquiry, as it might be carried out by scientists. Learning by mimicking these processes, it is argued, will imbue our students with an understanding of scientific approaches, and thus they will learn the practices of science. Often such approaches can be given within a particular real-life context, which can be motivating. I know this argument well, and indeed have advocated this framework.1
However, problems emerge. Let’s consider two.
The first is that these approaches often conflate learning how to do a particular technique with applying that technique to a particular scenario. In other words, students are expected to learn how to do something, but at the same time know how to do it in an unfamiliar scenario. This should set off your cognitive load alarm bells. Now I know people may argue that students learned how to use the UV/vis spectrometer in the previous semester when doing the Beer-Lambert law, so they should be able to use it now for this kinetics experiment, but my experience is that students don’t transfer those skills well, and unless you’ve really focussed on teaching them the actual technique (as opposed to using the technique in a particular study), relying on previous experimental experience is not favourable.
Let’s park the cognitive load issue for a moment, and consider a deeper issue. In his wonderful essay, which should be compulsory reading for anyone setting foot in a teaching lab, Paul Kirschner discusses at length, the epistemology of practical education (epistemology meaning the way knowledge is acquired).2 He writes that we need to distinguish between teaching science and doing science. Drawing on the work of Woolnough and Allsop,3 and Anderson4 he describes the substantive structure of science – the body of knowledge making up science – and the syntactical structure of science – the habits and skills of those who practice science. Anderson’s short book is a wonderful read: he describes this distinction as “science” and “sciencing”. In teaching about the syntactical structure, or “sciencing”, Kirschner argues with some force that a mistake is made if we aim to use science practical work to reassert the substantive knowledge; we should instead be explicitly teaching the process of sciencing – how are these habits and skills are developed.
So: the previous two paragraphs have tried to summarise two issues that arise when one considers laboratory education that incorporate inquiry approaches; they often impose unrealistic demands on students requiring the learning about a technique and applying the technique to an unfamiliar scenario simultaneously; and their focus is on doing science as if it were a realistic scenario, rather than teaching how science is done.
An example in practice
How can such confusion manifest in practice? In our teaching labs, our Year 3 students used to complete several routine practicals, and then in their final few weeks complete an investigation. This approach has a lot going for it. Students get used to more advanced techniques in the first few expository experiments, and then being familiar with Advanced Things can launch into their investigation; an experiment they needed to scope out, design, and conduct. As their last formal laboratory exercise, this would be a good connection to their research project in Year 5.
In practice, it was a bloodbath. Students found it inordinately difficult to take on experimental design, and had little concept about the scope of the experiment, whether what they were doing was on the right path. I think it is instructive to relate these observed problems with the issues described above. We had not taught students how to use the techniques in the scenario they were going to be requiring them, and we had spent a long time telling them to verify known scientific facts, but not much about the actual processes involved in making these verifications.
Change was needed.
A few years ago at the Variety in Chemistry Education meeting in Edinburgh, Martin Pitt gave a 5-minute talk about a practice he had adopted: he gave students a chance to do a practical a second time. He found that even though everything else was the same, students in the second iteration were much more familiar with the equipment, had much greater understanding of the concept, and got much better quality data. This talk appealed to me very much at the time because (a) I was so impressed Martin was brave enough to attempt this (one can imagine the coffee room chat) and (b) it linked in very nicely with my emerging thought at the time about cognitive load.
So Martin is one piece of the jigsaw’s solution. A second is back to Kirchner’s essay. Must we Michael? Yes, we must. At the end, Kirschner presents some strategies for practice. This essay is a tour de force, but compared to the main body of the essay, these strategies seem a bit meek. However, there, just above the footnotes, he describes the divergent laboratory approach, a compromise between the typical recipes (highly structured) and the experimental approach (highly unstructured):
“The last approach can be regarded as a realistic compromise between the experimental and the academic laboratories and is called the divergent laboratory (Lerch, 1971). In the divergent lab, there should be parts of the experiment that are predetermined and standard for all students, but there should be many possible directions in which the experiment can develop after the initial stage. It provides the student with tasks similar to those encountered in an open ended or project (experimental) lab within a framework that is compatible with the various restrictions imposed as a result of the wider system of instructional organisation.”
Martin’s simple experiment had shown that by allowing students time and space to consider an experiment, they demonstrated greater understanding of the experiment and a better ability to gather experimental data. The divergent laboratory approach is one with a solid but pragmatic grounding in education literature. So here is the plan:
Students complete a recipe laboratory as usual. They learn the approaches, the types of data that are obtained, the quirks of the experiment. We call this Part 1: it is highly structured, and has the purpose of teaching students how to gather that data as well as get some baseline information for…
…for a subsequent exploration. Instead of finishing this experiment and moving on to another recipe, students continue with this experiment. But instead of following a recipe now, they move on to some other aspect. We call this Part 2 (naming isn’t our strong point). This investigative component allows them to explore some additional aspect of the system they have been studying, or use what they have been studying in the defined component to apply to some new scenario. The key thing is that the students have learned how to do what they are doing and the scope of that experiment, and then move to apply it to a new scenario. We repeat this three times throughout the students’ time with us so that the students become used to experimental design in a structured way. A problem with the old investigation model was that students eventually got some sense of what was needed, but never had the feedback loop to try it out again.
We call this approach unfinished recipes. We are giving students the start; the overall structure and scope, but the end depends on where they take it, how much they do, what variation they consider. There is still a lot of work to do (designing these experiments is hard). But lots of progress has been made. Students are designing experiments and approaches without direct recipes. They are learning sciencing. A colleague told me today that the turnaround has been remarkable – students are working in labs, are happy and know what they are doing.
YES THIS IS THE PHYSICAL CHEMISTRY LABORATORY NOW…
I’m very lucky to have the support of two fantastic demonstrators who were involved in the design of this approach and a lab technician who is patient to my last minute whims as well as colleagues involved in designing the unfinished recipes.
- McDonnell, C.; O’Connor, C.; Seery, M. K., Developing practical chemistry skills by means of student-driven problem based learning mini-projects. Chemistry Education Research and Practice 2007, 8 (2), 130-139.
- Kirschner, P. A., Epistemology, practical work and academic skills in science education. Science & Education 1992, 1 (3), 273-299.
- Woolnough, B. E.; Allsop, T., Practical work in science. Cambridge University Press: 1985.
- Anderson, R. O., The experience of science: A new perspective for laboratory teaching. Teachers College Press, Columbia University: New York, 1976.
One of the first challenges that emerge when considering teaching in laboratories is to define the kind of environment we are teaching in, and what that means for student learning. Laboratories differ significantly from lectures in terms of environment. Lectures tend to follow a well-established pattern – highly organised material is presented to learners in a fixed setting. While modern lectures incorporate some kind of activity, the focus is usually on whatever material is being presented, and learners rarely have to draw on any additional knowledge or skills outside what is under immediate consideration. Furthermore, learners have time (and often tutorials) after lectures to reconsider the information presented in lectures.
Laboratory learning is much more complex for a variety of reasons. One is physical – the very space students are in when completing laboratory work can vary significantly depending on the type of experiment they are completing. A second is that the number of stakeholders involved increase: teaching assistants, technical staff, and additional academic staff each have a role to play in the delivery of the laboratory course.
Here, we will consider a further aspect: the complexity experienced by students. We can consider the laboratory as a complex learning environment (van Merrienboer, 2003), an environment with the following aims:
(i) Complex learning aims at the integration of knowledge, skills, and attitudes.
Learning in the laboratory involves three domains. The cognitive domain relates to the intellectual knowledge associated with experimental work, such as the underlying concepts of an experiment, the procedures involved for a piece of apparatus, or the ability to apply scientific reasoning to results observed. Students are required to draw on this knowledge as they work through their experiment. The psychomotor domain relates to the physical actions required in completing an experiment such as motor skills and coordination of tasks. Students are required to have basic proficiency in these tasks, and as they progress in capability, be able to adapt their approach when completing tasks in response to particular conditions. Finally, the affective domain considers the students’ emotional relationship with their experimental work such as their motivation to do well or the internalisation of the value of the task to their learning.
Because of the nature of laboratory learning, these three domains are active at the same time, and students have to draw on a range of aspects to work in this environment. Carrying out any experimental task will involve drawing on knowledge about what that task is, including safety considerations, while actively completing the task, and do so within the context of whatever their personal attitude for the laboratory is. Managing learning within this complex environment requires a teasing out of the various factors involved, and an understanding of how to best address each one in turn, so that students are offered the chance to develop the capacity to integrate the tasks into the whole, and carry out the work satisfactorily. Because of the time boundaries imposed on laboratory work, this is one of the greatest challenges we face in laboratory teaching.
(ii) Complex learning involves the coordination of qualitatively different constituent skills.
As well as bringing together learning from different domains, within the context of laboratory skills, students will need to be able to complete multiple component tasks as part of one overall task. An analogy is learning to drive. The process of driving requires knowledge of the use of each of the pedals, the gear stick, steering wheel, etc which can each be individually practiced when not driving. In the process of driving, the driver needs to be able to coordinate the various individual tasks simultaneously. Parallels can be made with the chemistry laboratory, where students will need to complete several component tasks in the process of doing one overall task. This is difficult, and requires that the student is capable of each of the constituent tasks in advance of being required to complete the composite task.
(iii) Complex learning requires the transfer of what is learned to real settings
Preparing a laboratory programme which enables students to experience the challenges of drawing together constituent components described in (i) and (ii), above, lays the foundation for the third challenge for laboratory learning: the ability to transfer what is known to unfamiliar situations encountered in real situations. The context of what is “real” needs to be carefully managed within the curriculum – students embarking on an undergraduate research project will likely encounter real problems, but in the formal laboratory curriculum, care is needed to distinguish between simulated problems (where the teacher knows the preferred solution pathway) and actual problems, where the pathway is not clear. Given the number of complexities regarding learning discussed, it would clearly be a folly to require students to begin to consider real settings before teaching the pre-requisite capabilities of integrating knowledge, skills, and attitudes and coordination of tasks, described above. The laboratory curriculum therefore needs to be designed so that these capabilities are developed progressively, so that students develop the capacity to translate their learning to real situations.
Jeroen J. G. van Merrienboer , Paul A. Kirschner & Liesbeth Kester (2003)
Taking the Load Off a Learner’s Mind: Instructional Design for Complex Learning, Educational
Psychologist, 38(1), 5-13.
Much of my work over the last year has focussed on pre-labs. In our research, we are busy exploring the role of pre-labs and their impact on learning in the laboratory. In practice, I am very busy making a seemingly endless amount of pre-lab videos for my own teaching.
These research and practice worlds collided when I wanted to answer the question: what makes for a good pre-lab? It’s taken a year of reading and writing and re-reading and re-writing to come up with some sensible answer, which is now published as a review.
There are dozens of articles about pre-labs and the first task was to categorise these – what are others doing and why they are doing it. We came up with a few themes, including the most common pair: to introduce the theory behind a lab and to introduce experimental techniques. So far so obvious. Time and again these reports – we gathered over 60 but there are likely more our search didn’t capture – highlighted that pre-labs had benefit, including unintended benefits (such as a clear increase in confidence about doing labs).
Why were pre-labs showing this benefit? This was rarer in reports. Some work, including a nice recent CERP paper, described the use of a underpinning framework to base the design of pre-labs upon and meant that the outcomes could be considered in that framework (in that case: self-regulation theory). But we were looking for something more… over-arching; a framework to consider the design considerations of pre-labs that took account of the unique environment of learning in the laboratory.
We have opted to use the complex learning framework as a framework for learning in laboratories, for various reasons. It is consistent with cognitive load theory, which is an obvious basis for preparative work. It describes the learning scenario as one where several strands of activity are drawn together, and is ‘complex’ because this act of drawing together requires significant effort (and support). And it offers a clear basis on the nature of information that should be provided in advance of the learning scenario. Overall, it seemed a sensible ‘fit’ for thinking about laboratory learning, and especially for preparing for this learning.
What makes for a good pre-lab?
We drew together the learning from the many reports on pre-lab literature with the tenets from complex learning framework to derive some guidelines to those thinking about developing pre-laboratory activities. These are shown in the figure. A particular advantage of the complex learning framework is the distinction between supportive and procedural information, which aims to get to the nitty-gritty of the kind of content that should be incorporated into a pre-lab activity. Casual readers should note that the “procedural” used here is a little more nuanced than just “procedure” that we think about in chemistry. We’ve elaborated a lot on this.
I hope that this review is useful – it has certainly been a learning experience writing it. The pre-print of the review is now available at http://dx.doi.org/10.1039/C7RP00140A and the final formatted version should follow shortly.
I’ve spent the last two week in Australia thanks to a trip to the Royal Australian Chemical Institute 100th Annual Congress in Melbourne. I attended the Chemistry Education symposium.
So what is keeping chemistry educators busy around this part of the world? There are a lot of similarities, but some differences. While we wrestle with the ripples of TEF and the totalitarian threat of learning gains, around here the acronym of fear is TLO: threshold learning outcomes. As I understand it, these are legally binding statements stating that university courses will ensure students will graduate with the stated outcomes. Institutions are required to demonstrate that these learning outcomes are part of their programmes and identify the level to which they are assessed. This all sounds very good, except individuals on the ground are now focussing on identifying where these outcomes are being addressed. Given that they are quite granular, this appears to be a huge undertaking and is raising questions like: where and to what extent is teamwork assessed in a programme?
This process does appear to have promoted a big interest in broader learning outcomes, with lots of talks on how to incorporate transferable skills into the curriculum, and some very nice research into students’ awareness of their skills. Badges are of interest here and may be a useful way to document these learning outcomes in a way that doesn’t need a specific mark. Labs were often promoted as a way of addressing these learning outcomes, but I do wonder how much we can use labs for learning beyond their surely core purpose of teaching practical chemistry.
Speaking of labs, there was some nice work on preparing for laboratory work and on incorporating context into laboratory work. There was (to me) a contentious proposal that there be a certain number of laboratory activities (such as titrations) that are considered core to a chemist’s repertoire, and that graduation should not be allowed until competence in those core activities be demonstrated. Personally I think chemistry is a broader church than that, and it will be interesting to watch that one progress. A round-table discussion spent a good bit of time talking about labs in light of future pressures of funding and space; and it does seem that we are still not quite clear about what the purpose of labs are. Distance education – which Australia has a well-established head start in – was also discussed, and I was really glad to hear someone with a lot of experience in this say that it is possible to generate a community with online learners, but that it takes a substantial personal effort. The lab discussion continued to the end, with a nice talk on incorporating computational thinking into chemistry education, with suggestions on how already reported lab activities might be used to achieve this.
Of course it is the personal dimension that is the real benefit of these meetings, and it was great to meet some faces old and new. Gwen Lawrie wasn’t on the program as the announcement of her award of Education Division Medal was kept secret for as long as possible. I could listen to Gwen all day, and her talk had the theme “Chasing Rainbows”, which captured so eloquently what it means to be a teacher-researcher in chemistry education, and in a landscape that continues to change. [Gwen’s publications are worth trawling] Gwen’s collaborator Madeline Schultz (a Division Citation Winner) spoke about both TLOs and on reflections on respected practitioners on their approaches to teaching chemistry – an interesting study using a lens of pedagogical content knowledge. From Curtin, I (re-)met Mauro Mocerino (who I heard speak in Europe an age ago on clickers) who spoke here of his long standing work on training demonstrators. Also from that parish, it was a pleasure to finally meet Dan Southam. I knew Dan only through others; a man “who gets things done” so it was lovely to meet him in his capacity as Chair of the Division and this symposium, and to see that his appellation rang true. And it was nice to meet Elizabeth Yuriev, who does lovely work exploring how students approach physical chemistry problem and on helping students with problem solving strategies.
There were lots of other good conversations and friendly meetings, demonstrating that chemistry educators are a nice bunch regardless of location. I wasn’t the only international interloper; Aishling Flaherty from University of Limerick was there to spread her good work on demonstrator training – an impressive programme she has developed and is now trialling in a different university and a different country. And George Bodner spoke of much of his work in studying how students learn organic chemistry, and in particular the case of “What to do about Parker”. The memory of Prof Bodner sitting at the back of my talk looking at my slides through a telescopic eye piece is a happy one that will stay with me for a long time. Talk of organic chemistry reminds me of a presentation about the app Chirality – 2 which was described – it covers lots of aspects about revising organic chemistry, and looked really great.
My slightly extended trip was because I had the good fortune to visit the research group of Prof Tina Overton, who moved to Melbourne a few years ago, joining native Chris Thompson in growing the chemistry education group at Monash. It was an amazing experience immersing in a vibrant and active research group, who are working on things ranging from student critical thinking, chemists’ career aspirations, awareness of transferable skills, and the process and effect of transforming an entire laboratory curriculum. I learned a lot as I always do from Tina and am extremely grateful for her very generous hosting. I leave Australia now, wondering if I can plan a journey in 2018 for ICCE in Sydney.
I have been reading quite a lot about why we do practical work. Laboratory work is a core component of the chemistry (science) curriculum but its ubiquity means that we rarely stop to consider its purpose explicitly. This leads to many problems. An interesting quote summarises one:
One of the interesting things about laboratories is that there has never been definite consensus about those serious purposes. Perhaps that is why they have remained popular: they can be thought to support almost any aim of teaching.1
Even within institutions, where their might be some prescription of what the purpose is in broad terms, different faculty involved in the laboratory may have different emphases, and subsequently the message about what the purpose of practical work is differs depending on who is running the lab on a given day.2
This matters for various reasons. The first is that if there is confusion about the purpose of practical work, then everyone involved will place their attention onto the part that they think is most important. Academics will likely consider overarching goals, with students developing scientific skills and nature of science aspects.3 Demonstrators will think about teaching how to use instruments or complete techniques. Students will follow the money, and focus on the assessment, usually the lab report, which means their time is best utilised by getting the results as quickly as possible and getting out of the lab.4 Everybody’s priority is different because the purposes were never made clear. As in crystalline.
The second reason that thinking about purposes is that without an explicit consideration of what the purposes of practical work are, it is difficult to challenge these purposes, and consider their value. How many lab manuals open up with a line similar to: “The purpose of these practicals is to reaffirm theory taught in lectures…”? The notion that the purpose of practicals is in somehow supplementing taught material in lectures has long come in for criticism, and has little basis in evidence. Laboratories are generally quite inefficient places to “teach” theory. Woolnough and Allsop argued vehemently for cutting the “Gordian Knot” between theory and practical, arguing that practical settings offered their own unique purpose that, rather than being subservient to theory work, complemented it.5 Kirchner picks this argument up, describing science education in terms of substantive structure and syntactical structure. The former deals with the knowledge base of science, the latter with the acts of how we do science.6 Anderson had earlier distinguished between “science” and “sciencing”.7
Discussion therefore needs to focus on what this syntactical structure is – what is “sciencing”? Here, the literature is vast, and often contradictory. To make a start, we look to Johnstone who, with his usual pragmatism, distinguished between aims of practical work (what we set out to do) and objectives of practical work (what the students achieve).8 With this in mind, we can begin to have some serious discussion about what we want practical work to achieve in our curricula.
- White, R. T., The link between the laboratory and learning. International Journal of Science Education 1996, 18 (7), 761-774.
- Boud, D.; Dunn, J.; Hegarty-Hazel, E., Teaching in laboratories. Society for Research into Higher Education & NFER-Nelson Guildford, Surrey, UK: 1986.
- Bretz, S. L.; Fay, M.; Bruck, L. B.; Towns, M. H., What faculty interviews reveal about meaningful learning in the undergraduate chemistry laboratory. Journal of Chemical Education 2013, 90 (3), 281-288.
- (a) DeKorver, B. K.; Towns, M. H., General Chemistry Students’ Goals for Chemistry Laboratory Coursework. Journal of Chemical Education 2015, 92 (12), 2031-2037; (b) DeKorver, B. K.; Towns, M. H., Upper-level undergraduate chemistry students’ goals for their laboratory coursework. Journal of Research in Science Teaching 2016, 53 (8), 1198-1215.
- Woolnough, B. E.; Allsop, T., Practical work in science. Cambridge University Press: 1985.
- Kirschner, P. A., Epistemology, practical work and academic skills in science education. Science & Education 1992, 1 (3), 273-299.
- Anderson, R. O., The experience of science: A new perspective for laboratory teaching. Teachers College Press, Columbia University: New York, 1976.
- Johnstone, A. H.; Al-Shuaili, A., Learning in the laboratory; some thoughts from the literature. University Chemistry Education 2001, 5 (2), 42-51.
Marcy Towns’ lovely paper from 2015 described the use of digital badges in higher education chemistry, specifically for assessment of laboratory skills. This work was important. The concept of badges had been around for a while. When I first came across them while doing an MSc in E-Learning back in 2010, laboratory work seemed an obvious place to use them. But while the technology promised a lot, there wasn’t feasible systems in place to do it. And what exactly would you badge, anyway? And would undergraduate students really take a digital badge seriously?
Towns’ work was important for several reasons. On a systematic level, it demonstrated that issuing badges to a very large cohort of students in undergraduate laboratories was feasible. At Purdue, they used an in-house system called Passport to manage the badging process from submission of videos to viewing online for assessment, and subsequent issuing of the badges. But more importantly for me, Towns’ work reasserted the notion of laboratory skills and competencies as something worth assessing in their own right. Skills weren’t being implicitly assessed via quality of results or yield in a reaction. This approach – videoing a student while they demonstrate a technique was directly assessed. This is not something you come across very often in the education literature (some notable exceptions are pointed out in our paper).
This work answered two of the three questions I had about badges – there are systems in place (although as I discovered, Blackboard just about manages to do this, with a lot of creaking). And there is scope for badging laboratory skills – the concept of badging is built on demonstrable evidence, and videoing techniques is part of this. Whether students take badging seriously I think still needs to be answered. My own sense is that there will need to be a critical mass of badges – an obvious ecosystem where it is clear to students how they can progress, and our own work in this regard is extending into more advanced techniques.
Incorporating peer review
One of the great insights Towns and her students shared at a workshop at BCCE last summer was the notion of narration in demonstrating techniques. Early videos in their initial pilot studies were eerily silent, and it was difficult for them to know what the students’ understanding was as they completed a technique – why they were doing things in a particular way? So they built in narration into the requirements of the demonstration. I think this is one of those things in hindsight that is obvious, but for us to know up front in our own implementation was invaluable.
We opted to have a system where students would video each other rather than be videoed by a demonstrator, and that the narration would be in effect one peer telling the other how they were doing the technique. To facilitate a review of this at the end, the demonstrators in the project came up with the idea of a peer observation sheet (they designed them too – bonus points!). The whole set up was to encourage dialogue – genuine interactions discussing the experimental technique, and allowing for feedback on this based on the guidelines presented in the peer observation sheets. These acted as a framework on which the lab was run. Lord knows chemists like instructions to follow.
Feedback then is given in-situ, and indeed if the student demonstrating feels after discussion that they would like to video it again, they can. This notion of quality, or exemplary work, is underscored by the exemplars provided to students in advance; pre-laboratory videos dedicated to correct display of technique. This whole framework is based around Sadler’s design for feedback, discussed… in the paper!
We’ve documented our on-going work on the project blog and the paper summarising the design and analysis of evaluation is now available in CERP. It is part of the special issue on transferable skills in the curriculum which will be published in the Autumn, primarily as we felt it developed digital literacy skills in addition to the laboratory work; students were required to submit a link to the video they hosted online, rather than the video itself. This is giving them control over their digital footprint.
Resources – digital badges, peer observation sheets, links to exemplar videos – are all freely available on the project website. I really think there is great scope for badges, and look forward to where this project will go next!
As pat of our ongoing development of an electronic laboratory manual at Edinburgh, I decided this year to incorporate discussion boards to support students doing physical chemistry labs. It’s always a shock, and a bit upsetting, to hear students say that they spent very long periods of time on lab reports. The idea behind the discussion board was to support them as they were doing these reports, so that they could use the time they were working on them in a more focussed way.
The core aim is to avoid the horror stories of students spending 18 hours on a report, because if they are spending that time on it, much of it must be figuring out what the hell it is they are meant to be doing. Ultimately, a lab report is a presentation of some data, usually graphically, and some discussion of the calculations based on that data. That shouldn’t take that long.
The system set-up was easy. I had asked around and heard some good suggestions for external sites that did this well (can’t remember it now but one was suggested by colleagues in physics where questions could be up-voted). But I didn’t anticipate so many questions that I would have to answer only the most pressing, and didn’t want “another login”, and so just opted for Blackboard’s native discussion board. Each experiment got its own forum, along with a forum for general organisation issues.
A postgrad demonstrator advised me to allow the posts to be made anonymously, and that seemed sensible. Nothing was being graded, and I didn’t want any reticence about asking questions. Even anonymously, some students apologised for asking what they deemed “silly” questions, but as in classroom scenarios, these were often the most insightful. Students were told to use the forum for questions, and initially, any questions by email were politely redirected to the board. In cases close to submission deadlines, I copied the essential part of the question, and pasted it to the board with a response. But once reports began to be due, the boards became actively used. I made sure in the first weekend to check in too, as this was likely going to be the time that students would be working on their reports.
The boards were extensively used. About 60 of our third years do phys chem labs at a time, and they viewed the boards over 5500 times in a 6 week period. Half of these views were on a new kinetics experiment, which tells me as organiser that I need to review that. For second years, they have just begun labs, and already in a two week period, 140 2nd years viewed the board 2500 times. The number of posts of course is nowhere near this, suggesting that most views are “lurkers”, and probably most queries are common. Since students can post anonymously, I have no data on what proportion of students were viewing the boards. Perhaps it is one person going in lots, but given the widespread viewership across all experiments, my guess is it isn’t. The boards were also accessible to demonstrators (who correct all the reports), but I’ve no idea if they looked at them.
The reception from students has been glowing, so much so that it is the surprise “win” of the semester. (Hey, look over here at all these videos I made… No? Okay then!) Students have reported at school council, staff student liaison committees, anecdotally to me and other staff that they really like and appreciate the boards. Which of course prompts introspection.
Why do they like them? One could say that of course students will like them, I’m telling them the answer. And indeed, in many cases, I am. The boards were set up to provide clear guidance on what is needed and expected in lab reports. So if I am asked questions, of course I provide clear guidance. That mightn’t always be the answer, but it will certainly be a very clear direction to students on what they should do. But in working through questions and answers, I stumbled across an additional aspect.
One more thing
Everyone’s favourite detective was famous for saying: “oh: just one more thing“. I’ve found in the lab that students are very keen and eager to know what purpose their experiment has in the bigger context, where it might be used in research, something of interest in it beyond the satisfaction of proving, once again, some fundamental physical constant. And in honesty, it is a failing on our part and in the “traditional” approach that we don’t use this opportunity to inspire. So sometimes in responding to questions, I would add in additional components to think about – one more thing – something to further challenge student thought, or to demonstrate where the associated theory or technique in some experiment we were doing is used in research elsewhere. My high point was when I came across an experiment that used exactly our technique and experiment, published in RSC Advances this year. This then sparked the idea of how we can develop these labs more, the subject of another post.
Again I have no idea if students liked this or followed up these leads. But it did ease my guilt a little that I might not be just offering a silver spoon. It’s a hard balance to strike, but I am certainly going to continue with discussion boards for labs while I work it out.
In a small laboratory off the M25, is a man named Bob. And Bob is a genius at designing and completing reactions on a very small scale. Bob is greatly helped by Dr Kay Stephenson, Mary Owen and Emma Warwick.
I was invited to go down to CLEAPPS to see Bob in action, and try out for myself some of the microscale chemistry he has been developing. I was interested to see it because of a general interest in laboratory expriments and how we can expand our repertoire. But I found out a lot more than just smaller versions of laboratory experiments.
Safety and cost considerations first piqued Bob’s interest in microscale. The traditional laboratory Hofmann voltmeter costs about £250, but the microscale version, including ingenious three way taps to syringe out the separated gases costs about £50. Thoughts about how to do a reduction of copper oxide safely led him to use a procedure that avoided traditional problems with explosions. There’s also a very neat version using iron oxide, incorporating the use of a magnet to show that iron forms.
Bob promised to show me 93 demonstrations in a morning (“scaled back from 94!”) and I worried on my way there that I would have to put on my polite smile after a while. But actually time flew, and as we worked through the (less than 93) experiments, I noticed something very obvious. This isn’t just about safety and cost. It has deep grounding in the scholarship of teaching and learning too.
What I remember from the session is not the apparatus, but the chemistry. Practical chemistry is difficult because we have to worry about setting up apparatus and this can act as a distraction to the chemistry involved. However, the minimal and often absence of apparatus meant that we were just doing and observing chemistry. This particularly struck me when we were looking at conductivity measurements, using a simple meter made with carbon fibre rods (from a kite shop). This, along with several other experiments, used an ingenious idea of instruction sheets within polypropylene pockets (Bob has thought a lot about contact angles). The reaction beaker becomes a drop of water, and it is possible to explore some lovely chemistry: pH indicator colours, conductivity, precipitation reactions, producing paramagnetic compounds, all in this way. It’s not all introductory chemistry; we discussed a possible experiment for my third year physical chemists and there is lots to do for a general chemistry first year lab, including a fabulously simple colourimeter.
One of the reasons chemistry is difficult to learn is because we have multiple ways of representing it. We can describe things as we view them: the macroscopic scale – a white precipitate forms when we precipitate out chloride ions with silver ions. We can describe things at the atomic scale, describing the ionic movement leading the above precipitation. And we can use symbolism, for example representing the ions in a diagram, or talking about the solubility product equation. When students learn chemistry, moving between these “domains” is an acknowledged difficulty. These three domains were described by Alex Johnstone, and we now describe this as Johnstone’s triangle.
One of my observations from the many experiments I carried out with Bob was that we can begin to see these reactions happening. The precipitation reactions took place over about 30 seconds as the ions from a salt at each side migrated through the droplet. Conductivity was introduced into the assumed unionised water droplet by shoving in a grain or two of salt. We are beginning to jump across representations visually. Therefore what has me excited about these techniques is not just laboratory work, but activities to stimulate student chatter about what they are observing and why. The beauty of the plastic sheets is that they can just be wiped off quickly with a paper towel before continuing on.
Bob knew I was a schoolboy chemist at heart. “Put down that book on phenomenology” I’m sure I heard him say, before he let me pop a flame with hydrogen and reignite it with oxygen produced from his modified electrolysis apparatus (I mean who doesn’t want to do this?!). I left the room fist-bumping the air after a finale of firing my own rocket, coupled with a lesson in non-Newtonian liquids. And lots of ideas to try. And a mug.
I want a CLEAPPS set to be developed in time for Christmas. In the mean time, you can find lots of useful materials at: http://science.cleapss.org.uk/.
The following was a pre-conference piece submitted to a Royal Society conference on assessment in practical science.
A modern laboratory education curriculum should embrace digital technologies with assessment protocols that enable students to showcase their skills and competences. With regards to assessment, such a curriculum should:
- incorporate the digital domain for all aspects related to experimental work; preparations, activities, reflections;
- provide a robust and valid assessment framework but with flexibility for individuality;
- emphasise to students the role of documenting evidence in demonstrating skills and competences by means of micro-accreditation, such as digital badges.
This paper summarizes how some digital technologies can address the above points.
How can research into the use of digital technology in the assessment of experimental science improve the validity of assessment in the short, medium and long term?
Re-shifting the emphasis of assessment by means of e-assessment
Our use of digital technologies in everyday life has increased substantially in the last two decades, In contrast, laboratory education has remained stubbornly paper-based, with laboratory notebooks at the core of assessment protocols. This emphasis on a post-hoc report of work done, rather than a consideration of the work itself, means that the value of laboratory work has been distorted in favour of the process of preparing laboratory reports. Experimental work, and the demonstration of experimental skills and competences is of secondary importance.
There are good reasons why emphasis has historically been on the laboratory report instead of laboratory work. Directly assessing experimental work, and indeed any input students have to the planning and completion of experimental work, is subjective. Issues also arise if laboratory work is completed in groups, for either pedagogic or resource reasons. Assigning individual marks is fraught with difficulty.
Digital technologies can provide a basis to address many of the concerns regarding validity that the above issues raise, and provide an opportunity to reposition what is considered to be important in terms of the goals and purpose of experimental science.
The completion of experimental work typically involves:
- Preparation: planning and preparing for work and making decisions on experimental approaches to be taken;
- Action: learning how to carry out work competently, demonstrating competence in experimental approaches, and accurately recording data and/or observations;
- Reflection: drawing conclusions from data, reporting of findings, and evaluation of approaches taken.
Incorporating the digital domain for all aspects of experimental work
Wikis and electronic laboratory notebooks are online document editing spaces that enable individual contribution to be documented and reviewed. Such platforms have been shown to allow the documentation of student thoughts and contributions to work, and as such they provide an excellent basis for recording the entire process (preparation, action, reflection) the student engages with while completing experimental work. Preparation can include a description of what equipment will be used and why, or thoughts on the purpose of experiment. Action can be documented by recording experimental work completed with the inclusion of data or observations in a variety of multi-media formats (text/photos/video/audio). Reflection can allow for a richer form of the typical lab report. In practice this means asking students to consider and review their experimental approach, so that the emphasis shifts away from the “right answer” (an often cited criticism of students coming through a school laboratory curriculum) and more towards a consideration of the approach taken.
Using traceability as a basis for validity of assessment
Validity is a core concern for a national high-stakes examination. Research to date on wikis has pointed to the advantages offered, including that student contributions are date stamped, and each individual contribution is logged. Overall contributions to work can be tracked. Rubrics have been effectively used to assess student laboratory skills, although the compilation of rubrics needs a considerable investment in order to document the desired goals and expectations of any particular curriculum experiment or inquiry so that they can be easily assessed. The value of a more flexible approach to documenting science work using wikis and electronic lab notebooks allows scope for individuality within an overall framework of requirements. However this is an area that needs considerable and ongoing research.
There is a body of research discussing the use of virtual laboratories for mimicking student experimentation, as they provide for more controlled and hence robust assessment protocols. These should be resisted, as they remove students’ exposure to the situational and psychomotor demands that being in the laboratory brings. While virtual laboratories may play some role in summative assessment – for example in decision making – they will likely act as a distraction to the necessary changes required to engaging with and documenting real hands-on work, as they will again shift the focus of experimental science away from actual laboratory work.
Emphasis on experimental science and documenting competences
An advantage of a refocus on documenting of processes means that there is an opportunity for students to showcase their own experimental skills. Digital badges have emerged as a way to accredit these, in what is known as “micro-accreditation”. Digital badges mimic the idea of Guides and Scouts badges by acknowledging achievements and competences in a particular domain. Examples could include badging students experimental skills (for example badges for pipetting, titrating, etc) and higher-level badges (for example badges where students would need to draw on a range of competences already awarded and apply them to a particular scenario (for example an overall analysis where students would need to design the approach and draw on their technical competency on pipetting and titrations). This enables students to document their own progress in an ongoing way, and allows them to reflect on any activities needed to complete a full set of badges on offer. This is an exciting area as it offers significant expansion across the curriculum. Mobile learning platforms will make new and interesting ways to develop these approaches.
Changing from paper based to electronic based media is not without difficulties. In terms of short, medium, and long-term objectives, an initial focus should begin with promoting the possibilities of documenting scientific work in school through the use of multimedia. This will develop a culture and expertise around the use of technical skills and work towards a medium term goal of developing a basis for documenting work in an online platform instead of on paper – emphasising the value of documenting evidence of processes. This can be complemented with the development of a suite of digital badges associated with expected experimental techniques and protocols. In the long term, this allows the consideration of assessment of laboratory work via wikis and electronic lab notebooks, using appropriate rubrics, which allow student to genuinely and accurately showcase their competence in experimental science in a much more meaningful and engaging way.
Contracts have been signed so I am happy to say that I am writing a book on chemistry laboratory education as part of the RSC’s new Advances in Chemistry Education series due for publication mid 2017.
I’ve long had an interest in lab education, since stumbling across David McGarvey’s “Experimenting with Undergraduate Practicals” in University Chemistry Education (now CERP). Soon after, I met Stuart Bennett, now retired, from Open University at a European summer school. Stuart spoke about lab education and its potential affordances in the curriculum. He was an enormous influence on my thinking in chemistry education, and in practical work in particular. We’d later co-author a chapter on lab education for a book for new lecturers in chemistry published by the RSC (itself a good example on the benefits of European collaboration). My first piece of published education research was based on laboratory work; a report in CERP on the implementation of mini-projects in chemistry curriculum, completed with good friends and colleagues Claire Mc Donnell and Christine O’Connor. So I’ve been thinking about laboratory work for a long time.
Why a book?
A question I will likely be asking with increasing despair over the coming months is: why am I writing a book? To reaffirm to myself as much as anything else, and to remind me if I get lost on the way, the reasons are pretty straightforward.
My career decisions and personal interests over the last few years have meant that I have moved my focus entirely to chemistry education. Initially this involved sneaking in some reading between the covers of J. Mat. Chem. when I was meant to be catching up on metal oxide photocatalysis. But as time went on and thanks to the support of others involved in chemistry education, this interest became stronger. I eventually decided to make a break with chemistry and move into chemistry education research. (One of the nicest things for me personally about joining Edinburgh was that this interest was ultimately validated.)
So while my knowledge of latest chemistry research is limited mainly to Chemistry World reports, one thing I do know well is the chemistry education research literature. And there is a lot of literature on laboratory education. But as I read it and try to keep on top of it, it is apparent that much of the literature on laboratory education falls into themes, and by a bit of rethinking of these themes and by taking a curriculum design approach, some guiding principles for laboratory education can be drawn up. And that a compilation of such principles, within the context of offering a roadmap or plan for laboratory education might be useful to others.
And this is what I hope to offer. The book will be purposefully targeted at anyone responsible for taking a traditional university level chemistry laboratory course and looking to change it. In reality, such change is an enormous task, and being pragmatic, needs to happen in phases. It’s tempting then to tweak bits and change bits based on some innovation presented at a conference or seen in a paper. But there needs to be an overall design for the entire student experience, so that incremental changes sum up to an overall consistent whole piece. Furthermore, by offering a roadmap or overall design, I hope to empower members of staff who may be responsible for such change by giving the evidence they may need to rationalise changes to colleagues. Everyone has an opinion on laboratory education! The aim is to provide evidence-based design approaches.
My bookshelves are groaning with excellent books on laboratory education. I first came across Teaching in Laboratories by Boud Dunn and Hegarty-Hazel back in the days when I stumbled across McGarvey’s article. I still refer to it, as even though it was published in 1986, it still carries a lot of useful material. Woolnough and Allsop’s Practical Work in Science is also excellent; crystal clear on the role and value of laboratory education and its distinction from lecture based curriculum. Hegarty-Hazel also edited The Student Laboratory and the Science Curriculum. Roger Anderson’s book The Experience of Science was published before I was born.
I have bought these now out of print books and several more second hand for less than the cost of a cup of coffee. I have learned lots from them, but am mindful that (justifiably) well-known and comprehensive as they are, they are now out of print and our university laboratories have not seen much change in the forty years since Anderson.
I am very conscious of this as I structure my own book. I can speculate that books about science laboratories at both secondary and tertiary level may be too broad. So the book is focussing exclusively on chemistry and higher education.
Secondly, the book is very clearly directed at those implementing a new approach, those involved in change. Ultimately it is their drive and energy and input that decides the direction of changes that will occur. I hope that by speaking directly to them with a clear rationale and approach based on an up-to-date literature, that it may ease the workload somewhat for those looking to rethink laboratory education in their curricula. Now I just need to actually write it.
Summer writing goals pic.twitter.com/8kYyotsKjQ
— Shit Academics Say (@AcademicsSay) July 25, 2016
I’ve compiled this list for another purpose and thought it might be useful to share here.
The following are publications I can find* from UK corresponding authors on chemistry education research, practice, and laboratory work relevant to HE since beginning of 2015. There are lots of interesting finds and useful articles. Most are laboratory experiments and activities, Some refer to teaching practice or underlying principles.
I don’t imagine this is a fully comprehensive list, so do let me know what’s missing. It’s in approximate chronological order from beginning of 2015.
- Surrey (Lygo-Baker): Teaching polymer chemistry
- Reading (Strohfeldt): PBL medicinal chemistry practical
- Astra Zeneca and Huddersfield (Hill and Sweeney): A flow chart for reaction work up
- Bath (Chew): Lab experiment: coffee grounds to biodiesel
- Nottingham (Galloway): PeerWise for revision
- Hertfordshire (Fergus): Context examples of recreational drugs for spectroscopy and introductory organic chemistry
- Overton (was Hull): Dynamic problem based learning
- Durham (Hurst, now at York): Lab Experiment: Rheology of PVA gels
- Reading (Cranwell): Lab experiment: Songoshira reaction
- Edinburgh (Seery): Flipped chemistry trial
- Oaklands (Smith): Synthesis of fullerenes from graphite
- Manchester (O’Malley): Virtual labs for physical chemistry MOOC
- Edinburgh (Seery): Review of flipped lectures in HE chemistry
- Manchester (Wong): Lab experiment: Paterno-Buchi and kinetics
- Southampton (Coles): Electronic lab notebooks in upper level undergraduate lab
- UCL (Tomaszewski): Information literacy, searching
- St Andrews & Glasgow (Smellie): Lab experiment: Solvent extraction of copper
- Imperial (Rzepa): Lab experiment: Assymetric epoxidation in the lab and molecular modelling; electronic lab notebooks
- Reading (Cranwell): Lab experiment: Wolff Kishner reaction
- Imperial (Rzepa): Using crystal structure databases
- Leeds (Mistry): Inquiry based organic lab in first year – students design work up
- Manchester (Turner): Molecular modelling activity
- Imperial (Haslam & Brechtelsbauer): Lab experiment: vapour pressure with an isosteniscope
- Imperial (Parkes): Making a battery from household products
- Durham (Bruce and Robson): A corpus for writing chemistry
- Who will it be…?!
*For those interested, the Web of Science search details are reproduced below. Results were filtered to remove non-UK papers, conference proceedings and editorials.
ADDRESS:((united kingdom OR UK OR Scotland OR Wales OR England OR (Northern Ireland))) AND TOPIC: (chemistry)AND YEAR PUBLISHED: (2016 or 2015)
Literature on laboratory education over the last four decades (and more, I’m sure) has a lot to say on the role of practical work in undergraduate curricula. Indeed Baird Lloyd (1992) surveys opinions on the role of practical work in North American General Chemistry syllabi over the course of the 20th century and opens with this delicious quote, apparently offered by a student in 1928 in a $10 competition:
Chemistry laboratory is so intimately connected with the science of chemistry, that, without experimentation, the true spirit of the science cannot possibly be acquired.
I love this quote because it captures so nicely the sense that laboratory work is at the heart of chemistry teaching – its implicit role in the teaching of chemistry is unquestionable. And although it has been questioned a lot, repeatedly, over the following decades; not many today would advocate a chemistry syllabus that did not contain laboratory work.
I feel another aspect of our consideration of chemistry labs is often unchallenged, and needs to be. That is the notion that chemistry laboratories are in some way proving ground for what students come across in lectures. That they provide an opportunity for students to visualise and see for themselves what the teacher or lecturer was talking about. Or more laudably, to even “discover” for themselves by following a controlled experiment a particular relationship. Didn’t believe it in class that an acid and an alcohol make an ester? Well now you are in labs, you can prove it. Can’t imagine that vapour pressure increases with temperature? Then come on in – we have just the practical for you. Faraday said that he was never able to make a fact his own without seeing it. But then again, he was a great demonstrator.
A problem with this on an operational level, especially at university, and especially in the physical chemistry laboratory, is that is near impossible to schedule practicals so that they follow on from the introduction of theory in class. This leads to the annual complaint from students that they can’t do the practical because they haven’t done the theory. Your students are saying this, if you haven’t heard them, you need to tune your surveys.
It’s an entirely understandable sentiment from students because we situate practicals as a subsidiary of lectures. But this is a false relationship for a variety of reasons. The first is that if you accept a model whereby you teach students chemistry content in lectures, why is there a need to supplement this teaching with a re-teaching of a sub-set of topics, arbitrarily chosen based on the whim of a lab course organiser and the size of a department’s budget? Secondly, although we aim to re-teach, or hit home some major principle again in lab work, we don’t really assess that. We might grade students’ lab report and give feedback, but it is not relevant to them as they won’t need to know it again in that context. The lab report is done. And finally, the model completely undermines the true role of practical work and value it can offer the curriculum.
A different model
When we design lecture courses, we don’t really give much thought to the labs that will go with them. Lecture course content has evolved rapidly to keep up to date with new chemistry; lab development is much slower. So why not the other way around? Why not design lab courses independent of lectures? Lecture courses are one area of the curriculum to learn – typically the content of the curriculum; laboratory courses are another. And what might the role here be?
Woolnough and Allsop (1985), who make a clear and convincing argument for cutting the “Gordian knot” between theory and practice, instead advocate a syllabus that has three aims:
- developing practical skills and techniques.
- being a problem-solving chemist.
- getting a “feel for phenomena”.
The detail of how this can be done is the subject of their book, but involves a syllabus that has “exercises, investigations, and experiences”. To me these amount to the “process” of chemistry. On a general level, I think this approach is worth consideration as it has several impacts on teaching and learning in practice.
Impacts on teaching and learning
Cutting the link between theory and practice means that there is no longer a need to examine students’ understanding of chemistry concepts by proxy. Long introductions, much hated by students, which aim to get the student to understand the theory behind the topic at hand by rephrasing what is given to them in a lab manual, are obsolete. A properly designed syllabus removes the need for students to have had lectures in a particular topic before a lab course. Pre-lab questions can move away from being about random bits of theory and focus on the relationships in the experiment. There is no need for pointless post-lab questions that try to squeeze in a bit more theory.
Instead, students will need to approach the lab with some kind of model for what is happening. This does not need to be the actual equations they learn in lectures. Some thought means they may be able to draw on prior knowledge to inform that model. Of course, the practical will likely involve using some aspect of what they cover or will cover in lectures, but at the stage of doing the practical, it is the fundamental relationship they are considering and exploring. Approaching the lab with a model of a relationship (clearly I am in phys chem labs here!) and exploring that relationship is better reflecting the nature of science, and focussing students attention on the study in question. Group discussions and sharing data are more meaningful. Perhaps labs could even inform future lectures rather than rely on past ones! A final advantage is the reassertion of practical skills and techniques as a valuable aspect of laboratory work.
A key point here is that the laboratory content is appropriate for the level of the curriculum, just as it is when we design lectures. This approach is not advocating random discovery – quite the opposite. But free of the bond with associated lectures, there is scope to develop a much more coherent, independent, and more genuinely complementary laboratory course.
Baird W. Lloyd, The 20th Century General Chemistry Laboratory: its various faces, J. Chem. Ed., 1992, 69(11), 866-869.
Brian Woolnaugh and Terry Allsop (1985) Practical Work in Science, Cambridge University Press.
How do we prepare students for practical skills they conduct in the laboratory?
Practical skills involve psychomotor development, as they typically involve handling chemicals, glassware, and instrumentation. But how do we prepare students for this work, and do we give them enough time to develop these skills?
Farmer and Frazer analysed 126 school experiments (from the old O-level Nuffield syllabus) with a view to categorising practical skills and came up with some interesting results. Acknowledging that some psychomotor tasks include a cognitive component (they give the example of manipulating the air-hole collar of a Bunsen burner while judging the nature of the flame for a particular task at hand) they identified 65 psychomotor tasks and 108 cognitive tasks from the experiments studied. Some of these psychomotor tasks are defined as having a key role, in that the success of the experiment is dependent on the successful completion of that task, reducing the number of psychomotor skills to 44. Many of these key tasks were required in only a few experiments, so the set was again reduced to number of frequent key tasks – those occurring in more than 10 experiments. The 14 frequent key tasks subsequently identified are described in their table below.
Thus of the 65 psychomotor skills listed, only 14 are defined as frequent key tasks, limiting the opportunities for pupils to develop the skills associated with completing them. Indeed this paper goes on to demonstrate that in an assessment of 100 pupils, there was very poor demonstration of ability in correctly completing the practical tasks, which they attribute to the design of the syllabus and the limited opportunity to do practical work.
This article prompts me to think again: how do we prepare students for the laboratory skills aspect of practical work? I think the most common approach is to demonstrate immediately in advance of the student completing the practical, explaining the technique or the apparatus and its operation. However, demonstration puts students in the mode of observer; they are watching someone else complete an activity, rather than conceptualising their own completion. It also relies on the quality of the demonstrator, and is subject to local hazards, such as time available, ability to see and hear the demonstration, and so on. Therefore, there may be benefit in shifting this demonstration to pre-lab, allowing students time to become accustomed to a technique and its nuances.
Such pre-labs need to be carefully designed, and actively distinguished from any pre-lab information focussing on theory, which has a separate purpose. At Edinburgh, two strategies are planned.
The first is on the development of core introductory laboratory skills: titrations involving pipetting and buretting; preparing standard solutions including using a balance; and setting up Quickfit glassware to complete a distillation. Pre-lab information is provided to students in the form of videos demonstrating each technique, with key steps in each procedure highlighted in the video. Students will be required to demonstrate each of the three procedures to their peers in the laboratory, while their peer uses the checklist to ensure that all aspects of the task were completed appropriately. The purpose here is to incorporate preparation, demonstration, and peer-review into the learning of core lab skills, as well as to set in mind early on in students’ university careers the correct approach and the appropriate glassware to use for basic laboratory techniques. The approach includes students’ videoing their peers as part of the review process using mobile phones; and the video recording will subsequently be used as evidence for issuing students with a digital badge for that technique (more on that at the project homepage).
The second approach is to develop the laboratory manual beyond its traditional textbook format to be an electronic laboratory manual, with pre-lab demonstrations included. More on that project to come soon.
In designing pre-lab activities for skills development, the aim is to move beyond “just demonstrating” and to get students thinking through the approaches they will take. The reason for this is guided by work done by Beasley in the late 1970s. Beasley drew from the literature of physical education to consider the development of psychomotor skills in chemistry. He studied the concept of mental practice as a technique to help students prepare for the laboratory. Mental practice is based on the notion that physical activity requires mental thought, and thus mentally or introspectively rehearsing an activity prompts neural and muscular responses. Students were assigned to groups where they conducted no preparation, physical preparation, mental preparation, and both physical and mental preparation. They were tested before and after completing a lab on volumetric analysis. Beasley reported that students who students entering college from school were not proficient in completing volumetric analysis based on accuracy of their results. Furthermore, there was no significant difference in post-test scores of treatment students (which were all better than students who did no preparation), suggesting that mental preparation was as effective as physical preparation.
 A. Farmer and M. J. Frazer, Practical Skills in School Chemistry, Education in Chemistry, 1985, 22, 138.
 W. Beasley, The Effect of Physical and Mental Practice of Psychomotor Skills on Chemistry Student Laboratory Performance, Journal of Research in Science Teaching, 1979, 16(5), 473.
 J. B. Oxendine, Physical education. In Singer, R. B. (Ed.), The psychomotor domain: Movement behavior. Philadelphia: Lea and Feberger, 1972.
 S. De Meo, Teaching Chemical Technique: A review of the literature, Journal of Chemical Education, 2001 78(3), 373.
 S. De Meo, Gazing at the Hand: A Foucaultian View of the Teaching of Manipulative Skills to Introductory Chemistry Students in the United States and the Potential for Transforming Laboratory Instruction, Curriculum Inquiry, 2005, 35, 3.
There is something about reading old educational literature that is simultaneously reaffirming and depressing. Reading a point of view from over three decades ago that confirms your current thinking belies the notion that you are jumping on “the latest fad”, while the fact that it is still an issue for discussion three decades later makes you wonder about the glacial rate of change in educational approaches.
Education in Chemistry published a series of articles on practical work by Alex Johnstone. This article from 1982 sets the scene:
It is not uncommon in undergraduate laboratories to see students working as fast as possible just to get finished, with little or no thought about what they are doing.
Students, he argues, see practical work as “an intellectual non-event”. Johnstone used this article to elaborate on his hypothesis that practical classes presented a significant challenge to students, because the idea being taught is simultaneously needed at the start of the practical to organise the information presented in the class. In the lab, they need to recall the theory of the lab, remember how to use apparatus or read new written instructions about apparatus, develop new lab skills, listen to verbal instructions, and process whatever experimental data the experiment produces. It is difficult to discern which of this information is immediately important, and which is incidental.
The result is that students will follow the laboratory procedure like a recipe, without any intellectual engagement. Or they might take unnecessary care so that they won’t regret any experimental actions later – for example using an analytical balance when only a rough estimate was needed. Or they might go all British Bake Off Technical Challenge and just look around and copy others, without knowing why they are doing what they are mimicking.
Johnstone makes an interesting point which I don’t think I have seen elsewhere. When we lecture, we start off from a single point, elaborating with examples and developing connections. However, in practical work, students are exposed to all information at once, and must navigate their own way through to find the main point, often obscured. He used the idea of a pyramid and inverted pyramid to model these approaches.
How then can this information overload be alleviated? Johnstone provides some examples, including
- making the point of the experiment clear;
- a consideration of the arrangement of instruction manual so that it is clear what information is preliminary, peripheral, and/or preparatory.
- ensuring the experiment has not acquired confusing or irrelevant aspects – this resonated with me from experience: some experiments involve students making a series of dilutions and then performing experiments with this series. Students spend so much time considering the process of dilution (preparatory), this becomes the main focus, rather than the main purpose of the experiment. This involves thinking about the goals of the experiment. If it is required that students should know how to prepare a dilution series (of course it is), then that should have primary prominence elsewhere.
- ensuring necessary skills are required before exposure to investigative experiments.
A new manual
in 1990, Michael Byrne from what was then Newcastle Polytechnic decided to put Johnstone’s ideas into practice and reported on a new first year manual with the following design considerations:
- Experiments included a brief introduction with key information needed to understand instruments and the principle behind the experiment.
- Objectives were explicitly stated, indicating exactly what the student should be able to do as a result of carrying out the experiment.
- The language of procedures was simplified and laid out in sequence of what he student was meant to do.
- Information was provided on how to present results and what the discussion should cover. Students were prompted as to how they should query their results.
These ideas aren’t exactly what we would consider radical now, but students were tested after five weeks and those using the revised manual scored significantly higher in tests about techniques and results of experiments than those who had the old manuals.
In 1990s, Johnstone continued his attempts to effect change in our approach to practical teaching. In 1990, he discussed the use of student diaries to record views on their lab experiences during their second year at university. The diary consisted of short response questions that the student completed after each practical. The responses were analysed iteratively, so that the experiments could be grouped into layers of criticism. Unsurprisingly, physical chemistry experiments came out as the most unpopular. My poor subject.
Why were they unpopular? Johnstone analysed the “load” of each experiment – the amount of information they had to process, recall, digest and interrelate in the three hour period. I’ve reproduced his table below:
The total load of physical chemistry experiments is much greater than the inorganic or organic labs, primarily due to theoretical aspects. Furthermore, those physical lab experiments which were subject to most criticism had a theory load of 40, compared to the average physical lab theory load of 33. The result of this was evident in the student diaries: comments such as “not learned anything” and “no satisfaction” indicates that the students had not engaged with the experiment in any way.
Practical measures for practical work
In 1991, in the final of my trio of his articles here, Johnstone reported how he addressed the issues arising from the study of load in the manuals. Five strategies were considered:
- Noise reduction in the lab (noise meaning extraneous information): clearly labelled solutions, provided without need for further dilution unless required as part of experiment, highlighting where a particular procedure may differ from what students have previously experienced.
- Noise reduction in the manual: clearer layout of manual, with consistent icons, diagrams of types of balance beside instructions, etc. (this is 1991, people…)
- Allowing for practice: design of the overall practical course so that required skills are progressively developed.
- Time for thought: requiring students to prepare some of the procedure in advance of the lab as part of their pre-practical work – e.g. amounts to be measured out, etc.
- Time to piece it together: arranging the lab programme so that skills are developed in one week, used in a second week, and applied in a third week in a more open-ended lab that took about half an hour of lab time.
Johnstone’s trio of papers shown here show an impressive sequence of developing a theory as to why something has gone wrong, testing that theory with some analysis, and grounding an educational approach based on these findings. It’s one of the reasons I admire his work so much.
Michael S. Byrne, More effective practical work, Education in Chemistry, 1990, 27, 12-13.
A. H. Johnstone, A. J. B. Wham, The demands of practical work, Education in Chemistry, 1982, 19, 71-73
A. H. Johnstone, K. M. Letton, Investigating undergraduate laboratory work, Education in Chemistry, 1990, 27, 9-11.
A. H. Johnstone, K. M. Letton, Practical measures for practical work, Education in Chemistry, 1991, 28, 81-83
“Although the majority of scientific workers utilize photography for illustrative purposes, a survey of the literature shows that only a limited number fully appreciate its usefulness as a means for recording data.”
So wrote GE Matthew and JI Crabtree in a 1927 article of Journal of Chemical Education. Photography has come on since then, when they cautioned readers on the properties and limitations of photographic emulsion for quantitative purposes. Now it is much simpler, and there are many applications of photography using a mobile phone camera and a suitable app. I’ve summarised five of these below.
1. Colorimetric Analysis
This is a paper I have written about before. It essentially allows a Beer-Lambert plot to be performed from a mobile phone picture of a series of solutions of different concentrations of a coloured dye. The practical is extended to Lucozade. The original paper suggests the use of PC imaging software, but good results can be obtained with a mobile phone RGB colour determination app such as RGB Camera. Some more detail on that process is in the earlier blog post.
2. Colorimetry for chemical kinetics
This recently published paper extends the idea outlined above and uses colorimetry for kinetic analysis. The experiment is the hydrolysis of crystal violet with hydroxide ions. A similar set up to that described above, except the camera is set to acquire images every 10 s automatically. The authors describe the analysis protocol well, and suggest a mechanism for reducing data analysis time. The app mentioned here (for Android) is Camera FV-5 Lite. The supplementary information has detailed student instructions for image analysis. A very clever idea.
3. A variation on flame photometry for testing for sodium in sea water and coconut water
This is a really excellent idea where the flame test is monitored by recording a video on the mobile phone. Stock saline solutions from between 20 to 160 mg/dm3 were prepared and used to build a calibration curve. The flame colour was recorded on video. To do this, the phone was fixed approximately 40 cm from the flame, with a white background 40 cm in the opposite direction. Distilled water was sprayed into the flame to record the blank, followed by the calibration solutions and the analytes (sea water and coconut water). The videos were replayed to find the point at which the light was most intense. Again, the authors go into quite complex PC imaging analysis; a simpler option would be to pause the video at the point of greatest intensity and take a phone screenshot for analysis. RGB data can be obtained, subtracting the baseline (distilled water). I want to do this one!!
4. Determining amino-acid content in tea leaf extract – using microfluidic analysis
Students prepare a microfluidic device using a wax pen on fliter paper. A 2% ninhydrin solution is prepared (full details in paper) and this is used as the sensor on the filter paper. Tea is boiled and extracted and small drops added to the microfluid wells. After a picture is taken, the RGB data allow for analysis of the glutamic acid present. Again the authors suggest desktop software, but there is no reason why an app can’t be used. The set-up involves ninhydrin and tin (II) chloride, so is probably best for university students.
The microfluidic device is interesting though for students at all levels. Essentially any pattern can be drawn on filter paper with a wax pen. The paper is heated to 135 °C for 30 seconds, and the wax melts through the paper, creating hydrophobic walls. The main author also has a just published RSC Advances paper where the microfluidic devices are prepared using an inkjet printer, and used as a glucose assay, so this is right on the cutting edge.
5. More microfluidics: analysis of Cu2+ and Fe2+ using colorimetry
Another paper on microfluidics, but this one more applicable for the school classroom. Microfluidic arrays are prepared by cutting designs into Parafilm sheets and enclosing them between paper, and then aluminium foil, before passing through a laminator. Analysis as before is by RGB determination of the spots formed, again the paper’s SI gives a good overview of the analysis protocol for students.
A nice paper in J Chem Ed just out (behind the paywall, but contacting author usually works). It introduces the concept of paper based diagnostics for the analysis of small amounts of sample. Essentially a small amount of paper cut out using a decorative paper punch is pre-treated with indicator and allowed to dry. Once dried, this can then be used to test a sample for acidity or whatever else is of interest. A drop of analyte is transferred to the paper using a swab or capillary tube.
The paper and supplementary information provides elaborate details of context based forensic scenarios, along with instructor and student directions (which are detailed but need a good edit). There’s lots of good stuff too on the scientific inquiry.The indicators test for “cyanide” – thankfully just hydroxide ions – and creatine. (I’m not sure I agree with the principle of pretending it’s cyanide, but that’s another matter).
Developing the idea:
It struck me that it could be interesting to develop this idea further, and get students at upper level/early undergraduate to develop a more elaborate matrix for testing a particular range of analytes – e.g. anions. Any suggestions welcome. What could be adsorbed and dried onto paper that would give a colour change with a drop of analyte. Clever chemists: hear the call!
The powers that be have decided that the oxidation reaction using Cr(VI) is now too dangerous for our school students to carry out, and have proposed to replace this reaction with the oxidation of phenylmethanol (benzyl alcohol) to benzoic acid. Some details about this reaction are below.*
Oxidation of Phenylmethanol
Phenylmethanol (1 mL) is added to a 100 mL conical flask. A solution of potassium permanganate (25 mL, 0.2 M) and sodium carbonate (0.5 g) are added and the solution heated at about 60 °C for 20 minutes on a water bath. The solution turns brown on heating. After cooling slightly, a few drops of conc. HCl are added until the solution was acidified; i.e. until no more fizzing is apparent. The solution clarifies leaving a brown residue of manganese dioxide. Finally, sodium sulfite (a few drops of saturated solution) is added until the solution cleared. Benzoic acid precipitates as the solution cooled. This is filtered off using a Hirsch funnel and weighed when dry (0.6g, 50%).
About the reaction
The reaction involves oxidation with MnO4–. The mechanism is not well understood, but the permanganate provides the oxygen atoms necessary for the oxidation. The oxidation state of Mn changes over the course of the reaction. The initial solution of permanganate will be dark purple. As it oxidises the phenylmethanol, it itself will be reduced to Mn4+, precipitating out as MnO2 – the brown powder is visible as the reaction progresses. In order to complete the reaction under necessary alkaline conditions, sodium carbonate is added. The Mn4+ is again reduced to soluble Mn2+, using sodium sulfite, which allows for all solid product to be easily isolated by filtration, and safe disposal of filtrate. Benzoic acid is sparingly soluble in water, and as the reaction cools, it will precipitate out.
Purpose of each reagent
- KMnO4: This is the oxidising agent, which will oxidise phenylmethanol to benzoic acid. In the process, it is itself reduced.
- Na2CO3: This increases the pH of solution, so that it is alkaline. This is necessary for the reaction to proceed as indicated.
- HCl: This is used to acidify the solution. In doing so it neutralises the sodium carbonate. HCl also protonates the benzoate anion, forming benzoic acid, so that it will precipitate out of solution when the reaction cools.
- Na2SO3: This is used to reduce the MnO2 precipitate so that it can be solubilised.
The density of phenylmethanol is 1.05 g/mL. Therefore mass used in reaction is 1.05 g. As the molecular mass of phenylmethanol is 108.1 g/mol, this means that 9.7 x 10-3 mol of reactant were used, and hence 9.7 x 10-3 mol of product (122.1 g/mol) expected. This corresponds to a theoretical yield of 1.19 g. Using the above conditions, a yield of 0.6 g (50% was obtained).
Melting point of benzoic acid: 122 °C.
*Thanks to Gráinne Hargaden, Claire Mc Donnell, Maria Sheehan, Marie Walsh.