The last post discussed an advanced physical chemistry lab, and in this one I want to summarise more concrete plans for how we can move an early undergraduate lab online.
The key thing for us at early undergrad stage is teaching chemical technique, and getting students to think about recording data and drawing conclusions from experiments. An important factor is that at this stage, students probably expect that they will be learning about technique. Coming into university from school, their perception will be that they want to learn about chemical techniques, and lots of them.
A problem I have summarised in previous posts as the “swipey-wipe” effect rears its head here. A typical online lab involves students doing a simulation of sorts, getting information from that simulation, and writing up the report. While this likely has some benefit in terms of data processing, I am not sure how much it teaches about experimental technique, because a lot of simulations don’t reflect the tangible reality of laboratory work.
Anyway with resolution rather than confusion on my mind for this post, I’m going to get straight to the template envisaged. This is heavily influenced by Joi Walker’s poster presented at CLEAR20 last week, as well as Lukas Kroos/Nicole Graulich’s poster on decision-making in experimental procedures – using video of those procedures (both posters available on the CLEAR website). All academic credit goes to the work reported by those researchers, and I have linked to papers by Joi below.
Joi Walker’s work is based on getting students to make an academic argument and reason it with evidence. So instead of getting students to play with a simulation and generate data off the bat, I am proposing that we have a lab template that:
- Presents students with some data from a described experiment. There should be sufficient data there for students to make a claim.
- Students’ first task is to make a claim, based on the evidence presented. If you zoom in really close on Joi’s poster, you’ll see a whiteboard there on how she structures that (what is the guiding question; what is the claim, what is the argument presented; what is the evidence for that argument).
- Once students have made their claim, their next task is to decide what further evidence they need to support the claim. It is at this point where we might unleash a simulation on students. I personally prefer video (or at least photos of the real set-up) and Lukas’ poster from CLEAR shows a really exciting way of making that interactive – I am greedy for more on that work.
- We somehow provide students with data based on their required further experiment. Students need to use the videos provided to describe the experimental procedure – how they would do the experiment in reality if they were in the lab. This is a way to get students to meaningfully engage with any procedure videos etc we share.
- Students add in the data from the “experiment” with the data they were provided, and review their claim. Their discussion outlines whether their claim was supported or not by the additional evidence.
- The writing up and presentation of the work needs some further thought (by me). Joi shared some lovely peer-review work, and Marcy Towns, who spoke at CLEAR, outlined a similar approach based around “Claim-Evidence-Reasoning”, which involved some peer review work (recording of that presentation is online next week).
I think this approach ticks a lot of boxes for me. It removes the “game” aspect of doing a simulation to get some data just for the sake of it, and instead turns it into a meaningful activity, with some in-depth considerations by students of experimental procedure, but also a taster about experimental design, and in making judgements on data. It’s also feasible. (Of course after teaching students online, once we get them back in the laboratory, we will be cram-packing in intensive laboratory competencies.)
Example in practice
To do a run through feasibility, I am thinking of some of our early year experiments. Obviously this more suited to general/physical chemistry labs, but one I had in mind was the typical iodine clock kinetics experiment. We could share some data based on some initial concentrations, which could allow students to either deduce the order or reaction, or require just one more piece of the jigsaw. They could make a claim and seek further data to either confirm the order or get the additional data. It would be easy to integrate existing videos we have about the iodine clock, and easy to auto-generate data based on student requests.
The reporting and assessment side of things needs a bit more thought, but I’m certainly happier about this kind of level of laboratory than I am when I finished writing the advanced lab post! In fact it really appeals as it means we can use this as a process to review our current early year lab offering and improve them to include argumentation for future iterations in person or online.
Check out Joi Walker’s publications on ADI:
Sampson, V., Grooms, J., & Walker, J. P. (2011). Argument‐Driven Inquiry as a way to help students learn how to participate in scientific argumentation and craft written arguments: An exploratory study. Science Education, 95(2), 217-257.
Walker, J. P., Sampson, V., & Zimmerman, C. O. (2011). Argument-driven inquiry: An introduction to a new instructional model for use in undergraduate chemistry labs. Journal of Chemical Education, 88(8), 1048-1056.