Rounding up the peer review and digital badge project

Marcy Towns’ lovely paper from 2015 described the use of digital badges in higher education chemistry, specifically for assessment of laboratory skills. This work was important.  The concept of badges had been around for a while. When I first came across them while doing an MSc in E-Learning back in 2010, laboratory work seemed an obvious place to use them. But while the technology promised a lot, there wasn’t feasible systems in place to do it. And what exactly would you badge, anyway? And would undergraduate students really take a digital badge seriously?

Towns’ work was important for several reasons. On a systematic level, it demonstrated that issuing badges to a very large cohort of students in undergraduate laboratories was feasible. At Purdue, they used an in-house system called Passport to manage the badging process from submission of videos to viewing online for assessment, and subsequent issuing of the badges. But more importantly for me, Towns’ work reasserted the notion of laboratory skills and competencies as something worth assessing in their own right. Skills weren’t being implicitly assessed via quality of results or yield in a reaction. This approach – videoing a student while they demonstrate a technique was directly assessed. This is not something you come across very often in the education literature (some notable exceptions are pointed out in our paper).

This work answered two of the three questions I had about badges – there are systems in place (although as I discovered, Blackboard just about manages to do this, with a lot of creaking).  And there is scope for badging laboratory skills – the concept of badging is built on demonstrable evidence, and videoing techniques is part of this. Whether students take badging seriously I think still needs to be answered. My own sense is that there will need to be a critical mass of badges – an obvious ecosystem where it is clear to students how they can progress, and our own work in this regard is extending into more advanced techniques.

Incorporating peer review

One of the great insights Towns and her students shared at a workshop at BCCE last summer was the notion of narration in demonstrating techniques. Early videos in their initial pilot studies were eerily silent, and it was difficult for them to know what the students’ understanding was as they completed a technique – why they were doing things in a particular way? So they built in narration into the requirements of the demonstration. I think this is one of those things in hindsight that is obvious, but for us to know up front in our own implementation was invaluable.

We opted to have a system where students would video each other rather than be videoed by a demonstrator, and that the narration would be in effect one peer telling the other how they were doing the technique. To facilitate a review of this at the end, the demonstrators in the project came up with the idea of a peer observation sheet (they designed them too – bonus points!). The whole set up was to encourage dialogue – genuine interactions discussing the experimental technique, and allowing for feedback on this based on the guidelines presented in the peer observation sheets. These acted as a framework on which the lab was run. Lord knows chemists like instructions to follow.

Feedback then is given in-situ, and indeed if the student demonstrating feels after discussion that they would like to video it again, they can. This notion of quality, or exemplary work, is underscored by the exemplars provided to students in advance; pre-laboratory videos dedicated to correct display of technique. This whole framework is based around Sadler’s design for feedback, discussed… in the paper!

We’ve documented our on-going work on the project blog and the paper summarising the design and analysis of evaluation is now available in CERP.  It is part of the special issue on transferable skills in the curriculum which will be published in the Autumn, primarily as we felt it developed digital literacy skills in addition to the laboratory work; students were required to submit a link to the video they hosted online, rather than the video itself. This is giving them control over their digital footprint.


Resources – digital badges, peer observation sheets, links to exemplar videos – are all freely available on the project website.  I really think there is great scope for badges, and look forward to where this project will go next!

Three badges

Related Posts:

Using the Columbo approach on Discussion Boards

As pat of our ongoing development of an electronic laboratory manual at Edinburgh, I decided this year to incorporate discussion boards to support students doing physical chemistry labs. It’s always a shock, and a bit upsetting, to hear students say that they spent very long periods of time on lab reports. The idea behind the discussion board was to support them as they were doing these reports, so that they could use the time they were working on them in a more focussed way.

The core aim is to avoid the horror stories of students spending 18 hours on a report, because if they are spending that time on it, much of it must be figuring out what the hell it is they are meant to be doing. Ultimately, a lab report is a presentation of some data, usually graphically, and some discussion of the calculations based on that data. That shouldn’t take that long.

Setting Up

The system set-up was easy. I had asked around and heard some good suggestions for external sites that did this well (can’t remember it now but one was suggested by colleagues in physics where questions could be up-voted). But I didn’t anticipate so many questions that I would have to answer only the most pressing, and didn’t want “another login”, and so just opted for Blackboard’s native discussion board. Each experiment got its own forum, along with a forum for general organisation issues.


A postgrad demonstrator advised me to allow the posts to be made anonymously, and that seemed sensible. Nothing was being graded, and I didn’t want any reticence about asking questions. Even anonymously, some students apologised for asking what they deemed “silly” questions, but as in classroom scenarios, these were often the most insightful. Students were told to use the forum for questions, and initially, any questions by email were politely redirected to the board. In cases close to submission deadlines, I copied the essential part of the question, and pasted it to the board with a response. But once reports began to be due, the boards became actively used. I made sure in the first weekend to check in too, as this was likely going to be the time that students would be working on their reports.

The boards were extensively used. About 60 of our third years do phys chem labs at a time, and they viewed the boards over 5500 times in a 6 week period. Half of these views were on a new kinetics experiment, which tells me as organiser that I need to review that. For second years, they have just begun labs, and already in a two week period, 140 2nd years viewed the board 2500 times. The number of posts of course is nowhere near this, suggesting that most views are “lurkers”, and probably most queries are common. Since students can post anonymously, I have no data on what proportion of students were viewing the boards. Perhaps it is one person going in lots, but given the widespread viewership across all experiments, my guess is it isn’t. The boards were also accessible to demonstrators (who correct all the reports), but I’ve no idea if they looked at them.


The reception from students has been glowing, so much so that it is the surprise “win” of the semester. (Hey, look over here at all these videos I made… No? Okay then!) Students have reported at school council, staff student liaison committees, anecdotally to me and other staff that they really like and appreciate the boards. Which of course prompts introspection.

Why do they like them? One could say that of course students will like them, I’m telling them the answer. And indeed, in many cases, I am. The boards were set up to provide clear guidance on what is needed and expected in lab reports. So if I am asked questions, of course I provide clear guidance. That mightn’t always be the answer, but it will certainly be a very clear direction to students on what they should do. But in working through questions and answers, I stumbled across an additional aspect.

One more thing

Me, when asked an electrochemistry question
Me, when asked an electrochemistry question

Everyone’s favourite detective was famous for saying: “oh: just one more thing“. I’ve found in the lab that students are very keen and eager to know what purpose their experiment has in the bigger context, where it might be used in research, something of interest in it beyond the satisfaction of proving, once again, some fundamental physical constant. And in honesty, it is a failing on our part and in the “traditional” approach that we don’t use this opportunity to inspire. So sometimes in responding to questions, I would add in additional components to think about – one more thing – something to further challenge student thought, or to demonstrate where the associated theory or technique in some experiment we were doing is used in research elsewhere. My high point was when I came across an experiment that used exactly our technique and experiment, published in RSC Advances this year. This then sparked the idea of how we can develop these labs more, the subject of another post.

Again I have no idea if students liked this or followed up these leads. But it did ease my guilt a little that I might not be just offering a silver spoon. It’s a hard balance to strike, but I am certainly going to continue with discussion boards for labs while I work it out.

embedded by Embedded Video

YouTube Direkt

Related Posts:

A tour around Johnstone’s Triangle

In a small laboratory off the M25, is a man named Bob. And Bob is a genius at designing and completing reactions on a very small scale. Bob is greatly helped by Dr Kay Stephenson, Mary Owen and Emma Warwick.

I was invited to go down to CLEAPPS to see Bob in action, and try out for myself some of the microscale chemistry he has been developing. I was interested to see it because of a general interest in laboratory expriments and how we can expand our repertoire. But I found out a lot more than just smaller versions of laboratory experiments.

Safety and cost considerations first piqued Bob’s interest in microscale. The traditional laboratory Hofmann voltmeter costs about £250, but the microscale version, including ingenious three way taps to syringe out the separated gases costs about £50. Thoughts about how to do a reduction of copper oxide safely led him to use a procedure that avoided traditional problems with explosions. There’s also a very neat version using iron oxide, incorporating the use of a magnet to show that iron forms.

Electrochemical production of leading to subsequent production of iodine and bromine. Copper crystals form on the electrode.
Electrochemical production of chlorine leading to subsequent production of iodine and bromine. Copper crystals form on the electrode.

Bob promised to show me 93 demonstrations in a morning (“scaled back from 94!”) and I worried on my way there that I would have to put on my polite smile after a while. But actually time flew, and as we worked through the (less than 93) experiments, I noticed something very obvious. This isn’t just about safety and cost. It has deep grounding in the scholarship of teaching and learning too.

Cognitive Load

What I remember from the session is not the apparatus, but the chemistry. Practical chemistry is difficult because we have to worry about setting up apparatus and this can act as a distraction to the chemistry involved. However, the minimal and often absence of apparatus meant that we were just doing and observing chemistry. This particularly struck me when we were looking at conductivity measurements, using a simple meter made with carbon fibre rods (from a kite shop). This, along with several other experiments, used an ingenious idea of instruction sheets within polypropylene pockets (Bob has thought a lot about contact angles). The reaction beaker becomes a drop of water, and it is possible to explore some lovely chemistry: pH indicator colours, conductivity, precipitation reactions, producing paramagnetic compounds, all in this way. It’s not all introductory chemistry; we discussed a possible experiment for my third year physical chemists and there is lots to do for a general chemistry first year lab, including a fabulously simple colourimeter.

Designing a universal indicator.
Designing a universal indicator.

Johnstone’s Triangle

One of the reasons chemistry is difficult to learn is because we have multiple ways of representing it. We can describe things as we view them: the macroscopic scale – a white precipitate forms when we precipitate out chloride ions with silver ions. We can describe things at the atomic scale, describing the ionic movement leading the above precipitation. And we can use symbolism, for example representing the ions in a diagram, or talking about the solubility product equation.  When students learn chemistry, moving between these “domains” is an acknowledged difficulty. These three domains were described by Alex Johnstone, and we now describe this as Johnstone’s triangle.

Johnstone's triangle (from U. Iowa Chemistry)
Johnstone’s triangle (from U. Iowa Chemistry)

One of my observations from the many experiments I carried out with Bob was that we can begin to see these reactions happening. The precipitation reactions took place over about 30 seconds as the ions from a salt at each side migrated through the droplet. Conductivity was introduced into the assumed unionised water droplet by shoving in a grain or two of salt. We are beginning to jump across representations visually. Therefore what has me excited about these techniques is not just laboratory work, but activities to stimulate student chatter about what they are observing and why. The beauty of the plastic sheets is that they can just be wiped off quickly with a paper towel before continuing on.

Reaction of ammonia gas (Centre) with several solutions including HCl with universal indicator (top right) and copper chloride (bottom right)
Reaction of ammonia gas (centre) with several solutions including HCl with universal indicator (top right) and copper chloride (bottom right)

Bob knew I was a schoolboy chemist at heart. “Put down that book on phenomenology” I’m sure I heard him say, before he let me pop a flame with hydrogen and reignite it with oxygen produced from his modified electrolysis apparatus (I mean who doesn’t want to do this?!). I left the room fist-bumping the air after a finale of firing my own rocket, coupled with a lesson in non-Newtonian liquids. And lots of ideas to try. And a mug.

I want a CLEAPPS set to be developed in time for Christmas. In the mean time, you can find lots of useful materials at:

Related Posts:

Using digital technology to assess experimental science

The following was a pre-conference piece submitted to a Royal Society conference on assessment in practical science.


A modern laboratory education curriculum should embrace digital technologies with assessment protocols that enable students to showcase their skills and competences. With regards to assessment, such a curriculum should:

  • incorporate the digital domain for all aspects related to experimental work; preparations, activities, reflections;
  • provide a robust and valid assessment framework but with flexibility for individuality;
  • emphasise to students the role of documenting evidence in demonstrating skills and competences by means of micro-accreditation, such as digital badges.

This paper summarizes how some digital technologies can address the above points.

How can research into the use of digital technology in the assessment of experimental science improve the validity of assessment in the short, medium and long term?

Re-shifting the emphasis of assessment by means of e-assessment

Our use of digital technologies in everyday life has increased substantially in the last two decades, In contrast, laboratory education has remained stubbornly paper-based, with laboratory notebooks at the core of assessment protocols. This emphasis on a post-hoc report of work done, rather than a consideration of the work itself, means that the value of laboratory work has been distorted in favour of the process of preparing laboratory reports. Experimental work, and the demonstration of experimental skills and competences is of secondary importance.

There are good reasons why emphasis has historically been on the laboratory report instead of laboratory work. Directly assessing experimental work, and indeed any input students have to the planning and completion of experimental work, is subjective. Issues also arise if laboratory work is completed in groups, for either pedagogic or resource reasons. Assigning individual marks is fraught with difficulty.

Digital technologies can provide a basis to address many of the concerns regarding validity that the above issues raise, and provide an opportunity to reposition what is considered to be important in terms of the goals and purpose of experimental science.

The completion of experimental work typically involves:

  • Preparation: planning and preparing for work and making decisions on experimental approaches to be taken;
  • Action: learning how to carry out work competently, demonstrating competence in experimental approaches, and accurately recording data and/or observations;
  • Reflection: drawing conclusions from data, reporting of findings, and evaluation of approaches taken.

Incorporating the digital domain for all aspects of experimental work

Wikis and electronic laboratory notebooks are online document editing spaces that enable individual contribution to be documented and reviewed. Such platforms have been shown to allow the documentation of student thoughts and contributions to work, and as such they provide an excellent basis for recording the entire process (preparation, action, reflection) the student engages with while completing experimental work. Preparation can include a description of what equipment will be used and why, or thoughts on the purpose of experiment. Action can be documented by recording experimental work completed with the inclusion of data or observations in a variety of multi-media formats (text/photos/video/audio). Reflection can allow for a richer form of the typical lab report. In practice this means asking students to consider and review their experimental approach, so that the emphasis shifts away from the “right answer” (an often cited criticism of students coming through a school laboratory curriculum) and more towards a consideration of the approach taken.

Using traceability as a basis for validity of assessment

Validity is a core concern for a national high-stakes examination. Research to date on wikis has pointed to the advantages offered, including that student contributions are date stamped, and each individual contribution is logged. Overall contributions to work can be tracked. Rubrics have been effectively used to assess student laboratory skills, although the compilation of rubrics needs a considerable investment in order to document the desired goals and expectations of any particular curriculum experiment or inquiry so that they can be easily assessed. The value of a more flexible approach to documenting science work using wikis and electronic lab notebooks allows scope for individuality within an overall framework of requirements. However this is an area that needs considerable and ongoing research.

There is a body of research discussing the use of virtual laboratories for mimicking student experimentation, as they provide for more controlled and hence robust assessment protocols. These should be resisted, as they remove students’ exposure to the situational and psychomotor demands that being in the laboratory brings. While virtual laboratories may play some role in summative assessment – for example in decision making – they will likely act as a distraction to the necessary changes required to engaging with and documenting real hands-on work, as they will again shift the focus of experimental science away from actual laboratory work.

Emphasis on experimental science and documenting competences

An advantage of a refocus on documenting of processes means that there is an opportunity for students to showcase their own experimental skills. Digital badges have emerged as a way to accredit these, in what is known as “micro-accreditation”. Digital badges mimic the idea of Guides and Scouts badges by acknowledging achievements and competences in a particular domain. Examples could include badging students experimental skills (for example badges for pipetting, titrating, etc) and higher-level badges (for example badges where students would need to draw on a range of competences already awarded and apply them to a particular scenario (for example an overall analysis where students would need to design the approach and draw on their technical competency on pipetting and titrations). This enables students to document their own progress in an ongoing way, and allows them to reflect on any activities needed to complete a full set of badges on offer. This is an exciting area as it offers significant expansion across the curriculum. Mobile learning platforms will make new and interesting ways to develop these approaches.


Changing from paper based to electronic based media is not without difficulties. In terms of short, medium, and long-term objectives, an initial focus should begin with promoting the possibilities of documenting scientific work in school through the use of multimedia. This will develop a culture and expertise around the use of technical skills and work towards a medium term goal of developing a basis for documenting work in an online platform instead of on paper – emphasising the value of documenting evidence of processes. This can be complemented with the development of a suite of digital badges associated with expected experimental techniques and protocols. In the long term, this allows the consideration of assessment of laboratory work via wikis and electronic lab notebooks, using appropriate rubrics, which allow student to genuinely and accurately showcase their competence in experimental science in a much more meaningful and engaging way.

Related Posts:

  • No Related Posts

Video: e-portfolios for documenting learning in the 21st century

Here is a talk I gave to school teachers at an NCCA event in Dublin Castle earlier this year. It discusses:

  1. Why and how we should enable students create digital artefacts to document their learning?
  2. Practical approaches to doing this – discussing digital badges and wikis in particular.
embedded by Embedded Video

vimeo Direkt

Related Posts:

#ViCEPHEC16 – curly arrows and labs

The annual Variety in Chemistry Education/Physics Higher Education conference was on this week in Southampton. Some notes and thoughts are below.

Curly arrows

Physicists learned a lot about curly arrows at this conference. Nick Greeves‘ opening keynote spoke about the development of ChemTube3D – a stunning achievement – over 1000 HTML pages, mostly developed by UG students. News for those who know the site are that 3D curly arrow mechanisms are now part of the reaction mechanism visualisations, really beautiful visualisation of changing orbitals as a reaction proceeds for 30+ reactions, lovely visualisations of MOFs, direct links to/from various textbooks, and an app at the prototype stage. Nick explained that this has all been developed with small amounts of money from various agencies, including the HEA Physical Sciences Centre.

Mike Casey from UCD spoke about a resource at a much earlier stage of development; an interactive mechanism tutor. Students can choose a reaction type and then answer the question by drawing the mechanism – based on their answer they receive feedback. Version 2 is on the way with improved feedback, but I wondered if this feedback might include a link to the appropriate place in Chemtube3D, so that students could watch the associated visualisation as part of the feedback.

In the same session Robert Campbell spoke about his research on how A-level students answer organic chemistry questions. My understanding is that students tend to use rules of mechanisms (e.g. primary alkyl halides means it’s always SN2) without understanding the reason why; hence promoting rote learning. In a nice project situated in the context of cognitive load theory, Rob used Livescribe technology to investigate students reasoning. Looking forward to seeing this research in print.

Rob’s future work alluded to considering the video worked answers described by Stephen Barnes, also for A-level students. These demonstrated a simple but clever approach; using questions resembling A-level standard, asking students to complete them, providing video worked examples so students could self-assess, and then getting them to reflect on how they can improve. David Read mentioned that this model aligned with the work of Sadler, worth a read.

Laboratory work

Selfishly, I was really happy to see lots of talks about labs on the programme. Ian Bearden was the physics keynote, and he spoke about opening the laboratory course – meaning the removal of prescriptive and allowing students to develop their own procedures. Moving away from pure recipe is of course music to this audience’s ears and the talk was very well received. But you can’t please everyone – I would have loved to hear much more about what was done and the data involved, rather than the opening half of the talk about the rationale for doing so. A short discussion prompted this tweet from Felix Janeway, something we can agree on! But I will definitely be exploring this work more. Ian also mentioned that this approach is also part of physics modules taught to trainee teachers, which sounded a very good idea.

Jennifer Evans spoke about the prevalence of pre-labs in UK institutions following on from the Carnduff and Reid study in 2003. Surprisingly many don’t have any form of pre-lab work. It will be interesting to get a sense of what pre-lab work involves – is it theory or practice? Theory and practice were mentioned in a study from Oxford presented by Ruiqi Yu, an undergraduate student. This showed mixed messages on the purpose of practical work, surely something the academy need to agree on once and for all. There was also quite a nice poster from Oxford involving a simulation designed to teach experimental design, accessible at this link. This was also built by an undergraduate student. Cate Cropper from Liverpool gave a really useful talk on tablets in labs – exploring the nitty gritty of how they might work. Finally on labs, Jenny Slaughter gave an overview of the Bristol ChemLabs, which is neatly summarised in this EiC article, although the link to the HEA document has broken.

Other bites

  • Gwen Lawrie (via Skype) and Glenn Hurst spoke about professional development; Gwen mentioned this site she has developed with Madeline Schultz and others to inform lecturers about PCK. Glenn spoke about a lovely project on training PhD students for laboratory teaching – details here.  This reminds me of Barry Ryan‘s work at DIT.
  • Kristy Turner gave an overview of the School Teacher Fellow model at Manchester, allowing her to work both at school and university with obvious benefits for both. Kristy looked forward to an army of Kristy’s, which would indeed be formidable, albeit quite scary. Even without that, the conference undoubtedly benefits from the presence of school teachers, as Rob’s talk, mentioned above, demonstrates.
  • Rachel Koramoah gave a really great workshop on qualitative data analysis. Proving the interest in chemistry education research, this workshop filled up quickly. The post-it note method was demonstrated, which was interesting and will certainly explore more, but I hope to tease out a bit more detail on the data reduction step. This is the benefit of this model – the participants reduce the data for you – but I worry that this might in turn lead to loss of valuable data.
  • Matthew Mears gave a great byte on the value of explicit signposting to textbooks using the R-D-L approach: Read (assign a reading); Do (Assign questions to try); Learn (assign questions to confirm understanding). Matt said setting it up takes about 30 minutes and he has seen marked improvements in student performance in comparison to other sections of the course.
  • David Nutt won the best poster prize. His poster showed the results of eye-tracking experiments to demonstrate the value or not of an in-screen presenter. Very interesting results which I look forward to seeing in print.

The conference organisation was brilliant and thanks to Paul Duckmanton and Charles (Minion) Harrison for leading the organisation. Lots of happy but tired punters left on Friday afternoon.

I couldn’t attend everything, and other perspectives on the meeting with links etc can be found at links below. From Twitter, Barry Ryan’s presenation on NearPod seemed popular, along with the continuing amazingness of my colleagues in the Edinburgh Physics Education Research Group. One of their talks, by Anna Wood, is available online.

Related Posts:

Getting ready to badge and looking for interested partners

Over the summer we have been working on a lab skills badging project. Lots of detail is on the project home site, but briefly this is what it’s about:

  • Experimental skills are a crucial component of student laboratory learning, but we rarely assess them, or even check them, formally. For schools, there is a requirement to show that students are doing practical work.
  • By implementing a system whereby students review particular lab techniques in advance of labs, demonstrate them to a peer while being videod, reviews the technique with a peer using a checklist, and uploads the video for assessment, we intend that students will be able to learn and perform the technique to a high standard.
  • The video can form part of students electronic portfolio that they may wish to share in future (See this article for more on that).
  • The process is suitable for digital badging – awarding of an electronic badge acknowledging competency in a particular skill (think scout badges for… tying knots…).

Marcy Towns has a nice paper on this for pipetting and we are going to trial it for this and some other lab techniques.

Looking for interested parties to trial it out

I am looking for school teachers who would like to try this method out. It can be used to document any lab technique or procedure you like. You don’t necessarily need an exemplar video, but a core requirement is that you want to document students laboratory work formally, and acknowledge achievement in this work by a digital badge. We will provide the means to offer the badge, and exemplar videos if you need them, assuming they are within our stock. Interested teachers will be responsible for local implementation and assessment of quality (i.e. making the call on whether a badge is issued).

Yes I need help with badge design
Yes I need help with badge design

This will be part of a larger project and there will be some research on the value and impact of the digital badges, drawing from implementation case studies. This will be discussed with individuals, depending on their own local circumstances.

So if you are interested, let’s badge! You can contact me at: to follow up.

Related Posts:

What is the “education literature”?

Over on the Education in Chemistry blog, Paul MacLellan wrote an excellent article on reasons teachers don’t engage with education research, which is well worth a read. Speaking a few years ago, I used analogy of a paddle boat steamer when talking about the penetration of education research in HE. The paddles throw up some splashes as it sails over the vast quantity of water below. These splashes were meant to represent how many engage with research – taking on what they hear on the grapevine, Twitter, or CPD. It isn’t systematic.

I’ve spent a lot of time wondering about whether I should expect my colleagues to read education research, and on balance, I don’t think I should. The reason stems from the argument made about different audiences by Keith Taber in MacLellan’s article, and quantified by the featured commenter under his post. And I think we need some clarity about what we mean by education research literature.

Primary, secondary, and tertiary literature

New research in any field is iterative. We know a certain amount, and someone does some research to add a little more to our understanding. In publishing these research findings, we tend to summarise what was known before to situate the work in context, and add on the new bit. As Taber points out, education has the unique position of aiming to address two audiences: like any field it is addressing other education researchers in that field; but also has a second audience; practitioners who may wish to change some aspect of their teaching, and are looking for “what works”. The trouble with the mixed audience is that the language and semantics for each are very different, leaving the practitioner feeling very frustrated. The featured comment under MacLellan’s blog documents this very well. The practitioner looking to improve faces the difficult challenge: they use some search engine with decent keywords and have to try to pick out some paper that will offer them nuggets. It really is a needle in a haystack, (or a splash of water from the river boat…).

If asked for advice, I think I would rather suggest that such practitioners would instead refer to secondary or tertiary literature. Secondary literature aims to summarise the research findings in a particular field. While it is still written with an audience of researchers from the field in mind, these reviews typically group the results from several individual studies into themes or overarching concepts, which can be useful to practitioners who may wish to see “what might work” in their own context. I recall the value of MacArthur and Jones’ review on clickers, and my own review of flipping lectures in chemistry are examples of this type.

The audience shifts more fully when we move to tertiary literature. While there is still likely two audiences for education research, the emphasis with tertiary literature is addressing a wider audience; introducing the field to a wider audience of interested readers. Typically books summarising teaching approaches are grounded in well documented research, but unlike secondary sources, they are written for those wishing to find out about the topic from an introductory level, and the language is considerate of the wider audience. Think of Taber’s books on misconceptions, and the impact they have had. More recently, the web has offered us new forms of tertiary literature – blogs are becoming more popular to disseminate the usefulness of research to a wider audience and summaries such that recently published by Tina Overton on factors to consider in teaching approaches can help introduce overarching research findings, without having to penetrate the original education research studies.

So should my colleagues read education research? I still don’t think so. A tourist to a new city wouldn’t read academic articles on transport infrastructure and architecture – they would just read the tourist guide. Of course it can be inspiring to read a case study or see what students in an individual situation experienced. But I would rather recommend secondary and tertiary sources to them if they are going to spend any valuable time reading.

And that means, in chemistry education’s case, we need a lot more of these types of publications. A recent J Chem Ed editorial suggested that they are thinking about promoting this type of publication, and any movement in that direction is welcome.

Related Posts:

  • No Related Posts

Planning a new book on laboratory education

Contracts have been signed so I am happy to say that I am writing a book on chemistry laboratory education as part of the RSC’s new Advances in Chemistry Education series due for publication mid 2017.

I’ve long had an interest in lab education, since stumbling across David McGarvey’s “Experimenting with Undergraduate Practicals” in University Chemistry Education (now CERP). Soon after, I met Stuart Bennett, now retired, from Open University at a European summer school. Stuart spoke about lab education and its potential affordances in the curriculum. He was an enormous influence on my thinking in chemistry education, and in practical work in particular. We’d later co-author a chapter on lab education for a book for new lecturers in chemistry published by the RSC (itself a good example on the benefits of European collaboration). My first piece of published education research was based on laboratory work; a report in CERP on the implementation of mini-projects in chemistry curriculum, completed with good friends and colleagues Claire Mc Donnell and Christine O’Connor. So I’ve been thinking about laboratory work for a long time.

Why a book?

A question I will likely be asking with increasing despair over the coming months is: why am I writing a book? To reaffirm to myself as much as anything else, and to remind me if I get lost on the way, the reasons are pretty straightforward.

My career decisions and personal interests over the last few years have meant that I have moved my focus entirely to chemistry education. Initially this involved sneaking in some reading between the covers of J. Mat. Chem. when I was meant to be catching up on metal oxide photocatalysis. But as time went on and thanks to the support of others involved in chemistry education, this interest became stronger. I eventually decided to make a break with chemistry and move into chemistry education research. (One of the nicest things for me personally about joining Edinburgh was that this interest was ultimately validated.)

So while my knowledge of latest chemistry research is limited mainly to Chemistry World reports, one thing I do know well is the chemistry education research literature. And there is a lot of literature on laboratory education. But as I read it and try to keep on top of it, it is apparent that much of the literature on laboratory education falls into themes, and by a bit of rethinking of these themes and by taking a curriculum design approach, some guiding principles for laboratory education can be drawn up. And that a compilation of such principles, within the context of offering a roadmap or plan for laboratory education might be useful to others.

And this is what I hope to offer. The book will be purposefully targeted at anyone responsible for taking a traditional university level chemistry laboratory course and looking to change it. In reality, such change is an enormous task, and being pragmatic, needs to happen in phases. It’s tempting then to tweak bits and change bits based on some innovation presented at a conference or seen in a paper. But there needs to be an overall design for the entire student experience, so that incremental changes sum up to an overall consistent whole piece. Furthermore, by offering a roadmap or overall design, I hope to empower members of staff who may be responsible for such change by giving the evidence they may need to rationalise changes to colleagues. Everyone has an opinion on laboratory education! The aim is to provide evidence-based design approaches.

My bookshelves are groaning with excellent books on laboratory education. I first came across Teaching in Laboratories by Boud Dunn and Hegarty-Hazel back in the days when I stumbled across McGarvey’s article. I still refer to it, as even though it was published in 1986, it still carries a lot of useful material. Woolnough and Allsop’s Practical Work in Science is also excellent; crystal clear on the role and value of laboratory education and its distinction from lecture based curriculum. Hegarty-Hazel also edited The Student Laboratory and the Science Curriculum. Roger Anderson’s book The Experience of Science was published before I was born.

I have bought these now out of print books and several more second hand for less than the cost of a cup of coffee. I have learned lots from them, but am mindful that (justifiably) well-known and comprehensive as they are, they are now out of print and our university laboratories have not seen much change in the forty years since Anderson.

I am very conscious of this as I structure my own book. I can speculate that books about science laboratories at both secondary and tertiary level may be too broad. So the book is focussing exclusively on chemistry and higher education.

Secondly, the book is very clearly directed at those implementing a new approach, those involved in change. Ultimately it is their drive and energy and input that decides the direction of changes that will occur.  I hope that by speaking directly to them with a clear rationale and approach based on an up-to-date literature, that it may ease the workload somewhat for those looking to rethink laboratory education in their curricula. Now I just need to actually write it.

Related Posts:

Alex Johnstone’s 10 Educational Commandments

My thanks to Prof Tina Overton for alerting me to the fact that these exist. I subsequently happened across them in this article detailing an interview with Prof Johnstone (1), and thought they would be useful to share.

Ten Educational Commandments 

1. What is learned is controlled by what you already know and understand.

2. How you learn is controlled by how you learned in the past (related to learning style but also to your interpretation of the “rules”).

3. If learning is to be meaningful, it has to link on to existing knowledge and skills, enriching both (2).

4. The amount of material to be processed in unit time is limited (3).

5. Feedback and reassurance are necessary for comfortable learning, and assessment should be humane.

6. Cognisance should be taken of learning styles and motivation.

7. Students should consolidate their learning by asking themselves about what goes on in their own heads— metacognition.

8. There should be room for problem solving in its fullest sense (4).

9. There should be room to create, defend, try out, hypothesise.

10. There should be opportunity given to teach (you don’t really learn until you teach) (5).

Johnstone told his interviewer that he didn’t claim any originality for the statements, which his students called the 10 educational commandments. Rather he merely brought together well known ideas from the literature. But, and importantly for this fan, Johnstone said that they have been built into his own research and practice, using them as “stars to steer by”.


  1. Cardellini, L, J. Chem. Educ., 2000, 77, 12, 1571.
  2. Johnstone, A. H. Chemical Education Research and Practice in Europe (CERAPIE) 2000, 1, 9–15; online at
  3. Johnstone, A. H. J. Chem. Educ. 1993, 70, 701–705
  4. Johnstone, A. H. In Creative Problem Solving in Chemistry; Wood, C. A., Ed.; Royal Society of Chemistry: London, 1993.
  5. Sirhan, G.; Gray, C.; Johnstone, A. H.; Reid, N. Univ. Chem. Educ. 1999, 3, 43–46.

Related Posts:

ChemEd Journal Publications from UK since 2015

I’ve compiled this list for another purpose and thought it might be useful to share here. 

The following are publications I can find* from UK corresponding authors on chemistry education research, practice, and laboratory work relevant to HE since beginning of 2015.  There are lots of interesting finds and useful articles. Most are laboratory experiments and activities, Some refer to teaching practice or underlying principles.

I don’t imagine this is a fully comprehensive list, so do let me know what’s missing. It’s in approximate chronological order from beginning of 2015.

  1. Surrey (Lygo-Baker): Teaching polymer chemistry
  2. Reading (Strohfeldt): PBL medicinal chemistry practical
  3. Astra Zeneca and Huddersfield (Hill and Sweeney): A flow chart for reaction work up
  4. Bath (Chew): Lab experiment: coffee grounds to biodiesel
  5. Nottingham (Galloway): PeerWise for revision
  6. Hertfordshire (Fergus): Context examples of recreational drugs for spectroscopy and introductory organic chemistry 
  7. Overton (was Hull): Dynamic problem based learning
  8. Durham (Hurst, now at York): Lab Experiment: Rheology of PVA gels
  9. Reading (Cranwell): Lab experiment: Songoshira reaction
  10. Edinburgh (Seery): Flipped chemistry trial
  11. Oaklands (Smith): Synthesis of fullerenes from graphite
  12. Manchester (O’Malley): Virtual labs for physical chemistry MOOC  
  13. Edinburgh (Seery): Review of flipped lectures in HE chemistry
  14. Manchester (Wong): Lab experiment: Paterno-Buchi and kinetics
  15. Southampton (Coles): Electronic lab notebooks in upper level undergraduate lab
  16. UCL (Tomaszewski): Information literacy, searching
  17. St Andrews & Glasgow (Smellie): Lab experiment: Solvent extraction of copper
  18. Imperial (Rzepa): Lab experiment: Assymetric epoxidation in the lab and molecular modelling; electronic lab notebooks
  19. Reading (Cranwell): Lab experiment: Wolff Kishner reaction
  20. Imperial (Rzepa): Using crystal structure databases
  21. Leeds (Mistry): Inquiry based organic lab in first year – students design work up
  22. Manchester (Turner): Molecular modelling activity
  23. Imperial (Haslam & Brechtelsbauer): Lab experiment: vapour pressure with an isosteniscope
  24. Imperial (Parkes): Making a battery from household products
  25. Durham (Bruce and Robson): A corpus for writing chemistry
  26. Who will it be…?!

*For those interested, the Web of Science search details are reproduced below. Results were filtered to remove non-UK papers, conference proceedings and editorials.

ADDRESS:((united kingdom OR UK OR Scotland OR Wales OR England OR (Northern Ireland))) AND TOPIC: (chemistry)AND YEAR PUBLISHED: (2016 or 2015)



Related Posts:

Practical work: theory or practice?

Literature on laboratory education over the last four decades (and more, I’m sure) has a lot to say on the role of practical work in undergraduate curricula. Indeed Baird Lloyd (1992) surveys opinions on the role of practical work in North American General Chemistry syllabi over the course of the 20th century and opens with this delicious quote, apparently offered by a student in 1928 in a $10 competition:

Chemistry laboratory is so intimately connected with the science of chemistry, that, without experimentation, the true spirit of the science cannot possibly be acquired. 

I love this quote because it captures so nicely the sense that laboratory work is at the heart of chemistry teaching – its implicit role in the teaching of chemistry is unquestionable. And although it has been questioned a lot, repeatedly, over the following decades; not many today would advocate a chemistry syllabus that did not contain laboratory work.

I feel another aspect of our consideration of chemistry labs is often unchallenged, and needs to be. That is the notion that chemistry laboratories are in some way proving ground for what students come across in lectures. That they provide an opportunity for students to visualise and see for themselves what the teacher or lecturer was talking about. Or more laudably, to even “discover” for themselves by following a controlled experiment a particular relationship. Didn’t believe it in class that an acid and an alcohol make an ester? Well now you are in labs, you can prove it. Can’t imagine that vapour pressure increases with temperature? Then come on in – we have just the practical for you. Faraday said that he was never able to make a fact his own without seeing it. But then again, he was a great demonstrator.

A problem with this on an operational level, especially at university, and especially in the physical chemistry laboratory, is that is near impossible to schedule practicals so that they follow on from the introduction of theory in class. This leads to the annual complaint from students that they can’t do the practical because they haven’t done the theory. Your students are saying this, if you haven’t heard them, you need to tune your surveys.

It’s an entirely understandable sentiment from students because we situate practicals as a subsidiary of lectures. But this is a false relationship for a variety of reasons. The first is that if you accept a model whereby you teach students chemistry content in lectures, why is there a need to supplement this teaching with a re-teaching of a sub-set of topics, arbitrarily chosen based on the whim of a lab course organiser and the size of a department’s budget? Secondly, although we aim to re-teach, or hit home some major principle again in lab work, we don’t really assess that. We might grade students’ lab report and give feedback, but it is not relevant to them as they won’t need to know it again in that context. The lab report is done. And finally, the model completely undermines the true role of practical work and value it can offer the curriculum.

A different model

When we design lecture courses, we don’t really give much thought to the labs that will go with them. Lecture course content has evolved rapidly to keep up to date with new chemistry; lab development is much slower. So why not the other way around? Why not design lab courses independent of lectures? Lecture courses are one area of the curriculum to learn – typically the content of the curriculum; laboratory courses are another. And what might the role here be?

Woolnough and Allsop (1985), who make a clear and convincing argument for cutting the “Gordian knot” between theory and practice, instead advocate a syllabus that has three aims:

  1. developing practical skills and techniques.
  2. being a problem-solving chemist.
  3. getting a “feel for phenomena”.

The detail of how this can be done is the subject of their book, but involves a syllabus that has “exercises, investigations, and experiences”. To me these amount to the “process” of chemistry. On a general level, I think this approach is worth consideration as it has several impacts on teaching and learning in practice.

Impacts on teaching and learning

Cutting the link between theory and practice means that there is no longer a need to examine students’ understanding of chemistry concepts by proxy. Long introductions, much hated by students, which aim to get the student to understand the theory behind the topic at hand by rephrasing what is given to them in a lab manual, are obsolete. A properly designed syllabus removes the need for students to have had lectures in a particular topic before a lab course. Pre-lab questions can move away from being about random bits of theory and focus on the relationships in the experiment. There is no need for pointless post-lab questions that try to squeeze in a bit more theory.

Instead, students will need to approach the lab with some kind of model for what is happening. This does not need to be the actual equations they learn in lectures. Some thought means they may be able to draw on prior knowledge to inform that model. Of course, the practical will likely involve using some aspect of what they cover or will cover in lectures, but at the stage of doing the practical, it is the fundamental relationship they are considering and exploring. Approaching the lab with a model of a relationship (clearly I am in phys chem labs here!) and exploring that relationship is better reflecting the nature of science, and focussing students attention on the study in question. Group discussions and sharing data are more meaningful. Perhaps labs could even inform future lectures rather than rely on past ones! A final advantage is the reassertion of practical skills and techniques as a valuable aspect of laboratory work.

A key point here is that the laboratory content is appropriate for the level of the curriculum, just as it is when we design lectures. This approach is not advocating random discovery – quite the opposite. But free of the bond with associated lectures, there is scope to develop a much more coherent, independent, and more genuinely complementary laboratory course.


Baird W. Lloyd, The 20th Century General Chemistry Laboratory: its various faces, J. Chem. Ed., 1992, 69(11), 866-869.

Brian Woolnaugh and Terry Allsop (1985) Practical Work in Science, Cambridge University Press.

1928 quote

Related Posts:

Developing practical skills in the chemistry laboratory

How do we prepare students for practical skills they conduct in the laboratory?

Practical skills involve psychomotor development, as they typically involve handling chemicals, glassware, and instrumentation. But how do we prepare students for this work, and do we give them enough time to develop these skills?

Farmer and Frazer analysed 126 school experiments (from the old O-level Nuffield syllabus) with a view to categorising practical skills and came up with some interesting results.[1] Acknowledging that some psychomotor tasks include a cognitive component (they give the example of manipulating the air-hole collar of a Bunsen burner while judging the nature of the flame for a particular task at hand) they identified 65 psychomotor tasks and 108 cognitive tasks from the experiments studied. Some of these psychomotor tasks are defined as having a key role, in that the success of the experiment is dependent on the successful completion of that task, reducing the number of psychomotor skills to 44. Many of these key tasks were required in only a few experiments, so the set was again reduced to number of frequent key tasks – those occurring in more than 10 experiments. The 14 frequent key tasks subsequently identified are described in their table below.

Data from Education in Chemistry (Ref 1)
Data from Education in Chemistry (Ref 1)

Thus of the 65 psychomotor skills listed, only 14 are defined as frequent key tasks, limiting the opportunities for pupils to develop the skills associated with completing them. Indeed this paper goes on to demonstrate that in an assessment of 100 pupils, there was very poor demonstration of ability in correctly completing the practical tasks, which they attribute to the design of the syllabus and the limited opportunity to do practical work.

This article prompts me to think again: how do we prepare students for the laboratory skills aspect of practical work? I think the most common approach is to demonstrate immediately in advance of the student completing the practical, explaining the technique or the apparatus and its operation. However, demonstration puts students in the mode of observer; they are watching someone else complete an activity, rather than conceptualising their own completion. It also relies on the quality of the demonstrator, and is subject to local hazards, such as time available, ability to see and hear the demonstration, and so on. Therefore, there may be benefit in shifting this demonstration to pre-lab, allowing students time to become accustomed to a technique and its nuances.

Such pre-labs need to be carefully designed, and actively distinguished from any pre-lab information focussing on theory, which has a separate purpose. At Edinburgh, two strategies are planned.

The first is on the development of core introductory laboratory skills: titrations involving pipetting and buretting; preparing standard solutions including using a balance; and setting up Quickfit glassware to complete a distillation. Pre-lab information is provided to students in the form of videos demonstrating each technique, with key steps in each procedure highlighted in the video. Students will be required to demonstrate each of the three procedures to their peers in the laboratory, while their peer uses the checklist to ensure that all aspects of the task were completed appropriately. The purpose here is to incorporate preparation, demonstration, and peer-review into the learning of core lab skills, as well as to set in mind early on in students’ university careers the correct approach and the appropriate glassware to use for basic laboratory techniques. The approach includes students’ videoing their peers as part of the review process using mobile phones; and the video recording will subsequently be used as evidence for issuing students with a digital badge for that technique (more on that at the project homepage).

The second approach is to develop the laboratory manual beyond its traditional textbook format to be an electronic laboratory manual, with pre-lab demonstrations included. More on that project to come soon.

In designing pre-lab activities for skills development, the aim is to move beyond “just demonstrating” and to get students thinking through the approaches they will take. The reason for this is guided by work done by Beasley in the late 1970s. Beasley drew from the literature of physical education to consider the development of psychomotor skills in chemistry. He studied the concept of mental practice as a technique to help students prepare for the laboratory.[2] Mental practice is based on the notion that physical activity requires mental thought, and thus mentally or introspectively rehearsing an activity prompts neural and muscular responses.[3]  Students were assigned to groups where they conducted no preparation, physical preparation, mental preparation, and both physical and mental preparation. They were tested before and after completing a lab on volumetric analysis. Beasley reported that students who students entering college from school were not proficient in completing volumetric analysis based on accuracy of their results. Furthermore, there was no significant difference in post-test scores of treatment students (which were all better than students who did no preparation), suggesting that mental preparation was as effective as physical preparation.

Those interested in reading more on this topic may enjoy two reviews by Stephen DeMeo; one in J. Chem. Ed.[4] and an elegant piece “Gazing at the hand” in Curriculum Inquiry.[5]


[1] A. Farmer and M. J. Frazer, Practical Skills in School Chemistry, Education in Chemistry, 1985, 22, 138.

[2] W. Beasley, The Effect of Physical and Mental Practice of Psychomotor Skills on Chemistry Student Laboratory Performance, Journal of Research in Science Teaching, 1979, 16(5), 473.

[3] J. B. Oxendine, Physical education. In Singer, R. B. (Ed.), The psychomotor domain: Movement behavior. Philadelphia: Lea and Feberger, 1972.

[4] S. De Meo, Teaching Chemical Technique: A review of the literature, Journal of Chemical Education, 2001 78(3), 373.

[5] S. De Meo, Gazing at the Hand: A Foucaultian View of the Teaching of Manipulative Skills to Introductory Chemistry Students in the United States and the Potential for Transforming Laboratory Instruction, Curriculum Inquiry, 2005, 35, 3.

Related Posts:

Reflections on #micer16

Several years ago at the Variety in Chemistry Education conference, there was a rather sombre after-dinner conversation on whether the meeting would continue on in subsequent years. Attendance numbers were low and the age profile was favouring the upper half of the bell-curve.

Last year at Variety I registered before the deadline and got, what I think was the last space, and worried about whether my abstract would be considered. The meeting was packed full of energetic participants interested in teaching from all over UK and Ireland, at various stages of their careers. A swell in numbers is of course expected from the merging with the Physics Higher Education Conference, but the combination of the two is definitely (from this chemist’s perspective) greater than the sum of its parts.

Participants at #micer16
Participants at #micer16

What happened in the mean time would be worthy of a PhD study. How did the fragile strings that were just holding people together in this disparate, struggling community, not snap, but instead strengthen to bring in many newcomers? A complex web of new connections has grown.  While I watched it happen I am not sure how it happened. I suspect it is a confluence of many factors: the efforts of the RSC at a time when chemistry was at a low-point. The determination of the regular attendees to keep supporting it, knowing its inherent value. The ongoing support of people like Stuart Bennett, Dave McGarvey, Stephen Breuer, Bill Byers, and others. And of course the endless energy of Tina Overton and the crew at the Physical Sciences Centre at Hull.

Whatever the process, we are very lucky to have a vibrant community of people willing to push and challenge and innovate in our teaching of chemistry. And that community is willing and is expected to play a vital role in the development of teaching approaches. This requires design and evaluation of these approaches; a consideration of how they work in our educational context. And this requires the knowledge of how to design these research studies and complete these evaluations. Readers will note that Variety now particularly welcome evidence-based approaches.

Most of us in this community are chemists, and the language of education research can be new, and difficult to navigate. Thus a meeting such as MICER held last week aimed to introduce and/or develop approaches in education research. The speakers were excellent, but having selected them I knew they would be! Participants left, from what I could see and saw on social media, energised and enthused about the summer ahead and possible projects.

But we will all return to our individual departments, with the rest of the job to do, and soon enthusiasm gives way to pragmatism, as other things get in the way. It can be difficult to continue to develop expertise and competence in chemistry education research without a focus. The community needs to continue to support itself, and seek support from elsewhere.

How might this happen?

Support from within the community can happen by contacting someone you met at a conference and asking them to be a “critical friend”. Claire Mc Donnell introduced me to this term  and indeed was my critical friend. This is someone whom you trust to talk about your work with, share ideas and approaches, read drafts of work. It is a mutual relationship, and I have found it extremely beneficial, both from the perspective of having someone sensible to talk to, but also from a metacognitive perspective. Talking it out makes me think about it more.

The community can organise informal and formal journal clubs. Is there a particular paper you liked – how did the authors complete a study and what did they draw from it? Why not discuss it with someone, or better still in the open?

Over the next while I am hoping to crystallise these ideas and continue the conversations on how we do chemistry education research. I very much hope you can join me and be an active participant; indeed a proactive participant. So that there is an independent platform, I have set up the website and welcome anyone interested in being involved to get in touch about how we might plan activities or even a series of activities. I hope to see you there.

Related Posts:

The oldest chemsoc in the world

In 1935, John Kendall, Professor of Chemistry at the University of Edinburgh, wrote to the Journal of Chemical Education with what one can imagine was more than a hint of glee. He reported that there was a list of names, written in Joseph Black’s hand, under the heading “List of the Members of the Chemical Society“. These names had previously been thought to have been people from the city of Edinburgh with an interest in chemistry, or indeed those drawn from the general population of the country.

List of names from Kendall,  J Chem Ed
List of names from Kendall, J Chem Ed, 1935, 12, 565.

Kendall had the simple but ingenious idea of checking the register of students of the University of Edinburgh from this period, and in a productive 15 minutes, he identified 53 of the 59 names immediately. He wrote that “the right names tumbled out of the register just like ripe apples from a tree when shaken“.  The remaining names could be accounted for by transcription errors. Thus the list of names written in Joseph Black’s hand were considered to be members of Black’s class, and therefore it can be considered as the original Chemical Society of the University of Edinburgh. Prior to this, the oldest society was considered to be the Chemical Society of Philadelphia (1792). The nationality of 19 on the list is given (those who were medical graduates). Three were Scottish, three were English, and the remaining 13 were Irish; step forward Bicker McDonald

How do we know this group functioned as a society? Kendall gives an update on the story in 1953, in his book “Great Discoveries by Young Chemists“. He received a letter from Rev PJ McLaughlin in 1947. The good Reverend had discovered a folio that had been given to the Royal Irish Academy, Dublin, in 1846. This contained a collection of “Dissertations read before the Chemical Society instituted in the beginning of the Year 1785”. Edinburgh is mentioned within, and the names of the 32 contributors match some of those on Black’s list. Kendall requested that the volume be loaned to the Royal Society of Edinburgh, where upon examination, he realised it was the first book of Proceedings of the Chemical Society of the University of Edinburgh. Published in 1785, it has the honour of being the world’s first chemistry journal, preceding Annales de Chimie by 5 years.

Kendall requested that the book of proceedings to be returned to Edinburgh, and on 25 November, 1947, the Council of the Royal Irish Academy agreed to return the folio to its original home. Kendall concludes with a paragraph that makes me think that we would have got on quite well:

And now, my readers, if you possess friends with a common interest in chemistry, don’t you think it would be well worth while to start a chemical society of your own for the discussion of topics of current importance ? You might even keep a record of the papers presented by members at the meetings of such a society, and this record might help chemical historians of the year 2100 to appreciate the scientific problems of today.

Related Posts:

Practical measures for practical work

There is something about reading old educational literature that is simultaneously reaffirming and depressing. Reading a point of view from over three decades ago that confirms your current thinking belies the notion that you are jumping on “the latest fad”, while the fact that it is still an issue for discussion three decades later makes you wonder about the glacial rate of change in educational approaches.

Education in Chemistry published a series of articles on practical work by Alex Johnstone. This article from 1982 sets the scene:

It is not uncommon in undergraduate laboratories to see students working as fast as possible just to get finished, with little or no thought about what they are doing.

Students, he argues, see practical work as “an intellectual non-event”. Johnstone used this article to elaborate on his hypothesis that practical classes presented a significant challenge to students, because the idea being taught is simultaneously needed at the start of the practical to organise the information presented in the class. In the lab, they need to recall the theory of the lab, remember how to use apparatus or read new written instructions about apparatus, develop new lab skills, listen to verbal instructions, and process whatever experimental data the experiment produces. It is difficult to discern which of this information is immediately important, and which is incidental.

Information overload in a lab environment (from Education in Chemistry, 1982)
Information overload in a lab environment (from Education in Chemistry, 1982)

The result is that students will follow the laboratory procedure like a recipe, without any intellectual engagement. Or they might take unnecessary care so that they won’t regret any experimental actions later – for example using an analytical balance when only a rough estimate was needed. Or they might go all British Bake Off Technical Challenge and just look around and copy others, without knowing why they are doing what they are mimicking.

Johnstone makes an interesting point which I don’t think I have seen elsewhere. When we lecture, we start off from a single point, elaborating with examples and developing connections. However, in practical work, students are exposed to all information at once, and must navigate their own way through to find the main point, often obscured. He used the idea of a pyramid and inverted pyramid to model these approaches.

Different approaches in class work and practical work (Johnstone and Wham, from Education in Chemistry, 1982)
Different approaches in class work and practical work (Johnstone and Wham, from Education in Chemistry, 1982)

Teaching strategy

How then can this information overload be alleviated? Johnstone provides some examples, including

  • making the point of the experiment clear;
  • a consideration of the arrangement of instruction manual so that it is clear what information is preliminary, peripheral, and/or preparatory.
  • ensuring the experiment has not acquired confusing or irrelevant aspects – this resonated with me from experience: some experiments involve students making a series of dilutions and then performing experiments with this series. Students spend so much time considering the process of dilution (preparatory), this becomes the main focus, rather than the main purpose of the experiment. This involves thinking about the goals of the experiment. If it is required that students should know how to prepare a dilution series (of course it is), then that should have primary prominence elsewhere.
  • ensuring necessary skills are required before exposure to investigative experiments.

A new manual

in 1990, Michael Byrne from what was then Newcastle Polytechnic decided to put Johnstone’s ideas into practice and reported on a new first year manual with the following design considerations:

  1. Experiments included a brief introduction with key information needed to understand instruments and the principle behind the experiment.
  2. Objectives were explicitly stated, indicating exactly what the student should be able to do as a result of carrying out the experiment.
  3. The language of procedures was simplified and laid out in sequence of what he student was meant to do.
  4. Information was provided on how to present results and what the discussion should cover. Students were prompted as to how they should query their results.

These ideas aren’t exactly what we would consider radical now, but students were tested after five weeks and those using the revised manual scored significantly higher in tests about techniques and results of experiments than those who had the old manuals.

Tenacious J

In 1990s, Johnstone continued his attempts to effect change in our approach to practical teaching. In 1990, he discussed the use of student diaries to record views on their lab experiences during their second year at university. The diary consisted of short response questions that the student completed after each practical. The responses were analysed iteratively, so that the experiments could be grouped into layers of criticism. Unsurprisingly, physical chemistry experiments came out as the most unpopular. My poor subject.

Why were they unpopular? Johnstone analysed the “load” of each experiment  – the amount of information they had to process, recall, digest and interrelate in the three hour period. I’ve reproduced his table below:

Load caused by experiments, Johnstone and Letton, Education in Chemistry 1990
Load caused by experiments, Johnstone and Letton, Education in Chemistry 1990

The total load of physical chemistry experiments is much greater than the inorganic or organic labs, primarily due to theoretical aspects. Furthermore, those physical lab experiments which were subject to most criticism had a theory load of 40, compared to the average physical lab theory load of 33. The result of this was evident in the student diaries: comments such as “not learned anything” and “no satisfaction” indicates that the students had not engaged with the experiment in any way.

Practical measures for practical work

In 1991, in the final of my trio of his articles here, Johnstone reported how he addressed the issues arising from the study of load in the manuals. Five strategies were considered:

  1. Noise reduction in the lab (noise meaning extraneous information): clearly labelled solutions, provided without need for further dilution unless required as part of experiment, highlighting where a particular procedure may differ from what students have previously experienced.
  2. Noise reduction in the manual: clearer layout of manual, with consistent icons, diagrams of types of balance beside instructions, etc. (this is 1991, people…)
  3. Allowing for practice: design of the overall practical course so that required skills are progressively developed.
  4. Time for thought: requiring students to prepare some of the procedure in advance of the lab as part of their pre-practical work – e.g. amounts to be measured out, etc.
  5. Time to piece it together: arranging the lab programme so that skills are developed in one week, used in a second week, and applied in a third week in a more open-ended lab that took about half an hour of lab time.

Johnstone’s trio of papers shown here show an impressive sequence of developing a theory as to why something has gone wrong, testing that theory with some analysis, and grounding an educational approach based on these findings. It’s one of the reasons I admire his work so much.


Michael S. Byrne, More effective practical work, Education in Chemistry, 1990, 27, 12-13.

A. H. Johnstone, A. J. B. Wham, The demands of practical work, Education in Chemistry, 1982, 19, 71-73

A. H. Johnstone, K. M. Letton, Investigating undergraduate laboratory work, Education in Chemistry, 1990, 27, 9-11.

A. H. Johnstone, K. M. Letton, Practical measures for practical work, Education in Chemistry, 1991, 28, 81-83

Related Posts:

Finding out about Learning Analytics

I’m attending the JISC Learning Analytics network meeting (information), which is giving a good overview on the emerging development of learning analytics, and its integration into higher education. Learning analytics aims to harness data about students interactions and engagement with a course, whatever can be measured, and use that in an intelligent way to inform and empower students about their own academic journey. Of course, one of the major questions being discussed here is what data is relevant? This was something I explored when looking at developing a model to help tutors predict student performance and identify at-risk students (see CERP: 2009, 10, 227) but things have moved on now and the discipline of learning analytics looks to automate a lot of the data gathering and have a sensible data reporting to both staff and individual students.

There was an interesting talk on the rollout of learning analytics from Gary Tindell at the University of East London, which described the roll-out over time of a learning analytics platform, which might be of interest to others considering integrating it into their own institution. He identified 5 phases, which developed over time:

  • Phase 1: collecting data on student attendance via swipe card system. This data can be broken down by school, module, event, student. Subsequently developed an attendance reporting app (assuming app here means a web-app). This app identifies students whose attendance falls below 75% threshold and flags interventions via student retention team. Unsurprisingly, there was a correlation between student attendance and module performance.
  • Phase 2: student engagement app for personal tutors: this pulls together data on student attendance, module activity, use of library, e-book activity, coursework submission, assessment profile etc and aims to privide tutors with a broader profile of student engagement.
  • Phase 3: Development of an app that integrates all this data and calculates a level of student engagement based on a weighting system for identifying at risk students (those at risk of leaving). Weighting can be changed depending on what is considered most important. It allows students see a level of engagement compared with their cohort.
  • Phase 4: Research phase – intention is to use data to inform the weightings applied to student engagement app. Initial correlations found highest correlations for attendance and average module marks. However, more interestingly, multiple regressions suggest all engagement measures are significant. They have developed a quadrant based model that identifies low engagers to high engagers, and provides an indicator of student performance. One of the key measures is previous student performance – but is that a student engagement measure??
  • Phase 5 – currently in progress, developing 3 different sets of visualisations of student engagement.
  1. Compare individual engagement with UEL school and course
  2. Provides student with an indication of where they are located in terms of student engagement
  3. Provides an indication of the distance to travel for a student to be able to progress to another quadrant.

The next steps in the project are about aiming to answer the following questions:

–          Can we accurately predict student performance based on a metric?

–          Can providing students with information on their level of engagement really change study patterns?

It’s the last point that particularly interests me.



Related Posts:

  • No Related Posts

Rethinking my views on LGBT in STEM

I’ve never been sure where I stand with the association of sexuality and professional status. I’ve always leaned towards the side of considering it a private matter, not something to be either concealed or promoted. It is a personal identifier, not a professional one. Yet I’ve agreed with the notion that, as one clever person put it to me recently, if it helps someone, then it is a good thing. Maybe I’ve just been reluctant to be the one helping.

Recently there was an LGBT-focussed seminar day, where scientists and engineers came together to present their work and discuss issues around being LGBT in STEM. While I didn’t attend, I did follow the tweets and it has prompted some thought and several incidental conversations since. I wondered why all of these extremely clever people thought this to be an issue of great importance, while I was sitting on the fence.

I came out in the last year of my PhD; the imminent void facing me probably prompted a desire to remove at least one future concern. My main memory from that time is the terrible sadness I felt when a very lovely colleague asked me why I hadn’t said so before. They were bringing another colleague to a gay club every Sunday and I could have joined them. The utter futility of all those years of secrecy seemed such a waste.

So when I started my post-doc, I was an out person. On my first day, there was a query about a partner and who she was, and I mentioned a he, and all was well. I’m not sure whether it is on purpose, but ever since I’ve opted to tell the most trusted gossip I can find early on, and any awkward pronoun conversations are avoided evermore. I even managed (accidentally) to get it into my interview presentation at Edinburgh.

So why don’t I see it as an issue? It’s probably because it’s never been an issue for me. My memory of not being out has faded as it is half a lifetime ago. My experience has only been in academia, which has always been an actively supportive environment. That’s my frame of reference.

There can be no doubt that there are issues around supporting LGBT people in a university environment. It’s well documented that at school level, there is significant stress and trauma and associated mental health problems regarding LGBT pupils. And there are workplace studies that discuss the reticence of employees and the associated discomfort of coming out in an unsupportive environment. So while there is not much research for UK (although see here for a good article) it would be folly to think that in between these two phases of life, that there are not issues for our students at university which are worrying.

And it is sad to think that at a time when they should be most free to express and explore, and focussing on my thermodynamics notes, some of my students are instead worrying about how to distract their peers from conversations about what happened at the weekend.

So back to that fence: is it my role to advocate? What does that even mean? I am uncomfortable about the term ‘role model’. Conversations in the last while have certainly changed my thinking, in that I am now leaning towards the feeling that doing something is a good idea, but now I’m not quite sure what that something should be.

Some useful links from the LGBT STEMinar:
Dave Smith’s keynote:
Elena Rdz-Falcon’s keynote:
Good blog post on this seminar: 
Tweets from the day are under the hashtag #LGBTSteminar

Related Posts:

  • No Related Posts

PhD Studentship in Chemistry Education

3 Year PhD Studentship in Chemistry Education

Supervisor: Dr Michael Seery School of Chemistry, University of Edinburgh

“Learning analytics to enhance the student learning experience in chemistry”

Learning analytics is an emerging discipline considering the measurement, collection and analysis of data about student learning with a view to improving their learning experience. This project involves the design and development of a learning analytics system for chemistry so that students can continually monitor and reflect on their progress, with a view to actively improving their understanding throughout their studies. The project builds on work on using students’ prior chemistry learning to examine future performance (Chem. Ed. Res. Pract., 2009, 10, 227-232) and assessing the impact of learning resources on student performance (Brit. J. Ed. Tech., 2012, 43(4), 667–677).

The project is fully-funded for 36-months starting in September 2016, covering UK/EU tuition fees and an annual stipend at the EPSRC standard rate (in the region of £14,200 in academic year 2016-17).

Applications are sought from UK/EU candidates with an excellent track record in chemistry or chemistry education with an interest in higher education and the use of technology in education. Experience with e-learning software and statistical processing packages is desirable but not essential. Applicants must have or are expected to receive by the start date a 1st class or an upper 2nd class honours degree (or equivalent).

To apply, applicants should send the following to

  • A cover letter detailing interest in the position and any relevant experience.
  • A curriculum vitae.

The successful candidate will be required to apply through the EUCLID system as outlined at

Deadline: 1st Feb 2016

Related Posts:

  • No Related Posts

Date for #chemed diaries: Methods in Chemistry Education Research 20/5/16

Methods in Chemistry Education Research #micer16

Burlington House, 20th May 2016, 11 – 4 pm

This one day conference is being organised in response to a growing demand and interest in chemistry education research in the UK. The meeting will focus in particular on methods; the practicalities of how to do chemistry education research. Invited speakers with experience of completing and publishing discipline-based education research will give talks on particular methods, relating the methods to a theoretical framework, the research question, and discussing the practicalities of gathering data. Approaches to publication will be outlined. Each speaker will be followed by an extended structured discussion so that attendees have time to discuss further issues that arise. The meeting is being organised with the journal Chemistry Education Research and Practice which is free to access at the URL:

Attendance is free thanks to the support of the RSC’s Chemistry Education Research Group ( and Tertiary Education Group (

Session 1 (sponsored by the Chemistry Education Research Group)
In the first session, speakers will discuss their approach to education research with an emphasis on particular theoretical frameworks (e.g. grounded theory) and how this framework influences their method in addressing a research question.


Session 2 (sponsored by the Tertiary Education Group)
In the second session, speakers will discuss their approach with an emphasis on gathering data (e.g. focus groups), the reasons for these approaches in the context of the research question, and the considerations in interpreting this data.

Further information and a final list of speakers will be circulated early 2016.

Related Posts: