Links to Back Issues of University Chemistry Education

I don’t know if I am missing something, but I have found it hard to locate past issues of University Chemistry Education, the predecessor to CERP.  They are not linked on the RSC journal page. CERP arose out of a merger between U Chem Ed and CERAPIE, and it is the CERAPIE articles that are hosted in the CERP back issues. Confused? Yes. (More on all of this here)

Anyway in searching and hunting old U Chem Ed articles, I have cracked the code of links and compiled links to back issues below. They are full of goodness. (The very last article published in UCE was the very first chemistry education paper I read – David McGarvey’s “Experimenting with Undergraduate Practicals“.)

Links to Back Issues

Contents of all issues: http://www.rsc.org/images/date_index_tcm18-7050.pdf 

1997 – Volume 1:

1 – remains elusive… It contains Johnstone’s “And some fell on good ground” so I know it is out there… Edit: cracked it – they are available by article:

1998 – Volume 2:

1 – http://www.rsc.org/images/Vol_2_No1_tcm18-7034.pdf

2 – http://www.rsc.org/images/Vol_2_No2_tcm18-7035.pdf

1999 – Volume 3:

1 – http://www.rsc.org/images/Vol_3_No1_tcm18-7036.pdf

2 – http://www.rsc.org/images/Vol_3_No2_tcm18-7037.pdf

2000 – Volume 4:

1 – http://www.rsc.org/images/Vol_4_No1_tcm18-7038.pdf

2 – http://www.rsc.org/images/Vol_4_No2_tcm18-7039.pdf

2001 – Volume 5:

1 – http://www.rsc.org/images/Vol_5_No1_tcm18-7040.pdf

2 – http://www.rsc.org/images/Vol_5_No2_tcm18-7041.pdf

2002 – Volume 6:

1 – http://www.rsc.org/images/Vol_6_No1_tcm18-7042.pdf

2 – http://www.rsc.org/images/Vol_6_No2_tcm18-7043.pdf

2003 – Volume 7:

1 – http://www.rsc.org/images/Vol_7_No1_tcm18-7044.pdf

2 – http://www.rsc.org/images/Vol_7_No2_tcm18-7045.pdf

2004 – Volume 8:

1 – http://www.rsc.org/images/Vol_8_No1_tcm18-7046.pdf

2 – http://www.rsc.org/images/Vol_8_No2_tcm18-7047.pdf

I’ve downloaded these all now in case of future URL changes. Yes I was a librarian in another life.

UCE logo

Related Posts:

  • No Related Posts

Dialogue in lectures

This is not a post on whether the lecture is A Good Thing or not. Lectures happen. PERIOD!

A paper by Anna Wood and colleagues at the Edinburgh PER group, along with a subsequent talk by Anna at Moray House has gotten me thinking a lot over the last year about dialogue and its place in all of our interactions with students. The literature on feedback is replete with discussion on dialogue, sensibly so. The feedback cycle could be considered (simplistically) as a conversation: the student says something to the teacher in their work; the teacher says something back to the student in their feedback. There’s some conversation going on. Feedback literature talks about how this conversation continues, but what about the bit before this conversation begins?

Not a monologue

The spark from Wood’s work for me was that lectures are not a monologue. She is considering active lectures in particular, but cites Bamford* who gives a lovely overview of the nature of conversation in lectures in general. Bamford presents the case the lectures are not a monologue, but are a conversation. Just as in the feedback example above, two people are conversing with each other, although not verbally. In a lecturer, the lecturer might ask: “Is that OK?”. An individual student might respond inwardly “Yes, I am getting this” or “No, I haven’t a freaking clue what is going on and when is coffee”. A dialogue happened. Wood’s paper discusses these vicarious interactions – a delicious phrase describing the process of having both sides of the conversation; an internal dialogue of sorts. She describes how this dialogue continues in active lectures, but sadly there is only one Ross Galloway, so let’s think about how this conversation might continue in lectures given by us mere mortals. How can we help and inform these vicarious interactions?

Developing a conversation

A problem you will by now have identified is that the conversation: “Is that OK?” and retort isn’t much of a conversation. So how can we continue this conversation? My intention is to consider conversation starters in lectures that foster a sense with each individual student that they are having a personal conversation with the lecturer at points during the lecture. And incorporates guides for the student to continue this conversation after the lecture, up to the point that they submit their work, prompting the conversation we started with above.

In Woods talk, she mentioned specific examples. The lecturer would ask something like: “Is 7 J a reasonable answer?” A problem with “Is that OK?” is that it is too broad. It’s difficult to follow up the conversation specifically as it likely ends with yes or no.

How about a lecturer asks: “Why is this smaller than…?” You’re a student, and you’re listening. Why is it smaller? Do you know? Yes? No? Is it because…? Regardless of your answer, you are waiting for the response. You think you know the answer, or you know you don’t.

If we are to take dialogue seriously, then the crucial bit is what happens next. Eric Mazur will rightly tell us that we should have allow discussion with peers about this, but we are mortals, and want to get on with the lecture. So how about the conversation goes something like this:

“Why is this smaller than…?”

[pause]

You are a student: you will have an answer: You know, you think you know, you don’t know, you don’t know what’s going on. You will have some response.

The lecturer continues:

“For those of you who think…”

The lecturer responds with a couple of scenarios. The conversation continues beyond a couplet.

Did you think of one of these scenarios? If so the lecturer is talking to you. Yes I did think that and I have it confirmed now I am right. Or: yes I did think that, why is that wrong?

The lecturer can continue:

“While it makes sense to think that, have a look in reference XYZ for a bit more detail”.

The lecturer thus concludes this part of the conversation. A dialogue has happened and each student knows that that they have a good idea what is going on, they don’t but know where to follow up this issue, or that they haven’t a clue what is going on. Whichever case, there is some outcome, and some action prompted. Indeed one could argue that this prompted action (refer to reference) is a bridge between the lecture and tutorial – I checked this reference but don’t understand – and so the conversation continues there.

This all seems very obvious, and maybe everyone else does this and didn’t tell me. My lectures tend to have lots of “Is that OK?” type questions, but I do like this idea of a much more purposeful design to structuring the conversation with a large class. I should say that this is entirely without research base beyond what I have read, but I think it would be very empowering for students to think that a lecturer is aiming to have a conversation with them.

rf-freq-radio-wave

*Bamford is cited in Wood’s paper and I got most of it on Google Books.

Related Posts:

  • No Related Posts

Revising functional groups with lightbulb feedback

I’m always a little envious when people tell me they were students of chemistry at Glasgow during Alex Johnstone’s time there. A recent read from the Education in Chemistry back-catalogue has turned me a shade greener. Let me tell you about something wonderful.

The concept of working memory is based on the notion that we can process a finite number of new bits in one instance, originally thought to be about 7, now about 4.  What these ‘bits’ are depend on what we know. So a person who only knows a little chemistry will look at a complex organic molecule and see lots of carbons, hydrogens, etc joined together. Remembering it (or even discussing its structure/reactivity) would be very difficult – there are too many bits. A more advanced learner may be able to identify functional groups, where a group is an assembly or atoms in a particular pattern; ketones for example being an assembly of three carbons and an oxygen, with particular bonding arrangements. This reduces the number of bits.

Functional groups are important for organic chemists as they will determine the reactivity of the molecule, and a challenge for novices to be able to do this is to first be able to identify the functional groups. In order to help students practise this, Johnstone developed an innovative approach (this was 1982): an electronic circuit board.

Functional Group Board: Black dots represent points were students needed to wire from name to example of functional group
Functional Group Board: Black dots represent points were students needed to wire from name to example of functional group

The board was designed so that it was covered with a piece of paper listing all functional groups of interest on either side, and then an array of molecules in the middle, with functional groups circled. Students were asked to connect a lead from the functional group name to a matching functional group, and if they were correct, a lightbulb would flash.

A lightbulb would flash. Can you imagine the joy?!

Amide backup card
Amide backup card

If not, “back-up cards” were available so that students could review any that they connected incorrectly, and were then directed back to the board.

The board was made available to students in laboratory sessions, and they were just directed to play with it in groups to stimulate discussion (and so as “not to frighten them away with yet another test”). Thus students were able to test out their knowledge, and if incorrect they had resources to review and re-test. Needless to say the board was very popular with students, such that more complex sheets were developed for medical students.

Because this is 1982 and pre-… well, everything, Johnstone offers instructions for building the board, developed with the departmental electrician. Circuit instructions for 50 x 60 cm board were given, along with details of mounting various plans of functional groups onto the pegboard for assembly. I want one!

 

Reference

A. H. Johnstone, K. M. Letton, J. C. Speakman, Recognising functional groups, Education in Chemistry, 1982, 19, 16-19. RSC members can view archives of Education in Chemistry via the Historical Collection.

Related Posts:

  • No Related Posts

What is the purpose of practical work?

I have been reading quite a lot about why we do practical work. Laboratory work is a core component of the chemistry (science) curriculum but its ubiquity means that we rarely stop to consider its purpose explicitly. This leads to many problems. An interesting quote summarises one:

One of the interesting things about laboratories is that there has never been definite consensus about those serious purposes. Perhaps that is why they have remained popular: they can be thought to support almost any aim of teaching.1

Even within institutions, where their might be some prescription of what the purpose is in broad terms, different faculty involved in the laboratory may have different emphases, and subsequently the message about what the purpose of practical work is differs depending on who is running the lab on a given day.2

This matters for various reasons. The first is that if there is confusion about the purpose of practical work, then everyone involved will place their attention onto the part that they think is most important. Academics will likely consider overarching goals, with students developing scientific skills and nature of science aspects.3 Demonstrators will think about teaching how to use instruments or complete techniques. Students will follow the money, and focus on the assessment, usually the lab report, which means their time is best utilised by getting the results as quickly as possible and getting out of the lab.4 Everybody’s priority is different because the purposes were never made clear. As in crystalline.

The second reason that thinking about purposes is that without an explicit consideration of what the purposes of practical work are, it is difficult to challenge these purposes, and consider their value. How many lab manuals open up with a line similar to: “The purpose of these practicals is to reaffirm theory taught in lectures…”? The notion that the purpose of practicals is in somehow supplementing taught material in lectures has long come in for criticism, and has little basis in evidence. Laboratories are generally quite inefficient places to “teach” theory. Woolnough and Allsop argued vehemently for cutting the “Gordian Knot” between theory and practical, arguing that practical settings offered their own unique purpose that, rather than being subservient to theory work, complemented it.5 Kirchner picks this argument up, describing science education in terms of substantive structure and syntactical structure. The former deals with the knowledge base of science, the latter with the acts of how we do science.6 Anderson had earlier distinguished between “science” and “sciencing”.

Discussion therefore needs to focus on what this syntactical structure is – what is “sciencing”? Here, the literature is vast, and often contradictory. To make a start, we look to Johnstone who, with his usual pragmatism, distinguished between aims of practical work (what we set out to do) and objectives of practical work (what the students achieve).8 With this in mind, we can begin to have some serious discussion about what we want practical work to achieve in our curricula.

Links to source
Links to source

References

  1.  White, R. T., The link between the laboratory and learning. International Journal of Science Education 1996, 18 (7), 761-774.
  2. Boud, D.; Dunn, J.; Hegarty-Hazel, E., Teaching in laboratories. Society for Research into Higher Education & NFER-Nelson Guildford, Surrey, UK: 1986.
  3. Bretz, S. L.; Fay, M.; Bruck, L. B.; Towns, M. H., What faculty interviews reveal about meaningful learning in the undergraduate chemistry laboratory. Journal of Chemical Education 2013, 90 (3), 281-288.
  4. (a) DeKorver, B. K.; Towns, M. H., General Chemistry Students’ Goals for Chemistry Laboratory Coursework. Journal of Chemical Education 2015, 92 (12), 2031-2037; (b) DeKorver, B. K.; Towns, M. H., Upper-level undergraduate chemistry students’ goals for their laboratory coursework. Journal of Research in Science Teaching 2016, 53 (8), 1198-1215.
  5. Woolnough, B. E.; Allsop, T., Practical work in science. Cambridge University Press: 1985.
  6. Kirschner, P. A., Epistemology, practical work and academic skills in science education. Science & Education 1992, 1 (3), 273-299.
  7. Anderson, R. O., The experience of science: A new perspective for laboratory teaching. Teachers College Press, Columbia University: New York, 1976.
  8. Johnstone, A. H.; Al-Shuaili, A., Learning in the laboratory; some thoughts from the literature. University Chemistry Education 2001, 5 (2), 42-51.

 

Related Posts:

  • No Related Posts

Why do academics use technology in teaching?

This week is All Aboard week in Ireland, essayed at “Building Confidence in Digital Skills for Learning”. I am speaking today in the gorgeous city of Galway on this topic, and came across this paper in a recent BJET which gives some useful context. It summarises interviews with 33 Australian academics from various disciplines, on the topic of why they used technology in assessment. While the particular lens is on assessment, I think there are some useful things to note for those espousing the incorporation of technology generally.

Four themes emerge from the interviews

The first is that there is a perceived cost-benefit analysis at play; the cost of establishing an assessment process (e.g. quizzes) was perceived to be offset by the benefit that it would offer, such as reducing workload in the long-run. However, some responses suggest that this economic bet didn’t pay off, and that lack of time meant that academics often took quick solutions or those they knew about, such as multiple choice quizzes.

The second theme is that technology was adopted because it is considered contemporary and innovative; this suggests a sense of inevitability of using tools as they are there. A (mildly upsetting) quote from an interview is given:

“It would have been nice if we could have brainstormed what we wanted students to achieve, rather than just saying “well how can ICT be integrated within a subject?”

The third theme was one around the intention to shape students’ behaviour – providing activities to guide them through learning. There was a sense that this was expected and welcomed by students.

Finally, at the point of implementation, significant support was required, which often wasn’t forthcoming, and because of this, and other factors, intentions had to be compromised.

The authors use these themes to make some points about the process of advocating and supporting those integrating technology. I like their point about “formative development” – rolling out things over multiple iterations and thus lowering the stakes. Certainly my own experience (in hindsight!) reflects the benefit of this.

One other aspect of advocacy that isn’t mentioned but I think could be is to provide a framework upon which you hang your approaches. Giving students quizzes “coz it helps them revise” probably isn’t a sufficient framework, and nor is “lecture capture coz we can”. I try to use the framework of cognitive load theory as a basis for a lot of what I do, so that I have some justification for when things are supported or not, depending on where I expect students to be at in their progression. It’s a tricky balance, but I think such a framework at least prompts consideration of an overall approach rather than a piecemeal one.

There’s a lovely graphic from All Aboard showing lots of technologies, and as an awareness tool it is great. But there is probably a huge amount to be done in terms of digital literacy, regarding both the how, but also the why, of integrating technology into our teaching approaches.

map2
Click link to go to All Aboard webpage

 

Related Posts:

On learning “gains”

I have heard the term learning gain on and off but considered that it was just an awkward phrase to describe learning, a pleonasm that indicated some generally positive direction for those needing reassurance. And I am sure I have read about the concept of learning gain in the past but never took too much notice, likely too busy checking Twitter to focus on the subject at hand. But it was through Twitter this week that I was released from my gains-free bubble, and enough neurons aligned for me to grasp that learning gains is actually A Thing.

And what a horrendous Thing it is. HEFCE have a website dedicated to it and define it as “an attempt to measure the improvement in knowledge, skills, work-readiness and personal development made by students during their time spent in higher education”. If that isn’t clear, then how about this: a 125 page document published by RAND for HEFCE define learning gain (LG) as

LG = B – A’

They put it like this too – an equation on a separate line so that the scientists can nod along but the humanities people can just skip to the next piece of text. Good scientists should note that the variables should be italicised. Neither route offers any solace. This seemingly placid text masks one of the most remarkably awful ideas – gosh darn it THE most remarkable fuckwittery – I have come across in a long time.

Devoid of being able to monitor very much about teaching and learning in higher education, some genius has dreamt up this clanger. Rather than wondering why it is difficult (impossible) to achieve this task, the approach instead is to quantify with a number how much individual students have learned each year. A student will be measured at the start of the year, and again at the end of the year, and the difference is… well it’s obvious from the formula – there’ll be gainz. Each student’s own gain will be individual, so these poor lab rats exposed to this drudge are going to have to sit in an interview in a few years’ time and explain to someone what this garbage is about. Not only that, but this insidious beast is going to creep into every aspect of students’ learning. Degree transcripts will list a learning gain of 3.4 in career awareness and 2.9 in self-sufficiency. You were on a student society? Add 0.5 to your learning gain. Is this who we are now? How many centuries have universities existed, and now, at our supposed pinnacle, we have been reduced to this tripe. Don’t trust experts, Michael Gove said. Lots of very clever people are involved in this, and while I hope their intentions are good, I don’t trust the motives behind it. Writing an introduction to a special issue of HERD on measurement in universities, the editors cited Blaise:

A measurement culture takes root as if it is education because we are no longer sufficiently engaged with what a good education is. (HT)

HEFCE have funded several universities to wheel out creaking wooden horses, under the guise of pilot studies. It’s a cruel move, throwing some breadcrumbs at an education sector starved of research money to do anything. A horrible irony, it is, that in the name of ensuring value for money for taxpayers, that money is being spent on this. Don’t those involved see the next act in this tragedy? Institutional comparisons of average learning gains, far removed from the individual it was supposed to be personalised to, are easily computed. And numbers, whether they have meaning or not, are very powerful.

So when your student is sitting with you at the end of the year, and you have to measure their career awareness improvement, and your institution is under pressure as its gains are not as good as their protein-fuelled competitor, most likely a private operator in the dystopian near future the Tories are setting up, you might think again about their ability to discuss career pathways. Worse still is the difficulty in measuring a subject gain, or the even greater idiocy of using a proxy to measure something that can’t be meaningfully expressed as a number in the first place.

Another problem with measuring anything half as complicated as this, as a century of predictive studies has found, is that it is very easy to measure the large changes and the little changes, but the mass of students in the middle is mostly a muddle. And with LG = B – A’, there will be lots of muddle in the middle. What about the very bright student who actually knew most of the content at the start of the year because of prior schooling, but was shy and wanted to settle in for the first year of university. What about the one who enjoys just getting by and will focus in Year 3, but in the meantime has to be subject to self-efficacy tests to demonstrate improvement. What about the geniuses? What about first time in families? What about the notion that university is more than a number and the very darn fact that these are not things that we should try to measure.

I want my students to graduate mature and happy and adult and functioning members of society. If they learn some thermodynamics along the way, then that’s great. I can measure one of these things. It will be a sorry state of affairs if we have to measure others, and our future graduates, burdened with this heap of gunge won’t thank us for it.

Related Posts:

  • No Related Posts

Rounding up the peer review and digital badge project

Marcy Towns’ lovely paper from 2015 described the use of digital badges in higher education chemistry, specifically for assessment of laboratory skills. This work was important.  The concept of badges had been around for a while. When I first came across them while doing an MSc in E-Learning back in 2010, laboratory work seemed an obvious place to use them. But while the technology promised a lot, there wasn’t feasible systems in place to do it. And what exactly would you badge, anyway? And would undergraduate students really take a digital badge seriously?

Towns’ work was important for several reasons. On a systematic level, it demonstrated that issuing badges to a very large cohort of students in undergraduate laboratories was feasible. At Purdue, they used an in-house system called Passport to manage the badging process from submission of videos to viewing online for assessment, and subsequent issuing of the badges. But more importantly for me, Towns’ work reasserted the notion of laboratory skills and competencies as something worth assessing in their own right. Skills weren’t being implicitly assessed via quality of results or yield in a reaction. This approach – videoing a student while they demonstrate a technique was directly assessed. This is not something you come across very often in the education literature (some notable exceptions are pointed out in our paper).

This work answered two of the three questions I had about badges – there are systems in place (although as I discovered, Blackboard just about manages to do this, with a lot of creaking).  And there is scope for badging laboratory skills – the concept of badging is built on demonstrable evidence, and videoing techniques is part of this. Whether students take badging seriously I think still needs to be answered. My own sense is that there will need to be a critical mass of badges – an obvious ecosystem where it is clear to students how they can progress, and our own work in this regard is extending into more advanced techniques.

Incorporating peer review

One of the great insights Towns and her students shared at a workshop at BCCE last summer was the notion of narration in demonstrating techniques. Early videos in their initial pilot studies were eerily silent, and it was difficult for them to know what the students’ understanding was as they completed a technique – why they were doing things in a particular way? So they built in narration into the requirements of the demonstration. I think this is one of those things in hindsight that is obvious, but for us to know up front in our own implementation was invaluable.

We opted to have a system where students would video each other rather than be videoed by a demonstrator, and that the narration would be in effect one peer telling the other how they were doing the technique. To facilitate a review of this at the end, the demonstrators in the project came up with the idea of a peer observation sheet (they designed them too – bonus points!). The whole set up was to encourage dialogue – genuine interactions discussing the experimental technique, and allowing for feedback on this based on the guidelines presented in the peer observation sheets. These acted as a framework on which the lab was run. Lord knows chemists like instructions to follow.

Feedback then is given in-situ, and indeed if the student demonstrating feels after discussion that they would like to video it again, they can. This notion of quality, or exemplary work, is underscored by the exemplars provided to students in advance; pre-laboratory videos dedicated to correct display of technique. This whole framework is based around Sadler’s design for feedback, discussed… in the paper!

We’ve documented our on-going work on the project blog and the paper summarising the design and analysis of evaluation is now available in CERP.  It is part of the special issue on transferable skills in the curriculum which will be published in the Autumn, primarily as we felt it developed digital literacy skills in addition to the laboratory work; students were required to submit a link to the video they hosted online, rather than the video itself. This is giving them control over their digital footprint.

Resources

Resources – digital badges, peer observation sheets, links to exemplar videos – are all freely available on the project website.  I really think there is great scope for badges, and look forward to where this project will go next!

Three badges

Related Posts:

Using the Columbo approach on Discussion Boards

As pat of our ongoing development of an electronic laboratory manual at Edinburgh, I decided this year to incorporate discussion boards to support students doing physical chemistry labs. It’s always a shock, and a bit upsetting, to hear students say that they spent very long periods of time on lab reports. The idea behind the discussion board was to support them as they were doing these reports, so that they could use the time they were working on them in a more focussed way.

The core aim is to avoid the horror stories of students spending 18 hours on a report, because if they are spending that time on it, much of it must be figuring out what the hell it is they are meant to be doing. Ultimately, a lab report is a presentation of some data, usually graphically, and some discussion of the calculations based on that data. That shouldn’t take that long.

Setting Up

The system set-up was easy. I had asked around and heard some good suggestions for external sites that did this well (can’t remember it now but one was suggested by colleagues in physics where questions could be up-voted). But I didn’t anticipate so many questions that I would have to answer only the most pressing, and didn’t want “another login”, and so just opted for Blackboard’s native discussion board. Each experiment got its own forum, along with a forum for general organisation issues.

Use

A postgrad demonstrator advised me to allow the posts to be made anonymously, and that seemed sensible. Nothing was being graded, and I didn’t want any reticence about asking questions. Even anonymously, some students apologised for asking what they deemed “silly” questions, but as in classroom scenarios, these were often the most insightful. Students were told to use the forum for questions, and initially, any questions by email were politely redirected to the board. In cases close to submission deadlines, I copied the essential part of the question, and pasted it to the board with a response. But once reports began to be due, the boards became actively used. I made sure in the first weekend to check in too, as this was likely going to be the time that students would be working on their reports.

The boards were extensively used. About 60 of our third years do phys chem labs at a time, and they viewed the boards over 5500 times in a 6 week period. Half of these views were on a new kinetics experiment, which tells me as organiser that I need to review that. For second years, they have just begun labs, and already in a two week period, 140 2nd years viewed the board 2500 times. The number of posts of course is nowhere near this, suggesting that most views are “lurkers”, and probably most queries are common. Since students can post anonymously, I have no data on what proportion of students were viewing the boards. Perhaps it is one person going in lots, but given the widespread viewership across all experiments, my guess is it isn’t. The boards were also accessible to demonstrators (who correct all the reports), but I’ve no idea if they looked at them.

Reception

The reception from students has been glowing, so much so that it is the surprise “win” of the semester. (Hey, look over here at all these videos I made… No? Okay then!) Students have reported at school council, staff student liaison committees, anecdotally to me and other staff that they really like and appreciate the boards. Which of course prompts introspection.

Why do they like them? One could say that of course students will like them, I’m telling them the answer. And indeed, in many cases, I am. The boards were set up to provide clear guidance on what is needed and expected in lab reports. So if I am asked questions, of course I provide clear guidance. That mightn’t always be the answer, but it will certainly be a very clear direction to students on what they should do. But in working through questions and answers, I stumbled across an additional aspect.

One more thing

Me, when asked an electrochemistry question
Me, when asked an electrochemistry question

Everyone’s favourite detective was famous for saying: “oh: just one more thing“. I’ve found in the lab that students are very keen and eager to know what purpose their experiment has in the bigger context, where it might be used in research, something of interest in it beyond the satisfaction of proving, once again, some fundamental physical constant. And in honesty, it is a failing on our part and in the “traditional” approach that we don’t use this opportunity to inspire. So sometimes in responding to questions, I would add in additional components to think about – one more thing – something to further challenge student thought, or to demonstrate where the associated theory or technique in some experiment we were doing is used in research elsewhere. My high point was when I came across an experiment that used exactly our technique and experiment, published in RSC Advances this year. This then sparked the idea of how we can develop these labs more, the subject of another post.

Again I have no idea if students liked this or followed up these leads. But it did ease my guilt a little that I might not be just offering a silver spoon. It’s a hard balance to strike, but I am certainly going to continue with discussion boards for labs while I work it out.

embedded by Embedded Video

YouTube Direkt

Related Posts:

A tour around Johnstone’s Triangle

In a small laboratory off the M25, is a man named Bob. And Bob is a genius at designing and completing reactions on a very small scale. Bob is greatly helped by Dr Kay Stephenson, Mary Owen and Emma Warwick.

I was invited to go down to CLEAPPS to see Bob in action, and try out for myself some of the microscale chemistry he has been developing. I was interested to see it because of a general interest in laboratory expriments and how we can expand our repertoire. But I found out a lot more than just smaller versions of laboratory experiments.

Safety and cost considerations first piqued Bob’s interest in microscale. The traditional laboratory Hofmann voltmeter costs about £250, but the microscale version, including ingenious three way taps to syringe out the separated gases costs about £50. Thoughts about how to do a reduction of copper oxide safely led him to use a procedure that avoided traditional problems with explosions. There’s also a very neat version using iron oxide, incorporating the use of a magnet to show that iron forms.

Electrochemical production of leading to subsequent production of iodine and bromine. Copper crystals form on the electrode.
Electrochemical production of chlorine leading to subsequent production of iodine and bromine. Copper crystals form on the electrode.

Bob promised to show me 93 demonstrations in a morning (“scaled back from 94!”) and I worried on my way there that I would have to put on my polite smile after a while. But actually time flew, and as we worked through the (less than 93) experiments, I noticed something very obvious. This isn’t just about safety and cost. It has deep grounding in the scholarship of teaching and learning too.

Cognitive Load

What I remember from the session is not the apparatus, but the chemistry. Practical chemistry is difficult because we have to worry about setting up apparatus and this can act as a distraction to the chemistry involved. However, the minimal and often absence of apparatus meant that we were just doing and observing chemistry. This particularly struck me when we were looking at conductivity measurements, using a simple meter made with carbon fibre rods (from a kite shop). This, along with several other experiments, used an ingenious idea of instruction sheets within polypropylene pockets (Bob has thought a lot about contact angles). The reaction beaker becomes a drop of water, and it is possible to explore some lovely chemistry: pH indicator colours, conductivity, precipitation reactions, producing paramagnetic compounds, all in this way. It’s not all introductory chemistry; we discussed a possible experiment for my third year physical chemists and there is lots to do for a general chemistry first year lab, including a fabulously simple colourimeter.

Designing a universal indicator.
Designing a universal indicator.

Johnstone’s Triangle

One of the reasons chemistry is difficult to learn is because we have multiple ways of representing it. We can describe things as we view them: the macroscopic scale – a white precipitate forms when we precipitate out chloride ions with silver ions. We can describe things at the atomic scale, describing the ionic movement leading the above precipitation. And we can use symbolism, for example representing the ions in a diagram, or talking about the solubility product equation.  When students learn chemistry, moving between these “domains” is an acknowledged difficulty. These three domains were described by Alex Johnstone, and we now describe this as Johnstone’s triangle.

Johnstone's triangle (from U. Iowa Chemistry)
Johnstone’s triangle (from U. Iowa Chemistry)

One of my observations from the many experiments I carried out with Bob was that we can begin to see these reactions happening. The precipitation reactions took place over about 30 seconds as the ions from a salt at each side migrated through the droplet. Conductivity was introduced into the assumed unionised water droplet by shoving in a grain or two of salt. We are beginning to jump across representations visually. Therefore what has me excited about these techniques is not just laboratory work, but activities to stimulate student chatter about what they are observing and why. The beauty of the plastic sheets is that they can just be wiped off quickly with a paper towel before continuing on.

Reaction of ammonia gas (Centre) with several solutions including HCl with universal indicator (top right) and copper chloride (bottom right)
Reaction of ammonia gas (centre) with several solutions including HCl with universal indicator (top right) and copper chloride (bottom right)

Bob knew I was a schoolboy chemist at heart. “Put down that book on phenomenology” I’m sure I heard him say, before he let me pop a flame with hydrogen and reignite it with oxygen produced from his modified electrolysis apparatus (I mean who doesn’t want to do this?!). I left the room fist-bumping the air after a finale of firing my own rocket, coupled with a lesson in non-Newtonian liquids. And lots of ideas to try. And a mug.

I want a CLEAPPS set to be developed in time for Christmas. In the mean time, you can find lots of useful materials at: http://science.cleapss.org.uk/.

Related Posts:

Using digital technology to assess experimental science

The following was a pre-conference piece submitted to a Royal Society conference on assessment in practical science.

Summary

A modern laboratory education curriculum should embrace digital technologies with assessment protocols that enable students to showcase their skills and competences. With regards to assessment, such a curriculum should:

  • incorporate the digital domain for all aspects related to experimental work; preparations, activities, reflections;
  • provide a robust and valid assessment framework but with flexibility for individuality;
  • emphasise to students the role of documenting evidence in demonstrating skills and competences by means of micro-accreditation, such as digital badges.

This paper summarizes how some digital technologies can address the above points.

How can research into the use of digital technology in the assessment of experimental science improve the validity of assessment in the short, medium and long term?

Re-shifting the emphasis of assessment by means of e-assessment

Our use of digital technologies in everyday life has increased substantially in the last two decades, In contrast, laboratory education has remained stubbornly paper-based, with laboratory notebooks at the core of assessment protocols. This emphasis on a post-hoc report of work done, rather than a consideration of the work itself, means that the value of laboratory work has been distorted in favour of the process of preparing laboratory reports. Experimental work, and the demonstration of experimental skills and competences is of secondary importance.

There are good reasons why emphasis has historically been on the laboratory report instead of laboratory work. Directly assessing experimental work, and indeed any input students have to the planning and completion of experimental work, is subjective. Issues also arise if laboratory work is completed in groups, for either pedagogic or resource reasons. Assigning individual marks is fraught with difficulty.

Digital technologies can provide a basis to address many of the concerns regarding validity that the above issues raise, and provide an opportunity to reposition what is considered to be important in terms of the goals and purpose of experimental science.

The completion of experimental work typically involves:

  • Preparation: planning and preparing for work and making decisions on experimental approaches to be taken;
  • Action: learning how to carry out work competently, demonstrating competence in experimental approaches, and accurately recording data and/or observations;
  • Reflection: drawing conclusions from data, reporting of findings, and evaluation of approaches taken.

Incorporating the digital domain for all aspects of experimental work

Wikis and electronic laboratory notebooks are online document editing spaces that enable individual contribution to be documented and reviewed. Such platforms have been shown to allow the documentation of student thoughts and contributions to work, and as such they provide an excellent basis for recording the entire process (preparation, action, reflection) the student engages with while completing experimental work. Preparation can include a description of what equipment will be used and why, or thoughts on the purpose of experiment. Action can be documented by recording experimental work completed with the inclusion of data or observations in a variety of multi-media formats (text/photos/video/audio). Reflection can allow for a richer form of the typical lab report. In practice this means asking students to consider and review their experimental approach, so that the emphasis shifts away from the “right answer” (an often cited criticism of students coming through a school laboratory curriculum) and more towards a consideration of the approach taken.

Using traceability as a basis for validity of assessment

Validity is a core concern for a national high-stakes examination. Research to date on wikis has pointed to the advantages offered, including that student contributions are date stamped, and each individual contribution is logged. Overall contributions to work can be tracked. Rubrics have been effectively used to assess student laboratory skills, although the compilation of rubrics needs a considerable investment in order to document the desired goals and expectations of any particular curriculum experiment or inquiry so that they can be easily assessed. The value of a more flexible approach to documenting science work using wikis and electronic lab notebooks allows scope for individuality within an overall framework of requirements. However this is an area that needs considerable and ongoing research.

There is a body of research discussing the use of virtual laboratories for mimicking student experimentation, as they provide for more controlled and hence robust assessment protocols. These should be resisted, as they remove students’ exposure to the situational and psychomotor demands that being in the laboratory brings. While virtual laboratories may play some role in summative assessment – for example in decision making – they will likely act as a distraction to the necessary changes required to engaging with and documenting real hands-on work, as they will again shift the focus of experimental science away from actual laboratory work.

Emphasis on experimental science and documenting competences

An advantage of a refocus on documenting of processes means that there is an opportunity for students to showcase their own experimental skills. Digital badges have emerged as a way to accredit these, in what is known as “micro-accreditation”. Digital badges mimic the idea of Guides and Scouts badges by acknowledging achievements and competences in a particular domain. Examples could include badging students experimental skills (for example badges for pipetting, titrating, etc) and higher-level badges (for example badges where students would need to draw on a range of competences already awarded and apply them to a particular scenario (for example an overall analysis where students would need to design the approach and draw on their technical competency on pipetting and titrations). This enables students to document their own progress in an ongoing way, and allows them to reflect on any activities needed to complete a full set of badges on offer. This is an exciting area as it offers significant expansion across the curriculum. Mobile learning platforms will make new and interesting ways to develop these approaches.

Conclusions

Changing from paper based to electronic based media is not without difficulties. In terms of short, medium, and long-term objectives, an initial focus should begin with promoting the possibilities of documenting scientific work in school through the use of multimedia. This will develop a culture and expertise around the use of technical skills and work towards a medium term goal of developing a basis for documenting work in an online platform instead of on paper – emphasising the value of documenting evidence of processes. This can be complemented with the development of a suite of digital badges associated with expected experimental techniques and protocols. In the long term, this allows the consideration of assessment of laboratory work via wikis and electronic lab notebooks, using appropriate rubrics, which allow student to genuinely and accurately showcase their competence in experimental science in a much more meaningful and engaging way.

Related Posts:

  • No Related Posts

Video: e-portfolios for documenting learning in the 21st century

Here is a talk I gave to school teachers at an NCCA event in Dublin Castle earlier this year. It discusses:

  1. Why and how we should enable students create digital artefacts to document their learning?
  2. Practical approaches to doing this – discussing digital badges and wikis in particular.
embedded by Embedded Video

vimeo Direkt

Related Posts:

#ViCEPHEC16 – curly arrows and labs

The annual Variety in Chemistry Education/Physics Higher Education conference was on this week in Southampton. Some notes and thoughts are below.

Curly arrows

Physicists learned a lot about curly arrows at this conference. Nick Greeves‘ opening keynote spoke about the development of ChemTube3D – a stunning achievement – over 1000 HTML pages, mostly developed by UG students. News for those who know the site are that 3D curly arrow mechanisms are now part of the reaction mechanism visualisations, really beautiful visualisation of changing orbitals as a reaction proceeds for 30+ reactions, lovely visualisations of MOFs, direct links to/from various textbooks, and an app at the prototype stage. Nick explained that this has all been developed with small amounts of money from various agencies, including the HEA Physical Sciences Centre.

Mike Casey from UCD spoke about a resource at a much earlier stage of development; an interactive mechanism tutor. Students can choose a reaction type and then answer the question by drawing the mechanism – based on their answer they receive feedback. Version 2 is on the way with improved feedback, but I wondered if this feedback might include a link to the appropriate place in Chemtube3D, so that students could watch the associated visualisation as part of the feedback.

In the same session Robert Campbell spoke about his research on how A-level students answer organic chemistry questions. My understanding is that students tend to use rules of mechanisms (e.g. primary alkyl halides means it’s always SN2) without understanding the reason why; hence promoting rote learning. In a nice project situated in the context of cognitive load theory, Rob used Livescribe technology to investigate students reasoning. Looking forward to seeing this research in print.

Rob’s future work alluded to considering the video worked answers described by Stephen Barnes, also for A-level students. These demonstrated a simple but clever approach; using questions resembling A-level standard, asking students to complete them, providing video worked examples so students could self-assess, and then getting them to reflect on how they can improve. David Read mentioned that this model aligned with the work of Sadler, worth a read.

Laboratory work

Selfishly, I was really happy to see lots of talks about labs on the programme. Ian Bearden was the physics keynote, and he spoke about opening the laboratory course – meaning the removal of prescriptive and allowing students to develop their own procedures. Moving away from pure recipe is of course music to this audience’s ears and the talk was very well received. But you can’t please everyone – I would have loved to hear much more about what was done and the data involved, rather than the opening half of the talk about the rationale for doing so. A short discussion prompted this tweet from Felix Janeway, something we can agree on! But I will definitely be exploring this work more. Ian also mentioned that this approach is also part of physics modules taught to trainee teachers, which sounded a very good idea.

Jennifer Evans spoke about the prevalence of pre-labs in UK institutions following on from the Carnduff and Reid study in 2003. Surprisingly many don’t have any form of pre-lab work. It will be interesting to get a sense of what pre-lab work involves – is it theory or practice? Theory and practice were mentioned in a study from Oxford presented by Ruiqi Yu, an undergraduate student. This showed mixed messages on the purpose of practical work, surely something the academy need to agree on once and for all. There was also quite a nice poster from Oxford involving a simulation designed to teach experimental design, accessible at this link. This was also built by an undergraduate student. Cate Cropper from Liverpool gave a really useful talk on tablets in labs – exploring the nitty gritty of how they might work. Finally on labs, Jenny Slaughter gave an overview of the Bristol ChemLabs, which is neatly summarised in this EiC article, although the link to the HEA document has broken.

Other bites

  • Gwen Lawrie (via Skype) and Glenn Hurst spoke about professional development; Gwen mentioned this site she has developed with Madeline Schultz and others to inform lecturers about PCK. Glenn spoke about a lovely project on training PhD students for laboratory teaching – details here.  This reminds me of Barry Ryan‘s work at DIT.
  • Kristy Turner gave an overview of the School Teacher Fellow model at Manchester, allowing her to work both at school and university with obvious benefits for both. Kristy looked forward to an army of Kristy’s, which would indeed be formidable, albeit quite scary. Even without that, the conference undoubtedly benefits from the presence of school teachers, as Rob’s talk, mentioned above, demonstrates.
  • Rachel Koramoah gave a really great workshop on qualitative data analysis. Proving the interest in chemistry education research, this workshop filled up quickly. The post-it note method was demonstrated, which was interesting and will certainly explore more, but I hope to tease out a bit more detail on the data reduction step. This is the benefit of this model – the participants reduce the data for you – but I worry that this might in turn lead to loss of valuable data.
  • Matthew Mears gave a great byte on the value of explicit signposting to textbooks using the R-D-L approach: Read (assign a reading); Do (Assign questions to try); Learn (assign questions to confirm understanding). Matt said setting it up takes about 30 minutes and he has seen marked improvements in student performance in comparison to other sections of the course.
  • David Nutt won the best poster prize. His poster showed the results of eye-tracking experiments to demonstrate the value or not of an in-screen presenter. Very interesting results which I look forward to seeing in print.

The conference organisation was brilliant and thanks to Paul Duckmanton and Charles (Minion) Harrison for leading the organisation. Lots of happy but tired punters left on Friday afternoon.

I couldn’t attend everything, and other perspectives on the meeting with links etc can be found at links below. From Twitter, Barry Ryan’s presenation on NearPod seemed popular, along with the continuing amazingness of my colleagues in the Edinburgh Physics Education Research Group. One of their talks, by Anna Wood, is available online.

Related Posts:

Getting ready to badge and looking for interested partners

Over the summer we have been working on a lab skills badging project. Lots of detail is on the project home site, but briefly this is what it’s about:

  • Experimental skills are a crucial component of student laboratory learning, but we rarely assess them, or even check them, formally. For schools, there is a requirement to show that students are doing practical work.
  • By implementing a system whereby students review particular lab techniques in advance of labs, demonstrate them to a peer while being videod, reviews the technique with a peer using a checklist, and uploads the video for assessment, we intend that students will be able to learn and perform the technique to a high standard.
  • The video can form part of students electronic portfolio that they may wish to share in future (See this article for more on that).
  • The process is suitable for digital badging – awarding of an electronic badge acknowledging competency in a particular skill (think scout badges for… tying knots…).

Marcy Towns has a nice paper on this for pipetting and we are going to trial it for this and some other lab techniques.

Looking for interested parties to trial it out

I am looking for school teachers who would like to try this method out. It can be used to document any lab technique or procedure you like. You don’t necessarily need an exemplar video, but a core requirement is that you want to document students laboratory work formally, and acknowledge achievement in this work by a digital badge. We will provide the means to offer the badge, and exemplar videos if you need them, assuming they are within our stock. Interested teachers will be responsible for local implementation and assessment of quality (i.e. making the call on whether a badge is issued).

Yes I need help with badge design
Yes I need help with badge design

This will be part of a larger project and there will be some research on the value and impact of the digital badges, drawing from implementation case studies. This will be discussed with individuals, depending on their own local circumstances.

So if you are interested, let’s badge! You can contact me at: michael.seery@ed.ac.uk to follow up.

Related Posts:

What is the “education literature”?

Over on the Education in Chemistry blog, Paul MacLellan wrote an excellent article on reasons teachers don’t engage with education research, which is well worth a read. Speaking a few years ago, I used analogy of a paddle boat steamer when talking about the penetration of education research in HE. The paddles throw up some splashes as it sails over the vast quantity of water below. These splashes were meant to represent how many engage with research – taking on what they hear on the grapevine, Twitter, or CPD. It isn’t systematic.

I’ve spent a lot of time wondering about whether I should expect my colleagues to read education research, and on balance, I don’t think I should. The reason stems from the argument made about different audiences by Keith Taber in MacLellan’s article, and quantified by the featured commenter under his post. And I think we need some clarity about what we mean by education research literature.

Primary, secondary, and tertiary literature

New research in any field is iterative. We know a certain amount, and someone does some research to add a little more to our understanding. In publishing these research findings, we tend to summarise what was known before to situate the work in context, and add on the new bit. As Taber points out, education has the unique position of aiming to address two audiences: like any field it is addressing other education researchers in that field; but also has a second audience; practitioners who may wish to change some aspect of their teaching, and are looking for “what works”. The trouble with the mixed audience is that the language and semantics for each are very different, leaving the practitioner feeling very frustrated. The featured comment under MacLellan’s blog documents this very well. The practitioner looking to improve faces the difficult challenge: they use some search engine with decent keywords and have to try to pick out some paper that will offer them nuggets. It really is a needle in a haystack, (or a splash of water from the river boat…).

If asked for advice, I think I would rather suggest that such practitioners would instead refer to secondary or tertiary literature. Secondary literature aims to summarise the research findings in a particular field. While it is still written with an audience of researchers from the field in mind, these reviews typically group the results from several individual studies into themes or overarching concepts, which can be useful to practitioners who may wish to see “what might work” in their own context. I recall the value of MacArthur and Jones’ review on clickers, and my own review of flipping lectures in chemistry are examples of this type.

The audience shifts more fully when we move to tertiary literature. While there is still likely two audiences for education research, the emphasis with tertiary literature is addressing a wider audience; introducing the field to a wider audience of interested readers. Typically books summarising teaching approaches are grounded in well documented research, but unlike secondary sources, they are written for those wishing to find out about the topic from an introductory level, and the language is considerate of the wider audience. Think of Taber’s books on misconceptions, and the impact they have had. More recently, the web has offered us new forms of tertiary literature – blogs are becoming more popular to disseminate the usefulness of research to a wider audience and summaries such that recently published by Tina Overton on factors to consider in teaching approaches can help introduce overarching research findings, without having to penetrate the original education research studies.

So should my colleagues read education research? I still don’t think so. A tourist to a new city wouldn’t read academic articles on transport infrastructure and architecture – they would just read the tourist guide. Of course it can be inspiring to read a case study or see what students in an individual situation experienced. But I would rather recommend secondary and tertiary sources to them if they are going to spend any valuable time reading.

And that means, in chemistry education’s case, we need a lot more of these types of publications. A recent J Chem Ed editorial suggested that they are thinking about promoting this type of publication, and any movement in that direction is welcome.

Related Posts:

  • No Related Posts

Planning a new book on laboratory education

Contracts have been signed so I am happy to say that I am writing a book on chemistry laboratory education as part of the RSC’s new Advances in Chemistry Education series due for publication mid 2017.

I’ve long had an interest in lab education, since stumbling across David McGarvey’s “Experimenting with Undergraduate Practicals” in University Chemistry Education (now CERP). Soon after, I met Stuart Bennett, now retired, from Open University at a European summer school. Stuart spoke about lab education and its potential affordances in the curriculum. He was an enormous influence on my thinking in chemistry education, and in practical work in particular. We’d later co-author a chapter on lab education for a book for new lecturers in chemistry published by the RSC (itself a good example on the benefits of European collaboration). My first piece of published education research was based on laboratory work; a report in CERP on the implementation of mini-projects in chemistry curriculum, completed with good friends and colleagues Claire Mc Donnell and Christine O’Connor. So I’ve been thinking about laboratory work for a long time.

Why a book?

A question I will likely be asking with increasing despair over the coming months is: why am I writing a book? To reaffirm to myself as much as anything else, and to remind me if I get lost on the way, the reasons are pretty straightforward.

My career decisions and personal interests over the last few years have meant that I have moved my focus entirely to chemistry education. Initially this involved sneaking in some reading between the covers of J. Mat. Chem. when I was meant to be catching up on metal oxide photocatalysis. But as time went on and thanks to the support of others involved in chemistry education, this interest became stronger. I eventually decided to make a break with chemistry and move into chemistry education research. (One of the nicest things for me personally about joining Edinburgh was that this interest was ultimately validated.)

So while my knowledge of latest chemistry research is limited mainly to Chemistry World reports, one thing I do know well is the chemistry education research literature. And there is a lot of literature on laboratory education. But as I read it and try to keep on top of it, it is apparent that much of the literature on laboratory education falls into themes, and by a bit of rethinking of these themes and by taking a curriculum design approach, some guiding principles for laboratory education can be drawn up. And that a compilation of such principles, within the context of offering a roadmap or plan for laboratory education might be useful to others.

And this is what I hope to offer. The book will be purposefully targeted at anyone responsible for taking a traditional university level chemistry laboratory course and looking to change it. In reality, such change is an enormous task, and being pragmatic, needs to happen in phases. It’s tempting then to tweak bits and change bits based on some innovation presented at a conference or seen in a paper. But there needs to be an overall design for the entire student experience, so that incremental changes sum up to an overall consistent whole piece. Furthermore, by offering a roadmap or overall design, I hope to empower members of staff who may be responsible for such change by giving the evidence they may need to rationalise changes to colleagues. Everyone has an opinion on laboratory education! The aim is to provide evidence-based design approaches.

My bookshelves are groaning with excellent books on laboratory education. I first came across Teaching in Laboratories by Boud Dunn and Hegarty-Hazel back in the days when I stumbled across McGarvey’s article. I still refer to it, as even though it was published in 1986, it still carries a lot of useful material. Woolnough and Allsop’s Practical Work in Science is also excellent; crystal clear on the role and value of laboratory education and its distinction from lecture based curriculum. Hegarty-Hazel also edited The Student Laboratory and the Science Curriculum. Roger Anderson’s book The Experience of Science was published before I was born.

I have bought these now out of print books and several more second hand for less than the cost of a cup of coffee. I have learned lots from them, but am mindful that (justifiably) well-known and comprehensive as they are, they are now out of print and our university laboratories have not seen much change in the forty years since Anderson.

I am very conscious of this as I structure my own book. I can speculate that books about science laboratories at both secondary and tertiary level may be too broad. So the book is focussing exclusively on chemistry and higher education.

Secondly, the book is very clearly directed at those implementing a new approach, those involved in change. Ultimately it is their drive and energy and input that decides the direction of changes that will occur.  I hope that by speaking directly to them with a clear rationale and approach based on an up-to-date literature, that it may ease the workload somewhat for those looking to rethink laboratory education in their curricula. Now I just need to actually write it.

Related Posts:

Alex Johnstone’s 10 Educational Commandments

My thanks to Prof Tina Overton for alerting me to the fact that these exist. I subsequently happened across them in this article detailing an interview with Prof Johnstone (1), and thought they would be useful to share.

Ten Educational Commandments 

1. What is learned is controlled by what you already know and understand.

2. How you learn is controlled by how you learned in the past (related to learning style but also to your interpretation of the “rules”).

3. If learning is to be meaningful, it has to link on to existing knowledge and skills, enriching both (2).

4. The amount of material to be processed in unit time is limited (3).

5. Feedback and reassurance are necessary for comfortable learning, and assessment should be humane.

6. Cognisance should be taken of learning styles and motivation.

7. Students should consolidate their learning by asking themselves about what goes on in their own heads— metacognition.

8. There should be room for problem solving in its fullest sense (4).

9. There should be room to create, defend, try out, hypothesise.

10. There should be opportunity given to teach (you don’t really learn until you teach) (5).

Johnstone told his interviewer that he didn’t claim any originality for the statements, which his students called the 10 educational commandments. Rather he merely brought together well known ideas from the literature. But, and importantly for this fan, Johnstone said that they have been built into his own research and practice, using them as “stars to steer by”.

References

  1. Cardellini, L, J. Chem. Educ., 2000, 77, 12, 1571.
  2. Johnstone, A. H. Chemical Education Research and Practice in Europe (CERAPIE) 2000, 1, 9–15; online at http://www.uoi.gr/cerp/2000_January/contents.html.
  3. Johnstone, A. H. J. Chem. Educ. 1993, 70, 701–705
  4. Johnstone, A. H. In Creative Problem Solving in Chemistry; Wood, C. A., Ed.; Royal Society of Chemistry: London, 1993.
  5. Sirhan, G.; Gray, C.; Johnstone, A. H.; Reid, N. Univ. Chem. Educ. 1999, 3, 43–46.

Related Posts:

ChemEd Journal Publications from UK since 2015

I’ve compiled this list for another purpose and thought it might be useful to share here. 

The following are publications I can find* from UK corresponding authors on chemistry education research, practice, and laboratory work relevant to HE since beginning of 2015.  There are lots of interesting finds and useful articles. Most are laboratory experiments and activities, Some refer to teaching practice or underlying principles.

I don’t imagine this is a fully comprehensive list, so do let me know what’s missing. It’s in approximate chronological order from beginning of 2015.

  1. Surrey (Lygo-Baker): Teaching polymer chemistry
  2. Reading (Strohfeldt): PBL medicinal chemistry practical
  3. Astra Zeneca and Huddersfield (Hill and Sweeney): A flow chart for reaction work up
  4. Bath (Chew): Lab experiment: coffee grounds to biodiesel
  5. Nottingham (Galloway): PeerWise for revision
  6. Hertfordshire (Fergus): Context examples of recreational drugs for spectroscopy and introductory organic chemistry 
  7. Overton (was Hull): Dynamic problem based learning
  8. Durham (Hurst, now at York): Lab Experiment: Rheology of PVA gels
  9. Reading (Cranwell): Lab experiment: Songoshira reaction
  10. Edinburgh (Seery): Flipped chemistry trial
  11. Oaklands (Smith): Synthesis of fullerenes from graphite
  12. Manchester (O’Malley): Virtual labs for physical chemistry MOOC  
  13. Edinburgh (Seery): Review of flipped lectures in HE chemistry
  14. Manchester (Wong): Lab experiment: Paterno-Buchi and kinetics
  15. Southampton (Coles): Electronic lab notebooks in upper level undergraduate lab
  16. UCL (Tomaszewski): Information literacy, searching
  17. St Andrews & Glasgow (Smellie): Lab experiment: Solvent extraction of copper
  18. Imperial (Rzepa): Lab experiment: Assymetric epoxidation in the lab and molecular modelling; electronic lab notebooks
  19. Reading (Cranwell): Lab experiment: Wolff Kishner reaction
  20. Imperial (Rzepa): Using crystal structure databases
  21. Leeds (Mistry): Inquiry based organic lab in first year – students design work up
  22. Manchester (Turner): Molecular modelling activity
  23. Imperial (Haslam & Brechtelsbauer): Lab experiment: vapour pressure with an isosteniscope
  24. Imperial (Parkes): Making a battery from household products
  25. Durham (Bruce and Robson): A corpus for writing chemistry
  26. Who will it be…?!

*For those interested, the Web of Science search details are reproduced below. Results were filtered to remove non-UK papers, conference proceedings and editorials.

ADDRESS:((united kingdom OR UK OR Scotland OR Wales OR England OR (Northern Ireland))) AND TOPIC: (chemistry)AND YEAR PUBLISHED: (2016 or 2015)

Refined by: WEB OF SCIENCE CATEGORIES: ( EDUCATION EDUCATIONAL RESEARCH OR EDUCATION SCIENTIFIC DISCIPLINES )
Timespan: All years. Indexes: SCI-EXPANDED, SSCI, A&HCI, CPCI-S, CPCI-SSH, BKCI-S, BKCI-SSH, ESCI, CCR-EXPANDED, IC.

 

Related Posts:

Practical work: theory or practice?

Literature on laboratory education over the last four decades (and more, I’m sure) has a lot to say on the role of practical work in undergraduate curricula. Indeed Baird Lloyd (1992) surveys opinions on the role of practical work in North American General Chemistry syllabi over the course of the 20th century and opens with this delicious quote, apparently offered by a student in 1928 in a $10 competition:

Chemistry laboratory is so intimately connected with the science of chemistry, that, without experimentation, the true spirit of the science cannot possibly be acquired. 

I love this quote because it captures so nicely the sense that laboratory work is at the heart of chemistry teaching – its implicit role in the teaching of chemistry is unquestionable. And although it has been questioned a lot, repeatedly, over the following decades; not many today would advocate a chemistry syllabus that did not contain laboratory work.

I feel another aspect of our consideration of chemistry labs is often unchallenged, and needs to be. That is the notion that chemistry laboratories are in some way proving ground for what students come across in lectures. That they provide an opportunity for students to visualise and see for themselves what the teacher or lecturer was talking about. Or more laudably, to even “discover” for themselves by following a controlled experiment a particular relationship. Didn’t believe it in class that an acid and an alcohol make an ester? Well now you are in labs, you can prove it. Can’t imagine that vapour pressure increases with temperature? Then come on in – we have just the practical for you. Faraday said that he was never able to make a fact his own without seeing it. But then again, he was a great demonstrator.

A problem with this on an operational level, especially at university, and especially in the physical chemistry laboratory, is that is near impossible to schedule practicals so that they follow on from the introduction of theory in class. This leads to the annual complaint from students that they can’t do the practical because they haven’t done the theory. Your students are saying this, if you haven’t heard them, you need to tune your surveys.

It’s an entirely understandable sentiment from students because we situate practicals as a subsidiary of lectures. But this is a false relationship for a variety of reasons. The first is that if you accept a model whereby you teach students chemistry content in lectures, why is there a need to supplement this teaching with a re-teaching of a sub-set of topics, arbitrarily chosen based on the whim of a lab course organiser and the size of a department’s budget? Secondly, although we aim to re-teach, or hit home some major principle again in lab work, we don’t really assess that. We might grade students’ lab report and give feedback, but it is not relevant to them as they won’t need to know it again in that context. The lab report is done. And finally, the model completely undermines the true role of practical work and value it can offer the curriculum.

A different model

When we design lecture courses, we don’t really give much thought to the labs that will go with them. Lecture course content has evolved rapidly to keep up to date with new chemistry; lab development is much slower. So why not the other way around? Why not design lab courses independent of lectures? Lecture courses are one area of the curriculum to learn – typically the content of the curriculum; laboratory courses are another. And what might the role here be?

Woolnough and Allsop (1985), who make a clear and convincing argument for cutting the “Gordian knot” between theory and practice, instead advocate a syllabus that has three aims:

  1. developing practical skills and techniques.
  2. being a problem-solving chemist.
  3. getting a “feel for phenomena”.

The detail of how this can be done is the subject of their book, but involves a syllabus that has “exercises, investigations, and experiences”. To me these amount to the “process” of chemistry. On a general level, I think this approach is worth consideration as it has several impacts on teaching and learning in practice.

Impacts on teaching and learning

Cutting the link between theory and practice means that there is no longer a need to examine students’ understanding of chemistry concepts by proxy. Long introductions, much hated by students, which aim to get the student to understand the theory behind the topic at hand by rephrasing what is given to them in a lab manual, are obsolete. A properly designed syllabus removes the need for students to have had lectures in a particular topic before a lab course. Pre-lab questions can move away from being about random bits of theory and focus on the relationships in the experiment. There is no need for pointless post-lab questions that try to squeeze in a bit more theory.

Instead, students will need to approach the lab with some kind of model for what is happening. This does not need to be the actual equations they learn in lectures. Some thought means they may be able to draw on prior knowledge to inform that model. Of course, the practical will likely involve using some aspect of what they cover or will cover in lectures, but at the stage of doing the practical, it is the fundamental relationship they are considering and exploring. Approaching the lab with a model of a relationship (clearly I am in phys chem labs here!) and exploring that relationship is better reflecting the nature of science, and focussing students attention on the study in question. Group discussions and sharing data are more meaningful. Perhaps labs could even inform future lectures rather than rely on past ones! A final advantage is the reassertion of practical skills and techniques as a valuable aspect of laboratory work.

A key point here is that the laboratory content is appropriate for the level of the curriculum, just as it is when we design lectures. This approach is not advocating random discovery – quite the opposite. But free of the bond with associated lectures, there is scope to develop a much more coherent, independent, and more genuinely complementary laboratory course.

References

Baird W. Lloyd, The 20th Century General Chemistry Laboratory: its various faces, J. Chem. Ed., 1992, 69(11), 866-869.

Brian Woolnaugh and Terry Allsop (1985) Practical Work in Science, Cambridge University Press.

1928 quote

Related Posts:

Developing practical skills in the chemistry laboratory

How do we prepare students for practical skills they conduct in the laboratory?

Practical skills involve psychomotor development, as they typically involve handling chemicals, glassware, and instrumentation. But how do we prepare students for this work, and do we give them enough time to develop these skills?

Farmer and Frazer analysed 126 school experiments (from the old O-level Nuffield syllabus) with a view to categorising practical skills and came up with some interesting results.[1] Acknowledging that some psychomotor tasks include a cognitive component (they give the example of manipulating the air-hole collar of a Bunsen burner while judging the nature of the flame for a particular task at hand) they identified 65 psychomotor tasks and 108 cognitive tasks from the experiments studied. Some of these psychomotor tasks are defined as having a key role, in that the success of the experiment is dependent on the successful completion of that task, reducing the number of psychomotor skills to 44. Many of these key tasks were required in only a few experiments, so the set was again reduced to number of frequent key tasks – those occurring in more than 10 experiments. The 14 frequent key tasks subsequently identified are described in their table below.

Data from Education in Chemistry (Ref 1)
Data from Education in Chemistry (Ref 1)

Thus of the 65 psychomotor skills listed, only 14 are defined as frequent key tasks, limiting the opportunities for pupils to develop the skills associated with completing them. Indeed this paper goes on to demonstrate that in an assessment of 100 pupils, there was very poor demonstration of ability in correctly completing the practical tasks, which they attribute to the design of the syllabus and the limited opportunity to do practical work.

This article prompts me to think again: how do we prepare students for the laboratory skills aspect of practical work? I think the most common approach is to demonstrate immediately in advance of the student completing the practical, explaining the technique or the apparatus and its operation. However, demonstration puts students in the mode of observer; they are watching someone else complete an activity, rather than conceptualising their own completion. It also relies on the quality of the demonstrator, and is subject to local hazards, such as time available, ability to see and hear the demonstration, and so on. Therefore, there may be benefit in shifting this demonstration to pre-lab, allowing students time to become accustomed to a technique and its nuances.

Such pre-labs need to be carefully designed, and actively distinguished from any pre-lab information focussing on theory, which has a separate purpose. At Edinburgh, two strategies are planned.

The first is on the development of core introductory laboratory skills: titrations involving pipetting and buretting; preparing standard solutions including using a balance; and setting up Quickfit glassware to complete a distillation. Pre-lab information is provided to students in the form of videos demonstrating each technique, with key steps in each procedure highlighted in the video. Students will be required to demonstrate each of the three procedures to their peers in the laboratory, while their peer uses the checklist to ensure that all aspects of the task were completed appropriately. The purpose here is to incorporate preparation, demonstration, and peer-review into the learning of core lab skills, as well as to set in mind early on in students’ university careers the correct approach and the appropriate glassware to use for basic laboratory techniques. The approach includes students’ videoing their peers as part of the review process using mobile phones; and the video recording will subsequently be used as evidence for issuing students with a digital badge for that technique (more on that at the project homepage).

The second approach is to develop the laboratory manual beyond its traditional textbook format to be an electronic laboratory manual, with pre-lab demonstrations included. More on that project to come soon.

In designing pre-lab activities for skills development, the aim is to move beyond “just demonstrating” and to get students thinking through the approaches they will take. The reason for this is guided by work done by Beasley in the late 1970s. Beasley drew from the literature of physical education to consider the development of psychomotor skills in chemistry. He studied the concept of mental practice as a technique to help students prepare for the laboratory.[2] Mental practice is based on the notion that physical activity requires mental thought, and thus mentally or introspectively rehearsing an activity prompts neural and muscular responses.[3]  Students were assigned to groups where they conducted no preparation, physical preparation, mental preparation, and both physical and mental preparation. They were tested before and after completing a lab on volumetric analysis. Beasley reported that students who students entering college from school were not proficient in completing volumetric analysis based on accuracy of their results. Furthermore, there was no significant difference in post-test scores of treatment students (which were all better than students who did no preparation), suggesting that mental preparation was as effective as physical preparation.

Those interested in reading more on this topic may enjoy two reviews by Stephen DeMeo; one in J. Chem. Ed.[4] and an elegant piece “Gazing at the hand” in Curriculum Inquiry.[5]

References

[1] A. Farmer and M. J. Frazer, Practical Skills in School Chemistry, Education in Chemistry, 1985, 22, 138.

[2] W. Beasley, The Effect of Physical and Mental Practice of Psychomotor Skills on Chemistry Student Laboratory Performance, Journal of Research in Science Teaching, 1979, 16(5), 473.

[3] J. B. Oxendine, Physical education. In Singer, R. B. (Ed.), The psychomotor domain: Movement behavior. Philadelphia: Lea and Feberger, 1972.

[4] S. De Meo, Teaching Chemical Technique: A review of the literature, Journal of Chemical Education, 2001 78(3), 373.

[5] S. De Meo, Gazing at the Hand: A Foucaultian View of the Teaching of Manipulative Skills to Introductory Chemistry Students in the United States and the Potential for Transforming Laboratory Instruction, Curriculum Inquiry, 2005, 35, 3.

Related Posts:

Reflections on #micer16

Several years ago at the Variety in Chemistry Education conference, there was a rather sombre after-dinner conversation on whether the meeting would continue on in subsequent years. Attendance numbers were low and the age profile was favouring the upper half of the bell-curve.

Last year at Variety I registered before the deadline and got, what I think was the last space, and worried about whether my abstract would be considered. The meeting was packed full of energetic participants interested in teaching from all over UK and Ireland, at various stages of their careers. A swell in numbers is of course expected from the merging with the Physics Higher Education Conference, but the combination of the two is definitely (from this chemist’s perspective) greater than the sum of its parts.

Participants at #micer16
Participants at #micer16

What happened in the mean time would be worthy of a PhD study. How did the fragile strings that were just holding people together in this disparate, struggling community, not snap, but instead strengthen to bring in many newcomers? A complex web of new connections has grown.  While I watched it happen I am not sure how it happened. I suspect it is a confluence of many factors: the efforts of the RSC at a time when chemistry was at a low-point. The determination of the regular attendees to keep supporting it, knowing its inherent value. The ongoing support of people like Stuart Bennett, Dave McGarvey, Stephen Breuer, Bill Byers, and others. And of course the endless energy of Tina Overton and the crew at the Physical Sciences Centre at Hull.

Whatever the process, we are very lucky to have a vibrant community of people willing to push and challenge and innovate in our teaching of chemistry. And that community is willing and is expected to play a vital role in the development of teaching approaches. This requires design and evaluation of these approaches; a consideration of how they work in our educational context. And this requires the knowledge of how to design these research studies and complete these evaluations. Readers will note that Variety now particularly welcome evidence-based approaches.

Most of us in this community are chemists, and the language of education research can be new, and difficult to navigate. Thus a meeting such as MICER held last week aimed to introduce and/or develop approaches in education research. The speakers were excellent, but having selected them I knew they would be! Participants left, from what I could see and saw on social media, energised and enthused about the summer ahead and possible projects.

But we will all return to our individual departments, with the rest of the job to do, and soon enthusiasm gives way to pragmatism, as other things get in the way. It can be difficult to continue to develop expertise and competence in chemistry education research without a focus. The community needs to continue to support itself, and seek support from elsewhere.

How might this happen?

Support from within the community can happen by contacting someone you met at a conference and asking them to be a “critical friend”. Claire Mc Donnell introduced me to this term  and indeed was my critical friend. This is someone whom you trust to talk about your work with, share ideas and approaches, read drafts of work. It is a mutual relationship, and I have found it extremely beneficial, both from the perspective of having someone sensible to talk to, but also from a metacognitive perspective. Talking it out makes me think about it more.

The community can organise informal and formal journal clubs. Is there a particular paper you liked – how did the authors complete a study and what did they draw from it? Why not discuss it with someone, or better still in the open?

Over the next while I am hoping to crystallise these ideas and continue the conversations on how we do chemistry education research. I very much hope you can join me and be an active participant; indeed a proactive participant. So that there is an independent platform, I have set up the website http://micerportal.wordpress.com/ and welcome anyone interested in being involved to get in touch about how we might plan activities or even a series of activities. I hope to see you there.

Related Posts: