My ten favourite #chemed articles of 2015

This post is a sure-fire way to lose friends… but I’m going to pick 10 papers that were published this year that I found interesting and/or useful. This is not to say they are ten of the best; everyone will have their own 10 “best” based on their own contexts.

Caveats done, here are 10 papers on chemistry education research that stood out for me this year:

0. Text messages to explore students’ study habits (Ye, Oueini, Dickerson, and Lewis, CERP)

I was excited to see Scott Lewis speak at the Conference That Shall Not Be Named during the summer as I really love his work. This paper outlines an interesting way to find out about student study habits, using text-message prompts. Students received periodic text messages asking them if they have studied in the past 48 hours. The method is ingenious. Results are discussed in terms of cluster analysis (didn’t study as much, used textbook/practiced problems, and online homework/reviewed notes). There is lots of good stuff here for those interested in students’ study and supporting independent study time. Lewis often publishes with Jennifer Lewis, and together their papers are master-classes in quantitative data analysis. (Note this candidate for my top ten was so obvious I left it out in the original draft, so now it is a top 11…)

1. What do students learn in the laboratory (Galloway and Lowery-Bretz, CERP)?

This paper reports on an investigation using video cameras on the student to record their work in a chemistry lab. Students were interviewed soon after the lab. While we can see what students physically do while they are in the lab (psychomotor learning), it is harder to measure cognitive and affective experiences. This study set about trying to measure these, in the context of what the student considered to be meaningful learning. The paper is important for understanding learning that is going on in the laboratory (or not, in the case of recipe labs), but I liked it most for the use of video in collection of data.

2. Johnstone’s triangle in physical chemistry (Becker, Stanford, Towns, and Cole, CERP).

We are familiar with the importance of Johnstone’s triangle, but a lot of research often points to introductory chemistry, or the US “Gen Chem”. In this paper, consideration is given to understanding whether and how students relate macro, micro, and symbolic levels in thermodynamics, a subject that relies heavily on the symbolic (mathematical). The reliance on symbolic is probably due in no small part to the emphasis most textbooks place on this. The research looked at ways that classroom interactions can develop the translation across all levels, and most interestingly, a sequence of instructor interactions that showed an improvement in coordination of the three dimensions of triplet. There is a lot of good stuff for teachers of introductory thermodynamics here.

3. The all-seeing eye of prior knowledge (Boddey and de Berg, CERP).

My own interest in prior knowledge as a gauge for future learning means I greedily pick up anything that discusses it in further detail. And this paper does that well. It looked at the impact of completing a bridging course on students who had no previous chemistry, comparing them with those who had school chemistry. However, this study takes that typical analysis further, and interviewed students. These are used to tease out different levels of prior knowledge, with the ability to apply being supreme in improving exam performance.

4.  Flipped classes compared to active classes (Flynn, CERP).

I read a lot of papers on flipped lectures this year in preparing a review on the topic. This was by far the most comprehensive. Flipping is examined in small and large classes, and crucially any impact or improvement is discussed by comparing with an already active classroom. A detailed model for implementation of flipped lectures linking before, during, and after class activities is presented, and the whole piece is set in the context of curriculum design. This is dissemination of good practice at its best.

5. Defining problem solving strategies (Randles and Overton, CERP).

This paper gained a lot of attention at the time of publication, as it compares problem solving strategies of different groups in chemistry; undergraduates, academics, and industrialists. Beyond the headline though, I liked it particularly for its method – it is based on grounded theory, and the introductory sections give a very good overview on how this was achieved, which I think will be informative to many. Table 2 in particular demonstrates coding and example quotes which is very useful.

6. How do students experience labs? (Kable and more, IJSE)

This is a large scale project with a long gestation – the ultimate aim is to develop a laboratory experience survey, and in particular a survey for individual laboratory experiments, with a view to their iterative improvement. Three factors – motivation (interest and responsibility), assessment, and resources – are related to students’ positive experience of laboratory work. The survey probes students’ responses to these (some like quality of resources give surprising results). It is useful for anyone thinking about tweaking laboratory instruction, and looking for somewhere to start.

7. Approaches to learning and success in chemistry (Sinapuelas and Stacy, JRST)

Set in the context of transition from school to university, this work describes the categorisation of four levels of learning approaches (gathering facts, learning procedures, confirming understanding, applying ideas). I like these categories as they are a bit more nuanced, and perhaps less judgemental, than surface vs deep learning. The approach level correlates with exam performance. The paper discusses the use of learning resources to encourage students to move from learning procedures (level 2) to confirming understanding (level 3). There are in-depth descriptions characterising each level, and these will be informative to anyone thinking about how to support students’ independent study.

8. Exploring retention (Shedlosky-Shoemaker and Fautch, JCE).

This article categorises some psychological factors aiming to explain why some students do not complete their degree. Students switching degrees tend to have higher self-doubt (in general rather than just for chemistry) and performance anxiety. Motivation did not appear to distinguish between those switching or leaving a course and those staying. The study is useful for those interested in transition, as it challenges some common conceptions about student experiences and motivations. This study appears to suggest much more personal factors are at play.

9. Rethinking central ideas in chemistry (Talanquer, JCE).

Talanquer publishes regularly and operates on a different intellectual plane to most of us. While I can’t say I understand every argument he makes, he always provokes thought. In this commentary, he discusses the central ideas of introductory chemistry (atoms, elements, bonds, etc), and proposes alternative central ideas (chemical identity, mechanisms, etc). It’s one of a series of articles by several authors (including Talanquer himself) that continually challenge the approach we currently take to chemistry. It’s difficult to say whether this will ever become more than a thought experiment though…

10. Newcomers to education literature (Seethaler, JCE).

If you have ever wished to explain to a scientist colleague how education research “works”, this paper might be of use. It considers 5 things scientists should know about education research: what papers can tell you (and their limitations), theoretical bases in education research, a little on misconceptions and content inventories, describing learning, and tools of the trade. It’s a short article at three pages long, so necessarily leaves a lot of information out. But it is a nice primer.

Finally

The craziest graphical abstract of the year must go to Fung’s camera set up. And believe me, the competition was intense.

ed-2014-009624_0007

Related Posts:

Why I love the lecture (at academic conferences)

There is a narrative that goes like this: most educators promote active learning. Educators present at conferences. Therefore they should use active learning approaches at conference talks. Practice what they preach, and all that.

I disagree. I love a good lecture. Good lectures can be memorable and informative. Yes, that was me stifling back a tear when Martyn Poliakoff gave his Nyholm lecture at Variety. Yes, that is me falling in love with chemistry again every time I hear AP de Silva talk. And yes, that was me punching the air at the final Gordon CERP talk by [redacted] at [redacted].

Requesting audience activity at conferences is confusing the process of learning by students on a module with identified learning outcomes, with learning by an academic who define their own learning outcomes when they look at the book of abstracts. Worse still, it is confusing learning by novices with learning by experts. As experts, we are in a position to go to a lecture and immediately scoop up information that is relevant and useful to us. We have the prior knowledge and expertise to call upon to place quite complicated information in context. That’s what being an expert is. The purpose of the presentation is to place the work in context of the speaker’s overall research programme; bring what might be several publications under one umbrella, and present it as a narrative. Argued with good data. Links to publications for more information. Hopefully with a few jolly anecdotes along the way.

Audience participation is a folly. Consider an education talk where the speaker requests the audience to have a chat about something that’s being discussed and predict what’s next, or offer ideas. Academics are blessed with many talents, but we’re not social beasts. The little chat is prefaced with social niceties as we try to get over the fact that we have to speak to other humans, followed by some discussion on what we’re meant to be talking about as we were too busy checking our Twitter feed to see what people said about our talk earlier. Of course, some amazing gems might come out in the feedback to the presenter. But are they really things the presenter isn’t aware of? Was it worth the time? I don’t think so.

I say this with hand on heart, as I have given a lecture at a conference which relied on audience participation. It was a lecture on the flipped lecture, and I agreed with conference organisers that it would be a fun thing to immerse the audience in a flipped experience. As a Friday morning keynote after the conference dinner the night before, the slot made sense. It was great fun and we had some great discussion – but was there anything that came back from the audience that I couldn’t have discussed in my talk? Probably not. It was very popular (thanks Twitter) and I did learn lots, but that’s not the purpose of the conference. Speakers aren’t there to learn. They’re there to inform. Especially keynote speakers – hey we paid for your fees y’know! Now let’s all discuss this over coffee.

(c) The New Yorker
(c) The New Yorker

Related Posts:

Teaching Fellowship 2013

My work on enabling students to prepare for lectures has gathered some momentum again this year with the awarding of a Teaching Fellowship for a project involving second year students.

Most work to date has involved Year 1 students, focussing on introducing core concepts in advance of a lecture. These pre-lecture activities are probably best described in an Education in Chemistry article previously published.

This new project extends the concept to second years, and expands on the amount of information presented in advance of the lecture. The idea is that by providing much of the “content delivery” of lecture in advance, the lecture hour can be devoted to more in-depth discussion, problem solving, etc. As well as development of the material, a formal evaluation will be conducted, as outlined below. The project aims to evaluate the impact of the “inverted lecture” or “flipped lecture”, for which there is currently very scant literature.

There’s a lot to consider in implementing this study. I ran a pilot last year to identify what issues came up in the implementation phase. Some positive observations came through; students liked having control of the learning materials and engaging with them at their own pace. There was a good level of in-class activity. Giving students gapped notes or something to complete while watching the pre-lecture activities helped them focus on extracting relevant information and organising this information from the pre-lecture videos, rather than passive viewing. On the downside, administration problems last year meant that many students weren’t registered in Week 1(or indeed several weeks in) so material had to be simultaneously loaded on an open access platform. As well as hassle, this meant I couldn’t monitor access statistics. This year, the module is delayed until mid-semester. While I could gauge the “buzz” in a lecture last year as students were working through problems, I had no real idea of how each student was progressing until the mid-semester test. This year, I am hoping clickers or after lecture quizzes will highlight problems as we go from week to week.

The evaluation element aims to study student’s cognitive engagement in the lecture. They will be “interrupted” as they work through a problem and asked four short questions which are drawn from another study, which validated this instrument as a measure of cognitive engagement (more details on the instrument itself will be in a future post). I wish to show that as the students are working through their in-lecture tasks, having watched pre-lecture videos, that they are cognitively engaged with the material and task at hand. This information will be coupled with access data to the resources, quiz scores, and student interviews to build up a profile of how the flipped lecture works for middle stage undergraduate students. I also wish to develop a “How To” pack for lecturers considering implementing a similar strategy in their own teaching (assuming the study shows it is worthwhile).

Wish me luck!

Related Posts: