Planning a new book on laboratory education

Contracts have been signed so I am happy to say that I am writing a book on chemistry laboratory education as part of the RSC’s new Advances in Chemistry Education series due for publication mid 2017.

I’ve long had an interest in lab education, since stumbling across David McGarvey’s “Experimenting with Undergraduate Practicals” in University Chemistry Education (now CERP). Soon after, I met Stuart Bennett, now retired, from Open University at a European summer school. Stuart spoke about lab education and its potential affordances in the curriculum. He was an enormous influence on my thinking in chemistry education, and in practical work in particular. We’d later co-author a chapter on lab education for a book for new lecturers in chemistry published by the RSC (itself a good example on the benefits of European collaboration). My first piece of published education research was based on laboratory work; a report in CERP on the implementation of mini-projects in chemistry curriculum, completed with good friends and colleagues Claire Mc Donnell and Christine O’Connor. So I’ve been thinking about laboratory work for a long time.

Why a book?

A question I will likely be asking with increasing despair over the coming months is: why am I writing a book? To reaffirm to myself as much as anything else, and to remind me if I get lost on the way, the reasons are pretty straightforward.

My career decisions and personal interests over the last few years have meant that I have moved my focus entirely to chemistry education. Initially this involved sneaking in some reading between the covers of J. Mat. Chem. when I was meant to be catching up on metal oxide photocatalysis. But as time went on and thanks to the support of others involved in chemistry education, this interest became stronger. I eventually decided to make a break with chemistry and move into chemistry education research. (One of the nicest things for me personally about joining Edinburgh was that this interest was ultimately validated.)

So while my knowledge of latest chemistry research is limited mainly to Chemistry World reports, one thing I do know well is the chemistry education research literature. And there is a lot of literature on laboratory education. But as I read it and try to keep on top of it, it is apparent that much of the literature on laboratory education falls into themes, and by a bit of rethinking of these themes and by taking a curriculum design approach, some guiding principles for laboratory education can be drawn up. And that a compilation of such principles, within the context of offering a roadmap or plan for laboratory education might be useful to others.

And this is what I hope to offer. The book will be purposefully targeted at anyone responsible for taking a traditional university level chemistry laboratory course and looking to change it. In reality, such change is an enormous task, and being pragmatic, needs to happen in phases. It’s tempting then to tweak bits and change bits based on some innovation presented at a conference or seen in a paper. But there needs to be an overall design for the entire student experience, so that incremental changes sum up to an overall consistent whole piece. Furthermore, by offering a roadmap or overall design, I hope to empower members of staff who may be responsible for such change by giving the evidence they may need to rationalise changes to colleagues. Everyone has an opinion on laboratory education! The aim is to provide evidence-based design approaches.

My bookshelves are groaning with excellent books on laboratory education. I first came across Teaching in Laboratories by Boud Dunn and Hegarty-Hazel back in the days when I stumbled across McGarvey’s article. I still refer to it, as even though it was published in 1986, it still carries a lot of useful material. Woolnough and Allsop’s Practical Work in Science is also excellent; crystal clear on the role and value of laboratory education and its distinction from lecture based curriculum. Hegarty-Hazel also edited The Student Laboratory and the Science Curriculum. Roger Anderson’s book The Experience of Science was published before I was born.

I have bought these now out of print books and several more second hand for less than the cost of a cup of coffee. I have learned lots from them, but am mindful that (justifiably) well-known and comprehensive as they are, they are now out of print and our university laboratories have not seen much change in the forty years since Anderson.

I am very conscious of this as I structure my own book. I can speculate that books about science laboratories at both secondary and tertiary level may be too broad. So the book is focussing exclusively on chemistry and higher education.

Secondly, the book is very clearly directed at those implementing a new approach, those involved in change. Ultimately it is their drive and energy and input that decides the direction of changes that will occur.  I hope that by speaking directly to them with a clear rationale and approach based on an up-to-date literature, that it may ease the workload somewhat for those looking to rethink laboratory education in their curricula. Now I just need to actually write it.

Related Posts:

Alex Johnstone’s 10 Educational Commandments

My thanks to Prof Tina Overton for alerting me to the fact that these exist. I subsequently happened across them in this article detailing an interview with Prof Johnstone (1), and thought they would be useful to share.

Ten Educational Commandments 

1. What is learned is controlled by what you already know and understand.

2. How you learn is controlled by how you learned in the past (related to learning style but also to your interpretation of the “rules”).

3. If learning is to be meaningful, it has to link on to existing knowledge and skills, enriching both (2).

4. The amount of material to be processed in unit time is limited (3).

5. Feedback and reassurance are necessary for comfortable learning, and assessment should be humane.

6. Cognisance should be taken of learning styles and motivation.

7. Students should consolidate their learning by asking themselves about what goes on in their own heads— metacognition.

8. There should be room for problem solving in its fullest sense (4).

9. There should be room to create, defend, try out, hypothesise.

10. There should be opportunity given to teach (you don’t really learn until you teach) (5).

Johnstone told his interviewer that he didn’t claim any originality for the statements, which his students called the 10 educational commandments. Rather he merely brought together well known ideas from the literature. But, and importantly for this fan, Johnstone said that they have been built into his own research and practice, using them as “stars to steer by”.

References

  1. Cardellini, L, J. Chem. Educ., 2000, 77, 12, 1571.
  2. Johnstone, A. H. Chemical Education Research and Practice in Europe (CERAPIE) 2000, 1, 9–15; online at http://www.uoi.gr/cerp/2000_January/contents.html.
  3. Johnstone, A. H. J. Chem. Educ. 1993, 70, 701–705
  4. Johnstone, A. H. In Creative Problem Solving in Chemistry; Wood, C. A., Ed.; Royal Society of Chemistry: London, 1993.
  5. Sirhan, G.; Gray, C.; Johnstone, A. H.; Reid, N. Univ. Chem. Educ. 1999, 3, 43–46.

Related Posts:

ChemEd Journal Publications from UK since 2015

I’ve compiled this list for another purpose and thought it might be useful to share here. 

The following are publications I can find* from UK corresponding authors on chemistry education research, practice, and laboratory work relevant to HE since beginning of 2015.  There are lots of interesting finds and useful articles. Most are laboratory experiments and activities, Some refer to teaching practice or underlying principles.

I don’t imagine this is a fully comprehensive list, so do let me know what’s missing. It’s in approximate chronological order from beginning of 2015.

  1. Surrey (Lygo-Baker): Teaching polymer chemistry
  2. Reading (Strohfeldt): PBL medicinal chemistry practical
  3. Astra Zeneca and Huddersfield (Hill and Sweeney): A flow chart for reaction work up
  4. Bath (Chew): Lab experiment: coffee grounds to biodiesel
  5. Nottingham (Galloway): PeerWise for revision
  6. Hertfordshire (Fergus): Context examples of recreational drugs for spectroscopy and introductory organic chemistry 
  7. Overton (was Hull): Dynamic problem based learning
  8. Durham (Hurst, now at York): Lab Experiment: Rheology of PVA gels
  9. Reading (Cranwell): Lab experiment: Songoshira reaction
  10. Edinburgh (Seery): Flipped chemistry trial
  11. Oaklands (Smith): Synthesis of fullerenes from graphite
  12. Manchester (O’Malley): Virtual labs for physical chemistry MOOC  
  13. Edinburgh (Seery): Review of flipped lectures in HE chemistry
  14. Manchester (Wong): Lab experiment: Paterno-Buchi and kinetics
  15. Southampton (Coles): Electronic lab notebooks in upper level undergraduate lab
  16. UCL (Tomaszewski): Information literacy, searching
  17. St Andrews & Glasgow (Smellie): Lab experiment: Solvent extraction of copper
  18. Imperial (Rzepa): Lab experiment: Assymetric epoxidation in the lab and molecular modelling; electronic lab notebooks
  19. Reading (Cranwell): Lab experiment: Wolff Kishner reaction
  20. Imperial (Rzepa): Using crystal structure databases
  21. Leeds (Mistry): Inquiry based organic lab in first year – students design work up
  22. Manchester (Turner): Molecular modelling activity
  23. Imperial (Haslam & Brechtelsbauer): Lab experiment: vapour pressure with an isosteniscope
  24. Imperial (Parkes): Making a battery from household products
  25. Durham (Bruce and Robson): A corpus for writing chemistry
  26. Who will it be…?!

*For those interested, the Web of Science search details are reproduced below. Results were filtered to remove non-UK papers, conference proceedings and editorials.

ADDRESS:((united kingdom OR UK OR Scotland OR Wales OR England OR (Northern Ireland))) AND TOPIC: (chemistry)AND YEAR PUBLISHED: (2016 or 2015)

Refined by: WEB OF SCIENCE CATEGORIES: ( EDUCATION EDUCATIONAL RESEARCH OR EDUCATION SCIENTIFIC DISCIPLINES )
Timespan: All years. Indexes: SCI-EXPANDED, SSCI, A&HCI, CPCI-S, CPCI-SSH, BKCI-S, BKCI-SSH, ESCI, CCR-EXPANDED, IC.

 

Related Posts:

Practical work: theory or practice?

Literature on laboratory education over the last four decades (and more, I’m sure) has a lot to say on the role of practical work in undergraduate curricula. Indeed Baird Lloyd (1992) surveys opinions on the role of practical work in North American General Chemistry syllabi over the course of the 20th century and opens with this delicious quote, apparently offered by a student in 1928 in a $10 competition:

Chemistry laboratory is so intimately connected with the science of chemistry, that, without experimentation, the true spirit of the science cannot possibly be acquired. 

I love this quote because it captures so nicely the sense that laboratory work is at the heart of chemistry teaching – its implicit role in the teaching of chemistry is unquestionable. And although it has been questioned a lot, repeatedly, over the following decades; not many today would advocate a chemistry syllabus that did not contain laboratory work.

I feel another aspect of our consideration of chemistry labs is often unchallenged, and needs to be. That is the notion that chemistry laboratories are in some way proving ground for what students come across in lectures. That they provide an opportunity for students to visualise and see for themselves what the teacher or lecturer was talking about. Or more laudably, to even “discover” for themselves by following a controlled experiment a particular relationship. Didn’t believe it in class that an acid and an alcohol make an ester? Well now you are in labs, you can prove it. Can’t imagine that vapour pressure increases with temperature? Then come on in – we have just the practical for you. Faraday said that he was never able to make a fact his own without seeing it. But then again, he was a great demonstrator.

A problem with this on an operational level, especially at university, and especially in the physical chemistry laboratory, is that is near impossible to schedule practicals so that they follow on from the introduction of theory in class. This leads to the annual complaint from students that they can’t do the practical because they haven’t done the theory. Your students are saying this, if you haven’t heard them, you need to tune your surveys.

It’s an entirely understandable sentiment from students because we situate practicals as a subsidiary of lectures. But this is a false relationship for a variety of reasons. The first is that if you accept a model whereby you teach students chemistry content in lectures, why is there a need to supplement this teaching with a re-teaching of a sub-set of topics, arbitrarily chosen based on the whim of a lab course organiser and the size of a department’s budget? Secondly, although we aim to re-teach, or hit home some major principle again in lab work, we don’t really assess that. We might grade students’ lab report and give feedback, but it is not relevant to them as they won’t need to know it again in that context. The lab report is done. And finally, the model completely undermines the true role of practical work and value it can offer the curriculum.

A different model

When we design lecture courses, we don’t really give much thought to the labs that will go with them. Lecture course content has evolved rapidly to keep up to date with new chemistry; lab development is much slower. So why not the other way around? Why not design lab courses independent of lectures? Lecture courses are one area of the curriculum to learn – typically the content of the curriculum; laboratory courses are another. And what might the role here be?

Woolnough and Allsop (1985), who make a clear and convincing argument for cutting the “Gordian knot” between theory and practice, instead advocate a syllabus that has three aims:

  1. developing practical skills and techniques.
  2. being a problem-solving chemist.
  3. getting a “feel for phenomena”.

The detail of how this can be done is the subject of their book, but involves a syllabus that has “exercises, investigations, and experiences”. To me these amount to the “process” of chemistry. On a general level, I think this approach is worth consideration as it has several impacts on teaching and learning in practice.

Impacts on teaching and learning

Cutting the link between theory and practice means that there is no longer a need to examine students’ understanding of chemistry concepts by proxy. Long introductions, much hated by students, which aim to get the student to understand the theory behind the topic at hand by rephrasing what is given to them in a lab manual, are obsolete. A properly designed syllabus removes the need for students to have had lectures in a particular topic before a lab course. Pre-lab questions can move away from being about random bits of theory and focus on the relationships in the experiment. There is no need for pointless post-lab questions that try to squeeze in a bit more theory.

Instead, students will need to approach the lab with some kind of model for what is happening. This does not need to be the actual equations they learn in lectures. Some thought means they may be able to draw on prior knowledge to inform that model. Of course, the practical will likely involve using some aspect of what they cover or will cover in lectures, but at the stage of doing the practical, it is the fundamental relationship they are considering and exploring. Approaching the lab with a model of a relationship (clearly I am in phys chem labs here!) and exploring that relationship is better reflecting the nature of science, and focussing students attention on the study in question. Group discussions and sharing data are more meaningful. Perhaps labs could even inform future lectures rather than rely on past ones! A final advantage is the reassertion of practical skills and techniques as a valuable aspect of laboratory work.

A key point here is that the laboratory content is appropriate for the level of the curriculum, just as it is when we design lectures. This approach is not advocating random discovery – quite the opposite. But free of the bond with associated lectures, there is scope to develop a much more coherent, independent, and more genuinely complementary laboratory course.

References

Baird W. Lloyd, The 20th Century General Chemistry Laboratory: its various faces, J. Chem. Ed., 1992, 69(11), 866-869.

Brian Woolnaugh and Terry Allsop (1985) Practical Work in Science, Cambridge University Press.

1928 quote

Related Posts:

Reflections on #micer16

Several years ago at the Variety in Chemistry Education conference, there was a rather sombre after-dinner conversation on whether the meeting would continue on in subsequent years. Attendance numbers were low and the age profile was favouring the upper half of the bell-curve.

Last year at Variety I registered before the deadline and got, what I think was the last space, and worried about whether my abstract would be considered. The meeting was packed full of energetic participants interested in teaching from all over UK and Ireland, at various stages of their careers. A swell in numbers is of course expected from the merging with the Physics Higher Education Conference, but the combination of the two is definitely (from this chemist’s perspective) greater than the sum of its parts.

Participants at #micer16
Participants at #micer16

What happened in the mean time would be worthy of a PhD study. How did the fragile strings that were just holding people together in this disparate, struggling community, not snap, but instead strengthen to bring in many newcomers? A complex web of new connections has grown.  While I watched it happen I am not sure how it happened. I suspect it is a confluence of many factors: the efforts of the RSC at a time when chemistry was at a low-point. The determination of the regular attendees to keep supporting it, knowing its inherent value. The ongoing support of people like Stuart Bennett, Dave McGarvey, Stephen Breuer, Bill Byers, and others. And of course the endless energy of Tina Overton and the crew at the Physical Sciences Centre at Hull.

Whatever the process, we are very lucky to have a vibrant community of people willing to push and challenge and innovate in our teaching of chemistry. And that community is willing and is expected to play a vital role in the development of teaching approaches. This requires design and evaluation of these approaches; a consideration of how they work in our educational context. And this requires the knowledge of how to design these research studies and complete these evaluations. Readers will note that Variety now particularly welcome evidence-based approaches.

Most of us in this community are chemists, and the language of education research can be new, and difficult to navigate. Thus a meeting such as MICER held last week aimed to introduce and/or develop approaches in education research. The speakers were excellent, but having selected them I knew they would be! Participants left, from what I could see and saw on social media, energised and enthused about the summer ahead and possible projects.

But we will all return to our individual departments, with the rest of the job to do, and soon enthusiasm gives way to pragmatism, as other things get in the way. It can be difficult to continue to develop expertise and competence in chemistry education research without a focus. The community needs to continue to support itself, and seek support from elsewhere.

How might this happen?

Support from within the community can happen by contacting someone you met at a conference and asking them to be a “critical friend”. Claire Mc Donnell introduced me to this term  and indeed was my critical friend. This is someone whom you trust to talk about your work with, share ideas and approaches, read drafts of work. It is a mutual relationship, and I have found it extremely beneficial, both from the perspective of having someone sensible to talk to, but also from a metacognitive perspective. Talking it out makes me think about it more.

The community can organise informal and formal journal clubs. Is there a particular paper you liked – how did the authors complete a study and what did they draw from it? Why not discuss it with someone, or better still in the open?

Over the next while I am hoping to crystallise these ideas and continue the conversations on how we do chemistry education research. I very much hope you can join me and be an active participant; indeed a proactive participant. So that there is an independent platform, I have set up the website http://micerportal.wordpress.com/ and welcome anyone interested in being involved to get in touch about how we might plan activities or even a series of activities. I hope to see you there.

Related Posts:

Significant omission from my top 10 #chemed post!

0. Text messages to explore students’ study habits (Ye, Oueini, Dickerson, and Lewis, CERP)

I was excited to see Scott Lewis speak at the Conference That Shall Not Be Named during the summer as I really love his work. This paper outlines an interesting way to find out about student study habits, using text-message prompts. Students received periodic text messages asking them if they have studied in the past 48 hours. The method is ingenious. Results are discussed in terms of cluster analysis (didn’t study as much, used textbook/practiced problems, and online homework/reviewed notes). There is lots of good stuff here for those interested in students’ study and supporting independent study time. Lewis often publishes with Jennifer Lewis, and their papers are master-classes in quantitative data analysis. (Note this candidate for my top ten was so obvious I left it out in the original draft, so now it is a top 11…)

I’ve now included this in the original post.

Scott Lewis CERP

Related Posts:

My ten favourite #chemed articles of 2015

This post is a sure-fire way to lose friends… but I’m going to pick 10 papers that were published this year that I found interesting and/or useful. This is not to say they are ten of the best; everyone will have their own 10 “best” based on their own contexts.

Caveats done, here are 10 papers on chemistry education research that stood out for me this year:

0. Text messages to explore students’ study habits (Ye, Oueini, Dickerson, and Lewis, CERP)

I was excited to see Scott Lewis speak at the Conference That Shall Not Be Named during the summer as I really love his work. This paper outlines an interesting way to find out about student study habits, using text-message prompts. Students received periodic text messages asking them if they have studied in the past 48 hours. The method is ingenious. Results are discussed in terms of cluster analysis (didn’t study as much, used textbook/practiced problems, and online homework/reviewed notes). There is lots of good stuff here for those interested in students’ study and supporting independent study time. Lewis often publishes with Jennifer Lewis, and together their papers are master-classes in quantitative data analysis. (Note this candidate for my top ten was so obvious I left it out in the original draft, so now it is a top 11…)

1. What do students learn in the laboratory (Galloway and Lowery-Bretz, CERP)?

This paper reports on an investigation using video cameras on the student to record their work in a chemistry lab. Students were interviewed soon after the lab. While we can see what students physically do while they are in the lab (psychomotor learning), it is harder to measure cognitive and affective experiences. This study set about trying to measure these, in the context of what the student considered to be meaningful learning. The paper is important for understanding learning that is going on in the laboratory (or not, in the case of recipe labs), but I liked it most for the use of video in collection of data.

2. Johnstone’s triangle in physical chemistry (Becker, Stanford, Towns, and Cole, CERP).

We are familiar with the importance of Johnstone’s triangle, but a lot of research often points to introductory chemistry, or the US “Gen Chem”. In this paper, consideration is given to understanding whether and how students relate macro, micro, and symbolic levels in thermodynamics, a subject that relies heavily on the symbolic (mathematical). The reliance on symbolic is probably due in no small part to the emphasis most textbooks place on this. The research looked at ways that classroom interactions can develop the translation across all levels, and most interestingly, a sequence of instructor interactions that showed an improvement in coordination of the three dimensions of triplet. There is a lot of good stuff for teachers of introductory thermodynamics here.

3. The all-seeing eye of prior knowledge (Boddey and de Berg, CERP).

My own interest in prior knowledge as a gauge for future learning means I greedily pick up anything that discusses it in further detail. And this paper does that well. It looked at the impact of completing a bridging course on students who had no previous chemistry, comparing them with those who had school chemistry. However, this study takes that typical analysis further, and interviewed students. These are used to tease out different levels of prior knowledge, with the ability to apply being supreme in improving exam performance.

4.  Flipped classes compared to active classes (Flynn, CERP).

I read a lot of papers on flipped lectures this year in preparing a review on the topic. This was by far the most comprehensive. Flipping is examined in small and large classes, and crucially any impact or improvement is discussed by comparing with an already active classroom. A detailed model for implementation of flipped lectures linking before, during, and after class activities is presented, and the whole piece is set in the context of curriculum design. This is dissemination of good practice at its best.

5. Defining problem solving strategies (Randles and Overton, CERP).

This paper gained a lot of attention at the time of publication, as it compares problem solving strategies of different groups in chemistry; undergraduates, academics, and industrialists. Beyond the headline though, I liked it particularly for its method – it is based on grounded theory, and the introductory sections give a very good overview on how this was achieved, which I think will be informative to many. Table 2 in particular demonstrates coding and example quotes which is very useful.

6. How do students experience labs? (Kable and more, IJSE)

This is a large scale project with a long gestation – the ultimate aim is to develop a laboratory experience survey, and in particular a survey for individual laboratory experiments, with a view to their iterative improvement. Three factors – motivation (interest and responsibility), assessment, and resources – are related to students’ positive experience of laboratory work. The survey probes students’ responses to these (some like quality of resources give surprising results). It is useful for anyone thinking about tweaking laboratory instruction, and looking for somewhere to start.

7. Approaches to learning and success in chemistry (Sinapuelas and Stacy, JRST)

Set in the context of transition from school to university, this work describes the categorisation of four levels of learning approaches (gathering facts, learning procedures, confirming understanding, applying ideas). I like these categories as they are a bit more nuanced, and perhaps less judgemental, than surface vs deep learning. The approach level correlates with exam performance. The paper discusses the use of learning resources to encourage students to move from learning procedures (level 2) to confirming understanding (level 3). There are in-depth descriptions characterising each level, and these will be informative to anyone thinking about how to support students’ independent study.

8. Exploring retention (Shedlosky-Shoemaker and Fautch, JCE).

This article categorises some psychological factors aiming to explain why some students do not complete their degree. Students switching degrees tend to have higher self-doubt (in general rather than just for chemistry) and performance anxiety. Motivation did not appear to distinguish between those switching or leaving a course and those staying. The study is useful for those interested in transition, as it challenges some common conceptions about student experiences and motivations. This study appears to suggest much more personal factors are at play.

9. Rethinking central ideas in chemistry (Talanquer, JCE).

Talanquer publishes regularly and operates on a different intellectual plane to most of us. While I can’t say I understand every argument he makes, he always provokes thought. In this commentary, he discusses the central ideas of introductory chemistry (atoms, elements, bonds, etc), and proposes alternative central ideas (chemical identity, mechanisms, etc). It’s one of a series of articles by several authors (including Talanquer himself) that continually challenge the approach we currently take to chemistry. It’s difficult to say whether this will ever become more than a thought experiment though…

10. Newcomers to education literature (Seethaler, JCE).

If you have ever wished to explain to a scientist colleague how education research “works”, this paper might be of use. It considers 5 things scientists should know about education research: what papers can tell you (and their limitations), theoretical bases in education research, a little on misconceptions and content inventories, describing learning, and tools of the trade. It’s a short article at three pages long, so necessarily leaves a lot of information out. But it is a nice primer.

Finally

The craziest graphical abstract of the year must go to Fung’s camera set up. And believe me, the competition was intense.

ed-2014-009624_0007

Related Posts:

This week I’m reading… Changing STEM education

Summer is a great time for Good Intentions and Forward Planning… with that in mind I’ve been reading about what way we teach chemistry, how we know it’s not the best approach, and what might be done to change it.

Is changing the curriculum enough?

Bodner (1992) opens his discussion on reform in chemistry education writes that “recent concern”, way back in 1992, is not unique. He states that there are repeated cycles of concern about science education over the 20th century, followed by long periods of complacency. Scientists and educators usually respond in three ways:

  1. restructure the curriculum,
  2. attract more young people to science,
  3. try to change science teaching at primary and secondary level.

However, Bodner proposes that the problem is not in attracting people to science at the early stages, but keeping them on when they reach university, and that we at third level have much to learn with from our colleagues in primary and secondary level. Instead of changing the curriculum (the topics taught), his focus is on changing the way the curriculum is taught. In an era when textbooks (and one presumes now, the internet) have all the information one wants, the information dissemination component of a lecture is redundant. Bodner makes a case that students can perform quite well on a question involving equilibrium without understanding its relationship to other concepts taught in the same course, instead advocating an active learning classroom centred around discussion and explanation; dialogue between lecturers and student. He even offers a PhD thesis to back up his argument (A paper, with a great title, derived from this is here: PDF).

Are we there yet?

One of the frustrations I’m sure many who have been around the block a few times feel is the pace of change is so slow (read: glacial). 18 years after Bodner’s paper, Talanquer and Pollard (2010) criticize the chemistry curriculum at universities as “fact-based and encyclopedic, built upon a collection of isolated topics… detached from the practices, ways of thinking, and applications of both chemistry research and chemistry education research in the 21st century.” Their paper in CERP presents an argument for teaching “how we think instead of what we know”.

They describe their Chemistry XXI curriculum, which presents an introductory chemistry curriculum in eight units, each titled by a question. For example, Unit 1 is “How do we distinguish substances?”, consisting of four modules (1 to 2 weeks of work): “searching for differences, modelling matter, comparing masses, determining composition.” The chemical concepts mapping onto these include the particulate model of matter, mole and molar mass, and elemental composition.

Talanquer CERP 2010 imageAssessment of this approach is by a variety of means, including small group in-class activities. An example is provided for a component on physical and electronic properties of metals and non-metals; students are asked to design an LED, justifying their choices. I think this fits nicely into the discursive ideas Bodner mentions. Summative assessment is based on answering questions in a context-based scenario – picture shown.

In what is a very valuable addition to this discussion, learning progression levels are included, allowing student understanding of concepts and ideas, so that their progressive development can be monitored. It’s a paper that’s worth serious consideration and deserves more widespread awareness.

Keep on Truckin’

Finally in our trio is Martin Goedhart’s chapter in the recently published book Chemistry Education. Echoing the basis provided by Talanquer and Pollard, he argues that the traditional disciplines of analytical, organic, inorganic, physical, and biochemistry were reflective of what chemists were doing in research and practice. However, the interdisciplinary nature of our subject demands new divisions; Goedhart proposes three competency areas synthesis, analysis, and modelling. For example in analysis, the overall aim is “acquiring information about the composition and structure of substances and mixtures”. The key competencies are “sampling, using instruments, data interpretation”, with knowledge areas including instruments, methods and techniques, sample prep, etc. As an example of how the approach differs, he states that students should be able to select appropriate techniques for their analysis; our current emphasis is on the catalogue of facts on how each technique works. I think this echoes Talanquer’s point about shifting the emphasis on from what we know to how we think.

Related Posts:

My Education in Chemistry blog posts

A lot of my bloggery is now on the Education in Chemistry blog, and I will keep a running table of contents of them here.

Related Posts:

A future direction for clickers?

student answers

Clickers are routinely used to survey class on their understanding of topics or test their knowledge with quizzes, and as technology has developed, there have been clever ways of doing this (See: The Rise and Rise…). One issue that arises is that as lecturers, we don’t have a convenient way to know what individual students think, or what their answer is.

glassesEnter this recent paper from BJET, An Augmented Lecture Feedback System to support Learner and Teacher Communication. This paper describes a clicker-based system, but instead of (or as well as) a lecturer viewing a chart of responses, the lecturer sees the response hover over the student’s head. I know it’s early in the year, so I will let you read that sentence again.

The system works by way of the lecturer wearing glasses that scan the room and when each response is entered. The technology (while very clever) is still very rudimentary, and no-one in their right mind would want to look like this in their classroom, but as Google Glasses or equivalent take off, who knows what possibilities there will be in the coming decade.

I think it’s an interesting paper for showing a different aspect of lecturer-student interaction in the class. Quite what you do when you see that some students are incorrect is up to individual teaching scenarios.

The authors have a video explaining the paper in more detail, shown below.

embedded by Embedded Video

YouTube Direkt

Related Posts:

Lack of literature on flipped lecture rooms

Compiling literature on flipped/inverted classrooms for higher education isn’t easy. A lot of returns are of the “I couldn’t believe my ears!” type blog, which is fine for what it is, but not an academic study. Yet more literature, typically of the Chronicle or Educause type, tends to say flipped classrooms are great, and they lead on to MOOCs (as in the case of this recent C&EN piece), with a subsequent discussion on MOOCs, or tie in flipped classrooms with Peer Instruction, with a discussion on peer instruction. In these cases, and especially so for PI, this is the intention of the writer, so it is not a criticism. But it makes it hard to say what value flipped lectures have in their own right.

I want to think well of flipped lectures, and have piloted some myself, the concept being an extension of pre-lecture activities work that I have spent a lot of time on. While looking for methodologies to rob for a future study of my own, I had a look in the literature. The study most people seem to refer to is an article published in 2000 in the Journal of Economics Education which described the implementation of the inverted lecture. The paper is a nice one in that it describes the implementation well, with the views of students and instructors represented. But there is not much after surveying students in terms of considering effectiveness. I come from the school of thought that says if you throw oranges at students in a lecture and survey them, they will say it helped their learning, so I’m surprised that this study is referred to by evangelists in the flipped lecture area. The course site is still available, and while it looks a little dated, it does seem to align nicely with what the Ed Techs would consider good instructional design (resources, support, social area, etc).

A more recent study is that in Physics Reviews Special Topics: Physics Education Research. While it appears this is more of the pre-lecture type of activity rather than flipped lecture (ie there is still some lectures involved), the lecture room seems quite active. This study found that students who completed the pre-lecture work did better in exams than those that didn’t.

Not much else in my initial trawl. I’ll keep looking, as of course people might have done this and not called it flipped or inverting the lecture. Of course part of this is that education research takes time, and perhaps in the next few years, we will see lots of flipped lecture room literature.

 

Related Posts:

The Application of Technology to Enhance Chemistry Education

Call for Papers

Contributions are invited for a themed, peer-reviewed issue of CERP on The Application of Technology to Enhance Chemistry Education which is scheduled for publication Autumn 2013. Guest Editors: Michael K Seery and Claire McDonnell.

Topics for contribution may include but are not limited to:

  •  Blended learning to support ‘traditional’ instruction (e.g. online resources, wikis, blogs, e-portfolios)
  • In-class technology (e.g. clickers, iPads or equivalent)
  • Online learning (e.g. distance learning initiatives, online collaborative learning, active and interactive eLearning, computer simulations of practical work, modelling software for online learning)
  • Cognitive considerations for online learning (e.g. designing online resources)
  • E-assessment (e.g. formative assessment strategies, automated feedback)
  • Reviews and Perspectives (‘State of play’ of current trends, historical perspective)

Contributions should align with the principles and criteria specified in the recent CERP editorial (Chem. Educ. Res. Pract., 2012, 13, 4-7). To summarise, there is a requirement that papers provide an argument for some new knowledge supported by careful analysis of evidence; either by reviewing the existing literature, analysing carefully collected research data or rigorously evaluating innovative practice.

Submission of Manuscripts

Manuscripts should be submitted in the format required by the journal using the ScholarOne online manuscript submission platform available through the journal homepage http://www.rsc.org/CERP/. Enquiries concerning the suitability of possible contributions should be sent directly by email to: Michael Seery michael.seery@dit.ie and/or Claire McDonnell: claire.mcdonnell@dit.ie.

Important Dates

Manuscripts should be submitted by 4th January 2013 to be eligible for consideration in the theme issue, subject to authors being able to address revisions without too much delay. Manuscripts received after the deadline can still be considered for the theme issue, but the usual peer review process will not be compromised to reach decisions on publication, and if such articles are accepted for publication too late to be included in the theme issue then they would be included instead in a subsequent issue.

As with other CERP contributions, articles intended for the theme issue will be published as advanced articles on line as soon as they have been set and proofs have been checked, ahead of publication in the theme issue itself.

Related Posts:

Showing Worked Examples in Blackboard Quizzes

I’ve been thinking of ways to include worked examples and hints in Blackboard VLE quizzes. Cognitive Load theory has something called the Worked Example effect, whereby learners who receive direct instruction in the form of worked examples perform better than those who don’t. The reason is attributed to providing novice learners with an approach to solving a problem that they can replicate, thus alleviating the working memory load while solving a problem. There’s some more on worked examples here.

The question then was how to provide a worked example (or a hint, a slightly less informative way to guide students) in Blackboard quizzes. I want to have them at the point where students can click on them as they need them, rather than having to leave the quiz and go off somewhere else to get help. I did this in this trial with Javascript buttons. The video below goes through how it looks and the mechanics of it.

Related Posts:

8th Variety in Irish Chemistry Teaching Meeting – DIT 10th May

The Chemistry Education Research Team wish to invite you to the 8th Variety in Irish Chemistry Teaching Meeting which will be held in DIT Kevin St on Thursday 10th May 2012. The meeting is sponsored by the RSC Education Division Ireland.

Programme and Call for Abstracts

The aim of the meeting is to allow those teaching chemistry at third level to share “what works” – useful ideas and effective practice from their own teaching.

The keynote speaker is Dr David McGarvey, University of Keele, who was the 2011 RSC Higher Education Teaching Award winner.

A call for abstracts is now open for short oral presentations (10 – 15 minutes) on any topic related to teaching and learning chemistry. The deadline for abstracts (150 words maximum) is April 5th 2012.

Attendance is free, but registration is required. Registration forms for those intending to attend/present can be downloaded here and should be submitted by April 5th 2012 by email to michael.seery@dit.ie

Workshop

An optional workshop will be held on Thursday morning (10.30 – 12.30 pm) on the topic “Using Technology in Chemistry Teaching and Learning” and will cover the following topics: “Podcasting and Screencasting”, “Using Wikis in Chemistry Education”, and “E-assessment”. The cost of the workshop is €10.

Related Posts:

My experiences of teaching online: A case study

5stage

My paper on taking a module that was taught in class and moved online has been published in CERP (free to access). The paper aims to share my own experiences in teaching a module online so that others considering this approach might find some information of use.

The paper is set against a background of what I consider to be a general disaffection for online teaching among staff and students. This is apparent from surveys by the DRHEA—which reports that the main use of VLEs is as content repositories; the UK HEA (pdf)—where students ranked “e-learning” as the least enjoyable and least effective method of teaching; and large scale US study which reports a disappointing level of criticality in considering the effectiveness of online engagement.

The rationale for moving the module online is presented. It was found from practice that the online version of the module opened up new possibilities, especially in the domain of transferable skills. A table of learning outcomes, and how they are aligned with assessment is given. Implementation of the module online followed Gilly Salmon’s Five-Stage model, which was useful in this case because the online delivery was supported primarily by discussion boards. Notes and reflections from my experience of implementation are incorporated.

Finally, evaluation aims to capture what went well and what could be improved—both from my own perspective and that of students. One of the great benefits was observing a growing sense of independence among the students, and their ability to move beyond structured problems to being able to tackle unfamiliar ones. Some suggestions about encouraging engagement from all students are presented.

If you read it, I hope you enjoy the paper. It has certainly been an interesting module to deliver over the last number of years. The fifth version of the online delivery begins in a few weeks!

Related Posts:

Class Sizes and Student Learning

A recent discussion on an ALT email circulation raised the interesting question of whether there was a threshold for class sizes, above which student learning experience diminished. Unfortunately, what followed was lots of “in my experienceHigginbotham-esque replies (with the exception of details of an interesting internal survey at NUIG), despite the original query specifically requesting evidence-based information.

You up there—in the blue and white jumper—what do you think the answer is?

A clackety-clack into Google Scholar throws up some interesting results on this topic. Unsurprisingly, the general trend is that increasing class size diminishes students’ educational experience, although the extent to which this happens seems to be luke-warm. There are two issues to consider: what is being measured to reflect something like “educational experience”; and what is the discipline.

What students think

In this regard, an interesting paper that caught my eye was one that considered the effect of class sizes in various disciplines (Cheng, 2011). This work dismisses student grades in favour of three evaluation scores derived from students: student learning, instructor recommendations, and course recommendations. Student learning was scored based on a student response to a 5-point Likert scale question “I learned a great deal from this course”. (Many of you, including myself, may be tempted to run screaming for the hills at this point. What would students know?! Cheng does make the point that she is not saying that this measure is superior to student outcomes, just a different measure. She refers to Pike’s (1996) interesting paper on student self-reporting for a discussion on this. Also, Hamermesh’s paper (2005) is worth a read for the title alone—in short, good looking lecturers get better ratings.)

Overall Data

Anyway, Cheng has amassed an impressive data set. “In total, the data span 24 departments, 2110 courses, 1914 instructors, and 10,357 observations from Fall 2004 to Spring 2009.” Before considering subject, on an overall level, Cheng found that for each of her three ratings, ratings fell as class sizes increased (although the smallest class sizes received both lowest and highest marks). Cheng has further used her data to generate a model to predict how student “learning” (**measured as outlined above**), instructor and course recommendations would change, so that for an increase of 50 in class size, these ratings would decrease by 1.4%, 1.3%, and 1.1% respectively. Of course, some disciplines will have smaller class sizes or may require more class-tutor interaction, so Cheng has drilled down into each discipline and determined if it is negatively or positively affected, or indeterminately effected (i.e. mixed results)

Subject Specific

In the sciences, chemistry, biology, physics and maths were unaffected by increasing class size in this model, as were history, philosophy, and visual arts. Almost half of the disciplines surveyed were inconclusive, some showed negative effects: some engineering disciplines, political science, social science. No discipline benefits from increasing enrollment.

Chemistry

Cheng considers that theoretical subjects such as the sciences may have a low correlation with class size, but rather depends on other factors, such as quality of instructor or student effort. While I think there are flaws, or at best limitations to this study (as Cheng acknowledges), it does open up interesting questions. The one I am interested in is the culture of teaching chemistry, which is fiercely traditional. That this data suggests that an increasing class size would have little effect on ratings measured here in a chemistry class would in turn suggest that its teaching is still very much based on a teacher-centred philosophy. Clickers, anyone?

References

  • Cheng, D. A. Effects of class size on alternative educational outcomes across disciplines, Economics of Education Review, 2011, 30, 980–990.
  • Hamermesh, D., & Parker, A. Beauty in the classroom: Instructors’ pulchritude and putative pedagogical productivity. Economics of Education Review, 2005, 24, 369–376.
  • Pike, G. R. Limitations of using students’ self-reports of academic development as proxies for traditional achievement measures, 1996, Research in Higher Education, 37, 89-114.

Related Posts:

Chemistry Education Research and Practice

I still remember the pleasant surprise of discovering that there was a journal dedicated to the teaching of chemistry in higher education. Sometime in late 2005, I Googled something about assessment in chemistry, and out came a result: “Assessment in Chemistry and the Role of Examinations“, a great paper questioning the value of our assessment system.  The same issue had an article “Experimenting with Undergraduate Practicals“, which was hugely influential in my own consideration of the role lab education. My love affair with University Chemistry Education—which would later become Chemistry Education Research and Practice after a merger in 2005—began.

Early Themes

A Wordle of the titles of the papers published in U. Chem. Ed. from 1997 – 2004 is shown below. These show that the issues we are still tackling today—critical thinking, effective assessment, embedding transferable skills—have been around the block a few times in the chemistry education community!

Wordle of University Chemistry Education Titles Volumes 1 – 8.

The contents of the very first issue is telling in that regard. Among its articles are Alex Johnstone’s classic ‘…And some fell on good ground’, about prior knowledge and cognitive load in learning chemistry, the basis of my own research over the last three years. Tina Overton’s Creating Critical Chemists had themes of group work and discussion, which are interesting prelude to her hugely influential work on problem based learning in chemistry. The then editor of U. Chem. Ed., John Garratt, wrote a paper entitled Virtual investigations: ways to accelerate experience which discussed the use of pre-lab online exercises as a preparation to in-lab work which included a set of aims of practical work that informed the debate around laboratory education. More than twenty-five years later, these issues are still at the core of our discussions on chemistry education.

The new journal must have quickly gained an audience outside the UK. Apart from letters, Volume 3(1) in 1999 saw three non-UK-based submissions: Brian Murphy (IE), then of IT Sligo, on assessment of IT skills in chemistry (my 2012 CERP paper is on this topic!); Onno de Jong (NL) on how to go about researching chemical education (de Jong has done a lot of work on contextualising chemistry); and George Bodner (US) on an action research study of assessment. International submissions continued at a healthy pace.

Towards CERP

Soon after the establishment of University Chemistry Education in 1997 was the development of Chemical Education Research and Practice in Europe (CERAPIE), edited by Georgios Tsarpalis, in 2000. Like U. Chem. Ed., CERAPIE quickly attracted an international audience, and dropped “in Europe” from its title in 2003. In 2005, U. Chem. Ed. and CERP merged to form a new journal published by the Royal Society of Chemistry, co-edited by Stephen Breuer who was editor at U. Chem. Ed. since 2001 and Georgios Tsarpalis. From 2007, the journal was included in the ISI Citation Index, a hugely important step in the development of the journal.

According to ISI, 181 papers have been published in CERP since 2007. Among these, 61 (1/3) have been from US, 20 from England, 14 from Australia, 13 from Germany, 9 from Ireland and 7 from Scotland. The top 10 most cited articles are (note that these are biased by age!)

  1. Donald Treagust’s (AUS) excellent work on two-tier diagnostic assessment
  2. Norman Reid’s (SCO) seminal paper on the role of laboratory work in chemistry
  3. Lewis and Lewis’ (US) work on predicting at-risk students in General chemistry
  4. One of the early papers on clickers in chemistry by Loretta Jones (I have written about that here)
  5. One in Hofstein’s important series discussing laboratory education
  6. Absolutely ground breaking work done here at DIT :) on project based learning in the lab
  7. Cooper and Sandi-Urena’s work on metacognition in chemistry (check out last issue of 2011 for an update to this work)
  8. Mark Buntine and Justin Read’s work on undergraduate practical development ACELL
  9. Stolk and de Jongs’ paper on context based education in teacher training
  10. Domin’s work on conceptual development in a PBL laboratory setting

The Future

Issue 4, 2011 saw the retirement of both Stephen Breuer and Georgios Tsarpalis as editors of CERP. I think the chemistry education community has much to be grateful to them for, as they have provided a platform for practitioners and researchers in chemical education to share and debate ideas for more than a decade. For new and continuing lecturers, it is a great resource for stimulating a consideration of how we teach chemistry.

Interestingly, this last issue under their stewardship had themes which were very similar to those mentioned in the first issue of U. Chem. Ed.—technology in education, including laboratory education; conceptualising chemical concepts; and developing critical thinking through enquiry. The new editor, Keith Taber, has a big task ahead of him continuing on the work of this great journal. His own association goes back to 2000, when he wrote an article on teaching chemistry with a consideration of prior knowledge.

I am planning a follow-up article to consider some of the themes highlighted in CERP in more detail. If you’d like to be involved, or have any particular favourites, let me know!

Related Posts:

Implementation of Research Based Teaching Strategies

The traditional, almost-folkloric, based approach to teaching science is a stark contrast to the evidence-based research approach scientists consider in their everyday research. The quote by Joel Michael* highlights the contrast:

As scientists, we would never think of writing a grant proposal without a thorough knowledge of the relevant literature, nor would we go into the laboratory to actually do an experiment without knowing about the most current methodologies being employed in the field. Yet, all too often, when we go into the classroom to teach, we assume that nothing more than our expert knowledge of the discipline and our accumulated experiences as students and teachers are required to be a competent teacher. But this makes no more sense in the classroom than it would in the laboratory!

In discussing the implementation of innovative teaching techniques, this post is drawing on the work of Charles Henderson who spoke at a conference earlier this year on his analysis of the impact of physics education research on the physics community in US. I think there are lessons for chemists from this work. (The underlying assumption here is that moving from traditional methods of teaching based on information transmission to student-centred or active teaching improves student learning. This position is I think consolidated by a significant body of research.)

Change Mechanisms

The decision to use what Anderson called Research Based Instructional Strategies (RBIS) by a lecturer follows five stages, described by Rogers: (1) knowledge or awareness about the innovation; (2) persuasion about its effectiveness;  (3) deciding to use the innovation; (4) implementing the innovation; and (5) confirmation to continue its use.

Awareness of RBIS obviously underlies this process. A 2008 survey by Henderson and Dancy of 722 physics faculty showed that 87% were familiar with at least 1 of 24 identified RBIS applicable to introductory physics, and 48% reporting that they use at least one in their teaching. Time was reported as the most common reason why faculty did not implement more RBIS in their teaching.

A subsequent study by Henderson examined the individual stages of the implementation process in more detail and found that:

  • 12% of faculty had no awareness
  • 16% had knowledge but did not implement (Stage 1-2, above)
  • 23% discontinued after trying (Stage 3-4)
  • 26% continued use at a low level (Stage, 5, 1 – 2 RBIS)
  • 23% continued a a high level (Stage, 5, >3 RBIS)

Innovation Bottleneck

Henderson uses his data to demonstrate that on the whole, the physics education community does a good job of dissemination of RBIS to the community of educators. Just 12% of faculty had no awareness, and 1/6 of those who did, made no attempt to implement any. Therefore it can be argued that the fall-off in innovation is at a later stage in the change process. Hence efforts to encourage innovation should aim to address the one third of those with awareness who discontinue after a trial and those with a low level of continuance to build on their success. These groups may be a more suitable focus for consideration, in terms of percentage, as well as the fact that they were willing to give an innovation a go, when compared to those who had knowledge but did not try any innovation.

Teasing this out appears to be difficult. The decision to continue seems to come down to personal characteristics, such as desire to find out more, and gender (female twice as likely to continue than male, but the paper does dispel some traditional conceptions about who is innovative and who isn’t!).

However, in terms of practical measures that can be made the following are listed:

  • Practice literature can present an overly rosy picture of implementation. Therefore, when someone trys it and hits an unexpected hurdle (student resistance and complaints, concerns over breadth of content, outcomes not as expected), there is a sense that it isn’t working, and the innovation is discontinued. Therefore it is important that practice literature gives a full and honest account of implementation.
  • Implementation can be modified to the person’s own circumstances, and in modification, the effectiveness of the innovation is lost. Therefore, pitfalls and important issues in the dissemination stage (workshops, talks, etc) should be highlighted.
  • There is evidence that if an innovation is supported by the designer during the implementation phase, the innovation is more successfully implemented.

Now, who wants to do this analysis for UK/Ireland chemistry?!

References

Charles Henderson, Melissa H. Dancy, Magdalena Niewiadomska-Bugaj (2010) Variables that Correlate with Faculty Use of Research-Based Instructional Strategies, 169-172. In Proceedings of the 2010 Physics Education Research Conference.

Charles Henderson & Dancy, M. (2009) The Impact of Physics Education Research on the Teaching of Introductory Quantitative Physics in the United States, Physical Review Special Topics: Physics Education Research, 5 (2), 020107.

*Thanks to my colleague Claire Mc Donnell for giving me this quote: Joel Michael, Advances in Physiology Education, (2006) 30, 159-167.

Rogers, E. M. (1995). Diffusion of innovations (4th ed.). New York: Free Press.

Related Posts:

Using Pre-Lecture Resources in your Teaching

Using Pre-Lecture Resources in your teaching

Much of my study on educational research this year has focussed on Pre-Lecture Resources, working with Dr Roisin Donnelly at DIT’s Learning Teaching and Technology Centre and my colleague Dr Claire Mc Donnell. I’ve turned into something of an evangelist for pre-lecture resources, so in order to spread the good word, I have prepared this resource guide for others thinking of using a similar strategy. I’d love to hear from anyone who has considered this approach or is using a similar approach. The guide accompanies a presentation at the 12th Annual Showcase of Learning and Teaching Innovations, DIT, Jan 2011. Click on the image to access Using Pre-Lecture Resources in your teaching”



Related Posts:

Teaching Fellowship Launch Presentation

teachingfellowships

The DIT 2010-2011 Teaching Fellowships were launched on 23rd September 2010, and each recipient of a Fellowship gave a presentation on the work they plan to do. It was really nice to see what others plan to do; there was a lot of variety and a lot of overlap at the same time. My presentation – the main thrust of which was summarised in another post – is embedded below. All of the presentations can be viewed from the LTTC website.

embedded by Embedded Video

The video is streamed from the HEAnet server using the Embedded Video plugin (for the information of any WordPress junkies out there).

Alternative gathering of Fellowship (image credit):

Related Posts: