Lessons from running webinars

We are now coming up to half way for the webinar series I launched this year. Webinars run monthly, thereabouts, and are on the theme of chemistry education research. I’ve never hosted webinars before so it has been interesting, and when the technology decides not to work, heart-stopping. Useful responses to a post (plea) requesting ideas/guidance are listed here. I think I have incorporated most of the suggestions.

CERGinar 2017 - 2018 Series

Some thoughts on format

What’s been a real pleasure has been the opportunity to hear speakers I love give a talk. This year, because I was testing the water, I chose speakers who I have heard and who I know will do a good job, and somewhat selfishly that I want to hear again. This led to a list of 42 names scrawled on my office noticeboard, and picking just a few of these was really tough.

Alison Flynn set us off to a stellar talk with a talk that ran the spectrum from methods of doing the research right through to implementation in teaching. This was really popular and meant that it addressed the difficulty of the breadth of audience types. Keith Taber made us think more about methodologies… are experimental approaches appropriate, and what are their limitations? Nathaniel Grove picked up on the format set by Alison, again looking at methods and then looking at implications, and this seems to be a formula that works. In both cases, this meant that a natural break in proceedings was a chance to have a mid-presentation set of questions. And that echoes something I have learned from MICER: people love to discuss. Opportunities for discussion compete with wanting to squeeze as much out of the speakers as possible, and the balance is fine tuned. For an hour slot, thought, 45/15 seems to work out. Nathan’s talk included the guest chair Niki Kaiser; this was really useful as it meant I could focus on technical matters, Niki asked questions, and it also means the whole thing is less “my” webinar series, but one of the community.

How to choose speakers?

As well as the criterion (this time around) of having seen all the speakers present, there was the difficulty of choosing just a few from my list of favourites. Donald Wink is the next speaker in the series. He gave a talk at Gordon CERP last year, which was stellar, probably the best talk I heard in a year of many conferences. It was one of those talks where you stop taking notes and just listen to try to absorb as much as possible. His clarity on discussing case studies is one that I think deserves a very wide audience. Then, we have Nicole Graulich, who won best poster at Gordon CERP, meaning she got to give a short talk at the end of the conference. I was left wanting to hear much more. Ginger is doing some amazing work around students writing, and Vicente… well we all want to hear Vicente. Both of these are again Gordon speakers. I thought that this range of speakers represented some well established figures, some newer to a wider audience, different aspects of chemistry, and a balance of gender. But I’m sure I can choose another set that will fulfill those criteria.

On and on?

Chemistry education research, as a young discipline in the UK, has two difficulties as I see it. One: there is no money. And two: as there is no money, people do a lot of this work in their spare time or squeezed into a very busy day job. That means that things like this tend to get squeezed, and it becomes difficult for people to attend. The purpose of these webinars was to act as a proxy for the academic seminars our colleagues will be used to in chemistry departments, except focussed on education.  I have to say I thought that attendance (because of point 2) would be very low, but it has been way above expectations, with lots of discussion in the chat area.

I’d be interested in hearing from people as to whether we should continue with a new series in the Autumn, and proposed ideas for format/speakers. In the mean time, do register for Prof Donald Wink’s seminar, 21st Feb. You won’t be disappointed.

 

How to do a literature review when studying chemistry education

It’s the time of the year to crank up the new projects. One challenge when aiming to do education research is finding some relevant literature. Often we become familiar with something of interest because we heard someone talk about it or we read about it somewhere. But this may mean that we don’t have many references or further reading that we can use to continue to explore the topic in more detail.

So I am going to show how I generally do literature searches. I hope that my approach will show you how you can source a range of interesting and useful papers relevant to the topic you are studying, as well as identify some of the key papers that have been written about this topic. What I tend to find is that there is never any shortage of literature, regardless of the topic you are interested in, and finding the key papers is a great way to get overviews of that topic.

Where to search?

For literature in education, there are three general areas to search. Each have advantages and disadvantages.

For those with access (in university) Web of Science will search databases which, despite the name, include Social Sciences and Arts and Humanities Indexes. Its advantage is that it is easy to narrow down searches to very specific terms, times, and research topics, meaning you can quickly source a list of relevant literature. Its disadvantage is that it doesn’t search a lot of material that may be relevant but that doesn’t pass the criteria for inclusion in the database (peer review, particular process regarding review, etc). So for example, Education in Chemistry articles do not appear here (As they are not peer reviewed as EiC is a periodical), and CERP articles only appeared about 10 years ago, thanks to the efforts of the immediate past editors. CERP is there now, but the point is there are a lot of discipline based journals (e.g. Australian Journal of Education in Chemistry) that publish good stuff but that isn’t in this database.

The second place to look is ERIC (Education Resources Information Center) – a US database that is very comprehensive. It includes a much wider range of materials such as conference abstracts, although you can limit to peer review. I find ERIC very good, although it can link to more obscure material that can be hard to access.

Finally, there is Google Scholar. This is great as everyone knows how to use it, it links to PDFs of documents are shown if they are available, and it is very fast. The downside is that it is really hard to narrow your search and you get an awful lot of irrelevant hits. But it can be useful if you have very specific search terms. Google Scholar also shows you who cited the work, which is useful, and more extensive than Web of Science’s equivalent feature, as Google, like ERIC, looks at everything, not just what is listed in the database. Google is also good at getting into books which you may be able to view.

A practice search

I am going to do a literature search for something I am currently interested in: how chemistry students approach studying. I’m interested in devising ways to improve how we assist students with study tasks, and so I want to look to the literature to find out how other people have done this. For the purpose of this exercise, we will see that “study” is a really hard thing to look for because of the many meanings of the word. I intend it to mean how students interact with their academic work, but of course “study” is very widely used in scientific discourse and beyond.

It’s important to write down as precisely as you can what it is you are interested in, because the first challenge when you open up the search database is to choose your search terms.

Let’s start with Web of Science. So I’ve said I’m interested in studying chemistry. So what if I put in

study AND chem*

where chem* is my default term for the various derivatives that can be used – e.g. chemical.

study and chemstar

Well, we can see that’s not much use, we get over 1 million hits! By the time I go through those my students will have graduated. The problem of course is that ‘study’ has a general meaning of investigation as well as a specific one that we mean here.

Let’s go back. What am I interested in. I am interested in how students approach study. So how might authors phrase this? Well they might talk about “study approaches”, or “study methods” or “study strategies” or “study habits”, or “study skills”, or… well there’s probably a few more, but that will be enough to get on with.

(“study approach*” or “study method*” or “study strateg*” or “study habit*” or “study skill*”) AND Chem*

So I will enter the search term as shown. Note that I use quotations; this is to filter results to those which mention these two words in sequence. Any pair that match, AND a mention of chem* will return in my results. Of course this rules out “approaches to study” but we have to start somewhere.

Search2

How does this look? Over 500. OK, better than a million+, but we can see that some of the hits are not relevant at all.

search2 results

In Web of Science, we can filter by category – a very useful feature. So I will refine my results to only those in the education category.

refine search2

This returns about 80 hits. Much better. Before we continue, I am going to exclude conference proceedings. The reason for this is that very often you can’t access the text of these and they clutter up the results. So I will exclude these in the same way as I refined for education papers above, except in this case selecting ‘exclude’. We’re now down to 60 hits, which is a nice enough number for an afternoon’s work.

Thinning

Let’s move on to the next phase – an initial survey of your findings. For this phase, you need to operate some form of meticulous record keeping, or you will end up repeating your efforts at some future date. It’s also worth remembering what we are looking for: approaches people have used to develop chemistry students’ study skills. In my case I am interested in chemistry and in higher education. It is VERY easy to get distracted here and move from this phase to the next without completing this phase; trust me this will just mean you have to do the initial trawl all over again at some stage.

This trawl involves scanning titles and then abstracts to see if the articles are of interest. The first one in the list looks irrelevant, but clicking on it suggests that it is indeed for chemistry. It’s worth logging for now. I use the marked list feature, but you might choose to use a spreadsheet or a notebook. Just make sure it is consistent! Scrolling through the hits, we can very quickly see the hits that aren’t relevant. You can see here that including “study approach” in our search terms is going to generate quite a few false hits because it was picked up by articles mentioning the term “case study approach”.

I’ve shown some of the hits I marked of interest below. I have reduced my number down to 21. This really involved scanning quickly through abstract, seeing if it mentioned anything meaningful about study skills (the measurement or promotion of study skills) and if it did, it went in.

marked list 1

Snowballing

You’ll see in the marked list that some papers have been cited by other papers. It’s likely (though not absolutely so) that if someone else found this paper interesting, then you might too. Therefore clicking on the citing articles will bring up other more recent articles, and you can thin those out in the same way. Another way to generate more sources is to scan through the papers (especially the introductions) to see which papers influenced the authors of the papers you are interested in. You’ll often find there are a common few. Both these processes can “snowball” so that you generate quite a healthy number of papers to read. Reading will be covered another time… You can see now why about 50 initial hits is optimum. This step is a bit slow. But being methodical is the key!

A point to note: it may be useful to read fully one or two papers – especially those which appear to be cited a lot – before going into the thinning/snowballing phases as this can help give an overall clarity and context to the area, and might mean you are more informed about thinning/snowballing.

A practice search – Google Scholar

What about Google Scholar? For those outside universities that don’t have access to Web of Science, this is a good alternative. I enter my search terms using the Advanced search term accessed by the drop down arrow in the search box: you’ll see I am still using quotation marks to return exact matches but again there are limitations for this – for example strategy won’t return strategies, and Google isn’t as clever. So depending on the hit rate, you may wish to be more comprehensive.

Google

With Google, over 300 hits are returned, but there isn’t a simple way to filter them. You can sort by relevance, according to how Google scores that, or by date, and you can filter by date. The first one in the list by Prosser and Trigwell is quite a famous one on university teacher’s conceptions of teaching, and not directly of interest here – although of course one could argue that we should define what our own conception of teaching is before we think about how we are going to promote particular study approaches to students. But I’m looking for more direct hits here. With so many hits, this is going to involve a pretty quick trawl through the responses. Opening hits in new tabs means I can keep the original list open. Another hit links to a book – one advantage of Google search, although getting a sense of what might be covered usually means going to the table of contents. A problem with books though is that only a portion may be accessible. But the trawl again involves thinning and snowballing, the latter is I think much more important in Google, and as mentioned scopes a much broader citing set.

Searching with ERIC

Finally, let’s repeat with ERIC. Putting in the improved Web of Science term returns 363 hits (or 114 if I select peer-reviewed only).

Eric search

ERIC allows you to filter by journal, and you can see here that it is picking up journals that wouldn’t be shown in Web of Science, and would be lost or unlikely in Google. You can also filter by date, by author, and by level (although the latter should be treated with some caution). Proquest is a thesis database, so would link to postgraduate theses (subscription required, but you could contact the supervisor).

eric filters

The same process of thinning and snowballing can be applied. ERIC is a little frustrating as you have to click into the link to find out anything about it, whereas the others mentioned show you, for example, the number of citations. Also, for snowballing, ERIC does not show you the link to citing articles, instead linking to the journal webpage, which means a few clicks. But for a free database, it is really good.

Which is the best?

It’s interesting to note that in the full search I did using these three platforms, each one threw up some results that the others didn’t. I like Web of Science but that’s what I am used to. ERIC is impressive in its scope – you can get information on a lot of education related publications, although getting access to some of the more obscure ones might be difficult. Google is very easy and quick, but for a comprehensive search I think it is a bit of a blunt instrument.  Happy searching!

A new review on pre-labs in chemistry

Much of my work over the last year has focussed on pre-labs. In our research, we are busy exploring the role of pre-labs and their impact on learning in the laboratory. In practice, I am very busy making a seemingly endless amount of pre-lab videos for my own teaching.

These research and practice worlds collided when I wanted to answer the question: what makes for a good pre-lab? It’s taken a year of reading and writing and re-reading and re-writing to come up with some sensible answer, which is now published as a review.

There are dozens of articles about pre-labs and the first task was to categorise these – what are others doing and why they are doing it. We came up with a few themes, including the most common pair: to introduce the theory behind a lab and to introduce experimental techniques. So far so obvious. Time and again these reports – we gathered over 60 but there are likely more our search didn’t capture – highlighted that pre-labs had benefit, including unintended benefits (such as a clear increase in confidence about doing labs).

Why were pre-labs showing this benefit? This was rarer in reports. Some work, including a nice recent CERP paper, described the use of a underpinning framework to base the design of pre-labs upon and meant that the outcomes could be considered in that framework (in that case: self-regulation theory). But we were looking for something more… over-arching; a framework to consider the design considerations of pre-labs that took account of the unique environment of learning in the laboratory.

We have opted to use the complex learning framework as a framework for learning in laboratories, for various reasons. It is consistent with cognitive load theory, which is an obvious basis for preparative work. It describes the learning scenario as one where several strands of activity are drawn together, and is ‘complex’ because this act of drawing together requires significant effort (and support). And it offers a clear basis on the nature of information that should be provided in advance of the learning scenario. Overall, it seemed a sensible ‘fit’ for thinking about laboratory learning, and especially for preparing for this learning.

What makes for a good pre-lab?

We drew together the learning from the many reports on pre-lab literature with the tenets from complex learning framework to derive some guidelines to those thinking about developing pre-laboratory activities. These are shown in the figure. A particular advantage of the complex learning framework is the distinction between supportive and procedural information, which aims to get to the nitty-gritty of the kind of content that should be incorporated into a pre-lab activity. Casual readers should note that the “procedural” used here is a little more nuanced than just “procedure” that we think about in chemistry. We’ve elaborated a lot on this.

I hope that this review is useful – it has certainly been a learning experience writing it. The pre-print of the review is now available at http://dx.doi.org/10.1039/C7RP00140A and the final formatted version should follow shortly.

5 guielines for developing prelabs

A talk on integrating technology into teaching, learning, and assessment

While in Australia, I was invited to present a talk to the Monash Education Academy on using technology in education. They recorded it and the video is below. The talk had a preamble about a theme of “personalisation” that I am increasingly interested in (thanks especially to some work done by the Physics Education Research Group here at Edinburgh), and then discussed:

  1. Preparing for lectures and flipping
  2. Discussion Boards
  3. Using video for assessment

A view from Down Under

Melbourne Seventh City of Empire, part of the Australia 1930s Exhibition at National Gallery of Victoria
Melbourne Seventh City of Empire, part of the “Brave New World: Australia 1930s” Exhibition at National Gallery of Victoria

I’ve spent the last two week in Australia thanks to a trip to the Royal Australian Chemical Institute 100th Annual Congress in Melbourne. I attended the Chemistry Education symposium.

So what is keeping chemistry educators busy around this part of the world? There are a lot of similarities, but some differences. While we wrestle with the ripples of TEF and the totalitarian threat of learning gains, around here the acronym of fear is TLO: threshold learning outcomes.  As I understand it, these are legally binding statements stating that university courses will ensure students will graduate with the stated outcomes. Institutions are required to demonstrate that these learning outcomes are part of their programmes and identify the level to which they are assessed. This all sounds very good, except individuals on the ground are now focussing on identifying where these outcomes are being addressed. Given that they are quite granular, this appears to be a huge undertaking and is raising questions like: where and to what extent is teamwork assessed in a programme?

Melbourne from the Shrine
Melbourne from the Shrine

This process does appear to have promoted a big interest in broader learning outcomes, with lots of talks on how to incorporate transferable skills into the curriculum, and some very nice research into students’ awareness of their skills. Badges are of interest here and may be a useful way to document these learning outcomes in a way that doesn’t need a specific mark. Labs were often promoted as a way of addressing these learning outcomes, but I do wonder how much we can use labs for learning beyond their surely core purpose of teaching practical chemistry.

Speaking of labs, there was some nice work on preparing for laboratory work and on incorporating context into laboratory work. There was (to me) a contentious proposal that there be a certain number of laboratory activities (such as titrations) that are considered core to a chemist’s repertoire, and that graduation should not be allowed until competence in those core activities be demonstrated. Personally I think chemistry is a broader church than that, and it will be interesting to watch that one progress. A round-table discussion spent a good bit of time talking about labs in light of future pressures of funding and space; and it does seem that we are still not quite clear about what the purpose of labs are. Distance education – which Australia has a well-established head start in – was also discussed, and I was really glad to hear someone with a lot of experience in this say that it is possible to generate a community with online learners, but that it takes a substantial personal effort. The lab discussion continued to the end, with a nice talk on incorporating computational thinking into chemistry education, with suggestions on how already reported lab activities might be used to achieve this.

Gwen Lawrie delivers her Award Address
Gwen Lawrie delivers her Award Address

Of course it is the personal dimension that is the real benefit of these meetings, and it was great to meet some faces old and new. Gwen Lawrie wasn’t on the program as the announcement of her award of Education Division Medal was kept secret for as long as possible. I could listen to Gwen all day, and her talk had the theme “Chasing Rainbows”, which captured so eloquently what it means to be a teacher-researcher in chemistry education, and in a landscape that continues to change. [Gwen’s publications are worth trawling] Gwen’s collaborator Madeline Schultz (a Division Citation Winner) spoke about both TLOs and on reflections on respected practitioners on their approaches to teaching chemistry – an interesting study using a lens of pedagogical content knowledge. From Curtin, I (re-)met Mauro Mocerino (who I heard speak in Europe an age ago on clickers) who spoke here of his long standing work on training demonstrators. Also from that parish, it was a pleasure to finally meet Dan Southam. I knew Dan only through others; a man “who gets things done” so it was lovely to meet him in his capacity as Chair of the Division and this symposium, and to see that his appellation rang true. And it was nice to meet Elizabeth Yuriev, who does lovely work exploring how students approach physical chemistry problem and on helping students with problem solving strategies.

Dinner Date
Dinner Date

There were lots of other good conversations and friendly meetings, demonstrating that chemistry educators are a nice bunch regardless of location. I wasn’t the only international interloper; Aishling Flaherty from University of Limerick was there to spread her good work on demonstrator training – an impressive programme she has developed and is now trialling in a different university and a different country. And George Bodner spoke of much of his work in studying how students learn organic chemistry, and in particular the case of “What to do about Parker”. The memory of Prof Bodner sitting at the back of my talk looking at my slides through a telescopic eye piece is a happy one that will stay with me for a long time. Talk of organic chemistry reminds me of a presentation about the app Chirality – 2 which was described – it covers lots of aspects about revising organic chemistry, and looked really great.

The Pioneer, National Gallery of Victoria
The Pioneer, National Gallery of Victoria

My slightly extended trip was because I had the good fortune to visit the research group of Prof Tina Overton, who moved to Melbourne a few years ago, joining native Chris Thompson in growing the chemistry education group at Monash. It was an amazing experience immersing in a vibrant and active research group, who are working on things ranging from student critical thinking, chemists’ career aspirations, awareness of transferable skills, and the process and effect of transforming an entire laboratory curriculum. I learned a lot as I always do from Tina and am extremely grateful for her very generous hosting. I leave Australia now, wondering if I can plan a journey in 2018 for ICCE in Sydney.

From Hokusai exhibition, NGV
From Hokusai exhibition, NGV. My interpretation of students managing in a complex learning environment

Reflections on #MICER17

Two related themes emerged for me from the Methods in Chemistry Education Research meeting last week: confidence and iteration.

Let’s start where we finished: Georgios Tsaparlis’ presentation gave an overview of his career studying problem solving. This work emerged out of Johnstone’s remarkable findings around working memory and mental demand (M-demand).1,2 Johnstone devised a simple formula – if the requirements of a task were within the capability of working memory, students would be able to process the task; if not, students would find it difficult. This proposal was borne out of the plots of performance against complexity (demand) which showed a substantial drop at the point where M-demand exceeded working memory, and these findings seeded a remarkable amount of subsequent research.

However, things are seldom as simple as they seem and Tsaparlis’ work involved studying this relationship in different areas of chemistry – looking, for example, at how students solve problems in organic chemistry compared to physical chemistry, and the effect of the type of question. Each study was an iteration, an investigation of another aspect of this multi-dimensional jigsaw, aiming to make a little bit more sense each time. Sometimes the results led to an ill-fitting piece, with data being consigned to a desk drawer for a few years until further study allowed it to be explored in a new light. Towards the end of this arc of work, he began to move away from linear modelling, where we look at the strength of individual aspects on an overall hypothesis, to more complex models such as the “random walk”. It is another iteration.

The point to make here is there was no study that said: this is how students solve equilibrium questions. Rather, each study added a little more to understanding of a particular model framed around this understanding. Indeed Keith Taber outlined in his Ethics workshop the importance of context and situation in publishing results. Things are rarely definitive and usually context dependent.

For me this is reassuring. Just like Johnstone’s “ON-OFF” findings for working memory, there is a fear that one is either able to complete education research or one isn’t; a few participants indicated that “confidence” was one of the barriers in getting involved in education research in responding to Suzanne Fergus’ pre-meeting prompts, which guided her talk on writing research questions. I remember talking to an eminent chemistry professor who said something along the lines of “never look back!” – to just publish what you know to be your best understanding (and instrumentation) at a given time, accepting that more studies and analysis might lead to more understanding.

While this probably wasn’t intended to be as carefree as I leverage it here, there will always be one more publication, one better approach, one alternative understanding. The task then is to continually inform and develop our understanding of what it is we wish to understand. The action research cycles outlined more formally by Orla Kelly in her presentation facilitate this, although of course one might complete several cycles before considering publication. But I think iterations happen naturally as any research study progresses. Graham Scott illustrated this nicely in his presentation; later publications adding further depth to earlier ones. Stewart Kirton discussed building this iteration onto the design of research instruments.

Our task as education researchers then is to ensure that we are publishing to the best of or understanding and publishing with good intent – that we believe what we are saying at a particular time is an honest overview of our understanding of our study at that time in a given context.

Our task as practitioners is to move on from the duality of things that “work” and things that “don’t work”. The education literature isn’t unique in that it tends to publish more positive results than not, so when looking for the “best” way to do something, a keen reader may soon become overwhelmed, and even frustrated with the education literature for its lack of clarity. A task of those attending MICER then is not necessarily in translating research into practice; a common call, but rather communicating a greater awareness of the process of education research, along with how to meaningfully interpret outcomes so that they may be used – or not – in our teaching of chemistry.

 

We are grateful to Chemistry Education Research and Practice for their support of this event, along with the RSC interest groups: Tertiary Education Group and Chemistry Education Research Group.

Methods in Chemistry Education Research 2017 Welcome Slide

[1] J. Chem. Educ., 1984, 61, p 847

[2] Education in Chemistry, 1986, 23, 80-84

Wikipedia and writing

Academics have a complicated relationship with Wikipedia. There’s a somewhat reluctant acknowledgement that Wikipedia is an enormously used resource, but as the graphical abstract accompanying this recent J Chem Ed article1 shows, WE ARE NOT TOO HAPPY ABOUT IT. Others have embraced the fact that Wikipedia is a well-used resource, and used this to frame writing assignments as part of chemistry coursework.2-4  There is also some very elegant work on teasing out understanding of students’ perceptions of Wikipedia for organic chemistry coursework.5

Graphical abstract of M. D. Mandler, Journal of Chemical Education, 2017, 94, 271-272.
Graphical abstract of M. D. Mandler, Journal of Chemical Education, 2017, 94, 271-272.

Inspired by a meeting with our University’s Wikimedian in Residence I decided to try my hand at creating a Wikipedia article. The topic of the article was about a little-known chemist who hadn’t been written about before, and I’d say is unknown generally. I found her name listed on the Women in Red page, which is outside the scope of this post, save to say: go look at that page.

Writing the article was interesting, and some implications from a teaching perspective are listed:

  1. If there isn’t a Wikipedia article, writing a summary overview is quite a lot of work.

One of the great things about Wikipedia is of course that it offers a nice summary of the thing you are interested in, which then prompts you to go and look up other stuff which you can then pretend to have found originally. But what if there isn’t a Wikipedia article? Where do you start? Of course Googling and getting some information is part of this, but there is a step before, or at least coincident with this, which involves scoping out the content of what you want to summarise. This will involve reading enough so that you can begin this overview plan, and then searching to find information about the plan. In chemistry, the order of searching will likely go Google > Google Scholar > Databases like Web of Science etc > Google Books… Because of my context, I also got stuck into the RSC’s Historical Collection (a terribly under-promoted amazing resource). In any case, there is some good work to do here on developing information literacy (which in a formal education setting would probably need to be structured).

  1. #citethescheise

I was encouraged in writing to cite my work well, linking to original and verifiable sources. I am long enough in the game to know this, and may be known to advise novice academic writers to “referencify” their work for journals; the academic genre is one where we expect lots of superscript numbers to make a text look like it is well informed. Wikipedia has a very particular style where essentially every fact needs a citation. This is something I did reasonably well, but was very pleasantly surprised to see that someone else looked quite closely at these (new articles are reviewed by people who make amendments/changes). I know this because in my case I cited a modern J Mat Chem paper which offered an example of where the original contribution of my chemist had been cited about century later in 2016 (notability is a requirement in Wikipedia so I had this in mind). This reference had been checked, with the relevant line from it added to the citation. It was reassuring to know that someone took the time to consider the references in this amount of detail.

From a teaching point of view, we try in lab report and theses to encourage students to verify claims or opinion with data or literature. This seems like very good training for that. The point was also made to me that it teaches students to explore the veracity of what they read on Wikipedia, by considering the sources quoted.

  1. Learning to write

Wikipedia is an encyclopaedia (duh) and as such it has a particular style. I actually found it very difficult to write initially and went through quite a few drafts on Word with a view to keeping my piece pretty clinical and free of personal opinion.  Asking students to write Wikipedia articles will undoubtedly improve their writing of that style; I’m not too sure yet how beneficial that is; I feel the greater benefits are in information searching and citing, and in scoping out a narrative. But that is probably a personal bias. Edit: fair point made in this tweet: https://twitter.com/lirazelf/status/865124724166320128

  1. Writing to learn

Whatever about developing writing skills, I certainly learned a lot about my subject as well as much more context about the particular topic. Quite a lot of what I read didn’t make it into the final article (as it might have, for example if I were writing an essay). But as we know from preparing lecture notes, preparing a succinct summary of something means that you have to know a lot more than the summary you are presenting.

Why Wikipedia?

In challenging the arguments about Wikipedia such as those indicated in the graphical abstract above, I do like the idea of students getting to know and understand how the site works by interacting with it. Wikipedia usage is here to stay and I do think there is a strong argument around using it in academic writing and information literacy assignments. One very nice outcome is that something real and tangibly useful is being created, and there is a sense of contributing. Writing for something that is going to go live to the world means that it isn’t “just another exercise”. And Wikipedia articles always come to the top of Google searches (mine was there less than an hour after publishing).

Search view at 16:12 (left) after publishing, and at 16:43 (right)
Search view at 16:12 (left) after publishing, and at 16:43 (right).

I’m interested now in looking at Wikipedia writing, certainly in informal learning scenarios. A particular interest is going to be exploring how it develops information literacy skills and how we structure this with students.

My page, I’m sure you are dying to know is: https://en.wikipedia.org/wiki/Mildred_May_Gostling.

Lots of useful points about Wikipedia here – see Did You Know)

With thanks to Ewan McAndrew and Anne-Marie Scott.

Referencifying 

  1. M. D. Mandler, Journal of Chemical Education, 2017, 94, 271-272.
  2. C. L. Moy, J. R. Locke, B. P. Coppola and A. J. McNeil, Journal of Chemical Education, 2010, 87, 1159-1162.
  3. E. Martineau and L. Boisvert, Journal of Chemical Education, 2011, 88, 769-771.
  4. M. A. Walker and Y. Li, Journal of Chemical Education, 2016, 93, 509-515.
  5. G. V. Shultz and Y. Li, Journal of Chemical Education, 2016, 93, 413-422.

 

Links to Back Issues of University Chemistry Education

I don’t know if I am missing something, but I have found it hard to locate past issues of University Chemistry Education, the predecessor to CERP.  They are not linked on the RSC journal page. CERP arose out of a merger between U Chem Ed and CERAPIE, and it is the CERAPIE articles that are hosted in the CERP back issues. Confused? Yes. (More on all of this here)

Anyway in searching and hunting old U Chem Ed articles, I have cracked the code of links and compiled links to back issues below. They are full of goodness. (The very last article published in UCE was the very first chemistry education paper I read – David McGarvey’s “Experimenting with Undergraduate Practicals“.)

Links to Back Issues

Contents of all issues: http://www.rsc.org/images/date_index_tcm18-7050.pdf 

1997 – Volume 1:

1 – remains elusive… It contains Johnstone’s “And some fell on good ground” so I know it is out there… Edit: cracked it – they are available by article:

1998 – Volume 2:

1 – http://www.rsc.org/images/Vol_2_No1_tcm18-7034.pdf

2 – http://www.rsc.org/images/Vol_2_No2_tcm18-7035.pdf

1999 – Volume 3:

1 – http://www.rsc.org/images/Vol_3_No1_tcm18-7036.pdf

2 – http://www.rsc.org/images/Vol_3_No2_tcm18-7037.pdf

2000 – Volume 4:

1 – http://www.rsc.org/images/Vol_4_No1_tcm18-7038.pdf

2 – http://www.rsc.org/images/Vol_4_No2_tcm18-7039.pdf

2001 – Volume 5:

1 – http://www.rsc.org/images/Vol_5_No1_tcm18-7040.pdf

2 – http://www.rsc.org/images/Vol_5_No2_tcm18-7041.pdf

2002 – Volume 6:

1 – http://www.rsc.org/images/Vol_6_No1_tcm18-7042.pdf

2 – http://www.rsc.org/images/Vol_6_No2_tcm18-7043.pdf

2003 – Volume 7:

1 – http://www.rsc.org/images/Vol_7_No1_tcm18-7044.pdf

2 – http://www.rsc.org/images/Vol_7_No2_tcm18-7045.pdf

2004 – Volume 8:

1 – http://www.rsc.org/images/Vol_8_No1_tcm18-7046.pdf

2 – http://www.rsc.org/images/Vol_8_No2_tcm18-7047.pdf

I’ve downloaded these all now in case of future URL changes. Yes I was a librarian in another life.

UCE logo

Dialogue in lectures

This is not a post on whether the lecture is A Good Thing or not. Lectures happen. PERIOD!

A paper by Anna Wood and colleagues at the Edinburgh PER group, along with a subsequent talk by Anna at Moray House has gotten me thinking a lot over the last year about dialogue and its place in all of our interactions with students. The literature on feedback is replete with discussion on dialogue, sensibly so. The feedback cycle could be considered (simplistically) as a conversation: the student says something to the teacher in their work; the teacher says something back to the student in their feedback. There’s some conversation going on. Feedback literature talks about how this conversation continues, but what about the bit before this conversation begins?

Not a monologue

The spark from Wood’s work for me was that lectures are not a monologue. She is considering active lectures in particular, but cites Bamford* who gives a lovely overview of the nature of conversation in lectures in general. Bamford presents the case that lectures are not a monologue, but are a conversation. Just as in the feedback example above, two people are conversing with each other, although not verbally. In a lecture, the lecturer might ask: “Is that OK?”. An individual student might respond inwardly “Yes, I am getting this” or “No, I haven’t a freaking clue what is going on and when is coffee”. A dialogue happened. Wood’s paper discusses these vicarious interactions – a delicious phrase describing the process of having both sides of the conversation; an internal dialogue of sorts. She describes how this dialogue continues in active lectures, but sadly there is only one Ross Galloway, so let’s think about how this conversation might continue in lectures given by us mere mortals. How can we help and inform these vicarious interactions?

Developing a conversation

A problem you will by now have identified is that the conversation: “Is that OK?” and retort isn’t much of a conversation. So how can we continue this conversation? My intention is to consider conversation starters in lectures that foster a sense with each individual student that they are having a personal conversation with the lecturer at points during the lecture. And incorporates guides for the student to continue this conversation after the lecture, up to the point that they submit their work, prompting the conversation we started with above.

In Woods talk, she mentioned specific examples. The lecturer would ask something like: “Is 7 J a reasonable answer?” A problem with “Is that OK?” is that it is too broad. It’s difficult to follow up the conversation specifically as it likely ends with yes or no.

How about a lecturer asks: “Why is this smaller than…?” You’re a student, and you’re listening. Why is it smaller? Do you know? Yes? No? Is it because…? Regardless of your answer, you are waiting for the response. You think you know the answer, or you know you don’t.

If we are to take dialogue seriously, then the crucial bit is what happens next. Eric Mazur will rightly tell us that we should have allow discussion with peers about this, but we are mortals, and want to get on with the lecture. So how about the conversation goes something like this:

“Why is this smaller than…?”

[pause]

You are a student: you will have an answer: You know, you think you know, you don’t know, you don’t know what’s going on. You will have some response.

The lecturer continues:

“For those of you who think…”

The lecturer responds with a couple of scenarios. The conversation continues beyond a couplet.

Did you think of one of these scenarios? If so the lecturer is talking to you. Yes I did think that and I have it confirmed now I am right. Or: yes I did think that, why is that wrong?

The lecturer can continue:

“While it makes sense to think that, have a look in reference XYZ for a bit more detail”.

The lecturer thus concludes this part of the conversation. A dialogue has happened and each student knows that that they have a good idea what is going on, they don’t but know where to follow up this issue, or that they haven’t a clue what is going on. Whichever case, there is some outcome, and some action prompted. Indeed one could argue that this prompted action (refer to reference) is a bridge between the lecture and tutorial – I checked this reference but don’t understand – and so the conversation continues there.

This all seems very obvious, and maybe everyone else does this and didn’t tell me. My lectures tend to have lots of “Is that OK?” type questions, but I do like this idea of a much more purposeful design to structuring the conversation with a large class. I should say that this is entirely without research base beyond what I have read, but I think it would be very empowering for students to think that a lecturer is aiming to have a conversation with them.

rf-freq-radio-wave

*Bamford is cited in Wood’s paper and I got most of it on Google Books.

Revising functional groups with lightbulb feedback

I’m always a little envious when people tell me they were students of chemistry at Glasgow during Alex Johnstone’s time there. A recent read from the Education in Chemistry back-catalogue has turned me a shade greener. Let me tell you about something wonderful.

The concept of working memory is based on the notion that we can process a finite number of new bits in one instance, originally thought to be about 7, now about 4.  What these ‘bits’ are depend on what we know. So a person who only knows a little chemistry will look at a complex organic molecule and see lots of carbons, hydrogens, etc joined together. Remembering it (or even discussing its structure/reactivity) would be very difficult – there are too many bits. A more advanced learner may be able to identify functional groups, where a group is an assembly or atoms in a particular pattern; ketones for example being an assembly of three carbons and an oxygen, with particular bonding arrangements. This reduces the number of bits.

Functional groups are important for organic chemists as they will determine the reactivity of the molecule, and a challenge for novices to be able to do this is to first be able to identify the functional groups. In order to help students practise this, Johnstone developed an innovative approach (this was 1982): an electronic circuit board.

Functional Group Board: Black dots represent points were students needed to wire from name to example of functional group
Functional Group Board: Black dots represent points were students needed to wire from name to example of functional group

The board was designed so that it was covered with a piece of paper listing all functional groups of interest on either side, and then an array of molecules in the middle, with functional groups circled. Students were asked to connect a lead from the functional group name to a matching functional group, and if they were correct, a lightbulb would flash.

A lightbulb would flash. Can you imagine the joy?!

Amide backup card
Amide backup card

If not, “back-up cards” were available so that students could review any that they connected incorrectly, and were then directed back to the board.

The board was made available to students in laboratory sessions, and they were just directed to play with it in groups to stimulate discussion (and so as “not to frighten them away with yet another test”). Thus students were able to test out their knowledge, and if incorrect they had resources to review and re-test. Needless to say the board was very popular with students, such that more complex sheets were developed for medical students.

Because this is 1982 and pre-… well, everything, Johnstone offers instructions for building the board, developed with the departmental electrician. Circuit instructions for 50 x 60 cm board were given, along with details of mounting various plans of functional groups onto the pegboard for assembly. I want one!

 

Reference

A. H. Johnstone, K. M. Letton, J. C. Speakman, Recognising functional groups, Education in Chemistry, 1982, 19, 16-19. RSC members can view archives of Education in Chemistry via the Historical Collection.

What is the purpose of practical work?

I have been reading quite a lot about why we do practical work. Laboratory work is a core component of the chemistry (science) curriculum but its ubiquity means that we rarely stop to consider its purpose explicitly. This leads to many problems. An interesting quote summarises one:

One of the interesting things about laboratories is that there has never been definite consensus about those serious purposes. Perhaps that is why they have remained popular: they can be thought to support almost any aim of teaching.1

Even within institutions, where their might be some prescription of what the purpose is in broad terms, different faculty involved in the laboratory may have different emphases, and subsequently the message about what the purpose of practical work is differs depending on who is running the lab on a given day.2

This matters for various reasons. The first is that if there is confusion about the purpose of practical work, then everyone involved will place their attention onto the part that they think is most important. Academics will likely consider overarching goals, with students developing scientific skills and nature of science aspects.3 Demonstrators will think about teaching how to use instruments or complete techniques. Students will follow the money, and focus on the assessment, usually the lab report, which means their time is best utilised by getting the results as quickly as possible and getting out of the lab.4 Everybody’s priority is different because the purposes were never made clear. As in crystalline.

The second reason that thinking about purposes is that without an explicit consideration of what the purposes of practical work are, it is difficult to challenge these purposes, and consider their value. How many lab manuals open up with a line similar to: “The purpose of these practicals is to reaffirm theory taught in lectures…”? The notion that the purpose of practicals is in somehow supplementing taught material in lectures has long come in for criticism, and has little basis in evidence. Laboratories are generally quite inefficient places to “teach” theory. Woolnough and Allsop argued vehemently for cutting the “Gordian Knot” between theory and practical, arguing that practical settings offered their own unique purpose that, rather than being subservient to theory work, complemented it.5 Kirchner picks this argument up, describing science education in terms of substantive structure and syntactical structure. The former deals with the knowledge base of science, the latter with the acts of how we do science.6 Anderson had earlier distinguished between “science” and “sciencing”.

Discussion therefore needs to focus on what this syntactical structure is – what is “sciencing”? Here, the literature is vast, and often contradictory. To make a start, we look to Johnstone who, with his usual pragmatism, distinguished between aims of practical work (what we set out to do) and objectives of practical work (what the students achieve).8 With this in mind, we can begin to have some serious discussion about what we want practical work to achieve in our curricula.

Links to source
Links to source

References

  1.  White, R. T., The link between the laboratory and learning. International Journal of Science Education 1996, 18 (7), 761-774.
  2. Boud, D.; Dunn, J.; Hegarty-Hazel, E., Teaching in laboratories. Society for Research into Higher Education & NFER-Nelson Guildford, Surrey, UK: 1986.
  3. Bretz, S. L.; Fay, M.; Bruck, L. B.; Towns, M. H., What faculty interviews reveal about meaningful learning in the undergraduate chemistry laboratory. Journal of Chemical Education 2013, 90 (3), 281-288.
  4. (a) DeKorver, B. K.; Towns, M. H., General Chemistry Students’ Goals for Chemistry Laboratory Coursework. Journal of Chemical Education 2015, 92 (12), 2031-2037; (b) DeKorver, B. K.; Towns, M. H., Upper-level undergraduate chemistry students’ goals for their laboratory coursework. Journal of Research in Science Teaching 2016, 53 (8), 1198-1215.
  5. Woolnough, B. E.; Allsop, T., Practical work in science. Cambridge University Press: 1985.
  6. Kirschner, P. A., Epistemology, practical work and academic skills in science education. Science & Education 1992, 1 (3), 273-299.
  7. Anderson, R. O., The experience of science: A new perspective for laboratory teaching. Teachers College Press, Columbia University: New York, 1976.
  8. Johnstone, A. H.; Al-Shuaili, A., Learning in the laboratory; some thoughts from the literature. University Chemistry Education 2001, 5 (2), 42-51.

 

Why do academics use technology in teaching?

This week is All Aboard week in Ireland, essayed at “Building Confidence in Digital Skills for Learning”. I am speaking today in the gorgeous city of Galway on this topic, and came across this paper in a recent BJET which gives some useful context. It summarises interviews with 33 Australian academics from various disciplines, on the topic of why they used technology in assessment. While the particular lens is on assessment, I think there are some useful things to note for those espousing the incorporation of technology generally.

Four themes emerge from the interviews

The first is that there is a perceived cost-benefit analysis at play; the cost of establishing an assessment process (e.g. quizzes) was perceived to be offset by the benefit that it would offer, such as reducing workload in the long-run. However, some responses suggest that this economic bet didn’t pay off, and that lack of time meant that academics often took quick solutions or those they knew about, such as multiple choice quizzes.

The second theme is that technology was adopted because it is considered contemporary and innovative; this suggests a sense of inevitability of using tools as they are there. A (mildly upsetting) quote from an interview is given:

“It would have been nice if we could have brainstormed what we wanted students to achieve, rather than just saying “well how can ICT be integrated within a subject?”

The third theme was one around the intention to shape students’ behaviour – providing activities to guide them through learning. There was a sense that this was expected and welcomed by students.

Finally, at the point of implementation, significant support was required, which often wasn’t forthcoming, and because of this, and other factors, intentions had to be compromised.

The authors use these themes to make some points about the process of advocating and supporting those integrating technology. I like their point about “formative development” – rolling out things over multiple iterations and thus lowering the stakes. Certainly my own experience (in hindsight!) reflects the benefit of this.

One other aspect of advocacy that isn’t mentioned but I think could be is to provide a framework upon which you hang your approaches. Giving students quizzes “coz it helps them revise” probably isn’t a sufficient framework, and nor is “lecture capture coz we can”. I try to use the framework of cognitive load theory as a basis for a lot of what I do, so that I have some justification for when things are supported or not, depending on where I expect students to be at in their progression. It’s a tricky balance, but I think such a framework at least prompts consideration of an overall approach rather than a piecemeal one.

There’s a lovely graphic from All Aboard showing lots of technologies, and as an awareness tool it is great. But there is probably a huge amount to be done in terms of digital literacy, regarding both the how, but also the why, of integrating technology into our teaching approaches.

map2
Click link to go to All Aboard webpage

 

#ViCEPHEC16 – curly arrows and labs

The annual Variety in Chemistry Education/Physics Higher Education conference was on this week in Southampton. Some notes and thoughts are below.

Curly arrows

Physicists learned a lot about curly arrows at this conference. Nick Greeves‘ opening keynote spoke about the development of ChemTube3D – a stunning achievement – over 1000 HTML pages, mostly developed by UG students. News for those who know the site are that 3D curly arrow mechanisms are now part of the reaction mechanism visualisations, really beautiful visualisation of changing orbitals as a reaction proceeds for 30+ reactions, lovely visualisations of MOFs, direct links to/from various textbooks, and an app at the prototype stage. Nick explained that this has all been developed with small amounts of money from various agencies, including the HEA Physical Sciences Centre.

Mike Casey from UCD spoke about a resource at a much earlier stage of development; an interactive mechanism tutor. Students can choose a reaction type and then answer the question by drawing the mechanism – based on their answer they receive feedback. Version 2 is on the way with improved feedback, but I wondered if this feedback might include a link to the appropriate place in Chemtube3D, so that students could watch the associated visualisation as part of the feedback.

In the same session Robert Campbell spoke about his research on how A-level students answer organic chemistry questions. My understanding is that students tend to use rules of mechanisms (e.g. primary alkyl halides means it’s always SN2) without understanding the reason why; hence promoting rote learning. In a nice project situated in the context of cognitive load theory, Rob used Livescribe technology to investigate students reasoning. Looking forward to seeing this research in print.

Rob’s future work alluded to considering the video worked answers described by Stephen Barnes, also for A-level students. These demonstrated a simple but clever approach; using questions resembling A-level standard, asking students to complete them, providing video worked examples so students could self-assess, and then getting them to reflect on how they can improve. David Read mentioned that this model aligned with the work of Sadler, worth a read.

Laboratory work

Selfishly, I was really happy to see lots of talks about labs on the programme. Ian Bearden was the physics keynote, and he spoke about opening the laboratory course – meaning the removal of prescriptive and allowing students to develop their own procedures. Moving away from pure recipe is of course music to this audience’s ears and the talk was very well received. But you can’t please everyone – I would have loved to hear much more about what was done and the data involved, rather than the opening half of the talk about the rationale for doing so. A short discussion prompted this tweet from Felix Janeway, something we can agree on! But I will definitely be exploring this work more. Ian also mentioned that this approach is also part of physics modules taught to trainee teachers, which sounded a very good idea.

Jennifer Evans spoke about the prevalence of pre-labs in UK institutions following on from the Carnduff and Reid study in 2003. Surprisingly many don’t have any form of pre-lab work. It will be interesting to get a sense of what pre-lab work involves – is it theory or practice? Theory and practice were mentioned in a study from Oxford presented by Ruiqi Yu, an undergraduate student. This showed mixed messages on the purpose of practical work, surely something the academy need to agree on once and for all. There was also quite a nice poster from Oxford involving a simulation designed to teach experimental design, accessible at this link. This was also built by an undergraduate student. Cate Cropper from Liverpool gave a really useful talk on tablets in labs – exploring the nitty gritty of how they might work. Finally on labs, Jenny Slaughter gave an overview of the Bristol ChemLabs, which is neatly summarised in this EiC article, although the link to the HEA document has broken.

Other bites

  • Gwen Lawrie (via Skype) and Glenn Hurst spoke about professional development; Gwen mentioned this site she has developed with Madeline Schultz and others to inform lecturers about PCK. Glenn spoke about a lovely project on training PhD students for laboratory teaching – details here.  This reminds me of Barry Ryan‘s work at DIT.
  • Kristy Turner gave an overview of the School Teacher Fellow model at Manchester, allowing her to work both at school and university with obvious benefits for both. Kristy looked forward to an army of Kristy’s, which would indeed be formidable, albeit quite scary. Even without that, the conference undoubtedly benefits from the presence of school teachers, as Rob’s talk, mentioned above, demonstrates.
  • Rachel Koramoah gave a really great workshop on qualitative data analysis. Proving the interest in chemistry education research, this workshop filled up quickly. The post-it note method was demonstrated, which was interesting and will certainly explore more, but I hope to tease out a bit more detail on the data reduction step. This is the benefit of this model – the participants reduce the data for you – but I worry that this might in turn lead to loss of valuable data.
  • Matthew Mears gave a great byte on the value of explicit signposting to textbooks using the R-D-L approach: Read (assign a reading); Do (Assign questions to try); Learn (assign questions to confirm understanding). Matt said setting it up takes about 30 minutes and he has seen marked improvements in student performance in comparison to other sections of the course.
  • David Nutt won the best poster prize. His poster showed the results of eye-tracking experiments to demonstrate the value or not of an in-screen presenter. Very interesting results which I look forward to seeing in print.

The conference organisation was brilliant and thanks to Paul Duckmanton and Charles (Minion) Harrison for leading the organisation. Lots of happy but tired punters left on Friday afternoon.

I couldn’t attend everything, and other perspectives on the meeting with links etc can be found at links below. From Twitter, Barry Ryan’s presenation on NearPod seemed popular, along with the continuing amazingness of my colleagues in the Edinburgh Physics Education Research Group. One of their talks, by Anna Wood, is available online.

Getting ready to badge and looking for interested partners

Over the summer we have been working on a lab skills badging project. Lots of detail is on the project home site, but briefly this is what it’s about:

  • Experimental skills are a crucial component of student laboratory learning, but we rarely assess them, or even check them, formally. For schools, there is a requirement to show that students are doing practical work.
  • By implementing a system whereby students review particular lab techniques in advance of labs, demonstrate them to a peer while being videod, reviews the technique with a peer using a checklist, and uploads the video for assessment, we intend that students will be able to learn and perform the technique to a high standard.
  • The video can form part of students electronic portfolio that they may wish to share in future (See this article for more on that).
  • The process is suitable for digital badging – awarding of an electronic badge acknowledging competency in a particular skill (think scout badges for… tying knots…).

Marcy Towns has a nice paper on this for pipetting and we are going to trial it for this and some other lab techniques.

Looking for interested parties to trial it out

I am looking for school teachers who would like to try this method out. It can be used to document any lab technique or procedure you like. You don’t necessarily need an exemplar video, but a core requirement is that you want to document students laboratory work formally, and acknowledge achievement in this work by a digital badge. We will provide the means to offer the badge, and exemplar videos if you need them, assuming they are within our stock. Interested teachers will be responsible for local implementation and assessment of quality (i.e. making the call on whether a badge is issued).

Yes I need help with badge design
Yes I need help with badge design

This will be part of a larger project and there will be some research on the value and impact of the digital badges, drawing from implementation case studies. This will be discussed with individuals, depending on their own local circumstances.

So if you are interested, let’s badge! You can contact me at: michael.seery@ed.ac.uk to follow up.

What is the “education literature”?

Over on the Education in Chemistry blog, Paul MacLellan wrote an excellent article on reasons teachers don’t engage with education research, which is well worth a read. Speaking a few years ago, I used analogy of a paddle boat steamer when talking about the penetration of education research in HE. The paddles throw up some splashes as it sails over the vast quantity of water below. These splashes were meant to represent how many engage with research – taking on what they hear on the grapevine, Twitter, or CPD. It isn’t systematic.

I’ve spent a lot of time wondering about whether I should expect my colleagues to read education research, and on balance, I don’t think I should. The reason stems from the argument made about different audiences by Keith Taber in MacLellan’s article, and quantified by the featured commenter under his post. And I think we need some clarity about what we mean by education research literature.

Primary, secondary, and tertiary literature

New research in any field is iterative. We know a certain amount, and someone does some research to add a little more to our understanding. In publishing these research findings, we tend to summarise what was known before to situate the work in context, and add on the new bit. As Taber points out, education has the unique position of aiming to address two audiences: like any field it is addressing other education researchers in that field; but also has a second audience; practitioners who may wish to change some aspect of their teaching, and are looking for “what works”. The trouble with the mixed audience is that the language and semantics for each are very different, leaving the practitioner feeling very frustrated. The featured comment under MacLellan’s blog documents this very well. The practitioner looking to improve faces the difficult challenge: they use some search engine with decent keywords and have to try to pick out some paper that will offer them nuggets. It really is a needle in a haystack, (or a splash of water from the river boat…).

If asked for advice, I think I would rather suggest that such practitioners would instead refer to secondary or tertiary literature. Secondary literature aims to summarise the research findings in a particular field. While it is still written with an audience of researchers from the field in mind, these reviews typically group the results from several individual studies into themes or overarching concepts, which can be useful to practitioners who may wish to see “what might work” in their own context. I recall the value of MacArthur and Jones’ review on clickers, and my own review of flipping lectures in chemistry are examples of this type.

The audience shifts more fully when we move to tertiary literature. While there is still likely two audiences for education research, the emphasis with tertiary literature is addressing a wider audience; introducing the field to a wider audience of interested readers. Typically books summarising teaching approaches are grounded in well documented research, but unlike secondary sources, they are written for those wishing to find out about the topic from an introductory level, and the language is considerate of the wider audience. Think of Taber’s books on misconceptions, and the impact they have had. More recently, the web has offered us new forms of tertiary literature – blogs are becoming more popular to disseminate the usefulness of research to a wider audience and summaries such that recently published by Tina Overton on factors to consider in teaching approaches can help introduce overarching research findings, without having to penetrate the original education research studies.

So should my colleagues read education research? I still don’t think so. A tourist to a new city wouldn’t read academic articles on transport infrastructure and architecture – they would just read the tourist guide. Of course it can be inspiring to read a case study or see what students in an individual situation experienced. But I would rather recommend secondary and tertiary sources to them if they are going to spend any valuable time reading.

And that means, in chemistry education’s case, we need a lot more of these types of publications. A recent J Chem Ed editorial suggested that they are thinking about promoting this type of publication, and any movement in that direction is welcome.

Planning a new book on laboratory education

Contracts have been signed so I am happy to say that I am writing a book on chemistry laboratory education as part of the RSC’s new Advances in Chemistry Education series due for publication mid 2017.

I’ve long had an interest in lab education, since stumbling across David McGarvey’s “Experimenting with Undergraduate Practicals” in University Chemistry Education (now CERP). Soon after, I met Stuart Bennett, now retired, from Open University at a European summer school. Stuart spoke about lab education and its potential affordances in the curriculum. He was an enormous influence on my thinking in chemistry education, and in practical work in particular. We’d later co-author a chapter on lab education for a book for new lecturers in chemistry published by the RSC (itself a good example on the benefits of European collaboration). My first piece of published education research was based on laboratory work; a report in CERP on the implementation of mini-projects in chemistry curriculum, completed with good friends and colleagues Claire Mc Donnell and Christine O’Connor. So I’ve been thinking about laboratory work for a long time.

Why a book?

A question I will likely be asking with increasing despair over the coming months is: why am I writing a book? To reaffirm to myself as much as anything else, and to remind me if I get lost on the way, the reasons are pretty straightforward.

My career decisions and personal interests over the last few years have meant that I have moved my focus entirely to chemistry education. Initially this involved sneaking in some reading between the covers of J. Mat. Chem. when I was meant to be catching up on metal oxide photocatalysis. But as time went on and thanks to the support of others involved in chemistry education, this interest became stronger. I eventually decided to make a break with chemistry and move into chemistry education research. (One of the nicest things for me personally about joining Edinburgh was that this interest was ultimately validated.)

So while my knowledge of latest chemistry research is limited mainly to Chemistry World reports, one thing I do know well is the chemistry education research literature. And there is a lot of literature on laboratory education. But as I read it and try to keep on top of it, it is apparent that much of the literature on laboratory education falls into themes, and by a bit of rethinking of these themes and by taking a curriculum design approach, some guiding principles for laboratory education can be drawn up. And that a compilation of such principles, within the context of offering a roadmap or plan for laboratory education might be useful to others.

And this is what I hope to offer. The book will be purposefully targeted at anyone responsible for taking a traditional university level chemistry laboratory course and looking to change it. In reality, such change is an enormous task, and being pragmatic, needs to happen in phases. It’s tempting then to tweak bits and change bits based on some innovation presented at a conference or seen in a paper. But there needs to be an overall design for the entire student experience, so that incremental changes sum up to an overall consistent whole piece. Furthermore, by offering a roadmap or overall design, I hope to empower members of staff who may be responsible for such change by giving the evidence they may need to rationalise changes to colleagues. Everyone has an opinion on laboratory education! The aim is to provide evidence-based design approaches.

My bookshelves are groaning with excellent books on laboratory education. I first came across Teaching in Laboratories by Boud Dunn and Hegarty-Hazel back in the days when I stumbled across McGarvey’s article. I still refer to it, as even though it was published in 1986, it still carries a lot of useful material. Woolnough and Allsop’s Practical Work in Science is also excellent; crystal clear on the role and value of laboratory education and its distinction from lecture based curriculum. Hegarty-Hazel also edited The Student Laboratory and the Science Curriculum. Roger Anderson’s book The Experience of Science was published before I was born.

I have bought these now out of print books and several more second hand for less than the cost of a cup of coffee. I have learned lots from them, but am mindful that (justifiably) well-known and comprehensive as they are, they are now out of print and our university laboratories have not seen much change in the forty years since Anderson.

I am very conscious of this as I structure my own book. I can speculate that books about science laboratories at both secondary and tertiary level may be too broad. So the book is focussing exclusively on chemistry and higher education.

Secondly, the book is very clearly directed at those implementing a new approach, those involved in change. Ultimately it is their drive and energy and input that decides the direction of changes that will occur.  I hope that by speaking directly to them with a clear rationale and approach based on an up-to-date literature, that it may ease the workload somewhat for those looking to rethink laboratory education in their curricula. Now I just need to actually write it.

Alex Johnstone’s 10 Educational Commandments

My thanks to Prof Tina Overton for alerting me to the fact that these exist. I subsequently happened across them in this article detailing an interview with Prof Johnstone (1), and thought they would be useful to share.

Ten Educational Commandments 

1. What is learned is controlled by what you already know and understand.

2. How you learn is controlled by how you learned in the past (related to learning style but also to your interpretation of the “rules”).

3. If learning is to be meaningful, it has to link on to existing knowledge and skills, enriching both (2).

4. The amount of material to be processed in unit time is limited (3).

5. Feedback and reassurance are necessary for comfortable learning, and assessment should be humane.

6. Cognisance should be taken of learning styles and motivation.

7. Students should consolidate their learning by asking themselves about what goes on in their own heads— metacognition.

8. There should be room for problem solving in its fullest sense (4).

9. There should be room to create, defend, try out, hypothesise.

10. There should be opportunity given to teach (you don’t really learn until you teach) (5).

Johnstone told his interviewer that he didn’t claim any originality for the statements, which his students called the 10 educational commandments. Rather he merely brought together well known ideas from the literature. But, and importantly for this fan, Johnstone said that they have been built into his own research and practice, using them as “stars to steer by”.

References

  1. Cardellini, L, J. Chem. Educ., 2000, 77, 12, 1571.
  2. Johnstone, A. H. Chemical Education Research and Practice in Europe (CERAPIE) 2000, 1, 9–15; online at http://www.uoi.gr/cerp/2000_January/contents.html.
  3. Johnstone, A. H. J. Chem. Educ. 1993, 70, 701–705
  4. Johnstone, A. H. In Creative Problem Solving in Chemistry; Wood, C. A., Ed.; Royal Society of Chemistry: London, 1993.
  5. Sirhan, G.; Gray, C.; Johnstone, A. H.; Reid, N. Univ. Chem. Educ. 1999, 3, 43–46.

ChemEd Journal Publications from UK since 2015

I’ve compiled this list for another purpose and thought it might be useful to share here. 

The following are publications I can find* from UK corresponding authors on chemistry education research, practice, and laboratory work relevant to HE since beginning of 2015.  There are lots of interesting finds and useful articles. Most are laboratory experiments and activities, Some refer to teaching practice or underlying principles.

I don’t imagine this is a fully comprehensive list, so do let me know what’s missing. It’s in approximate chronological order from beginning of 2015.

  1. Surrey (Lygo-Baker): Teaching polymer chemistry
  2. Reading (Strohfeldt): PBL medicinal chemistry practical
  3. Astra Zeneca and Huddersfield (Hill and Sweeney): A flow chart for reaction work up
  4. Bath (Chew): Lab experiment: coffee grounds to biodiesel
  5. Nottingham (Galloway): PeerWise for revision
  6. Hertfordshire (Fergus): Context examples of recreational drugs for spectroscopy and introductory organic chemistry 
  7. Overton (was Hull): Dynamic problem based learning
  8. Durham (Hurst, now at York): Lab Experiment: Rheology of PVA gels
  9. Reading (Cranwell): Lab experiment: Songoshira reaction
  10. Edinburgh (Seery): Flipped chemistry trial
  11. Oaklands (Smith): Synthesis of fullerenes from graphite
  12. Manchester (O’Malley): Virtual labs for physical chemistry MOOC  
  13. Edinburgh (Seery): Review of flipped lectures in HE chemistry
  14. Manchester (Wong): Lab experiment: Paterno-Buchi and kinetics
  15. Southampton (Coles): Electronic lab notebooks in upper level undergraduate lab
  16. UCL (Tomaszewski): Information literacy, searching
  17. St Andrews & Glasgow (Smellie): Lab experiment: Solvent extraction of copper
  18. Imperial (Rzepa): Lab experiment: Assymetric epoxidation in the lab and molecular modelling; electronic lab notebooks
  19. Reading (Cranwell): Lab experiment: Wolff Kishner reaction
  20. Imperial (Rzepa): Using crystal structure databases
  21. Leeds (Mistry): Inquiry based organic lab in first year – students design work up
  22. Manchester (Turner): Molecular modelling activity
  23. Imperial (Haslam & Brechtelsbauer): Lab experiment: vapour pressure with an isosteniscope
  24. Imperial (Parkes): Making a battery from household products
  25. Durham (Bruce and Robson): A corpus for writing chemistry
  26. Who will it be…?!

*For those interested, the Web of Science search details are reproduced below. Results were filtered to remove non-UK papers, conference proceedings and editorials.

ADDRESS:((united kingdom OR UK OR Scotland OR Wales OR England OR (Northern Ireland))) AND TOPIC: (chemistry)AND YEAR PUBLISHED: (2016 or 2015)

Refined by: WEB OF SCIENCE CATEGORIES: ( EDUCATION EDUCATIONAL RESEARCH OR EDUCATION SCIENTIFIC DISCIPLINES )
Timespan: All years. Indexes: SCI-EXPANDED, SSCI, A&HCI, CPCI-S, CPCI-SSH, BKCI-S, BKCI-SSH, ESCI, CCR-EXPANDED, IC.

 

Practical work: theory or practice?

Literature on laboratory education over the last four decades (and more, I’m sure) has a lot to say on the role of practical work in undergraduate curricula. Indeed Baird Lloyd (1992) surveys opinions on the role of practical work in North American General Chemistry syllabi over the course of the 20th century and opens with this delicious quote, apparently offered by a student in 1928 in a $10 competition:

Chemistry laboratory is so intimately connected with the science of chemistry, that, without experimentation, the true spirit of the science cannot possibly be acquired. 

I love this quote because it captures so nicely the sense that laboratory work is at the heart of chemistry teaching – its implicit role in the teaching of chemistry is unquestionable. And although it has been questioned a lot, repeatedly, over the following decades; not many today would advocate a chemistry syllabus that did not contain laboratory work.

I feel another aspect of our consideration of chemistry labs is often unchallenged, and needs to be. That is the notion that chemistry laboratories are in some way proving ground for what students come across in lectures. That they provide an opportunity for students to visualise and see for themselves what the teacher or lecturer was talking about. Or more laudably, to even “discover” for themselves by following a controlled experiment a particular relationship. Didn’t believe it in class that an acid and an alcohol make an ester? Well now you are in labs, you can prove it. Can’t imagine that vapour pressure increases with temperature? Then come on in – we have just the practical for you. Faraday said that he was never able to make a fact his own without seeing it. But then again, he was a great demonstrator.

A problem with this on an operational level, especially at university, and especially in the physical chemistry laboratory, is that is near impossible to schedule practicals so that they follow on from the introduction of theory in class. This leads to the annual complaint from students that they can’t do the practical because they haven’t done the theory. Your students are saying this, if you haven’t heard them, you need to tune your surveys.

It’s an entirely understandable sentiment from students because we situate practicals as a subsidiary of lectures. But this is a false relationship for a variety of reasons. The first is that if you accept a model whereby you teach students chemistry content in lectures, why is there a need to supplement this teaching with a re-teaching of a sub-set of topics, arbitrarily chosen based on the whim of a lab course organiser and the size of a department’s budget? Secondly, although we aim to re-teach, or hit home some major principle again in lab work, we don’t really assess that. We might grade students’ lab report and give feedback, but it is not relevant to them as they won’t need to know it again in that context. The lab report is done. And finally, the model completely undermines the true role of practical work and value it can offer the curriculum.

A different model

When we design lecture courses, we don’t really give much thought to the labs that will go with them. Lecture course content has evolved rapidly to keep up to date with new chemistry; lab development is much slower. So why not the other way around? Why not design lab courses independent of lectures? Lecture courses are one area of the curriculum to learn – typically the content of the curriculum; laboratory courses are another. And what might the role here be?

Woolnough and Allsop (1985), who make a clear and convincing argument for cutting the “Gordian knot” between theory and practice, instead advocate a syllabus that has three aims:

  1. developing practical skills and techniques.
  2. being a problem-solving chemist.
  3. getting a “feel for phenomena”.

The detail of how this can be done is the subject of their book, but involves a syllabus that has “exercises, investigations, and experiences”. To me these amount to the “process” of chemistry. On a general level, I think this approach is worth consideration as it has several impacts on teaching and learning in practice.

Impacts on teaching and learning

Cutting the link between theory and practice means that there is no longer a need to examine students’ understanding of chemistry concepts by proxy. Long introductions, much hated by students, which aim to get the student to understand the theory behind the topic at hand by rephrasing what is given to them in a lab manual, are obsolete. A properly designed syllabus removes the need for students to have had lectures in a particular topic before a lab course. Pre-lab questions can move away from being about random bits of theory and focus on the relationships in the experiment. There is no need for pointless post-lab questions that try to squeeze in a bit more theory.

Instead, students will need to approach the lab with some kind of model for what is happening. This does not need to be the actual equations they learn in lectures. Some thought means they may be able to draw on prior knowledge to inform that model. Of course, the practical will likely involve using some aspect of what they cover or will cover in lectures, but at the stage of doing the practical, it is the fundamental relationship they are considering and exploring. Approaching the lab with a model of a relationship (clearly I am in phys chem labs here!) and exploring that relationship is better reflecting the nature of science, and focussing students attention on the study in question. Group discussions and sharing data are more meaningful. Perhaps labs could even inform future lectures rather than rely on past ones! A final advantage is the reassertion of practical skills and techniques as a valuable aspect of laboratory work.

A key point here is that the laboratory content is appropriate for the level of the curriculum, just as it is when we design lectures. This approach is not advocating random discovery – quite the opposite. But free of the bond with associated lectures, there is scope to develop a much more coherent, independent, and more genuinely complementary laboratory course.

References

Baird W. Lloyd, The 20th Century General Chemistry Laboratory: its various faces, J. Chem. Ed., 1992, 69(11), 866-869.

Brian Woolnaugh and Terry Allsop (1985) Practical Work in Science, Cambridge University Press.

1928 quote