Promoting teaching focussed academics

I read with interest this series of blog posts on promotion in academia, discussing external promotions (having to move beyond your institution to get promoted), internal promotions, and using the former to achieve the latter.

image

There is an additional layer of considerations for promotion of teaching focussed academics working in disciplinary departments (as opposed to education departments). The first is whether teaching focussed academics should be promoted on the basis of their work in teaching. The argument against is that if research is the traditional metric, then one who does not do research should not be promoted, and certainly not to professorial rank. This is still a prevalent view, although one that is changing somewhat. In the UK, there are not many Professors of Chemistry Education for example; I can think of a handful. In the last year there has been quite a few examples of people being promoted to Associate Professor (Reader) level. But it is still early days.

One of the difficulties is even if an institution wishes to promote someone to senior level, what criteria do they use? Typical criteria of funding and to a certain extent publication record are more difficult to apply. A general finger in the wind idea is whether the candidate has an international reputation, which seems sensible enough, but then I might say that. In a world of professional social media, a reputation and a reputation online are becoming confused. So instead, institutions might look for candidates to be Senior or Principal Fellow of the HEA, as an externally judged metric of reputation. This requires quite a commitment on the part of the applicant, and reflects back on what is the promotable action – teaching quality itself, or impact of that on others within and beyond the institution. For professorship, my own institution seeks things like external awards, student nominated awards, contribution to university policy, development of a MOOC (what?!), authorship of influential textbook, author of publications, invites to major conferences, or PFHEA. Perhaps I am blinkered, but one feels that if this level of criteria was equivalent across the board for all promotions, we would have very few professors generally.

One of the ironies of teaching focussed promotions is that it somewhat focusses on shiny new things that appear above the baseline of just doing the regular teaching grind. Things have to be excellent and innovative and while in a way that is good to encourage creativity, I do fear a little for students who are exposed to some crazier ideas so that the (ir)responsible academic can write about their snap-bang-whizz in appropriate promotion documentation.

A second concern is that publish-and-be-damned is even more enticing for education focussed academics looking for evidence. Promotion panels are unlikely to know the difference between reputable education publications with some merit in the field and The Secret Diary of Chemistry Education, promising all the latest news in the field. An education publication is an education publication, and to a passing eye, might appear so. One would hope that at least in this instance, an external reviewer might comment on that.

Finally, there is the issue of external reviewers. If someone is going for promotion on the basis of their teaching, and internally the university finds it difficult to judge quality, how can an external person do so? It will come back to perception, and perceptions are going to be influenced by bias. Over the last few years, I have been asked to write review letters for a whole variety of teaching focussed academic promotions from junior to professorial levels. One thing is very common – the difficulty in moving beyond perception and basing references on something tangible – there is often a lack of tangible evidence, often for the reasons above. So while the horizon isn’t clear and the pathway isn’t quite mapped out yet, recording and documenting evidence will be useful to support applications when the clouds clear. I’m open to suggestions as to what that might entail…

Reflecting on #MICER18

This week I ran the third Methods in Chemistry Education Research meeting (MICER18). It was a really interesting and useful day – we had a good range of speakers and lots of discussion; certainly the scope of the meeting this year was the most ambitious so far. As the meeting is beginning to settle into a space on the calendar, I’ve been thinking a lot about how to take it forward.

Micer Timeline

For me MICER operates on three levels. At one level this meeting has a very simple purpose – to share approaches that can be used when doing educational research, and especially applied to chemistry education research. This is achieved by asking speakers to give talks on how do undertake particular approaches, accompanied by activities and discussion – effectively a series of workshops. Over the series (2016, 2017, 2018) we have covered things like doing interviews, thematic analysis, using Likert scales, designing questionnaires, statistical approaches, thinking about theoretical frameworks and ethics, and the holy grail of writing a decent research question. The purpose is to give insight into the language and processes around educational research for the audience of people with a scientific background, who are moving towards the light. The emphasis on sharing methods of how things were done, rather than what happens as a result of doing things means that this meeting can eek out a little bit of light beyond the shadow of the annual Variety in Chemistry Education meeting. 

A second level is about identity. Most people working as a discipline based researcher in the U.K. and Ireland will likely be doing this as a kind of part-time hobby, in the few gaps available when doing a full time teaching position. Lack of funding means that the discipline is amateur; people are doing things with not much time and less money. If we are to professionalise, people need to feel confident in saying that they are a “chemistry education researcher”, loud and proud. This is very difficult to do if you don’t feel professionally grounded in that discipline. To gain that confidence, there needs to be a community into which they can situate themselves, and a sense of personal expertise to allow them to make the claim. By sharing methods and approaches, and demonstrating that there is a community, the meeting aims to help raise this confidence. This year we included a “reports from practice” section; hearing from people who are just like us doing CER in their own situation. It was wonderful, and a real highlight of the day, a kind of showcasing of what real live chemistry education researchers look like. I was also struck this year by the number of people in the room who spoke about projects they are working on, or where they were situating themselves on the spectrum of evaluation, reflection, and research. This highlights to me that the landscape is shifting slowly. But there needs to be considerable support; financial yes, but also in terms of identity. This is something the RSC needs to grapple with firmly. 

Finally, the entire education landscape is shifting. The growing emphasis on teaching at third level means that those situated in a teaching and scholarship roles are thrust into a political ping pong. At school level there is an expectation that teachers will engage with education research with little support or guidance, save for some grassroots heroes. We aimed to address this head on this year with a keynote talk about this very landscape; one which I think well characterised it and also offered clues as to how we might navigate it. We also included a talk on managing student projects; standing firm in a world where in some institutions, the presence of such projects is contentious. 

MICER19

In setting out on the MICER journey, I was only ever really concerned with (and indeed thought about) the first level. I knew from conversations that people wanted to know about the how of doing education research. The additional levels have grown, partly thrust upon us as a nascent community, partly necessary for us to be a community. But there is a danger that in aiming to do everything, the meeting tries to do too much, and as a result, does not achieve the sum of its parts. It is only a one-day meeting after all, and I am left wondering whether we should refocus our thoughts on the first part, and the others will work themselves out elsewhere (come on RSC!). 

I’m planning to send out feedback survey and hope to use that to guide the focus of future meetings. There are other less lofty considerations; the meeting was full by February and likely needs a bigger venue. The characteristics of attendees is broadening. While the registration cost is cheap, getting to London is expensive. Getting funding to support the meeting is getting more difficult; there is a limit to the hit that the supporting interest group budgets can manage.  

But the interest in, and outcomes of, the meeting mean that I think the effort will be worth the while. 

 

A memorable teaching scenario for #Chemedcarnival

Katherine Haxton has challenged us to write a blog post on a memorable teaching situation.

When I was 19 I attended a Scout training weekend as part of a course to become a canoe instructor. I had been canoeing for several years under the patient instruction of canoe instructor, Keith, who was also a former scout leader. Keith is tall and patient and a scientist and has a clipped English accent with excellent projection. “Lean downstream, Mick” he’d boom up and down rivers, while I’d lean upstream, and capsize.

All this made Keith very exotic. Coming from a small country village where everyone mumbled and there were no scientists, much less well-spoken English ones, he was very different. We adored him. He had rare qualities of being The Adult but never condescending, paternal but never patronising. We mimicked him constantly (out of earshot) but woe betide anyone who even hinted a bad word about him.

So the country fellows went to training course to learn how to be leaders. The teachers on the course seemed tough and scary and, well, from Dublin, which is to say they were under the influence most of the time. When we arrived we found out that Keith was going to give one of the sessions on this weekend of the course. This is more than half a life ago, but I recall the excitement that news brought. We were used to him in the context of our own canoeing, going down (and upside-down) the rivers of Wicklow, but now we would see him somewhere different. We knew him and these chaps from Dublin didn’t. He was ours and we were loaning him to the mob and he would be amazing.

He was amazing. The session was about safety, and the kinds of decisions that need to be made quickly when on rivers. He was clear and authoritative and we sat and listened in total silence. Bursting with pride. Everything he said made sense.

But then; one of the Dublin fellows shouted out in the silence: “But Keith – that’s WRONG!” You can guess our horror. Keith listened, and responded, and moved on. And then; another interruption!

A game was afoot. Even for simple country chaps, it was clear that the whole lesson had been structured, with planned interruptions prepared well in advance. Tension eased, we all played along, throwing out ideas and suggestions and discussing various scenarios and decisions.

It is a “teaching moment” that has always stuck with me. There aren’t many lessons from half a life ago that I remember so well. A few months ago I was in the newly refurbished National Gallery of Ireland, and wandered into a wing featuring some new artists. I came across this portrait of our hero. The link explains more.

ROBINSON, N-770

Lessons from running webinars

We are now coming up to half way for the webinar series I launched this year. Webinars run monthly, thereabouts, and are on the theme of chemistry education research. I’ve never hosted webinars before so it has been interesting, and when the technology decides not to work, heart-stopping. Useful responses to a post (plea) requesting ideas/guidance are listed here. I think I have incorporated most of the suggestions.

CERGinar 2017 - 2018 Series

Some thoughts on format

What’s been a real pleasure has been the opportunity to hear speakers I love give a talk. This year, because I was testing the water, I chose speakers who I have heard and who I know will do a good job, and somewhat selfishly that I want to hear again. This led to a list of 42 names scrawled on my office noticeboard, and picking just a few of these was really tough.

Alison Flynn set us off to a stellar talk with a talk that ran the spectrum from methods of doing the research right through to implementation in teaching. This was really popular and meant that it addressed the difficulty of the breadth of audience types. Keith Taber made us think more about methodologies… are experimental approaches appropriate, and what are their limitations? Nathaniel Grove picked up on the format set by Alison, again looking at methods and then looking at implications, and this seems to be a formula that works. In both cases, this meant that a natural break in proceedings was a chance to have a mid-presentation set of questions. And that echoes something I have learned from MICER: people love to discuss. Opportunities for discussion compete with wanting to squeeze as much out of the speakers as possible, and the balance is fine tuned. For an hour slot, thought, 45/15 seems to work out. Nathan’s talk included the guest chair Niki Kaiser; this was really useful as it meant I could focus on technical matters, Niki asked questions, and it also means the whole thing is less “my” webinar series, but one of the community.

How to choose speakers?

As well as the criterion (this time around) of having seen all the speakers present, there was the difficulty of choosing just a few from my list of favourites. Donald Wink is the next speaker in the series. He gave a talk at Gordon CERP last year, which was stellar, probably the best talk I heard in a year of many conferences. It was one of those talks where you stop taking notes and just listen to try to absorb as much as possible. His clarity on discussing case studies is one that I think deserves a very wide audience. Then, we have Nicole Graulich, who won best poster at Gordon CERP, meaning she got to give a short talk at the end of the conference. I was left wanting to hear much more. Ginger is doing some amazing work around students writing, and Vicente… well we all want to hear Vicente. Both of these are again Gordon speakers. I thought that this range of speakers represented some well established figures, some newer to a wider audience, different aspects of chemistry, and a balance of gender. But I’m sure I can choose another set that will fulfill those criteria.

On and on?

Chemistry education research, as a young discipline in the UK, has two difficulties as I see it. One: there is no money. And two: as there is no money, people do a lot of this work in their spare time or squeezed into a very busy day job. That means that things like this tend to get squeezed, and it becomes difficult for people to attend. The purpose of these webinars was to act as a proxy for the academic seminars our colleagues will be used to in chemistry departments, except focussed on education.  I have to say I thought that attendance (because of point 2) would be very low, but it has been way above expectations, with lots of discussion in the chat area.

I’d be interested in hearing from people as to whether we should continue with a new series in the Autumn, and proposed ideas for format/speakers. In the mean time, do register for Prof Donald Wink’s seminar, 21st Feb. You won’t be disappointed.

 

The Likert Debate

David Read blew my cover on Sunday night with a tweet mentioning my averaging of Likert data in our recent work on badges. If there is ever a way to get educational researchers to look up from their sherry on a Sunday evening, this is it.

Averaging Likert scales is fraught with problems. The main issue is that Likert response is ordinal, meaning that when you reply to a rating by selecting 1, 2, 3, 4, 5 – these are labels. Their appearance as numbers doesn’t make them numbers, and Likevangels note correctly that unlike the numbers 1, 2, 3… the spaces between the labels 1, 2, 3… do not have to be the same. In other words, if I ask you to rate how much you enjoyed the new season of Endeavour and gave you options 1, 2, 3, 4, 5 where 1 is not at all and 5 is very much so, you might choose 4 but that might be just because it was while it was near perfect TV, it wasn’t quite, there were a few things that bothered you (including that new Fancy chap), so you are holding back from a 5. If you could, you might say 4.9…

But someone else might say well it was just better than average, but only just mind. That new fella wasn’t a great actor but, hell, it is Endeavour, so you put 4 but really if you could you would say 3.5.

So both respondents are choosing 4, but the range of thought represented by that 4 is quite broad.

I can’t dispute this argument, but my own feeling on the matter is that this is a problem with Likert scales rather than a problem with their subsequent analysis. Totting up all of the responses in each category, we would still get two responses in the ‘4’ column, and those two responses would still represent quite a broad range of sentiments. Also, while I understand the ordinal argument, I do feel, that on balance, when respondents are asked to select between 1 and 5, there is an implied scale incorporated. One could of course emphasise the scale by adding in more points, but how many would be needed before the ordinal issue dissipates? A scale of 1 to 10? 1 – 100? Of course you could be very clever by doing what Bretz does with the Meaningful Learning in the Lab questionnaire and ask students to use a sliding scale which returns a number (Qualtrics allows for this more unusual question type). Regardless, it is still a rating influenced by the perception of the respondent.

Our badges paper tried to avoid being led by data by first exploring how the responses shifted in a pre-post questionnaire, so as to get some “sense” of the changes qualitatively. We saw a large decrease in 1s and 2s, and a large increase in 4s and 5s. Perhaps it is enough to say that; we followed the lead of Towns, whose methodology we based our own on,  in performing a pre-post comparison with a t-test. But like any statistic, the devil is in the detail, the statistic is just the summary. Conan Doyle wrote that “You can, never foretell what any one man will do, but you can say with precision what an average number will be up to. Individuals vary, but percentages remain constant. So says the statistician.

There is a bigger problem with Likert scales. They are just so darned easy to ask. It’s easy to dream up lots of things we want to know and stick it in a Likert question. Did you enjoy Endeavour? Average response: 4.2. It’s easy. But what does it tell us? It depends on what the respondent’s perception of enjoy is. Recently I’ve been testing out word associations rather than Likert. I want to know how student feel at a particular moment in laboratory work. Rather than asking them how nervous they feel or how confident they feel, I ask them to choose a word from a selection of a range of possible feelings. It’s not ideal, but it’s a move away from a survey of a long list of Likert questions.

When is a conference real?

Respected Dr Dr Mrs Seery, we hope that you can come to our conference in somewhere you’ve never heard of and tell us about your interesting and exciting work in Pre-Lecture Resources for Reducing Cognitive Load at our Conference on Chemistry and Chemical Engineering in Sub-Oceanic Conditions. Please reply.

Most of us now receive daily invites to conferences around the world – oh the travel we could do! – and the usual fare is a greeting like that above; a dodgy mail merge of incorrect title, a paper title you have published and a conference that has a theme that bears no resemblance to the topic that you have been on. But the targeting is getting cleverer, and there are now quite a few Chemistry Education “conferences” doing the rounds.

These conferences are organised to make money. The model is that academics are invited to speak at conferences, and they, like all attendees will pay to attend.  The organisers know nothing about the topic, and the conference will not have any coherent theme, but the organisers will have delivered on their promise to host a conference, and gather all the money raised in the process as profit. Academics will provide free labour by presenting at the conference, perhaps peer-reviewing, being members of the “Advisory Board”… It all mirrors an actual conference very closely, but of course the problem is that the themes of these “conferences” are so broad that little meaningful discussion could take place. So how do you know what is real or not?

Three key places to look are: (1) who is on the advisory board (2) is there a professional body underpinning the conference, and (3) what are the conference themes.

ICCE2018Organisers and Advisory Board

If you are going to a conference on chemistry education, and the advisory board is populated by Professor of Forestry from Transylvania, then an alarm bell should ring. Are the names familiar? If you were to Google some of them, would you come up with some CER publications? Pictured are the Local Organising Committee and International Advisory Board of the very real 25th ICCE 2018 conference happening in Sydney in 2018. A cursory glance at this list for anyone involved in chemistry education would show that these are people with a genuine investment in the discipline.

Sadly, this check on authenticity is becoming more difficult, as academics are bounded by a singular characteristic: we love doing things for free. So when you get an email that asks you if you want to be part of an organising committee for a conference whose title interests you, well why not? If you don’t look into it too much and you’ve always wanted to go to the Mongolian mountains for a hike, this fits the bill. Before you know it you are profiled on the conference website and credence is added to the meeting because of your affiliation.

Professional Body

A second thing to check is if there is a a professional body underpinning the conference. The very real 25th ICCE 2018 conference happening in Sydney in 2018 is being organised under the auspices of IUPAC, as have all of the conferences in the ICCE series, and the national chemistry body, RACI. This lends an air of authority to the meeting – these are professional bodies who are interested in promotion of chemistry education, rather than just out to make money.

ICCE2018logo

Conference Themes

But what if there is a conference that is out to make profit but means well and wants to host a good conference on a particular theme, where it has identified a gap. This isn’t illegal or morally wrong. We can use the conference themes to get a sense of how invested the organisers are in organising a conference about a topic that will bring a lot of like minded individuals together. I’ve pasted below an image from a tweet from the organisers of the “8th Edition of International Conference on Chemistry Education and Research” (note the ‘and’).

DPzbt2WVwAYUwyO

Exercising Judgement

It is in the interest of organisers of conferences such as these to spread the net widely; the more themes they cover, the more people will likely match. But of course, the broader the net, the more useless the meeting will be. It is worth exercising some judgement by considering the three points above. Even the conference title needs consideration: including an “and” is very popular as it allows a much broader range of topics while sounding like another very well established conference. Compare:

  • 25th IUPAC International Conference on Chemistry Education (ICCE2018)

with

  • 8th Edition of International Conference on Chemistry Education and Research

At a passing glance in a busy email reading session, both look similar.

Many readers of this will likely have received an invitation from the “Journal of Chemistry: Education, Research and Practice” and perhaps confuse it with the journal Chemistry Education Research and Practice. In this case, punctuation reveals very different intentions.

Take care, and if you do go to one of these “conferences”, I hope the scenery is nice!

Harmony in the Chemistry Lab

One of the difficulties students often raise is that the lab report they are required to produce is different for one section (not looking at anyone…) than it is for others. I think it is a fair comment. In Scotland and Ireland, students complete four year undergraduate Bachelor courses, and the first year in these courses is usually a highly organised, well-oiled machine which is consistent in format across the year (it would be similar in nature to the “Gen Chem” courses internationally). So when a student enters Year 2, I think it must be quite a shock to find out about different sections, and that different sections have their own cultures.

Chemistry 2 Laboratory Web

One thing we have done this year is to agree on a common report template for reports. Yes, I know Physical like to do it this way and Organic like it that way (Inorganic chemists don’t seem too fussy).  Our agreed template tries to accommodate these differences by mentioning particular emphases in each component of the report, although not without compromise. The intention is that as students move from one section to another through their year, feedback they get on a particular component of the report in one section in November is useful to them when they are doing a report for another section in March. Or rather, the clarity about the value of that feedback is better.

Once we had this in the bag, other things fall into place. The poster shows what is now in every Chemistry 2 laboratory manual and outside the laboratory. As well as the report assessment, we’ve harmonised how we treat pre-labs, what the expectations are in the lab. But we’ve also made clear (I hope!) how the current programme builds on Year 1 work, as well as outlining what is next. A key point is that each section (Inorganic, Physical, Organic) in the year is described in terms of the main focus (outcomes), showing students what the similarities and differences are. I think that this kind of information, which is often implicit, is useful to extend to students. More importantly, it keeps staff focussed on considering the practical course as one course rather than three courses.

As I begin to think about next year’s manuals, I’ll happily hear any comments or suggestions!

 

A model for the (chemistry) practical curriculum

Yesterday’s post discussed our recent work in thinking about how to build experimental design into the teaching laboratory. This post is related, but aims to think about the overall laboratory teaching curriculum.

I’ve been thinking about this and have Tina Overton’s mantra ringing in my head: what do we want the students to be at the end of it? So, what do we want students to be at the end of a practical curriculum? I think many of us will have varying answers, but there’s a couple of broad themes, which we can assemble thanks to the likes of Tamir (1976), Kirschner and Meester (1992), and Carnduff and Reid (2003, or Reid and Shah, 2007, more accessible).

Tamir considers the is of practical work should include –take a deep breath: skills (e.g., manipulative, inquiry, investigative, organizational, communicative), concepts (e.g., data, hypothesis, theoretical model, taxonomic category), cognitive abilities (e.g., critical thinking, problem solving, application, analysis, synthesis, evaluation, decision making, creativity), understanding the nature of science (e.g., the scientific enterprise, the scientists and how they work, the existence of a multiplicity of scientific methods, the interrelationships between science and technology and among various disciplines of science) and attitudes (e.g., curiosity, interest, risk taking, objectivity, precision, perseverance, satisfaction, responsibility, consensus and collaboration, confidence in scientific knowledge, self-reliance, liking science.)1

Kirschner and Meester list the aims as being: to formulate hypotheses, to solve problems, to use knowledge and skills in unfamiliar situations, to design simple experiments to test hypotheses, to use laboratory skills in performing (simple) experiments, to interpret experimental data, to describe clearly the experiment and to remember the central idea of an experiment over a significantly long period of time.2

And Reid presents the desired outcomes in terms of four skill types: skills relating to learning chemistry, practical skills, scientific skills, and general (meaning transferable) skills.3

So we can see some commonalities, but each have a slightly different perspective. In trying to grapple with the aims of practical work, and think about how they are introduced across a curriculum, I came up with the diagram below a few years ago, recently modified for the Scottish system (we have 5 years instead of 4). This model especially focuses on the concept of “nature of science”, which I consider is the overarching desire for practical work, encompassing the concept of “syntactical knowledge” described in yesterday’s post.

5 year curriculum overview

The intention is that each year of the curriculum adds on a new layer. Each year incorporates the year below, but includes a new dimension. So students in Year 3 will become exposed to Experimental Design (Familiar), but they’ll still be developing skills and exploring models/hypotheses.

I’ve shown this model to students at various stages, and they seem to like it. The sense of progression is obvious, and it is clear what the additional demand will be. In fact their reaction this year was so positive that it struck me that we should really share our curriculum design model (whatever it may be) with students, so there is clarity about expectation and demand. So I will include this model in lab manuals in future years. That way, it’s not just that each year is “harder” (or as is often the case, not harder at all, just longer experiments) but the exact focus is identified. They can see (their) ultimate target of final year project, although I think that perhaps we should, with Tina in mind again, have something on the top platform, stating the desired attributes on graduation.

I’d be interested in opinions on this model. One challenge it raises is how to make labs in the earlier years more interesting, and I think the intentional incorporation of interesting chemistry, decision making, and documenting skill development will help in that regard. Thoughts?!

References

  1. Tamir, P. The role of the laboratory in science teaching; University of Iowa: 1976.
  2. Kirschner, P. A.; Meester, M. A. M., The laboratory in higher science education: Problems, premises and objectives. Higher Education 1988, 17 (1), 81-98.
  3. (a) Carnduff, J.; Reid, N., Enhancing undergraduate chemistry laboratories: pre-laboratory and post-laboratory exercises. Royal Society of Chemistry: 2003; (b) Reid, N.; Shah, I., The role of laboratory work in university chemistry. Chemistry Education Research and Practice 2007, 8 (2), 172-185.

Rethinking laboratory education: unfinished recipes

A great dilemma lies at the heart of practical education. We wish to introduce students to the nature and practices of scientific enquiry, as it might be carried out by scientists. Learning by mimicking these processes, it is argued, will imbue our students with an understanding of scientific approaches, and thus they will learn the practices of science. Often such approaches can be given within a particular real-life context, which can be motivating. I know this argument well, and indeed have advocated this framework.1

However, problems emerge. Let’s consider two.

The first is that these approaches often conflate learning how to do a particular technique with applying that technique to a particular scenario. In other words, students are expected to learn how to do something, but at the same time know how to do it in an unfamiliar scenario. This should set off your cognitive load alarm bells. Now I know people may argue that students learned how to use the UV/vis spectrometer in the previous semester when doing the Beer-Lambert law, so they should be able to use it now for this kinetics experiment, but my experience is that students don’t transfer those skills well, and unless you’ve really focussed on teaching them the actual technique (as opposed to using the technique in a particular study), relying on previous experimental experience is not favourable.

Let’s park the cognitive load issue for a moment, and consider a deeper issue. In his wonderful essay, which should be compulsory reading for anyone setting foot in a teaching lab, Paul Kirschner discusses at length, the epistemology of practical education (epistemology meaning the way knowledge is acquired).2 He writes that we need to distinguish between teaching science and doing science. Drawing on the work of Woolnough and Allsop,3 and Anderson4 he describes the substantive structure of science – the body of knowledge making up science – and the syntactical structure of science – the habits and skills of those who practice science. Anderson’s short book is a wonderful read: he describes this distinction as “science” and “sciencing”. In teaching about the syntactical structure, or “sciencing”, Kirschner argues with some force that a mistake is made if we aim to use science practical work to reassert the substantive knowledge; we should instead be explicitly teaching the process of sciencing – how are these habits and skills are developed.

So: the previous two paragraphs have tried to summarise two issues that arise when one considers laboratory education that incorporate inquiry approaches; they often impose unrealistic demands on students requiring the learning about a technique and applying the technique to an unfamiliar scenario simultaneously; and their focus is on doing science as if it were a realistic scenario, rather than teaching how science is done.

An example in practice

How can such confusion manifest in practice? In our teaching labs, our Year 3 students used to complete several routine practicals, and then in their final few weeks complete an investigation. This approach has a lot going for it. Students get used to more advanced techniques in the first few expository experiments, and then being familiar with Advanced Things can launch into their investigation; an experiment they needed to scope out, design, and conduct. As their last formal laboratory exercise, this would be a good connection to their research project in Year 5.

In practice, it was a bloodbath. Students found it inordinately difficult to take on experimental design, and had little concept about the scope of the experiment, whether what they were doing was on the right path. I think it is instructive to relate these observed problems with the issues described above. We had not taught students how to use the techniques in the scenario they were going to be requiring them, and we had spent a long time telling them to verify known scientific facts, but not much about the actual processes involved in making these verifications.

Change was needed.

A few years ago at the Variety in Chemistry Education meeting in Edinburgh, Martin Pitt gave a 5-minute talk about a practice he had adopted: he gave students a chance to do a practical a second time. He found that even though everything else was the same, students in the second iteration were much more familiar with the equipment, had much greater understanding of the concept, and got much better quality data. This talk appealed to me very much at the time because (a) I was so impressed Martin was brave enough to attempt this (one can imagine the coffee room chat) and (b) it linked in very nicely with my emerging thought at the time about cognitive load.

So Martin is one piece of the jigsaw’s solution. A second is back to Kirchner’s essay. Must we Michael? Yes, we must. At the end, Kirschner presents some strategies for practice. This essay is a tour de force, but compared to the main body of the essay, these strategies seem a bit meek. However, there, just above the footnotes, he describes the divergent laboratory approach, a compromise between the typical recipes (highly structured) and the experimental approach (highly unstructured):

“The last approach can be regarded as a realistic compromise between the experimental and the academic laboratories and is called the divergent laboratory (Lerch, 1971). In the divergent lab, there should be parts of the experiment that are predetermined and standard for all students, but there should be many possible directions in which the experiment can develop after the initial stage. It provides the student with tasks similar to those encountered in an open ended or project (experimental) lab within a framework that is compatible with the various restrictions imposed as a result of the wider system of instructional organisation.”

Unfinished Recipes

Martin’s simple experiment had shown that by allowing students time and space to consider an experiment, they demonstrated greater understanding of the experiment and a better ability to gather experimental data. The divergent laboratory approach is one with a solid but pragmatic grounding in education literature. So here is the plan:

Students complete a recipe laboratory as usual. They learn the approaches, the types of data that are obtained, the quirks of the experiment. We call this Part 1: it is highly structured, and has the purpose of teaching students how to gather that data as well as get some baseline information for…

…for a subsequent exploration. Instead of finishing this experiment and moving on to another recipe, students continue with this experiment. But instead of following a recipe now, they move on to some other aspect. We call this Part 2 (naming isn’t our strong point). This investigative component allows them to explore some additional aspect of the system they have been studying, or use what they have been studying in the defined component to apply to some new scenario. The key thing is that the students have learned how to do what they are doing and the scope of that experiment, and then move to apply it to a new scenario. We repeat this three times throughout the students’ time with us so that the students become used to experimental design in a structured way. A problem with the old investigation model was that students eventually got some sense of what was needed, but never had the feedback loop to try it out again.

We call this approach unfinished recipes. We are giving students the start; the overall structure and scope, but the end depends on where they take it, how much they do, what variation they consider. There is still a lot of work to do (designing these experiments is hard). But lots of progress has been made. Students are designing experiments and approaches without direct recipes. They are learning sciencing. A colleague told me today that the turnaround has been remarkable – students are working in labs, are happy and know what they are doing.

YES THIS IS THE PHYSICAL CHEMISTRY LABORATORY NOW…

Thanks

I’m very lucky to have the support of two fantastic demonstrators who were involved in the design of this approach and a lab technician who is patient to my last minute whims as well as colleagues involved in designing the unfinished recipes.

References

  1. McDonnell, C.; O’Connor, C.; Seery, M. K., Developing practical chemistry skills by means of student-driven problem based learning mini-projects. Chemistry Education Research and Practice 2007, 8 (2), 130-139.
  2. Kirschner, P. A., Epistemology, practical work and academic skills in science education. Science & Education 1992, 1 (3), 273-299.
  3. Woolnough, B. E.; Allsop, T., Practical work in science. Cambridge University Press: 1985.
  4. Anderson, R. O., The experience of science: A new perspective for laboratory teaching. Teachers College Press, Columbia University: New York, 1976.

 

Bibliography for researching women in chemistry c1900

Some references to 19 petitioners to Chemical Society and others

This list was compiled for the purpose of creating/editing Wikipedia articles about Women in Chemistry. You can read a bit about the rationale for this here and more about the Women in Red project here.

Bibliography notes

  • Wikipedia link given if known;
  • Some of these are described in RSC “Faces of Chemistry” – links given;
  • CWTL = “Chemistry was their life”main biographic reference, see also index to that book;
  • Creese 1991, tends to focus on scientific contribution in the context of the time and is also good for who they worked for/with and Appendix details publications (available at https://www.jstor.org/stable/4027231);
  • BHC 2003 – by the Raynham Carters, so information similar to CWTL, but often a little more detailed in the context of admission to professional/learned Societies (Available at: http://www.scs.illinois.edu/~mainzv/HIST/bulletin_open_access/v28-2/v28-2%20p110-119.pdf);
  • EiC2004 – article on Ida Freund with special focus on her educational initiatives and some personal anecdotes (Education in Chemistry, 2004, 136-137 – PDF available at this link);
  • EiC2006 – women of Bedford college (Education in Chemistry, 2006, 77-79 – PDF available at this link);
  • CiB1999 – Detailed overview of Gertrude Walsh and Edith Usherwood (Lady Ingold) (Chemistry in Britain, 1999, 45 – 46);
  • CiB1991 – Story of the 1904 petition letter, passing mention to three main players involved (Chemistry in Britain, 1991, 233-238);
  • Brock 2011– Section on Women Chemists, with lots of detail on Edith Usherwood
  • 1st WW – British Women Chemists and the First World War – details of Taylor and Whiteley

Bibliography (please let me know of any useful additions)

E(lizabeth) Eleanor Field CWTL pp152-153

BHC 2003, p115

 

Emily C Fortey CWTL pp203-204

Creese 1991, p291

BHC 2003, p115

 

Grace C Toynbee (Mrs Percy Frankland) https://en.wikipedia.org/wiki/Grace_Frankland

CWTL pp424-425

BHC 2003, p116

 

Ida Freund https://en.wikipedia.org/wiki/Ida_Freund

http://www.rsc.org/diversity/175-faces/all-faces/ida-freund

CWTL pp226-229

Creese 1991; p287

BHC 2003, p114

EiC2004

CiB1991

 

Mildred M Mills (Mildred Gostling) https://en.wikipedia.org/wiki/Mildred_May_Gostling

CWTL pp429-430

BHC 2003, p115

 

Hilda J Hartle CWTL 479-481

BHC 2003, p114

http://www.scs.illinois.edu/~mainzv/HIST/bulletin_open_access/v36-1/v36-1%20p35-42.pdf

Edith E Humphrey https://en.wikipedia.org/wiki/Edith_Humphrey

CWTL 148-150

BHC 2003, p116

EiC2006

http://www.scs.illinois.edu/~mainzv/HIST/awards/Citations/Vorlesung_Alfred_Werner_CGZ08.ppt&sa=U&ved=0ahUKEwj0wPLcxOTWAhWMExoKHU4RB5YQFggHMAE&client=internal-uds-cse&usg=AOvVaw2-4FA_6QDF70h-y9tb4zjS (PPT file in German showing Humphrey’s thesis)

Dorothy Marshall https://en.wikipedia.org/wiki/Dorothy_Marshall (Stub)

CWTL pp229-230

BHC 2003, p115

 

Margaret Seward (Mrs McKillop) CWTL 105-107

BHC 2003, p115

 

Ida Smedley https://en.wikipedia.org/wiki/Ida_Maclean

http://www.rsc.org/diversity/175-faces/all-faces/dr-ida-smedley

CWTL pp58-61, also 179-180

Creese 1991, p282-284 (See also Wheldale and Homer)

BHC 2003, p114

EiC2004

CiB1991

 

Alice Emily Smith CWTL pp298-299

Creese 1991, p292

BHC 2003, p116

http://www.scs.illinois.edu/~mainzv/HIST/bulletin_open_access/v26-2/v26-2%20p134-142.pdf

Millicent Taylor https://en.wikipedia.org/wiki/Clara_Millicent_Taylor (“neutrality disputed”)

CWTL pp200-202

BHC 2003, p115

1st WW

http://www.scs.illinois.edu/~mainzv/HIST/bulletin_open_access/v36-2/v36-2%20p68-74.pdf

M. Beatrice Thomas CWTL pp230-232

Creese 1991; p287

BHC 2003, p114

EiC2004

 

Martha A Whiteley https://en.wikipedia.org/wiki/Martha_Annie_Whiteley

http://www.rsc.org/diversity/175-faces/all-faces/dr-martha-annie-whiteley

CWTL pp122-124

Creese 1991, p289, see also p293, p297

BHC 2003, p114

EiC2004

CiB1991

1st WW

http://www.scs.illinois.edu/~mainzv/HIST/bulletin_open_access/num20/num20%20p42-45.pdf (full bio) with individual picture and group picture)

Sibyl T Widdows CWTL pp160-161

BHC 2003, p115

 

Katherine I Williams CWTL pp200-203

Creese 1991, p291

BHC 2003, p115

 

 

References to other women chemists from this time (incomplete)

 

Muriel Wheldale https://en.wikipedia.org/wiki/Muriel_Wheldale_Onslow

Creese 1991, p284

Annie Homer Creese 1991, p284
Edith Gertrude Willcock Creese 1991, p285
Marjory Stephenson https://en.wikipedia.org/wiki/Marjory_Stephenson

Creese 1991, p285

Eleanor Balfour Sidgwick https://en.wikipedia.org/wiki/Eleanor_Mildred_Sidgwick

Creese 1991, p286

Emily Aston Creese 1991, p288
Frances Micklethwait https://en.wikipedia.org/wiki/Frances_Micklethwait

Creese 1991, p288, see also 293

Ida Homfray Creese 1991, p289
Effie Marsden Creese 1991, p289
Harriette Chick https://en.wikipedia.org/wiki/Harriette_Chick

Creese 1991, p290

Eva Hibbert Creese 1991, p292
Mary Stephen Leslie EiC2006
Violet Trew EiC2006
Helen Archbold (Mrs Porter) https://en.wikipedia.org/wiki/Helen_Porter

EiC2006

Rosalind Henley EiC2006
Edith Usherwood CiB1999

Brock 2011, pp218 – 230

Gertrude Walsh CiB1999

Seeking thoughts about running a webinar

Later this month I am hosting a webinar, hopefully first in a series. The speaker is Alison Flynn, who will be talking about organic mechanisms. Registration is available here: https://rsccerg.wordpress.com/2017/09/13/webinar-announcement-prof-alison-flynn-25-october/

How do you run a webinar? I have given webinars and remember that it was a bit like speaking into a void as you can’t get a sense from the room as to how much people are enjoying your talk… Instead you just keep talking, hoping that the internet is still working, and that someone on the other end of the line is listening. It is a bit of a bizarre experience first time out.

There are some strategies for avoiding this: occasional polling, or using the chat box. I have mixed feelings about allowing the chat box being open during the talk. On the one hand, it is really nice to see comments come through – essentially text versions of body language and perceptions you might gain from speaking to a live audience. But they can also be distracting, albeit in a positive way – someone might post a comment that is interesting – for example an aside to the presentation that you weren’t going to discuss – but now that it is raised you are wondering in that moment whether to. It can mean you lose your train of thought. So my inclination is to turn them off, or else turn them on for defined periods, although ultimately I intend to leave the decision to the speakers.

Anyway, I am looking for thoughts on what experiences people have had either as a presenter or attendee that can help create a positive webinar. Comments below or on Twitter please :)

And do join us for #CERGinar on 25th October. Alison is a fantastic speaker.

10 thoughts on VICEPHEC

  1. I enjoyed VICEPHEC this year. I like meeting friends and colleagues and hearing about what people are doing.

2. Everybody has a different view on what VICEPHEC is. The two parent organisations need to outline some overarching guidelines as to what VICEPHEC is (and isn’t).

3. These guidelines can then frame abstract calls and conference themes, with local hosts free to offer initiatives such as the (reportedly excellent) Labsolutely Fabulous.

4. I detected several instances of quite pointed commentary this year disregarding/dismissing any sense of evaluation of output or serious data. In my view this is anti-intellectual.

5. Sharing good ideas is a valuable part of the meeting; but we have an ethical responsibility to consider evaluation. Do you want to be the next “Learning Styles”?

6. Evaluation does not necessitate diving into the pedagogical glossary. But let’s not dismiss those who chose to do this. After all Variety is in the name.

7. But should we change the name? I think the combined meeting should have a new name. It is only physicists and chemists for historical reasons.

8. Sponsorship is welcome and beneficial. But we need to keep clear boundaries between sponsors and the academic programme. See 2.

9. Disagreement and debate within a community is healthy. But let’s do it respectfully. We are all on the same side.

10. MICER is a very different and much more niche affair than VICEPHEC. If I thought for a minute that MICER meant that talks at VICEPHEC became evaluation-free, I’d shut up shop.

While I have you… MICER18 is on 14th May 2018 :)

The Laboratory as a Complex Learning Environment

One of the first challenges that emerge when considering teaching in laboratories is to define the kind of environment we are teaching in, and what that means for student learning. Laboratories differ significantly from lectures in terms of environment. Lectures tend to follow a well-established pattern – highly organised material is presented to learners in a fixed setting. While modern lectures incorporate some kind of activity, the focus is usually on whatever material is being presented, and learners rarely have to draw on any additional knowledge or skills outside what is under immediate consideration. Furthermore, learners have time (and often tutorials) after lectures to reconsider the information presented in lectures.

Laboratory learning is much more complex for a variety of reasons. One is physical – the very space students are in when completing laboratory work can vary significantly depending on the type of experiment they are completing. A second is that the number of stakeholders involved increase: teaching assistants, technical staff, and additional academic staff each have a role to play in the delivery of the laboratory course.

Here, we will consider a further aspect: the complexity experienced by students. We can consider the laboratory as a complex learning environment (van Merrienboer, 2003), an environment with the following aims:

 (i) Complex learning aims at the integration of knowledge, skills, and attitudes.

Learning in the laboratory involves three domains. The cognitive domain relates to the intellectual knowledge associated with experimental work, such as the underlying concepts of an experiment, the procedures involved for a piece of apparatus, or the ability to apply scientific reasoning to results observed. Students are required to draw on this knowledge as they work through their experiment. The psychomotor domain relates to the physical actions required in completing an experiment such as  motor skills and coordination of tasks. Students are required to have basic proficiency in these tasks, and as they progress in capability, be able to adapt their approach when completing tasks in response to particular conditions. Finally, the affective domain considers the students’ emotional relationship with their experimental work such as their motivation to do well or the internalisation of the value of the task to their learning.

Because of the nature of laboratory learning, these three domains are active at the same time, and students have to draw on a range of aspects to work in this environment. Carrying out any experimental task will involve drawing on knowledge about what that task is, including safety considerations, while actively completing the task, and do so within the context of whatever their personal attitude for the laboratory is. Managing learning within this complex environment requires a teasing out of the various factors involved, and an understanding of how to best address each one in turn, so that students are offered the chance to develop the capacity to integrate the tasks into the whole, and carry out the work satisfactorily. Because of the time boundaries imposed on laboratory work, this is one of the greatest challenges we face in laboratory teaching.

(ii) Complex learning involves the coordination of qualitatively different constituent skills.

As well as bringing together learning from different domains, within the context of laboratory skills, students will need to be able to complete multiple component tasks as part of one overall task. An analogy is learning to drive. The process of driving requires knowledge of the use of each of the pedals, the gear stick, steering wheel, etc which can each be individually practiced when not driving. In the process of driving, the driver needs to be able to coordinate the various individual tasks simultaneously. Parallels can be made with the chemistry laboratory, where students will need to complete several component tasks in the process of doing one overall task. This is difficult, and requires that the student is capable of each of the constituent tasks in advance of being required to complete the composite task.

(iii) Complex learning requires the transfer of what is learned to real settings

Preparing a laboratory programme which enables students to experience the challenges of drawing together constituent components described in (i) and (ii), above, lays the foundation for the third challenge for laboratory learning: the ability to transfer what is known to unfamiliar situations encountered in real situations. The context of what is “real” needs to be carefully managed within the curriculum – students embarking on an undergraduate research project will likely encounter real problems, but in the formal laboratory curriculum, care is needed to distinguish between simulated problems (where the teacher knows the preferred solution pathway) and actual problems, where the pathway is not clear. Given the number of complexities regarding learning discussed, it would clearly be a folly to require students to begin to consider real settings before teaching the pre-requisite capabilities of integrating knowledge, skills, and attitudes and coordination of tasks, described above. The laboratory curriculum therefore needs to be designed so that these capabilities are developed progressively, so that students develop the capacity to translate their learning to real situations.

 

Reference

Jeroen J. G. van Merrienboer , Paul A. Kirschner & Liesbeth Kester (2003)
Taking the Load Off a Learner’s Mind: Instructional Design for Complex Learning, Educational
Psychologist, 38(1), 5-13.

 

How to do a literature review when studying chemistry education

It’s the time of the year to crank up the new projects. One challenge when aiming to do education research is finding some relevant literature. Often we become familiar with something of interest because we heard someone talk about it or we read about it somewhere. But this may mean that we don’t have many references or further reading that we can use to continue to explore the topic in more detail.

So I am going to show how I generally do literature searches. I hope that my approach will show you how you can source a range of interesting and useful papers relevant to the topic you are studying, as well as identify some of the key papers that have been written about this topic. What I tend to find is that there is never any shortage of literature, regardless of the topic you are interested in, and finding the key papers is a great way to get overviews of that topic.

Where to search?

For literature in education, there are three general areas to search. Each have advantages and disadvantages.

For those with access (in university) Web of Science will search databases which, despite the name, include Social Sciences and Arts and Humanities Indexes. Its advantage is that it is easy to narrow down searches to very specific terms, times, and research topics, meaning you can quickly source a list of relevant literature. Its disadvantage is that it doesn’t search a lot of material that may be relevant but that doesn’t pass the criteria for inclusion in the database (peer review, particular process regarding review, etc). So for example, Education in Chemistry articles do not appear here (As they are not peer reviewed as EiC is a periodical), and CERP articles only appeared about 10 years ago, thanks to the efforts of the immediate past editors. CERP is there now, but the point is there are a lot of discipline based journals (e.g. Australian Journal of Education in Chemistry) that publish good stuff but that isn’t in this database.

The second place to look is ERIC (Education Resources Information Center) – a US database that is very comprehensive. It includes a much wider range of materials such as conference abstracts, although you can limit to peer review. I find ERIC very good, although it can link to more obscure material that can be hard to access.

Finally, there is Google Scholar. This is great as everyone knows how to use it, it links to PDFs of documents are shown if they are available, and it is very fast. The downside is that it is really hard to narrow your search and you get an awful lot of irrelevant hits. But it can be useful if you have very specific search terms. Google Scholar also shows you who cited the work, which is useful, and more extensive than Web of Science’s equivalent feature, as Google, like ERIC, looks at everything, not just what is listed in the database. Google is also good at getting into books which you may be able to view.

A practice search

I am going to do a literature search for something I am currently interested in: how chemistry students approach studying. I’m interested in devising ways to improve how we assist students with study tasks, and so I want to look to the literature to find out how other people have done this. For the purpose of this exercise, we will see that “study” is a really hard thing to look for because of the many meanings of the word. I intend it to mean how students interact with their academic work, but of course “study” is very widely used in scientific discourse and beyond.

It’s important to write down as precisely as you can what it is you are interested in, because the first challenge when you open up the search database is to choose your search terms.

Let’s start with Web of Science. So I’ve said I’m interested in studying chemistry. So what if I put in

study AND chem*

where chem* is my default term for the various derivatives that can be used – e.g. chemical.

study and chemstar

Well, we can see that’s not much use, we get over 1 million hits! By the time I go through those my students will have graduated. The problem of course is that ‘study’ has a general meaning of investigation as well as a specific one that we mean here.

Let’s go back. What am I interested in. I am interested in how students approach study. So how might authors phrase this? Well they might talk about “study approaches”, or “study methods” or “study strategies” or “study habits”, or “study skills”, or… well there’s probably a few more, but that will be enough to get on with.

(“study approach*” or “study method*” or “study strateg*” or “study habit*” or “study skill*”) AND Chem*

So I will enter the search term as shown. Note that I use quotations; this is to filter results to those which mention these two words in sequence. Any pair that match, AND a mention of chem* will return in my results. Of course this rules out “approaches to study” but we have to start somewhere.

Search2

How does this look? Over 500. OK, better than a million+, but we can see that some of the hits are not relevant at all.

search2 results

In Web of Science, we can filter by category – a very useful feature. So I will refine my results to only those in the education category.

refine search2

This returns about 80 hits. Much better. Before we continue, I am going to exclude conference proceedings. The reason for this is that very often you can’t access the text of these and they clutter up the results. So I will exclude these in the same way as I refined for education papers above, except in this case selecting ‘exclude’. We’re now down to 60 hits, which is a nice enough number for an afternoon’s work.

Thinning

Let’s move on to the next phase – an initial survey of your findings. For this phase, you need to operate some form of meticulous record keeping, or you will end up repeating your efforts at some future date. It’s also worth remembering what we are looking for: approaches people have used to develop chemistry students’ study skills. In my case I am interested in chemistry and in higher education. It is VERY easy to get distracted here and move from this phase to the next without completing this phase; trust me this will just mean you have to do the initial trawl all over again at some stage.

This trawl involves scanning titles and then abstracts to see if the articles are of interest. The first one in the list looks irrelevant, but clicking on it suggests that it is indeed for chemistry. It’s worth logging for now. I use the marked list feature, but you might choose to use a spreadsheet or a notebook. Just make sure it is consistent! Scrolling through the hits, we can very quickly see the hits that aren’t relevant. You can see here that including “study approach” in our search terms is going to generate quite a few false hits because it was picked up by articles mentioning the term “case study approach”.

I’ve shown some of the hits I marked of interest below. I have reduced my number down to 21. This really involved scanning quickly through abstract, seeing if it mentioned anything meaningful about study skills (the measurement or promotion of study skills) and if it did, it went in.

marked list 1

Snowballing

You’ll see in the marked list that some papers have been cited by other papers. It’s likely (though not absolutely so) that if someone else found this paper interesting, then you might too. Therefore clicking on the citing articles will bring up other more recent articles, and you can thin those out in the same way. Another way to generate more sources is to scan through the papers (especially the introductions) to see which papers influenced the authors of the papers you are interested in. You’ll often find there are a common few. Both these processes can “snowball” so that you generate quite a healthy number of papers to read. Reading will be covered another time… You can see now why about 50 initial hits is optimum. This step is a bit slow. But being methodical is the key!

A point to note: it may be useful to read fully one or two papers – especially those which appear to be cited a lot – before going into the thinning/snowballing phases as this can help give an overall clarity and context to the area, and might mean you are more informed about thinning/snowballing.

A practice search – Google Scholar

What about Google Scholar? For those outside universities that don’t have access to Web of Science, this is a good alternative. I enter my search terms using the Advanced search term accessed by the drop down arrow in the search box: you’ll see I am still using quotation marks to return exact matches but again there are limitations for this – for example strategy won’t return strategies, and Google isn’t as clever. So depending on the hit rate, you may wish to be more comprehensive.

Google

With Google, over 300 hits are returned, but there isn’t a simple way to filter them. You can sort by relevance, according to how Google scores that, or by date, and you can filter by date. The first one in the list by Prosser and Trigwell is quite a famous one on university teacher’s conceptions of teaching, and not directly of interest here – although of course one could argue that we should define what our own conception of teaching is before we think about how we are going to promote particular study approaches to students. But I’m looking for more direct hits here. With so many hits, this is going to involve a pretty quick trawl through the responses. Opening hits in new tabs means I can keep the original list open. Another hit links to a book – one advantage of Google search, although getting a sense of what might be covered usually means going to the table of contents. A problem with books though is that only a portion may be accessible. But the trawl again involves thinning and snowballing, the latter is I think much more important in Google, and as mentioned scopes a much broader citing set.

Searching with ERIC

Finally, let’s repeat with ERIC. Putting in the improved Web of Science term returns 363 hits (or 114 if I select peer-reviewed only).

Eric search

ERIC allows you to filter by journal, and you can see here that it is picking up journals that wouldn’t be shown in Web of Science, and would be lost or unlikely in Google. You can also filter by date, by author, and by level (although the latter should be treated with some caution). Proquest is a thesis database, so would link to postgraduate theses (subscription required, but you could contact the supervisor).

eric filters

The same process of thinning and snowballing can be applied. ERIC is a little frustrating as you have to click into the link to find out anything about it, whereas the others mentioned show you, for example, the number of citations. Also, for snowballing, ERIC does not show you the link to citing articles, instead linking to the journal webpage, which means a few clicks. But for a free database, it is really good.

Which is the best?

It’s interesting to note that in the full search I did using these three platforms, each one threw up some results that the others didn’t. I like Web of Science but that’s what I am used to. ERIC is impressive in its scope – you can get information on a lot of education related publications, although getting access to some of the more obscure ones might be difficult. Google is very easy and quick, but for a comprehensive search I think it is a bit of a blunt instrument.  Happy searching!

What does active learning look like in college science classrooms?

A new review addressing this topic was recently published. I love reviews (someone else does all the hard work and you just have to read their summary!) and this one does a good job of categorising many of the approaches under the “active” umbrella. There are some limitations (for me) in their analysis, but the categorisation is useful nonetheless.

Most interestingly, the authors present a framework to consider active learning. There are two components to this. One is perhaps obvious: considering active learning means that you must first have an overall approach (i.e. are you teacher/student-centred, constructivist, etc); a strategy – a basis for why you will design particular activities in the classroom; and finally what these activities are. That seems pretty obvious.

The authors then draw on social interdependence theory (no, I hadn’t either) which identifies whether there is positive, negative, or no benefit from cooperating with others in attaining a goal. This is interesting. They then place on a grid the various activities depending on whether there is benefit from interdependence or not (positive or none) and what kind of peer interactions an active learning strategy might employ. The grid they come up with is shown, and they highlight:
– the difference between one and two-stage polling (clicker questions vs more formal peer-instruction)
– there are a lot of activities that depend on only ‘loose’ peer interactions; interactions which are short lived and do not involve the same peers.

active learning social interdependence

Types of active learning

The review is useful as it offers a smorgasbord of the kinds of activities people undertake. These are categorised into four main headings:
1. Individual non-polling activities such as the minute-paper, writing exercises, individualy solving a problem, concept maps, building models…
2. In-class polling activities formalised in terms of peer-instruction (question, vote, discussion with peer, revote) and sequence of questions, and non-formalised such as one-off voting, poll followed by written answer…
3. Whole class discussions involving an activity, a facilitated discussion, and questions/answers…
And
4. In-class activities such as POGIL, lecture-tutorials, PBL activities, and jigsaws (different groups do different parts of a problem and then it is all brought together).

Defining active learning

The authors define active learning as: “active learning engages students in the process of learning through activities and/or discussion in class, as opposed to passively listening to an expert. It emphasises higher order thinking and often involves group work.

I think it is useful piece of work, and certainly the social interdependence piece is interesting. But I do wonder how they searched for information. They give details on this, but they seem to be missing a whole tranche of work around the flipped lecture movement, for example, and it would have been interesting to read about how active learning scenarios were facilitated in terms of curriculum design and what kinds of things were done in class once that information was available (or indeed relied on that process). Also they describe how they got papers but have a very worrying (in my view) statement about not looking at discipline specific journals, such as CERP. But a CERP paper is referenced. In fairness they do not declare to be comprehensive, but this kind of thing makes me wonder about the extent of the literature surveyed. But that is a minor gripe, and I think it is well worth a read.

Reference

Arthurs, L. A., & Kreager, B. Z. (2017). An integrative review of in-class activities that enable active learning in college science classroom settings. International Journal of Science Education, 1-19.

A new review on pre-labs in chemistry

Much of my work over the last year has focussed on pre-labs. In our research, we are busy exploring the role of pre-labs and their impact on learning in the laboratory. In practice, I am very busy making a seemingly endless amount of pre-lab videos for my own teaching.

These research and practice worlds collided when I wanted to answer the question: what makes for a good pre-lab? It’s taken a year of reading and writing and re-reading and re-writing to come up with some sensible answer, which is now published as a review.

There are dozens of articles about pre-labs and the first task was to categorise these – what are others doing and why they are doing it. We came up with a few themes, including the most common pair: to introduce the theory behind a lab and to introduce experimental techniques. So far so obvious. Time and again these reports – we gathered over 60 but there are likely more our search didn’t capture – highlighted that pre-labs had benefit, including unintended benefits (such as a clear increase in confidence about doing labs).

Why were pre-labs showing this benefit? This was rarer in reports. Some work, including a nice recent CERP paper, described the use of a underpinning framework to base the design of pre-labs upon and meant that the outcomes could be considered in that framework (in that case: self-regulation theory). But we were looking for something more… over-arching; a framework to consider the design considerations of pre-labs that took account of the unique environment of learning in the laboratory.

We have opted to use the complex learning framework as a framework for learning in laboratories, for various reasons. It is consistent with cognitive load theory, which is an obvious basis for preparative work. It describes the learning scenario as one where several strands of activity are drawn together, and is ‘complex’ because this act of drawing together requires significant effort (and support). And it offers a clear basis on the nature of information that should be provided in advance of the learning scenario. Overall, it seemed a sensible ‘fit’ for thinking about laboratory learning, and especially for preparing for this learning.

What makes for a good pre-lab?

We drew together the learning from the many reports on pre-lab literature with the tenets from complex learning framework to derive some guidelines to those thinking about developing pre-laboratory activities. These are shown in the figure. A particular advantage of the complex learning framework is the distinction between supportive and procedural information, which aims to get to the nitty-gritty of the kind of content that should be incorporated into a pre-lab activity. Casual readers should note that the “procedural” used here is a little more nuanced than just “procedure” that we think about in chemistry. We’ve elaborated a lot on this.

I hope that this review is useful – it has certainly been a learning experience writing it. The pre-print of the review is now available at http://dx.doi.org/10.1039/C7RP00140A and the final formatted version should follow shortly.

5 guielines for developing prelabs

A talk on integrating technology into teaching, learning, and assessment

While in Australia, I was invited to present a talk to the Monash Education Academy on using technology in education. They recorded it and the video is below. The talk had a preamble about a theme of “personalisation” that I am increasingly interested in (thanks especially to some work done by the Physics Education Research Group here at Edinburgh), and then discussed:

  1. Preparing for lectures and flipping
  2. Discussion Boards
  3. Using video for assessment

A view from Down Under

Melbourne Seventh City of Empire, part of the Australia 1930s Exhibition at National Gallery of Victoria
Melbourne Seventh City of Empire, part of the “Brave New World: Australia 1930s” Exhibition at National Gallery of Victoria

I’ve spent the last two week in Australia thanks to a trip to the Royal Australian Chemical Institute 100th Annual Congress in Melbourne. I attended the Chemistry Education symposium.

So what is keeping chemistry educators busy around this part of the world? There are a lot of similarities, but some differences. While we wrestle with the ripples of TEF and the totalitarian threat of learning gains, around here the acronym of fear is TLO: threshold learning outcomes.  As I understand it, these are legally binding statements stating that university courses will ensure students will graduate with the stated outcomes. Institutions are required to demonstrate that these learning outcomes are part of their programmes and identify the level to which they are assessed. This all sounds very good, except individuals on the ground are now focussing on identifying where these outcomes are being addressed. Given that they are quite granular, this appears to be a huge undertaking and is raising questions like: where and to what extent is teamwork assessed in a programme?

Melbourne from the Shrine
Melbourne from the Shrine

This process does appear to have promoted a big interest in broader learning outcomes, with lots of talks on how to incorporate transferable skills into the curriculum, and some very nice research into students’ awareness of their skills. Badges are of interest here and may be a useful way to document these learning outcomes in a way that doesn’t need a specific mark. Labs were often promoted as a way of addressing these learning outcomes, but I do wonder how much we can use labs for learning beyond their surely core purpose of teaching practical chemistry.

Speaking of labs, there was some nice work on preparing for laboratory work and on incorporating context into laboratory work. There was (to me) a contentious proposal that there be a certain number of laboratory activities (such as titrations) that are considered core to a chemist’s repertoire, and that graduation should not be allowed until competence in those core activities be demonstrated. Personally I think chemistry is a broader church than that, and it will be interesting to watch that one progress. A round-table discussion spent a good bit of time talking about labs in light of future pressures of funding and space; and it does seem that we are still not quite clear about what the purpose of labs are. Distance education – which Australia has a well-established head start in – was also discussed, and I was really glad to hear someone with a lot of experience in this say that it is possible to generate a community with online learners, but that it takes a substantial personal effort. The lab discussion continued to the end, with a nice talk on incorporating computational thinking into chemistry education, with suggestions on how already reported lab activities might be used to achieve this.

Gwen Lawrie delivers her Award Address
Gwen Lawrie delivers her Award Address

Of course it is the personal dimension that is the real benefit of these meetings, and it was great to meet some faces old and new. Gwen Lawrie wasn’t on the program as the announcement of her award of Education Division Medal was kept secret for as long as possible. I could listen to Gwen all day, and her talk had the theme “Chasing Rainbows”, which captured so eloquently what it means to be a teacher-researcher in chemistry education, and in a landscape that continues to change. [Gwen’s publications are worth trawling] Gwen’s collaborator Madeline Schultz (a Division Citation Winner) spoke about both TLOs and on reflections on respected practitioners on their approaches to teaching chemistry – an interesting study using a lens of pedagogical content knowledge. From Curtin, I (re-)met Mauro Mocerino (who I heard speak in Europe an age ago on clickers) who spoke here of his long standing work on training demonstrators. Also from that parish, it was a pleasure to finally meet Dan Southam. I knew Dan only through others; a man “who gets things done” so it was lovely to meet him in his capacity as Chair of the Division and this symposium, and to see that his appellation rang true. And it was nice to meet Elizabeth Yuriev, who does lovely work exploring how students approach physical chemistry problem and on helping students with problem solving strategies.

Dinner Date
Dinner Date

There were lots of other good conversations and friendly meetings, demonstrating that chemistry educators are a nice bunch regardless of location. I wasn’t the only international interloper; Aishling Flaherty from University of Limerick was there to spread her good work on demonstrator training – an impressive programme she has developed and is now trialling in a different university and a different country. And George Bodner spoke of much of his work in studying how students learn organic chemistry, and in particular the case of “What to do about Parker”. The memory of Prof Bodner sitting at the back of my talk looking at my slides through a telescopic eye piece is a happy one that will stay with me for a long time. Talk of organic chemistry reminds me of a presentation about the app Chirality – 2 which was described – it covers lots of aspects about revising organic chemistry, and looked really great.

The Pioneer, National Gallery of Victoria
The Pioneer, National Gallery of Victoria

My slightly extended trip was because I had the good fortune to visit the research group of Prof Tina Overton, who moved to Melbourne a few years ago, joining native Chris Thompson in growing the chemistry education group at Monash. It was an amazing experience immersing in a vibrant and active research group, who are working on things ranging from student critical thinking, chemists’ career aspirations, awareness of transferable skills, and the process and effect of transforming an entire laboratory curriculum. I learned a lot as I always do from Tina and am extremely grateful for her very generous hosting. I leave Australia now, wondering if I can plan a journey in 2018 for ICCE in Sydney.

From Hokusai exhibition, NGV
From Hokusai exhibition, NGV. My interpretation of students managing in a complex learning environment