How to do a literature review when studying chemistry education

It’s the time of the year to crank up the new projects. One challenge when aiming to do education research is finding some relevant literature. Often we become familiar with something of interest because we heard someone talk about it or we read about it somewhere. But this may mean that we don’t have many references or further reading that we can use to continue to explore the topic in more detail.

So I am going to show how I generally do literature searches. I hope that my approach will show you how you can source a range of interesting and useful papers relevant to the topic you are studying, as well as identify some of the key papers that have been written about this topic. What I tend to find is that there is never any shortage of literature, regardless of the topic you are interested in, and finding the key papers is a great way to get overviews of that topic.

Where to search?

For literature in education, there are three general areas to search. Each have advantages and disadvantages.

For those with access (in university) Web of Science will search databases which, despite the name, include Social Sciences and Arts and Humanities Indexes. Its advantage is that it is easy to narrow down searches to very specific terms, times, and research topics, meaning you can quickly source a list of relevant literature. Its disadvantage is that it doesn’t search a lot of material that may be relevant but that doesn’t pass the criteria for inclusion in the database (peer review, particular process regarding review, etc). So for example, Education in Chemistry articles do not appear here (As they are not peer reviewed as EiC is a periodical), and CERP articles only appeared about 10 years ago, thanks to the efforts of the immediate past editors. CERP is there now, but the point is there are a lot of discipline based journals (e.g. Australian Journal of Education in Chemistry) that publish good stuff but that isn’t in this database.

The second place to look is ERIC (Education Resources Information Center) – a US database that is very comprehensive. It includes a much wider range of materials such as conference abstracts, although you can limit to peer review. I find ERIC very good, although it can link to more obscure material that can be hard to access.

Finally, there is Google Scholar. This is great as everyone knows how to use it, it links to PDFs of documents are shown if they are available, and it is very fast. The downside is that it is really hard to narrow your search and you get an awful lot of irrelevant hits. But it can be useful if you have very specific search terms. Google Scholar also shows you who cited the work, which is useful, and more extensive than Web of Science’s equivalent feature, as Google, like ERIC, looks at everything, not just what is listed in the database. Google is also good at getting into books which you may be able to view.

A practice search

I am going to do a literature search for something I am currently interested in: how chemistry students approach studying. I’m interested in devising ways to improve how we assist students with study tasks, and so I want to look to the literature to find out how other people have done this. For the purpose of this exercise, we will see that “study” is a really hard thing to look for because of the many meanings of the word. I intend it to mean how students interact with their academic work, but of course “study” is very widely used in scientific discourse and beyond.

It’s important to write down as precisely as you can what it is you are interested in, because the first challenge when you open up the search database is to choose your search terms.

Let’s start with Web of Science. So I’ve said I’m interested in studying chemistry. So what if I put in

study AND chem*

where chem* is my default term for the various derivatives that can be used – e.g. chemical.

study and chemstar

Well, we can see that’s not much use, we get over 1 million hits! By the time I go through those my students will have graduated. The problem of course is that ‘study’ has a general meaning of investigation as well as a specific one that we mean here.

Let’s go back. What am I interested in. I am interested in how students approach study. So how might authors phrase this? Well they might talk about “study approaches”, or “study methods” or “study strategies” or “study habits”, or “study skills”, or… well there’s probably a few more, but that will be enough to get on with.

(“study approach*” or “study method*” or “study strateg*” or “study habit*” or “study skill*”) AND Chem*

So I will enter the search term as shown. Note that I use quotations; this is to filter results to those which mention these two words in sequence. Any pair that match, AND a mention of chem* will return in my results. Of course this rules out “approaches to study” but we have to start somewhere.

Search2

How does this look? Over 500. OK, better than a million+, but we can see that some of the hits are not relevant at all.

search2 results

In Web of Science, we can filter by category – a very useful feature. So I will refine my results to only those in the education category.

refine search2

This returns about 80 hits. Much better. Before we continue, I am going to exclude conference proceedings. The reason for this is that very often you can’t access the text of these and they clutter up the results. So I will exclude these in the same way as I refined for education papers above, except in this case selecting ‘exclude’. We’re now down to 60 hits, which is a nice enough number for an afternoon’s work.

Thinning

Let’s move on to the next phase – an initial survey of your findings. For this phase, you need to operate some form of meticulous record keeping, or you will end up repeating your efforts at some future date. It’s also worth remembering what we are looking for: approaches people have used to develop chemistry students’ study skills. In my case I am interested in chemistry and in higher education. It is VERY easy to get distracted here and move from this phase to the next without completing this phase; trust me this will just mean you have to do the initial trawl all over again at some stage.

This trawl involves scanning titles and then abstracts to see if the articles are of interest. The first one in the list looks irrelevant, but clicking on it suggests that it is indeed for chemistry. It’s worth logging for now. I use the marked list feature, but you might choose to use a spreadsheet or a notebook. Just make sure it is consistent! Scrolling through the hits, we can very quickly see the hits that aren’t relevant. You can see here that including “study approach” in our search terms is going to generate quite a few false hits because it was picked up by articles mentioning the term “case study approach”.

I’ve shown some of the hits I marked of interest below. I have reduced my number down to 21. This really involved scanning quickly through abstract, seeing if it mentioned anything meaningful about study skills (the measurement or promotion of study skills) and if it did, it went in.

marked list 1

Snowballing

You’ll see in the marked list that some papers have been cited by other papers. It’s likely (though not absolutely so) that if someone else found this paper interesting, then you might too. Therefore clicking on the citing articles will bring up other more recent articles, and you can thin those out in the same way. Another way to generate more sources is to scan through the papers (especially the introductions) to see which papers influenced the authors of the papers you are interested in. You’ll often find there are a common few. Both these processes can “snowball” so that you generate quite a healthy number of papers to read. Reading will be covered another time… You can see now why about 50 initial hits is optimum. This step is a bit slow. But being methodical is the key!

A point to note: it may be useful to read fully one or two papers – especially those which appear to be cited a lot – before going into the thinning/snowballing phases as this can help give an overall clarity and context to the area, and might mean you are more informed about thinning/snowballing.

A practice search – Google Scholar

What about Google Scholar? For those outside universities that don’t have access to Web of Science, this is a good alternative. I enter my search terms using the Advanced search term accessed by the drop down arrow in the search box: you’ll see I am still using quotation marks to return exact matches but again there are limitations for this – for example strategy won’t return strategies, and Google isn’t as clever. So depending on the hit rate, you may wish to be more comprehensive.

Google

With Google, over 300 hits are returned, but there isn’t a simple way to filter them. You can sort by relevance, according to how Google scores that, or by date, and you can filter by date. The first one in the list by Prosser and Trigwell is quite a famous one on university teacher’s conceptions of teaching, and not directly of interest here – although of course one could argue that we should define what our own conception of teaching is before we think about how we are going to promote particular study approaches to students. But I’m looking for more direct hits here. With so many hits, this is going to involve a pretty quick trawl through the responses. Opening hits in new tabs means I can keep the original list open. Another hit links to a book – one advantage of Google search, although getting a sense of what might be covered usually means going to the table of contents. A problem with books though is that only a portion may be accessible. But the trawl again involves thinning and snowballing, the latter is I think much more important in Google, and as mentioned scopes a much broader citing set.

Searching with ERIC

Finally, let’s repeat with ERIC. Putting in the improved Web of Science term returns 363 hits (or 114 if I select peer-reviewed only).

Eric search

ERIC allows you to filter by journal, and you can see here that it is picking up journals that wouldn’t be shown in Web of Science, and would be lost or unlikely in Google. You can also filter by date, by author, and by level (although the latter should be treated with some caution). Proquest is a thesis database, so would link to postgraduate theses (subscription required, but you could contact the supervisor).

eric filters

The same process of thinning and snowballing can be applied. ERIC is a little frustrating as you have to click into the link to find out anything about it, whereas the others mentioned show you, for example, the number of citations. Also, for snowballing, ERIC does not show you the link to citing articles, instead linking to the journal webpage, which means a few clicks. But for a free database, it is really good.

Which is the best?

It’s interesting to note that in the full search I did using these three platforms, each one threw up some results that the others didn’t. I like Web of Science but that’s what I am used to. ERIC is impressive in its scope – you can get information on a lot of education related publications, although getting access to some of the more obscure ones might be difficult. Google is very easy and quick, but for a comprehensive search I think it is a bit of a blunt instrument.  Happy searching!

Links to Back Issues of University Chemistry Education

I don’t know if I am missing something, but I have found it hard to locate past issues of University Chemistry Education, the predecessor to CERP.  They are not linked on the RSC journal page. CERP arose out of a merger between U Chem Ed and CERAPIE, and it is the CERAPIE articles that are hosted in the CERP back issues. Confused? Yes. (More on all of this here)

Anyway in searching and hunting old U Chem Ed articles, I have cracked the code of links and compiled links to back issues below. They are full of goodness. (The very last article published in UCE was the very first chemistry education paper I read – David McGarvey’s “Experimenting with Undergraduate Practicals“.)

Links to Back Issues

Contents of all issues: http://www.rsc.org/images/date_index_tcm18-7050.pdf 

1997 – Volume 1:

1 – remains elusive… It contains Johnstone’s “And some fell on good ground” so I know it is out there… Edit: cracked it – they are available by article:

1998 – Volume 2:

1 – http://www.rsc.org/images/Vol_2_No1_tcm18-7034.pdf

2 – http://www.rsc.org/images/Vol_2_No2_tcm18-7035.pdf

1999 – Volume 3:

1 – http://www.rsc.org/images/Vol_3_No1_tcm18-7036.pdf

2 – http://www.rsc.org/images/Vol_3_No2_tcm18-7037.pdf

2000 – Volume 4:

1 – http://www.rsc.org/images/Vol_4_No1_tcm18-7038.pdf

2 – http://www.rsc.org/images/Vol_4_No2_tcm18-7039.pdf

2001 – Volume 5:

1 – http://www.rsc.org/images/Vol_5_No1_tcm18-7040.pdf

2 – http://www.rsc.org/images/Vol_5_No2_tcm18-7041.pdf

2002 – Volume 6:

1 – http://www.rsc.org/images/Vol_6_No1_tcm18-7042.pdf

2 – http://www.rsc.org/images/Vol_6_No2_tcm18-7043.pdf

2003 – Volume 7:

1 – http://www.rsc.org/images/Vol_7_No1_tcm18-7044.pdf

2 – http://www.rsc.org/images/Vol_7_No2_tcm18-7045.pdf

2004 – Volume 8:

1 – http://www.rsc.org/images/Vol_8_No1_tcm18-7046.pdf

2 – http://www.rsc.org/images/Vol_8_No2_tcm18-7047.pdf

I’ve downloaded these all now in case of future URL changes. Yes I was a librarian in another life.

UCE logo

This week I’m reading… Changing STEM education

Summer is a great time for Good Intentions and Forward Planning… with that in mind I’ve been reading about what way we teach chemistry, how we know it’s not the best approach, and what might be done to change it.

Is changing the curriculum enough?

Bodner (1992) opens his discussion on reform in chemistry education writes that “recent concern”, way back in 1992, is not unique. He states that there are repeated cycles of concern about science education over the 20th century, followed by long periods of complacency. Scientists and educators usually respond in three ways:

  1. restructure the curriculum,
  2. attract more young people to science,
  3. try to change science teaching at primary and secondary level.

However, Bodner proposes that the problem is not in attracting people to science at the early stages, but keeping them on when they reach university, and that we at third level have much to learn with from our colleagues in primary and secondary level. Instead of changing the curriculum (the topics taught), his focus is on changing the way the curriculum is taught. In an era when textbooks (and one presumes now, the internet) have all the information one wants, the information dissemination component of a lecture is redundant. Bodner makes a case that students can perform quite well on a question involving equilibrium without understanding its relationship to other concepts taught in the same course, instead advocating an active learning classroom centred around discussion and explanation; dialogue between lecturers and student. He even offers a PhD thesis to back up his argument (A paper, with a great title, derived from this is here: PDF).

Are we there yet?

One of the frustrations I’m sure many who have been around the block a few times feel is the pace of change is so slow (read: glacial). 18 years after Bodner’s paper, Talanquer and Pollard (2010) criticize the chemistry curriculum at universities as “fact-based and encyclopedic, built upon a collection of isolated topics… detached from the practices, ways of thinking, and applications of both chemistry research and chemistry education research in the 21st century.” Their paper in CERP presents an argument for teaching “how we think instead of what we know”.

They describe their Chemistry XXI curriculum, which presents an introductory chemistry curriculum in eight units, each titled by a question. For example, Unit 1 is “How do we distinguish substances?”, consisting of four modules (1 to 2 weeks of work): “searching for differences, modelling matter, comparing masses, determining composition.” The chemical concepts mapping onto these include the particulate model of matter, mole and molar mass, and elemental composition.

Talanquer CERP 2010 imageAssessment of this approach is by a variety of means, including small group in-class activities. An example is provided for a component on physical and electronic properties of metals and non-metals; students are asked to design an LED, justifying their choices. I think this fits nicely into the discursive ideas Bodner mentions. Summative assessment is based on answering questions in a context-based scenario – picture shown.

In what is a very valuable addition to this discussion, learning progression levels are included, allowing student understanding of concepts and ideas, so that their progressive development can be monitored. It’s a paper that’s worth serious consideration and deserves more widespread awareness.

Keep on Truckin’

Finally in our trio is Martin Goedhart’s chapter in the recently published book Chemistry Education. Echoing the basis provided by Talanquer and Pollard, he argues that the traditional disciplines of analytical, organic, inorganic, physical, and biochemistry were reflective of what chemists were doing in research and practice. However, the interdisciplinary nature of our subject demands new divisions; Goedhart proposes three competency areas synthesis, analysis, and modelling. For example in analysis, the overall aim is “acquiring information about the composition and structure of substances and mixtures”. The key competencies are “sampling, using instruments, data interpretation”, with knowledge areas including instruments, methods and techniques, sample prep, etc. As an example of how the approach differs, he states that students should be able to select appropriate techniques for their analysis; our current emphasis is on the catalogue of facts on how each technique works. I think this echoes Talanquer’s point about shifting the emphasis on from what we know to how we think.

The role of prior knowledge

One of the main challenges in teaching first year university students is that they have a great variety of backgrounds. A quick survey of any year one class will likely yield students who have come straight from school, students returning to education, and students who have taken a circuitous route through pre-university courses. Even the main block of students coming directly from school are a diverse group. Different school systems mean students can cover different content, and even that is assuming they take that subject at all. Those challenged with teaching first years know this better than those who take modules only in later years, when the group of students becomes somewhat more homogeneous.

All of this makes it difficult for the first year lecturer to approach their task, facing students who have, in our case, chemistry knowledge ranging from none at one extreme to a significant part of the syllabus they will be taking at the other.  As well as trying to navigate a syllabus that doesn’t isolate the novice learners and bore those who have “done it before”, there is a conceptual basis for worrying about this as well. Back in the 1950s, David Ausubel stated that the most important single factor influencing learning is what the learner already knows. Learners overlap new information with some existing knowledge, so that an ever-increasing complex understanding of a topic can develop. Unfortunately for novice learners, substantial unfamiliarity emphasises the attraction of rote learning, meaning that new information learned for the purpose of an exam and not integrated with some prior knowledge will likely be quickly forgotten, never mind understood.

In my own work (CERP, 2009, 10, 227-232), I analysed the performance of first year students in their end of module exam, and demonstrated that there was a consistent significant difference between the grades achieved by students who had taken chemistry at school and those that hadn’t. This difference disappeared from Year 2 onwards, suggesting that the group had levelled out somewhat by the end of the year. The response to this was to introduce pre-lecture activities so that students could at least familiarise themselves with some key terminology prior to lectures; a small attempt to generate some prior knowledge. Even this minor intervention led to the disappearance of any significant difference in exam scores (see BJET, 2012, 43(4), 2012 667–677).

I was reminded of all of this work by an excellent recent paper in Chemistry Education Research and Practice from Kevin de Berg and Kerrie Boddey which advances the idea further. Teaching nursing students in Australia, the researchers categorised the students as having completed senior school chemistry (SC), having completed a 3 day bridging course (BC) and having not studied chemistry since junior years of school (PC for poor chemistry). Statistical analysis showed some unsurprising results: those students with school chemistry scored higher (mean: 67%); followed by the bridging course students (54%); and lastly those students with poor chemistry (47%). The difference between BC and PC students was not significant however, although there were more lower performing students in the latter group.

These results align well with prior work on the value of prior knowledge in chemistry and the limited but positive impact of bridging courses. For me, this paper is valuable for its qualitative analysis. Students were interviewed about their experiences of learning chemistry. Those students who had completed the bridging course spoke about its value. For example, the authors quote Bella (please note, qualitative researchers, the author’s clever and helpful use of names beginning with B for Bridging Course students):

I think if I had actually gone straight just to class that first day not knowing anything, I don’t think I would have done half as well as what I would having known it.

Additionally, such was the perceived value of the bridging course, those students who had no prior chemistry felt left out. Paula states how she thinks it would have helped:

Familiarity in advance. Just, so you’re prepared. So you get the sort of basic, the basic framework of it all. So then, I’d sort of, got a head start and not be so overwhelmed…

Another interesting feature highlighted by students was the unique language of chemistry. Students spoke of entering their first chemistry class as being “like stepping into another world” and being “absolute Greek”. Additionally, the conceptual domains proved challenging, with students  overwhelmed by formulae and trying to visualise the molecular world. Pam states:

Maybe with anatomy, you can see more, you know, the things we’re cuttin’ up into pieces and looking inside. With chemistry, you can’t see it, so you gotta imagine that in your head and it’s hard tryin’ to imagine it, without actually physically touching it.

In a comprehensive review of prior knowledge, Dochy (1999) described “an overview of research that only lunatics would doubt“: prior knowledge was the most significant element in learning. Indeed, the review goes further when it considers strategies that teachers can use when considering students from differing backgrounds. I like this quote from Glaser and DeCorte cited elsewhere by Dochy:

Indeed, new learning is exceedingly difficult when prior informal as well as formal knowledge is not used as a springboard for future learning. It has also become more and more obvious, that in contrast to the traditional measures of aptitude, the assessment of prior knowledge and skill is not only a much more precise predictor of learning, but provides in addition a more useful basis for instruction and guidance’

This for me points a way forward. Much of the work on prior knowledge has concentrated on assessing the differences in performance as a function on prior knowledge or surveying the impact of amelioration strategies that aim to bring students up to the same level before teaching commences (such as bridging courses, pre-lecture activities, etc). But what about a more individualised learning path for students whereby a student’s starting point is taken from where their current understanding is. This would be beneficial to all learners – those without and also those with chemistry, the latter group could be challenged on any misconceptions in their prior knowledge. With technology getting cleverer, this is an exciting time to consider such an approach.

How To Disagree

Sometime last year,  a comment was left on one of my articles which said that I was being simplistic and my argument was childish. It was the first and only nasty comment I got, and it took me by surprise. After the sting, what I wanted to know was why he thought my argument was childish, so that I could consider something in my argument that I hadn’t before. I even wrote to him (oh, naiveté) but the email response just reiterated the main point rather forcefully. The net result was that neither of us really got anything from the conversation.

Hierarchy of Disagreement; By Rocket000 [Public domain], via Wikimedia Commons
Hierarchy of Disagreement; By Rocket000 [Public domain], via Wikimedia Commons
Back in 2008, Paul Graham wrote an essay “How to Disagree“. Graham (a computer scientist) wrote his essay given that with the increased use of the web, “there’s a lot more disagreeing going on.” Someone has made a diagram based on it and put it on Wikipedia. The essay itself is short, and as you can see from the diagram, it essentially outlines different levels of argument. These range from name-calling to refuting the central point of the argument. 

I think it has broader appeal too. In getting students to write, and argue, it can be hard to get them past personal opinion and on to using evidence to back up their point of view. I think this diagram is useful to show that; although perhaps it leaves out the ethical component – distinguishing between refuting a central point because it is backed up by a consensus of evidence and refuting a central point because it agrees with your point of view and you have found a piece of evidence to use to this effect.

Designing e-resources – the transient information effect

Waiting-for-the-TrainOne thing I have often wondered about when considering videos/animations/audio files is that unlike something written on a piece of paper, the information in multimedia presentation moves on quite quickly. With paper, one can get a sense of the whole, see the sequence, refer back quickly to what went before. With audio or video, information, and often quite complex information, is presented at the pace of the speaker, and it takes a bit of effort to go back and review a segment (i.e. not a glance of an eye). Therefore, for technical topics like chemistry, is there an issue with using multimedia generally?

This idea was discussed in a recent paper by John Sweller and others.* They cite a raft of studies which show that animations do not have any beneficial impact on learning. In this study, they conducted two experiments.  The context is that with animations, learners need to simultaneously remember what was just presented along with what is being presented. However, depending on the pace, previous information may be forgotten, and cannot be recalled as easily as static graphics on a piece of paper. Segmentation and user control has previously been shown to aid novice learners. Effective segmentation means that the amount of any information in any segment is within working memory limits.

The first experiment showed children an origami task – it was important that the task included technical elements. Obviously, watching TV includes long sections of text that can be easily processed. The difficulty is the inclusion of technical elements. The task involved 24 steps. Students were shown either video or a series of static images, either in short segments or a continuous presentation. The post-test scores showed students who watched animations scored better than those who had static images, but only if the animation was in short sections.

The second experiment considered length of verbal statements. Students were given instruction on how to read a temperature-time graph, and given five worked examples. The information was presented to the four groups as (1): longer audio text; (2) longer visual text; (3) shorter audio text; (4) shorter visual text. This information was presented in a 330 second presentation, but the amount of explanatory text on the slides differed (long vs short visual text) and in the case of audio only, the explanatory text on the slides was removed and presented as long or short audio segments. After instruction, students were given a post-test. These scores show that the longer visual text was preferable (reverse modality effect). The shorter audio was preferable to the longer audio. These results demonstrate that short spoken statements can be easily held in the auditory working memory, allowing visual memory to process the graphics on the slide. Written information crowds the visual working memory space, reducing capacity to process. However, long auditory information can be difficult to process.

In terms of designing e-resources, these experiments appear to suggest that animations have a beneficial impact on learning, but should be presented in short segments to novice learners, to allow time to process. Audio commentary is beneficial, but again short segments are more useful.

With regards to animations, I think interactivity is important, as it allows the user to click through at their own pace rather than just watching passively. Perhaps table of contents listing slides might help with audio statements, leaving it possible for users to click back on a slide they wish to revisit. But it is a pertinent reminder not to go off on long audio meanders as is our wont.

______________

*A Wong, W Leahy, N Marcus, J Sweller, Cognitive Load Theory, the transient information effect and e-learning, Learning and Instruction, 2012, 22, 449-475.

LGBT Students Perceptions of Campus Life

Journal Club #7: S. J. Ellis, Diversity and inclusivity at university: a survey of the experiences of lesbian, gay, bisexual and trans (LGBT) students in the UK, Higher Education, 2009, 57, 723 – 739.

A recent conversation led me on the trail of looking to see what literature exists on issues around transition from school to college for LGBT students. The answer, briefly, is very little. Most studies on LGBT issues in college are based in the US, and most of these appear to be relate to staff rather than students. One paper that came close is this one, examining campus climate in 42 UK universities by surveying students.

The entire topic is fraught with difficulties. As the paper states, it is difficult to determine if there is under-representation (or if there is a greater dropout rates) of LGBT students as we don’t have any population information; either of each year’s class (or indeed of the population as a whole). Therefore, it is difficult to state if there is a problem at all. However, given there is data at second level, regarding greater levels of truancy, early exit and underachievement at second level, it seems a logical statement that there is at the very least cause for monitoring third level, although LGBT students are not typically included under the “widening participation” umbrella. In addition, college is often the first time students are not living at home, and are free from constraints of second level and parents.

The study sampled 291 self-selected LGBT students, mostly undergraduate, from 42 HE institutions in the UK. The survey consisted 25 questions covering four themes: actual harassment, perceptions of campus climate, campus climate and outness, and LGBT inclusiveness.

Actual Harassment/Discrimination

About 1/4 of students surveyed had experienced actual harassment, consisting of derogatory remarks, threats of violence or verbal abuse, pressure to conceal, and a small number of actual physical assault. This was mostly perpetrated by other students, with a very small number of academic staff (4.4%) and other staff involved. Some particularly unpleasant scenarios are reported, perpetrated by members of religious groups/societies. In another case involving staff, presenters at an LGBT stand were asked not to make the stand “too overt”. Remarks in this category were most frequent with other people, followed by friends, followed by academic staff.

Perception of Campus Climate

Just under 2 in 5 students surveyed considered that they thought anti-LGBT issues existed to some extent on campus, although only a small amount felt that an LGBT person was likely to be harassed on campus. Male respondents were significantly more likely to think this, suggesting that most harassment is directed to this group.

Campus Climate and Outness

4 out of 5 students agreed that they felt comfortable being out on campus, although half conceal their sexual identity to certain groups to avoid the threat of harassment. 2 in 5 avoid revealing to an academic staff member or tutor for fear of negative consequences.

LGBT Inclusiveness

Opinion was evenly split as to whether universities addressed campus issues relating to inclusiveness. Less than 1 in 5 agreed that LGBT issues were adequately dealt with in the curriculum.

Summary and discussion

Overall the study is reasonably positive, with data suggesting that homophobia is not overwhelming, but is prevalent. The question that arises for me is what extent do universities as institutions and in terms of lecturing/tutoring staff be involved in representing LGBT issues in terms of dedicated LGBT support, in terms of curricula, and in terms of creating a more accepting, more mature culture. I do feel that institutions pass the buck somewhat is allowing LGBT societies take up much of the support work needed. In reality, change will only come, and more research on this topic will only emerge, if LGBT students are included in the diversity umbrella, and hence have a link with the funding of institutions.

Flipped Classroom Debate

Educational innovations are a bit like political parties; people tend to advocate them passionately or dismiss them disparagingly, with both groups relying on legacy rather than evidence for their views.

Flipped Graphic from http://ctl.utexas.edu/teaching/flipping_a_class/what_is_flipped which gives a useful overview
Flipped Graphic from http://ctl.utexas.edu/teaching/flipping_a_class/what_is_flipped which gives a useful overview

This week, the flipped classroom hit the headlines, with a USA Today article that presented a study of flipping the “STEM classroom”. The preliminary results found that flipping the classroom had no difference in problem solving, attitude to learning, and exam performance. It was remiss of the journalist of this article not to pick up what others later did; that the staff-student ratio at the institution in question was 1:9 and therefore any innovation is likely to struggle to improve results (Edit: the authors of the study have since stated that the USA Today article does not accurately reflect their opinion). In contrast, the New York Times also carried a piece this week which was a more comprehensive, largely positive, Opinion Piece on the flipped classroom at school level.

Whatever your view, it can only be a good thing that teaching and teaching methods are being openly discussed. The problem for advocates of the flipped classroom is that there is little evidence so far on its effectiveness, and I think some more passionate advocates may actually be doing the innovation a disservice by promising more than can be delivered, often without any real evidence. Nevertheless, one can’t sit in front of your lectern waiting for the evidence to come before you change from the status quo. It is difficult in education to amass this evidence without trial and error, so trials are necessary.

Thanks to the articles above, I joined the Flipped Classroom Ning this week, and I am impressed by the enthusiasm of a lot of educators wanting to improve their students’ learning. There is a growing chorus of people who have tried it, like I have, and see its potential, like I do. I imagine the coming year will be an interesting one in the education literature.

What do students think of flipped lecturing? On to this week’s Journal Club:

#7: J. D. Smith, Student attitudes toward flipping the general chemistry classroom, Chemistry Education Research and Practice, 2013, 14, 607-614. (free to access)

Smith created 200 mini-lectures ranging in length from 1:08 to 17:02 with an average length of 7:10. The list of lecture titles are in the supplementary information. Over the two year trial, two different pre-lecture assessments were used. In the first year, students were given questions to help them gauge their own level of understanding, with no grade. There was low take up. In the second year, clickers were used in the classroom (after initial clarifying questions) to gauge understanding of pre-lecture content. Then followed in class work; problem solving, discussing, teasing out issues and difficulties. The author writes

Generally, much more time was available for explanation, interaction, and conveyance of insight than had been in the past.

The students were surveyed on their experience of the flipped classroom model. They agreed (97%) that the online material was useful, and didn’t indicate any preference between streaming and downloadable content. Some students wished to be able to annotate lectures. Students felt the length of lectures was appropriate, but didn’t think they should be any longer. About half the students felt that shifting the work outside the class was a burden to them, but agreed that this made the class time less boring/more engaging, and more useful to them. Students watched the pre-recorded lectures on average three times, using them for class preparation, assisting with homework, and preparing for tests. Students found the lecture explanations more useful than the textbook explanations. In class, students were neutral on whether the in-class questions aided them to gauge their own understanding, but felt the in-class problem solving helped them prepare for homework and tests.

Discussion

I like this paper because it is a reflection of what really happens when most of us try out something new (we don’t all have $200,000 grants to pilot something). Here, an educator has done a huge amount of work in preparing a suite of materials for his students to use, incorporated it into the flipped class model, and made significant attempts to see how it went. I find the student responses encouraging.  It also highlights to me how important it is to consider how the time in class is used.

  1. Have you tried or are you considering trying flipped teaching? If so, what do hope to achieve? If not, why not?
  2. What kind of topics do you think lend to the flipped class model?
  3. Have you any thoughts on what can be done in the class hour (specifically)?

Off to prepare some screencasts…

Exam scheduling: semester or end of year?

Journal Club #6: G. Di Pietro, Bulletin of Economic Research, 2012, 65(1), 65 – 81. [Link]

It is my experience in academic discourse that when a change is proposed, those advocating the change rely on “gut instinct” and “common sense” while those opposing it seek evidence from the literature. My own institution is currently planning a significant change in the academic calendar, and while thinking about this, I came across this paper.

The author examines whether an institution’s reform involving moving semester exams to end of year exams had a negative impact on student performance. The system under study had two semesters, with exams in January and June, and the reform meant that there would be no January exams, just exams for the entire year at the end of the year (the way it used to be, I hear the chorus).

Reasons for the reform included the desire not to overburden first years with exams in January, and to allow students more time to digest their material. The author doesn’t hold back in stating that he believes the reasons for the reform were administrative and financial.

The study involved comparing the mid-term results from modules, and comparing these with semester exam results. Assuming that the mid-term results stayed constant before and after reform, the difference between the mid term mark and the exam performance mark before reform and after reform allow for a measure of the impact of reform on student grades to be determined.

BOER_423_f1

The results shown demonstrate that there was a drop in student performance when the exams moved out of semesters to the end of the year, with students scoring 4 points lower (nearly half a grade).

The author concludes with a statement that sounds a note of caution to those considering changing calendars (DIT colleagues take note!)

These findings may have important policy implications. Changes in examination arrangements should ideally be tested for their impact on student performance before they are introduced. Many changes in higher education are driven not by student learning considerations, but by other reasons such as financial and administrative convenience.

Discussion

Do you have any feelings regarding when modules should be examined?

 

Atomic Desire: Teleological Explanations in Chemistry Education

Journal Club 5: V. Talanquer, When Atoms Want, Journal of Chemical Education, dx.doi.org/10.1021/ed400311x

How many of us have said something like the following when explaining why atoms form ions of a certain charge?

Sodium atoms want to lose one electron so that they can have a full electron shell.

Vincente Talanquer writes an interesting piece in Journal of Chemical Education this month on the (over)use of teleological explanations in science and chemistry education. A teleological explanation is one which uses the consequence of the event (to become more stable) to explain why the event happened (loss of electron). This is conferring a desire on the part of the sodium atom. I have to confess, I do this all the time, especially at introductory level.

Talanquer examined whether students chose teleological explanations to explain observations (they did, overwhelmingly), and also whether they chose teleological explanations over causal explanations. For example, a causal explanation for our dear sodium atom losing an electron would be:

Sodium atoms have one electron in a valence orbital with a higher energy than available valence orbitals in other atoms

Students chose teleological explanations over causal explanations in the range of questions posed. Finally, students over a range of levels (introductory to graduate) preferred teleological explanations.

So what?

The preference for teleological explanations is in part assigned to the fact that it offers a more easily understood explanation for an observed phenomenon, and as such learners are more likely to grasp onto it as a means of being asked to explain the observation. This probably explains why it is is prevalent at introductory teaching. However, Talanquer argues that their overuse might prevent students from examining the phenomenon more deeply; meaning that concepts such as equilibrium are difficult to understand. Talanquer writes:

Teleological explanations are problematic in education because they provide a cognitively cheap way of satisfying a need for explanation without having to engage in more complex mechanistic reasoning.

Therefore, in relying on teleological explanations, an event occurs because the substance involved wants it. This thought eliminates a more nuanced view of a process occurring, often in competition with other processes, which may cost more energetically, or be slower, and therefore less likely to occur. This probabilistic view is lost in the simplicity of the original explanation.

What to do?

I think these applications have a value, but perhaps as a first step in engaging learners into the fact that a particular observation is a routine occurrence (e.g. group 1 form 1+ charge). However, when we move on to more in-depth discussion, it’s important to bring learners along so that they don’t over-rely on their simplistic understanding. I think this probably aligns well with Bloom’s Taxonomy.

Discussion

What do you think?
1. Do you use teleological explanations?
2. What value do you think they have, and do you agree with the arguments presented in the paper, that they may be overused?

Do you use lecture handouts, and when?

The aim of the “Journal Club” is to present a summary of a journal article and discuss it in the comments below or on social meeja. The emphasis is not on discussing the paper itself (e.g. methodology etc) but more what the observations or outcomes reported can tell us about our own practice. Get involved! It’s friendly. Be nice. And if you wish to submit your own summary of an article you like, please do. (Paper on author’s website).

4. EJ Marsh and HE Sink, Access to Handouts of Presentation Slides During Lecture: Consequences for Learning, 2010, Applied Cognitive Psychology, 24, 691–706.

As class sizes get bigger, and photocopying notes becomes more time-consuming, I thought it would be interesting to have a look at this study on whether and when to give students handouts to lectures. The authors have devised two scenarios: students are given the handouts at the start of the lecture or students are given the handouts after the lecture is over (this would drive me INSANE if I were a student!).

The authors argue that cognitive load theory has something good to say about both options. Providing material in advance helps students encode the lecture information more readily. Providing the material afterwards means that the students have to work a bit harder during lectures, but that this work can be a benefit to learning (“desirable difficulties”).

They constructed scenarios whereby students watched a lecture, either with or without handouts. They examined students scores in a test soon after and one week after the lecture, to study the difference in short and longer term recall. A separate prior study found that 50% of staff preferred to give handouts before a lecture, 21% saying they never distributed their notes. 74% of students preferred notes before the lecture.

Results

The authors first examined the number of words written by students who had and had not handouts. Unsurprisingly, those without wrote twice as much as those with handouts. When this text was analysed, it was found that the bulk of the extra text written by the no-handout group was text from slides. Interestingly, there was no difference between the two groups in terms of the amount of text written that was not on slides – it was the same for both groups.

Performance in the tests both immediately afterwards and one week after related to the amount of time students reviewed the material, but not to whether students had handouts in the original lecture. A caveat here is that students who did not have handouts spent slightly longer on lecture review. The authors summarise this observation by saying that the end result for both groups was the same, although it took the no-handout group more effort to get there. I think this is an interesting discussion point.

A second experiment which tested free recall soon after the lecture found that students with handouts performed slightly better (significant to 0.05).

Discussion

This is a small study but throws some interesting light on some common myths that appear on both sides of the argument. Giving handouts in lectures did not significantly enhance any additional note taking by students in the time they had available to amend the notes they were provided. Similarly, requiring students to write out the text did not improve memory of the material. The students who did not have handouts had to do a bit more work to achieve the same grade as those who were given handouts.

1. Do you use handouts? Do you give them out before/after? Why?
2. What’s your opinion on the “extra work” by students result – do you think it is a good thing that students without handouts acheive the same score by virtue of spending more time reviewing the material?

In other words, do I need to go to the photocopier tomorrow?!

When we grade, do we mean what we say?

The aim of the “Journal Club” is to present a summary of a journal article and discuss it in the comments below or on social meeja. The emphasis is not on discussing the paper itself (e.g. methodology etc) but more what the observations or outcomes reported can tell us about our own practice. Get involved! It’s friendly. Be nice. And if you wish to submit your own summary of an article you like, please do. If you can’t access the paper in question, emailing the corresponding author usually works (This article is available online at the author’s homepage. PDF).

Comments on this article are open until Friday 4th October.
#3: HL Petcovic, H Fynewever, C Henderson, JM Mutambuki, JA Barney, Faculty Grading of Quantitative Problems: A mismatch between values and practice, Research in Science Education, 2013, 43, 437-455.
One of my own lecturers used to remark that he didn’t care how good our workings were, if the final answer was wrong, we’d get no marks. What use is a bridge, he’d ask, if the engineers calculated its span to be too short? We weren’t engineers.

After two weeks of giving out about students, this week I am looking at a paper that probes what lecturers say they expect in student answers, and whether there is a mismatch between these expectations and how they grade. To examine this, the authors have constructed a question with two example answers. The first answer is a detailed, well explained answer that has some errors in it, but these errors cancel out, and give the right answer (ANS A). The second answer is brief, and does not show workings, but gives the correct answer (ANS B). Ten chemistry academics were asked about their grading practices, given these answers, and asked to mark them.

In terms of scoring the solutions, eight of the ten academics scored the incorrect-workings answer (ANS A) higher than the correct-no workings answer (ANS B); and the remaining two scored them equally. The average scores were 7.8 versus 5.1. This was much higher than academics in physics and earth sciences, who were evenly split in whether ANS A scored higher than ANS B.

What do we say we want?

In the interviews, the authors drew up a list of values attributed to instructors in terms of what they wished to see in an answer. Value 1 was that instructors wished to see reasoning in answers to know if the student understands (and to offer specific feedback). All chemistry academics expressed this value.

Value 2 was that instructors wished to find evidence in order to deduct points for incorrect answers. This was interesting, as nine of the ten chemists used this as a reason to deduct points from ANS A, as the student had shown their work; whereas five chemists were reluctant to deduct marks from ANS B as they could not be sure if the student had the same mistakes, as they did not show their workings.

Seven chemists were attributed Value 3, which was a tendency to project correct thinking on ambiguous solutions, assuming that the student writing ANS B must have had the correct thought process, since there was no evidence of a mistake.

Finally, the chemists alone had a fourth value which was not found as much with earth scientists or at all with physicists – a desire to see organisation, units, significant figures; a general methodological approach.

There is evidently a mismatch between the values expressed. Value 1 (want reasoning) and Value 4 (want methodological approach) would appear to conflict with Value 2 (need evidence to deduct) and 3 (projecting correct thought). Most chemists expressed several values, and where they expressed conflicting values, the authors deduced a burden of proof; which set of values the academics (implicitly) rated higher. Six chemists placed the burden of proof on the student: “I can’t see if the student knew how to do this or just copied it.” The remainder placed the burden on themselves: “I don’t want to take any credit off but will tell him directly that he should give more detail.

Message to students

Students of course are likely to take messages from how we grade instead of how we say we will grade. If students are graded with the burden of proof on the instructor, they are more like to do well if they do not expose much reasoning in their answers. If they are required to show reasoning and demonstrate understanding, they are likely to score poorly. Therefore, while we often say that we want to see workings, reasoning, scientific argument, unless we follow through on that, we are rewarding students who call our bluff in this regard!

Discussion

I think this is an interesting paper, and it’s made me think back about how I mark student work. I would imagine that I would be in the burden of proof on the instructor camp, seeing that as implicitly fair to students, but perhaps I need to get a bit harder, and demand fully detailed reasoning in student answers.

  1. Can you identify with any of the four values the authors outline in this paper?
  2. For introductory students, do you have an “induction” of how to illustrate reasoning in answering questions and problems?

Interested to hear what you think.

Journal Club #2: Approaching Open Ended Problems

The aim of the “Journal Club” is to present a summary of a journal article and discuss it in the comments below or on social meeja. The emphasis is not on discussing the paper itself (e.g. methodology etc) but more what the observations or outcomes reported can tell us about our own practice. Get involved! It’s friendly. Be nice. And if you wish to submit your own summary of an article you like, please do. If you can’t access the paper in question, emailing the corresponding author usually works (CERP is free to access).

Comments on this article are open until Friday 27th September.

#2 T Overton, N Potter, C Lang, A study of approaches to solving open-ended problems in chemistry, Chemistry Education Research and Practice, 2013, doi: 10.1039/C3RP00028A

There is a lot of literature that promotes the use of contextualization in our teaching, to generate interest and promote engagement in a topic. This is often coupled with more open-ended activities, reflecting real-life situations. There is also a lot of literature that advocates that teachers should be cognisant of working memory by providing structure to student learning and by reducing “noise” as much as possible. I see these as conflicting viewpoints, and have struggled with over the last few years in thinking where the balance between providing enough of the “carrot”  of context and the “stick” of structure lies.

Tina Overton’s paper on student approaches to open ended problems is useful (and unusual) in this regard; the opening section presents a synopsis of several studies on approaches to problem solving when the problem is structured or algorithmic. But what happens when students are given a problem where the data is incomplete, or the method is unfamiliar, or the outcomes may be open (Johnstone, 1993)? Three examples of such problems are given. One of these is:

 Many commercial hair-restorers claim to stimulate hair growth. If human hair is composed mainly of the protein α-keratin, estimate the rate of incorporation of amino acid per follicle per second.

I have to be honest and say that this question scares the hair out of me, made my hair stand on end, and other hair-related puns, but as an “expert” in the discipline, I think I would have an idea where to start. The research involved listening to students talk out how they approached the problem, and these approaches were coded. They ranged from approaches that could lead to a feasible solution (makes estimations, knowing what approach to take, having a logical reasoning or approach, sensible assumptions) to approaches that are unlikely to succeed (seeking an algorithmic approach, distracted by context, lack of knowledge). When participants were grouped by common approaches, three distinct groupings emerged.

  1.  The first were those who used predominantly positive (scientific) approaches to the problem. They could clarify the aims and identify the data needed and assumptions that could be made. (10/27)
  2. The second where those who couldn’t attempt the problem, they didn’t know where to start in tackling the problem and didn’t have the basic knowledge required to call upon. They weren’t able to take a scientific approach to solving the problem. (10/27).
  3. Finally were students who got caught out with their prior knowledge confusing them (e.g. writing out rate equations), and although they tried to tackle the problem, were unsuccessful (7/27)

The study participants were from all years, but the authors state that there groups identified above did not correlate with stage in degree.

Discussion questions

This study interests me a lot. The headline is that of this (small) sample of students, a majority had difficulty with open ended, highly contextualised problems. I am making a sweeping statement, but I would hazard a guess that students in this institution get exposed to a lot more open ended problems than average. However, some students displayed significant expertise in approaching problems, and frankly their responses recorded are very impressive. Questions I think worth teasing out are:

  1. Do you use open-ended problems of this nature in teaching? Why/why not?
  2. There is clearly a gap in approaches. The middle group are caught in between, making good efforts to solve the problem, but calling on irrelevant prior knowledge. If you were planning to incorporate open-ended problem solving, how could this gap be addressed in curriculum design?

In regard to the second question, I’m trying to think of an approach around the worked example/fading approach used for simple algorithmic problems. Would something similar have a role, or does that approach necessitate a change in categorisation of the problem, back to closed, defined…?

Love to hear your thoughts. Remember the “Be nice” part…

Planning for Journal Club #2

Several people expressed an interest in being involved in a journal club. I’m not quite sure how a journal club works, but I suppose, using a book club analogy, something is proposed in advance, it’s read, and we share comments on what we think. I think to be useful, commentary focussed on the topic of the paper and its implementation in practice is most beneficial. So to keep some momentum, I’m planning that the next paper to look at is Tina Overton’s nice piece of work on student approaches to open ended problem solving (freely available on the CERP website).

I will aim to post my thoughts on it in the coming days, and then will leave comments open for a week if you want to add thoughts. If it takes off, and there is a bit of interest, I’m happy to move to an independent space outside my website. But I will keep it here for the moment until that happy day. In the meantime I welcome suggestions for journal articles. For a broad church, I am leaning towards articles of a general nature that are accessible to educators generally.

Journal Club #1: Metacognitive Learning Strategies in Science

The aim of the “Journal Club” is to present a summary of a journal article and discuss it in the comments below or on social meeja. The emphasis is not on discussing the paper itself (e.g. methodology etc) but more what the observations or outcomes reported can tell us about our own practice. Get involved! It’s friendly. Be nice. And if you wish to submit your own summary of an article you like, please do. If you can’t access the paper in question, emailing the corresponding author usually works (email details given on journal page linked below).

#1: E Cook, E Kennedy and S McGuire, Effect of Teaching Metacognitive Learning Strategies on Performance in General Chemistry Courses, Journal of Chemical Education, 2013, 90, 961-7.

I am a little biased in choosing this paper as I heard Saundra McGuire speak at Gordon CERP conference two years ago on this topic. This is one inspirational lady! This paper opens by saying that there are many teacher-focussed interventions that work on increasing retention and success. However, this article describes the concept of teaching students how to help themselves in learning and applying new information. The intervention is simple: a single 50 minute lecture on developing metacognitive learning strategies was provided to freshman chemists. Analysis of results shows that there was a significant improvement in their grades relative to those who hadn’t sat in on this lecture.

The timing of the intervention lecture was just after an early semester test. The authors argue that giving students who have done well in school tests information on study skills before they have completed any college tests is a folly, as the students are of the (correct) opinion that their study methods to date have been effective. However, after their first college test, students may be more receptive to thinking about how to study, especially if they haven’t performed as well as they usually had in school tests.

The lecture itself (available in supplementary information) uses the Bloom’s Taxonomy structure to show students the levels of learning, and what different study approaches apply to each level. This keeps it simple and logical, which I think is an attractive element of the work. Armed with an understanding of the different levels of learning, a study cycle is proposed. Students reported in response to questions that they understood the differences between school and college learning requirements having been shown the Bloom framework. The (revised) levels of Bloom’s taxonomy are outlined below:

Bloom’s Level (Revised) Typical Activity Level Typical Study Strategy Rationale
Creating Generating, producing information into new patterns Postgrad
Evaluation Making judgments based on criteria
Analysis Breaking components apart and relating them to each other Undergrad Working through problems without examples; working in groups Working through problems and hearing how others think a problem aids understanding
Application Using knowledge to solve problems, carry out procedures
Comprehension Restating in your own words, paraphrasing or summarising School Previewing lecture material; Paraphrasing/ rewriting lecture notes Helps students organise new information and connect to what they know
Knowledge Memorizing information, recalling but perhaps not understanding

Based on this, a “Preview, Attend, Review” study cycle is proposed, so that students are exposed to the class material three times within a short period (24 h). This is followed by a study session, consisting of four steps:

  1. Set a goal (1 – 2 mins) – what is aim of study session?
  2. Study with focus (40 – 50 mins) – interact with material: mind maps, summarize, process, etc
  3. Reward (10 – 15 min break)
  4. Review (5 min) – go over what was just studied.

The paper has detailed statistical analysis on how this intervention improved student grades. I like it because it is realistic – it can be delivered in a busy semester relatively easily, as it only takes one lecture slot; and it is student friendly – a welcome addition to the plethora of study skills books and strategies that are vague and generic. This to me seems quite focussed in explaining to students why thinking about how they study is important and how they might go about improving their study skills and learning. In the mix, metacognitive skills are being incorporated into the curriculum by stealth.

What do you think?