Pedagogy, Trends in E-Learning

Why do academics use technology in teaching?

This week is All Aboard week in Ireland, essayed at “Building Confidence in Digital Skills for Learning”. I am speaking today in the gorgeous city of Galway on this topic, and came across this paper in a recent BJET which gives some useful context. It summarises interviews with 33 Australian academics from various disciplines, on the topic of why they used technology in assessment. While the particular lens is on assessment, I think there are some useful things to note for those espousing the incorporation of technology generally.

Four themes emerge from the interviews

The first is that there is a perceived cost-benefit analysis at play; the cost of establishing an assessment process (e.g. quizzes) was perceived to be offset by the benefit that it would offer, such as reducing workload in the long-run. However, some responses suggest that this economic bet didn’t pay off, and that lack of time meant that academics often took quick solutions or those they knew about, such as multiple choice quizzes.

The second theme is that technology was adopted because it is considered contemporary and innovative; this suggests a sense of inevitability of using tools as they are there. A (mildly upsetting) quote from an interview is given:

“It would have been nice if we could have brainstormed what we wanted students to achieve, rather than just saying “well how can ICT be integrated within a subject?”

The third theme was one around the intention to shape students’ behaviour – providing activities to guide them through learning. There was a sense that this was expected and welcomed by students.

Finally, at the point of implementation, significant support was required, which often wasn’t forthcoming, and because of this, and other factors, intentions had to be compromised.

The authors use these themes to make some points about the process of advocating and supporting those integrating technology. I like their point about “formative development” – rolling out things over multiple iterations and thus lowering the stakes. Certainly my own experience (in hindsight!) reflects the benefit of this.

One other aspect of advocacy that isn’t mentioned but I think could be is to provide a framework upon which you hang your approaches. Giving students quizzes “coz it helps them revise” probably isn’t a sufficient framework, and nor is “lecture capture coz we can”. I try to use the framework of cognitive load theory as a basis for a lot of what I do, so that I have some justification for when things are supported or not, depending on where I expect students to be at in their progression. It’s a tricky balance, but I think such a framework at least prompts consideration of an overall approach rather than a piecemeal one.

There’s a lovely graphic from All Aboard showing lots of technologies, and as an awareness tool it is great. But there is probably a huge amount to be done in terms of digital literacy, regarding both the how, but also the why, of integrating technology into our teaching approaches.

map2
Click link to go to All Aboard webpage

 

Laboratory

Rounding up the peer review and digital badge project

Marcy Towns’ lovely paper from 2015 described the use of digital badges in higher education chemistry, specifically for assessment of laboratory skills. This work was important.  The concept of badges had been around for a while. When I first came across them while doing an MSc in E-Learning back in 2010, laboratory work seemed an obvious place to use them. But while the technology promised a lot, there wasn’t feasible systems in place to do it. And what exactly would you badge, anyway? And would undergraduate students really take a digital badge seriously?

Towns’ work was important for several reasons. On a systematic level, it demonstrated that issuing badges to a very large cohort of students in undergraduate laboratories was feasible. At Purdue, they used an in-house system called Passport to manage the badging process from submission of videos to viewing online for assessment, and subsequent issuing of the badges. But more importantly for me, Towns’ work reasserted the notion of laboratory skills and competencies as something worth assessing in their own right. Skills weren’t being implicitly assessed via quality of results or yield in a reaction. This approach – videoing a student while they demonstrate a technique was directly assessed. This is not something you come across very often in the education literature (some notable exceptions are pointed out in our paper).

This work answered two of the three questions I had about badges – there are systems in place (although as I discovered, Blackboard just about manages to do this, with a lot of creaking).  And there is scope for badging laboratory skills – the concept of badging is built on demonstrable evidence, and videoing techniques is part of this. Whether students take badging seriously I think still needs to be answered. My own sense is that there will need to be a critical mass of badges – an obvious ecosystem where it is clear to students how they can progress, and our own work in this regard is extending into more advanced techniques.

Incorporating peer review

One of the great insights Towns and her students shared at a workshop at BCCE last summer was the notion of narration in demonstrating techniques. Early videos in their initial pilot studies were eerily silent, and it was difficult for them to know what the students’ understanding was as they completed a technique – why they were doing things in a particular way? So they built in narration into the requirements of the demonstration. I think this is one of those things in hindsight that is obvious, but for us to know up front in our own implementation was invaluable.

We opted to have a system where students would video each other rather than be videoed by a demonstrator, and that the narration would be in effect one peer telling the other how they were doing the technique. To facilitate a review of this at the end, the demonstrators in the project came up with the idea of a peer observation sheet (they designed them too – bonus points!). The whole set up was to encourage dialogue – genuine interactions discussing the experimental technique, and allowing for feedback on this based on the guidelines presented in the peer observation sheets. These acted as a framework on which the lab was run. Lord knows chemists like instructions to follow.

Feedback then is given in-situ, and indeed if the student demonstrating feels after discussion that they would like to video it again, they can. This notion of quality, or exemplary work, is underscored by the exemplars provided to students in advance; pre-laboratory videos dedicated to correct display of technique. This whole framework is based around Sadler’s design for feedback, discussed… in the paper!

We’ve documented our on-going work on the project blog and the paper summarising the design and analysis of evaluation is now available in CERP.  It is part of the special issue on transferable skills in the curriculum which will be published in the Autumn, primarily as we felt it developed digital literacy skills in addition to the laboratory work; students were required to submit a link to the video they hosted online, rather than the video itself. This is giving them control over their digital footprint.

Resources

Resources – digital badges, peer observation sheets, links to exemplar videos – are all freely available on the project website.  I really think there is great scope for badges, and look forward to where this project will go next!

Three badges

Chemistry, Pedagogy, Royal Society of Chemistry

#ViCEPHEC16 – curly arrows and labs

The annual Variety in Chemistry Education/Physics Higher Education conference was on this week in Southampton. Some notes and thoughts are below.

Curly arrows

Physicists learned a lot about curly arrows at this conference. Nick Greeves‘ opening keynote spoke about the development of ChemTube3D – a stunning achievement – over 1000 HTML pages, mostly developed by UG students. News for those who know the site are that 3D curly arrow mechanisms are now part of the reaction mechanism visualisations, really beautiful visualisation of changing orbitals as a reaction proceeds for 30+ reactions, lovely visualisations of MOFs, direct links to/from various textbooks, and an app at the prototype stage. Nick explained that this has all been developed with small amounts of money from various agencies, including the HEA Physical Sciences Centre.

Mike Casey from UCD spoke about a resource at a much earlier stage of development; an interactive mechanism tutor. Students can choose a reaction type and then answer the question by drawing the mechanism – based on their answer they receive feedback. Version 2 is on the way with improved feedback, but I wondered if this feedback might include a link to the appropriate place in Chemtube3D, so that students could watch the associated visualisation as part of the feedback.

In the same session Robert Campbell spoke about his research on how A-level students answer organic chemistry questions. My understanding is that students tend to use rules of mechanisms (e.g. primary alkyl halides means it’s always SN2) without understanding the reason why; hence promoting rote learning. In a nice project situated in the context of cognitive load theory, Rob used Livescribe technology to investigate students reasoning. Looking forward to seeing this research in print.

Rob’s future work alluded to considering the video worked answers described by Stephen Barnes, also for A-level students. These demonstrated a simple but clever approach; using questions resembling A-level standard, asking students to complete them, providing video worked examples so students could self-assess, and then getting them to reflect on how they can improve. David Read mentioned that this model aligned with the work of Sadler, worth a read.

Laboratory work

Selfishly, I was really happy to see lots of talks about labs on the programme. Ian Bearden was the physics keynote, and he spoke about opening the laboratory course – meaning the removal of prescriptive and allowing students to develop their own procedures. Moving away from pure recipe is of course music to this audience’s ears and the talk was very well received. But you can’t please everyone – I would have loved to hear much more about what was done and the data involved, rather than the opening half of the talk about the rationale for doing so. A short discussion prompted this tweet from Felix Janeway, something we can agree on! But I will definitely be exploring this work more. Ian also mentioned that this approach is also part of physics modules taught to trainee teachers, which sounded a very good idea.

Jennifer Evans spoke about the prevalence of pre-labs in UK institutions following on from the Carnduff and Reid study in 2003. Surprisingly many don’t have any form of pre-lab work. It will be interesting to get a sense of what pre-lab work involves – is it theory or practice? Theory and practice were mentioned in a study from Oxford presented by Ruiqi Yu, an undergraduate student. This showed mixed messages on the purpose of practical work, surely something the academy need to agree on once and for all. There was also quite a nice poster from Oxford involving a simulation designed to teach experimental design, accessible at this link. This was also built by an undergraduate student. Cate Cropper from Liverpool gave a really useful talk on tablets in labs – exploring the nitty gritty of how they might work. Finally on labs, Jenny Slaughter gave an overview of the Bristol ChemLabs, which is neatly summarised in this EiC article, although the link to the HEA document has broken.

Other bites

  • Gwen Lawrie (via Skype) and Glenn Hurst spoke about professional development; Gwen mentioned this site she has developed with Madeline Schultz and others to inform lecturers about PCK. Glenn spoke about a lovely project on training PhD students for laboratory teaching – details here.  This reminds me of Barry Ryan‘s work at DIT.
  • Kristy Turner gave an overview of the School Teacher Fellow model at Manchester, allowing her to work both at school and university with obvious benefits for both. Kristy looked forward to an army of Kristy’s, which would indeed be formidable, albeit quite scary. Even without that, the conference undoubtedly benefits from the presence of school teachers, as Rob’s talk, mentioned above, demonstrates.
  • Rachel Koramoah gave a really great workshop on qualitative data analysis. Proving the interest in chemistry education research, this workshop filled up quickly. The post-it note method was demonstrated, which was interesting and will certainly explore more, but I hope to tease out a bit more detail on the data reduction step. This is the benefit of this model – the participants reduce the data for you – but I worry that this might in turn lead to loss of valuable data.
  • Matthew Mears gave a great byte on the value of explicit signposting to textbooks using the R-D-L approach: Read (assign a reading); Do (Assign questions to try); Learn (assign questions to confirm understanding). Matt said setting it up takes about 30 minutes and he has seen marked improvements in student performance in comparison to other sections of the course.
  • David Nutt won the best poster prize. His poster showed the results of eye-tracking experiments to demonstrate the value or not of an in-screen presenter. Very interesting results which I look forward to seeing in print.

The conference organisation was brilliant and thanks to Paul Duckmanton and Charles (Minion) Harrison for leading the organisation. Lots of happy but tired punters left on Friday afternoon.

I couldn’t attend everything, and other perspectives on the meeting with links etc can be found at links below. From Twitter, Barry Ryan’s presenation on NearPod seemed popular, along with the continuing amazingness of my colleagues in the Edinburgh Physics Education Research Group. One of their talks, by Anna Wood, is available online.

Chemistry, Pedagogy

Getting ready to badge and looking for interested partners

Over the summer we have been working on a lab skills badging project. Lots of detail is on the project home site, but briefly this is what it’s about:

  • Experimental skills are a crucial component of student laboratory learning, but we rarely assess them, or even check them, formally. For schools, there is a requirement to show that students are doing practical work.
  • By implementing a system whereby students review particular lab techniques in advance of labs, demonstrate them to a peer while being videod, reviews the technique with a peer using a checklist, and uploads the video for assessment, we intend that students will be able to learn and perform the technique to a high standard.
  • The video can form part of students electronic portfolio that they may wish to share in future (See this article for more on that).
  • The process is suitable for digital badging – awarding of an electronic badge acknowledging competency in a particular skill (think scout badges for… tying knots…).

Marcy Towns has a nice paper on this for pipetting and we are going to trial it for this and some other lab techniques.

Looking for interested parties to trial it out

I am looking for school teachers who would like to try this method out. It can be used to document any lab technique or procedure you like. You don’t necessarily need an exemplar video, but a core requirement is that you want to document students laboratory work formally, and acknowledge achievement in this work by a digital badge. We will provide the means to offer the badge, and exemplar videos if you need them, assuming they are within our stock. Interested teachers will be responsible for local implementation and assessment of quality (i.e. making the call on whether a badge is issued).

Yes I need help with badge design
Yes I need help with badge design

This will be part of a larger project and there will be some research on the value and impact of the digital badges, drawing from implementation case studies. This will be discussed with individuals, depending on their own local circumstances.

So if you are interested, let’s badge! You can contact me at: michael.seery@ed.ac.uk to follow up.