I’ve been thinking of ways to include worked examples and hints in Blackboard VLE quizzes. Cognitive Load theory has something called the Worked Example effect, whereby learners who receive direct instruction in the form of worked examples perform better than those who don’t. The reason is attributed to providing novice learners with an approach to solving a problem that they can replicate, thus alleviating the working memory load while solving a problem. There’s some more on worked examples here.
I think the start of my teaching career coincided with the rise of the VLE. Early on, I remember being told about these new learning environments and the array of tools that would help student learning. Encouraged, in the nicest possible way, to upload material and use the institution’s expensive new toy, many lecturers complied and uploaded course materials, support papers, practice questions and so on. In this ideal world, the students couldn’t have had more learning resources at their fingertips. Learning was going to happen.
In reality, this has not been the case. The DRHEA e-learning audit (2009) reveals some disappointing figures across the Dublin region. Students regularly log into their VLE, but mostly access it to access course materials (lecture notes). This makes VLEs a very expensive version of Dropbox or other online repository.
This is also reflected in the UK. In my own subject (chemistry) and in physics, the Higher Education Academy Physical Sciences Centre review of student learning experience showed that e-learning came bottom of the pile when students were asked to say which teaching method was most effective and most enjoyable.
For most lecturers, e-learning is not part of their day to day practice, perhaps because of lack of confidence, probably because of lack of awareness. Mention e-learning, and the discussion quickly moves to whether to use PowerPoint and whether those notes should go online.There may also be subtle fears of replacement – that if learning can happen online, perhaps it can happen without lecturers at all! (Of course, anyone who has taught online knows the truth here!). And as the DRHEA survey shows, if academics engage with the VLE, it tends to be in the form of mimicking what they do in lectures, rather than supporting what is done in lectures.
Institutions, bless them, are concerned with e-learning from a perspective of usage and branding – how does their toy compare with next door. There have also been subtle and not so subtle undertones about how e-learning can provide cost-savings in the future, which is a naive viewpoint. Institutions need to be protected from themselves. If, as a community, we don’t consider valuable uses for incorporating into our practice, institutions will want to fill the vacuum, just as was done previously with pushing content online. Lecture capture, a spectacular waste of tax-payers money, is looming large and is already catching on in the UK. It looks good, makes for good PR and students “love” it. The fact that there is little or no evidence to show that it helps with learning is disregarded. As a community of educators, we should be concerned about this “innovation” being pushed on us [I recommend reading this for a fuller discussion of lecture capture]
Students, well bless them too. Students are clever, articulate, funny and they are our future. But they are also sometimes a bit stupid. Students will always want more – more notes online, more resources, more quizzes, self-study questions, more more and more! In the relaxed days several months before exams, they mean well and plan to engage with all of this material. But all the evidence points to the fact that students rarely engage with the material until it is too late, just before exams. At this stage, they find the nature of the content, often not even re-purposed for an online environment (substitution of what they have rather than supplemental to help them understand what they have), useless for their learning.
Finally, we have my very good friends, the learning development officers, who try various strategies, sometimes against all the odds, to assist lecturers in incorporating e-learning into their teaching. Locally, their help has been of great value to me, but reading about e-learning on blogs and on The Twitter Machine, there is a sense that the ideas and conversations within the learning development community does not reflect what is happening on the ground. There is perhaps a false sense of advancement, buffered from the great unwashed of PowerPoint debaters by early adopters and innovations in the literature. This can lead to a disconnect in language – acronyms, gadgets and tech jargon which results in the lack of confidence among lecturers who may wish to change. The term “learning technologist” does not help, as it immediately imposes a (false) divide between learning and e-learning.
So, what to do? The high participation rates in VLEs indicate that this is a place where learning opportunities can be provided. Students are hungry to engage, if material is there. One of my favourite authors in the literature on e-learning for practitioners is Gilly Salmon (Gill-e-Salmon?). A core component of her approach is for practitioners to ask themselves: “What is the pedagogic rationale for implementing any proposed change?“. I think this is a very powerful position – it speaks in language all perspectives can understand, or at least appreciate (institutions I am looking at you). Lecturers, identifying problems or issues in some teaching practices can consider how to integrate a change, perhaps harnessing technology, into their teaching. Because there is a need; an underlying rationale even; the implementation has a value and a role to play in the module delivery. Lecturers may refer to it, and better still integrate it into their class work. Students are now presented with specific, often bespoke learning materials with specific purpose of supporting their learning at a particular stage of their learning in the module. Instead of just representing lecture information all over again, there is a reason at particular stages in the module, to interact with these reasons – they have a value. Learning development officers can offer their considerable expertise in supporting lecturers in developing the resources, so that they are fit for a purpose. And institutions are happy because students are happy and access statistics look good. In our own work here at DIT, we have enjoyed some success at the micro-level employing this approach – moving away from mass content upload (“shovelware”) towards specific learning resources tailored for and incorporated into specific modules. It takes time and is harder work, but the value of what is produced is greater for all.
Prompted by my visit to Edinburgh next week to the “More Effective Lectures” workshop, I have compiled several blog posts and bits and pieces of other writing into a Resource Pack that I hope might be useful to other practitioners entitled: “Podcasting and screencasting for supporting lectures“. The resource is a PDF file and is available at this link: Podcasting and Screencasting for Supporting Lectures or click on the image below. The resource covers:
Introduction to the use of podcasts/screencasts in education
Overview of the design of e-resources
Tips for preparing podcasts and screencasts
Tools of the trade: Audacity, Camtasia and Articulate
This is an Articulate interaction which incorporates video demonstrations the various aspects of the iodine clock experiment and then has a quiz towards the end. This could be used as a pre-lab activity, where students could print out their response to the quiz and bring it to the lab, or alternatively link the quiz to the VLE by SCORM. Click on the image to access the resource:
Funding from NDLR and DIT gratefully acknowledged.
The DIT 2010-2011 Teaching Fellowships were launched on 23rd September 2010, and each recipient of a Fellowship gave a presentation on the work they plan to do. It was really nice to see what others plan to do; there was a lot of variety and a lot of overlap at the same time. My presentation – the main thrust of which was summarised in another post – is embedded below. All of the presentations can be viewed from the LTTC website.
The video is streamed from the HEAnet server using the Embedded Video plugin (for the information of any WordPress junkies out there).
This article considers the concept of pooling experimental data in the teaching lab. It’s a very simple but effective technique that I have used, both in low-tech and high(er)-tech versions. Some observations from my own experiences as well as some examples from the literature are presented.
What is pooling data?
Pooling data, in the sense I mean here, is the aggregation of all students lab data on a particular experiment in one lab session. If ten pairs of students are completing an experiment, the pooled data would consist of a table of all individual pairs’ data, together with averaged data. In the low-tech version, this data would be tabulated and averaged on the white board, with students taking their own results as well as the class average results for analysis in their report. In the high-tech version, the same analysis is done, but the results are entered into a spreadsheet which is uploaded onto the VLE. This also allows a living graph to be drawn during the lab session as students enter their data. The approach of course requires that all students or a significant proportion are completing the same experiment at the same time, something that might be difficult in upper level undergraduate chemistry, but should be quite feasible in years 1 and 2.
Why pool data?
The advantages of pooling lab data, from my own experience of doing it, are many. The first is that it opens up a dialogue across the group. Very often in a lab group, pairs work independently without considering that others in the lab are doing the same experiment, and hence a useful resource for comparing results. Entering the data in the public space means that students consider their results in light of others – is it around the same, how does it differ, should it be repeated? This addresses issues about confidence in experimental approach. It also allows for a discussion on the nature of errors in the lab. Differences between human errors and instrumental variance, for example, can be easily identified. The latter will result in different individual values, but the same trend (e.g. the same slope of a graph); the former will show differences in trends of data. The process also flags to students the extent of confidence they can have in their value, and really illustrates the point of variance very clearly. Pairs can discuss with each other across the group about how and why their values differ. It is a nice experience to watch a group of students standing at a board considering differences in values, rather than rushing to see if their plot gives them the “right answer”.
If a living graph approach is used, it provides a platform towards the end of the practical activity for discussion of both experimental error (which can be illustrated visually on the scatter plot containing all data), as discussed above, and for initiating explanations on how data treatment should begin. In my own laboratory, students are required to stay the full time, meaning the introduction (completed prior to the lab), procedure, data recording and the beginnings of data analysis (all completed during the lab). This means that the analysis can take place in the presence of an instructor, which lessens the black hole effect that many students see lab report writing to be. The range of data available also allows meaningful statistical analysis to be performed by students.
One final advantage of pooling data, which is a slight variation of the method described so far, is that pooling data may allow a larger scale experiment to be done – for example the construction of a phase diagram – where all pairs do some experiments that contribute to the total data set. This method is effective in the discovery laboratory teaching method, see some of the references for more details.
Examples from my practice
There is no limitation on the nature of experiments that can be used in this way, once a consistent value is to be measured by the lab group as a whole. It has best use though when used in obtaining data that is to be treated graphically. I have used it in a range of quantitative experiments based on the Beer-Lambert law, thermochemistry experiments, and kinetics measurements. Some of these have some data acquisition that don’t allow for direct comparison, which in itself allows for discussion. For example, in a simple calorimeter calibration experiment, depending on the students use of hot and cold water temperatures used to calibrate, they will obtain different readings. This opens up the floor for a discussion on heat capacity. Kinetic measurements (e.g. clock experiments) allow for a discussion on how confident one can be in a time measurement – and again allows for discussion on being clear about when a time measurement is made in the reporting of the procedure (a perennial question is how does my procedure differ from the lab procedure – this kind of thing provides a great illustration how!). I should say that students are initially reluctant to commit their data to the public scrutiny of their colleagues, but once the initial entering begins, all inhibitions are lost.
Examples from literature
Either my searching isn’t great, or it’s just too “simple” to report in the literature, but I haven’t seen many examples. Ricci and Ditzler (1991) discuss pooling lab data in a discovery approach to chemistry learning. While this has a slightly different emphasis to the approach described here, the essentials are the same – students compile a range of data on either the same measurements or different measurements, and use the advantage of a large number of samples to deduce conclusions. The authors provide some examples on a measurements experiment (mass of a penny based on year minted), stoichiometric experiments (mass of silver halide precipitate based on halide), electron configuration (colour of transition metal complexes), and molecular structure (using IR frequencies). Olsen (2008) uses pooled data to illustrate statistical concepts (central tendency and distribution) in the laboratory, using the combustion of magnesium and determination of the molecular formula of the oxide as the basis for the data acquisition. McGarvey (2009) describes the use of data pooling in his teaching laboratory:
“Live data pooling, supported by projection of tabulated/graphical data on a large screen in the laboratory, provides the added dimension of enabling students to actually see the data develop before their eyes as the practical class proceeds; they can also see how their classmates are progressing and how their own results compare with those of fellow students. A large data set also lends itself to more meaningful data (and error) analysis and it is also easier to assess adherence to (and deviations from) theoretical predictions with a large data set, which is more instructive for students when carrying out data processing and analysis.”
McGarvey, DJ (2009) Enhancing undergraduate chemistry practicals using live data pooling, Eurovariety in Chemistry Education, Manchester. Proceedings at this link.
Olsen, RJ (2008) Using Pooled Data and Data Visualization To Introduce Statistical Concepts in the General Chemistry Laboratory, J. Chem. Ed., 85(4), 544.
Ricci, RW and Ditzler, MA (1991) A Laboratory-Centered Approach to Teaching General Chemistry, J. Chem. Ed., 68(3), 228.
This presentation is a screencast of a presentation to be presented at Chem Ed Ireland 2010. It covers:
what uses class websites might have
overview of setting up Google Sites
Example of a chemistry class website on Google Sites
Uses for podcasting in education
Overview of Audacity
This handout accompanies the presentation. Lots more information on setting up a class website using Google Sites is available at the Becoming an eTeacher Resource. Module 5 of this resource focusses on podcasting using Audacity. Some other links are given below.
Becoming an eTeacher – Five module resource on setting up and populating Google Sites for class work
In our first week of our Trends in E-Learning module, we’ve been looking at the VLE is dead debate. The seed for discussion was Martin Weller’s blog post (now over two years old) which makes the valid point that there are several independent third party (free) applications out there that address most if not all of the needs a VLE does, and do it a lot better because each individual application is that company’s core business.
I’ve been thinking about my own relationship with VLEs as a practising lecturer, and a student, and as someone who has, if I may say, above average capability in PC and web literacy than your typical academic (as well as boosting my own ego, this is important as I’ll mention later). We’ve had a useful discussion over the week (in our closed VLE discussion board) which has given me the opportunity to hone my thoughts.
It is dead.
I like Martin Weller. He writes with a sense of pragmatism and Feet On The Ground and seems to be both someone who thinks a lot about these things and teaches himself, which gives him an edge over a lot of commentators in my book. The VLE was born from a need to create an online workspace for students, to make files available and to communicate with students effectively (Phillips, Cormier and Styles 2008). In reality, it is in the main two things in my experience: a content repository for lecture notes and supplemental notes and a way of administrating a course through the mail/announcement/discussion board communication tools to students registered on the module. Whether its use as a content repository is a good thing is for another debate, but if this is the main use, why do we need one at all? Lecturers could provide a webpage with links to their presentations. This, coupled with having the email address of all the students means that these two uses are obsolete.
VLEs are closed, walled gardens, with the lecturer as gatekeeper to the information within. What’s in the VLE is therefore considered important, because the lecturer puts it there. There are two points to tease out here. The first is that if the lecturer defines the information the students should know, there is pressure on him (I’m male) to keep that content up there and up to date, making sure a range of issues are covered. He spends a lot of time working through the content of the web picking out information from trusted sites, academic papers and interesting presentations as well as links to core texts and placing it online in a nicely arranged manner so students can come into the garden and pick whatever roses of information they want. But when the student wants to learn some information for themselves, they do not have any experience in sourcing information, checking validity, because sourcing information to them has meant logging in and accessing the file the lecturer sourced. Secondly, it is a moot point whether students access much or any of this information at all, unless it is intrinsically related to assessment. If they need it at a future stage, post-module, they can’t get it because they are no longer allowed into that garden. This has been my own experience as a student in modules I have completed in the past.
The alternative therefore is that information can be placed on a website or referral area to all the resources a module needs. Slides could be posted on slideshare, wiki discussions and class activities on pbworks, screencasts on Screenr.com or using the free monthly allowance of screencast.com, or of course YouTube, podcasts on iTunes, pictures on flickr, discussions on an open discussion forum, assessment on… well that needs fine tuning but Google will come up with something soon I’m sure. Or the whole shebang could be placed on Google sites, Facebook or some of the other giants that are getting a taste of this market. What’s the difference between this and a VLE in the traditional sense? Well in this case, neither access nor content is restricted. This area becomes more of a first referral site – a place to start looking – scaffolding learner’s embrace of the information source that is the internet through the language of tags, ratings and credibility. No need for expensive customary VLEs. An additional advantage for the lecturer is that they don’t have to struggle with the terrible interface of VLEs, instead using the simplicity and beauty of something like WordPress, the mass appeal of Youtube and the versatility of compiling interesting information on Delicious.
It isn’t dead.
But wait! I like James Clay too. Full of useful tips and advice and an Eagerness To Share good practice, he has been the one I have followed that makes most sense about what a VLE actually is, and how it can be used. His podcast #40 is really excellent and I recommend anyone interested in a short synopsis of what they can do with VLEs listen to the second half of it, where he outlines a five stage plan for using a VLE. The message coming out of this is that let’s not get too hung up on what a VLE is, but more what can we do with it. His five stages range from uploading content, resources and assignments, interactivity with feedback, discussion and sharing of thoughts to running a module online.
One of the comments to Martin Weller’s post, above, makes the point argued by Grainne Conole that the VLE walled garden provides for a “trusted brand”. In addition, while I might personally be comfortable of using an array of sites and tools, I know a lot of my colleagues wouldn’t, and it would be difficult at an institutional level to provide any support for the variety of tools and sites each lecturer may individually choose. It might also be difficult for students to know what bit of information is where. The two great practical advantages of the institutional VLE are that the students are added by the institution registration procedure, and the gradebook feature allows for students privacy with respect to individual grades to be protected. To go it alone, this would involve a lot of work on behalf of the individual lecturers at what would be a very busy time of the year. While usage at the moment is probably underwhelming, through progressive staff training and development, staff could be introduced to the “stages” of using a VLE, so that over time the true potential could be realised.
Is it dead or not?
What do we want to use a VLE for? In the end, it is to help students learn. So I suppose it doesn’t really matter what we use as long as we are aiming towards that goal. I don’t like the walled garden nature of a VLE. Practically and psychologically, it reinforces an objectivist approach in assuming the lecturer has all the knowledge and students will absorb it all from the VLE. But I do like the structure a VLE can provide, and as a student I like this too – knowing I can go to a particular place to find resources on a topic. The ID and gradebook features are also beneficial.
When I was a student, I worked as a gardener in a beautiful 19th century garden. The main section was the Radial Garden, a walled, with very formal layout of beds and highly manicured lawns. As you walked through this section, you passed through a gate into a less formal, although still structured section and then through a third set of gates, passing through the wall into the Pleasure Grounds, which was a beautiful informal grounds with specimen trees that seemed to go on for ever. The difference between the Radial Garden and the Pleasure Grounds was stark, with the middle section acting as a transition. Both extremes were equally beautiful, equally of interest to gardeners. Perhaps this is a method of introducing material to learners online. Provide them with the structure and formality of a formal VLE setting, but as the module progresses, let the students go and explore. Let them outside and report back what they find useful, Build in this knowledge into the course structure, incorporating their thoughts and your feedback, so that content knowledge is developed in a shared way. It sounds Utopian, but I think there is something there for consideration.
Lawrie Phipps, Dave Cormier, and Mark Stiles (2008) Reflecting on the virtual learning systems – extinction or evolution?, Educational Developments, 9.2.