From tomorrow, 9th May, the 2014 Spring ConfChem begins. ConfChem is an online conference, and the theme of this one is “Flipped Lectures”. The conference abstract is below. Each week, two papers are discussed, and my paper “Student Engagement with Flipped Chemistry Lectures” is first up! Do join in with the conference over the coming weeks. You can find the conference papers and joining instructions at: http://confchem.ccce.divched.org/2014SpringConfChem
Among educational practice there has been significant attention on the flipped classroom, which is an innovative pedagogical method used by K-12 to college and university educators. There are many different approaches to implementing a flipped classroom. In particular, some educators pre-record lectures of themselves presenting material, others use screen casts to convey information to students before attending class in order to facilitate more peer-to-peer learning, and some teachers use a flipped classroom approach that does not involve videos. Ultimately, the shift in learning is focused on changing the classroom from passive to active.
The purpose of the symposium is to present papers on the flipped classroom and its development of flipped learning. Although some authors are invited to discuss the technical aspects of the flipped classroom, the focus of our symposium will be about how teachers use the face-to-face class time gained by changing from a completely lecture based classroom. Please join the discussion during this symposium as we explore the wide variety of approaches with the authors and other members of the chemical education and flipped classroom communities.
A question always likely to give strong response is whether PowerPoint should be used in lectures. Those advocating its use point to a more organised lecture where the structure has been thought out in advance. Those against it say that PowerPoint makes it too easy to put too much content in lectures and accompanying handouts. I don’t have a Yes/No opinion, because I think it depends very much on the person and what they do.
While raiding the archives of Education in Chemistry (EiC), I came across some old but interesting articles on the topic of lecturing. Before I was born, Alex Johnstone wrote an article “Attention Breaks in Lectures” in EiC. In it, he outlines a study he undertook where student attention was monitored by observers in 90 chemistry lectures. Attention drops—doodling, looking around, yawning, chatting—were recorded. Interestingly, the course been taught was delivered twice in one day, so a comparison could be drawn between groups. Not surprisingly, the average performance of a groupwas found to correlate with the level of attention paid. The twelve lecturers giving the course varied in style. The lapses in attention were more common in lecturers that did not vary their style compared to those that did: by using activities such as models, experiments, problem-solving sessions, etc. The general pattern of lapses in attention were found to exist at the start of the lecture and about 10 – 18 minutes later, with further lapses over the duration of the lecture. Attention span dropped during the lecture, so that by the end of the lecture, attention span was 3 – 4 minutes.
When I was starting university, Johnstone wrote another article for EiC. In this article, “Lectures-a learning experience?” Johnstone stated that the average lecturer delivers approximately 5000 words in a lecture, with a student recording about 500 of these. This article reports what students chose to write down and why they felt some information was more important. The study found that students recorded about 90% of what was on the blackboard, with inaccuracies ore common with diagrams or equations. Lecturer corrections, demonstrations and examples of applications typically went unrecorded. Note-taking styles did not vary for students, even if the lecturing style or content was different. Students themselves ranked lecturers in terms of effectiveness, and those marked “ineffective” tended to have a higher word count per lecture—they cover more, although from the perspective of the student, they make less sense.
For me, the question is a lot bigger than “To PowerPoint or not”…
AH Johnstone and F Percival, Attention breaks in lectures, Education in Chemistry, 1976, 13, 49-50.
AH Johnstone and WY Su, Lectures – a learning experience? Education in Chemistry, 1994, 31, 75-76.
The magazine Education in Chemistry has launched a blog, and I am a guest contributor. My first article covers the topic of Flipped Lectures, and along with the information in the post itself, some really useful tips and interesting discussion points came up in the discussion. The first post begins below and you can continue it on the new blog itself!
As with most new blogs, I’ll begin with: “Hello world!” I’m delighted to be a contributor to the new Education in Chemistry blog and I’m looking forward to sharing ideas and hearing back from the chemistry education community. My own interests are based around the use technology in education and school to university transition, so I suppose that’s what I hope to share on this blog. I have a tendency to go on a bit, so I am imposing a restriction to my posts of 600 words, which is about the length of a cup of tea. There’s 97 so far; go on, have that biscuit… continue reading
Last semester, I trialled a flipped lecture in my Year 2 module on Chemical Thermodynamics. This work will be presented at DIT’s Learning and Teaching Showcase in January, and an accompanying flyer is below. Overall, it has been a very positive experience, and as well as my own and student perceptions, there appears to be good evidence that there is a high level of engagement with the material in the classroom (which was the original motivation for introducing this method in place of the traditional lecture). I’m definitely going to keep going with this method for this module.
A previous post showed how to include hints and worked examples in quiz questions in the now old Blackboard, BB 8. With the new Blackboard 9, is this still possible?
It seems the hint option (i.e. an alert box) still works. Including the code shown below using the HTML editor option, a button is included in the question text:
<p>(This is the Question text)What is the answer to 1 + 23</p>
<p><input type="button" value="Hint" onclick="alert('This is a hint')" /></p>
Note that the question text can be subsequently edited. The hint text “This is a hint” in the above code then appears as a popup alert (the box will resize to accommodate more text):
This allows for primitive hinting – it is difficult to include more detailed information in this template.
A recent review by Kay gives a brief overview of video lectures which provides some useful information [see here for open access version]. Perhaps more interesting though are a list of questions at the end of the paper which Kay feels remain unanswered. I’ve listed them below, and thrown in a few of my own thoughts. I’d be interested in hearing what others think.
1. What is the optimum length for video podcasts and does that depend on the nature of content?
This is a common question, but I think it does depend not on the nature of the content, but on what you want students to do with the content. Sitting watching a video for 10 minutes while doing nothing will seem like a very long time, but I think using a 10 minute video to work through a work sheet or other activity won’t feel as long…?
2. Are summaries more effective than full lectures video podcasts?
McGarr has summarised podcasts as substitutional or supplemental as often mentioned on this site. The question is really – is it worth repeating a passive learning experience in a video form? Neither a summary nor a full lecture is likely to be much use if there isn’t an additional learning benefit to the students in addressing a problem or using it in some way.
3. Are worked-examples better addressed through the use of video podcasts than in lectures?
I am a big fan of worked examples, and I think they are useful learning and revision tools. I think this is one area where video podcasts have huge potential. In a lecture environment, students probably don’t have time to both work through the example and process the discipline specific material in each step. Moving this online (or at least providing additional worked examples online) help in this regard as students can work through them at their own pace.
4. Can administrative tasks be adequately addressed using video podcasts?
Bleurgh. Well maybe…? No… Bleurgh.
I have toyed with some induction material on vodcast… maybe there are other options?
5. Could video podcasts be used to give feedback to students?
David McGarvey at Keele spoke about this at the Irish Variety and gave some really concrete examples of how video feedback had an enormous impact on the quality of students’s final submission of work. I don’t know if there is much published on this.
6. What type of content or concepts are best suited to the video podcast format and is there still a roll for audio podcasts?
Audio podcasts never seem to have taken off in science, and I think it is because of the nature of the content; equations, graphical material etc. It’s interesting to note the recent work by Sweller on audio alone versus audio and visual, summarised here, on the preference of the latter.
One thing I have often wondered about when considering videos/animations/audio files is that unlike something written on a piece of paper, the information in multimedia presentation moves on quite quickly. With paper, one can get a sense of the whole, see the sequence, refer back quickly to what went before. With audio or video, information, and often quite complex information, is presented at the pace of the speaker, and it takes a bit of effort to go back and review a segment (i.e. not a glance of an eye). Therefore, for technical topics like chemistry, is there an issue with using multimedia generally?
This idea was discussed in a recent paper by John Sweller and others.* They cite a raft of studies which show that animations do not have any beneficial impact on learning. In this study, they conducted two experiments. The context is that with animations, learners need to simultaneously remember what was just presented along with what is being presented. However, depending on the pace, previous information may be forgotten, and cannot be recalled as easily as static graphics on a piece of paper. Segmentation and user control has previously been shown to aid novice learners. Effective segmentation means that the amount of any information in any segment is within working memory limits.
The first experiment showed children an origami task – it was important that the task included technical elements. Obviously, watching TV includes long sections of text that can be easily processed. The difficulty is the inclusion of technical elements. The task involved 24 steps. Students were shown either video or a series of static images, either in short segments or a continuous presentation. The post-test scores showed students who watched animations scored better than those who had static images, but only if the animation was in short sections.
The second experiment considered length of verbal statements. Students were given instruction on how to read a temperature-time graph, and given five worked examples. The information was presented to the four groups as (1): longer audio text; (2) longer visual text; (3) shorter audio text; (4) shorter visual text. This information was presented in a 330 second presentation, but the amount of explanatory text on the slides differed (long vs short visual text) and in the case of audio only, the explanatory text on the slides was removed and presented as long or short audio segments. After instruction, students were given a post-test. These scores show that the longer visual text was preferable (reverse modality effect). The shorter audio was preferable to the longer audio. These results demonstrate that short spoken statements can be easily held in the auditory working memory, allowing visual memory to process the graphics on the slide. Written information crowds the visual working memory space, reducing capacity to process. However, long auditory information can be difficult to process.
In terms of designing e-resources, these experiments appear to suggest that animations have a beneficial impact on learning, but should be presented in short segments to novice learners, to allow time to process. Audio commentary is beneficial, but again short segments are more useful.
With regards to animations, I think interactivity is important, as it allows the user to click through at their own pace rather than just watching passively. Perhaps table of contents listing slides might help with audio statements, leaving it possible for users to click back on a slide they wish to revisit. But it is a pertinent reminder not to go off on long audio meanders as is our wont.
*A Wong, W Leahy, N Marcus, J Sweller, Cognitive Load Theory, the transient information effect and e-learning, Learning and Instruction, 2012, 22, 449-475.
The College of Sciences and Health is hosting a series of lunchtime sessions to disseminate examples of learning, teaching and assessment methods being employed on programmes across the College. These events are designed to provide colleagues with an opportunity to learn from each-other, and to provide an opportunity to share examples of practice collected through the annual monitoring process.
On Wednesday, 11th December 2013, 1:15pm to 2:45pm:
1:15 – 1:45 – Steve Meaney, School of Biological Sciences, Broadcast Video
1:45 – 2:15 – Bryan Duggan, School of Computing, Free tools for recording video
2:15 – 2:45 – Michael Seery, School of Chemical and Pharmaceutical Sciences, Video to Support Lecture and Lab Teaching
I attended a useful workshop today on the Blackboard Retention Centre, which I was aware of before, but never really looked at. However, it looks like it could be quite useful.
Perhaps the reason it doesn’t look immediately useful is that the initial options set up automatically by the retention centre are too broad. When I first noticed the alerts, I had about 750. If I have two groups of students doing labs at different times, but was too lazy to set up groups, then half the class flag an alert.
Therefore first task will be to turn off all alerts and customize new much more specific ones that I want.
What do I want? After the first week of flipping, a handful of students didn’t access the screencast or do the quiz. I trawled through the data, identified these students, and sent them all an email. The next week, uptake was better, so this strategy worked. It was time consuming though, so the first rule I create in the retention centre will be to flag any students who doesn’t do the specific weekly quiz each week. I don’t think I can flag if a student hasn’t accessed the weekly video, but can flag any student who hasn’t logged in in the previous seven days. These obviously can’t have watched the video nor done the quiz. From a pastoral point of view, flagging students who score below average in the quiz means I can contact them and see where they need help.
The nice thing is that these rules we create bundle together students in particular categories, and with a click of the mouse, you can email that group. It looks like it will be a very efficient process.
We also considered a way of uploading the spreadsheet of in-class attendance, and including that in the retention alerts.
All in all, a very useful workshop! The video with introductory details about the retention centre is here:
Educational innovations are a bit like political parties; people tend to advocate them passionately or dismiss them disparagingly, with both groups relying on legacy rather than evidence for their views.
This week, the flipped classroom hit the headlines, with a USA Today article that presented a study of flipping the “STEM classroom”. The preliminary results found that flipping the classroom had no difference in problem solving, attitude to learning, and exam performance. It was remiss of the journalist of this article not to pick up what others later did; that the staff-student ratio at the institution in question was 1:9 and therefore any innovation is likely to struggle to improve results (Edit: the authors of the study have since stated that the USA Today article does not accurately reflect their opinion). In contrast, the New York Times also carried a piece this week which was a more comprehensive, largely positive, Opinion Piece on the flipped classroom at school level.
Whatever your view, it can only be a good thing that teaching and teaching methods are being openly discussed. The problem for advocates of the flipped classroom is that there is little evidence so far on its effectiveness, and I think some more passionate advocates may actually be doing the innovation a disservice by promising more than can be delivered, often without any real evidence. Nevertheless, one can’t sit in front of your lectern waiting for the evidence to come before you change from the status quo. It is difficult in education to amass this evidence without trial and error, so trials are necessary.
Thanks to the articles above, I joined the Flipped Classroom Ning this week, and I am impressed by the enthusiasm of a lot of educators wanting to improve their students’ learning. There is a growing chorus of people who have tried it, like I have, and see its potential, like I do. I imagine the coming year will be an interesting one in the education literature.
What do students think of flipped lecturing? On to this week’s Journal Club:
#7: J. D. Smith, Student attitudes toward flipping the general chemistry classroom, Chemistry Education Research and Practice, 2013, 14, 607-614. (free to access)
Smith created 200 mini-lectures ranging in length from 1:08 to 17:02 with an average length of 7:10. The list of lecture titles are in the supplementary information. Over the two year trial, two different pre-lecture assessments were used. In the first year, students were given questions to help them gauge their own level of understanding, with no grade. There was low take up. In the second year, clickers were used in the classroom (after initial clarifying questions) to gauge understanding of pre-lecture content. Then followed in class work; problem solving, discussing, teasing out issues and difficulties. The author writes
Generally, much more time was available for explanation, interaction, and conveyance of insight than had been in the past.
The students were surveyed on their experience of the flipped classroom model. They agreed (97%) that the online material was useful, and didn’t indicate any preference between streaming and downloadable content. Some students wished to be able to annotate lectures. Students felt the length of lectures was appropriate, but didn’t think they should be any longer. About half the students felt that shifting the work outside the class was a burden to them, but agreed that this made the class time less boring/more engaging, and more useful to them. Students watched the pre-recorded lectures on average three times, using them for class preparation, assisting with homework, and preparing for tests. Students found the lecture explanations more useful than the textbook explanations. In class, students were neutral on whether the in-class questions aided them to gauge their own understanding, but felt the in-class problem solving helped them prepare for homework and tests.
I like this paper because it is a reflection of what really happens when most of us try out something new (we don’t all have $200,000 grants to pilot something). Here, an educator has done a huge amount of work in preparing a suite of materials for his students to use, incorporated it into the flipped class model, and made significant attempts to see how it went. I find the student responses encouraging. It also highlights to me how important it is to consider how the time in class is used.
Have you tried or are you considering trying flipped teaching? If so, what do hope to achieve? If not, why not?
What kind of topics do you think lend to the flipped class model?
Have you any thoughts on what can be done in the class hour (specifically)?
My work on enabling students to prepare for lectures has gathered some momentum again this year with the awarding of a Teaching Fellowship for a project involving second year students.
Most work to date has involved Year 1 students, focussing on introducing core concepts in advance of a lecture. These pre-lecture activities are probably best described in an Education in Chemistry article previously published.
This new project extends the concept to second years, and expands on the amount of information presented in advance of the lecture. The idea is that by providing much of the “content delivery” of lecture in advance, the lecture hour can be devoted to more in-depth discussion, problem solving, etc. As well as development of the material, a formal evaluation will be conducted, as outlined below. The project aims to evaluate the impact of the “inverted lecture” or “flipped lecture”, for which there is currently very scant literature.
There’s a lot to consider in implementing this study. I ran a pilot last year to identify what issues came up in the implementation phase. Some positive observations came through; students liked having control of the learning materials and engaging with them at their own pace. There was a good level of in-class activity. Giving students gapped notes or something to complete while watching the pre-lecture activities helped them focus on extracting relevant information and organising this information from the pre-lecture videos, rather than passive viewing. On the downside, administration problems last year meant that many students weren’t registered in Week 1(or indeed several weeks in) so material had to be simultaneously loaded on an open access platform. As well as hassle, this meant I couldn’t monitor access statistics. This year, the module is delayed until mid-semester. While I could gauge the “buzz” in a lecture last year as students were working through problems, I had no real idea of how each student was progressing until the mid-semester test. This year, I am hoping clickers or after lecture quizzes will highlight problems as we go from week to week.
The evaluation element aims to study student’s cognitive engagement in the lecture. They will be “interrupted” as they work through a problem and asked four short questions which are drawn from another study, which validated this instrument as a measure of cognitive engagement (more details on the instrument itself will be in a future post). I wish to show that as the students are working through their in-lecture tasks, having watched pre-lecture videos, that they are cognitively engaged with the material and task at hand. This information will be coupled with access data to the resources, quiz scores, and student interviews to build up a profile of how the flipped lecture works for middle stage undergraduate students. I also wish to develop a “How To” pack for lecturers considering implementing a similar strategy in their own teaching (assuming the study shows it is worthwhile).
It’s hard to believe we are at the end of another academic year. It doesn’t seem long ago since I was welcoming new first years in and giving my final year induction talk to the incoming anxious, but eager fourth years. But here we are already in mid-June, which also means the end of my Teaching Fellowship on pre-lecture resources.
I am becoming more and more certain about the role of discussion in class, which the pre-lectures facilitated.
Looking back on the year, I’m incredibly proud of the work on pre-lecture resources. While the techy bits indulge my nerd-side, their impact on my teaching style will be long lasting. The initial excitement of monitoring their impact quantitatively on student grades was encouraging, but a more influential output is that my concept of what a lecture can be is evolving. I am becoming more and more certain about the role of discussion in class, which the pre-lectures facilitated. Things as elaborate as problem-based learning and as simple as “think-pair-share” all have discussion at their core. I’ve tried with my Learny-Teachy hat on in the past to get students engaged and interacting with me as a lecturer; but almost by accident, the pre-lecture resources got them interacting with each other. Since observing this and the positive impact in the classroom, it’s something that is going to be embedded in my teaching method in the future.
The project is formally finished, although it will of course be tweaked and adjusted for use next year. We, in the Chemistry Education Research Team, are moving on to an exciting new project this month. We’re going back to our roots. Our first collaborative project was on context-based laboratory mini-projects, (you can read the paper here) and we are returning to that theme now to develop a suite of context based laboratory and lecture resources and e-resources, supported by the Royal Society of Chemistry. It’s a big, quite ambitious project, and we are starting into a busy summer working on it. But like all of these things, lessons formal and informal will trickle into our on experiences as educators, and ultimately into student learning experiences. I’m looking forward to getting stuck in.
This post aims to summarise some literature on the use of worked examples in the teaching of problem solving in chemistry. Crippen, drawing from the work of Sweller and others has summarised worked examples as follows (taken from Crippen, 2010, below):
Worked examples are sample problems which have already been solved and provide the learner with a model representation about how to think though complex items (Mwangi & Sweller, 1998). They are intentionally similar in content and structure to the quiz items under consideration for the current study. Worked examples are not scripted, but provide the learner with a knowledge base to understand concepts by demonstrating the necessary steps taken to arrive at a defined solution. They are an especially effective technique for increasing the problem solving skills of novice learners (Kalyuga, Chandler, Tuovinen, & Sweller, 2001), but can also assist in the same way with a more comprehensive audience of learners (Ward & Sweller, 1990). Worked examples also provide an efficient use of limited cognitive resources needed for schema acquisition preferable to mean-ends analysis problem solving methods (Sweller, 1988). Since worked examples are opened only when prompted by the user (learner), we consider this action a self-regulatory behavior.
A Valid and Reliable Instrument for Cognitive Complexity Rating Assignment of Chemistry Exam Items, K Knaus, K Murphy, A Bleckling, T Holme, Journal of Chemical Education, 2011, 88, 554-560. DOI
While not about worked examples per se, this paper ties in to the area in two important respects – in considering cognitive load of questions in general and in considering the cognitive components of a question, and attributing a load or rating to these.
An example of how a problem could be rated is provided and considers the the concepts and skills and their relative difficulty, and the level of interactivity between the concepts and skills. The rating is found by using a rubric to score the values obtained from this analysis.
Correlation was found between the rating of questions and student performance (r = .498), rating of questions and student mental effort ratings (r = .492),
Applying cognitive theory to chemistry instruction: the case for worked examples, KJ Crippen and DW Brooks, Chemistry Education Research and Practice, 2009, 10, 35 – 41. DOI
Perspective of literature around key concepts in feedback, worked examples, scaffolding, etc based on cognitive load theory applied to chemistry.
Purports that “Instruction that places a heavy emphasis on learning from open-ended problems, those often touted as motivational, is inefficient and ineffective because of its cognitive resource requirements. Instruction that requires students to learn from and interact with structured worked examples of closed-ended problems is consistent with contemporary models of human learning and produces efficient and effective results.“
Distinguishes between engaging in an activity and deliberative practice in learning how to do an activity. In the former, there is no (cognitive) room for criticism, evaluation of feed-back, etc. In the latter, “the situation is constructed specifically so that one can allocate energy for practice and dealing with feedback, especially disconfirming feedback.“
Defines worked examples as those which contain: “a) a problem formulation, b) solution steps and c) the final solution“.
Impact of web-based worked examples and self-explanation on performance, problem-solving, and self-efficacy, KJ Crippen and BL Earl, Computers and Education, 2007, 49, 809. DOI
This paper describes the use of providing worked examples and self-explanation prompts to improve problem solving capacity in chemistry. Worked examples are used in the context of cognitive load theory, as they can reduce the cognitive load in solving new problems, as learners can focus on problem solving. The paper distinguishes between worked examples instruction for inexperienced learners and problem solving practice for experienced students. The work is based on a prior study which demonstrated that students use extensively worked examples and self-explanation prompts and find them helpful.
A quiz is available to students on a weekly basis matching their lecture content. Correct and incorrect results and a grade are provided at the end of each week. The experimental group were provided with questions and three worked examples (designed in accordance with literature guidelines); the experimental group were provided with either worked examples or self-explanation prompts.
The results were limited by the fact that there was a small sample size, but showed that there was little difference in exam score between students who were provided with worked examples and those in the control group. The authors suggest that worked examples may be of use in questions that are well structured. The addition of a self-explanation prompt did result in an improvement in score relative to the control group, and the authors argue that this additional component to providing worked examples provide students with context for interpreting examples and activating learning strategies.
The effects of feedback protocol on self-regulated learning in a web-based worked example learning environment, Computers and Education, 2010, 55, 1470. DOI
Subsequent paper to that described above (2007)
Aims to examine worked example feedback protocols that will best enhance achievement and motivation
Describes literature into the different types of learners: (1) aim is to successfully learn how to complete a task; (2) aim is to avoid misunderstanding or making an error; (3) aim is to outperform others; (4) aim is to avoid embarrassment compared to others. The results are discussed in the context of these learner types.
As above. learners were presented with questions and the option of selecting a worked example/self-explanation prompt.
Feedback protocols varied between different groups of the sample of students (184): students received their quiz score and the class average (norm-referenced) or students received their quiz score compared to their cumulative attempts (self-referenced).
Results were inconclusive, and authors propose that a mixture of norm- and self-referenced feedback (e.g. by allowing them to toggle between the two).
A subsequent paper on this topic is: Scaffolding motivation through the use of worked examples, Journal of Interactive Learning Research in press.
The process was quite an innovative one, and I actually got quite a lot out of it. Submission for consideration of the award cannot be more than five minutes. For mine, I made an Articulate presentation that was a model of the resources that the students see, to try and give a sense of how students would have interacted with it. The submission was required to show how the idea was innovative, how it was implemented and how it worked.
Short-listed applications were then invited to the Helix two weeks ago whereby we started off with an “innovation speed-dating” session, meeting each of the five judges in turn and explaining the concept to them. I was a little bit cynical about this, but the calibre of the judges was extremely impressive, and I found myself wishing I had spent more time preparing for it! It was a really great way for the judges to find out more about the idea, as each one had their own interest or angle that they were interested in. The judges were introduced to us at the start of the final, and included a representative from academia (at the coalface), a representative from ILTA (pedagogy-technology), representative from industry (innovation), representative from the HEA (policy) and last year’s winner (standard) – and for each of these I considered the term in brackets to be the bit I would focus on in each of the speed-date sessions. This meant that it was possible to cover a lot of information about the innovation in a short amount of time (35 minutes). I got a lot out of this session, as in discussing the idea with the judges, they prompted a lot of ideas from their perspectives that I hadn’t considered.
The speed-dating was followed by an interview with the entire panel, which focussed on the core concepts – why was the resource introduced, how did it go, what did I learn and what would I change. Again, some useful things came out of this – most importantly for dissemination that the pre-final resources might be more useful for dissemination than the finished ones, as individuals could tailor them to their own situations. This is something I plan to do now, which I never would have thought of.
The other great output was in talking to the other finalists. They had some really great, impressive innovations and all shared a passion for considering how best to effectively use technology in our learning. I got so many ideas from just talking to them and having a look at their ideas, which I think will be summarised on Jennifer Burke website soon.
All in all, a very positive day! Many thanks go to the organisers at ILTA for what was obviously a huge amount of work behind the scenes, and the judges for giving up their time. I wholly recommend anyone considering applying for next year’s award to do so – the process is a very beneficial one in terms of personal development.
I love my Sony camcorder, but unfortunately the video output it produces is in MPEG-2 format. This will play in something like Windows media player, but you can’t import it into Camtasia Studio. This video shows the workflow I go through to convert imported videos using Any Video Converter, and the subsequent sound/picture editing that might be considered depending on your scenario.
I’ve had quite a few discussions about how to integrate SCORM resources generated through Articulate presenter with Blackboard, so this is a video summarising what I do. I’m not an expert, so the advice here is based on trial and error. Please feel free to correct me Best to watch in 720 HD.
I’ll be giving a webinar as part of the fantastic Sligo IT webinar series this Wednesday at lunchtime. You can register and find out more here: http://www.eventbrite.com/event/1135441135. The webinar will cover some of the work I’ve done on my Teaching Fellowship on the area of pre-lecture resources. It’ll be my first webinar – I’m quite nervous about it, but looking forward to the instant interaction of the audience as I give the talk!
This presentation will outline the use of online pre-lecture resources to supporting in-lecture material. The design rationale is to develop a cyclical approach between online resources and lectures, so that the two are mutually dependent. The aim of the resources are to introduce students to some key ideas and terminology prior to the lecture, so that their working memory during the lecture can focus on application
and integration, rather than familiarising with new terminology. These had a short quiz associated with them which was linked to the gradebook in the VLE. Some design principles behind developing these (and any) e-learning resources will be presented, along with implementation strategy and some analysis of the effect of using these resources with my own students.
Much of my study on educational research this year has focussed on Pre-Lecture Resources, working with Dr Roisin Donnelly at DIT’s Learning Teaching and Technology Centre and my colleague Dr Claire Mc Donnell. I’ve turned into something of an evangelist for pre-lecture resources, so in order to spread the good word, I have prepared this resource guide for others thinking of using a similar strategy. I’d love to hear from anyone who has considered this approach or is using a similar approach. The guide accompanies a presentation at the 12th Annual Showcase of Learning and Teaching Innovations, DIT, Jan 2011. Click on the image to access“Using Pre-Lecture Resources in your teaching”
Prompted by my visit to Edinburgh next week to the “More Effective Lectures” workshop, I have compiled several blog posts and bits and pieces of other writing into a Resource Pack that I hope might be useful to other practitioners entitled: “Podcasting and screencasting for supporting lectures“. The resource is a PDF file and is available at this link: Podcasting and Screencasting for Supporting Lectures or click on the image below. The resource covers:
Introduction to the use of podcasts/screencasts in education
Overview of the design of e-resources
Tips for preparing podcasts and screencasts
Tools of the trade: Audacity, Camtasia and Articulate