8th Variety in Irish Chemistry Teaching Meeting – DIT 10th May

The Chemistry Education Research Team wish to invite you to the 8th Variety in Irish Chemistry Teaching Meeting which will be held in DIT Kevin St on Thursday 10th May 2012. The meeting is sponsored by the RSC Education Division Ireland.

Programme and Call for Abstracts

The aim of the meeting is to allow those teaching chemistry at third level to share “what works” – useful ideas and effective practice from their own teaching.

The keynote speaker is Dr David McGarvey, University of Keele, who was the 2011 RSC Higher Education Teaching Award winner.

A call for abstracts is now open for short oral presentations (10 – 15 minutes) on any topic related to teaching and learning chemistry. The deadline for abstracts (150 words maximum) is April 5th 2012.

Attendance is free, but registration is required. Registration forms for those intending to attend/present can be downloaded here and should be submitted by April 5th 2012 by email to michael.seery@dit.ie

Workshop

An optional workshop will be held on Thursday morning (10.30 – 12.30 pm) on the topic “Using Technology in Chemistry Teaching and Learning” and will cover the following topics: “Podcasting and Screencasting”, “Using Wikis in Chemistry Education”, and “E-assessment”. The cost of the workshop is €10.

Related Posts:

Class Sizes and Student Learning

A recent discussion on an ALT email circulation raised the interesting question of whether there was a threshold for class sizes, above which student learning experience diminished. Unfortunately, what followed was lots of “in my experienceHigginbotham-esque replies (with the exception of details of an interesting internal survey at NUIG), despite the original query specifically requesting evidence-based information.

You up there—in the blue and white jumper—what do you think the answer is?

A clackety-clack into Google Scholar throws up some interesting results on this topic. Unsurprisingly, the general trend is that increasing class size diminishes students’ educational experience, although the extent to which this happens seems to be luke-warm. There are two issues to consider: what is being measured to reflect something like “educational experience”; and what is the discipline.

What students think

In this regard, an interesting paper that caught my eye was one that considered the effect of class sizes in various disciplines (Cheng, 2011). This work dismisses student grades in favour of three evaluation scores derived from students: student learning, instructor recommendations, and course recommendations. Student learning was scored based on a student response to a 5-point Likert scale question “I learned a great deal from this course”. (Many of you, including myself, may be tempted to run screaming for the hills at this point. What would students know?! Cheng does make the point that she is not saying that this measure is superior to student outcomes, just a different measure. She refers to Pike’s (1996) interesting paper on student self-reporting for a discussion on this. Also, Hamermesh’s paper (2005) is worth a read for the title alone—in short, good looking lecturers get better ratings.)

Overall Data

Anyway, Cheng has amassed an impressive data set. “In total, the data span 24 departments, 2110 courses, 1914 instructors, and 10,357 observations from Fall 2004 to Spring 2009.” Before considering subject, on an overall level, Cheng found that for each of her three ratings, ratings fell as class sizes increased (although the smallest class sizes received both lowest and highest marks). Cheng has further used her data to generate a model to predict how student “learning” (**measured as outlined above**), instructor and course recommendations would change, so that for an increase of 50 in class size, these ratings would decrease by 1.4%, 1.3%, and 1.1% respectively. Of course, some disciplines will have smaller class sizes or may require more class-tutor interaction, so Cheng has drilled down into each discipline and determined if it is negatively or positively affected, or indeterminately effected (i.e. mixed results)

Subject Specific

In the sciences, chemistry, biology, physics and maths were unaffected by increasing class size in this model, as were history, philosophy, and visual arts. Almost half of the disciplines surveyed were inconclusive, some showed negative effects: some engineering disciplines, political science, social science. No discipline benefits from increasing enrollment.

Chemistry

Cheng considers that theoretical subjects such as the sciences may have a low correlation with class size, but rather depends on other factors, such as quality of instructor or student effort. While I think there are flaws, or at best limitations to this study (as Cheng acknowledges), it does open up interesting questions. The one I am interested in is the culture of teaching chemistry, which is fiercely traditional. That this data suggests that an increasing class size would have little effect on ratings measured here in a chemistry class would in turn suggest that its teaching is still very much based on a teacher-centred philosophy. Clickers, anyone?

References

  • Cheng, D. A. Effects of class size on alternative educational outcomes across disciplines, Economics of Education Review, 2011, 30, 980–990.
  • Hamermesh, D., & Parker, A. Beauty in the classroom: Instructors’ pulchritude and putative pedagogical productivity. Economics of Education Review, 2005, 24, 369–376.
  • Pike, G. R. Limitations of using students’ self-reports of academic development as proxies for traditional achievement measures, 1996, Research in Higher Education, 37, 89-114.

Related Posts:

E-learning (dis)traction

I think the start of my teaching career coincided with the rise of the VLE. Early on, I remember being told about these new learning environments and the array of tools that would help student learning. Encouraged, in the nicest possible way, to upload material and use the institution’s expensive new toy, many lecturers complied and uploaded course materials, support papers, practice questions and so on. In this ideal world, the students couldn’t have had more learning resources at their fingertips. Learning was going to happen.

In reality, this has not been the case. The DRHEA e-learning audit (2009) reveals some disappointing figures across the Dublin region. Students regularly log into their VLE, but mostly access it to access course materials (lecture notes). This makes VLEs a very expensive version of Dropbox or other online repository.

This is also reflected in the UK. In my own subject (chemistry) and in physics, the Higher Education Academy Physical Sciences Centre review of student learning experience showed that e-learning came bottom of the pile when students were asked to say which teaching method was most effective and most enjoyable.

A Distraction

For most lecturers, e-learning is not part of their day to day practice, perhaps because of lack of confidence, probably because of lack of awareness. Mention e-learning, and the discussion quickly moves to whether to use PowerPoint and whether those notes should go online.There may also be subtle fears of replacement – that if learning can happen online, perhaps it can happen without lecturers at all! (Of course, anyone who has taught online knows the truth here!). And as the DRHEA survey shows, if academics engage with the VLE, it tends to be in the form of mimicking what they do in lectures, rather than supporting what is done in lectures.

Institutions, bless them, are concerned with e-learning from a perspective of usage and branding – how does their toy compare with next door. There have also been subtle and not so subtle undertones about how e-learning can provide cost-savings in the future, which is a naive viewpoint. Institutions need to be protected from themselves. If, as a community, we don’t consider valuable uses for incorporating into our practice, institutions will want to fill the vacuum, just as was done previously with pushing content online. Lecture capture, a spectacular waste of tax-payers money, is looming large and is already catching on in the UK. It looks good, makes for good PR and students “love” it. The fact that there is little or no evidence to show that it helps with learning is disregarded. As a community of educators, we should be concerned about this “innovation” being pushed on us [I recommend reading this for a fuller discussion of lecture capture]

Students, well bless them too. Students are clever, articulate, funny and they are our future. But they are also sometimes a bit stupid. Students will always want more – more notes online, more resources, more quizzes, self-study questions, more more and more! In the relaxed days several months before exams, they mean well and plan to engage with all of this material. But all the evidence points to the fact that students rarely engage with the material until it is too late, just before exams. At this stage, they find the nature of the content, often not even re-purposed for an online environment (substitution of what they have rather than supplemental to help them understand what they have), useless for their learning.

Finally, we have my very good friends, the learning development officers, who try various strategies, sometimes against all the odds, to assist lecturers in incorporating e-learning into their teaching. Locally, their help has been of great value to me, but reading about e-learning on blogs and on The Twitter Machine, there is a sense that the ideas and conversations within the learning development community does not reflect what is happening on the ground. There is perhaps a false sense of advancement, buffered from the great unwashed of PowerPoint debaters by early adopters and innovations in the literature. This can lead to a disconnect in language – acronyms, gadgets and tech jargon which results in the lack of confidence among lecturers who may wish to change. The term “learning technologist” does not help, as it immediately imposes a (false) divide between learning and e-learning.

Gaining traction?

So, what to do? The high participation rates in VLEs indicate that this is a place where learning opportunities can be provided. Students are hungry to engage, if material is there. One of my favourite authors in the literature on e-learning for practitioners is Gilly Salmon (Gill-e-Salmon?). A core component of her approach is for practitioners to ask themselves: “What is the pedagogic rationale for implementing any proposed change?“. I think  this is a very powerful position – it speaks in language all perspectives can understand, or at least appreciate (institutions I am looking at you). Lecturers, identifying problems or issues in some teaching practices can consider how to integrate a change, perhaps harnessing technology, into their teaching. Because there is a need; an underlying rationale even; the implementation has a value and a role to play in the module delivery. Lecturers may refer to it, and better still integrate it into their class work. Students are now presented with specific, often bespoke learning materials with specific purpose of supporting their learning at a particular stage of their learning in the module. Instead of just representing lecture information all over again, there is a reason at particular stages in the module, to interact with these reasons – they have a value. Learning development officers can offer their considerable expertise in supporting lecturers in developing the resources, so that they are fit for a purpose. And institutions are happy because students are happy and access statistics look good. In our own work here at DIT, we have enjoyed some success at the micro-level employing this approach – moving away from mass content upload (“shovelware”) towards specific learning resources tailored for and incorporated into specific modules. It takes time and is harder work, but the value of what is produced is greater for all.

Now, I feel better after that.

Related Posts:

Academic Workload

Performance indicators developed for a business model won't sit well in an academic setting - unless we redefine what we want from an academic setting.

Continuing his Marshall Attack, Prof Higginbotham moved his bishop to b3. He pushed himself up from the large couch and stretched in front of the fire. Time for a lecture. Grabbing some notes from a small table beside the sherry, he went to the window of his ivory tower. A few students were already waiting below, he could see a few more scuttling across the quad. Late, as usual. He opened the window and cast the papers containing the day’s knowledge down to his charges, who eagerly caught them, some chasing the papers in the light breeze. Closing the window, he returned, leaving one sheet he held onto on the small table by the sherry, where Marjorie, the departmental secretary would collect it and photocopy it for next year. He refilled his sherry glass, and moved to resume the game. What a busy day! It would be time for lunch soon.

Academic workload is a contentious topic at the moment, after the recent Public Accounts Committee grilling of the University Presidents, and their FAS moment. At one stage of this committee hearing, Roisin Shorthall TD requested to know how many hours an academic worked. I think the question of number of hours is over-simplified. I can understand the need for accountability, but higher education is a very complex system, and a counting of hours does not reflect anything except the making of an easy bar chart. Face-time with students is one component, (and unfortunately an ever-dwindling component) of what an academic actually does. The question should not be on the number of hours spent by an academic on a job – there are lots of studies to show it far exceeds a typical working week – but rather the quality of that time. There are two factors to consider here – amount of time spent by academics on administration and the fallacy of requiring accountability.

Ned Costello (of the IUA) recently commented at a PAC that a typical (university) lecturer spent roughly 40:40:20 on teaching:research:administration. Now it’s impossible to come up with a generic ratio like this, and even though Ned said something not very nice about us in the Institutes, I can imagine he was pressed for a number. 40-40-20 is always a good one, having a nice 4-4-2 ring to it that appeals to former sports-people turned commentators.  The HE sector has a top-heavy administration sector, meaning there are more administrators than lecturers (imagine that happening in health – we’d go mad…) So as well as having half the staff on non-teaching duties, teaching staff are required to spend their time on administration duties. The problem for practitioners on the ground is that this 20% is constantly pushing and bloating, reducing the time that can be spent on jobs that academics should be doing – teaching (which involves formal lecturing, lab/workshop teaching and informal aspects not recorded but  of equal or more value – talking with students, continuous assessment, feedback) and engaged in research/consultancy/etc.

This is especially so in the area of research. Most academics can talk of research grant applications that require descriptions written at length telling the funding agency what results will be obtained, having to keep research student log books to prove you met students while completing the research and complicated research account statements to prove you spent the money wisely. Then there are the several forms and requirements for getting a student to a viva voce. The system is self-fuelling, administrators want their stamp on everything, so there are more forms to fill out, more time on administration and ultimately more administrators to keep track of the forms. On paper it makes sense – there is a nice paper trail and at the end of it all, it is “accountable”. But at what expense? Wouldn’t these hours be better spent on research itself, rather than accounting for it? Taking autonomy away from people and institutions means that they will spend more time proving that they are doing the job they are meant to be doing rather than doing the job itself!

The process of requiring individuals to be accountable will ultimately result in them doing less work

This is the fallacy of requiring accountability in a type of employment which does not easily facilitate bean-counting. And while all that discussed above is annoying, it is small fry to what is coming our way. HE will be run on a business model where the expense per lecturer will be offset against the income – ultimately a cost-benefit analysis. There is an excellent article on this model in the Wall St Journal. While individual differences between income and expense may be explained locally (Lecturer X does a lot of in-class tutorials with small groups, hence the high “expense”), these numbers will ultimately compiled by school, college and institution, where these local explanations are lost. The easiest way for institutions to be in the black will be to have very large classes and summative assessment, which goes against pretty much every good teaching practice concept developed in the last thirty years. In practice, this will push more academics to formally record what they have being doing outside Roisin Shorthall’s supposed 15 hour workload – meeting students, correcting continuous assessment, discussing feedback, mentoring projects, preparing induction, professional development and so on [and on]. Institutions will eventually have to reject the recording of this time, as it will very quickly exceed 40 hours, so these activities will dwindle. In other words, the process of requiring individuals to be accountable will ultimately result in them doing less work. But at least we’ll know then how much work they do…

This simple solution is to ease off on over-monitoring. The system has internal, often informal checks. Students are quick (and right) to complain about lecturers who they feel are not delivering the goods. It’s impossible to conduct research effectively without publishing or presenting work to peers in your discipline, who will be far more critical than any of my very nice administration colleagues could be. In short, all this time spent saying how busy we are could be much better utilised. The increase in fees in the UK and the re-implementation of fees in Ireland will be the oxygen to the spark that is igniting around flexible models of education provision, where private companies are already way ahead of the starting line, ready to run. George Siemens recently wrote that “higher education is not in control of its fate as it has failed to develop the capacity to be self-reliant in times of change“. Instead of bean-counting the past, institutions should be pushing forward alternative means of provision so that they have the money to survive the future.

Photo Credit

http://online.wsj.com/article/SB10001424052748703735804575536322093520994.html

Related Posts:

Why it’s time for @vonprond to go

Oh dear. I had to wipe up some porridge this morning as I read this week’s article in the Irish Times from DCU’s former president. The “PowerPoint or Not” story has made it to the top, it is his article of the week. The article, which is essentially based on a 2009 Chronicle of Higher Education piece, and a press release for Southern Methodist University, discusses some personal anecdotes and some general hand-waving towards removing PowerPoint in lectures, and just, you know, teaching naked. Yes, I said the Irish Times.

I’m sure DCU’s Learning Innovation Unit must be collectively weeping this morning. Their website is full of resources for different teaching methods and information about seminars, which I’m sure FvP was informed about, and probably used in extolling the virtues of teaching innovation at DCU during various speeches as President. He chose to ignore all that, and preferred to, you know, just teach guys! I’ve found my Higginbotham – I was right about the unusual name. I’m not going down the senseless PowerPoint or not debate, except to say that it doesn’t matter what type of football boots a Newcastle player wears, whether he is a good player or not boils down to a certain amount of innate talent, and a lot of practice based on best training principles.

The bigger issue for me though is that this is still what passes for considered comment in the discourse on academic teaching quality at third level. Here we have someone, who hasn’t taught for at least ten years, advising others in his profession how to teach. He’s not basing this on any research, any evidence – just because it feels good, and, you know, he was a great lecturer. And while I would never usually say no to an article on teaching making the Irish Times, given the week Higher Education has had, (it even got another President blogging again), I would have thought something on the workload of academics would be more timely.

I was surprised when Prof von Prondzynski’s column in the Irish Times returned in the autumn. While having someone in the job of University President gave interesting insight to some of his columns, now that he has finished that position, I don’t understand the decision to keep him. Perhaps they thought he was a good bet for the Maynooth job? His column today really emphasises the fact that he has not much to offer, like a retired Taoiseach no longer knows what goes on at Cabinet. He should take the other Drumcondra man’s lead, and bow out of Irish media gracefully. I wish him the best in his new position. And for God’s sake will someone call Tom Collins and see if he can do a column for next Tuesday.

Related Posts:

University Rankings – Ireland Changes 2010

Say what you will about university rankings, they are used in media and political circles and along with the recent OECD report will provide an interesting context to the Hunt Report “debate”. I think it is naive to suggest that such large falls/gains mean a university is significantly better or worse than it was a year before – the large changes are more likely due to the change in methodology in the case of THE (who departed from QS and established a new rankings this year with Thompson Reuters) – their weighting for staff-student ratio is down, the weighting for learning environment is up and the weighting on “subjective opinion polls” is significantly reduced. The 2010 & 2009 figures are listed:

QS/THE 2009
THE/
Thompson Reuters 2010
QS 2010
DIT
326
347
395
DCU
279
313
330
UCD
89
94
114
TCD
43
76
52
NUIG
243
299
232
UCC
207
243
184
UL
401 - 500
>400
451-500
NUIM
401 - 500
>400
401-450
2010 2009 Change
DIT 395 326 -69
DCU 330 279 -51
UCD 114 89 -25
TCD 43 52 9
UCG 232 243 11
UCC 184 207 23
UL 451 – 500 401 – 500
NUIM 401 – 450 401 – 500

Source: THE/QS 2009, QS 2010: http://www.topuniversities.com/country-guides/ireland

Click on image for THE article for their methodology 2010

Related Posts: