Finding out about Learning Analytics

I’m attending the JISC Learning Analytics network meeting (information), which is giving a good overview on the emerging development of learning analytics, and its integration into higher education. Learning analytics aims to harness data about students interactions and engagement with a course, whatever can be measured, and use that in an intelligent way to inform and empower students about their own academic journey. Of course, one of the major questions being discussed here is what data is relevant? This was something I explored when looking at developing a model to help tutors predict student performance and identify at-risk students (see CERP: 2009, 10, 227) but things have moved on now and the discipline of learning analytics looks to automate a lot of the data gathering and have a sensible data reporting to both staff and individual students.

There was an interesting talk on the rollout of learning analytics from Gary Tindell at the University of East London, which described the roll-out over time of a learning analytics platform, which might be of interest to others considering integrating it into their own institution. He identified 5 phases, which developed over time:

  • Phase 1: collecting data on student attendance via swipe card system. This data can be broken down by school, module, event, student. Subsequently developed an attendance reporting app (assuming app here means a web-app). This app identifies students whose attendance falls below 75% threshold and flags interventions via student retention team. Unsurprisingly, there was a correlation between student attendance and module performance.
  • Phase 2: student engagement app for personal tutors: this pulls together data on student attendance, module activity, use of library, e-book activity, coursework submission, assessment profile etc and aims to privide tutors with a broader profile of student engagement.
  • Phase 3: Development of an app that integrates all this data and calculates a level of student engagement based on a weighting system for identifying at risk students (those at risk of leaving). Weighting can be changed depending on what is considered most important. It allows students see a level of engagement compared with their cohort.
  • Phase 4: Research phase – intention is to use data to inform the weightings applied to student engagement app. Initial correlations found highest correlations for attendance and average module marks. However, more interestingly, multiple regressions suggest all engagement measures are significant. They have developed a quadrant based model that identifies low engagers to high engagers, and provides an indicator of student performance. One of the key measures is previous student performance – but is that a student engagement measure??
  • Phase 5 – currently in progress, developing 3 different sets of visualisations of student engagement.
  1. Compare individual engagement with UEL school and course
  2. Provides student with an indication of where they are located in terms of student engagement
  3. Provides an indication of the distance to travel for a student to be able to progress to another quadrant.

The next steps in the project are about aiming to answer the following questions:

–          Can we accurately predict student performance based on a metric?

–          Can providing students with information on their level of engagement really change study patterns?

It’s the last point that particularly interests me.