The following is a summary and some reflection upon a CETIS report (Harmelen and Workman, 2012) titled Analytics for Learning and Teaching. One of the reasons for reading this is the current considerations around reworking and resubmitting a unsuccessful OLT application around learning analytics.

This report is the third in a series of reports on analytics produced by jisc/cetis. i.e. some UK folk who have been associated with e-learning/educational technology for some time. I’m only vaguely aware of the details of who and what they are, but they have made useful contributions in the past. Add to this a topic that is of current interest to me and I’m my hopes are high for this.

Summary

I’m disappointed. I had thought this would engage with analytics for “learning and teaching”. Instead it’s a general overview of the institutional requirements/considerations for analytics. An overview that would have worked almost as well for a completely different industry sector. Sure there was a lot of university specific information, but most of it was at a management level. Very little actually dealt with analytics for learning and teaching.

In the conclusion section it’s stated

Most importantly, for learning analytics (as opposed to academic analytics) there is a strong link between pedagogy and analytics.

There’s almost nothing in the report that supports or links to this statement. Just after this quote it says

Any wise institution will examine where it wishes to move to pedagogically, and consider if any proposed learning analytics solution will hinder or enhance its pedagogic development.

A statement that scares me and seems to really represent the top-down, systems based approach taken in the report. Is there anything worse for learning and teaching than for a university to decide on an institutional pedagogical approach?

That said, I’ve got a couple of useful points and some additional literature references to follow up. So not wasted.

Executive summary

Starts by focusing on the use of analytics for learning and teaching and not for the “optimisation of activities around learning and teaching, for example, student recruitment”

Lists the following as exemplars

  • Identify at risk students so positive interventions can be taken.
    Which of course is something that appears to be the focus of analytics projects, but not at a learning and teaching level. It’s central divisions engaged with this.
  • Provide recommendations to students about activities.
  • Detect the need for and measure the results of pedagogic improvements.
  • Tailor course offerings.
    I wonder how much evidence they have of this?
  • Identify teachers who are performing well and those that need help.
    Mmm, wonder if this crosses a line or two.
  • Assist in the student recruitment process.
    Mmm, doesn’t this contradict the point above?

Conclusions the paper draws

  • UK higher ed under uses learning and academic analytics.
  • Risks to the adoption of analytics: measurements that aren’t useful; staff and students lack the knowledge to use analytics.
  • Risk to institutions in delaying introduction of analytics.
    Positioned as the “most compelling risk to amortise” which sounds a bit alarmist and faddish. A bit of FOMO?
  • Institutions vary in analytics readiness and maturity.
  • There are different scales of analytics projects.
  • Small limited-scope analytics projects are a good way to start – these enable institutions to develop staff skills and raise the profile of analytics.
    This perhaps could be an argument for the grant project?
  • There are commercial solutions that will work, but they should be evaluated.

At this stage, this is not necessarily sounding all that promising.

Overview: Analytics in education

Returns to a broader view of analytics “we examine the use of analytics in education with a bias towards providing information that may help decision makers in thinking about analytics in their institutions.”

Do emphasise that “analytics exists as part of a socio-technical system where human decision-making and consequent actions are as much a part of any successful analytics solution as the technical components”.

Varieties of analytics

Quotes to useful definitions

Analytics is “the use of data, statistical analysis, and explanatory and predictive models to gain insights and act on complex issues.” [Bichsel 2012].
Others have a broader view of analytics. Davenport et al [2010] characterise analytics as answering questions that generate both information and insight (table 1)

The table 1 for Davenport follows

Past Present Future
Information What happened? What is happening now? What will happen?
Insight how and why did it happen? What’s the next best action? What’s the best/worst that can happen?

The Bichsel (2012) quote is useful because it combines the range of data, analysis and models. i.e. the inclusion of data. In learning analytics – especially educational data mining – in the fancy , whiz bang algorithms that make great predictions. Simply providing students and staff with access to the data seems to be ignored. This is a topic for another post.

Okay, back to learning analytics

Learning analytics is the application of analytic techniques to analyse educational data, including data about learner and teacher activities, to identify patterns of behaviour and provide actionable information to improve learning and learning-related activities.

Interesting this is back to analysing the data, it doesn’t include just providing the data.

Includes the Long and Siemens table. Repeats the exemplar list from above, including student recruitment.

Positions educational data mining as one of a range of techniques that can be applied to learning and academic analytics.

Has a Google trends image showing how learning analytics has taken off from 2007 on. While academic analytics and educational data mining have stayed constant. This would appear to provide good evidence for “learning analytics” becoming a fashion/fad. i.e. the trend for every university to talk about learning analytics as everything with data, but for researchers to to limit it to just data used for learning and teaching. Some research examining the use of learning analytics over this time might reveal this.

First reference to analytics in education is a 1995 student retention and performance experiment (Sanjeev and Zytkowt, 1995). The 2007 explosion in interest stems for EDUCAUSE.

Anticipated trajectory for analytics in the UK

  1. Some early implementation.
  2. Poised on the verge of rapid growth in late 2012.
  3. Mainstream adoption within two to five years.
  4. Use of LA will provide a differentiator in student learning services.
  5. Catchup by change-adverse institutions.

Definitions

Apparently [van Barneveld et al 2012] is entirely on terminology.

Lists a wide variety of definitions of learning analytics. Some useful contrasts. I like the Diaz and Brown (2012) definition that is explained as 3 goals, the last of which is

Enables interventions and decision making about learning by instructors and students.

This strikes me as the most important. Ferguson’s (2012) segmentation approach based on the question the approach is trying to answer is somewhat similar.

Brown (2011) breaks identifies the following component activities

  • Data collection.
  • Analysis.
  • Student learning
  • Audience.
  • Interventions.

Illustrative examples

Covers four examples

  • Purdue University’s course signals project.
    Over three years: 10% increase in A and B grade, 6.41% decrase in D, F and withdrawals.
  • Desire2Learn analytics.
    One LMS vendors approach to something like Course Signals. However, the point is made here that a single model like that in Course Signals doesn’t capture the full variety within an institution. So Desire2Learn use an ensemble technique with multiple predictive models. Attention is paid to visualisation and providing actionable information.
  • An enrolment prediction system (this is recruitment again).
    Baylor University’s use of SAS Enterprise Miner for “predictive modelling applied to admissions, retention and fundraising”.
  • Recommendation engine for library books.
    OU in the UK’s “Recommendations Improve the Search Experience”. A database of library users and the courses they are doing is used to provide recommendations using the library search facilities.

Uses

Examines various taxonomies/frameworks for understanding how analytics could be used. Most move beyond teaching and learning. Reports on some results of EDUCAUSE survey. Then moves onto look at some specific examples

Sensemaking

i.e. learning analytics as a way to make understand connections, anticipate trajectories and act effectively. Draws on Klein et al (2006) and Siemens (2011). But doesn’t really give examples.

Identifying students at risk

Some approaches (not sure this is a useful distinction).

  • Causal (i.e. correlation exists) models – that identify links between certain data attributes and student performance.
  • Predictive model – using SNA to measure connectedness.

Driving pedagogic change

Multiplicity of uncontrolled variables makes it unlikely that analytics can decide which is the best pedagogy. However, suggested that it can drive pedagogic change:

  • Decide the pedagogy you want to use.
  • Consider how learning analytics can support that pedagogy.
    Given that most e-learning systems are not designed with any particular pedagogy in mind, I wonder how possible this is.

Illustrates this with a table from Greller and Drachsler (2012). Must follow up on this.

Use of activity data

Mostly focuses on resource management. ???Learning and teaching???

Adoption

Describes various success factors and essentially boils down to committed leaders, skilled staff and flexible technology.

Investment

Quotes the RISE Project (68,000 pounds), surveys from the states ($USD400,000 pa.)

Adoption process

A cycle from Siemens?

  • Collection of data guided by purpose.
  • Storage of data
  • Cleaning of data.
  • Integration of data into coherent data sets.
  • Analysis.
  • Reporting and visualisation.
  • Actions enabled.

Presents a general model for institutional analytics with linkages to institutional strategy.

However, this particular point was left out until another section

King et al [2012] describe how it was recognised that departmental acceptance of the early warning system would depend on departments being able to choose performance indicators suited to their needs. Indicators can be chosen by departmental staff; data choices are kept simple, where for each selectable indicator, staff members can see a title, a description and a settable severity level.

Analytics maturity

Reports on Davenport and Harris’ (2007) 5 stages of organisational progress to full adoption of analytics. And Davenport’s et al (2010) five assets and capabiliites for analytics.

I’m getting a little tired of all these neat frameworks and stage models.

Though I do like this sentiment

Again, this is very much the position advanced in this paper, that localised analytics advances awareness and builds experience and readiness to move on to greater levels of analytic maturity.

Human, social and managerial aspects of adoption

Return’s to Davenport again. Mentions the need for people with skills with analytics at numerous levels within the organisation…including “there may also be a need to foster skills within evidence-based pedagogy and the use of analytics to design courses, student interventions and other aspects of learning.”

Build or buy

This old furphy. Does anyone think it’s one or the other these days? You will need to do both.

Includes a table from Campbell and Oblinger (2007) identifying potential data sources. Almost all of which are around a course. Few are actually within it.

Goes on to talk about various commercial systems.

Risks

Develops a list of “several overlapping categories”

  • Risks that inappropriate measurements are revealed through analytics.
  • Analytics specific technical risks and more general technical risks (e.g. system failure).
  • Social and managerial risks (e.g. they don’t support it, or they privilege it over other decision making).
  • Human resources risks, including not having enough a) statistical and analytic capacity, b) implementation resource, c) understanding of the role and applicability of analytics.
  • Risks of students/staff not be able to interpret analytics.
  • Risk of ineffectual interventions.
  • Legal risks (e.g. data protection and FOI)
  • Risk of big brother approach to education
  • Delayed introduction leading to missed opportunities.

References

Harmelen, M. Van, & Workman, D. (2012). Analytics for Learning and Teaching (Vol. 1, pp. 1–40). Bolton.

0 thoughts on “Analytics for Learning and Teaching

Leave a Reply

Your email address will not be published. Required fields are marked *