Sensemaking – #ascilite

Live blogging of workshop run by Associate Professor Gregor Kennedy – early work from MM mentioning audit trail. Something that Reeves and Hedberg (2003) criticise as being hard to impossible without the students themselves explaining.

Talked about as early skepticism which disappeared with the arrival of big data.

Comment: But perhaps the skepticism has just been swamped by the fad.

A fair bit of time on workshop activities

How are learning analytics used

In order of prevalence

  • Detect at risk students – majority here
  • Teaching and learning research and evaluation
  • Student feedback for adaptive learning
  • Track students’ skills devleopment within curricula

Sensemaking – fundamental issues

Process of analytics: measure, parse, analyse, interpret, report

Note: Sticking to the analytics as simply information, not a foundation for action as suggested in IRAC.

Each of the steps require decisions to be made: metric selection, granularity of analysis, analysis sophistication, meaning making behaviour != cognition, timely representation, provision to multiple audiences

Behaviour != cognition

Basic level analytics data record the behavioural responses of users

Some – free-text responses – can have a cognitive component

Cognitive component is absent

Thus easy to answer what, but not why.

metric selection

Dashboard views provide aggregated student or class view. Done in a way that is known or not known.

Typical metrics

  • How many times did they do something
  • How much time did they spend.
  • Some sort of standardised score – assessment etc.

granularity

At what level do you collect and analyse data

  • Every click
  • Key components of a task – particular aspects specific to a learning task
  • Key aspects of your online subject
  • Key aspectis of your online course

Top down and bottom up

Computer science – bottom up – data mining for meaningful patterns.

L&T folk – top down – pedagogical models and specific learning designs

Hard to do it only one way. Usually a combination of both required

The IMS white paper on learning measurement for analytics identified as an example of someone starting to do both.

Note: this model might be useful for the 2009 extension paper.

The affordances of the tool also influences the analysis.

Has a iterative model of analysis from macro down to specific.

Leave a Reply

Your email address will not be published. Required fields are marked *

css.php