What follows will eventually be a summary of my contribution to a the ASCILITE’2017 panel titled Are learning analytics leading us towards a utopian or dystopian future, and what can we as practitioners do to influence this?. Below you’ll find a summary of my prediction and the argument that underpins it, suggestions for more reading, the slides, and references.

Argument

The argument made in the following is that Vision #3 from the LACE Visions for the Future of Learning Analytics captures a likely future for learning analytics in Australian Higher Education. That is,

In 2025, analytics are rarely used in education

I don’t necessarily agree with some of the features of this particular vision. However, it does include predictions of many problems, poor implementation and limited use of learning analytics. Based how Australian higher education currently approaches the use of digital technology in learning and teaching, this future appears likely (to me).

This is because the approach universities take to implementing learning analytics will focus on the development/adoption of an institutional learning analytics tool(s) and encouraging/managing the adoption of that tool within the institution. An approach that tends to what Cavallo (2004) described as an “explicitly topdown and hierarchical, and implicitly view education as a series of depersonalised, decontextualised steps carried out by willing, receptive, non-transforming agents” (p. 96). An approach that assumes that there is the one learning analytics tool that can be scaled across the institution. One size fits all.

It’s an approach that fails to effectively engage with Gasevic et al (2015) describe as a significant tension between course (unit) specific models and general models. A tension that echoes the reusability paradox (Wiley, n.d) in that general models “represent a cost effective &…efficient approach” (Gasevic et al, 2015, p. 83), but at the cost of pedagogical value. In learning and teaching, one size does NOT fit all.

In terms of what can be done, the suggestion is to focus on an approach designed to help find the right size for each. An approach that effectively engages with the significant tension of the reusability paradox and aims to work toward maximising pedagogical value.

In terms of specifics, the following offers some early suggestions. First, avoid taking a deficit model of teachers around both digital technology and learning and teaching. Adopt alternative ontological perspectives as the basis for planning. Focus on creating an environment and digital technology platforms that encourage the co-development of contextually specific, embedded and protean learning analytics interventions. Preferably linked with activities focused on helping provide teaching staff with the opportunity to “experience powerful personal experiences” (Cavallo, 2004, p. 102) around how teaching as design combined with learning analytics can respond to their problems and desires.

In addition, the approach should focus on enabling and encouraging teacher DIY learning analytics. DIY learning analytics involves teachers customising in different ways learning analytics to fit their context. Not only is this a way to increase the pedagogical value of learning analytics but may be the only way to achieve learning analytics at scale, as Gunn et al (2005) write

…only when the…end users of technology add their requirements, experience and professional practice that mainstream integration is achieved (p. 190)

More reading

Some suggestions for more reading include

  • How to organise a child’s birthday party is a YouTube video sharing a learning story examining how different perspectives influence how to organise.
  • Cavallo (2004) makes the case for the limitations of the traditional approach in the context of schooling.
  • Gunn et al (2005) make the case that supporting teachers in repurposing learning objects is essential to ensuring adoption and sustainability of learning objects.
  • Jones and Clark (2014) outline the two different mindsets and illustrate the difference in the context of learning analytics.
  • Jones et al (2017) report on example of teacher DIY learning analytics (originally described in Jones and Clark, 2014) and draw some implications for the institutional implementation of learning analytics.
  • Learning analytics, complex adaptive systems and meso-level practitioners: A way forward offers early plans for using an alternative ontology to address the question of learning analytics within higher education.

Slides

View below or download the Powerpoint slides.

References

Cavallo, D. (2004). Models of growth – Towards fundamental change in learning environments. BT Technology Journal, 22(4), 96–112.

Dede, C. (2008). Theoretical perspectives influencing the use of information technology in teaching and learning. In J. Voogt & G. Knezek (Eds.), International Handbook of Information Technology in Primary and Secondary Education (pp. 43–62). New York: Springer.

Gašević, D., Dawson, S., Rogers, T., & Gasevic, D. (2015). Learning analytics should not promote one size fits all: The effects of instructional conditions in predicating learning success. The Internet and Higher Education, 28, 68–84. https://doi.org/10.1016/j.iheduc.2015.10.002

Gunn, C., Woodgate, S., & O’Grady, W. (2005). Repurposing learning objects: a sustainable alternative? ALT-J, 13(3), 189–200.
Mishra, P., & Koehler, M. (2006). Technological pedagogical content knowledge: A framework for teacher knowledge. Teachers College Record, 108(6), 1017–1054.

Tyack, D., & Cuban, L. (1995). Tinkering towards utopia: A century of public school reform. Cambridge, MA: Harvard University Press.

Wiley, D. (n.d.). The Reusability Paradox. Retrieved from http://cnx.org/content/m11898/latest/

Yoo, Y., Boland, R. J., Lyytinen, K., & Majchrzak, A. (2012). Organizing for Innovation in the Digitized World. Organization Science, 23(5), 1398–1408.