The following is the final version of a short paper that’s been accepted for ASCILITE’2013. It’s our first attempt to formulate and present the IRAC framework for analysing and designing learning analytics applications. This presentation from last week expands on the IRAC framework a little and touches on some of the future work.
David Jones
University of Southern Queensland
Colin Beer, Damien Clark
Office of Learning and Teaching
CQUniversity
Abstract
It is an unusual Australian University that is not currently expending time and resources in an attempt to harness learning analytics. This rush, like prior management fads, is likely to face significant challenges when it comes to adoption, let alone the more difficult challenge of translating possible insights from learning analytics into action that improves learning and teaching. This paper draws on a range of prior research to develop four questions – the IRAC framework – that can be used to improve the analysis and design of learning analytics tools and interventions. Use of the IRAC framework is illustrated through the analysis of three learning analytics tools currently under development. This analysis highlights how learning analytics projects tend to focus on limited understandings of only some aspects of the IRAC framework and suggests that this will limit its potential impact.
Keywords: learning analytics; IRAC; e-learning; EPSS; educational data mining; complex adaptive systems
Introduction
The adoption of learning analytics within Australian universities is trending towards a management fashion or fad. Given the wide array of challenges facing Australian higher education, the lure of evidence-based decision making has made the quest to implement some form of learning analytics “stunningly obvious” (Siemens & Long, 2011, p. 31). After all, learning analytics is increasingly being seen as “essential for penetrating the fog that has settled over much of higher education” (Siemens & Long, 2011, p. 40). The rush toward Learning Analytics is illustrated by its transition from not even a glimmer on the Australian and New Zealand Higher Education technology horizon in 2010 (Johnson, Smith, Levine, & Haywood, 2010) to predictions of its adoption in one year or less in 2012 (Johnson, Adams, & Cummins, 2012) and again in 2013 (Johnson et al., 2013). It is in situations like this – where an innovation has achieved a sufficiently high public profile – that the rush to join the bandwagon can swamp deliberative, mindful behaviour (Swanson & Ramiller, 2004). If institutions are going to successfully harness learning analytics to address the challenges facing the higher education sector, then it is important to move beyond slavish adoption of the latest fashion and aim for mindful innovation.
This paper describes the formulation and use of the IRAC framework as a tool to aid the mindful implementation of learning analytics. The IRAC framework consists of four broad categories of questions – Information, Representation, Affordances and Change – that can be used to scaffold analysis of the complex array of, often competing, considerations associated with the institutional implementation of learning analytics. The design of the IRAC framework draws upon bodies of literature including Electronic Performance Support Systems (EPSS) (Gery, 1991), the design of cognitive artefacts (Norman, 1993), and Decision Support Systems (Arnott & Pervan, 2005). In turn, considerations within each of the four questions are further informed by a broad array of research from fields including learning analytics, educational data mining, complex adaptive systems, ethics and many more. It is suggested that the considered use of the IRAC framework to analyse learning analytics implementations in a particular context, for specific tasks, will result in designs that are more likely to be integrated into and improve learning and teaching practices.
Learning from the past
The IRAC framework is based on the assumption that the real value and impact of learning analytics arises from its integration into the “tools and processes of teaching and learning” (Elias, 2011, p. 5). It is from this perspective that the notion of Electronic Performance Support Systems (EPSS) is seen as providing useful insights as EPSS embody a “perspective on designing systems that support learning and/or performing” (Hannafin, McCarthy, Hannafin, & Radtke, 2001, p. 658). EPSS are computer-based systems intended to “provide workers with the help they need to perform certain job tasks, at the time they need that help, and in a form that will be most helpful” (Reiser, 2001, p. 63). This captures the notion of the performance zone defined by Gery (1991) as the metaphorical area where all of the necessary information, skills, and dispositions come together to ensure successful task completion. For Villachica, Stone & Endicott (2006) the performance zone “emerges with the intersection of representations appropriate to the task, appropriate to the person, and containing critical features of the real world” (p. 540). This definition of the performance zone is a restatement of Dickelman’s (1995) three design principles for cognitive artefacts drawn from Norman’s (1993) book “Things That Make Us Smart”. In this book, Norman (1993) argues “that technology can make us smart” (p. 3) through our ability to create artefacts that expand our capabilities. At the same time, however, Norman (1993) argues that the “machine-centered view of the design of machines and, for that matter, the understanding of people” (p. 9) results in artefacts that “more often interferes and confuses than aids and clarifies” (p. 9). A danger faced in the current rush toward learning analytics.
The notions of EPSS, the Performance Zone and Norman’s (1993) insights into the design of cognitive artefacts – along with insights from other literature – provide the four questions that form the IRAC framework. The IRAC framework is intended to be applied with a particular context and a particular task in mind. A nuanced appreciation of context is at the heart of mindful innovation with Information Technology (Swanson & Ramiller, 2004). Olmos & Corrin (2012), amongst others, reinforce the importance for learning analytics to start with “a clear understanding of the questions to be answered” (p. 47) or the task to be achieved. When used this way, it is suggested that the IRAC framework will help focus attention on factors that will improve the implementation and impact of learning analytics. The following lists the four questions at the core of the IRAC framework and briefly describes some of the associated factors. The four questions are:
- Is all the relevant Information and only the relevant information available?
While there is an “information explosion”, the information we collect is usually about “those things that are easiest to identify and count or measure” but which may have “little or no connection with those factors of greatest importance” (Norman, 1993, p. 13). This leads to Verhulst’s observation (cited in Bollier & Firestone, 2010) that “big data is driven more by storage capabilities than by superior ways to ascertain useful knowledge” (p. 14). There are various other aspects of information to consider. For instance, is the information required technically and ethically available for use? How is the information to be cleaned, analysed and manipulated? Is the information sufficient to fulfill the needs of the task? In particular, does the information captured provide a reasonable basis upon which to “contribute to the understanding of student learning in a complex social context such as higher education” (Lodge & Lewis, 2012, p. 563)?
- Does the Representation of the information aid the task being undertaken?
A bad representation will turn a problem into a reflective challenge, while an appropriate representation can transform the same problem into a simple, straightforward task (Norman, 1993). Representation has a profound impact on design work (Hevner, March, Park, & Ram, 2004), particularly on the way in which tasks and problems are conceived (Boland, 2002). In order to maintain performance, it is necessary for people to be “able to learn, use, and reference necessary information within a single context and without breaks in the natural flow of performing their jobs.” (Villachica et al., 2006, p. 540). Olmos and Corrin (2012) suggest that there is a need to better understand how visualisations of complex information can be used to aid analysis. Considerations here focus on how easy is it to understand the implications and limitations of the findings provided by learning analytics?
- Are there appropriate Affordances for action?
A poorly designed or constructed artefact can greatly hinder its use (Norman, 1993). For an application of information technology to have a positive impact on individual performance it must be utilised and be a good fit for the task it supports (Goodhue & Thompson, 1995). Human beings tend to use objects in “ways suggested by the most salient perceived affordances, not in ways that are difficult to discover” (Norman, 1993, p. 106). The nature of such affordances are not inherent to the artefact, but are instead co-determined by the properties of the artefact in relation to the properties of the individual, including the goals of that individual (Young, Barab, & Garrett, 2000). Glassey (1998) observes that through the provision of “the wrong end-user tools and failing to engage and enable end users” even the best implemented data warehouses “sit abandoned” (p. 62). Tutty, Sheard and Avram (2008) suggest there is evidence that institutional quality measures not only inhibit change, “they may actually encourage inferior teaching approaches” (p. 182). The consideration for affordances is whether or not the tool and the surrounding environment provide support for action that is appropriate to the context, the individuals and the task.
- How will the information, representation and the affordances be Changed?
The idea of evolutionary development has been central to the theory of decision support systems (DSS) since its inception in the early 1970s (Arnott & Pervan, 2005). Rather than being implemented in linear or parallel, development occurs through continuous action cycles involving significant user participation (Arnott & Pervan, 2005). Beyond the systems, there is a need for the information being captured to change. Buckingham-Shum (2012) identifies the risk that research and development based on data already being gathered will tend to perpetuate the existing dominant approaches from which the data was generated. Bollier and Firestone (2010) observe that once “people know there is an automated system in place, they may deliberately try to game it” (p. 6). Universities are complex systems (Beer, Jones, & Clark, 2012) requiring reflective and adaptive approaches that seek to identify and respond to emergent behaviour in order to stimulate increased interaction and communication (Boustani et al., 2010). Potential considerations here include, who is able to implement change? Which, if any, of the three prior questions can be changed? How radical can those changes be? Is a diversity of change possible?
It is proposed that the lens provided by the IRAC framework can help increase the mindfulness of innovation arising from learning analytics. In particular, it can move consideration beyond the existing over emphasis on the first two questions and raise awareness of the last two questions. This shift in emphasis appears necessary to increase the use and effectiveness of learning analytics. The IRAC framework can also provide suggestions for future directions. In the last section, the paper seeks to illustrate the value of the IRAC framework by using it to compare and contrast three nascent learning analytics tools against each other and contemporary practice.
Looking to the future
The Student Support Indexing system (SSI) mirrors many other contemporary learning analytics tools with a focus on the task of improving retention through intervention. Like similar systems, it draws upon LMS clickstream information in combination with data from other context specific student information systems and continuously indexes potential student risk. Only a very few such systems, such as S3 (Essa & Ayad, 2012), provide the ability to change a formula in response to a particular context. SSI also represents the information in tabular form, separate from the learning context. SSI does provide common affordances for intervention and tracking, which appear to assist in the development of a shared understanding of student support needs across teaching and student support staff. Initial findings are positive with teaching staff appreciating the aggregation of information from various institutional systems in conjunction with basic affordances for intervention facilitation and tracking. In its current pilot form, the SSI provides little in terms of change and it is hoped that the underlying process for indexing student risk, tracking student interventions and monitoring students interventions can be represented in more contextually appropriate ways in future iterations.
The Moodle Activity Viewer (MAV) currently serves a similar task as traditional LMS reporting functionality and draws on much the same LMS clickstream information to represent student usage of course website activities and resources. MAV’s representative distinction is that it visualises student activity as a heat map that is overlaid directly onto the course website. MAV, like many contemporary learning analytics applications, offers little in the way of affordances. Perhaps the key distinction with MAV is that it is implemented as a browser-based add-on that depends on a LMS independent server. This architectural design offers greater ability for change because it avoids the administrative and technical complexity of LMS module development (Leony, Pardo, Valentın, Quinones, & Kloos, 2012) and the associated governance constraints. It is this capability for change that is seen as the great strength of MAV, offering the potential to overcome its limited affordances, and a foundation for future research.
BIM is a Moodle plugin that manages the use of student selected, externally hosted blogs as reflective journals. It is posts written by students that form the information used by BIM, moving beyond the limitations (see Lodge & Lewis, 2012) associated with an over-reliance on clickstream information. Since BIM aims to support a particular learning design – reflective journals – it enables exploration of process analytics (Lockyer, Heathcote, & Dawson, 2013). In particular, how process analytics can be leveraged to support the implementation of affordances for automated assessment, scaffolding of student reflective writing, and encouraging connections between students and staff. Like MAV, the work on BIM is also exploring approaches to avoid the constraints on change placed by existing LMS and organisational approaches.
The IRAC framework arose from a concern that most existing learning analytics applications were falling outside the performance zone and were thus unlikely to successfully and sustainably improve learning and teaching. Existing initiatives focus heavily on information, its analysis, and how it is represented; and, not enough on technological affordances for action and agility to change and adapt. Drawing on earlier work from the EPSS and other literature we have proposed the IRAC framework as a guide to help locate the performance zone for learning analytics. The next step with the IRAC framework is a more detailed identification and description of its four components. Following this we intend to use the framework to analyse the extant learning analytics literature and to guide the development and evaluation of learning analytics applications such as SSI, MAV and BIM.
References
Arnott, D., & Pervan, G. (2005). A critical analysis of decision support systems research. Journal of Information Technology, 20(2), 67–87.
Beer, C., Jones, D., & Clark, D. (2012). Analytics and complexity : Learning and leading for the future. In M. Brown, M. Hartnett, & T. Stewart (Eds.), Future Challenges, Sustainable Futures. Proceedings of ascilite Wellington 2012 (pp. 78–87). Wellington, NZ.
Boland, R. J. (2002). Design in the punctuation of management action. In R. Boland & F. Collopy (Eds.), Managing as designing (pp. 106–112). Standford, CA: Standford University Press.
Bollier, D., & Firestone, C. (2010). The promise and peril of big data. Washington DC: The Aspen Institute.
Boustani, M. a, Munger, S., Gulati, R., Vogel, M., Beck, R. a, & Callahan, C. M. (2010). Selecting a change and evaluating its impact on the performance of a complex adaptive health care delivery system. Clinical interventions in aging, 5, 141–8.
Buckingham Shum, S. (2012). Learning Analytics. Moscow.
Dickelman, G. (1995). Things That Help Us Perform : Commentary on Ideas from Donald A . Norman. Performance improvement quarterly, 8(1), 23–30.
Elias, T. (2011). Learning Analytics: Definitions, Processes and Potential. Learning.
Essa, A., & Ayad, H. (2012). Student success system: risk analytics and data visualization using ensembles of predictive models. In Proceedings of the 2nd International Conference on Learning Analytics and Knowledge – LAKÕ12 (pp. 2–5). Vancouver: ACM Press.
Gery, G. J. (1991). Electronic Performance Support Systems: How and why to remake the workplace through the strategic adoption of technology. Tolland, MA: Gery Performance Press.
Glassey, K. (1998). Seducing the End User. Communications of the ACM, 41(9), 62–69.
Goodhue, D., & Thompson, R. (1995). Task-technology fit and individual performance. MIS quarterly, 19(2), 213.
Hannafin, M., McCarthy, J., Hannafin, K., & Radtke, P. (2001). Scaffolding performance in EPSSs: Bridging theory and practice. In World Conference on Educational Multimedia, Hypermedia and Telecommunications (pp. 658–663).
Hevner, A., March, S., Park, J., & Ram, S. (2004). Design science in information systems research. MIS Quarterly, 28(1), 75–105.
Johnson, L., Adams Becker, S., Cummins, M., Freeman, A., Ifenthaler, D., & Vardaxis, N. (2013). Technology Outlook for Australian Tertiary Education 2013-2018: An NMC Horizon Project Regional Analysis. Austin, Texas.
Johnson, L., Adams, S., & Cummins, M. (2012). Technology Outlook for Australian Tertiary Education 2012-2017: An NMC Horizon Report Regional Analysis. New Media Consortium. Austin, Texas.
Johnson, L., Smith, R., Levine, A., & Haywood, K. (2010). The horizon report: 2010 Australia-New Zealand Edition. Austin, Texas.
Leony, D., Pardo, A., Valentõn, L. de la F., Quinones, I., & Kloos, C. D. (2012). Learning analytics in the LMS: Using browser extensions to embed visualizations into a Learning Management System. In R. Vatrapu, W. Halb, & S. Bull (Eds.), TaPTA. Saarbrucken: CEUR-WS.org.
Lockyer, L., Heathcote, E., & Dawson, S. (2013). Informing Pedagogical Action: Aligning Learning Analytics With Learning Design. American Behavioral Scientist, XX(X), 57(10), 1439-1459.
Lodge, J., & Lewis, M. (2012). Pigeon pecks and mouse clicks : Putting the learning back into learning analytics. In M. Brown, M. Hartnett, & T. Stewart (Eds.), Future challenges, sustainable futures. Proceedings ascilite Wellington 2012 (pp. 560–564). Wellington, NZ.
Norman, D. A. (1993). Things that make us smart: defending human attributes in the age of the machine. Cambridge, Mass: Perseus. Reading, MA: Addison Wesley.
Olmos, M., & Corrin, L. (2012). Learning analytics: a case study of the process of design of visualizations. Journal of Asynchronous Learning Networks, 16(3), 39–49.
Reiser, R. (2001). A history of instructional design and technology: Part II: A history of instructional design. Educational Technology Research and Development, 49(2), 57–67.
Siemens, G., & Long, P. (2011). Penetrating the Fog: Analytics in Learning and Education. EDUCAUSE Review, 46(5).
Swanson, E. B., & Ramiller, N. C. (2004). Innovating mindfully with information technology. MIS Quarterly, 28(4), 553–583.
Tutty, J., Sheard, J., & Avram, C. (2008). Teaching in the current higher education environment: perceptions of IT academics. Computer Science Education, 18(3), 171–185.
Villachica, S., Stone, D., & Endicott, J. (2006). Performance Suport Systems. In J. Pershing (Ed.), Handbook of Human Performance Technology (Third Edit., pp. 539–566). San Francisco, CA: John Wiley & Sons.
Young, M. F., Barab, S. A., & Garrett, S. (2000). Agent as detector: An ecological psychology perspective on learning by perceiving-acting systems. In D. Jonassen & S. Land (Eds.), Theoretical foundations of learning environments (pp. 143–173). Mahwah, New Jersey: Lawrence Erlbaum Associates.