Learning Analytics: engaging with and changing learning and teaching

The following is an attempt to build a bit more on an earlier idea around the use of learning analytics. It’s an attempt to frame a different approach to the use of learning analytics and to share these ideas in preparation for a potential project.

In part, the project is based on the assumption that the current predominant applications of learning analytics are

  1. By management as a tool to enable “data-based” decision making.
  2. By providing tools to students that allow them to reflect on their learning.
  3. By researchers.

And that as identified by Dawson, Heathcote & Poole (2010) there is a

lack of research regarding the application of academic analytics to inform the design, delivery and future evaluations of individual teaching practices

i.e. while there existing applications of learning analytics by/for management, researchers and students is important and should continue, there is a need to explore how learning analytics can be used by teaching staff to inform and improve their practice.

The theoretical basis the current project idea are, in summary,

  1. Drawing on Seely-Brown & Duiguid’s (1991) ideas around how “abstractions detached from practice distort or obscure intricacies of that practices” there is value in examining how what learning analytics might do by focusing on an in-depth engagement with actual academic practice to better enable exploration, understanding and innovation around the application of learning analytics to individual teaching practices.
  2. The quality of student learning outcomes is influenced by the conceptions of learning and teaching, and the perceptions of the teaching environment held by teaching staff (Trigwell, 2001; Prosser et al, 2003; Richardson, 2005; Ramsden et al, 2007).
  3. Learning analytics can be useful in revealing different and additional insights about what is going on within a course (and in other courses).
  4. Transforming the insights from learning analytics to informed pedagogical action is, for the majority of academics, complex and labour intensive (Dawson, et al, 2010).
  5. Distributed leadership – built on foundations of distributed cognition and activity theory – seeks to distribute power (the ability to get things done) through a collegial sharing of knowledge, of practice, and reflection within a socio-cultural context. (Spillane et al, 2004; Parrish et al 2008).
  6. Encouraging academics to engage in reflection on their teaching is an effective way to enhance teaching practice and eventually student learning (Kreber and Castleden, 2009).

Consequently, the project seeks to engage groups of academics in cycles of participatory action research where they are encouraged and enabled to explore and reflect on their courses they have taught with various learning analytics tools and other lenses. In preparation for this a range of existing analytics tools and forms of analysis will be applied to the courses. In response to the cycles the tools/analyses may be modified or new ones created. In particular, the tools will be modified/developed to make it easier for academics to transform the information provided by the application of learning analytics into informed pedagogical action.

In particular, the project will explore how the tools can be modified to enable the sharing of knowledge, practice and reflection between the participants and eventually the broader academic community. To break down the course-based silos and make it easier for academics to see what other staff have done and with what impacts.

References

Dawson, S., Heathcote, L., & Poole, G. (2010). Harnessing ICT potential: The adoption and analysis of ICT systems for enhancing the student learning experience. International Journal of Educational Management, 24(2), 116-128. doi:10.1108/09513541011020936

Kreber, C. and H. Castleden (2009). “Reflection on teaching and epistemological structure: reflective and critically reflective processes in ‘pure/soft’ and ‘pure/hard’ fields.” Higher Education 57(4): 509-531.

Parrish, D., Lefoe, G., Smigiel, H., & Albury, R. (2008). The GREEN Resource: The development of leadership capacity in higher education. Wollongong: CEDIR, University of Wollongong.

Prosser, M., P. Ramsden, et al. (2003). “Dissonance in experience of teaching and its relation to the quality of student learning.” Studies in Higher Education 28(1): 37-48.

Spillane, J., Halverson, R., & Diamond, J. (2004). Towards a theory of leadership practice: a distributed perspective. Journal of Curriculum Studies, 36(1), 3-34.

Ramsden, P., M. Prosser, et al. (2007). “University teachers’ experiences of academic leadership and their approaches to teaching.” Learning and Instruction 17(2): 140-155.

Richardson, J. (2005). Students’ approaches to learning and teachers’ approaches to teaching in higher education. Educational Psychology, 25(6), 673-680.

Seely Brown, J., & Duguid, P. (1991). Organizational learning and communities-of-practice: Toward a unified view of working, learning, and innovation. Organization Science, 2(1), 40-57.

Trigwell, K. (2001). “Judging university teaching.” The International Journal for Academic Development 6(1): 65-73.

Harnessing learning analytics to inform/improve learning and teaching

The following is an early attempt to formulate a method by which learning analytics could be used to make a “Golf GTI”. The context for this is an attempt to develop a grant application for an OLT grant.

Standing on the shoulders of giants

This work has to build on earlier work, especially work funded by the same/related organisations. The two most obvious examples of earlier work involved @shaned07 and many talented collaborators:

  1. Investigating the application of IT generated data as an indicator of learning and teaching performance (Dawson and McWilliam, 2008); and
  2. “Seeing” networks: visualising and evaluating student learning networks .

This work has made a range of findings and recommendations for further work. This proposal aims to build directly on that foundation.

The aim

The overall aim of this project is to further address the “lack of research regarding the application of academic analytics to inform the design, delivery and future evaluations of individual teaching practices” (Dawson, Heathcote, & Poole, 2010).

In achieving this, the project will:

  • Explore and validate established indicators of student learning performance (Dawson and McWilliam, 2008).
    This project will be based on work that is doing exactly this (which is in turn based on the work of others). The nature of this project, however, will almost certainly help identify new and interesting indicators and further test established indicators.
  • Investigate and supply what data, indicators, tools, and knowledge are required to better assist teaching staff to inform the design, delivery, and evaluation of individual teaching practice (Dawson, Bakharia, Lockyer, & Heathcote, 2011).
    “The transformation of user-data from analysis to informed pedagogical action is for the vast majority of academic teaching staff, a complex and potentially labour intensive process” (Dawson, Heathcote, & Poole, 2010). The project will draw heavily on existing work, but also seek to make new contributions.
  • Explore how and with what impact learning analytics tools and insights are used to inform pedagogical interventions.

One approach

Action research – a methodology generally adopted to generate and evaluate innovations within a particular context – seems an appropriate fit for a project aiming to explore the application of learning analytics to inform the design, delivery and evaluation of individual teaching practices. Hence one idea for this project would be to based it on a a range of cycles of action research.

The cycles – intended to be run in parallel across the institutions involved – would generally aim to

  • Be focused on a small group of academics (perhaps 5-6) responsible for teaching a range of courses.
    Given the difficulty involved in understanding the data and insights revealed by analytics and translating that into action, initial cycle(s) could be targeted at academics with appropriate content knowledge. For example, a group of academics from statistics, psychology and education who have a mix of knowledge that would help interpreting the data and recommending useful theories and approaches that might inform changes to learning.

    Arguably this might also increase the variety of ideas for expanding learning analytics, opportunities for further publication and research, and the chance that the academics from initial rounds could become part of the support team for subsequent cycles.

  • Include a small team from the project with expertise around learning analytics (perhaps including academic participants in prior AR cycles).
    The aim of this team is both to help the academics involved understand and explore what learning analytics insights and tools are currently available, but also to make aid in exploring and developing new insights and tools.
  • Draw upon existing tools, patterns and experience around learning analytics to identify areas of interest in each of the courses.
    Where possible these cycles would draw heavily an established foundation of learning analytics tools and indicators. One of the aims of this project is to learn more about how and with what impacts these existing tools and indicators have on learning and teaching.
  • Collaboratively identify areas and processes for intervention and research within those courses with analytics.
  • Implement those interventions.
  • Evaluate and reflect on the results.
  • Start again.

The idea is that the act of using analytics to make interventions in a range of real teaching contexts with a diversity of teaching staff and courses will reveal new and interesting insights.

Outcomes

After a year or two of these cycles, it is assumed that the outcomes of the project could include:

  • Insights into if, how and with what impact learning analytics insights and tools were used to inform the design, delivery and evaluations of individual teaching practices.
  • Validation of existing and identification of new predictors of student learning performance.
  • A range of tools, support resources, and recommendations to aid in the use of learning analytics to inform the design, delivery and evaluation of individual teaching practice.
  • Increase the use of these tools and insights to inform the design, delivery and evaluation of individual teaching practice.

Foundations of the approach

The following is a mixed-bag of theories and references that give a vague idea of the perspectives informing this project. There is no narrative in the following.

From this post

Key enabling factors for knowledge creation is knowledge sharing and integration [36,54]. Research in organizational learning has emphasized the value of practice; people acquire and share knowledge in socially situated work. Learning in the organization occurs in the interplay between tacit and explicit knowledge while it crosses boundaries of groups, departments, and organizations as people participate in work [17,54]. The process should be situated in shared practice with a joint, collective purpose [12,14,15].

and also this from Seely-Brown and Duiguid

We, by contrast, suggest that practice is central to understanding work. Abstractions detached from practice distort or obscure intricacies of that practice. Without a clear understanding of those intricacies and the role they play, the practice itself cannot be well understood, engendered (through training), or enhanced (through innovation).

Given this focus, it does not appear surprising when Green et al (2009) report that “many academic staff continue to employ inappropriate, teacher-centered, content focused strategies”.

there is a significant body of literature that establishes the conceptions of learning and teaching held by academics and links those conceptions to the quality of student learning outcomes (Kember and Kwan 2000; Biggs 2001; Trigwell 2001; Norton, Richardson et al. 2005; Eley 2006; Gonzalez 2009).

Trigwell (2001) suggests that focusing more holistically on the combination of elements – especially on the teachers’ conceptions of teaching and a focus on students – makes the differences between teaching qualities more discernible and judgements easier. A focus on the strategies and technologies used by a teacher ignores the influence that their conceptions can have on how such strategies and technologies are used. Approaches to staff development that focus on the provision of prescribed skills and teaching recipes result, in many cases, in participants querying the feasibility of presented methods, defending methods they are already using, using new methods mechanically, or modifying methods intended to facilitate student learning into didactic transmission modes (Gibbs 1995; Trigwell 1995). A focus on strategies also ignores the likelihood that contextual factors also influence the appropriateness and implementation of strategies and techniques. Even a teacher with a student-centred conception of learning will adopt alternate strategies if the context is not appropriate.

The relationship between conceptions of learning and teaching has implications for educational change (Tutty, Sheard et al. 2008). Change towards more sophisticated forms of teaching is only possible if the pedagogue’s conception of teaching are addressed first (Ho, Watkins et al. 2001). There is little evidence to show that pedagogue’s conceptions of teaching will develop with increasing teaching experience or from formal training (Richardson 2005). Pedagogue’s approaches to teaching change slowly, with some change coming after a sustained training process (Postareff, Lindblom-Ylanne et al. 1997). Given that it appears most university pedagogues hold content-centred conceptions of learning and teaching and that the majority of e-learning appears focused on distributing content, there appears to be a need to change the conceptions held by pedagogues.

Changing pedagogues’ conceptions of teaching, however, are a necessary but not sufficient condition for improved student learning. While pedagogue’s are likely to adopt teaching approaches that are consistent with their conceptions of teaching there may be differences between espoused theories and theories in use (Leveson 2004). While pedagogues may hold higher-level view of teaching other contextual factors may prevent use of those conceptions (Leveson 2004). Environmental, institutional, or other issues may impel pedagogues to teach in a way that is against their preferred approach (Samuelowicz and Bain 2001). While conceptions of teaching influence approaches to teaching, other factors such as institutional influence and the nature of students, curriculum and discipline may also influence teaching approaches (Kember and Kwan 2000). Prosser and Trigwell (1997) found that pedagogue’s with a student-focused approach were more likely to report that their departments valued teaching, that their class sizes were not too large, and that they had control over what was taught and how it was taught. Other contextual factors that frustrate pedagogues’ intended approaches to teaching may include senior staff with traditional teacher-focused conceptions raising issues about standards and curriculum coverage and students who induce teachers to adopt a more didactic approach (Richardson 2005)

Efforts to improve teaching have often failed because the complexity of teaching has been underestimated and such attempts should consider the integrated system of relationships that constitute the teaching experience as a whole (Leveson 2004). One such important complicating influence are differences that have found differences between discipline areas (Lindblom-Ylanne, Trigwell et al. 2006), which suggest a need to understand teaching from both a general and discipline-specific perspective (Leveson 2004). Beliefs about teaching vary markedly across different disciplines and these variations are related to the pedagogue’s beliefs about the naure of the discipline they are teaching (Richardson 2005).

Rhetorical claims espousing e-learning seek to appeal to a pedagogues’ vision with an emphasis on innovation at the expense of reflection on pedagogues’ thinking and practices (Convery 2009). The unrealistic expectations of e-learning inhibit pragmatic attempts by pedagogues to integrate technology into classroom contexts and contribute to pedagogues being blamed for the failure of technology to fulfill its promise (Convery 2009).

Any approach to one technology and pedagogy for all is pretty much doomed to flap and then crash (Salmon 2005). Apart from the above reasons there are also those associated with the technology. Technological artifacts often generate new, unforeseen behaviours that may deviate from initial intentions, it is likely that secondary changes in patterns and behaviours will occur that will not be predictable (Westera 2004). E-learning practice cannot remain static because e-learning pedagogies are evolving through the continual emergence of new modes of practice and enhanced technological tools (Nichols and Anderson 2005).

More thinking to consider

The above is influence by my perspectives on what works in terms of changing/improving learning and teaching within universities. I need to re-read some of my earlier writings, including the following

References

Dawson, S., & McWilliam, E. (2008). Investigating the application of IT generated data as an indicator of learning and teaching performance. Canberra: Australian Learning and Teaching Council. Retrieved from http://www.olt.gov.au/system/files/resources/grants_cg_report_itgenerated_qut_feb09.pdf

Dawson, S., Heathcote, L., & Poole, G. (2010). Harnessing ICT potential: The adoption and analysis of ICT systems for enhancing the student learning experience. International Journal of Educational Management, 24(2), 116-128. doi:10.1108/09513541011020936

Dawson, S., Bakharia, A., Lockyer, L., & Heathcote, E. (2011). “Seeing” networks : visualising and evaluating student learning networks Final Report 2011. Main. Canberra. Retrieved from http://research.uow.edu.au/content/groups/public/@web/@learnnet/documents/doc/uow115678.pdf

A #pstn story, so far

What follows is some reflection on forward thinking about the Pre-Service Teacher Network (#pstn) project. Bits of this may end up in a paper destined for the PLE Conference 2012

My story

2010 was an interesting year for me. After almost two decades as a University academic, first within Information Technology and then in e-learning support, I started a one-year pre-service teacher qualification with the intent being to become a high-school Information Technology and Mathematics teacher. Even before I started the qualification I started reconstructing my PLN to better suit my new career. Like @laurenforner, I learned more of direct value to me learning from my PLN and the broader Internet than through my formal studies. That said, my formal studies did provide the motivation and a foundation for my PLN-based learnings. I was sold on the benefits of a project like #pstn.

By the end of 2010 the wheel had turned and I was set to return to life as a University academic. This time within a Faculty of Education teaching a 3rd year course “ICTs and Pedagogy”. Around this time I became aware of @sthcrft’s and @laurenforner’s plans for #pstn and saw benefits for the students (120 studying online, and 25, 52, and 57 students at different campuses) in my course. On the down side, timing prevented any significant modification of the existing course meaning that the use of #pstn/Twitter, blogs, and social bookmarking (Diigo) were introduced as optional activities. As the term draws to an end, Diigo has been by far the most widely adopted with 100+ students joining the course Diigo group. Only a small handful of students have started using Twitter, with a similar number starting an individual blog. Participation in #pstn has been very limited.

Why?

Sometime ago, Geoghegan (1994) – drawing on the work of Moore (2002) – suggested that instructional technology’s limited uptake within higher education was due to a significant difference between the promoters/innovators of technology and the pragmatic majority and an ignorance of that divide. This divide seems to exist between the creators of #pstn and our target audience, see the following table which combines characteristics from Geoghegan (1994) with observation of my students.

Attribute #pstn developers Students
Social media experience Long-term users of Twitter If any experience with social media, regular user of Facebook and generally for personal use, not professional.
Focus Helping PSTs build networks to aid their transition into the workforce Pass their current courses for which #pstn and other social media is not a requirement.
Risk taking Willing to take risks Averse to taking risks
Experimentation Willing and keen to experiment Prefer proven and known applications
Need for assistance Self-sufficient Require support

It could be argued that where #pstn and the use of Twitter has worked in this project has been in those instances where the chasm has been small to non-existent (e.g. students already using Twitter having more in common with the #pstn developers), or has been more effectively bridged (e.g. @rellypops experience). At USQ the chasm wasn’t bridged well. A perception somewhat reported by the odd message of confusion tweeted by #pstn participants. The above discussion hasn’t considered the other people involved with #pstn, the mentors. Typically the mentors would have much more in common with the #pstn developers, but participation remained low. Perhaps due to the limited student participation or a sense of confusion about how to effectively participate.

Another contributing factor arises from the nature of the formal education context. A major factor is the accepted nature of university study. Lectures, tutorials, assignments etc. are common, accepted practices. The use of Twitter and social media, however, is not familiar and for some is an example of formal education encroaching into the personal realm. But technical factors also play a role. A fail whale during a tutorial introducing on-campus students to Twitter does not create a good first impression. Nor does a University filtering system that blocks the Twitter URL shortener, drastically limiting the value of Twitter to on-campus students.

Moving forward

At USQ, the plan is to bridge this chasm by embedding #pstn into the course, the assessment, and the course support structures. The formation and engagement with a PLN will become an assessable component of at least one assignment and explained as a major source of inspiration for students as they start planning ICT-rich lessons they will be required to teach later in the term. The intent is that this will provide students with what Geoghegan (1994) describes as “a compelling reason to adopt”. As @rellypops experience shows, this type of approach can achieve widespread adoption. This will be supplemented with a range of weekly activities that marry the #pstn and course experiences with appropriate levels of support. Particular thought, however, needs to be given to how (and if) PSTs can be encouraged and enabled to make various transitions. The transition from a user of Facebook for personal reasons to a user of Twitter and other social media for learning and professional reasons. The transition from a user of Twitter for assessment reasons to Twitter use for reasons of value to the student.

An obvious further extension of #pstn is to actively give voice to the experience and perceptions of the #pstn participants.

Can learning analytics make a Golf GTI

Or, an attempt to share some thinking about the idea behind an – almost obligatory – application for external funding.

The car analogy

A few weeks ago one of my neighbours up the road had left the lights of his ageing Mitsubishi Magna on. They were on all night. As an older car – arguably of questionable quality – the Magna allowed him to get out of the car with the lights still on. I believe he got out of the car during daylight and it wasn’t obvious to him that his lights were still on. The car, his tool for driving, didn’t warn him of this problem.

On the other hand, one of the cars my family owns is a Honda Jazz. It’s almost as old as the Magna, but arguably Honda have put in a bit more thought. If you remove the keys from the ignition and the headlights are still on, the car starts an incessant and annoying beeping. The car “has a bit of smarts” built in. It warns the driver that there’s a problem.

The other car we own is a VW Golf GTI (see photo). I love this car for a variety of reasons. One of the very minor reasons is that if you remove the key from the ignition, and the headlights are still one, the car turns the lights off. It also has automatic windscreen wipers that do a pretty good job. If it rains, they start.

The new car

The “LMS” is like a Magna

The Learning Management Systems used by most Universities remind me a great deal of the Magna (perhaps a Model T Ford is a better example). They don’t contain a lot of smarts. If something is going to happen within the LMS, then either the students or academics using the LMS have to do it. The LMS doesn’t provide much assistance for the people using it when they fail to pick up on a problem, much like the Magna didn’t warn my neighbour that his headlights were still on.

Using analytics to produce a Golf GTI

Some colleagues and I are currently throwing around an idea to use analytics to make the LMS (in our case Moodle) – or perhaps the broader institutional learning environment – more like a Golf GTI than a Magna.

Perhaps the original or best known example of this is the Signals work at Purdue University. In part, we’d be looking to replicate something like this, but this is only part of the story. We’d also be aiming to identify how a range of the other patterns that have been identified through analytics can be leveraged to modify the LMS to be a more pro-active member of the distributed cognitive system that is learning and teaching within a university (i.e. make the LMS more like the Globe Theatre). The theory being that if the quality of cognitive processes within the institutional learning systems is better, then the (student learning) outcomes should also be better.

This is one approach to responding to the learning analytics challenge identified by Dawson et al (2008, p. 222)

no longer simply to generate data and make it available, but rather to readily and accurately interpret data and translate such findings to practice

How might it happen?

Much of what happens around analytics is driven from the top-down, and there’s a place for that. An alternative that I’m keen to explore with this project is what happens when the question of analytics, the LMS and distributed cognition is examined from the perspective of the teaching staff and the students. What different questions and tools might be useful? This perspective drives the following initial suggested process:

  1. Continue the identification and examination of various patterns in the usage data.
  2. Identify a small set of courses – initial project participants – and
    1. Identify the issues and aims they have for their courses.
    2. Explore whether there are insights from analytics and potential actions they (and the LMS) can take to address the issues/aims.
    3. Modify the LMS environment as a result and observe what happens.
    4. Return to “a”.
  3. Broaden the release of these changes to other courses and observe what happens.

Well, that’s an initial stab. More work to do. More reading to do.

A CRM for the LMS?

I’m definitely in need of a “customer relationship management” (CRM) system for the Learning Management System (LMS). Any suggestions?

I’m back teaching at University for the first time in a while. For the course I’m teaching there are 200+ students for whom I’m the “teacher”. 170 of these are “online” students, which means they rarely come on campus to lectures/tutorials. Supporting these students means covering questions about both the course and the administrative processes around it. Keeping track of 200+ students and the conversations we’ve had is hard when the interactions are spread across email, discussion forums, LMS instant messaging, the phone and other systems and medium. Not to mention student information being spread across the student records system, the LMS, the assignment submission system, the prac placement system etc. And being busy with other tasks doesn’t help recall.

In the last couple of days a student has had to remind me about a prior email conversation because I’d forgotten about it and didn’t actively search through the various archives.

I’m likely to be in this situation for the foreseeable future, so I need to think about ways to prevent this from happening again. Not to mention the benefits that could arise from knowing the students better.

On the inertia of systems

For various reasons I am feeling frustrated with the inertia of systems.

I work at a place where the Managed Operating Environment must retain an old version of the Internet Explorer web browser because their massive, expensive, not so user friendly Enterprise Resource Planning System has a web interface that isn’t compatible with more recent Web browsers.

Continuing on with the software perspective, I see software supporting e-learning that similarly can’t keep up with modern web development practices because it’s entire development model is based on earlier practices. Making it extremely difficult to adopt processes and services that would radically improve the experience of teachers and students.

More importantly and fundamentally I see an education system burdened by an accepted way of working that is disenfranchising kids. A system that is actively turning kids off education, and worse, almost turning them off learning. This is not something that has happened recently. It’s possible to see generations of families showing symptoms of this malaise.

And the grand solutions of the day are all examples of first-order change. They accept the present system and just try to improve it. e.g. the “teach for Australia” article in one of the Australian newspapers this weekend. We’ll put “better qualified” people into schools for a couple of years, that will solve the problem.

And I’m complicit in this first-order change. I’m helping enable this inertia.

Mainly because it’s easier. It’s hard enough to do tasks well with existing systems, let alone do something different within existing systems.

I worry about university programs and courses that are closely tied with professional groups or governments, because they help create almost crushing inertia. How do you prepare future educators – whose primary focus often tends to be how to survive the first fives years in the “system” (which the majority don’t) – when you think the entire system is broken?

And on that positive note, it’s off to think about what I’m going to teach next week.

Enabling and analysing cross-institutional patterns in learning analytics: A foundation for research and action

The following is a slightly edited version of a internal grant application that has just been funded. It is a small step towards actually moving the Indicators project a little further toward the targets we talked about so many years ago.  My co-investigators on this grant are: Colin Beer and Professor Patrick Danaher.

One of the big first steps will be to locate this project within all the work on show at LAK12.

Summary of the project

The widespread adoption of Learning Management Systems (LMS) within universities is generating large collections of data. Few institutions, however, are actively using these data to inform either research into what is happening around e-learning or action to improve e-learning. This design-based research project aims to lay a foundation that will enable cross-institutional use of learning analytics by both researchers and teaching staff. It will use this foundation to analyse and compare patterns within these data sets across two universities and demonstrate how these patterns can generate both sustainable research and action, thereby filling a significant gap in the current literature.

Objectives

The project aims to build upon initial learning analytics work at CQUniversity (Beer, Clark, & Jones, 2010; Beer, Jones, & Clark, 2009; Clark, Beer, & Jones, 2010) and the related joint CQU-USQ DEHub funded project (Rossi et al., 2011-2012). The specific objectives of this project are to:

  1. Develop design principles that contribute to the on-going theorisation of learning analytics and the task of enabling effective inter-institutional projects using learning analytics to inform both research and action in order to improve learning and teaching.
  2. Analyse USQ learning analytics to identify unique patterns and to test for the presence at USQ of patterns identified at other institutions.

Through these aims it is intended that the project will lay the groundwork for a range of additional research projects – including competitive external grant applications – aimed at improving learning and teaching at USQ. It is intended that such work would not be limited to those directly involved with this grant application.

Background

The adoption and implementation of a Learning Management System (LMS) have become the almost universal approach to e-learning at universities (Jones & Muldoon, 2007). Kolowich (2012) reports on survey data from the USA that suggest more than 90% of colleges and universities, regardless of type, have adopted an institutional LMS. While these technologies have become pervasive, there is little evidence of their effectiveness in improving learning outcomes (Phillips et al., 2011). One possible solution to this problem is learning analytics.

Learning analytics has been described (Johnson, Smith, Willis, Levine, & Haywood, 2011, p. 6) as:

…a variety of data-gathering tools and analytic techniques to study student engagement, performance, and progress in practice, with the goal of using what is learned to revise curricula, teaching, and assessment in real time…. Learning analytics aims to mobilize the power of data-mining tools in the service of learning, and embracing the complexity, diversity, and abundance of information that dynamic learning environments can generate.

Learning analytics has been established as especially useful for institutional management as an enabler of data-driven decision-making, responding to pressures for accountability and student success (Campbell et al., 2007). It can, however, also be used by researchers, teaching and teaching support staff, and students. Siemens and Long (2011) propose that a particular strength of learning analytics is its ability to bridge the research/practice divide. One opportunity for bridging this divide is offered by research that identifies the existence of patterns and correlations (Objective 2) within usage data. For example, the graph in Figure 1 shows one pattern found by this project. It shows the average number of hits on a course website by over 38,000 distance education student/courses. The hits are grouped by age and plotted against the final grade in the course. It shows that students who pass the course and are older than 31 use the LMS significantly more than younger students. This pattern does not meet typical expectations and thereby opens up opportunities for research and new insights that can influence teaching practice. Such work is starting to help universities address the learning analytics challenge identified by Dawson et al (2008, p. 222):

no longer simply to generate data and make it available, but rather to readily and accurately interpret data and translate such findings to practice

LMS usage by age

Figure 1. LMS usage by age.
(n=38,000+ student/courses)

While there is increasing interest in and activity around learning analytics, Siemens and Long (2011) argue that learning analytics “is still in the early stages of implementation and experimentation”. The 2011 Horizon report describes work around learning analytics as still being in the very early stages of development, with much of the work being conceptual, and being four to five years away from widespread adoption (Johnson et al., 2011). This early work in learning analytics has identified a range of problems facing the effective use of learning analytics, including: the ephemerality of the data maintained by LMSs (Greenland, 2011); challenges to institutional processes around data ownership (Arnold, 2010); privacy and profiling (Campbell et al., 2007); failure to capture the true messiness of learning (Campbell, 2012; Siemens & Long, 2011); and, other problems around sharing data within and between universities. Effective cross-institutional use of learning analytics requires a foundation that is able to solve these and other associated problems (Objective 1).

Figure 2 represents one conceptual view of this project and its aims. The enabling task (Objective 1) aims to overcome the barriers – institutional, technical, privacy, diversity in the data, etc. – confronting learning analytics work to provide a foundation for further work. In particular, this work seeks to make theoretical contributions that can be used by ourselves and others to address these problems. With this foundation it is then possible to analyse, compare, and contrast available data to identify a range of patterns (Objective 2) that can be used by various stakeholders to develop a range of research projects and teaching interventions.
The neatness of the diagram (e.g., a single foundation and all stakeholders drawing on the same patterns) hides the complexity inherent in learning analytics, the diversity of stakeholder requirements and the ongoing, emergent and collaborative nature of using learning analytics to inform both research and action.

One take on analytics

Figure 2. Representation of project aims.

Significance of the research

While there is significant and growing interest in learning analytics there has been limited work that is cross-institutional. In particular, there has been little work – beyond that proclaiming the existence of difficulties – that actively seeks to develop explicit guidance (Objective 1) for enacting cross-institutional learning analytics. In addition, much of the work around learning analytics has emphasised data mining and the production of reports for management. This research project focuses explicitly on providing patterns and insights (Objective 2) intended to serve the requirements of researchers, teachers, and teaching support staff and to contribute strongly to the applicants’ research agenda.

Research method

This project arises from the dual observations of:

  1. the potential of learning analytics to offer new tools, methods and insight for both research and action around learning and teaching; and,
  2. the ongoing uncertainty about how best to navigate the complex issues and relationships surrounding the organisational and inter-organisational implementation and use of learning analytics.

Given these observations, this project will use a design-based research (DBR) methodology to achieve the two stated objectives. Wang and Hannafin (2005, p. 6) define design-based research as:

… a systematic but flexible methodology aimed to improve educational practices through iterative analysis, design, development, and implementation, based on collaboration among researchers and practitioners in real-world settings, and leading to contextually sensitive design principles and theories.

Design-based research offers the means for this project to target both research and action through collaboration and engagement with a real-world setting. Design-based research is built upon a pragmatic paradigm of inquiry to deal with the complexity inherent in e-learning research (Phillips, McNaught, & Kennedy, 2012). Design-based research often uses the iterative process represented Figure 3.

Design-based research process

Figure 3. Design-based research process (adapted from Reeves, 2006, p. 59).

Drawing on a combination of methods (interviews, collaborations, data mining and statistical analysis) this project will iterate through a DBR process generating insights, solutions and theoretical principles required to provide an enabling learning analytics foundation for cross-institutional, collaborative research and action.

Anticipated outcomes

  1. Creation of a learning analytics foundation at USQ that will facilitate additional projects around research and action into understanding and improving learning and teaching at USQ.
  2. Development of design principles informing the creation and use of such a foundation.
  3. At least two research papers – one based on further statistics analysis of CQUni data, one based on the comparison of patterns between the institutions, and potential for others.
  4. Preparation of external research grant applications (e.g., ARC Linkage scheme).

References

Arnold, K. E. (2010). Signals: Applying Academic Analytics. Educause Quarterly, 33(1).

Beer, C., Clark, K., & Jones, D. (2010). Indicators of engagement. Curriculum, technology and transformation for an unknown future. Proceedings of ASCILITE Sydney 2010 (pp. 75-86). Sydney. Retrieved from http://ascilite.org.au/conferences/sydney10/procs/Beer-full.pdf

Beer, C., Jones, D., & Clark, K. (2009). The indicators project identifying effective learning, adoption, activity, grades and external factors. Same places, different spaces. Proceedings ASCILITE Auckland 2009. Auckland, New Zealand. Retrieved from http://www.ascilite.org.au/conferences/auckland09/procs/beer.pdf

Campbell, G. (2012). Here I Stand. Retrieved April 2, 2012, from https://sas.elluminate.com/site/external/jwsdetect/playback.jnlp?psid=2012-03-01.1231.M.0728C08DFE8BF0EB7323E19A1BC114.vcr&sid=2008104

Campbell, J., DeBlois, P., & Oblinger, D. (2007). Academic analytics: A new tool for a new era. EDCAUSE Review, 42(4), 40-42.

Clark, K., Beer, C., & Jones, D. (2010). Academic involvement with the LMS : An exploratory study. In C. Steel, M. Keppell, P. Gerbic, & S. Housego (Eds.), Curriculum, technology & transformation for an unknown future. Proceedings ascilite Sydney 2010 (pp. 487-496). Retrieved from http://ascilite.org.au/conferences/sydney10/procs/Kenclark-full.pdf

Dawson, S., McWilliam, E., & Tan, J. P. L. (2008). Teaching smarter: How mining ICT data can inform and improve learning and teaching practice. Melbourne. Retrieved from http://www.ascilite.org.au/conferences/melbourne08/procs/dawson.pdf

Greenland, S. (2011). Using log data to investigate the impact of (a) synchronous learning tools on LMS interaction. In G. Williams, P. Statham, N. Brown, & B. Cleland (Eds.), ASCILITE 2011 (pp. 469-474). Hobart, Australia. Retrieved from http://www.leishman-associates.com.au/ascilite2011/downloads/papers/Greenland-concise.pdf

Johnson, L., Smith, R., Willis, H., Levine, A., & Haywood, K. (2011). The 2011 Horizon Report. Media (p. 29). Austin, Texas: The New Media Consortium. Retrieved from http://csn.edu/PDFFiles/OTS/Website documents/getconnected/HR2011.pdf

Jones, D., & Muldoon, N. (2007). The teleological reason why ICTs limit choice for university learners and learning. In R. J. Atkinson, C. McBeath, S. K. A. Soong, & C. Cheers (Eds.), (pp. 450-459). Singapore. Retrieved from http://www.ascilite.org.au/conferences/singapore07/procs/jones-d.pdf

Kolowich, S. (2012). Traditional colleges aim to boost LMS usage. Inside Higher Ed. Retrieved April 1, 2012, from http://www.insidehighered.com/news/2012/03/21/traditional-colleges-aim-boost-lms-usage

Phillips, R., Maor, D., Cumming-Potvin, W., Roberts, P., Herrington, J., Preston, G., Moore, E., et al. (2011). Learning analytics and study behaviour: A pilot study. In G. Williams, P. Statham, N. Brown, & B. Cleland (Eds.), ASCILITE 2011 (pp. 997-1007). Hobart, Tasmania.

Phillips, R., McNaught, C., & Kennedy, G. (2012). Evaluating e-learning: Guiding research and practice. Milton Park, UK: Routledge.

Reeves, T. (2006). Design research from a technology perspective. In J. van den Akker, K. Gravemeijer, S. McKenney, & N. Nieveen (Eds.), Educational Design Research (pp. 52-66). Milton Park, UK: Routledge.

Rossi, D., Beer, C., Janse van Rensburg, H. M., Harreveld, R. E., Danaher, P. A., & Singh, M. J. G. (2011-2012). Learning interactions: A cross-institutional multi-disciplinary analysis of learner-learner and learner-teacher and learner-content interactions in online learning contexts.

Siemens, G., & Long, P. (2011). Penetrating the Fog: Analytics in Learning and Education. EDUCAUSE Review, 46(5). Retrieved from

Wang, F., & Hannafin, M. (2005). Design-Based Research and Technology-Enhanced Learning Environments. Educational Technology Research and Development, 53(4), 5-23.