Enabling academics to apply learning analytics to individual pedagogical practice: how and with what impacts?

Thanks to @cj13 for the heads up about the EDUCAUSE analytics sprint in the midst of moving, conferences, end/start of term and grant writing I missed it. Found it interesting that the first thing the struck my eye was a link to this discussion titled “Faculty need how-to information for the data they do have”. It’s interesting because the grant application I’m writing is directly aimed in this general area. Though we perhaps have a slightly different take on the problem.

As it happens, I’m about to reframe the “outcomes and rationale” section of the grant. So, rather than lose the existing content I’m going to post it below to share the thoughts and see what interesting connections arise. Some early thoughts on the project are here and we’re aiming for OLT funding.

The project team for this application includes myself, Prof Lorelle Burton (USQ), Dr Angela Murphy (CQU), Prof Bobby Harreveld (CQUni), Colin Beer (CQUni), Damien Clark (CQUni), and last, but no means least, Dr Shane Dawson (UBC).

Abstract

Learning analytics is the use of tools and techniques to gather data about the learning process and the use of that data to design, develop and evaluate learning and teaching practice. A significant challenge for learning analytics is the complexity of transforming the data it reveals into informed pedagogical action. This project will investigate how and with what impacts learning analytics can be used to inform individual pedagogical practice. Using Participatory Action Research the project will support groups of academics from two participating universities in using learning analytics to improve their learning and teaching. From this the project will generate insight into how best to use learning analytics to inform pedagogical practice, the impacts of such action, and the types of tools and organisational policies that enable this practice. These insights will be made available through an evolving, online knowledge base and appropriate design guidelines, and encapsulated in a number of supporting tools for an open source LMS.

Rationale

The Society for Learning Analytics Research (SoLAR) defines learning analytics as (Siemens et al., 2011, p. 4)

the measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimizing learning and the environments in which it occurs.

Interest in learning analytics has been rapidly growing over recent years with Ferguson (2012) suggesting it is amongst the fastest-growing areas of research within technology-enhanced learning driven by a combination of technological, pedagogical and political/economic drivers. The 2012 Horizon Report (Johnson & Cummins, 2012) argues that while learning analytics is still in its early stages of development it is likely to see widespread adoption within higher education in the next 2-3 years. The analysis in the Horizon Report’s technology outlook for Australian Tertiary Education (Johnson, Adams, & Cummins, 2012) places learning analytics into the “one year or less” for the first time seeming to suggest that Australian universities are particularly interested in “finding new ways to measure student performance, ideally in real-time” (Johnson, Adams, & Cummins, 2012, p. 1).

The rise of learning analytics as a field of research is timely given the broadening participation agenda of the Australian higher education sector. Commonwealth government targets set in response to the Bradley review (Bradley, et al. 2008) are ambitious, and are in line with the move from elite to mass higher education globally, particularly in OECD countries (Santiago, et al. 2008). For 40% of Australians aged 25-34 to hold (or be progressing towards) bachelor degrees, we will need to enrol and graduate more students. Many of these students will be from non-traditional backgrounds, and would have been considered ‘high-risk’ students in the past. Learning analytics can help us understand the circumstances under which those students are most likely to succeed. But can learning analytics also help guide teachers to make the coalface pedagogical decisions to support the success of this larger and more diverse body of students?

To date much of the work on learning analytics in higher education has centred on identifying students at risk of failure and addressing short-term issues to prevent that failure (Johnson & Cummins, 2012). The dominant use of learning analytics within higher education has largely been by management (Dawson, et al. 2011) or support staff. The larger promise of learning analytics is when it is used “to more precisely understand students’ learning needs and to tailor instruction appropriately far more accurately and far sooner than is possible today” (Johnson & Cummins, 2012, p. 23). If correctly applied and interpreted this practice has implications not only for student performance, but also for the perceptions of learning, teaching and assessment held by educators (Johnson & Cummins, 2012). The ability to correctly apply and interpret the findings of learning analytics into practice is, however, difficult, time-consuming, requires additional support, and is worthy of further investigation (Dawson et al., 2011; Dawson & McWilliam, 2008). The challenge is being able “to readily and accurately interpret the data and translate such findings into practice” (Dawson & McWilliam, 2008, p. 12).

Outcomes

This project seeks to address this challenge. It seeks to explore how and with what impacts educators can be enabled and encouraged to effectively and appropriately use learning analytics to inform individual pedagogical practice. In doing so, the project aims to develop the following outcomes:

  1. An online knowledge base that guides educators and institutions in the use of learning analytics to inform individual pedagogical practice.
  2. Enhancements to at least 12 courses across the two participating institutions through the application of learning analytics.
  3. A “harnessing analytics model” that explains how and with what impacts learning analytics can be used to inform individual pedagogical practice, including identification of enablers, barriers and critical success factors.
  4. Design guidelines explaining how to modify e-learning information systems to better enable the application of learning analytics to inform pedagogical practice.
  5. Enhanced and new learning analytics tools for the Moodle LMS based on those design guidelines and integrated with the knowledge base.
  6. Further testing, enhancements to existing and identification of new trends, correlations and patterns evident in usage data.

How

The project aims to develop these outcomes by directly helping groups of teaching academics harness learning analytics to observe and intervene in their teaching. This direct engagement in practice will take the form of cycles of Participatory Action Research at the University of Southern Queensland and CQUniversity. The use of PAR will provide the “opportunity for codeveloping processes with people rather than for people” (McIntyre, 2008, p. xii). The PAR cycles will be supported by and contribute to the evolution of the knowledge base and will inform the enhancements to Moodle – the LMS at both institutions and at over 18 Australian Universities – learning analytics tools. The institution and LMS specific interventions developed during the PAR cycles will be reviewed, tested and abstracted into broader models and guidelines that can be used at other institutions and within other e-learning tools. This sharing of insight between context specific outcomes and broader knowledge will be supported by the active involvement of learning analytics experts from other institutions and the project reference group.

References

Bradley, D., Noonan, P., Nugent, H., & Scales, B. (2008). Review of Australian Higher Education. Canberra. Retrieved from http://www.deewr.gov.au/HigherEducation/Review/Documents/PDF/Higher Education Review_one document_02.pdf

Dawson, S., Bakharia, A., Lockyer, L., & Heathcote, E. (2011). “Seeing” networks : visualising and evaluating student learning networks Final Report 2011. Main. Canberra. Retrieved from http://research.uow.edu.au/content/groups/public/@web/@learnnet/documents/doc/uow115678.pdf

Dawson, S., & McWilliam, E. (2008). Investigating the application of IT generated data as an indicators of learning and teaching performance. Melbourne: Australian Learning and Teaching Council.

Ferguson, R. (2012). The State of Learning Analytics in 2012 : A Review and Future Challenges a review and future challenges. Milton Keynes: UK. Retrieved from http://kmi.open.ac.uk/publications/pdf/kmi-12-01.pdf

Johnson, L., & Cummins, M. (2012). The NMC Horizon Report: 2012 Higher Education Edition (p. 42). Austin, Texas.

McIntyre, A. (2008). Participatory Action Research. Thousand Oaks, CA: SAGE Publications.

Siemens, G., Gasevic, D., Haythornthwaite, C., Dawson, S., Shum, S. B., & Ferguson, R. (2011). Open Learning Analytics: an integrated & modularized platform. Knowledge Creation Diffusion Utilization. Retrieved from http://solaresearch.org/OpenLearningAnalytics.pdf

Starting the re-design of EDC3100

Today marks the formal start of the process of re-designing the course EDC3100, ICTs and Pedagogy. This afternoon is the first meeting with the LITE team (Learning Innovation Teaching Enhancement) and I need to get my head around the process and think about what might make sense to do.

Where are the really good examples?

Just about ever teacher preparation program in the world is likely to have some sort of course designed to engage pre-service teachers in the use of ICTs. An important step in this process will be identifying the good examples and stealing ideas being inspired by their example.

So, if you know of some good examples, leave a pointer in the comments below.

What follows is some initial thinking.

Faculty focus

Somewhat necessarily the re-design needs to meet with organisational requirements. Here’s what I can make of those.

A major focus is on “ensuring compliance with AITSL and QCT professional standards and an important component of this is encouraging the embedding of ICTs within the pedagogy of the courses.

If there is a course within the faculty that should have ICTs embedded within it, EDC3100 should be it. In fact, there’s an argument to be made that EDC3100 should be leading this push.

Apparently there is a “tool” to help embedding ICTs within courses that align with the professional standards. It appears that this is a table in a Word document that helps map activities and resources from the course against the standards. Given my prior thinking around alignment and mapping projects, I’m not convinced that this sort of is a really big help as it never becomes integrated into everyday practice. It’s a good first move, but has known problems.

I will have to engage with the tool and will aim to do so publicly. But first I’ll have to ask permission to make the tool public.

Current problems

What follows is an ad hoc list of the problems, there is overlap, I see with the course

  • Big assignment.
    A 70% final assignment places too much pressure on students and encourages surface learning.
  • Limited feedback.
    The 29% first assignment is due around week 7 meaning that students get little feedback on their learning in the first half of the course.
  • Traditional pedagogies and walking the walk.
    In a course that talks about the use of ICTs to transform learning and teaching, the learning and teaching in the course has been fairly traditional.
  • Under engagement of students in using ICTs.
    This is a 3rd year course, the students have generally done a fair bit of teaching on prac and theory about teaching in courses. I’m not sure the course currently encourages the students to engage with a broad array of technologies. This lack of awareness of technologies limits what they can use in their teaching. There’s a similar query around the depth of engagement, there’s value in gaining deep experience/knowledge with a tool.
  • The students don’t see themselves as, or act as teachers.
    Mentioned briefly here. There’s a lot of factors here, but the nature of the assessment and the pedagogy doesn’t help.
  • The breadth of sectors.
    Students in this course range from early childhood through to VET teachers. The absence of a direct connection to their context/interest impacts on engagement/motivation etc.
  • The students are generally confined to the course.
    Recent changes in the assessment have addresses this to some extent, but too much of what the students do in this course is still confined to performance for the teachers of the course in the confines of the University environment. Not necessarily reflective of the changes being wrought in society by ICTs.
  • Do the synopsis, rationale and objectives fit and serve a purpose?
    It’s probably sensible to assume that if we’re mapping this course against AITSL and QCT standards, then the objectives, synopsis and rationale of the course should probably have some connection with those standards. After all, that’s fundamental to this alignment theory thing.
  • The professional standards mean nothing to the students.
    If these are seen as important, then they should be more prevalent through the course.
  • The constraints of the LMS and institutional eportfolios.
    From some perspectives, the institution and its perspective of e-learning seems somewhat stuck in the “industrial e-learning” paradigm. The LMS and the institutional policies seem to want to keep online learning constrained to the institutional systems. I’d like to break out of that (see the bit above about walking the walk).
  • Multiple campuses.
    There’s a tendency for students in multiple campuses to be separated into groups. This does provide some benefits, but also some tensions.
  • The tyranny of 250+ students.
    Being flexible and innovative with 10 students is much easier than with 250+. Size does matter. Both individually in terms of workload but also because of institutional risk aversion and the subsequent pressures that brings. Many of the systems and processes around this course don’t work very well with 250+ students leading to a lot of busy work.

An early vision

What I’d like to do will be impacted upon by a range of constraints including organisational factors, time etc. What follows is an early vision based solely on my prejudices and unsullied by practical concerns or reflection.

  • Students create and maintain their own personal portfolio (on a blog etc.) where they do most of their work during the semester.
  • Assessment may be some variation on contract marking (e.g. Dave Cormier’s approach here) or some other approach that gives the student the ability and responsibility to identify purposeful assessment for them.
  • Perhaps have some sort of badge system (i.e. criteria/standards) outlining the type of learning the students have to demonstrate with an existing set of activities they could do, along with the possibility of creating their own.
    e.g. the professional standards might fit in here “Use ICTs to empower students from diverse backgrounds….etc”
  • Incorporate some form of self and peer assessment into all of this.
  • Have some significant flexibility so that students could choose to focus more on some of the standards. e.g. a special ed teacher might want to focus a bit more on “diverse backgrounds etc” than a more traditional classroom teacher.
  • Have at least some of these tasks require collaboration and also the creation of publicly useful artifacts.
  • Have no fixed sequence in the course, instead structure what resources/activities there are against the standards/statements and allow the students to pick and choose how they attack it.
  • Move the academics role in this course away from giving lectures, running tutorials and marking assignments towards participating in the network and responding to student needs.
  • Have some form of technology that supports both the students and the staff in all of this.
    As a man with a hammer (as in “if you have a hammer, everything looks like a nail”) I can see BIM being able to do this with a few tweaks. Seems a logical extension.

I’ll wake up now and think about something that won’t kill me with overwork and threaten both the students and the institution with its difference.

Is there a link between managerialisation and learning analytics?

Another task for today is to put some finishing touches on an application for an OLT grant around learning analytics. The project – some early thinking shown here – is coming together nicely, but could always be better. Some good feedback from Rob Phillips reminded me about what the Horizon Report for Australian Tertiary Education (Johnson et al, 2012, p. 1) has to say about analytics in Oz higher ed

Experts all over the globe see learning analytics as important, but the Horizon.au advisory board see this technology in 2012 as more imminent than both the New Zealand experts and the global higher education group…In fact, this report marks the first time learning analytics has been voted into the near-term horizon, indicating that Australia is well-positioned for leadership in this area. Secondary research supported the conclusion that Australian institutions are particularly interested in finding new ways to measure student performance, ideally in real-time.

I’m wondering whether my impression that Australian universities are amongst the most “managerialised” of tertiary institutions is a factor in the imminent arrival of learning analytics?

In the context of this question, I found this quote from the Oz Horizon report particularly interesting (Johnson et al, 2012, p. 4) – my emphasis added

The Australian and the New Zealand panels both agreed the pervasive resistance of academics to the personal adoption and use of new technologies or techniques themselves is a continuing barrier to institutional leadership with any technology.

I can’t help wondering if the causality here is the wrong way around. Is it perhaps the attempts at and methods employed by “institutional leadership with any technology” that is creating the “pervasive resistance of academics”?

That’s certainly what I argue in a series of posts that form the skeleton for an ASCILITE paper submission.

Continuing on from the Oz Horizon report (Johnson et al, 2012, p. 4)

Both felt strongly that for students to learn how to effectively use technology, their teachers and mentors must find ways to embrace and creatively integrate it into their own work.

This is the problem I see again and again in higher education IT, especially examples of “institutional leadership with any technology”. Often the only way for academics to “embrace and creatively integrate it into their own work” is to change what they do to suit the constraints and limited capabilities of the type of “enterprise IT” that is rolled out within Universities.

i.e. the feel like change is being done to them, rather than for or with them.

In terms of analytics, it seems that the data/insights provided by analytics is seen by management as a way to potentially bypass the academics. Through the patterns revealed by academics management can see what students are doing (or not) and subsequently address it through institutional approaches that bypass or direct the academics to do something.

Of course, this is all based on the assumption that the patterns learning analytics reveal actually mean something, or can be interpreted effectively and appropriately without the sort of deep contextual knowledge that “good” teachers have.

References

Johnson, L., Adams, S., & Cummins, M. (2012). Technology Outlook for Australian Tertiary Education 2012-2017: An NMC Horizon Report Regional Analysis. New Media Consortium. Austin, Texas. Retrieved from http://www.nmc.org/publications/2012-technology-outlook-au

Redesigning the weekly ramble

The reflections on my couple of days at the PLE Conference in Melbourne will have to wait a couple of days. Semester starts on Monday and I have some course preparation to engage with. The following is an attempt to capture some thinking about the re-design of the Weekly Ramble

The old ramble

Essentially a conversation around a collection of resources and activities relevant to the topics for a given week in the EDC3100 course. Designed quickly before the start of last semester, now being re-designed quickly at the beginning of this semester. The idea being that the resources/activities would serve as springboards for further student exploration.

Last semester the rambles we’re implemented using Moodle Book module. With one book per ramble per week. A very quick and dirty analysis of the student feedback reveals the following

  1. Many of the students loved the change away from online lectures which, in their experience, were little more than reading of the slides. The rambles had them being more active.
  2. Many of the students hated the rambles. Their major concerns seemed to be the uncertainty about how much to do, but also the apparent loss of that verbal connection the lecture gave.
  3. Both sets of students tended to mention the difficulty of finding that great idea or resource that was mentioned in one of the rambles that they’d seen previously.

The last point arose from the nature of the book module, how I used it, and the absence of a decent search facility within Moodle (Aside: this stikes me as a pretty big hole in functionality.)

What follows are my current ideas for solutions to these problems.

From one to many books/activities

Rather than a single book activity each ramble, the plan is to break it up into very separate activities with (hopefully) meaningful titles. Enabling folk to see the content of the ramble from the course site. Perhaps evolve into using groupings etc to give different activities to different groups of students.

Why am I using the book module at all? Why not simply use a range of Moodle activities? Part of the reason is that I doubt I’ve truly yet got the Moodle model. It’s quite a bit different from what I’ve used previously. But it’s also a desire to have a conversation, to give some context to the students before launching them into activities.

Explicit direction

How it all fits together needs to be a bit clearer. The “why” and “so what” questions for the activities need to be clearer for the students. Both through me being more explicit about it, but also encouraging them to answer those questions.

Investigate encouraging teacher identity

A part of making it clearer and having the students engage in this thinking, is to move them beyond seeing themselves as students. Even with the rhetoric of “pre-service teacher” it seems that many of them think of themselves as students. And particularly pragmatic students at that.

I’m wondering if, within the constraints of the experience of University life, it is possible to encourage them to identify more as teachers, than students? I also wonder if this is successful whether it will actually encourage changes in practice/behaviour from them? I think this is going to be difficult in that I’m not sure that the assessment (e.g. a 70% final assignment) and other characteristics of this course are conducive to encouraging this mindshift.

Perhaps I should start referring to them as teachers. Wonder how that will go down? Will it annoy some that I’m using that title for people who haven’t been officially accredited? Reminds me of Dr Karl’s habit of calling everyone “Dr”. Is getting the pre-service teachers to think of themselves as teachers the right identity anyway?

Continue with the “Ramble followups” – perhaps with some structure

To address the feeling of loss associated with no online lectures/tutorials, we started a ramble followup. A gathering in a Wimba room at the end of each week to discuss any questions the students had. While only lightly utilisied it seemed to address some of the concerns. There is perhaps some value in adding a bit more structure to these sessions, especially in terms of encouraging identity formation.

Greater encouragement with external connections

Given the #pleconf experience and the subsequent mini-explosion of connections I have to useful and interesting people and resources for pre-service teachers, it seems sensible to try a bit harder and a bit smarter at encouraging the teachers in this course to connect with other teachers. Perhaps it is establishing these networks that will truly get them thinking as teachers.

Why do (social) networks matter in teaching & learning?

After a week of increasingly intermittent engagements with Twitter I stumbled back into the Twitterverse this afternoon and one of the first things I see is this post from @marksmithers. It is Mark’s response to the call for help from @courosa for his keynote at the Melbourne PLE conference next week. Alec’s question is

Why do (social) networks matter in teaching & learning?

What follows is my response.

Apparent serendipity

It is largely serendipitous that I am posting this. Without Mark being in my network and me happening to dip back into that network today, I would’ve probably missed this thread altogether. I echo Mark’s though that with a network of the appropriate make-up (the balance between similarity and diversity so difficult to achieve) answers to questions crop up in the network as you need them.

For example, I’ve just about finished teaching teaching this course for the first time. There were a number of times during the course when purusing my Twitter feed would highlight some really good resource or example of a topic I had to “teach” that week.

Teaching by learning

Which brings me to a slight disagreement with Mark, though it is achieved by the typically academic practice of arguing about definitions. Mark wrote:

I’m going to ignore the ‘teaching’ word and just concentrate on the ‘learning’ word because that is far more important and far more enabled by the network.

I’m a fan of Downes’ basic theory of teaching and learning

to teach is to model and demonstrate, to learn is to practice and reflect

The courses I’m currently teaching are focused on the inter-connection between Information and Communication Technologies (ICTs), networks and pedagogy (teaching/learning). So I am trying to “teach” these courses by showing the students how I learn about using ICTs and networks to teach them. The main approach to doing this is visibly engaging with and constructing networks. This includes reflections on my teaching posted on this blog, comments on Twitter, bookmarks shared via Diigo etc.

I don’t do this as much as I’d like. It’s difficult, but without these (social) networks it would be far more difficult to share this activity with the students.

Teaching as making connections

The flipside of Downes’ definition is “to learn is to practice and reflect”. Having students engage in (social) networks while they are learning is a great way of making this visible. Something I’m struggling and hoping to increase significantly over time.

I’ve often thought Erica McWilliams’ concept of the “meddler in the middle” (as opposed to the “sage on the stage” or the “guide on the side”) might be an apt metaphor for this. At least as I conceptualise it potentially working in a networked “classroom” (which is really not separated at all from the broader world). i.e. with students actively engaged with (social) networks their practice and reflection – perhaps their knowledge – becomes visible and enables a teacher to meddle in their network but also more broadly in the whole class’ network by encouraging students to engage in activities that lead them to make new connections in their network.

Perhaps more importantly, it opens up the possibility of other students and people from outside the class to encourage students to engage in activities that lead them to make new connections.

Reducing isolation

Earlier this term I observed a student in my course in a tutorial struggling with a particular task. One that most other students had completed. As I watched the student continued on struggling, not making connections with the knowledge needed to complete the task or with other students. I did intervene, but I do wonder just how many of the 250+ students in this course had moments like this?

Following on from the above, I believe/hope that by making students (social) networks more visible it is possible to reduce this sense of isolation.