As part of the USQ Technology Demonstrator Project (a bit more here) we’ll soon be able to play with the Moodle Activity Viewer. As described the VC, the Technology Demonstrator Project entails
The demonstrator process is 90 days and is a trial of a product that will improve an educator’s professional practice and ultimately motivate and provide significant enhancement to the student learning journey,
The process develops a case study which in turn is evaluated by the institution to determine if there is sufficient value to continue or perhaps scale up the project. As part o the process I need to “articulate what it is you hope to achieve/demonstrate by using MAV”.
The following provides some background/rationale/aim on the project and MAV. It concludes with an initial suggestion for how MAV might be used.
Rationale and aim
In short, it’s difficult to form a good understanding of which resources and activities students are engaging with (or not) on a Moodle course site. In particular, it’s difficult to form a good understanding of how they are engaging within those resources and activities. Making it easier for teaching staff to visualise and explore student engagement with resources and activities will help improve their understanding of student engagement. This improved understanding could lead to re-thinking course and activity design. It could enhance the “student learning journey”.
It’s hard to visualise what’s happening
Digital technologies are opaque. Turkle (1995) talks about how what is going on within these technologies are hidden from the user. This is a problem that confronts university teaching staff using a Learning Management System. Being able to identify what resources and activities within a course website students are engaging with,which resources they are not, and identifying which students are engaging can take a significant amount of time.
For example, testing at USQ in 2014 (for this presentation) found that once you knew which reports to run on Moodle you had to step through a number of different reports. Many of these reports include waiting for minutes (in 2016 the speed is better) with a blank page while the server responds to the request. After that delay, you can’t actually focus only on student activity (staff activity is included) and it won’t work for all modules. In addition, the visualisation that is provided is limited to tabular data – like the following.
Other limitations of the standard reports, include:
- Identifying how many students, rather than clicks have accessed each resource/activity.
- Identify which students have/haven’t accessed each resource/activity.
- Generate the same report within an activity/resource to understand how students have engaged within the activity/resource.
Michael de Raadt has developed the Heatmap block for Moodle (inspired by MAV) which addresses many of the limitations of the standard Moodle report. However, it does not (yet) enable the generation of a activity report within an activity/resource.
The alternative – Moodle Activity Viewer (MAV)
This particular project will introduce and scaffold the use of the Moodle Activity Viewer (MAV) by USQ staff. The following illustrates MAV’s advantages.
MAV modifies any standard Moodle page by overlaying a heat map on it. The following image shows part of a 2013 course site of mine with the addition of MAV’s heatmap. The “hotter” (more red) a link has been coloured, the most times it has been clicked upon. In addition, the number of clicks on any link has been added in brackets.
A switch of a MAV option will modify the heatmap to show the number of students, rather than clicks. If you visit this page, you will see an image of the entire course site with a MAV heatmap showing the number of students.
The current major advantage of MAV is that the heatmap will work on any standard Moodle links that appear on any Moodle page. Meaning you can view a specific resource (e.g. a Moodle Book resource) or an activity (e.g. a discussion forum) and use the MAV heatmap to understand student engagement with that activity.
The following image (click on it to see larger versions) shows the MAV heatmap on a discussion forum from the 2013 course site above. This forum is the “introduce yourself” activity for the course. It shows that the most visited forum post was my introduction, visited by 87 students. Most of the other introductions were visited by significantly less students.
This illustrate a potential failure for this activity design. Students aren’t reading many other introductions. Perhaps suggesting a need to redesign this activity.
Using MAV
At CQU, MAV is installed and teaching staff can choose to use it, or not. I’m unaware of how much shared discussion occurs around what MAV reveals. However, given that I’ve co-authored a paper titled “TPACK as shared practice: Toward a research agenda” (Jones, Heffernan, & Albion, 2015) I am interested in exploring if MAV can be leveraged in a way that is more situated, social and distributed. Hence the following approach, which is all very tentative and initial. Suggestions welcome.
The approach is influenced by the Visitor and Resident Mapping approach developed by Dave White and others. We (I believe I can talk with my co-authors) found using an adapted version of the mapping process for this paper to be very useful.
- Identify a group of teaching staff and have them identify courses of interest.
Staff from within a program or other related group of courses would be one approach. But a diverse group of courses might help challenge assumptions. - Prepare colour print outs of their course sites, both with and without the MAV heatmap.
- Gather them in a room/time and ask them to bring along laptops (or run it in a computer lab)
- Ask them to mark up the clear (no MAV heatmap) print out of their course site to represent their current thoughts on student engagement.
This could include- Introducing them to the idea of heatmaps, engagment.
- Some group discussion about why and what students might engage with.
- Development of shared predictions.
- A show and tell of their highlighted maps.
- Handout the MAV heatmap versions of their course site and ask them to analyse and compare.
Perhaps including:- Specific tasks for them to respond to
- How closely aligned is the MAV map and your prediction?
- What are the major differences?
- Why do you think that might be?
- What else would you like to know to better explain?
- Show and tell of the answers
- Specific tasks for them to respond to
- Show the use of MAV live on a course site
Showing- changing between # of clicks or # students
- focus on specific groups of students
- generating heatmaps on particular activities/resources and what that might reveal
- Based on this capability, engage in some group generation of questions that MAV might be able to help answer.
- Walk through the process of installing MAV on their computer(s) (if required)
- Allow time for them to start using MAV to answer questions that interest them.
- What did you find?
Group discussion around what people found, what worked, what didn’t etc. Including discussion of what might need to be changed about their course/learning design. - Final reflections and evaluation