Assembling the heterogeneous elements for (digital) learning

Month: December 2012

Backup for BIM 2.0

What follows is a journal of the attempt to bring BIM 2.0’s backup functionality into line with the new approach in Moodle 2.x.

Done. Appears to be all working. Will work on restore next and do some testing.


First up is trying to understand the developer docs on the new backup process. What follows is an attempt to both summarise/understand those docs and explain what changes I’ve made to BIM 2.0. The Backup 2.0 general architecture documents are also used.

What I believe it all boils down to is the ability to convert the database structure for a BIM activity into an XML file/structure. The aim here will be to keep the XML structure produced as close to that produced by 1.9 as possible.

Steps required

  1. Preparation – knowing what to backup
    Much of this is done in the “BIM data” section below.
    1. Draw the DB schema.
    2. Identify where the user information is located in the schema.
    3. Determine correct order of backup.
    4. Identify attributes and elements.
      All “id” fields should be attributes.
    5. Identify not needed elements.
      Any field except those in parent elements should be included.
    6. Identify the file areas used.
      Text fields and attachements appear to fit into this sector. This appears to be a bit new in Moodle 2.
    7. Annotating important bits
      e.g. the ID fields.
  2. Remove the old backup stuff.
    Basically backuplib.php in the bim directory.
  3. Tell Moodle that BIM 2.0 is supporting backups.
    Add the following to mod/bim/lib.php[code lang=”php”]case FEATURE_BACKUP_MOODLE2: return true;[/code]
  4. Set up the directory for the code
    create mod/bim/backup/moodle2
  5. Set up and test the backup process (which won’t work at the moment).
    The backup documentation includes a simple script that speeds up the develop/test cycle for backups. put that in place and run it. Breaks as expected
  6. Start putting in the code
    1. create empty mod/bim/backup/moodle2/backup_bim_settingslib.php
    2. backup_bim_stepslib.php – another empty file
    3. backup_bim_activity_task.class.php the place the above files are used. For now just some skeleton code with empty methods.
  7. Run the backup again.
    Which runs without error as expected.
  8. Create the bim.xml file – as an empty file
    • Some empty code into backup_bim_stepslib.php
    • Call the method from the steps file from backup_bim_activity_task.class.php.
  9. Now to define each of the elements essentially a translation of the provided code with the description of the bim data below. This produces an empty backup file for bim.
  10. Define the tree of data following the skeleton code.
  11. Connecting it all to the database
    A fairly simple set of method calls building on the above. Tested and all seems to be working. Woo hoo!
  12. Annotating IDs
    This appears to be related to signposting user (and other) information, something I missed the first time.
    For BIM, the relevant fields to annotate are user and group.
  13. Annotating files
    Not sure about this section. Need to read some more and update.
  14. Encode references to URLs?
    Done as per example.

BIM Data

The following is based on this 2010 post documenting the development work on the backup process for BIM 1.0. With some extra work based on the preparation information from above.

The bim data hierarchy (bullet points represent table names)

  • bim
    id (attr)
    course (not needed) **** CHECK IF THIS IS INCLUDED ****
    intro (????file area???)

    • bim_group_allocation
      id (attr)
      bim (not needed)
    • bim_questions
      id (attr)
      BIM (not needed)
      title (???? file area ???? )
      body (??? file area????)
    • bim_student_feeds
      id (attr)
      bim (not needed)
    • bim_marking
      id (attr)
      bim (not needed)
      question (this is an id back into bim_questions)
      link (???file area??)
      title (??file area??)
      post (?? file area?? )
      comments (?? file area??)

In BIM 1.0 the user data includes: student feeds, marking and group allocation.

Major (Moodle) requirements for BIM 2.0

The next step in the development of BIM 2.0 is identifying the list of major (Moodle) requirements that need to be implemented. BIM is a Moodle activity module. Moodle has a range of expectations that such modules are meant to meet. The following is an attempt to identify what needs to be done.

It has resulted in a renewed effort to use the github issue list to record and manage what needs to be done. Not only have I started adding issues for BIM 2.0, I’ve also been through the old issues and decided which apply to BIM 2.0

In short, some major work to be done to get backup/restore migrated. Some minor tweaks (it appears) to get gradebook integration working. Logging is working as is.

Summary of the requirements

A summary of what was found follows. It includes some compulsory/important type requirements:

And also some that would be nice future additions:

What has changed?

Now to find out what has changed in the requirements that have to be addressed now.

Backup and restore

This has definitely been changed. It’s listed in the migrating CONTRIB code document.

backuplib.php is now replaced with a backup directory. It also appears to be a more OO-based approach. Some major re-work to be done here. Will leave this to another post.


This isn’t working. Any attempt to turn on the BIM gradebook integration generates an error on line 313 of lib.php due to a problem with a database insert

Debug info: Column ‘grademax’ cannot be null
INSERT INTO mdl_grade_items

The question will be whether this is a problem in BIM or evidence that the Gradebook API has changed significantly.

According to the Gradebook API there should be a mod/bim/grade.php file. Certainly not one in BIM 1.0. But then the forum module doesn’t have one either and yet it does use the gradebook, so it would appear to be optional.

grademax can be changed in the gradebook, but the help text located there suggests it should be set on the activity settings page. i.e. I need to add the ability to set grademax on the BIM config screen.

This has identified that the problem is because the existing BIM code is does not have a value for the grademax field for the gradebook. It appears that the Moodle 2.x code has required that this be not null.

Actually, the BIM 1.0 code doesn’t seem to have this set. A mystery change? Perhaps some boilerplate with a search and replace I put in place when setting up BIM 2.0? Moodle 1.9 doesn’t seem to have required a grademax value. So what does grademax imply?

Common sense would seem to imply the maximum value that can be entered into the gradebook for this component. BIM currently asks for maximums for each question, so a grademax could be calculated. The problem is that BIM only uses the maximums to generate a warning, it doesn’t enforce it. If the gradebook enforces grademax, then this could create some dissonance with BIM’s operation.

As it happens the hard coding of grademax to 10 results in gradebook integration. Or at least the activity being added to the gradebook. When I try to release some results (which includes adding marks to gradebook) I get a coding error which I’ll need to fix. Have added this to the to do list.

Will leave working on this until later.

It also suggests that in lib.php the bim_supports function should report that it has FEATURE_GRADE_HAS_GRADE. I’ll add that for now.

There are also a few examples that provide some extra code missing from BIM. Will add that as well.


The logging API in Moodle is likely to be replaced in a little while as part of an increasing importance of logging, analytics etc. The new work includes some references which could be used to inform a rethinking about BIM logging. This is one of my areas of interest.

But at the moment, the current BIM logging is working. At least there are BIM entries being added into the dummy course that I’ve been testing with.

Bug fix and to do for BIM

After a short Xmas break it’s time to continue work on getting BIM 2.0 up and going. In this post I’m trying to continue the work from a week or so ago. The main aim is to fix a bug with the manage marking page.

Status: The manage marking bug has been fixed. Mostly related to further migration work from Moodle 1.9 to Moodle 2.x

The manage marking bug

The bug is summarised nicely by the following screen short from the last post.

Manage marking has an error

There appears to be a problem with one of the data structures that results in the BIM crashing and burning. There’s some evidence of an earlier attempt to investigate this, so time to revisit prior posts on BIM development. This post identifies the location of the problem.

The problems are all related to the changes in the database API from 1.9 to 2.x

These are fixed. get_all_marker_stats is working, however, the displaying of the data also needs to be fixed. Replacing flexible table with HTML table.

To do

  • Table of unregistered students is showing some number (student id?) that shouldn’t be there.
  • It isn’t showing the left hand column.
    A broken div

Unregistered students

A few of the pages display a table of students who have not registered their blogs. This needs to be updated to html_table.

  1. Find where it is shown.
    Done using the bim_setup_details_table with the last parameter being unregistered. Once with the marker code and twice in the coordinator code.
  2. Identify the fix
    • replace add keyed data with similar to
      [code lang=”php”]$table->data[] = array( $row[‘username’], $row[‘name’], $row[’email’], $row[‘register’] );[/code]
    • replace table..print_html with
      [code lang=”php”]echo html_writer::table( $table );[/code]
  3. Fix each of those.

Help text for Manage Marking

The problem with manage marking seems to have delayed the provision of the help text. Need to add that in.

Only the one, but there does appear to be some scope to provide more detailed help messages throughout.

To do list

This post has a list of what was working and not with the coordinator interface and a later post updates some of this. Need to revisit these and start a list in basecamp.

Misc to do

  • Manage marking
    • view students with the missing status appears to be showing a student who has 1 question that has been marked. What is the meaning of the MISSING status?
  • Re-visit the use of tables and how implemented.
  • Help messages
    • Check out other help icons in coordinator views.
    • Think about provide more detailed help via sprinkling help icons throughout all of the views.
    • Look into how some of the older help text can be reused.

BIM: another restart?

The following is essentially an activity log/diary or the first steps of getting back into work on BIM. I’m hoping to have it ready to work with some course redesign I’m working on, but timelines may make that difficult.

The aim of this is to get the current version of BIM for Moodle 2.x up and running with Moodle 2.4+. The next step will be to determine what work needs to be completed on BIM and what new features might be useful.

In summary, it’s surprisingly functional as is, much more than I remembered.

Download and install Moodle 2.4

Moodle 2.4+ downloaded from here

Stick it in an m24 directory under xampp and follow the instructions.

All installed.

Installing bim2

And now to get bim2 off github. Mm, 8 months since I worked on the code. Not good.

[code lang=”bash”]mkdir bim
cd bim
git clone
mv BIM/* .
mv BIM/.git .
rm -rf BIM

Task: I really need to look into the naming of that folder and using of git so there’s no need to play with the file structure.

Visit the local Moodle website, picks up BIM ready to install. Oops, error.

Plugin “mod_bim” is defective or outdated, can not continue, sorry.
Debug info: Missing mandatory en language pack.
Error code: detectedbrokenplugin

That’s because I didn’t clone the bim2 branch

[code lang=”bash”]sudo git clone -b bim2[/code]

And that has updated successfully. Now does it actually work?

Testing it out

Ohh, pretty new interface for Moodle 2.4. Looks like the BIM icon will need to updated to work with the slightly bigger and different design for the module icons. (Click on the following images to see bigger versions)

After adding the activity you need to enter the basic configuration details

Add some questions that the students will blog in response to.

What about allocating markers to mark the influx of posts?

No users allocated to the course, so nothing there. Nice to see I’d thought of this condition. Time to allocate some students and teaching staff. So staff enrolled in the course. Can I manage marking now?

Not yet. I need to create some groups for the course. Markers aren’t allocated individual students within BIM. They are allocated groups.

So with groups allocated, I can allocate a marker. Can I manage the markers? The coordinating teacher can see a list of all the markers and what they have (or haven’t) marked yet.

Oops, that’s the first error in the code. Will have to revisit that.

Can I see the students I have to mark as a marker? This is the overview. It shows which of my students have registered their blogs (and for which I can mark something) and which haven’t yet.

Now, let’s see if I can do some marking.

Not really because none of the posts from this single student have been allocated to one of the set questions. I’ll need to allocate one of his posts to a question using the “allocate question” screen.

Now I should be able to mark that allocated question

Student perspective

So, does it work from the student’s perspective. Does the activity show up when they login to the course?

Can the register their blog?

Does it actually work as expected?

What’s next?

Time for a road trip. So no progress for a few days, after that it will be revisiting what outstanding tasks are left to make this truly useful. Gradebook integration is probably the top of the list. Backup/restore may be the next step.

Why Moneyball is the wrong analogy for learning analytics

Learning analytics is one of the research areas I’m interested in. Consequently, I’ve read and listened to a bit about learning analytics over recent times. In that time I’ve often heard Moneyball used as an example or analogy for learning analytics.

I can see the reason for this. It’s a good example of how data can inform decision making in a field many people (especially those in America) are familiar with. Having a best selling book that’s turned into a Brad Pit movie doesn’t hurt either. But I think it’s the wrong analogy for learning analytics.

Moneyball by Kei!, on Flickr
Creative Commons Attribution-Share Alike 2.0 Generic License  by  Kei! 


As it happens I’ve been reading Nate Silver’s book The Signal and the Noise: Why most predictions fail bus some don’t over recent weeks and I’ll use it to make my case. Silver has had success in applying “analytics” to make predictions in both baseball and US politics and in the book he talks to experts from a range of fields about predictions. Through this process he concluded

I came to realize that prediction in the era of Big Data was not going very well

One of the reasons he gives is

Baseball, for instance, is an exceptional case. It happens to be an especially rich and revealing exception

Why? Well one reason is given when talking about economics, a discipline with a poor track record when it comes to predictions.

This isn’t baseball, where the game is always played by the same rules.

If you don’t play by the rules set down in baseball you are going to get pulled up. What are the rules for learning? How can you be sure that each of the students are aware of the rules, interpreted them the same way, and are playing by them?

A little further on in Silver’s book comes this

The third major challenge for economic forecasters is that their raw data isn’t much good

If the raw data isn’t much good, any predictions you make based on that data is going to have some flaws.

How good is the data in learning? Well, in the face-to-face classroom it’s next to non-existent. At least in the hard, quantitative, consistent form required for most learning analytics. If it’s e-learning, well the data is currently limited to usage logs from the LMS, which are at best a vague indicator of what’s going on.

Intelligent Tutoring Systems tend to solve these problems by having a fixed set of rules (a model) of learning and learners in a particular area. These rules, however, would appear to limit the adoption of the system. How many other contexts can these rules be applied to? Can you actually create such rules for all contexts?

I’m not convinced you can. Especially when broader trends are pushing for an increasingly diverse set of students, but also when learning is seen as a broader, more open and individual happening. Are there “rules” for learning that are broadly applicable?

What the economics analogy suggests for learning analytics

Is learning analytics about prediction? I’d argue that largely it is. You are wanting to understanding what is happening and make predictions about what will happen next. If the learner isn’t going to learn, you want to know that and be able to intervene. You want to make predictions that enable intervention.

The lack of success in prediction in economics suggests that the future of learning analytics may not be bright. At least if it relies on the same models and assumptions as economics. So what needs to change?

Beyond the early adopters of online instruction: Motivating the reluctant majority

The following is a summary and some reflection on Hixon et al (2012). I’m particularly interested in this topic due to my belief (based on 20 years experience and observation) that most institutional approaches to change in learning and teaching has only been successful in moving the same 10% of staff. A 10% that didn’t need a lot of help in the first place.

Thoughts and to do

Fairly disappointed with this paper. Didn’t engage at all with the perspective of Geoghegan (1994) who took the implications from the adopters categories a lot further and questioned some of the fundamental assumptions.

Misc thoughts, questions and to do

  • Is there anyone doing interesting research/thinking around the inherent diversity in academics and the inherent consistency in what passes for institutional e-learning?
    Looking at the work referencing Geoghegan would probably be a good start.
  • How/what does the increasingly universal adoption of e-learning in Oz Universities imply for the adopter categories and from there how e-learning is supported and the subsequent quality of it?
  • Can learning analytics of LMS usage identify/support the adopter categories? Or at the very least some difference between staff?
  • Look at the conceptual paper (Barczyk et al, 2010) that informed the survey

In the following, where I remember, my comments are emphasised. Other text is a summary of Hixon et al (2012)


Now that most of the innovators and early adopters of online instruction are comfortably teaching online, many institutions are facing challenges as they prepare the next wave of online instructors. This research how faculty in this “next wave” (the majority of adopters) differ from the innovators and early adopters of online instruction. A specific online course development program is described and the experiences of the “majority” in the program are examined in relation to the experiences of previous participants (the innovators and early adopters).

There is probably a refinement to be made here. There are a number of universities in Australia where the majority of, if not all, courses are taught online. These institutions already have the “next wave” online. The problem though is that the quality of the online experience leaves a great deal to be desired.


More folk have to move online. This study designed to help inform best practices “in bringing the ‘majority’ online”

Based on the Distance Education Mentoring program at a Midwestern university. Cohort-based mentoring program to help faculty develop an online course. Over four years of operation it’s been noted that faculty participants are changing. Looks at Rogers’ Diffusion of Innovation theory to understand the changes. A few paragraphs explaining DOI getting to categories of adopters. Moves onto some literature on those using it exploring technological innovations.

Interestingly, they don’t reference Geoghegan (1994) who used Moore’s extension of these to make some points along these lines. Interesting largely because Geoghegan is one of my theoretical/literature “hammers”.

Approaches to development of online courses

Posits two approaches

  1. faculty-driven approach;
    An approach that can work if faculty have the skills.
  2. collaborative approach.
    Seen as the solution to overcome the difficulties (especially pedagogical) of the faculty-driven approach.

I’d argue that the same observation (lack of skills) can be made with the collaborative approach. I’ve been in situations where the “collaborators” haven’t had the necessary skills either. What isn’t explicitly noted in the above is that both normally assume that course development is separate from teaching. A course is redesigned before/after teaching.

Struggles faced by academics from the literature include

  • learning the necessary skills;
    A contributing factor here is the really poor quality of the technical tools. Some of that difficulty arises because the tools are from another context and don’t match the local context.
  • adapting the pedagogic strategies for the online environment;
    At the same time workload formulas, room allocation, legal requirements, policies and student prejudices mitigate against the adoption of those pedagogical strategies.
  • conceptualising their course for the new environment;
  • finding increased time required to develop quality online courses.
    Wonder if the faculty had developed “quality” face-to-face courses? What’s the source of this difficulty online?

The assumption here is that it is course development that must be collaborative. What would it look like, how would it happen and what would be the impacts if the course delivery process was collaborative? i.e. don’t assume that faculty skill-development and course redesign only occurs before the course is taught. What happens if as I’m teaching the course I make changes and am able to learn more about what works. More importantly, that the organisational e-learning systems/policies/processes can learn more.


The mentoring program “is design to educate and certify faculty members in the principles of instructional design”. Each faculty participant (protege) has a mentor from outside the discipline. Uses the Quality Matters rubric. More detail given on how it works.

Courses are taught and then evaluated and given a pass/fail based on the rubric.

By the 3rd year of the program, noticed participants “are hesitant, or even resistant, to consider new approaches and technologies, or even to teach online”. Which is argued to be fitting to the idea that they are “the majority”. Would be interesting to dig further into this? Were this “majority” being required to participate?

Program changed in fourth year. More structure. More defined structure in the online course they complete. Submission by specific deadlines. Formal meeting schedule. A contract required to be signed. Mmm, not a great fan of that. I wonder if they actually looked at when Rogers and others have said about the characteristics of the majority and if the changes to the program were based on those insights? e.g. their communication networks tend to be vertically oriented, wouldn’t having a mentor from outside the discipline be a poor match?

Research questions

  1. In what ways do faculty members participating in the 4th offering differ from prior offerings?
  2. In what ways do the experiences/perceptions of the 4th years differ?

Survey questionnaire developed to ask: skill development, mentoring relationship, its effectiveness, perceptions of teaching as a result, program satisfaction, general beliefs about online education. Wouldn’t connecting this to DOI have been sensible?

47 of the 92 proteges completed the survey: 27% of year 1, 52% of year 2, 57% of year 3, 58% of year 4.

Responses from years 1-3 combined to compare. But they’ve said they noticed differences in year 3?


Those in 1st three years had been teaching longer than 4th years. Simlarly, the earlier group had higher ranks. Oh dear, it appears the 4th years might “junior” academics fighting to establish themselves as researchers forced to engage with the program

Year 4 less likely to identify as early adopters of technology. Continuing stereotypes would have required age to have been mentioned by now wouldn’t it? No significant difference in age.

Both groups were equally looking forward to the program.

Year 4 group reported more benefit from online course. Well this measures the changes in the course, rather than the people Both groups satisfied similarly with program.


Barczyk, C., Buckenmeyer, J., & Feldman, L. (2010). Mentoring professors: A model for
developing quality online instructors and courses in higher education. International
Journal on E-Learning, 9(1), 7–26.

Geoghegan, W. (1994). Whatever happened to instructional technology? In S. Bapna, A. Emdad, & J. Zaveri (Eds.), (pp. 438–447). Baltimore, MD: IBM.

Hixon, E., Buckenmeyer, J., Barczyk, C., Feldman, L., & Zamojski, H. (2012). Beyond the early adopters of online instruction: Motivating the reluctant majority. The Internet and Higher Education, 15(2), 102–107. doi:10.1016/j.iheduc.2011.11.005

Developing personal learning networks for open and social learning

The following is a summary and touch of reflection on Couros (2010) and is the another step in thinking about the design/implementation of a course I’m working on.

Thoughts and to do

As expected a good overview/rationale for the type of approach I’m interested in exploring with EDC3100. Some interesting departures to think about. For example,

  • Alec’s course had 16 registered students, mine will have 200+ (first semester), perhaps another 80+ (second semester) and possibly have to be taught by someone else in semester 3.
  • Alec’s much more effective and engaged with his PLN than I am.
  • Alec’s course is post-graduate, mine is under graduate.

to do

  • Look at Tabak’s (2004) concept of distributed scaffolding.
  • Engage in an analysis of the learning environment available for EDC3100. Is Moodle appropriate? Would a self-hosted WordPress be a better fit? Having 200, rather then 20, registered students is an argument for Moodle (perhaps).
  • Think about the question about whether to be overly explicit in terms of what students should post to the blog, or take the more open approach.
  • How many of the student blogs are still active today?
  • Over time it appears there’s been a move away from the Wiki assessment, I wonder why that is?
  • Is it still difficult/different to read social media?


Tells the story of EC&I831 an open access, graduate level, educational technology course at the University of Regina in 2008. 8 non-registered participants for every official student. Experience provided insight into the potential for leveraging PLNs in open access and distance education.


Course title – “Open, Connected Social”. Fully online course developed using FOSS and freely available services. Design demonstrate “open teaching methodologies: educational practices inspired by the open source movement, complementary learning theory and emerging theories of knowledge. Students builts PLNs to “collaboratively explore, negotiate and develop authentic and sustainable knowledge networks”. Couros (2010, p. 110) writes

It is my hope in writing this chapter that I capture and document relevant reflections and activities to provide starting points for those considering open teaching as educational innovation

That’s what I’m looking for Alec.

Three sections

  1. key theoretical foundations;
  2. the course experience
  3. discoveries related to the role of PLNs, techniques for developing and leveraging PLNs in DE courses and the role of emerging technologies.

Theoretical foundations

  1. The open movement
    Educators participating in FOSS communities had strong tendencies toward: collaboration, sharing, openness in classroom activities and professional collaborations. Technology was a barrier, but Web 2.0 etc addressed this. Now they could easily create, share, collaborate. Added is the greater availability of educational relevant content. So much so that

    The dilemma of the educator shifted quickly from a perceived lack of choice and accessibility to having to acquire the skills necessary to choose wisely from increased options.

  2. Complementary learning theories
    Influences include:

    • social cognitive theory – suggests it is the combination of behavioural, cognitive and environmental factors that influence human behaviour. People learn through observations of others. Self-efficacy is important.
    • social constructivism and
      Related to the above. Sociocultural context and social interaction are important for knowledge construction. Tabak’s (2004) concept of distributed scaffolding – an emerging approach to learning design.
    • adult learning theory.
      Adults learn differently from kids, which results in principles such as: adults be involved in planning/evaluating their instruction; experience (including mistakes) provides the basis for learning activities; interest is generated from subjects that have immediate relevance to their job/life; learning is problem-centred rather than content-oriented.
  3. Connectivism
    Heavily influenced by theories of social constructivism, network theory and chaos theory. Digital technologies become important to learning. Stresses metaskills of evaluating and managing information and the importance of pattern recognition as a learning strategy.
  4. Open teaching
    Defined as Couros (2010 p. 115)

    Open teaching is described as the facilitation of learning experiences that are open, transparent, collaborative, and social. Open teachers are advocates of a free and open knowledge society, and support their students in the critical consumption, production, connection, and synthesis of knowledge through the shared development of learning networks.

    Typical activities of open teachers include

    • Use of FOSS tools where possible and beneficial.
    • Integration of free/open content into L&T.
    • Promotion of copyleft content licences
    • Help students understand copyright law.
    • Help students build PLNs for collaborative and sustained learning.
    • Development of learning environments that are reflective, responsive, student-centered and incorporate diverse learning strategies.
    • Modelling openness, transparency, connectedness and responsible copyright etc. use.
    • advocacy for the participation and development of collaborative gift cultures in education and society.

    That last one is interesting

The course

20 registered students. Mostly practicing teachers or educational administrators. Normally there is a maximum of 16 students (I wish). Development funded by $30,000 government grant. Important: this funding was not used in the design and development side, but instead on hiring learning assistants who “were hired as social connectors, and their primary responsibilities were to support students in the development of PLNs” (Courous, 2010, p. 117)

In terms of selecting a learning environment, WebCT, Moodle, and Ning were rejected. Wikispaces was adopted. Wikispaces (hosted) was used. The site 2007-2010) and now. Have moved to a WordPress site (by the looks).

Course facilitation model

  • Major assessment (3) guided the activities
    1. Personal blog/digital portfolio
      Student responsible for developing a digital space to document their learning through readings and activities. For many these became showcases and acted as distributed communication portals. Most remain active beyond the end date.
    2. Collaborative development of an educational technology wiki resource
      Wiki with collaborative content.

      I’m wondering how collaborative this process was? A group or a network (a la Downes).

    3. Student-chosen major digital project.
      Range of projects (produce videos, instructional resources, social networking activities, participation in global collaborative projects, development of private social networks etc) developing resource specific to their professional context.

    It’s changed a bit and is described somewhat on the assessments page

  • Tools and interactions
    1. Synchronous events
      Two each week. 1.5-2 hours. First on content knowledge in form of invited presenters. Connect/elluminate and used and associated recordings. The second was a hands-on session for technical skills and pedagogical possibilities.

      Combination Skype and became the preferred method for video conferencing. How is explained here

    2. Asynchronous activities
      • reading, reviewing and critiquing course readings in blogs.
      • sharing resources through social bookmarking.
      • Creation of screencasts, tutorials etc for personal learning and that of others.

      And a bunch of others including reading blogs, participation in open professional development, posting content to open sites, microblogging, collaborative wiki design and collaborative design of lesson plans. Most were unplanned.

PLNs in Distance education

First session in course was closed and explanatory. The author’s PLN became important to support the model. Which does raise the question of how someone without the author’s PLN might go.

Conceptualising PLN

Mentions absence of definition and the need to discern PLE from PLN. Offers two images (click on these to see the original) The old and new style “PLN”. A discussion picked up a bit more online here. In short it appears that the PLE are the tools, processes etc that allow management of learning. The definition used for PLN

personal learning networks are the sum of all social capital and connections that result in the development and facilitation of a personal learning environment.

TypicalTeacherNetwork by courosa, on Flickr
Creative Commons Attribution-Noncommercial-Share Alike 2.0 Generic License  by  courosa 
Networked Teacher Diagram - Update by courosa, on Flickr
Creative Commons Attribution-Noncommercial-Share Alike 2.0 Generic License  by  courosa 

PLNs for teaching and learning

Strategies deemed effective in the course

  • Immerse yourself.
    i.e. use and understanding of the social media tools, how they can be used, and how the students can use them.
  • Learn to read social media.
    Social media is read much differently than traditional media. Tools aren’t great. Need to use what’s available.
  • Strengthen your PLN
    Creating content and commenting on the work of others is important.
  • Know your connections
    Be aware of skills/backgrounds of PLN allow identifying who can help students.
  • PLNs central to learning
    Courses/communities hosted in the institutional LMS die. The community in this course lived on.

Final thoughts

two questions often asked after conference presentations on this

  1. How did you get away with this?
    Institutional support for open teaching is essential. Colleagues are “constructively critical of technology, but strongly supportive of innovation in teaching and learning”.
  2. Where did you find the time to teach this way?
    Good teaching always requires more time


Couros, A. (2010). Developing personal learning networks for open and social learning. In G. Veletsianos (Ed.), Emerging technologies in distance education (pp. 109–128). AU Press.

Tabak, I., (2004). Synergy: A complement to emerging patterns of distributed
scaffolding. The Journal of the Learning Sciences, 13(3), 305–335.

Understanding management students' reflective practice through blogging

The following is a summary and perhaps some reflection upon Osman and Koh (2013). It’s part of the thinking and reading behind the re-design of the ICTs and pedagogy course I help teach to pre-service teachers.


65 business (MBA/Egyptian) students participated in collaborative blogging over 5 weeks. Analysis (content analysis for critical thinking and theory/practice links) support the potential of blogs as a tool “for reflection and learning in practitioner-oriented courses. Implications for the design of blogging tasks are discussed.

Thoughts and todo

Provides some empirical evidence for the use of blogs for reflection and connecting theory and practice. Though the findings are generally what people would expect.

The task for these students was somewhat like a forced connection. You must post 1 topic and comment on two others. I wonder if this was more open/flexible/student controlled more contributions would arise? Perhaps only if appropriate support/connection made.

To do

  • Look at Ho and Richards (1993) for a framework specific to journals of student teachers.
  • Look at framework of Greenlaw and DeLoach (2003) and also Osman & Duffy (2010)
  • Look at Osman & Duffy (2010) for the idea that theories are not actively taken up by students and remain detached areas of knowledge, not integrated into decision-making.
  • Loving et al (2007) another framework for evaluating evaluation.


Problems in practitioner courses in combining theory and practice. Need to encourage reflection etc.

Reflection has some plusses. Uses Moon’s (1999, p. 155) definition

a mental process purpose and/or outcome in which manipulation of meaning is applied to relatively complicated or unstructured ideas in learning or to problems for which there is no obvious solution”

Blogs are a recent tool for this, benefits of blogs from the literature are listed and referenced

  • empowering students by giving a voice and venue for self-expression.
  • increasing sense of ownership, engagement and interest in learning.
  • may facilitate enriched opportunities for communication, challenge, cognitive conflict, deeper thinking and knowledge construction.

But scarcity of studies that investigate “empirically”. Many relying on self-report data or anecdotal evidence. Few studies critically examine the quality of students’ reflection, especially in management education. Few provide explanations and thus limit guidelines that suport the design of blogging tasks to facilitate reflection and learning.

Literature review

Starts with references to the problem of MBA programs problem of combining academic rigor and practical application. A problem that teaching programs have had for a long time. Critical reflection is seen as a way to bridge this.

The rest is broken into the following sections

  1. Blogging and reflection.
    Individual journals a common approach. Privacy provides a sheltered place but limits sharing/collaboration etc. Blogging provides some affordances that address this. Value is accepted by enthusiasts, but limited analysis. Some studies mentioned. A few using coding frameworks are mentioned. One shows blogging has a positive impact on reflection, but peer comments has a negative impact.
  2. Critical thinking through blogging.
    Critical thinking defined as “development of a habit of continuous reflection and questioning”. Few studies of blogging looking at critical thinking.
  3. Fostering theory-practice linkages in management education.
    Explains the use Kolb’s learning cycle in this study.

Research questions

  1. How critically evaluative were the reflections of graduate business students when they engaged in blogging?
  2. In their reflections, to what extent did these students link theory and practice? What phases of Kolb’s experiential learning cycle did these students focus on?


Students blogged for last 5 weeks of 10 week term. 20% of assessment for the task. Guidelines kept to a minimum. Graded on the quantity and quality of postings. Students introduced to reflective practice and Kolb’s cycle prior.

Blogging groups (max 8) were self-assigned and access to the blogs limited to the group. During first week the instructor moderated blog posts. Discontinued as a threat to student ownership.

Students had the opportunity of opting out of their blog contributions from being analysed for the research. 54 provided signed consent.

Blog archives coded by two independent coders.


The assessment task required students to initiate one topic and comment on two posts submitted by other groups per week. 65 students expected to make 325 posts and 650 comments. In the end 144 topics and 399 comments. Students only posted 543 times, 44% less than anticipated.

RQ #1 – how critically evaluative were posts

A peak at simplistic alternatives/argument (30%) and basic analysis (26%). About 14% theoretical inference i.e. building arguments around theory. Apparently this was the expected level.

No significant differences between posts and comments.

RQ #2 – To what extent did students link theory and practice

Focused on higher level critical thinking posts. Used a “Kolb-based framework”.

Significant differences between posts in types of reflection. Students seemed more comfortable considering theory with experience, observation or experimentation.


Results support use of blogging as a tool to encourage reflection. Mmm, not sure it’s innate to the technology, though the affordance is there.

Few posts off task – but I think that’s probably a result of asking those questions in other areas. But author’s compare this with content oriented posts in discussion forums only being 40/50% of posts. Again, possibly the design of blogs in this context suggesting it’s not the place to raise non-content questions. Authors do point out that this was a blended context, the discussion forum references were totally online. And they pick up my point.

Surprising level of students reflecting on their learning via blogs. Mostly positive, but a prominent concern was a desire for feedback, especially from the instructor. Suggests some reasons: novelty of reflection requiring reassurance; a by-product of culture.

Some suggestions around student confusion because of reasons from the literature: what to write in a blog post, low self-efficacy re: the worthiness of their contribution; difficulty generating topics.

In this study students wanted instructor to post discussion questions. i.e. the instructor needs to be more active in scaffolding struggling students.

Guidelines for designing blogging tasks

The article closes with the following list of guidelines (p. 30)

  1. Explain the importance of reflection a vehicle for learning and continued professional development.
  2. Provide different forms of scaffolding. Many students are new to reflection and critical thinking as a more formal activity. In addition to giving them a framework and guidelines to inform their reflections, examples that illustrate quality reflection and critical thinking might be necessary. Students in this context seemed to especially need help with building theory based arguments, evaluating theories, and addressing ethical concerns for business issues.
  3. Give prompts to encourage reflections. Some students are often apprehensive about initiating reflections.
  4. Promote reflection and critical thinking over longer durations. A reflection task that extends for part of the semester might not be sufficient to adequately develop students’ reflective and critical thinking skills.
  5. Relate students’ reflections to class topic so that students see the value of reflection as an integral and legitimate ingredient of learning.
  6. Provide technical orientation at the beginning of the session. Although we assume that our students are tech savvy, they might not be.

Nothing to surprising there, it’s what I’ve done in the past and will aim to do next year.


Osman, G., & Koh, J. H. L. (2013). Understanding management students’ reflective practice through blogging. The Internet and Higher Education, 16, 23–31. doi:10.1016/j.iheduc.2012.07.001

Ho, B., & Richards, J. C. (1993). Reflective thinking through teacher journal writing: Myths and realities. Prospect, 8, 7–24.

Greenlaw, S. A., & DeLoach, S. B. (2003). Teaching critical thinking with electronic discussion. The Journal of Economic Education, 34(1), 36–52.

Loving, C. C., Schroeder, C., Kang, R., Shimek, C., & Herbert, B. (2007). Blogs: Enhancing links in a professional learning community of science and mathematics teachers. Contemporary Issues in Technology and Teacher Education, 7(3), 178–198.

Osman, G., & Duffy, T. (2010). Scaffolding critical discourse in online problem-based scenarios: The role of articulation and evaluative feedback. In M. B. Nunes, & M. McPherson (Eds.), IADIS International Conference e-Learning 2010: Vol 1 (pp. 156–160). International Association for Development of the Information Society.

Can/will learning analytics challenge the current QA mentality of university teaching

Ensuring the quality of the student learning experience has become an increasingly important task for Australian universities. Experience over the last 10 years and some recent reading suggests there are some limitations to how this is currently being done. New innovations/fashions like learning analytics appear likely to reinforce these limitations, rather than actually make significant progress. I’m wondering whether the current paradigm/mindset that underpins university quality assurance (QA) processes can be challenged by learning analytics.

The black box approach to QA

In their presentation at ascilite’2012, Melinda Lewis and Jason Lodge included the following slide.

ascilite'2012 Lodge & Lewis

The point I took from this image and the associated discussion was that the Quality Assurance approach used by universities treats the students as a black box. I’d go a step further and suggest that it is the course (or unit, or subject) as the aggregation of student opinion, satisfaction and results that is treated as the black box.

For example, I know of an academic organisational unit (faculty, school, department, not sure what it’s currently called) that provides additional funding to the teaching staff of a course if they achieve a certain minimum response rate on end of term course evaluations and exceed a particular mean level of response on 3 Likert scale questions. The quality of the course, and subsequent reward, is being based on a hugely flawed measure of the quality. A measure of quality that doesn’t care or know what happens within a course, just what students say at the end of the course. Grade distribution (i.e. you don’t have too many fails or too many top results) is the other black box measure.

If you perform particularly badly on these indicators then you and your course will be scheduled for revision. A situation where a bunch of experts work with you to redesign the course curriculum, learning experiences etc. To help you produce the brand new, perfect black course box. These experts will have no knowledge of what went on in prior offerings of the course and they will disappear long before the course is offered again.

Increasingly institutions are expected to be able to demonstrate that they are paying attention to the quality of the student learning experience. This pressure has led to the creation of organisational structures, institutional leaders and experts, policies and processes that all enshrine this black box approach to QA. It creates a paradigm, a certain way of looking at the world that de-values alternatives. It creates inertia.

Learning analytics reinforcing the black box

Lodge and Lewis (2012, pp 561) suggest

The real power and potential of learning analytics is not just to save “at risk” students but also to lead to tangible improvements in the quality of the student learning experience.


The problem is that almost every university in Australia is currently embarking on a Learning Analytics project. Almost without exception most of those projects have as their focus, “at risk” students. Attrition and retention is the focus. Some of these projects are multi-million dollar budgets. Given changing funding models and the Australian Government’s push to increase the diversity and percentage of Australians with higher education qualifications, this focus is not surprising.

It’s also not surprising that many of these projects appear to be reinforcing the current black box approach to quality assurance. Data warehouses are being built to enable people and divisions not directly involved with actually teaching the courses to identify “at risk” students and implement policies and processes that keep them around.

At it’s best these projects will not impact on the actual learning experience. The interventions will occur outside of the course context. At worse, these projects will negatively impact the learning experience as already overworked teaching staff are made to jump through additional hoops to respond to the insights gained by the “at risk” learning analytics.

How to change this?

The argument we put forward in a recent presentation was that the institutional implementation of learning analytics needs to focus on “doing it with academics/students” rather than on doing it “for” and “to” academics/students. The argument here is that the “for” and “to” paths for learning analytics continues the tradition of treating the course as a black box. On the other hand, the “with” path requires direct engagement with academics within the course context to explore and learn how and with what impacts learning analytics can help improve the quality of the student learning experience.

In the presentation Trigwell’s (2001) model of factors that impact upon the learning of a student was used to illustrate the difference. The following is a representation of that model.

Trigwell's model of teaching

Do it to the academics/students

In terms of learning analytics, this path will involve some people within the institution developing some systems, processes and policies that identify problems and define how those problems are to be addressed. For example, a data warehouse and its dashboards will highlight those students at risk. Another group at the institution will contact the students or perhaps their teachers. i.e. there will be changes to the institutional context level that will essentially by pass the thinking and planning of the teacher and go direct to the teaching context. It’s done to them.

Doing it to

The course level is largely ignored and if it is considered then courses are treated as black boxes.

Do it for the academics/students

In this model a group – perhaps the IT division of the central L&T folk – will make changes to the context by selecting some tools for the LMS, some dashboards in the data warehouse etc that are deemed to be useful for the academics and students. They might even run some professional development activities, perhaps even invite a big name in the field to come and give a talk about learning analytics and learning design. i.e the changes are done for the academics/students in the hope that this will change their thinking and the planning.

Doing it for

The trouble is that this approach is typically informed by a rose-coloured view of how teaching/learning occurs in a course (e.g. very, very few academics actively engage in learning design in developing their courses); ignores the diversity of academics, students and learning; and forgets that we don’t really know how learning analytics can be used to understand student learning and how we might intervene.

The course is still treated as a black box.

Do it with the academics/students

Doing it with

In this model, a group of people (including academics/students) work together to explore and learn how learning analytics can be applied. It starts with the situated context and looks for ways in which what we know can be harnessed effectively by academics within that context. It assumes that we don’t currently know how to do this and that by working within the specifics of the course context we can learn how and identify interesting directions.

The course is treated as an open box.

This is the approach which our failed OLT application was trying to engage in. We’re thinking about going around again, if you’re interested then let me know.

The challenge of analytics to strategy

This post was actually sparked today by reading this article titled “Does analytics make us smart or stupid?” in which someone from an analytics vendor uses McLuhan’s Tetrad to analyse the possible changes that arise from analytics. In particular, it was this proposition

With access to comprehensive data sets and an ability to leave no stone unturned, execution becomes the most troublesome business uncertainty. Successful adaptation to changing conditions will drive competitive advantage more than superior planning. While not disappearing altogether, strategy is likely to combine with execution to become a single business function.

This seems to resonate with the idea that perhaps the black box approach to the course might be challenged by learning analytics. The “to” and “for” paths are much more closely tied with traditional views of QA which are in turn largely based on the idea of strategy and top-down management practices. Perhaps learning analytics can be the spark that turns this QA approach away from the black box approach toward on focused more on execution, on what happens within the course.

I’m not holding my breath.


Lodge, J., & Lewis, M. (2012). Pigeon pecks and mouse clicks : Putting the learning back into learning analytics. In M. Brown, M. Hartnett, & T. Stewart (Eds.), Future challenges, sustainable futures. Proceedings ascilite Wellington 2012 (pp. 560–564). Wellington, NZ.

Trigwell, K. (2001). Judging university teaching. The International Journal for Academic Development, 6(1), 65–73.

Tertiary course design is very poor, and we solve it by "blame the teacher"

The following is inspired by this tweet

which links to this newspaper article titled “Tertiary course design ‘very poor'”. An article certain to get a rise out of me because it continues the “blame the teacher” refrain of common to certain types of central L&T type people

After 33 years of working in higher education in all parts in NZ, the US and UK, the one thing we’ve become very clear about in curriculum design is that our people in higher education need to actually be educated as educators to work at that level

This seems to imply then that all of the courses taught by those with teaching qualifications should be beacons of quality learning experiences. My observations of courses at a number of universities taught by graduates of higher education teaching certificates and by those in Faculties of Education would seem to indicate otherwise. Not to mention reports of “ticking the box” from colleagues at top universities required to complete graduate certificates in higher education teaching. i.e. they have to complete the certificate to have a job, so they complete it. They are successful products of formal education, they know how to successfully jump through the required hoops.

This is not to suggest there is no value in these courses. But it’s not the solution to the problem. It’s not even the best way to build knowledge of teaching and learning amongst academics.

The following figure is from Richardson (2005)

Integrated model of teachers' approaches to teaching

The findings from this research is that there can be significant differences between the espoused theories information teaching and learning and the theories in use (Leveson, 2004). Teachers can know all the “best” learning theory but not use that in their teaching. While teachers may hold higher-level views of teaching, other contextual factors may prevent the use of those conceptions (Leveson, 2004). Environmental, institutional, or other issues may impel teachers to teach in a way that is against their preferred approach (Samuelowicz & Bain, 2001). Prosser and Trigwell (1997) found that teachers with a student-focused approach were more likely to report that their departments valued teaching, that their class sizes were not too large, and that they had control over what was taught and how it was taught. In examining conceptions of e-learning held by academic staff, Gonzalez (2009) found that institutional factors and the nature of the students were the most relevant contextual factors influencing teaching.

Now, consider the world of Australian (and New Zealand?) Universities as we move into 2013. Do you think the environmental factors have gotten any better in terms or enabling teachers to teach in the ways they want? An increasing focus on retention, an increasingly diverse intake of students, decreasing funding, increasing use of e-learning, decreasing quality of institutional e-learning systems, increasing casualisation of the academic work-force, research versus teaching, increasing managerialisation and increasingly rapid rounds of restructuring….are any of these factors destined to encourage quality approaches to teaching and learning?

My argument is that given this environment, even if you could get every academic at a university to have a formal qualification in learning and teaching, there wouldn’t be any significant increase in the quality of student learning because the environment would limit any chance of action and only encourage academics to “tick” the qualifications box.

On the other hand, if the teaching and learning environment at a university wasn’t focused on the efficient performance of a set of plans (which limit learning) and instead focused on encouraging and enabling academics and the system to learn more about about teaching and learning within their specific context…….


Leveson, L. (2004). Encouraging better learning through better teaching: a study of approaches to teaching in accounting. Accounting Education, 13(4), 529–549.

Prosser, M., & Trigwell, K. (1997). Relations between perceptions of the teaching environment and approaches to teaching. British Journal of Educational Psychology, 67(1), 25–35.

Richardson, J. (2005). Students’ approaches to learning and teachers’ approaches to teaching in higher education. Educational Psychology, 25(6), 673–680.

Samuelowicz, K., & Bain, J. (2001). Revisiting academics’ beliefs about teaching and learning. Higher Education, 41(3), 299–325.

#ascilite2012 technical support and the tail wagging the dog

I’m slowly recovering from a week at conferences. First, ASCILITE’2012 (#ascilite2012) and second the SoLAR Southern Flare Conference (#FlareAus). I was going to spend the week before preparing, but marking and other tasks intervened. This meant I spent much of the week preparing presentations which meant a couple of late nights and limited social collaboration. Add in a couple of early flights and I’m a little tired and frustrated. This may come through in the following.

Perhaps the biggest frustration this week was the audio-visual support at #ascilite2012. This is summed up nicely by the following quote from the “Information for Presenters” page from the conference website

All authors are required to email their final PowerPoint presentation (with all embedded images and videos in the same folder) by no later than 20 November 2012.

Just to be clear on the point, the conference started on the 25th of November. That’s right, the expectation was that we’d have our presentations completed 5 days before the conference started.

This probably wasn’t going to happen for most people. So a follow up option was provided (from the same page)

We would prefer that presenters use the equipment we provide in the venue as each venue will have mac and pc capability, therefore we ask for your presentation before hand, or at least 5 hours before your presentation at the event.

According to an overhead comment from one of the people organising the presentation support, this is how all conferences work.

Sorry, but no.

Tail wagging the dog

To me this is a perfect example of the tail represented by technology and the technologists wagging the dog.

Edu Doggy

For at least the last 10 years I’ve been taking laptops to conferences. For me – and many others I know – our process is to work on the presentations until the very last minute due to two main factors. First, we’re busy. I didn’t get a chance to work directly on my presentation for #ascilite2012 until I left home to travel to Wellington. I didn’t really get into my #FlareAus presentation until the night before. Second, we like to incorporate insights, comments and events from the conference. In the hour or so before my #ascilite the conference chair introduced the idea of FOMO to describe MOOCs and other hypes and Neil Selwyn decried the absence of a focus on the present in educational technology research. Both points that resonated strongly with my presentation. I had to work these into the presentation.

Theoretically, this necessary change was not possible.

Which is somewhat ironic given that the aim of the presentation, the paper and my thesis was to argue that university e-learning suffers from exactly the same problem.

Especially when the #ascilite2012 call for papers is talking about

Recent waves of global uncertainty coupled with local crises and government reforms are reshaping the tertiary education landscape.

Doing it with academics not possible

An extension to this proposition is that since the people, process and product of university e-learning is inflexible, then university e-learning by definition is either done “to” or “for” the academics. i.e. the tail wags the dog. The practice of e-learning is constrained by the people, process and product. This prevents university being done “with” the academics. i.e. as a learning process. This was the theme picked up in our #FlareAus presentation.

The proposal being that learning analytics within universities will largely be done “to” and “for” academics, rather than “with”. Subsequent to this will be a whole range of pitfalls and eventually the likely end result that learning analytics will become yet another fashion, fad or band-wagon.

Evidence of workarounds

Just like I chose to ignore the requirements of the audio-visual folk at #ascilite2012. There was evidence at #FlareAus of people working around the requirements/constraints of university e-learning.

The presentation from Abelardo Pardo used the client-side (browser) approach to working around the inflexibility of the LMS (Moodle). i.e. staff install a browser plugin that identifies when a particular LMS web page arrives in the browser and adds something useful to the page.

Susan Tull presented on the University of Cantebury’s LearnTrak system (more detail in this EDUCAUSE Review article). LearnTrak is a customised version of GISMO which is a “Graphical Interactive Student Monitoring Tool for Moodle”. Susan’s presentation was before mine at #FlareAus. I liked the idea because they were working with their academics to provide a system that worked for them. That responded to local needs. At least that was the impression.

GISMO apparently takes the Moodle plugin approach but it appears that it does breakaway from Moodle’s interface fairly quickly in order to present a fairly detailed collection of reports, mostly charts.

Both these approaches have their limitations. But I am now wondering if there is a vein (rich or otherwise) of research opportunities in developing better and different approaches to breaking the inflexibility of the product and the process of university e-learning. This might become a theme.

Enabling academics to apply learning analytics to individual pedagogical practice: how and with what impacts?

The following is an excerpt from an unsuccessful 2012 second round OLT grant. We’re currently pondering what the next step is with the idea.

A recent presentation at the Southern Solar Flare conference places the following idea in a broader context of learning analytics and how universities are implementing it.

Project Rationale

The Society for Learning Analytics Research (SoLAR) defines learning analytics as (Siemens et al., 2011, p. 4)

..the measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimizing learning and the environments in which it occurs.

One example of learning analytics is the Social Networks Adapting Pedagogical Practice (SNAPP) tool developed by an ALTC project (Dawson, Bakharia, Lockyer, & Heathcote, 2011) to visualise interaction patterns in course discussion forums to support a focus on learner isolation, creativity and community formation. While SNAPP’s network diagrams were found to be effective in promoting reflection on teaching activities it was found that teachers had difficulty in understanding the relationship between their pedagogical approaches and the insights revealed by SNAPP (Dawson et al., 2011, p. 4). Being able to interpret the patterns provided by learning analytics and apply them to practice is difficult, time-consuming, requires additional support, and is worthy of further investigation (Dawson et al., 2011; Dawson & McWilliam, 2008). The challenge is being able “to readily and accurately interpret the data and translate such findings into practice” (Dawson & McWilliam, 2008, p. 12). This project aims to address this challenge.

Ferguson (2012) identifies learning analytics as one of the fastest-growing areas of research within technology-enhanced learning with interest being driven by a combination of technological, pedagogical, political and economic drivers. The 2012 Horizon Report (Johnson & Cummins, 2012) argues that while learning analytics is still in its early stages of development it is likely to see widespread adoption within the higher education sector in the next 2-3 years. The recent Horizon technology outlook for Australian Tertiary Education (Johnson, Adams, & Cummins, 2012) places learning analytics – for the first time anywhere in the world – into the “one year or less” time-frame for adoption. This finding suggests that with the imminent and widespread adoption of learning analytics, Australian universities may be particularly interested in “finding new ways to measure student performance, ideally in real-time” (Johnson et al., 2012, p. 1).

An interest perhaps driven in part by the broadening participation agenda of the Commonwealth government and its targets set in response to the Bradley review (Bradley, Noonan, Nugent, & Scales, 2008). For 40% of Australians aged 25-34 to hold (or be progressing towards) bachelor degrees, we will need to enrol and graduate more students. Many of these students will be from non-traditional backgrounds, and would have been considered ‘high-risk’ students in the past. Learning analytics can help us understand the circumstances under which those students are most likely to succeed. But can learning analytics also help guide teachers to make the coalface pedagogical decisions to support the success of this larger and more diverse body of students?

To date much of the work on learning analytics in higher education has centred on identifying students at risk of failure and addressing short-term issues to prevent that failure (Johnson & Cummins, 2012). The dominant use of learning analytics within higher education has largely been by University administrators (Dawson, et al. 2011) or support staff. The larger promise of learning analytics is when it is used “to more precisely understand students’ learning needs and to tailor instruction appropriately far more accurately and far sooner than is possible today” (Johnson & Cummins, 2012, p. 23). If correctly applied and interpreted this practice has implications not only for student performance, but also for the perceptions of learning, teaching and assessment held by educators (Johnson & Cummins, 2012). There are, however, “a dearth of studies that have investigated the relationship between learning analytics and data requirements that would better assist teachers in the design and evaluation of learning and teaching practice” (Dawson et al., 2011, p. 4). Consequently, it remains uncertain exactly how to best enable teachers to apply learning analytics to inform their individual pedagogical practices, and, if successful, what impacts such practices would have on the perceptions of learning, teaching and assessment held by those teachers, and ultimately on student performance. This project seeks to provide answers to these “How?” and “With what impacts?” questions.

Project outcomes

This project’s six outcomes are summarised in the following table and described in more detail following.

1A course (aka unit or subject) is the smallest stand-alone offering
2 – OKB is the Online Knowledge Base
Outcome Format Audience Contribution
Course 1 modifications Modified courses

Case studies of changes disseminated via presentations, the web (OKB2), and publications

Students & staff in the courses

Other teaching staff and institutional policy markers


Improvements to student learning and changes to teacher perceptions

Data about the impacts of using learning analytics to inform teaching.

Concrete examples of using learning analytics that can be duplicated and modified in other contexts.

Modifications to Moodle tools Changes made available to the broader Moodle community by contribution back to the Moodle code base. 17 Australian Universities using Moodle and the broader Moodle community.

Improved tools will lead to greater use of learning analytics by Moodle using teaching staff.

Concrete examples that improve understanding of the tool design guidelines

Tool design guidelines Design guidelines and theory disseminated via presentations, the web (OKB) and publications Tool developers


Better tools in other LMS.

Foundation for further research.

Harnessing analytics models Models for how both teaching staff and institutions can enable the use of learning analytics to inform pedagogical practice disseminated via presentations, the web (OKB) and publications Teaching staff

Policy makers and institutional leaders

Teaching support staff


Aid teaching staff and institutions in harnessing learning analytics contributing to more widespread, effective use.

Foundation for further research

Refinements to learning analytics patterns New or modified patterns disseminated via presentations, the web (OKB) and publications Teaching staff

Learning analytics researchers

Improved understanding of what is happening and suggestions for possible interventions.

Foundation for further research

Online knowledge base (OKB) A website providing a variety of learning paths through the project’s outcomes, resources, discussions, and processes. Teaching staff

Policy makers and institutional leaders


Widespread and varied dissemination of project outcomes

Course modifications

At the centre of this project are two cycles of Participatory Action Research (PAR) with teaching academics at the University of Southern Queensland (USQ) and CQUniversity (CQUni). The aim of these cycles will be to work with the academics to explore how learning analytics can be used to inform their pedagogical practice – the methods they use for learning and teaching. Learning analytics tools and methods will be used to examine prior offerings of courses taught by these academics, share and explore perceptions of learning and teaching, identify potential course modifications, and examine the outcomes of those modifications. The modifications made will be supported and evaluated by a range of additional means. The modifications made, the reasons why, and the impact of the modifications will inform other project outcomes and will be available as case studies disseminated via various means.

Modifications to Moodle tools

USQ and CQUni are two of the 17 Australian Universities using Moodle as their institutional Learning Management System (LMS). Moodle provides an array of current and newly developed learning analytics tools of varying capabilities and qualities. Informed by prior research, the project’s theoretical framework and the insights gained during the project’s two PAR cycles a range of modifications will be made to these tools with the intent of better enabling the use of learning analytics to inform and share pedagogical practice. NetSpot – an e-learning services company that hosts Moodle for 10 Australian Universities – will make the necessary changes to these Moodle tools. All changes will be made available to the broader Moodle community.

Tool design guidelines

The rationale for and changes made to the Moodle-based learning analytics tools will be captured in design guidelines. The design guidelines are intended to make this knowledge available to the developers of other e-learning systems and thereby enable them to make improvements to their systems to better enable the use of learning analytics to inform pedagogical practice. The design guidelines will be made available via the OKB and will also be published in peer-reviewed outlets.

Harnessing analytics models

Changes to pedagogical practice will not arise simply because of the availability of new tools. A significant body of research has found that changes in approaches to teaching are influenced by disciplinary characteristics, conceptions of teaching, situational factors and perceptions of the teaching environment (Richardson, 2005). In addition, enabling increased use of learning analytics is likely to modify these factors and their relationship. Through its PAR cycles the project will explore these changes and work with academics and institutional leaders to identify factors that constrain and enable the on-going use of learning analytics to inform pedagogical practice. The results of this work will be combined with extant literature to develop a range of design models intended to aid institutional policy makers, teaching support staff and teaching staff in deciding how best to enable the use of learning analytics to inform pedagogical practice in their context.

Refinements to learning analytics patterns

Learning analytics often leads to the transformation of large amounts of usage data into useful patterns, indicators or visualisations. For example, the common correlation between increasing levels of LMS activity and increasing grades (Dawson & McWilliam, 2008, p. 2). These patterns are often used to inform decision-making. However, the patterns that are identified are directly influenced by what is being looked for (e.g. the emphasis on student retention in learning analytics work focusing attention on patterns identifying success factors) and the contexts being explored. For example, Beer, Jones & Clark (2009) found that the increasing activity/increasing grades correlation did not exist for certain groups of students and courses with certain characteristics.

During its PAR cycles, this project will be working directly with a diverse collection of teaching academics to achieve their purposes with learning analytics. This set of different perspectives will lead to the identification of new and the refinement of existing learning analytics patterns. These patterns will be distributed through the OKB, traditional peer-reviewed publications, and, where appropriate, incorporation into the learning analytics tools.

Online knowledge base

The Online Knowledge Base (OKB) fulfils two functions. First, it will be the primary means of communication and collaboration between the project team, reference group and project participants both within this and continuing work. Second, the OKB will serve as one of the major components of the project’s dissemination strategy. Both these functions will rely heavily on social media and related software (e.g. Mendeley, Diigo, blogs, Twitter etc.). The use of these tools will in turn be aggregated and curated to form an online knowledge base. The OKB will provide access to all project outcomes as both formal reports and as a variety of learning paths design explicitly for different stakeholders.

Value and need for the project

The need for this project is well established in the learning analytics literature including the findings and recommendations from two prior learning analytics related ALTC grants (Dawson et al., 2011; Dawson & McWilliam, 2008). Samples from that literature showing the need for this project includes:

  • Learning analytics is still at the early stages of implementation and experimentation (Siemens & Long, 2011).
  • The analytical tools provided by most LMS are poorly utilised (Dawson & McWilliam, 2008) in large part because the “tools and presentation format are complex and far removed from the specific learning context” (Dawson & McWilliam, 2008, p. 8).
  • Australian academics have limited understanding of what is available from these tools and how that data is related to pedagogical practice (Dawson & McWilliam, 2008).
  • There is a dearth of studies examining the use and impact of learning analytics to inform the design, delivery and future evaluations of individual teaching practices (Dawson et al., 2011; Dawson, Heathcote, & Poole, 2010).

This project builds directly upon prior work including prior ALTC grants (Dawson et al., 2011; Dawson & McWilliam, 2008), the development of Moodle learning analytics tools, and broader learning analytics research. The project is, through the project team and the members of the reference group, directly connected to the broader learning analytics communities. By working with a diverse range of teaching staff to explore the use of learning analytics to inform pedagogical practice the project will fill an identified need and produce outcomes of immediate use to a significant portion of the Australian higher education sector. Beyond immediate use the project’s outcomes create a platform for on-going work.


The project will take place over the course of 24 months and includes four main stages summarised in the following table.

Stages Outcomes Evaluation & dissemination
1: Formation and initial design
(Jan 2013 –
Jul 2013)
Project established; ethics approval granted; recruitment of research assistant.
Initial version of OKB.
Initial tool design guidelines and enhancements to LMS tools.
Project evaluation plan.
Two Reference group meetings.
Appointment of and initial meetings with project evaluator.
Presentations at CQU and USQ.
Identification of PAR cycle #1 participants and institutional influencers.
Promotion of OKB and broaden connections with learning analytics community.
2: PAR cycle #1
(Aug 2013 –
Feb 2014)
Modifications to 4 courses.
Changes to OKB.
Enhancements to tool design guidelines and LMS tools.
Initial draft of harnessing analytics guidelines.
Refinements to LA patterns.
On-going engagement and self- evaluation by PAR participants, critical friends, influencers and other team members.
Invited presentations integrated into OKB.
Reference group meeting.
Work with project evaluator.
Contribution of tool modifications to Moodle community.
3: PAR cycle #2
(Jan 2014 –
Sep 2014)
Modification to a further 8 courses. Changes to OKB.
Further enhancements to tool design guidelines and LMS tools.
Enhancements of harnessing analytics guidelines.
Refinements to LA patterns.
On-going engagement and self- evaluation by PAR participants, critical friends, influencers and other team members.
Invited presentations integrated into OKB.
Initial publications.
Reference group meeting.
Work with project evaluator.
Contribution of tool modifications to Moodle community.
4: Project Finalisation
(Oct 2014 –
Dec 2014)
Final enhancements to all project outcomes.
Final project report.
Publications and presentations.
Summative evaluation by project evaluator.
Promotion of OKB with final project outcomes.
Planning for future work.

Methodology and framework

The project will be using a combination of Participatory Action Research (PAR) and design-based research (DBR). This combination is used in part because of the close connection between the two methods – e.g. Wang and Hannafin (2005, p. 6) suggest that DBR is “akin” but slightly different to PAR – but also because of two slightly different project tasks. Firstly, the project aims through the lens of situated cognition to engage fully in the specifics of two institutional contexts for the purposes of helping individual academics address their needs. PAR is the best fit for this purpose and is seen as being better known across disciplines than DBR. To support the use of PAR, the project will adopt the lessons learned by Fraser & Harvey (2008) in a previous ALTC-funded project. These include: supported reflection sessions, the provision of theoretical sparks, and the pairing of academic participants with institutional influencers. The project also has a second task in that it must formulate artefacts (e.g. the knowledge base and enhanced Moodle tools) and design theory (e.g. the harnessing analytics model and the tool design guidelines) useful to the broader community. Design-based research both draws on and is conducted in order to generate design theory (Wang & Hannafin, 2005). Design-based research also involved the use of “multiple research methods in real-world learning environments” (Wang & Hannafin, 2005, p. 20).

In terms of theoretical frameworks, the project’s design draws upon situated cognition, distributed cognition and the role of conceptions and reflection in changing teaching and learning. A brief explanation of each of these and its application to the project design follows.

Seely Brown and Duguid (1989) argue that the tendency for education, training and technology design to focus on abstract representations that are detached from practice actually distort the intricacies of practice. This distortion hinders how well practice can be understood, engendered, or enhanced. It hinders learning. The design and development of many e-learning systems tend to suffer from this limited understanding of the intricacies of practice involved in modern e-learning. Dawson et al.’s (2011) observation that university administrators have been the dominant users of learning analytics in higher education indicates a similar problem with learning analytics. The use of Participatory Action Research will provide the “opportunity for codeveloping processes with people rather than for people” (McIntyre, 2008, p. xii). By situating the project within a shared purpose of joint, collective purpose, PAR will improve the understanding and learning gained about how learning analytics can be used to inform pedagogical practice.

The majority of e-learning systems provide direct support for the implementation of technical tasks such as posting a message to a discussion forum. The difficult cognitive task of combining these technical tasks to create an effective and appropriate pedagogical design is left almost entirely to the teacher. Dawson & McWilliam (2008) identify this problem with most LMS learning analytics tools which have presentation formats that are too complex and far removed from the specific learning context.

Hollan, Hutchins and Kirsh (2000) describe how distributed cognition expands what is considered cognitive beyond an individual to encompass interactions between people, their environment and the tools therein. Boland, Ramkrishnan and Te’eni (1994, p. 459) define a distributed cognition system as one that “supports interpretation and dialogue among a set of inquirers by providing richer forms of self-reflection and communication”. This project will make enhancements to learning analytics tools that make such tools an effective part of a distributed cognition system. A particular focus of the enhancements will be on reducing the difficulty of the task and offering greater support to teacher self-reflection and collaboration.

There is a significant body of literature that has established a link between the conceptions of learning and teaching held by academics and the quality of student learning outcomes (c.f. Richardson, 2005). It has also been found that environmental, institutional, or other issues may impel academics to teach in a way that is against their preferred approach (Richardson, 2005). There is a similarly widely acknowledged view that reflection on teaching contributes to changes in conceptions of teaching that lead to enhanced teaching practice and possibly improved student learning (Kreber & Castleden, 2009). Through its use of situated cognition and participatory action research this project aims to develop significant insight into the factors that constrain and enable adoption of learning analytics. By working within a PAR process with teaching academics and their accompanying institutional influencers the project aims to respond to these factors. Lastly, the combination of PAR with distributed cognition will encourage teaching staff to engage in a range of reflective processes that could lead to changes in their conceptions of learning and teaching and subsequently the quality of student learning outcomes.


Beer, C., Jones, D., & Clark, K. (2009). The indicators project identifying effective learning, adoption, activity, grades and external factors. Same places, different spaces. Proceedings ascilite Auckland 2009 (pp. 60–70). Auckland, New Zealand. Retrieved from

Boland, R., Ramkrishnan, V., & Te’eni, D. (1994). Designing information technology to support distributed cognition. Organization Science, 5(3), 456–475.

Bradley, D., Noonan, P., Nugent, H., & Scales, B. (2008). Review of Australian Higher Education. Canberra. Retrieved from

Brown, J. S., Collins, A., & Duguid, P. (1989). Situated cognition and the culture of learning. Educational Researcher, 18(1), 32–42.

Dawson, S., Bakharia, A., Lockyer, L., & Heathcote, E. (2011). “Seeing” networks : visualising and evaluating student learning networks Final Report 2011. Main. Canberra. Retrieved from

Dawson, S., Heathcote, L., & Poole, G. (2010). Harnessing ICT potential: The adoption and analysis of ICT systems for enhancing the student learning experience. International Journal of Educational Management, 24(2), 116–128. doi:10.1108/09513541011020936

Dawson, S., & McWilliam, E. (2008). Investigating the application of IT generated data as an indicator of learning and teaching performance. Canberra: Australian Learning and Teaching Council. Retrieved from

Ferguson, R. (2012). The State of Learning Analytics in 2012 : A Review and Future Challenges a review and future challenges. Milton Keynes: UK. Retrieved from

Fraser, S., & Harvey, M. (2008). Leadership and assessment: Strengthening the nexus. Final report. Strawberry Hills: Australian Learning and Teaching. Canberra. Retrieved from

Hollan, J., Hutchins, E., & Kirsh, D. (2000). Distributed cognition: Toward a new foundation for human-computer interaction research. ACM Transactions on Computer-Human Interaction, 7(2), 174–196.

Johnson, L., Adams, S., & Cummins, M. (2012). Technology Outlook for Australian Tertiary Education 2012-2017: An NMC Horizon Report Regional Analysis. New Media Consortium. Austin, Texas. Retrieved from

Johnson, L., & Cummins, M. (2012). The NMC Horizon Report: 2012 Higher Education Edition (p. 42). Austin, Texas.

Kreber, C., & Castleden, H. (2009). Reflection on teaching and epistemological structure: reflective and critically reflective processes in “pure/soft” and “pure/hard” fields. Higher Education, 57(4), 509–531.

McIntyre, A. (2008). Participatory Action Research. Thousand Oaks, CA: SAGE Publications.
Richardson, J. (2005). Students’ approaches to learning and teachers’ approaches to teaching in higher education. Educational Psychology, 25(6), 673–680.

Siemens, G., Gasevic, D., Haythornthwaite, C., Dawson, S., Shum, S. B., & Ferguson, R. (2011). Open Learning Analytics: an integrated & modularized platform. Knowledge Creation Diffusion Utilization. Retrieved from

Siemens, G., & Long, P. (2011). Penetrating the Fog: Analytics in Learning and Education. EDUCAUSE Review, 46(5). Retrieved from

Wang, F., & Hannafin, M. (2005). Design-Based Research and Technology-Enhanced Learning Environments. Educational Technology Research and Development, 53(4), 5–23.

Powered by WordPress & Theme by Anders Norén