COVID-19 and the subsequent #pivotonline has higher education paying a lot more attention to the use of digital and online technology for learning and teaching (digital education). COVID-19 has made digital education necessary. COVID-19 has made any form of education – and just about anything else – more difficult. For everyone. COVID-19 and it’s impact is rewriting what higher education will be after. COVID-19 is raising hopes and fears that what will come after will be (positively?) transformative. Not beholden to previous conceptions and corporate mores.

Most of that’s beyond me. Too big to consider. Too far beyond my control and my personal limitations. Hence I’ll retreat to my limited experience, practices, and conceptions. Exploring those more familiar and possibly understandable landscapes in order to reveal something that might be useful for the brave new post-COVID-19 world of university digital education. A world that I’m not confident has any hope of being positively transformed. Regardless of what the experts, prognosticators, futurists and vendors are selling. But I’m well-known for being a pessimist.

Echoing Phipps and Lanclos (2019) I believe that making changes in digital education needs to be grounded in “an understanding of the practices that staff undertake and the challenges they face” (p. 68). Some colleagues and I have started identifying our practices and challenges by documenting the workarounds we’ve used and developed. Alter (2014) defines workarounds as

a goal-driven adaptation, improvisation, or other change to one or more aspects of an existing work system in order to overcome, bypass, or minimize the impact of obstacles, exceptions, anomalies, mishaps, established practices, management expectations, or structural constraints that are perceived as preventing that work system or its participants from achieving a desired level of efficiency, effectiveness, or other organizational or personal goals (p. 1044)

Workarounds are a useful lens because they highlight areas of disconnect between what is needed and what is provided. Alter (2014) suggests that this Theory of Workarounds could be used to understand these disconnects and leverage that understanding to drive re-design. Resonating with Biggs’ (2001) notion of quality feasibility, a practice that actively seeks to understand what the impediements to quality teaching are and to remove them.

The challenge I faced was whether I could remember a reasonable percentage of the workarounds I’ve used in 20+ years.

Enter the following list of eight Online Course Quality Indicators also available as a PDF download and tested in Joosten, Cusatis & Harness (2019) (HT: @plredmond and OLDaily).My interest here isn’t in the validity/value of this type of approach (of which I have my doubts). Instead, my interest in that the eight indicators offer a prompt for the type of considerations to which a conscientious teacher might pay attention. The type of considerations that will point out limitations within institutional support for (digital) education and generate workarounds.

Initial findings

So far I’ve remembered 53 workarounds. Detail provided below. The following table maps workarounds against the quality indicators. The biggest category is Doesn’t fit. i.e. workarounds that didn’t seem to fit the quality indicators. Perhaps suggesting that the quality indictors were designed to analyse the outcome of teacher work (online course), rather than provide insight into the practices teachers undertake to produce that outcome.

Peer interaction and content interaction are the indicators with the next highest number of workarounds. Though I have collapsed both content interaction and richness indicators into content interaction.

Quality Indicator

# of workarounds









Instructor interaction


Peer interaction


Content interaction / Richness


Doesn’t fit


53 is a fair number. But perhaps not surprising given my original discipline is information technology and part of my working life has been spent designing LMS-like functionality.

What’s disappointing is that a number of these workarounds are duplicates solving the same fundamental problem. The only difference being in the institutional and technological context. For example, a number of the workarounds are focused on helping with:

  1. Production and maintenance of well-designed, rich course content.
  2. Increasing the quanity and quality of what teachers know about students background and activity.

What does that say about higher education, digital education, and me?

Proper reflection and analysis will have to wait for another time. But evidence of a difficulties in at least two fundamental practices seems important. Or, perhaps it’s just showing how blinked and obsessive my interest is.

There are some questions about whether the following are actually workarounds. In particular, some of the fairly specific learning activities aren’t actually designed to change an existing part of the institutional context. There was no part of the institutional context that provided for the learning activities. Largely because the learning activities were so specific to the learning intent that the institution would never have been able to provide any support. However, most institutions now have lists of digital tools that have been approved for use in learning and teaching. Typically, the specificity of the learning need means that no appropriate tool has been added to the list.

What does this say about the reusability paradox and institutional approaches to digital education?

Workarounds and quality indicators

The following steps through each of the quality indicators and uses them as an inspiration to answer the above question. For each workaround links to additional detail is provided and initial thoughts on the workaround given.


Systems Emergencies

One attempt at an authentic real world experience was the Systems Emergency assessment item for Systems Administration (Jones, 1995). Each student had to run a program on their computer. A program that would break their computer. Simulating an authentic error. The students had to draw on what they’d learned during the course to disagnose the problem, fix it and complete a report.

Is this a workaround? It’s so specific to a particular course and a particular pedagogical choice there is no institutional system that it is replacing.

Open Learning Computing Platform

Better example from the same course went by the acryonym OLCP (open learning computing platform) (Jones, 1994). The recommended computer systems almost all distance education students were using (Windows 3.1/95) was not up to the requirements of the course (Systems Administration). To workaround this limitation we distributed a version of Linux (Jones, 1996a), eventually relying on commercial distributions. Without Linux the course couldn’t be taught.

Personal Blogs, not ePortfolios

Arguably, my predilection for requiring students to use their choice of public blogging engines, rather than institutional ePotfolio tools was also driven by a desire for authenticity. Not to mention my skepticism about the value of institutional ePortfolio systems (which got me in trouble one time). Initially, individual student blogs were an extension of journaling (introduced in Sys Admin) and an encouragement to engage in open reflection and discussion. Intended to mirror good practice for IT professionals and first used in a Web Engineering course in 2002. Later evolving into the BAM and BIM tools to encourage reflection for assessment purposes and to encourage the development of a professional learning network.

Alignment and curriculum mapping

In terms of alignment of assessments and learning activities I’ve used and more often seen people use bespoke Word documents and spreadsheets to engage in mapping of courses and programs. Mainly because institutions did not have any practice of encouraging such practices, let alone systems to do it (e.g. this from 2009). There’s been a lot more attention paid and importance placed on mapping, but generally it remains an area of bespoke documents and spreadsheets. Perceived shortfalls that led to some design work on alternatives.


Moodle Course design

Designing a well-organised course site that is easy to navigate with manageable sections and a logical and consistent form is no easy task given the nature of most LMS. My first foray into this (before 2012 I was using an LMS I developed) added the fikkiwubg design features using bits of HTML.

A “Jump-to: Topic” navigation bar to my Moodle course sites to avoid the scroll of death.

Addition of non-topic based navigation to the top of the Moodle site to provide a sensible grouping of resources (Course background & content) that didn’t fit with the default Moodle design.

Addition of topic-based photos to generate visual interest, perhaps a bit of dual coding with the topic, and encourage some further exploration.

A “Right now” section at the top manually updated each week (along with the banner image) of term to orient students to the current focus.

Moodle Activity Viewer

Since the in-built Moodle reports aren’t that good and because I really wanted to understand how students were engaging with the Moodle sites I designed the Moodle Activity Viewer scratched an itch. It enabled an analysis of student activity.

Evernote to search a course site?

On of the on-going challenges with using Moodle was the absence of a search engine, a fairly widespread and important part of navigating any website. I did consider a number of different options and ended up trying out a kludge with Evernote. But only for one offering.

Modifying the Moodle search book block

Hosting course content on a WordPress blog

In 2012 I took over a Masters course titled Network and Global Learning. Given the focus of the course, hosting the learning environment in a closed LMS site didn’t seem appropriate. Instead, I decided to try it as an open course. It ended up as a WordPress site and has since been taken over by another academic…at least for one offering. Looks like it probably ended up back in the LMS.

Diigo for course revision

Given NGL was hosted on a course blog, this raised questions about how to take notes about what wasn’t working and ponder options for re-design. In Word, this could be done with the comments feature. For the Web I used Diigo to produce annotations like the following.

Card Interface

Late 2018 saw me stepping backwards to Blackboard 9.1. A very flexible system for structuring a site, but incredibly hard to make look good without a lot of knowledge. How to enable lots of people organise their course sites effectively? Enter the Card Interface. Easily convert a standard Blackboard content page into a contemporary, visual user interface.