Assembling the heterogeneous elements for (digital) learning

The LMS Product – limitations and an alternative

What follows is the first draft of the “Product” section for an ASCILITE paper (the overview for the paper) I hope to finish by tomorrow……just a bit of wishful thinking. Much of it has appeared in this blog previously, just now trying to wrangle it into a formal publication and all the limitations (e.g. space) that brings with it.

It’s a first draft, so comments and suggestions more than welcome.


One of the defining characteristics of the industrial e-learning paradigm is the reliance on the Learning Management System (LMS) as the product for organizational e-learning. Despite the associated complexities and risks almost every university seems compelled to have an LMS (Coates, James, & Baldwin, 2005). The LMS is an example of an integrated or monolithic information system. This type of information system brings with it a set of advantages and disadvantages. On the plus side, an integrated system offers cost efficiencies and other benefits through standardization but, at the same time, such systems constrain flexibility, competitiveness, autonomy and increase rigidity (B Light, Holland, & Wills, 2001; Lowe & Locke, 2008). Such systems are best suited to circumstances where there is commonality between organizations and stable requirements with low uncertainty. This does not seem to be a good description of tertiary e-learning, either over the last 10 years or the next 10. This section looks at two of the repercussions of this mismatch – 1) organizations and people must adapt to the system; and, 2) the single vendor limitation – before describing the alternate principles from the ISDT.

The first repercussion of an integrated system is captured by this comment (Sturgess & Nouwens, 2004, n.p.)

we should seek to change people’ behaviour because information technology systems are difficult to change.

This is a comment from a technical staff member participating in CQUni’s 2003 LMS selection process. This comment, rather than being isolated, captures the accepted industry best practice recommendation to implement integrated systems in their “vanilla” form because local changes are too expensive (Robey, Ross, & Boudreau, 2002). Maintaining a vanilla implementation constrains what is possible with the system, limiting change, innovation and differentiation and perhaps being a contributing factor in the poor pedagogical outcomes observed in industrial e-learning.

For example, in 2007 an instructional designer working on a redesign of a CQUni course in Nutrition informed by constructive alignment was stymied by the limitations of the Blackboard LMS. Blackboard could not support the required number of group-based discussion forums required by the new course design. Normally, with an integrated system the pedagogical approach would have to be changed to fit the confines of the system. Instead the implementation of the course site was supplemented with use of one of the Webfuse discussion forums that allowed the fulfillment of the original educational design. Academic staff teaching large first year courses using the Webfuse BAM functionality faced a similar situation when CQUni adopted Moodle. Since Moodle did not provide similar functionality these staff would be forced to change their pedagogical approach to fit the capabilities of the integrated system.

The regular forced migration to another version of an LMS is the extreme example of the organization being forced to change in response to the technology, rather than the technology fitting to the organizations needs. It is not uncommon to hear Universities being forced to adopt a new LMS because the vendor has ceased supporting their current system. The cost, complexity and disruption caused by an LMS migration contributes to this “stable systems drag” (Truex, Baskerville, & Klein, 1999) as the institution seeks a long period of “vanilla” use to recoup the cost.

Another characteristic of an integrated system is that the quality of the tools available is limited to those provided by a single vendor or community. For example, a key component of the recent disquiet about the Curt Bonk MOOC hosted within a Blackboard LMS was the poor quality of the Blackboard discussion forum (see Lane, 2012). Reservations about the quality and functionality of the Wiki and Blog tools within Moodle are also fairly common. LMS-based tools also tend not to fare well in comparisons with specialist tools. For example, when LMS-based blog tools are compared with tools like WordPress. In addition, integrated systems tend to support only one version of every given tool. Leading to the situation where users can pine for the previous version of the tool because it suited their needs better.

The ISDT formulated from the experience of developing Webfuse proposes 13 principles for the form and function of the product for emergent e-learning. These principles were divided into 3 groups:

  1. Integrated and independent services.
    Rather than a system or platform, Webfuse was positioned as glue. It was used to “fuse” together widely different services and tools into an integrated whole. Webfuse was an example of a best-of-breed system, a type of system that provides more flexibility and responsiveness to contextual needs (Ben Light, Holland, & Wills, 2001). For example, when the existing discussion forum tool was seen as limited, a new discussion forum tool was selected and integrated into Webfuse. At the same time the old discussion forum tool was retained and could be used by those for whom it was an appropriate fit. While new tools could be added as required, the interface used by staff and students remained essentially the same. There was no need for expensive system migrations.
  2. Adaptive and inclusive architecture.
    Almost all LMS support some form of plugin architecture where external users can develop new tools and services for the LMS. This architecture, however, is generally limited to tools specifically written for the LMS and its architecture and thereby limiting what tools can be integrated. The Webuse “architecture” was designed to support the idea of software wrappers (Sneed, 2000) enabling the inclusion of a much broader array of applications.
  3. Scaffolding, context-sensitive conglomerations.
    Most e-learning tools provide a collection of configuration options that can be used in a variety of ways. Effective use of these tools requires a combination of skills from a broad array of disciplines and significant contextual knowledge that the majority of academic staff do not possess. The most obvious example is in the overall design of a course website. Webfuse had a default course site conglomeration that combined a range of institutional data sources and Webfuse tools to automatically create a course site. A key aspect of the Webfuse wrappers placed around integrated tools was the addition of institutional specific information and services. There are significant, unexplored opportunities in adding scaffolding to e-learning tools that enable distributed cognition.

Writing about the need for universities to embrace diversity Thomas (2012) talks of Procrustes who

would stretch and sever the limbs of his guests to fit the size of his bed. We, too, are continuing to stretch and shape our higher education to a particular standard to the detriment of students and society alike.

In terms of e-learning, that “particular standard” is defined by the products we are using to implement industrial e-learning.


Coates, H., James, R., & Baldwin, G. (2005). A Critical Examination of the Effects of Learning Management Systems on University Teaching and Learning. Tertiary Education and Management, 11(1), 19-36. Retrieved from

Lane, L. M. (2012). Leaving an open online course. Retrieved from

Light, B, Holland, C. P., & Wills, K. (2001). ERP and best of breed: a comparative analysis. Business Process Management Journal, 7(3), 216-224.

Lowe, A., & Locke, J. (2008). Enterprise resource planning and the post bureaucratic organization. Information Technology & People, 21(4), 375-400.

Robey, D., Ross, W., & Boudreau, M.-C. (2002). Learning to implement enterprise systems: An exploratory study of the dialectics of change. Journal of Management Information Systems, 19(1), 17-46.

Sneed, H. (2000). Encapsulation of legacy software: A technique for reusing legacy software components. Annals of Software Engineering, 9(1-4), 293-313.

Sturgess, P., & Nouwens, F. (2004). Evaluation of online learning management systems. Turkish Online Journal of Distance Education, 5(3). Retrieved from

Thomas, J. (2012). Universities can’t all be the same – it’s time we embraced diversity. The Conversation. Retrieved June 28, 2012, from

Truex, D., Baskerville, R., & Klein, H. (1999). Growing systems in emergent organizations. Communications of the ACM, 42(8), 117-123.


Learning analytics and complexity


The e-learning process – limitations and an alternative


  1. Spot on David. But you could also say that the LMS a fridge being used as a sideboard. The user just knows that if they put the fridge on its side they have a nice flat surface to put stuff on. Meanwhile they’ve put all their groceries on a table somewhere and they’re busy going off in the heat.

    To put the analogy out of its misery; the user doesn’t know what they want their product for and they end up using it for the wrong purposes. Because of this; the product designers don’t know how to improve their product.

    When university leadership finally comes up with an articulated vision for what online learning environments should be like for their students then we might get products that fit or at least a minimal set that of features that works really well and the ability to integrate with other best feature products.

    • Thanks for the comment Mark.

      I agree with your point about the user not knowing what they want. But I don’t think that university leadership are the place to look for that vision. Any vision from on high tends to suffer the same problem the LMS creates in terms of trying to standardise something that is inherently diverse.

      One of the following sections of this paper – that I have yet to write – focusses on process and will argue that an alternative (which I think is better) is a distributed leadership/cognition or communities of practice type bottom up approach where the vision of what e-learning can be arises from the “coal-face” and appropriate networks/groups of skilled people.

      That’s what I have to start on now. Will be interesting to see what you think.

  2. I’m not sure that the user doesn’t know, though. Higher education institutions contain multiple different users with legitimately different needs. Not everyone needs bulk content storage; not everyone needs a complex collaboration environment.

    But everyone gets the exact same model of fridge. So then some people buy fridge magnets and use it as a place to leave notes, and some use it as a place to keep the milk and nothing much else, and some explore fully its capacity to store all kinds of things and match the kitchen.

    In other words, the institutional LMS is not open to much customisation at the system level, so it’s customised by workarounds, avoidance and diversion at the individual user level.

    Broadly, it looks as though this has resulted in a decline in the creative and collaborative capacity of the built-in social tools, because designers know that educators can find better options online for free. But institutions are puzzled and hurt at the implementation stage when they’ve bought a big fridge and someone sees it primarily as a flat surface on which to put their microwave.

    • Kate, thanks for the comment and pulling me up on the “user doesn’t know”.

      By this I don’t mean that there aren’t potential users of e-learning that don’t know what they want. I certainly don’t mean that there is some expert in central L&T, the IT division, the education faculty, or senior management that does know what everyone wants.

      I don’t think there is anyone person who does know the best answer. In fact, as you point out, I don’t think there is one best answer. There’s lots of different answers.

      What I do mean is that it is only through an appropriate network of people (including all of those I mentioned a couple of paragraphs above) working collaboratively and responding to the lessons and insights being gathered as they engage in the practice of e-learning, that we’ll start creating some really interesting answers to the question of “what we want”.

      The biggest problem with the LMS for me, is that it gets in the way of that capability of responding to what is learned.

  3. I would say that some individual users in universities have a very good idea of what they want and need. Most individual users in universities don’t have a good idea of what they want or need. In fact most just wish the issue would go awayso that can concentrate on which days they are going to ‘work from home’. Which of course it won’t.

    The users that decide what the university gets in the way of an LMS have absolutely no idea what the university wants or needs.

  4. How can I resist adding my two cents worth here…

    As a whitegoods vendor we very regularly get vast requirements documents detailing the specific features of each fridge that a potential clients believes will set them up to meet the needs of their various stakeholders, and we do our best to respond forthrightly as to whether our kind of fridge is going to do the job.

    This does bring with it some issues though, namely:

    1. Often one gets the feeling, as a fridge salesman, that many of these requirements have indeed been developed with significant input from ‘the coal face’. While this is in many ways a good thing, it can also lead to a voluminous list of requirements which reflect the *current* fridge, which works just fine thanks (in fact all this new fridge nonsense is just a waste, since I’m really only interested in cooking, which I can do with a new or old fridge…)

    2. Following on from this, as a fridge salesman there are often new and interesting things that we see in the exciting world of fridge development which, although you’ve never seen before, really could enhance your cooking strategy. Sadly, the highly prescriptive list of requirements list provided to fridge sales staff often stifles this potential through its level of detail, and a focus on process rather than outcome (i.e. stating that the fridge “must include a button which initiates the defrost feature of the fridge when required” rather than saying it “must provide means to ensure that any excess frost does not accumulate”). This often leaves us in the sad situation of saying to ourselves “well that’s a pretty daft way of going about it given the other options out there today, but if its really what you want…”.

    3. I’ve noticed over the last four years that very few customers are coming through the door looking for just a fridge any more – they are looking for an ‘integrated kitchen solution’ which cools, cooks, cleans and does a bunch of other stuff into the bargain. All in the one unit. And supported by one whitegoods store. This puts a significant amount of pressure on smaller vendors to be able to deliver a machine of such mythical kitchen omnipotence, and so we are forced to either sell a partial solution, drop out of the game, or team up with a larger supplier who can actually help us put together something which meets these ever-expanding needs…

    What would be lovely in the whitegoods trade would be to have a potential customer come to us and say ‘Help us work out what it is we *think* we really need. Let’s get our smart people with your smart people and work out what the road ahead could look like, acknowledging the past but driving towards the future. Let’s challenge whether we even need fridges any more, or if they can be consigned to the history books with the introduction of improved anti-spoilage technology. And whatever we do, let’s do whatever we can to move away from the ‘great white monoliths’ of the past, and build towards a far more interoperable, flexible kitchen design. And if, and I do mean IF, it looks like there’s the potential for us to implement something of value together, then let’s do it. If not, at least we’re not just building more of the same old same old which isn’t really satisfying anyone. I can almost guarantee that there would be no more work for either you, the client, or us, the vendor, in comparison to the effort we jointly perform now in creating, completing and assessing hundred-plus-page documents which often, sadly, end up as little more than dusty doorstops.

    Right, must get back to fixing this leaky old Westinghouse before my Vienettas melt…

    • What you’re suggesting makes so much sense to me, having been directly involved in creating the feature list that does exactly describe “just like our old fridge but not actually leaking on the floor”. And I feel that we did miss an opportunity to have exactly the kind of transformative cross-professional conversation that you describe, Mark.

      Why do we do it? I think it’s because we have mandatory institutional procurement practices that turn the whole thing from an imaginative conversation about working together into an awkward competitive speed dating process, with the next fridge salesman waiting outside the door.

      By the time the choice is made, it’s too late — everyone’s become committed to the vision enshrined in the 100page doorstop, and configuration and implementation come down to whether you want the door hung on the left or the right.

      Part of the problem, I think, is the vendor model itself. It worked when we were all just shopping for fridges. The vendor and the product were inseparable.

      But now we’ve moved into interoperable kitchen design, what we need is designers, people whose expertise lies in helping institutions understand the emerging technical solutions in the context of current educational sector constraints and demands. In Australia, for example, these would be people who can talk about analytics specifically in terms of the reporting that Australian higher education institutions will have to produce in the next 3-5 years.

      So how do we get beyond the current RFP set up to create imaginative processes for ed tech companies and educational institutions to choose each other?

  5. I recently wrote a post for WCET (followed by a webcast on similar subject) that supports the arguments here – we are (and need to be) moving from monolithic LMS market to learning platform market, and institutional decision-making needs to change.

    I think Kate captures one of the barriers to making this change very well – institutional procurement practices – although I would argue that they are not as mandatory as it often appears. From my experience as a consultant it is really customary procurement practices based more on IT or purchasing organization self-importance than on actual policies or regulations.

    For example, there might be a policy requiring some form of scoring to be objective, but that scoring does not necessarily need to be at the feature level. Feature-level scoring is the biggest cause of RFP bloat and “our old fridge but not actually leaking on the floor”, followed by fridge salesmen pain. When pushing schools to describe needs instead of features & allowing for the fridge salesmen to show new approaches to meet needs, the biggest pushback I tend to see is IT or purchasing wanting to use their historical template, or a cynical view from committee members that we need to tell the damn fridge salesmen exactly what we need or we’ll get stuck with a monstrosity just like last time.

    The other pushback is a very shallow understanding of group decision-making at most institutions. Voting and scoring are not always (or even usually) the best mechanisms for group decisions. They provide the veneer of objectivity while abdicating the real responsibility of group deliberations and building consensus.

    In other words, I completely agree that the purchasing process is a major problem, but it doesn’t have to be this way – it is organizational resistance and control issues that often prevent change.

    One other point (of agreement) – the definition of future capabilities should be an iterative process, with users explaining current pain points and desires, and fridge salesmen showing new concepts and potential visions of the future. Any formal purchasing process should either include this back-and-forth with a demo / discussion phase to help define requirements, or preferably should start after this iteration has had some time to work, though pilot projects, etc.

    The work done at UMassOnline by Patrick Masson and team is a good example of this more flexible approach, and it was done at a public institution.

    Very good post and discussion.

  6. I think Mark Drechsler hits all of the right points. Software selection (not just the LMS) is driven by feature comparisons, “does the fridge have an ice dispenser?” “Can the ice dispenser dispense cubes and crashed ice?” Can the ice dispenser also dispense water?” “Hot and cold water?” As more and more features are added to fridges, these features essentially become standards that must be in place for a system (fridge or LMS) to even be called a system (fridge or LMS). What fridge doesn’t make ice? And why do you need hot water from your “refrigerator.”

    At UMassOnline we’ve been advocating for a new approach to RFP’s that would eliminate the ever increasingly granular feature by feature comparisons, and focus on functionality, that is, what instructors and students can do with those features. I don’t have a good fridge analogy, so I will stick with a tried and true example I frequently use.

    Often in an LMS RFP you will find feature requirements around discussion boards: can you sort by thread, author, date, etc.; can you assign permissions (view/edit); can you do assessments, etc. To all of these feature requirements respondents can check “Yes” (which provides really no differentiation between any product or provider), or they can try and offer some examples and insights into how one might use their discussion forums to sort, assign and assess learning activities (assuming the evaluators share their learning objectives afforded through their examples). Rather than listing features, UMassOnline asked evaluators to provide user stories ( User stories are first person (short) narratives that describe what you want to do (not what you want to use) and also provide a test scenario for assessing if you can indeed do that. They usually follow a simple outline, “As a [user/stakeholder/actor], I want to [desired condition], so that I can [acceptance criteria].” Each of these user stories (over 300 from seven campuses, in UMassOnline’s case) was then submitted to the providers. The respondents were then asked to provide a “testing script” for each user story. The testing scripts were essentially instructions on how to accomplish the desired condition and meet the acceptance criteria.

    So imagine I teach a course on fiction and I want my class to team up to write a single description of one character. I then want them to review one another’s to construct one full description. I expect them to draft something, review each other’s work, comment on the drafts, collaborate on new versions, revise, etc., even critique/assess each other’s contributions.

    What features would I include in my RFP to allow this? With user stories, instead of features, I could simply say, “As an instructor, I want to create small groups so they can do peer to peer assessment.” To accomplish the above learning activity, you might use any number of tools/features, e.g. a discussion forum, email, track changes in MS Word, a wiki, a blog, etc. The tools do not matter, it’s what you do with the tools that make the application useful.

    For the RFP responses, providers would submit the instructions on how to set up a group (in a discussion forum, wiki, blog, or/and email list, etc.), limit participation to that group, allow monitoring, commenting, editing, and assessments, etc. Then the evaluators would access the system, follow the instructions and see if they could indeed achieve what they desired: letting a small group of students co-create a character narrative.

    This approach removes the feature/product bias and the tendency of evaluators to go with what is familiar or what they already know (have), while allowing providers the opportunity to differentiate their product’s innovative approach to teaching and learning. Evaluators shouldn’t care about the technology they should care about the techniques, i.e. the affordances a tool enables. You can learn more here:

    • Thanks to everyone for all the comments, I’m busy trying to finish this paper so my comments will be brief and will not show a deep engagement with the perspectives raised in the comments. Instead, I’ll briefly mention what I think is distinctive about the alternative I’m suggesting.

      i.e. the very idea of a RFP, is a teleological process, and assumes that a process of in-depth analysis can find the right answer for the organisation. Whether the analysis is done by the organisation or the experienced “fridge salesman”, it’s still based on this techno-rational, almost instructivist, ideal.

      The alternative I try very badly to summarise towards the end end of the follow up post to this one is that the process around the “support” of the institutional e-learning system needs to be based around a more emergent/agile/ateleological process. Perspectives that have a fairly strong a socio-constructivist, situated cognition foundation.

      The problem I have with the LMS isn’t how organisations select them, it’s that the nature of the LMS helps prevent organisations from adoption these ateleological processes.

Leave a Reply

Your email address will not be published. Required fields are marked *

Powered by WordPress & Theme by Anders Norén