Assembling the heterogeneous elements for (digital) learning

The network challenge to the LMS mindset

It’s been an intellectually draining few days with a range of visitors to the institution talking about a range of different topics. It started on Monday with George Siemens giving a talk titled “The future of the University: Learning in distributed networks” (video and audio from the talk available here). A key point – at least one I remember – was that the structure of universities follows the structure of information and the current trend is toward distributed networks. On Tuesday, Mark Drechsler ran a session on Extending Moodle using LTI which raises some interesting questions about how the LMS mindset will handle the challenge of a distributed network mindset.

LMS != distributed network

The LMS is an integrated, enterprise system sourced from a single vendor. The institution as a whole decides upon which of the different available systems it will choose and then implements it on a server. The students and staff of that institution must now make use of that system. Changes to that system are controlled by a governance structure that may/may not approve the addition or removal of functionality into the enterprise system. Only a designated IT group (outsourced or local) are actually able to make the change. The “network” has a single node.

The typical mindset encouraged by an LMS when designing learning is not what is the best way to engage student learning. It’s the best way to engage student learning within the constraints of the functionality currently provided by the LMS. I wrote more about the limitations of this model in Jones (2012) and almost incessantly over the last few years. Chapter 2 of the PhD thesis a fair bit of this argument.

Over recent years most institutions have realised that a single node network consisting of the LMS isn’t sufficient for their needs. The network has had a few new nodes such as a lecture capture system, a content repository, an eportfolio system and a range of others. However, this “network” of services isn’t really a distributed network in that it’s still only the institution approved processes that can add to the network. I as an academic, or one of my students, can’t decide we’d like to add a service that is integrated into this institutional network.

Sure we can use standard hyperlinks to link off to Google docs or any one of the huge array of external services that are out there. An extreme example is my kludge for using BIM this year. Where I’m hosting a version of BIM on my laptop because for various reasons (including many of my own making) BIM couldn’t get installed into the institutional version of Moodle in time.

The trouble is that these kludges are not members of the distributed learning systems network associated with the institution. The most obvious indicator of this is the amount of manual work I need to engage in to get information about students from the institutional system into my local install of BIM and then to get information out of my local install of BIM back into the institutional ecosystem.

To have seamless integration into the institutional LMS network requires going through the institutional governance structure. Now there are good reasons for this, but many of them arise from the problem of the LMS not being a network. Some examples include

  • a “bad” addition to the LMS could bring the system down for everyone;

    If the LMS were a network, then this wouldn’t happen. The additions would be on another node so that if the addition was “bad” only that node would be impacted. If nodes could be added by individuals, then only that individual’s applications would be impacted.

  • not enough people are going to use the addition;

    To make it worthwhile to integrate something into the system, there has to be the likelihood that a large number of people are going to use it. Otherwise it’s not worth the effort. The cost of adding something to an integrated system is high. With a network approach the cost of adding a new feature should be low enough to make it economical for only one person to use it.

  • who’s going to help people use the new addition;

    Since a large number of people have to be able to use the system, this raises the question of who is going to support those people. In a network approach, there isn’t this need. In fact, I may decide I don’t want other academics using the service I’ve added.

  • the inertia problem;

    The other major impact of this high cost of integrating a tool into the LMS is inertia. The cost of making changes and the cost of negative impacts means great care must be taken with changes. This means that rapid on-going improvement is difficult leading to inertia. Small-scale improvements suffer from a starvation problem.

  • the trust problem;

    Since it’s a high cost, high risk situation then only a very limited group of people (i.e. central IT) are allowed to make changes and only after approval of another limited group of people (the governance folk).

  • vanilla implementation.

    All of the above leads to vanilla implementations. It’s too difficult to manage the above, so let’s implement the system as is. I’ve heard stories of institutions moving away from more flexible systems (e.g. Moodle) back toward more constrained commercial systems because it removes what element of choice there is. If there’s no choice, then there’s no need for complex discussions. It’s easier to be vanilla.

The LTI Challenge

The Learning Tools Interoperability standard, or more precisely it’s integration into various LMS offer a challenge to this LMS mindset. LTI offers the possibility – at least for some – of turning all this into more of a network than an integrated system. The following will illustrate what I mean. What I wonder, is how well will the existing governance structures around institutional LMS – with their non distributed network mindset – respond to this possibility?

Will they

  1. Recognise it as a significant advantage and engage in exploring how they can effectively encourage and support this shift?
  2. Shut it down because it doesn’t match the LMS mindset?


In the very near future, BIM will be installed into the institutional Moodle install for use by others. I have always feared this step because – due to the reasons expressed above – once BIM is installed I will not be able to modify it quickly.

LTI apparently offers a solution to this via this approach

  1. I set up a version of Moodle on one of the freely available hosted services.

    This would be my install of Moodle, equivalent to what I run on my laptop. No-one else would rely on this version. I could make changes to it without effecting anyone. It’s a separate node in the network relied upon by my course. I can install a version of BIM on it and modify it to my hearts content confident that no-one else will be impacted by changes.

  2. Install the Moodle LTI Provider module on my version of Moodle.
  3. Set up a course on my version of Moodle, create a BIM activity and add it to the LTI provider module.

    This allows any other LTI enabled system to connect to and use this BIM activity as if it were running within that system, when it is actually running on my version of Moodle. Of course, this is only possible when they have the appropriate URL and secret.

  4. Go to the institutional version of Moodle and the course in which my students are enrolled and add an “External Tool” (the Moodle name for an LTI consumer) that connects to BIM running on my version of Moodle.

    From the student (and other staff) perspective, using this version of BIM would essentially look the same as using the version of BIM on the institutional Moodle.

LTI allows the institutional LMS become a network. A network that I can add nodes to which are actually part of the network in terms of sharing information easily. It’s a network where I control the node I added, meaning it no longer suffers from the constraints of the institutional LMS.

The downsides and the likely institutional response

This is not an approach currently in the reach of many academics. It’s not an approach required by many academics. But, that’s the beauty of a network over an integrated system, you don’t need to be constrained by the lowest common denominator. Different requirements can be addressed differently.

In terms of technical support, there would be none. i.e. you couldn’t expect the institutional helpdesk to be able to help diagnose problems with my Moodle install. I would have to take on the role of troubleshooting and ensure that the students, if they have problems, aren’t asking the helpdesk.

Perhaps more difficult are questions around student data. I got in trouble last year for using a Google spreadsheet to organise students into groups due to students entering their information onto a system not owned by the institution (even though the student email system is outsourced to Google). I imagine having some student information within a BIM activity on an externally hosted server that hasn’t been officially vetted and approved by the institution would be seen as problematic. In fact, I seem to recollect a suggestion that we should not be using any old Web 2.0 tool in our teaching without clearing it first with the institutional IT folk.

Which brings me back to the my questions above, will the organisational folk

  1. Recognise the LTI-enabled network capability as a significant advantage and engage in exploring how they can effectively encourage and support this shift?
  2. Shut LTI down (or at least restrict it) because it doesn’t match the LMS mindset?

    How long before the LTI consumer module in the institutional LMS is turned off?

LTI seems to continue what I see as the inexorable trend to a more networked approach, or as framed earlier as enabling the best of breed approach to developing these systems. LTI enables the loose coupling of systems. Interesting times ahead.


Jones, D. (2012). The life and death of Webfuse : principles for learning and leading into the future. In M. Brown, M. Hartnett, & T. Stewart (Eds.), Future challenges, sustainable futures. Proceedings ascilite Wellington 2012 (pp. 414–423). Wellington, NZ.


Neither strategy nor "space" to innovate is enough


An overview of learning analytics


  1. beerc

    G’day DJ

    Interesting post. I’ve watching the LTI space with interest (and hope). The concept of loose coupling appeals to me and much of our work in recent years is based around it. The reason loose coupling with enterprise systems appeals to me is that you can often bypass the usual constrains associated with the enterprise systems. For example, its a long, arduous task to develop a Moodle plugin that does something useful in your local context.Then you have to battle to have the Moodle plugin tested, installed so on and so forth. None of this aligns with an agile approach to development that is often required to make something truly useful in a given context.

    It just seems rather comical to me that we are trying to facilitate quality education using rigid systems and processes that simply cannot adapt or evolve to match the current context, much less the actual context that is always changing. So while I appreciate and look forward to LTI and the possibilities it affords, I can’t help thinking that the bigger problem is with the rigidity of enterprise thinking. Here we are nearly ten years out from the Web2.0 boom and higher education is talking about a distributed LMS?


    • What I think will be really interesting is how long it will take “the rigidity of enterprise thinking” to recognise that there’s a potential challenge to their mindset. My understanding is that the ability to add an external (LTI) tool in Moodle is standard with Moodle 2.2 or later. It’s just there.

      Would love to do research explaining this to people from different backgrounds and roles and see how they see the capability and what they plan to do with it.

  2. David,

    Thanks for the post – one question for clarification thought. You mention that in Siemens’ presentation one of the themes being that the “structure of universities follows the structure of information”. Is this correct, or is it more correct that “structure of LEARNING follows the structure of information”, and that “structure of universities SHOULD follow the structure of information (regardless of whether they are doing this in reality or not)?

    As I said in the session, I won’t consider LTI-driven integrations as having really arrived until adding in an external tool is as simple as installing an app on an iPhone (and yes, I realise the metaphor doesn’t bear close scrutiny from a technical perspective before anyone lays into me about it). Right now using an external tool via LTI is often an exercise in confusion, guesswork and frustration for the most part (at least in my experience), and something that most academic staff wouldn’t go near, not to mention the aspects lacking in the current LTI 1.x specification (lack of logging data pushed back to the LMS being one showstopper for me, which I’m not sure will be on the roadmap for 2.0 or not), not to mention those folks out there ‘extending’ LTI for some tools so that they will only work on ‘specific’ LMS platforms…

    All that aside, even if the standard and the interface both realise their potential, some of the policy, risk and institutional control challenges will be far harder to solve, and begs the question in some ways of whether these challenges are just an indication of a HE sector that has not evolved at the same rate as how we learn has evolved. I know Allan disagrees with me on this one – hoping he will weigh in to this argument as well 😉



    • re: the point Siemens made, I can’t remember the exact quote. I did try to do a quick search through the video but couldn’t find it. George did RT the post, so I’m hoping if there were any major misrepresentation of his thoughts he would have (or will) correct them. In the end, I think the point about moving to a more networked structure for both the institution and its technologies would fit.

      I tried quickly to get my local Moodle install using LTI to talk to itself. Didn’t work with some not all that clear an error message, which I haven’t had time to explore. So I can see your point about it not being a widespread solution. But it is perhaps enough for a range of more advanced/ornery folk to explore.

      In terms of the HE sectors evolution, I’m in your boat. To some extent I think the dominant “pedagogies” I see in HE combined with the dominant governance strategies indicate a sector a long way behind. There are pockets of people doing brilliant stuff, but they are the exception.

  3. I’ve done a good amount of work with LTI. Particularly with working with vendors to be their first LTI consumer. What I’ve found is that providers often do not understand or care about FERPA guidelines here in the US. During the evaluation period of many LTI providers we’ve found significant problems. For example… Leaking student/course roster information to unauthorized users. Student submission data open to anyone.

    We limit the addition of LTI tools centrally. For LTI providers within the University we streamline the process. I’m not sure the typical end user is capable of evaluating the security of edtech tools.

    • Thanks Andy for the sharing your experience. While my sentiment above may come across as anti-LMS/governance, there are legal and other reasons why adding a node to the network needs to have a level of checks and balances. Getting the balance between these checks and the benefits that arise from lightweight network addition will be interesting to observe. Especially if you think about different possible futures where students are able to make linkages. For example, if you’re a school following UMW’s domain of one’s own model where students might have services that make sense to share within the institutional ecosystem. LTI and what comes next do seem to be move toward that sort of possibility.

  4. I see only positive with LTI. It is a win for faculty and students. A whole new set of tools are showing up that extend the LMS and provide things that faculty have been wanting to do for years. Now they integrate with the LMS, and provide single sign-on, and data flow between the two. I teach with Canvas, and it allows faculty to locate LTI apps, and add them to their own courses without having to go through the LMS administrator. Some schools may freak out about it, but in the end it only makes sense. Sure we need to make sure that student data is protected. I only see positive things with LTI in my experience using LTI tools.

    • Thanks for the positive comments. I hope it’s clear from the above, that for me as faculty, LTI is a god send. I only wish I’d explored it earlier this year. Would have saved me a lot of time and energy. Moodle currently allows the same freedom for faculty. But I do fear it being turned off sometime soon as the implications are though through by those in the organisational governance structure.

  5. alishahrazad

    Great post David, interesting timing for me to find this post because I just had a great conversation with an HE admin (using Canvas) who is very happy with the LTI framework and its flexibility at their institution, and has an open mind about the checks & balances you mention above (a favorable institutional response in my opinion).

    I think another critical barrier to having a distributed network of “learning” tools and activities alongside the LMS is the lack of standardization for learning-related data. Every tool out there has its own way to track results, student/employee information, and provide analyses.

    The Experience API (also known as the Tin Can API) can be that missing link. I’d encourage those interested to learn more if you haven’t done so already: from the US DoD ADL team (Advanced Distributed Learning).


    • G’day Ali,

      Thanks for the comment. There seems to be a definite trend in the comments here that the Canvas folk are getting it. Perhaps institutions that adopt Canvas also “get” the network mindset. My experience has been at Universities where the traditional enterprise IT approach (i.e. not the network mindset) has been predominant. It would be interesting to see if such a difference existed.

      With LTI and Tin Can it would definitely seem that the learning standards community is heading more toward the network mindset. So while the tools may be lagging a bit it certainly seems like they are heading in the right direction. Which brings me to the question of the mindsets of IT governance and whether they will keep up.


Leave a Reply

Your email address will not be published. Required fields are marked *

Powered by WordPress & Theme by Anders Norén