Assembling the heterogeneous elements for (digital) learning

Category: webfuse Page 1 of 6

TAM, #moodle, online assignment submission and strategic implementation

The following is an attempt to expand upon a mention in the last post about exploring one set of thoughts about why/how we might extend/reuse/build upon some prior research using TAM (Technology Acceptance Model) and its application to understanding the use (or not) of Online Assignment Submission and Management (OASM) in higher education.

In summary,

  1. In 2005, we published a couple of papers (Behrens et al, 2005; Jones et al, 2005) in which the Technology Acceptance Model (TAM) (Davis, 1989; Venkatesh et al, 2003; Venkatesh & Bala, 2008) was used to explore why an increasingly widely used online assignment submission and management tool (OASIS) was successful.
  2. In 2009/2010, that system was replaced as part of an project to adopt a single LMS (Moodle) that was designed to provide “appropriate support for staff and students to access and use ICT effectively in learning and teaching” (Tickle et al, 2009, p. 1040)
  3. There have been mixed messages about the success of that project. For example, Tynan et al (2009) suggest this

    It is probable that since the institution had undergone a large review and renewal of technology in the learning management system where processes to support academics were put in place and where academics were included in decision making and empowered to change and upskill, negative attitudes towards the general impact of technology were not an issue for staff. One can hypothesise that these issues were principally resolved.

    . While Rossi and Luck (2011) report on a range of issues with the transition including a significant loss of functionality in terms of online assignment submission and management.

    There have also been some significant anecdotal comments about issues with the Moodle OASM functionality with large classes.

  4. It’s now 3/4 years since the implementation of Moodle. Now would appear to be a good time to explore the usage of the Moodle OASM functionality and the perceptions of the teaching staff. This would enable some comparison with the earlier findings from the 2005 work. Especially given findings from this work

    The study concludes that staff perceptions have indeed changed and whilst more staff are using online systems for assignment submission, marking and feedback, many do not have a positive attitude towards it. This could be explained by the increased prevalence of available systems and tools alongside their mandated presence.

    but at the same time “but students wholeheartedly in support” of OASM.

There’s a bunch of stuff to unpack here, an initial start includes

  • The original OASM system was entirely optional. There was no perception that OASM was compulsory or mandatory in the early noughties. As Huber (2013) suggests there is a growing trend toward the expectation or the explicit mandating of OASM. TAM research suggests that optional and mandatory adoption decisions have different impacts/factors.

    Is OASM now seen as mandatory?

  • What is the actual use of OASM?

    We can use “learning analytics” to examine the trends in adoption and use. We did this in 2009 (Beer et al, 2009). The image below (click on it to see it larger) compares the use of “evaluating students” features between the institution’s prior LMSes: Blackboard and Webfuse. “Evaluating students” includes both OASM and quizzes. The purple and green lines indicate the max/min adoption rates expected for this feature from Malikowski et al (2007).

  • The impact of prior use.

    As Huber (2013) suggests OASM is increasingly more widespread. Obviously, prior experience with OASM will influence perceptions (versions of TAM suggest this as well). But will there be a difference between people who have used the Webfuse OASM system and those who have used other systems?

  • What are the factors that impact perceptions.

    The free text responses we focused on in earlier papers are useful for identifying what it is that people like (or don’t) about these systems. This could be interesting.

  • How specific do we get?

    Huber (2013) draws on surveys that are much more explicit in exploring different uses of OASM. The TAM survey is more generic and open ended. What’s the right mix?

The life and death of Webfuse: What's wrong with industrial e-learning and how to fix it

The following is a collection of presentation resources (i.e. the slides) for an ASCILITE’2012 of this paper. The paper and presentation are a summary of the outcomes my PhD work. The thesis goes into much more detail.


Drawing on the 14-year life and death of an integrated online learning environment used by tens of thousands of people, this paper argues that many of the principles and practices underpinning industrial e-learning – the current dominant institutional model – are inappropriate. The paper illustrates how industrial e-learning can limit outcomes of tertiary e-learning and limits the abilities of universities to respond to uncertainty and effectively explore the future of learning. It limits their ability to learn. The paper proposes one alternate set of successfully implemented principles and practices as being more appropriate for institutions seeking to learn for the future and lead in a climate of change.


The slides are available on Slideshare and should show up below. These slides are the extended version, prior to the cutting required to fit within the 20 minute time limit.


Arnott, D. (2006). Cognitive biases and decision support systems development: a design science approach. Information Systems Journal, 16, 55–78.

Behrens, S., Jamieson, K., Jones, D., & Cranston, M. (2005). Predicting system success using the Technology Acceptance Model: A case study. 16th Australasian Conference on Information Systems. Sydney.

Brews, P., & Hunt, M. (1999). Learning to plan and planning to learn: Resolving the planning school/learning school debate. Strategic Management, 20(10), 889–913.

Cecez-Kecmanovic, D., Janson, M., & Brown, A. (2002). The rationality framework for a critical study of information systems. Journal of Information Technology, 17, 215–227.

Central Queensland University. (2004). Faculty teaching and learning report. Rockhampton, Australia.

Davenport, T. (1998). Putting the Enterprise into the Enterprise System. Harvard Business Review, 76(4), 121–131.

Dede, C. (2008). Theoretical perspectives influencing the use of information technology in teaching and learning. In J. Voogt & G. Knezek (Eds.), (pp. 43–59). New York: Springer.

Dillard, J., & Yuthas, K. (2006). Enterprise resource planning systems and communicative action. Critical Perspectives on Accounting, 17(2-3), 202–223.

Fleming, P., & Spicer, A. (2003). Working at a cynical distance: Implications for power, subjectivity and resistance. Organization, 10(1), 157–179.

Haywood, T. (2002). Defining moments: Tension between richness and reach. In W. Dutton & B. Loader (Eds.), (pp. 39–49). London: Routledge.

Hutchins, E. (1991). Organizing work by adaptation. Organization Science, 2(1), 14–39.

Introna, L. (1996). Notes on ateleological information systems development. Information Technology & People, 9(4), 20–39.

Jamieson, K., & Hyland, P. (2006). Factors that influence Information Systems decisions and outcomes: A summary of key themes from four case studies. Adelaide, Australia.

Jones, D. (1996). Solving Some Problems of University Education: A Case Study. In R. Debreceny & A. Ellis (Eds.), Proceedings of AusWebÕ96 (pp. 243–252). Gold Coast, QLD: Southern Cross University Press.

Jones, D. (2002). Student Feedback, Anonymity, Observable Change and Course Barometers. In P. Barker & S. Rebelsky (Eds.), World Conference on Educational Multimedia, Hypermedia and Telecommunications 2002 (pp. 884–889). Denver, Colorado: AACE.

Jones, D. (2003). Course Barometers: Lessons gained from the widespread use of anonymous online formative evaluation. QUT, Brisbane.

Jones, D., & Buchanan, R. (1996). The design of an integrated online learning environment. In A. Christie, B. Vaughan, & P. James (Eds.), Making New Connections, asciliteÕ1996 (pp. 331–345). Adelaide.

Jones, D., & Luck, J. (2009). Blog Aggregation Management: Reducing the Aggravation of Managing Student Blogging. In G. Siemns & C. Fulford (Eds.), World Conference on Educational Multimedia, Hypermedia and Telecommunications 2009 (pp. 398–406). Chesapeake, VA: AACE.

Jones, N., & OÕShea, J. (2004). Challenging hierarchies: The impact of e-learning. Higher Education, 48, 379–395.

Katz, R. (2003). Balancing Technology and Tradition: The Example of Course Management Systems. EDUCAUSE Review, 38(4), 48–59.

Kurtz, C., & Snowden, D. (2007). Bramble Bushes in a Thicket: Narrative and the intangiables of learning networks. In M. Gibbert & T. Durand (Eds.), . Blackwell.

Laurillard, D. (2002). Rethinking University Teaching: A Conversational Framework for the Effective Use of Learning Technologies. London: Routledge.

Light, B., Holland, C. P., & Wills, K. (2001). ERP and best of breed: a comparative analysis. Business Process Management Journal, 7(3), 216–224.

March, J. (1991). Exploration and exploitation in organizational learning. Organization Science, 2(1), 71–87.

Mintzberg, H. (1989). Mintzberg on Management, Inside our Strange World of Organisations. New York: Free Press.

Morgan, Glenda. (2003). Faculty use of course management systems. Educause Centre for Applied Research.

Morgan, Glenn. (1992). Marketing discourse and practice: Towards a critical analysis. In M. Alvesson & H. Willmott (Eds.), (pp. 136–158). London: SAGE.

Pozzebon, M., Titah, R., & Pinsonneault, A. (2006). Combining social shaping of technology and communicative action theory for understanding rhetorical closuer in IT. Information Technology & People, 19(3), 244–271.

Robey, D., Ross, W., & Boudreau, M.-C. (2002). Learning to implement enterprise systems: An exploratory study of the dialectics of change. Journal of Management Information Systems, 19(1), 17–46.

Rossi, D., & Luck, J. (2011). Wrestling, wrangling and reaping: An exploration of educational practice and the transference of academic knowledge and skill in online learning contexts. Studies in Learning, Evaluation, Innovation and Development, 8(1), 60–75.

Seely Brown, J., Collins, A., & Duguid, P. (1989). Situated cognition and the culture of learning. Educational Researcher, 18(1), 32–42.

Seely-Brown, J., & Hagel, J. (2005). From push to pull: The next frontier of innovation. The McKinsey Quarterly. McKinsey & Company.

Simon, H. (1991). Bounded rationality and organizational learning. Organization Science, 2(1), 125–134.

Sturgess, P., & Nouwens, F. (2004). Evaluation of online learning management systems. Turkish Online Journal of Distance Education, 5(3).

Thomas, J. (2012). Universities canÕt all be the same – it’s time we embraced diversity. The Conversation. Retrieved June 28, 2012, from

Truex, Duane, Baskerville, R., & Travis, J. (2000). Amethodical systems development: the deferred meaning of systems development methods. Accounting Management and Information Technologies, 10, 53–79.

Truex, Duanne, Baskerville, R., & Klein, H. (1999). Growing systems in emergent organizations. Communications of the ACM, 42(8), 117–123.

Tversky, A., & Kahneman, D. (1974). Judgment under uncertainty: Heuristics and biases. Science, 185(4157), 1124–1131.

Underwood, J., & Dillon, G. (2011). Chasing dreams and recognising realities: teachersÕ responses to ICT. Technology, Pedagogy and Education, 20(3), 317–330. doi:10.1080/1475939X.2011.610932

Wagner, E., Scott, S., & Galliers, R. (2006). The creation of Òbest practiceÓ software: Myth, reality and ethics. Information and Organization, 16(3), 251–275.

Weick, K., & Quinn, R. (1999). Organizational change and development. Annual Review of Psychology, 50, 361–386.

A triumph of the explicit over the tacit and the subsequent loss of learning

I’ve spent the last week dealing with a range of institutional systems for the submission and processing of assignments, results etc. I’m likely to spend at least another week or two trudging through the inexplicable holes, dead-ends, and busy work such systems create. Hence the need for a break. While walking through the local “Japanese Gardens” back to the office I stumbled across a possible explanation. Or at least a catchy phrase to represent that explanation and provide an opportunity to revisit and share some recent reading.

These ill-fitting systems are illustrative of the triumph of the explicit over the tacit (and implicit) that is embodied in the type of business-like processes and policies in use in the current modern Australian university. It’s this triumph that is the biggest barrier to widespread improvement and innovation in learning and teaching at those institutions because they limited institutional learning.

Japanese Gardens

For example, the design of these information systems is based on the traditional Software Development Life Cycle where some poor sod had to develop the set of requirements which were then dutifully turned into software by the IT department or some vendor (even worse because the requirements become even less important as the focus becomes what the vendor’s system can do). The requirements have to be made explicit so that the IT department can prove to unhappy users who find a system they can’t use, that the system is exactly what the users asked for.

Beyond this there is a need to make explicit various formal university policies (and then unintentionally hide them on Intranets). There needs to be an explicit model for everything and everyone has to follow the same explicit model. After all, consistency is quality (isn’t it?). Managers are happy when they see evidence of consistent following of these processes. When in reality every person and their dog is complaining bitterly about the constraints and inappropriateness of these models and actively searching any which way around them.

The attempt to capture all insight and knowledge about the system and its requirements and make it explicit has failed. And worse the people, policies and processes put in place are largely incapable of recognising this. Let alone being able to do something about it. But the explicit has triumphed.

By trying to appear rational and capable of making everything explicit, these processes and policies are sacrificing the tacit. Not only does this triumph make the institution ignorant of the reality of the lived experience of its staff and students, it sacrifices any ability to learn and innovate.

This is no great original insight. Many folk have observed similar previously.

Ciborra (2002)

In this respect, science-based, method-driven approaches can be misleading. Contrary to their promise, they are deceivingly abstract and removed from practice.

…how to go beyond appearances and the illusionary realness of those management and systems concepts in common currency and how to value the apparitions of the ordinary in the life of information systems and organisations.

The development of organisational information systems, processes and policies aim to abstract away the realities of context and achieve a neat tidy, rational model. I see a great similarity between this and what Seely Brown, Collins and Duguid (1989) suggest

Many methods of didactic education assume a separation between knowing and doing, treating knowledge as an integral, self-sufficient substance, theoretically independent of the situations in which it is learned and used.

You can see evidence of this in the observation that the people who develop such systems are generally not involved in the day to day situation in which those systems are used. Not to mention that many of the system owners aren’t directly involved in the day to day use of such systems. The administrative staff put in place to double check consistent following of the standard process never actually have to complete the process themselves. They just make sure everyone else does. The “knowers” and “doers” are separate.

It’s not uncommon for the “knowers” to talk disparagingly of the “doers”. Anyone whose attended a meeting where IT management and academic management get together will have stories of this. So, not only do they not engage in the doing, they don’t value the insights that arise from the “doing”.

In getting into situated learning, Seely Brown et al (1989) continue

The activity in which knowledge is developed and deployed, it is now argued, is not separable from or ancillary to learning and cognition. Nor is it neutral. Rather, it is an integral part of what is learned.

And this, I believe, isn’t limited to the development of information systems and formal university policies and processes. This triumph of the explicit over the tacit directly informs much of the practice of central learning and teaching policies and processes. The very institutional instruments that are meant to inform and improve the quality of learning and teaching.

I’m going to suggest that the harnessing of Information and Communication Technologies (ICTs) and the widespread improvement of the quality of learning and teaching within Australian Universities is being significantly held back because of this uncritical acceptance of supposedly rational methods that result in the triumph of the explicit over the implicit. Even worse, this triumph is a big drag on the ability of these institutions to learn how to be better and to innovate.

Ciborra (2002)

let us drop the old methodologies, in, order to be better able to see the new dimensions the technology is going to disclose to us

The capacity to integrate unique ideas and practical design solutions at the end-user level turns out to be more important than the adoption of structured approaces to systems development or industry analysis

The power of bricolage, improvisation and hacking is that these activities are highly situated; they exploit, in full, the local context and resources at hand, while often pre-planned ways of operating appear to be derooted, and less effective because they do not fit the contingencies of the moment. Also, almost by definition these activities are highly idiosyncratic, and tend to be invisible both because they are marginalised and because they unfold in a way that is small in scope. The results are modes of operating that are latent and not easy to imitate. Thus, the smart bricolage or the good hack cannot be easily replicated outside the cultural bed from which it has emerged.

Throw away your best practices, your annual plans, quality assurance etc (at least a bit) and allow the space and resources for bricolage. Allow the harnessing of the tacit.


Ciborra, C. (2002). The Labyrinths of Information: Challenging the Wisdom of Systems. Oxford, UK: Oxford University Press.

Seely Brown, J., Collins, A., & Duguid, P. (1989). Situated cognition and the culture of learning. Educational Researcher, 18(1), 32–42.

One example of industrial e-learning as "on the web" not "of the web"

The following arises from some recent experiences with the idea of “minimum course sites” and this observation from @cogdog in this blog post

I have no idea if this is off base, but frankly it is a major (to me) difference of doing things ON the web (e.g. putting stuff inside LMSes) and doing things OF the web.

It’s also an query to see if anyone knows of an institution that has implemented a search engine across the institutional e-learning systems in a way that effectively allows users to search for resources in a course centric way.

The symptom

There’s a push on at my current institution’s central L&T folk to develop a minimum course site standard. Some minimum set of services, buttons etc. that will achieve the nirvana of consistency. Everything will be the same.

The main espoused reason as to why this is a good thing is that the students have been asking for it. There has been consistent feedback from students that none of the course sites are the same.

The problem

Of course, the real problem isn’t that students want everything to be the same. The real problem is that they can’t find what they are looking for. Sure, if everything was the same then they might have some ideas about where to find things, but that has problems including:

  • The idea that every course site at a university can be structured the same is a misunderstanding of the diversity inherent in course. Especially as people try to move away from the traditional models such as lecture/tutorial etc.
  • The idea that one particular structure will be understandable/appropriate to all people also is questionable.
  • Even if all the sites are consistent and this works, it won’t solve the problem of when the student is working on a question about “universal design” and wants to find where that was mentioned amongst the many artefacts in the course site.

The solution

The idea that the solution to this problem is to waste huge amounts of resources in the forlorn attempt to achieve some vaguely acceptable minimum standards that is broadly applicable seems to be a perfect example of “doing things ON the web, rather than doing things OF the web”.

I can’t remember the last time I visited a large website and attempted to find some important information by navigating through the site structure. Generally, I – like I expect most people – come to a large site almost directly to the content I am interested in either through a link provided by someone or via a search engine.

Broader implications

To me the idea of solving this problem through minimum standards is a rather large indication of the shortcomings of industrial e-learning. Industrial e-learning is the label I’ve applied to the current common paradigm of e-learning adopted by most universities. It’s techno-rational in its foundations and involves the planned management of large enterprise systems (be they open source or not). I propose that “industrial e-learning” is capable and concerned primarily with “doing things On the web, rather than doing things OF the web”.

Some potential contributing factors might include:

  1. Existing mindsets.
    At this institution, many of the central L&T folk come from a tradition of print-based distance education where consistency of appearance was a huge consideration. Many of these folk are perhaps not “of the web”.
  2. Limitations of the tools.
    It doesn’t appear that Moodle has a decent search engine, which is not surprising given the inspiration of its design and its stated intent of not being an information repository.
  3. The nature of industrial e-learning, its product and process.
    A key characteristic of industrial e-learning is a process that goes something like this
    1. Spend a long time objectively selecting an appropriate tool.
    2. Use that tool for along time to recoup the cost of moving to the new tool.
    3. Aim to keep the tool as vanilla as possible to reduce problems with upgrades from the vendor.
      This applies to open source systems as much as proprietary systems.
    4. Employ people to help others learn how to best use the system to achieve their ends.
      Importantly, the staff employed are generally not their to help others learn how to “best achieve their ends”, the focus definitely tends to be on ho to “best use the system to achieve their ends”.
    5. Any changes to the system have to be requested through a long-scale process that involves consensus amongst most people and the approval of the people employed in point d.

    This means that industrial e-learning is set up to do things the way the chosen systems work. If you have to do something that isn’t directly supported by the system, it’s very, very hard. e.g. add a search engine to Moodle.

All of these make it very hard for industrial e-learning to be “doing things OF the web”

Reducing meaningless freedom and a Mahara feature request

Note: An update to this post included at the end.

I’m currently finalising results for a course with 250+ students spread across multiple campuses and online. The final large assignment – worth 70% of the final mark – requires that students create a portfolio (are people still using the term “eportfolio”?) in Mahara and submit it via the institutional assignment submission system.

Due to the nature of the portfolio content and concerns about privacy of school student (it’s a pre-service teacher course) the portfolio cannot be opened up to everyone. Not to mention the fact that many of the students retain this fear that someone is going to copy their work. So, students have to create this multi-page, multi-resource portfolio in Mahara and make sure that certain people can access the portfolio.

With 250+ students it was always going to be the case that a decent handful would have problems, even with reasonable instructions. And it is this decent handful that is creating extra workload for the teaching staff. Additional workload that could be avoided if a principle we formulated in early work around assignment submission – reduce meaningless freedom – was applied to Mahara.

The following outlines that principle and outlines a feature request for Mahara that might help.

Reduce meaningless freedom

Online assignment submission was one of the first applications of online learning we explored back in the mid-1990s in our courses with large numbers of distance education students (Jones and Jamieson, 1997). The early systems were not that well designed and increased the workload on the marker (sorry Kieren). However, they did help with a range of improvements over the traditional physical assignment submission process.

From this experience, we developed the principle of “reduce meaningless freedom”. Here’s how it was described in Jones (1999)

An important lesson from the on-going development of online assignment submission is to reduce the amount of “meaningless freedom” available to students. Early systems relied on students submitting assignments via email attachments. The freedom to choose file formats, mail programs and types of attachments significantly increased the amount of work required to mark assignments. Moving to a Web-based system where student freedom is reduced to choosing which file to upload was a significant improvement.

The problem was that when marking large numbers of assignments, you want to get into a routine. You can only get into a routine if certain important aspects are the same (e.g. file formats, file names etc, the ability to access a Mahara portfolio). The trouble is that if you have a large number of people completing a process, if there is any flexibility in the process they will complete it in different ways (including not completing it properly).

This is not a problem that can be solved by improving the instructions or applying a checklist. With a large enough number of people, there will be people who can’t follow those instructions or ignore the checklist.

Consequently, the information system has to be designed to remove any freedom to vary from the process. It shouldn’t remove all freedom, just those that aren’t important to the outcome of the process but will increase workload in processing. For example, we want the students to be able to express their creativity with their Mahara portfolios. We just don’t want them to have the freedom to submit a URL for a portfolio that we can’t access.

The system should remove, or at least limit, this freedom.

Mahara feature request

The problem here is that for people new to Mahara, it is very difficult to check who can access a complex, multi-page portfolio. I do know that there is a way to give access to a complex, multi-page portfolio: you create a collection, add the pages to that collection, and create a secret URL that the collection. This is the process we described to students. The trouble is that the students have to choose to follow this process and they are free not to.

You could be hard about this and be very explicit. “If you don’t do this you will fail!”. But that doesn’t create a positive learning environment I’d like to have in my courses and fails to recognise that our tools should be helping us achieve our goals.

What would be useful, is if Mahara had a “show/check access” feature. Where a person creating a Mahara portfolio could submit the URL and Mahara would generate a report of who could access which components of that portfolio. It would recurse through all the Mahara links accessible from that URL and report on who could access those links.

Having this as a feature that people have to choose to use still involves some freedom. To remove that freedom a bit more, this process could run in the background and the outcome could be made visible via the Mahara interface. For example, when editing a page that contains links to other parts of Mahara, the interface could add an appropriate label explaining who can access those links.

An update

Thanks to it is revealed to me that Mahara already has this feature. That its share facility does show what is accessible. Kudos to the Mahara developers. Now I don’t follow on Twitter and pretty sure they don’t follow me. So this is a nice bit of learning thanks to Twitter, hash tags and

This begs the question as to why I wasn’t already aware of this. After all, I’m responsible for this course and somewhat computer literate. A significant part of the answer has to be the limitations of my approach to learning about Mahara. But other contributing factors would include that this feature is neither explicitly obvious from using Mahara nor in the preparation/resources provided by my current institution.

One perspective is that there is too much freedom in the way the institution allows the use of Mahara in courses. It should remove the freedom I have from getting this far into a semester without being aware of this. But perhaps it can also be addressed by making it more explicit/obvious in Mahara?

Then there is the whole robustness versus resilience perspective as argued by Dave Snowden.


Jones, D., & Jamieson, B. (1997). Three Generations of Online Assignment Management. In R. Kevill, R. Oliver, & R. Phillips (Eds.), (pp. 317-323). Perth, Australia.

Jones, D. (1999). Solving some problems with university education: Part II., Proceedings of AUSWEB’99, Balina, Australia.

People and e-learning – limitations and an alternative

So the last of three sections examining the limitations of industrial e-learning and suggesting an alternative. Time to write the conclusion, read the paper over again and cut it down to size.


The characteristics of the product and process of industrial e-learning (e.g. focus on long periods of stable use and the importance of efficient use of the chosen LMS) directly reinforced by and directly impact the people and roles involved with tertiary e-learning. This section briefly examines just four examples of this impact, including:

  1. The negative impact of organizational hierarchies on communication and knowledge sharing.
    The logical decomposition inherent in teleological design creates numerous, often significant, organizational boundaries between the people involved with e-learning. Such boundaries are seen as inhibiting the ability to integrate knowledge across the organization. The following comments from Rossi and Luck (2011, p. 68) partially illustrate this problem:

    During training sessions … several people made suggestions and raised issues with the structure and use of Moodle. As these suggestions and issues were not recorded and the trainers did not feed them back to the programmers … This resulted in frustration for academic staff when teaching with Moodle for the first time as the problems were not fixed before teaching started.

  2. Chinese whispers.
    Within an appropriate governance structure the need for changes to an LMS would typically need to flow up from the users to a central committee typically made up of senior leaders from the faculties, Information Technology and central learning and teaching. There would normally be some representation from teaching staff and students. The length of the communication chain for the original need becomes like a game of Chinese Whispers as it is interpreted through the experiences and biases of those involved. Leading to this impression reported by Rossi and Luck (2011, p. 69)

    The longer the communication chain, the less likely it was that academic users’ concerns would be communicated correctly to the people who could fix the problems.

    The cost of traversing this chain of communication means it is typically not worth the effort of raising small-scale changes.

    Not to mention killing creativity which just came through my Twitter feed thanks to @kyliebudge.

  3. Mixed purposes.
    Logical decomposition also encourages different organizational units to focus on their part of the problem and lose sight of the whole picture. An IT division evaluated on its ability to minimize cost and maximize availability is not likely to want to support technologies in which it has limited expertise. This is one explanation for why the leader of an IT division would direct the IT division’s representatives on an LMS selection panel to ensure that the panel selected the LMS implemented in Java. Or a decision to use the latest version of the Oracle DBMS – the DBMS supported by the IT division – to support the new Moodle installation even though it hasn’t been tested with Moodle and best practice advice is to avoid Oracle. A decision that leads to weeks at the start of the “go live” term where Moodle is largely unavailable.
  4. The perils of senior leadership.
    Having the support and engagement of a senior leader at an institution is often seen as a critical success factor for an LMS implementation. But when the successful completion of the project is tied to the leader’s progression within the leadership hierarchy it can create the situation where the project will be deemed a success, regardless of the outcome.

As an alternative, the Webfuse system relied on a multi-skilled, integrated development and support team. This meant that the small team was responsible for training, helpdesk support, and systems development. The helpdesk person handling the user’s problem was typically also a Webfuse developer who was empowered to make small changes without formal governance approval. Behrens (2009, p. 127) quotes a manager in CQU’s IT division describing the types of changes made to Webfuse as “not even on the priority radar” due to traditional IT management techniques. The developers were also located within the faculty, so they also interacted with academic staff in the corridors and the staff room. This context created an approach to the support of an e-learning system with all the hallmarks of a social constructivist, situated cognition, or community of practice. The type of collaborative and supportive environment identified by Tickle et al (2009) in which academics learn through attempts to solve genuine educational problems, rather than being shown how to adapt their needs to the constraints of the LMS.


Behrens, S. (2009). Shadow systems: the good, the bad and the ugly. Communications of the ACM, 52(2), 124-129.

Rossi, D., & Luck, J. (2011). Wrestling, wrangling and reaping: An exploration of educational practice and the transference of academic knowledge and skill in online learning contexts. Studies in Learning, Evaluation, Innovation and Development, 8(1), 60-75. Retrieved from

Tickle, K., Muldoon, N., & Tennent, B. (2009). Moodle and the institutional repositioning of learning and teaching at CQUniversity. Auckland, NZ. Retrieved from

Introducing the alternative

The last couple of posts have attempted to (in the confines of an #ascilite12 paper) summarise some constraints with the dominant product and process models used in industrial e-learning and suggest an alternative. The following – which probably should have been posted first – describes how and where this alternative comes from.

As all this is meant to go into an academic paper, the following starts with a discussion about “research methods” before moving onto describing some of the reasons why this alternative approach might have some merit.

As with the prior posts, this is all still first draft stuff.

Research methods and limitations

From the initial stages of its design the Webfuse system was intended to be a vehicle for both practice (it hosted over 3000 course sites from 1997-2009) and research. Underpinning the evolution of Webfuse was an on-going process of cycle action research that sought to continually improve the system through insights from theory and observation of use. This commenced in 1996 and continued, at varying levels of intensity, through to 2009 when the system ceased directly supporting e-learning. This work has contributed in varying ways to over 25 peer-reviewed publications. Webfuse has also been studied by other researchers investigating institutional adoption of e-learning systems (Danaher, Luck, & McConachie, 2005) and shadow systems in the context of ERP implementation (Behrens, 2009; Behrens & Sedera, 2004).

Starting in 2001 the design of Webuse became the focus of a PhD thesis (Jones, 2011) that made two contributions towards understanding e-learning implementation within universities: the Ps Framework and an Information Systems Design Theory (ISDT). The Ps Framework arose out of an analysis of existing e-learning implementation practices and as a tool to enable the comparison of alternate approaches (Jones, Vallack, & Fitzgerald-Hood, 2008). The formulated ISDT – An ISDT for emergent university e-learning systems –offers guidance for e-learning implementation that brings a number of proposed advantages over industrial e-learing. These contributions to knowledge arose from an action research process that combined broad theoretical knowledge – the principles of the ISDT are supported by insights from a range of kernel theories – with empirical evidence arising from the design and support of a successful e-learning system. Rather than present the complete ISDT – due primarily to space constraints – this paper focuses on how three important components of e-learning can be re-conceptualised through the principles of the ISDT.

The ISDT – and the sub-set of principles presented in this paper – seek to provide theoretical guidance about how to develop and support information systems for university e-learning that are capable of responding to the dominant characteristics (diversity, uncertainty and rapid change) of university e-learning. This is achieved through a combination of product (principles of form and function) and process (principles of implementation) that focus on developing a deep and evolving understanding of the context and use of e-learning. It is through being able to use that understanding to make rapid changes to the system, which ultimately encourages and enables adoption and on-going adaptation. It suggests that any instantiation built following the ISDT will support e-learning in a way that: is specific to the institutional context; results in greater quality, quantity and variety of adoption; and, improves the differentiation and competitive advantage of the host institution.

As with all research, the study described within this study has a number of limitations that should be kept in mind when considering its findings. Through its use of action research, this work suffers the same limitations, to varying degrees, of all action research. Baskerville and Wood-Harper (1996) identify these limitations as: (1) lack of impartiality of the researcher; (2) lack of discipline; (3) mistaken for consulting; and (4) context-dependency leading to difficulty of generalizing findings. These limitations have been addressed within this study through a variety of means including: a history of peer-reviewed publications throughout the process; use of objective data sources; the generation of theory; and, an on-going process of testing. Consequently the resulting ISDT and the principles described here have not been “proven”. This was not the aim of this work. Instead, the intent was to gather sufficient empirical and theoretical support to build and propose a coherent and useful alternative to industrial e-learning. The question of proof and further testing of the ISDT in similar and different contexts provides – as in all research aiming to generate theory – an avenue for future research.

On the value of Webfuse

This section aims to show that there is some value in considering Webfuse. It seeks to summarise the empirical support for the ISDT and the principles described here by presenting evidence that the development of Webfuse led to a range of features specific to the institution and to greater levels of adoption. It is important to note that from 1997 through 2005 Webfuse was funded and controlled by one of five faculties at CQUniversity. Webfuse did not become a system controlled by a central IT division until 2005/2006 as a result of organizational restructures. During the life-span of Webfuse CQU adopted three different official, institutional LMS: WebCT (1999), Blackboard (2004), and Moodle (2010).

Specific to the context

During the period from 1999 through 2002 the “Webfuse faculty” saw a significant increase in the complexity of its teaching model including the addition of numerous international campuses situated within capital cities and a doubling in student numbers, primarily through full-fee paying overseas students. By 2002, the “Webfuse faculty” was teaching 30% of all students at the University. Due to the significant increased in complexity of teaching in this context, a range of teaching management and support services were integrated into Webfuse including: staff and student “portals”, an online assignment submission and management system, a results upload application, an informal review of grade system, a timetable generator, student photo gallery, academic misconduct database, email merge facility, and assignment extension systems.

The value of these systems to the faculty is illustrated by this quote from the Faculty annual report for 2003 cited by Danaher, Luck & McConachie (2005, p. 39)

[t]he best thing about teaching and learning in this faculty in 2003 would be the development of technologically progressive academic information systems that provide better service to our students and staff and make our teaching more effective. Webfuse and MyInfocom development has greatly assisted staff to cope with the complexities of delivering courses across a large multi-site operation.

By 2003 the faculties not using Webfuse were actively negotiating to enable their staff to have access to the services. In 2009 alone, over 12,000 students and 1100 staff made use of these services. Even though no longer officially supported, it is a few of these services that continue to be used by the university in the middle of 2012.

Quotes from staff using the Webfuse systems reported in various publications (Behrens, 2009; Behrens, Jamieson, Jones, & Cranston, 2005; Jones, Cranston, Behrens, & Jamieson, 2005) also provide some insights into how well Webfuse supported the specific context at CQUni.

my positive experience with other Infocom systems gives me confidence that OASIS would be no different. The systems team have a very good track record that inspires confidence

The key to easy use of OASIS is that it is not a off the shelf product that is sooooo generic that it has lost its way as a course delivery tool.

I remember talking to [a Webfuse developer] and saying how I was having these problems with uploading our final results into [the Enterprise Resource Planning (ERP) system] for the faculty. He basically said, “No problem, we can get our system to handle that”…and ‘Hey presto!’ there was this new piece of functionality added to the system … You felt really involved … You didn’t feel as though you had to jump through hoops to get something done.

Beyond context specific systems supporting the management of learning and teaching, Webfuse also included a number of context specific learning and teaching innovations. A short list of examples includes:

  • the course barometer;
    Based on an innovation (Svensson, Andersson, Gadd, & Johnsson, 1999) seen at a conference the barometer was designed to provide students a simple, anonymous method for providing informal, formative feedback about a course (Jones, 2002). Initially intended only for the authors courses, the barometer became a required part of all Webfuse course sites from 2001 through 2005. In 2007/2008 the barometers were used as part of a whole of institution attempt to encourage formative feedback in both Webfuse and Blackboard.
  • Blog Aggregation Management (BAM); and
    BAM allowed students to create individual, externally hosted web-logs (blog) and use them as reflective journals. Students registered their external blog with BAM, which then mirrored all of the students’ blog posts on an institutional server and provided a management and marking interface for teaching staff. Created by the author for use in his own teaching in 2006, BAM was subsequently used in 26 course offerings by 2050+ students and ported to Moodle as BIM (Jones & Luck, 2009). In reviewing BAM, the ELI guide to blogging (Coghlan et al., 2007) identified as
    One of the most compelling aspects of the project was the simple way it married Web 2.0 applications with institutional systems. This approach has the potential to give institutional teaching and learning systems greater efficacy and agility by making use of the many free or inexpensive—but useful—tools like blogs proliferating on the Internet and to liberate institutional computing staff and resources for other efforts.
  • A Web 2.0 course site.
    While it looked like a normal course website, none of the functionality – including discussion, wiki, blog, portfolio and resource sharing – was implemented by Webfuse. Instead, freely available and externally hosted Web 2.0 tools and services provided all of the functionality. For example, each student had a portfolio and a weblog provided by the site The content of the default course site was populated by using BAM to aggregate RSS feeds (generated by the external tools) which were then parsed and displayed by Javascript functions within the course site pages. Typically students and staff did not visit the default course site, as they could access all content by using a course OPML file and an appropriate reader application.

Even within the constraints placed on the development of Webfuse it was able to develop an array of e-learning applications that are either not present in industrial LMSes, were added much later than the Webfuse services, or had significantly reduced functionality.

Greater levels of adoption

Encouraging staff adoption of the Webfuse system was one of the main issues raised in the original Webfuse paper (Jones & Buchanan, 1996). Difficulties in encouraging high levels of quality use of e-learning within universities has remained a theme throughout the literature. Initial use of Webfuse in 1997 and 1998 was not all that successful in achieving that goal, with only five – including the designer of Webfuse who made 50% of all edits using the system – of 60 academic staff making any significant use of Webfuse by early 1999 (Jones & Lynch, 1999). These limitations were addressed from 1999 onwards by a range of changes to the system, how it was supported and the organizational context. The following illustrates the success of these changes by comparing Webfuse adoption with that of the official LMS (WebCT 1999-2003/4; Blackboard 2004-2009) used primarily by the non-Webfuse faculties. It first examines the number of course sites and then examines feature adoption.

From 1997 Webfuse automatically created a default course site for all Faculty courses by drawing on a range of existing course related information. For the official institutional LMS course sites were typically created on request and had to be populated by the academics. By the end of 2003 – 4 years after the initial introduction of WebCT as the official institutional LMS – only 15% (141) of courses from the non-Webfuse faculties had WebCT course sites. At the same time, 100% (302) of the courses from the Webfuse faculty had course sites. Due to the need for academics to populate WebCT and Blackboard courses sites, the presence of a course website doesn’t necessarily imply use. For example, Tickle et al (2009) report that 21% of the 417 Blackboard courses being migrated to Moodle in 2010 contained no documents.

Research examining the adoption of specific categories of LMS features provides a more useful insight into LMS usage. Figures 1 through 4 use the research model proposed by Malikowski, Thompson, & Thies (2007) to compare the adoption of LMS features between Webfuse (the thick continuous lines in each figure), CQUni’s version of Blackboard (the dashed lines), and range of adoption rates found in the literature by Malikowski et al (2007) (the two dotted lines in each figure). This is done for four of the five LMS feature categories identified by Malikowski et al (2007): content transmission (Figure 1), class interaction (Figure 2), student assessment (Figure 3), and course evaluation (Figure 4).

(Click on the graphs to see large versions)

Content Transmission Interactions
Figure 1: Adoption of content transmission features: Webfuse, Blackboard and Malikowski Figure 2: Adoption of class interactions features: Webfuse, Blackboard and Malikowski
(missing archives of most pre-2002 course mailing lists)
Evaluate Students Evaluate Courses
Figure 3: Adoption of student assessment features: Webfuse, Blackboard and Malikowski Figure 4: Adoption of course evaluation features: Webfuse, Blackboard and Malikowski

The Webfuse usage data included in Figures 1 through 4 only include actual feature use by academics or students. For example, from 2001 through 2005 100% of Webfuse courses contained a course evaluation feature called a course barometer, only courses where the course barometer was actually used by students are included in Figure 4. Similarly, all Webfuse default course sites contained content (either automatically added from existing data repositories or copied across from a previous term). Figure 1 only includes data for those Webfuse course sites where teaching staff modified or added content.

Figures 2 and 3 indicate Webfuse adoption rates of greater than 100%. This is possible because a number of Webfuse features – including the EmailMerge and online assignment submission and management applications – were being used in course sites hosted on Blackboard. Webfuse was seen as providing services that Blackboard did not provide, or that were significantly better than what Blackboard did provide. Similarly, the spike in Webfuse course evaluation feature adoption in 2008 to 51.6% is due to a CQU wide push to improve formative feedback across all courses that relied on the Webfuse course barometer feature.

Excluding use by non-Webfuse courses and focusing on the time period 2003-2006, Figures 2 and 3 show that adoption of Webfuse class interaction and student assessment features significantly higher than the equivalent Blackboard features at CQU. It is also significantly higher than the adoption rates found by Malikowski et al (2007) in the broader literature. It also shows adoption rates that appear to be somewhat higher than that found amongst 2008, Semester 1 courses at the University of Western Sydney and Griffith University by Rankine et al (2009). Though it should be noted that Rankine et al (2009) used different sampling and feature categorization strategies that make this comparison tentative.


Behrens, S. (2009). Shadow systems: the good, the bad and the ugly. Communications of the ACM, 52(2), 124-129.

Behrens, S., Jamieson, K., Jones, D., & Cranston, M. (2005). Predicting system success using the Technology Acceptance Model: A case study. 16th Australasian Conference on Information Systems. Sydney. Retrieved from

Behrens, S., & Sedera, W. (2004). Why do shadow systems exist after an ERP implementation? Lessons from a case study. In C.-P. Wei (Ed.), (pp. 1713-1726). Shanghai, China.

Coghlan, E., Crawford, J., Little, J., Lomas, C., Lombardi, M., Oblinger, D., & Windham, C. (2007). ELI Discovery Tool: Guide to Blogging. EDUCAUSE. Retrieved from

Danaher, P. A., Luck, J., & McConachie, J. (2005). The stories that documents tell: Changing technology options from Blackboard, Webfuse and the Content Management System at Central Queensland University. Studies in Learning, Evaluation, Innovation and Development, 2(1), 34-43.

Jones, D. (2002). Student Feedback, Anonymity, Observable Change and Course Barometers. In S. R. Philip Barker (Ed.), (pp. 884-889). Denver, Colorado: AACE.

Jones, D. (2011). An Information Systems Design Theory for E-learning. Philosophy. Australian National University. Retrieved from

Jones, D., & Buchanan, R. (1996). The design of an integrated online learning environment. In P. J. Allan Christie Beverley Vaughan (Ed.), (pp. 331-345). Adelaide.

Jones, D., Cranston, M., Behrens, S., & Jamieson, K. (2005). What makes ICT implementation successful: A case study of online assignment submission. Adelaide.

Jones, D., & Luck, J. (2009). Blog Aggregation Management: Reducing the Aggravation of Managing Student Blogging. In G. Siemns & C. Fulford (Eds.), World Conference on Educational Multimedia, Hypermedia and Telecommunications 2009 (pp. 398-406). Chesapeake, VA: AACE. Retrieved from

Jones, D., & Lynch, T. (1999). A Model for the Design of Web-based Systems that supports Adoption, Appropriation and Evolution. In Y. D. San Murugesan (Ed.), (pp. 47-56). Los Angeles.

Jones, D., Vallack, J., & Fitzgerald-Hood, N. (2008). The Ps Framework: Mapping the landscape for the PLEs@CQUni project. Hello! Where are you in the landscape of educational technology? ASCILITE’2008. Melbourne.

Malikowski, S., Thompson, M., & Theis, J. (2007). A model for research into course management systems: bridging technology and learning theory. Journal of Educational Computing Research, 36(2), 149-173.

Rankine, L., Stevenson, L., Malfroy, J., & Ashford-Rowe, K. (2009). Benchmarking across universities: A framework for LMS analysis. Ascilite 2009. Same places, different spaces (pp. 815-819). Auckland. Retrieved from

Svensson, L., Andersson, R., Gadd, M., & Johnsson, A. (1999). Course-Barometer: Compensating for the loss of informal feedback in distance education (pp. 1612-1613). Seattle, Washington: AACE.

Tickle, K., Muldoon, N., & Tennent, B. (2009). Moodle and the institutional repositioning of learning and teaching at CQUniversity. Auckland, NZ. Retrieved from

The e-learning process – limitations and an alternative

And here’s the followup to the well received “LMS Product” post. This is the second section looking at the limitations of how industrial e-learning is implemented, this time focusing on the process used. Not really happy with this one, space limitations are making it difficult to do a good job of description.


It has become a maxim of modern society that without objectives, without purpose there can be no success, the setting of goals and achieving them has become the essence of “success” (Introna, 1996). Many, if not most, universities follow, or at least profess to follow, a purpose driven approach to setting strategic directions (Jones, Luck, McConachie, & Danaher, 2005). This is how institutional leaders demonstrate their strategic insight, their rationality and leadership. This is not a great surprise since such purpose driven processes – labeled as teleological processes by Introna (1996) – has dominated theory and practice to such an extent that it has become ingrained. Even though the debate between the “planning school” of process thought and the “learning school” of process thought has been one of the most pervasive debates in management (Clegg, 2002).

Prior papers (Jones et al., 2005; Jones & Muldoon, 2007) have used the nine attributes of a design process formulated by Introna (1996) to argue that purpose driven processes are particularly inappropriate to the practice of tertiary e-learning. The same papers have presented and illustrated the alternative, ateleological processes. The limitations of teleological processes can be illustrated by examining Introna’s (1996) three necessary requirements for teleological design processes

  1. The system’s behaviour must be relatively stable and predictable.
    As mentioned in the previous section, stability and predictability do not sound like appropriate adjectives for e-learning, especially into the future. Especially given the popular rhetoric about organizations in the present era no longer being stable, and instead are continuously adapting to shifting environments that places them in a state of constantly seeking stability while never achieving it (Truex, Baskerville, & Klein, 1999).
  2. The designers must be able to manipulate the system’s behaviour directly.
    Social systems cannot be “designed” in the same way as technical systems, at best they can be indirectly influenced (Introna, 1996). Technology development and diffusion needs cooperation, however, it takes place in a competitive and conflictual atmosphere where different social groups – each with their own interpretation of the technology and the problem to be solved – are inevitably involved and seek to shape outcomes (Allen, 2000). Academics are trained not to accept propositions uncritically and subsequently cannot be expected to adopt strategies without question or adaptation (Gibbs, Habeshaw, & Yorke, 2000).
  3. The designers must be able to determine accurately the goals or criteria for success.
    The uncertain and confused arena of social behaviour and autonomous human action make predetermination impossible (Truex, Baskerville et al. 2000). Allen (2000) argues that change in organizational and social setting involving technology is by nature undetermined.

For example, Tickle et al (2009) offer one description of the teleological process used to transition CQUni to the Moodle LMS in 2009. One of the institutional policies introduced as part of this process was the adoption of Minimum Service Standards for course delivery (Tickle et al., 2009, p. 1047). Intended to act as a starting point for “integrating learning and teaching strategies that could influence students study habits” and to “encourage academic staff to look beyond existing practices and consider the useful features of the new LMS” (Tickle et al., 2009, p. 1042). In order to assure the quality of this process a web-based checklist was implemented in another institutional system with the expectation that the course coordinator and moderator would actively check the course site met the minimum standards. A senior lecturer widely recognized as a quality teacher described the process for dealing with the minimum standards checklist as

I go in and tick all the boxes, the moderator goes in and ticks all the boxes and the school secretary does the same thing. It’s just like the exam check list.

The minimum standards checklist was removed in 2011.

A teleological process is not interested in learning and changing, only in achieving the established purpose. The philosophical assumptions of teleological processes – modernism and rationality – are in direct contradiction to views of learning meant to underpin the best learning and teaching. Rossi and Luck (2011, p. 62) talk about how “[c]onstructivist views of learning pervade contemporary educational literature, represent the dominant learning theory and are frequently associated with online learning”. Wise and Quealy (2006, p. 899) argue, however, that

while a social constructivist framework may be ideal for understanding the way people learn, it is at odds not only with the implicit instructional design agenda, but also with current university elearning governance and infrastructure.

Staff development sessions become focused on helping the institution achieve the efficient and effective use of the LMS, rather than quality learning and teaching. This leads to staff developers being “seen as the university’s ‘agent’” (Pettit, 2005, p. 253). There is a reason why Clegg (2002) references to teleological approaches as the “planning school” of process thought and the alternative ateological approach the “learning school” of process.

The ISDT abstracted from the Webfuse work includes 11 principles of implementation (i.e. process) divided into 3 groups. The first and second groupings refer more to people and will be covered in the next section. The second grouping focused explicitly on the process and was titled “An adopter-focused, emergent development process”. Webfuse achieved this by using an information systems development processes based on principles of emergent development (Truex et al., 1999) and ateleological design (Introna, 1996). The Webfuse development team was employed and located within the faculty. This allowed for a much more in-depth knowledge of the individual and organizational needs and an explicit focus on responding to those needs. The quote early in this paper about the origins of the results uploading system is indicative of this. Lastly, at its best Webfuse was able to seek a balance between teleological and ateleological processes due to a Faculty Dean who recognized the significant limitations of a top-down approach.

This process, when combined with a flexible and responsive product, better enabled the Webfuse team to work with the academics and students using the system to actively modify and construct the system in response to what was learned while using the system. It was an approach much more inline with a social constructivist philosophy.


Allen, J. (2000). Information systems as technological innovation. Information Technology & People, 13(3), 210-221.

Clegg, S. (2002). Management and organization paradoxes. Philadelphia, PA: John Benjamins Publishing.

Gibbs, G., Habeshaw, T., & Yorke, M. (2000). Institutional learning and teaching strategies in English higher education. Higher Education, 40(3), 351-372.

Introna, L. (1996). Notes on ateleological information systems development. Information Technology & People, 9(4), 20-39.

Jones, D., Luck, J., McConachie, J., & Danaher, P. A. (2005). The teleological brake on ICTs in open and distance learning. Adelaide.

Jones, D., & Muldoon, N. (2007). The teleological reason why ICTs limit choice for university learners and learning. In R. J. Atkinson, C. McBeath, S. K. A. Soong, & C. Cheers (Eds.), (pp. 450-459). Singapore. Retrieved from

Pettit, J. (2005). Conferencing and Workshops: a blend for staff development. Education, Communication & Information, 5(3), 251-263. doi:10.1080/14636310500350505

Rossi, D., & Luck, J. (2011). Wrestling, wrangling and reaping: An exploration of educational practice and the transference of academic knowledge and skill in online learning contexts. Studies in Learning, Evaluation, Innovation and Development, 8(1), 60-75. Retrieved from

Tickle, K., Muldoon, N., & Tennent, B. (2009). Moodle and the institutional repositioning of learning and teaching at CQUniversity. Auckland, NZ. Retrieved from

Truex, D., Baskerville, R., & Klein, H. (1999). Growing systems in emergent organizations. Communications of the ACM, 42(8), 117-123.

Wise, L., & Quealy, J. (2006). LMS Governance Project Report. Melbourne, Australia: Melbourne-Monash Collaboration in Education Technologies. Retrieved from

The LMS Product – limitations and an alternative

What follows is the first draft of the “Product” section for an ASCILITE paper (the overview for the paper) I hope to finish by tomorrow……just a bit of wishful thinking. Much of it has appeared in this blog previously, just now trying to wrangle it into a formal publication and all the limitations (e.g. space) that brings with it.

It’s a first draft, so comments and suggestions more than welcome.


One of the defining characteristics of the industrial e-learning paradigm is the reliance on the Learning Management System (LMS) as the product for organizational e-learning. Despite the associated complexities and risks almost every university seems compelled to have an LMS (Coates, James, & Baldwin, 2005). The LMS is an example of an integrated or monolithic information system. This type of information system brings with it a set of advantages and disadvantages. On the plus side, an integrated system offers cost efficiencies and other benefits through standardization but, at the same time, such systems constrain flexibility, competitiveness, autonomy and increase rigidity (B Light, Holland, & Wills, 2001; Lowe & Locke, 2008). Such systems are best suited to circumstances where there is commonality between organizations and stable requirements with low uncertainty. This does not seem to be a good description of tertiary e-learning, either over the last 10 years or the next 10. This section looks at two of the repercussions of this mismatch – 1) organizations and people must adapt to the system; and, 2) the single vendor limitation – before describing the alternate principles from the ISDT.

The first repercussion of an integrated system is captured by this comment (Sturgess & Nouwens, 2004, n.p.)

we should seek to change people’ behaviour because information technology systems are difficult to change.

This is a comment from a technical staff member participating in CQUni’s 2003 LMS selection process. This comment, rather than being isolated, captures the accepted industry best practice recommendation to implement integrated systems in their “vanilla” form because local changes are too expensive (Robey, Ross, & Boudreau, 2002). Maintaining a vanilla implementation constrains what is possible with the system, limiting change, innovation and differentiation and perhaps being a contributing factor in the poor pedagogical outcomes observed in industrial e-learning.

For example, in 2007 an instructional designer working on a redesign of a CQUni course in Nutrition informed by constructive alignment was stymied by the limitations of the Blackboard LMS. Blackboard could not support the required number of group-based discussion forums required by the new course design. Normally, with an integrated system the pedagogical approach would have to be changed to fit the confines of the system. Instead the implementation of the course site was supplemented with use of one of the Webfuse discussion forums that allowed the fulfillment of the original educational design. Academic staff teaching large first year courses using the Webfuse BAM functionality faced a similar situation when CQUni adopted Moodle. Since Moodle did not provide similar functionality these staff would be forced to change their pedagogical approach to fit the capabilities of the integrated system.

The regular forced migration to another version of an LMS is the extreme example of the organization being forced to change in response to the technology, rather than the technology fitting to the organizations needs. It is not uncommon to hear Universities being forced to adopt a new LMS because the vendor has ceased supporting their current system. The cost, complexity and disruption caused by an LMS migration contributes to this “stable systems drag” (Truex, Baskerville, & Klein, 1999) as the institution seeks a long period of “vanilla” use to recoup the cost.

Another characteristic of an integrated system is that the quality of the tools available is limited to those provided by a single vendor or community. For example, a key component of the recent disquiet about the Curt Bonk MOOC hosted within a Blackboard LMS was the poor quality of the Blackboard discussion forum (see Lane, 2012). Reservations about the quality and functionality of the Wiki and Blog tools within Moodle are also fairly common. LMS-based tools also tend not to fare well in comparisons with specialist tools. For example, when LMS-based blog tools are compared with tools like WordPress. In addition, integrated systems tend to support only one version of every given tool. Leading to the situation where users can pine for the previous version of the tool because it suited their needs better.

The ISDT formulated from the experience of developing Webfuse proposes 13 principles for the form and function of the product for emergent e-learning. These principles were divided into 3 groups:

  1. Integrated and independent services.
    Rather than a system or platform, Webfuse was positioned as glue. It was used to “fuse” together widely different services and tools into an integrated whole. Webfuse was an example of a best-of-breed system, a type of system that provides more flexibility and responsiveness to contextual needs (Ben Light, Holland, & Wills, 2001). For example, when the existing discussion forum tool was seen as limited, a new discussion forum tool was selected and integrated into Webfuse. At the same time the old discussion forum tool was retained and could be used by those for whom it was an appropriate fit. While new tools could be added as required, the interface used by staff and students remained essentially the same. There was no need for expensive system migrations.
  2. Adaptive and inclusive architecture.
    Almost all LMS support some form of plugin architecture where external users can develop new tools and services for the LMS. This architecture, however, is generally limited to tools specifically written for the LMS and its architecture and thereby limiting what tools can be integrated. The Webuse “architecture” was designed to support the idea of software wrappers (Sneed, 2000) enabling the inclusion of a much broader array of applications.
  3. Scaffolding, context-sensitive conglomerations.
    Most e-learning tools provide a collection of configuration options that can be used in a variety of ways. Effective use of these tools requires a combination of skills from a broad array of disciplines and significant contextual knowledge that the majority of academic staff do not possess. The most obvious example is in the overall design of a course website. Webfuse had a default course site conglomeration that combined a range of institutional data sources and Webfuse tools to automatically create a course site. A key aspect of the Webfuse wrappers placed around integrated tools was the addition of institutional specific information and services. There are significant, unexplored opportunities in adding scaffolding to e-learning tools that enable distributed cognition.

Writing about the need for universities to embrace diversity Thomas (2012) talks of Procrustes who

would stretch and sever the limbs of his guests to fit the size of his bed. We, too, are continuing to stretch and shape our higher education to a particular standard to the detriment of students and society alike.

In terms of e-learning, that “particular standard” is defined by the products we are using to implement industrial e-learning.


Coates, H., James, R., & Baldwin, G. (2005). A Critical Examination of the Effects of Learning Management Systems on University Teaching and Learning. Tertiary Education and Management, 11(1), 19-36. Retrieved from

Lane, L. M. (2012). Leaving an open online course. Retrieved from

Light, B, Holland, C. P., & Wills, K. (2001). ERP and best of breed: a comparative analysis. Business Process Management Journal, 7(3), 216-224.

Lowe, A., & Locke, J. (2008). Enterprise resource planning and the post bureaucratic organization. Information Technology & People, 21(4), 375-400.

Robey, D., Ross, W., & Boudreau, M.-C. (2002). Learning to implement enterprise systems: An exploratory study of the dialectics of change. Journal of Management Information Systems, 19(1), 17-46.

Sneed, H. (2000). Encapsulation of legacy software: A technique for reusing legacy software components. Annals of Software Engineering, 9(1-4), 293-313.

Sturgess, P., & Nouwens, F. (2004). Evaluation of online learning management systems. Turkish Online Journal of Distance Education, 5(3). Retrieved from

Thomas, J. (2012). Universities can’t all be the same – it’s time we embraced diversity. The Conversation. Retrieved June 28, 2012, from

Truex, D., Baskerville, R., & Klein, H. (1999). Growing systems in emergent organizations. Communications of the ACM, 42(8), 117-123.

The life and death of Webfuse: lessons for learning and leading into the future

The following is an attempt to formulate and structure some ideas for a paper for ascilite’12 in Wellington. The aim is to convert my PhD thesis – especially “The information systems design theory for emergent university e-learning” – into something useful and interesting for the ascilite crowd. The following is an attempt to organise the mish-mash of content I currently have into something sensible.


Drawing on the 14-year life and death of an integrated online learning environment used by tens of thousands of people, this paper argues that many of the principles and practices underpinning industrial e-learning – the current dominant institutional model – are inappropriate. The paper illustrates how industrial e-learning has been a significant drag on the current practice of tertiary e-learning in terms of quantity and quality and argues that it will actively prevent universities from being able to respond to uncertainty and effectively create and explore the future of learning. The paper proposes one alternative set of successfully implemented principles and practices and argues that this alternative will better enable institutions to lead in a climate of change, rather than following along behind.

Paper structure

The introduction will briefly

  • re-connect this paper with the 1996 ascilite paper that outlined the initial design of Webfuse;
  • set that in the broader history of LMS (i.e. every institution had its own LMS) which then during the noughties got replaced by one of the big 2 or 3 enterprise LMS;
  • illustrate the problems and limited outcomes of industrial e-learning;
  • link this trend with the broader pushes toward strategic/top-down/rational management practices within universities;
  • explain how Webfuse lived through this phase and move further and further away from the industrial e-learning trend;
  • outline the structure of the paper
    • Research method
    • On the value of Webfuse
    • Product
    • People
    • Process
    • Conclusions and future work

The Research method section (in “proving” the academic credentials of this work) will talk about

  • the cycle of action research over 14 years;
  • the formulation of the Ps Framework and the ISDTs;
  • link this back to DBR.

The On the value of Webfuse will seek to argue that the system based on the principles outlined in the paper was a “success” on a number of fronts. In doing so

  • Talk about the complex nature of what “success” means in terms of Information Systems implementation.
  • draw on quotes from the literature showing the value of the system as percieved by others.
  • summarise the greater levels of adoption and use of this system as compared to other systems both within and outside the institution.
  • Talk about features of the system which were not present in other systems for years if not ever.

The next three sections – Product, People, Process – will follow the same basic structure but will focus on a different essential component of e-learning. The structure will go something like

  • Explain the nature of the component as implemented in industrial e-learning.
  • Illustrate the problems that arise because of those principles.
  • Present the alternative set of principles and practices.

The conclusions and future work will probably cover some of the following (this is perhaps the section most likely to evolve)

  • The principles here are not a perfect solution – as a wicked design problem there is no such thing – there are problems and limitations with this approach. Not the least of which is the familiarity gap. It rejects many of the taken for granted assumptions of existing practice. Perhaps list these. Perhaps the biggest message of this approach is that institutions need to have practices that engage with these challenges rather than seek to abstract them out of existence.
  • That said glimmers of these very different principles is increasingly visible in a range of movements within the host disciplines. e.g. agile management practices, agile development etc.
  • Perhaps talk about the limitations of the research – impartiality, discipline/rigor, context-dependency.
  • A call for more design work and design theory in this area to test/refine the principles here or develop entirely different alternatives.
  • That testing of this design theory is going to be extremely hard given the established nature of industrial e-learning within higher education organisations. In particular, the spread of senior management staff who have a sense of ownership. This tends to rule out the possibility of doing much to address the People and Process aspects, at least at an institutional level. It tends to leave the Product aspect. Where tinkering with open source LMSes may be a productive area for future work. Though at the same time providing its own sense of inertia. Some examples may be in exploring how distributed cognition can improve these systems, but also exploring technical workarounds to improve the adaptability of these systems.

Expanding out Product, People, Process

The following are an initial attempt to expand out the three main sections. Completing this has highlighted the need to think about how to present/structure the problems.

For product, this will include :

  • Current nature – is the LMS.(supporting blog posts one, two
    An enterprise system that has little or no capacity for change or customisation at the institutional or individual level (changing look and feel doesn’t count). Even open source LMS suffer problems here.
  • The problems include
    • Having to change the behaviour of people, because technology is hard to change (must include the Sturgess and Nouwens quote mentioned here
    • More broadly the need for institutions to engage in large scale change projects because of new versions of the software.
    • Separation of data and services into separate systems (e.g. student records etc.)
    • Software that is generic and not specific to the institutional needs, the lowest-common denominator.
    • e.g. assignment management functionality in most LMS in 2011/2012 that is behind what Webfuse had 10 years ago
    • A focus on one tool. e.g. one discussion forum.
  • The alternative

For process – will draw on a few prior publications (thesis-base process posts one and two, OODLA paper, prior ascilite paper, posts (procurement models, role of people in LMS selection)

  • Current nature – i.e. teleological, plan-driven
    Need to not limit this solely to strategic management or IT process selection. Need to engage with the learning design folk who adopt this model for the design of teaching and see if there’s an argument to make this better.
  • Limitations – drawn from the publications above.
    Also perhaps mention how it clashes with how people learn.
  • The alternative – ateleological.
    Draw on insights from the thesis, but also other work e.g. Laurillard and others’ calls for teachers to be action researchers and the need for the organisation to engage with this. Perhaps even bring in Bigg’s quality model.

    In particular, see if arguments/suggestions can be developed to enhance “course design” in ways that are more ateleological.

    Mention Cavallo and the idea that any sort of change is learning and needs to connect to how people learn.

For people the focus will likely be on the (techno-)rational model as it is applied to how people think/respond and also how they are organised. (A people blog post from the thesis)

  • Current situation – people are assumed to be rational and the application of logical decomposition that splits people up into sub-groups. Also an emphasis on cheapness in support rolls. There’s also the problem raised towards the end of this post where the innovative central staff are trying to get people to use what’s been provided. Perhaps linked to the chasm.
  • Problems.
    Politics caused by organisational structures. The frontline support tasks being taken on by roles that are amongst the lowest paid in the organisation and focused tightly on the products being used rather than broad skills. The chasm and how most approaches are targeting (intentionally or not) the early adopters. Chinese whispers. Starvation of requirements. Gaming of the system to fit the teleological constraints
  • Alternative.
    Cross-disciplinary, high-skilled, distributed teams close to the users..

And the thesis is complete, what's next?

Just after 9:30pm last Wednesday I read an email from the Dean and Director of the ANU College of Business and Economics congratulating me on the fact that my thesis had been accepted without revision by the examiners and the institution. Needless to say that it was good news.!/djplaner/status/70817513038036993

Celebrations, however, were somewhat muted and restricted to the above tweet and a couple of drinks the following night. Celebrations, like blogging, had become victim of circumstance. Circumstances that included: being a student teacher placed at a local high school 4 days a week and having to prepare and deliver an increasing number of lessons, at the same time having to complete University assignments, spend time with my family, and most recently recover from the flu.

Most of these are on-going tasks, but I thought I’d take a bit of time to reflect. After all it is Friday and I currently feel like I’m getting a little ahead on tasks, which I fear is more a false dawn.

The value of a thesis

Over recent years, especially the last couple, I’ve seen quite a bit written about the value of a PhD thesis. Leigh Blackall has embarked on a PhD his way after identifying several limitations with more traditional practice. Sarah Thorneycroft has been doing work around traditional academic publishing and then there’s the more recent news (mostly out of the US) about there being an over-supply of PhD graduates. And that’s just the few that I’ve gleaned while I’ve been doing an ostrich impression and focusing on getting the thesis done and thinking about teaching. I really should make the time at some stage to follow what these and other folk are doing.

So, while it’s finished, is there still value in a PhD? Especially since I’m moving away from academia into high school teaching?

This was a question that struck me really quite hard early in my high school prac teaching experience. During the first mathematics class I’d taught with one group of students who are largely disengaged. One of the students said, “I’ve never passed a math class.” She then proceeded to quite comfortably complete a set of fairly abstract algebra exercises with a minimum of assistance. It became obvious that there was a lot of room for value generation in this class. Value that could have some significant impact on the lives of students.

What value is there in a PhD? Certainly I don’t see mine ever having the same sort of impact as a good high school teacher. And that’s with a thesis that generated a journal paper that has a Google Scholar citation count of 197 and over 2000 visits to the thesis page on my blog.

The common refrain I hear in Academia is that the PhD is just the entry ticket into Academia. That it’s your on-going work that will make the contribution and have the impact. But frankly, my experiences and observations of academia and its recent trends are such that it is becoming harder and harder to have an impact beyond the pointless ticking of boxes, meeting of targets, mouthing of slogans, and the mounting of projects that are meant to look good at the time (i.e. in terms of fulfilling all of the previous important tasks of academia) while failing to have any lasting impact.

Add to this the observation that my thesis is within the Information Systems discipline which appears to me to be a dying discipline. A discipline suffering from the growing persuasiveness of information technology in turn reducing the importance and relevance of specialist IS researchers. A disease that seems to be infecting many IT specialist disciplines, but is especially difficulty for IS and its attempt to distinguish itself from other business disciplines and also the IT discipline. That said, being actively engaged with attempts to increase the relevance of the Information Systems/Technology disciplines would be an interesting and challenging project.

In the end, the value of my PhD comes down to a purely personal value. After taking so long to complete the thesis, I have indeed completed it. I’ve proven that I could finish it.

High school teaching

While, as described above, I can see the great impact a quality high school teacher can make, I can also see how difficult it might be. I wonder about whether or not I have the energy required to make the impact. Even though my experience is limited, I can already see the mismatch between the nature of schools, their curriculum and the needs of the students. NAPLAN and QCS testing is driving an increased focus on intellectual pursuits, somewhat like the point Sir Ken makes in the well known video below. Interestingly, this video was shown at the weekly staff meeting at the school I’m currently placed at.

Yes, there is some moves to broadening school with the offering of vocational education as part of high school. But the pressure of NAPLAN seems to be particularly limiting on mathematics. Especially within the constraints of existing curriculum and resources such as textbooks. The kids that are prepared to fit within the expectations of school are a joy to work with and get a lot out of this approach. But there are other kids who, for a variety of reasons, don’t fit and subsequently are ill-served by the system. Trying to help those students within the constraints of the system sounds like a recipe for frustration and burn out.

Seems like I’ve found the challenge for what’s next.

How hard is it to get a personalised class timetable?

Apparently, it is too complex to create a personalised class timetable for students at the institution I’m attending. I have previously described what I (as a student) have to currently do to create my class timetable as well as explaining that I helped implement a personalised class timetable system at the same institution about 10 years ago.

The purpose of this post is to find out just how difficult it would be to do it today.

The context

The institutional context includes:

  1. A collection of static web pages that contain class timetabling information.
  2. A newly introduced Google Apps for Education for students (i.e. students all have institutionally provided individual Google calendars).

The plan

The previous personalised timetable system at this institution was a stand-alone web application. With some input it created a one week summary timetable as a single HTML page. Given the change in context, in particular the availability of Google calendar, the plan is to respond to this change. Rather than having a stand-alone institutional application, the plan is to integrate with what students are using.

That is, the plan is

  1. Web scrape the institutional web page to get the data.
  2. Use the data to create an iCalendar file or similar with the timetable information.

The idea is that the student can then important the iCal file into Google Calendar or any other calendar program that supports that file format (which is most). If this were an institutional system, it might be able to automatically pre-populate all students’ individual Google calendars with their class timetable.

Missing access

The last time I implemented a system like this I was within the institution, this time I’m not. This means I don’t have access to information such as

  • The list of courses a student is enrolled in.
    For the test I’ll hard-code it with the four courses I’m enrolled in. Wouldn’t be hard to taken any list of courses and generate a calendar.
  • The dates for each week in the University calendar (e.g. week 1 is Feb ?? to Mar ??)
    I’ll also have to hard-code this.
  • Assessment due date information which could be added to the file.
    I’ll leave this data out of this little test.
  • Ability to automatically import information into the institutional Google calendar.
    Not sure if this is possible. If it were the institution could automatically insert into a student’s Google calendar their timetable.

Web scaping

I don’t think the web page format for the timetable has changed too much. Am hoping that I still have the code that can easily web scape the appropriate web pages.

Yep, there’s a bit of code from a couple of years ago, let’s see if that will work.

I think the hardest part here will be getting this ancient Perl code to run on my new computer when I haven’t done anything with it for years. Yep, get that working and hey presto it is extracting the information.

One of the limitations of the old Perl install is that the database stuff isn’t working. This is okay as I don’t need it for this little exercise. So, remove the database stuff and create a hashed data structure that allows manipulation.


Create an iCalendar file

Can this be done in Perl? Yes, a quick Google reveals the Data::ICal Perl module and some code that used Data::ICal to do something similar as I’ve planned (though only for the 2010 FIFA World Cup).

Say what you like about Perl, but CPAN rocks. Simple single statement and Data::ICal and all other necessary files are installed and working.

A couple of hours later (including a stop for lunch) and I have produced an ical file that will get imported into Google calendar. Main problem at the moment is that the timezone isn’t quite correct. Google calendar is showing the events 10 hours after they should have been – I think this is timezone related.

The big question now, “is there an easy way to delete all these entries?”. A google search reveals an option under setting for Google calendar.

Yep, slight change in the timezone setting and the times are okay. What about the dates?

Oops, week 12 dates showing up early (typo in hard-coded start date). Week 8 on a Tuesday, not Monday. Ahh, public holiday that Monday. So official week start date is the Tuesday. My code assumes the Monday. A more detailed version of this would need to figure out the public holidays. Same for week 9.

It’s all working

So, what’s been implemented is a script that

  • Automatically scrapes the institutional web page with timetabling information.
  • Extracts only the courses being taken by a student.
  • Generates an iCal file with the weekly personal class timetable.

Week 1 for me looks like the following. Click on the images to see a larger version.

Week 1

The monthly view on Google calendar, with one of the events highlighted looks like this


Concept proven, work to do

It took me just over 3 hours to complete this. There was a break for lunch and most of the time was spent remembering how to interact with Perl and the Webfuse code.

This is not an “enterprise” solution, not yet. But it wouldn’t be that difficult to do. It’s certainly not impossible.

If you have followed some of my previous work or blog posts it will not be a surprise that I believe there are significant barriers in the processes around institutional IT systems that limit the possibility of these types of innovations.

On the potential flexibility of open source LMS and its limits

Today a mate posted to his blog about a small project he’s involved with. The context of this project seems to be a good opportunity to comment on the potential flexibility of open source LMS and the limits of that flexibility within an institutional context. It’s also an attempt to link it back to the design theory described in my thesis (if you want more of the theory behind the following, look at the thesis).

The following uses Moodle as an example, but I believe that similar limitations exist regardless of the open source LMS. This is in part because a significant limit on the flexibility is not the LMS, but instead the institutional governance processes and associated factors..

The need

In this case, the need is to send students an email with a link to a survey. The link has been personalised based on the students’ membership of Moodle groups. They survey asks them to answer questions based on their experience of a group task.

My initial thought would be that this sounds like something Moodle should be able to do. Given the increasing emphasis on group related work I doubt that this is a novel requirement. So, there might be something in Moodle that can do this, however, based on my limited knowledge of Moodle I can’t think of anything off the top of my head.

I believe that there might be the functionality within Moodle to do each of the components of this task. There is probably a way to send emails to members of a group. There might even be a way to customise that email to some extent (there is a bulk email facility in Moodle 1.9, but, from memory, it seems somewhat limited). There is also probably a way to do a group-based survey (a MCQ might be the obvious solution).

But I doubt that there is an easy way to combine these separate functions so that the group email can automatically include the link to the group’s MCQ/survey.

Doing it outside of Moodle

There is another interesting and related comment in post describing this project

hile not ideal in that it is a separate system from the LMS, it is hoped that this trial will help inform the development of a Moodle module that will perform the same function albeit in a more integrated and seamless way

Over 6 months ago, I used to work at the institution being described. Based on that out-of-date experience, my initial guess is that “doing it outside of Moodle” is deemed to be easier than engaging with Moodle and the institutional IT department.

Two limits of open source LMS flexibility

Drawing on the above examples, I’d like to propose at least two, somewhat related limits on the flexibility of open source LMS:

  1. Inflexible institutional structures and processes.
  2. The difficulty of producing/the absence of scaffolding conglomerations.

Inflexible institutional structures and processes

Modifying an enterprise implementation of Moodle effectively and efficiently is hard. You don’t want your institution’s Moodle instance to be unavailable to students and staff because a code change has broken something drastically. The Moodle code-base is itself quite difficult to get a handle on. Not overly difficult, but a non-Moodle developer can’t simply front-up and start making changes quickly. They need to be enculturated into the Moodle way, to learn what works and what doesn’t. Such a requirements means that someone who is able to modify Moodle is often a scarce and expensive resources. Especially within most universities who often don’t have someone dedicated as a Moodle developer.

To address this difficulty and also to CYA (some might argue that CYA is the major reason) institution’s spend a lot of time and effort setting up appropriate governance structures. The theory being that these are objective and rational ways to manage the difficult process and the expensive and scarce resource.

The trouble is that the difficulty and expense involved means that it becomes difficult for such processes to effectively engage in “small” problems like this one. i.e. problems that don’t actually require development of any significantly new functionality or large-scale modules. It just needs a few minor changes or wrappers around existing functionality. For example, the requirement above could possibly be solved (the following is an example description given off the top of my head without any investigation as to whether this would work) by

  • Modifying the Moodle quiz function to populate a database table linking groups to URLs for group specific quizzes.
  • Modifying the existing Moodle bulk-email facility (or perhaps adding a wrapper around it like I did with bim) to use this database table to send personalised emails to group members.
  • Perhaps add a new quiz report that allows viewing/comparing within/across groups.

For a variety of reasons traditional institutional LMS policies and processes are too heavy-weight to respond to this sort of need. Instead, in order for something like this to be considered, it has to be blown up into some institutional priority. e.g. a system to support peer and group-based assessment for the entire institution. A project that will require a significant amount of time doing a needs analysis,……..

A big project that requires lots of resources is expensive enough to be efficiently considered by the governance and related processes. Small projects are too cheap to be efficiently considered by the expensive institutional processes.

In the hardware/operating systems field, this is a problem known as starvation or indefinite postponement. The situation where a task is forever ignored because of a flaw in the priority mechanism.

So, I’m proposing that the institutional implementation of open source LMS end up suffer from the “starvation limit” on flexibility.

The need for rapid development of scaffolding conglomerations

The need in this case, at least to me, sounds like an example of what I termed scaffolding, context-sensitive conglomerations. Rather than necessarily requiring a brand new Moodle module or block, this problem sounds like something that actually needs to combine the functionality from a number of existing Moodle services. Something that conglomerates the lower-level functionality provided by Moodle into something that better meets this higher-level need.

A large part of the popularity of Moodle arises from its modularity. A feature that allows for the easy development of lots of new functionality. Something that increases the flexibility of Moodle.

The problem is that this flexibility arises, in part, from keeping these different modules separate. It’s the separation that makes it easy to add a new function without (theoretically) worrying about how it will effect the other modules. They are meant to be independent. The current problems moving to Moodle 2.0 is an example of the problems that arise from dependency. All the third-party modules depend on the Moodle core, so when the Moodle core changes all the third-party modules have to change.

A strict separation between modules makes it more difficult to combine parts of these different modules into a scaffolding conglomeration.

So, I’m proposing that open source LMS have an “over reliance on module independence” that limits their flexibility.

It’s really all about balance

I can already here proponents of traditional institutional IT governance processes or strict software engineers bemoaning the problems of not having institutional governance or of module dependence. And I do agree. There are dangers and problems. I’m not suggesting that they should necessarily be done away with.

I do, however, think that too often the balance has gone too far one way. There needs to be more recognition of a need for balance the other way. A bit less of a focus on the objective, best ways of technical implementation, and a bit more on the subjective, best ways to improve learning and teaching.

A command for organisations? Program or be programmed

I’ve just finished the Douglas Rushkoff book Program or be Programmed: Ten commands for a digital age. As the title suggests the author provides ten “commands” for living well with digital technologies. This post arises from the titular and last command examined in the book, Program or be programmed.

Dougls Rushkoff

This particular command was of interest to me for two reasons. First, it suggests that learning to program is important and that more should be doing it. As I’m likely to become a information technology high school teacher there is some significant self-interest in there being a widely accepted importance to learning ot program. Second, and the main connection for this post, is that my experience with and observation of universities is that they are tending “to be programmed”, rather than program. In particular when it comes to e-learning.

This post is some thinking out loud about that experience and the Ruskoff command. In particular, it’s my argument that universities are being programmed by the technology they are using. I’m wondering why? Am hoping this will be my last post on these topics, I think I’ve pushed the barrow for all its worth. Onto new things next.

Program or be programmed

Rushkoff’s (p 128) point is that

Digital technology is programmed. This makes it biased toward those with the capacity to write the code.

This also gives a bit of a taste for the other commands. i.e. that there are inherent biases in digital technology that can be good or bad. To get the best out of the technology there are certain behaviours that seem best suited for encouraging the good, rather than the bad.

One of the negative outcomes of not being able to program, of not being able to take advantage of this bias of digital technology is (p 15)

…instead of optimizing our machines for humanity – or even the benefit of some particular group – we are optimizing humans for machinery.

But is all digital technology programmed?

In terms of software, yes, it is all generally created by people programming. But not all digital technology is programmable. The majority of the time, money and resources being invested by universities (I’ll stick to unis, however, much of what I say may be applicable more broadly to organisations) is in “enterprise” systems. Originally this was in the form of Enterprise Resource Planning system (ERPs) like Peoplesoft. It is broadly recognised that modifications to ERPs are not a good idea, and that instead the ERP should be implemented in “vanilla” form (Robey et al, 2002).

That is, rather than modify the ERP system to respond to the needs of the university. The university should modify its practices to match the operation of the ERP system. This appears to be exactly what Rushkoff warn’s against “we are optimizing humans for machinery”.

This is important for e-learning because, I would argue, the Learning Management System (LMS) is essentially an ERP for learning. And I would suggest that much of what goes on around the implementation and support of an LMS within a university is the optimization of humans for machinery. In some specific instances that I’m aware of, it doesn’t matter whether the LMS is open source or not. Why?

Software remains hard to modify

Glass (2001), describing one of the frequently forgotten fundamental facts about software engineering, suggested that maintenance consumes about 40 to 80 percent of software costs, with 60% of the maintenance cost is due to enhancement. i.e. a significant proportion of the cost of any software system is adding new features to it. You need to remember that this is a general statement. If the software you are talking about is part of a system that operates within a continually changing context, then the figure is going to be much, much higher.

Most software engineering remains focused on creation. On the design and implementation of the software. There hasn’t been enough focus on on-going modification, evolution or co-emergence of the software and local needs.

Take Moodle. It’s an LMS. Good and bad like other LMS. But it’s open source. It is meant to be easy to modify. That’s one of the arguments wheeled out by proponents when institutions are having to select a new LMS. And Moodle and its development processes are fairly flexible. It’s not that hard to add a new activity module to perform some task you want that isn’t supported by the core.

The trouble is that Moodle is currently entering a phase which suggests it suffers much the same problems as most large enterprise software applications. The transition from Moodle 1.x to Moodle 2.0 is highlighting the problems with modification. Some folk are reporting difficulties with the upgrade process, others are deciding to delay the upgrade as some of the third-party modules they use haven’t been converted to Moodle 2. There are even suggestions from some that mirror the “implement vanilla” advice for ERPs.

It appears that “we are optimizing humans for machinery”.

I’m wondering if there is anyone doing research how to make systems like Moodle more readily modifiable for local contexts. At the very least, looking at how/if the version upgrade problem can be improved. But also, the ability to modify the core to better suit local requirements. There are aspects there already. One of the difficulties is that to achieve this you would have to cross boundaries between the original developers, service providers (Moodle partners) and the practices of internal IT divisions.

Not everyone wants to program

One reason this will be hard is that not everyone wants to program. Recently, D’Arcy Norman wrote a post talking about the difference between the geeks and folk like his dad. His dad doesn’t want to bother with this techy stuff, he doesn’t want to “program”.

This sort of problem is made worse if you have an IT division that has senior management with backgrounds in non-IT work. For example, an IT director with a background in facilities management isn’t going to understand that IT is protean, that it can be programmed. Familiar with the relative permanence of physical buildings and infrastructure such a person isn’t going to understand that IT can be changed, that it should be optimized for the human beings using the system.

Organisational structures and processes prevent programming

One of the key arguments in my EDUCAUSE presentation (and my thesis) is that the structures and processes that universities are using to support e-learning are biased away from modification of the system. They are biased towards vanilla implementation.

First, helpdesk provision is treated as a generic task. The folk on the helpdesk are seen as low-level, interchangeable cogs in a machine that provides support for all an organisation’s applications. The responsibility of the helpdesk is to fix known problems quickly. They don’t/can’t become experts in the needs of the users. The systems within which they work don’t encourage, or possibly even allow, the development of deep understanding.

For the more complex software applications there will be an escalation process. If the front-line helpdesk can’t solve the problem it gets handed up to application experts. These are experts in using the application. They are trained and required to help the user figure out how to use the application to achieve their aims. These application experts are expert in optimizing the humans for the machinery. For example, if an academic says they want students to have an individual journal, a Moodle 1.9 application expert will come back with suggestions about how this might be done with the Moodle wiki or some other kludge with some other Moodle tool. If Moodle 1.9 doesn’t provide a direct match, they figure out how to kludge together functionality it does have. The application expert usually can’t suggest using something else.

By this stage, an academic has either given up on the idea, accepted the kludge, gone and done it themselves, or (bravely) decided to escalate the problem further by entering into the application governance process. This is the heavy weight, apparently rational process through which requests for additional functionality are weighed against the needs of the organisation and the available resources. If it’s deemed important enough the new functionality might get scheduled for implementation at some point in the future.

There are many problems with this process

  • Non-users making the decisions;
    Most of the folk involved in the governance process are not front-line users. They are managers, both IT and organisational. They might include a couple of experts – e-learning and technology. And they might include a couple of token end-users/academics. Though these are typically going to be innovators. They are not going to be representative of the majority of users.

    What these people see as important or necessary, is not going to be representative of what the majority of academic staff/users think is important. In fact, these groups can quickly become biased against the users. I attended one such meeting where the first 10/15 minutes was spent complaining about foibles of academic staff.

  • Chinese whispers;
    The argument/information presented to such a group will have had to go through chinese whispers like game. An analyst is sent to talk to a few users asking for a new feature. The analyst talks to the developers and other folk expert in the application. The analysts recommendations will be “vetted” by their manager and possibly other interested parties. The analysts recommendation is then described at the governance meeting by someone else.

    All along this line, vested interests, cognitive biases, different frames of references, initial confusion, limited expertise and experience, and a variety of other factors contribute to the original need being morphed into something completely different.

  • Up-front decision making; and
    Finally, many of these requests will have to battle against already set priorities. As part of the budgeting process, the organisation will already have decided what projects and changes it will be implementing this year. The decisions has been made. Any new requirements have to compete for whatever is left.
  • Competing priorities.
    Last in this list, but not last overall, are competing priorities. The academic attempting to implement individual student journals has as their priority improving the learning experience of the student. They are trying to get the students to engage in reflection and other good practices. This priority has to battle with other priorities.

    The head of the IT division will have as a priority of staying in budget and keeping the other senior managers happy with the performance of the IT division. Most of the IT folk will have a priority, or will be told that their priority is, to make the IT division and the head of IT look good. Similarly, and more broadly, the other senior managers on 5 year contracts will have as a priority making sure that the aims of their immediate supervisor are being seen to be achieved……..

These and other factors lead me to believe that as currently practiced, the nature of most large organisations is to be programmed. That is, when it comes to using digital technologies they are more likely to optimize the humans within the organisation for the needs of the technology.

Achieving the alternate path, optimizing the machinery for the needs of the humans and the organisation is not a simple task. It is very difficult. However, by either ignoring or being unaware of the bias of their processes, organisations are sacrificing much of the potential of digital techology. If they can’t figure out how to start programming, such organisations will end up being programmed.


Robey, D., Ross, W., & Boudreau, M.-C. (2002). Learning to implement enterprise systems: An exploratory study of the dialectics of change. Journal of Management Information Systems, 19(1), 17-46.

Thesis abstract v1.0b

The following is yet another sign that the thesis is finally getting close to submission. The following is the second version of an abstract of the thesis. There have been some changes, feedback always welcome.


This thesis seeks to offer an answer to the problem of how to design, implement and support information systems that effectively and efficiently support e-learning within universities. This is a problem that is increasingly prevalent and important to the operation of universities. It is also a problem where existing solutions are limited in terms of variety, quality and explicit theoretical guidance. This thesis formulates a specific Information Systems Design Theory (ISDT) – An Information Systems Design Theory for Emergent University E-learning Systems – as one answer to this problem.

The ISDT is formulated using an iterative action research cycle that encompasses the design, support and evolution of the Webfuse information system at Central Queensland University (CQU) from 1996 through 2009. The Webfuse system was used by tens of thousands of staff and students. It is the knowledge gained through this experience that, at two separate stages, is used to formulate ISDTs culminating in An Information Systems Design Theory for Emergent University E-learning Systems.

The final ISDT recognises that diversity and rapid, on-going change are, for a number of reasons, the key characteristics of e-learning within universities. Consequently, the ISDT specifies both process and product models that aim to enable the e-learning information systems to be emergent. In particular the ISDT proposes that emergent elearning information systems will: encourage and enable greater levels of e-learning adoption in terms of quantity, quality and diversity; as well as provide a level of differentiation and competitive advantage for the institution.

This thesis makes two additional contributions. First, the Ps Framework is developed and used to analyse the current, dominant practice of providing e-learning information systems within universities. The resulting analysis reveals a significant mismatch between the requirements of e-learning within universities and the characteristics of the product and process models used by the dominant approach to supporting e-learning within universities. It is this mismatch that the ISDT seeks to address. Second, problems with the existing approach for specifying ISDTs led to the development and widespread acceptance of an improved method of ISDT specification.

Dilbert as an expository instantiation

A few recent posts have been first draft excerpts from my Information Systems Design Theory (ISDT) from emergent university e-learning systems. Being academics and hence somewhat pedantic about these things there are meant to be a number of specific components of an ISDT. One of these is the expository instantiation that is meant to act as both an explanatory device and a platform for testing (Gregor and Jones, 2007) i.e. it’s meant to help explain the theory and also examples of testing the theory.

The trouble is that today’s Dilbert cartoon is probably as good an explanation of what is currently the third principle of implementation for my ISDT.

I’m sure that most folk working in a context where they’ve had to use a corporate information system have experienced something like this. A small change – either to fix a problem or improve the system – simply can’t be made because of the nature of the technology or the processes used to make the changes. The inability to make these changes is a major problem for enterprise systems.

The idea from the ISDT is that the development and support team for an emergent university e-learning system should be able to make small scale changes quickly without having to push them up the governance hierarchy. Where possible the team should have the skills, insight, judgement and trust so that “small scale” is actually quite large.

An example

The Webfuse e-learning system that informed much the ISDT provides one example. Behrens (2009) quotes a user of Webfuse about one example of how it was responsive

I remember talking to [a Webfuse developer] and saying how I was having these problems with uploading our final results into [the Enterprise Resource Planning (ERP) system] for the faculty. He basically said, “No problem, we can get our system to handle that”… and ‘Hey presto!’ there was this new piece of functionality added to the system… You felt really involved… You didn’t feel as though you had to jump through hoops to get something done.

Then this is compared with a quote from one of the managers responsible for the enterprise system

We just can’t react in the same way that the Webfuse system can, we are dealing with a really large and complex ERP system. We also have to keep any changes to a minimum because of the fact that it is an ERP. I can see why users get so frustrated with the central system and our support of it. Sometimes, with all the processes we deal with it can take weeks, months, years and sometimes never to get a response back to the user.

Is that Dilbert or what?

The problem with LMS

Fulfilling this requirement is one of the areas where most LMS create problems. For most universities/orgnaisations it is getting into the situation where the LMS (even Moodle) is approaching the “complex ERP system” problem used in the last quote above. Changing the LMS is to fraught with potential dangers that these changes can’t be made quickly. Most organisations don’t try, so we’re back to a Dilbert moment.

Hence, I think there are two problems facing universities trying to fulfil principle #3:

  1. Having the right people in the support and development team with the right experience, insight and judgement is not a simple thing and is directly opposed to the current common practice which is seeking to minimise having these people. Instead there’s reliance on helpdesk staff and trainers.
  2. The product problem. i.e. it’s too large and difficult to change quickly and safely. I think there’s some interesting work to be done here within Moodle and other open source LMS. How do you balance the “flexibility” of open source with the complexity of maintaining a stable institutional implementation?


Behrens, S. (2009). Shadow systems: the good, the bad and the ugly. Communications of the ACM, 52(2), 124-129.

Gregor, S., & Jones, D. (2007). The anatomy of a design theory. Journal of the Association for Information Systems, 8(5), 312-335.

Page 1 of 6

Powered by WordPress & Theme by Anders Norén