Product models – LMS, BoB and alternatives

The following completes the “alternate models” section of the Product component started in a previous post. It’s a bit rough and ready, but hopefully good enough.

Product models

The ERP market was one of the fastest growing and most profitable areas of the software industry during the last three years of the 1990s (Sprott 2000) and has tended to dominate the IT field (Light, Holland et al. 2001). It was at this same time – the late 1990s – that the availability of commercial LMS and their use within universities became increasingly prevalent. Perhaps then, it is not surprising that in terms of the underlying product model an LMS appears to be very close to that of a single-vendor Enterprise Resource Planning (ERP) system. In both cases, all the required functionality is provided in one, integrated package sourced from a single provider. In comparing the literature it is possible to see significant commonality between the advantages and disadvantages of an LMS and that of ERP system. The aim of this section is not to repeat the advantages and disadvantages of LMS – covered somewhat in the “LMS characteristics and limitations” section in Section 2.1.2 – or ERPs – covered in more detail in the relevant literature (Kallinikos 2004; Light 2005). It is instead to establish the existence of other potential product models and compare these with the ERP model. In addition, towards the end of this section the additional complicating and recent factor of user-owned technology is raised.

There are two approaches to the design of an LMS (Weller, Pegler et al. 2005):

  1. monolithic or integrated approach; and
    All common tools are provided by the one software package provided and supported by the one vendor. The predominant approach.
  2. best of breed approach.
    An alternative approach also termed a component or hybrid architecture. Aims to provide the same level of integration but the ability to select components that best suit the local context.

The same two approaches can be identified in the broader provision of enterprise information systems. It is possible to identify a reasonable spread of literature (Dewan, Seidmann et al. 1995; Geishecker 1999; Light, Holland et al. 2001; Hyvonen 2003; MacKinnon, Grant et al. 2008; Burke, Yu et al. 2009) examining various questions arising out of the difference between a monolithic ERP product model and the best of breed (BoB) model. This may not be all that surprising as such discussions have been billed as the “long-running debate” with the pendulum swinging from one view to the other and back again (Geishecker 1999). It is a debate that is encompassed be an even longer standing debate over the centralisation of decentralisation of computing, its focus on efficiency versus effectiveness and the supposed rational attempts at optimising the trade-off (King 1983). A debate that appears unresolvable due to the actual driving issues in the debate being the politics of organisation and resources and especially the apparently central issue of control (King 1983).

ERP adoption involves a centralised organisation of processes and a tendency to reduce autonomy and increase rigidity (Lowe and Locke 2008). Centralisation of control preserves top management prerogatives in most decisions, whereas decentralisation allows lower level managers discretion in choosing among options (King 1983). A BoB approach allows each department to select its own solution (Dewan, Seidmann et al. 1995). Light, Holland and Wills (2001) perform a comparative analysis of the ERP (monolithic or integrated) and best of breed (BoB) approaches to enterprise information systems and is summarised in Table 2.3.

Table 2.3 – Comparison of major differences between ERP and BoB (adapted from Light, Holland et al. 2001)
Best of breed Single vendor ERP
Organisation requirements and accommodations determine functionality The vendor of the ERP system determines functionality

A context sympathetic approach to BPR is taken A clean slate approach to BPR is taken
Good flexibility in process re-design due to a variety in component availability Limited flexibility in process re-design, as only one business process map is available as a starting point
Reliance on numerous vendors distributes risk as provision is made to accommodate change Reliance on one vendor may increase risk
The IT department may require multiple skills sets due to the presence of applications, and possibly platforms, from different sources A single skills set is required by the IT department as applications and platforms are common
Detrimental impact of IT on competitiveness can be dealt with, as individualism is possible through the use of unique combinations of packages and custom components Single vendor approaches are common and result in common business process maps throughout industries. Distinctive capabilities may be impacted on
The need for flexibility and competitiveness is acknowledged at the beginning of the implementation. Best in class applications aim to ensure quality Flexibility and competitiveness may be constrained due to the absence or tardiness of upgrades and the quality of these when they arrive
Integration of applications is time consuming and needs to be managed when changes are made to components Integration of applications is pre-coded into the system and is maintained via upgrades

Even in 1983, over twenty-five years ago, it was recognized that the terrain in which to decide between centralized and decentralized computing was continually changing (King 1983). This change is driven in no small part by the changing nature of technology from main-frames to personal computers to managed operating environments. Similarly, the smaller discussion between ERP and BoB has also been influenced by changes in technology. In the early to mid-1980s, the mainframe-dominant market automatically defaulted to an integrated ERP approach (Geishecker 1999). Most recently integration technologies like web services and service-oriented architectures (SOA) are seen to be enabling the adoption of BoB approaches (Chen, Chen et al. 2003). Such approaches are having an impact within the LMS field with attempts at implement a BoB LMS being enabled by the development of service-oriented architectures such as that be JISC (Weller, Pegler et al. 2005). Such an approach may allow a more post-industrial approach to the LMS allowing the taking of parts that are needed, when they are needed and granting control where it is needed (Dron 2006). Bailetti et al (2005) report on an early system that uses web services to implement a BoB approach.

In general, however, discussion about and comparison between ERP and BoB approaches to enterpise systems suffer the same limitation as the discussion of procurement strategies in the previous section. They are still based on the assumption that it is the responsibility of the institution, and its information technology department, to select, own and maintain all of the information systems required by users. Web 2.0, e-learning 2.0 (Downes 2005) and the rise of social software requires that organization of e-learning moves beyond centralized and integrated LMS and towards a variety of separate tools which are used and managed by the students in relation to their self-governed work. (Dalsgaard 2006). Stiles (2007) argues that in the future organizational needs will be best met by a BoB approach, however student initiated processes will be done using their choice of tools and services. An approach that provides students with a tool-box of loosely joined small pieces (Ryberg 2008).

References

Bailetti, T., M. Weiss, et al. (2005). An open platform for customized learning environments. International Conference on Management of Technology (IAMOT).

Burke, D., F. Yu, et al. (2009). "Best of Breed Strategies: Hospital characteristics associated with organizational HIT strategy." Journal of Healthcare Information Management 23(2): 46-51.

Chen, M., A. Chen, et al. (2003). "The implications and impacts of web services to electronic commerce research and practices." Journal of Electronic Commerce Reseaerch 4(4): 128-139.

Dalsgaard, C. (2006) "Social software: E-learning beyond learning management systems." European Journal of Distance Education Volume,  DOI:

Dewan, R., A. Seidmann, et al. (1995). Strategic choices in IS infrastructure: Corporate standards versus "Best of Breed" Systems. ICIS’1995.

Downes, S. (2005). "E-learning 2.0." eLearn 2005(10).

Dron, J. (2006). Any color you like, as long as it’s Blackboard. World Conference on E-Learning in Corporate, Government, Healthcare and Higher Education, Honolulu, Hawaii, USA, AACE.

Geishecker, L. (1999). "ERP vs. best-of-breed." Strategic Finance 80(9): 62-67.

Hyvonen, T. (2003). "Management accounting and information systems: ERP versus BoB." European Accounting Review 12(1): 155-173.

Kallinikos, J. (2004). "Deconstructing information packages: Organizational and behavioural implications of ERP systems." Information Technology & People 17(1): 8-30.

King, J. L. (1983). "Centalized versus decentralized computing: organizational considerations and management options." ACM Computing Surveys 15(4): 319-349.

Light, B. (2005). "Potential pitfalls in packaged software adoption." Communications of the ACM 48(5): 119-121.

Light, B., C. Holland, et al. (2001). "ERP and best of breed: a comparative analysis." Business Process Management Journal 7(3): 216-224.

Lowe, A. and J. Locke (2008). "Enterprise resource planning and the post bureaucratic organization." Information Technology & People 21(4): 375-400.

MacKinnon, W., G. Grant, et al. (2008). Enterprise information systems and strategic flexibility. 41st Annual Hawaii International Conference on System Sciences, Waikoloa, Hawaii.

Ryberg, T. (2008). Challenges and potentials for institutional and technological infrastructures in adopting social media. 6th International Confernece on Networked Learning, Halkidiki, Greece.

Sprott, D. (2000). "Componentizing the enterprise application packages." Communications of the ACM 43(4): 63-69.

Stiles, M. (2007). "Death of the VLE? A challenge to a new orthodoxy." Serials 20(1): 31-36.

Weller, M., C. Pegler, et al. (2005). "Students’ experience of component versus integrated virtual learning environments." Journal of Computer Assisted Learning 21(4): 253-259.

Procurement and software: alternate models for e-learning

And here’s the next bit of the Products component for chapter 2 of my thesis. The aim of this section is basically two argue that the LMS approach to e-learning embodies one view of how to procure software and one software model. I eventually aim to argue that both of these predominant models are essentially bad matches for the nature of e-learning within a university. The following is intended more to identify that there are alternatives than argue for the inappropriateness. That’s for later. But I doubt I’ve stopped it coming through.

This section focuses on procurement, I hope to have the product section up later today.

Procurement and software: alternate models for e-learning

As has been noted previously, within higher education the selection and purchase of an LMS has become the almost ubiquitous and unquestioned technical solution to the provision of e-learning. This singular approach can be said to embody a single approach to the procurement of software – “buy” – and a standard software model – the integrated, enterprise system. This section is based on the assumption that there are alternatives to both these models. There are different approaches to software procurement and different software product models that may be more appropriate for e-learning within universities, especially in light of recent changes within the broader information technology market place.

Procurement strategies for information systems

There is recognition that the choice of IS procurement strategy is critical for company operations and that different kinds of systems, require different kinds of resources and consequently different procurement strategies are applicable (Hallikainen and Chen 2005). Alignment between information technology and business is seen by scholars as an important principle for the success of IT deployment and implementation (Beukers, Versendaal et al. 2006). Saarinen and Vepsalainen (1994) propose the Procurement Principle as a prescriptive model for information systems investments. The principle is based on the assumption that optimal decisions about procurement are made when there is alignment between three choices: what type of system, what procurement strategy, and what type of organisational requirements (Wild and Sobernig 2007).

The Procurement Principle is based on transaction cost economics and draws on two inherent factors – specificity of system design and uncertainty of requirements – to develop three generic types of organisational requirements (Saarinen and Vepsalainen 1994):

  1. routine;
    Common to many or most organizations with stable requirements and low uncertainty.
  2. standard; and
    Common to a group of organizations, possibly within a given domain (Wild and Sobernig 2007), with some variety and uncertainty in requirements.
  3. speculative.
    Highly specific to one company and involve high uncertainty in terms of functionality, user interfaces and the competitiveness of the organisation.

In terms of the two inherent factors – specificity of design and requirements uncertainty – the above generic types represent systems on the diagonal. Saarinen and Vesalainen (1994) recognise other types of systems exist, suggest that they may be difficult to deal with and recommend solutions that modify requirements to fit with the three identified types or postponed.

Saarinen and Vesalainen (1994) identify generic types of developers that fit with these procurement strategies. The three types are:

  1. implementers;
    Employed by an external software development company these developers of high levels of product specific knowledge but only limited, common knowledge about the user organisation.
  2. analysts; and
    Commissioned by the client these staff are responsible for specifying user requirements and improving system solutions by drawing on their abilitiy to solve generic problems and specify complex integrated systems.
  3. innovators.
    Usually employed by the user organisation these developers have specialised knowledge about the user organisation, its users and information systems. They can communicate easily with the users and can specify and create new innovative solutions.

The appropriate matching of the type of requirements and the types of developer is now used to identify three efficient and generic procurement strategies. In large projects, the above three generic strategies will have to be combined and redefined in practice (Saarinen and Vepsalainen 1994). The three generic strategies are (Saarinen and Vepsalainen 1994):

  1. Routine systems can be best implemented by acquiring software packages from implementers.
  2. Standard applications require software contracting by analysts and possibly other outside resources for implementation.
  3. Speculative investments are best left for internal development by innovators.

These three generic strategies correspond to the three major approaches to information systems development: software product purchase, contractual customized development with outside vendors, and in-house development (Heiskanen, Newman et al. 2000). The selection and implementation of an LMS within a university represents software product purchase with some limited integration work. There is increasingly an absence of institutions adopting other approaches, either individually or in combination.

The over-emphasis on the software product purchase approach contributes to an increased in a techno-centric view. Due to the cost involved in modifying a complex software package most commercial systems require the institution to modify its practices to accommodate the system (Dodds 2007). So, rather than using IT to foster a culture of innovation by taking the point of view of the individual (Dodds 2007), or even the organisation, the focus is on the technology and its capabilities. As early as 1982 an alternate evolutionary approach, which appears much closer to in-house development, was recommended by Kerr and Hiltz (1982) for computer-mediated communication and found to be common with interactive systems which provide cognitive support. Kerr and Hiltz (1982) suggested that because the technology was so new, the possibilities for alternative functions and capabilities so numerous, and that users could not adequately understand what they might do with a new technology until they had an opportunity to experience it that an approach of feedback, evaluation and incremental implementation of new features was desirable.

The reasons identified by Kerr and Hiltz (1982) seem to fit two (requirements identity and requirements volatility) of the three categories of risks associated with requirements development identified by Tuunanen et al (2007) and shown in Table 2.2. If this observation remains appropriate for current practices around e-learning it would appear to question the alignment between the LMS procurement approach and the types of requirements that would make that approach the most efficient as identified by the Procurement Principle.

Table 2.2 – Requirements development risks (adapted from Tuunanen, Rossi et al. 2007)
Risks Definition
Requirements identity The availability of requirements; high identity risk indicates requirements are unknown or indistinguishable
Requirements volatility The stability of requirements; high volatility risk indicates requirements easily change as a result of environmental dynamics or individual learning
Requirements complexity The understandability of requirements; high complexity risk indicates requirements are difficult to understand, specify, and communicate

In addition, both the nature of the LMS and the procurement model assume that it is necessary that for the organisation to provide all of the components of the information system. In recent years the functionality and usability of technology available ot individuals has been outstripping that of technology provided centrally by institutions (Johnson and Liber 2008). Increasingly, university students and staff are using a collection of tools and systems they choose, rather than tools and systems selected, owned and maintained by the university (Jones 2008).

References

Beukers, M., J. Versendaal, et al. (2006). "The procurement alignment framework construction and application." Wirtschaftsinformatik 48(5): 323-330.

Hallikainen, P. and L. Chen (2005). "A holistic framework on information systems evaluation with a case analysis." The Electronic Journal Information Systems Evaluation 9(2): 57-64.

Johnson, M. and O. Liber (2008). "The Personal Learning Environment and the human condition: from theory to teaching practice." Interactive Learning Environments 16(1): 3-15.

Jones, D. (2008). PLES: framing one future for lifelong learning, e-learning and universities. Lifelong Learning: reflecting on successes and framing futures. Keynote and refereed papers from the 5th International Lifelong Learning Conference, Rockhampton, CQU Press.

Saarinen, T. and A. Vepsalainen (1994). "Procurement strategies for information systems." Journal of Management Information Systems 11(2): 187-208.

Wild, F. and S. Sobernig (2007). Learning tools in higher education: Products, characteristics, procurement. Second Conference on Technology Enhanced Learning. Crete, Greece.

Learning Tools in Higher Education: Products, Characteristics, Procurement

Back to the PhD today, probably will do a couple of summaries of papers I’m reading. The focus is on the product models and procurement strategies used by Universities to solve the technical problem of e-learning. I start with a paper with the title “Learning Tools in Higher Education: Products, Characteristics, Procurement” (Wild and Sobernig, 2007)

Summary

Uses interviews of 100 European universities from 27 countries to identify the tools they use to facilitate learning, how intensively they are used and what procurement strategies are used.

Gives some rough figures of types of systems used. Gives a longitudinal feel to some previous studies.

Seems to indicate that European institutions seem to find it “very important to have an institutional platform run by the institutions them-
selves, however, with strong connections to the open-source world”.

I wonder if the results would be the same in the US or Australia where commercial LMS adoption has been more predominant – though changing somewhat.

The reporting of the findings are, to me at least, somewhat confusing.

The greatest value for me is pointing me to the literature (Saarinen et al, 1994; Heiskanen et al, 2000) that proposes an optimal relationship between types of requirements, types of system and types of procurement strategy. I’ll be using this in the PhD and potentially some papers.

Introduction

Most unis using some sort of LMS. 250 commercial software providers,40 open source products – large and heterogenous products. Some evidence (Pituch and Lee, 2006) that functionality and interactivity drive usage.

What tools are being used today?

Products in the market

Participants report

  • 182 distinct tools occurred 290 times: LMS, content management, collaboration tools
  • Moodle most used – 44 instances, but only 15 of these not running in parallell with others.
  • WebCT – 14 installations.
  • 15 pure content management systems in 20 installations
  • 18 pure admin information systems – 19x.
  • 22 different authoring tols
  • 14 learning object repositories
  • 10 different assessment tools
  • 32 different collaboration tools with 51 installations
  • Most heavily used systems identified by highest active number of users – WebCT (twice), .LRN (once), CampusNet (once), Blackoard (once) and eLSe (once).

References a couple of other similar investigations of tools

Since one – five systems have vanished.

Portfolio characteristics

What activities did the tools support:

  • text-based communication – 87 (out of 100)
  • Assessments – 81
  • Quality assurance and evaluation – 53
  • Collaborative publishing – 52
  • Individual publishing – 44
  • social networking – 34
  • Authoring learning designs – 31
  • Audio/video conferencing – 31
  • Audio/video broadcasting – 25
  • User portfolio management – 23
  • simulations/online labs – 21

Text-oriented predominant. Multimedia lacking support

Following table compares reports of courses sites from two previous studies and this one – some issues in comparison.

Categories Paulsen (1999) Paulsen (2003) Wild and Sobernig (2007)
Up to 15 courses 68% 38% 22%
More than 15 25% 50% 56%

This study also found – 36% more than 100. 5% more than 1000.

Tool usage: 49/100 delivery and 54/100 course management.

Report on problems with calculating number of users because of varios difficulties.

Procurement strategies

Procurement decisions based on 3 types of requirements

  1. Speculative requirements – organisationaly unique or involve uncertainty.
  2. Standard requirements – common to organisations of a particular domain.
  3. Routine requirements – invariant across domain boundaries.

Literature suggest that in optimal cases, organisational choices are driven by these requirements. Suggests this choice represents a combination of

  • Software type – custom developed, packaged and off-the-shelf
  • Procurement strategy – in-house development (internal procurement), contracting and acquisition (both external procurement).

Same literature suggests an alignment between requirement types and organisational choices:

  • Predominantly speculative – internal development of custom software.
  • Standard requirements – customised, packaged software where customisation external contracted.
  • Routine requirements – off-the shelf software.

At this stage, the explanation of the findings from the survey are really hard to follow – at least for me. I would’ve though this should be easy. Keep that in mind when you read the following.

  • 40% follow procurement configurations considered optimal
  • 44% reported mixed configurations of requirements and procurement strategy
  • 5% report external procurement from external contractors
  • External procurement, when it does occur, predominantly with speculative requirements.
  • Internal development equally distributed across requirements – 21% speculative, 19% mixed, 18% standard
  • There are other percentages reported, but I can’t follow it and/or make sense of it with the ones I’ve summarised above

References

Heiskanen, A., M. Newman, et al. (2000). “The social dynamics of software development.” Accounting, Management & Information Technology 10(1): 1-32.

Paulsen, M. F.: Online Education. An International Analysis of Web-based Education and Strategic Recommendations for Decision Makers. NKI Forlaget, Bekkestua, Norway (2000)

Paulsen, M. F. (2003). “Experiences with Learning Management Systems in 113 European Institutions.” Educational Technology & Society 6(4): 134-148.

Pituch, K., and Lee, Y.: The influence of system characteristics on e-learning use. Computers & Education. 47(2) (2006) 222–244

Saarinen, T. and A. Vepsalainen (1994). “Procurement strategies for information systems.” Journal of Management Information Systems 11(2): 187-208.

Wild, F. and S. Sobernig (2007). Learning tools in higher education: Products, characteristics, procurement. Second Conference on Technology Enhanced Learning. Crete, Greece.

Comparisons between LMS – the need for system independence

Some colleagues and I are putting the finishing touches on a paper that has arisen out of the indicators project. The paper is an exploratory paper, seeking to find interesting patterns that might indicate good or bad things about the use of LMS (learning management systems, aka course management systems, virtual learning environments etc) that might help improve decision-making by all participants (students through management). I hope to post the paper in coming days.

This post is about one aspect of the paper. The section where we compare feature adoption between two different LMS that have been used side-by-side at our institution: Blackboard and Webfuse. (Important: I don’t believe Webfuse is an LMS and will argue that in my PhD (Webfuse is the topic of my thesis). But it’s easier to go with the flow). This is one of the apparent holes in the literature, we haven’t found any publications analysing and comparing system logs from different LMS, especially within the one institution over the same long time frame. In our case we went from 2005 through the first half of 2009.

The aim of this post is to identify the need and argue for the benefits in developing a LMS independent means of analysing and comparing the usage logs of different LMS at different institutions.

Anyone interested?

The following gives a bit of the background, reports on some initial findings and out of that identifies the need for additional work.

Our first step

Blackboard and Webfuse have a number of significant differences. All LMS have somewhat different assumptions, designs and names. Webfuse is significantly different, but that’s another story. The differences make comparisons between LMS more difficult. How do you compare apples with apples?

The only published approach we’re aware of that attempts to make a first step towards a solution to this problem is the paper by Malikowski, Thompson and Theis (2007) for which the abstract makes the following claims

…This article recommends a model for CMS research that equally considers technical features and research about how people learn…..This model should also ease the process of synthesizing research in CMSs created by different vendors, which contain similar features but label them differently.

I’ve talked about and used the model previously (first, second and other places). For the purposes of the paper we produced a different representation of the Malikowski et al (2007) model.

Reworked Malikowski model

From my perspective there are three contributions the model makes

  1. Provides an argument for 5 categories of features and LMS might have, gives them a common title and specifies which common features fit where.
  2. Draws on existing literature give some initial benchmarks for the level of adoption (specified by the percentage of courses with a feature) to be expected grouped into three levels.
    I must admit that Malikowski et al don’t specify the percentages directly, these are taken from the examples they list in tables.
  3. Suggests a model where features are adopted sequentially over time as academics become more comfortable with existing features.

Blackboard versus Webfuse – 2005 to 2009

The benefit the model has provided us is the ability to group the different features of Webfuse and Blackboard into the five categories and then compare the levels of feature adoption between the two systems and with the benchmarks identified in the Malikowksi et al (2007) paper. The following summarises what we found.

Transmitting content

Malikowski et al (2007) define this to include announcements, uploaded files and the use of the gradebook to share grades (but not assignment submission etc.). The following graph shows the percentage of course sites in both Blackboard (black continuous line), Webfuse (black dashed lines) and the “benchmark region” identified in Malikowski et al. In the case of transmitting content the “benchmark region” is between 50 and 100%.

Feature adoption - Transmit Content - Wf vs Bb

This shows that both Blackboard and Webfuse are in the “benchmark region”. Not surprising given the pre-dominant use of LMSs for content transmission. What may be surprising is that Webfuse only averages around 60-75%. This is due to one of those differences in LMS. Webfuse is designed to automatically create default course sites that contain a range of content. Also, it’s quite common for the announcements facility in a Webfuse course site to be used by faculty management to disseminate administrative announcements to students.

So, in reality 100% of Webfuse courses transmit content. The percentage show those courses where the academics have uploaded additional content or made announcements themselves.

Class interactions

Class interactions covers chat rooms, email, discussion forums, mailing lists etc. Anything that get folk in a course talking.

Feature adoption - Class Interaction- Wf vs Bb

Both Blackboard and Webfuse are, to varying extents, outside of the “benchmark area”. Webfuse quite considerably reaching levels near 100% in recent years. Blackboard has only just crept over. This creeping over of Bb may be an indicator that the “benchmark area” is out of date. It was created drawing on 2004 and earlier literature. If feature adoption increases over time, the “benchmark area” has probably moved up.

Evaluating students

Online assignment submission, quizzes and use of other tools to assess/evaluate students.

Feature adoption - Evaluating Students - Bb vs Wf

Over recent years Webfuse has seen double the adoption of these features than Blackboard. It’s grown outside the “benchmark area”. Most of this is online assignment submission, in fact some of the courses using the Webfuse online assignment submission system are actually Blackboard courses.

Evaluating course/instructor

The last category we covered was evaluating the course/instruction through survey tools etc. We didn’t cover computer-based instruction as very few Blackboard courses use it and Webfuse doesn’t provide the facility.

Which raises an interesting question. I clearly remember a non-Webfuse person being quite critical that Webfuse did not offer the computer-based instruction funtionality – we could have added it but no-one ever asked. What is better, paying for features few people ever use or not having features that a few people will use?

Feature adoption: evaluating Courses Bb versus Wf

First, it should be pointed out that for “rarely used” features like course evaluation there is an absence of percentages in Malikowski et al (2007). I’ve specified 20% as the upper limit for this “benchmark area” because “moderately used” was 20% or higher. So it’s probably unfair to describe the Blackboard adoption level as being at the bottom of the range. On the hand, Webfuse is streets ahead. Near 100% dropping to just less than 40%. More on this below.

Work to do

Generating the above has identified a need or value in the following future work:

  • Do an up to date literature review and establish a new “benchmark area”.
    Malikowski et al (2007) rely on literature from 2004 and before. Levels of adoption have probably gone up since then.
  • Refine the list of features per category through the same literature review.
    In recent years LMS have added blogs, wikis, social networking etc. Where do they fit?
  • Refine the definition of “adoption”.
    Malikowski and his co-authors have used at least two very different definitions of adoption. There is apparently no work to check that the papers used to illustrate the model in Malikoswki et al (2007) use a common definition of adoption.
  • Develop feature specific LMS independent usage descriptions.
    In their first paper Malikowski et al (2006) count adoption as the presence of a feature, regardless of how broadly it is used. This causes problems, for example, the course evaluation figure for Webfuse is near 100% because for a number of years a course barometer (Jones, 2002) was a standard part of a Webfuse default site. i.e. everyone course had one. Just doing a quick check, only 23% of Webfuse courses in 2006 had a barometer in which a student made a comment.

    Malikowski (2008) adopted a new measure for adoption. Course use of a particular feature had to be above the 25th percentile of use for that feature in order to be counted. I don’t find this a good measure. Just 1 student comment on a barometer could be a potentially essential use of the feature.

    There appears to be a need for being able to judge the level of use of a feature in a way that is sensitive to the feature. 1 entry in a gradebook for a course for 500 students is probably an error can be ignored. 1 comment on a barometer for that same course that points out an important issue probably shouldn’t be ignored.

  • Attempt to automate comparison between LMS.
    In order to enable broader and quicker comparison between different LMS, whether between institutions or within institutions, there appears a need to automate the process. To make it quicker and more staight forward.

    One approach might be to design a LMS independent database scheme for the extended Malikowski et al (2007) model. Such a scheme would enable people to write “conversion scripts” that take usage logs from Blackboard, Moodle, WebCT or whatever LMS and automatically insert them into the schema. Once someone has written the conversion script of an LMS, no-one else would have to. The LMS independent schema could than be analysed and used to compare and contrast different systems and different institutions without the apples and oranges problem.

References

Jones, D. (2002). Student Feedback, Anonymity, Observable Change and Course Barometers. World Conference on Educational Multimedia, Hypermedia and Telecommunications, Denver, Colorado, AACE.

Malikowski, S., M. Thompson, et al. (2006). “External factors associated with adopting a CMS in resident college courses.” Internet and Higher Education 9(3): 163-174.

Malikowski, S., M. Thompson, et al. (2007). “A model for research into course management systems: bridging technology and learning theory.” Journal of Educational Computing Research 36(2): 149-173.

PhD Update #22 – one day active, but some movement

The last week has seen only one day spent on the PhD. Mainly due to working on a conference paper. The good news is that the paper is connected to the PhD. It looks at mining LMS usage logs to generate indicators of patterns which may be interesting. The paper includes a comparison of LMS feature adoption between Blackboard (CQU’s existing institutional LMS) and Webfuse – the topic of the PhD. Webfuse comes out favourably from a couple of perspectives. More on this later.

What I did

The intent expressed in the last PhD update was to complete the Product section and make a good start on the pedagogy section.

In the one day and a bit days I spent on the thesis I

So, one and a bit sections left to complete Product

What I’ll do next week

The plan is to complete product and hopefully complete pedagogy.

At this stage, I should have a fair number of days to work on the PhD, so I might get somewhere close.

E-Learning 2.0 and reliability of external services

BAM is a little project of mine playing at the edges of post-industrial e-learning. Since 2006 it’s been relying on students creating and using blogs provided by external service provides – mostly WordPress.com.

This reliance on external service providers has been one of the “problems” raised by folk who don’t like the idea. They fear that because the university doesn’t own the technology or have a contract with the service provider there is no certainty about the quality of service. That the students and the staff will be left high and dry as the service is yanked in the middle of term.

Those fears are not unfounded. There have been stories of Web 2.0 services disappearing in the middle of a course. However, my argument has always been that if you pick the right service providers and design systems to allow safe failure you can achieve generally better outcomes (for a cheaper price) than the mandate and purchase approach traditionally taken by institutions.

This post shares a recent experience that supports this argument and ruminates on some alternate explanations for why this approach might be better.

The story

Yesterday I received an email from one of the teaching staff involved in a course that is using BAM. The course has 170+ students spread across 5+ campuses using BAM with their posts being tracked and marked by 10 staff. Three of the students for this teacher are reporting that they can’t access their blogs.

While BAM allows students to create and use a blog on any service provider we have found it useful to suggest providers whom we find reliable. Originally this was blogger and WordPress.com, in the last year or so we’ve recommended WordPress.com only. i.e. based on our experience, we found WordPress.com more usable and reliable. I should point out though, that the institution I work for does not have a formal agreement with WordPress.com. The students create free blogs on WordPress.com like any of the other of thousands of folks who do each week. I’ll pick up on this later.

After looking at the reported problem it was apparent that the blogs for the three students had been suspended because they apparently had contravened the WordPress.com terms of service (ToS). This mean that the students couldn’t post to their blog and no-one could see any of the content posted to their blog. While it seemed unlikely that the students would have done anything to deserve this, it’s amazing what can happen. So the question was what had they done?

A key part of BAM is that it is designed to enable safe failure. If, as in this case, the student’s blog has disappeared – for whatever reason – it doesn’t matter. BAM keeps a mirror of the blog’s RSS/Atom feed on a university server. So while I couldn’t see the blogs posts on WordPress.com, I could see the content on BAM. Nothing there looked like it would contravene the ToS.

So the only way forward was to ask WordPress.com why they did this. This is where the fear of failure arises. I’ve seen any number of examples of technical support being horrible. Including instances where the institution has paid companies significant amounts of money for support only to receive slow responses that do little more than avoid the question or report “it looks alright from here”. If you get this sort of “service” from supplier you pay, what sort of response am I going to get from WordPress.com.

Remember, these blogs are not my blogs. The blogs belong to students who attend the university I work for. A university WordPress.com is not likely to know anything about. A university they certainly don’t have any relationship with. In fact, it’s a university that appears to favour a competitor. Both IT division and our Vice-Chancellor have blogs hosted by blogger.

For these reasons, I was somewhat pessimistic about the response I would get. I was fearful that this experience would provide support for the nay sayers. How wrong I was.

12 hours after I contacted WordPress.com about this issue. I received an email response which basically said “Oops, sorry it looked like the profiles matched spammers. Our mistake. The blogs are back.”.

12 hours might seem like a long time if you’re picky. But I’m more than happy with that. It’s streets ahead of the response times I’ve seen from vendors who are being paid big money. It’s orders of magnitude better in terms of effectiveness.

Do one thing and do it well

It’s my proposition that if you choose a good Web 2.0 service provider, rather than being more risky than purchasing, installing and owning your own institutional version of the service, it is actually less risky, less expensive and results in better quality on a number of fronts. This is because a good Web 2.0 service provider has scale and is doing one thing and doing it well.

Unlike an integrated system (e.g. an LMS) WordPress.com only has to concentrate on blog engines. So it’s blog service is always going to be streets ahead of that provided by the LMS. Even if the LMS is open source.

A commercial LMS vendor is going to have to weight the requirements of huge numbers of very different clients, wanting different things and consequently won’t be able to provide exactly the service the institution needs. Not to mention that they will be spread really thin to cover all the clients.

An open source LMS generally has really good support. But the institution needs to have smart people who know about the system in order to properly engage with that support and be flexible with the system.

There’s more to draw out here, but I don’t have time. Have a paper to write.

Learning requires willingness to suffer injury to one's self-esteem

Over recent weeks I have ignored Twitter, it was consuming too much time and I have to focus on writing the PhD. There is a cost involved to doing this, you miss out on some good insights.

Aside: The quality of the insights you gather from twitter are directly correlated with the quality of the people you follow. Listening to this podcast yesterday I heard the following description of the difference between Facebook and Twitter. Facebook is for the people you already know, Twitter is for those you don’t.

This morning I gave in and started up Nambu and have come across the following, very fitting quote

“Every act of conscious learning requires the willingness to suffer an injury to one’s self-esteem. That is why young children, before they are aware of their own self-importance learn so easily; and why older persons, especially if vain or important, cannot learn at all.” — Thomas Szasz, 1973

I plan to use this quote to argue that current approaches within universities – or at least those I’m familiar with – prevent learning.

Source

I came across this quote via a tweet by Gardner Campbell pointing to the first lecture by Michael Wesch. The quote is the lead in to the lecture.

Thomas Szasz is a somewhat controversial figure, so perhaps not the perfect source for a quote. But the quote does capture what I see as a key aspect of learning – and one that I personally struggle with.

Learning means being wrong

Szasz suggests you have to be willing to suffer through injury to your self-esteem to learn. To get it wrong. This connects with many of the other insights, quotes and perspectives on learning that I’ve seen and discussed on the blog. I’m sure there are many more.

Additional support for this idea comes from confirmation bias, the Tolstoy syndrome and pattern entrainment and not to mention the Golden Hammer law and status quo adherence. All summed up nicely by a quote from Tolstoy

The most difficult subjects can be explained to the most slow-witted man if he has not formed any idea of them already; but the simplest thing cannot be made clear to the most intelligent man if he is firmly persuaded that he knows already, without a shadow of doubt, what is laid before him.

In order to learn something new you have to be prepared to think anew, critically examine what you currently take for granted and hold it up to the light of new insights to see if it is found wanting. While learning something new, you will make mistakes. In fact, there are any number of quotes around innovation that posit the importance of failure

If you’re not failing every now and again, it’s a sign you’re not doing anything very innovative. — Woody Allen

or

The essential part of creativity is not being afraid to fail. — Edwin Land

and

Success is on the far side of failure. — Thomas Watson Sr

Fear of failure is embedded in academia

Jon Udell has argued that academia is heavily focused on not being seen to make mistakes. Researchers only release ideas that are fully baked, half-baked ideas are discouraged

As Gardner Campbell observes in this article

For an academic, “failure” is often synonymous with “looking stupid in front of someone.” For many faculty, and maybe for me back in the 1980s, computers mean the possibility of “pulling a Charlie Gordon,” as the narrator poignantly terms it in Daniel Keyes’s Flowers for Algernon.

Fear of failure is made worse by managerialism

For quite some time I have been arguing that teleological approaches to online learning – and I know expand that to broader styles of management – within higher education is ill-suited to the challenge (Jons, Luck, McConachie and Daner, 2005; Jones and Muldoon, 2007). Approaches to leadership and management that are driven by current over-emphasis on efficiency and accountability are based heavily on teleological assumptions and because of the mismatch end up damaging universities.

But worse, at least from the perspective of learning, such approaches to leadership – at least as often practiced – are hugely fearful of failure. They seek to avoid it as much as possible. The SNAFU principle is a humourous explanation of this tendency for authoritarian hierarchies to screw up.

Of course there is also much written in the management and organisational research about this tendency. This post covers a small sample of it and includes the following quote from Argyris and Schon (1978, p116)

In a Model 1 behavioral world, the discovery of uncorrectable errors is a source of personal and organisational vulnerability. The response to vulnerability is unilateral self-protection, which can take several forms. Uncorrectable errors, and the processes that lead to them, can be hidden, disguised, or denied (all of which we call ‘camouflage’); and individuals and groups can protect themselves further by sealing themselves off from blame, should camouflage fail.

References

Jones, D., J. Luck, et al. (2005). The teleological brake on ICTs in open and distance learning. Conference of the Open and Distance Learning Association of Australia’2005, Adelaide.

Jones, D. and N. Muldoon (2007). The teleological reason why ICTs limit choice for university learners and learning. ICT: Providing choices for learners and learning. Proceedings ASCILITE Singapore 2007, Singapore.

Identifying file distribution on Webfuse course sites

As part of the thesis I’ve been engaging with some of the literature around LMS feature usage to evaluate usage of Webfuse. A good first stab of this was reported in an earlier post. There were a number of limitations of that work, it’s time to expand a bit on it. To some extent for the PhD and to some extent because of a paper.

As with some of the other posts this one is essentially a journal or a log of what I’m doing and why I’m doing it. A permanent record of my thinking so I can come back later, if needed.

There’s even an unexpected connection with power law distributions towards the end.

Content distribution

In that previous post I did not include a graph/any figures around the use of Webfuse course sites to distribute content or files. This is because Webfuse had a concept of a default course site. i.e. every course would have the same basic default site created automatically. Since about 2001 this meant that every course site performed some aspect of information distribution including: the course synopsis on the home page, details about the course assessment, details about course resources including textbook details and a link to the course profile, and details about the teaching staff.

Beyond this staff were able to upload files and other content as they desired. i.e. moving beyond the default course site was optional and left entirely up to the teaching staff. Some of us, perhaps went overboard. Other staff may have been more minimal. The aim here is to develop metrics that illustrate that variability.

Malikowski et al (2007) have a category of LMS usage called Transmitting Content. The LMS features they include in this category include:

  • Files uploaded into the LMS.
  • Announcements posted to the course site.

So, in keeping with the idea of building on existing literature. I’ll aim to generate data around those figures. Translating those into Webfuse should be fairly straight forward, thinking includes:

  • Files uploaded into the LMS.
    Malikowski et al (2007) include both HTML files and other file types. For Webfuse and its default course sites I believe I’ll need to treat these a little differently:
    • HTML files.
      The default course sites produce HTML. I’ll need to exclude these standard HTML files.
    • Other files.
      Should be able to simply count them.
    • Real course sites.
      Webfuse also had the idea of a real course site. i.e. an empty directory into which the course coordinator could upload their own course website. This was usually used by academics teaching multimedia, but also some others, who knew what they wanted to do and didn’t like the limitations of Webfuse.
  • Announcements.
    The default course site has an RSS based announcements facility. However, some of the announcements are made be “management”. i.e. not the academics teaching the course but the middle managers responsible for a group of courses. These announcements are more administrative and apply to all students (so they get repeated in every course). In some courses they may be the only updates. These announcements are usually posted by the “webmaster”, so I’ll need to exclude those.

Implementation

I’ll treat each of these as somewhat separate.

  • Calculate # non-HTML files.
  • Calculate # of announcements – both webmaster and not.
  • Calculate # HTML files beyond default course site (I’ll postpone doing this one until later)

Calculate # non-HTML files.

Webfuse created/managed websites. So all of the files uploaded by staff exist within a traditional file system. Not in a database. With a bit of UNIX command line magic it’s easy to exact name of every file within a course site and remove those that aren’t of interest. The resulting list of files is the main data source that can then be manipulated.

The command to generate the main data source goes like this

find T1 T2 T3 -type f | get all the files for the given terms
grep -v ‘.htm$’ | grep -v ‘.html$’ | remove the HTML files
grep -v ‘CONTENT$’ | remove the Webfuse data files
grep -v .htaccess | remove Apache access restriction file
grep -v ‘updates.rss$’ | remove the RSS file used for announcements
grep -v ‘.ctb$’| grep -v ‘.ttl$’ | grep -v ‘/Boards/[^/]*$’ | grep -v ‘/Members/[^/]*$’ | grep -v ‘/Messages/[^/]*$’ | grep -v ‘/Variables/[^/]*$’ | grep -v ‘Settings.pl’ | remove files created by discussion forum
sed -e ‘1,$s/.gz$//’

The sed command at the end removes the gzip extension that has been placed on all the files in old course sites that have been archived – compressed.

The output of this command is the following

T1/COIT11133/Assessment/Assignment_2/small2.exe
T1/COIT11133/Assessment/Weekly_Tests/Results/QuizResults.xls
T1/COIT11133/Resources/ass2.exe

The next aim is to generate a file that contains the number of files for each course offering. From there the number of courses with 0 files can be identified, as can some other information. The command to do this is

sed -e ‘1,$s/^(T./………/).*$/1/’ all.Course.Files | sort | uniq -c | sort -r -n > count.Course.Files

After deleting a few entries for backup or temp directories. We have our list. Time to manipulate the data, turn it into a CSV file and into Excel. Graph below, fairly significant disparity in number of files – the type of curve looks very familiar though.

Number of uploaded files per Webfuse course site for 2005

In total, for 2005 there were 178 course sites that had files. That’s out of 299 – so 59.5%. This compares to the 50% that Col found for the Blackboard course sites in the same year.

Calculate # of Announcements

The UNIX command line alone will not solve this problem. Actually, think again, it might. What I have to do is:

  • For each updates.rss
    • count the number of posts by webmaster
    • count the number of posts by non-webmaster
    • output – courseOffering,#webmaster,#non-webmaster

Yep, a simple shell script will do it

echo COURSE,ALL,webmaster
for name in `find T1 -name updates.rss`
do
  all=`grep '' $name | wc -l`
  webmaster=`grep 'webmaster' $name | wc -l`
  echo "$name,$all,$webmaster"
done

Let’s have a look at the 2005 data. Remove some dummy data, remove extra whitespace. 100% of the courses had updates. 166 (55%) had no updates from the teaching staff, 133 (45%) did. That compares to 77% in Blackboard. Wonder if the Blackboard updates also included “webmaster” type updates?

In terms of the number of announcements contributed by the teaching staff. The following graph shows the distribution. The largest number for a single offering was 34. Based on a 12 week CQU teaching term, that’s almost, on average, 3 announcements a week

Number of coordinator announcements - Webfuse 2005

Power laws and LMS usage?

The two graphs above look very much like a power law distribution. Clay Shirky has been writing and talking about power law distributions for some time. Given that there appears to be a power law distribution going on here with usage of these two LMS features, and potentially that the same power law distribution might exist with other LMS features, what can Shirky and other theoretical writings around power law distributions tell us about LMS usage?

References

Malikowski, S., Thompson, M., & Theis, J. (2007). A model for research into course management systems: bridging technology and learning theory. Journal of Educational Computing Research, 36(2), 149-173.

Other information systems in higher education

The following is the next short, and still somewhat questionable, section of the Product component. A previous post discussesd the limitations of an LMS, this section talks briefly about the other types of systems necessary for learning and teaching. The next section will talk about more abstract alternatives to those most commonly associated with the LMS.

Other systems

A university makes use of a large number of software applications partly because creating a single application to run a business as higher education is virtually impossible (Jones 2004). Universities have multiple constituencies – including parents, students, government, industry and alumni – and a need to maintain relationships with individuals that are now lifelong (Lightfoot and Ihrig 2002). Paulsen (2002) perceives e-learning to consist of a chain of four systems: content creation tools, learning management systems, student management systems, and accounting systems. The implication being that the LMS is only component of the information systems ecosystem of a university. Institutions now have applications for financial management, human resources, admissions, recruitment, payments, procurement, research databases, course management, online library reserves, classroom scheduling, patient records, grant and contracts management and email (Lightfoot and Ihrig 2002). Over recent times many institutions have moved to enterprise systems that integrate students, financial and human resource systems (Duderstadt, Atkins et al. 2002).

It has been suggested that contemporary learning environments should integrate academic and administrative support services directly into the students’ environment (Segrave and Holt 2003). All too often the systems are not interconnected and present the user with a fragmented view of the institution (Lightfoot and Ihrig 2002). There is a general lack of integration amongst these systems (Paulsen 2002). The development of robust, institutional, technical infrastructure has become a major area of activity (Conole 2002). Large scale enterprise systems, while useful to the administrative side of the university, can work at odds with the academic activities and force teaching and research to conform to business IT systems (Duderstadt, Atkins et al. 2002). Some attempts to increase the level of integration between academic and administrative systems has been done under the label of a managed learning environment (MLE).

While there remains some difficulty in defining the MLE as a concept there is agreement that the MLE involves a whole institution approach that links systems and facilities that are already provided across the institution (Holyfield 2003). A managed learning environment (MLE) will include administrative information about courses, resources, support and guidance, collaborative information, assessment and feedback – essentially linking up to back-end office systems and databases (Conole 2002). Beyond integration with administrative systems, to fully reap the benefits of an LMS, it has been suggested that institutions must integrate them with other systems including: identity directories, internal and external web sites, portals, library catalogs, multimedia and learning objects repositories, e-portfolios, email, calendar, instant messaging, wikis, blogs, web conferencencing, and other collaboration tools. (Molina and Ganjalizadeh 2006). It is hypothesized that institutions implementing integrated systems will improve their chances of becoming successful, large-scale e-learning providers (Paulsen 2002).

Discussion of the benefits of integration bring us back to some of the limitations of the LMS discussed above. Integration through the use of monolithic solutions like ERP systems increase complexity, offer limited flexibility and are not designed to collaborate with other autonomous applications (Irani 2002). Based on this view, the very nature of most LMS – as an example of a monolithic enterprise system – would appear somewhat less than well suited to integration within a MLE. The difficulty of integration and how alternative product models may provide different capabilities is part of the focus of the next section.

References

Conole, G. (2002). "The evolving landscape of learning technology." ALT-J 10(3): 4-18.

Duderstadt, J., D. Atkins, et al. (2002). Higher education in the digital age: Technology issues and strategies for American colleges and universities. Westport, Conn, Praeger Publishers.

Holyfield, S. (2003). Developing a shared understanding of the Managed Learning Environment – the role of diagramming and requirements gathering, JISC.

Irani, Z. (2002). "Critical evaluation and integration of information systems." Business Process Management Journal 8(4): 314-317.

Jones, D. (2004). "The conceptualisation of e-learning: Lessons and implications." Best practice in university learning and teaching: Learning from our Challenges.  Theme issue of Studies in Learning, Evaluation, Innovation and Development 1(1): 47-55.

Lightfoot, E. and W. Ihrig (2002). "Next-Generation Infrastructure." EDUCAUSE Review: 52-61.

Molina, P. and S. Ganjalizadeh. (2006). "Open Source Learning Management Systems."   Retrieved 28 December, 2006, from http://www.educause.edu/LibraryDetailPage/666?ID=DEC0602.

Paulsen, M. F. (2002). "Online education systems in Scandinavian and Australian Universities: A Comparative Study." International Review of Research in Open and Distance Learning.

Segrave, S. and D. Holt (2003). "Contemporary learning environments: Designing e-Learning for education in the professions." Distance Education 24(1): 7-24.

LMS characteristics and limitations

This post follows on from previous posts and contributes the next bit of the Product component of my thesis.

Having given an overview of what an learning management system (LMS) is in the last post, this post looks at some of the characteristics and limitations of the LMS model. It’s not complete, but it’s a start.

LMS characteristics and limitations

The introduction of the LMS has started a new round in the struggle between the propensities of technology to define their own paths and academic’s appropriate desires to subordinate the technologies to the values and traditions of the academy (Katz 2003). As with any technology, LMS are not value neutral transmitters of facts but instead carry the values and priorities of their producers (Dutton and Loader 2002). While agreeing with the emergent perspective described by Markus and Robey (1988), the perspective that sees the uses and consequences of information technology emerge unpredictably from complex social interactions, that technology does not unambiguously determine outcomes. This section illustrated agreement with the view expressed by Kallinikos (2004) that systems can have profound effects on the structuring of work and the forms of human action they enable or constrain. This suggests that there exists some value in examining the characteristics and limitations of technical systems. Subsequently, this section draws on the literature around LMS to identify that characteristics and limitations of the LMS, and how those may enable or constrain learning and teaching.

Adoption of an enterprise LMS will require some standardisation of teaching and learning as all available functionality is provided by the system (Luck, Jones et al. 2004). An LMS, by its nature, is structured and has little capability for customisation (Morgan 2003). Current LMS are not customizable for instruction aimed at a specific audience with specific content. (Black, Beck et al. 2007). As two of the most highly personalised sets of processes within institutions of higher education, any attempt at standardising teaching and learning is likely to be radical, painful and problematic (Morgan 2003). The standardization inherent in an LMS exacerbates the pain of adoption by being standardized products designed to support a non-standard base of university academics with different disciplines, teaching philosophies and instructional styles (Black, Beck et al. 2007).

There is, however, value in the standardisation inherent within an LMS as it reduces institutional pain during the selection process (Black, Beck et al. 2007). The same standardisation built into an LMS helps organizations deal with support and training as there is a fixed set of functionality. The design of an LMS is more concerned with providing the organisation with the ability to produce and disseminate information by centralising and controlling services (Siemens 2006). The standardisation embedded in the design of an LMS can create a number of operational conditions that push teaching and learning in a particular direction (Luck, Jones et al. 2004), at the very least limiting possibilities to those supported by the LMS. Managerialism may be the easiest and most natural path for a centrally managed LMS to take (Dron 2006).

The LMS model with its nature as an integrated, enterprise system fits the long-term culture of institutional information technology and its primary concern with centralizing and controlling information technology services with a view to reducing costs (Beer and Jones 2008). An approach that increases tensions created by a long-term cultural divide within universities between the culture of administration – that values efficiency, principles of scientific management and standardized business processes – and the academic culture – more focused on tradition, erudition and innovation (Fernandez 2008). Management perceive information technology as a cost to be minimized while academics see it as a service to be customized for their idiosyncratic requirements (Jones 2004).

The design of an LMS embeds particular world views, for example, the Blackboard LMS – with its origins in the American higher education sector – embodies a a particularly American view with “course” as the standard organisational unit within the system (Dron 2006). Rather than a minor irritation, the inability to modify this assumption requires institutional practice to align with the system, rather than vice versa (Dron 2006). In addition, the course focus on most LMS make it difficult to support communities of students outside of the course structure or to involve non-course participants in online courses (Beer and Jones 2008).

In terms of support for pedagogy, there are views that LMS, in general, does not dictate either a discipline or a pedagogy (Katz 2003). However, there have been some designed with a pedagogical emphasis, generally constructivist, (Stiles 2007), though none have entered the mainstream. Many LMS embed traditional teaching paradigms into them through name, metaphor and user interface (Dutton, Cheong et al. 2004; Stiles 2007). Examples include the use of common terms such as blackboard and gradebook and the use of university buildings to structure the user interface (Dutton, Cheong et al. 2004; Dron 2006; Stiles 2007). While the use of familiar concepts make for a more intuitive interface (Stiles 2007), they can also lead to built-in constrains on the use of LMS (Dutton, Cheong et al. 2004). The very design of the LMS can encourage horseless carriage approaches to e-learning. Giving support for the observation that technology will most likely reinforce the old systems rather than the new paths (Lian 2000).

The values embedded in many common LMS reveal a residue that is clearly transmissive and adds to the banality, confusion and disapointment in the learning and teaching experiences (Sullivan and Czigler 2002; deFreitas and Oliver 2005; Salmon 2005). The tendency towards behaviourist approaches to learning – with an emphasis on parcelling up knowledge into bite-sized chunks – is one of the great weaknesses of the contemporary LMS (Weigel 2005). LMS are largely based on training-type models based on an overly simplistic understanding of the relationship between teachers, knowledge and student learning (Coates, James et al. 2005). A social constructivist approach to learning – with an emphasis on self-governed and problem-based activities – are not very well support by LMS (Dalsgaard 2006). The LMS assumption of a self-paced learner results in most LMS having limited interaction or collaboration tools such as simple chat rooms and discussion forums (Bonk 2002).

Most LMS support more or less the same pedagogy (Robson 1999). The nature of an integrated, enterprise system and its requirement for standardization means it is unlikely that a single LMS will support more than one instructional theory, if that. This would appear problematic given the significant diversity in instructional theories adopted within a single university – whether implicitly or explicitly acknowledged (West, Waddoups et al. 2006). LMS need to become more flexible and customizable in form and allow students and faculty to choose among pedagogies in their structure (Katz 2003) in enable adaptation of the tool to fit each unique situation (West, Waddoups et al. 2006).

The standard and pre-established boundary to learning within an LMS is a course (Weigel 2005). Access to the resources, activities and people associated with learning – and subsequently the learning itself – is restricted to those individuals associated with a particular offering of the course and further to the period when the course is offered (Beer and Jones 2008). Learners contribute to discussions that are closed and removed at the end of the course (Cameron and Anderson 2006). The model of many LMS implementations is equivalent to having students come on-campus blind-folded, taking them directly to their course-related activities, and not allowing them to see or speak to anyone not in their own course (Wise and Quealy 2006). Learning within an LMS is like “walled garden”, outside of the context of the learner’s everyday life, environment and informal learning (Mentis 2008). The focus on the course by LMS also places limits on management. For example, LMS provide only very limited functionality associated with reporting and usage monitoring at an institutional level across multiple courses (Morgan 2003).

The closed nature of many LMS go beyond restrictions on learning. Embracing a new LMS has high entry costs because there are few efficient migration tools (Molina and Ganjalizadeh 2006). Restrictions on migration of content, technical and financial factors can make it difficult for institutions to migrate between different systems (Coates, James et al. 2005). An on-going challenge to management is that observation that e-learning technologies are undergoing a continual process of change (Huynh, Umesh et al. 2003) and that any frozen definition of “best” technology is likely to be temporary (Haywood 2002). The high cost of changing systems can contribute to lock-in (Davis, Little et al. 2008).

LMS vendors are trying to position their systems as the center-point for e-learning (Siemens 2004). The assumption of an enterprise system is that it provides all of the necessary services in one integrated whole. There are, however, increasing perceptions that the LMS may be less significant within the an organisational online learning system (Davis, Little et al. 2008). The LMS may not, on its own, be sufficiently conducive to supporting the design, development and operations required within contemporary learning environments (Segrave and Holt 2003). This is a point expanded upon in the next section.

References

Beer, C. and D. Jones (2008). Learning networks: harnessing the power of online communities for discipline and lifelong learning. Lifelong Learning: reflecting on successes and framing futures. Keynote and refereed papers from the 5th International Lifelong Learning Conference, Rockhampton, Central Queensland University Press.

Black, E., D. Beck, et al. (2007). "The other side of the LMS: Considering implementation and use in the adoption of an LMS in online and blended learning environments." Tech Trends 51(2): 35-39.

Bonk, C. (2002). Collaborative tools for e-learning. Chief Learning Officer: 22-24, 26-27.

Cameron, D. and T. Anderson (2006). "Comparing Weblogs to Threaded Discussion Tools in Online Educational Contexts." International Journal of Instructional Technology and Distance Learning 2(11).

Coates, H., R. James, et al. (2005). "A Critical Examination of the Effects of Learning Management Systems on University Teaching and Learning." Tertiary Education and Management 11(1): 19-36.

Dalsgaard, C. (2006) "Social software: E-learning beyond learning management systems." European Journal of Distance Education Volume,  DOI:

Davis, A., P. Little, et al. (2008). Developing an infrastructure for online learning. Theory and Practice of Online Learning. T. Anderson. Athabasca, Canada, AU Press: 121-142.

deFreitas, S. and M. Oliver (2005). "Does e-learning policy drive change in higher education? A case study relating models of organisational change to e-learning implementation." Journal of Higher Education Policy and Management 27(1): 81-95.

Dron, J. (2006). Any color you like, as long as it’s Blackboard. World Conference on E-Learning in Corporate, Government, Healthcare and Higher Education, Honolulu, Hawaii, USA, AACE.

Dutton, W., P. Cheong, et al. (2004). "The social shaping of a virtual learning environment: The case of a University-wide course management system." Electronic Journal of e-Learning 2(1): 69-80.

Dutton, W. and B. Loader (2002). Introduction. Digital Academe: The New Media and Institutions of Higher Education and Learning. W. Dutton and B. Loader. London, Routledge: 1-32.

Fernandez, L. (2008). "An antidote for the Faculty-IT divide." EDCAUSE Quarterly 31(1): 7-9.

Haywood, T. (2002). Defining moments: Tension between richness and reach. Digital Academe: The New Media and Institutions of Higher Education and Learning. W. Dutton and B. Loader. London, Routledge: 39-49.

Huynh, M., U. N. Umesh, et al. (2003). "E-Learning as an emerging entrepreneurial enterprise in universities and firms." Communications of the AIS 12: 48-68.

Jones, D. (2004). "The conceptualisation of e-learning: Lessons and implications." Best practice in university learning and teaching: Learning from our Challenges.  Theme issue of Studies in Learning, Evaluation, Innovation and Development 1(1): 47-55.

Kallinikos, J. (2004). "Deconstructing information packages: Organizational and behavioural implications of ERP systems." Information Technology & People 17(1): 8-30.

Katz, R. (2003). "Balancing Technology and Tradition: The Example of Course Management Systems." EDUCAUSE Review: 48-59.

Lian, A. (2000). "Knowledge transfer and technology in education: Toward a complete learning environment." Educational Technology & Society 3(3): 13-26.

Luck, J., D. Jones, et al. (2004). "Challenging Enterprises and Subcultures: Interrogating ‘Best Practice’ in Central Queensland University’s Course Management Systems." Best practice in university learning and teaching: Learning from our Challenges.  Theme issue of Studies in Learning, Evaluation, Innovation and Development 1(2): 19-31.

Markus, M. L. and D. Robey (1988). "Information technology and organizational change: causal structure in theory and research." Management Science 34(5): 583-598.

Mentis, M. (2008). "Navigating the e-Learning Terrain: Aligning Technology, Pedagogy and Context." Electronic Journal of e-Learning 6(3): 217-226.

Molina, P. and S. Ganjalizadeh. (2006). "Open Source Learning Management Systems."   Retrieved 28 December, 2006, from http://www.educause.edu/LibraryDetailPage/666?ID=DEC0602.

Morgan, G. (2003). Faculty use of course management systems, Educause Centre for Applied Research: 97.

Robson, R. (1999). WWW-based course-support systems: The first generation. WWW-Based Course-Support Systems Seminar, Seattle, Washington.

Salmon, G. (2005). "Flying not flapping: a strategic framework for e-learning and pedagogical innovation in higher education institutions." ALT-J, Research in Learning Technology 13(3): 201-218.

Segrave, S. and D. Holt (2003). "Contemporary learning environments: Designing e-Learning for education in the professions." Distance Education 24(1): 7-24.

Siemens, G. (2004). "Learning Management Systems: The wrong place to start learning."   Retrieved January 12, 2007, from http://www.elearnspace.org/Articles/lms.htm.

Siemens, G. (2006). "Learning or Management System? A Review of Learning Management System Reviews." from http://ltc.umanitoba.ca/wordpress/wpcontent/uploads/2006/10/learning-ormanagement-system-with-reference-list.doc.

Stiles, M. (2007). "Death of the VLE? A challenge to a new orthodoxy." Serials 20(1): 31-36.

Sullivan, K. and P. Czigler (2002). "Maximising the educational affordances of a technology supported learning environment for introductory undergraduate phonetics." British Journal of Educational Technology 33(3): 333-343.

Weigel, V. (2005). "Course Management to Curricular Capabilities: A Capabilties Apporach for the Next-Generation Course Management System." EDUCAUSE Review 40(3): 54-67.

West, R., G. Waddoups, et al. (2006). "Understanding the experience of instructors as they adopt a course management system." Educational Technology Research and Development.

Wise, L. and J. Quealy. (2006, May, 2006). "LMS Governance Project Report." from http://www.infodiv.unimelb.edu.au/telars/talmet/melbmonash/media/LMSGovernanceFinalReport.pdf.