Assembling the heterogeneous elements for (digital) learning

Month: November 2008 Page 1 of 2

Selecting a tripod

It’s been a week or so since I purchased the new camera. Last weekend I had to buy Aperture 2 for my Mac so I could easily pull the raw photos off the camera. Have been fairly happy with that purchase and the photos have flowed.

The Beautiful Wife

I’ve only put 10 to 20 of them onto my Flickr photostream, a lot more sitting on the hard drive and I’ve yet to take on any assignments. I need to do this to start really learning something about photography, rather than simply some trial and error.

Of the photos I’ve placed on Flickr so far, the shot of the missus is the most visited. Though that probably has something to do with the title and/or the subject, rather than the quality of the photography.

Was going to try and pick a favourite, but couldn’t decide. I don’t mind this one of Zeke.

Reflective Rider

The next accessory

Now it is time to think about the next additional expense/accessory for the camera, a tripod.

The missus and I are off to Paris in a week or so. Paris in December means lights. Night-time displays of decorative, Xmas lights along the Champs-Elysees and department stores. A tripod is likely to be required.

I’m somewhat reluctant to go the whole hog and get a “real” tripod. The size, weight and apparent clumsiness of them suggest I don’t want to be walking around Paris with one. Which is why I was interested to see a gorillapod in a local store. It seemed small enough to fit into a bag.

There seem to have been some fairly positive reviews – one and two, and one a little more restrained, but still positive.

The last review makes the point that this is not a replacement for a real tripod. Mmm, do I want/need one for what I want to do?

Indications of limitations – blog based discussions

I’m trying to run an experiment in blog-based discussions. Trying to understand, from experience, the realities of using individual blogs for a multi-person discussion. It’s not going well.

The first problem was that WordPress’ pingbacks not always working – as briefly mentioned in this post. The next problem is that links from an external blog (in particular one from Blogger), aren’t currently working in a way that is particularly useful for tracking a conversation.

Tony has attempted to join in the conversation from his blogger hosted blog via this <a href=”post. He’s included a link to my original post in his post. Apparently blogger doesn’t support trackbacks.

His link has not showed up on the post. However, on the admin interface for my blog there is a section that tracks all links point to my blog posts. Tony’s link has shown up there. Which isn’t all that useful for folk trying to following the conversation.

The question will be whether or not there is a configuration setting or a plugin I can enable on my blog to get around this problem. That’s a next task.

More on blogs and discussion

In some previous posts (the original post and the followup post) I’ve been playing around using blogs for multiple discussion forums. As yet, no-one else has joined in :(. Which is not surprising, to some extent, given at least one of the bits of my experiment in the followup post did not work.

This post attempts to explain what didn’t work and in the process introduce some basics of blogs and discussions as per the

Introducing pingbacks

In the followup post I included a link back to the the original post. What was supposed to happen can be seen on this unrelated post. There are a number of responses to the post, including a number that were made using the method I attempted.

Wordpress call this method a pingback. The concept of a pingback expands upon, and at least for some, improves on the idea of a trackback.

It looks like pingbacks don’t work all the time.

Some resources for around blogs and discussion forums

Empty party room

In a previous blog post I tried/am trying to kick off an experiment in using a blog for a multi-person discussion as an attempt to answer a question we will have to address as part of the PLEs@CQUni project.

I’m hoping this is a party to which a few others will come.

This post is an attempt to illustrate one answer to the “mechanics” question, i.e. how might you do this and also to provide some pointers to existing information on this topic.

The mechanics

I’m posting this on my own blog, hosted on If I include a link to the previous post (as I did in the first sentence of this post) WordPress automatically tells the other blog, which then adds a link to the new blog post. The author of the original blog post gets an email from WordPress saying that someone has linked to the post. The linkage shows up in the management interface of WordPress.

If you visit the previous blog post you should now see a link back to this post towards the bottom.

In theory, this allows each of the participants know when someone comments on their posts. It provides a set of connections between the different blogs, a way of generating a view of the discussion.

Some resources

This is not a new exercise, some existing information includes

An experiment in blog-based discussions

One of the major tools used (and mis-used) in most university-based e-learning is the discussion forum, or mailing list, or some other form of software for managing/creating multi-person dialogue. The PLEs@CQUni project is attempting to figure out and experiment with social media tools as a way to improve existing practice. An obvious need is to identify if, how and with what limitations these tools can be used to manage/create multi-person dialogues of the sort most academic staff associate with discussion forums.

The perceived need for this type of identification is mostly pragmatic. It is based on the observation that the decisions and actions people take are mostly based on patterns formed by previous experience. This is why most e-learning continues to be of the “horseless carriage” type. Being able to show academic staff that a new technology can re-create aspects of previous practice is an important step in getting them to move. This is the first step in the journey. We have to help get them out the door.

The question and assumptions

Chances are that blogs are going to be a major component of a PLE. Some of the more interesting work in this area certainly suggests this. So the modified question for this post and the activity I hope will arise from it is

What are the mechanics, benefits and limitations of using individual blogs to manage a multi-person discussion?

The idea is not to have a single blog on which everyone posts. The idea is to encourage the PLE type assumption where each participant in the discussion has their own blog and uses their blog to make their contribution.

The assumption should be that, if possible, each participant in the conversation can have their own blog on a different provider. i.e. everyone shouldn’t have to get a blog on to engage in the discussion.


This should be a type of action research. We’re not going to talk about it. We’re actually going to try and do it, and learn from the doing. This blog post will serve as the first part of an artifact that will arise out of this process. Those who participate will attempt to use this post as the first part of a conversation, by using their own blog.

Your task is to use whatever blog-based means you like to continue this conversation. The aim of the conversation is to discuss and come to some conclusion about the question listed above.

I will kick the ball rolling by sharing some resources arising from a quick google. A link to it should appear below ASAP.

Information Systems Epistemology: An Historical Perspective

Information Systems Epistemology: An Historical Perspective

This is a summary, review, attempt to understand, and pick tidbits from the following book chapter

Hirschheim, R. A. (1992). Information Systems Epistemology: An Historical Perspective. In R. Galliers (Ed.), Information Systems Research: Issues, Methods and Practical Guidelines (pp. 28-60). London: Blackweel Scientific Publications.

It’s an attempt to start moving on Chapter 3 of my thesis.

Basic summary

Provides a basic overview/introduction to the history of epistemology

The bits I found particularly interesting, given my current state of understanding and work, include

  • quotes about the role of science being the search for understanding
  • Suggestions that better to view science as problem solving – Poper quot that understanding is the same as problem solving. Particularly appropriate for what I understand as design research.
  • Very nice quote about researchers needing to be tool builders
  • Quotes from early anti-positivists of a need to complement positivism, not replace it.
  • Quote from Schutz about the main function of social science being understanding, subject meaning and action. Does this imply design research type work?


The paper aims to take a look at the history of epistemology within the IS discipline and consequently expose hidden assumptions beneath the conception of valid research and research methods.


It is my contention that information systems epistemology draws heavily from the social sciences because information systems are, fundamentally, social rather than technical systems.

The suggestion is that the natural sciences scientific paradigm is only appropriate as much as it is appropriate for the social sciences.

Fundamental aspects of epistemology

epistemology – our theory of knowledge, how we acquire knowledge.

What is knowledge – the author considers it to roughly synonymous with understanding.

Raises two questions

  1. What is knowledge – a simple problem
  2. How do we obtain valid knowledge – more problematic

What is knowledge?

Mentions the Greeks and their two types of knowledge

  1. doxa – knowledge believed to be true
  2. episteme – knowledge known to be true

This leads into the Sophists. How do we know something is true?

Author suggests it’s a straightforward problem. Since we cannot transcend our language/cultural system there is no chance of obtaining any absolute viewpoint. Hence knowledge must be “asserted”, knowledge claims are conceived in a probalistic sense. Knowledge is not infalible but conditional, a social convention, relative to both time and place, of societal or group acceptance.

How do we obtain knowledge

“This is the role of science”. But science itself is related to societal norms and expectations. So it can be argued about. “In its most conceptual scence, it is nothing more than the search for understanding” (p30)

This implies that, given any particular cultural/societal view, just about any “scholarly” attempt to acquiring knowledge can be labeled “science”. Distinguishing between science and non-science is blurred. If you take a “multi-cultural” approach. Any particular culture may well have fairly well defined boundary.

“The conventions we agree to are those that have proved successful in the past. If, however, the conventions – and therefore our scientific process – cease to be successful then it would be time to reconsider” (p30-31)

It could be suggested that this is the origins of design research. A pushing of conventions because of perceived limitations.

Many of us are concerned that the present accepted research methods are no longer appropriate for the subject – indeed, they may never have been. What is needed is a fresh look at the field; in particular what is the most appropriate epistemological stance.

Science and method

Begins by laying the groundwork about the limitations/lack of success of the natural science approach. e.g. “yielded many knowledge claims but most do not have widespread community acceptance” (p31). Relates it to similar literature in the social sciences.

Some have suggested that science is better described in terms of problem or puzzle solving. If this is done, then many problems disappear because the emphasis has shifted away from correlations, statistical significance to simply looking for an appropriate way to solve a problem.

This is very much similar to some of the underpinnings of the design research work.

Author goes onto quote Poper (1972) “The activity of understanding is, essentially, the same as that of problem solving.”. Science, in this view, becomes more about practical solutions to problems.

The following paragraph has some interesting implications for design research.

Some chose to view the process of problem solving as a craft (Pettigrew, 1985). Within this context the researcher should be viewed as a craftsman or a tool builder – one who builds tools, as separate from and in addition to, the researcher as tool users. Unfortunately, it is apparent that the common conception of researchers/scientists is different. They are people who use a particular tool (or a set of tools). This, to my mind, is undesirable because if scientists are viewed in terms of tool users rather than tool builders then we run the risk of distorted knowledge acquisition techniques. As an old proverb states: ‘For he who has but one tool, the hammer, the whole world looks like a nail’. We certainly need to guard against such a view, yet the way we practice ‘science’ leads us directly to that view.

Eventually gets to to the point about positivism being the predominant conception of science. Defines it as “an epistemology which posits beliefs (emerging from the search for regularity and causal relationships) and scrutinizes them through empirical testing”.

Positivist science

Seeks to define/understand positivist science, uses 5 points

  1. Unity of the scientific method
    The scientific method – the accepted approach for knowledge acquisition – is universally applicable regardless of the domain of study.
  2. The search for human causal relationships
    There is a desire for regularity and causal relationships amongst the elements of the study. Elements of the whole which are reduced down into constituent parts – reductionism.
  3. The belief in empiricism
    The only valid data is those which are experienced through the senses. Subjective perception, extrasensory experience etc are not acceptable.
  4. The value-free nature of science (and its process)
    There are no connections between the practice of the scientific method and the political, ideological or moral beliefs.
  5. The logical and mathematical foundation of science
    They provide the formal basis for the quantitative analysis in the search for causal relationships.

Ontology of positivism

Ontology – the nature of the world around us and the part of it which the scientist examines

Positivism has a realist ontology. i.e. the universe consists of objectively given, immutable objects and structures that exist as empirical entities and are independent of the observer’s appreciation of them.

Which contrasts with relativism or instrumentalism which holds that reality is a subjective construction of the mind. The names and descriptions of reality that are communicated impact on how reality is perceived and structured….more on this

Positivism has had success in natural sciences, somewhat more checkered in social sciences.

Author provides some summaries of the historical development of epistemology and in particular draws on one provided by Ivanov (1984) which is shown in the following image.

Relevant schools of thought for information science (Ivanov, 1984)

History of IS epistemology

Divides up and introduces a history of IS epistemology into 4 stages

  1. The arrival of positivism
    • starts with the dark ages of the church and study of god as only intellectual pursuits, emergence of science through to 17th century
    • Descartes major source of positivism. Mathematics as sole based for study. All properties could be reduced to mathematical form. separation of mind and matter/mind and body.
    • positivism and empiricism came out of the late renaissance period. Backon and the inductive-experimental method. Gailileo – nature is consistent, not random. Newton stressing need for experimental confirmation. HObbes – humans could be studied using the same methods as physical phenomena.
    • and more into the 1900s.
  2. The entering of anti-positivism
    • Arriving in the latter part of the 19th century concerned the positivism was missing the fundamental experience of life.
    • A number talked about the need for something apart from positivism not something to replace it, something to complement it, hence the name: anti-positivism
    • Traces it back to a number of authors and gives summaries of their position. I found the description of Kant, interesting.
      Kant believed you achieve knowledge through a synthesis (which he called ‘transcendental’) of concept (understanding) and experience. The philosophy that arises is called ‘transcendental idealism’ in which there is a difference between theoretical (dealing with knowledge of appearances – the realm of nature) and practical reason (oral reasoning – issues).

      Okay, so less interesting towards the end. Is this perhaps the limitation of the short description in the paper. What does wikipedia say? Ahh, this quote pricks my interest

      Kant argues, however, that using reason without applying it to experience will only lead to illusions, while experience will be purely subjective without first being subsumed under pure reason.

      I can see this being applied to a range of issues and problems I’m currently thinking of including: Kaplan’s law of instrument, teleological design etc.

      Similarly interesting is the discussion of Dilthey and a couple of comments on his belief. First, the suggestion that life cannot be “understood as a mchine, as Hobbes suggested”. This might be useful for Sandy and her work.

      Secondly, is that life cannot be understood using the explanatory model and its attempt to classify events according to lawas of nature. Connetions with Shirley’s theory of theories stuff.

      The wikipedia page on Dilthey has this to say

      Dilthey strongly rejected using a model formed exclusively from the natural sciences (Naturwissenschaften), and instead proposed developing a separate model for the human sciences (Geisteswissenschaften). His argument centered around the idea that in the natural sciences we seek to explain phenomena in terms of cause and effect, or the general and the particular; in contrast, in the human sciences, we seek to understand in terms of the relations of the part and the whole. In the social sciences we may also combine the two approaches, a point stressed by German sociologist Max Weber.

      And lastly, a nice quote for the Ps Framework

      Because individuals do not exist in isolation, they cannote be studied as isolated units; they have to be understood in the ocntext of their connections to cultural and social life.

      which apparently is a quote from Polkinghorne (1983).

  3. The re-entering of positivism (through logical positivism)
    Logical positivism suggested to be dominanat epistemology of contemporary sciecnce. But still rooted in positivism.

  4. The arrival of the contemporary critics
    Everyone’s a critic, so logical positivism didn’t last long. Some of the criticisms

    • Does not separate observable from theory
      What you observe is influenced by the theories. “In fact, it is unlikely that obervation can be theory free”.
    • Lack of success in using deductive reasoning to overcome the problem of induction
    • the idea of value-free science
      In the guise of neutrality, the researcher is in fact tacitly supporting the status quo.

    Goes on about a range of others. Interestingly, Schutz influenced by Weber and Husserl got into phenomenology. “Schutz contended that Weber’s concept that the main function of the social scientist was to interpret, did not go far enough. He believed the main characteristics of social science must be ‘understanding’, ‘subjective’ meaning and ‘action'”. This has some interesting implications for the work of Hevner et al that separate out natural/social sciences from design – perhaps.

  5. Post-positivism – being a fifth stage which the author suggests is currently emerging
    Arising out of growing band of researchers unhappy with postivism. Picks up the line of thought that knowledge is not apodeictic (i.e. a logical certainity, self-evident). Instead knowledge is accepted by some community which accepts it as an imporvement of previous understanding.

    Suggested that it is more a belief about knowledge than a school of thought with agreed tenets. The wikipedia page seems to suggest a little differently

    The main tenets of postpositivism (and where it differs from positivism) are that the knower and known cannot be separated, and the absence of a shared, single reality.

    A part of this is a methodological pluralism i.e. that there is no correct method, simply many that may be contigent on the problem being studied or the ‘kind’ of knowledge desired’.

First photo

Memory card (SanDisk eXtreme 3 8Gb) and bag (Tamrac expedition 5x) have been successfully purchased and returned home. The camera is finally together and the first shot has been taken.

First shot with the new camera

And yes, the image files are significantly larger than those produced by the C-770. Seems there will be some reading to do. Both for simple camera operation but also, potentially more importantly, about photography in general.

What to “read”?

Being a “net” sort of guy, one would assume that I should be looking at the net for resources and communities in which to participate. Rather than follow my first instinct to refer to the couple of books we bought for Anna last Xmas.

Given I’m a user of flickr, perhaps that’s a good place to start. Searching the groups part of Flickr reveals tht there are 1282 groups about “learning” and “photography”. Sorting by group size reveals that the group search ain’t all that effective.

A quick look at the Flickr groups and nothing stikes me as interesting. I did get taken to digital photography school which looks interesting.

A google search reveals the number one hit being onto which is like coming home. Philip Greenspun is almost an “old friend” given his work in web-based systems development in the 1990s. Don’t think I’ll be reading too much, at least initially, online.

Kant – separation of reason and experience


I’m slowly working through some PhD related work (the post on the paper I’m reading will come out later today) and that brought me across the following description of an argument of Kant’s from the wikipedia page on Kant

Kant argues, however, that using reason without applying it to experience will only lead to illusions, while experience will be purely subjective without first being subsumed under pure reason.

I haven’t time to follow up on this or to go to the original source, so the following may suffer from that. However, I find that I interpret this as being very conncted to what I’m currently doing and writing about.

Separation of expert analysis/design and lived experience

My understanding is that Kant was arguing against both the empirical and the rational view of the world/philosophy. To some extent (possibly doubtful in its validity) I see a connection here with some of the problems I’ve been writing about.

The rational world, in my thinking, can be ascribed to aspects of the “expert designer” approach. An expert/consultant/designer in information technology, curriculum, organisational structure applies a range of theories and rules of thumb to design a solution. Such an expert has varying but only small amounts of experience with what actually goes on in the context.

For example, a curriculum designer doesn’t really know what goes on in a course. What the students experience, what the staff say and do etc. What knowledge they do have is based on less than perfect methods such as observation, evaluation results and self-reporting of the students and staff.

The lack of understanding of the lived experience limits what they can see and do. They generally aren’t aware of, or abstract away, the complexities of connections between elements within such a system (should point out that I’m talking primilary about design that happens within human organisations).

As a result of this lack, any solution is likely to be less than perfect.

On the other hand, the academic who is teaching the course (typically) has a large amount of lived experience. A deep understanding of what happens in the course. However, it will be somewhat limited by their patterns and what they are trained to see. In addition, (typically) they will also have no understanding of the various theories and rules of thumb that can help understand what happens and design new interventions.

So as the Wikipedia author ascribes to Kant. Solutions developed purely by an expert designer, without experience, will lead to illusion. While a solution based solely on experience will be purely subjective.

There needs to be a strong and appropriate mix of reason and experience. The right mix of practice and theory.

Implications for information technology

I wonder what this perspective would say about information technology development projects that develop entire systems divorced from experience/reality until they are completed and ready to be put into place?

Implications for the PLE project

For the PLEs@CQUni project this implies that the research project, in order to encourage use of PLE related concepts within learning and teaching, needs to be informed by both theory and experience.

Starting a new journey and hobby – photography

Bit the bullet yesterday and upgraded to a DSLR. The Olympus C-770 we’ve had for almost four years seems a bit the worse for wear after Zach and Zeke dropped it while battling for control. Plus there’s the size issues, 4MP is not cutting it anymore.

TheKids v1

The C-770 has some nostalgia value. It was purchased the day Zach was born back in 2005, primarily for the purpose of recording the new arrival and his impact on the rest of the family. Sandy complained, or at least teased me, about the price, but not the results.

The replacement

The new camera

So a few tens of thousands of shots later the C-770 is being replaced by a Sony A300 DSLR. The C-770 will be passed onto the boys as a toy.

And so begins the journey of trying to get to understand and use the new beast to a level worthy of the price. The price is a bit of a step up and so some extra effort is going to be required to justify it. In particular, time to learn something about focal lengths, aperture and the other dark arts of photograpy.

So in the search of a new hobby I’ve decided to take this on and blog my learning journey. Or at least that is the current intention. How far the intention lasts in the cold light of reality will remain to be seen.

First step, memory card

In getting started with unpacking and getting it together it became obvious that it was missing two important bits: the battery and a memory card. The battery turns out to have been an oversight by the store. The camera was the last one in stock it was on display and so consequently all the bits were spread around the place. Not neatly residing in a shrink wrapped box.

The second was the memory card. The camera did not come with one and such a thing is necessary if you wish to actually take some photos which are recorded for future viewing pleasure. So the journey begins to find what I should buy.

From this page the details are

memory card slot accepts CompactFlash® (Type I and II), and Microdrive™ (also accepts Memory Stick® PRO Duo™/Memory Stick PRO-HG Duo via an optional CF card adapter)

From this it would appear that the CompactFlash cards are the way to go. An adaptor doesn’t sound like a plan for long term hassle-free use. The joy of the internet is that someone has always asked the question and gotten some answers. Not to mention articles giving help on the selection process.

The first suggestion, not surprisingly, is to get as large as possible. The CNET review suggests at least 4GB for a 10MP camera. First indications suggest at least $AUD100. Ahh, the Sony site is isting an 8Gb 300x for $300. 2Gb ($59.95) and 4Gb ($94.95) 133x cards are a bit cheaper.

Probably cheaper online, but I feel the need to start using the camera now. So I’ll have to take what I can get in town today.

A bag?

Of course, I’m also going to have to get a bag to carry all this stuff around in. More expense. I can hear Sandy grinding her teeth already, and quietly calculating how this can be used as ammunition to obtain her objectives.

Don’t like the normal camera bag “over the shoulder” look. Wonder if they make back packs that are suitable? Again, probably going to be limited to what I can find in town.

More on the expert designer – efficiency and effectiveness

A previous post has gotten a comment which I want to follow up on. The interface for writing a post gives more opportunity to be creative than that provided to add comments.

A clarification of the intent

Due to a few factors my intent may not have been clear. So one more attempt at clarity.

Let’s concentrate on one level, rather than the 3 or 4 I used in the original post. Perhaps the most connected to my current work is that of teaching and the common saying that modern teachers need to “not be the sage on the stage, but become the guide on the side”.

Sage on the stage

This is the age old image of the university course and it’s face-to-face sessions. The primary purpose of the professor is to analyse the topic area, identify what is important and deliver it to the students. The professor is the expert designer. The sage on the stage.

The content of the course is packaged into a format entirely controlled by the professor. A format that fits the expert designers conception of what it should look like.

Guide on the side

The alternative recognises that the learner needs to be much more in control of their learner. They have to actively construct learning through activities, tasks and approaches that are most suitable for them.

In this model, the professor gives up much of the control associated with the expert designer approach. Instead the concentrate on providing scaffolding, encouragement and guidance to the learner to aid them in their journey through the content. The design of the specific learning experience is largely the responsibilty of the learner.

Spectrum not a dichotomy

It should be pointed out that this is not a dichotomy. You don’t have two extreme boxes. At one end is the expert designer option in which the designer controls all. While at the other end you have each individual doing their own design.

Instead there is a full spectrum of approaches inbetween where the control of the designer becomes less and less.

A software example

A software example would be WordPress not having plugins. Instead any and all new features for WordPress would be under the control of the WordPress software developers. The expert designers.

By providing support for plugins, WordPress allow aspects of control and design to be broaden to a move diverse group.

The comments

There will always be experts because it is more efficient for an organization to use division of labour techniques to maximize greater skill levels and greater productivity as a whole.

There are three ways I’d respond to this

  1. There is more to life than efficiency.
  2. The measurement of efficiency is a highly questionable exercise.
  3. I’m not sure the “expert” route is always more efficient.

More to life than efficiency

I can think of two competing characteristics that are often in competition to efficiency.

  1. Effectiveness

    Teaching a course with a single academic is considerably more efficient than teaching it with 5. However, for a variety of reasons (e.g. more academics, means more people marking which might mean quicker turnaround time on feedback and better quality and quantity of feedback, which probably means better learning), doing it with 5 might result in a more effective outcomes.

  2. Ability to adapt
    When things change you have to have some “fat” to enable change. In terms of organisations and innovation, Christensen’s disruptive innovation work seems to indicate that having and allowing different approaches is actually a good thing.

Measurement of efficiency is questionable

How and who measures what is efficient?

About 6 or 7 years ago I was fighting battles with a group of “expert designers” responsible for the institutional ERP. The group I worked with had created a web-based system for academic staff to view data in the ERP (i.e. student records). The system did a lot more than this, but this was the focus of the ERP group.

One of their arguments was that having two systems was inefficient. Instead of using our duplicate (shadow) system, academics should be using the ERP provided system. It was more efficient this way. The university didn’t have to support and maintain two different systems.

That sounds right doesn’t it? If you based your assumptions solely on what appeared in the university accounting system you would be right.

However, if you knew the organisation in a little more detail than captured in the accounts. You would be aware that the ERP system’s approach was taking academic staff 20 minutes to generate a simple list of students in a course. And this is one of the simplest tasks academics needed to do.

The web-based duplicate system we’d developed could do it in under a minute.

Reliance on the ERP system was requiring at least one faculty to employ additional staff to perform this task for the academics. In other faculties, academics were having to waste their time performing this task or weren’t doing it.

Is that efficient?

The expert route isn’t always efficient

I think the above story also illustrates how the expert route isn’t always more efficient. Sometimes (many?) the experts get caught up in the law of instrument. They did in the above case. All they had was an ERP, they had to solve every problem with the ERP, even though it was inefficient and terrible.

Trust and the expert

The organisation has to trust the experts to provide the information from their area of expertise.

One of the problems with experts is the law of instrument. They start to see every problem with the lens of their expertise. Even when it isn’t appropriate.

Experts, especially those in support/service positions, tend to over emphasise the importance of the requirements of their expert area over the broader needs of the organisation.

PLEs and experts

I would have thought that PLEs would lead to MORE specialization as it is far easier to build a targeted learning path to turn out experts.

I think we’re getting back to the area of confusion.

Currently, when it comes to providing the tools for students to use for e-learning. Most institutions use the expert designer approach. The IT unit goes out and evaluates all the available tools, makes the most appropriate choice and everyone uses it.

The extreme PLE approach is that the institutional experts don’t select or design anything. Each individual student takes on the role of designer. They are more familiar with what they have used before, what they want to do. They do the design.

In reality, at least in the work we’ve done so far, is that the truth is somewhere in between. The institution minimises the design of technology but it still provides some scaffolding, some direction and support to help the learners make their own choices.

Tool users, research, hammers and the law of instrument

The following quote is from (Hirschheim, 1992) and is questioning the practice of research/the scientific method

Within this context the researcher should be viewed as a craftsman or a tool builder – one who builds tools, as separate from and in addition to, the researcher as tool users. Unfortunately, it is apparent that the common conception of researchers/scientists is different. They are people who use a particular tool (or a set of tools). This, to my mind, is undesirable because if scientists are viewed in terms of tool users rather than tool builders then we run the risk of distorted knowledge acquisition techniques. As an old proverb states: ‘For he who has but one tool, the hammer, the whole world looks like a nail’. We certainly need to guard against such a view, yet the way we practice ‘science’ leads us directly to that view.

Using a hammer to make an omelete

I’ve used this image in a recent presentation as a background to an important point that I’ve hammered again and again and again. “If all you have is a hammer, then everything is a nail”.

Apparently this is called the law of instrument and came from Abraham Kaplan’s The conduct of inquiry: Methodology for behavioural science. Apparently first published in 1964.

Information technology

There is a false dichotomy often trotted out in the practice of information technology: buy versus build. The impression being that “building” (being a tool builder) is a bad thing as it is wasteful. It’s seen as cheaper and more appropriate for the organisation to be a tool user.

As the “buy” option increasingly wins over the “build” option I believe I am increasingly seeing the law of instrument raise its ugly head within organisations. The most obviously bad example of this I’ve seen is folk wanting to use a WebCT/Blackboard course site for a publicity website. But there are many, many others.


You can see this in the group of staff (and institutions) who have “grown up” in e-learning with learning management systems. Their hammer is the LMS. The LMS is used to beat up on every learning problem because it is seen as a nail.

This is especially true of LMS support staff who do not have a good foundation knowledge in technology and learning and teaching. Every problem becomes a question of how to solve it with in the LMS. Even though the LMS may be the worst possible tool – like making an omelette with a hammer.

Asking tool users what they’d like to do

A common research method around new types of technology in learning and teaching sees the researcher developing a survey or running focus groups. These are targetted at group of people who are current tool users. For example, students and staff of a university currently using an LMS. The research aim is to ask these “tool users” what they would like to do with a brand new tool, often one based on completely different assumptions or models from the tool they are using.

This approach is a bit like giving stone age people a battery powered (was going to use electric knife, but no electricity – the point is the knife is powered and cuts “by itself”) knife. They’d simply end up using it like they use their stone axes (they would bang what they are cutting). They have been shaped by their tool use. They will find it difficult to imagine the different affordances that the new tool provides until they’ve used it.


I believe this was the context in which Kaplan first originated the law of instrument. Folk who get so caught up in a particular research methodology that they continue to apply it in situations where it no longer works.

The model underpinning blackboard and how ACCT19064 uses it

As proposed earlier this post is the first step in attempting to evaluate the differences between three learning management systems. This post attempts to understand and describe the model underpinning Blackboard version 6.3.


Blackboard like most web-based systems of a certain vintage (mid-1990s to early/mid 2000s) tend to structure websites as a hierarchical collection of documents and folders (files and directories for those of us from the pre-desktop metaphor based interface days). This approach has its source in a number of places, but most directly it comes from computer file systems

Webopedia has a half way decent page explaining the concept. The mathematicians amongst you could talk on in great detail about the plusses and minuses of this approach over other structures.

In it’s simplest form a hierarchical structure starts with

  • A single root document or node.
    Underneath this will be a sequence of other collections/folders/drawers.

    • Like this one
    • And yet another one
      Each of these collections can in turn contain other collections.

      • Like this sub-collection.
      • And this one.
        This hierarchical structure can continue for quite some time. Getting deeper and deeper as you go.
    • And probably another one.
      Best practice guidlines are that each collection should never contain much more the 7+/-2 elements as this is a known limitation of short term memory.

Blackboard’s idea of hierachy

One of the problems with Blackboard is that it’s underpinning models don’t always match what people assume from their what they see in the interface. This applies somewhat to the hierarchical model underpinning Blackboard.

Normally in a hierarchical structure there is one root document or node that is at the “top” of the pyramid of content. What Blackboard does is that each course site has a collection of content areas and then you nominate one of those as the “home” page. i.e. the one that appears when people first login. It’s not really the top of the pyramid.

Let’s get to an example, the image is the home page for the ACCT19064 course.

ACCT19064 home page

Note: I currently have “admin” access on this installation of Blackboard. Some of what appears in the interface is based on that access and is not normally seen by student or other staff users.

The links in the course menu on the left hand side of the image are (mostly) the top levels of the hierarchical structure of the Blackboard course. There are 13 main areas

  • Machinimas
  • Hird & Co
  • Feedback
  • Notice Board
  • Discussion Board
  • Group Space
  • Resources Room
  • Assessment Room
  • Instructor Resources
  • Teaching Team
  • External Links
  • Library
  • Helpdesk

The “home page” in the large right hand frame, which would be at the top of the hierarchy if Blackboard followed this practice, is the Announcements page. The link to the announcements page in ACCT19064 is provided by the “Notice board” course menu link.

The other complicating factor is that the course menu links for Helpdesk and Library aren’t really part of the Blackboard course site. They are links to other resources.

Feature: Blackboard allows top level folders to be links to external resources and also ad hoc elements within the course site.

The bit above is a new strategy. Everytime I come to something that I think is somewhat strange/unique or a feature of Blackboard I am going to record it in the text and also on this page.

Feature: The course home page can be set to a selection of “pages” within the site.

Other bits that don’t fit

Underneath the course menu links there are a couple of panels. The content of some of these can be controlled by the coordinator. In the example above the designer has removed as many of the links on these panels as possible. Normally, there would be two links

  • Communication; and
    Links to a page of the default communication tools Blackboard provides each course including: announcements, collaboration, discussion board, group pages, email and class roster.
  • Course tools
    Links to a page of the default course tools (not communication tools) Blackboard provides including: address book, calendar, journal, dictionary, dropbox, glossary….. This list can be supplemented.

The links to these tools are not part of the hierarchical structure of the course. They are always there, though the designer can remove the links. Confusingly, most staff leave these links and so students waste time checking the tools out, even if they aren’t used in the course (and most aren’t).

Feature: Blackboard does not maintain the hierarchy metaphor at all well. Confuses it with “tools” which sit outside the hierarchy.

The course map feature

To really reinforce the hierarchical nature of a Blackboard course site, Blackboard provides a course map feature which provides a very familiar “Windows explorer” link representation of the structure of a course website. The following image is of a part of the course map for the ACCT19064 course site.

Blackboard course map for the ACCT19064 site

What’s do the course menu links point to?

The links in the course menu can point to the following items

  • Content area
    This is the default content holder in Blackboard. If the designers wants to create a collection of content (HTML, uploaded files, tools etc.) they create a content area. More on these below.
  • Tool link
    This is a link to one of the communication or course tools mentioned above.
  • Course link
    A link to some other page/item within the course, usually within a content area.
  • External link
    A URL to some external resource.

The course menu link for ACCT19064 points the following

  • Machinimas – content area with 5 elements
  • Hird & Co – content area with 3 elements
  • Feedback – content area with 3 elements
  • Notice Board – link to the announcements tool
  • Discussion Board – direct link to the course level discussion conference
  • Group Space – a content area with 5 elements
  • Resources Room – a content area with 15 or so elements
  • Assessment Room – a content area with 6 elements
  • Instructor Resources – a content area with 1 element (see below)
  • Teaching Team – a link to the “Staff Information” tools
  • External Links – a content area with a number of links
  • Library – a direct link to the Library website.
  • Helpdesk – a mailto: link for the helpdesk

The number of elements I mention in each content area might be wrong. Blackboard supports the controlled release of content in a content area. Some people may not be able to see all of the content in a content area – explained more in the “Controlling Access” section below.

What’s in a content area?

A content area consists of a number of items. The items are displayed one under the other. The following image is of the Assessment Room in the ACCT19064 course site. It has 6 items. Not the alternating background colour to identify different items.

ACCT19064 assessment room

The edit view link in the top right hand corner appears when the user has permission to edit the content area. This is how you add, modify or remove an item from the content area.

An item in a content area can be one of the following

  • A “content” item
    i.e. something that contains some text, perhaps links to an uploaded file.
  • A folder
    This is how you get the hierarchical structure. A folder creates another level in the hierarchy within/underneath the current content area. This folder contains another content area.
  • An external link.
  • A course link.
  • A link to various tools or services.
    e.g. to tests or a range of different services and tools provided by Blackboard and its add ons.

Each item is associated with a particular “icon”. A folder item will have a small folder icon. A content item will have a simple document.

Feature: The icon associated with each item cannot easily be changed, especially for an individual course. It can also not be made invisible (easily) and causes problems for designers.

For example, the following image is from what was intended to be the home page for a Blackboard course. A nice image and text ruined by the document icon in the top left hand corner.

Controlling access

Blackboard provides a facility to limit who can see and access items within a content area and also what links can be seen in the course menu. However, it’s done consistently.

Feature: Different approaches with different functionality is available to restrict access/viewing of the course menu links (very simple) and individual items in content areas (very complex and featured). Restrictions on discussion forums also appear some what limited.

The following image is of the “Instructor Resources” content area of the ACCT19064 course site. It is being viewed as a user who is not a member of the staff for this course. Actually not a member of the blackboard group “Teaching Staff”.

ACCT19064 - Instructor Resources - not staff

What follows is the same page with the same content area. However, it is now viewed as a user who is a member of the “Teaching staff” group.

ACCT19064 Instructors Resources - as staff

Access to items can be restricted in the following ways

  • Visible or not
    A simple switch which says everyone can see it, or they can’t.
  • Date based ranges
    Specify a date/time range in which it is visible.
  • Group based membership
    You can see it if you are part of the right group or in a specified list of users.
  • Assessment related
    You can only see if if you have attempted a piece of assessment or achieved a grade within a specific range.

The specification of rules to restrict access can be combined.

Feature: Access to items can be restricted based on simple on/off, date, group membership, assessment.

A description of the ACCT19064 site

At initial look the course site is designed as a container for the content and tools used within the course. The design of the course site itself does not inherently provide any guidance to what the students are meant to do. i.e. there is no study schedule or similar showing up in the course menu links.

However, looking at the announcements for the course. It appears that this type of guidance and support for the students is given by an announcement from the course coordinator on the Sunday at the start of each week. This announcement is very specific. It outlines the individual and team-based tasks which the on-campus and off-campus students are meant to complete. There is also some additional comments, sometimes errata and sometimes the odd big of advice.

Interestingly, these guidance announcements didn’t link students directly to the tasks.

Breaking down the content of the site

The following describes in more detail the content within each of the course menu links, at least those that point to content areas.

  • Machinimas – content area with 5 elements. There is no adaptive release.
    • Description of the machinimas and their purpose.
    • Four external links to web pages that contain video of the machinima.

    Feature: Blackboard uses breadcrumbs for navigation. Including external web pages can be made more transparent if the breadcrumbs can be re-created on those external pages.

    The machinima pages, with the video playing, look like the following image

    ACCT19064 machinima page

  • Hird & Co – content area with 3 elements. No adaptive release
    This is meant to represent the imaginary audit company used throughout the course

    • External link to a “intranet” site for an imaginary audit company.
    • External link to an external discussion forum used by AIC campuses for discussion about questions prior to face-to-face classes.
    • A link to a Blackboard discussion forum used by the Audit Partner (the coordinator)
  • Feedback – content area with 3 elements
    Actually it’s 4 elements. One is not visible due to adaptive release. The missing element is a course experience survey. This section is entirely set aside to getting feedback from students about the course. I’m guessing it was added late in the term.

    • Content item linking to an information sheet
    • Content item linking to a consent form
    • A link to survey tool for the actual survey
    • An external link with help on an issue with the survey.
  • Notice Board – link to the announcements tool
  • Discussion Board – direct link to the course level discussion conference
  • Group Space – a content area with 5 elements
    • Link to a folder for the “Group Space for Audit Teams” – content area with 3 elements
      • Link to the groups page for the course
        This is one of the Blackboard communication tools that is supported by a group allocation/management system. It allows each group to have a collection of pages/tools which are unique and only accessible by members of the group.

        Typically this includes a group discussion conference, collaboration, file exchange and email.

      • Content item containing an announcement about groups
      • Content item containing details of group allocation – group names and student members
    • Content item linking to a document explaining problems faced by Vista users
    • Link to the Blackboard drop box tool
    • Folder containing feedback for submitted tasks
    • Folder containing declarations for each quiz
  • Resources Room – a content area with 15 or so elements
    Apart from a content item describing solutions to problems for users of Vista, this content area consists entirely of folders used to group resources associated with a particular week or activity. There are

    • 12 folders for each week of term
      These are a collection of folders and content items providing access to various weekly materials such as: eLectures, powerpoint lecture slides, activity sheets as Word docs, weekly quizzes and solutions (available only after a specified time).
    • one containing feedback and facts from previous students,
      A simple collection of content items with pass results and qualitative student feedback.
    • one containing general course materials (e.g. course profile and study guide),
      Two external links to the course profile and study guide.
    • one on auditing standards
      Two external links to the auditing standards applicable to the course.
    • One containing a course revision presentation
  • Assessment Room – a content area with 6 elements
    • Team membership – content item links to a word doc that students must complete and return
    • Personal journal – link to Blackboard journal tool that is used for personal reflection and integrated into assessment and weekly activities.
    • Resources for assessment items 1, 2a and 2b
      Each contains a range of content items, folders and external links pointing to resources specific to each assessement item.
    • Exam information – collection of information about the exam
  • Instructor Resources – a content area with 7 elements, a number under adaptive release
    • Staff discussion forum – external link to an external discussion forum
    • Snapshots for teaching team – collection of word documents explainining activities/tasks for the various weeks.
    • Teaching materials for lectures – collection of materials for staff to give lectures
    • Teaching materials for the tutorials – ditto for tutorials
    • Teaching materials for the workshops – ditto for workshops
    • Information on Assessment item 2a – misc additional background on assessment item
    • Information on assessment item 2b
  • Teaching Team – a link to the “Staff Information” tools
  • External Links – a content area with a number of links
  • Library – a direct link to the Library website.
  • Helpdesk – a mailto: link for the helpdesk


A fairly traditional hierarchical design for a course website. Students receive direction on what tasks to do from weekly announcements, not from some fixed “schedule” page.

Heavy use is made of groups.

There are some significant differences between tasks/activities for on-campus versus off-campus students. While educationally appropriate this does tend to make things more difficult for the coordinator and students. i.e. there has to be two sets of instructions created by the coordinator and students have to discern which they should follow.

Some use of external discussion forums. Probably due to the limitation in how Blackboard allows discussion forums to be configured. i.e. one discussion conference per course and one discussion conference per group.

Staff information not integrated with CQU systems, require duplication of effort. Same applies for the provision of links to course profile and to some extent lectures.

Evaluating an LMS by understanding the underpinning "model"

Currently, CQUni is undertaking an evaluation of Sakai and Moodle as a replacement for Blackboard as the organisation’s Learning Management System. The evaluation process includes many of the standard activities including

  • Developing a long list of criteria/requirements and comparing each LMS against that criteria.
  • Getting groups of staff and students together to examine/port courses to each system and compare/contrast.

Personally, I feel that while both approaches are worthwhile they fail to be sufficient to provide the organisation with enough detail to inform the decision. The main limitation is that neither approach tends to develop a deep understanding of the affordances and limitations of the systems. They always lead to the situation that after a few months/years of use people can sit back and say, “We didn’t know it would do X”.

A few months, at least, of using these systems in real life courses would provide better insight but is also prohibitively expensive. This post is the start of an attempt to try another approach, which might improve a bit on the existing approaches.

What is the approach?

The approach is based somewhat on some previous ramblings and is based on the assumption that an LMS is a piece of information technology. Consequently, it has a set of data structures, algorithms and interfaces that either make it hard or easy to perform tasks. The idea is that if you engage with and understand the model, you can get some idea about what tasks are hard or difficult.

Now there is an awful lot of distance between saying that and actually doing it. I’m claiming that the following posts are going to achieve anywhere near what is possible to make this work effectively. My existing current context doesn’t allow it.

At best this approach is going to start developing some ideas of what needs to be done and which I didn’t do. Hopefully it might “light” the way, a bit.

Using the concept elsewhere

We’ve actually been working on this approach as a basis for staff development in using an LMS. Based on the assumption that understanding the basics of the model will make things work somewhat easier for folk to use. The first attempt at this is the slidecast prepared by Col Beer and shown below.

Blackboard@CQ Uni

View SlideShare presentation or Upload your own.

What will be done?

Given time constraints I can only work with a single course, from a single designer. More courses, especially those that are different would be better. But I have to live with one. I’ve tried to choose one that is likely to test a broader range of features of the LMSes to minise this. But the approach is still inherently limited by this small sample set.

The chosen course is the T2, 2008 offering of ACCT19064, Auditing and Professional Practice. For 2008 this course underwent a complete re-design driven by two talented and committed members of staff – Nona Muldoon (an instructional designer) and Jenny Kofoed (an accounting academic). As part of this re-design they made use of machinima produced in Second Life. The re-design was found to be particularly successful and has been written about.

The basic steps will be

  1. Explain the model underpinning Blackboard and how it is used within the course.
  2. Seek to understand and explain the model underpinning Moodle and then Sakai.
  3. Identify and differences between the models and how that might impact course design.

Hopefully, all things being equal, you should see a list of posts describing these steps linked below.

PLEs@CQUni: Origins, rationale and outcomes so far

The video and slides from the talk last Friday giving an overview of the PLEs@CQUni project are now up and available.



The dissonance gap in systems and LMS evaluations

Ania Lian writes in the paper Knowledge transfer and technology in education: toward a complete learning environment

It is argued that technology itself is neither liberating, empowering nor enabling one to be with other people but that it will serve whichever goals motivate its incorporation.

In a couple of papers (e.g. this one) I’ve paraphrased this as

Technology is not, of itself, liberating or empowering but serves the goals of those who guide its design and use (Lian, 2000).

This posts tries to explain why this is the case and also to explain what relevance this has to an institution when it attempts to evaluate and select a new learning management system.

Why is this the case?

A learning management system is a piece of information technology. As a piece of information technology it has been designed by a small group of expert designers. Typically a company or open source community (individual).

This group of expert designers analyse the problem, identify some sort of solution and then turn that solution into code. They do not do this is a purely sequential nor objective manner. Their past experiences, lessons learned and new knowledge all impact on this process.

However, at some stage they must eventually design and implement algorithms, data structures and interfaces. These artifacts will all embody the view of the world they have formed during the above process. This view of the world will impact upon the final system.

For example, a system designed with an emphasis on being “enterprise ready”, on being “scalable” will have a very different set of facilities, structure and “feel” than a system designed with an emphasis on a particular educational approach.

This influence of the world view of the designers is present at all levels in a system. Not just the high level perspective of the system. For example, the Peoplesoft ERP had its origins as a human resource management system. As such one of the early and important identifiers for that system was “EMPLID” i.e. employee identifier. In a HR system you have employees.

Later on Peoplesoft entered the field of providing ERPs to universities. As part of this they had to add a student records database. In part, to record things like which courses a student was enrolled in.

To achieve this you have to have some form of unique identifier for each student. Most universities do this through a unique student number. I’ve seen other student records system give the student number labels like STUD, STUD_NUM etc.

Guess what label Peoplesoft uses? (Remember it’s origins and the impact of the original designers). Yep, that’s right. EMPLID – employee id.

What relevance does this pose?

When a university is seeking to select a new learning management system it is selecting an enterprise information system. As such any selection will have embedded in it a certain world view. Even the notion of an LMS embodies a certain world view. That of the “big” system. This world view brings certain positives and negatives. The impact of the original designers will play a part in how effective the choice is, it will limit or enable what learning and teaching can occur, what staff and students can do.

Discussion forums in Blackboard

As one example, let’s take the Blackboard LMS (version 6.3). Like any LMS it provides support for discussion forums. The model, or at least my current understanding of the model, is that a course website can have

  • One “Course Conference” for the course.
    A course conference can consist of many separate discussion forums. This “course course conference” is usually used for everyone in the course. For many CQU courses, this is the only course conference used.
  • Each group of students/staff can have a course conference.
    Blackboard allows you to create groups of students and staff. Each of these groups can have a single course conference. Again the single course conference can contain many different discussion forums, but they are all located in the single course conference.

So what are the limitations of this approach? Essentially, it doesn’t support each group having multiple course conferences.

One of the designs by an instructional designer at CQU had each group having a part of the blackboard course site that contained a number of rooms. Sections of their group site set aside specifically for different activities, topics or times during the term. Each room might include a number of services including a discussion forum, a wiki, sharing of resources etc.

The design made sense from the perspective of keeping everything associated with a particular activity in the one place.

This could not be done with Blackboard. The discussion forums would have had to be within the single course conference allowed for each group. Taking students out of the activity into another part of the course site.

Assignment submission in Blackboard

The assignment submission process in Blackboard 6.3 allows students to submit any type of file, performs no checking of the files and provides only ad hoc support for marking staff.

Conseqently, it is well known that you would be really stupid to use the Blackboard assignment submission system for a class with more than about 20 students. To do so is opening you up for a huge amount of extra work.


A dissonance between the needs of a systems users and its embedded world view lead to a number of problems. The dissonance becomes a gap between the users and the system. A gap that prevents the adoption of certain approaches or which creates additional workload as people attempt to work around the gap.

Eventually, this dissonance gap will lead users to attempting to use alternate means. Of creating shadow systems.

Understand the gap as part of the valuation

Consequently, it seems to be sensible for any evaluation of new systems would include an attempt to actively identify the models and purpose underpinning the different systems and attempting to understand the size of the dissonance gap.

All things being equal (and there’s a real good chance that they won’t be), this is something we’ll be attempting over the next few days as CQU attempts to move from Blackboard to either Sakai or Moodle.

"Big" systems – another assumption "PLEs" overthrow

This is a continuation of my attempt to develop a list of the assumptions underpinning existing practice around learning and teaching at universities which the concepts surrounding the Personal Learning Environments (PLE) term bring into question. The list started with this post and is continuing (all the posts should be linked from the bottom of the original post).

“Big” systems

Since the mid to late 1990s most higher education institutions have been adopting the “big” system fad. The “big” system fad is the adoption of really expensive information systems that do everything. What I call the “one ring to rule them all” approach.

The positive spin on these systems is that they are “enterprise” systems. They embody best practice. They promise a single system to unite all required tasks. It’s all neat and tidy and a load of crap.

Most of this came out of interest in enterprise resource planning (ERP) systems which grew out of systems to manage planning for manufacturing systems. These morphed into systems to manage the entire operations of organisations.

Around the mid-1990s these started being applied to universities. Around the same time learning management systems (LMS) arrived on the seen. They were soon sold as “enterprise” learning management systems. Single systems to manage all of the “e-learning” of an organisation.

Senior management like these big systems because of the promise. Buy the “big” systems and all your problems will be over. You won’t have to do anything else. You will be able to get the entire organisation using this system. Everything will be consistent. You will be able to understand it, because it is simple, and subsequently manage and control it.

Information Technology units like “big” systems, at least at the beginning, because, for various reasons, it gives IT control. They are given the power to require people to comply and use the big system because it is so expensive everyone must use it.

Because it is expensive it is important. Because it is an IT system that is important, the IT unit is then important to the business.

Problems with big systems

The Wikipedia page on ERPs offers a good list of the disadvantages of these systems. Many of these become hugely problematic when a “big” system is applied to contexts outside of its sweet spot. i.e. contexts similar to manufacturing. Contexts which are not standardised in process, components or outputs (e.g. learning and teaching) suffer under the weight of a “big” system.

Some examples, drawing on the disadvantages listed on the wikipedia page

  • Customisation of the ERP software is limited
    Standard best practice advice with “big” systems is do not customise. This means your organisation and everyone in it must use the system in exactly the same way as everyone else. It is almost impossible to customise the system for local contextual needs.
  • Loss of competitive advantage
    This means your organisation cannot differentiate itself from any other competitor that is using the same “big” system.
  • Once established, switching costs are very high
    They cost so much to implement you can’t easily change. This leads to “stable systems” drag (Truex, Baskerville and Klein, 1999) where the organisation struggles against the problems of an inappropriate system because it is too expensive/difficult to change. Eventually the huge investment of resources is made and the change is made. Usually to another “big” system and the problem starts all over again.

    Supposedly open source “big” systems is the solution to this problem. Sorry, no it isn’t. The cost of the actual software, which is the bit which is “free” with open source software, is the smallest part of the cost of a “big” system. You end up saving this very minor portion of the cost and still retain all the other problems of a “big” system.

  • The system may be too complex for the needs of the customer
    A University I know of selected Peoplesoft as its ERP. The Peoplesoft ERP system grew out of human resource management. It was, I’m told, known for being a good HRM system. From there it grew to have a number of other components added – finance, student administration etc. Guess which part of the ERP this university did not implement – the HRM system. It was too hard given existing contextual constraints.

    How much of the functionality of an “enterprise” LMS is actually used by the staff and students of a university? If my experience is anything to go by, bugger all.

The biggest problem with these systems is that they attempt to do everything. This means that a single supplier, or in the case of open source – a single community, must provide all of the necessary functionality. If you want a kerfuffle widget for your “big” system you have to wait for the vendor/community to develop a kerfuffle widget

Small pieces, loosely joined

The PLE concept builds on the concept of small pieces loosely joined. That is, there are lots of different Web 2.0 tools, all (most?) of them concentrate on doing on thing, providing one service. Flickr helps folk share photos, WordPress helps folk blog etc. Each of these systems is loosely joined through a combination of feeds and open APIs.

Take a look at the home page of my blog. Down the right hand column you will see two collections of photos. One collection taken by me and another containing those photos on Flickr that I’ve used in my presentations. How do I get these into my blog?

  • Upload them into my account on Flickr/tag them as a favourite in Flickr.
  • Use FlickrRiver to create two HTML badges of random photos. One with my the most interesting photos I’ve taken, the other from my favourites.
  • Use WordPress’ features to add the HTML for these two badges to my blog.

Each of these tools do one thing well. They are loosely joined via an open API (Flickr to FlickrRiver) and HTML/”API” (FlickrRiver to WordPress).

Imagine how much effort, how much expense and how long you would have to wait for the vendor/community of a “big” system to provide that functionality. Does anyone think that such a tool would be easier to use and have more functionality than Flickr, FlickrRiver and WordPress?

Moving beyond the expert designer

Big Ben

Another major advantage of this approach is that it enables the “death of the expert designer”. Adding the FlickrRiver HTML badge to get my random collection of Flickr photos did not require me to go to the software designers of WordPress and ask them to add the ability to add random photos to my blog.

I could do it myself.

This removes a bottleneck within organisations.

Not without its problems

As others have pointed out the idea of small pieces loosely joined is not without its problems. These have to be looked at and worked out. Based on the work I’ve done I don’t see the potential problems as unsolvable and I certainly see the benefits far outweighing the problems.


Truex, D., Baskerville, R., & Klein, H. (1999). Growing systems in emergent organizations. Communications of the ACM, 42(8), 117-123.

Page 1 of 2

Powered by WordPress & Theme by Anders Norén