Assembling the heterogeneous elements for (digital) learning

Category: Chapter 2 Page 1 of 9

A command for organisations? Program or be programmed

I’ve just finished the Douglas Rushkoff book Program or be Programmed: Ten commands for a digital age. As the title suggests the author provides ten “commands” for living well with digital technologies. This post arises from the titular and last command examined in the book, Program or be programmed.

Dougls Rushkoff

This particular command was of interest to me for two reasons. First, it suggests that learning to program is important and that more should be doing it. As I’m likely to become a information technology high school teacher there is some significant self-interest in there being a widely accepted importance to learning ot program. Second, and the main connection for this post, is that my experience with and observation of universities is that they are tending “to be programmed”, rather than program. In particular when it comes to e-learning.

This post is some thinking out loud about that experience and the Ruskoff command. In particular, it’s my argument that universities are being programmed by the technology they are using. I’m wondering why? Am hoping this will be my last post on these topics, I think I’ve pushed the barrow for all its worth. Onto new things next.

Program or be programmed

Rushkoff’s (p 128) point is that

Digital technology is programmed. This makes it biased toward those with the capacity to write the code.

This also gives a bit of a taste for the other commands. i.e. that there are inherent biases in digital technology that can be good or bad. To get the best out of the technology there are certain behaviours that seem best suited for encouraging the good, rather than the bad.

One of the negative outcomes of not being able to program, of not being able to take advantage of this bias of digital technology is (p 15)

…instead of optimizing our machines for humanity – or even the benefit of some particular group – we are optimizing humans for machinery.

But is all digital technology programmed?

In terms of software, yes, it is all generally created by people programming. But not all digital technology is programmable. The majority of the time, money and resources being invested by universities (I’ll stick to unis, however, much of what I say may be applicable more broadly to organisations) is in “enterprise” systems. Originally this was in the form of Enterprise Resource Planning system (ERPs) like Peoplesoft. It is broadly recognised that modifications to ERPs are not a good idea, and that instead the ERP should be implemented in “vanilla” form (Robey et al, 2002).

That is, rather than modify the ERP system to respond to the needs of the university. The university should modify its practices to match the operation of the ERP system. This appears to be exactly what Rushkoff warn’s against “we are optimizing humans for machinery”.

This is important for e-learning because, I would argue, the Learning Management System (LMS) is essentially an ERP for learning. And I would suggest that much of what goes on around the implementation and support of an LMS within a university is the optimization of humans for machinery. In some specific instances that I’m aware of, it doesn’t matter whether the LMS is open source or not. Why?

Software remains hard to modify

Glass (2001), describing one of the frequently forgotten fundamental facts about software engineering, suggested that maintenance consumes about 40 to 80 percent of software costs, with 60% of the maintenance cost is due to enhancement. i.e. a significant proportion of the cost of any software system is adding new features to it. You need to remember that this is a general statement. If the software you are talking about is part of a system that operates within a continually changing context, then the figure is going to be much, much higher.

Most software engineering remains focused on creation. On the design and implementation of the software. There hasn’t been enough focus on on-going modification, evolution or co-emergence of the software and local needs.

Take Moodle. It’s an LMS. Good and bad like other LMS. But it’s open source. It is meant to be easy to modify. That’s one of the arguments wheeled out by proponents when institutions are having to select a new LMS. And Moodle and its development processes are fairly flexible. It’s not that hard to add a new activity module to perform some task you want that isn’t supported by the core.

The trouble is that Moodle is currently entering a phase which suggests it suffers much the same problems as most large enterprise software applications. The transition from Moodle 1.x to Moodle 2.0 is highlighting the problems with modification. Some folk are reporting difficulties with the upgrade process, others are deciding to delay the upgrade as some of the third-party modules they use haven’t been converted to Moodle 2. There are even suggestions from some that mirror the “implement vanilla” advice for ERPs.

It appears that “we are optimizing humans for machinery”.

I’m wondering if there is anyone doing research how to make systems like Moodle more readily modifiable for local contexts. At the very least, looking at how/if the version upgrade problem can be improved. But also, the ability to modify the core to better suit local requirements. There are aspects there already. One of the difficulties is that to achieve this you would have to cross boundaries between the original developers, service providers (Moodle partners) and the practices of internal IT divisions.

Not everyone wants to program

One reason this will be hard is that not everyone wants to program. Recently, D’Arcy Norman wrote a post talking about the difference between the geeks and folk like his dad. His dad doesn’t want to bother with this techy stuff, he doesn’t want to “program”.

This sort of problem is made worse if you have an IT division that has senior management with backgrounds in non-IT work. For example, an IT director with a background in facilities management isn’t going to understand that IT is protean, that it can be programmed. Familiar with the relative permanence of physical buildings and infrastructure such a person isn’t going to understand that IT can be changed, that it should be optimized for the human beings using the system.

Organisational structures and processes prevent programming

One of the key arguments in my EDUCAUSE presentation (and my thesis) is that the structures and processes that universities are using to support e-learning are biased away from modification of the system. They are biased towards vanilla implementation.

First, helpdesk provision is treated as a generic task. The folk on the helpdesk are seen as low-level, interchangeable cogs in a machine that provides support for all an organisation’s applications. The responsibility of the helpdesk is to fix known problems quickly. They don’t/can’t become experts in the needs of the users. The systems within which they work don’t encourage, or possibly even allow, the development of deep understanding.

For the more complex software applications there will be an escalation process. If the front-line helpdesk can’t solve the problem it gets handed up to application experts. These are experts in using the application. They are trained and required to help the user figure out how to use the application to achieve their aims. These application experts are expert in optimizing the humans for the machinery. For example, if an academic says they want students to have an individual journal, a Moodle 1.9 application expert will come back with suggestions about how this might be done with the Moodle wiki or some other kludge with some other Moodle tool. If Moodle 1.9 doesn’t provide a direct match, they figure out how to kludge together functionality it does have. The application expert usually can’t suggest using something else.

By this stage, an academic has either given up on the idea, accepted the kludge, gone and done it themselves, or (bravely) decided to escalate the problem further by entering into the application governance process. This is the heavy weight, apparently rational process through which requests for additional functionality are weighed against the needs of the organisation and the available resources. If it’s deemed important enough the new functionality might get scheduled for implementation at some point in the future.

There are many problems with this process

  • Non-users making the decisions;
    Most of the folk involved in the governance process are not front-line users. They are managers, both IT and organisational. They might include a couple of experts – e-learning and technology. And they might include a couple of token end-users/academics. Though these are typically going to be innovators. They are not going to be representative of the majority of users.

    What these people see as important or necessary, is not going to be representative of what the majority of academic staff/users think is important. In fact, these groups can quickly become biased against the users. I attended one such meeting where the first 10/15 minutes was spent complaining about foibles of academic staff.

  • Chinese whispers;
    The argument/information presented to such a group will have had to go through chinese whispers like game. An analyst is sent to talk to a few users asking for a new feature. The analyst talks to the developers and other folk expert in the application. The analysts recommendations will be “vetted” by their manager and possibly other interested parties. The analysts recommendation is then described at the governance meeting by someone else.

    All along this line, vested interests, cognitive biases, different frames of references, initial confusion, limited expertise and experience, and a variety of other factors contribute to the original need being morphed into something completely different.

  • Up-front decision making; and
    Finally, many of these requests will have to battle against already set priorities. As part of the budgeting process, the organisation will already have decided what projects and changes it will be implementing this year. The decisions has been made. Any new requirements have to compete for whatever is left.
  • Competing priorities.
    Last in this list, but not last overall, are competing priorities. The academic attempting to implement individual student journals has as their priority improving the learning experience of the student. They are trying to get the students to engage in reflection and other good practices. This priority has to battle with other priorities.

    The head of the IT division will have as a priority of staying in budget and keeping the other senior managers happy with the performance of the IT division. Most of the IT folk will have a priority, or will be told that their priority is, to make the IT division and the head of IT look good. Similarly, and more broadly, the other senior managers on 5 year contracts will have as a priority making sure that the aims of their immediate supervisor are being seen to be achieved……..

These and other factors lead me to believe that as currently practiced, the nature of most large organisations is to be programmed. That is, when it comes to using digital technologies they are more likely to optimize the humans within the organisation for the needs of the technology.

Achieving the alternate path, optimizing the machinery for the needs of the humans and the organisation is not a simple task. It is very difficult. However, by either ignoring or being unaware of the bias of their processes, organisations are sacrificing much of the potential of digital techology. If they can’t figure out how to start programming, such organisations will end up being programmed.

References

Robey, D., Ross, W., & Boudreau, M.-C. (2002). Learning to implement enterprise systems: An exploratory study of the dialectics of change. Journal of Management Information Systems, 19(1), 17-46.

The nature of user involvement in LMS selection and implementation

Given what know (see the below) about the importance of people to the implementation of information systems and also to learning and teaching, how would you characterise the involvement of uses in the selection and implementation of an LMS at most universities? What impact does it have?

The importance of people

There has been significant research within the information systems discipline – a small subset includes user participation and involvement (Ives and Olson 1984); technology acceptance and use (Davis 1989; Venkatesh, Morris et al. 2003); decision-making around system selection and implementation (Bannister and Remenyi 1999; Jamieson, Hyland et al. 2007); system success (DeLone and McLean 1992; Myers 1994); development methods (Mumford 1981); and, the social shaping of technology (Kling 2000)- around the importance and impact of people on information systems and their success. In terms of user participation and involvement, Lynch and Gregor (2004) found that previous studies were inconclusive in terms of links with system success, however, they suggest that the level of influence users have on the development process is a better indicator of system outcomes. The perceptions of the people who may potentially use an information and communication technology play a significant role in their adoption and use of that technology (Jones, Cranston et al. 2005). Information systems are designed and used by people operating in complex social contexts, consequently such a system is understood differently by different people and given meaning by the shared understanding that arises out of social interaction (Doolin 1998).

Similar findings and suggestions are evident in the educational and e-learning literature. John and La Velle (2004) argue that new technologies at most enable rather than dictate change. Dodds (2007) suggests that any excellence demonstrated by a University is not a product of technology, it is a product of the faculty, students and staff who play differing roles in the pursuit of scholarship and learning. For Morgan (2003), teaching and learning are two of the most highly personalised processes. Numerous authors (e.g. Alexander 2001; Oblinger 2003) identify understanding learners, and particularly their learning styles, attitudes, and approaches as essential to the effective facilitation of learning. For Watson (2006), it is clear that consideration of the human dimension is critical to education. Since, as Stewart (2008) observes, the beliefs held by those involved in the educational process, regardless of how ill-informed, can have a tremendous impact on the performance of both students and teachers and how effectively technology may be utilised. Personal characteristics have been found to influence e-learning implementation (Siritongthaworn, Krairit et al. 2006) and most universities are still struggling to engage a significant percentage of students and staff in e-learning (Salmon 2005). While technology may be the stimulus, the essential matters are complex and will be the purview of academics (Oblinger, Barone et al. 2001).

References

Alexander, S. (2001). E-learning developments and experiences. Education and Training, 43(4/5), 240-248.

Bannister, F., & Remenyi, D. (1999). Value perception in IT investment decisions. Electronic Journal of Information Systems Evaluation, 2(2).

Davis, F. D. (1989). Perceived usefulness, perceived ease of use and user acceptance of information technology. MIS Quarterly, 13(3), 319.

DeLone, W. H., & McLean, E. R. (1992). Information systems success: The quest for the dependent variable. Information Systems Research, 3(1), 60-95.

Dodds, T. (2007). Information Technology: A Contributor to Innovation in Higher Education. New Directions for Higher Education, 2007(137), 85-95.

Doolin, B. (1998). Information technology as disciplinary technology: being critical in interpretive research on information systems. Journal of Information Technology, 13(4), 301-311.

Ives, B., & Olson, M. (1984). User involvement and MIS success: a review of research. Management Science, 30(5), 586-603.

Jamieson, K., Hyland, P., & Soosay, C. (2007). An exploration of a proposed balanced decision model for the selection of enterprise resource planning systems. International Journal of Integrated Supply Management, 3(4), 345-363.

John, P. D., & La Velle, L. B. (2004). Devices and Desires: subject subcultures, pedagogical identity and the challenge of information and communications technology. Technology, Pedagogy and Education, 13(3), 307-326.

Jones, D., Cranston, M., Behrens, S., & Jamieson, K. (2005). What makes ICT implementation successful: A case study of online assignment submission. Paper presented at the ODLAA’2005, Adelaide.

Kling, R. (2000). Learning about information technologies and social change: The contribution of social informatics. The Information Society, 16(3), 217-232.

Lynch, T., & Gregor, S. (2004). User participation in decision support systems development: Influencing system outcomes. European Journal of Information Systems, 13(4), 286-301.

Morgan, G. (2003). Faculty use of course management systems: Educause Centre for Applied Research.

Mumford, E. (1981). Participative systems design: Structure and method. Syst. Objectives solutions, 1(1), 5-19.

Myers, M. D. (1994). Dialectical hermeneutics: a theoretical framework for the implementation of information systems. Information Systems Journal, 5, 51-70.

Oblinger, D. (2003). Boomers, gen-Xers and millennials: Understanding the new students. EDUCAUSE Review, 37 – 47.

Oblinger, D., Barone, C., & Hawkins, B. (2001). Distributed education and its challenge: An overview. Washington DC: American Council on Education.

Salmon, G. (2005). Flying not flapping: a strategic framework for e-learning and pedagogical innovation in higher education institutions. ALT-J, Research in Learning Technology, 13(3), 201-218.

Siritongthaworn, S., Krairit, D., Dimmitt, N., & Paul, H. (2006). The study of e-learning technology implementation: A preliminary investigation of universities in Thailand. Education and Information Technologies, 11(2), 137-160.

Stewart, D. P. (2008). Technology as a management tool in the Community College classroom: Challenges and Benefits. Journal of Online Learning and Teaching, 4(4).

Venkatesh, V., Morris, M., Davis, G. B., & Davis, F. D. (2003). User acceptance of information technology: Toward a unified view. MIS Quarterly, 27(3), 425-478.

Watson, D. (2006). Understanding the relationship between ICT and education means exploring innovation and change. Education and Information Technologies, 11(3-4), 199-216.

Nobody likes a do-gooder – another reason for e-learning not mainstreaming?

Came across the article, “Nobody likes a do-gooder: Study confirms selfless behaviour is alienating” from the Daily Mail via Morgaine’s amplify. I’m wondering if there’s a connection between this and the chasm in the adoption of instructional technology identified by Geoghegan (1994)

The chasm

Back in 1994, Geoghegan draw on Moore’s Crossing the Chasm to explain why instructional technology wasn’t being adopted by the majority of university academics. The suggestion is that there is a significant difference between the early adopters of instructional technology and the early majority. That what works for one group, doesn’t work for the others. There is a chasm. Geoghegan (1994) also suggested that the “technologists alliance” – vendors of instructional technology and the university folk charged with supporting instructional technology – adopt approaches that work for the early adopters, not the early majority.

Nobody likes do-gooders

The Daily Mail article reports on some psychological research that draws some conclusions about how “do-gooders” are seen by the majority

Researchers say do-gooders come to be resented because they ‘raise the bar’ for what is expected of everyone.

This resonates with my experience as an early adopter and more broadly with observations of higher education. The early adopters, those really keen on learning and teaching are seen a bit differently by those that aren’t keen. I wonder if the “raise the bar” issue applies? Would imagine this could be quite common in a higher education environment where research retains its primacy, but universities are under increasing pressure to improve their learning and teaching. And more importantly show to everyone that they have improved.

The complete study is outlined in a journal article.

References

Geoghegan, W. (1994). Whatever happened to instructional technology? Paper presented at the 22nd Annual Conferences of the International Business Schools Computing Association, Baltimore, MD.

The end of management – lessons for universities?

Yet another “death of X” article is the spark for this post. This one comes from the Wall Street Journal and is titled The end of management. There’s been a wave of these articles recently, but this one I like because it caters to my prejudice that most of the problems in organisations, especially in universities around learning and teaching, arise from an inappropriate management paradigm. The following has some connections to the oil sheiks thread.

Some choice quotes

Corporations are bureaucracies and managers are bureaucrats. Their fundamental tendency is toward self-perpetuation. They are, almost by definition, resistant to change. They were designed and tasked, not with reinforcing market forces, but with supplanting and even resisting the market.

and

The weakness of managed corporations in dealing with accelerating change is only half the double-flanked attack on traditional notions of corporate management. The other half comes from the erosion of the fundamental justification for corporations in the first place.

And a quote from Gary Hamel which summarises much of the problem with real innovation, including innovation around management

That thing that limits us is that we are extraordinarily familiar with the old model, but the new model, we haven’t even seen yet.

Moving onto the question of resources

In corporations, decisions about allocating resources are made by people with a vested interest in the status quo. “The single biggest reason companies fail,” says Mr. Hamel, “is that they overinvest in what is, as opposed to what might be.”

The challenge that strikes at the heart of improving learning and teaching within universities is capture in this quote

there’s the even bigger challenge of creating structures that motivate and inspire workers. There’s plenty of evidence that most workers in today’s complex organizations are simply not engaged in their work.

Does your university have large numbers of academic staff that are actively engaged in teaching? How does it do it?

I’d like to work for a university that gets this, or at least is trying to.

Wicked problems and the need to engage with differing perspectives

In writing the last post, I had the opportunity re-read the Wikipedia article on wicked problems. This quote struck a chord with me

Rittel and Webber coined the term in the context of problems of social policy, an arena in which a purely scientific-rational approach cannot be applied because of the lack of a clear problem definition and differing perspectives of stakeholders.

My experience with writing BIM from late last year through early this year is a good example of this. BIM is based on an earlier tool called BAM that was written as part of an institution specific system called Webfuse. As of early this year, the institution was dropping Webfuse and moving to Moodle. If BAM were to be continued, it had to be ported to Moodle. And the perspectives begin.

Stakeholders and their perspectives

There are at least three sets of stakeholders in the BIM process:

  • Academic staff wanting BIM written to use in their teaching.
  • Myself, the developer/researcher wanting to write BIM because of an interest in the approach.
  • The IT folk responsible for the Moodle transition project and supporting staff.

The academic staff who wanted BIM created, wanted it because it enabled a pedagogical practice that had been previously successful, at least from their perspective. They didn’t really care a great deal about how, they just wanted to use that pedagogy again.

I wanted to work on BIM because I believed that both the pedagogy it enabled and the model of e-learning systems it embodied were worthwhile and potentially very important for future practice.

The IT folk didn’t want BIM written. They had limited resources to use on the project and anything that was not core Moodle, was not very attractive to them. Consequently, they spent a lot of time and effort proposing methods by which the pedagogy enabled by BAM, could be enabled through various combinations of core Moodle tools. There was also quite a bit of political shenanigans being undertaken to prevent BIM being written.

Effective collaboration to enable an efficient implementation of the required pedagogy was not high on the agenda.

The winner writes the history

Obviously, the above is my perspective of what happened. I’m quite sure others involved might provide different perspectives. Especially now that BIM has been somewhat successful, at least in terms of at least one other institution using it and various people in the broader Moodle community saying nice things. I know begin to wonder what the story will be written about the history of BIM.

I no longer work for the original institution, and am fairly confident that if BIM continues to enjoy some success, that the IT folk within the institution will take some credit for an environment that enabled the development of BIM. After all, the development of BIM proves the rhetoric about the value of adopting an open source LMS like Moodle. After all the institution was able to develop a Moodle module that served an effective pedagogical purpose and is being adopted by others.

From my perspective, the writing of BIM has been achieved in spite of the institutional environment. Due to the difficulties of that environment, I had to do most of the work on holidays, had to fight individuals that actively worked against the development of BIM, and a range of other problems not indicative of an environment conducive to innovation.

But, now that I’ve left the organisation, it shall be interesting to hear what stories those that remain tell of BIM, its development, and their role within it.

The main point is that difference exists

Now all of that probably sounds a bit one sided and biased. Others might suggest a different version of events and suggest that it wasn’t so bad. They are free to confess that. Which version of events is more correct isn’t the point I’m trying to make here.

The point I’m trying to make is that as a wicked problem, improving learning and teaching within a university is going to have a large number of very different perspectives. The attempt to develop “the correct” perspective – which is the aim of engineering or planning approaches to solving these problems – misses the point. To establish an arbitrary and singular “correct” perspective of the problem and its solution, such a process must ignore and continually suppress alternative perspectives. This wastes energy on the suppression, and worse, closes off more fruitful solutions that arise from actively engaging with the diversity.

Two types of process and what university e-learning continues to get wrong

I should be writing other things, but there’s a wave amongst some of the “innovation bloggers” at the moment that I wanted to ride for the purposes of – once again – trying to get those driving university e-learning (and learning and teaching more generally) to realise they have something fundamentally wrong. They are using the wrong type of process.

I level this criticism at most of management, most of the support staff (information technology, instructional designers, staff developers etc) and especially at most of the researchers around e-learning etc.

For those of you at CQU who still don’t get what Webfuse was about. It wasn’t primarily about the technology, it was about this type of process. The technology was only important in so far as it enabled the process to be more responsive.

Empathy-driven innovation and a pull strategy

Over the weekend, Tim Kastelle wrote a post yesterday in which he proposes that a pull strategy is a key for empathy-driven innovation.

What is empathy-driven innovation, well Tim provides the following bullet points about empathy-driven innovation:

  • It requires a deep understanding of what the people that will use your innovation need and want.
    Most organisational e-learning assumes that steering committees and reference groups are sufficient and appropriate for understanding what is needed. This is just plain silly. The people who reside on such things are generally very different in terms of experience and outlook than the majority of academics involved with learning and teaching. If they aren’t different at the start, the very act of being a member of such groups will, over time, make them very different. These groups are not representative.

    What’s worse, is the support structures, processes, and roles that are created to sit under these committees and implement e-learning are more likely to prevent “deep understanding”, than help it. For example, different aspects of e-learning are divided along the lines of institutional structures. Anything technology related is the responsibility of the information technology folk, anything pedagogical is the responsibility of the instructional design folk and never shall the twain meet. As these folk generally report to different managers within different organisational units, the rarely talk and share insights.

    E-learning is more than the sum of its parts. Currently, there is generally a large gulf between the academics and students “experiencing” e-learning, the technology people keeping the systems going, the instructional design folk helping academics design courses, and the management staff trying to keep the whole thing going. This gulf actively works against developing deep understanding and limits the quality of e-learning within universities.

  • Using empathy for the users of our innovations is the best way to create thick value.
    A deep contextualised understanding and appreciation for the context of the academic staff and students helps develop truly unique and high quality e-learning applications and environment. Without it you are left with copying what every one else does, which is typically pretty limited.
  • We are creating ideas that entice people.
    Almost all of university-based e-learning is based on push strategies. i.e. someone or group who is/are “smart” identify the right solution and then push it out onto the academics and students. They have to do this because their understanding of the context and need of the academics and students is small to non-existent. They decisions are based more on their own personal observations and preferences, or even worse, on the latest fad (e.g. e-portfolios, open source LMS etc.).

    They aren’t creating ideas that entice people, they are creating ideas that people have to use.

    Researchers are particular bad at this.

  • Innovations that pull are inherently network-based.
    The idea is that to engage in empathy-driven innovation, you have to have connections to the people using the innovations.

    As argued above, it’s my suggestion that the structures and processes around e-learning within universities are such that they actively work against the formation of these connections. To have empathy-driven innovation you have to connect the folk involved in teaching, learning, technology and instructional design in ways that are meaningful and that actively strengthen the connections.

    At the moment, at least in my institution, there is no easy way for an academic to talk to a technical person that actually knows anything about the system, i.e. someone who can actively modify the system. The technology person can’t easily talk with someone with educational knowledge to better inform technological change. Instead each group retreats to talking amongst themselves. The necessary connections are generally only there in spite of the system, not because of it.

The Webfuse Thesis

I’m somewhat engaged in this discussion because I have seen, for a small period of time, this type of empathy-driven innovation work in the context of e-learning within a University. This is the topic of my PhD Thesis, an attempt to describe an information systems design theory for e-learning that encapsulates this.

At it’s simplest, the Webfuse thesis embodies two aspects:

  1. Process.
    There are two broad types of process: teleological and ateleological. I describe the two types here. Empathy-driven innovation is an example of an ateleological process. The table in the blog post describing teleological and ateleological mentions Seely Brown’s and Hagels push and pull distinction.

    University e-learning is always too teleological, it needs to be more ateleological.

  2. Product.
    Everyone focuses too much on the actual product or system when we talk about e-learning. With Webfuse the product was only important in terms of how flexible it could be. How much diversity could it support and how easy was it to support that diversity. Because the one thing I know about learning and teaching within universities, is that it is diverse. In addition, if you are to engage in ateleological (empathy-driven) design, you need to be able to respond to local needs.

    Most importantly, the question of how flexible the product is, is not limited to just the characteristics of the product. Yes, Moodle as an open source LMS implemented with technology (PHP and MySQL) that has low entry barriers, can be very flexible. However, if the organisation implements with technology with high entry barriers and inflexibility (e.g. Oracle) or if it adopts a process that is more teleological than ateleological, it robs Moodle of its flexibility.

From theory to intervention: Mapping theoretically derived behavioural determinants to behaviour change techniques

The following draws on principles/theory from psychology to guide thinking about how to incorporate “data” from “academic analytics” into an LMS in a way that encourages and enables academic staff to improve their learning and teaching. It’s based on some of the ideas that underpin similar approaches that have been used for students such as this Moodle dashboard and the signals work at University of Purdue.

The following gives some background to this approach, summarises a paper from the psychology literature around behaviour modification and then explains one idea for a “signals” like application for academic staff. Some of this thinking is also informing the “Moodle curriculum mapping” project.

Very interested in pointers to similar work, suggestions for improvement or expressions of interest from folk.

Background

I have a growing interest in how insights from psychology, especially around behaviour change can inform the design of e-learning and other aspects of the teaching environment at universities in a way to encourage and enable improvement.
Important: I did not say “require”, I said “encourage”. Too much of what passes in universities at the moment takes the “require” approach with obvious negative consequences.

This is where my current interest in “nudging” – the design of good choice architecture and behaviour modification is coming from. The basic aim is to redesign the environment within which teaching occurs in a way the encourages and enables improvement in teaching practice, rather than discourages it.

To aid in this work, I’ve been lucky enough to become friends with a pyschologist who has some similar interests. We’re currently talking about different possibilities, informed by our different backgrounds. As part of that he’s pointing me to bits in the psychological literature that offers some insight. This is an attempt to summarise/reflect on one such paper (Michie et al, 2008)

Theory to intervention

It appears that the basic aim of the paper is to

  • Develop methods to clarify the list of behaviour change techniques.
  • Identify links between the behaviour change techniques and behavioural determinants.

First a comparison of two attempts at simplifying the key behavioural determinants for change – the following table. My understanding is that there are some values of these determinants that would encourage behaviour change, and others that would not.

Key Determinants of Behaviour Change from Fishbein et al., 2001; Michie et al., 2004
Fishbein et al Michie et al
Self-standards Social/professional role and identity
Knowledge
Skills Skills
Self-efficacy Beliefs about capabilities
Anticipated outcomes/attitude Beliefs about consequences
Intention Motivation and goals
Memory, attention and decision processes
Environmental constraints Environmental context and resources
Norms Social influences
Emotion
Action planning

It is interesting to see how well the categories listed in this table resonate with the limits I was planning to talk about in this presentation. i.e. it really seems to me, at the moment, that much of the environment within universities around teaching and learning is designed as to reduce the chance of these determinants to be leaning towards behaviour change.

Mapping techniques to determinants

They use a group of experts in a consensus process for linking behaviour change techniques with determinants of behaviour. The “Their mapping” section below gives a summary of the consensus links. The smaller headings are the determinants of behaviour from the above table, the bullet points are the behaviour change techniques.

Now, I haven’t gone looking for more detail on the techniques. The following is going to be based solely on my assumptions about what those techniques might entail – and hence it will be limited. However, this should be sufficient for the goal of identifying changes in the LMS environment that might encourage change in behaviour around teaching.

First, let’s identify some of the prevalent techniques, i.e. those that are mentioned a more than once and which might be useful/easy within teaching.

Prevalent techniques

Social encouragement, pressure and support

The technique “Social processes of encouragement, pressure, support ” is linked to 4 of the 11 determinants: Social/professional role and identity, beliefs about capabilities, motivation and goals and social influences. I find this interesting as it can be suggested that most teaching is a lone and invisible act. Especially in a LMS where what’s going on in other courses. Making what happens more visible might enable this sort of social process.

There’s also some potential connection with “Information regarding behaviour of others” which is mentioned in 3 of 11.

Monitoring and self-monitoring

Get mentioned as linked to 4 of 11 determinants. Again, most LMS don’t appear to give good overall information about what a teacher is doing in a way that would enable monitoring/self-monitoring.

Related to this is “goal/target specified”, part of monitoring.

There’s more to do here, let’s get onto a suggestion

One suggestion

There’s a basic model process embedded here, something along the lines of:

  • Take a knowledge of what is “good” teaching and learning
    For example, Fresen (2007) argues that the level of interaction, facilitation or simply participation by academic staff is a critical success factor for e-learning. There’s a bunch more literature that backs this up. And our own research/analysis has backed this up. Courses with high staff participation show much higher student participation and a clearer correlation between student participation and grade (i.e. more student participation, the higher the grade).
  • Identify a negative/insight into the behavioural determinants that affect academic staff around this issue.
    There are a couple. First, it’s not uncommon for staff to have an information distribution conception of teaching. i.e. they see their job as to disseminate information. Not to talk, to communicate, or participate. Associated with this is that most staff have no idea what other staff are doing within their course sites. They don’t know how often other staff are contribution to the discussion forum or visiting the course site.
  • Draw on a behavioural technique or two to design an intervention in the LMS that can encourage a behaviour change. i.e. that addresses the negative in the determinants.
    In terms of increasing staff participation you might embed into the LMS a graph like the following. Embed it in such a way as the first thing an academic sees when they login, is the graph – perhaps on part of the screen.

    Example staff posts feedback

    What this graph shows is for a single (hypothetical) staff member the number of replies they have made in course discussion forums for the three courses the staff member has taught. The number of replies is shown per term, in reality it might be shown by week of term – as the term progresses.

    This part can hit the “monitoring”, “self-monitoring” and “feedback” techniques.

    The extra, straight line represents the average number of replies made by staff in all courses in the LMS. Or alternatively, all courses in a program/degree into which the staff member teaches. (Realistically, the average would probably change from term to term).

    This aspect hits the “social processes of encouragement, pressure, support”, “modelling/demonstration behaviour of others”. By showing what other people are doing it is starting to create a social norm. One that might perhaps encourage the academic, if they are below the average, to increase their level of replies.

    But the point is not to stop here. Showing a graph like this is simple using business intelligence tools and is only a small part of the change necessary.

    It’s now necessary to hit techniques such as “graded task, starting with easy tasks”, “Increasing skills: problem-solving, decision-making, goal-setting”, “Planning, implementation”, “Prompts, triggers, cues”. It’s not enough to show that there is a problem, you have to help the academic with how to address the problem.

    In this case, there might be links associated with this graph that show advice on how to increase replies or staff participation (e.g. advice to post a summary of the week’s happenings in a course each week, or some other specific, context appropriate advice). Or it might also provide links to further, more detailed information to shed more light on this problem. For example, it might link to SNAPP to show disconnections.

    But it’s even more than this. If you wanted to hit the “Environmental changes (e.g. objects to facilitate behaviour)” technique you may want to go further with than simply showing techniques. You may want to enable this “showing of techniques” to be within a broader community where people could comment on whether or not a technique worked. It would be useful if the tool help automate/scaffold the performance of the task, i.e. moved up the abstraction layer from the basic LMS functionality. Or perhaps the tool and associated process could track and create “before and afters”. i.e. when someone tries a technique, store the graph before it is applied and then capture it at sometime after.

It’s fairly easy to see how the waterfall visualisation (shown below) and developed by David Wiley and his group could be used this way.

education,data,visualization

Their mapping

Social/professional role and identity

  • Social processes of encouragement, pressure, support

Knowledge

  • Information regarding behaviour by others

Skills

  • Goal/target specified: behaviour or outcome
  • Monitoring
  • Self-monitoring
  • Rewards; incentives (inc. self-evaluation)
  • Graded task, starting with easy tasks
  • Increasing skills: problem-solving, decision-making, goal-setting
  • Rehearsal of relevant skills
  • Modelling/demonstration of behaviour by others
  • Homework
  • Perform behaviour in different settings

Beliefs about capabilities

  • Self-monitoring
  • Graded task, starting with easy tasks
  • Increasing skills: problem-solving, decision-making, goal-setting
  • Coping skills
  • Rehearsal of relevant skills
  • Social processes of encouragement, pressure and support
  • Feedback
  • Self talk
  • Motivational interviewing

Beliefs about consequences

  • Self-monitoring
  • Persuasive communication
  • Information regarding behaviour, outcome
  • Feedback

Motivation and goals

  • Goal/target specified: behaviour or outcome
  • Contract
  • Rewards; incentives (inc. self-evaluation )
  • Graded task, starting with easy tasks.
  • Increasing skills: problem-solving, decision-making, goal-setting
  • Social processes of encouragement, pressure, support
  • Persuasive communication
  • Information regarding behaviour, outcome
  • Motivational interviewing

Memory, attention, decision processes

  • Self-monitoring
  • Planning, implementation
  • Prompts, triggers, cues

Environmental context and resources

  • Environmental changes (e.g. objects to facilitate behaviour)

Social influences

  • Social processes of encouragement, pressure, support
  • Modelling/demonstration of behaviour by others

Emotion

  • Stress management
  • Coping skills

Action planning

  • Goal/target specified: behaviour or outcome
  • Contract
  • Planning, implementation
  • Prompts, triggers, cues
  • Use of imagery

References

Fresen, J. (2007). “A taxonomy of factors to promote quality web-supported learning.” International Journal on E-Learning 6(3): 351-362.

Michie, S., M. Johnston, et al. (2008). “From theory to intervention: Mapping theoretically derived behavioural determinants to behaviour change techniques.” Applied Psychology: An International Review 57(4): 660-680.

Why is University/LMS e-learning so ugly?

Yesterday, Derek Moore tweeted

University webs require eye candy + brain fare. Puget Sound’s site does both with colour palate & info architecture http://bit.ly/9U1kBN

In a later he also pointed to the fact that even Puget Sounds instance of Moodle looked good. I agreed.

This resonated strongly with me because I and a few colleagues have recently been talking about how most e-learning within Universities and LMS is ugly. Depressing corporate undesign seeking to achieve quality through consistency and instead sinking to being the lowest common denominator. Sorry, I’m starting to mix two of my bete noires:

  1. Most LMS/University e-learning is ugly.
  2. Most of it is based on the assumption that everything must be the same.

Let’s just focus on #1.

I’m using ugly/pretty in the following in the broadest possible sense. Pretty, at its extreme end, is something that resonates postively in the soul as your using it effectively to achieve something useful. It helps you achieve the goal, but you feel good while your doing it, even when you fail and even without knowing why. There’s a thesis or three in this particular topic alone – so I won’t have captured it.

Why might it be ugly? An absence of skill?

Let me be the first to admit that the majority of e-learning that I’ve been responsible for is ugly. This design (used in 100s of course sites) is mostly mine, but has thankfully improved (as much as possible) by other folk. At best you might call that functional. But it doesn’t excite the eyes or resonate. And sadly, it’s probably all downhill from there as you go further back in history.

Even my most recent contribution – BIM – is ugly. If you wish to inflict more pain on your aesthetic sensibility look at this video. BIM rears its ugly head from about 1 minute 22 seconds in.

In my case, these are ugly because of an absence of skill. I’m not a graphic designer, I don’t have training in visual principles. At best I pick up a bit, mostly from what I steal, and then proceed to destroy those principles through my own ineptitude.

But what about organisations? What about the LMS projects like Moodle?

Why might it be ugly? Trying to be too many things to too many?

An LMS is traditionally intended to be a single, integrated system that provides all the functionality required for institutional e-learning. It is trying to be a jack of all trades. To make something so all encompassing look good in its entirety is very difficult. For me, part of looking good is responding to the specifics of a situation in an appropriate way.

It’s also not much use being pretty if you don’t do anything. At some level the developers of an LMS have to focus on making it easy to get the LMS to do things, and that will limit the ability to make it look pretty. The complexity of the LMS development, places limits on making it look pretty.

At some level, the complexity required to implement a system as complex as a LMS also reduces the field of designers who can effectively work with to improve the design of the system.

But what about organisations adopting the LMS, why don’t they have the people to make it look good?

Why might it be ugly? Politics?

The rise of marketing and the “importance of brand” comes with it the idea of everything looking the same. It brings out the “look and feel” police, those folk responsible for ensuring that all visual representations of the organisation capture the brand in accepted ways.

In many ways this is an even worse example of “trying to be too many things”. As the “brand” must capture a full range of print, online and other media. Which can be a bridge too far for many. The complexity kills the ability for the brand to capture and make complete use of the specific media. Worse, often the “brand police” don’t really understand the media and thus can’t see the benefits of the media that could be used to improve the brand.

The brand and the brand police create inertia around the appearance of e-learning. They help enshrine the ugliness.

Then we get into the realm of politics and irrationality. It no longer becomes about aesthetic arguments (difficult at the best of times) it becomes about who plays the game the best, who has the best connection to leadership, who has the established inertia, who can spin the best line.

The call to arms

I think there is some significant value in making e-learning look “pretty”. I think there’s some interesting work to be done in testing that claim and finding out how you make LMS and university e-learning “pretty”.

Some questions for you:

  • Is there already, or can we set up, a gallery of “pretty” LMS/institutional e-learning?
    Perhaps something for Moodle (my immediate interest) but other examples would be fun.
  • What bodies of literature can inform this aim?
    Surely some folk have already done stuff in this area.
  • What might be some interesting ways forward i.e. specific projects to get started?

Embedding behaviour modification – paper summary

A growing interest of mine is an investigation of how the design of the environment and information systems to support university learning and teaching can be improved with a greater consideration given to factors which can help encourage improvement and change. i.e. not just building systems that do a task (e.g. manage a discussion forum) but design a discussion forum that encourages and enables an academic to adopt strategies and tactics that are known to be good. If they choose to.

One aspect of the thinking around this is the idea of behaviour modification. The assumption is that to some extent improving the teaching of academics is about changing their behaviour. The following is a summary of a paper (Nawyn et al, 2006) available here.

The abstract

Ubiquitous computing technologies create new opportunities for preventive healthcare researchers to deploy behavior modification strategies outside of clinical settings. In this paper, we describe how strategies for motivating behavior change might be embedded within usage patterns of a typical electronic device. This interaction model differs substantially from prior approaches to behavioral modification such as CD-ROMs: sensor-enabled technology can drive interventions that are timelier, tailored, subtle, and even fun. To explore these ideas, we developed a prototype system named ViTo. On one level, ViTo functions as a universal remote control for a home entertainment system. The interface of this device, however, is designed in such a way that it may unobtrusively promote a reduction in the user’s television viewing while encouraging an increase in the frequency and quantity of non-sedentary activities. The design of ViTo demonstrates how a variety of behavioral science strategies for motivating behavior change can be carefully woven into the operation of a common consumer electronic device. Results of an exploratory evaluation of a single participant using the system in an instrumented home facility are presented

Summary

Tell’s how a PDA + additional technology was used to embed behaviour modification strategies aimed at decreasing the amount of television watching. Describes a successful test with a single person.

Has some links/references to strategies and research giving principles for how to guide this type of design.

Introduction

Set the scene. Too many Americans watch too much TV, are overweight and don’t get exercise. Reducing TV watching should improve health, if replaced with activities that aren’t sedentary. But difficult because TV watching is addictive and exercise is seen to have high costs and initial experience not so good.

The idea is that “successful behavior modification depends on delivery of motivational strategies at the precise place and time the behavior occurs”. The idea is that “sense-enabled mobile computing technologies” can help achieve this. This work aims to:

  • use technology to disrupt the stimulus-reward cycle of TV watching;
  • decrease the costs of physical activity.

Technology-enabled behavioral modification strategies

Prior work has included knowledge campaigns and clinical interventions – the two most common approaches. Technology used to reduce television usually gatekeepers used to limit student access – not likely to be used by adults. There are exercise-contingent TV activation systems.

More work aimed at increasing physical activity independent of television. Approaches use include measuring activity and providing open loop feedback. i.e. simple, non-intrusive aids to increase activity. The more interactive, just in time feedback may help short-term motiviation – e.g. video games. Also technology interventions that mimic a human trainer.

For those not already exercising small increases in physical activity may be better than intense regimens.

The opportunity: just-in-time interactions

Technological intervention based on the value of: that people respond best to information that is timely, tailored to their situation, often subtle, and easy to process. This intervention uses a PDA device intended to replace the television remote control and adds a graphical interface, built-in program listings, access to a media library, integrated activity management, and interactive games.

It tries to determine the goals of the user and suggest alternatives to watching TV in a timely manner. The addition of wearable acceleration sensors it can also function as a personal trainer.

Challenges

Provide a user experience rewarding enough to be used over time.

Grabbing attention without grabbing time

Prior work on behavior change interventions reveals them to be:

  • resource-intensive, requiring extensive support staff;
  • time-intensive, requiring the user to stop everyday activity to focus on relevant tasks.

This is why the remote is seen as a perfect device. It’s part of the normal experience. Doesn’t need separate time to use.

Sustaining the interaction over time

Behavior change needs to be sustained over years to have a meaningful impact.
Extended use of a device might run the risk of annoyance, so avoided paternalistic or authoritarian strategies. Focus instead on strategies that promote intrinsic motivation and self-reflection. Elements of fun, reward and novelty are used to induce positive affect rather than feelings of guilt.

Avoiding the pitfall of coercion

Temptation of using coercion for motiviation. The likelihood that users will tolerate coercive devices for long is questionable.

Avoiding reliance on extrinsic justification

Optimal outcome of any behavioural intervention is change that persists. Heavy reliance on extrinsic justification – rewards or incentives – may result in dependency that can hurt persistence if removed. Also problems if the undesirable behaviour – watching TV – is the reward for exercise.

Case study

Low cost remote produced from consumer hardware. Laptop provided to manage media library. GUI with finger input.

Provides puzzles that use the TV for display and physical activity for input.

Behavior modification strategies

Most derived from basic research on learning and decision-making – suggestibility, goal-setting and operant conditioning). Examples include:

  • value integration – having persuasive strategies embedded within an application that otherwise provides value to the user increases the likelihood of adoption.
  • reduction – reducing the complexity of a task increases the likelihood that it will be performed.
  • convenience – embedding within something used regularly, increases opportunities for delivery of behaviour change strategies.
  • ease of use – easier to use = more likely to be adopted over long term.
  • intrinsic motivation – incorproating elements of challenge, curiosity and control into an activity can sustain interest.
  • suggestion – you can bias people toward a course of action through even very subtle prompts and cues.
  • encouraging incompatible behaviour – encouragement can be effective
  • disrupting habitual behaviour – eliminate bad habits by the conditions that create them are removed or avoided.
  • goal setting – concrete, achievable goals promote behaviour change by orienting the individual toward a definable outcome.
  • self-monitoring – motivated people can be more effective when able to evaluate progress toward outcome goals.
  • proximal feedback – feedback that occurs during or immediately after an activity has the greatest impact on behaviour change.
  • operant conditioning – increase frequency of desirable behaviour by pairing with rewarding stimuli.
  • shaping – transform an existing behaviour into more desirable one by rewarding successive approximations of the end goal.
  • consistency – draw on the desire of people to have a consistency between what they say and do to help them adhere to stated goals.

Exploratory evaluation

Use it with a single (real life) person to find out what happens.

Done in a specially instrumented apartment, including 3 phases: baseline with normal remote, 12 days at home, 7 days in lab with special remote. Participant not told that this was aimed at changing behaviour around watching TV and physical activity.

Results

Television watching reduced from 133 minutes a day during baseline to 41 minutes during intervention.

Evaluation against the adopted strategies were positive.

Conclusions

Substantial improvement important. Phase strategies in over time. Strategies are initially seen as novel – can use this curiosity. Not all users will react well.

References

Nawyn, J., S. Intille, et al. (2006). Embedding behavior modification strategies into a consumer electronic device: A case study. 8th International Conference on Ubiquitous Computing: 297-314.

Different perspectives on the purpose of the LMS

Antonio Vantaggiato gives one response to a post from Donald Clark titled “Moodle: e-learning’s frankenstein”. Clark’s post is getting a bit of traction because it is being seen as a negative critique of Moodle.

I think part of this problem is the failure to recognise the importance of the perceived purpose to which Moodle (or any LMS) is meant to serve. Just in my local institution, I can see a number of very different perceptions of the purpose behind the adoption of Moodle.

In the following I’m stealing bits of writing I’ve done for the thesis, some of it has appeared in previous posts. This probably makes the following sound somewhat pretentious, but I’ve gotta got some use out of the &%*#$ thesis.

The importance of purpose

Historically and increasingly, at least in my experience, the implementation of e-learning within universities has been done somewhat uncritical with the information technology taken for granted and assumed to be unproblematic. This is somewhat surprising given the nature of the universities and the role academics are meant to take. However, in my experience the selection of institutional LMSs is driven by IT and management with little room for critical thought or theory informed decision making.

Instead they rely on a very techno-rational approach that takes a very narrow perspective of what technology is, how it has effects and how and why it is implicated in social change (Orlikowski and Iacono 2001). A different perspective is that technology serves the goals of those who guide its design and use (Lian 2000).

This is important because many, if not most, universities follow, or at least profess to follow, a purpose driven approach to setting strategic directions (McConachie, Danaher et al. 2005). The implementation of an LMS is being done to achieve specific institutional purposes. The very definition of a teleological design process is to set and achieve objectives, to be purpose driven (Introna, 1996). When an institution engages in selecting an LMS, the purpose is typically set by a small group, usually organisational leaders, who draw on expert knowledge to perform a diagnosis of the current situation in order to identify some ideal future state and how to get there.

Once that purpose is decided, everything the organisation does from then on is about achieving that purpose with maximum efficiency. By definition, any activity or idea that does not move the organisation closer to achieving its stated purpose is seen as inefficient (Jones and Muldoon, 2007).

Differences of purpose

Many of the folk responding to Clark’s post who are defending Moodle have their own notion of the purpose of Moodle, usually how they have used it. Others draw on the purposes espoused by the designer(s) of Moodle. There is little recognition that there exists a diversity of opinions about the purposes of Moodle.

A little of this diversity is represented in discussions about how Moodle is used in individual courses. For example, this comment mentions that Moodle does teacher centered very well. i.e. if a teacher sees the purpose of a course site to distribute information, Moodle can do that. This comment makes the point that Moodle is a tool, the pedagogy is not about the tool, it is about the approach.

Now, while to some extent that is true, I also agree with Kallinikos (2004) that systems can have profound effects on the structuring of work and the forms of human action they enable or constrain.

While Moodle’s designers may have all sorts of wonderful intents with the purpose of Moodle. Within a university the purpose assigned to Moodle by the people implementing it and supporting play a significant part. The processes, structures etc that they put around Moodle within an institutional setting can enable or constrain the purpose seen by the Moodle designers and the purpose seen by the staff and students who will use it.

Moodle/LMS as an integrated enterprise system

Due to the complexity of implementing Moodle for a largish organisation the people driving Moodle implementations within universities are usually IT folk. It is my suggestion that the purpose they perceive of Moodle is that of an integrated, enterprise system. A university’s LMS forms the academic system equivalent of enterprise resource planning (ERP) systems in terms of pedagogical impact and institutional resource consumption (Morgan 2003).

To slightly paraphrase Siemens (2006), the purpose of an LMS for the institution is to provide the organisation with the ability to produce and disseminate information by centralising and controlling services. The LMS model with its nature as an integrated, enterprise system fits the long-term culture of institutional information technology and its primary concern with centralizing and controlling information technology services with a view to reducing costs (Beer and Jones 2008).

An LMS is designed to provide an organisation with all the tools it will need for e-learning. Weller, Pegler and Mason (2005) identify two approaches to the design of an LMS: monolithic or integrated approach, and the best-of-breed approach. The monolithic approach is the predominant approach and seeks to provide all common online learning tools in a single off-the-shelf package (Weller, Pegler et al. 2005).

The evidence of this purpose can be seen when you go to your LMS folk and ask them “I’d like to do X”. The response to this question will generally be not what is the best marriage of pedagogy and technology (the best blended learning) to achieve your goal. The response to this question will generally be “How to do X in the LMS”. Regardless of how much extra work, complexity and just plain inappropriateness doing X in the LMS requires.

If all you have is an LMS, every pedagogical problem is solved by the application of the LMS.

What should the purpose of an LMS be?

BIM is a representation of what I think the purpose of an LMS should be. i.e. the LMS should provide the services that are necessary/fundamental to the university/institution, and only those. Increasingly, most of the services should be fulfilled by services and resources that students and staff already use and control.

BIM provides academics teaching a course a way to aggregate blog posts from students and, if they want to, mark them. Those marks are integrated into the Moodle gradebook. The assumption is that marking/accreditation is one of the main tasks a university performs and that there aren’t external services that currently provide that service.

There are, however, a great many very good and free blog services. So students use their choice of blog provider (or something else that generates RSS/Atom) to create and manage their contributions.

The purpose of the LMS isn’t to provide all services, just those that are required for the institution’s tasks.

Eventually, the term LMS becomes a misnomer. The system isn’t about managing learning. It’s about providing the glue between what the institution has to provide and what the learners are using. The purpose is about achieving the best mix of pedagogy and technology, rather than on how to use the LMS.

This perspective obviously has connections with Jon Mott’s (2010) article and the various folk who have written about this previously.

References

Beer, C. and D. Jones (2008). Learning networks: harnessing the power of online communities for discipline and lifelong learning. Lifelong Learning: reflecting on successes and framing futures. Keynote and refereed papers from the 5th International Lifelong Learning Conference, Rockhampton, Central Queensland University Press.

Jones, D. and N. Muldoon (2007). The teleological reason why ICTs limit choice for university learners and learning. ICT: Providing choices for learners and learning. Proceedings ASCILITE Singapore 2007, Singapore.

Kallinikos, J. (2004). “Deconstructing information packages: Organizational and behavioural implications of ERP systems.” Information Technology & People 17(1): 8-30.

Introna, L. (1996). “Notes on ateleological information systems development.” Information Technology & People 9(4): 20-39.

Lian, A. (2000). “Knowledge transfer and technology in education: Toward a complete learning environment.” Educational Technology & Society 3(3): 13-26.

McConachie, J., P. Danaher, et al. (2005). “Central Queensland University’s Course Management Systems: Accelerator or brake in engaging change?” International Review of Research in Open and Distance Learning 6(1).

Morgan, G. (2003). Faculty use of course management systems, Educause Centre for Applied Research: 97.

Mott, J. (2010). “Envisioning the Post-LMS Era: The Open Learning Network.” EDUCAUSE Quarterly 33(1).

Orlikowski, W. and C. S. Iacono (2001). “Research commentary: desperately seeking the IT in IT research a call to theorizing the IT artifact.” Information Systems Research 12(2): 121-134.

Siemens, G. (2006). “Learning or Management System? A Review of Learning Management System Reviews.” from http://ltc.umanitoba.ca/wordpress/wpcontent/uploads/2006/10/learning-ormanagement-system-with-reference-list.doc.

Weller, M., C. Pegler, et al. (2005). “Students’ experience of component versus integrated virtual learning environments.” Journal of Computer Assisted Learning 21(4): 253-259.

The Ps Framework and the mismatch created by the product and process of industrial e-learning

Last year I gave a a couple of presentations titled “Alternatives for the institutional implementation of e-learning”. In those presentations I essentially argued that what passes for current practice of e-learning in universities – what I now call “industrial e-learning” – suffers from a significant mismatch which, I believe, contributes to most of its limitations. My argument is that the nature of the product (the LMS) and the process (teleological) adopted in industrial e-learning to be completely unsuited to the nature of the people involved in e-learning, the nature of learning and teaching and the nature of universities and the context in which they operate.

Anyone who has been following this blog for some time will recognise that sentiment in any number of posts. But this one is a little different.

This thinking has formed the basis for and arisen out of my PhD thesis which I’m trying to rapidly finish. Over the last couple of weeks I’ve been cutting down the enormous Chapter 2 (literature review) that I wrote last year into something approaching an acceptable size. The basis for this cutting has become the argument used in those presentations.

It is with some relief that I release into the world Chapter 2 of my thesis. It’s an attempt to make the above argument in a formal way and set up the space for my contribution in the form of an information systems design theory (i.e. a better way) for e-learning.

The current state of this draft is that it basically fits together. I’m walking away from it for a couple of weeks to get some head space, before I come back to it and polish it up. Any and all feedback is welcome.

Design processes for teaching

The following is a first draft of a section from my thesis. It will form part of the newly cut down section on Process within chapter 2 (170 pages down to 50). The following tries to say something about the design processes used for teaching within universities. It starts with a characterisation of instructional design, looks at the limitations (referring to some earlier work about teleological and ateleological processes) and seeks to describe what the literature has to say about how teaching academics actually design/plan their courses.

As with previous thesis drafts, this is an early draft, I’ll re-edit and improve later, but thought I’d get this out there.

Design processes for teaching

Having introduced a framework for understanding different types of processes and examining the institutional strategic and institutional learning and teaching processes used by universities, this section examines the types of processes used to plan, develop and run individual university courses or units. It starts with a one description of what this type of process embodies, examines the ideal instructional design process before describing what is known about the processes used by individual university academics. The same tendency toward teleological processes is seen. As can the same problems and limitations that arise from such processes within a context that show significant diversity, levels of uncertainty and human agency.

This section and its topic, while somewhat related to the discussion on Pedagogy (Section cross reference), has a different focus. Pedagogy focuses on what is known about learning and how to improve the learning that occurs within universities. This section examines the processes used to design learning as embodied in individual university courses or units. Without question this design process should be informed by knowledge of pedagogy, but the process itself is worthy of description as there are differing options and perspectives.

Reigeluth (1983) defines instructional design as a set of decision-marking procedures that, given a set of outcomes for student to achieve and knowledge of the context within which they will achieve them, guides the choice and development of effective instructional strategies. Reiser (2001) describes how the field of instructional design arose out of the need for large groups of psychologists and educators to develop training materials for the military services. The backgrounds and skills of these people the materials were developed through instructional principles derived from theory on instructiona, learning and human behaviour (Reiser 2001). After the war this work continued and increasingly training was viewed as a system to be designed and developed using specialised procedures (Reiser 2001).

Models of instructional design still have strong connection to the models developed in the 1950s based on the ADDIE (Analyse, Design, Develop, Implement, Evaluate) process (Irlbeck, Kays et al. 2006). ADDIE is a framework designed for objectivist epistemologies where front-end analysis precedes the development of curricular content (Der-Thanq, Hung et al. 2007). The learning theory used to inform instructional design has moved on from its behaviourist origins, moving through cognitivism, constructivism and slowly into connectivism. However, Winn (1990) identifies three areas where behaviourism still exerts power over the processes used by instructional design: the reductionist premise that you can identify the parts, then you can teach the whole; separation of design from implementation; and the assumption that following good procedures, applied correctly results in good instruction.

Visscher-Voerman and Gustafson (2004) identified four different paradigms for instructional design – instrumental, communicative, pragmatic and artistic – with ADDIE situated within the instrumental paradigm. They found, in confirmation of other studies, that the instrumental paradigm as dominated instructional design and that there are questions about its relevance given recent epistemological and technical developments (Visscher-Voerman and Gustafson 2004). Evidence of this dominance can be found in more recent conceptions of instructional design such as constructive alignment (Biggs 1999). Constructive alignment is based on constructivist theories of learning (Entwistle 2003) and focusing on a shift from teacher-centered to student-centered teaching, (Harvey and Kamvounias 2008) and is an example of outcomes-centred design. Outcomes-centred design is a four step process: definition of learning outcomes; design assessment tasks for students to demonstrate achievement; design learning activities for students to develop the appropriate skills; and identify the content that will underpin the learning activities (Phillips 2005).

This teleological view of the instructional design process has a number of flaws. Some of these flaws arise from the teleological nature of the process. Table 2.1 draws on literature around learning, teaching and instructional design to illustrate that it could be argued that Introna’s (1996) three necessary conditions for a teleological process do not exist in the instructional design context. The following seeks to describe other criticisms of this teleological approach to instructional design that have arisen from the literature, including the observation that it does not match what is known about how teaching academics plan their courses

Table 2.1 – Suggestions that instructional design does not satisfy Introna’s (1996) 3 necessary conditions for teleological processes
Necessary conditions Reality
Stable and predictable system Discipline categories bring differences (Becher and Trowler 2001) and are social constructions, subject to change from within and between disciplines.
If a student finds a learning strategy troubling, the student can switch to another at will. The designer could not have predicted which strategy the student would actually use (Winn 1990).
Traditional instruction design is not responsive enough for a society characterised by rapid change (Gustafson 1995).
Manipulate behaviour Change in student strategy can circumvent the intent of the design, unless the design is extremely adaptable (Winn 1990)
Human behaviour is unpredictable, if not indeterminate, which suggests that attempts to predict and control educational outcomes cannot be successful (Cziko 1989)
Academic freedom in teaching refers to the right to teach a course in a way the academic feels reasonable (Geirsdottir 2009)
Most teachers believe they have considerable autonomy in course planning (Stark 2000)
Accurately determine goals curriculum decision making is characterised by conflict and contradictions and by attempts to guard the interest and power relations within the disciplinary community (Henkel and Kogan 1999).
As the student learns, their mental models change and hence decisions about instructional strategies made now, would be different than those made initially (Winn 1990).
Influences on the choice of teaching approaches adopted are clearly more complex than any simple analytic model can convey (Entwistle 2003)
It cannot be assumed that everything is planned in advance (Levander and Mikkola 2009)
In the real world, no-one is sure what the instructional goals should be (Dick 1995).

The above description of instructional design as a teleological process represents the dominant paradigm of instructional design, but not the only paradigm. This is representative of the more homogeneous view of instructional design built around the ADDIE framework (Visscher-Voerman and Gustafson 2004). While instructional designers do apparently use process-based instructional design models (e.g. ADDIE), a majority of their time is not spent working within such processes nor do they follow them in a rigid fashion (Kenny, Zhang et al. 2005). The design processes used by instructional designers are much more heterogeneous and diverse (Rowland 1992). Dick (1995) suggestions that models, such as ADDIE, are ultimately judged on their usefuleness, not on whether they are good or bad.

Models, such as ADDIE, are most useful in the systematic planning of major revisions of an existing course or the creation of a new course. However, traditional university academics spend relatively little time in systematic planning activities prior to teaching an existing course (Lattuca and Stark 2009). A significant reason for this is that academics are not often required to engage in the development of new courses or major overhauls of existing courses (Stark and Lowther 1988). The pre-dominant practice is teaching an existing course, often a course the academic has taught previously. When this happens, academics spend most of their time fine tuning a course or making minor modifications to material or content (Stark 2000).

It is also known that academics practice: is not described by a rational planning model; generally starts with content and not explicit course objectives; and does not separate planning from implementation (Lattuca and Stark 2009). Since academics have traditionally not been required to document their teaching goals for a course ahead of time it is possible that the actual teaching and learning that occurs is more in line with the teacher’s implicit internalised knowledge and not that described in published course descriptions (Levander and Mikkola 2009). Formal description of the curriculum do not necessarily provide much understanding about how teachers put their curriculum ideas into action (Argyris and Schon 1974).

As stated earlier, the instructional design process can be seen as drawing on the knowledge of learning and instructional design to identify appropriate instructional strategies to achieve required outcomes within a given context. Most university academics do not have this knowledge of learning and instructional design. In addition, these staff rarely read educational literature or call upon any available expert assistance when planning a course (Stark 2000). In the absence of formal qualifications of knowledge in this area, most academics teach in ways they have been taught (Phillips 2005) and/or which fit with disciplinary norms and their recent teaching experience (Entwistle 2003). These in turn influence the conceptions of teaching and learning held by academics, which in turn influences their approaches to teaching as described in a significant body of literature discussed in more detail in (cross reference to just after Figure 2.2 and section 2.7.1).

In seeking to describe what is known about the approaches to teaching used by academics, Richardson (2005) developed the integrated model shown in Figure 2.1. While useful, Entwistle (Entwistle 2003) suggests that the simply analytic models are too simple to capture the full complexity of the decision making that occurs when choosing teaching approaches. Stark (2000) suggests that instructional design is not only a science, but also a creative act, linked to teacher thinking that must be examined contextually, meaning that it is not amenable to a single formula or prescription. Or perhaps to a teleological process.

Figure 2.2 - An integrated model of teachers' approaches to teaching, conceptions of teaching and perceptions of the teaching environment (Richardson 2005)

References

Argyris, C. and D. Schon (1974). Theory in practice: Increasing professional effectiveness. Oxford, England, Jossey-Bass.

Becher, T. and P. Trowler (2001). Academic tribes and territories: Intellectual enquiry and the culture of disciplines. Buckingham, SRHE and Open University Press.

Biggs, J. (1999). Teaching for quality learning at university. Buckingham, Open University Press.

Cziko, G. A. (1989). "Unpredictability and indeterminism in human behavior: arguments and implications for educational research." Educational Researcher 18(3): 17-25.

Der-Thanq, C., D. Hung, et al. (2007). "Educational design as a quest for congruence: The need for alternative learning design tools." British Journal of Educational Technology 38(5): 876-884.

Dick, W. (1995). Enhanced ISD: A response to changing environments for learning and performance. Instructional design fundamentals: a reconsideration. B. Seels. Englewood Cliffs, Educational Technology: 13-20.

Entwistle, N. (2003). Concepts and conceptual frameworks underpinning the ETL Project. Occasional Report 3. Edinburgh, University of Edinburgh.

Geirsdottir, G. (2009). We are caught up in our own world : conceptions of curriculum within three different disciplines at the University of Iceland, Iceland University of Education. PhD: 326.

Gustafson, K. (1995). Instructional design fundamentals: clouds on the horizon. Instructional design fundamentals: a reconsideration. B. Seels. Englewood Cliffs, Educational Technology: 21-30.

Harvey, A. and P. Kamvounias (2008). "Bridging the implementation gap: a teacher-as-learner approach to teaching and learning policy." Higher Education Research & Development 27(1): 31-41.

Henkel, M. and M. Kogan (1999). Changes in Curriculum and Institutional Structures: Responses to Outside Influences in Higher Education Institutions. Innovation and adaptation in higher education. C. Gellert. London, Jessica Kingsley Publishers: 66-92.

Introna, L. (1996). "Notes on ateleological information systems development." Information Technology & People 9(4): 20-39.

Irlbeck, S., E. Kays, et al. (2006). "The Phoenix Rising: Emergent models of instructional design." Distance Education 27(2): 171-185.

Kenny, R., Z. Zhang, et al. (2005). "A review of what instructional designers do: Questions answered and questions not asked." Canadian Journal of Learning and Technology 31(1): 9-26.

Lattuca, L. and J. Stark (2009). Shaping the college curriculum: Academic plans in context. San Francisco, John Wiley & Sons.

Levander, L. and M. Mikkola (2009). "Core curriculum analysis: A tool for educational design." The Journal of Agricultural Education and Extension 15(3): 275-286.

Phillips, R. (2005). "Challenging the primacy of lectures: The dissonance between theory and practice in university teaching." Journal of University Teaching and Learning Practice 2(1): 1-12.

Reigeluth, C. M. (1983). Instructional design: what is it and why is it? Instructional design theories and models. C. M. Reigeluth. Hillsdale, NJ, Lawrence Erlbaum Associates.

Reiser, R. (2001). "A history of instructional design and technology: Part II: A history of instructional design." Educational Technology Research and Development 49(2): 57-67.

Richardson, J. (2005). "Students’ approaches to learning and teachers’ approaches to teaching in higher education." Educational Psychology 25(6): 673-680.

Rowland, G. (1992). "What do instructional designers actually do? An initial investigation of expert practice." Performance Improvement Quarterly 5(2): 65-86.

Stark, J. (2000). "Planning introductory college courses: Content, context and form." Instructional Science 28(5): 413-438.

Stark, J. and M. Lowther (1988). Strengthening the Ties That Bind: Integrating Undergraduate Liberal and Professional Study. Ann Arbor, MI, Professional Preparation Project.

Visscher-Voerman, I. and K. Gustafson (2004). "Paradigms in the theory and practice of education and training design." Educational Technology Research and Development 52(2): 69-89.

Winn, W. (1990). "Some implications of cognitive theory for instructional design." Instructional Science 19(1): 53-69.

Implications arising from the absence of the "sameness of meaning"

A few days ago Stephen Downes – a little unusually – made a sequence of comments/tweets on Twitter around the “sameness of meaning” and its impossibility. Since then I’ve had a number of experiences and discussions that suggest some of the problem associated with learning and teaching policy, process and structure within universities arises because too many people assume that there is sameness of meaning.

Communication and the commonality of meaning

Let’s start with this tweet

Communication isn’t about commonality of meaning. That’s impossible. It’s about being able to at least approximately predict the response.

Further explanation flows from this tweet

What happens is, the word we transmit (and maybe gestures, etc) forms only a small part of the other person’s understanding of what you said

and this one

Your word is only a stimulus; Most of the person’s understanding is based on his prior knowledge, and that is what produces the response

And I particularly like this one as a guideline for how to move forward

“Get me a gazelle” would work just as well if your listener understood that he should deliver a Heineken; meaning doesn’t matter, results do

Implications for design

This talks to me because much of what I do could be broadly called “design”. Mostly it’s around the design of information systems. This means much more than technology. Information systems (in the meaning I am using) also embodies all the other “wetware” (i.e. people and organisational) stuff required for the technology to be used and used effectively.

This definition means that I include the following as design:

  • The design and implementation of training and support for the system.
  • The creation of the policies and procedures around the system.
  • The design of the organisational structures and positions within those structures that will impact on the system.
  • How people are encouraged to make decisions about the system.

As I wrote previously (Jones, 2004) – and really just repeating what others had already said – about the impact of representation/meaning

The formulation of the initial state into an effective representation is crucial to finding an effective design solution (Weber, 2003). Representation has a profound impact on design work (Hevner et al., 2004), particularly on the way in which tasks and problems are conceived (Boland, 2002). How an organisation conceptualises the e-learning problem will significantly influence how it answers the questions of how, why and with what outcomes

The answers that a university arrives at in terms of the how, why and with what outcomes end up embodying a collection of meanings. When the organisation and its members implement e-learning they too often assume that there is a commonality of meaning. Commonality of meaning is a key part of how they represent the system. Consequently, their design is fundamentally based on the idea of commonality of meaning. I think that this is a fatal flaw for much of what is designed.

What follows are some examples of where it doesn’t hold.

Software design

My main current task is the design of BIM (code should be out by Monday at the latest) and today was a day to watch a “clueless user” (she’s actually quite intelligent she just knows little about computers and BIM) interact with BIM. BIM is designed by me. It embodies the meaning that I have formed about BIM and its task over the 3 years or so I’ve been working on it. It also embodies meanings/ideas/understandings that have formed over the last 12/15 years of doing e-learning and developing e-learning systems. That same meaning is informed by my experiences in social media (e.g. this blog)

The “clueless user” is a sessional teaching academic in management/human resources. She’s done a bit of e-learning and used BAM. She doesn’t have very much in the way of detailed mental models about how her computer works, how the Internet works or how Moodle, BIM, blogs and feeds work (or even mean).

Needless to say, having observed the user and the meanings she has demonstrated of BIM, I have a long list of improvements for the interface and operation of BIM. If some of them aren’t made, the other academics going to be using BIM are going to struggle. Understanding her meaning and responding to it has been helpful. It has forced me to reconsider and hopefully improve BIM to better fit with other meanings. It should improve BIM.

Of course, 1 person does not make a universe. But that 1 person being very different from me will help a bit.

Downes tweeted

… and my take-away is that we should be careful not to assume that people see things the same way we do, because invariably they don’t

If I’d assumed the same meaning and left BIM as, there would have been trouble for someone. The academics using the unmodified BIM would have suffered increased levels of frustration of dealing with a new system for which they did not understand the embodied meaning. There will still be some of that, but hopefully not as much.

Downes, on the implications of this

there’s so much room for error in communication we don’t notice that we mean different things, usually, and then it surprises us when we do

I’m a little bit surprised by the level of changes needed in BIM. It’s based on a system that’s been used for 3 years, that has been used by this same “clueless user”.

But by engaging in what I’ve done I’ve opened myself up to that surprise at a stage much earlier where it is simpler for me to respond. Too much of how e-learning is implemented in universities does not allow itself to be exposed to “good” surprise, instead they get “bad” – often hugely problematic – surprise.

Minimum service standards

I know of an institution that has implemented minimum service standards for course websites. The standards have been approved at all the right committees, the designers of the standards have written a paper about it, there has been mention of it some of the training sessions for staff and it is now a couple of weeks out from the start of the first term using these standards.

The meaning being heard from the designers of these standards, at least until very recently, has been “it’s all good”. The meaning being heard from the academic staff now being required to fulfill the standards and complete the accompanying checklist includes: “Where did this come from?” and “How do we comply with it?”

Even some of the designers and promulgators of the standards have different meanings. Perhaps the two extremes of those meanings are:

  • The standards are a stick with which to identify the bad teachers.
  • The standards provide a scaffold within which to have discussions about the design of learning experiences within the LMS.

Now, will this difference of meaning result in a “bad” surprise. I’m not so sure. I think organisations and how they choose to perceive the world has a lot in common with what Downes says about communication

In fact, we mostly don’t detect the errors, there’s a huge tolerance for error in communication, that’s why it works

LMS training

I would characterise the standard approach used to “train” academics how to use a new LMS – or any new system – as:

  • In the months before the release of the system hold numerous training sessions in places and at times that suit the academics.
  • Have the supervisors of the academics, and especially the senior management of the institution, reinforce how important it is to attend these sessions.
  • Within the session seek to get the academics to understand the meaning embodied into the system so they can interact with it.
  • Provide these sessions at a time and place removed from the normal context within which the academics will use the system.
  • Employ a range of technical folk who can easily understand the meaning of the system to explain it to the academics in a way that is very similar to how the technical folk learned it.
  • Assume that at the completion of the training they only need a much lower level of support and training. Generally limited to repeating the original training for new staff and providing front-line helpdesk staff to explain how any problems are due to the academic misunderstanding the meaning embodied in the system.

Can you see how the lack of a commonality of meaning is going to cause problems here. To me it’s obvious that the academics will not get the meaning embodied in the system.

The “clueless user” I mentioned above expressed this understanding of the training she experience.

I did the training in the first batch. Over 6 months ago. I haven’t touched the LMS since. How much do you think I remember?

If there is no commonality of meaning, then what?

Downes suggestion is (remember he’s thinking/tweeting in a different context, but I think it applies)

You need to experiment- Wittgenstein called it a game – to test and feel to see what word evokes what response- there is no common ‘meaning’

Given the impossibility of any commonality of meaning and the huge complexity and diversity of the meaning associated with e-learning, learning, teaching, universities, people and technology, the processes within universities and e-learning should be aimed much more at experimentation, at sharing of meaning, at encouraging surprise and enabling effective response and interaction.

What if the assumption of commonality of meaning remains? You keep operating as if there was commonality of meaning? Downes

if ‘sameness of meaning’ were required, communication would grind to a halt.

Is there value in strategic plans for educational technology

Dave Cormier has recently published a blog post titled Dave’s wildly unscientific survey of technology use in Higher Education. There’s a bunch of interesting stuff there. I especially like Dave’s note on e-portfolios

eportfolios are a vast hidden overhead. They really only make sense if they are portable and accessible to the user. Transferring vast quantities of student held data out of the university every spring seems complicated. Better, maybe, to instruct students to use external services.

Mainly because it aligns with some of my views.

But that’s not the point of this post. This morning Dave tweeted for folk to respond to a comment on the post by Diego Leal on strategic plans for educational technology in universities.

Strategic plans in educational technology are a bugbear of mine. I’ve been writing and thinking about them a lot recently. So I’ve bitten.

Summary

My starting position is that I’m strongly against strategic plans for educational technology in organisations. However, I’m enough of a pragmatist to recognise that – for various reasons (mostly political) – organisations have to have them. If they must have them, they must be very light on specifics and focus on enabling learning and improvement.

My main reason for this is a belief that strategic plans generally embody an assumption about organisations and planning that simply doesn’t exist within universities, especially in the context of educational technology. This mismatch results in strategic plans generally creating or enabling problems.

Important: I don’t believe that the problems with strategic plans (for edtech in higher education) arise because they are implemented badly. I believe problems with strategic plans arise because they are completely inappropriate for edtech in higher education. Strategic plans might work for other purposes, but not this one.

This mismatch leads to the following (amongst others) common problems:

  • Model 1 behaviour (Argyris et al, 1985);
  • Fads, fashions and band wagons (Birnbaum, 2000; Swanson and Ramiller, 2004)
  • Purpose proxies (Introna, 1996);
    i.e. rather than measure good learning and teaching, an institution measures how many people are using the LMS or have a graduate certificate in learning and teaching.
  • Suboptimal stable equilibria (March, 1991)
  • Technology gravity (McDonald & Gibbons, 2009)

Rationale

Introna (1996) identified three necessary conditions for the type of process embedded in a strategic plan to be possible. They are:

  • The behaviour of the system is relatively stable and predictable.
  • The planners are able to manipulate system behaviour.
  • The planners are able to accurately determine goals or criteria for success.

In a recent talk I argued that none of those conditions exist within the practice of learning and teaching in higher education. It’s a point I also argue in a section of my thesis

The alternative?

The talk includes some discussion of some principles of a different approach to the same problem. That alternative is based on the idea of ateleological design suggested by Introna (1996). An idea that is very similar to broader debates in various other areas of research. This section of my thesis describes the two ends of the process spectrum.

It is my position that educational technology in higher education – due to its diversity and rapid pace of change – has to be much further towards the ateleological, emergent, naturalistic or exploitation end of the spectrum.

Statement of biases

I’ve only ever worked at the one institution (for coming up to 20 years) and have been significantly influenced by that experience. Experience which has included spending 6 months developing a strategic plan for Information Technology in Learning and Teaching that was approved by the Academic Board of the institution, used by the IT Division to justify a range of budget claims, thrown out/forgotten, and now, about 5 years later, many of the recommendations are being actioned. The experience also includes spending 7 or so years developing an e-learning system from the bottom up, in spite of the organisational hierarchy.

So I am perhaps not the most objective voice.

References

Argyris, C., R. Putnam, et al. (1985). Action science: Concepts, methods and skills for research and intervention. San Francisco, Jossey-Bass.

Birnbaum, R. (2000). Management Fads in Higher Education: Where They Come From, What They Do, Why They Fail. San Francisco, Jossey-Bass.

Introna, L. (1996). “Notes on ateleological information systems development.” Information Technology & People 9(4): 20-39.

March, J. (1991). “Exploration and exploitation in organizational learning.” Organization Science 2(1): 71-87.

McDonald, J. and A. Gibbons (2009). “Technology I, II, and III: criteria for understanding and improving the practice of instructional technology ” Educational Technology Research and Development 57(3): 377-392.

Swanson, E. B. and N. C. Ramiller (2004). “Innovating mindfully with information technology.” MIS Quarterly 28(4): 553-583.

Thoughts on "Insidious pedagogy"

The following is a reflection on and response to a paper by Lisa Lane (2009) in First Monday titled “Insidious pedagogy: How course management systems impact teaching”. I’ve been struggling with keeping up with reading, but this topics is closely connected to my thesis and the presentation I’ll be giving soon.

The post starts with my thoughts and reactions to the paper and has a summary of the paper at the end.

My Thoughts

In summary, the paper basically seems to be based on

  • observing a problem; and
    In summary, the problem is that because most academics are not expert online technology users they seek to use course management systems (CMS) at a basic level by using system defaults. They system defaults in some CMS (e.g. Blackboard) are seen to encourage limited use and also to encourage academic staff to continue as novices. These novice staff produce learning environments that are less than appropriate, but they are also happy with the CMS.
  • proposing two bits of a solution.
    The two solutions are:
    • start novices with pedagogy;
      When introducing a CMS to technically novice acacdemic staff, don’t start by examining the technical features of the CMS. This encourages them to stick with those features without considering pedagogy. Instead, start with pedagogy and work to the tools.
    • have the CMS use opt-in, rather than opt-out.
      The default setting for an opt-out CMS is that all of the options are there, in the face of the academic. This can be confronting and lead novices to taking more pragmatic approaches. An opt-in approach has fewer defaults which encourages/requires the academic to think more holistically.

I like the paper, especially in its description of the problem. This is an important problem that is often over-looked. However, while there is some value in the solutions – the distinction between opt-in and opt-out is especially interesting – I wonder about the practicality of the “start with pedagogy” solution. Also, not surpisingly given such a complex problem, think there are other factors to be considered.

Practicality of “start with pedagogy”

My current institution is currently in the midst of adopting Moodle. The institution has implemented the organisationally rational approach of having compulsory training sessions in Moodle being run by both IT trainers and curriculum designers. For various reasons, a number of the staff attending these sessions have asked a common question: “What’s the minimum I need to know?”. Such staff aren’t that interested in starting with pedagogy.

This raises an interesting point that I haven’t thought of for the first time. Given our institutional context, I believe that the number of true novices (i.e. those that have never used a CMS) amongst academic staff is very low. Many of these staff may well have very limited conceptions of e-learning from a pedagogical perspective, however, they have started to develop “their way” of teaching online. They are comfortable with that and all they want to know is how to replicate it in the new system.

In addition to this, most of the staff I know don’t start with pedagogy when they are designing their teaching. This can be due to not knowing about pedagogy or also for vary pragmatic reasons. For example, if you are a casual, part-time being employed to teach an existing course, you are going to stick with what has been done before. You’re not being paid to do something different and any problems that arise because of “difference” will not be treated well.

Other solutions

There are many other potential solutions, I will be talking about the main ones in a couple of weeks. Some other misc ones before I get back to work:

  • Engage web novice academics in the use of the Internet – especially social media – that further their career.
    e.g. Using social media to connect with other researchers, using blogs to become a “public intellectual”. This provides them with experience to be aware of different possibilities.
  • Modify the context of most universities to appropriately encourage a focus on improving learning and teaching.
    Are instructors motivated to spend more time on improving their teaching? What if they believe the following (Fairweather, 2005)

    More time teaching is a negative influence on academic pay….The trend is worsening most rapidly in institutions whose central mission focuses on teaching and learning

    Until universities truly value learning and teaching and treat it as such…….

  • Adopt a best of breed approach for the CMS.

Other thoughts

Other thoughts/responses include

Is Moodle really different?

Lane (2008) writes (emphasis added)

This is particularly true of integrated systems (such as Blackboard/WebCT), but is also a factor in some of the newer, more constructivist systems (Moodle).

This seems to accept the view that Moodle, being designed on social constructivst principles, is somehow different and better than Blackboard, WebCT etc. I’m sorry to say that if I haven’t seen anything significant while using Moodle that strongly shows those social constructivist principles.

I think there’s a really interesting research project around investigating this claim, how/if it is visible in the design of Moodle and how that claimed strength influences use of Moodle.

Today’s CMS can be customized

There’s a quote in the paper (Lane, 2008)

Today’s CMSs can be customized, changed and adapted

I question this a little. I think the point of the quote in the paper is from the perspective of the academic. i.e. that when designing your course there is choice, an ability to customize your course in a variety of ways by the breadth of additional functionality that CMS vendors have provided.

I agree with that to some extent, however, this customization has some limits:

  • don’t break the model;
    All systems have an in-built model. You can only customize to the extent that you fit within the model. We had an experience in one course where we couldn’t create enough discussion forums in the right places for one pedagogical design. This was entirely due to the assumptions built into the CMS about how discussion forums would work. It broke the model, so we couldn’t do it.
  • your installation allows it.
    There is an important distinction to be made between what the CMS allows you to customize and what the particular installation of the CMS you are using allows you to customize. The decisions made by specific institutions can further constrain the level of customization. The simplest example is the choice at the institutional level not to install “module X”. But in some CMS there are also installation level configuration decisions that constrain customization.

I’ve argued elsewhere that the basic model of a CMS is based on that of an integrated, enterprise system – a product model well known to be inflexible. In fact, best practice information systems literature suggests that for such systems you must “implement vanilla” to minimise costs.

Designed to focus on instructor effeciency?

The paper (Lane, 2008) includes the following claim about the design of CMSs

Today’s enterprise–scale systems were created to manage traditional teaching tasks as if they were business processes. They were originally designed to focus on instructor efficiency for administrative functions such as grade posting, test creation, and enrollment management.

My position is that most of them were very badly designed to do this, if they were at all designed to do this. I’ve heard lots of folk explain that if you have a class for 30 or 40, then the commercial CMSs work fine, but if you have 800, you are buggered.

The first version of WebCT installed at my institution had an internal limit on the number of students that could be managed within the gradebook – 999. If you had more than 1000 students in a class, you were stuffed. My institution had classes that big.

The nature of my current institution – courses having upwards of 20 different teaching staff spread across the eastern coast of Australia – means that online assignment submission and management is an important task. Experience of staff here is that the assignment submission system in Blackboard is really bad in terms of efficiency. Early indications are that the default Moodle system is just as bad. A locally produced system is significantly more efficient.

All of this seems to bring into question the “efficiency” aspect of CMS. They don’t even do that well. We should write something on this.

Paper summary

The following is a quick summary of the paper

Introduction

Nice quote from Thoreau which I might have to steal

But lo! men have become the tools of their tools.

Draws on historian’s view to argue that technologies tend to have a purpose/objective that can limit or even determine its use.

Course Management Systems (CMS) also do this, through the defaults in those systems. Other literature tends to not to focus on this. The paper suggests that

A closer look at how course management systems work, combined with an understanding of how novices use technology, provides a clearer view of the manner in which a CMS may not only influence, but control, instructional approaches.

The inherent pedagogies of CMSs

CMS designed mostly for administrative purposes. Built-in pedagogy is essentially based on presentation and assessment. The design of these systems make it simple to perform presentation and assessment tasks.

That said, CMSs have been expanded to include other features and this is expanding. Suggests that CMSs can be customized, changed and adapted. But why aren’t faculty tinkering the CMS to make their individual pedagogies work online?

Novice web users and the CMS

Most academics are not web-heads. Most are drafted to teach online. It’s based on top-down directives. Lots more references to explain that they aren’t savvy with technology. At the same time, most have established successful learnig approaches over time.

Interesting points about how much academics use the same research methods they learned in graduate school. Can expand here.

Experts and novices are different.

The fault of the defaults

Basically argues that the defaults of the CMS aren’t designed to make it easy for or fit with the expectations/experience of academics. As they spend more time with the system, they become comfortable with the defaults.

Important: makes the point about the difference of perspective between educational technologists and academics, especially how they view the CMS.

Novices are happier with CMS because – to put it bluntly – they don’t know better. It’s the folk pushing the boundaries that are less satisfied with CMS.

Solutions to CMS dominance

Treat novices, differently from advanced instructors. With novices emphasis pedagogy first. Argues that starting with technology features focuses on the novice instructor’s weakness (technological literarcy) at the expense of their main strength (expertise in discipline and their teaching).

Also suggests that “opt-out” systems – systems that show all the tools and features and expect users to choose which they don’t want – are too overwhelming for novices. Suggests that opt-in systems – like Moodle – are better. Especially in the way they give similar emphasis to discussions and content transmission

References

Fairweather, J. (2005). “Beyond the rhetoric: Trends in the relative value of teaching and research in faculty salaries.” Journal of Higher Education 76(4): 401-422.

Lane, L. (2009). “Insidious pedagogy: How course management systems impact teaching.” First Monday 14(10).

Lessons from Pedagogy for e-learning

Two thesis related posts in a day, I must be on a roll. This post actually marks a milestone, the following rough bit of material is the last bit of original writing I’ll need to do for chapter 2. What remains will be tidying up, fixing typos/spelling/grammar, “concludings” and some major cutting. Sadly chapter 2 currently stands at 200+ pages and will need some major cutting I think to be a reasonable size. That’s a job for another day.

The following is meant to abstract some lessons for e-learning based on the literature around pedagogy reviewed in early sections (e.g. the one from earlier today. It continues my focus on diversity and change being key characteristics of e-learning, an observation that highlights a mismatch with the standard product and process being used for e-learning.

As I near the end there are an increasing number of cross references from this material to earlier material. Sorry, haven’t gotten around to linking them on the blog. This is likely to be only somewhat less annoying than the poor grammar and dyslexic typing.

Lessons from Pedagogy for e-learning

The above brief overview of the Pedagogy component of the Ps Framework forms the basis for the identification of four lessons for e-learning within universities from the literature on pedagogy. The first of these is that learning is an inherently diverse human activity. The second is that e-learning is only a relatively new human activity and is still changing and adding to the diversity of learning. The third lesson, and one based on this observation of increasing diversity, is that there is no silver bullet, no one universal approach to learning or to e-learning and that instead e-learning should perhaps be focusing on its ability to support this diversity. The final lesson is that any change in learning and teaching at university is reliant on changing the conceptions of the academics.

Learning is inherently diverse

Dede (2008) raises the question of whether or not there is just one pre-eminent way of learning/teaching for every student, for every subject, for all legitimate purposes of schooling? Like everything else in education, a balance is needed – one size does not fit all – even in online settings (Cuthrell and Lyon 2007). Different learners bring to the learning experience: different learning objectives; different prior knowledge and past experience; and, different cognitive preferences (Dagger, Conlan et al. 2005). The diversification and massification of the student body has led universities to shift their education rhetoric from a notion of “one size fits all” to a concept of tailored, flexible learning (Lewis, Marginson et al. 2005). Learning should not be one size fits all and can be customised to meet local requirements and this deviation from a standard model should now be seen as a strength (Cavallo 2004). A “one size fits all” approach ignores the importance of disciplinary culture (Jones 2009). There is no one best way of developing instruction (Davies 1991). Dede’s (2008) answer to his question is that given the spectrum of learning theories, it would appear that “learning is a human activity quite diverse in its manifestations from person to person”. He goes onto suggest that the field of instructional design can only progress if it recognises that learning is a human activity quite diverse in its manifestations from person to person and even from day to day (Dede 2008).

E-learning is new and changing

While, to some extent, Bates (2004) statement that e-learning does not change the fundamental process of learning in that students still need to read, observe, think, discuss, practice and receive feedback. However, e-learning is creating a new environment within which learning and teaching operates and is contributing to the creation of and need for new knowledge about learning and teaching. There is little understanding of the affordances of different technologies and how these might be exploited in particular learning and teaching contexts (Conole and Dyke 2004). There is a need to engage with the affordances and constraints of particular technologies to understand how new technologies can meet specific pedagogical goals of specific content areas (Mishra and Koehler 2006). The rise of e-learning is calling for and generating more than knowledge simply to inform instructional design theories. With the example of connectivism, it is possible to see new knowledge, enabled or required to some extent by the rise of technology, being generated at the other three levels of learning theories identified in Section 2.1.2.

E-learning, diversity and silver bullets

The diversity inherent in learning is not matched by the theories and philosophies around the use of information and communication technologies to support learning. Such approaches treat learning as a simple activity that is relatively invariant across people, subject areas and educational objectives; and, so most widely used instructional technology applications have less variety in approach than a low-end fast-food restaurant (Dede 2008). The apparent high costs of developing educational materials means, that at least for for-profit organizations, a “one size fits all” approach produces economies of scale that is likely to prevail over the potential of online technologies to support customisation for the needs of individual learners (Cunningham, Ryan et al. 2000). This tendency towards one size fits all is contributed to by successive generations of pundits espousing ‘magical’ media, the single best medium for learning or the universally optimal way of learning (Dede 2008).

The difference and diversity inherent in learning challenges managerialism – a rising trend within higher education as shown in Society in Place (cross reference) – which generally seeks to elide ambiguities and to standardise individuals and experiences (Danaher, Luck et al. 2004). The managerialist approach to standardisation is well served by the monolithic or integrated product model on which learning management systems are based (cross reference to procurement and software section in Product). Innovation and diversity are served less well by such a product model. Dede (2008) argues that

from an instrumental perspective, the history of tool making shows that the best strategy is to have simultaneously available a variety of specialized tools, rather than a single device that attempts to accomplish everything.

Improvement comes through changing teacher conceptions

Even with the diversity in learning and the change created by the introduction of e-learning, the practice of learning and teaching in universities remains much the same. While e-learning has provided a new medium, must teaching remains old wine in new bottles (Bates 2004). As shown in section 2.1.4 (e-learning usage from past experience) the majority of academic staff still rely on old, familiar pedagogies rather than actively engaging with the new affordances offered by technology. This is something that is only going to change when the university context encourages, enables and perhaps even requires, changes in the conceptions of learning and teaching held by academic staff. The on-going introduction of new technologies is unlikely to ever bring about such change.

References

Bates, T. (2004). The promise and myths of e-learning in post-secondary education. The Network Society: A Cross-cultural Perspective. M. Castells. Cheltenham, UK, Edward Elgar: 271-292.

Cavallo, D. (2004). "Models of growth – Towards fundamental change in learning environments." BT Technology Journal 22(4): 96-112.

Conole, G. and M. Dyke (2004). "What are the affordances of information and communication technologies?" ALT-J, Research in Learning Technology 12(2): 113-124.

Cunningham, S., Y. Ryan, et al. (2000). The Business of Borderless Education. Canberra, Department of Education, Training and Youth Affairs: 328.

Cuthrell, K. and A. Lyon (2007). "Instructional strategies: What do online students prefer?" Journal of Online Learning and Teaching 3(4).

Dagger, D., O. Conlan, et al. (2005). Fundamental requirements of personalised eLearning development environments. World Conference on E-Learning in Corporate, Government, Healthcare, and Higher Education 2005, Vancouver, Canada, AACE.

Danaher, P. A., J. Luck, et al. (2004). Course management systems: Innovation versus managerialism. Research Proceedings of the 11th Association for Learning Technology Conference (ALT-C 2004), University of Exeter, Devon, England, Association for Learning Technology.

Davies, I. (1991). Instructional development as an art: One of the three faces of ID. Paradigms rgained: the uses of illuminative, semiotic, and post-modern criticism as modes of inquiry in educational technology: a book of readings. D. Hlynka and J. Belland, Educational Technology Publications: 93-106.

Dede, C. (2008). Theoretical perspectives influencing the use of information technology in teaching and learning. International Handbook of Information Technology in Primary and Secondary Education. J. Voogt and G. Knezek. New York, Springer: 43-59.

Jones, A. (2009). "Redisciplining generic attributes: the disciplinary context in focus." Studies in Higher Education 34(1): 85-100.

Lewis, T., S. Marginson, et al. (2005). "The network university? Technology, culture and organisational complexity in contemporary higher education." Higher Education Quarterly 59(1): 56-75.

Mishra, P. and M. Koehler (2006). "Technological pedagogical content knowledge: A framework for teacher knowledge." Teachers College Record 108(6): 1017-1054.

Page 1 of 9

Powered by WordPress & Theme by Anders Norén

css.php