Coordination, support and knowledge sharing associated with e-learning – where does your organisation fit?

A recent post summarised a paper that was taking some insights from the information systems discipline and applied it to the implementation of a LMS/VLE. This post draws on some insights from Alavi and Leidner (2001), an influential paper (208 citations on Google Scholar) from the information systems discipline. A paper that calls for IS researchers to focus more on technology-mediated learning – i.e. e-learning.

Of the many ways the paper suggests IS researchers can make a contribution the following is the focus of this post.

Lastly, at the organizational level of analysis, IS scholars might have insights to provide to the question of how to encourage instructors (in the role of end-users) to incorporate IT tools to improve their instructional design, development, and delivery.

Based on a review of the literature the authors suggest a simple matrix – the figure below – to summarise four common approaches universities take to the coordination, support and knowledge sharing around e-learning at the organisational level.

Coordination, support and knowledge sharing

The four quadrants can be described as:

  1. Acquisition of technology and their support is uncoordinated. Sharing of knowledge is random – generally limited to ad hoc social networks.
  2. Technology acquisition and support remains uncoordinated, but some facilitation of knowledge sharing occurs.
  3. Technology acquisition and support is now coordinated across the institution, however, knowledge sharing is random.
  4. Technology acquisition and support is coordinated and knowledge sharing is facilitated.

Alavi and Leidner suggest that most universities are in quadrant #1. That may have been true in North America in 2000/2001. It might continue to be true in that country. At this point in time, in Australia, I believe most universities could probably be said to be in quadrants 3 and 4. Though, in reality, there are probably aspects of practice that dip into the other quadrants. Where is your university?

The quadrant I’m most familiar with is probably quadrant 3 though at times we might have touched on 4. Facilitation of knowledge sharing is by far the most difficult of the three tasks. One I’m not sure anyone has really grappled with effectively. Especially because facilitation of knowledge sharing is not separate from acquisition and support. Though it is often treated as separate.

Acquisition and support can easily be allocated to the information technology folk. Which means that facilitation of knowledge sharing can occur about how to use the technology. But leveraging the knowledge sharing to inform the modification of existing or adoption of new technology has to battle across the disciplinary gulf that separates learning and teaching from IT. Not only that, it also has to battle the problem of resource starvation where funding/resourcing for L&T IT gets starved of attention due to the “size” of the problem outside of L&T.


Alavi, M. and D. E. Leidner (2001). “Research commentary: technology-mediated learning – a call for greater depth and breadth of research.” Information Systems Research 12(1): 1-10.

The Ps framework

The following is a section of my thesis – chapter 2. As I get first drafts of this stuff done, I’m going to post it to the blog – where appropriate. This is the first.

This section is the first major part of chapter 2 – the literature review. It explains the background of the Ps Framework which will be used to structure the rest of the chapter.

The Ps Framework

This chapter aims to illustrate knowledge of the extant literature and associated worthy research issues around the problem of designing and implementing e-learning and the supporting information systems within universities. The development of this understanding and its description in the remainder of this chapter has been achieved through the formulation of the Ps Framework as a theory to enable analysis, understanding and description of that extant literature. Elsewhere (Jones 2008; Jones, Vallack et al. 2008) the Ps Framework has been used to illustrate how the framework can be a useful tool for helping the diverse stakeholders to effectively share and negotiate their various perspectives and consequently, make sound and pragmatic decisions around e-learning. In this chapter, the Ps Framework helps illustrate that the literature survey is “constructively analytical rather than merely descriptive” (Perry 1998) and its components make the main section headings in this chapter.

The first part of this section (Why the Ps Framework?) provides a brief description of why the Ps Framework is necessary. Next (Components of the Ps framework), the individual components of the Ps Framework and their graphical representation is explained. The next section (hopefully online later this week) begins the use of components of the Ps Framework, in this case “Past Experience”, to describe one aspect of what is currently known about e-learning.

Why the Ps Framework?

The focus of this work is the development of an Information Systems Design Theory (ISDT) for e-learning. The aim is to develop insight into appropriate approaches to the design and implementation of e-learning. Consequently, the research in this thesis can be seen as a design problem. There is growing interest in design, design research and design theory in fields such as management (Boland 2002; van Aken 2004; van Aken 2005), information systems (Walls, Widmeyer et al. 1992; Hevner, March et al. 2004; Walls, Widmeyer et al. 2004; Gregor and Jones 2007), and education (Brown 1992; Collins 1992; Savelson, Phillips et al. 2003). Design is the core of all professional training (Simon 1996). Design can be seen as a transformation from some known situation (the initial state) which is deemed to be problematic by some interested parties into a target state (Jarvinen 2001). The formulation of the initial state into an effective representation is crucial to finding an effective design solution (Weber 2003). Representation has a profound impact on design work (Hevner, March et al. 2004) particularly on the way in which tasks and problems are conceived (Boland 2002).

The organisational selection, adoption and use of educational technology by universities is increasingly seen as an information systems implementation project (Jones, Vallack et al. 2008). How such projects are conceptualised significantly influence the design of the resulting system. Jamieson and Hyland (2006) suggest that there are relationships between decisions made in the pre-implementation phase of an information systems project, the factors considered in those decisions and the degree of success of the project outcomes. During the pre-implementation phase, decisions involve a high volume of information, are incredibly complex, and are associated with a high degree of uncertainty (Jamieson and Hyland 2006). There remains some distance until there is complete understanding of the complexity of innovation and change around university implementation of e-learning (Cousin, Deepwell et al. 2004).

Bannister and Remenyi (1999) contend that given such difficult decisions, both individual and corporate decision makers will more than likely base their decisions on instinct. Given the non-linear nature of e-learning implementation it becomes more complex to handle and there is a need for meaning-makers or mediators between the different elements of the “implementation ecology” (Cousin, Deepwell et al. 2004). How a design problem is conceptualised by the members of an organization influences what they see as valid solutions to that problem, it impacts directly on the quality of the decisions they make about projects. Different members of an organization will, as a result of their different experiences, have varying perspectives on a design problem. Too often, the full diversity of experience is so difficult to capture, compare and contrast that decision-making processes often, both consciously and unconsciously, avoid the attempt.

Frameworks offer new ways of looking at phenomena and provide information on which to base sound, pragmatic decisions (Mishra and Koehler 2006). Gregor (2006) defines taxonomies, models, classification schema and frameworks as theories for analysing, understanding and describing the salient attributes of phenomena and the relationships therein. The development of taxonomies, models and frameworks to aid understanding is common in most disciplines. Examples from the educational technology field include:

  • the 4Es conceptual model (Collis, Peters, & Pals, 2001);
    This is a model to predict the acceptance of ICT innovations by an individual within an educational context. It proposes that an individual’s acceptance of educational ICT innovations is based upon four concepts: environment, effectiveness, ease of use and engagement.
  • the ACTIONS model (Bates, 2005); and
    This framework provides guidance to the process of selecting a particular educational technology by drawing on 7 components: Access, Costs, Teaching and learning, Interactivity and user-friendliness, Organisational issues, Novelty and Speed.
  • E-learning ecology elements (Cousin, Deepwell et al. 2004).
    Four elements or domains are identified as requiring consideration during the implementation of e-learning within universities: pedagogical, technological, cultural and organisational.

Components of the Ps Framework

The Information Technology (IT) artifact is often taken for granted or assumed it to be unproblematic which often results in narrow conceptualisations of what technology is, how it has effects and how and why it is implicated in social change (Orlikowski and Iacono 2001). Such limited conceptualisations often view IT as fixed, neutral and independent of their context of use. The position taken in this thesis, and demonstrated in this chapter through the use of the Ps framework, is that IT is one of a number of components of an emergent process of change where the outcomes are indeterminate because they are situationally and dynamically contingent (Markus and Robey 1988). On-going change is not solely “technology led” or solely “organisational/agency driven”, instead, change arises from a complex interaction between technology, people and the organization (Marshall and Gregor 2002). This view of the Ps framework and the IT artifact connects with Orlikowski and Iacono’s (2001) ensemble view of the IT artifact where technology is seen to be embedded with the conditions of its use.

The Ps Framework consists of 7 components. Only one of which – Product – specifically encompasses technology. The remaining six seek to describe and understand the parts of the complex and dynamic social context within which e-learning is applied. The seven components of the Ps framework are: (as work progresses and I post additional sections of the chapter, I’ll link to them from the following)

  1. The problem and purpose;
    What is the purpose or reason for the organization in adopting e-learning or changing how it currently implements e-learning? What does the organization hope to achieve? How does the organization conceptualise e-learning?
  2. Place;
    What is the nature of the organization in which e-learning will be implemented? What is the social and political context within which it is placed?
  3. People;
    What type of people and roles existing within the organization? Management, professional and academic staff, students. What are their beliefs, biases and cultures?
  4. Pedagogy;
    What are the conceptualisations about learning and teaching which the people within the place bring to e-learning? What practices are being used to learn and teach? What practices might they like to adopt?
  5. Past experience;
    What has gone on before with e-learning, both within and outside of this particular place? What worked and what didn’t?
  6. Product; and
    What system has been chosen or designed to implement e-learning? Where system is used in the broadest possible definition to include the hardware, software and support roles.
  7. Process.
    What are the characteristics of the process used to choose how or what will be implemented and what process will be used to implement the chosen approach?

One, of potentially many, explanation of the relationship between the seven components starts with purpose. Some event, problem or factor arises that will require the organization to change the way in which it supports e-learning. This becomes the purpose underlying a process used by the organization to determine how (process) and what it (product) will change. This change will be influenced by a range of factors including: characteristics of the organization and its context (place); the nature and conceptions of the individuals and cultures within it (people); the conceptualisations of learning and teaching (pedagogy) held by the people and the organization; and the historical precedents by within and outside the organisation (past experience).

This is not to suggest that there exists a simple linear, or even hierarchical, relationship between the components of the Ps Framework. The context of implementing educational technology within a university is too complex for such a simple reductionist view (Jones, Vallack et al. 2008). As stated above, the perspective underpinning the Ps Framework is one where the technology is one of even components of an emergent process of change where the outcomes are indeterminate because they are situationally and dynamically contingent (Markus and Robey 1988).

Figure 1 provides a representation of the 7 components of the Ps Framework for E-Learning. The situationally contingent nature of these components is represented by the Place component encapsulating all of the remaining six. The dynamically contingent nature of these components is represented by the messiness of their representation. It is intended also that each component be connected in someway to every other component as a representation that each component can influence the other, and vice versa.

The Ps Framework: a messy version

"Blame the teacher" and its negative impact on learning and e-learning

The following post is sparked by reading Findlow (2008) as part of my PhD work. I’m about halfway through it and finding it very interesting. In particular, this post is sparked by the following paragraph from the paper

The institutional counterpoint to this was the feeling expressed, implicitly or explicitly, by all the administrative staff I talked to that, in the words of one mid-ranking administrator, ‘No offence, but some academics need a kick up the bum’. Only five survey respondents cited constraints not explicitly linked to aspects of ‘the system’, singling out academics’ ‘attitudes’, which they elaborated as: ‘lack of imagination’ and/or ‘a reluctance to take risks’

Blame the teacher – level 1

In promulgating the idea of reflective alignment I borrowed and reworked Biggs (Biggs and Tang, 2007) ideas of constructive alignment to take it from looking at how individual academics could design teaching to looking at how a university could improve the design of teaching performed by its academics.

The sentiments outlined in the above quote from Findlow (2008) are a perfect example of, what I framed as, level 1 knowledge about improving teaching. The “blame the teacher” level. The level at which management can feel consoled that the on-going problems with the quality of teaching is not the fault of the system they manage. It’s the fault of those horrible academics. It would all be better if the academics simply got a “kick up the bum”.

Escalation into “accountability” – level 2

The immediate problem that arises from level 1 knowledge about improving teaching, is that management very quickly want to provide that “kick up the bum”. This is typically done by introducing “accountability”. As Findlow (2008) writes

Key to the general mismatch seemed to be the ways in which the environment – both institution and scheme – demanded subscription to a view of accountability that impeded real innovation; that is, the sort of accountability that is modelled on classic audit: ‘conducted by remote agencies of control’ (Power 1994, 43), presuming an absence of trust, and valuing standardisation according to a priori standards.

This approach fits nicely into Level 2 knowledge about improving teaching – i.e. it is a focus on what management does. The solution here is that management spend their time setting up strategic directions against which all must be evaluated. They then set up “accountability courts” (i.e. “remote agencies of control) to evaluate everything that is being done to ensure that it contributes to the achievement of those strategic directions.

This can be seen in such examples as IT governance or panels that evaluate applications for learning and teaching innovation grants. A small select group sits in positions of power as accountability judges to ensure that all is okay.

Once the directions are set and the “accountability courts” are set up, management play their role within those courts, especially in terms of “kicking but” when appropriate.

Mismatch and inappropriate

There is an argument to be made that such approaches are an anathema to higher education. For example, Findlow (2008) makes this point

New managerialism approaches knowledge as a finished product, packaged, positive, objective, externally verifiable and therefore located outside the knower. By contrast, an ‘academic exceptionalist’ (Kogan and Hanney 2000, 242) view of knowledge places it in the minds of knowledgeable individuals, with the holder of the knowledge also the main agent in its transmission (Brew 2003). This kind of expert or ‘professional knowing’, closely related to conventionally acquired ‘wisdom’ (Clegg 2005, 418), is produced through an organic process between people in a culture of nurturing new ideas. The process is allowed to take as long as it takes, and knowledge is not seen as a finished product.

There are arguments back and forth here. I’m going to ignore them as beyond scope for this post.

I will say that I have no time for many of the academics who, at this stage, will generally trot out the “academic freedom” defense to “accountability courts”. Accountability, of an appropriate sort, is a key component of being an academic, peer review anyone? Findlow (2008) has this to say

accountability is intrinsic to academia: the sort of accountability that is about honesty and responsibility, about making decisions on the basis of sound rationales, on the understanding that you may be called to account at any point. Strathern (2000a, 3) suggests that ‘audit is almost impossible to criticise in principle – after all, it advances values that academics generally hold dear, such as responsibility, openness about outcomes’.

Academics should be open and clear about what and why the perform certain tasks. Hiding behind “academic freedom” is to often an excuse to avoid being “called to account”. (That said there are always power issues that complicate this).

My argument against “accountability courts” is not on the grounds of principle, but on pragmatic grounds. It doesn’t work

It doesn’t work

Remember, we’re talking here about improving the design of courses across an institution. To some extent this involves innovation – the topic of Findlow (2008) – who makes the following point about innovation (emphasis added)

The nature of innovation … is change via problematisation and risk. In order to push the boundaries of what we know, and break down dogma, problems have to be identified and resolved (McLean and Blackwell 1997, 96). Entering uncharted territory implies risk, which requires acceptance by all stakeholders.

This last point is where the problems with “accountability courts” arise. It starts with the SNAFU principle which in turn leads to task corruption.

SNAFU principle

Believed to arise from the US army in World War II the phrase SNAFU is commonly known as an acronym that is expanded out to Situation Normal, All Fouled Up – where “Fouled” is generally replaced with a more colloquial term. Interestingly, and as a pause to this diatribe, here’s a YouTube video of Private Snafu – a cartoon series made by the US armed services during World War II to educate the troops about important issues. You may recognise Mel Blanc’s voice.

However, the SNAFU principle gets closer to the problem. The principle is defined as

“True communication is possible only between equals, because inferiors are more consistently rewarded for telling their superiors pleasant lies than for telling the truth.”

This is illustrated nicely by the fable on this SNAFU principle page.

Can this be applied to innovation in higher education? Surely it wouldn’t happen? Findlow (2008) again

My own experience as a funded innovator, and the prevailing experience of my respondents, was that participation in a funded ‘scheme’ made authentic problematisation, and honest description of risk, difficult. Problematisation was inhibited by the necessary consideration given to funding body and institutional agendas in defining parameters for approval. Audit can be seen as a response to fear of risk, and audit-managerially governed schemes require parameters pre-determined, expected outcomes and costs known in advance. Respondents in this case related the reluctance of the scheme to provide for unanticipated needs as they arose, without which effective innovation was much harder.

Task corruption

Task corruption can be defined as

is where either an institution or individual, conciously or unconsciously, adopts a process or an approach to a primary task that either avoids or destroys the task.

It can arise when the nature of the system encourages people to comply through a number of different mechanisms. Findlow (2008) reports on one as applied to innovation

The discussion groups of new academics unanimously recounted a feeling of implicit pressure not to acknowledge problems. They all said they had quickly learned to avoid mention of ‘problems’, that if necessary the word ‘issues’ was preferable, but that these ‘issues’ should be presented concisely and as if they had already been dealt with. While their formal institutional training programme emphasised the importance of honestly addressing shortcomings, their informal exposure to management culture conveyed a very different message. They had learned, they said, that to get on in academia you had to protect yourself and the institution, separate rhetoric from reality, strategy from truth – that authentic problematisation was non-productive and potentially dangerous.

Findlow goes on to reference Trowler’s (1998) term “coping strategies” and the phrase “work to rule”. Findlow gives examples, such as innovators have to lie about a particular aspect of their innovation in the formal documents required by an “accountability court” in order to fulfill requirements. Even thought the rationale was accepted by senior adminstrators.

Academics start to work the system. This creates a less than stellar confidence in the nature of the system and subsequently reduce the chances of innovation. Findlow (2008) again

Allen’s (2003) study of institutional change found that innovation was facilitated by the confidence that comes with secure working environments. Where change was judged by staff to be successful, it tended to emerge from university environments where holistic and humanistic views of scholarship and systems of implicit trust were embedded. These gave academics the confidence to take risks. Allen found that insecure environments created what Power (1994, 13) describes as ‘the very distrust that [they were] meant to address’, removed the expectation and obligation for genuinely responsible academic accountability (Giri 2000, 174), and made staff reluctant to devote time, signpost problems or try something that might not work and could reflect negatively on their career portfolios.

A solution?

The last quote from Findlow (2008) seems to provide a possible suggestion in “university environments where holistic and humanistic views of scholarship and systems of implicit trust were embedded”. Perhaps such an environment would embody level 3 knowledge of how to improve design of courses. Such an environment might allow academics to enage in Reflective Problematisation.

Such an environment might focus on some of the features of this process.


Biggs, J. and C. Tang (2007). Teaching for Quality Learning. Maidenhead, England, Open University Press.

Findlow, S. (2008). “Accountability and innovation in higher education: a disabling tension?” Studies in Higher Education 33(3): 313-329.