The following is the second to last section for the People component of chapter 2 of my thesis. The basic aim of this section is to establish that people are generally not rational and methods that assume that they are, are destined to fail. It’s my proposition that most of the organisational approaches to e-learning and, more generally, learning and teaching at universities suffer this flaw.

I’m not the first to make this observation and I almost certainly haven’t expressed it as well as it can be in the following. Again, I’m satisficing for the purposes of completing the thesis. There’s a thesis in this topic alone.

People, cognition and rationality

The practice of e-learning within universities has arisen at a time when changes in broader society are increasing the emphasis on accountability, efficiency and managerialisation (see Place cross ref for more discussion). From the late 1990s onwards the practice of e-learning within universities has been dominated by an industrial paradigm associated with the use of enterprise information systems (see Past Experience cross ref). Both factors are usually characterised as having a strong techno-rational basis. A techno-rational discourse seeks the use of quantitative data and measurement to ensure accountability (Kappler 2004). Enterprise systems are an extreme application of a techno-rational perspective (Dillard and Yuthas 2006). A techno-rational approach to management sees it as a scientifically rational and efficient application of neutral knowledge on a par with the natural sciences (Morgan 1992). It is a school of through aimed at marginalizing the role of intuitive thinking through the use of analytical tools and technical solutions (Vanharanta and Easton 2009). This section draws on a range of literature to briefly outline observations that suggest limitations of such techno-rational approaches that impact upon the implementation of e-learning within universities.

At the level of the individual, there is significant research to indicate that people do not make rational decisions. It has been shown that when making decisions people rely on strategies such as rules of thumb and heuristics to simplify decisions, several of which suffer from systematic biases that influence judgement (Tversky and Kahneman 1974). Cognitive biases are mental behaviours that negatively impact upon decision quality in a significant number of decisions for a significant number of people; they are inherent in human reasoning (Arnott 2006). Arnott (2006) develops a taxonomy of 37 cognitive biases identified by psychological research. Humans have primitive emotional parts to our brains that can strongly influence – both negatively and positively – the choice we make (Morse 2006). A common example from organisational life is provided by Keil and Robey (1999) who identify the “deaf effect” where by people remain deaf in the presence of reported troubles in the hope that they can avoid dealing with difficult problems and perhaps to also disassociate themselves from a failing endeavour. Even if people can make rational decisions there appear to be limits on that rationality. Given a complex environment, there are limits to the ability of human beings to adapt optimally, or even satisfactorily (Simon 1991).

Cecez-Kecmanovic, Janson and Brown (2002) describe Weber’s (1978) view that there is a limit on rationality because the mutual judgements of rational action between actors will differ to the degree to which their beliefs and values differ, and that such belief and value conflicts cannot be resolved in a rational way. Weber’s view is that substantive rationality within organizations is inherently limited because of the inevitability of value conflict (Cecez-Kecmanovic, Janson et al. 2002). Individually inherited cultural belief systems significantly bias normal human thought and perception, belief systems that are added to as we inherit organisational beliefs based on a long line of learned and rigidly held inaccuracies (Bailey 2007).

These ideas of individually and organisationally different cultural beliefs connect with the idea of technology frames. The understandings that members of a social group come to have of technological artefacts arising from knowledge of the particular technology and the local understanding of specific uses in a given setting (Orlikowski and Gash 1994). These structures provide templates for problem solving and evaluation, focus attention on information consistent with existing structures while hiding inconsistent information and fill gaps in information with information consistent with existing knowledge structures (Davidson 2002). Differences in technological frames between those involved with information systems projects can lead to actions that hamper technology implementation (Orlikowski and Gash 1994).

Techno-rational approaches to management treat people as objects to be manipulated in accordance with scientific laws (Morgan 1992). Such approaches embody a deterministic approach that views potential adopters as predisposed to adopt innovations that are quantifiably superior from some technical perspective (Surry and Farquhar 1997). The expanded technological determinist view suggests that it is technology that shapes the forms of society and organizations (Jones 1999). Increasingly, however, interest in explaining the organisational consequences of information systems had led to positions that privilege human agency over social structure and technological features (Boudreau and Robey 2005). Information systems development is then not a case of people with clearly-defined goals applying technologies with clearly-defined properties to achieve clearly defined organisational effects (Jones 1999). Boudreau and Robey (2005) show that the organisational consequences of an ERP system – known to be notoriously inflexible once configured and implemented and where adoption is typically motivated by a desire for greater control – could be shaped and enacted through use rather than simply embedded in technical features. The trajectory of emergence of use is not wholly determined either by human agency or the material property of the technology, but rather by the unpredictable interplay of the two.

Research into decision making around information systems projects has revealed that such decisions are rarely logical or rational (Bannister and Remenyi 1999). Decision making about the implementation of information systems is not a techno-rational process with many decision makers relying on intuitions or instincts and simple heuristics to simplify decision making (Jamieson and Hyland 2006). The practice of innovation and change development within universities can never be a mere rational process (Jones and O’Shea 2004). Awareness of the problems associated with the limitations in individual and organizational rationality can help minimize the negative effectives of irrational bias in individual and organizational decision making opens the possibility of adopting approaches and practices that can improve outcomes.


Arnott, D. (2006). "Cognitive biases and decision support systems development: a design science approach." Information Systems Journal 16: 55-78.

Bailey, C. (2007). "Cognitive accuracy and intelligent executive function in the brain and in business." Annals of the New York Academy of Sciences 1118: 122-141.

Bannister, F. and D. Remenyi (1999). "Value peception in IT investment decisions." Electronic Journal of Information Systems Evaluation 2(2).

Boudreau, M.-C. and D. Robey (2005). "Enacting integrated information technology: A human agency perspective." Organization Science 16(1): 3-18.

Cecez-Kecmanovic, D., M. Janson, et al. (2002). "The rationality framework for a critical study of information systems." Journal of Information Technology 17: 215-227.

Davidson, E. (2002). "Technology frames and framing: A socio-cognitive investigation of requirements determination." MIS Quarterly 26(4): 329-358.

Dillard, J. and K. Yuthas (2006). "Enterprise resource planning systems and communicative action." Critical Perspectives on Accounting 17(2-3): 202-223.

Jamieson, K. and P. Hyland (2006). Factors that influence Information Systems decisions and outcomes: A summary of key themes from four case studies. 17th Australasian Conference on Information Systems, Adelaide, Australia.

Jones, M. (1999). Information systems and the double mangle: Steering a course between the scylla of embedded structure and the charybdis of strong symmetry. Information Systems: Current Issues and Future Chalenges. T. Larsen, L. Levine and J. DeGross. Laxenburg, Austria, IFIP: 287-302.

Jones, N. and J. O’Shea (2004). "Challenging hierarchies: The impact of e-learning." Higher Education 48(3): 379-395.

Kappler, K. (2004). NCATE: Wolf in shepherd’s clothes. Critical perspectives on the curriculum of teacher education. T. Poetter, T. Goodney and J. Bird. Lanham, MD, University Press of America: 19-40.

Keil, M. and D. Robey (1999). "Turning around troubled software projects: An exploratory study of the de-escalation of commitment to failing courses of action." Journal of Management Information Systems 15(4): 63-87.

Morgan, G. (1992). Marketing discourse and practice: Towards a critical analysis. Cricital management studies. M. Alvesson and H. Willmott. London, SAGE: 136-158.

Morse, G. (2006). "Decisions and desire." Harvard Business Review 84(1): 42-51.

Orlikowski, W. and D. Gash (1994). "Technological frames: Making sense of information technology in organizations." ACM Transactions on Information Systems 12(2): 174-207.

Simon, H. (1991). "Bounded rationality and organizational learning." Organization Science 2(1): 125-134.

Surry, D. and J. Farquhar (1997). "Diffusion Theory and Instruction Technology." Journal of Instructional Science and Technology 2(1): 269-278.

Tversky, A. and D. Kahneman (1974). "Judgment under uncertainty: Heuristics and biases." Science 185(4157): 1124-1131.

Vanharanta, M. and G. Easton (2009). "Intuitive managerial thinking; the use of mental simulations in the industrial marketing context." Industrial Marketing Management In Press.

Weber, M. (1978). Economy and society. Berkeley, CA, University of California Press.