Reducing meaningless freedom and a Mahara feature request

Note: An update to this post included at the end.

I’m currently finalising results for a course with 250+ students spread across multiple campuses and online. The final large assignment – worth 70% of the final mark – requires that students create a portfolio (are people still using the term “eportfolio”?) in Mahara and submit it via the institutional assignment submission system.

Due to the nature of the portfolio content and concerns about privacy of school student (it’s a pre-service teacher course) the portfolio cannot be opened up to everyone. Not to mention the fact that many of the students retain this fear that someone is going to copy their work. So, students have to create this multi-page, multi-resource portfolio in Mahara and make sure that certain people can access the portfolio.

With 250+ students it was always going to be the case that a decent handful would have problems, even with reasonable instructions. And it is this decent handful that is creating extra workload for the teaching staff. Additional workload that could be avoided if a principle we formulated in early work around assignment submission – reduce meaningless freedom – was applied to Mahara.

The following outlines that principle and outlines a feature request for Mahara that might help.

Reduce meaningless freedom

Online assignment submission was one of the first applications of online learning we explored back in the mid-1990s in our courses with large numbers of distance education students (Jones and Jamieson, 1997). The early systems were not that well designed and increased the workload on the marker (sorry Kieren). However, they did help with a range of improvements over the traditional physical assignment submission process.

From this experience, we developed the principle of “reduce meaningless freedom”. Here’s how it was described in Jones (1999)

An important lesson from the on-going development of online assignment submission is to reduce the amount of “meaningless freedom” available to students. Early systems relied on students submitting assignments via email attachments. The freedom to choose file formats, mail programs and types of attachments significantly increased the amount of work required to mark assignments. Moving to a Web-based system where student freedom is reduced to choosing which file to upload was a significant improvement.

The problem was that when marking large numbers of assignments, you want to get into a routine. You can only get into a routine if certain important aspects are the same (e.g. file formats, file names etc, the ability to access a Mahara portfolio). The trouble is that if you have a large number of people completing a process, if there is any flexibility in the process they will complete it in different ways (including not completing it properly).

This is not a problem that can be solved by improving the instructions or applying a checklist. With a large enough number of people, there will be people who can’t follow those instructions or ignore the checklist.

Consequently, the information system has to be designed to remove any freedom to vary from the process. It shouldn’t remove all freedom, just those that aren’t important to the outcome of the process but will increase workload in processing. For example, we want the students to be able to express their creativity with their Mahara portfolios. We just don’t want them to have the freedom to submit a URL for a portfolio that we can’t access.

The system should remove, or at least limit, this freedom.

Mahara feature request

The problem here is that for people new to Mahara, it is very difficult to check who can access a complex, multi-page portfolio. I do know that there is a way to give access to a complex, multi-page portfolio: you create a collection, add the pages to that collection, and create a secret URL that the collection. This is the process we described to students. The trouble is that the students have to choose to follow this process and they are free not to.

You could be hard about this and be very explicit. “If you don’t do this you will fail!”. But that doesn’t create a positive learning environment I’d like to have in my courses and fails to recognise that our tools should be helping us achieve our goals.

What would be useful, is if Mahara had a “show/check access” feature. Where a person creating a Mahara portfolio could submit the URL and Mahara would generate a report of who could access which components of that portfolio. It would recurse through all the Mahara links accessible from that URL and report on who could access those links.

Having this as a feature that people have to choose to use still involves some freedom. To remove that freedom a bit more, this process could run in the background and the outcome could be made visible via the Mahara interface. For example, when editing a page that contains links to other parts of Mahara, the interface could add an appropriate label explaining who can access those links.

An update

Thanks to @icampus21.com it is revealed to me that Mahara already has this feature. That its share facility does show what is accessible. Kudos to the Mahara developers. Now I don’t follow @icampus21.com on Twitter and pretty sure they don’t follow me. So this is a nice bit of learning thanks to Twitter, hash tags and @icampus21.com

This begs the question as to why I wasn’t already aware of this. After all, I’m responsible for this course and somewhat computer literate. A significant part of the answer has to be the limitations of my approach to learning about Mahara. But other contributing factors would include that this feature is neither explicitly obvious from using Mahara nor in the preparation/resources provided by my current institution.

One perspective is that there is too much freedom in the way the institution allows the use of Mahara in courses. It should remove the freedom I have from getting this far into a semester without being aware of this. But perhaps it can also be addressed by making it more explicit/obvious in Mahara?

Then there is the whole robustness versus resilience perspective as argued by Dave Snowden.

References

Jones, D., & Jamieson, B. (1997). Three Generations of Online Assignment Management. In R. Kevill, R. Oliver, & R. Phillips (Eds.), (pp. 317-323). Perth, Australia.

Jones, D. (1999). Solving some problems with university education: Part II., Proceedings of AUSWEB’99, Balina, Australia.

People and e-learning – limitations and an alternative

So the last of three sections examining the limitations of industrial e-learning and suggesting an alternative. Time to write the conclusion, read the paper over again and cut it down to size.

People

The characteristics of the product and process of industrial e-learning (e.g. focus on long periods of stable use and the importance of efficient use of the chosen LMS) directly reinforced by and directly impact the people and roles involved with tertiary e-learning. This section briefly examines just four examples of this impact, including:

  1. The negative impact of organizational hierarchies on communication and knowledge sharing.
    The logical decomposition inherent in teleological design creates numerous, often significant, organizational boundaries between the people involved with e-learning. Such boundaries are seen as inhibiting the ability to integrate knowledge across the organization. The following comments from Rossi and Luck (2011, p. 68) partially illustrate this problem:

    During training sessions … several people made suggestions and raised issues with the structure and use of Moodle. As these suggestions and issues were not recorded and the trainers did not feed them back to the programmers … This resulted in frustration for academic staff when teaching with Moodle for the first time as the problems were not fixed before teaching started.

  2. Chinese whispers.
    Within an appropriate governance structure the need for changes to an LMS would typically need to flow up from the users to a central committee typically made up of senior leaders from the faculties, Information Technology and central learning and teaching. There would normally be some representation from teaching staff and students. The length of the communication chain for the original need becomes like a game of Chinese Whispers as it is interpreted through the experiences and biases of those involved. Leading to this impression reported by Rossi and Luck (2011, p. 69)

    The longer the communication chain, the less likely it was that academic users’ concerns would be communicated correctly to the people who could fix the problems.

    The cost of traversing this chain of communication means it is typically not worth the effort of raising small-scale changes.

    Not to mention killing creativity which just came through my Twitter feed thanks to @kyliebudge.

  3. Mixed purposes.
    Logical decomposition also encourages different organizational units to focus on their part of the problem and lose sight of the whole picture. An IT division evaluated on its ability to minimize cost and maximize availability is not likely to want to support technologies in which it has limited expertise. This is one explanation for why the leader of an IT division would direct the IT division’s representatives on an LMS selection panel to ensure that the panel selected the LMS implemented in Java. Or a decision to use the latest version of the Oracle DBMS – the DBMS supported by the IT division – to support the new Moodle installation even though it hasn’t been tested with Moodle and best practice advice is to avoid Oracle. A decision that leads to weeks at the start of the “go live” term where Moodle is largely unavailable.
  4. The perils of senior leadership.
    Having the support and engagement of a senior leader at an institution is often seen as a critical success factor for an LMS implementation. But when the successful completion of the project is tied to the leader’s progression within the leadership hierarchy it can create the situation where the project will be deemed a success, regardless of the outcome.

As an alternative, the Webfuse system relied on a multi-skilled, integrated development and support team. This meant that the small team was responsible for training, helpdesk support, and systems development. The helpdesk person handling the user’s problem was typically also a Webfuse developer who was empowered to make small changes without formal governance approval. Behrens (2009, p. 127) quotes a manager in CQU’s IT division describing the types of changes made to Webfuse as “not even on the priority radar” due to traditional IT management techniques. The developers were also located within the faculty, so they also interacted with academic staff in the corridors and the staff room. This context created an approach to the support of an e-learning system with all the hallmarks of a social constructivist, situated cognition, or community of practice. The type of collaborative and supportive environment identified by Tickle et al (2009) in which academics learn through attempts to solve genuine educational problems, rather than being shown how to adapt their needs to the constraints of the LMS.

References

Behrens, S. (2009). Shadow systems: the good, the bad and the ugly. Communications of the ACM, 52(2), 124-129.

Rossi, D., & Luck, J. (2011). Wrestling, wrangling and reaping: An exploration of educational practice and the transference of academic knowledge and skill in online learning contexts. Studies in Learning, Evaluation, Innovation and Development, 8(1), 60-75. Retrieved from http://www.sleid.cqu.edu.au/include/getdoc.php?id=1122&article=391&mode=pdf

Tickle, K., Muldoon, N., & Tennent, B. (2009). Moodle and the institutional repositioning of learning and teaching at CQUniversity. Auckland, NZ. Retrieved from http://www.ascilite.org.au/conferences/auckland09/procs/tickle.pdf

Introducing the alternative

The last couple of posts have attempted to (in the confines of an #ascilite12 paper) summarise some constraints with the dominant product and process models used in industrial e-learning and suggest an alternative. The following – which probably should have been posted first – describes how and where this alternative comes from.

As all this is meant to go into an academic paper, the following starts with a discussion about “research methods” before moving onto describing some of the reasons why this alternative approach might have some merit.

As with the prior posts, this is all still first draft stuff.

Research methods and limitations

From the initial stages of its design the Webfuse system was intended to be a vehicle for both practice (it hosted over 3000 course sites from 1997-2009) and research. Underpinning the evolution of Webfuse was an on-going process of cycle action research that sought to continually improve the system through insights from theory and observation of use. This commenced in 1996 and continued, at varying levels of intensity, through to 2009 when the system ceased directly supporting e-learning. This work has contributed in varying ways to over 25 peer-reviewed publications. Webfuse has also been studied by other researchers investigating institutional adoption of e-learning systems (Danaher, Luck, & McConachie, 2005) and shadow systems in the context of ERP implementation (Behrens, 2009; Behrens & Sedera, 2004).

Starting in 2001 the design of Webuse became the focus of a PhD thesis (Jones, 2011) that made two contributions towards understanding e-learning implementation within universities: the Ps Framework and an Information Systems Design Theory (ISDT). The Ps Framework arose out of an analysis of existing e-learning implementation practices and as a tool to enable the comparison of alternate approaches (Jones, Vallack, & Fitzgerald-Hood, 2008). The formulated ISDT – An ISDT for emergent university e-learning systems –offers guidance for e-learning implementation that brings a number of proposed advantages over industrial e-learing. These contributions to knowledge arose from an action research process that combined broad theoretical knowledge – the principles of the ISDT are supported by insights from a range of kernel theories – with empirical evidence arising from the design and support of a successful e-learning system. Rather than present the complete ISDT – due primarily to space constraints – this paper focuses on how three important components of e-learning can be re-conceptualised through the principles of the ISDT.

The ISDT – and the sub-set of principles presented in this paper – seek to provide theoretical guidance about how to develop and support information systems for university e-learning that are capable of responding to the dominant characteristics (diversity, uncertainty and rapid change) of university e-learning. This is achieved through a combination of product (principles of form and function) and process (principles of implementation) that focus on developing a deep and evolving understanding of the context and use of e-learning. It is through being able to use that understanding to make rapid changes to the system, which ultimately encourages and enables adoption and on-going adaptation. It suggests that any instantiation built following the ISDT will support e-learning in a way that: is specific to the institutional context; results in greater quality, quantity and variety of adoption; and, improves the differentiation and competitive advantage of the host institution.

As with all research, the study described within this study has a number of limitations that should be kept in mind when considering its findings. Through its use of action research, this work suffers the same limitations, to varying degrees, of all action research. Baskerville and Wood-Harper (1996) identify these limitations as: (1) lack of impartiality of the researcher; (2) lack of discipline; (3) mistaken for consulting; and (4) context-dependency leading to difficulty of generalizing findings. These limitations have been addressed within this study through a variety of means including: a history of peer-reviewed publications throughout the process; use of objective data sources; the generation of theory; and, an on-going process of testing. Consequently the resulting ISDT and the principles described here have not been “proven”. This was not the aim of this work. Instead, the intent was to gather sufficient empirical and theoretical support to build and propose a coherent and useful alternative to industrial e-learning. The question of proof and further testing of the ISDT in similar and different contexts provides – as in all research aiming to generate theory – an avenue for future research.

On the value of Webfuse

This section aims to show that there is some value in considering Webfuse. It seeks to summarise the empirical support for the ISDT and the principles described here by presenting evidence that the development of Webfuse led to a range of features specific to the institution and to greater levels of adoption. It is important to note that from 1997 through 2005 Webfuse was funded and controlled by one of five faculties at CQUniversity. Webfuse did not become a system controlled by a central IT division until 2005/2006 as a result of organizational restructures. During the life-span of Webfuse CQU adopted three different official, institutional LMS: WebCT (1999), Blackboard (2004), and Moodle (2010).

Specific to the context

During the period from 1999 through 2002 the “Webfuse faculty” saw a significant increase in the complexity of its teaching model including the addition of numerous international campuses situated within capital cities and a doubling in student numbers, primarily through full-fee paying overseas students. By 2002, the “Webfuse faculty” was teaching 30% of all students at the University. Due to the significant increased in complexity of teaching in this context, a range of teaching management and support services were integrated into Webfuse including: staff and student “portals”, an online assignment submission and management system, a results upload application, an informal review of grade system, a timetable generator, student photo gallery, academic misconduct database, email merge facility, and assignment extension systems.

The value of these systems to the faculty is illustrated by this quote from the Faculty annual report for 2003 cited by Danaher, Luck & McConachie (2005, p. 39)

[t]he best thing about teaching and learning in this faculty in 2003 would be the development of technologically progressive academic information systems that provide better service to our students and staff and make our teaching more effective. Webfuse and MyInfocom development has greatly assisted staff to cope with the complexities of delivering courses across a large multi-site operation.

By 2003 the faculties not using Webfuse were actively negotiating to enable their staff to have access to the services. In 2009 alone, over 12,000 students and 1100 staff made use of these services. Even though no longer officially supported, it is a few of these services that continue to be used by the university in the middle of 2012.

Quotes from staff using the Webfuse systems reported in various publications (Behrens, 2009; Behrens, Jamieson, Jones, & Cranston, 2005; Jones, Cranston, Behrens, & Jamieson, 2005) also provide some insights into how well Webfuse supported the specific context at CQUni.

my positive experience with other Infocom systems gives me confidence that OASIS would be no different. The systems team have a very good track record that inspires confidence

The key to easy use of OASIS is that it is not a off the shelf product that is sooooo generic that it has lost its way as a course delivery tool.

I remember talking to [a Webfuse developer] and saying how I was having these problems with uploading our final results into [the Enterprise Resource Planning (ERP) system] for the faculty. He basically said, “No problem, we can get our system to handle that”…and ‘Hey presto!’ there was this new piece of functionality added to the system … You felt really involved … You didn’t feel as though you had to jump through hoops to get something done.

Beyond context specific systems supporting the management of learning and teaching, Webfuse also included a number of context specific learning and teaching innovations. A short list of examples includes:

  • the course barometer;
    Based on an innovation (Svensson, Andersson, Gadd, & Johnsson, 1999) seen at a conference the barometer was designed to provide students a simple, anonymous method for providing informal, formative feedback about a course (Jones, 2002). Initially intended only for the authors courses, the barometer became a required part of all Webfuse course sites from 2001 through 2005. In 2007/2008 the barometers were used as part of a whole of institution attempt to encourage formative feedback in both Webfuse and Blackboard.
  • Blog Aggregation Management (BAM); and
    BAM allowed students to create individual, externally hosted web-logs (blog) and use them as reflective journals. Students registered their external blog with BAM, which then mirrored all of the students’ blog posts on an institutional server and provided a management and marking interface for teaching staff. Created by the author for use in his own teaching in 2006, BAM was subsequently used in 26 course offerings by 2050+ students and ported to Moodle as BIM (Jones & Luck, 2009). In reviewing BAM, the ELI guide to blogging (Coghlan et al., 2007) identified as
    One of the most compelling aspects of the project was the simple way it married Web 2.0 applications with institutional systems. This approach has the potential to give institutional teaching and learning systems greater efficacy and agility by making use of the many free or inexpensive—but useful—tools like blogs proliferating on the Internet and to liberate institutional computing staff and resources for other efforts.
  • A Web 2.0 course site.
    While it looked like a normal course website, none of the functionality – including discussion, wiki, blog, portfolio and resource sharing – was implemented by Webfuse. Instead, freely available and externally hosted Web 2.0 tools and services provided all of the functionality. For example, each student had a portfolio and a weblog provided by the site http://redbubble.com. The content of the default course site was populated by using BAM to aggregate RSS feeds (generated by the external tools) which were then parsed and displayed by Javascript functions within the course site pages. Typically students and staff did not visit the default course site, as they could access all content by using a course OPML file and an appropriate reader application.

Even within the constraints placed on the development of Webfuse it was able to develop an array of e-learning applications that are either not present in industrial LMSes, were added much later than the Webfuse services, or had significantly reduced functionality.

Greater levels of adoption

Encouraging staff adoption of the Webfuse system was one of the main issues raised in the original Webfuse paper (Jones & Buchanan, 1996). Difficulties in encouraging high levels of quality use of e-learning within universities has remained a theme throughout the literature. Initial use of Webfuse in 1997 and 1998 was not all that successful in achieving that goal, with only five – including the designer of Webfuse who made 50% of all edits using the system – of 60 academic staff making any significant use of Webfuse by early 1999 (Jones & Lynch, 1999). These limitations were addressed from 1999 onwards by a range of changes to the system, how it was supported and the organizational context. The following illustrates the success of these changes by comparing Webfuse adoption with that of the official LMS (WebCT 1999-2003/4; Blackboard 2004-2009) used primarily by the non-Webfuse faculties. It first examines the number of course sites and then examines feature adoption.

From 1997 Webfuse automatically created a default course site for all Faculty courses by drawing on a range of existing course related information. For the official institutional LMS course sites were typically created on request and had to be populated by the academics. By the end of 2003 – 4 years after the initial introduction of WebCT as the official institutional LMS – only 15% (141) of courses from the non-Webfuse faculties had WebCT course sites. At the same time, 100% (302) of the courses from the Webfuse faculty had course sites. Due to the need for academics to populate WebCT and Blackboard courses sites, the presence of a course website doesn’t necessarily imply use. For example, Tickle et al (2009) report that 21% of the 417 Blackboard courses being migrated to Moodle in 2010 contained no documents.

Research examining the adoption of specific categories of LMS features provides a more useful insight into LMS usage. Figures 1 through 4 use the research model proposed by Malikowski, Thompson, & Thies (2007) to compare the adoption of LMS features between Webfuse (the thick continuous lines in each figure), CQUni’s version of Blackboard (the dashed lines), and range of adoption rates found in the literature by Malikowski et al (2007) (the two dotted lines in each figure). This is done for four of the five LMS feature categories identified by Malikowski et al (2007): content transmission (Figure 1), class interaction (Figure 2), student assessment (Figure 3), and course evaluation (Figure 4).

(Click on the graphs to see large versions)

Content Transmission Interactions
Figure 1: Adoption of content transmission features: Webfuse, Blackboard and Malikowski Figure 2: Adoption of class interactions features: Webfuse, Blackboard and Malikowski
(missing archives of most pre-2002 course mailing lists)
Evaluate Students Evaluate Courses
Figure 3: Adoption of student assessment features: Webfuse, Blackboard and Malikowski Figure 4: Adoption of course evaluation features: Webfuse, Blackboard and Malikowski

The Webfuse usage data included in Figures 1 through 4 only include actual feature use by academics or students. For example, from 2001 through 2005 100% of Webfuse courses contained a course evaluation feature called a course barometer, only courses where the course barometer was actually used by students are included in Figure 4. Similarly, all Webfuse default course sites contained content (either automatically added from existing data repositories or copied across from a previous term). Figure 1 only includes data for those Webfuse course sites where teaching staff modified or added content.

Figures 2 and 3 indicate Webfuse adoption rates of greater than 100%. This is possible because a number of Webfuse features – including the EmailMerge and online assignment submission and management applications – were being used in course sites hosted on Blackboard. Webfuse was seen as providing services that Blackboard did not provide, or that were significantly better than what Blackboard did provide. Similarly, the spike in Webfuse course evaluation feature adoption in 2008 to 51.6% is due to a CQU wide push to improve formative feedback across all courses that relied on the Webfuse course barometer feature.

Excluding use by non-Webfuse courses and focusing on the time period 2003-2006, Figures 2 and 3 show that adoption of Webfuse class interaction and student assessment features significantly higher than the equivalent Blackboard features at CQU. It is also significantly higher than the adoption rates found by Malikowski et al (2007) in the broader literature. It also shows adoption rates that appear to be somewhat higher than that found amongst 2008, Semester 1 courses at the University of Western Sydney and Griffith University by Rankine et al (2009). Though it should be noted that Rankine et al (2009) used different sampling and feature categorization strategies that make this comparison tentative.

References

Behrens, S. (2009). Shadow systems: the good, the bad and the ugly. Communications of the ACM, 52(2), 124-129.

Behrens, S., Jamieson, K., Jones, D., & Cranston, M. (2005). Predicting system success using the Technology Acceptance Model: A case study. 16th Australasian Conference on Information Systems. Sydney. Retrieved from http://cgit.nutn.edu.tw:8080/cgit/PaperDL/tkw_090717140108.pdf

Behrens, S., & Sedera, W. (2004). Why do shadow systems exist after an ERP implementation? Lessons from a case study. In C.-P. Wei (Ed.), (pp. 1713-1726). Shanghai, China.

Coghlan, E., Crawford, J., Little, J., Lomas, C., Lombardi, M., Oblinger, D., & Windham, C. (2007). ELI Discovery Tool: Guide to Blogging. EDUCAUSE. Retrieved from http://www-cdn.educause.edu/eli/GuideToBlogging/13552

Danaher, P. A., Luck, J., & McConachie, J. (2005). The stories that documents tell: Changing technology options from Blackboard, Webfuse and the Content Management System at Central Queensland University. Studies in Learning, Evaluation, Innovation and Development, 2(1), 34-43.

Jones, D. (2002). Student Feedback, Anonymity, Observable Change and Course Barometers. In S. R. Philip Barker (Ed.), (pp. 884-889). Denver, Colorado: AACE.

Jones, D. (2011). An Information Systems Design Theory for E-learning. Philosophy. Australian National University. Retrieved from http://davidtjones.wordpress.com/research/phd-thesis/

Jones, D., & Buchanan, R. (1996). The design of an integrated online learning environment. In P. J. Allan Christie Beverley Vaughan (Ed.), (pp. 331-345). Adelaide.

Jones, D., Cranston, M., Behrens, S., & Jamieson, K. (2005). What makes ICT implementation successful: A case study of online assignment submission. Adelaide.

Jones, D., & Luck, J. (2009). Blog Aggregation Management: Reducing the Aggravation of Managing Student Blogging. In G. Siemns & C. Fulford (Eds.), World Conference on Educational Multimedia, Hypermedia and Telecommunications 2009 (pp. 398-406). Chesapeake, VA: AACE. Retrieved from http://www.editlib.org/p/31530

Jones, D., & Lynch, T. (1999). A Model for the Design of Web-based Systems that supports Adoption, Appropriation and Evolution. In Y. D. San Murugesan (Ed.), (pp. 47-56). Los Angeles.

Jones, D., Vallack, J., & Fitzgerald-Hood, N. (2008). The Ps Framework: Mapping the landscape for the PLEs@CQUni project. Hello! Where are you in the landscape of educational technology? ASCILITE’2008. Melbourne.

Malikowski, S., Thompson, M., & Theis, J. (2007). A model for research into course management systems: bridging technology and learning theory. Journal of Educational Computing Research, 36(2), 149-173.

Rankine, L., Stevenson, L., Malfroy, J., & Ashford-Rowe, K. (2009). Benchmarking across universities: A framework for LMS analysis. Ascilite 2009. Same places, different spaces (pp. 815-819). Auckland. Retrieved from http://www.ascilite.org.au/conferences/auckland09/procs/rankine.pdf

Svensson, L., Andersson, R., Gadd, M., & Johnsson, A. (1999). Course-Barometer: Compensating for the loss of informal feedback in distance education (pp. 1612-1613). Seattle, Washington: AACE.

Tickle, K., Muldoon, N., & Tennent, B. (2009). Moodle and the institutional repositioning of learning and teaching at CQUniversity. Auckland, NZ. Retrieved from http://www.ascilite.org.au/conferences/auckland09/procs/tickle.pdf

The e-learning process – limitations and an alternative

And here’s the followup to the well received “LMS Product” post. This is the second section looking at the limitations of how industrial e-learning is implemented, this time focusing on the process used. Not really happy with this one, space limitations are making it difficult to do a good job of description.

Process

It has become a maxim of modern society that without objectives, without purpose there can be no success, the setting of goals and achieving them has become the essence of “success” (Introna, 1996). Many, if not most, universities follow, or at least profess to follow, a purpose driven approach to setting strategic directions (Jones, Luck, McConachie, & Danaher, 2005). This is how institutional leaders demonstrate their strategic insight, their rationality and leadership. This is not a great surprise since such purpose driven processes – labeled as teleological processes by Introna (1996) – has dominated theory and practice to such an extent that it has become ingrained. Even though the debate between the “planning school” of process thought and the “learning school” of process thought has been one of the most pervasive debates in management (Clegg, 2002).

Prior papers (Jones et al., 2005; Jones & Muldoon, 2007) have used the nine attributes of a design process formulated by Introna (1996) to argue that purpose driven processes are particularly inappropriate to the practice of tertiary e-learning. The same papers have presented and illustrated the alternative, ateleological processes. The limitations of teleological processes can be illustrated by examining Introna’s (1996) three necessary requirements for teleological design processes

  1. The system’s behaviour must be relatively stable and predictable.
    As mentioned in the previous section, stability and predictability do not sound like appropriate adjectives for e-learning, especially into the future. Especially given the popular rhetoric about organizations in the present era no longer being stable, and instead are continuously adapting to shifting environments that places them in a state of constantly seeking stability while never achieving it (Truex, Baskerville, & Klein, 1999).
  2. The designers must be able to manipulate the system’s behaviour directly.
    Social systems cannot be “designed” in the same way as technical systems, at best they can be indirectly influenced (Introna, 1996). Technology development and diffusion needs cooperation, however, it takes place in a competitive and conflictual atmosphere where different social groups – each with their own interpretation of the technology and the problem to be solved – are inevitably involved and seek to shape outcomes (Allen, 2000). Academics are trained not to accept propositions uncritically and subsequently cannot be expected to adopt strategies without question or adaptation (Gibbs, Habeshaw, & Yorke, 2000).
  3. The designers must be able to determine accurately the goals or criteria for success.
    The uncertain and confused arena of social behaviour and autonomous human action make predetermination impossible (Truex, Baskerville et al. 2000). Allen (2000) argues that change in organizational and social setting involving technology is by nature undetermined.

For example, Tickle et al (2009) offer one description of the teleological process used to transition CQUni to the Moodle LMS in 2009. One of the institutional policies introduced as part of this process was the adoption of Minimum Service Standards for course delivery (Tickle et al., 2009, p. 1047). Intended to act as a starting point for “integrating learning and teaching strategies that could influence students study habits” and to “encourage academic staff to look beyond existing practices and consider the useful features of the new LMS” (Tickle et al., 2009, p. 1042). In order to assure the quality of this process a web-based checklist was implemented in another institutional system with the expectation that the course coordinator and moderator would actively check the course site met the minimum standards. A senior lecturer widely recognized as a quality teacher described the process for dealing with the minimum standards checklist as

I go in and tick all the boxes, the moderator goes in and ticks all the boxes and the school secretary does the same thing. It’s just like the exam check list.

The minimum standards checklist was removed in 2011.

A teleological process is not interested in learning and changing, only in achieving the established purpose. The philosophical assumptions of teleological processes – modernism and rationality – are in direct contradiction to views of learning meant to underpin the best learning and teaching. Rossi and Luck (2011, p. 62) talk about how “[c]onstructivist views of learning pervade contemporary educational literature, represent the dominant learning theory and are frequently associated with online learning”. Wise and Quealy (2006, p. 899) argue, however, that

while a social constructivist framework may be ideal for understanding the way people learn, it is at odds not only with the implicit instructional design agenda, but also with current university elearning governance and infrastructure.

Staff development sessions become focused on helping the institution achieve the efficient and effective use of the LMS, rather than quality learning and teaching. This leads to staff developers being “seen as the university’s ‘agent’” (Pettit, 2005, p. 253). There is a reason why Clegg (2002) references to teleological approaches as the “planning school” of process thought and the alternative ateological approach the “learning school” of process.

The ISDT abstracted from the Webfuse work includes 11 principles of implementation (i.e. process) divided into 3 groups. The first and second groupings refer more to people and will be covered in the next section. The second grouping focused explicitly on the process and was titled “An adopter-focused, emergent development process”. Webfuse achieved this by using an information systems development processes based on principles of emergent development (Truex et al., 1999) and ateleological design (Introna, 1996). The Webfuse development team was employed and located within the faculty. This allowed for a much more in-depth knowledge of the individual and organizational needs and an explicit focus on responding to those needs. The quote early in this paper about the origins of the results uploading system is indicative of this. Lastly, at its best Webfuse was able to seek a balance between teleological and ateleological processes due to a Faculty Dean who recognized the significant limitations of a top-down approach.

This process, when combined with a flexible and responsive product, better enabled the Webfuse team to work with the academics and students using the system to actively modify and construct the system in response to what was learned while using the system. It was an approach much more inline with a social constructivist philosophy.

References

Allen, J. (2000). Information systems as technological innovation. Information Technology & People, 13(3), 210-221.

Clegg, S. (2002). Management and organization paradoxes. Philadelphia, PA: John Benjamins Publishing.

Gibbs, G., Habeshaw, T., & Yorke, M. (2000). Institutional learning and teaching strategies in English higher education. Higher Education, 40(3), 351-372.

Introna, L. (1996). Notes on ateleological information systems development. Information Technology & People, 9(4), 20-39.

Jones, D., Luck, J., McConachie, J., & Danaher, P. A. (2005). The teleological brake on ICTs in open and distance learning. Adelaide.

Jones, D., & Muldoon, N. (2007). The teleological reason why ICTs limit choice for university learners and learning. In R. J. Atkinson, C. McBeath, S. K. A. Soong, & C. Cheers (Eds.), (pp. 450-459). Singapore. Retrieved from http://www.ascilite.org.au/conferences/singapore07/procs/jones-d.pdf

Pettit, J. (2005). Conferencing and Workshops: a blend for staff development. Education, Communication & Information, 5(3), 251-263. doi:10.1080/14636310500350505

Rossi, D., & Luck, J. (2011). Wrestling, wrangling and reaping: An exploration of educational practice and the transference of academic knowledge and skill in online learning contexts. Studies in Learning, Evaluation, Innovation and Development, 8(1), 60-75. Retrieved from http://www.sleid.cqu.edu.au/include/getdoc.php?id=1122&article=391&mode=pdf

Tickle, K., Muldoon, N., & Tennent, B. (2009). Moodle and the institutional repositioning of learning and teaching at CQUniversity. Auckland, NZ. Retrieved from http://www.ascilite.org.au/conferences/auckland09/procs/tickle.pdf

Truex, D., Baskerville, R., & Klein, H. (1999). Growing systems in emergent organizations. Communications of the ACM, 42(8), 117-123.

Wise, L., & Quealy, J. (2006). LMS Governance Project Report. Melbourne, Australia: Melbourne-Monash Collaboration in Education Technologies. Retrieved from http://www.infodiv.unimelb.edu.au/telars/talmet/melbmonash/media/LMSGovernanceFinalReport.pdf

The LMS Product – limitations and an alternative

What follows is the first draft of the “Product” section for an ASCILITE paper (the overview for the paper) I hope to finish by tomorrow……just a bit of wishful thinking. Much of it has appeared in this blog previously, just now trying to wrangle it into a formal publication and all the limitations (e.g. space) that brings with it.

It’s a first draft, so comments and suggestions more than welcome.

Product

One of the defining characteristics of the industrial e-learning paradigm is the reliance on the Learning Management System (LMS) as the product for organizational e-learning. Despite the associated complexities and risks almost every university seems compelled to have an LMS (Coates, James, & Baldwin, 2005). The LMS is an example of an integrated or monolithic information system. This type of information system brings with it a set of advantages and disadvantages. On the plus side, an integrated system offers cost efficiencies and other benefits through standardization but, at the same time, such systems constrain flexibility, competitiveness, autonomy and increase rigidity (B Light, Holland, & Wills, 2001; Lowe & Locke, 2008). Such systems are best suited to circumstances where there is commonality between organizations and stable requirements with low uncertainty. This does not seem to be a good description of tertiary e-learning, either over the last 10 years or the next 10. This section looks at two of the repercussions of this mismatch – 1) organizations and people must adapt to the system; and, 2) the single vendor limitation – before describing the alternate principles from the ISDT.

The first repercussion of an integrated system is captured by this comment (Sturgess & Nouwens, 2004, n.p.)

we should seek to change people’ behaviour because information technology systems are difficult to change.

This is a comment from a technical staff member participating in CQUni’s 2003 LMS selection process. This comment, rather than being isolated, captures the accepted industry best practice recommendation to implement integrated systems in their “vanilla” form because local changes are too expensive (Robey, Ross, & Boudreau, 2002). Maintaining a vanilla implementation constrains what is possible with the system, limiting change, innovation and differentiation and perhaps being a contributing factor in the poor pedagogical outcomes observed in industrial e-learning.

For example, in 2007 an instructional designer working on a redesign of a CQUni course in Nutrition informed by constructive alignment was stymied by the limitations of the Blackboard LMS. Blackboard could not support the required number of group-based discussion forums required by the new course design. Normally, with an integrated system the pedagogical approach would have to be changed to fit the confines of the system. Instead the implementation of the course site was supplemented with use of one of the Webfuse discussion forums that allowed the fulfillment of the original educational design. Academic staff teaching large first year courses using the Webfuse BAM functionality faced a similar situation when CQUni adopted Moodle. Since Moodle did not provide similar functionality these staff would be forced to change their pedagogical approach to fit the capabilities of the integrated system.

The regular forced migration to another version of an LMS is the extreme example of the organization being forced to change in response to the technology, rather than the technology fitting to the organizations needs. It is not uncommon to hear Universities being forced to adopt a new LMS because the vendor has ceased supporting their current system. The cost, complexity and disruption caused by an LMS migration contributes to this “stable systems drag” (Truex, Baskerville, & Klein, 1999) as the institution seeks a long period of “vanilla” use to recoup the cost.

Another characteristic of an integrated system is that the quality of the tools available is limited to those provided by a single vendor or community. For example, a key component of the recent disquiet about the Curt Bonk MOOC hosted within a Blackboard LMS was the poor quality of the Blackboard discussion forum (see Lane, 2012). Reservations about the quality and functionality of the Wiki and Blog tools within Moodle are also fairly common. LMS-based tools also tend not to fare well in comparisons with specialist tools. For example, when LMS-based blog tools are compared with tools like WordPress. In addition, integrated systems tend to support only one version of every given tool. Leading to the situation where users can pine for the previous version of the tool because it suited their needs better.

The ISDT formulated from the experience of developing Webfuse proposes 13 principles for the form and function of the product for emergent e-learning. These principles were divided into 3 groups:

  1. Integrated and independent services.
    Rather than a system or platform, Webfuse was positioned as glue. It was used to “fuse” together widely different services and tools into an integrated whole. Webfuse was an example of a best-of-breed system, a type of system that provides more flexibility and responsiveness to contextual needs (Ben Light, Holland, & Wills, 2001). For example, when the existing discussion forum tool was seen as limited, a new discussion forum tool was selected and integrated into Webfuse. At the same time the old discussion forum tool was retained and could be used by those for whom it was an appropriate fit. While new tools could be added as required, the interface used by staff and students remained essentially the same. There was no need for expensive system migrations.
  2. Adaptive and inclusive architecture.
    Almost all LMS support some form of plugin architecture where external users can develop new tools and services for the LMS. This architecture, however, is generally limited to tools specifically written for the LMS and its architecture and thereby limiting what tools can be integrated. The Webuse “architecture” was designed to support the idea of software wrappers (Sneed, 2000) enabling the inclusion of a much broader array of applications.
  3. Scaffolding, context-sensitive conglomerations.
    Most e-learning tools provide a collection of configuration options that can be used in a variety of ways. Effective use of these tools requires a combination of skills from a broad array of disciplines and significant contextual knowledge that the majority of academic staff do not possess. The most obvious example is in the overall design of a course website. Webfuse had a default course site conglomeration that combined a range of institutional data sources and Webfuse tools to automatically create a course site. A key aspect of the Webfuse wrappers placed around integrated tools was the addition of institutional specific information and services. There are significant, unexplored opportunities in adding scaffolding to e-learning tools that enable distributed cognition.

Writing about the need for universities to embrace diversity Thomas (2012) talks of Procrustes who

would stretch and sever the limbs of his guests to fit the size of his bed. We, too, are continuing to stretch and shape our higher education to a particular standard to the detriment of students and society alike.

In terms of e-learning, that “particular standard” is defined by the products we are using to implement industrial e-learning.

References

Coates, H., James, R., & Baldwin, G. (2005). A Critical Examination of the Effects of Learning Management Systems on University Teaching and Learning. Tertiary Education and Management, 11(1), 19-36. Retrieved from http://www.springerlink.com/content/r21987609l3g1h58/

Lane, L. M. (2012). Leaving an open online course. Retrieved from http://lisahistory.net/wordpress/2012/04/leaving-an-open-online-class/

Light, B, Holland, C. P., & Wills, K. (2001). ERP and best of breed: a comparative analysis. Business Process Management Journal, 7(3), 216-224.

Lowe, A., & Locke, J. (2008). Enterprise resource planning and the post bureaucratic organization. Information Technology & People, 21(4), 375-400.

Robey, D., Ross, W., & Boudreau, M.-C. (2002). Learning to implement enterprise systems: An exploratory study of the dialectics of change. Journal of Management Information Systems, 19(1), 17-46.

Sneed, H. (2000). Encapsulation of legacy software: A technique for reusing legacy software components. Annals of Software Engineering, 9(1-4), 293-313.

Sturgess, P., & Nouwens, F. (2004). Evaluation of online learning management systems. Turkish Online Journal of Distance Education, 5(3). Retrieved from http://tojde.anadolu.edu.tr/tojde15/articles/sturgess.htm

Thomas, J. (2012). Universities can’t all be the same – it’s time we embraced diversity. The Conversation. Retrieved June 28, 2012, from http://theconversation.edu.au/universities-cant-all-be-the-same-its-time-we-embraced-diversity-7379

Truex, D., Baskerville, R., & Klein, H. (1999). Growing systems in emergent organizations. Communications of the ACM, 42(8), 117-123.

Learning analytics and complexity

Another day and another #ascilite12 paper to think about. This is the one where my periodic engagement is potentially driving my co-author slightly crazy. I’m sure this contribution will add further to that.

The idea

The basic idea of the paper is to

  1. Draw on a few more insights/patterns from the data gathered as part of the Indicators project.
    This includes usage data from a single Australian university from 3 different “Learning Management Systems” over a 10+ year period.
  2. Use the lens of complex adaptive systems to
    1. Identify potential problems with existing trends in the application of learning analytics.
    2. Identify some potential alternative applications of learning analytics.

The idea builds on some of Col’s earlier thinking in this area and is likely to inform some of the next steps we take in this area.

Potential problems

The problems that we seem to have identified so far are

  1. The hidden complexity behind the simple patterns.
    1. Abstraction losing detail.
    2. Decomposition preventing action.
  2. It’s not a causal system
    1. Correlation, causation confusion.
    2. Overweening conceit of causality.

There must be more of these problems. I do wonder if a closer reading of some of the CAS literature would provide more insights.

For each of these problems we’re hoping to

  • Illustrate the nature of the problem with evidence from the data.
  • Offer insight into why this is a problem from Complex Adaptive Systems theory.
    Morrison (2006) gives a good overview of CAS, some its application to education and some limitations.
  • Suggest a better approach based on the insights from CAS.

Hidden complexity – abstraction losing detail

This in part picks up and illustrates part of the message Gardner Campbell made in his presentation “Here I stand” as part of LAK’12. i.e. that the nature of learning analytics and its reliance on abstracting patterns or relationships from data has a tendency to hide the complexity of reality. Especially when used for decision making by people who are not directly engaged in the reality.

Col has illustrated this in this post using the traditional relationship between LMS use and grades (more use == better grades). The nice trend gets interrupted when you start looking at the complexity behind that nice trend. For example, one student who achieved a HD in every course having widely varying numbers of posts/replies in different courses. Similar to Ken’s discoveries when looking at his teaching. Same academic in a few different courses having widely varying practices.

A concrete example is management passing a rule that every course must have a course website that includes a discussion forum, even when for entirely appropriate reasons an entirely on-campus course decides it’s not appropriate.

Decomposition preventing action

The structure and organisation of universities are based on top-down decomposition. All of the various required functions are identified and divided into various parts. There’s HR, IT, the teaching academics in faculties etc. With each decomposition there is loss of the whole. Each sub-component starts to focus on their bit. This is where you get IT departments focusing on uptime, security and risk regardless of the effects it has on the quality of learning.

You can see the effect of this in the learning analytics literature. ALTC grants having to take a browser/client-based approach to tool development because the IT department won’t provide access to the database. It’s one of the reasons why the Indicators project is a little further ahead than most, even though we are a very small group. Through a series of accidents we had access to data and the skills necessary to do something with it.

The effect is also visible in the location of data. Student results are in the student records system. LMS activity is in the LMS etc. This is why “dashboards” are the solution. They bring the data into a single system that is maintained by the data mining folk within an institution. Even though the real value of the patterns revealed by these systems is within the learning environment (the LMS for most), not in yet another systems

You can also see this in the increasing tendency for “dashboards” to be the organisational solution to learning analytics. It’s what the data mining folk in the institution already do, so why not do it for learning analytics? The only trouble is that the information provided by learning analytics is most useful within the LMS. A contributing factor to some of the limitations of the tools and the difficulty staff and students have using them.

The major difficulty for learning analytics is that action in response to learning analytics takes at the teaching/learning coal-face. Not in the dashboard system or the other places inhabited by senior management and support staff.

It’s not a causal system

University senior management assume that they can manipulate the behaviour of people. For example, is the lovely quote I often use from an LMS working group. One of the technical people suggested “that we should change people’s behaviour because information technology systems are difficult to change”. As a complex system there simply isn’t the causality there.

For example, when Moodle was introduced at the institution in question there was grave concern about how few Blackboard course sites actually contained discussion forums. A solution to this was the implementation of “minimum course standards” accompanied by a checklist that was ticked by the academic and double checked by the moderator to assure certain standards were implemented. e.g. a discussion forum. Subsequent data reveals that while all courses may have had a discussion forum, a significant proportion of courses have discussion forums with fewer than 5 posts. This is the mistaken “overweening conceit of casuality”.

Then there is the obvious confusion between correlation and causation. i.e. simply because HD students average more use of the LMS, this doesn’t mean that if all students use the LMS more they’ll get better marks.

Some alternatives

Okay, so given these problems what might you do differently. A few initial suggestions

  • Put a focus on the technology to aid sense-making and action especially to aid academics and students.
    The technology can’t know enough abou the context to make decisions. It can, however, help the people make decisions.
  • Put this into the learning environment currently used by these folk (i.e. the LMS).
    It has to be a part of the environment. If it’s separate, it won’t be used.
  • Break-down the silos.
    Currently, much of learning analytics is within a course or across an institution, or perhaps focused on a specific student. Academics within a program need to be sharing their insights and actions. Students need to be able to see how they are going against others…

This is not meant to represent a new direction for the practice of learning analytics. Rather one interesting avenue for further research.

References

Morrison, K. (2006). Complexity theory and education. APERA Conference, Hong Kong (pp. 1-12). Retrieved from http://edisdat.ied.edu.hk/pubarch/b15907314/full_paper/SYMPO-000004_Keith Morrison.pdf

The life and death of Webfuse: lessons for learning and leading into the future

The following is an attempt to formulate and structure some ideas for a paper for ascilite’12 in Wellington. The aim is to convert my PhD thesis – especially “The information systems design theory for emergent university e-learning” – into something useful and interesting for the ascilite crowd. The following is an attempt to organise the mish-mash of content I currently have into something sensible.

Abstract

Drawing on the 14-year life and death of an integrated online learning environment used by tens of thousands of people, this paper argues that many of the principles and practices underpinning industrial e-learning – the current dominant institutional model – are inappropriate. The paper illustrates how industrial e-learning has been a significant drag on the current practice of tertiary e-learning in terms of quantity and quality and argues that it will actively prevent universities from being able to respond to uncertainty and effectively create and explore the future of learning. The paper proposes one alternative set of successfully implemented principles and practices and argues that this alternative will better enable institutions to lead in a climate of change, rather than following along behind.

Paper structure

The introduction will briefly

  • re-connect this paper with the 1996 ascilite paper that outlined the initial design of Webfuse;
  • set that in the broader history of LMS (i.e. every institution had its own LMS) which then during the noughties got replaced by one of the big 2 or 3 enterprise LMS;
  • illustrate the problems and limited outcomes of industrial e-learning;
  • link this trend with the broader pushes toward strategic/top-down/rational management practices within universities;
  • explain how Webfuse lived through this phase and move further and further away from the industrial e-learning trend;
  • outline the structure of the paper
    • Research method
    • On the value of Webfuse
    • Product
    • People
    • Process
    • Conclusions and future work

The Research method section (in “proving” the academic credentials of this work) will talk about

  • the cycle of action research over 14 years;
  • the formulation of the Ps Framework and the ISDTs;
  • link this back to DBR.

The On the value of Webfuse will seek to argue that the system based on the principles outlined in the paper was a “success” on a number of fronts. In doing so

  • Talk about the complex nature of what “success” means in terms of Information Systems implementation.
  • draw on quotes from the literature showing the value of the system as percieved by others.
  • summarise the greater levels of adoption and use of this system as compared to other systems both within and outside the institution.
  • Talk about features of the system which were not present in other systems for years if not ever.

The next three sections – Product, People, Process – will follow the same basic structure but will focus on a different essential component of e-learning. The structure will go something like

  • Explain the nature of the component as implemented in industrial e-learning.
  • Illustrate the problems that arise because of those principles.
  • Present the alternative set of principles and practices.

The conclusions and future work will probably cover some of the following (this is perhaps the section most likely to evolve)

  • The principles here are not a perfect solution – as a wicked design problem there is no such thing – there are problems and limitations with this approach. Not the least of which is the familiarity gap. It rejects many of the taken for granted assumptions of existing practice. Perhaps list these. Perhaps the biggest message of this approach is that institutions need to have practices that engage with these challenges rather than seek to abstract them out of existence.
  • That said glimmers of these very different principles is increasingly visible in a range of movements within the host disciplines. e.g. agile management practices, agile development etc.
  • Perhaps talk about the limitations of the research – impartiality, discipline/rigor, context-dependency.
  • A call for more design work and design theory in this area to test/refine the principles here or develop entirely different alternatives.
  • That testing of this design theory is going to be extremely hard given the established nature of industrial e-learning within higher education organisations. In particular, the spread of senior management staff who have a sense of ownership. This tends to rule out the possibility of doing much to address the People and Process aspects, at least at an institutional level. It tends to leave the Product aspect. Where tinkering with open source LMSes may be a productive area for future work. Though at the same time providing its own sense of inertia. Some examples may be in exploring how distributed cognition can improve these systems, but also exploring technical workarounds to improve the adaptability of these systems.

Expanding out Product, People, Process

The following are an initial attempt to expand out the three main sections. Completing this has highlighted the need to think about how to present/structure the problems.

For product, this will include :

  • Current nature – is the LMS.(supporting blog posts one, two
    An enterprise system that has little or no capacity for change or customisation at the institutional or individual level (changing look and feel doesn’t count). Even open source LMS suffer problems here.
  • The problems include
    • Having to change the behaviour of people, because technology is hard to change (must include the Sturgess and Nouwens quote mentioned here
    • More broadly the need for institutions to engage in large scale change projects because of new versions of the software.
    • Separation of data and services into separate systems (e.g. student records etc.)
    • Software that is generic and not specific to the institutional needs, the lowest-common denominator.
    • e.g. assignment management functionality in most LMS in 2011/2012 that is behind what Webfuse had 10 years ago
    • A focus on one tool. e.g. one discussion forum.
  • The alternative

For process – will draw on a few prior publications (thesis-base process posts one and two, OODLA paper, prior ascilite paper, posts (procurement models, role of people in LMS selection)

  • Current nature – i.e. teleological, plan-driven
    Need to not limit this solely to strategic management or IT process selection. Need to engage with the learning design folk who adopt this model for the design of teaching and see if there’s an argument to make this better.
  • Limitations – drawn from the publications above.
    Also perhaps mention how it clashes with how people learn.
  • The alternative – ateleological.
    Draw on insights from the thesis, but also other work e.g. Laurillard and others’ calls for teachers to be action researchers and the need for the organisation to engage with this. Perhaps even bring in Bigg’s quality model.

    In particular, see if arguments/suggestions can be developed to enhance “course design” in ways that are more ateleological.

    Mention Cavallo and the idea that any sort of change is learning and needs to connect to how people learn.

For people the focus will likely be on the (techno-)rational model as it is applied to how people think/respond and also how they are organised. (A people blog post from the thesis)

  • Current situation – people are assumed to be rational and the application of logical decomposition that splits people up into sub-groups. Also an emphasis on cheapness in support rolls. There’s also the problem raised towards the end of this post where the innovative central staff are trying to get people to use what’s been provided. Perhaps linked to the chasm.
  • Problems.
    Politics caused by organisational structures. The frontline support tasks being taken on by roles that are amongst the lowest paid in the organisation and focused tightly on the products being used rather than broad skills. The chasm and how most approaches are targeting (intentionally or not) the early adopters. Chinese whispers. Starvation of requirements. Gaming of the system to fit the teleological constraints
  • Alternative.
    Cross-disciplinary, high-skilled, distributed teams close to the users..