Assembling the heterogeneous elements for (digital) learning

Category: Chapter 5 Page 1 of 3

Dilbert as an expository instantiation

A few recent posts have been first draft excerpts from my Information Systems Design Theory (ISDT) from emergent university e-learning systems. Being academics and hence somewhat pedantic about these things there are meant to be a number of specific components of an ISDT. One of these is the expository instantiation that is meant to act as both an explanatory device and a platform for testing (Gregor and Jones, 2007) i.e. it’s meant to help explain the theory and also examples of testing the theory.

The trouble is that today’s Dilbert cartoon is probably as good an explanation of what is currently the third principle of implementation for my ISDT.

Dilbert.com

I’m sure that most folk working in a context where they’ve had to use a corporate information system have experienced something like this. A small change – either to fix a problem or improve the system – simply can’t be made because of the nature of the technology or the processes used to make the changes. The inability to make these changes is a major problem for enterprise systems.

The idea from the ISDT is that the development and support team for an emergent university e-learning system should be able to make small scale changes quickly without having to push them up the governance hierarchy. Where possible the team should have the skills, insight, judgement and trust so that “small scale” is actually quite large.

An example

The Webfuse e-learning system that informed much the ISDT provides one example. Behrens (2009) quotes a user of Webfuse about one example of how it was responsive

I remember talking to [a Webfuse developer] and saying how I was having these problems with uploading our final results into [the Enterprise Resource Planning (ERP) system] for the faculty. He basically said, “No problem, we can get our system to handle that”… and ‘Hey presto!’ there was this new piece of functionality added to the system… You felt really involved… You didn’t feel as though you had to jump through hoops to get something done.

Then this is compared with a quote from one of the managers responsible for the enterprise system

We just can’t react in the same way that the Webfuse system can, we are dealing with a really large and complex ERP system. We also have to keep any changes to a minimum because of the fact that it is an ERP. I can see why users get so frustrated with the central system and our support of it. Sometimes, with all the processes we deal with it can take weeks, months, years and sometimes never to get a response back to the user.

Is that Dilbert or what?

The problem with LMS

Fulfilling this requirement is one of the areas where most LMS create problems. For most universities/orgnaisations it is getting into the situation where the LMS (even Moodle) is approaching the “complex ERP system” problem used in the last quote above. Changing the LMS is to fraught with potential dangers that these changes can’t be made quickly. Most organisations don’t try, so we’re back to a Dilbert moment.

Hence, I think there are two problems facing universities trying to fulfil principle #3:

  1. Having the right people in the support and development team with the right experience, insight and judgement is not a simple thing and is directly opposed to the current common practice which is seeking to minimise having these people. Instead there’s reliance on helpdesk staff and trainers.
  2. The product problem. i.e. it’s too large and difficult to change quickly and safely. I think there’s some interesting work to be done here within Moodle and other open source LMS. How do you balance the “flexibility” of open source with the complexity of maintaining a stable institutional implementation?

References

Behrens, S. (2009). Shadow systems: the good, the bad and the ugly. Communications of the ACM, 52(2), 124-129.

Gregor, S., & Jones, D. (2007). The anatomy of a design theory. Journal of the Association for Information Systems, 8(5), 312-335.

Justificatory knowledge

The following is a first version of the justificatory knowledge section of my ISDT for emergent university e-learning systems. Still fairly uncertain just how information is meant to go in here and also just how far I should go with the reference to other theories (there are lots) and how much time should be spent looking at the interactions between them.

If you have some literature/theories which support or contradict this approach, will be really happy to hear about it.

Justificatory knowledge

The purpose of the justificatory knowledge component is to provide an explanation of why the ISDT is structured as it is and why it should be expected to work appropriately. Much of the justificatory knowledge that underpins this ISDT has been described previously within the literature review (Chapter 2), the first Webfuse action research cycle (Chapter 4), and earlier in this chapter. To avoid repetition this section provides a summary and brief discussion of the justificatory knowledge underpinning the ISDT for emergent university e-learning systems. This summary is linked specifically to the ISDT’s principles for form and function, and principles of implementation.

The justificatory knowledge described below arose from the experiences obtained and literature read during the design and support of Webfuse. This knowledge described below is not necessarily complete or the only established knowledge – theoretical or otherwise – that could be used to justify the principles of the ISDT. Hovorka and Germonprez (2009) identify as a weakness of design research, the lack of guidance around the interaction between the various kernel theories that make up justificatory knowledge and how the influence of these kernel theories may change during use. To some extent, the context-sensitive, emergent nature of the approach embodied by this ISDT – and its kernel theories – means that such advice is embedded in the justificatory knowledge that supports the ISDT.

Justificatory knowledge for principles for form and function

Table 5.20 provides a summary of the justificatory knowledge and is followed by a brief description. For each the three categories of principles of form and function for this ISDT, Table 5.20 provides pointers to sections of this thesis and references to literature that describe the justificatory knowledge in more detail.

Table 5.20 – Summary of justificatory knowledge for principles of form and function
Principle Justificatory knowledge
Integrated and independent services Section 2.3.2 – Software wrappers (Bass, Clements et al. 1998; Sneed 2000)
Adaptive and inclusive architecture Systems of Systems (Perrochon and Mann 1999)
Section 2.3.2 – Best of breed (Light, Holland et al. 2001; Lowe and Locke 2008), Service Oriented Architectures (Chen, Chen et al. 2003; Weller, Pegler et al. 2005), End-user development (Eriksson and Dittrich 2007)
Section 4.4.4 – Micro-kernel architecture (Liedtke 1995)
Scaffolding, context-sensitive conglomerations Constructive templates (Nanard, Nanard et al. 1998) , End-user development (Eriksson and Dittrich 2007)

As summarised in chapter 4, a software wrapper is a type of encapsulation that enables software components to be encased in an alternative abstraction that enables clients, often in a new context, to access the wrapped components services (Bass, Clements et al. 1998; Sneed 2000). As such software wrappers are one example of an approach that provides integrated and independent software services.

Some of the relative advantages and limitations more tightly integrated systems is provided by the enterprise software literature. In this literature, comparisons between tightly integrated systems and best-of-breed approaches have argued that integration involves centralisation of processes and a consequently a tendency to reduce autonomy, increase rigidity, and reduce competitiveness (Light, Holland et al. 2001; Lowe and Locke 2008). The best-of-breed approach, focusing on a more inclusive integration of appropriate packages, increases system flexibility while at the same time requires greater time, skills and resources to integrate diverse applications (Light, Holland et al. 2001). Perrochon and Mann (1999) argue that traditional approaches to system architecture, even those with a focus on adaptivity, are appropriate for greenfield developments due to their reliance on the assumption of design (specify architecture) and then implement. The rise of component-oriented software has created the problem of systems of systems that require the combination of well-engineered components or systems into an overall system they were never, and could never be, designed for (Perrochon and Mann 1999).

The concept of constructive templates (Catlin, Garret et al. 1991; Nanard, Nanard et al. 1998) was developed in response to the difficulty faced by content providers in developing hypermedia structures that followed the known principles of interface and hypermedia design. Constructive templates helped content experts to create well designed hypermedia (Catlin, Garret et al. 1991).

Justificatory knowledge for principles of implementation

The justificatory knowledge for this ISDT’s principles of implementation – summarised in Table 5.21 and briefly described below – draws heavily on what is down about alternatives to traditional, plan-driven software development methodologies as discussed in Section 2.4 and Sections 5.3.1 and 5.3.2 of this chapter.

Table 5.21 – Summary of justificatory knowledge for principles of implementation
Principle Justificatory knowledge
Multi-skilled, integrated development and support team Job rotation, multi-skilling etc (Faegri, Dyba et al. 2010), Organisational learning (Seely Brown and Duguid 1991), Situated learning, Situated action, Communities of practice (Seely Brown and Duguid 1991), Knowledge-based theory of organizational capability (Grant 1996)
An adopter-focused, emergent development process Section 2.4 examines the topic of processes, including a comparison of traditional plan-driven processes (e.g. the SDLC) and learning-focused processes such as emergent development. Additional discussion occurs in Section 5.3.2
Section 5.3.1 introduces the conception of adopter-focused development.
A supportive organisational context Organisational fit (Hong and Kim 2002), Strategic alignment (Henderson and Venkatraman 1993), Bricolage (Chae and Lanzara 2006), Mindful innovation (Swanson and Ramiller 2004)

Seely Brown and Duguid (1991) argue that the tendency for education, training and technology design to focus on abstract representations that are detached from practice actually distort the intricacies of practice and consequently hinder how well practice can be understood, engendered, or enhanced. The idea of the development team integrated and embedded in the everyday practice of e-learning seeks to improve the learning of both academics and students about how to harness e-learning, and also improve the learning of the development team (and the organisation) about how e-learning is being used. The ISDT seeks to establish a process for supporting and developing e-learning which is situated in shared practice with a joint, collective purpose.

Faegri, Dyba et al (2010) argue that turbulent environments increase the importance of employee skills and competences and that having employees cycle through different jobs – such as developers being on helpdesk – can improve knowledge redundancy, organizational knowledge creation, and other benefits. Faegri, Dyba et al (2010) also cite Keil-Slawik (1992) as arguing that full understanding of software requires experience developing the software. The traditional hierarchical structures associated with the division of labour around the e-learning within universities – e.g. helpdesk and developers organized into separate units within an IT division; learning and teaching experts located in another division focused on learning and teaching; and, faculty academics located academic units – are seen by Grant (1996) to inhibit the ability to integrate knowledge from members of an organisation. Such integration is seen as fundamental to the organisation’s ability to create and sustain competitive advantage (Grant 1996).

There is significant literature (March 1991; Baskerville, Travis et al. 1992; Mintzberg 1994; Bamford and Forrester 2003) in a variety of disciplines that identifies plan-driven processes as the dominant approach in most organizations. This and related literature also examines the limitations this over-emphasis suffers, especially in contexts with rapid change or significant diversity (see Section 2.4). Consequently there is significant literature identifying both the theoretical basis and guidance (Introna 1996; Truex, Baskerville et al. 1999) and practical implementation methods (Beck 2000; Schwaber and Beedle 2002) for more emergent or adopter-based development processes.

An emergent, university e-learning information system is a large-scale information system. In this context, “large-scale” is used in the sense adopted by Chae and Lanzara (2006), as referring to systems that involve both organisational technologies and technological innovations that “comprise and connect multiple communities of practice within an organisation or between organistions”. Literature examining success factors with information systems development (e.g. Ewusi-Mensah 1997; Scott and Vessey 2002) has long considered it vital for senior management to be supportive of and committed to systems development. Brown et al (2007) identify commitment – defined as the resources dedicated to IT, organisational dedication to change procedures, and top management support – as one of two most cited problems in the IS projects they examined and identified it as the factor most cited within the literature.

Organisational fit (Hong and Kim 2002) and strategic alignment (Henderson and Venkatraman 1993) between various aspects of an organisation and its information technology systems and processes have long been argued as critical success factors. A similar importance on having an organisational context that is committed and appropriate to information systems development is also found in approaches that are less traditional or teleological (e.g. bricolage and mindful innovation) and have more in common with the emergent, adopter-focused approach advocate in this ISDT. Collective or organisational bricolage requires the combined effort of several individuals and groups (Chae and Lanzara 2006). An organisation which is mindful in innovating with IT, uses reasoning grounded in its own organisational facts and specifics when thinking about the innovation, the organisation recognises that context matters (Swanson and Ramiller 2004). Within mindful innovation, management have a responsibility to foster conditions that prompt collective mindfulness (Swanson and Ramiller 2004).

References

Bamford, D., & Forrester, P. (2003). Managing planned and emergent change within an operations management environment. International Journal of Operations and Production Management, 23(5), 546-564.

Baskerville, R., Travis, J., & Truex, D. (1992). Systems without method: the impact of new technologies on information systems development projects. In K. E. Kendall (Ed.), The Impact of Computer Supported Technologies on Information Systems Development (pp. 241-251). Amsterdam: North-Holland.

Bass, L., Clements, P., & Kazman, R. (1998). Software Architecture in Practice. Boston: Addison-Wesley.

Beck, K. (2000). Extreme Programming Explained: Embrace Change: Addison-Wesley.

Brown, S., Chervany, N., & Reinicke, B. (2007). What matters when introducing new information technology. Communications of the ACM, 50(9), 91-96.

Catlin, K., Garret, L. N., & Launhardt, J. (1991). Hypermedia Templates: An Author’s Tool. Paper presented at the Proceedings of Hypertext’91.

Chae, B., & Lanzara, G. F. (2006). Self-destructive dyamics in large-scale technochange and some ways of conteracting it. Information Technology & People, 19(1), 74-97.

Chen, M., Chen, A., & Shao, B. (2003). The implications and impacts of web services to electronic commerce research and practices. Journal of Electronic Commerce Reseaerch, 4(4), 128-139.

Eriksson, J., & Dittrich, Y. (2007). Combining tailoring and evolutionary software development for rapidly changing business systems. Journal of Organizational and End User Computing, 19(2), 47-64.

Ewusi-Mensah, K. (1997). Critical Issues in Abandonded Information Systems Development Projects. Communications of the ACM, 40(9), 74-80.

Faegri, T. E., Dyba, T., & Dingsoyr, T. (2010). Introducing knowledge redundancy practice in software development: Experiences with job rotation in support work. Information and Software Technology, 52(10), 1118-1132.

Grant, R. (1996). Prospering in dynamically competitive environments: organizational capability as knowledge integration. Organization Science, 7(4), 357-387.

Henderson, J., & Venkatraman, N. (1993). Strategic alignment: Leveraging information technology for transforming organizations. IBM Systems Journal, 32(1), 4-16.

Hong, K.-K., & Kim, Y.-G. (2002). The critical success factors for ERP implementation: an organizational fit perspective. Information & Management, 40(1), 25-40.

Hovorka, D., & Germonprez, M. (2009). Tinkering, tailoring and bricolage: Implications for theories of design. Paper presented at the AMCIS 2009.

Introna, L. (1996). Notes on ateleological information systems development. Information Technology & People, 9(4), 20-39.

Keil-Slawik, R. (1992). Artifacts in software design. In C. Floyd, H. Zullighoven, R. Budde & R. Keil-Slawik (Eds.), Software Development and Reality Construction (pp. 168-188). Berlin: Springer-Verlag.

Liedtke, J. (1995). On micro-kernel construction. Operating Systems Review, 29(5), 237-250.

Light, B., Holland, C., & Wills, K. (2001). ERP and best of breed: a comparative analysis. Business Process Management Journal, 7(3), 216-224.

Lowe, A., & Locke, J. (2008). Enterprise resource planning and the post bureaucratic organization. Information Technology & People, 21(4), 375-400.

March, J. (1991). Exploration and exploitation in organizational learning. Organization Science, 2(1), 71-87.

Mintzberg, H. (1994). The rise and fall of strategic planning: Reconceiving roles for planning, plans, planners. New York: Free Press.

Nanard, M., Nanard, J., & Kahn, P. (1998). Pushing Reuse in Hypermedia Design: Golden Rules, Design Patterns and Constructive Templates. Paper presented at the Proceedings of the 9th ACM Conference on Hypertext and Hypermedia.

Perrochon, L., & Mann, W. (1999). Inferred Designs. IEEE Software, 16(5), 46-51.

Schwaber, K., & Beedle, M. (2002). Agile Software Development with Scrum. Upper Saddle River, NJ: Prentice-Hall.

Scott, J., & Vessey, I. (2002). Managing risks in enterprise systems implementations. Communications of the ACM, 45(4), 74-81.

Seely Brown, J., & Duguid, P. (1991). Organizational learning and communities-of-practice: Toward a unified view of working, learning, and innovation. Organization Science, 2(1), 40-57.

Sneed, H. (2000). Encapsulation of legacy software: A technique for reusing legacy software components. Annals of Software Engineering, 9(1-4), 293-313.

Swanson, E. B., & Ramiller, N. C. (2004). Innovating mindfully with information technology. MIS Quarterly, 28(4), 553-583.

Truex, D., Baskerville, R., & Klein, H. (1999). Growing systems in emergent organizations. Communications of the ACM, 42(8), 117-123.

Weller, M., Pegler, C., & Mason, R. (2005). Students’ experience of component versus integrated virtual learning environments. Journal of Computer Assisted Learning, 21(4), 253-259.

Principles of form and function

The aim of my thesis is to formulate an information systems design theory for e-learning. Even though I have a publication or two that have described early versions of the ISDT, I’ve never been really happy with them. However, I’m getting close to the end of this process, at least for the purposes of getting the thesis submitted.

The following is a first draft of the “Principles of form and function”, one of the primary components of an ISDT as identified by Gregory and Jones (2007). I’ll be putting up a draft of the principles of implementation in a little while (UPDATE principles of implementation now up). These are still just approaching first draft stage, they need a bit more reflection and some comments from my esteemed supervisor. Happy to hear thoughts.

By the way, the working title for this ISDT is now “An ISDT for emergent university e-learning systems”.

Principles of form and function

Gregor and Jones (2007) describe the aim of the principles of form and function as defining the structure, organisation, and functioning of the design product or design method. The ISDT described in Chapter 4 was specifically aimed at the World-Wide Web as shown in its title, “An ISDT for web-based learning systems”. Such technology specific assumptions are missing from the ISDT described in this chapter to avoid technological obsolescence. By not relying on a specific technology the ISDT can avoid a common problem with design research – the perishability of findings – and, enable the on-going evolution of any instantiation to continue regardless of the technology.

The principles of form and function for this ISDT are presented here as divided into three groupings: integrated and independent services; adaptive and inclusive architecture; and, scaffolding, context-sensitive conglomerations. Each of these groupings and the related principles are described in the following sub-sections and illustrated through examples from Webfuse. The underlying aim of the following principles of form and function is to provide a system that is easy to modify and focused on providing context-specific services. The ISDT’s principles of implementation (Section 5.6.4) are designed to work with the principles of form and function in order to enable the design of an emergent university e-learning information system.

Integrated and independent services

The emergent nature of this ISDT means that, rather than prescribe a specific set of services that an instantiation should provide, the focus here is on providing mechanisms to quickly add and modify new services in response to local need. It is assumed that an instantiation would provide an initial set of services (see principle 4) with which system use could begin. Subsequent services would be added in response to observed need.

An emergent university e-learning system should:

  1. Provide a method or methods for packaging and using necessary e-learning services from a variety of sources and of a variety of types.
    For example, Webfuse provided two methods for user-level packaging services: – page types and Wf applications – and also used design patterns and object-oriented design for packging of implementation level services. The types of services packaged through these means included: information stored in databases; various operations on that data; external services such as enterprise authentication services; open source COTS; and, remote applications such as blogging tools.
  2. Provide numerous ways to enable different packages to interact and integrate.
    Webfuse provided a number of methods through which the packaging mechanisms described in the previous point could be integrated. For example, Wf applications provided a simple, consistent interface that enabled easy integration from numerous sources. It was through this approach that Wf applications such as email merge, course list, and course photo album were integrated into numerous other services. To allow staff experience what students say on StudentMyCQU, the ViewStudentMyCQU application was implemented as a wrapper around the StudentMyCQU application.
  3. Provide a packaging mechanism that allows for a level of independence and duplication.
    Within Webfuse, modifications to page types could be made with little or no effect on other page types. It was also possible to have multiple page types of the same type. For example, there were three different web-based discussion forums with slightly different functionality preferred by different users. Similarly, the use of the Model-View-Controller design pattern in Wf applications enabled the same data to be represented in many different forms. For example, class lists could be viewed by campus, with or without student photos, as a CSV file, as a HTML page etc.
  4. Provide an initial collection of services that provide a necessary minimum of common e-learning functionality covering: information distribution, communication, assessment, and administration.
    The initial collection of services for Webfuse in 2000 included the existing page types and a range of support services (see Section 4.4.3). These provided an initial collection of services that provided sufficient services for academics to begin using e-learning. It was this use that provided the opportunity to observe, learn and subsequently add, remove and modify available services (see Section 5.3).
  5. Focus on packaging existing software or services for integration into the system, rather than developing custom-built versions of existing functionality.
    With Webfuse this was mostly done through the use of the page types as software wrappers around existing open source software as described in Chapter 4. The BAM Wf application (see 5.3.6) integrated student use of existing blog engines (e.g. http://wordpress.com) into Webfuse via standardised XML formats.
  6. Present this collection of services in a way that for staff and students resembles a single system.
    With Webfuse, whether users were managing incidents of academic misconduct, finding the phone number of a student, responding to a student query on a discussion forum, or uploading a Word document they believed they were using a single system. Via Staff MyCQU they could access all services in a way that fit with their requirements.
  7. Minimise disruption to the user experience of the system.
    From 1997 through 2009, the authentication mechanism used by Webfuse changed at least four times. Users of Webfuse saw no visible change. Similarly, Webfuse page types were re-designed from purely procedural code to being heavily object-oriented. The only changes in the user interface for page types were where new services were added.

Adaptive and inclusive architecture

Sommerville (2001) defines software architecture as the collection of sub-systems within the software and the framework that provides the necessary control and communication mechanisms for these sub-systems. The principles for integrated and independent services described in the previous section are the “sub-systems” for an emergent university e-learning system. Such as a system, like all large information systems, needs some form of system architecture. The major difference for this ISDT is that traditional architectural concerns such as consistency and efficiency are not as important as being adaptive and inclusive.

The system architecture for an emergent university e-learning system should:

  1. Be inclusive by supporting the integration and control of the broadest possible collection of services.
    The approach to software wrappers adopted as part of the Webfuse page types, was to enable the integration of any external service at the expense of ease of implementation. Consequently, the Webfuse page types architecture integrated a range of applications using very different software technologies including a chat room that was a Java application; a page counter implemented in the C programming language; a lecture page type that combined numerous different applications; and, three different discussion forums implemented in Perl. In addition to the page types, Webfuse also relied heavily on the architecture provided by the Apache web server for access control, authentication, and other services. The BAM Wf application (Section 5.3.6) used RSS and Atom feeds as a method for integrating disparate blog applications. Each of these different approaches embody very different architectural models which increase the cost of implementation, but also increase the breadth of services that can be integrated and controlled.
  2. Provide an architecture that is adaptive to changes in requirements and context.
    One approach is the use of an architectural model that provides high levels of maintainability through fine-grained, self-contained components (Sommerville 2001). This was initially achieved in Webfuse through the page types architecture. However, in order to achieve a long-lived information system there is a need for more than this. Sommerville (2001) suggests that major architectural changes are not a normal part of software maintenance. As a system that operated for 13 years in a Web-environment, Webfuse had to undergo major architectural changes. In early 2000, performance problems arose due to increased demand for dynamic web applications (student quizzes) resulting in a significant change in Webfuse architecture. This change was aided through Webfuse’s reliance on the Apache web server and its continual evolution that provided the scaffolding for this architectural change.

The perspective for this ISDT is that traditional homogenous approaches to software architecture (e.g. component architectures) offer numerous advantages. However, there are some drawbacks. For example, a component architecture can only integrate components that have been written to meet the specifications of the component architecture. Any functionality not available within that component architecture, is not available to the system. To some extent such a limitation closes off possibilities for diversity – which this ISDT views as inherent in university learning and teaching – and future emergent development. This does not rule out the use of component architectures within an emergent university e-learning system, but it does mean that such a system would also be using other architectural models at the same time to ensure it was adaptive and inclusive.

Scaffolding, context-sensitive conglomerations

The design of e-learning in universities requires the combination of skills from a variety of different professions (e.g. instructional design, web design etc), and yet is often most performed by academics with limited knowledge of any of these professions. This limited knowledge creates significant workload for the academics and contributes to the limited quality of much e-learning. Adding experts in these fields to help course design is expensive and somewhat counter to the traditional practice of learning and teaching within universities. This suggests that e-learning in universities has a need for approaches that allow the effective capture and re-use of expertise in a form that can be re-used by non-experts without repeated direct interaction with experts. Such an approach could aim to reduce perceived workload and increase the quality of e-learning.

An emergent university e-learning information system should:

  1. Provide the ability to easily develop, including end user development, larger conglomerations of packaged services.
    A conglomeration is not simply an e-learning service such as a discussion forum. Instead it provides additional scaffolding around such services, possibly combining multiple services, to achieve a higher-level task. While many conglomerations would be expert designed and development, offering support for end-user development would increase system flexibility. The Webfuse default course site approach (Section 5.3.5) is one example of a conglomeration. A default course site combines a number of separate page types (services), specific graphical and instructional designs, and existing institutional content into a course website with a minimum of human input. Another form of conglomeration that developed with Webfuse was Staff MyCQU. This “portal” grew to become a conglomeration of integrated Wf applications designed to package a range of services academics required for learning and teaching.
  2. Ensure that conglomerations provide a range of scaffolding to aid users, increase adoption and increase quality.
    There is likely to be some distance between the knowledge of the user and that required to effectively use e-learning services. Scaffolding provided by the conglomerations should seek to bridge this distance, encourage good practice, and help the user develop additional skills. For example, over time an “outstanding tasks” element was added to Staff MyCQU to remind staff of unfinished work in a range of Wf applications. The BAM Wf application was designed to support the workload involved in tracking and marking individual student reflective journals (Jones and Luck 2009). A more recent example focused more on instructional design is the instructional design wizard included in the new version of the Desire2Learn LMS. This wizard guides academics through course creation via course objectives.
  3. Embed opportunities for collaboration and interaction into conglomerations.
    An essential aim of scaffolding conglomerations is enabling and encouraging academics to learn more about how to effectively use e-learning. While the importance of community and social interaction to learning is widely recognised, most professional development opportunities occur in isolation (Bransford, Brown et al. 2000). Conglomerations should aim to provide opportunities for academics to observe, question and discuss use of the technology. Examples from Webfuse are limited to the ability to observe. For example, all Webfuse course sites were, by default, open for all to see. The CourseHistory Wf application allowed staff to see the grade breakdown for all offerings of any course. A better example would have been if the CourseHistory application encouraged and enabled discussions about grade breakdowns.
  4. Ensure that conglomerations are context-sensitive.
    Effective integration with the specific institutional context enables conglomerations to leverage existing resources and reduce cognitive dissonance. For example, the Webfuse default course site conglomeration was integrated with a range of CQU specific systems, processes and resources. The Webfuse online assignment submission system evolved a number of CQU specific features that significantly increased perceptions of usefulness and ease-of-use (Behrens, Jamieson et al. 2005).

References

Behrens, S., Jamieson, K., Jones, D., & Cranston, M. (2005). Predicting system success using the Technology Acceptance Model: A case study. Paper presented at the Australasian Conference on Information Systems’2005, Sydney.

Bransford, J., Brown, A., & Cocking, R. (2000). How people learn: brain, mind, experience, and school. Washington, D.C.: National Academy Press.

Gregor, S., & Jones, D. (2007). The anatomy of a design theory. Journal of the Association for Information Systems, 8(5), 312-335.

Jones, D., & Luck, J. (2009). Blog Aggregation Management: Reducing the Aggravation of Managing Student Blogging. Paper presented at the World Conference on Education Multimedia, Hypermedia and Telecommunications 2009. from http://www.editlib.org/p/31530.

Sommerville, I. (2001). Software Engineering (6th ed.): Addison-Wesley.

How strict a blueprint do ISDTs provide?

Am working on the final ISDT for the thesis. An Information Systems Design Theory (ISDT) is a theory for design and action. It is meant to aim to provide general principles that help practitioners design information systems. Design theory provides guidance about how to build an artifact (process) and what the artifact should look like when built (product/design principles) (Walls, Widmeyer et al. 1992; Gregor 2002). Walls et al (1992) see an ISDT as an integrated set of prescriptions consisting of a particular class of user requirements (meta-requirements), a type of system solution with distinctive features (meta-design) and a set of effective development practices (meta-design). Each of these components of an ISDT can be informed by kernel theories, either academic or practitioner theory-in-use (Sarker and Lee 2002), that enable the formulation of empirically testable predictions relating the design theory to outcomes (Markus, Majchrzak et al. 2002).

My question

I’ve just about happy with the “ISDT for emergent university e-learning systems” that I’ve developed. A key feature of the ISDT is the “emergent” bit. This implies that the specific context within which the ISDT might be applied is going to heavily influence the final system. To some extent there is a chance that aspects of the ISDT should be ignored based on the nature of the specific context. Which brings me to my questions:

  1. How far can the ISDT go in saying, “ignore principle X” if it doesn’t make sense?
  2. How much of the ISDT has to be followed for the resulting system to be informed by the ISDT?
  3. If most of the ISDT is optional based on contextual factors, how much use is the ISDT?
  4. How much and what sort of specific guidance does an ISDT have to give to be useful and/or worthwhile?

Class of systems

One potential line of response to this is based on the “class of systems” idea. The original definition provided by Walls et al (1992) for the meta-design component indicates that it “Describes a class of artefacts hypothesized to meet the meta-requirements” and not a specific instantiation. van Aken (2004) suggests that rather than a specific prescription for a specific situation (an instantiation), the intent should be for a general prescription for a class of problems. van Aken (2004) arrives at this idea through the use of Bunge’s idea of a technological rule.

van Aken (2004) goes onto explain the role of the practitioner in the use of a technological rule/ISDT

Choosing the right solution concept and using it as a design exemplar to design a specific variant of it presumes considerable competence on the part of practitioners. They need a thorough understanding both of the rule and of the particulars of the specific case and they need the skills to translate the general into the specific. Much of the training of students in the design sciences is devoted to learning technological rules and to developing skills in their application. In medicine and engineering, technological rules are not developed for laymen, but for competent professionals.

This seems to offer some support for the idea that this problem, is not really a problem.

Emergent

It appears that the idea of “emergent” is then just an increase in emphasis on context than is generally the case in practice. There is, I believe, a significant difference between emergent/agile development and traditional approaches, it’s probably worthwhile making the distinction in a mild way when introducing the ISDT and then reinforcing this in the artifact mutability and principles of implementation section.

The first stab

The following paragraph is a first draft of the last paragraph in the introduction to the ISDT. It starts alright, but I’m not sure I’ve really captured (or understand) what I’m trying to get at with this. Is it just an attempt to signpost perspectives included below? Need to be able to make this clearer I think.

It is widely accepted that an ISDT – or the related concept of technological rule – are not meant to describe a specific instantiation, but instead to provide a general prescription for a class of problems (Walls, Widmeyer et al. 1992; van Aken 2004). The ISDT presented here is intended to offer a prescription for e-learning information systems for universities. In addition to this general class of problems, the ISDT presented here also includes in its prescription specific advice – provided in the principles of implementation, and artifact mutability components of the ISDT – to be more somewhat more general again. This is captured in the use of the word “emergent” in the title of the ISDT and intended in the sense adopted by Truex et al (1999) where “organisational features are products of constant social negotiation and consensus building….never arriving but always in transition”. This suggests the possibility that aspects of this ISDT may also be subject to negotiation within specific social contexts and subsequently not always seen as relevant.

References

Gregor, S. (2002). Design Theory in Information Systems. Australian Journal of Information Systems, 14-22.

Markus, M. L., Majchrzak, A., & Gasser, L. (2002). A Design Theory for Systems that Support Emergent Knowledge Processes. MIS Quarterly, 26(3), 179-212.

van Aken, J. (2004). Management research based on the paradigm of the design sciences: The quest for field-tested and grounded technological rules. Journal of Management Studies, 41(2), 219-246.

Walls, J., Widmeyer, G., & El Sawy, O. A. (1992). Building an Information System Design Theory for Vigilant EIS. Information Systems Research, 3(1), 36-58.

Light-weight analytics tools as part of scaffolding, context-sensitive conglomerations

A couple of days ago I floated the idea of scaffolding, context-sensitive conglomerations as one idea/model/suggestion for how e-learning systems (currently mostly LMS, but hopefully other models will arise).

George Siemens has posted about light-weight analytics tools such as SNAPP. Both the comments on that post are, to my current somewhat focused/biased perspective, suggestions for the need for scaffolding conglomerations. Both comments are from practitioners who talk about how the supplement their use of discussion forums with other forms of representation. It would appear obvious that these combinations of tools are useful. I’m pretty sure you could find quite a few talented and motivated academics across the world that are using this combination. I’m also pretty sure that few of them would be located within the same institution.

Are there any discussion forum tools in e-learning systems that already provide this sort of scaffolding for users? Are there any IT departments in universities that have recognised this need and are helping academics make this connection?

I’m not aware of any, and this suggests to me that there are some fundamental problems with the way these systems are being supported and structured. i.e. current approaches mean it is unlikely for these sorts of networks of tools/conglomerations to arise.

SNAPP has used a good approach that makes it simpler to create these conglomerations, through the use of browser plugins. But the advantages of that approach come with a negative. i.e. I don’t believe you can currently generate a SNAPP visualisation for a group of courses (e.g. to see how the students/staff in a program are interacting), or compare visualisations between different courses. You also can’t easily combine SNAPP with other context specific data sources such as student records system etc.

Consequently, SNAPP is a great example of a tool that enables a “scaffolding conglomeration” when combined with an LMS discussion forum. But it still remains difficult to add the “context-sensitive” component.

I think there’s value in exploring how SNAPP and similar tools can be used as both scaffolding and context-sensitive conglomerations, and more importantly, what impact it can have on the practice of teachers, and subsequently the quality of learning.

Adding advice

While I remember, there’s a next step that I’d like to see a “scaffolding, context-sensitive conglomeration” take. Advice, examples and connections.

For example, assume I’m teaching a course, I use a discussion forum and I’ve designed its use for a specific pedagogic purpose. I’ve installed SNAPP and to my horror discover a problem. What do I do? What strategies can I employ to address this problem? What strategies have other academics in similar (or even different situations) used? What happened? Can I get their contact details so I can have a chat?

I would imagine that this addition could also be implemented for students who discovered a “bad” pattern in their own practice. They could receive advice, examples and connections from other students.

The learning analytics group seems to include a leaning towards “intelligent”/”adaptive” software to provide this sort of service. I’m more interested in how we can use these tools to connect people and provide the scaffolding that enables and encourages them to take some action.

Misc. reflections on reading about situated cognition

For various reasons, mostly PhD related (and somewhat related to procrastination), I’m taking the time to read a bit more about situated cognition. Not sure how far it will go. The following are some ad hoc reflections and essentially a diary of what I’m reading. Not aiming for this post to fulfil any purpose beyond being a place to dump observations.

The wikipedia page

So far, the Wikipedia page on situated cognition seems fairly extensive and a reasonable place to start.

Misapplication of community of practice?

The wikipedia page has the following definition of community of practice

The concept of a **community of practice** (often abbreviated as CoP) refers to the process of social learning that occurs and shared sociocultural practices that emerge and evolve when people who have common goals interact as they strive towards those goals.

I find this somewhat interesting in that my experience with CoPs around university learning and teaching has been with special groups set up for specific purposes above and beyond normal teaching. i.e. rather than have a CoP around teaching at university X, where the common purpose is to teach. A CoP is set up around attrition, graduate attributes etc and focuses on that as the goal, rather than the teaching.

I do wonder whether this on-going ad hoc creation of CoPs around special topics that are important, but haven’t been embedded into common institutional practice is a symptom of CoP misuse. i.e. if normal practice of teaching within an institution was more like a CoP, would you really need a separate CoP on retention etc? Does the need for separate CoPs indicate that the normal practice of teaching within an institution isn’t like a CoP and hence, perhaps there isn’t a sense of a common purpose amongst those involved in the process of normal teaching? Instead of a common purpose, do the actors within an institution’s normal practice of teaching and learning have their own different purposes?

The glossary from which this definition comes highlights the amount of though and subsequently special language that has arisen around situated cognition. Also has some interesting resonance with the need for design theories to have a constructs section.

Affordances

Interesting to see some of the origins of the concept of affordances beyond Norman and HCI/usability. The idea that affordances are the individual’s interpretations about what action is possible within the given environment through their perception of the environment connects strongly with some of the problems I have around the stereotypical university environment around teaching and the nature of e-learning systems. i.e. I think the affordances seen by many teaching staff aren’t good in terms of improving learning and teaching.

The relationship between affordances and schemata seems an area of some disquiet and more reading – Glenberg and Robertson (1999)

Perception

This section was a little disappointing, it only mentions visual perception. While this is apparently an importent influence on situated cognition, even from my limited reading and knowledge there appears to have been a lot more work done on perception.

Memory

Again I feel there could be more here. But it does pose the interesting question of how situated cognition is downplaying the importance of stored, symbolic representations in memory. Instead having a belief that perception and action are co-determined by effectivities and affordances… Raises the question about what is
learning and knowing (which are covered next). Also links back to the disquiet about the link between affordances and schemata.

Knowing and learning

So knowing is not a think, a memory, but a verb, it is action/participation of an agent in an environment. This is where the idea that knowledge cannot be separate from context. It gives rise to the importance of context.

This is interesting, challenging and somewhat comforting. It is comforting because to some aspect it represents an idea embedded in how Webfuse worked. For Webfuse, context was important. Webfuse wasn’t a general purpose tool that could be used elsewhere, Webfuse could not be separate from its context. Mmm, situated cognition as a kernel theory for the ISDT and Webfuse?

It does seem that some of these sections have been made specifically very narrow and focussing on language/literacy learning.

Pedagogical implications

So, situated cognition is the theory of “mind/knowing/learning etc”, while cognitive apprenticeship etc are instructional design theories drawing on that theory, it appears.

Critiques

A small section on critiques close off the page. On the face of it, the critiques do a reasonable job of removing many of the assumptions on which situated cognition is based. Need to have a look further into Anderson et al (2000).

Of course the criticisms arise from cognitivists – more information processing types – who hold a perspective that has also been challenged. Interesting that it appears that Herbert Simon is one of the critiques (last author on Anderson et al).

References

Anderson, J. R., Greeno, J. G., Reder, L. M., & Simon, H. A. (2000). Perspectives on learning, thinking, and activity. Educational Researcher, 29, 11-13.

Scaffolding, context-sensitive conglomerations in e-learning systems

For the last week or so I’ve been attempting to bring together the principles that underpin the design theory for e-learning that will be the main contribution of my thesis. This post summarises one of the principles of form and function that is emerging from the last week or so. I using this blog post to escape the confines of PhD-ese and see if writing about it can generate some further refinement.

Am aiming for brevity and clarity in the following.

A design theory contains principles for form and function; and for implementation. The following is more form and function, while the previous post is more implementation. There is more to the ISDT than just these two posts.

The problem

The functionality provided by most e-learning systems are in the form of low-level primitive tasks (e.g. upload a document, add a discussion forum, create a quiz). Most e-learning systems provide a fairly large collection of these low-level features. Many of these features, because they are designed for the general case, come with large numbers of options and configuration settings so that the task can be tweaked to the broadest collection of uses.

Consequently, achieving a high-level task (e.g. supporting a discussion forum that encourages student-to-student collaboration, or creating an educationally well-designed course website) requires a significant amount of knowledge, skills and time on the part of teaching academics. Some (perhaps much) of the time spent by teachers in leveraging these low-level, very general features is expended on bridging the gap between the generality of the feature set and the specifics of the context. Much of the time is spent trying to translate and wrangle what is known about good practice into the specifics required to implement it with the low-level feature set of the e-learning system.

I suggest that these difficulties limit the quality and quantity of how these feature sets are used. These difficulties appear to be one of the factors that contribute to the on-going fairly poor quality of most e-learning within formal education settings, especially universities.

Scaffolding, context-sensitive conglomerations as a solution?

The idea is that an e-learning system should provide the ability to create “scaffolding, context-sensitive, conglomerations” of its low-level feature set.

Examples

An example of this is the default course site “conglomeration” used in the Webfuse system. This is the system I designed and forms the basis for my PhD. So you can see where the idea has come from. The default course sites conglomeration was aimed at making it very simple for academics to create a default course site. So simple, in fact, that if they wanted to, they didn’t have to do anything.

Another possible example of a “conglomeration” from my own work could be the Moodle module BIM. BIM is a conglomeration of external blog engines (e.g. WordPress.com and various Moodle features, especially the gradebook. The BIM “conglomeration” is designed to make it simpler for teachers to manage students using individual blogs for assessment and other purposes. It provides a bridge between the external blogs and Moodle’s making/management features.

Moving further afield, the work being done by Desire2Learn with its instructional design wizard provide another approach to the provision of a conglomeration. The “Wizard” approach is used.

Conglomerations

In this context, a conglomeration is essentially a way to group together the existing low-level functionality provided by the e-learning system into something at a much higher level of abstraction. The term conglomeration has been chosen specifically to avoid terms like component, framework, package etc that, at least with some technical systems, refer to fairly specific ways of combining technology features. I’m trying to avoid these terms for a range of reasons, including:

  • Non-specificity;
    The aim of the ISDT (information systems design theory) is to provide broad guidance that is not specific to a particular approach or technology.
  • Inclusivity;
    Most of the technical approaches to grouping functional (e.g. component-based architectures) are not inclusive. i.e. if you’re using architecture X, you can only group features that are implemented using architecture X. While recognising there’s always going to be a practical need for something like this, with my ISDT I am trying to argue that for e-learning systems, the conglomeration mechanism needs to be as inclusive as possible (e.g. BIM’s reliance on RSS/Atom feeds – loose coupling – rather than a specific API for a particular blog service).
  • User focus.
    As part of all this the conglomerations only have to group these features from the perspective of the user. It doesn’t actually mean that they are technically grouped through the use of the same architecture. What’s important is that the user experience with the conglomeration leads them to believe the different bits of low-level functionality are integrated and working together.

Context-sensitive

Context-sensitive implies that the conglomerations know something about and offer support for factors or knowledge that is specific to the context of the people using the conglomeration. i.e. it’s not such a general purpose tool as to require the user to bridge the gap between their context and the tool.

For example, the default course site approach within Webfuse was designed to know about and integrate with as much of the local context as possible. The course synopsis, details about the assessment, staff details etc were automatically drawn from institutional databases and available within the conglomeration. In a more able and educationally enlightened context, the default course site conglomeration might have known about and offered specific support for the use of institutional graduate attributes or course learning outcomes.

Scaffolding

This aspect is still a work in progress and builds somewhat on the context-sensitive attributed. The term scaffolding was chosen because of the connection between this idea and situated/distributed cognition, cognitive apprenticeship etc. The idea is that the e-learning system should be designed in a way to help the teachers (and perhaps students) learn about using the system effectively.

The assumption is that for most teachers using the e-learning system to improve and/or change their teaching practice is a learning process. They haven’t done this before. Rather than treat learning about how to make this change a separate professional development activity, the e-learning embodies/embeds the learning into its conglomerations. In some way, the system adopts an approach that includes situated modelling, coaching and fading.

Collins et al (1991) define it this way

When scaffolding is provided by a teacher, it involves the teacher in executing parts of the task that the student cannot yet manage. A requisite to such scaffolding is accurate diagnosis of the student’s current skill level or difficulty and the availability of an intermediate step at the appropriate level of difficulty in carrying out the target activity. Fading involves the gradual removal of supports until students are on their own.

Am still considering the implications of this addition and how far to take it/interpret it. Potentially this could suggest that the system have a model of the teachers’ abilities and be “intelligent”. I don’t see this as being a requirement. An equally plausible, and probably more likely, approach would be for this diagnosis to be performed by people.

Need to think this through some more and see how much more of the insights offered by the learning theories this idea is based on could/should be used.

References

Collins, A., Brown, J. S., & Holum, A. (1991). Cognitive apprenticeship: Making thinking visible. American Educator(Winter), 6-11, 38-46.

Situated shared practice, curriculum design and academic development

Am currently reading Faegri et al (2010) as part of developing the justificatory knowledge for the final ISDT for e-learning that is meant to be the contribution of the thesis. The principle from the ISDT that this paper connects with is the idea of a “Multi-skilled, integrated development and support team” (the name is a work in progress). The following is simply a placeholder for a quote from the paper and a brief connection with the ISDT and what I think it means for curriculum design and academic development.

The quote

The paper itself is talking about an action research project where job rotation was introduced into a software development firm with the aim of increasing the quality of the knowledge held by software developers. The basic finding was that in this case, there were some benefits, however, the problems outweighed them. I haven’t read all the way through, I’m currently working through the literature review. The following quote is from the review.

Key enabling factors for knowledge creation is knowledge sharing
and integration [36,54]. Research in organizational learning has emphasized the value of practice; people acquire and share knowledge in socially situated work. Learning in the organization occurs in the interplay between tacit and explicit knowledge while it crosses boundaries of groups, departments, and organizations as people participate in work [17,54]. The process should be situated in shared practice with a joint, collective purpose [12,14,15].

Another related quote

The following is from a bit more related reading, in particular Seely Brown & Duguid (1991) – emphasis added

The source of the oppositions perceived between working, learning, and innovating lies primarily in the gulf between precepts and practice. Formal descriptions of work (e.g., “office procedures”) and of learning (e.g., “subject matter”) are abstracted from actual practice. They inevitably and intentionally omit the details. In a society that attaches particular value to “abstract knowledge,” the details of practice have come to be seen as nonessential, unimportant, and easily developed once the relevant abstractions have been grasped. Thus education, training, and technology design generally focus on abstract representations to the detriment, if not exclusion of actual practice. We, by contrast, suggest that practice is central to understanding work. Abstractions detached from practice distort or obscure intricacies of that practice. Without a clear understanding of those intricacies and the role they play, the practice itself cannot be well understood, engendered (through training), or enhanced (through innovation).

Relevance?

I see this as highly relevant to the question of how to improve learning and teaching in universities, especially in terms of the practice of e-learning, curriculum design and academic development. It’s my suggestion that the common approaches to these tasks in most universities ignore the key enabling factors mentioned in the above quote.

For example, the e-learning designers/developers, curriculum designers and academic developers are generally not directly involved with the everyday practice of learning and teaching within the institution. As a result the teaching academics and these other support staff don’t get the benefit of shared practice.

A further impediment to shared practice is the divisions between e-learning support staff, curriculum designers and academic developers that are introduced by organisational hierarchies. At one stage, I worked at a university where the e-learning support people reported to the IT division, the academic staff developers reported to the HR division, the curriculum designers reported to the library, and teaching academics were organised into faculties. There wasn’t a common shared practice amongst these folk.

Instead, any sharing that did occur was either at high level project or management boards and committees, or in design projects prior to implementation. The separation reduce the ability to combine, share and create new knowledge about what was possible.

The resulting problem

The following quote is from Seely Brown and Duiguid (1991)

Because this corporation’s training programs follow a similar downskilling approach, the reps regard them as generally unhelpful. As a result, a wedge is driven between the corporation and its reps: the corporation assumes the reps are untrainable, uncooperative, and unskilled; whereas the reps view the overly simplistic training programs as a reflection of the corporation’s low estimation of their worth and skills. In fact, their valuation is a testament to the depth of the rep’s insight. They recognize the superficiality of the training because they are conscious of the full complexity of the technology and what it takes to keep it running. The corporation, on the other hand, blinkered by its implicit faith in formal training and canonical practice and its misinterpretation of the rep’s behavior, is unable to appreciate either aspect of their insight.

It resonates strongly with some recent experience of mine at an institution rolling out a new LMS. The training programs around the new LMS, the view of management, and the subsequent response from the academics showed some very strong resemblances to the situation described above.

An alternative

One alternative, is what I’m proposing in the ISDT for e-learning. The following is an initial description of the roles/purpose of the “Multi-skilled, integrated development and support team”. Without too much effort you could probably translate this into broader learning and teaching, not just e-learning. Heaven forbid, you could even use it for “blended learning”.

An emergent university e-learning information system should have a team of people that:

  • is responsible for performing the necessary training, development, helpdesk, and other support tasks required by system use within the institution;
  • contains an appropriate combination of technical, training, media design and production, institutional, and learning and teaching skills and knowledge;
  • through the performance of its allocated tasks the team is integrated into the everyday practice of learning and teaching within the institution and cultivates relationships with system users, especially teaching staff;
  • is integrated into the one organisational unit, and as much as possible, co-located;
  • can perform small scale changes to the system in response to problems, observations, and lessons learned during system support and training tasks rapidly without needing formal governance approval;
  • actively examines and reflects on system use and non-use – with a particular emphasis on identifying and examining what early innovators – to identify areas for system improvement and extension;
  • is able to identify and to raise the need for large scale changes to the system with an appropriate governance process; and
  • is trusted by organisational leadership to translate organisational goals into changes within the system, its support and use.

References

Faegri, T. E., Dyba, T., & Dingsoyr, T. (2010). Introducing knowledge redundancy practice in software development: Experiences with job rotation in support work. Information and Software Technology, 52(10), 1118-1132.

Seely Brown, J., & Duguid, P. (1991). Organizational learning and communities-of-practice: Toward a unified view of working, learning, and innovation. Organization Science, 2(1), 40-57.

University e-learning systems: the need for new product and process models and some examples

I’m in the midst of the horrible task of trying to abstract what I think I know about implementing e-learning information systems within universities into the formal “language” required of an information systems design theory and a PhD thesis. This post is a welcome break from that, but is still connected in that it builds on what is perhaps fundamentally different between what most universities are currently doing, and what I think is a more effective approach. In particular, it highlights some more recent developments which are arguably a step towards what I’m thinking.

As it turns out, this post is also an attempt to crystalise some early thinking about what goes into the ISDT. So some of the following is a bit rough. Actually, writing this has identified one perspective that I hadn’t thought of, which is potentially important.

Edu 2.0

The post arises from having listened to this interview with Graham Glass the guy behind Edu 2.0, which is essentially a cloud-based LMS. It’s probably one of a growing number out there. What I found interesting was his description of the product and the process behind Edu 2.0.

In terms of product (i.e. the technology used to provide the e-learning services), the suggestion was that because Edu 2.0 is based in the cloud – in this case Amazon’s S3 service – it could be updated much more quickly than more traditional institutionally hosted LMSs. There some connection here with Google’s approach to on-going modifications to live software.

Coupled with this product flexibility was a process (i.e. the process through which users were supported and the system evolved) that very much focused on the Edu 2.0 developers interacting with the users of the product. For example, releasing proposals and screenshots of new features within discussion forums populated with users and getting feedback; and also responding quickly to requests for fixes or extensions from users. To such an extent that Glass reports users of Edu 2.0 feeling like it is “there EDU 2.0” because it responds so quickly to them and their needs.

The traditional Uni/LMS approach is broken

In the thesis I argue that when you look at how universities are currently implementing e-learning information systems (i.e. selecting and implementing an LMS) the product (the enterprise LMS, the one ring to rule them all) and the process they use are not a very good match at all for the requirements of effectively supporting learning and teaching. In a nut shell, the product and the process is aimed at reducing diversity and the ability to learn, while diversity is a key characteristic of learning and teaching at a university. Not to mention that when it comes to e-learning within universities, it’s still very early days and it is essential that any systemic approach to e-learning have the ability to learn from its implementation and make changes.

I attempted to expand on this argument in the presentation I gave at the EDUCAUSE’2009 conference in Denver last year.

What is needed

The alternative I’m trying to propose within the formal language of the ISDT is that e-learning within universities should seek to use a product (i.e. a specific collection of technologies) that is incredible flexible. The product must, as much as possible, enable rapid, on-going, and sometimes quite significant changes.

To harness this flexibility, the support and development process for e-learning should, rather than be focused on top-down, quality assurance type processes, be focused on closely observing what is being done with the system and using those lessons to modify the product to better suit the diversity of local needs. In particular, the process needs to be adopter focused, which is described by Surry and Farquhar (1997) as seeing the individual choosing to adopt the innovation as the primary force for change.

To some extent, this ability to respond to the local social context can be hard with a software product that has to be used in multiple different contexts. e.g. an LMS used in different institutions.

Slow evolution but not there yet

All university e-learning implementation is not the same. There has been a gentle evolution away from less flexible products to more flexible produces, e.g.

  1. Commercial LMS, hosted on institutional servers.
    Incredibly inflexible. You have to wait for the commercial vendor to see the cost/benefit argument to implement a change in the code base, and then you have to wait until your local IT department can schedule the upgrade to the product.
  2. Open source LMS, hosted on institutional servers.
    Less inflexible. You still have to wait for a developer to see your change as an interesting itch to scratch. This can be quite quick, but it can also be slow. It can be especially quick if your institution has good developers, but good developers cost big money. Even if the developer scratches your itch, the change has to be accepted into the open source code base, which can take some time if its a major change. Then, finally, after the code base is changed, you have to wait for your local IT shop to schedule the upgrade.
  3. Open source LMS, with hosting outsourced.
    This can be a bit quicker than the institutional hosted version. Mainly because the hosting company may well have some decent developers and significant knowledge of upgrading the LMS. However, it’s still going to cost a bit, and it’s not going to be real quick.

The cloud-based approach used by EDU 2.0 does offer a product that is potentially more flexible than existing LMS models. However, apart from the general slowness in the updating, if the change is very specific to an individual institution, it is going to cause some significant problems, regardless of the product model.

Some alternative product models

The EDU 2.0 model doesn’t help the customisation problem. In fact, it probably makes it a bit worse as the same code base is being used by hundreds of institutions from across the globe. The model being adopted by Moodle (and probably others), having plugins you can add, is a step in the right direction in that institutions can choose to have different plugins installed.
However, this model typically assumes that all the plugins have to use the same API, language, or framework. If they don’t, they can’t be installed on the local server and integrated into the LMS.

This requirements is necessary because there is an assumption for many (but not all) plugins that they provide the entire functionality and must be run on the local server. So there is a need for a tighter coupling between the plugin and the LMS and consequently less local flexibility.

A plugin like BIM is a little different. There is a wrapper that is tightly integrated into Moodle to provide some features. However, the majority of the functionality is provided by software (in this case blogging engines) that are chosen by the individual students. Here the flexibility is provided by the loose coupling between blog engine and Moodle.

Mm, still need some more work on this.

References

Surry, D., & Farquhar, J. (1997). Diffusion Theory and Instructional Technology. e-Journal of Instructional Science and Technology, 2(1), 269-278.

Oil sheiks, Lucifer and university learning and teaching

The following arises from a combination of factors including:

Old wine in new bottles

Perhaps the key quote from Mark’s post is

This post is simply to try and say what many people don’t want to say and that is, that most universities really don’t care about educational technology or elearning.

My related perspective is that the vast majority of university learning and teaching is, at best (trying to be very positive), just ok. There’s a small bit that is really, really bad; and a small bit that is really, really good. In addition, most interventions to improve learning and teaching are not doing anything to change this distribution. At best, they might change the media, but the overall distribution is the same.

There’s a quote from Dutton and Loader (2002) that goes something like

without new learning paradigms educators are likely to use technology to do things the way they have always done; but with new and more expensive technology.

I am currently of the opinion that without new management/leadership paradigms to inform how universities improve learning and teaching, the distribution is going to remain the same just with new and more expensive organisational structures. This article from the Goldwater Institute about administrative bloat at American universities might be an indicator of that.

Don’t blame the academics

The “When good people turn bad” radio program is an interview with Philip Zimbardo. He’s the guy responsible for the Stanford prisoner study, an example of where good people turned really bad because of the situation in which they were place. The interview includes the following from Prof Zimbardo

You no longer can focus only on individual freedom of will, individual rationality. People are always behaving in a context, in a situation, and those situations are always created and maintained by powerful systems, political systems, cultural, religious ones. And so we have to take a more complex view of human nature because human beings are complex.

This resonates somewhat with a point that Mark makes

the problem of adoption is primarily not a technical one but one of organisational culture

. I agree. It’s the culture, the systems, the processes and the policies within universities that are encouraging/enshrining this distribution where most university learning and teaching is, at best, just ok.

The culture/system doesn’t encourage nor enable this to change. When management do seek to do something about this, their existing “management paradigm” encourages an emphasis on requiring change without doing anything effective to change the culture/system.

The proposition and the interest

Which is where I am interested in and propose the following

If you really wish to improve the majority of learning and teaching within a university, then you have to focus on changing the culture/system so that academics staff are encouraged and enabled to engage in learning about how to teach.

In addition, I would suggest that requiring learning (e.g. through requiring all new academic staff to obtain a formal qualification in learning) without aligning the entire culture/system to enable academic staff to learn and experiment (some of these characteristics are summarised here) is doomed to failure.

I’d also suggest that there is no way you can “align” the culture/system of a university to enable and encourage academic staff learning about teaching. At best you can engage in a continual process of “aligning” the culture/system as that process of “aligning” is itself a learning process.

Easy to say

I can imagine some universities leaders saying “No shit Sherlock, what do you think we’re doing?”. My response is you aren’t really doing this. Your paradigm is fundamentally inappropriate, regardless of what you claim.

However, actually achieving this is not simple and I don’t claim to have all the answers. This is why this is phrased as a proposition, it’s an area requiring more work.

I am hoping that within a few days, I might have a small subset of an answer in the next, and hopefully final, iteration of the design theory for e-learning that is meant to be the contribution of my thesis.

References

Dutton, W. and B. Loader (2002). Introduction. Digital Academe: The New Media and Institutions of Higher Education and Learning. W. Dutton and B. Loader. London, Routledge: 1-32.

Lessons learned from Webfuse: 2000 onwards

The following is an early draft of the “lessons learned” section of chapter 5 of the thesis, the third last section that needs to be completed (to first draft stage). It still needs some work and completing the last two sections will probably lead to some changes, but it’s a start.

The basic aim of this section is to draw out reasons why the intervention (in this case the Webfuse system I designed) succeeded or failed. As I write this, pretty sure I haven’t finished.

Lessons learned

Before attempting to describe the final ISDT arising from this work, this section seeks to reflect on the outcomes of the intervention in an attempt to understand how well the intervention achieved the changes sought and to understand the observed success and failures.

Relative unimportance of the technical product

From the perspective of data structures, algorithms, and bleeding edge technology Webfuse was not at all innovative. Use of scripting languages, relational databases, and open source applications to construct websites was fairly common and widespread. Nor would much of its implementation be considered theoretically correct by researchers focusing on relational databases, software engineering or computer science. For example, the schema used by Webfuse databases could not be described as being appropriately normalised. In addition, a common complaint about Webfuse has been that it was using technology that would not scale and that was not “enterprise-ready” (even though it could and did scale and support the enterprise). The questions of technical novelty, technical purity, or fulfilling arbitrary scalability guidelines had little or no effect on the success of Webfuse. The success of Webfuse arose from becoming, and being able to stay, an integral and useful part of the everyday life of the students and staff of the institution.

Webfuse was not a product

This emphasis on the characteristics of the technical product was also evident in the continual queries from colleagues asking when Webfuse would be sold or made available to other institutions. The ability and adoption of Webfuse as a product by other institutions was seen as a way for proving its success. This was based on the assumption that Webfuse, like all software, was a product that could be reused regardless of the organisational context. This was also the assumption that underpinned the development of Webfuse during the first phase of its development (1996 though 1999). One example of this is the observation that Webfuse was made available as an open source project for people to download in 1997.

A key characteristic of the second phase of Webfuse development was the recognition that the product and its features was not as important as how well its features matched the needs of the local context and continued to evolve in response to those needs. The most important part of Webfuse was the process, not the product. It was through this process of contextual adaptation that Webfuse became part of the way things were done at CQU, it became part of the culture. This tight connection with the institution meant that while the principles behind Webfuse and some of the applications might be useful at other institution, it was impossible to distribute Webfuse as software product. An understanding of this distinction improved the implementation of Webfuse. However, an inability to explain the importance of this distinction to various stakeholders contributed to the eventual demise of Webfuse and especially its ateleological process.

The importance of the pedagogue

Coates et al (2005) suggests that a recurrent message from educational technology research is that “it is not the provision of features but their uptake and use that really determines their educational value”. This message matches well with the experience of Webfuse. During the initial phase of Webfuse development described in Chapter 4, the provision of features in terms of various page types was not sufficient to generate use by academic staff and consequently any impact in terms of educational value. If the teaching staff responsible for a course did not use the provided features, or did not integrate them effectively into a course, there was no educational value. The pedagogue was of central importance in terms of any educational value arising from e-learning. It was through understanding and using this principle that Webfuse was able to become and everyday part of the practice of teaching academics.

Change takes time, familiarity, need, support, and adaptation

Many, if not all, teaching staff did not make decisions to adopt new educational practices and technologies immediately upon their release. Such adoption decisions occurred over varying periods of time as a result of a combination of individual contextual factors. Effective use of novel practices and technologies often lagged adoption by a number of years. The introduction of novel practices into an organisation generated a need for changes in organisational practices and support in order to become widely adopted and appropriately used. For example, the use of the course barometer feature was highest and most appropriate in 2002 and 2008 (see Figure 5.12) when use was encouraged and supported by organisational resources. In addition, as novel features become more widely used, there is a need to adapt those features to the requirements in response to lessons learned and changing requirements.

Helping people increases trust and knowledge

From 2000 onwards the Webfuse development staff also fulfilled the roles of system trainers and frontline helpdesk staff. Each of these roles are inherently challenging and attempting to balance the competing demands of each role adds further to the complexity. However, there were also a number of significant benefits that arose from this multi-skilling. These benefits included:

  • Helpdesk staff with increased knowledge of the systems;
    The helpdesk staff handling user problems had deep understandings of how the systems worked, what they could do, and how they could be manipulated. This deep level of knowledge enabled quicker and more flexible responses to problems.
  • Increased ability for rapid changes; and
    In some cases, those flexible responses involved quick modification of the Webfuse code to correct a problem or add a new feature. Such minor problems did not have to rise through a helpdesk escalation process before being remedied.
  • Developers with increased knowledge of the needs and capabilities of the users.
    The offering of helpdesk support and training sessions provided a deeper understanding of the capabilities and needs of both staff and student users that could drive the on-going design and development of Webfuse.

Each of these benefits combined to increase trust in the system and its direction, as evidenced by the increased use shown above and the following quote from a member of academic staff

my positive experience with other Infocom systems gives me confidence that OASIS would be no different. The systems team have a very good track record that inspires confidence

You can’t keep all the people happy

The experience with Webfuse from 2000 through 2009 has highlighted just how difficult answering the question – “Was Webfuse a success?” – actually is and how dependent it is upon the experiences and position of the person answering the question. In terms of success, it is possible to point to the statistics showing much higher levels of usage by staff and students. It is also possible to point qualitative comments from staff around trust and confidence and to formal management reports describing Webfuse as “[t]he best thing about teaching and learning in this faculty in 2003”. At the same time, it’s possible to point to consistent arguments from central IT staff that Webfuse was a shadow system that duplicated existing systems and was subsequently inefficient and wasteful (Jones, Behrens et al. 2004). There were also comments from one senior member of staff in 2004 suggesting that Webfuse had made no significant difference to learning and teaching at CQU.

Ateleological processes don’t fit in a teleological environment

Webfuse experienced its greatest levels of support and improvement during the period from 2000 through 2004. During these years the Faculty of Informatics and Communication (Infocom) – which supported Webfuse – was undergoing significant growth in student numbers, complexity, and available resources. At the same time, Infocom had a Dean who had publicly expressed support (Marshall 2001) for a more ateleological approach to organisational and systems development, and was comfortable with that approach within Infocom.

From 2004 onwards there were a number of changes within CQU, including: (1) changes in faculty and institutional leadership; (2) changes in student enrolment profile raising concerns about faculty and institutional funding; and, (3) an organisational restructure resulting in increased centralisation and/or out-sourcing of services. These changes led the institution toward a much more teleological approach to systems development and support. Under these conditions the ateleological Webfuse process was seen as wasteful of resources and, to some extent, nonsensical.

References

Coates, H., R. James, et al. (2005). "A Critical Examination of the Effects of Learning Management Systems on University Teaching and Learning." Tertiary Education and Management 11(1): 19-36.

Jones, D., S. Behrens, et al. (2004). The rise and fall of a shadow system: Lessons for enterprise system implementation. Managing New Wave Information Systems: Enterprise, Government and Society, Proceedings of the 15th Australasian Conference on Information Systems, Hobart, Tasmania.

Marshall, S. (2001). Faculty level strategies in response to globalisation. 12th Annual International Conference of the Australian Association for Institutional Research. Rockhampton, QLD, Australia.

30% of information about task performance

Over on the Remote Learner blog, Jason Cole has posted some information about a keynote by Dr Richard Clark at one of the US MoodleMoots. I want to focus on one key quote from that talk and its implications for Australian higher education and current trends to “improve” learning and teaching and adopt open source LMS (like Moodle).

It’s my argument that this quote, and the research behind it, has implications for the way these projects are conceptualised and run. i.e. they are missing out on a huge amount of potential.

Task analysis and the 30%

The quote from the presentation is

In task analysis, top experts only provide 30% of information about how they perform tasks.

It’s claimed that all the points made by Clark in his presentation are supported by research. It appears likely that the support for this claim comes from Sullivan et al (2008). This paper address the problem of trying to develop procedural skills necessary for professions such as surgery.

The above quote arises due to the problems experts have in describing what they do. Sullivan et al (2008) offer various descriptions and references of this problem in the introduction

This is often difficult because as physicians gain expertise their skills become automated and the steps of the skill blend together [2]. Automated knowledge is achieved by years of practice and experience, wherein the basic elements of the task are performed largely without conscious awareness [3]. This causes experts to omit specific steps when trying to describe a procedure because this information is no longer accessible to conscious processes [2]

Then later, when describing the findings of their research they write

The fact that the experts were not able to articulate all of
the steps and decisions of the task is consistent with the expertise literature that shows that expertise is highly automated [2,3,5] and that experts make errors when trying to describe how they complete a task [3,6,7]. In essence, as the experts developed expertise, their knowledge of the task changed from declarative to procedural knowledge. Declarative knowledge is knowing facts, events, and objects and is found in our conscious working memory [2]. Procedural knowledge is knowing how to perform a task and includes both motor and cognitive skills [2]. Procedural knowledge is automated and operates outside of conscious awareness [2,3]. Once a skill becomes automated, it is fine-tuned to run on autopilot and executes much faster than conscious processes [2,8]. This causes the expert to omit steps and decision points while teaching a procedure because they have literally lost access to the behaviors and cognitive decisions that are made during skill execution [2,5].

The link to analysis and design

A large number of universities within Australia are either:

  1. Changing their LMS to an open source LMS (e.g. Moodle or Sakai), and using this as an opportunity to “renew” their online learning; and/or
  2. Busy on broader interventions to “renew” their online learning due to changes in government policies such as quality assurance, graduate attributes and a move to demand funding for university places.

The common process being adopted by most of these projects is from the planning school of process. i.e. you undertake analysis to identify all relevant, objective information and then design the solution on that basis. You then employ a project team to ensure that the design gets implemented, and finally you put in a skeleton team that maintains the design. This works in terms of information systems (e.g. the selection, implementation and support of a LMS) or broader organisational change (e.g. strategic plans).

The problem is that the “expert problem” Clark refers to above means that it is difficult to gather all the necessary information. It’s difficult to get the people with the knowledge to tell all that they know.

A related example.

The StaffMyCQU Example

Some colleagues and I – over a period of almost 10 years – designed, supported, and evolved an information system call Staff MyCQU. An early part of it’s evolution is described in the “Student Records” section of this paper. It was a fairly simple web application that provided university staff with access to student records and range of related services. Over it’s life cycle, a range of new and different features were added and existing features tweaked, all in response to interactions with the system’s users.

Importantly, the systems developers were also generally the people handling user queries and problems on the “helpdesk”. Quite often, changes to the system would result in tweaks and changes. Rather than being designed up front, the system grew and changed with people using it.

The technology used to implement Staff MyCQU is now deemed ancient and, even more importantly, the system and what it represents is now politically tainted within the organisation. Hence, for the last year or so, the information technology folk at the institution have been working on replacement systems. Just recently, there’s been some concrete outcomes of that work which has resulted in systems being shown to folk, including some of the folk who had used Staff MyCQU. On being shown a particular feature of the new system, it soon became obvious that the system didn’t include a fairly common extension of the feature. An extension that had actually been within StaffMyCQU from the start.

The designers of the new system, with little or no direct connection with actual users doing actual work, don’t have the knowledge about user needs to design a system that is equivalent to what already exists. A perfect example of why the strict separation of analysis, design, implementation and use/maintenance that is explicit in most IT projects and divisions is a significant problem.

The need for growing knowledge

Sullivan et al (2008) suggest cognitive task analysis as a way to better “getting at” the knowledge held by the expert, and there’s a place for that. However, I also think that there is a need, especially in some contexts, for recognition that the engineering/planning method is just not appropriate for some contexts. In some contexts, you need more of a growing/gardening approach. Or, in some cases you need to include more of the growing/gardening approach into your engineering method.

Rather than seeking to gather and analyse all knowledge separated from practice and prior to implementation. Implementation needs to be designed to pay close attention to knowledge that is generated during implementation and the ability to act upon that knowledge.

Especially for wicked problems and complex systems

Trying to improve learning and teaching within a university is a wicked problem. There are many different stakeholders or groups of stakeholders, each with a different frame of reference which leads to different understanding of how to solve the problem. Simple techno-rational solutions to wicked problems rely on the adoption of one of those frames of reference and ignorance of the remainder.

For example, implementation of a new LMS is seen as an information technology problem and treated as such. Consequently, success is measured by uptime and successful project implementation. Not on the quality of learning and teaching that results.

In addition, as you solve wicked problems, you and all of the stakeholders learn more about the problem. The multiple frames of reference change and consequently the appropriate solutions change. This is getting into the area of complex adaptive systems. Dave Snowden has a recent post about why human complex adaptive systems are different.

Prediction

Universities that lean too heavily on engineering/planning approaches to improving learning and teaching will fail. However, they are likely to appear to succeed due to the types of indicators they choose to adopt to as measurements of success and the capability of actors to game those indicators.

Universities that adopt more of a gardening approach, will have greater levels of success, but will have a messier time of it during their projects. These universities will be where the really innovative stuff comes from.

References

Sullivan, M., A. Oretga, et al. (2008). “Assessing the teaching of procedural skills: can cognitive task analysis add to our traditional teaching methods.” The American Journal of Surgery 195: 20-23.

Usage of dynamic web applications by staff and students

The following is a “formalisation” of a previous post. This is the version that forms the first draft of a section in the thesis.

Usage of dynamic web applications by staff and students

As described in Section 5.3.4, a major addition to the integrated OLE provided by Webfuse from 2000 onwards was in the form of dynamic web applications developed using the Wf framework. This section briefly illustrates and explains the usage of these dynamic web applications by both staff and students from 2000 onwards. It shows how these applications enabled the integrated OLE provided by Webfuse to become a heavily used component of everyday learning and teaching within the institution.

Table 5.16 divides people using the Wf framework applications into students – people enrolled in courses at the institution – and staff – people employed by the institution to deliver and support courses. For each group of users it shows the number of people using the Wf applications each year from 2000 through 2009. It also shows the number of requests those people made of the Wf applications within the given year.

Year Students Staff
  Users Requests Users Requests
2000 538 1888 2 31
2001 3361 27,936 96 4139
2002 4210 39,805 298 57,864
2003 6115 206,884 575 164,274
2004 8664 491,136 633 260,657
2005 9937 461,999 782 311,080
2006 11,994 1,033,619 1159 656,609
2007 10,810 820,847 1289 748,107
2008 12,085 777,522 1234 880,015
2009 12,342 900,870 1169 1,157,987

In 2000, the first and only Wf application was called TakeQuiz. As the name suggests, students used TakeQuiz to complete online quizzes. From 2000 through 2002, TakeQuiz was the dominant Wf application used by students accounting for 100% of student requests in 2000 and over 99% in 2001 and 2002. Staff use of TakeQuiz was limited to only two staff testing TakeQuiz to be familiar with the student experience.

The year 2001 saw the development of the initial Wf applications aimed to support staff. These included the results upload application, a facility to generate a class list containing details of enrolled students, an initial online assignment management, quiz management applications, and an access control application for the website. As described in Section 5.3.6 the results upload and class list functionality was implemented in response to difficulties arising from the introduction of a new student records system. The top four Wf applications by staff requests were: results upload (53.4%); online assignment management (15.4%); class list (9%); and, access control (6.9%).

During 2002 the focus on developing Wf applications to better support academic staff continued with the development of Staff MyInfocom, a staff “portal” initially provided staff with simple access to student records. This resulted in a significant increase in staff usage. The top four Wf applications used by staff accounted for almost 83% of staff usage were: Staff MyInfocom (41.3%); Online assignment management (33.5%); Quiz management (4.5%); and, results upload (3.5%).

The usefulness of Staff MyInfocom was such that by 2003, staff from outside Infocom were asking for access. In response a non-Infocom staff “portal” – Staff MyCQU – was developed. The top six Wf applications in 2003 by number of staff requests were: Staff MyInfocom (40%); Staff MyCQU (28.9%); Assignment management (15.9%); results upload (4.3%); informal review of grade (3%); and, quiz management (2.9%). A student “portal” – Student MyInfocom – was also developed in 2003, primarily to help students submit and track their online assignment submissions. The top three Wf applications in 2003 by number of student requests were: Student MyInfocom (55.5%); Take quiz (43%); and, the timetable generator (1.1%).

In 2004, the usage of Staff MyCQU exceeds that of Staff MyInfocom. In 2006, due to an organisational restructure, Staff MyInfocom becomes Staff MyCQU as the services provided become recognised as an institutional service. From 2006 onwards Staff MyCQU is the most used Wf application by staff. Accounting for, on average, 75.5% of staff requests each year. Additional administrative Wf applications are added in 2005 – a system for tracking student academic misconduct incidents – and in 2007 – a system to manage student requests for assignment extensions. By 2009, the number of staff requests on Wf applications breaks the million mark.

In terms of the percentage of staff and students using Wf applications, in 2009, almost 59% of CQU students used a Wf application. This use was primarily online assignment submission. Given there is no way to know how many staff associated with teaching are employed by CQU, it is more difficult to give a figure for percentage of staff. As an indicative figure, the CQU AUQA Performance Portfolio (CQU 2010) summarises the CQU staff profile as consisting of 365 full-time equivalent (FTE) academic staff and 658 non-academic staff for a staff total of 1023. The staff figure for 2009 of 1169 shown in Table 5.16 illustrates how the figure of 1023 does not include a range of staff employed by CQU partners or as casual teaching and support staff. However, there does appear to be a significant percentage of staff using Wf applications.

The large number of requests, significant percentage of staff/students using Wf applications, the use of Wf applications by staff not using the Webfuse course sites, and the breadth of services provided by Wf applications suggests that the Wf applications have been very successful in improving the integrated OLE provided by Webfuse. In particular, the Wf applications have been instrumental in fulfilling the major lessons learned at the end of Chapter 4 – “to better integrated with the requirements and practice of academic staff and students”.

References

CQU (2010). Central Queensland University Performance Portfolio 2010. Rockhampton, Central Queensland University: 114.

Usage of Wf applications

Have spent the last day or so doing some more data munging of access logs trying to determine the level of usage of Wf applications within Webfuse. Wf was the framework for developing dynamic web applications within Webfuse – the topic/outcome of my PhD. The following is an initial summary of that munging with some initial reflection. This will get further polished and re-worked for inclusion in chapter 5 of the thesis.

Users and requests

The following table splits people using the Wf applications into students and staff. It shows the number of users of each type using Wf applications and the number of requests they made.

Year Students Staff
  Users Requests Users Requests
2000 538 1888 2 31
2001 3361 27,936 96 4139
2002 4210 39,805 298 57,864
2003 6115 206,884 575 164,274
2004 8664 491,136 633 260,657
2005 9937 461,999 782 311,080
2006 11,994 1,033,619 1159 656,609
2007 10,810 820,847 1289 748,107
2008 12,085 777,522 1234 880,015
2009 12,342 900,870 1169 1,157,987

History and comments

The following is simply a collection of comments seeking to offer some meaning, context and reflection on the statistics in the above table.

2000

The first Wf application was the standard dynamic application of any e-learning system, “take quiz”. The application students use to take an online quiz. It was developed in 2000, based on some earlier work, that was modified to fit within a mod_perl development environment to deal with the increased system load created by all those students taking the quiz at the same time, at the last minute.

The only staff use of Wf applications was experiments taking a quiz. Staff wanting to know what the students would see.

2001

2001 saw the development of the first staff specific Wf application, the results upload app. This app allowed staff teaching a course to upload a CSV containing the final student results for entry into the institution’s students records system. It was developed after a particularly inefficient solution was proposed by the project implementing Peoplesoft at the institution. The results upload application was responsible for 53.4% of the staff requests in 2001. The other staff applications included:

  • various quiz management features – 15.4%;
  • an early online assignment submission system – 12.2%;
  • an app to download a course class list – 9%; and
  • an app to restrict access to portions of a class site – 6.9%.

For students, the take quiz application accounts for over 99% of the student requests.

2002

This is where the staff “portal” applications: Staff MyInfocom and Staff MyCQU are developed. Both are extensions of the class list, results upload applications initially developed in 2001. The top 4 Wf applications based on staff requests (and accounting for almost 83% of staff Wf application requests) are:

  1. Staff MyInfocom – 41.3%;
  2. Online assignment management – 33.5%;
  3. Quiz management applications – 4.5%;
  4. Results upload – 3.5%.

It is this year that online assignment management really starts being used heavily.

For students, the take quiz application still accounts for over 99.5% of the Wf requests.

2003

In 2003 a student “portal”, and a number of other Wf applications for students, were developed. The major Wf applications by student use in 2003 were:

  • Student MyInfocom (the “portal”) – 55.5%;
    The portal includes the mechanisms by which students submit online assignments.
  • Take quiz – 43%;
  • The timetable generator – 1.1%.

For staff, the “Staff MyCQU” portal – used by staff outside the original Webfuse faculty – starts to get heavy use. The top Wf applications by staff usage are:

  • Staff MyInfocom – 40%;
  • Staff MyCQU – 28.9%;
  • Assignment management – 15.9%;
  • Results upload – 4.3%;
  • Informal review of grade – 3%;
  • Quiz management – 2.9%.

It is interesting to see the decreasing proportion of use of the quiz management facilities.

2004

Webfuse was developed within a particular faculty. 2004 is the year where usage by staff outside that faculty, starts to exceed usage by staff within that faculty. Webfuse is becoming an institutional system, at least in use. It takes another couple of years before it is official.

In terms of staff usage, the top Wf applications are:

  • Staff MyCQU – 41.1%;
  • StaffMyInfocom – 30.5%;
  • Assignment management – 15.9%;
  • Quiz – 4.8%.

For students it is:

  • StudentMyInfocom – 78%;
  • Take Quiz – 20.4%;
  • Timetable generator – 1.3%.

2005

This is the first year I was no longer officially involved with Webfuse. It’s also the year that the direction of Webfuse is officially being set more by faculty management. This is evident in the rise of the Academic Misconduct Database (AMD). A system for tracking incidents of academic misconduct by students.

For staff the top Wf applications by usage in 2005 were:

  • StaffMyCQU – 46%;
  • StaffMyInfocom – 26.3%;
  • Assignment management – 12.3%;
  • AMD – 8.1%.

For students:

  • Student MyInfocom – 88.6%;
  • TakeQuiz – 9.6%;
  • Timetable generator – 1.2%.

2006

This is the year when an organisational restructure encouraged the broader adoption of Wf applications within the institution as parts of the faculty where Webfuse originated were broken up and spread around. You could say the infection that was Webfuse, spread. An example of this organisational restructure is the retirement of the “MyInfocom” brand. The faculty “Infocom” no longer existed and the brand morphed into MyCQU.

For staff, the top Wf applications were:

  • StaffMyCQU – 71.1%;
  • Assignment management – 11.1%;
  • Plagiarism – 10.5%;
  • Results upload – 1.4%;

This is also the year in which the timetable generator no longer existed, so the top student applications were:

  • StudentMyCQU – 92.2%;
  • Take quiz – 6.1%.

2007

I believe this is the year Webfuse and the Wf applications officially became institutional systems as the Webfuse team moved from sitting in a faculty into the central IT division. I found it interesting to see the emergence of the mail merge facility as an top staff application.

The top applications by staff usage were:

  • StaffMyCQU – 73.7%;
  • Plagiarism – 9% ;
  • Assignment management – 7.6%;
  • Extension system – 1.6%;
    This was a system by which students could request an extension for an assignment, and that request would be managed and tracked by staff.
  • Results upload – 1.6%;
  • Mail merge – 1.4%.

2007 is also the year which the Topic Allocation Nomination System became heavily used. Used in only 1 course, the TANS allowed students to select topics they would use for assessment.

The top applications by student usage were:

  • Student MyCQU – 88.1%;
  • take quiz – 7.8%;
  • Topic Allocation system – 2.9%.

2008

This is the first year in which staff requests are greater than student requests. This is probably a factor of students being increasingly served by other systems (another student portal was well underway by now), and the increasing number of applications aimed specifically at staff.

Top applications by staff usage were:

  • Staff MyCQU – 77.6%;
  • Assignment management – 7.4%;
  • Plagiarism – 4.6%;
  • Extension – 2%;
  • Mail merge – 1.7%;
  • Results upload – 1.5%:

For students:

  • Student MyCQU – 84.9%;
  • Take Quiz – 6.3%;
  • TANS – 5%;
  • Barometer – 1.4%;
  • Extension system – 1.3%;
  • BAM – .5%

Amongst the student numbers, it is interesting to see the appearance (admittedly in a small way) of the barometer and BAM. I believe the barometer appears because of an institutional project aimed at encouraging more feedback from students. BAM appears because of it’s broader use, especially in a large course or two, and the availability of a method for students to track and see their progress.

2009

2009 was essentially the final year of Webfuse. Though, the Wf applications continue to live on for a while in 2010, they will eventually disappear. Even though this is the first year in which staff usage of Wf applications breaks the million request barrier.

Most interestingly for me amongst the staff usage, is the rise of BAM. The top applications by staff usage were:

  • Staff MyCQU – 79.8%;
  • Assignment management – 7.1%;
  • Plagiarism – 3.3%;
  • BAM – 2.4%;
  • Extension system – 1.4%;
  • MailMerge – 1.4%;
  • Results upload – 1.2%

For the students, BAM continues to rise and the barometer drops (note: this is a comparison in percentage terms, but the overall number of student requests increased between 2008 and 2009, so it’s a real change). It is my understanding that the “push” to use the barometer was not continued in 2009 and its use was left up to the academic staff, hence the drop. This mirrors lessons learned in earlier work with the barometers.

For students, the top used Wf applications were:

  • Student MyCQU – 86.8%;
  • Take Quiz – 5.5%;
  • BAM – 2.8%,
  • TANS – 2.6%;
  • Extension system – 1.4%;
  • Barometer – 0.9%.

What’s with the student portal?

From about 2006/2007 CQU had an official student portal, so why does StudentMyCQU still account for such a large percentage of student usage of Wf applications?

The simple answer is that the StudentMyCQU application also embodies the interface students use to submit online assignments. For example, of the 781,705 requests of the Student MyCQU application made in 2009, at least 76% of the requests were submitting, checking or viewing online assignments.

Combining the student usage with the staff usage, and online assignment submission and management was amongst one of the most used features of Webfuse.

Usage of Webfuse course sites

As part of the PhD I have to summarise aspects of Webfuse usage, the following is a quick attempt to summarise usage of the Webfuse course sites from 1997 through 2009. The main aim is to show what I’ve got so far, think about what needs to be added, generate a to do list, and make an initial stab at explaining some of the movements. The hope is that writing this is essentially a rough draft/scribble of material that will end up in the thesis.

Course site usage

The following table provides the stats I have so far. The table has the following columns:

  • year;
    Webfuse course sites first appeared in 1997 and ceased at the end of 2009. Each row shows statistics for courses that were offered in that academic year. At this institution the academic year started in Feb/March. For most of these years there was between 3 and 5 different terms. Some courses were offered multiple times a year.
  • course site hits;
    Most of the content on Webfuse course sites was available to anyone on the World-Wide Web without a need to login. Hence it is impossible to determine just how many of the requests for course site content and services were for students, staff, or the general public. Hence the number of requests for HTML pages and data files is given.
  • page updates; and
    Course sites were created and modified by the page update process. This column shows the number of unique page updates by teaching staff. Page updates by support staff have been removed.
  • Unique authors.
    The number of teaching academics who performed page updates in a year.
Year Course site hits Page Updates Unique authors
1997 780,651 2232 15
1998 905,326 7278 41
1999 1,378,699 6523 36
2000 1,833,577 5588 40
2001 6,491,238 2213 38
2002 5,346,867 11313 86
2003 4,686,393 13618 99
2004 4,133,551 12186 81
2005 2,422,395 8059 84
2006 3,278,221 13618 86
2007 1,891,192 12444 70
2008 1,848,491 9920 64
2009 1,958,401 8639 53

Some misc comments and points about this data:

  • From 1997 through to the first half of 2001, the default Webfuse course sites were generally fairly limited and created manually by support staff. Some were then modified by the teaching academics.

    To do: An important point to bring out is that in the early years, most of the page updates were down to a very small percentage of the staff. e.g. in 1997 almost 50% of the page updates were done by the one teaching academic – me. Need to add here some indication of distributed the number of page updates were amongst the teachign staff.

  • Through to about 2002/2003, the number of students within courses with Webfuse course sites was increasing significantly. From there the numbers decrease quite significantly.

    To do: calculate the number of students enrolled in the Webfuse courses and include it in this table to give some indication of this potential effect.

  • In the second half of 2001, a new automated and “better” design default course site approach was implemented. There is then a significant increase in the number of staff updating pages and the number of page updates. This is sort of the point I’m trying to prove.

    To do: There’s a strange dip in the number of page updates in 2001, more than half from the previous year. Check the dip in page updates for 2001, what’s the source?

  • In 2004/2005 the faculty in which Webfuse originated was broken up, with different disciplines going into different faculties. This followed close after the adoption of Blackboard as the institution’s LMS. Hence the number of courses using Webfuse course sites dropped.

    To do: Include the number of Webfuse course sites in each year to give some relative idea about the ups and downs of the page updates and unique authors.

Features used in Webfuse course sites

Time to get back into the thesis. The following is the next completed section from the evaluation part of chapter 5 of my thesis. A result of much data munging and some writing, still needs a bit more reflection and thought, but getting close.

Features used in course sites

The previous sub-section examined the number of pages used in default course sites from 2000 through 2004. This sub-section seeks to examine in more detail the question of feature adoption within the Webfuse courses sites. In particular it seeks to describe the impact of the introduction of the default course sites approach and compare its results with feature adoption in course websites from other systems at other institutions. This is done using the Malikowski et al (2007) model introduced in Chapter 4, which abstracts LMS features into five system independent categories (see Figure 4.8). This sub-section first describes the changes in the available Webfuse features – both through new Webfuse page types and Wf applications – from 2000 onwards in terms of the Malikowski et al (2007) model. It then outlines how Webfuse feature adoption within course sites changed over the period from 2000 through 2004 and compares that with other systems at other institutions. Finally, it compares and contrasts feature adoption during 2005 through 2009 at CQU between Webfuse and Blackboard.

As described in Chapter 4, the fifth Malikowski et al (2007) category – Computer-Based Instruction – is not included in the following discussions because Webfuse never provided features that fit within this category. In addition, it is a category of feature rarely present or used in other LMS, especially from 2000 through 2004. Table 5.12 lists the remaining four Malikowski et al (2007) categories and lists the Webfuse features within those categories from 1997-1999 and 2000 onwards. The 2000 onward features included features provided by both page types and Wf applications.

Table 5.12 – Allocation of Webfuse page type (1996-1999) to Malikowski et al (2007) categories
Category Page Types (1997-1999) Webfuse features (2000-)
Transmitting content Various content and index page types
Lecture and study guide page types
File upload and search page types
CourseHome
CourseResources
CourseSchedule
CourseStaff
CourseAssessment
RSSUpdates
LectureRepository
Timetable generator (Jones 2003)
Creating class interactions Email2WWW
EwgieChatRoom
WWWBoard and WebBBS
Yabb
CourseGroup, CourseGroups
CourseMailingLists
Email Merge
Etutes
Evaluating students AssignmentSubmission Quiz
Assignment extension management
Academic misconduct database
OASIS (Jones and Behrens 2003)
BAM (Jones and Luck 2009)
Plagiarism detection
IROG (Jones 2003)
Peer Review
Topic Allocation
Evaluating course and instructors Barometer
UnitFeedback/FormMail
Survey

Table 5.13 shows the percentage of Webfuse courses that adopted features in each of the categories proposed Malikowski et al (2007) from 1997 through 2009. The “Malikowski %” row represents the level of feature adoption found by Malikowski et al (2007) in the LMS literature for usage reported before 2004. The “Blackboard %” row represents feature adoption within Blackboard by CQU courses during 2005. Blackboard was adopted as the official institutional LMS by CQU in 2004. The subsequent rows show the level of feature adoption within Webfuse courses from 1997 through 2007. The following describes some limitations and context for the data in Table 5.13 after which some additional visualisations of this data are shown and then some conclusions are drawn.

Table 5.13 – Feature adoption in Webfuse course sites (1999-2004)
Usage Transmitting content Class interactions Evaluating students Evaluating courses and instructors
Malikowski % >50% 20-50% 20-50% <20%
Blackboard % 94% 28% 17% 2%
1997 34.9% 1.8% 0.9% 9.2%
1998 38.4% 48.6% 1.4% 0.7%
1999 46.0% 9.0% 2.1% 9.5%
2000 46.6% 43.7% 24.7% 6.9%
2001 51.6% 32.4% 47.1% 28.3%
2002 69.6% 63.8% 57.7% 44.2%
2003 69.2% 68.5% 93.7% 37.7%
2004 61.3% 61.9% 91.8% 35.7%
2005 64.2% 69.2% 93.6% 39.8%
2006 70.0% 68.7% 105.1% 31.6%
2007 68.5% 102.0% 168.1% 33.1%
2008 72.9% 110.7% 192.0% 51.6%
2009 69.2% 105.7% 211.4% 42.7%

A variety of contextual factors and limitations are necessary to understand the data presented in Table 5.13. These include:

  • Missing course sites;
    As mentioned in previous tables both the course website archives for 1998 and 2000 are each missing course sites for a single term. The percentages shown in Table 5.13 represent the percentage of courses offered in the terms for which archival information is available.
  • Missing mailing lists;
    For most of the period shown in Table 5.13 a significant proportion of courses made use of electronic mailing lists for course communication. These lists, while supported by the Webfuse team, did not have an automated web interface until after the introduction of the default course sites. Information about the use of mailing lists before the default course sites is somewhat patchy. With none available before 2000 and only some information for 2000 and the first half of 2001.
  • Optional versus compulsory content transmission;
    All Webfuse courses sites, including both manually produced sites (pre 2nd half of 2001) and the default course sites (post 2nd half of 2001) included content. Rather than simply show 100%, Table 5.13 shows the percentage of courses where additional content was transmitted through the course site by teaching staff. This was an optional practice.
  • The definition of adoption and the course barometer;
    From 2001 through 2005 the presence of a course barometer was part of the Infocom default course site. This means that 100% of all Webfuse course sites had a course barometer. However, this is not represented in the figures for “evaluating courses and instructors” in Table 5.13. Instead, Table 5.13 includes the percentage of courses where the course barometer was actually used within the course.
  • Greater than 100% adoption.
    From 2006 onwards, both the class interactions and evaluating students columns suggest that greater than 100% of Webfuse course sites had adopted features in these categories. This arises due to the ability of courses to use a number of the Webfuse provided features (e.g. email merge and results upload) without using Webfuse for course sites.

The following graphs enable a visual comparison between the level of feature adoption within Webfuse and are also used to draw some conclusions about that adoption. These graphs use almost the same data as shown in Table 5.13 only separated into the four Malikowski et al (2007) categories. The only difference is that the following graphs also show how feature adoption for Blackboard changed over the period 2005 through 2009, rather than simply showing the level of adoption for 2005 as in Table 5.13. The Blackboard figures for 2009 only include data from the first CQU term, not for the entire year.

Figure 5.9 provides a visualisation of the percentage of courses using features associated with content transmission. The Malikowski et al (2007) range is identified by the dotted lines and represent that as of around 2004, it was common to find between 50% and 100% of course sites using content transmission features. The dashed line in Figure 5.9 shows that from 2005 through 2009 between 80% and 100% of CQU Blackboard course sites were using content transmission features. The thicker black line that includes data labels represents the percentage of Webfuse course sites using the option of adding content transmission features to the default course sites.

Content Transmission

Figure 5.9 – Percentage course sites adopting content transmission: Webfuse, Blackboard and Malikowski et al (2007) (click image to enlarge)

From Figure 5.9 it is possible to see that there was an increase from in the optional use of content transmission features when the default course site approach was introduced during the second half of 2001. In 2002, the first full year of operation for the default course site approach, there was an increase of over 20% use of content transmission features over 2000, the last full year without the default course site approach. From 2002 the adoption rate stayed above 60%.
Figure 5.10 shows the percentage of course websites adopting class interaction features such as discussion forums, chat rooms etc. As of 2004, Malikowski et al (2007) found that it was typical to find between 20% and 50% of course sites adopting these features. From 2005 through 2009, the percentage of Blackboard courses adoption class interaction features increased from 28% through 61%. The data series with the data labels represents the adoption of class interactions within Webfuse course sites and highlights some of the limitations and contextual issues discussed above about Table 5.13.

Interactions

Figure 5.10 – Percentage course sites adopting class interactions: Webfuse, Blackboard and Malikowski et al (2007)(click image to enlarge)

As mentioned in the previous chapter, the Department of Mathematics and Computing (M&C) – in which the Webfuse work originated – had started using email lists in 1992 as a way of interacting with distance education students. These lists arose from the same place as Webfuse. As outlined above prior to 2001, the archives of these mailing lists were kept separate from the Webfuse course sites and records are somewhat patchy. For example, there are archives of the mailing lists for 1998, hence the peak of 48.6% in 1998. The 1.8% and 9% adoption figures for 1997 and 1999 represent years for which mailing list data is missing. In addition, the greater than 100% adoption rates in 2007-2009 arise from increased use of the email merge facility by courses that did not have Webfuse course sites. These courses accessed the email merge facility through Staff MyCQU.

Figure 5.10 shows that adoption of class interaction features were significantly higher within Webfuse than both the Malikowski averages and in Blackboard. Given that once adopted, it was unusual for a course mailing list to be dropped, unless replaced by a web-based discussion forum. It is thought that complete archives of the pre-2001 mailing lists archives would indicate that as early as 1997, almost 50% of Webfuse course sites had adopted some form of class room interaction. Most of this adoption arose from M&C courses continuing use of mailing lists. The increased adoption of class interaction features post 2002 arise from the increased prevalence of Web-based discussion, especially amongst non-M&C courses.

Figure 5.11 shows the percentage adoption of features related to student assessment – typically quizzes and online assignment submission. It shows that the typical Malikowski et al (2007) adoption rate is expected to be between 20% and 50%. It shows that CQU Blackboard adoption from 2005 through 2009 ranged between 17% and 30%. On the other hand, Webfuse adoption after a minimal adoption in 1997 through 1999, increased to levels of over 90% from 2003 through 2005 before exceeding 100% from 2006 onwards.

Evaluate Students

Figure 5.11 – Percentage course sites adopting student assessment: Webfuse, Blackboard and Malikowski et al (2007)(click image to enlarge)

The almost non-existent adoption of student assessment features within Webfuse from 1997 through 1999 represents the almost non-existent provision of these features. A primitive online assignment submission system was used in a small number of courses during these years, mostly those taught by the Webfuse designer. From 2000 onwards an online quiz system became available and a new assignment submission system began to be developed. From this stage on adoption grows to over 90% in 2003. The use of Webfuse student assessment features far outstrips both the Malikowski ranges and those of CQU Blackboard courses.

Figure 5.12 shows the adoption of course evaluation features. It shows that the expected Malikowski et al range (2007) to be between 0% and 20%. The adoption of course evaluation features by CQU Blackboard courses ranges from 2% in 2005 through to 5% in 2009. Prior to 2001, the Webfuse adoption rate is less than 10%, but then increases to range between 28% and 52% from 2001 on. This increase is generally due to increase availability of the Webfuse course barometer feature (see Section 5.3.6).

Evaluate Courses

Figure 5.12 – Percentage course sites adopting course evaluation: Webfuse, Blackboard and Malikowski et al (2007)(click image to enlarge)

Two of the peaks in the Webfuse adoption of course evaluation features from Figure 5.12 coincide with concerted efforts to encourage broader use of the course barometer. The 2002 peak at 44.2% coincides with the work of a barometer booster within Infocom during 2001 and early 2002 as described in Jones (2002). The 2008 peak of 51.6% coincides with a broader whole of CQU push to use the barometer for student evaluation purposes.

The above suggests that, in terms of feature adoption by courses, Webfuse and the default course site approach has been somewhat successful. It ensured that 100% of all courses offered by the organisational unit using Webfuse had a course site with some level of content transmission. With a significant additional level of content added to the course sites. Overall, there was a broader adoption of content transmission with less effort required by academics. In terms of course interactions, student assessment and course evaluation features, the services provided by Webfuse after 2001 has results in levels of adoption greater than broadly expected (as indicated by the Malikowski model) and than found in the use of the Blackboard system at the same institution.

References

Jones, D. (2003). How to live with ERP systems and thrive. Paper presented at the 2003 Tertiary Education Management Conference, Adelaide.

Jones, D., & Behrens, S. (2003). Online Assignment Management: An Evolutionary Tale. Paper presented at the 36th Annual Hawaii International Conference on System Sciences, Hawaii.

Jones, D., & Luck, J. (2009). Blog Aggregation Management: Reducing the Aggravation of Managing Student Blogging. Paper presented at the World Conference on Education Multimedia, Hypermedia and Telecommunications 2009. from http://www.editlib.org/p/31530.

Malikowski, S., Thompson, M., & Theis, J. (2007). A model for research into course management systems: bridging technology and learning theory. Journal of Educational Computing Research, 36(2), 149-173.

Page 1 of 3

Powered by WordPress & Theme by Anders Norén

css.php