Assembling the heterogeneous elements for (digital) learning

Category: innovation Page 1 of 2

Exploring knowledge reuse in design for digital learning: tweaks, H5P, constructive templates and CASA

The following has been accepted for presentation at ASCILITE’2019. It’s based on work described in earlier blog posts.

Click on the images below to see full size.

Abstract

Higher education is being challenged to improve the quality of learning and teaching while at the same time dealing with challenges such as reduced funding and increasing complexity. Design for learning has been proposed as one way to address this challenge, but a question remains around how to sustainably harness all the diverse knowledge required for effective design for digital learning. This paper proposes some initial design principles embodied in the idea of Context-Appropriate Scaffolding Assemblages (CASA) as one potential answer. These principles arose out of prior theory and work, contemporary digital learning practices and the early cycles of an Action Design Research process that has developed two digital ensemble artefacts employed in over 30 courses (units, subjects). Early experience with this approach suggests it can successfully increase the level of design knowledge embedded in digital learning experiences, identify and address shortcomings with current practice, and have a positive impact on the quality of the learning environment.

Keywords: Design for Learning, Digital learning, NGDLE.

Introduction

Learning and teaching within higher education continues to be faced with significant, diverse and on-going challenges. Challenges that increase the difficulty of providing the high-quality learning experiences necessary to produce graduates of the standard society is expecting (Bennett, Lockyer, & Agostinho, 2018). Goodyear (2015) groups these challenges into four categories: massification and the subsequent diversification of needs and expectations; growing expectations of producing work-ready graduates; rapidly changing technologies, creating risk and uncertainty; and, dwindling public funding and competing demands on time. Reconceptualising teaching as design for learning has been identified as a key strategy to sustainably, and at scale, respond to these challenges in a way that offers improvements in learning and teaching (Bennett et al., 2018; Goodyear, 2015). Design for learning aims to improve learning processes and outcomes through the creation of tasks, environments, and social structures that are conducive to effective learning (Goodyear, 2015; Goodyear & Dimitriadis, 2013). The ability of universities to develop the capacity of teaching staff to enhance student learning through design for learning is of increasing financial and strategic importance (Alhadad, Thompson, Knight, Lewis, & Lodge, 2018).

Designing learning experiences that successfully integrate digital tools is a wicked problem. A problem that requires the utilisation of expert knowledge across numerous fields to design solutions that respond appropriately to the unique, incomplete, contextual, and complex nature of learning (Mishra & Koehler, 2008). The shift to teaching as design for learning requires different skills and knowledge, but also brings shifts in the conception of teaching and the identity of the teacher (Gregory & Lodge, 2015). Effective implementation of design for learning requires detailed understanding of pedagogy and design and places cognitive, emotional and social demands on teachers (Alhadad et al., 2018). The ability of teachers to deal with this load has significant impact on learners, learning, and outcomes (Bezuidenhout, 2018). Academic staff report perceptions that expertise in digital technology and instructional design will be increasingly important to their future work, but that these are also the areas where they have the least competency and the highest need for training (Roberts, 2018). Helping teachers integrate digital technology effectively into learning and teaching has been at or near the top of issues facing higher education over several years (Dahlstrom, 2015). However, the nature of this required knowledge is often underestimated by common conceptions of the knowledge required by university teachers (Goodyear, 2015). Responding effectively will not be achieved through a single institutional technology, structure, or design, but instead will require an “amalgamation of strategies and supportive resources” (Alhadad et al., 2018, pp. 427-429). Approaches that do not pay enough attention to the impact on teacher workload run the risk of less than optimal learner outcomes (Gregory & Lodge, 2015).

Universities have adopted several different strategies to ameliorate the difficulty of successfully engaging in design for digital learning. For decades a common solution has been that course design, especially involving the adoption of new methods and technologies, should involve systematic planning by a team of people with appropriate expertise in content, education, technology and other required areas (Dekkers & Andrews, 2000). The use of collaborative design teams with an appropriate, complementary mix of skills, knowledge and experience mirrors the practice in other design fields (Alhadad et al., 2018). However, the prevalence of this practice in higher education has been low, both then (Dekkers & Andrews, 2000) and now. The combination of the high demand and limited availability of people with the necessary knowledge mean that many teaching staff miss out (Bennett, Agostinho, & Lockyer, 2017). A complementary approach is professional development that provides teaching staff with the necessary knowledge of digital technology and instructional design (Roberts, 2018). However, access to professional development is not always possible and funding for professional development and training has rarely kept up with the funding for hardware and infrastructure (Mathes, 2019). There has been work focused on developing methods, tools and repositories to help analyse, capture and encourage reuse of learning designs across disciplines and sectors (Bennett et al., 2017). However, it appears that design for learning continues to struggle to enter mainstream practice (Mor, Craft, & Maina, 2015) with design work undertaken by teachers apparently not including the use of formal methods or systematic representations (Bennett et al., 2017). There does, however, remain on-going demand from academic staff for customisable and reusable ideas for design (Goodyear, 2005). Approaches that respond to academic concerns about workload and time (Gregory & Lodge, 2015) and do not require radical changes to existing work practices nor the development of complex knowledge and skills (Goodyear, 2005).

If there are limitations with current common approaches, what other approaches might exist? Leading to the research question of this study:

How might the diverse knowledge required for effective design for digital learning be shared and used sustainably and at scale?

An Action Design Research (ADR) process is being applied to develop one answer to this question. ADR is used to describe the design, development and evaluation of two digital artefacts – the Card Interface and the Content Interface – and the subsequent formulation of initial design principles that offer a potential answer to the research question. The paper starts by describing the research context and research method. The evolution of each of the two digital artefacts is then described. This experience is then abstracted into six design principles encapsulated in the concept of Context-Appropriate Scaffolding Assemblages (CASA). Finally, the conclusions and implications of this work are discussed.

Research context and method

This research project started in late 2018 within the Learning and Teaching (L&T) section of the Arts, Education and Law (AEL) Group at Griffith University. Staff within the AEL L&T section work with the AEL’s teachers to improve the quality of learning and teaching across about 1300 courses (units, subjects) and 68 programs (degrees). This work seeks to bridge the gaps between the macro-level institutional and technological vision and the practical, coal-face realities of teaching and learning (micro-level). In late 2018 the macro-level vision at Griffith University consisted of current and long-term usage of the Blackboard Learn Learning Management System (LMS) along with a recent decision to move to the Blackboard Ultra LMS. In this context, a challenge was balancing the need to help teaching staff continue to improve learning and teaching within the existing learning environment while at the same time helping the institution develop, refine, and achieve its new macro-level vision. It is within this context that the first offering of Griffith University’s Bachelor of Creative Industries (BCI) program would occur in 2019. The BCI is a future-focused program designed to attract creatives who aspire to a career in the creative industries by instilling an entrepreneurial mindset to engage and challenge the practice and business of the creative industries. Implementation of the program was supported through a year-long strategic project including a project manager and educational developer from the AEL L&T section working with a Program Director and other academic staff. This study starts in late 2018 with a focus on developing the course sites for the seven first year BCI courses. A focus of this work was to develop a striking and innovative design that mirrored the program’s aims and approach. A design that could be maintained by the relevant teaching staff beyond the project’s protected niche. This raised the question of how to ensure that the design knowledge required to maintain a digital learning environment into the future would be available within the teaching team?

To answer this question an Action Design Research (Sein, Henfridsson, Purao, & Rossi, 2011) process was adopted. ADR is a merging of Action Research with Design Research developed within the Information Systems discipline. ADR aims to use the analysis of the continuing emergence of theory-ingrained, digital artefacts within a context as the basis for developing generalised outcomes, including design principles (Sein et al., 2011). A key assumption of ADR is that digital artefacts are not established or fixed. Instead, digital artefacts are ensembles that arise within a context and continue to emerge through development, use and refinement (Sein et al., 2011). A critical element of ADR is that the specific problem being addressed – design of online learning environment for courses within the BCI program – is established as an example of a broader class of problems – how to sustainably and at scale share and reuse the diverse knowledge required for effective design for digital learning (Sein et al., 2011). This shift moves ADR work beyond design – as practised by any learning designer – to research intending to provide guidance to how others might address similar challenges in other contexts that belong to the broader class of design problems.

Figure 1 provides a representation of the ADR four-stage process and the seven principles on which ADR is based. Stages 1 through 3 represent the process through which ensemble digital artefacts are developed, used and evolved within a specific context. The next two sections of this paper describe the emergence of two artefacts developed for the BCI program as they cycled through the first three ADR stages numerous times. The fourth stage of ADR – Formalisation of Learning – aims to abstract the situated knowledge gained during the emergence of digital artefacts into design principles that provide guidance for addressing a class of field problems (Sein et al., 2011). The third section of this paper formalizes the learning gained in the form of six initial design principles structured around the concept of Contextually Appropriate Scaffolding Assemblages (CASA).

Action Design Research Method: Stages and Pinciples

Figure 1 – ADR Method: Stages and Principles (adapted from Sein et al., 2011, p. 41)

Card Interface (artefact 1, ADR stages 1-3)

In response to the adoption of a trimester academic calendar, Griffith University encourages the adoption of a modular approach to course design. It is recommended that course profiles use modules to group and describe the teaching and learning activities. Subsequently, it has become common practice for this modular structure to be used within the course site using the Blackboard Learn content area functionality. To do this well, is not straight forward. Blackboard Learn has several functional limitations in legibility, design consistency, content arrangement and content adjustment that make it difficult to achieve quality visual design (Bartuskova, Krejcar, & Soukal, 2015). Usability analysis has also found that the Blackboard content area is inflexible, inefficient to use, and creates confusion for teaching staff regardless of their level of user experience (Kunene & Petrides, 2017). Overcoming these limitations requires levels of technical and design knowledge not typically held by teaching staff. Without this knowledge the resulting designs typically range from purely textual (e.g. the left-hand side of Figure 2) through to exemplars of poor design choices including the likes of blinking text, poor layout, questionable colour choices, and inconsistent design. While specialist design staff can and have been used to provide the necessary design knowledge to implement contextually-appropriate, effective designs, such an approach does not scale. For example, any subsequent modification typically requires the re-engagement of the design staff.

To overcome this challenge the Blackboard Learn user community has developed a collection of related solutions (Abhrahamson & Hillman, 2016; Plaisted & Tkachov, 2011) that use Javascript to package the necessary design knowledge into a form that can be used by teachers. Griffith University has for some time used one of these solutions, the Blackboard Tweaks building block (Plaisted & Tkachov, 2011) developed at the Queensland University of Technology. One of the tweaks offered by this building block – the Themed Course Table – has been widely used by teaching staff to generate a tabular representation of course modules (e.g. the right-hand side of Figure 2). However, experience has shown that the level of knowledge required to maintain and update the Themed Course Table can challenge some teaching staff. For example, re-ordering modules can be difficult for some, and the dates commonly used within the table must be manually added and then modified when copied from one offering to another. Finally, the inherently text-based and tabular design of the Themed Course Table is also increasingly dated. This was an important limitation for the Bachelor of Creative Industries. An alternative was required.

Example blackboard content area Themed course table
Figure 2 – Example Blackboard Learn Content Areas: Textual versus Themed Course Table

That alternative would use the same approach as the Themed Course Table to achieve a more appropriate outcome. The approach used by the Themed Course Table, other related examples from the Blackboard community, and the H5P authoring tool (Singh & Scholz, 2017) are contemporary examples of constructive templates (Nanard, Nanard, & Kahn, 1998). Constructive templates arose from the hypermedia discipline to encourage the reuse of design knowledge and have been found to reduce cost and improve consistency, reliability and quality while enabling content experts to author and maintain hypermedia systems (Nanard et al., 1998). Constructive templates encapsulate a specific collection of design knowledge required to scaffold the structured provision of necessary data and generate design instances. For example, the Themed Course Table supports the provision of data through the Blackboard content area interface. It then uses design knowledge embedded within the tweak to transform that data into a table. Given these examples and the author’s prior positive experience with the use of constructive templates within digital learning (Jones, 2011), the initial plan for the BCI Course Content area was to replace the Course Theme Table “template” to adopt both a more contemporary visual design, and a forward-oriented view of design for learning. Dimitriadis and Goodyear (2013) argue that design for learning needs to be more forward-oriented and consider what features will be required in each of the lifecycle stages of a learning activity. That is, as the Course Theme Table replacement is being designed, consider what specific features will be required during configuration, orchestration, and reflection and re-design.

The first step in developing a replacement was to explore contemporary web interface practices for a table replacement. Due to its responsiveness to different devices, highly visual presentation, and widespread use amongst Internet and social media services, a card-based interface was chosen. Based on the metaphor of a paper card, this interface brings together all data for a particular object with an option to add contextual information. Common practice with card-based interfaces is to embed into a card memorable images related to the card content (see Figure 3). Within the context of a course module overview such a practice has the potential to positively impact student cognition, emotions, interest, and motivation (Leutner, 2014; Mayer, 2017). A practical advantage of card-based interfaces is that its widespread use means there are numerous widely available resources to aid implementation. This was especially important to the BCI project team, as it did not have significant graphical and client-side design knowledge to draw upon.

Next, a prototype was developed to test how effectively a card-based interface would represent a course’s learning modules. An iterative process was used to translate features and existing practice from the Course Theme Table to a card-based interface. Feedback from other design staff influenced the evolution of the prototype. It also highlighted differences of opinion about some of the visual elements such as the size of the cards, the number of cards per row, and the inclusion of the date in the top left-hand corner. Eventually the prototype card interface was shown to the BCI teaching team for input and approval. With approval given, a collection of Javascript and HTML was created to transform a specifically formatted Blackboard content area into a card interface.

Figure 3 shows just two of the six different styles of card-based interface currently supported by the Card Interface. This illustrates a key feature of the original conception of constructive templates – separation of content from presentation (Nanard et al., 1998) – allowing for different representations of the same content. The left-hand image in Figure 3 and the inclusion of dates on some cards illustrates one way the Card Interface supports a forward-oriented approach to design. Initially, the module dates are specified during the configuration of a course site. However, the dates typically only apply to the initial offering of the course and will need to be manually changed for subsequent offerings. To address this the Card Interface knows the trimester weekly dates from the university academic calendar. Dates to be included on the Card Interface can then be provided using the week number (e.g. Week 1, Week 5 etc.). The Card Interface identifies the trimester a course offering belongs to and translates all week numbers into the appropriate calendar dates.

view ANother card interface
Figure 3 – Two early visualisations of the Card Interface

Despite being designed for the BCI program, the first use of the Card Interface was not in the BCI program. Instead, in late 2018 a librarian working on a Study Skills site learned of the Card Interface from a colleague. Working without any additional support, the librarian was able to use the Card Interface to represent 28 modules spread over 12 content areas. Implementation of the Card Interface in the BCI courses started by drawing on existing learning module content from course profiles. Google Image Search was used to identify visually striking images that could be associated with each module (e.g. the left-hand side of Figure 3). The Card Interface was also used on the BCI program’s Blackboard site. However, the program site had a broader purpose leading to different design decisions and the adoption of a different style of card-based interface (see the right-hand image in Figure 3).

Anecdotal feedback from BCI staff and students suggest that the initial implementation and use of the Card Interface was positive. In addition, the visual improvements offered by the Card Interface over both the standard Blackboard Content Area and the Course Theme Table tweak led to interest from other courses and programs. As of late July 2019, the Card Interface has been used in over 55 content areas in over 30 Blackboard sites. Adoption has occurred at both the program and individual course level led by exposure within the AEL L&T team or by academics seeing it and wanting it. Widespread use has generated different requirements leading to creative uses of the Card Interface (e.g. the use of animated GIFs as card images) and the addition of new functionality (e.g. the ability to embed a video, instead of an image). Requirements from another strategic project led to a customisation of the Card Interface to provide an overview of assessment items, rather than modules.

With its adoption in multiple courses and use for different purposes the Card Interface appears to have successfully encapsulated a collection of design knowledge into a form that can be readily adopted and adapted. Use of that knowledge has improved the resulting design. Contributing factors to this success include: building on existing practice; providing advantages above and beyond existing practice; and, the capability for both teaching and support staff to rapidly customise the Card Interface. Further work is required to gain greater and more objective insight into the impact of the Card Interface on the student experience and outcomes of learning and teaching.

Content Interface (artefact 2, ADR stages 1-3)

The Card Interface provides a visual overview of course modules. The next challenge for the BCI project was the design, implementation and support of the learning activities and resources that form the content of those course modules. A task that is inherently more creative, important and typically involves significantly more content. Also, a task that must be completed using the same, problematic Blackboard interface. This requirement is known to encourage teaching staff to avoid the interface by using offline documents and slides (Bartuskova et al., 2015). This is despite evidence that failing to leverage affordances of the online environment can create a disengaging student experience (Stone & O’Shea, 2019) and that course content is a significant influence on students’ perceptions of course quality (Peltier, Schibrowsky, & Drago, 2007). Adding to the difficulty, the BCI teaching staff either had limited, none, or little recent experience with Blackboard. In the case of contracted staff, they did not have access to Blackboard. This raised the question of how to support the design, implementation and re-design of effective modular, online learning resources and activities for the BCI?

Observation of, and experience with, the Blackboard interface identified three main issues. First, staff did not know how or have access to the Blackboard content interface. Second, the Blackboard authoring interface provides limited authoring functionality. For example, beyond issues identified in the literature (Bartuskova et al., 2015; Kunene & Petrides, 2017) there is no support for standard authoring functionality such as grammar checking, reference management, commenting, and version control. Lastly, once the content is placed within Blackboard the user interface is limited and quite dated. On the plus side, the Blackboard interface does provide the ability to integrate a variety of different activities such as discussion forums, quizzes etc. The intent was to address the issues while at the same time retaining the ability to use the Blackboard activities.

For better or worse, the most common content creation tool for most University staff is Microsoft Word. Anecdotal observation suggests that many staff have adopted the practice of drafting content in Word before copying and pasting it into Blackboard. The Content Interface is designed to transform Word documents into good quality online learning activities and resources (see Figure 4). This is done by using an open source converter to semantically transform Word to HTML that is then copied and pasted into Blackboard. A collection of design knowledge embedded into Javascript then transforms the HTML in several ways. Semantic elements such as activities and readings are visually transformed. All external web links are modified to open in a new tab to avoid a common Blackboard error. The document is transformed into an accordion interface with vertical list of headings that be clicked on to display associated content. This progressive reveal: allows readers to get an overall picture of the module before focusing on the details; provides greater control over how they engage with the content; and is particularly useful on mobile platforms (Budiu, 2015; Loranger, 2014).

Word Content Interface
Figure 4 – Example Module as a Word document and in the Content Interface in Blackboard

To date, the Content Interface has been used to develop over 75 modules in 13 different Blackboard sites, most of these within the seven BCI courses. Experience using the still incomplete Content Interface suggests that there are significant advantages. For example, Library staff have adopted it to create research skills modules that are used in multiple course sites. Experience in the BCI shows that sharing documents through OneDrive and using comments and track changes enables the Word documents to become boundary objects helping the course development team co-create the module learning activities and resources. Where staff are comfortable with Word as an authoring environment, the authoring process is more efficient. The resulting accordion interface offers an improvement over the standard Blackboard interface. However, creating documents with Word is not without its challenges, especially the use of Word styles and templates. Also, the extra steps required can be perceived as problematic when minor edits need to be made, and when direct editing within Blackboard is perceived to be easier and quicker, especially for time-poor teaching staff. Better integration between Blackboard and OneDrive will help. More advantage is possible when the Content Interface is further contextually customized to offer forward-oriented functionality specific to the module learning design.

Initial Design Principles (ADR stage 4)

This section engages with the final stage of the ADR process – formalisation of learning – to produce design principles that help provide actionable insight for practitioners. The following six design principles help guide the development of Contextually-Appropriate Scaffolding Assemblages (CASA) that help to sustainably and at scale share and reuse the design knowledge necessary for effective design for digital learning. The design principles are grouped using the three components of the CASA acronym.

Contextually-Appropriate

1. A CASA should address a specific contextual need within a specific activity
system
.
The highest quality learning and teaching involves the development of appropriate context-specific approaches (Mishra & Koehler, 2006). A CASA should not be implemented at an institutional level. Such top-down projects are unable to pay enough attention to contextually specific needs as they aim for a solution that works in all contexts. Instead, a CASA should be designed in response to a specific need arising in a course or a small group of related courses. Following Ellis & Goodyear (2019) the focus in designing a CASA should not be the needs of individual students, but instead on the whole activity system. That is, consideration should be given to the complex assemblage of learners, teachers, content, pedagogy, technology, organisational structures and the physical environment with an emphasis on encouraging students to successfully engage in intended learning activities. For example, both the Card and Content Interfaces arose from working with a group of seven courses in the BCI program as the result of two separate, but related, needs. While the issues addressed by these CASA apply to many courses, the ability to develop and test solutions at a small scale was beneficial. Rather than a focus primarily on individual learners, the solutions were heavily influenced by an analysis of the available tools (e.g. Blackboard Tweaks, Office365), practices (e.g. modularisation and learning activities described in course profiles), and other components of the activity systems.

2. CASA should be built using and result in generative technologies. To maximise and maintain contextual appropriateness, a CASA must be able to be designed and redesigned as easily as possible. Zittrain (2008) labels technologies as generative or sterile. Generative technologies have a “capacity to produce unanticipated change through unfiltered contributions from broad and varied audiences” (Zittrain, 2008, p. 70). Sterile technologies prevent this. Generative technologies enable convivial systems where people can be “actively engaged in generating creative extensions to the artefacts given to them” (Fischer & Girgensohn, 1990, p. 183). It is the end-user modifiability of generative technology that is crucial to knowledge-based design environments and enables response to unanticipated, contextual requirements (Fischer & Girgensohn, 1990). Implementing CASA using generative technologies allows easy design for specific contexts. Ensuring that CASA are implemented as generative technologies enables easy redesign for other contexts. Generativity, like other technological affordances, arises from the relationship between the technology and the people using the technology. Not only is it necessary to use technology that is easier to modify, it is necessary to be able to draw upon appropriate technological skills. This could mean having people with those technological skills available to educational design teams. It could also mean having a network of intra- and inter-institutional CASA users and developers collaboratively sharing CASA and the knowledge required for use and development; like that available in the H5P community (Singh & Scholz, 2017).

For example, development of the Card and Content Interfaces was only possible due to Blackboard Learn supporting the embedding of Javascript. The value of this generative capability is evident through the numerous projects (Abhrahamson & Hillman, 2016; Plaisted & Tkachov, 2011) from the Blackboard community that leverage this capability; a capability that has been removed in Blackboard’s next version LMS, Ultra. The use of Office365 by the Content Interface illustrates the rise of digital platforms that are generative and raise questions that challenge how innovation through digital technologies are enabled and managed (Yoo, Boland, Lyytinen, & Majchrzak, 2012). Using the generative jQuery library to implement the Content Interface’s accordion enables modification of the accordion look and feel through use of jQuery’s theme roller and library of existing themes. The separation of content from presentation in the Card Interface has enabled at least six redesigns for different purposes. This work was possible because the BCI development team had ready access to the necessary technological skills and was able to draw upon a wide collection of open source software and online support.

3. CASA development should be strategically aligned and supported. Services to support design for learning within Australian universities are limited and insufficient for the demand (Bennett et al., 2017). Services capable of supporting the development of CASA are likely to be more limited. Hence appropriate decisions need to be made about how and what CASA are designed, re-designed and supported. Resources used to develop CASA are best allocated in line with institutional strategic projects. CASA development should proceed with consideration to the “manageably small set of particularly valued activity systems” (Ellis & Goodyear, 2019, p. 188) within the institution and be undertaken with institutionally approved and supported generative technologies. For example, the Card and Content Interfaces arose from an AEL strategic project. Both interfaces were focused on providing contextually-appropriate customization and support for the institutionally important activity system of creating modular learning activities and resources. Where possible these example CASA have used institutionally approved digital technologies (e.g. OneDrive and Blackboard). The sterile nature of existing institutional infrastructure has made it necessary to use more generative technologies (e.g. Amazon Web Services) that are neither officially approved or supported. However, the approach used does build upon an approach from an existing institutional approved technology – Blackboard Tweaks (Plaisted & Tkachov, 2011).

Scaffolding

4. CASA should package appropriate design knowledge to enable (re-)use by teachers and students. Drawing on ideas from constructive templates (Nanard et al., 1998), CASA should package the diverse design knowledge required to respond to a contextually-appropriate need in a way that this design knowledge can be easily reused in different instances. CASA enable the sustainable reuse of contextually applied design knowledge in learning activity systems and subsequently reduce cost and improve quality and consistency. For example, the Card Interface combines the knowledge from web design and multimedia learning research (Leutner, 2014; Mayer, 2017) in a way that has allowed teaching staff to generate a visual overview of the modules in numerous course sites. The Content Interface combines existing knowledge of the Microsoft

Word ecosystem with web design knowledge to improve the design, use and revision of modular content.

5. CASA should actively support a forward-oriented approach to design for learning.

To “thrive outside of the protective niches of project-based innovation” (Dimitriadis & Goodyear, 2013, p. 1) the design of a CASA must not focus only on initial implementation. Instead, CASA design must explicitly consider and include functionality to support the configuration, orchestration, and reflection and re-design of the CASA. For example, the Card Interface leverages contextual knowledge to enable dates to be specified independent of the calendar to automate re-design for subsequent course offerings. As CASA tend to embody a learning design, it should be possible to improve each CASA’s support for orchestration by implementing checkpoint and process analytics (Lockyer, Heathcote, & Dawson, 2013) specific to the CASA’s embedded learning design.

Assemblages

6. CASA are conceptualised and treated as contextual assemblages. Like all technologies, CASA are assemblies of other technologies (Arthur, 2009) where technologies are understood to include techniques such as organisational processes and pedagogies, as well as hardware and software. But a contextual assemblage is more than just technology. It includes consideration of and connections with the policies, practices, funding, literacies and discourse across levels from societal and down through sector, organisational, personal, individual, formal and informal. These are elements that make up the mess and nuance of the context, where the practice of educational technology gets complex (Cottom, 2019). A CASA must be generative in order to be designed and re-designed to respond to this contextual complexity. A CASA needs to be inherently heterogeneous, ephemeral, local, and emergent. A need that is opposed and ill-suited to the dominant rational system view underpinning common digital learning practice which sees technologies as planned, structured, consistent, deterministic, and systematic. Instead, connecting back to design principle one, CASA should be designed in recognition of and as the importance and complex intertwining of the human, social and organisational elements in any attempt to use digital technologies. It should play down the usefulness of distinctions between developer and user, or pedagogy and technology. For example, the Card Interface does not use the Lego approach to assembly that informs the Next Generation Digital Learning Environment (NGDLE) (Brown, Dehoney, & Millichap, 2015) and underpins technologies such as the Learning Tools Interoperability (LTI) standard. Instead of combining clearly distinct blocks with clearly defined connectors the Card and Content Interface is intertwined with and modifies the Blackboard user interface to connect with the specifics of context. Suggesting that the Lego approach is useful, perhaps even necessary, but not sufficient.

Conclusions, Implications, and Further Work

Universities are faced with the strategically important question of how to sustainably and at scale leverage the knowledge required for effective design for digital learning. The early stages of an Action Design Research (ADR) process has been used to formulate one potential answer in the form of six design principles encapsulated in the idea of Context-Appropriate Scaffolding Assemblages (CASA). To date, the ADR process has resulted in the development and use of two prototype CASA within a suite of 7 courses and within 6 months their subsequent adoption in another 24 courses. CASA draw on the idea of constructive templates to capture diverse design knowledge in a form that enables use of that knowledge by teachers and students to effectively address contextually specific needs. By adopting a forward-oriented view of design for learning CASA offer functionality to support configuration, orchestration, and reflection and re-design in order to encourage on-going use beyond the protected project niche of initial implementation. The use of generative technologies and an assemblage perspective enables CASA development to be driven by and re-designed to fit the specific needs of different activity systems and contexts. Such work will be most effective when it is strategically aligned and supported with the aim of supporting and refining institutionally valued activity systems.

Use of the Card and Content Interfaces within and beyond the original project suggest that these CASA have successfully encapsulated the necessary design knowledge to address shortcomings with current practice and had a positive impact on the quality of the digital learning environment. But it’s early days. These CASA can be improved by more completely following the CASA design principles. For example, the Content Interface currently offers only generic support for module design. Significantly greater benefits would arise from customising the Content Interface to support specific learning designs and provide contextually appropriate forward-oriented functionality. More experience is needed to provide insight into how this can be done effectively. Further work is required to establish if, how and what impact the use of CASA has on the quality of the learning environment and the experience and outcomes of both learning and teaching. Further work could also explore the questions raised by the CASA design principles about existing digital learning practice. The generative principle raises questions about whether moves away from leveraging the generativity of web technology – such the design of Blackboard Ultra and the increasing focus on mobile apps – will make it more difficult to integrate contextually specific design knowledge? Do reported difficulties accessing student engagement data with H5P activities (Singh & Scholz, 2017) suggest that the H5P community could fruitfully pay more attention to supporting a forward-oriented design approach? Does the assemblage principal point to potential limitations with some conceptualisations and implementation of next generation of digital learning environments?

References

Abhrahamson, A., & Hillman, D. (2016). Cutomize Learn with CSS and Javascript injection. Presented at the BBWorld 16, Las Vegas, NV. Retrieved from https://community.blackboard.com/docs/DOC-2103

Alhadad, S. S. J., Thompson, K., Knight, S., Lewis, M., & Lodge, J. M. (2018). Analytics-enabled Teaching As Design: Reconceptualisation and Call for Research. Proceedings of the 8th International Conference on Learning Analytics and Knowledge, 427–435.

Arthur, W. B. (2009). The Nature of Technology: what it is and how it evolves. New York, USA: Free Press.

Bartuskova, A., Krejcar, O., & Soukal, I. (2015). Framework of Design Requirements for E-learning Applied on Blackboard Learning System. In M. Núñez, N. T. Nguyen, D. Camacho, & B. Trawiński (Eds.), Computational Collective Intelligence (pp. 471–480). Springer International Publishing.

Bennett, S., Agostinho, S., & Lockyer, L. (2017). The process of designing for learning: understanding university teachers’ design work. Educational Technology Research & Development, 65(1), 125–145.

Bennett, S., Lockyer, L., & Agostinho, S. (2018). Towards sustainable technology-enhanced innovation in higher education: Advancing learning design by understanding and supporting teacher design practice. British Journal of Educational Technology, 49(6), 1014–1026.

Bezuidenhout, A. (2018). Analysing the Importance-Competence Gap of Distance Educators with the Increased Utilisation of Online Learning Strategies in a Developing World Context. International Review of Research in Open and Distributed Learning, 19(3), 263–281.

Brown, M., Dehoney, J., & Millichap, N. (2015). The Next Generation Digital Learning Environment: A

Report on Research (p. 11). Louisville, CO: EDUCAUSE.

Budiu, R. (2015). Accordions on Mobile. Retrieved July 18, 2019, from Nielsen Norman Group website: https://www.nngroup.com/articles/mobile-accordions/

Cottom, T. M. (2019). Rethinking the Context of Edtech. EDUCAUSE Review, 54(3). Retrieved from
https://er.educause.edu/articles/2019/8/rethinking-the-context-of-edtech

Dahlstrom, E. (2015). Educational Technology and Faculty Development in Higher Education. Retrieved from ECAR website: https://library.educause.edu/resources/2015/6/educational-technology-and-faculty-development-in-higher-education

Dekkers, J., & Andrews, T. (2000). A meta-analysis of flexible delivery in selected Australian tertiary institutions: How flexible is flexible delivery? In L. Richardson & J. Lidstone, (Eds.), Proceedings of

ASET-HERDSA 2000 Conference, (pp. 172-182)

Dimitriadis, Y., & Goodyear, P. (2013). Forward-oriented design for learning: illustrating the approach. Research in Learning Technology, 21, 1–13.

Ellis, R. A., & Goodyear, P. (2019). The Education Ecology of Universities: Integrating Learning,

Strategy and the Academy. Routledge.

Fischer, G., & Girgensohn, A. (1990). End-user Modifiability in Design Environments. Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, 183–192.

Goodyear, P. (2005). Educational design and networked learning: Patterns, pattern languages and design practice. Australasian Journal of Educational Technology, 21(1). https://doi.org/10.14742/ajet.1344

Goodyear, P. (2015). Teaching As Design. HERDSA Review of Higher Education, 2, 27–59.

Goodyear, P., & Dimitriadis, Y. (2013). In medias res: reframing design for learning. Research in

Learning Technology, 21, 1–13.

Gregory, M. S. J., & Lodge, J. M. (2015). Academic workload: the silent barrier to the implementation of technology-enhanced learning strategies in higher education. Distance Education, 36(2), 210–230.

Jones, D. (2011). An Information Systems Design Theory for E-learning (PhD, Australian National University). Retrieved from https://openresearch-repository.anu.edu.au/handle/1885/8370

Kunene, K. N., & Petrides, L. (2017). Mind the LMS Content Producer: Blackboard usability for improved productivity and user satisfaction. Information Systems, 14.

Leutner, D. (2014). Motivation and emotion as mediators in multimedia learning. Learning and

Instruction, 29, 174–175.

Lockyer, L., Heathcote, E., & Dawson, S. (2013). Informing Pedagogical Action: Aligning Learning Analytics With Learning Design. American Behavioral Scientist, 57(10), 1439–1459.

Loranger, H. (2014). Accordions for Complex Website Content on Desktops. Retrieved July 18, 2019, from Nielsen Norman Group website: https://www.nngroup.com/articles/accordions-complex-content/

Mathes, J. (2019). Global quality in online, open, flexible and technology enhanced education: An analysis of strengths, weaknesses, opportunities and threats. Retrieved from International Council for Open and Distance Education website:
https://www.icde.org/knowledge-hub/report-global-quality-in-online-education

Mayer, R. E. (2017). Using multimedia for e-learning. Journal of Computer Assisted Learning,
33(5), 403–423.

Mishra, P., & Koehler, M. (2006). Technological pedagogical content knowledge: A framework for teacher knowledge. Teachers College Record, 108(6), 1017–1054.

Mor, Y., Craft, B., & Maina, M. (2015). Introduction – Learning Design: Definitions, Current Issues and Grand Challenges. In M. Maina, B. Craft, & Y. Mor (Eds.), The Art & Science of Learning Design (pp. ix–xxvi). Rotterdam: Sense Publishers.

Nanard, M., Nanard, J., & Kahn, P. (1998). Pushing Reuse in Hypermedia Design: Golden Rules, Design

Patterns and Constructive Templates. 11–20. ACM.

Peltier, J. W., Schibrowsky, J. A., & Drago, W. (2007). The Interdependence of the Factors Influencing the Perceived Quality of the Online Learning Experience: A Causal Model. Journal of Marketing Education; Boulder, 29(2), 140–153.

Plaisted, T., & Tkachov, N. (2011). Blackboard Tweaks: Tools for Academics, Designers and Programmers. Retrieved July 2, 2019, from http://tweaks.github.io/Tweaks/index.html

Roberts, J. (2018). Future and changing roles of staff in distance education: A study to identify training and professional development needs. Distance Education, 39(1), 37–53.

Sein, M. K., Henfridsson, O., Purao, S., & Rossi, M. (2011). Action Design Research. MIS Quarterly,
35(1), 37–56.

Singh, S., & Scholz, K. (2017). Using an e-authoring tool (H5P) to support blended learning: Librarians’ experience. In H. Partridge, K. Davis, & J. Thomas (Eds.), Me, Us, IT! Proceedings ASCILITE2017: 34th International Conference on Innovation, Practice and Research in the Use of Educational Technologies in Tertiary Education (pp. 158–162).

Stone, C., & O’Shea, S. (2019). Older, online and first: Recommendations for retention and success.

Australasian Journal of Educational Technology, 35(1). https://doi.org/10.14742/ajet.3913

Yoo, Y., Boland, R. J., Lyytinen, K., & Majchrzak, A. (2012). Organizing for Innovation in the Digitized World. Organization Science, 23(5), 1398–1408.

Zittrain, J. (2008). The Future of the Internet–And How to Stop It. Yale University Press.

From thinking to tinkering: The grassroots of strategic information systems

What follows is a long overdue summary of Ciborra (1992). I think it will have a lot of insight for how universities implement e-learning. The abstract for Ciborra (1992) is

When building a Strategic Information. System (SIS), it may not be economically sound for a firm to be an innovator through the strategic deployment of information technology. The decreasing costs of the technology and the power of imitation may quickly curtail any competitive advantage acquired through an SIS. On the other hand, the iron law of market competition prescribes that those who do not imitate superior solutions are driven out of business. This means that any successful SIS becomes a competitive necessity for every player in the industry. Tapping standard models of strategy analysis and data sources for industry analysis will lead to similar systems and enhance, rather than decrease, imitation. How then should “true” SISs be developed? In order to avoid easy imitation, they should should emerge from from the grass roots of the organization, out of end-user hacking, computing, and tinkering. In this way the innovative SIS is going to be highly entrenched with the specific culture of the firm. Top management needs to appreciate local fluctuations in practices as a repository of unique innovations and commit adequate resources to their development, even if they fly if the face of traditional approaches. Rather than of looking for standard models in the business strategy literature, SISs should be looked for in the theory and practice of organizational leaming and innovation, both incremental and radical.

My final thoughts

The connection with e-learning

Learning and teaching is the core business of a university. For the 20+ years I’ve worked in Australian Higher Education there has been calls for universities to become more distinct. It would then seem logical that the information systems used to support, enhance and transform (as if there are many that do that) learning and teaching (I’ll use e-learning systems in the following) should be seen as Strategic Information Systems.

Since the late 1990s the implementation of e-learning systems has been strongly influenced by the traditional approaches to strategic and operational management. The influence of the adoption of ERP systems are in no small way a major contributor to this. This recent article (HT: @katemfd) shows the lengths to which universities are going when the select an LMS (sadly for many e-learning == LMS).

I wonder how much of the process is seen as being for strategic advantage. Part, or perhaps all, of Ciborra’s argument for tinkering is on the basis of generating strategic advantage. The question remains whether universities see e-learning as a source of strategic advantage (anymore)? Perhaps they don’t see selection of the LMS as a strategic advantage, but given the lemming like rush toward “we have to have a MOOC” of many VCs it would seem that technology enhanced learning (apologies to @sthcrft) is still seen as a potential “disruptor”/strategic advantage

For me this approach embodies the rational analytic theme to strategy that Ciborra critiques. The tinkering approach is what is missing from university e-learning and its absence is (IMHO) the reason much of it is less than stellar.

Ciborra argues that strategic advantage comes from systems where development is treated as an innovation process. Where innovation is defined as creating new knowledge “about resources, goals, tasks, markets, products and processes” (p. 304). To me this is the same as saying to treat the development of these systems as a learning process. Perhaps more appropriately a constructionist learning process. Not only does such a process provide institutional strategic advantage, it should improve the quality of e-learning.

The current rhetoric/reality gap in e-learning arises from not only an absence, but active prevention and rooting out, of tinkering and bricolage. An absence of learning.

The deficit model problem

Underpinning Ciborra’s approach is that the existing skills and competencies within an organisation provide both the source and the constraint on innovation/learning.

A problem with university e-learning is the deficit model of most existing staff. i.e. most senior management, central L&T, central L&T and middle managers (e.g. ADL&T) have a deficit model of academic staff. They aren’t good enough. They don’t know enough. They have to complete a formal teaching qualification before they can be effective teachers. We have to nail down systems so they don’t do anything different.

Consequently, wxisting skills and competencies are only seen as a constraint on innovation/learning. They are never seen as a source.

Ironically, the same problem arises in the view of students held by the teaching academics that are disparaged by central L&T etc.

The difficulties

The very notion of something being “unanalyzable” would be very difficult for many involved in University management and information technology to accept. Let alone deciding to use it as a foundation for the design of systems.

Summary of the paper

Introduction

Traditional approaches for designing information systems are based on “a set of guidlines” about how best to use IT in a competitive environment and “a planning and implementation strategy” (p. 297).

However, the “wealth of ‘how to build an SIS’ recipes” during the 1990s failed to “yield a commensurate number of successful cases” at least not measured against the rise of systems in the 1980s. Reviewing the literature suggests a number of reasons, including

  • Theoretical literature emphasises rational assessment by top management as the means for strategy formulation ignoring alternative conceptions from innovation literature valuing learning more than thinking and experimentation as a means for revealing new directions.
  • Examining precedent-setting SISs suggests that serendipity, reinvention and other facts were important in their creation. These are missing from the rational approach.

So there are empirical and theoretical grounds for a new kind of guidelines for SIS design.

Organisations should ask

  1. Does it pay to be innovative?
  2. Are SISs offering competitive advantage or are they competitive necessity?
  3. How can a firm implement systems that are not easily copied and thus generate returns?

In terms of e-learning this applies

the paradox of micro-economics: competition tends to force standardization of solutions and equalization of production and coordination costs among participants.

i.e. the pressures to standarise.

The argument is that an SIS must be based on new practical and conceptual foundations

  • Basing an SIS on something that can’t be analysed, like orgnisational culture will help avoid easy imitation. Leveraging the unique sources of practice and know-how of the firm and industry level can be th esource of sustained advantage.
  • SIS development should be closer to prototyping and engaging with end-users’ ingenuity than has been realised.

    The capability of integrating unique ideas and practical design solutions at the end-user level turns out to be important than the adoption of structured approaches to systems development or industry analysis (Schoen 1979; Ciborra and Lanzara, 1990)

Questionable advantage

During the 1980s a range of early adopters of strategic information systems (SISs) – think old style airline reservation systems – arose brought benefit to some organisations and bankruptcy to those that didn’t adopt. This arose to a range of frameworks for identifying SIS.

I’m guessing some of these contributed to the rise of ERP systems.

But the history of those cited success stories suggest that SIS only provide an ephemeral advantage before being copied. One study suggests 92% of systems followed industry wide trends. Only three were original.

I imagine the percentage in university e-learning would be significantly higher. i.e. you can’t get fired if you implement an LMS (or an eportfolio).

To avoid the imitation problem there are suggestions to figure out the lead time for competitors to copy. But that doesn’t avoid the problem. Especially given the rise of consultants and service to help overcome.

After all, if every university can throw millions of dollars at Accenture etc they’ll all end up with the same crappy systems.

Shifts in model of strategic thinking and competition

This is where the traditional approaches to strategy formulation get questioned.

i.e. “management should first engage in a purely cognitive process” that involves

  1. appraise the environment (e.g. SWOT analysis)
  2. identify success factors/distinctive competencies
  3. translate those into a range of competitive strategy alternatives
  4. select the optimal strategy
  5. plan it in sufficient details
  6. implement

At this stage I would add “fail to respond to how much the requirements have changed” and start over again as you employ new senior leadership

This model is seen in most SIS models.

Suggests that in reality actual strategy formulation involves incrementalism, muddling through, myopic and evolutionary decision making. “Structures tend to influence strategy formulation before they can be impacted by the new vision” (p. 300)

References Mintzberg (1990) to question this school of through 3 ways

  1. Assumes that the environment is highly predictable and events unfold in predicted sequences, when in fact implementation surprises happen. Resulting in the clash between inflexible plans and the need for revision.
  2. Assumes that the strategist is an objective decision maker not influenced by “frames of reference, cultural biases, or ingrained, routinized ways of action” (p. 301). Contrary to a raft of research.
  3. Strategy is seen as an intentional design process rather than as learning “the continuous acquisition of knowledge in various forms”. Quotes a range of folk to argue that strategy must be based on effective adaptation and learning involving both “incremental, trial-and-error learning, and radical second-order learning” (p. 301)

The models of competition implicit in SIS frameworks tend to rely on theories of business strategy from industrial organisation economics. i.e. returns are determined by industry structure. To generate advantage a firm must change the structural characteristics by “creating barriers to entry, product differentiation, links with suppliers” (p. 301).

There are alternative models

  • Chamberlin’s (1933) theory of monopolistic competition

    Firms are heterogeneous and compete on resource and asset differences – “technical know-how, reputation, ability for teamwork, organisational culture and skills, and other ‘invisible assets’ (Itami, 1987)” (p. 301)

    Differences enable high return strategies. You compete by cultivating unique strengths and capabilities and defending against imitation.

  • Schumpeter’s take based on innovation in product, market or technology

    Innovation arises from creative destruction, not strategic planning. The ability to guess, learn and luck appear to be the competitive factors.

Links these with Mintzberg’s critique of rational analytics approaches and identifies two themes in business strategy

  1. Rational analytic

    Formulate strategy in advance based on industry analysis. Plan and then implement. Gains advantage relative to firms in the same industry strucure.

  2. Tinkering (my use of the phrase)

    Strategy difficult to plan before the fact. Advantage arises from exploiting unique characteristics of the firm and unleashing its innovating capabilities

Reconsidering the empirical evidence

Turns to an examination of four well-known SIS based on the two themes and other considerations from above. This examination these “cases emphasize the discrepancy between ideal plans for an SIS and the realities of implementation” (p. 302). i.e.

The system was not developed according to a company-
by one of the business units. The system was not developed according to company-wide strategic plan; rather, it was the outcome of an evolutionary, piecemeal process that included the ingenious tactical use of systems already available.

i.e. bricolage and even more revaling

the conventional MIS unit was responsible not only for initial neglect of the new strategic applications within McKesson, but also, subsequently, for the slow pace of company-wide learning about McKesson’s new information systems

Another system “was supposed to address an internal inefficiency” (p. 303) not some grand strategic goal.

And further

The most frequently cited SIS successes of the 1980s, then, tell the same story. successes of the 1980s, then, tell the same story. Innovative SISs are not fully designed top-down or introduced in one shot; rather, they are tried out through prototyping and tinkering. In contrast, strategy formulation and design take place in pre-existing cognitive frames and organizational contexts that usually prevent designers and sponsors from seeing and exploiting the potential for innovation. (p. 303)

New foundations for SIS design

SIS development must be treated as an innovation process. The skills/competencies in an organisation is both a source and a constraint on innovation. The aim is to create knowledge.

New knowledge can be created in two non-exclusive ways

  1. Tinkering.

    Rely on local information and routine behaviour. Learning by doing, incremental decision making and muddling through).

    Accessing more diverse and distant information, when an adequate level of competence is not present, would instead lead to errors and further divergence from optimal performance (Heiner, 1983) (p. 304)

    People close to the operational level have to be able to tinker to solve new problems. “local cues from a situation are trusted and exploited in a somewhat unreflective way, aiming at ad hoc solutions by heuristics rather than high theory”

    The value of this approach is to keep development of an SIS close to the competencies of the organisation and ongoing fluctuations.

  2. Radical learning

    “entails restructuring the cognitive and organisational backgrounds that give meaning to the practices, routines and skills at hand” (p. 304). It requires more than analysis and requirements specifications. Aims at restructuring the context of both business policy and systems development”. Requires “intervening in situations and designing-in-action”.

    The change in context allows new ways of looking at the capabilities and devising new strategies. The sheer difference becomes difficult to imitate.

SIS planning by oxymorons

Time to translate those theoretical observations into practical guidelines.

Argues that the way to develop an SIS is to proceed by oxymoroon. Fusing “opposites in practice and being exposed to the mismatches that bound to occur” (p. 305). Defines 7

  • 4 to bolster incremental learning
    1. Value bricolage strategically
    2. Design tinkering

      This is important

      Activities, settings, and systems have to be arranged so that invention and prototyping by end-users can flourish, together with open experimentation (p. 305)

      Set up the organisation to favour local innovation. e.g. ad hoc project teams. ethnographic studies.

    3. Establish systematic serendipity

      Open experimentation results in largely incomplete designs, the constant intermingling of implementation and refinement, concurrent or simultaneous conception and execution – NOT sequential

      An ideal context for serendipity to merge and lead to unexpected solutions.

    4. Thrive on gradual breakthroughs.

      In a fluctuating environment the ideas that arise are likely to include those that don’t align with established organisational routines. The raw material for innovation. “management should appreciate and learn about such emerging practices”

  • Radical learning and innovation
    1. Practice unskilled learning

      Radically innovative approaches may be seen as incompetent when judged by old routines and norms. Management should value this behaviour as an attempt to unlearn old ways of thinking and doing. It’s where new perspectives arise.

    2. Strive for failure

      Going for excellence suggests doing better what you already do which generates routinized and efficient systems. The competency trap. Creative reflection over failures and suggest ways to novel ideas and designs. Also the recognition of discontinuities and flex points.

    3. Achieve collaborative inimitability

      Don’t be afraid to collaborate with competitors. Expose the org to new cultures and ideas.

These seven oxymorons can represent a new “systematic” approach for the establishment of an organizational environment where new information—and thus new systems can be generated. Precisely because they are paradoxical, they can unfreeze existing routines, cognitive frames and behaviors; they favor learning over monitoring and innovation. (p. 306)

References

Ciborra, C. (1992). From thinking to tinkering: The grassroots of strategic information systems. The Information Society, 8(4), 297–309.

Eduhacking – a better use for (part of) academic conferences?

In short, can we get an Eduhack style event running at ASCILITE’12? Want to help? If you want, skip to the point

 

Possibly the most productive conference I’ve ever been on was the 1996 ITiCSE Conference in Barcelona. (It seems the conferences have evolved from “Integrating Technology into CS Education” to “Innovation and Technology in CS Education”). Apart from my first trip to Spain, the conference introduced me to something different in terms of conferences, the working groups.

We were the first set of working groups and at that stage it worked a bit like this:

  • Someone came up with a topic – in our case “World Wide Web as an Interactive Teaching Resource”.
  • They called for participants.
  • We started collaborating ahead of the conference.
  • During the conference we (based on my vague recollection of 16 years ago)
    • Worked for a day or two before the conference proper started.
    • Did some work during the conference, including presenting a “poster” on our current progress. (apparently shown in the image below)
    • Did some final work at the end/after of the conference.
  • Produces a final document

Poster of working group

The benefit

The biggest benefit that flowed from that event was meeting the co-author of the book we wrote, which (even with its limitations) remains the most cited of my publications. Without being a member of the working group with my co-author, the book would never have been written.

Having to work with folk at a conference on a specific project, rather than sit and listen or sit and drink network, provides additional insights and connections. It can also be a bit more challenging, but nothing in life is free.

The wasted opportunity

This type of approach seems to address the wasted opportunity that is most conferences. You have all these talented, diverse and skilled folk in the one location, but limit their interaction to presentations, panels and socialising. Nothing, at least in my experience, works to bring those diverse perspectives together to produce something.

For a long time, I’ve been wondering if something different is possible.

Looking for alternatives

The ITiCSE working group approach was okay, but fairly traditional. It aimed to produce an academic paper. I was involved with the first one, it would be interesting to see how they’ve evolved and changed based on the experience.

The REACT project tried to ensure that planned innovations in L&T benefited from diverse perspectives before implementation. But like the working group idea used an academic paper as the main artifact. REACT never really went anywhere.

And then there is Eduhacking in the style used by @sthcrft and @stuffy65 at UNE and in particular @sthcrft ‘s call

do we need a cross-institution eduhack event? From my point of view, anything that’s about collaborating on ideas and possibilities has got to be better than yet another show and tell event. Who’s in?

I’m thinking: Yes and me. The question is where to now?

How might it work?

Education Hack Day describes the aim this way

The mission was simple: listen to problems sourced by teachers from around the world, pick a dozen or so to tackle, and form teams around those problems that would each come up with and execute a creative solution to solve them.

This seems to have been based on the older/broader idea of from the developer world of a hackathon. As with the UNE experiment, the focus here wouldn’t necessarily be on software developers, but a broader cross-section of people.

So a process might be:

  • Pick a conference, one that has a good cross section of educational technology type folk.
    For example, ASCILITE’12.
  • Run an Eduhack day just before the conference proper starts, probably as a workshop.
  • Actively encourage a diverse collection of folk to come along.
  • Distribute a call for problems prior to the conference.
  • Ensure that the setting for the Eduhack is appropriate (i.e. not the normal conference breakout room).
  • Have a loose process to select the problems to be worked on and let folk go.
  • Have some of the outcomes presented as posters during the conference.
  • Encourage/enable groups to keep working on the problems post-conference, perhaps for presentation as traditional papers at the next conference?

I’m sure there are improvements to be made. Who’s interested?

Herding cats and losing weight: the vimeo video

This is in part a test of WordPress.com’s new support for Vimeo video. The video below is of a presentation I gave at CQU this year. The abstract is below. The slides are on slideshare.

Abstract

The environment within which Universities operate has changed significantly over recent years. Two of the biggest changes have been a reduction in state funding for universities and, at the same time, an increased need for universities to demonstrate the quality and appropriateness of their services, especially learning and teaching.

Consequently, most universities have developed a range of strategies, policies, structures and systems with the intent of improving and demonstrating the quality of their learning and teaching. This presentation will draw on the metaphors of herding cats and losing weight to examine the underlying assumptions of these attempts, the resulting outcomes, question whether or not they are the best we can hope for, and present some alternatives.

The video

The Innovation Prevention Department: Why?

The final keynote at ASCILITE’09 was by James Clay and was titled the Future of Learning (this is a link to an apparently earlier presentation by same author, same topic). Many aspects of the talk resonated with many in the audience, however, the one that perhaps resonated the most was that of the Innovation Prevention Department.

James, as he describes in this comment was suggesting that most organisations have one department that seems to hold back innovation. The comment reveals James’ use came from Jon Trinder (slide 6). A quick google reveals the phrase being used as a chapter title in this 2002 book. So, it doesn’t seem to be a new concept that there always seems to be one department within an organisation that prevents innovation.

So, what’s the problem?

The problem is that in giving a list of departments that could possible fulfill this role, James started with the information technology department and that’s about where most of the audience seemed to stop listening. Many didn’t hear the other suggestions in James’ list. From where I was sitting, as soon as IT was mentioned most of the audience started nodding their head and remembering specific examples of where their IT folk had thwarted some innovation. It wasn’t long before “the Innovation Thwarting Department” play on information technology department was doing the rounds.

As it happened, there were a couple of IT folk in the audience and another couple listening to the tweet stream. Not surprisingly, they were somewhat chagrined at this disparaging label for the work they do. Mark Smithers talks about his dismay at seeing the tweets from ASCILITE mentioning this. Nick Sharrat shares his thoughts about

the frustration I often feel when my profession is disparaged for actually just ‘doing it’s job’, especially by people who often display an incredible naivity about the real world of IT.

Not surprisingly, people don’t like being disparaged.

Is there something there?

Both Mark and Nick give lots of examples of the difficulties that IT face in doing their job. The constraints, which are many, within which they have to operate. They give examples of where the request or idea from the user is significantly flawed from a different perspective.

However, isn’t it a worry when a significant percentage of the audience at a conference like ASCILITE’09, when presented with “innovation prevention department”, immediately though of the IT department? Rather than simply explaining why IT folk are rational, professionals working in a complex environment, shouldn’t there be an interest in understanding why the ASCILITE crowd are thinking this way?

Given that ASCILITE is about computers in learning in tertiary education the folk at ASCILITE are keen to use computers effectively for that task. Given that IT professionals are be a key component/enabler of this work, shouldn’t they be getting on?

I can’t see how the two groups can work together effectively if there exists this gulf in perceptions. If the source of the gulf is understood, perhaps that will enable the gulf to be bridged.

Does anyone know of any work that has sought to document the causes for this gulf between IT and L&T folk? What about identifying strategies for moving forward?

My suggested reasons

Both Nick and Mark give a variety of reasons/problems for why/how IT function. The following is a start of some reasons that I propose for why L&T folk have formed the “innovation prevention” impression of IT. I’m doing this because I believe this is the first step in moving forward. Let’s have both sides get their cards on the table and then figure out ways forward.

I must emphasise that the following list is based on my experiences, reading and perceptions. It is not meant to be definitive and may not describe what really happened, however, they do capture my perceptions of that reality. Please understand that the perceptions people have of what goes on is what drives how they act. It doesn’t matter whether or not you believe that the reality is otherwise. People will react based on their perceptions, not yours.

I should note that I have an IT background. I’ve taught IT professionals. Some of whom have worked within IT departments in higher ed. I’ve also run a large scale IT system that was not part of central IT (though it is now).

The assumption of objectiveness, rationality and professionalism.

Nick argues that

So, next time your IT department seems to be out to get you, give them a little more credit – you need to trust that they are proffesionals making very difficult compromises.

Being a professional brings with it the aura of objectiveness and rationality. The trouble is that people are not information processing intelligences that make rational decisions. Our intelligence is based on pattern matching, our pattern matching processes are rife with biases and shortcomings. For example, the following was the finding reported by the technical team (consisting mostly of IT professionals) on the comparisons between different LMSs being considered at my institution in about 2003/2004

strongly feels that the Blackboard product has the best overall technical fit and provides the best opportunity available to meet our tactical needs while minimizing support problems and costs

. This is in spite of the observation that Blackboard had never been run on the existing infrastructure, one of the other LMS was a locally grown system that had been running on existing infrastructure for a number of years, and that a year or so after the implementation of Blackboard the institution had to invest in an entirely new server infrastructure due to problems in running Blackboard on the old infrastructure.

And that’s before we get into politics. I’m sure any number of people within organisations can point to situations where the politics of the situation has driven the decision. IT departments are not absent of politics.

Note: this does not mean that L&T folk are better, more rational, than IT folk. It’s just that both sets of people are prone to irrationality and biases.

Who specifies the needs around innovation?

Nick specifies the role of IT professionals as

to provide systems that meet the business needs. That’s ‘needs’ and not ‘wants’

. The trouble is that when it comes to innovation you can’t specify, you can’t plan. I use a quote from Joseph Gavin Jr in my email signature

If a major project is truly innovative, you cannot possibly know its exact cost and its exact schedule at the beginning. And if in fact you do know the exact cost and the exact schedule, chances are that the technology is obsolete.

It’s the emphasis on specification and planning that is in-built into most IT projects that is a direct anathema to innovation.

Limited understanding of the nature of teaching and learning.

This continues on from the previous point. IT is focused on specification, global solutions, the same solutions for all. Learning and teaching is all about diversity, variability and change. Features that do not match well with traditional IT processes. I’ve argues this in a recent presentation video (and slides).

IT assume that the same processes they use for student records systems will work for learning and teaching.

The user deficit model.

Nick’s comment about people with “an incredible naivete about the real world of IT” in some IT folk (but by no means all and I don’t know Nick so this is not meant to be a characterisation of him) demonstrates a user deficit model. i.e. the users are stupid, we need to make the decisions for them, we know what’s best for them.

This type of model is embodied in the acronym PEBKAC and it encourages a blame the user approach to thinking. Read the criticisms of PEBKAC to see how often user error/stupidity is due to the lack of quality in the IT systems.

There’s a trite little saying

There are only two industries that refer to their customers as users. The computer industry and the trade in illicit drugs.

It may be trite but it shows a mindset that can and does exist in some IT folk.

That said, there’s also a similar mindset towards/deficit model of teaching academics held by L&T support staff.

The wrong rules

While I was writing this, the following tweet came from Matthew Allen tweeted a point made by Skewes at the Broadband Futures event in Australia

the problem is not technology; it’s the rules which prevent innovation

. This ties in somewhat with the first point, but it’s also more than that.

The vast majority of the practices of IT folk arose from a period when IT resources and the ability to use them were scarce and expensive. Increasingly with the advent of social media, the cloud, SaaS etc, I think we’re seeing the rise of a period of abundance in terms of the ability and availability of certain types of IT resource (some others may remain scarce).

The rules for handling scarcity are the wrong rules for handling abundance.

Is there value in strategic plans for educational technology

Dave Cormier has recently published a blog post titled Dave’s wildly unscientific survey of technology use in Higher Education. There’s a bunch of interesting stuff there. I especially like Dave’s note on e-portfolios

eportfolios are a vast hidden overhead. They really only make sense if they are portable and accessible to the user. Transferring vast quantities of student held data out of the university every spring seems complicated. Better, maybe, to instruct students to use external services.

Mainly because it aligns with some of my views.

But that’s not the point of this post. This morning Dave tweeted for folk to respond to a comment on the post by Diego Leal on strategic plans for educational technology in universities.

Strategic plans in educational technology are a bugbear of mine. I’ve been writing and thinking about them a lot recently. So I’ve bitten.

Summary

My starting position is that I’m strongly against strategic plans for educational technology in organisations. However, I’m enough of a pragmatist to recognise that – for various reasons (mostly political) – organisations have to have them. If they must have them, they must be very light on specifics and focus on enabling learning and improvement.

My main reason for this is a belief that strategic plans generally embody an assumption about organisations and planning that simply doesn’t exist within universities, especially in the context of educational technology. This mismatch results in strategic plans generally creating or enabling problems.

Important: I don’t believe that the problems with strategic plans (for edtech in higher education) arise because they are implemented badly. I believe problems with strategic plans arise because they are completely inappropriate for edtech in higher education. Strategic plans might work for other purposes, but not this one.

This mismatch leads to the following (amongst others) common problems:

  • Model 1 behaviour (Argyris et al, 1985);
  • Fads, fashions and band wagons (Birnbaum, 2000; Swanson and Ramiller, 2004)
  • Purpose proxies (Introna, 1996);
    i.e. rather than measure good learning and teaching, an institution measures how many people are using the LMS or have a graduate certificate in learning and teaching.
  • Suboptimal stable equilibria (March, 1991)
  • Technology gravity (McDonald & Gibbons, 2009)

Rationale

Introna (1996) identified three necessary conditions for the type of process embedded in a strategic plan to be possible. They are:

  • The behaviour of the system is relatively stable and predictable.
  • The planners are able to manipulate system behaviour.
  • The planners are able to accurately determine goals or criteria for success.

In a recent talk I argued that none of those conditions exist within the practice of learning and teaching in higher education. It’s a point I also argue in a section of my thesis

The alternative?

The talk includes some discussion of some principles of a different approach to the same problem. That alternative is based on the idea of ateleological design suggested by Introna (1996). An idea that is very similar to broader debates in various other areas of research. This section of my thesis describes the two ends of the process spectrum.

It is my position that educational technology in higher education – due to its diversity and rapid pace of change – has to be much further towards the ateleological, emergent, naturalistic or exploitation end of the spectrum.

Statement of biases

I’ve only ever worked at the one institution (for coming up to 20 years) and have been significantly influenced by that experience. Experience which has included spending 6 months developing a strategic plan for Information Technology in Learning and Teaching that was approved by the Academic Board of the institution, used by the IT Division to justify a range of budget claims, thrown out/forgotten, and now, about 5 years later, many of the recommendations are being actioned. The experience also includes spending 7 or so years developing an e-learning system from the bottom up, in spite of the organisational hierarchy.

So I am perhaps not the most objective voice.

References

Argyris, C., R. Putnam, et al. (1985). Action science: Concepts, methods and skills for research and intervention. San Francisco, Jossey-Bass.

Birnbaum, R. (2000). Management Fads in Higher Education: Where They Come From, What They Do, Why They Fail. San Francisco, Jossey-Bass.

Introna, L. (1996). “Notes on ateleological information systems development.” Information Technology & People 9(4): 20-39.

March, J. (1991). “Exploration and exploitation in organizational learning.” Organization Science 2(1): 71-87.

McDonald, J. and A. Gibbons (2009). “Technology I, II, and III: criteria for understanding and improving the practice of instructional technology ” Educational Technology Research and Development 57(3): 377-392.

Swanson, E. B. and N. C. Ramiller (2004). “Innovating mindfully with information technology.” MIS Quarterly 28(4): 553-583.

What are the conditions that are conducive to the creation of a variety of new ideas?

I’m currently working on the Process section of my thesis. As part of that I’m referring back to a book chapter (that was a conference paper) by Bo Carlsson (2004) titled “Public policy as a form of design”. This post is an attempt to summarise some of the points made in that chapter as they are connected to my new job.

What are the conditions that are conducive to the creation of a variety of new ideas?

Let’s start with this quote

Sometimes the first and most important policy objective is to remove obstacles to creativity and to foster entrepreneurship rather than to take new initiatives. The formation of new clusters can be facilitated but not directed. Planning cannot replace the imaginative spark that creates innovation.

He does make the point that once innovation clusters evolve, then it may be necessary for appropriate policy to develop.

Three types of facilitating policy stand out

  1. Increase absorptive capacity or receiver competence.
    i.e. the ability of folk to identify innovations and convert them into “business opportunities”. Such capacity is built through research and development, hiring of personnel, training of personnel and accumulation of experience.
  2. Increase connectivity.
    Increasing the quantity and quality of linkages within and outside of the system.
  3. Promote entrepreneurship and encourage variety.

    Given the risk and uncertainty associated with each link in teh chain, the greater the number of players, each with uncertain and divergent beliefs about the chances of success, the greater are the chances of successful outcomes. This is a game of effectiveness, not efficiency. Let the market (or the public), not bureaucrats, select the successful projects

Misc quotes

The higher the opportunity cost of entrepreneurship, the lower the qulity of entrepreneur because the process becomes driven by adverse selection. In the extreme, the only agents willing to undertake entrepreneurship are those who cannot do anything else.

Application to innovation within universities

Given the growing influence of managerialization within society and the increasing moves to standardisation and accountability within higher education it is not difficult to identify some tensions. Indeed, the tension between accountability and innovation and its negative ramifications within universities is the topic of Findlow (2008)

Limitations in academic staff development, the lack of perceived importance of learning and teaching, a focus on specific technologies and a range of other factors mean that the absorptive capacity of universities around e-learning is not great.

The connectivity is also somewhat limited due to the hierarchical structures that arise out of the same influences. The separation of academics into disciplinary sub-groups and the organisational distance between the academics and the IT and L&T folk all arise from these structures and its increasing reliance on teleological design.

The lack of connectivity and the increase in top-down approaches to decision-making is significantly reducing the variety of approaches. Obviously the pressure for standardisation and the fear of risk-taking impact upon this.

Now, there are activities that can work around this, however, the emphasis on top-down decision making will make this difficult. Unless of course someone senior can see the light.

References

Carlsson, B. (2004). Public policy as a form of design. Managing as Designing. R. Boland and F. Collopy. Standford, CA, Standford Business Books: 259-264.

Findlow, S. (2008). “Accountability and innovation in higher education: a disabling tension?” Studies in Higher Education 33(3): 313-329.

Learning spaces: expenditure and time spent learning

I’ve just been listening to this podcast of a keynote by Dr Phil Long. Apart from the content of the talk, this is interesting because Dr Long has just recently started work at the University of Queensland (which is just down the road from here) at the Centre for Educational Innovation and Technology.

There is much good content. I particular recommend listening through the question and answer section at the end for a story about a “student/teacher” relationship that is very inspiring.

The bit that sparks this post is also in the question and answers at the end and is interesting to me because of a nascent idea I have for some experimentation. When talking about the future of university campuses, Dr Long suggests that classrooms will become marginal and goes onto say

Most institutions, when it comes to infrastructure, spend 8 of 10 dollars on physical classroom infrastructure, and if you do any study on where students learn you will find that less than 7% of the time when they are working on class related work happens in that box. How do you spend 80% of your dollar on where students are spending 7% of their time?

There’s a growing interest at my current institution in the question of learning spaces, though much of the interest I’ve seen so far seems to be stuck about 5 years ago. The nature of my current institution is such that for a large proportion of our students, the 7% figure would actually be 0%. In some cases, a large number of our students never set foot on a campus.

And yet, much of our expenditure, our concerns, our learning and teaching practice and our management and workload calculations are built around the assumption of the classroom. Even though it is increasingly problematic.

It’s as if, in the words of Postman, that the lecture and its associated goods and chattels continue to be mythic.

Where's the inspiration? Where's the desire to improve?

The title and spark for this post comes from this post entitled “A night of ‘Biggest Loser’ Inspiration”. I came across it via a tweet from Gardner Campbell and in particular the quote from the post he tweeted (included here sans the 140 character tweet limit)

People follow inspiration and that’s where students will go — where they are inspired to learn, collaborate, build and innovate.

I’m guessing, though am currently not 100% certain, that my current institution will want me to contribute to creating this sort of inspiration in my new role. I’m excited by that, but I’m also concerned that it will be really difficult. When I’m looking at the difficulties I will face, the biggest is perhaps embodied in the second question from this quote from the post (caps in original, I’ve added the emphasis)

I want to scream, “WHERE’S THE INSPIRATION? WHERE’S THE DESIRE TO IMPROVE?

Where’s the desire to improve?

When it comes to improving learning and teaching I am a firm believer in the absolute centrality of teacher’s conception of learning and teaching. Yes, I agree that student learning is the focus, you want to inspire students to learn, collaborate, build and innovate. However, I work within a university setting and am tasked with helping improve the learning students receive from the university. In that setting the conceptions of learning held by the teaching staff directly impact upon the quality of the student learning.

Consequently, I currently believe that an important, if not the most important, aim for my position should be to encourage academics to reflect upon their conceptions of learning and teaching. The theory being, see the following figure from Trigwell (2001), change in those conceptions is the only way to achieve sustainable improvements in the quality of learning experience by students.

Trigwell's model of teaching

The problem is that for this will only happen if there is a desire on the part of the academics to reflect. If there’s no desire, it won’t work. My current institution has been going through some tough times which may make it hard to find that inspiration.

Further connections with the biggest loser

The post that started this, was sparked by watching the Biggest Loser – one of the recent franchises of reality shows to go global. Since I listened to some of David Maister’s podcasts from a recent book of this (Strategy and the Fat Smoker – there’s a good review/overview here) I’ve been pondering the connection between weight loss and encouraging innovation and improvement in learning and teaching (I can see at least a presentation and maybe some research arising from this work but it’s been put aside until I finish other tasks).

I particularly like this quote

If you truly want to succeed (and many people do not want it badly enough to make it happen) then you must never settle, never give up, never coast, never just accept what is, even if you are currently performing at a high level.

which I took from this review of the book. The review was done by a lawyer who focused on one chapter of Maister’s book – Chapter 17: The Trouble with Lawyers. The review includes this

He highlights four problems that prevent “lawyers from effectively functioning in groups:”

  • problems with trust;
  • difficulties with ideology, values, and principles;
  • professional detachment; and
  • unusual approaches to decision making (referring to lawyers’ propensity to attack any idea presented to locate and highlight its weaknesses, with the result that “within a short time, most ideas, no matter who initiates them, will be destroyed, dismissed, or postponed for future examination.”)

A list which I find fairly appropriate for university academics.

One of the observations that arise from the book is the examination of consultants and the businesses that employ them and a comparison with health professionals and fat smokers. Consultants are brought in to tell the business how to improve itself, just as health professionals are brought in to help fat smokers. The trouble is, that like fat smokers, most business people already know what they are doing wrong. Fat smokers know they need to stop smoking, eat well and start exercising. What do health professionals tell fat smokers? Stop smoking, eat well and start exercising. Duh!

Of course, consultants know that most business people know what the consultants know. Increasingly most of the business people have been through the same education processes and read the same literature as the consultants. Though the business people often have the huge benefit of long-term and in-depth practical experience within the specific context of the business. A consultant knows this and has to justify his/her fee. So consultants come in with a barrage of jargon and technologies (in the broadest possible senses) that the business person doesn’t have. However, in the end it all boils down to the same knowledge.

I can see a lot of similarities here between instructional designers (and other folk employed to help academics) and academics. The instructional designers are the consultants and the academics are the business people. I see instructional designers developing a barrage of jargon and technologies which essentially boil down to telling the fat smoker to stop smoking, eat well and exercise. Essentially telling the academic what they already know but making it so difficult to understand that the academic spends more time understanding than implementing.

Of course, this is a generalisation and metaphor with all the attendant limitations. But, I do believe there is a glimmer, possibly more, of truth. It also makes some assumptions and raises some questions:

  • That academics do know the equivalent of “stop smoking, eat well and exercise” for learning and teaching.
    Having worked in a number of positions that help academics in their teaching I’ve had an opportunity to see a large number of very different academics. Sadly and somewhat suprisingly, a fairly significant number appear to be somewhat clueless. However, I do wonder how much of this lack of knowledge or simply poor execution.
  • If they know, why don’t they follow through?
    What are the factors or reasons why this knowledge isn’t put into action? Can anything be done to address them.
  • Is there really an equivalent of “stop smoking, eat well and exercise” for learning and teaching?

It has to be intrinsic

Have to add this in before I close. This review of Maister’s book mentions the following as one of the many answers provided by Maister

Motivation must be intrinsic, not extrinsic. The biggest barrier to change is the feeling that “it’s OK so far.”

When I ask “where’s the desire”, I think this is perhaps the best answer. When the desire to improve and innovate is intrinsic to the academic, then the question becomes how does the university get out of their way and help them achieve?

But, how do you enable/encourage/create that intrinsic movitation? Can you? That’s the question I’d like to investigate.

Some initial thoughts on e-learning and innovation

Theoretically, I’m in the process of starting a new job that is focused on encouraging e-learning and innovation within a university context. The following post is an early attempt to try and make sense of this job, what it might do and how it might do it. It’s probably of little value to others, but I’m trying to be open about this.

This is still early days and the understanding will continue to grow and change. Due to the nature of human beings as pattern matching intelligences this exploration will necessarily, as arising out of my own attempts to make sense of this job, illustrate my past experiences and patterns of action. Feel free to disagree and suggest alternative perspectives.

The model

The following started out as a 2×2 framework but has evolved as I’ve been writing this post. The attempt of the model is to represent the process and two of the decisions that have to be answered went attempting a change or innovation within an organisation. In summary, the idea is that:

  • Some spark creates the need for the change or innovation.
  • This makes it necessary to decide what to do in order to respond to the spark.
  • Once what is decided it is necessary to decide how to do it.
  • How things are done can also contribute to the next or different sparks.

I’ve purposely not included numbers in the above list. Cycles can start in any of these stages and there isn’t always a cycle. In fact, some might argue a significant flaw in many organisations is a failure to draw knowledge from how things were done in order to inform the next spark. Alternatively, it may not always be possible to connect the causal cycle until after the fact.

A Change Cycle

The spark

There is generally a spark, some event, thing, or knowledge that makes it necessary to make some change or undertake some action. It might be to solve a problem or achieve a goal. The spark may not be identified prior to the change, only after. The model suggests that such a spark can arise from a spectrum with two extreme dimensions:

  1. Idealistic; and
    In this context, something at the idealistic end would generally be something created by an expert, management or government. For example last night the Australian Federal Government released it’s budget for the next year and it includes a number of projects and changes that will require strategic responses from Australian Universities. Alternatively, it might be something internal to an organisation such as the appointment of a new Vice-Chancellor.
  2. Naturalistic.
    In this context, this is understood to be something that arises from the “coal-face” of the organisation. An extreme example might be an individual academic faced with a group students not understanding a particular concept.

This is meant to be a spectrum, an example from the middle might be an institution (not a single academic) identifying long turnaround times on assignment feedback.

What to do

Given a spark, it is necessary to identify what to do. What can be done to respond to the spark? Of all the different projects that might work, how does an institution or individual identify what to do?

The model represents two extreme ends of a spectrum:

  1. Fad; and
    This is where a project is chosen simply because someone else has done it. e.g. “boys with toys” represents a lone-ranger academic adopting the use of Twitter because he saw the Oprah show on Twitter.
  2. Knowledge.
    The chosen project is identified based on some theoretical knowledge – be it organisational, learning, technical etc – and its application to the local context. For example, the adoption of constructive alignment based on the ideas of Biggs.

How to do it

Having decided what to do, it is now necessary to plan how to do it. This spectrum draws on a distinction made by Kurtz and Snowden (2007) and is one I’ve used before. The following table compares the two approaches.

Idealistic Naturalistic
Achieve ideal state Understand a sufficiency of the present in order to stimulate evolution
Privilege expert knowledge, analysis and interpretation Favour enabling emergent meaning at the ground level
Separate diagnosis from intervention Diagnosis and intervention to be intertwined with practice

Snowden’s point – and I agree – is that idealistic approaches only work in contexts in which there is a clear connection between cause and effect. i.e. you can predict that if you do X, then Y will happen. Snowden points out in his Cynefin framework and associated writing that there are other contexts, that require different approaches.

Cynefin domains

Putting the model in context

The rest of the post attempts to use the above model to begin understanding the context within which the new job takes place. The aim is to help me formulate plans for the position that I need to raise with the hierarchy to get approval. It covers

  • the spark; and
    An attempt to identify factors, both naturalistic and idealistic, that are creating a need for change or innovation within the institution.
  • a list of projects.
    Based on combination of “what to do” and “how to do it” and knowledge/prejudices around the local context identify an initial list of potential projects.

Some of the thinking that follows does (or will/should) include a range of existing projects and processes within the organisation. While the new position may not be directly connected with these projects and processes it is necessary for the position to be aware of an work with those projects and processes.

This is only an initial list – it will grow and change as time goes by.

The spark

The following aims to provide an initial list of potential sparks that might be important for the new position to either do something about or at least work with or inform. I’ve attempted to group them in some initial rough categories as a way to help brainstorm:

  • From the position description.
  • Organisational strategic plans.
  • Organisational factors.
  • Government policies and other external factors.

From the position description

The new position comes with a list of accountabilities by which the incumbent will be judged. Not surprisingly, this focuses the attention. The following are drawn from those accountabilities.

Organisational strategic plans

Like most institutions mine has developed a strategic plan, but it also has a range of other organisational goals, understandings and cultures that also have to be considered. I need to better understand these.

First, focus on the strategic plan which is divided up into 8 main sections. Many of the components of these aims are the responsibility of existing organisational units. I’ll focus on the ones that appear to connect with the new job, but leave the others in but struck through. Each component is further divided into: what we need to do; how we will do it; how will we know that we are doing it well.

  1. Learning and Teaching
    • What we need to do
      • Provide a multimodal educational platform supported by appropriate technology.
      • Ensure that programs meet future industry and community needs.
      • Provide multiple pathways and a seamless fit for articulating students.
      • Improve student retention and progression rates.
      • Support collaboration within and across campus and administrative structures to ensure successful student learning.
      • Develop and reward staff capability in innovative curriculum design, teaching and assessment, and the scholarship of learning and teaching
    • How we will do it
      • Progress the implementation of the Student Learning Journey.
      • Benchmark programs against relevant industry and labour market needs.
      • Review graduate attributes and improve integration into programs.
      • Provide formal and informal mentoring for new academic and casual teaching staff.
      • Identify, develop and support learning and teaching leaders.
      • Support staff to engage in the scholarship of learning and teaching and develop innovative practices.
    • Doing it well?
      • Improved Course Experience Questionnaire (CEQ) and Graduate Destination Survey (GDS) outcomes against benchmarked universities.
      • Improved Learning and Teaching Performance Fund outcomes.
      • Increase in the quality of Australian Learning and Teaching Council (ALTC) awards and grants applications and maintenance of success in an increasingly competitive arena.
      • Improved student engagement as measured by the Australasian Survey of Student Engagement.
      • Improved Student Evaluations of Teaching and an increase in the number of students participating.
  2. Research and innovation
    • What we need to do
      • Support research excellence in the University’s priorities for research that contribute to the Resource Industries; Community Health and Social Viability; and Intercultural Education and that this research meets the needs of the communities we serve.
      • Develop and support a vibrant research culture and intellectual environment.
      • Enhance the quality and dissemination of research outcomes.
      • Support quality research programs to enable staff and students to achieve success and realise their full potential.
      • Provide quality, relevant services and support to research stakeholders.
      • Increase the University’s research performance.
    • How we will do it
      • Increase external research income through effective policies, training and processes and focus investment for growth in the Research Institutes.
      • Provide training, staff development, networking and mentoring for staff involved in research and reward excellence and encourage exploration and innovation.
      • Research and university leaders will work strategically with industry, community, government and other stakeholders to align research priorities with industry needs.
      • Foster an environment of active enquiry, innovative development and effective dissemination
    • Doing it well?
      • External research income to increase by 50% in the next 2 years and to be benchmarked against other institutions.
        There is an interesting split between “innovation”/L&T funding and research funding.
      • Receipt of external research investments other than research project income.
      • Improvement in the quantum of quality publication outputs registered each year by category and compared with other institutions.
      • Improvement in the University ranking for external research performance funds relative to the sector.
      • Increase in the number of research active staff by 5% per annum.
      • Increase in the number of Research Higher Degree enrolments and increase in the number of Research Higher Degree students completing on time or earlier
  3. Community engagement
  4. Domestic engagement
    • What we need to do
      • Address the shortfall in domestic student enrolments as a matter of urgency through a range of strategies to build demand, attract students to CQUniversity and improve retention.
      • Develop appropriate contemporary programs and courses to meet the needs of domestic students, increasing participation, access, retention and success of students.
      • Develop new ways to attract students to CQUniversity including building on marketing initiatives, the re-branding exercise and redressing reputational issues.
      • Develop new ways to engage with industry, business and the community via new learning initiatives.
      • Develop new educational models for the future that are aligned with our broad mission “to be what you want to be”.
      • Explore ways to increase distance education offerings and enhance our reputation as a renowned distance education provider
    • How we will do it
      • Continue the development of new suites of contemporary programs in areas of demonstrated demand.
      • Implement the new brand.
      • Improve customer service led by Navigate CQUniversity.
      • Implement Alternative Pathways in 2008.
    • Doing it well?
        By achieving our student enrolment targets (not necessarily DEEWR targets).

      • Increase in domestic student retention rates by 1% per annum.
      • 5% increase per annum in number of students entering bridging programs and progressing to award studies.
      • Increase in access and participation rates for equity students.
      • Increase the access and participation of Aboriginal and Torres Strait Islander students
  5. International engagement
    • What we need to do
    • How we will do it
      • Build staff capability in learning and teaching related to international students, especially curriculum design and culturally inclusive teaching practices which meets the needs and expectations of international students.
      • Establish priorities and encourage engagement in research through IERI (Intercultural Education Research Institute) that informs international education in areas of policy, systems, planning, pedagogy and others.
      • Develop and implement the new CQUniversity/CMS interface and maximise the benefits resulting from 100% ownership of CMS by expanding the range of academic programs at the Australian International Campuses.
      • Explore low risk delivery mechanisms and pathway linkages.
      • Increase student and staff mobility through improved Study Abroad and Exchange programs.
    • Doing it well?
  6. People and performance
    • What we need to do
      • Fully integrate the human resource strategy with the organisational strategy, via the implementation of the Management Plan – Human Resources.
      • Invest in the development of staff to ensure that they have the requisite skills and abilities to support the attainment of the University’s strategic objectives.
      • Develop whole of University strategies in support of improved staff morale.
      • Facilitate opportunities for collaborative projects across organisational boundaries. – this is interesting
      • Provide a safe workplace for staff and students and meet all Workplace Health & Safety legislative requirements.
    • How we will do it
      • Complete the organisational restructure process by end 2008.
      • Implement revised PRPD processes.
      • Develop workforce planning and succession planning tools.
      • Develop recruitment strategies to attract and recruit high performing staff.
      • Provide management and leadership training for all managers and supervisors.
      • Negotiate a new Union Collective Agreement prior to the nominal expiry date of the current agreement.
      • Encourage active staff involvement in professional bodies.
      • Conduct focus groups with staff on ways to improve staff morale.
      • Facilitate greater opportunities for meaningful communication between staff and University managers at all levels across the University.
      • Develop Service Level Agreements for the delivery of human resources services across the University.
      • Reduce the number of staff and student injuries on University property through a range of strategies.
    • Doing it well?
  7. Resources, systems and infrastructure
    • What we need to do
      • Increase revenue and decrease costs.
      • Ensure an appropriate linkage between the planning and budget functions of the organisation.
      • Ensure management has access to the appropriate and timely information and reporting tools.
      • Ensure the University has a Strategic Asset Management Plan to support our strategic initiatives.
      • Ensure the University has an ICT Management Plan which supports our strategic initiatives.
      • Ensure campus development plans are in place to support the future operational and strategic needs of the university.
      • Ensure the University has a Financial Management Plan which supports the strategic direction of the University.
      • Work towards sustainable resource management and leadership in environmental outcomes from our operations
    • How we will do it
    • Doing it well?
  8. Governance and quality

Need to find out which parts of the organisation are responsible for the above and what they are doing.

Organisational factors

Perhaps the most open to debate, given lack of agreement amongst stakeholders and some of the points about Model 1 behaviour, but just as important. Some of the following connect with strategic plans.

  • Evaluation of learning and teaching – beyond just course based and other limitations.
  • Flexibility and quality of learning platforms and technologies.
  • Actual quality of learning and teaching, administration, e-learning.
  • An emphasis on fad and idealistic dimensions.

Government strategies

  • Teaching funding linked to performance outcomes on quality, particiaption and completion rates.
  • Student centred funding.

List of projects

An early version of the model in this post was a traditional 2×2 model (with slightly different labels). While I’ve moved on from there and think the two dimensions are spectrums the 2×2 model offers some help in understanding what could be done. The following table summarises.

Sector Description What can be done
Idealistic-fads The pre-dominant mode within organisations. This position will probably have little influence on these projects as they are driven by senior management. The best that can be hoped is to provide evidence and insight to those making decisions. Focused on nature of the organisation and the experience of students and staff. Helping increase the quality and quantity of the feedback to these folk. Make them aware of the limitations of the chosen approaches. Make sure that the knowledge generated from these projects is available and used to inform subsequent projects. Be aware of the fads/trends that are rising and become familiar with them. Perhaps attempt.
Idealistic-knowledge Generally limited use at the organisational level, some use in isolated areas The insights from the projects are likely to be useful. Ensure that the knowledge is disseminated and informs subsequent projects. Be aware of the types of knowledge that can help inform proejcts and their implementation.

This is probably where traditional university “innovation” grants sit. Probably have to engage with these but the cartoon below stikes me as saying a few things about these grants and there’s also the work of Findlow.

Naturalistic-fads A common approach – often seen in lone rangers No point, ability or benefit in stopping these. Better to help inform their implementation and learn their lessons. How to do this effectively is another question. There are some connections here or perhaps in the next sector with incremental, cumulative improvement arising out of the work of the Teaching, Learning, Technology group.
Naturalistic-knowledge Rarely used and the sector I feel most appropriate for innovation around learning and teaching. Have talked previously about the idea of reflective alignment. Something I’d like to try. Perhaps there are others.

Innovation in Corporate America

Quotes about innovation and creativity

Theoretically, I’m in the process of starting a new job that is focused on encouraging e-learning and innovation within a university context. I’m still reading some of the different literature but the following quotes resonate with me around this position and how it is likely to evolve.

The purpose and place of “idea” departments

McLuhan and how innovation roles/departments are isolation wards

In big industry new ideas are invited to rear their heads so they can be clobbered at once. The idea department of a big firm is a sort of lab for isolating dangerous viruses.

This is a real danger for the new position as it sits outside the organisational structures in which the vast majority of learning and teaching occurs. Especially when this presents barriers to the following.

Innovation is fostered by information gathered from new connections; from insights gained by journeys into other disciplines or places; from active, collegial networks and fluid, open boundaries. Innovation arises from ongoing circles of exchange, where information is not just accumulated or stored, but created. Knowledge is generated anew from connections that weren’t there before. — Margaret Wheatley

New: The following quote mirrors, to some extent, the McLuhan quote. It’s taken from a Clay Shirky post on the future of newspapers

Leadership becomes faith-based, while employees who have the temerity to suggest that what seems to be happening is in fact happening are herded into Innovation Departments, where they can be ignored en masse. This shunting aside of the realists in favor of the fabulists has different effects on different industries at different times.

The importance of failure

Woody Allen on failure

If you’re not failing every now and again, it’s a sign you’re not doing anything very innovative.

Edwin Land

The essential part of creativity is not being afraid to fail.

Thomas Watson Sr

Success is on the far side of failure.

Innovation ain’t logical

Einstein on the connection between innovation and logic

Innovation is not the product of logical thought, although the result is tied to logical structure.

Discoveries are often made by not following instructions, by going off the main road, by trying the untried. — Frank Tyger

That so few now dare to eccentric marks the chief danger of our time — John Stuart Mill

Solutions look for problems

And one I find particularly appropriate for e-learning.

We are surrounded by engineers’ folleys: too many technical solutions still looking for problems to solve.

The uncreative mind can spot wrong answers, but it takes a very creative mind to spot wrong questions. — Anthony Jay

Pattern entrainment

As pattern matching intelligences human beings decision making is based on a first-fit pattern match with past experience. One of the reasons horseless carriage innovations.

The importance of creativity and what it is

Creativity, as has been said, consists largely of rearranging what we know in order to find out what we do not know. Hence, to think creatively, we must be able to look afresh at what we normally take for granted. — George Kneller

Along similar lines

It’s easy to come up with new ideas; the hard part is letting go of what worked for you two years ago, but will soon be out of date. — Roger von Oech

Of course, management is always important.

It isn’t the incompetent who destroy an organisation. The incompetent never get in a position to destroy it. It is those who achieved something and want to rest upon their achievements who are forever clogging things up. — F. M. Young

Once we rid ourselves of traditional thinking we can get on with creating the future. — James Bertrand

Open and closed

A quote from John Cleese on open and close modes, I see some connections with the Model 1 and Model 2 behaviours observed by Argyris.

We all operate in two contrasting modes, which might be called open and closed. The open mode is more relaxed, more receptive, more exploratory, more democratic, more playful and more humorous. The closed mode is the tighter, more rigid, more hierarchical, more tunnel-visioned. Most people, unfortunately spend most of their time in the closed mode. Not that the closed mode cannot be helpful. If you are leaping a ravine, the moment of takeoff is a bad time for considering alternative strategies. When you charge the enemy machine-gun post, don’t waste energy trying to see the funny side of it. Do it in the “closed” mode. But the moment the action is over, try to return to the “open” mode—to open your mind again to all the feedback from our action that enables us to tell whether the action has been successful, or whether further action is need to improve on what we have done. In other words, we must return to the open mode, because in that mode we are the most aware, most receptive, most creative, and therefore at our most intelligent.

Argyris identifies attempts to “maximise winning and minimise losing” and “minimise generating or expressing negative feelings” as being key governing variables in Model 1 behaviour – the dominant model used in most organisations.

The things we fear most in organizations—fluctuations, disturbances, imbalances—re the primary sources of creativity. — Margaret Wheatley

The achievement of excellence can only occur if the organization promotes a culture of creative dissatisfaction. — Lawrence Miller

Difficulties of innovation

Also an aspect of pattern entrainment.

And of course, Machiavelli’s quote

It ought to be remembered that there is nothing more difficult to take in hand, more perilous to conduct, or more uncertain in its success, than to take the lead in the introduction of a new order of things. Because the innovator has for enemies all those who have done well under the old conditions, and lukewarm defenders in those who may do well under the new. This coolness arises partly from fear of the opponents, who have the laws on their side, and partly from the incredulity of men, who do not readily believe in new things until they have had a long experience of them.

Pondering a new position – request for help

After a period of uncertainty, it appears likely that at some stage during May 2009 I may be starting a new position at my current institution. The position goes under the title – “eLearning and Innovation Specialist”. It is a academic position, I will retain my current appointment as a Level C Academic but rather than teaching will have to achieve the following position purpose

You are the person who consults effectively and broadly with key stakeholders and provides strategic advice and leadership in learning and teaching innovation, primarily in the area of e-learning. Your primary purpose is to promote strategic e-learning development, in conjunction with all major stakeholders, to ensure CQUniversity achieves it ongoing e-learning goals.

Some of the above and a list of the accountabilities for the position are available on the eLearning and innovation page on this blog. This page will serve as a central point for discussion this blog about the position.

Given the nature of the position, the organisation and recent history I intend to use this blog to reflect on the position. I believe strongly in the value of open and transparent discussion as a way to increase distributed cognition and consequently increase the spread and effectiveness of innovation around learning and teaching. So a major aim of the discussion of the job here is to engage in, or at least spark discussion on the position, what it does and how successful it is. In fact, I believe that using the blog in this way is a good way to fulfill some of the position accountabilities including: relationships, communicate and publish, continuous improvement and SoTL and the teaching/research nexus.

Current task

Next week I will be meeting with the position’s supervisor to engage in a planning process for what the role incumbent should be doing over the next year. My current task is to develop my ideas for what that should be a starting point for negotiation with my supervisor. Until those plans are discussed and signed off by the supervisor, they are simply potential ideas, not actual aims or projects.

My aim today is to reading literature around innovation, especially in learning and teaching, to inform that planning. My initial list of questions which I’d like to ask of the literature are:

  • What is innovation? Revolution, evolution, disruptive etc.?
  • How to measure successful innovation? What are the barriers and enablers towards innovation?
  • What models exist for encouraging innovation? Which work, which don’t?

Literature I know of

The following is a list of literature I’m aware of and currently intend to look at and/or revisit. Some of the literature is innovation oriented, some of it is specific to learning and teaching.

  • Innovate and integrate (Jasinski, 2007)
    Commisioned research from the Australian Flexible Learning Framework into the processes of embedding innovative practices.
  • Accountability and innovation in higher education: a disabling tension? (Findlow, 2008)
    A paper that empirically explores the tension between accountability and innovation within UK higher education (very closely related to the Australian context). The link above is to a previous post that draws on some of the ideas from the paper.
  • The work of Clayton Christensen – especially disruptive innovation
  • Perspectives on innovation from the complex adaptive systems literature (Carlisle and McMillan, 2006; Webb, Lettice & Lemon, 2006)
    I believe CAS and associated concepts provide a much more useful model for understanding of concepts such as universities and innovation. Consequently, I believe they provide a better foundation for acting.
  • Postman, Papert and others, particularly those examining why innovation in learning and teaching hasn’t been all that successful.
  • The current death of university meme should probably also be looked at.

Would seem obvious that, since I’m using a blog for this discussion, I should also list some associated blogs. I haven’t done that, yet, as it’s not yet 100% certain I’ll continue in the position.

What can you do to help?

I’d value any suggestions you have on the following questions, or just about anything else that is vaguely related to this. Please leave your comment here, or if you’d like your comments kept private email me (gmail has better spam protection, so I’m using it, not my institutional email account)

I’m particularly keen on hearing about:

  • Any additional questions I should consider about the nature of innovation and the role.
  • Any suggestions for additional literature that might be useful.
  • Pointers to people or units at other universities that have similar roles.
  • What are the prejudices or blindspots that exist in the above?

References

Carlisle, Y. and E. McMillan (2006). “Innovation in organisations from a complex adaptive systems perspective.” Emergence: Complexity and Organization 8(1): 2-9.

Findlow, S. (2008). “Accountability and innovation in higher education: a disabling tension?” Studies in Higher Education 33(3): 313-329.

Jasinski, M. (2007). Innovate and integrate: embedding innovative practices. Brisbane, Australian Flexible Learning Framework: 243.

Webb, C., F. Lettice, et al. (2006). “Facilitating learning and innovation in organisations using complexity science principles.” Emergence: Complexity and Organization 8(1): 30-41.

"Blame the teacher" and its negative impact on learning and e-learning

The following post is sparked by reading Findlow (2008) as part of my PhD work. I’m about halfway through it and finding it very interesting. In particular, this post is sparked by the following paragraph from the paper

The institutional counterpoint to this was the feeling expressed, implicitly or explicitly, by all the administrative staff I talked to that, in the words of one mid-ranking administrator, ‘No offence, but some academics need a kick up the bum’. Only five survey respondents cited constraints not explicitly linked to aspects of ‘the system’, singling out academics’ ‘attitudes’, which they elaborated as: ‘lack of imagination’ and/or ‘a reluctance to take risks’

Blame the teacher – level 1

In promulgating the idea of reflective alignment I borrowed and reworked Biggs (Biggs and Tang, 2007) ideas of constructive alignment to take it from looking at how individual academics could design teaching to looking at how a university could improve the design of teaching performed by its academics.

The sentiments outlined in the above quote from Findlow (2008) are a perfect example of, what I framed as, level 1 knowledge about improving teaching. The “blame the teacher” level. The level at which management can feel consoled that the on-going problems with the quality of teaching is not the fault of the system they manage. It’s the fault of those horrible academics. It would all be better if the academics simply got a “kick up the bum”.

Escalation into “accountability” – level 2

The immediate problem that arises from level 1 knowledge about improving teaching, is that management very quickly want to provide that “kick up the bum”. This is typically done by introducing “accountability”. As Findlow (2008) writes

Key to the general mismatch seemed to be the ways in which the environment – both institution and scheme – demanded subscription to a view of accountability that impeded real innovation; that is, the sort of accountability that is modelled on classic audit: ‘conducted by remote agencies of control’ (Power 1994, 43), presuming an absence of trust, and valuing standardisation according to a priori standards.

This approach fits nicely into Level 2 knowledge about improving teaching – i.e. it is a focus on what management does. The solution here is that management spend their time setting up strategic directions against which all must be evaluated. They then set up “accountability courts” (i.e. “remote agencies of control) to evaluate everything that is being done to ensure that it contributes to the achievement of those strategic directions.

This can be seen in such examples as IT governance or panels that evaluate applications for learning and teaching innovation grants. A small select group sits in positions of power as accountability judges to ensure that all is okay.

Once the directions are set and the “accountability courts” are set up, management play their role within those courts, especially in terms of “kicking but” when appropriate.

Mismatch and inappropriate

There is an argument to be made that such approaches are an anathema to higher education. For example, Findlow (2008) makes this point

New managerialism approaches knowledge as a finished product, packaged, positive, objective, externally verifiable and therefore located outside the knower. By contrast, an ‘academic exceptionalist’ (Kogan and Hanney 2000, 242) view of knowledge places it in the minds of knowledgeable individuals, with the holder of the knowledge also the main agent in its transmission (Brew 2003). This kind of expert or ‘professional knowing’, closely related to conventionally acquired ‘wisdom’ (Clegg 2005, 418), is produced through an organic process between people in a culture of nurturing new ideas. The process is allowed to take as long as it takes, and knowledge is not seen as a finished product.

There are arguments back and forth here. I’m going to ignore them as beyond scope for this post.

I will say that I have no time for many of the academics who, at this stage, will generally trot out the “academic freedom” defense to “accountability courts”. Accountability, of an appropriate sort, is a key component of being an academic, peer review anyone? Findlow (2008) has this to say

accountability is intrinsic to academia: the sort of accountability that is about honesty and responsibility, about making decisions on the basis of sound rationales, on the understanding that you may be called to account at any point. Strathern (2000a, 3) suggests that ‘audit is almost impossible to criticise in principle – after all, it advances values that academics generally hold dear, such as responsibility, openness about outcomes’.

Academics should be open and clear about what and why the perform certain tasks. Hiding behind “academic freedom” is to often an excuse to avoid being “called to account”. (That said there are always power issues that complicate this).

My argument against “accountability courts” is not on the grounds of principle, but on pragmatic grounds. It doesn’t work

It doesn’t work

Remember, we’re talking here about improving the design of courses across an institution. To some extent this involves innovation – the topic of Findlow (2008) – who makes the following point about innovation (emphasis added)

The nature of innovation … is change via problematisation and risk. In order to push the boundaries of what we know, and break down dogma, problems have to be identified and resolved (McLean and Blackwell 1997, 96). Entering uncharted territory implies risk, which requires acceptance by all stakeholders.

This last point is where the problems with “accountability courts” arise. It starts with the SNAFU principle which in turn leads to task corruption.

SNAFU principle

Believed to arise from the US army in World War II the phrase SNAFU is commonly known as an acronym that is expanded out to Situation Normal, All Fouled Up – where “Fouled” is generally replaced with a more colloquial term. Interestingly, and as a pause to this diatribe, here’s a YouTube video of Private Snafu – a cartoon series made by the US armed services during World War II to educate the troops about important issues. You may recognise Mel Blanc’s voice.

However, the SNAFU principle gets closer to the problem. The principle is defined as

“True communication is possible only between equals, because inferiors are more consistently rewarded for telling their superiors pleasant lies than for telling the truth.”

This is illustrated nicely by the fable on this SNAFU principle page.

Can this be applied to innovation in higher education? Surely it wouldn’t happen? Findlow (2008) again

My own experience as a funded innovator, and the prevailing experience of my respondents, was that participation in a funded ‘scheme’ made authentic problematisation, and honest description of risk, difficult. Problematisation was inhibited by the necessary consideration given to funding body and institutional agendas in defining parameters for approval. Audit can be seen as a response to fear of risk, and audit-managerially governed schemes require parameters pre-determined, expected outcomes and costs known in advance. Respondents in this case related the reluctance of the scheme to provide for unanticipated needs as they arose, without which effective innovation was much harder.

Task corruption

Task corruption can be defined as

is where either an institution or individual, conciously or unconsciously, adopts a process or an approach to a primary task that either avoids or destroys the task.

It can arise when the nature of the system encourages people to comply through a number of different mechanisms. Findlow (2008) reports on one as applied to innovation

The discussion groups of new academics unanimously recounted a feeling of implicit pressure not to acknowledge problems. They all said they had quickly learned to avoid mention of ‘problems’, that if necessary the word ‘issues’ was preferable, but that these ‘issues’ should be presented concisely and as if they had already been dealt with. While their formal institutional training programme emphasised the importance of honestly addressing shortcomings, their informal exposure to management culture conveyed a very different message. They had learned, they said, that to get on in academia you had to protect yourself and the institution, separate rhetoric from reality, strategy from truth – that authentic problematisation was non-productive and potentially dangerous.

Findlow goes on to reference Trowler’s (1998) term “coping strategies” and the phrase “work to rule”. Findlow gives examples, such as innovators have to lie about a particular aspect of their innovation in the formal documents required by an “accountability court” in order to fulfill requirements. Even thought the rationale was accepted by senior adminstrators.

Academics start to work the system. This creates a less than stellar confidence in the nature of the system and subsequently reduce the chances of innovation. Findlow (2008) again

Allen’s (2003) study of institutional change found that innovation was facilitated by the confidence that comes with secure working environments. Where change was judged by staff to be successful, it tended to emerge from university environments where holistic and humanistic views of scholarship and systems of implicit trust were embedded. These gave academics the confidence to take risks. Allen found that insecure environments created what Power (1994, 13) describes as ‘the very distrust that [they were] meant to address’, removed the expectation and obligation for genuinely responsible academic accountability (Giri 2000, 174), and made staff reluctant to devote time, signpost problems or try something that might not work and could reflect negatively on their career portfolios.

A solution?

The last quote from Findlow (2008) seems to provide a possible suggestion in “university environments where holistic and humanistic views of scholarship and systems of implicit trust were embedded”. Perhaps such an environment would embody level 3 knowledge of how to improve design of courses. Such an environment might allow academics to enage in Reflective Problematisation.

Such an environment might focus on some of the features of this process.

References

Biggs, J. and C. Tang (2007). Teaching for Quality Learning. Maidenhead, England, Open University Press.

Findlow, S. (2008). “Accountability and innovation in higher education: a disabling tension?” Studies in Higher Education 33(3): 313-329.

Cooked course feeds – An approach to bringing the PLEs@CQUni, BAM and Indicators projects together?

The following is floating an idea that might be useful in my local context.

The idea

The idea is to implement a “cooked feed” for a CQUniversity course. An RSS or OPML feed that either students or staff or both can subscribe to and receive a range of automated information about their course. Since some of this information would be private to the course or the individuals involved, it would be password protected and could be different depending on the identity of the person pulling the feed.

For example, a student of the course would receive generic information about the course (e.g. any recent posts to the discussion forums, details of resources uploaded to the course site) as well as information specific to them (e.g. that their assignment has been marked, or someone has responded to one of their discussion posts). A staff member could receive similar generic and specific information. Since CQU courses are often offered across multiple campuses staff and student information could be specific to the campus or the sets of students (e.g. a tutor would receive regular updates on their students – have they logged into the course site etc)

A staff member might get a set of feeds like this:

  1. Student progress – perhaps containing a collection of feeds. One might be summary that summarises progress (or the lack thereof) for all students and then one feed per student.
  2. Course site – provides posts related to the course website. For example, posts to discussion forums, usage statistics of resources and features etc.
  3. Tasks and events – updates of when assignments are due, when assignments are meant to be marked, when results need to be uploaded. These updates would not only contain information about what needs to be done, but also provide links and advice about how to perform them.

The “cooked” adjective suggests that the feeds are not simply raw data from original sources. But that they undergo additional preparation to increase the value of the information they contain. For example, rather than a single post simply listing the students who have (or have not) visited a course site the post might contain the students GPA for previous courses, some indication of how long into a term they normally access a course site, when they added the course (in both date and week format – i.e. week 2 of term), links back to institutional information systems to see photos and other details of the students, links to an email merge facility to send a private/bulk email to all students in a particular category, a list of which staff are responsible for which students etc.

The point is that the “cooking” turns generic LMS information into information that is meaningful for the institution, the course, the staff, and the students. It is this contextual information that will almost always be missing from generic systems, simply because they have to be generic and each institution is going to be different.

Why?

The PLEs@CQUNi project already has a couple of related sub-projects doing work in this area – discussion forums and BAM.

Discussion forums. The slideshow below explains how staff can currently access RSS feeds generated from the discussion forums of CQU’s current implementation of Blackboard version 6.3. A similar feature has already been developed for the discussion forum used in the other “LMS” being used at CQU.

The above slideshow uses the idea of the “come to me” web. This meme is encompasses one reason why doing this might be a good thing. It saves time, it makes information more visible to the staff and the students. Information they can draw upon to decide what to do next. Information in a form that allows them to re-purpose and reuse for tasks that make sense to them, but would never be apparent to a central designer.

BAM. The Blog Aggregation Management (BAM) project now generates an OPML feed unique for each individual staff member to track their students’ blog posts. The slidecast below outlines how they can use it.

The indicators project is seeking to mine usage logs of the LMS to generate information that is useful to staff. I think there is value in this project looking at generating RSS feeds for staff based on the information it generates. Why depends on the difference between lag and lead indicators.

I’ve always thought that too much of the data generated at Universities are lag indicators. Indicators that tell you how good or bad things went. For example, “oh dear, course X had a 80% failure rate”. While having this information is useful it’s too late to do anything. You can’t (well you shouldn’t be able to) change the failure rate after it has happened.

What is much more useful are lead indicators. Indicators that offer you some insight into what is likely to happen. For example, “oh dear, the students all failed that pop quiz about topic X”. If you have some indication that something is starting to go wrong, you may be able to do something about it.

Aside: Of course things brings up the problematic way most courses are designed, especially the assessment. They are designed in ways such that there are almost no lead indicators. The staff have no real insight into how the students are going until they hand in an assignment or take an exam. By which time it is too late to do anything.

Having the indicators project generating RSS posts summarising important lead indicators for a course might encourage and help academics take action to prevent problems developing into outright failure.

This is also encompassed in the idea of BAM generating feeds and the very idea of BAM in the first place. It allows staff to see which students are or are not progressing (lead indicator) and then take action they deem appropriate.

It’s also a part of the ideas behind reflective alignment. That post also has some suggestions about how to implement this sort of thing.

How to improve L&T and e-learning at universities

Over the last week or so I’ve been criticising essentially all current practice used to improve learning and teaching. There are probably two main prongs to my current cynicism:

  1. Worse than useless evaluation of learning and teaching; and
    Universities are using evaluation methods that are known to be worthless and/or can’t get significant numbers of folk to agree as some definition of “good” learning and teaching.
  2. A focus on what management do.
    Where, given the difficulty of getting individual academics (let alone a significant number of them), to change and/or improve their learning and teaching (often because of the problems with point #1), the management/leadership/committee/support hierarchy within universities embark on a bit of task corruption and start to focus on what they do, rather than on what the teaching staff do.

    For example, the university has improved learning and teaching if the academic board has successfully mandated the introduction of generic attributes into all courses, had the staff development center run appropriate staff development events, and introduced “generic attributes” sections within course outlines. They’ve done lots of things, hence success. Regardless of what the academics are really doing and what impacts it is having on the quality of learning and teaching (i.e. see point #1).

So do you just give up?

So does this mean you can’t do anything? What can you do to improve learning and teaching? Does the fact that learning and teaching (and improving learning and teaching are wicked problems mean that you can’t do anything? This is part of the problem Col is asking about with his indicators project. This post is mostly aimed at trying to explain some principles and approaches that might work. As well as attempting to help Col, it’s attempting to make concrete some of my own thoughts. It’s all a work in progress.

In this section I’m going to try and propose some generic principles that might help inform how you might plan something. In the next section I’m going to try and apply these principles to Col’s problem. Important: I don’t think this is a recipe. The principles are going to be very broad and leave a lot of room for the application of individual knowledge. Knowledge of both generic theories of teaching, learning, people etc. and also of the specific contexts.

The principles I’m going to suggest are drawn from:

  • Reflective alignment – a focus on what the teachers do.
  • Adopter-based development processes.
  • A model for evaluating innovations informed by diffusion theory.
  • Emergent/ateleological design.
  • The Cynefin framework.

Reflective alignment

In proposing reflective alignment I believe it is possible to make a difference. But only if

The focus is on what the teacher does to design and deliver their course. The aim is to ensure that the learning and teaching system, its processes, rewards and constraints are aiming to ensure that the teacher is engaging in those activities which ensure quality learning and teaching. In a way that makes sense for the teacher, their course and their students.

The last sentence is important. It what make sense for the teacher. It is not what some senior manager thinks should work, or what the academic board thinks is important or good. Any attempt to introduce something that doesn’t engage with the individual teacher and doesn’t encourage them to reflect on what they are doing and hopefully make a small improvement, will fail.

Adopter-based development

This has strong connections with the idea of adopted-based development processes, which are talked about in this paper (Jones and Lynch, 1999)

places additional emphasis on being adopter-based and concentrating on the needs of the individuals and the social system in which the final system will be used.

Forget about the literature, forget about the latest fad (mostly) and concentrate first and foremost on developing a deep understanding of the local context, the social system and its mores and the people within it. What they experience, what their problems are, what their strengths are and what they’d like to do. Use these as the focus for deciding what you do next, not the latest, greatest fad.

How do you decide?

In this paper (Jones, Jamieson and Clark, 2003) we drew on Rogers’ diffusion theory (Rogers, 1995) to develop a model that might help folk make these sorts of decisions. The idea was to evaluate a potential innovation against the model in order to

increase their awareness of potential implementation issues, estimate the likelihood of reinvention, and predict the amount and type of effort required to achieve successful implementation of specific … innovations.

Variables influencing rate of adoption

The model consists of five characteristics of an innovation diffusion process that will directly influence the rate of adoption of the innovation. These characteristics, through the work of Rogers and others, also help identify potential problems facing adoption and potential solutions.

This model can be misused. It can be used as an attempt to encourage adoption of Level 2 approaches to improving learning and teaching. i.e. someone centrally decides on what to do and tries to package it in a way to encourage adoption. IMHO, this is the worst thing that can happen. Application of the model has to be driven by a deep understanding of the needs of the people within the local context. In terms of reflective alignment, driven by a desire to help encourage academics to reflect more on their learning and teaching.

Emergent/ateleological design

Traditional developer-based approaches to information systems are based on a broadly accepted and unquestioned set of principles that are completely and utterly inappropriate for learning and teaching in universities. Since at least this paper (Jones, 2000) I’ve been arguing for different design processes based on emergent development (Truex, Baskerville and Klein, 1999) and ateleological design (Introna, 1996).

Truex, Baskerville and Klein (1999) suggest the following principles for emergent development:

  • Continual analysis;
  • Dynamic requirements negotiation;
  • Useful, incomplete specifications;
  • Continuous redevelopment; and
  • The ability to adapt.

They are expanded in more detail in the paper. There have been many similar discussions about processes. This paper talks about Introna’s ateleological design process and its principles. Kurtz and Snowden (2007) talk about idealistics versus naturalistic approaches that are summarised in the following table.

Idealistic Naturalistic
Achieve ideal state Understand a sufficiency of the present in order to stimulate evolution
Privilege expert knowledge, analysis and interpretation Favour enabling emergent meaning at the ground level
Separate diagnosis from interfention Diagnosis and intervention to be intertwined with practice

No surprises for guessing that I believe that a naturalistic process is much more appropriate.

Protean technologies

Most software packages are severely constraining. I’m thinking mostly of enterprise systems here that tend to illustrate the underlying assumptions in their design where the control of what users do is necessary to ensure efficiency. I believe it just constrains what people can do, limits innovation and in an environment like learning and teaching this is a huge problem.

Truex et al (1999) make this point about systems and include “ability to adapt” as a prime requirement for emergent development. The software/systems in play have to be adaptable. As many people as possible, as quickly as possible, need to be able to modify the software to enable new functionality as the need becomes apparent. The technology has to enable, in Kurtz and Snowden’s (2007) words, “emergent meaning at the ground level”. It also to allow “diagnosis and intervention to be intertwined with practice”.

That is the software has to be protean. As much as possible the users of the system need to be able to play with the system, to try new things and where appropriate there have to be developers who can help and enable these things to happen more quickly. This implies that the software has to enable and support discussion, amongst many different people, to occur. To help share perspectives and ideas. The mixing of ideas help generate new and interesting ideas for change to the software.

Cynefin framework

Cynefin framework

Which brings us to the Cynefin framework. As a wicked problem, I place teaching and attempting to improve teaching into the Complex domain of the Cynefin framework. This means that the most appropriate approach is to “Probe – Sense – Respond”. i.e. do something small, see how it works and then encourage the stuff that works and cease/change the stuff that doesn’t.

Some ideas for a way forward

So to quickly finish this off, some off the cuff ideas for the indicators project:

  • Get the data from the indicators into a form that provides some information to real academics in a form that is easy to access and preferably as a part of a process or system they already use.
  • Make sure the form is perceived by the academics to provide some value.
  • Especially useful if the information/services provided by the indicators project enables/encourages reflection on the part of the academics.
    For example, giving a clear, simple, regular update on some information about student activity that is currently unknown. Perhaps couched with advice that helps provide options for a way to solve any potential problems.
  • Use a process and/or part of the product that encourages a lot of people talking about/contributing to ideas about how to improve what information/services the indicators provides.
  • Adopt the “open source” development ethos “release early, release often”
  • Perhaps try and create a community of academics around the project that are interested and want to use the services.
  • Pick people that are likely to be good change agents. Keep in mind Moore’s chasm and Geohegan’s identification of the technologists alliance.

References

Introna, L. (1996). “Notes on ateleological information systems development.” Information Technology & People 9(4): 20-39.

David Jones, Teresa Lynch, (1999). A Model for the Design of Web-based Systems that supports Adoption, Appropriation, and Evolution, Proceedings of the 1st ICSE Workshop on Web Engineering, Murugesan, S. & Deshpande, Y. (eds), Los Angeles, pp 47-56

David Jones, Kieren Jamieson, Damien Clark, (2003). “A Model for Evaluating Potential WBE Innovations,” Hawaii International Conference on System Sciences, vol. 5, no. 5, pp. 154a, 36th Annual Hawaii International Conference on System Sciences (HICSS’03) – Track 5, 2003.

Kurtz, C. and D. Snowden (2007). Bramble Bushes in a Thicket: Narrative and the intangiables of learning networks. Strategic Networks: Learning to Compete. Gibbert, Michel, Durand and Thomas, Blackwell.

Rogers, E. (1995). Diffusion of Innovations. New York, The Free Press.

Truex, D., R. Baskerville, et al. (1999). “Growing systems in emergent organizations.” Communications of the ACM 42(8): 117-123.

The IRIS model of Technology Adoption – neat and incomplete?

George Siemens has a post introducing the IRIS model of technology adoption – image shown below.

IRIS model of technology adoption

I always start off having a vague disquiet about these types of models. I think the main reason is the point George makes at the start of the post

In many instances, it’s a matter of misunderstanding (determining the context from which different speakers are arguing)

i.e. some of my disquiet arises from bringing a different context/perspective to this. The following is my attempt to clearly identify the source of my vague sense of disquiet about this model.

At the moment, I think I’m going to identify three sources:

  1. It’s too neat.
  2. Misinterpretation based on different definitions.
  3. It misses the most important part.

It’s too neat

I’ve argued before that frameworks and their graphical representations tend to make the inherently messy, too neat.

One of the things I don’t like about frameworks is that they have (for very good reasons) to be tidy. This certainly helps understanding, a key purpose of frameworks, but it also can give the false impression of tidiness, of simplicity of a tame problem. My interest is currently in e-learning within universities, which I consider to be extremely messy. To me it is an example of a wicked problem.

Innovation in any reasonably complex social system is also a wicked problem.

Part of my disquiet about the neatness is how I’ve seen models, frameworks and taxonomies used within organisations. They’ve been used as a replacement for recognising, understanding and dealing with the complexities and messiness of the real situation. Some of the sentiment expressed by Jim Groom about leadership captures some of this. I’ve seen this problem lead too often to faddish and fashionable adoption of innovations. I tend to think much of the organisational implementation of e-learning is based on fads and fashions.

That said, a neat graphical representation is a good way to start understanding, but it’s not the end game.

Misinterpretation based on definitions

Perhaps getting back to the point about misunderstandings arising from context. In my current context fads and fashions are something I see regularly and am thinking about. Hence when I see “How do we duplicate it?” under the systematization component of the IRIS model, I immediately think of fads and fashions. I wonder if “How do we scale it?” or “How do we encourage widespread appropriate adoption?” might capture better what George’s intent might be.

It doesn’t go far enough

The IRIS model, as it stands, appears to make (based on my interpretation) the same problem that almost all of these types of models make. It focuses mostly on the development and pays little or no attention to the long term use, adaptation and evolution of innovations. I hesitate to label it as such, but the IRIS model seems to have a very strong basis in teleological design (Jones et al, 2005; Jones and Muldoon, 2007). Again, this could be the impact of perspective and context. It could be me falling into the hole provided by Kaplan’s law of instrument.

Going back to a major component of my information systems design theory for e-learning and a quote from an old paper (Jones, Lynch and Jamieson, 2003)

The world in which systems are developed is not static and entirely predictable – systems will need to be altered and maintained. Maintenance typically consumes about 40 to 80 percent of software costs and 60% of that maintenance cost is due to enhancement (Glass, 2001). That is, adding new capabilities that were not known of during the analysis phase. If maintenance is such a large part of system development the assumption of a period of low-cost maintenance to recoup costs from the analysis and development phases seems less valid. If an organization is operating in a continually changing context then a large investment in up front analysis is a poor investment as requirements change before the end of the analysis stage (Truex & Klein, 1999).

I think there is another step in the IRIS model – let’s call it Evolution. It’s a step that comes after Systematization and has a cyclical relationship with both Systematization and Innovation. No innovation survives in its original form once it starts to be used. A whole range of limitations and unexpected affordances of the innovation are discovered as it is used in new and complex settings. This is especially true if the innovation makes use of some form protean technology that enables and even encourages the modification of the innovation by people within a given context.

Time for the hobby horse

The lack of attention paid to this Evolution phase is perhaps the aspect of university-based e-learning that annoys me most. The processes used and the products selected generally don’t pay sufficient attention to the need and benefits of actively enabling this evolution. It’s the type of thinking that leads to systems and practices that aren’t moving with the times and require organisations to enter into large-scale replacement projects – e.g. the selection of a new LMS.

This approach is based on the idea of “big up-front design” as shown in the following model from Truex et al (1999). There is a period of analysis and design which is expensive. Then, to recoup costs, there is a period of stable use until the system no longer is suitable and hence the need for replacement.

Big up front design

The major problem with this process is evident if you look at universities and their use of learning management systems and expand the timelines out to 10 years or so, you get the following.

The long term effects of big up front design

That is, because they don’t pay enough attention to Evolution and because they don’t treat their “Product” (i.e. the LMS) as protean, every 5 years or so they have to go through the expensive and onerous process of replacing their systems.

References

Glass, R. (2001). Frequently Forgotten Fundamental Facts about Software Engineering. IEEE Software, 110-112.

David Jones, Teresa Lynch, Kieren Jamision, Emergent Development of Web-based Education, Proceedings of Informing Science + IT Education Conference, Pori, Finland

David Jones, Jo Luck, Jeanne McConachie, P. A. Danaher, The teleological brake on ICTs in open and distance learning, To appear in Proceedings of ODLAA’2005

David Jones, Nona Muldoon, The teleological reason why ICTs limit choice for university learners and learning, In ICT: Providing choices for learners and learning. Proceedings ASCILITE Singapore. pp 450-459

Truex, D., & Klein, B. (1999). Growing Systems in Emergent Organizations. Communications of the ACM, 42(8),
117-123.

Page 1 of 2

Powered by WordPress & Theme by Anders Norén

css.php