Assembling the heterogeneous elements for (digital) learning

Month: September 2014

A perspective on why institutional e-learning is so bad

It’s about time to tilt at the same windmill again. For as long as I can remember I’ve thought institutional e-learning was done badly. Here’s another attempt to explain why and map out a way forward. The following is based heavily on on this paper that will be presented at ASCILITE’2014 and is a slightly re-worked version of something I shared as part of my current institution’s attempts to formulate operational plans.

The argument is that institutional e-learning is based entirely on the wrong mindset. To see any hope of improvement it needs to integrate a little of another largely incommensurable mindset. I use a problem specific to my practice below to illustrate the argument. My co-author shares a different problem in the paper that illustrates the same point, but his problem is potentially more interesting.

My problem

I teach EDC3100, ICT and Pedagogy a third year core course in the Bachelor of Education. The first semester enrolment typically consists of 300 plus pre-service teachers; studying to become every type of teacher from early childhood, primary, secondary, and VET; located at each of USQ’s campuses, Malaysia and about 170 of the students via online learning. Some of these students – due to exemptions – are in their first semester of University study. Others are into the 6th, 7th and beyond year of study at USQ.

As a course that is teaching teachers about how to use Information and Communication Technologies to enhance/transform their pedagogy, the course requires all students to make heavy use of ICTs. Many of these students are not “digital natives”. Even those with years of online study at USQ show surprising levels of digital illiteracy. Hence there are lots of questions from students that need answering. Almost all of these questions are asked on discussion forums.

When I respond to a question on a course discussion forum it’s often important/useful to cater the response to the specifics of the student. In particular, it’s not unusual to see the need to customise a response based on

  1. The student’s mode (at which campus or online) of study.
  2. What “type” of teacher are they studying to become (early childhood, primary etc).
  3. Whether they also have a specific learning area/discipline.
    e.g. HPE students often have a specific set of challenges around ICTs and the secondaries have specific learning areas. Then the secondary students are focusing on two or so disciplines.
  4. Whether this is the student’s first semester of study.
  5. The physical location of online students.
    Students in other states or overseas often use different curricula etc.

Challenge: If you are teaching in a University, can you find out this type of information about your students?

At my institution I have access to an LMS and a student records systems. This information is mostly not in the LMS, and if it is it will require navigating to another page to find it. While it is in the student records system the default access provided to academics does not allow them to access that information.

The silly solution I’ve used for the last 3 years has been

  1. At the start of semester, ask the Faculty staff member with the appropriate permission and ask them to generate a spreadsheet providing this information for all students.
  2. I then have that spreadsheet open when replying to student queries and when needed I check the spreadsheet.

As you might imagine this doesn’t happen as often as it should because it takes time. Of course the spreadsheet is almost straight away out of date as students add and drop the course.

My new solution

What I did instead was modify my web browser so that when it sees any page provided by the Study Desk that contains a link to a user profile it will add a new link to the page that is close by the user profile link. When I click on this new link (and it’s for a student in EDC3100) a dialog box will pop up with additional information about the student (what they are studying, their mode of study, how many courses they’ve completed and their city/post code/country).

The following figure shows what it looks like. Note:

  1. The [details] links near the author’s name and photo.
    I haven’t spent the time to tidy this kludge up.
  2. The dialog box and how the forum post is somewhat greyed out.
    The idea is that I can check this information, click Ok and then reply to the query.
MAV-based more user details by David T Jones, on Flickr

 

Currently, this only works via a web browser running on my laptop. It’s a personal solution. It is based on a particular set of technologies developed by and currently being used at CQUni for a strategic project around retention.

Why can’t USQ have more solutions like this? SET in our ways

The argument is that USQ (or any other university) is generally unable to achieve a solution like this due to the usually implicit mindset that underpins how it operates. My co-author and I have tried to make this mindset explicit as the “SET framework” based on how the institution answers three questions:

  1. What work gets done? – Strategic
    i.e. there is a strategic plan and a sequence of operational plans that define what is acceptable. The assumption is that the organisation has identified some ideal future state (enshrined in the plans) and what work can be done is judged against how well it helps the organisation achieve this pre-defined state. Any work that isn’t in the plan is deemed inefficient or unimportant.
    This particular problem could be aligned with the existing institutional plans, but there’s a question of how easily this could be achieved.
  2. How is ICT perceived? – Established
    This quote from an IT person at CQU from around 2003/4 sums up this perspective nicely, “we should seek to change people’s behavior because information technology systems are difficult to change” (Sturgess & Nouwens, 2004, n.p). This view of ICTs is that it is really hard to change them and instead people and their practices should change. This is especially prevalent with “enterprise systems” where best practice advice is to implement them as “vanilla” (i.e. no change to the technology).
    Peoplesoft (USQ’s student records system) is a horrendously difficult and expensive system to modify. Moodle – as open source software – is theoretically easier to modify but still requires some significant technical skill (i.e. expensive and rare) to modify properly. Even Moodle is very difficult to modify if your modifications require contextually specific changes to the systems in-built assumptions. e.g. modifying the Moodle discussion forum to show whether an EDC3100 students is studying to become a HPE teacher is not likely to happen.
  3. How the world is perceived? – Tree-like
    By this I mean hierarchical tree in the computer science sense. Strategic approaches will always use logical decomposition to reduce a big, really hard problem into lots of little easy to solve problems and assumes that you can just put all those little solutions together again and solve the big problem. You can see this in org charts, how learning and teaching is broken down into programs, courses, topics, learning objectives, attributes etc, and information systems.
    Each of the little boxes in the tree become responsible for a specific task. e.g. the development and support of IT is meant to be done in the IT box. Teaching education is done by the education teaching box etc. Managing student records is done in the Peoplesoft box which is the responsibility of the Student Administration box.

    The problem is that it’s really, really hard to move between boxes. If I wanted my problem solved, it would have to recognised by the folk in my box that are part of the IT governance process. They would take my problem (along with everyone else’s) up the hierarchy to someone/group who can make a judgement. A small problem like this is almost certainly going to be starved of attention as the focus is on achieving strategic goals. If it does get attention there’s the challenge of trying to bridge the two different boxes in which Peoplesoft and Moodle reside. etc.

    The more likely outcomes is that I’m not going to bother (at least not with the formal structure)

How is this possible? Breaking BAD

The solution I’ve developed is possible due to a different mindset that provides different answers to the three questions above. We’ve labelled this mindset the “BAD framework”.

It answers the three questions this way

  1. What work gets done? – Bricolage
    Rather than trying to achieve some pre-defined perfect state, bricolage focuses on solving concrete, contextualised problems using the resources that are to hand.I had a problem that I needed to solve so I figured out how I could solve it with the resources I had to hand. This year I’ve been able to improve my solution because I had access to new and better resources. But not resources that were provided by USQ. A good set of APIs would be a great help.
  2. How is ICT perceived? – Affordances
    ICTs are seen as protean. They can and should be manipulated, modified and re-worked to help people achieve what they want to achieve. There is no such thing as a “perfect design” or “perfect system”, the diversity and rapid change inherent in learning and teaching makes such an idea as nonsensical.
    In this case, I’ve been able to use the spreadsheet manually generated from Peopelsoft, Perl, Postgres, PHP, Greasemonkey, the Firefox web browser and the well designed HTML created by Moodle to manipulate and change the appearance and functionality of the Study Desk pages.
  3. How the world is perceived? – Distributed
    The world is (and universities are) complex, dynamic, and consists of interdependent assemblages of diverse actors (human and not) connected via complex networks. The ability to quickly construct and traverse those connections are essential to learning, understanding and action. Poldolny & Page (1998) apply the distributed view to governance and organisations and describe it as meaning that two or more actors are able to undertake repeated interactions over a period of time without having a centralised authority responsible for resolving any issues arising from those interactions.
    Rather than the typical tree-like structure of server (Study Desk) and client (my laptop), my solution draws on a network of technologies some on my laptop and some on university servers. I’ve used that distributed technologies to make connections not previously possible and hence I’m no able to do more than previously.

Implications

The paper that goes into this in more detail closes with this

The suggestion here is not that institutions should see the BAD framework as a replacement for the SET framework, but rather that they should engage in some bricolage and explore how contextually appropriate mixtures of both frameworks can help bridge their e-learning reality/rhetoric chasm. Perhaps universities need to break a little BAD?

Hence the suggestion for institutions is to figure out whether we want to break a little BAD and how that might be done.

However, as argued above it takes more than just having good technology. The fundamental mindset that underpins much of how an organisation does business needs to be questioned. This is hard.

The paper also raises the following as potential examples of how existing conceptions might need to be challenged

rather than require the IT division to formally approve and develop all applications of ICT, their focus should perhaps turn (at least in part) to enabling and encouraging “ways to make work-arounds easier for users to create, document and share” (Koopman & Hoffman, 2003, p. 74) through organisational “settings, and systems arranged so that invention and prototyping by end-users can flourish” (Ciborra, 1992, p. 305).

rather than academic staff development focusing on ensuring that the appropriate knowledge is embedded in the heads of teaching staff (e.g. formal teaching qualifications), there should be a shift to a focus on ensuring that the appropriate knowledge is embedded within the network of actors – both people and artefacts – distributed within and perhaps outside the institution.

Rather than accept “the over-hyped, pre-configured digital products and practices that are being imported continually into university settings” (Selwyn, 2013, p. 3), perhaps universities should instead actively contribute to “a genuine grassroots interest needs to be developed in the co-creation of alternative educational technologies. In short, mass participation is needed in the development of “digital technology for university educators by university educators” (p. 3).

The argument isn’t that we should throw out Moodle or other systems. Instead there needs to be mechanisms by which we can harness the complete knowledge distributed across the institution to extend and modify those existing technologies into something that is unique to the institutional context.

For me, I’d love to see what happens if institutional e-learning was characterised by

Widespread and on-going bricolage by a widely distributed collection of individuals and groups (students, teachers and others) from across the entire institution all connected via various means and learning from and building upon each others work. An institutional context that provides a range of functionality that supports and enables this on-going engagement with bricolage and recognises that this is where its competitive advantage will come from. An institutional context that is actively trying to make it easier to connect to actors from across and outside the institution and grow the knowledge embedded in those connections. All of this knowledge being used to manipulate and modify technologies to achieve new and interesting learning experiences.

 

 

Breaking BAD to bridge the reality/rhetoric chasm

The following is a copy of a paper accepted at ASCILITE’2014 (and nominated for best paper) written by myself and Damien Clark (CQUniversity – @damoclarky). The official conference version of the paper is available as a PDF.

Presentation slides available on Slideshare Google Slides.

The source code for the Moodle Activity Viewer is available on github. As are some of the scripts produced at USQ.

Abstract

The reality of using digital technologies to enhance learning and teaching has a history of falling short of the rhetoric. Past attempts at bridging this chasm have tried: increasing the perceived value of teaching; improving the pedagogical and technological knowledge of academics; redesigning organisational policies, processes and support structures; and, designing and deploying better pedagogical techniques and technologies. Few appear to have had any significant, widespread impact, perhaps because of the limitations of the (often implicit) theoretical foundations of the institutional implementation of e-learning. Using a design-based research approach, this paper develops an alternate theoretical framework (the BAD framework) for institutional e-learning and uses that framework to analyse the development, evolution, and very different applications of the Moodle Activity Viewer (MAV) at two separate universities. Based on this experience it is argued that the reality/rhetoric chasm is more likely to be bridged by interweaving the BAD framework into existing practice.

Keywords: bricolage, learning analytics, e-learning, augmented browsing, Moodle.

Introduction

In a newspaper article (Laxon, 2013) Professor Mark Brown makes the following comment on the quality of contemporary University e-learning:

E-learning’s a bit like teenage sex. Everyone says they’re doing it but not many people really are and those that are doing it are doing it very poorly. (n.p).

E-learning – defined by the OECD (2005) as the use of information and communications technology (ICT) to support and enhance learning and teaching – has been around for so long that there have been numerous debates about replacing it with other phrases. Regardless of the term used, there “has been a long-standing tendency in education for digital technologies to eventually fall short of the exaggerated expectations” (Selwyn, 2012, n.p.). Writing in the early 1990s Geoghagen (1994) seeks to understand why a three decade long “vision of a pedagogical utopia” (n.p.) promised by instructional technologies has failed to eventuate. Ten years on, Salmon (2005) notes that e-learning within universities is still struggling to move beyond projects driven by innovators and engage a significant percentage of students and staff. Even more recently, concerns remain about how much technology is being used to effectively enhance student learning (Kirkwood & Price, 2013). Given that “Australian universities have made very large investments in corporate educational technologies” (Holt et al., 2013, p. 388) it is increasingly important to understand and address the reality/rhetoric chasm around e-learning.

Not surprisingly the literature provides a variety of answers to this complex question. Weimer (2007) observes that academics come to the task of teaching with immense amounts of content knowledge, but little or no knowledge of teaching and learning, beyond perhaps their personal experience. A situation which may not change significantly given that academics are expected to engage equally in research and teaching and yet work towards promotion criteria that are perceived to primarily value achievements in research (Zellweger, 2005). It has been argued that the limitations of the Learning Management System (LMS) – the most common university e-learning tool – make the LMS less than suitable for more effective learner-centred approaches and is contributing to growing educator dissatisfaction (Rahman & Dron, 2012). It’s also been argued that the “limited digital fluency of lecturers and professors is a great challenge” (Johnson, Adams Becker, Cummins, & Estrada, 2014, p. 3) for the creative leveraging of emerging technologies. Another contributing factor is likely to be Selwyn’s (2008) suggestion that educational technologists have failed to be cognisant of “the more critical analyses of technology that have come to the fore in other social science and humanities disciplines” (p. 83). Of particular interest here is the observation of Goodyear et al (2014) that the “influence of the physical setting (digital and material) on learning activity is often important, but is under-researched and under-theorised: it is often taken for granted” (p. 138).

This paper reports on the initial stages of a design-based research project that aims to bridge the e-learning reality/rhetoric chasm by exploring and harnessing alternative theoretical foundations for the institutional implementation of e-learning. The paper starts comparing and contrasting two different theoretical foundations of institutional e-learning. The SET framework is suggested as a description of the mostly implicit assumptions underpinning most contemporary approaches. The BAD framework is proposed as an alternative and perhaps complementary framework that better captures the reality of what happens and if effectively integrated into institutional practices may help bridge the chasm. The development of a technology – the Moodle Activity Viewer (MAV) – and its use at two different universities is then used to illustrate the benefits and limitations of the SET and BAD frameworks, and how the two can be fruitfully combined. The paper closes with some discussion of implications and future work.

Breaking BAD versus SET in your ways

The work described here is part of an on-going cycle of design-based research that aims to develop new artefacts and theories that can help bridge the e-learning reality/rhetoric chasm. We believe that bridging this chasm is of theoretical and practical significance to the sector and to us personally. The interventions we describe in the following sections arose out of our day-to-day work and were informed by a range of theoretical perspectives. This section offers a brief description of the theoretical frameworks that have informed and been refined by this work. This is important as design-based research should depart from a problem (McKenney & Reeves, 2013), be grounded in practice, theory-driven and seek to refine both theory and practice (Wang & Hannafin, 2005). The frameworks described here are important because they identify a mindset (the SET framework) that contributes significantly to the on-going difficulty in bridging the e-learning reality/rhetoric chasm, and offers an alternate mindset (the BAD framework) that provides principles that can help bridge the chasm. The SET and BAD frameworks are broadly incommensurable ways of answering three important, inter-related questions about the implementation of e-learning. While the SET framework represents the most commonly accepted mindset used in practice, both frameworks are evident in both the literature and in practice. Table 1 provides an overview of both frameworks.

Table 1: The BAD and SET frameworks for e-learning implementation
Question SET BAD
What work gets done? Strategy – following a global plan intended to achieve a pre-identified desired future state. Bricolage – local piecemeal action responding to emerging contingencies.

 

How ICT is perceived? Established – ICT is a hard technology and cannot be changed. People and their practices must be modified to fit the fixed functionality of the technology. Affordances – ICT is a soft technology that can be modified to meet the needs of its users, their context, and what they would like to achieve.
How you see the world? Tree-like – the world is relatively stable and predictable. It can be understood through logical decomposition into a hierarchy of distinct black boxes. Distributed – the world is complex, dynamic, and consists of interdependent assemblages of diverse actors (human and not) connected via complex networks.

What work gets done: Bricolage or Strategic

The majority of contemporary Australian universities follow a strategic approach to deciding what work gets done. Numerous environmental challenges and influences have led to universities being treated as businesses with an increasing prevalence of managers using “strategic control and a focus on outputs which can be quantified and compared” (Reid, 2009, p. 575) to manage academic activities. A strategic approach involves the creation of a vision identifying a desired future state and the development of operational plans to bring about the desired future state. The only work that is deemed acceptable is that which fits within the established operational plan and is seen to contribute to the desired future state. All other work is deemed inefficient. The strategic approach is evident at all levels of institutional e-learning. Inglis (2007) describes how government required Australian universities to have institutional learning and teaching strategic plans published on their websites. The strategic or planning-by-objectives (e.g. learning outcomes, graduate attributes) approach also underpins how course design is largely assumed to occur with Visscher-Voerman and Gustafson (2004) finding that it underpins “a majority of the instructional design models in the literature” (p. 77). The strategic approach is so ingrained that it is often forgotten that these ideas have not always existed (Kezar, 2001), have significant flaws, and that there is at least one alternate perspective.

Bricolage, “the art of creating with what is at hand” (Scribner, 2005, p. 297) or “designing immediately” (BŸscher, Gill, Mogensen, & Shapiro, 2001, p. 23) involves the manipulation and creative repurposing of existing, and often unlikely, resources into new arrangements to solve a concrete, contextualized problem. Ciborra (1992) argues that bricolage – defined as the “capability of integrating unique ideas and practical design solutions at the end-user level” (p. 299) – is more important in developing organisational applications of ICT that provide competitive advantage than traditional strategic approaches. Scribner (2005) and other authors have used bricolage to understand the creative and considered repurposing of readily available resources that teachers use to engage in the difficult task of helping people learn. Bricolage is not without its problems. There are risks associated with extremes of both the strategic and bricolage approaches to how work gets done (Jones, Luck, McConachie, & Danaher, 2005). In the context of institutional e-learning, the problem is that at the moment the strategic is crowding out bricolage. For example, Groom and Lamb (2014) observe that the cost of supporting an enterprise learning tool (e.g. LMS) limits resources for user-driven innovation, in part because it draws “attention and users away” (n.p) from the strategic tool (i.e. LMS). The demands of sustaining the large and complex strategic tool dominates priorities and leads to “IT organizationsÉdefined by what’s necessary rather than what’s possible” (Groom & Lamb, 2014, n.p). There would appear to be some significant benefit to exploring a dynamic and flexible interplay between the strategic and bricolage approaches to deciding what work gets done.

How ICT is perceived: Affordances or Established

The established view sees ICT as a hard technology (Dron, 2013). What can be done with hard technology is fixed in advance either by embedding it in the technology or “in inflexible human processes, rules and procedures needed for the technology’s operation” (Dron, 2013, p. 35). An example of this is the IT person quoted by Sturgess and Nouwens (2004) as suggesting in the context of an LMS evaluation process that “we should seek to change people’s behavior because information technology systems are difficult to change” (n.p). This way of perceiving ICTs assumes that the functionality provided by technology is established and cannot be changed. This creates the problem identified by Rushkoff (2010) where “instead of optimizing our machines for humanity – or even the benefit of some particular group – we are optimizing humans for machinery” (p. 15). Perhaps in no small way the established view of ICT in e-learning contributes to Dede’s (2008) observation that “widely used instructional technology applications have less variety in approach than a low-end fast-food restaurant” (p. 58). The established view of ICT challenges Kay’s (1984) discussion of the “protean nature of the computer” (p. 59) as “the first metamedium, and as such has degrees of freedom and expression never before encountered” (p. 59). The problem is that digital technology is “biased toward those with the capacity to write code” (Rushkoff, 2010, p. 128) and increasingly those who can code have been focused on avoiding it.

The established view of ICT represents a narrow view of technological change and human agency. When unable to achieve a desired outcome, people will use the available knowledge and resources to create an alternative path, they will create a workaround (Koopman & Hoffman, 2003). For example, Hannon (2013) talks about the “hidden effort” (p. 175) of “meso-level practitioners – teaching academics, learning technologies, and academic developers” (p. 175) to bridge the gaps created by centralised technologies. The established view represents the designer-centred idea of achieving “perfect” software (Koopman & Hoffman, 2003), rather than recognising the need for on-going adaptation due to the diversity, complexity and on-going change inherent in university e-learning. The established view also ignores Kay’s (1984) description of the computer as offering “degrees of freedom and expression never before encountered” (p. 59). The established view does not leverage the affordance of ICT for change and freedom. Following Goodyear et al (2014), affordances are not a feature of a technology, but rather it is a relationship between the technology and the people using the technology. Within university e-learning the affordance for change has been limited due to both the perceived nature of the technology – best practice guidelines for integrated systems such as LMS and ERP recommend vanilla implementation (Robey, Ross, & Boudreau, 2002) – and the people – the apparent low digital fluency of academics (Johnson, Adams Becker, Cummins, & Estrada, 2014, p. 3). However, this is changing. There are faculty and students who are increasingly digitally fluent (e.g. the authors of this paper) and easily capable of harnessing the advent of technologies that “help to make bricolage an attainable reality” (BŸscher et al., 2001, p. 24) such as the IMS LTI standards, APIs (Lane, 2014) and augmented browsing (Dai, Tsai, Tsai, & Hsu, 2011). An affordances perspective of ICT seeks to leverage the capacity for ICT to be manipulated so that it offers the best possible affordances for learners and teachers. A move away from the established “design of an artefact towards emergent design of technology-in-use, particularly by the users” (Johri, 2011, p. 212).

How you see the world: Distributed or Tree-like

The methods used to solve most of the large and complex problems that make up institutional e-learning rely upon a tree-like or hierarchical conception of the world. To manage a university it is broken up into a tree-like structure consisting of divisions, faculties, schools, and so on. The organisation of the formal learning and teaching done at the university relies upon a tree-like structure of degrees, majors/minors, courses or units, learning outcomes, weeks, lectures, tutorials, etc. The information systems used to enable formal learning and teaching mirror the tree-like structure of the organisation with separation into different systems responsible for student records, learning management, learning content management etc. The individual information systems themselves are broken up into tree-like structures reliant on modular design. These tree-like structures are the result of the reliance on methods that use analysis and logical decomposition to reduce larger complex wholes into smaller more easily understood and manageable parts (Truex, Baskerville, & Travis, 2000). These methods produce tree-like structures of independent, largely black-boxed components that interact through formally approved mechanisms that typically involve oversight or approval from further up the hierarchy. For example, a request for a new feature in an LMS must wend its way up the tree-like governance structure until it is considered at the institutional level, compared against institutional priorities and ranked against other requests, before possibly being passed down to the other organisational black-box that can fulfill that request. There are numerous limitations associated with tree-like structures. For example, Holt et al (2013) identify just one of these limitations when they argue that the growing complexity of institutional e-learning means that no one leader at the top of a hierarchical tree has the knowledge to “possibly contend with the complexity of issues” (p. 389).

The solution suggested by Holt et al (2013) is distributed leadership which is in turn based on broader theoretical foundations of distributed cognition, social learning, as well as network and activity theories. A theoretical foundation that can be seen in a broad array of distributed ways of looking at the world. For example, in terms of learning, Siemens’ (2008) lists the foundations of connectivism: as activity theory; distributed and embodied cognition; complexity; and network theory. At the core of connectivism is the “thesis that knowledge is distributed across a network of connections and therefore learning consists of the ability to construct and traverse those networks” (Downes, 2011, n.p). Johri (2011) links much of this same foundation to socio-materiality and suggests that it offers “a key theoretical perspective that can be leveraged to advance research, design and use of learning technologies” (p. 210). Poldolny & Page (1998) apply the distributed view to governance and organisations and describe it as meaning that two or more actors are able to undertake repeated interactions over a period of time without having a centralised authority responsible for resolving any issues arising from those interactions. Rather than the responsibility and capability for specific actions being seen as belonging to any particular organisational member or group (tree-like), the responsibility and capability is distributed across a network of individuals, groups and technologies. The distributed view sees institution e-learning as a complex, dynamic, and interdependent assemblages of diverse actors (both human and not) distributed in complex networks.

It is our argument that being aware of the differences in thinking between the SET and BAD frameworks offers insight that can guide the design of interventions that are more likely to bridge the e-learning reality/rhetoric chasm. The following sections describe the development and adaptation of the Moodle Activity Viewer (MAV) at both CQUni and USQ as an example of what is possible when breaking BAD.

Breaking BAD and the development of MAV

The second author works for Learning and Teaching Services at CQUniversity (CQUni). In late 2012, he was working on a guide for teaching staff titled “How can I enhance my teaching practice?”. In contributing to the “Designing effective course structure” section of this guide, the author asked a range of rhetorical questions including “How do you know which resources your students access the most, and the least?”. Providing an answer to this question for the reader took more effort than expected. There are reports available in Moodle 2.2 (the version being used by CQUni at the time) that can be used to answer this question. However, they suffer from a number of limitations including: duplicated report names; unclear differences between reports; usage values include both staff and student activity; poor speed of generation; and, a tabular format. It was apparent that these limitations were acting as a barrier to reflection on course design. This was especially problematic, as the institution had placed increased emphasis on generating and responding to student feedback (CQUniversity, 2012). Annual course enhancement reports – introduced in 2010 – required teaching staff to respond to feedback from students and highlight enhancements to be made for the course’s next offering (CQUniversity, 2011). Information about activity and resource usage on the course Moodle site was seen by some to be useful in completing these reports. However, there was no apparent strategic or organisational imperative to address issues with the Moodle reports and it appeared likely that the aging version of Moodle (version 2.2) would persist for some time given other organisational priorities. As a stopgap solution the author and a colleague engaged in some bricolage and began writing SQL queries for the Moodle database and generating Excel spreadsheets. Whilst this approach provided more useful data, the spreadsheets were manually generated on request and the teaching staff had to bridge the conceptual gap between the information within the Excel spreadsheet and their Moodle course site.

In the months following, the author started thinking about a better approach. While CQUni had implemented a range of customisations to the institution’s Moodle instance, substantial changes required a clear understanding of the final requirements, alignment with strategic imperatives, and support of the senior management. At this stage of the process it was not overly clear what the final requirements of a solution would be, hence more experimentation was required to better understand the problem and possible solutions, prior to making the case for modifying Moodle.  While the author did not have the ability to change the institution’s version of Moodle itself, he did have access to: a copy of the Moodle database; access to a server computer; and software development abilities. Any bridging of this particular gap would need to draw on available resources (bricolage) and not disturb or impact critical high-availability services such as Moodle. Given uncertainty about what functionality might best enable reflection on course design any potential solution would also need to enable a significant level of agility and experimentation (bricolage).

The technical solution that seemed to best fulfill these requirements was augmented browsing. Dai et al (2011) define augmented browsing as “an effective means for dynamically adding supplementary information to a webpage without having users navigate away from the page” (p. 2418). The use of augmented browsing to add functionality to a LMS is not new.  Leony et al (2012) created a browser add-on that embeds learning analytics graphs directly within the Moodle LMS course home page. Dawson et al (2011) used what is known as bookmarklets to generate interactive sociograms to visualise student learning networks as part of SNAPP.  The problems that drove SNAPP’s use of augmented browsing – complex and difficult to interpret LMS reports and the difficulty of getting suggestions from teaching staff integrated into an institution LMS (Dawson et al., 2011) – mirror those faced at CQU.

Through a process of bricolage the Moodle Activity Viewer (MAV) was developed as an add-on for the Firefox web browser. More specifically, the MAV is built upon another popular Firefox add-on called Greasemonkey, and in Greasemonkey terms MAV is known as a userscript.  However, for the purposes of this paper, the MAV will be referred to more generally as an add-on to the browser. The intent was that the MAV would generate a heat map and embed it directly onto any web page produced by Moodle. A heat map shades each of the links in a web page with a spectrum of colours where the deeper red shades indicate links that are being clicked on more often (see Figure 1). The implementation of the MAV is completely separate from the institutional Moodle instance meaning its use has no impact on the production Moodle environment. Once the MAV add-on is installed into Firefox, and with it turned on, any web page from a Moodle course site can have a heat map overlaid on all Moodle links in that page. This process starts with the MAV add-on recognising a newly loaded page as belonging to a Moodle course site. When this occurs the MAV will generate a query asking for usage figures associated with every relevant Moodle link on that web page. This query is sent to the MAV server hosted on an available server computer. The MAV server translates the query into appropriate queries that will extract the necessary information from the Moodle database. As implemented at CQU, the MAV server relies on a copy of the Moodle database that is updated daily. While not necessary, use of a copy of the Moodle database ensures that there is no risk of disrupting the production Moodle instance.

The MAV add-on can be configured to generate overlays based on the number of clicks on a link, or the number of students who have clicked on a link. It can also be configured to limit the overlays to particular groups of students or to a particular student. When used on the main course page, MAV provides an overview of how students are using all of the course resources. Looking at a discussion forum page with the MAV enabled allows the viewer to analyse which threads or messages are receiving the most attention. Hence MAV can provide a simple form of process analytics (Lockyer, Heathcote, & Dawson, 2013).

An initial proof-of-concept implementation of the MAV was developed by April 2013. A few weeks later this implementation was demonstrated to the “Moodle 2 Project Board” to seek approval to continue development. The plan was to engage in small trials with academic staff and evolve the tool. The intent was that this would generate a blueprint for the implementation of heat maps within Moodle itself.  The low-risk nature of the approach contributed to approval to continue. However, by July 2013, the institution downsized through an organisational restructure and resources in the IT department were subsequently reduced.  As part of this restructure, and in an effort to reduce costs, the IT Department set to reduce the level of in-house systems development in favour of more established “vanilla” systems (off-the-shelf with limited or no customisations).  This new strategy made it unlikely that the MAV would be re-implemented directly within Moodle, and the augmented browsing approach might be viable longer term. As the MAV was being developed and refined, it was being tested by a small group of teaching staff within the creator’s team. Then in September 2013, the first official trial was launched making the MAV available to all staff within one of CQUniversity’s schools.

How MAV works by David T Jones, on FlickrFigure 1: How MAV works (Click on the image to see larger version)

Early in March 2012, prior to the genesis of the MAV, the second author and a colleague developed a proposal for a student retention project. It was informed by ongoing research into learning analytics at the institution and motivated by a strategic institutional imperative to improve student retention (CQUniversity, 2011).  It was not until October 2013 – after the commencement of the first trial of the MAV – that a revised version of the proposal received final approval and the project commenced in November under the name EASICONNECT.  Part of the EASICONNECT project was the inclusion of an early alerts system for disengaged students called EASI (Early Alert Student Indicators) to identify disengaged students early, and provide simple tools to nudge the students to re-engage, with the hope of improving student retention. In 2013, between the proposal submission and final approval of the EASICONNECT Project, EASI under a different name (Student Support Indicators – SSI) was created as a proof-of-concept and used in a series of small term-based trials, evolving similarly to the MAV. One of the amendments made to the approved proposal by the project sponsor (management) was the inclusion of the MAV as a project deliverable in the EASICONNECT project.

Neither EASI nor the MAV were strictly the results of strategic plans. Both systems arose from bricolage being undertaken by two members of CQUni’s Learning and Teaching Services that was later recognised as contributing to the strategic aims of the institution. With the eventual approval of the EASICONNECT project, the creators of EASI and the MAV worked more closely together on these tools and the obvious linkages between them were developed further. Initially this meant modifying the MAV so staff participating in the EASI trial could easily navigate from the MAV to EASI. In Term 1, 2014 EASI introduced links for each student in a course, that when clicked, would open the Moodle course site with the MAV enabled only for the selected student. While EASI showed a summary of the number of clicks made by the student in the course site, the MAV could then contextualise this information, revealing where those clicks took place directly within Moodle. In Term 2, 2014 a feature often requested by teaching staff was added to the MAV that would identify students who had and hadn’t clicked on links. The MAV also provided an option for staff to open EASI to initiate an email nudge to either group of students. Figure 2 provides a comparison of week-to-week usage of MAV between term 1 and 2, of 2014. The graphs show usage in terms of the number of page views and number of staff using the system, with the Term 2 figures including up until the end of Week 10 (of 15).

Both MAV and its sister project EASI were initiated as a form of bricolage. It was only later that both projects enjoyed the synthesised environment of a strategic project that provided the space and institutional permission for this work to scale and continue to merge. MAV arose due to the limited affordances offered by the LMS and the promise that different ICT could be harnessed to enhance the perceived affordances. Remembering that affordances are not something innate to a tool, but are instead co-constitutive between tool, user and context; the on-going use of bricolage allowed the potential affordances of the tool to evolve in response to use by teaching staff. Through this approach MAV has been able to evolve from potentially offering affordances of value to teaching staff as part of “design for reflection and redesign” (Dimitriadis & Goodyear, 2013) to also offering potential affordances for “design for orchestration” (Dimitriadis & Goodyear, 2013).

Figure 2: 2014 MAV usage at CQUni: Comparison between T1 and T2 (Click on images to see larger versions of the graphs)
MAV Usage - page views by David T Jones, on Flickr
MAV usage - # staff by David T Jones, on Flickr

Implementing MAV as a browser add-on also enables a break from the tree-like conceptions that underpin the design of large integrated systems like an LMS. The tree-like conception is so evident in the Moodle LMS that it is visible in the name. Moodle is an acronym for Modular Object-Oriented Dynamic Learning Environment. With Modular capturing the fact that “Moodle is built in a highly modular fashion” (Dougiamas & Taylor, 2003, p. 173), meaning that logical decomposition is used to break the large integrated system into small components or modules. This modular architecture allows the rapid development and addition of independent plugins and is a key enabler of the flexibility of Moodle. However, this is based on each of the modules being largely independent of each other, which has the consequence of making it more difficult to have functionality that crosses modular boundaries, such as taking usage information from the logging systems and integrating that information into all of the modules that work together to produce a web page generated by Moodle.

Extending MAV at another institution

In 2012 the first author commenced work within the Faculty of Education at the University of Southern Queensland (USQ). The majority of the allocated teaching load involved two offerings of EDC3100, ICTs and Pedagogy. EDC3100 is a large (300+ on-campus and online students first semester, and ~100 totally online second semester) core, third year course for Bachelor of Education (BEdu) students. The author expected that USQ would have high quality systems and processes to support large, online courses. This was due to USQ’s significant reputation in the practice and research of distance and online education; it’s then stated vision “To be recognised as a world leader in open and flexible higher education” (USQ, 2012, p. 5); and the observation that “by 2012 up to 70% of students in the Bachelor of Education were studying at least some subjects online” (Albion, 2014, p. 1163). The experience of teaching EDC3100 quickly revealed an e-learning reality/rhetoric chasm.

As a core course EDC3100 students study at all of USQ’s campuses, a Malaysian partner, and online from across Australia and the world. The students are studying to become teachers in early childhood, primary, secondary and VET settings. The course is designed so that the “Study Desk” (the Moodle course site) is an essential source of information and support for all students. The course design makes heavy use of discussion forums for a range of learning activities. Given the size and diversity of the student population there are times when it is beneficial for teaching staff to customise their responses to the student’s context and specialisation. For instance, an example from the Australian Curriculum may be appropriate for a primary or lower secondary pre-service teacher based in Australia, but inappropriate for a VET pre-service teacher. Whilst the Moodle discussion forum draws on user profiles to identify authors of posts, the available information is limited to that provided centrally via the institution and by the users. For EDC3100 this means that a student’s campus is apparent through their membership of the Moodle groups automatically created by USQ’s systems, however, seeing this requires navigating away from the discussion forum. The student’s specialisation is not visible in Moodle. The only way this information is available is to ask an administrative staff member with the appropriate student records access to generate a spreadsheet (and then update the spreadsheet as students add and drop the course) that includes this specific information. The lack of easy access to this information constrains the ability of teaching staff to effectively intervene.

One explanation for the existence of this gap is the limitations of the SET approach to institutional e-learning systems. The tree-based practice of logical decomposition results in distinct tasks – such as the management of student demographic and enrolment data (Peoplesoft), and the practice of online learning (Moodle) – being supported by different information systems with different data models and owned by different organisational units. Logical decomposition allows each of these individual systems and their owners to focus on the efficiency of their primary task. However, it comes at the cost of making it more difficult to both recognise and respond to requirements that go across the tasks (e.g. teaching). It is even more difficult when the requirement is specific to a subset of the organisation. For example, ensuring that information about the specialisation of BEdu students is evident in Moodle is only of interest to some of the staff teaching into the BEdu. Even if this barrier could be overcome, modifying the Moodle discussion forum to make this type of information more visible would be highly unlikely due to the cost, difficulty and (quite understandable) reluctance to make changes to enterprise software inherent in the established-view of technology.

To address this need the MAV add-on was modified to recognise USQ Moodle web pages that contain links to student profiles (e.g. a forum post). On recognising such a page the modified version of MAV queries a database populated using the manually provided spreadsheet described above. MAV uses that information to add to each student profile link a popup dialog that provides student information such as specialisation and campus without leaving the page. Adding different information (e.g. activity completion, GPA etc.) to this dialog can proceed without the approval of any centralised authority. The MAV server and the database run on the author’s laptop and the author has the skill to modify the database and write new code for both the MAV server and client. As such it’s an example of Podonly and Page’s (1998) distributed approach to governance. The only limitation is whether or not the necessary information can be retrieved in a format that can be easily imported into the database.

Conclusions, implications and future work

Future work will focus on continuing an on-going cycle of design-based research exploring how and with what impacts the BAD framework can be fruitfully integrated into the practice of institutional e-learning. To aid this process we are exploring how MAV, its various modifications, and descendants can be effectively developed and shared within and between institutions. As a first step, the CQU MAV code has been released on GitHub (https://github.com/damoclark/mav), development is occurring in the open and interested collaborators are welcome. A particular interest is in exploring and evaluating the use of MAV to implement scaffolding and context-sensitive conglomerations. Proposed in Jones (2012) a conglomeration seeks to enhance the affordances offered by any standard e-learning tool (e.g. a discussion forum) with a range of additional and often contextually specific information and functionality. Both uses of MAV described above are simple examples of a conglomeration. Of particular interest is whether these conglomerations can be used to explore whether Goodyear’s (2009) idea that “research-based evidence and the fruits of successful teaching experience can be embodied in the resources that teachers use at design time” can be extended to institutional e-learning tools.

Perhaps the biggest challenge to this work arises from the observation that the SET framework forms the foundation for current institutional practice and that the SET and BAD frameworks are largely incommensurable. At CQU, MAV has benefited from recognition and support of senior management; yet, it still challenges the assumptions of those operating solely through the SET framework. The incommensurable nature of the SET and BAD frameworks imply that any attempts to fruitfully merge the two will need to deal with existing, and sometimes strongly held assumptions and mindsets. For example, rather than require the IT division to formally approve and develop all applications of ICT, their focus should perhaps turn (at least in part) to enabling and encouraging “ways to make work-arounds easier for users to create, document and share” (Koopman & Hoffman, 2003, p. 74) through organisational “settings, and systems É arranged so that invention and prototyping by end-users can flourish” (Ciborra, 1992, p. 305). Similarly, rather than academic staff development focusing on ensuring that the appropriate knowledge is embedded in the heads of teaching staff (e.g. formal teaching qualifications), there should be a shift to a focus on ensuring that the appropriate knowledge is embedded within the network of actors – both people and artefacts – distributed within and perhaps outside the institution. Rather than accept “the over-hyped, pre-configured digital products and practices that are being imported continually into university settings” (Selwyn, 2013, p. 3), perhaps universities should instead actively contribute to “a genuine grassroots interest needs to be developed in the co-creation of alternative educational technologies.  In short, mass participation is needed in the development of “digital technology for university educators by university educators” (p. 3).

Biggs (2012) conceptualises the job of a teacher as being responsible for creating a learning context in which “all students are more likely to use the higher order learning processes which ‘academic’ students use spontaneously” (p. 39). If this perspective is taken one step back, then it is the responsibility of a university to create an institutional context in which all teaching staff are more likely to create the type of learning context which ‘good’ teachers create spontaneously. The on-going existence of the e-learning reality/rhetoric chasm suggests many universities are yet to achieve this goal. This paper has argued that this is due in part to the institutional implementation of e-learning being based on a limited SET of theoretical conceptions. The paper has compared the SET framework with the BAD framework and argued that the BAD framework provides a more promising theoretical foundation for bridging this chasm. It has illustrated the strengths and weaknesses of these two frameworks through a description of the origins and on-going use of the Moodle Activity Viewer (MAV) at two institutions. The suggestion here is not that institutions should see the BAD framework as a replacement for the SET framework, but rather that they should engage in some bricolage and explore how contextually appropriate mixtures of both frameworks can help bridge their e-learning reality/rhetoric chasm. Perhaps universities need to break a little BAD?

References

Albion, P. (2014). From Creation to Curation: Evolution of an Authentic’Assessment for Learning’Task. In M. Searson & M. Ochoa (Eds.), Society for Information Technology & Teacher Education International Conference (pp. 1160-1168). Chesapapeake, VA: AACE.

Biggs, J. (2012). What the student does: teaching for enhanced learning. Higher Education Research & Development, 31(1), 39-55. doi:10.1080/07294360.2012.642839

BŸscher, M., Gill, S., Mogensen, P., & Shapiro, D. (2001). Landscapes of practice: bricolage as a method for situated design. Computer Supported Cooperative Work, 10(1), 1-28.

Ciborra, C. (1992). From thinking to tinkering: The grassroots of strategic information systems. The Information Society, 8(4), 297-309.

CQUniversity. (2011). CQUniversity Annual Report 2010 (p. 136). Rockhampton.

CQUniversity. (2012). CQUniversity Annual Report 2011 (p. 84). Rockhampton.

Dai, H. J., Tsai, W. C., Tsai, R. T. H., & Hsu, W. L. (2011). Enhancing search results with semantic annotation using augmented browsing. IJCAI Proceedings – International Joint Conference on Artificial Intelligence, 22(3), 2418-2423.

Dawson, S., Bakharia, A., Lockyer, L., & Heathcote, E. (2011). “Seeing” networks : visualising and evaluating student learning networks Final Report 2011. Canberra: Australian Learning and Teaching Council.

Dede, C. (2008). Theoretical perspectives influencing the use of information technology in teaching and learning. In J. Voogt & G. Knezek (Eds.), International Handbook of Information Technology in Primary and Secondary Education (pp. 43-62). New York: Springer.

Dimitriadis, Y., & Goodyear, P. (2013). Forward-oriented design for learning : illustrating the approach. Research in Learning Technology, 21, 1-13. Retrieved from http://www.researchinlearningtechnology.net/index.php/rlt/article/view/20290

Downes, S. (2011). “Connectivism” and Connective Knowledge. Retrieved from http://www.huffingtonpost.com/stephen-downes/connectivism-and-connecti_b_804653.html

Dron, J. (2013). Soft is hard and hard is easy: learning technologies and social media. Form@ Re-Open Journal per La Formazione in Rete, 13(1), 32-43. Retrieved from http://fupress.net/index.php/formare/article/view/12613

Geoghegan, W. (1994). Whatever happened to instructional technology? Paper presented at the 22nd Annual Conference of The International Business Schools Computing Association. Baltimore, MD.

Goodyear, P. (2009). Teaching, technology and educational design: The architecture of productive learning environments (pp. 1-37). Sydney. Retrieved from http://www.olt.gov.au/system/files/resources/Goodyear%2C P ALTC Fellowship report 2010.pdf

Goodyear, P., Carvalho, L., & Dohn, N. B. (2014). Design for networked learning: framing relations between participants’ activities and the physical setting. In S. Bayne, M. de Laat, T. Ryberg, & C. Sinclair (Eds.), Ninth International Conference on Networked Learning 2014 (pp. 137-144). Edinburgh, Scotland. Retrieved from http://www.networkedlearningconference.org.uk/abstracts/pdf/goodyear.pdf

Groom, J., & Lamb, B. (2014). Reclaiming innovation. EDUCAUSE Review, 1-12. Retrieved from http://www.educause.edu/visuals/shared/er/extras/2014/ReclaimingInnovation/default.html

Hannon, J. (2013). Incommensurate practices: sociomaterial entanglements of learning technology implementation. Journal of Computer Assisted Learning, 29(2), 168-178. doi:10.1111/j.1365-2729.2012.00480.x

Holt, D., Palmer, S., Munro, J., Solomonides, I., Gosper, M., Hicks, M., É Hollenbeck, R. (2013). Leading the quality management of online learning environments in Australian higher education. Australasian Journal of Educational Technology, 29(3), 387-402. Retrieved from http://www.ascilite.org.au/ajet/submission/index.php/AJET/article/view/84

Inglis, A. (2007). Approaches taken by Australian universities to documenting institutional e-learning strategies. In R. J. Atkinson, C. McBeath, S.K. Soong, & C. Cheers (Eds.), ICT: Providing Choices for Learners and Learning. Proceedings ASCILITE Singapore 2007 (pp. 419-427). Retrieved from http://www.ascilite.org.au/conferences/singapore07/procs/inglis.pdf

Johnson, L., Adams Becker, S., Cummins, M., & Estrada, V. (2014). 2014 NMC Technology Outlook for Australian Tertiary Education: A Horizon Project Regional Report. Austin, Texas. Retrieved from http://www.nmc.org/publications/2014-technology-outlook-au

Johri, A. (2011). The socio-materiality of learning practices and implications for the field of learning technology. Research in Learning Technology, 19(3), 207-217. Retrieved from http://researchinlearningtechnology.net/coaction/index.php/rlt/article/view/17110

Jones, D. (2012). The life and death of Webfuse : principles for learning and leading into the future. In M. Brown, M. Hartnett, & T. Stewart (Eds.), Future challenges, sustainable futures. Proceedings ascilite Wellington 2012 (pp. 414-423). Wellington, NZ.

Jones, D., Luck, J., McConachie, J., & Danaher, P. A. (2005). The teleological brake on ICTs in open and distance learning. In 17th Biennial Conference of the Open and Distance Learning Association of Australia. Adelaide.

Kay, A. (1984). Computer Software. Scientific American, 251(3), 53-59.

Kezar, A. (2001). Understanding and Facilitating Organizational Change in the 21st Century: Recent Research and Conceptulizations. ASHE-ERIC Higher Education Report, 28(4).

Kirkwood, A., & Price, L. (2013). Technology-enhanced learning and teaching in higher education: what is “enhanced” and how do we know? A critical literature review. Learning, Media and Technology, (August), 1-31. doi:10.1080/17439884.2013.770404

Koopman, P., & Hoffman, R. (2003). Work-arounds, make-work and kludges. Intelligent Systems, IEEE, 18(6), 70-75.

Lane, K. (2014). The University of API (p. 28). Retrieved from http://university.apievangelist.com/white-paper.html

Laxon, A. (2013, September 14). Exams go online for university students. The New Zealand Herald.

Lockyer, L., Heathcote, E., & Dawson, S. (2013). Informing Pedagogical Action: Aligning Learning Analytics With Learning Design. American Behavioral Scientist, 57(10), 1439-1459. doi:10.1177/0002764213479367

McKenney, S., & Reeves, T. C. (2013). Systematic Review of Design-Based Research Progress: Is a Little Knowledge a Dangerous Thing? Educational Researcher, 42(2), 97-100. doi:10.3102/0013189X12463781

OECD. (2005). E-Learning in Tertiary Education: Where do we stand? (p. 289). Paris, France: Centre for Educational Research and Innovation, Organisation for Economic Co-operation and Development. Retrieved from http://new.sourceoecd.org/education/9264009205

Podolny, J., & Page, K. (1998). Network forms of organization. Annual Review of Sociology, 24, 57-76.

Rahman, N., & Dron, J. (2012). Challenges and opportunities for learning analytics when formal teaching meets social spaces. In 2nd International Conference on Learning Analytics and Knowledge (pp. 54-58). Vancourver, British Columbia: ACM Press. doi:10.1145/2330601.2330619

Reid, I. C. (2009). The contradictory managerialism of university quality assurance. Journal of Education Policy, 24(5), 575-593. doi:10.1080/02680930903131242

Robey, D., Ross, W., & Boudreau, M.-C. (2002). Learning to implement enterprise systems: An exploratory study of the dialectics of change. Journal of Management Information Systems, 19(1), 17-46.

Rushkoff, D. (2010). Program or be programmed: Ten commands for a digital age. New York: OR Books.

Salmon, G. (2005). Flying not flapping: a strategic framework for e-learning and pedagogical innovation in higher education institutions. ALT-J, Research in Learning Technology, 13(3), 201-218.

Scribner, J. (2005). The problems of practice: Bricolage as a metaphor for teachers’ work and learning. Alberta Journal of Educational Research, 51(4), 295-310. Retrieved from http://ajer.journalhosting.ucalgary.ca/ajer/index.php/ajer/article/view/587

Selwyn, N. (2008). From state‐of‐the‐art to state‐of‐the‐actual? Introduction to a special issue. Technology, Pedagogy and Education, 17(2), 83-87. doi:10.1080/14759390802098573

Selwyn, N. (2012). Social media in higher education. The Europa World of Learning. Retrieved from http://www.educationarena.com/pdf/sample/sample-essay-selwyn.pdf

Selwyn, N. (2013). Digital technologies in universities: problems posing as solutions? Learning, Media and Technology, 38(1), 1-3. doi:10.1080/17439884.2013.759965

Siemens, G. (2008). What is the unique idea in Connectivism? Retrieved July 13, 2014, from http://www.connectivism.ca/?p=116

Sturgess, P., & Nouwens, F. (2004). Evaluation of online learning management systems. Turkish Online Journal of Distance Education, 5(3). Retrieved from http://tojde.anadolu.edu.tr/tojde15/articles/sturgess.htm

Truex, D., Baskerville, R., & Travis, J. (2000). Amethodical systems development: the deferred meaning of systems development methods. Accounting Management and Information Technologies, 10, 53-79.

USQ. (2012). University of Southern Queensland 2011 Annual Report. Toowoomba. doi:10.1037/e543872012-001

Visscher-Voerman, I., &Gustafson, K. (2004). Paradigms in the theory and practice of education and training design. Educational Technology Research and Development, 52(2), 69-89.

Wang, F., & Hannafin, M. (2005). Design-Based Research and Technology-Enhanced Learning Environments. Educational Technology Research and Development, 53(4), 5-23.

Weimer, M. (2007). Intriguing connections but not with the past. International Journal for Academic Development, 12(1), 5-8.

Zellweger, F. (2005). Strategic Management of Educational Technology: The Importance of Leadership and Management. Riga, Latvia.

Searching for a phrase and some research

This is a plea for help. I’m certain I remember a particular phrase/concept that arose from some research around educational technology from 10+ years ago (may have been as long as 30 years ago).

It was a phrase/concept that was used to look critically at the tendency for education to create special education versions of real software. i.e. rather than use a standard bit of software – e.g. Word (*shudder*) – they would have to use a word processor made specially for education (K12 mostly I believe).

Does this ring any bells for you? Can you point me in a useful direction?

Thanks.

You want digitally fluent faculty?

The 2014 Horizon Report for Higher Education has identified the “Low Digital Fluency of Faculty” as the number 1 “significant challenge impeding higher education technology adoption”. I have many problems with this, but the image below captures my main problem.

You want digitally fluent faculty? by David T Jones, on Flickr

As a fairly digitally fluent faculty member I have yet to work for an institution of higher education that is able to deal with digitally fluent faculty. I’ve spent the last 20+ years banging my head against the digital illiteracies of higher education institutions. So to hear that the low digital fluency of faculty is seen as the #1 challenge impeding technology adoption is really rather aggravating.

(And I do know that Nicholson’s character didn’t actually say both lines)

Three paths for learning analytics and beyond: Moving from rhetoric to reality

Paper accepted to ASCILITE’2014 and nominated for best paper.

Beer, C., Tickner, R., & Jones, D. (2014). Three paths for learning analytics and beyond : moving from rhetoric to reality. In B. Hegarty, J. McDonald, & S. Loke (Eds.), Rhetoric and Reality: Critical perspectives on educational technology. Proceedings ascilite Dunedin 2014 (pp. 242–250).

Abstract

There is growing rhetoric about the potential of learning analytics in higher education. There is concern about what the growing hype around learning analytics will mean for the reality. Will learning analytics be a repeat of past mistakes where technology implementations fail to move beyond a transitory fad and provide meaningful and sustained contributions to learning and teaching? How can such a fate be avoided? This paper identifies three paths that learning analytics implementations might take, with particular consideration to their likely impact on learning and teaching. An ongoing learning analytics project – currently used by hundreds of teaching staff to support early interventions to improve student retention -at a regional Australian university is examined in relation to the three paths, and some implications, challenges and future directions are discussed.

Keywords: learning analytics, learning, teaching, data, complexity, bricolage

Introduction

The delivery of distance education via the Internet is the fastest growing segment of adult education (Carr-Chellman, 2004; Macfadyen & Dawson, 2010) and there is considerable pressure for institutions to ‘join the herd’. Burgeoning demand for university places, increased competition between universities, the introduction of globalisation coupled with reduced public funding are driving universities to expend time and resources on e-learning (Ellis, Jarkey, Mahony, Peat, & Sheely, 2007). There is however, evidence to suggest that the ubiquitous adoption of learning management systems (LMS) to meet institutional e-learning needs, has constrained innovation and negatively impacted on the quality of the learning experience (Alexander, 2001; Paulsen, 2002). This has contributed to a gap between the rhetoric around the virtues of e-learning and the complicated reality of the e-learning ‘lived experience’. Increasingly the adoption of technology by universities is being driven by a search for any panacea that will bridge this gap and is showing a tendency toward faddism.

Managerial faddism or hype is the tendency of people to eagerly embrace the newest fad or technology of the moment and to see problems as being largely solvable (or preventable) through better or more ‘rational’ management (Goldfinch, 2007). Birnbaum (2001) says about managerial fads; “they are usually borrowed from other settings, applied without full consideration of their limitations, presented either as complex or deceptively simple, rely on jargon, and emphasize rational decision making” (p. 2). Maddux and Cummings (2004) suggest that the use of information technology in higher education has been “plagued by fad and fashion since its inception” (p. 514). It is argued that management hype cycles are propagated by top-down, teleological approaches that dominate technology innovation, and indeed management, in higher education (Duke, 2001). Given the higher education sector’s disposition to adopting technological concepts based on hype and apparent rationality (Duke, 2001), there is a danger that the implementation of emerging technology related concepts, such as learning analytics (LA), will fail to make sustained and meaningful contributions to learning and teaching (L&T).

The aim of this paper is to explore how LA can avoid becoming yet another fad, by analysing the likely implementation paths institutions might take. The paper starts by examining what we now know about LA for evidence that suggests LA appears to be in the midst of a hype cycle that is likely to impede its ability to provide a sustained and meaningful contribution to L&T. The paper then examines some conceptual and theoretical frameworks around hype cycles, technology implementation, complex systems and models of university learning. These frameworks form the basis for identifying and analysing three likely paths universities might take when implementing LA. CQUniversity’s recent experience with a LA project that aims to assist with student retention is drawn upon to compare and contrast these paths before implications and future work are presented.

What we know about learning analytics

Johnson et al. (2013) define Learning Analytics (LA) as the collection and analysis of data in education settings in order to inform decision-making and improve L&T. Siemens and Long (2011) define LA as “the measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimizing learning and the environments in which it occurs”(p. 34). Others have said, “learning analytics provides data related to students’ interactions with learning materials which can inform pedagogically sound decisions about learning design” (Fisher, Whale, & Valenzuela, 2012, p. 9). Definitions aside, it can be said that the widespread use of technology in higher education has allowed the capture of detailed data on events that occur within learning environments (Pardo, 2013). The explosion in the use of digital technologies with L&T has contributed to the sectors’ growing interest in LA due to the ability of technology to generate digital trails (Siemens, 2013a), which can be captured and analysed. These digital trails have the potential to inform L&T practices in a variety of ways.

It is said that LA can contribute to course design, student success, faculty development, predictive modelling and strategic information (Diaz & Brown, 2012). Others say that LA can identify students at risk, highlight student learning needs, aid in reflective practice and can enable teachers to appropriately tailor instruction among other things (Johnson et al., 2013). Reports abound identifying LA as a key future trend in L&T with many reporting its rise to mainstream practice in the near future (Johnson et al., 2013; Lodge & Lewis, 2012; New Media Consortium, 2012; Siemens, 2011). Siemens and Long (2011) typify this rhetoric when they say that LA “is essential for penetrating the fog that has settled over much of higher education” (p. 40). While the promise of LA is still has a long way to go to live up to expectation, the prospect of evidence-informed practice and improved data services in an increasingly competitive and online higher education marketplace, is fuelling institutions’ interest in LA.

There are many reasons for the normative pull towards improved data services in higher education. Administrators demand data for the support of resource and strategic planning, faculty and administrators are hungry for information that can assist institutions with student recruitment and student retention, and external agencies such as governments, require a range of data indicators about institutional performance (Guan, Nunez, & Welsh, 2002). Prior to the emergence of LA, this desire for improved data services had, in many cases, led to the adoption of data warehouses by universities. Data warehouses are “… an subject-oriented, integrated, non-volatile and time-variant collection of data in support of management’s decisions” (Inmon, 2002). Data warehouses grew out of decision support systems and their use has escalated over recent years with increasing volumes and varieties of data being collected by institutions (Guan et al., 2002). Unfortunately and despite large volumes of data, data warehouses suffer from high failure rates and limited use by users (Goldfinch, 2007).

It has been said that a majority of information systems (IS) fail, and the larger the development, the more likely it will fail (Goldfinch, 2007). While there are many potential reasons for IS project failure, managerial faddism, management approaches and immense complexity are shown to be a significant factors (Goldfinch, 2007). These factors are of particular concern for LA, due to a range of underlying complexities and the ‘contextuality’ of what LA is representing (Beer, Jones, & Clark, 2012). Managerial faddism and management approaches to technology adoption can constrain the implementation’s ability to deal with complexity, as solutions are often presented as universally applicable, ‘quick fixes’ (Birnbaum, 2001). This is a concern for LA, as there is evidence to suggest that it is currently in the midst of a hype cycle.

The learning analytics hype

In observing the growing interest and attempted implementations of learning analytics within Australian universities, it is increasingly apparent that learning analytics is showing all the hallmarks of a management fashion or fad (Jones, Beer, & Clark, 2013). Fads are innovations that appear to be rational and functional, and are aimed at encouraging better institutional performance (Gibson & Tesone, 2001). Environmental factors such as increasing competition, regulation and turbulence contribute to the development of fads where there is an overwhelming desire to ‘be part of the in crowd’ (Gibson & Tesone, 2001). Fads often ‘speak to managers’ in that they appear to be common-sense and appeal to organisational rationality around efficiency and effectiveness, which makes counter-argument difficult (Birnbaum, 2001). Learning analytics talks strongly to managerialism due to its potential to facilitate data-driven decision-making and to complement existing institutional business intelligence efforts (Beer et al., 2012). Once an innovation such as LA achieves a high public profile, it can create an urgency to ‘join the bandwagon’ that swamps deliberative, mindful behaviour (Swanson & Ramiller, 2004).

The Horizon Project is an on-going collaborative research effort between the New Media Consortium and various partners to produce annual reports intended to help inform education leaders about significant developments in technology in higher education. LA has been mentioned in the Horizon Project’s reports in some form for the last five years. In 2010 and 2011 reports, visual data analysis (Johnson, Levine, Smith, & Stone, 2010) and then learning analytics (Johnson et al., 2011), were placed in the four to five year time frame for widespread adoption. In 2012 and 2013, perhaps as a sign of developing hype, LA moved to ‘1 year or less until widespread adoption’. However, in a 2014 report (Johnson, Adams Becker, Cummins, & Estrada, 2014), predictions about the widespread adoption of learning analytics has moved back to the 2 to 3 year time frame. Johnson et al (2014) explain that this increase in time frame is said not to be because “learning analytics has slowed in Australian tertiary education” (p. 2), but instead due to new aspects of learning analytics that add “more complexity to the topic that will require more time to explore and implement at scale (p. 2). Could this perhaps echo Birnbaum’s (2001) earlier observation that fads are often presented as complex or deceptively simple? During a trip to Australia in 2013, George Siemens, a noted international scholar in the LA arena, said “I’m not familiar with (m)any universities that have taken a systems-level view of LA… Most of what I’ve encountered to date is specific research projects or small deployments of LA. I have yet to see a systemic approach to analytics use/adoption.” (Siemens, 2013b).

The gathering hype around LA (Jones et al., 2013) appears to be following a similar trend to the business world around the concept of “big data” – the analysis and use of large datasets in business. Universities also followed the business world with the widespread adoption of data warehouse technology for decision support (Ramamurthy, Sen, & Sinha, 2008). While data warehouses have been around for some time, they have been plagued by high failure rates and limited spread or use (Ramamurthy et al., 2008). This is indicative of a larger trend in industry, where “the vast majority of big data and magical business analytics project fail. Not in a great big system-won’t-work way…They fail because the users don’t use them” (Schiller, 2012). The adoption of these technologies appears to be perilous even when the rush to adoption is not being driven by hype. If learning analytics does appear to be showing all the signs of being yet another fad, what steps can organisations take to avoid this outcome? The following section describes some theoretical frameworks that are drawn upon to help identify potential paths.

Theoretical frameworks

Hype cycles characterise a technology’s typical progression from an emerging technology to either productive use or disappointment (Linden & Fenn, 2003). Hype cycles have been linked to a recognition that imitation is often the driving force behind the diffusion of any technological innovation (Ciborra, 1992). Birnbaum (2001) suggest that technology hype cycles start with a technological trigger, which is followed a growing revolution and the rapid expansion of narrative and positivity around the technology. Then comes the minimal impact where enthusiasm for the technology starts to wane and initial reports of success become tempered by countervailing reports of failure. This is followed by the resolution of dissonance where the original promoters of the fad seek to explain the failure of the fad to achieve widespread impact. Such explanations tend not to see the blame arising from the fad itself, but instead attribute it to “a lack of leadership, intransigence of followers, improper implementation, and lack of resources” (Birnbaum, 2001). Hype cycles are linked with teleological or top-down approaches to technology adoption, which have primacy in higher education (Birnbaum, 2001). A practice that seems ignorant of research suggesting ateleological or bottom-up approaches to technology adoption can lead to more meaningful implementations (Duke, 2001).

Defining what constitutes a successful implementation of an Information or Communication Technology (ICT) is perilous. The conventional approach to recognising a successful ICT project according to Marchland & Peppard (2013) relates to some easily answered questions. Does it work? Was it deployed on time? Was it within budget? Did it adhere to the project plan? Goldfinch (Goldfinch, 2007) extends this to say that ICT projects can often fail simply because they are not used as intended, or users do not use them at all for reasons such as recalcitrance, lack of training or usability. More traditional project success measures might be useful for straightforward ICT projects where the requirements can be determined at the design stage, however, ICT projects around data and analytics are much more difficult to evaluate in terms of success or failure (Marchand & Peppard, 2013). These systems require people to interpret and create meaning from the information the systems provide. While deploying analytical ICT is relatively easy, understanding how they might be used is much less clear and these projects cannot be mapped out in a neat fashion (Marchand & Peppard, 2013). Suggesting that traditional ‘top down’ approaches associated with technology implementation might be less than ideal for LA implementations.

Teleological, top-down or plan-based approaches dominate technology adoption in higher education (McConachie, Danaher, Luck, & Jones, 2005). Known as planning or plan-based approaches, they are typically idealistic, episodic and follow a deliberate plan or strategy (Boehm & Turner, 2003). The suitability of these approaches for resolving complex problems has been questioned (Camillus, 2008). By contrast, ateleological or learning approaches follow an emergent path and are naturalistic and evolutionary (Kurtz & Snowden, 2003). The debate between the planning and learning schools of process has been one of the most pervasive debates in the management literature (Clegg, 2002) with many authors critically evaluating the two schools (e.g., Mintzberg, 1989; Kurtz & Snowden, 2003; McConachie et al, 2005).

The use of planning-based processes to the implementation of LA projects creates a problem when online learning environments are acknowledged as non-linear complex systems (Barnett, 2000; Beer et al., 2012; Mason, 2008a, 2008b). Complex systems are systems that adapt, learn or change as they interact (Holland, 2006). They are non-linear systems in that they contain nested agents and systems that are all interacting and evolving, so we cannot understand any of the agents or systems without reference to the others (Plsek & Greenhalgh, 2001). Cause and effect is not evident and cannot be predicted, meaning that even small interventions can have far-reaching, disproportionate and impossible to predict consequences (Boustani et al., 2010; Shiell, Hawe, & Gold, 2008). If LA is about understanding learners and the contexts within which they learn, considering online learning environments as complex systems has a profound effect on how we approach LA projects. It follows from this that what contemporary universities need is the most productive elements of both teleological and ateleological approaches to the eight elements of the design process identified by (Introna, 1996). Such a synthesis is crucial to addressing the plethora of issues competing for the attention of university decision-makers, whether in Australia or internationally.

The development of LA tools and processes is only the first of the steps (Elias, 2011) identifies as necessary for the implementation of LA. The second step identified by (Elias, 2011), and arguably the far more difficult step, is “the integration of these tools and processes into the practice of teaching and learning” (p. 5). Beer et al. (2012) argue that it is likely to be the teachers who have the right mix of closeness and expertise with the learning context, to make the best use of LA derived information. Echoing earlier arguments that teachers are perhaps the most important element of any attempt to enhance learning and teaching (L&T) (Radloff, 2008). Achieving such a goal would appear to require some understanding of the practice of teaching and learning. One such understanding is provided by Trigwell’s (2001) model of university teaching. As shown in Figure 1, Trigwell’s (2001) model suggests that the student learning experience is directly related to teachers’ strategies, teachers’ planning, teachers’ thinking including knowledge, conceptions and reflections, along with the L&T context. This is difficult as the teacher’s context is complex and dynamic. If LA is representing data about learners and their contexts and its goal is to enhance to L&T, it is crucial that it engages with teachers and their dynamic contexts (Sharples et al., 2013).

Trigwell's model of teaching by David T Jones, on FlickrFigure 1. Trigwell’s (2001) model of university teaching.

The three paths

Based on the preceding theoretical perspectives and personal experience within Australian universities, it is possible to identify at least three potential paths – ‘do it to’, ‘do it for’, and ‘do it with’ -that universities might take when pondering harnessing LA. In the rest of the paper we describe these three paths and then use them to understand the emergence of an LA project at a particular Australian University.

Do it to the teachers

‘Do it to’ describes the top-down, techno-rational and typical approach to ICT adoption in higher education. In theory, this approach starts with the recognition that LA aligns with identified institutional strategic goals. From there a project is formed that leads to a technology being identified and considered at the institutional level, usually with input from a small group of people, before being implemented institution-wide. ‘Do it to’ approaches will typically involve the setting up of a formal project with appropriate management sponsorship, performance indicators, budgets, project teams, user groups, and other project management requirements.

The ‘do it to’ approach focuses much of its attention on changing the teaching and learning context (the left hand end of Figure 1) in terms of policies and systems. The assumption is that this change in context will lead to changes in teacher thinking, planning and strategy. ‘Do it to’ provides a focus on creating the right context for L&T but its effect on teacher thinking, planning and strategy is arguably deficient. ‘Do it to’ represents a mechanistic approach that although common, is likely to fail (Duke, 2001) and this is particularly troublesome for LA implementations for a range of reasons.

The difficulty of ICT implementation for data and analytics projects (Marchand & Peppard, 2013) is compounded in LA due to its novelty and an absence of predefined approaches that are known to work across multiple contexts (Siemens, 2013a). L&T contexts are complex and diverse (Beer et al., 2012) and imposed technological solutions into these environments can lead to a problem of task corruption, where staff engagement is superficial and disingenuous (Rosh White, 2006). Centralised approaches to LA can often be mistakenly viewed as purely an exercise in technology (Macfadyen & Dawson, 2012) and may provide correlative data that can be misleading or erroneous at the course or individual levels (Beer et al., 2012).

Do it for the teachers

Geoghegan (1994) identifies the growth of a “technologists’ alliance” between innovative teaching staff, central instructional technology staff and information technology vendors, as responsible for many of the developments that seek to harness information technology to enhance student learning. While this alliance is often called upon to contribute to the “do it to” path, they are also largely responsible for the “do it for” path. Driven by a desire and responsibility to enhance student learning, members of the alliance will seek to draw on their more advanced knowledge of technology and its application to learning and teaching to: organize staff development sessions; experiment with, adopt or develop new applications of technology; and, help with the design and implementation of exemplar information technology enhanced learning designs. Such work may lead to changes in the L&T context – in much the same way as the “do it to” path -through the availability of a new Moodle plugin for learning analytics or visits from experts on certain topics. It can also lead to changes in the thinking, planning and strategies of small numbers of teaching staff. Typically those innovative teaching staff participating in the exemplar applications of technologies and whom are becoming or already a part of the technologists’ alliance.

While the technologists’ alliance is responsible for many of the positive examples of harnessing information technology to enhance L&T, Geoghagen (1994) also argues that its members have “also unknowingly worked to prevent the dissemination of these benefits into the much larger mainstream population”. Geoghagen (1994) attributes a major contributor to this being the extensive differences between the members of the technologists’ alliance and the majority of teaching staff. Rather than recognize and respond to this chasm, there has been a failure to recognize its existence, assume a level of homogeneity, and believe that it is simply a matter of overcoming increased resistance to change, rather than addressing qualitatively distinct perspectives and needs (Geoghegan, 1994).

Do it with the teachers

This approach is firmly entrenched in the learning approach process mentioned previously. This path starts by working with teaching academics inside the course or unit ‘black box’ during the term. The idea is to develop an understanding of the lived experience, encompassing all its diversity and difficulty, so as to establish how LA can help contribute within the context. The aim being to fulfil Geoghagen’s (1994) advice to develop an application “well-tuned to the instructional needs” that provides a “major and clearly recognizable benefit or improvement to an important process”. Such applications provide those outside the techologists’ alliance a compelling reason to adopt a new practice. It is through the adoption of new practices that educators can gain new experiences “that enable appropriation of new modes of teaching and learning that enable them to reconsider and restructure their thinking and practice” (Cavallo, 2004). An approach, which (Cavallo, 2004) argues is significantly more effective in changing practice than “merely being told what to do differently” (p. 97). Thus the ‘do it with’ path starts with the current reality of L&T – the right-hand end of Figure 1 – and works back toward the left.

Beyond being a potentially more effective way of changing thinking and practice around learning, the “do it with” approach brings a number of other potential benefits. This type of bottom-up or evolutionary approach is also known as bricolage – “the art of creating with what is at hand” (Scribner, 2005) -and has been identified as a major component of how teachers operate (Scribner, 2005). It is also a primary source of strategic benefit from information technology (Ciborra, 1992). However, the ‘do it with’ path also has some hurdles to overcome. These approaches are messy and tend not to fit with existing institutional approaches to technology adoption or innovation. Learning approach processes are agile and require freedom to adapt and change. This clashes with existing organisational cultural norms around technology innovation, implementation and uniformity. ‘Do it with’ approaches do not fit with existing organisation structures that are rationally decomposed into specialised units (Beer et al., 2012). Other problems can be attributed to workloads and competing requirements and these can inhibit the collaborative, reflective and adaptable approaches required for bricolage. There are also questions about whether or not such approaches can be translated into sustainable, long-term practices.

A question of balance

These three approaches described above are not mutually exclusive. Elements of all three approaches are very likely to, and perhaps need to exist with in LA implementations. It is a question of balance. The typical approach to ICT implementation is ‘do it to’ which constrains the impact the implementation might have on L&T. This paper has suggested that ‘do it with’ and even ‘do it for’ approaches, may allow LA to develop more sustained and meaningful contributions to L&T. However, they starkly contrast with existing institutional technology adoption and implementation norms based on ‘do it to’. While the way forward may not be clear, it is clear that we need a better balance between all three of these approaches if LA is going to enhance learning, teaching and student success. The following section describes a LA implementation at a regional Australian university with a very complex and diverse L&T environment.

EASI @ CQU

EASI or Early Alert Student Indicators is a LA project at CQUniversity targeting a strategic goal around student retention by improving academic-student contact. It combines student descriptive data from the student information system with student behaviour data from the Moodle LMS, and provides this data, in real-time, to teaching academics within their Moodle course sites. It also provides the academics a number of ways by which they can ‘nudge’ students who might be struggling at any point during the term. The term 1, 2014 trial was deemed to be very successful with 5,968 visits to EASI across the term, by 283 individual academic staff that looked at 357 individual courses. A majority of the 39,837 nudges recorded were mail-merges where academics used the in-built mail-merge facility to send personalised emails to students. The 7,146 students who received at least one ‘nudge’ email during the term, had by the end of term, 51% more Moodle clicks on average than students who did not receive nudges. This may be indicative of heightened engagement and aligns with anecdotal comments from the academics who have indicated that the personalised email ‘nudges’ promoted increased student activity and dramatically elevated staff-student conversation.

Based on a strategic goal to address a growing student retention problem, a formal project was proposed in 2012 based on a project proposal document (PPD) that outlined how the project would contribute to the strategic goal. There were more than a dozen iterations of this document before the project gained final approval, which then required a project initiation document (PID) to be submitted. The PID, over a number of iterations, provided fine-grained detail on a range of plans for the project including the project plan, project scope, deliverables, milestones, budget and quality. Twelve months after the PPD, work officially began on the project following the final approval of the PID. On the surface it would appear that this particular LA project followed a ‘do it to’ approach with formal project management methodology, and early indications about its success are encouraging. However, the underlying and invisible reality suggests a different story.

The idea for EASI evolved from many conversations and collaborations between staff from within the central L&T unit, and coalface academic staff, going back to 2008. These conversations and collaborations were predominately around finding ways of making better use of data that could inform L&T. The central L&T staff were somewhat unique in that they were active LA researchers, possessed experience with software development, and all were in daily contact and shared insights with front-line academic teaching staff. The central L&T staff pursued LA in their own time, using informal access to test data that was often incomplete or inconsistent. The EASI concept developed during 2011, when these staff identified the potential for LA to contribute to the strategic imperative of improving student retention. A number of small-scale pilots/experiments were conducted in close partnership with the participating teaching academics on a trial-and-error basis.

These trials occurred prior to the approval of the formal project plan using a combination of ‘do it with’ and ‘do it for’ paths before the start of the formal project and its requirements constrained the approach strictly to ‘do it to’. The essence of this story is that the project’s success, as defined by senior management (Reed, 2014), is directly attributable to the tinkering and experimentation that occurred with the front-line academics, prior to the commencement of the formal project. The ‘do it with’ and ‘do it for’ components allowed the bricolage that made the implementation meaningful (Ciborra, 1992), while the ‘do it to’ component provided the resourcing necessary to progress the idea beyond the tinkering stage. Perhaps the key message from the EASI experience is that there needs to be balance between all three approaches if LA is to going to make sustained and meaningful contributions to L&T.

Conclusion

A story was told in this paper of an apparently successful ‘do it to’ LA project. It was suggested that this project was successful only because of its underpinning and preceding ‘do it with’ and ‘do it for’ processes. These processes allowed the project to adapt in response to the needs of the users over time, prior to the start of the formal project. Based on this experience and the theoretical frameworks described in this paper, it would appear likely that attempts to implement LA without sufficient ‘do it with’ will fail. Turn-key solutions and the increasing trend for ‘systems integration’ and outsourcing, is unlikely to allow the bricolage required for sustained and meaningful improvement in complex L&T contexts. There is even a question of how long the EASI project can remain successful given the formal project and its associated resourcing, will cease at the end of the project.

While this paper specifically targeted LA, there is a question as to whether the same paths, or combination thereof, are required more broadly for improving L&T in universities. Is the broader e-learning rhetoric/reality gap a result of an increasing amount of ‘do it to’ and ‘do it for’ and not enough ‘do it with’? How much effort are universities investing in each of the three paths? How could a university appropriately follow the ‘do it with’ path more often? What impacts might this have on the quality of learning and teaching? The exploration of these questions may help universities to bridge the gap between e-learning rhetoric and reality.

References

Alexander, S. (2001). E-learning developments and experiences. Education+ Training, 43 (4/5), 240-248.

Barnett, R. (2000). Supercomplexity and the Curriculum. Studies in Higher Education, 25 (3), 255-265. doi: 10.1080/03075070050193398

Beer, C., Jones, D., & Clark, D. (2012). Analytics and complexity: Learning and leading for the future . Paper presented at the ASCILITE2012 Future challenges, sustainable futures, Wellingtone, New Zealand.

Birnbaum, R. (2001). Management fads in higher education: Where they come from, what they do, why they fail : Jossey-Bass San Francisco.

Boehm, B., & Turner, R. (2003). Using Risk to Balance Agile and Plan-Driven Methods. Computer, 36 (6), 57.

Boustani, M. A., Munger, S., Gulati, R., Vogel, M., Beck, R. A., & Callahan, C. M. (2010). Selecting a change and evaluating its impact on the performance of a complex adaptive health care delivery system. Clinical Interventions In Aging, 5 , 141-148.

Camillus, J. C. (2008). Strategy as a Wicked Problem. Harvard Business Review, 86 (5), 98-106.

Carr-Chellman, A. A. (2004). Global perspectives on e-learning: Rhetoric and reality : Sage.

Cavallo, D. (2004). Models of growth—towards fundamental change in learning environments. BT Technology Journal, 22 (4), 96-112.

Ciborra, C. U. (1992). From thinking to tinkering: The grassroots of strategic information systems. The Information Society, 8 (4), 297-309.

Diaz, V., & Brown, M. (2012). Learning analytics: A report on the ELI focus session. In Educause (Ed.), Educause Learning Initiative (Paper 2, 2012 ed., Vol. ELI Paper 2: 2012, pp. 18). Educause: Educause.

Duke, C. (2001). Networks and Managerialism: field-testing competing paradigms. Journal of Higher Education Policy & Management, 23 (1), 103-118. doi: 10.1080/13600800020047270

Elias, T. (2011). Learning analytics: Definitions, processes and potential. Learning, 23 , 134-148.

Ellis, R. A., Jarkey, N., Mahony, M. J., Peat, M., & Sheely, S. (2007). Managing Quality Improvement of eLearning in a Large, Campus-Based University. Quality Assurance in Education: An International Perspective, 15 (1), 9-23.

Fisher, J., Whale, S., & Valenzuela, F.-R. (2012). Learning Analytics: a bottom-up approach to enhancing and evaluating students’; online learning (pp. 18). University of New England: Office for Learning and Teaching.

Geoghegan, W. (1994). Whatever happened to instructional technology? Paper presented at the Paper presented at the 22nd Annual Conference of the International Business Schools Computing Association.

Gibson, J. W., & Tesone, D. V. (2001). Management fads: Emergence, evolution, and implications for managers. Academy of Management Executive, 15 (4), 122-133. doi: 10.5465/AME.2001.5898744

Goldfinch, S. (2007). Pessimism, computer failure, and information systems development in the public sector. Public Administration Review, 67 (5), 917-929.

Guan, J., Nunez, W., & Welsh, J. F. (2002). Institutional strategy and information support: the role of data warehousing in higher education. Campus –Wide Information Systems, 19 (5), 168.

Holland, J. (2006). Studying Complex Adaptive Systems. Journal of Systems Science and Complexity, 19 (1), 1 8. doi: 10.1007/s11424-006-0001-z

Inmon, W. H. (2002). Building the data warehouse / W.H. Inmon : New York ; Chichester : Wiley, c2002. 3rd ed. Introna, L. D. (1996). Notes on ateleological information systems development. Information Technology & People, 9 (4), 20-39.

Johnson, L., Adams Becker, S., Cummins, M., & Estrada, V. (2014). 2014 NMC Technology Outlook for Australian Tertiary Education: A Horizon Project Regional Report. In N. M. Consortium (Ed.), An NMC Horizon Project Regional Report . Austin, Texas.

Johnson, L., Adams, S., Cummins, M., Freeman, A., Ifenthaler, D., Vardaxis, N., & Consortium, N. M. (2013). Technology Outlook for Australian Tertiary Education 2013-2018: An NMC Horizon Report Regional Analysis. In T. N. M. Consortium (Ed.): New Media Consortium.

Johnson, L., Becker, S., Estrada, V., & Freeman, A. (2014). Horizon Report: 2014 Higher Education.

Johnson, L., Levine, A., Smith, R., & Stone, S. (2010). The 2010 Horizon Report : ERIC.

Johnson, L., Smith, R., Willis, H., Levine, A., Haywood, K., New Media, C., & Educause. (2011). The 2011 Horizon Report: New Media Consortium.

Jones, D., Beer, C., & Clark, D. (2013). The IRAC framework: Locating the performance zone for learning analytics . Paper presented at the Electric Dreams., Sydney. http://www.ascilite.org.au/conferences/sydney13/program/papers/Jones.pdf

Kurtz, C. F., & Snowden, D. J. (2003). The new dynamics of strategy: Sense-making in a complex and complicated world. IBM Systems Journal, 42 (3), 462-483.

Linden, A., & Fenn, J. (2003). Understanding Gartner’s hype cycles. Strategic Analysis Report Nº R-20-1971. Gartner, Inc .

Lodge, J., & Lewis, M. (2012). Pigeon pecks and mouse clicks: Putting the learning back into learning analytics. Paper presented at the ASCILITE 2012,, Wellington.

Macfadyen, L. P., & Dawson, S. (2010). Mining LMS data to develop an ‚Äúearly warning system‚ for educators: A proof of concept. Computers & Education, 54 (2), 588-599. doi: 10.1016/j.compedu.2009.09.008

Macfadyen, L. P., & Dawson, S. (2012). Numbers Are Not Enough. Why e-Learning Analytics Failed to Inform an Institutional Strategic Plan. Journal of Educational Technology & Society, 15 (3), 149-163.

Maddux, C., & Cummings, R. (2004). Fad, fashion, and the weak role of theory and research in information technology in education. Journal of Technology and Teacher Education, 12 (4), 511-533.

Marchand, D. A., & Peppard, J. (2013). Why IT Fumbles Analytics. Harvard Business Review, 91 (1), 104-112.

Mason, M. (2008a). Complexity theory and the philosophy of education. Educational Philosophy and Theory, 40 (1), 15. doi: 10.1111/j.1469-5812.2007.00412.x

Mason, M. (2008b). What Is Complexity Theory and What Are Its Implications for Educational Change? Educational Philosophy and Theory, 40 (1), 35-49.

McConachie, J., Danaher, P. A., Luck, J., & Jones, D. (2005). Central Queensland University’s Course Management Systems: Accelerator or Brake in Engaging Change? International Review of Research in Open and Distance Learning, 6 (1).

New Media Consortium. (2012). The NMC Horizon Report, Higher Education Edition. In N. M. Consortium (Ed.), Horizon Project (2012 ed., Vol. 2012, pp. 36). Austin, Texas USA 78730: New Media Consortium and Educause Learning Initiative.

Pardo, A. (2013). Social learning graphs: combining social network graphs and analytics to represent learning experiences. International Journal of Social Media and Interactive Learning Environments, 1 (1), 43-58.

Paulsen, M. (2002). Online education systems in Scandinavian and Australian universities: A comparative study. The International Review of Research in Open and Distance Learning, 3 (2).

Plsek, P. E., & Greenhalgh, T. (2001). Complexity science: The challenge of complexity in health care. BMJ (Clinical Research Ed.), 323 (7313), 625-628.

Radloff, A. (2008). Engaging staff in quality learning and teaching: what’s a Pro Vice Chancellor to do? Sydney: HERDSA.

Ramamurthy, K. R., Sen, A., & Sinha, A. P. (2008). An empirical investigation of the key determinants of data warehouse adoption. Decision Support Systems, 44 (4), 817-841.

Reed, R. (2014, 10/7/2014). [EASI project success].

Rosh White, N. (2006). Tertiary education in the Noughties: the student perspective. Higher Education Research & Development, 25 (3), 231-246. doi: 10.1080/07294360600792947

Schiller, M. J. (2012). Big Data Fail: Five Principles to Save Your BI Butt. Retrieved 1/6/2014, 2014, from http://www.cioinsight.com/c/a/Expert-Voices/Big-Data-Fail-Five-Principles-to-Save-Your-BI-Butt-759074/

Scribner, J. P. (2005). The problems of practice: Bricolage as a metaphor for teachers’ work and learning. Alberta journal of educational research, 51
(4).

Sharples, M., McAndrew, P., Ferguson, R., FitzGerald, E., Hirst, T., & Gaved, M. (2013). Innovating Pedagogy 2013. In O. University (Ed.), (Report 2 ed.). Milton Keynes, United Kingdom: The Open University.

Shiell, A., Hawe, P., & Gold, L. (2008). Complex interventions or complex systems? Implications for health economic evaluation. BMJ, 336 (7656), 1281-1283.

Siemens, G. (2011). Learning and Knowledge Analytics. Retrieved 1/11/2011, 2011, from http://www.learninganalytics.net/?p=131

Siemens, G. (2013a). Learning Analytics: The Emergence of a Discipline. American Behavioral Scientist, 57 (10), 1380-1400. doi: 10.1177/0002764213498851

Siemens, G. (2013b). [Systems level learning analytics].

Siemens, G., & Long, P. (2011). Penetrating the Fog: Analytics in Learning and Education. Educause Review, 46 (5), 9. Retrieved from Educause Review Online website: http://www.educause.edu/ero/article/penetratingfog-analytics-learning-and-education

Swanson, E. B., & Ramiller, N. C. (2004). Innovating mindfully with information technology. MIS quarterly , 553-583.

Trigwell, K. (2001). Judging university teaching. International Journal for Academic Development, 6 (1), 65-73. doi: 10.1080/13601440110033698

Breaking BAD to bridge the e-learning reality/rhetoric chasm

@damoclarky and I got a bit lucky. Our ASCILITE paper has been accepted with revisions. Apparently the first reviewer hated the “theoretical construct” we were using to make our argument. The following is what we originally wrote, sharing it here to hopefully spark some critique and improvement (and also not to entirely waste the writing when I gut it and start again).

Start with the problem and then the “construct”, both adapted from the paper.

Problem

In a newspaper article (Laxon, 2013) Professor Mark Brown makes the following comment on the quality of contemporary University e-learning

E-learning’s a bit like teenage sex. Everyone says they’re doing it but not many people really are and those that are doing it are doing it very poorly. (n.p).

E-learning – defined by the OECD (2005) as the use of information and communications technology (ICT) to support and enhance learning and teaching – has been around for so long that there have been numerous debates about replacing it with other phrases. Regardless of the term used there “has been a long-standing tendency in education for digital technologies to eventually fall short of the exaggerated expectations”(Selwyn, 2012, n.p.). Writing in the early 1990s Geoghagen (1994) seeks to understand why a three decade long “vision of a pedagogical utopia” (n.p.) promised by instructional technologies has failed to eventuate. Ten years on Salmon (2005) notes that e-learning within universities is still struggling to move beyond projects driven by innovators and engage a significant percentage of students and staff. Even more recently concerns remain about how much technology is being used to effectively enhance student learning (Kirkwood & Price, 2013). Given that “Australian universities have made very large investments in corporate educational technologies” (Holt et al., 2013, p. 388) it is increasingly important to understand and address the rhetoric/reality chasm around e-learning.

Not surprisingly the literature provides a variety of answers to this complex question. Weimer (2007) observes that academics come to the task of teaching with immense amounts of content knowledge, but little or no knowledge of teaching and learning, beyond perhaps their personal experience. A situation which may not change significantly given that academics are expected to engage equally in research and teaching and yet work towards promotion criteria that are perceived to primarily value achievements in research (Zellweger, 2005). It has been argued that the limitations of the Learning Management System (LMS) – the most common university e-learning tool – make the LMS less than suitable for more effective learner-centered approaches and is contributing to growing educator dissatisfaction (Rahman & Dron, 2012). It’s also been argued that the “limited digital fluency of lecturers and professors is a great challenge” (Johnson, Adams Becker, Cummins, & Estrada, 2014, p. 3) for the creative leveraging of emerging technologies. Another contributing factor is likely to be Selwyn’s (2008) suggestion that educational technologists have failed to be cognisant of “the more critical analyses of technology that have come to the fore in other social science and humanities disciplines (p. 83). Of particular interest here is the observation of Goodyear et al (2014) that the “influence of the physical setting (digital and material) on learning activity is often important, but is under-researched and under-theorised: it is often taken for granted” (p. 138).

Our argument is that the set of implicit assumptions that underpin the practice of institutional e-learning within universities (which we’ll summarise under the acronym SET) leads to a digital and material environment that contributes significantly to the reality/rhetoric chasm. The argument is that while this mindset underpins how universities go about the task of institutional e-learning, they won’t be able to bridge the chasm.

Instead, we argument that another mindset needs to play a larger role in institutional practice. How much we don’t know. We’ll summarise this mindset under the acronym “BAD”. Yep, we think institutional e-learning needs to break BAD.

Breaking BAD versus SET in your ways

The following table contrasts the two frameworks and expands their acronyms. A slightly more detailed examination of the two frameworks follows

Table 1: The BAD and SET frameworks for e-learning implementation
Component BAD SET
How work gets done Bricolage – concrete problems are solved through creative recombination of existing resources Strategy – a desired future state is identified, all resources required to achieve state in most efficient way identified and provided.
How ICT is perceived Affordances – ICT is protean. It can be modified to enhance and transform current practice; and, to make it easier for the users. Established – ICT is fixed and implemented vanilla. Processes change to fit and users trained to use the provided functionality.
How you see the world Distributed – the world is complex, dynamic and unpredictable. Tree-like – the world is relatively stable and predictable. It can be understood through logical decomposition into a hierarchy.

How work gets done

(this was originally titled “How stuff happens” but was probably what one reviewer described as “inappropriately colloquial”. Need a better label for this. The idea is that the organisation only recognises work of a particular type. It’s the only way it conceives of anything interesting/important happening. Not sure the following explains this well enough)

It would be an unusual contemporary Australian university that was not – at least proclaiming the rhetoric of – following a strategic approach to its operations. Numerous environmental challenges and influences have led to universities being treated as businesses with an increasing prevalence of managers using “strategic control and a focus on outputs which can be quantified and compared” (Reid, 2009, p. 575) to manage academic activities. In line with this has been the increasing strategic approach to learning and teaching. The requirement that Australian universities have institutional learning and teaching strategic plans publicly available on their websites prior to accessing a government learning and teaching fund (Inglis, 2007) is just one example of how university teaching has become an object of policy with the learning and teaching excellence necessarily including the specification of goals (Clegg & Smith, 2008). The perceived importance of strategic approaches to institutional e-learning is illustrated by Carter et al’s (2011) identifying the importance of ensuring “Technology alignment with goals of the organization” (p. 207). The strategic or planning-by-objectives (e.g. learning outcomes, graduate attributes) approach also underpins how course design is largely assumed to occur with Visscher-Voerman and Gustafson (2004) finding that it underpins “a majority of the instructional design models in the literature” (p. 77). These approaches to understanding “how stuff happens” are so ingrained that it is often forgotten that these ideas have not always existed (Kezar, 2001) and that there is an alternate perspective.

(An example comparing bricolage and engineering approaches might be useful, might actually be a better structure for this section)

An example of this alternate perspective can be found in the idea of bricolage or “the art of creating with what is at hand” (Scribner, 2005, p. 297). Bricolage involves the manipulation and creative repurposing of existing, and often unlikely, resources into new arrangements to solve a concrete problem. A bricoleur (someone who engages in bricolage) when faced with a project does not analyse what resources may be required to fulfill that project (a more strategic approach), instead they ask how the project can be achieved with the resources already available (Hatton, 1989). Hatton (1989) used bricolage to understand the work of teachers, though Scribner (2005) thinks somewhat negatively. In terms of developing strategic applications of ICT, Ciborra (1992) argues that the “capability of integrating unique ideas and practical design solutions at the end-user level” (p. 299) (bricolage) is more important than strategic approaches.

As argued by Jones et al (2005) there are risky extremes inherent in both the strategic and bricolage approaches to process. The suggestion here within the context of university e-learning is that it would be fruitful to explore a dynamic and flexible interplay between the strategic and bricolage approaches. The problem is that at the moment the strategic is crowding out the bricolage. As Groom and Lamb (2014) observe the cost of supporting an enterprise learning tool (e.g. LMS) limits resources for user-driven innovation, in part because it draws “attention and users away” from the strategic tool. The demands of sustaining the large, complex and strategic tool dominates priorities and leads to “IT organizations…defined by what’s necessary rather than what’s possible” (Groom & Lamb, 2014, n.p). The established view of Information and Communication Technologies (ICT) in part arises from the predominance of the strategic view of how work happens.

How ICT is perceived: Affordances or Established

Widely accepted best practice within the IT industry is that large integrated systems – like an LMS – should be implemented in their “vanilla” form as they are too expensive (Robey, Ross, & Boudreau, 2002). This way of perceiving ICTs assumes that the functionality provided by technology is established and cannot be changed. This perception of an LMS encourages the adoption of only those pedagogical designs that are supported by the existing LMS functionality and precludes the exploration of contextually specific learning designs (Jones, 2012). Perceiving and implementing the LMS as a established product simplifies and reduces the cost of training and support, but increases the difficulty of adoption as teaching staff attempt to use a standardised system to support hugely diverse disciplines, teaching philosophies and instructional styles (Black, Beck, Dawson, Jinks, & DiPietro, 2007). Perhaps in no small way the established view of ICT in e-learning contributes to Dede’s (2008) observation that “widely used instructional technology applications have less variety in approach than a low-end fast-food restaurant” (p. 58). This perception of ICT challenges Kay’s (1984) discussion of the “protean nature of the computer” (p. 59) as “the first metamedium, and as such has degrees of freedom and expression never before encountered” (p. 59). However, this perception of ICT is closely linked with the techno-rational assumptions of the strategic view, an approach that is increasingly seen as a naïve view of ICT, technology and organisations.

(Remove some of the quotes and tell a better story).

Goodyear et al (2014) argue that in thinking about design for networked learning it is vital to acknowledge “the likelihood of slippage between the task as set and the actual activity” (p. 139). Hannon (2013) describes a case where “meso-level practitioners – teaching academics, learning technologies, and academic developers” (p. 175) undertake “hidden effort” (p. 175) to deal with the gap between technology and pedagogy that arise from the application of centralised technologies. Rather than stick with the established functionality provided by an information system increasingly technically literate users draw upon increasingly available technologies to develop systems that bridge the gaps between their needs and the established information system. While often seen as dangerous and inefficient such systems can provide a resource of creativity and innovation that helps organisations survive in a competitive environment (Behrens, 2009). Such systems arise because ICT is not seen as established, but rather as one of a number of components of an emergent process of change where the outcomes are indeterminate because they are contingent on the specifics of the context and the situation (Markus & Robey, 1988). In particular, they arise due to an on-going process – not unlike bricolage – where users are exploring how the affordances of ICT can be leveraged to address concrete problems. The phrase affordances is used here as defined by Goodyear et al (2014) “not as pre-given, but as co-evolving, emergent and partly co-constitutive” (p. 142) and as a way of exploring how what is actually done with e-learning systems is “influenced by the qualities of the place in which they are working” (p. 137). Our view is that it is necessary for the implementation of e-learning systems to be perceived as an on-going and emergent exploration of the affordances that could be the most useful for the students and teachers within a given context. Echoing Johri’s (2011) observation that bricolage shifts focus away from the established “design of an artefact towards emergent design of technology-in-use, particularly by the users” (p. 212).

(that can certainly be improved upon)

How you see the world: Distributed or Tree-like

Techno-rational methods such as strategic planning and software development (or at least act like they) perceive the world as a hierarchy or as being tree-like. These methods use analysis and logical decomposition to reduce larger wholes into smaller more easily understood and manageable parts (Truex, Baskerville, & Travis, 2000). This approach is problematic because the isolation of components is largely imaginary and their separation leads to a loss of rich interdependencies between components (Truex et al., 2000). Enterprise systems are informed heavily by these tree-like conceptions and this is reflected in university e-learning environments and their poor fit with the heterarchical and self-organised potential of contemporary technologies and educational practices (Hannon, Ryberg, & Riddle, 2014). Goodyear et al (2014) argue “that the dominant images of the object of our research do not yet reflect the extent to which learning networks now consist of heterogenous assemblages of tasks, activities, people, roles, rules, places, tools, artefacts and other resources, distributed in complex configurations across time and space and involving digital, non-digital and hybrid entities” (p. 140). We suggest that the same applies to the dominant conceptions underpinning the implementation of institutional e-learning systems.

The limitations of tree-like models and a preference for distributed models are evident in a number of sources. Holt et al (2013) argue for the importance of distributed leadership in institutional e-learning to the growing complexity of e-learning meaning that no one leader at the top of a hierarchical tree has the knowledge to “possibly contend with the complexity of issues” (p. 389). The tend towards distribution is obviously evident in connectivism and its “thesis that knowledge is distributed across a network of connections and therefore learning consists of the ability to construct and traverse those networks” (Downes, 2011, n.p). Siemens’ (2008) list of some of the concepts from which connectivism arises – such as activity theory, distributed and embodied cognition, complexity and network theory – illustrate the breadth of this move to distributed understandings. The socio-material approaches to studying and understanding networked learning (and technology embedded practices more broadly) mentioned by both Hannon (2013) and Goodyear et al (2014) echo a distributed view and underpins the emergent view of technology mentioned in the previous section. It also links with the idea of bricolage as paying close attention to what occurs within the distributed network and responding to context-specific problems by experimenting with the affordances perceived by the components of an network/assemblage to reduce the chasm between rhetoric and reality.

The balkanisation threat to network learning

As part of NGL the plan was to play with Mendely as a medium, but some limitations of Mendeley meant it didn’t quite fit the bill.

Disconnected by larsomat, on Flickr
Creative Commons Creative Commons Attribution-Noncommercial-Share Alike 2.0 Generic License   by  larsomat 

Undeterred, Tracey’s spent some time exploring and sharing more about Mendeley and its possibilities for network learning. Including a journal article that explains how a tool like Mendely responds to the changes happening in science research (and perhaps research more broadly). Interesting to see Mendeley’s PKM related process (7 parts, rather than 3) –

  1. Organise
  2. Manage
  3. Read
  4. and Write
  5. Collaborate
  6. Discover
  7. Participate

Another alternative

Earlier this morning, as it happens, I received a “cold call” about colwiz that has the goal to

accelerate research by providing a robust reference manager with data sharing and collaboration capabilities

Sounds an awful lot like Mendeley. I could perhaps take the time to do an in-depth comparison – I might do that one day as I can see some interesting applications in terms of learning and teaching – but I don’t have the time just right now.

But what worries me is this as yet another example of the balkanization of the network world. I wonder how easily I – as a current Mendeley user – interact with a research group using colwiz (and vice versa)? My guess is that we’d all have to standardise on one reference manager. This tendency for the commercial imperative to focus on getting everyone into their tool at the expense of interconnections between tools. I’m pretty sure this will get a mention at some stage in connected courses.

As it happens the movie “A Beautiful Mind” was on TV last night which begs the question whether this commercial tendency might benefit from understanding of “Governing Dynamics”?

Further illustration of the potential silliness of some responses to a network world is illustrated by my initial searches for an image or clip from the movie. My first find comes up with the message

The clip you are trying to watch is unavailable in your region. We periodically let studios know which clips are in high demand so please check back soon

A later search reveals this blog post which has the same clip embedded but via YouTube.

The network routes/works around blockages or breakdowns. Wasn’t that one of the aims behind the design of the Internet?

Which begs the next question, Is there anyone designing some software to allow connections to be made between disparate reference management software? Will making connections between balkanized commercial interests become a demand that another entity will have to satisfy? Perhaps, a commercial opportunity?

Learning how to make waves

Had to share this quote that I came across via an artefact produced by a student. Actually, the version I’ve found in the original is slightly different, but the intent is the same.

We have to do more than teach kids to surf the net, we have to teach them to make waves (Shneiderman, 1998, p. 29)

I haven’t read the rest of the article but the quote resonates with me and a couple of recent experiences. In particular the idea that perhaps the first step to help “them to make waves” is that the teachers and the educational organisations that employ them are making waves. And Shneiderman (1998) agrees since just before the above quote comes (emphasis added)

Some technology cheerleaders and national leaders focus on installing computers in classrooms as a measure of success in transforming education. However, even if there were one computer available for each student, with appropriate software and network access, there is no assurance that education would improve. The technology alone can never be a solution, but in the hands of a knowledgeable teacher, appropriately designed technology can become a useful tool. (p. 29)

The trouble is that increasingly I’m thinking that what it means to a “knowledgeable teacher” is extremely limited. Beyond this I also get a feel that the solutions commonly adopted to make teachers more knowledgeable also only address part of the problem.

What’s missing?

On Edge by jurvetson, on Flickr
Creative Commons Creative Commons Attribution 2.0 Generic License   by  jurvetson 

Back in 2003 I wrote (Jones, 2003)

the basic premise of this paper is that a gap exists between the functionality of all institutional information systems and the needs of the staff and students (n.p.)

That particular paper (Jones, 2003) goes on to illustrate various gaps that existed between the systems provided by one university and the requirements of the staff and students. The paper also describes how a group I worked with was able to bridge those gaps/gaping chasms.

The problem is that rather than bridge those chasms, most institutions are reverting to relying on the people to bridge the chasms. Or in the words of Douglas Rushkoff (2010)

…instead of optimizing our machines for humanity – or even the benefit of some particular group – we are optimizing humans for machinery.

Perhaps institutions are defining knowledgeable teachers as those who bridge these chasms.

Are teachers making waves with technology, or are they being overwhelmed by technology?

So when the 2014 Horizon Reports (e.g. Johnson et al, 2014) are identifying the “Low digital fluency of faculty” as one of the “Significant challenges impeding higher education technology adoption”, I’m not so sure.

It’s not enough to know how to use the technology (to bridge the chasm), you need to be able to change the technology.

Why isn’t it happening?

So why aren’t the IT units of universities helping change the technology to bridge the chasms? Groom and Lamb (2014) identify this problem

IT organizations are often defined by what’s necessary rather than what’s possible, and the cumulative weight of an increasingly complex communications infrastructure weighs ever heavier.

and they then quote Martin Weller about the problems associated with large enterprise systems like an LMS

“..The level of support, planning and maintenance required for such systems is considerable. So we developed a whole host of processes to make sure it worked well. But along the way we lost the ability to support small scale IT requests that don’t require an enterprise level solution. In short, we know how to spend £500,000 but not how to spend £500.” The myriad costs associated with supporting LMSs crowd out budget and staff time that might be directed toward homegrown, open-source, and user-driven innovation

The world view associated with maintaining large enterprise systems is an anathema to change. The accepted industry best practice recommendation is to implement such systems in their “vanilla” form because local changes are too expensive (Robey, Ross, & Boudreau, 2002)

Making waves as changing technology

My belief is that if teachers and organisations want to make waves, rather than surf (the net, the next fad or fashion etc). Then they need to do it by changing technology, by understanding the premise offered by Rushkoff (2010)

Digital technology is programmed. This makes it biased toward those with the capacity to write the code.

Better results come from being able to change digital technology. The ability to bridge the chasms has to be brought back into organisations who wish to be seen as leaders in the “digital age”.

The Connected Courses folk seem very much to get this idea. Overnight the #ccourses tweet stream seemed to include some significant mention of the notion of “tinkering” as a useful approach to knowledge production. Hopefully some universities will get the idea that enabling the ability for teachers and students to tinker with the technologies they used for learning is almost certainly going to be a brilliant way for the organisation to learn to make waves.

References

Groom, J., & Lamb, B. (2014). Reclaiming innovation. EDUCAUSE Review, 1–12.

Johnson, L., Adams Becker, S., Estrada, V., & Freeman, A. (2014). NMC Horizon Report: 2014 Higher Education Edition. Austin, Texas.

Jones, D. (2003). How to live with ERP systems and thrive. In Tertiary Education Management Conference’2003. Adelaide.

Robey, D., Ross, W., & Boudreau, M.-C. (2002). Learning to implement enterprise systems: An exploratory study of the dialectics of change. Journal of Management Information Systems, 19(1), 17–46.

Rushkoff, D. (2010). Program or be programmed: Ten commands for a digital age. New York: OR Books.

Shneiderman, B. (1998). Relate–Create–Donate: a teaching/learning philosophy for the cyber-generation. Computers & Education, 31(1), 25–39. doi:10.1016/S0360-1315(98)00014-1

Too much stuff, not enough time

The plan by now was that I would have spent a few weeks engaging with the readings from NETGL and figuring out what they can offer some insights for enhancing and transforming the two courses I currently teach. By this time I’d also have set up my new domain (something I have actually done) and moved this blog to that domain. But that was not to be. Connected courses is starting soon and it’s to relevant and interesting to not engage.

I don’t have a good track record of engaging with “MOOCs”. Not off to a good start with this one, at least in my head. But perhaps as the participants of NGL have learned over the last few weeks, part of the trick is figuring out what you can do with the time you have (and having the discipline to make sure you make some time). Time for me to feel a little of their pain and see just how well I can handle it.

Signing up for Connected Courses

So once again I venture into the realm of a “MOOC”. Will be interesting to see if the organisers of Connected Courses shudder a little bit when that particular label is used. Especially given that Connected Courses is being described as

Connected Courses is a collaborative community of faculty in higher education developing networked, open courses that embody the principles of connected learning and the values of the open web.

With a mission of

Our goal is to build an inclusive and expansive network of teachers, students, and educational offerings that makes high quality, meaningful, and socially connected learning available to everyone.

It does have a Syllabus so there are some artefacts of a “course”. That said there are some very interesting people behind it, so should be lots to learn and fun to be had. If I get the chance to engage fully.

The main reason for this fairly rambling post is to ensure that I have at least one post in the “connectedcourses” category on this blog. I’m trying to connect my blog to the course and the advice is

Please make sure this URL works and links to the place to the place that shows your tagged/categorized blog posts. If you have not written any, do not proceed. The wheels may fall off your bus (just kidding)– there needs to be at least ONE post visible at this address when when you enter it in a web browser

Seems the aggregator they are using has the same problem with empty feeds as BIM. I have to give the same advice to folk in my courses. I wonder if the need is as slightly annoying to them as it is to me. There’s also the problem with finding the feeds for categories/tags, rather than the whole site. Slightly reaffirming that these folk are having the same problems, but also a challenge to see if I can modify BIM to address these issues.

I had hoped to have relocated this blog to my shiny new domain by now, but time hasn’t been in abundance recently. Likely to be a recurring theme over coming months.

Counting the uncountable – NGL participation

The following documents the writing of a script to perform simple counts of what the NGL participants have been doing on their blog. Another post on the course blog will offer an explanation of the emails that will be sent to participants real soon now.

What?

There are 10+ participants in NGL. The indicators of participation being looked for are

  • Number of posts.
  • Average word count per post.
  • % of posts with links to blog posts from other participants.
  • % of posts with links to other online resources.
  • % of posts from the blog that appear on the blog first (out of all participants).

Starting point

Will start with the EDC3100 script and modify from there. That script currently calculates the following relevant

  • Posts per week – not needed, but total posts will be available
  • average word count
  • # of links
  • # of links to other participants

Changes

Remove activity completion

Get Moodle user information – are we only including currently enrolled students, is now.

What about blog posts? Yep.

Calculate the stats for each participant

  • NUM_POSTS – done.
  • (AVG_)POST_LENGTH – done.
  • POSTS_WITH_STUDENT_LINKS – done.
  • POSTS_WITH_LINKS – done.
  • LINKS_HERE_FIRST – to do.

    This is the more difficult task. The requirement here is for each link (not to another participant blog) made in a blog, check to see if it’s the first time the link has appeared in a participant post.

    At the moment the function counting links does have the timepublished for the blog post. It also creates array containing a hash for each link. But that’s all links, but maybe that doesn’t matter.

    What we need here is probably a hash with key on the link and the value being a reference to the hash about the post (which has timepublished).

    With each student object having this, BlogStatistics object can then generate stats for LINKS_HERE_FIRST.

    DoTheLinks updated to do this in Marking.pm

    — See below —

Assign a standard and show the report

  • NUM_POSTS – DONE
  • (AVG_)POST_LENGTH – DONE
  • POSTS_WITH_STUDENT_LINKS – DONE
  • POSTS_WITH_LINKS – DONE
  • LINKS_HERE_FIRST

Currently the report only assigns percentages for each stat, need to translate that into a mark for the assignment. This would have to

  • average the percentage for each descriptor for a criteria.
    The current descriptors/criteria relationship is

    • Posts (10 marks)
      • # posts
      • # words per post
    • Connections (5 marks)
      • % posts with links to other participant blog posts
    • Other links (5 marks)
      • % posts with links to other resources
      • % of posts where links occur first – not calculated yet
  • calculate the mark per criteria

    The above are stored in a hash where the key is the unique id for the descriptor

    • LENGTH = # words per post
    • NUM_POSTS = # posts
    • LINKS = % posts with links to other resources
    • STUDENT_LINKS = % posts with links to other participant blog posts
  • add them up

Calculating first blogs

The task here is for each student, calculate the percentage of links included in their blog posts that appear there first (amongst all the other student blogs)

What we need here is probably a hash with key on the link and the value being a reference to the hash about the post (which has timepublished).

There is a function createBlogMapping that loops through each post for each student and creates a hash ref MAPPING that maps out who links to who.

A similar function that only works on external links (or perhaps all links) and uses the timepublished to create the necessary hashref.

Perhaps something like
[code lang=”perl”]
$whenShared->{$link} = { EARLIEST => "unix timestamp when published",
POST => $link_to_blog_post_in_data structure };
[/code]

This hash would allow a loop for each student that would count the number of times a the POST value is the user’s post.

Exclude any link that isn’t to the student’s actual post, the first link to another student’s post is counted the same as a link elsewhere

So we’re looking at two methods

  1. constructWhenShared – create the hash ref
  2. calculateEarliestPercent – add to {MARKING}->{STATS} the percentage of links first here.

Powered by WordPress & Theme by Anders Norén

css.php