Anyone capturing users' post-adoptive behaviours for the LMS? Implications?

Jasperson, Carter & Zmud (2005)

advocate that organizations strongly consider capturing users’ post-adoptive behaviors, overtime, at a feature level of analysis (as well as the outcomes associated with these behaviors). It is only through analyzing a community’s usage patterns at a level of detail sufficient to enable individual learning (regarding both the IT application and work system) to be exposed, along with the outcomes associated with this learning, that the expectation gaps required to devise and direct interventions can themselves be exposed. Without such richness in available data, it is unlikely that organizations will realize significant improvements in their capability to manage the post-adoptive life cycle (p. 549)

Are there any universities “capturing users’ post-adoptive behaviours” for the LMS? Or any other educational system?

There’s lots of learning analytics research (e.g. interesting stuff from Gasevic et al, 2015) going on, but most of that is focused on learning and learners. This is important stuff and there should be more of it.

But Jasperson et al (2015) are Information Systems researchers publishing in one of the premier IS journals. Are there University IT departments that are achieving the “richness in available data…(that) will realize significant improvements in their capability to manage the post-adoptive life cycle”?

If there is, what does that look like? How do they do it? What “expectation gaps” have they identified? What “direct interventions” have they implemented? How?

My experience suggests that this work is limited. I wonder what implications that has for the quality system use and thus the quality of learning and teaching?

What “expectation gaps” are going ignored? What impact does that have on learning and teaching?

Jasperson et al (2005) develop a “Conceptual model of post-adoptive behaviour” shown in the image below. Post-adoptive behaviours can include the decision not to use, or change how to use. A gap in expectations that is never filled, is not likely to encourage on-going use.

They also identify that there is an “insufficient understanding of the technology sensemaking process” (p. 544). The model suggests that technology sensemaking is a pre-cursor to “user-initiated learning interventions”, examples of which include: formal or informal training opportunities; accessing external documentation; observing others; and, experimenting with IT application features.

Perhaps this offers a possible explanation for complaints about academics not using the provided training/documentation for institutional digital learning systems? Perhaps this might offer some insight into the apparent “low digital fluency of faculty” problem.

conceptual model of post-adoptive behaviours


Gašević, D., Dawson, S., Rogers, T., & Gasevic, D. (2015). Learning analytics should not promote one size fits all: The effects of instructional conditions in predicating learning success. The Internet and Higher Education, 28, 68–84. doi:doi:10.1016/j.iheduc.2015.10.002

Jasperson, S., Carter, P. E., & Zmud, R. W. (2005). A Comprehensive Conceptualization of Post-Adaptive Behaviors Associated with Information Technology Enabled Work Systems. MIS Quarterly, 29(3), 525–557.

The CSCW view of Knowledge Management

Earlier this week I attended a session given by the research ethics folk at my institution. One of the observations was that they’d run training sessions but almost no-one came. I’ve heard similar observations from L&T folk, librarians, and just about anyone else aiming to help academics develop new skills. Especially when people spend time and effort developing yet another you beaut website or booklet that provides everything one would want to know about a topic. There’s also the broader trope developing about academics/teachers being digitally illiterate, which I’m increasingly seeing as unhelpful and perhaps even damaging.

Hence my interest when I stumbled across Ackerman et al (2013) a paper titled “Sharing knowledge and expertise: The CSCW View” with the abstract

Knowledge Management (KM) is a diffuse and controversial term, which has been used by a large number of research disciplines. CSCW, over the last 20 years, has taken a critical stance towards most of these approaches, and instead, CSCW shifted the focus towards a practice-based perspective. This paper surveys CSCW researchers’ viewpoints on what has become called ‘knowledge sharing’ and ‘expertise sharing’. These are based in an understanding of the social contexts of knowledge work and practices, as well as in an emphasis on communication among knowledgeable humans. The paper provides a summary and overview of the two strands of knowledge and expertise sharing in CSCW, which, froman analytical standpoint, roughly represent ‘generations’ of research: an ‘object-centric’ and a ‘people-centric’ view.We also survey the challenges and opportunities ahead.

What follows are a summary and some thoughts on the paper.

Thoughts? Possibilities?

The paper’s useful in that it appears to give a good overview of the work from CSCW on this topic. Relevant to some of the problem being faced around digital learning.

All this is especially interesting to me due to my interest in exploring the design and impact of distributed means of sharing knowledge about digital learning

Look at Cabitza and Simone (2012) – two levels of information, and affording mechanisms – as informing design. Their work on knowledge artifacts (Cabitza et al, 2008) might also be interesting.

Brown and Duguid’s (2000) Network of Practice is a better fit for what I’m thinking here.

CSCW has a tendency to precede development with ethnographic studies.

Learning object repositories?

Given the fairly scathing findings re: the idea of repositories, what does this say about current University practices around learning object repositories?

Is digitally illiterate a bad place to start?

The “sharing expertise” approach would appear to assume that the people you’re trying to help have knowledge to share. Labeling teachers as digitally illiterate would appear to mean you couldn’t even conceptualise this as a possibility. Is this a core problem here?

The shift from system to individual practice

At some level the shift in the CSCW work illustrates a shift from focusing on IT systems to a focus on individual practices. The V&R mapping process illustrates some of this.

Context and embedding is important

Findings reinforce the contextual and situated nature of knowledge (is that a bias from the assumptions of these researchers?). Does this explain many of the problems currently being faced? i.e. what’s being done at the moment is neither contextual nor situated? Would addressing this improve outcomes?


A topic dealt with by different research communities (Information Systems, CSCL, Computer Science) each with their particular focus and limitations. e.g. CS has developed interesting algorithms but “Empirical explroations into the practice of knowledge-intense work have been typically lacking in this discourse” (p. 532).

The CSCW strength has been “to have explore the relationship between innovative computational artifacts and knowledge work – from a micro-perspective” (p. 532)

Uses two different terms that “connote CSCW’s spin on the problem” i.e.

that knowledge is situated in people and in location, and that the social is an essential part of using any knowledge…far more useful systems can be developed if they are grounded in an analysis of work practices and do not ignore the social aspects of knowledge sharing. (p. 532)

  1. Knowledge sharing – knowledge is externalised so that it can be captured/manipulated/shared by technology.
  2. Expertise sharing – where the capability/expertise to do work is “based on discussions among knowledgeable actors and less significantly supported by a priori externalizations”

Speak of generations of knowledge management

  1. Repository models of information and knowledge.
    Ignoring the social nature of knowledge, focused on externalising knowledge.
  2. Sharing expertise
    Tying communication among people into knowledge work. Either through identifying how best to “find” who has the knowledge or on creating online communities to allow people to share their knowledge. – expertise finders, recommenders, and collaborative help systems.
    Work later scaled to Internet size systems and communities – collectives, inter-organisational networks etc.

Repository model

started with attempts “to build vast repositories of what they knew” (p. 533).

it should be noted that CSCW never really accepted that this model would work in practice (p. 534)…Reducing the richness of collective memory to specific information artifacts was utopian (p. 537)

Findings from various CSCW repository studies

  • Standard issues with repository systems

    particularly difficulty with motivating users to author and organize the material and to maintain the information and its navigation

  • Context is important.

    Some systems tackled the problem of context by trying to channel people to expertise that was as local as possible based on the assumption that “people nearby an asker would know more about local context and might be better at explaining than might experts”.

    Other research found “difficulties of reuse and the organisation of the information into repositories over time, especially when context changed…showed that no organisational memory per se existed; the perfect repository was a myth” (p. 534)

  • Need to embed.

    such a memory could be constructed and used, but the researchers also found they needed to embed both the system and the information in both practice and in the organizational context

  • situated and social.

    CSCWin general has assumed that understanding situated use was critical to producing useful, and usable, systems (Suchman 1987;Suchman and Wynn 1984) and that usability and usefulness are social and collaborative in nature (p. 537)

  • deviations seen as useful

    Exceptions in organizational activities, instead of being assumed to be deviations from correct procedures, were held to be ‘normal’ in organizational life (Suchman 1983) and to be examined for what they said about organizational activity, including information handling (Randall et al. 2007;Schmidt 1999) (p. 537)

  • issues in social creation, use, and reuse of information.

    • issues of motivation,
      Getting information is hard. Aligning reward structures a constant problem. The idea of capturing all knowledge clashed with a range of factors, especially in competitive organisational settings.
    • context in reuse,
      “processes of decontextualisation and recontextualisation loomed over the repository model” (p. 538). “This is difficult to achieve, and even harder to achieve for complex problems” (p. 539).
    • assessments of reliability and authoritativeness,
      de/recontextualisation is social/situated. Information is assessed based on: expertise of the author, reliability, authoritativeness, quality, understandability, the provisional/final nature of he information, obsolescense and completeness, is it officialy vetted?
    • organizational politics, maintenance, and
      “knowledge sharing has politics” (p. 539). Who is and can author/change information impacts use. Categories/meta data of/about data has politics.
    • reification
      “repository systems promote an objectified view of knowledge” (p. 540)

Repository work has since been commercialised.

Some of this work is being re-examined/done due to new methods: machine learning and crowd-sourcing.

Boundary objects – “critical to knowledge sharing. Because of their plasticity of meaning boundary objects serve as translation mechanisms for ideas, viewpoints, and values across otherwise difficult to traverse social boundaries. Boundary objects are bridges between different communities of practice (Wenger 1998) or social worlds (Strauss 1993).” (p. 541)

“information objects that have meaning on both sides of an intra-organisational or inter-organisational boundary”.

CSCW tended to focus on “tractable information processing objects” (p. 542) – forms etc. – easier to implement but “over-emphasis on boundary objects as material artifact, which can limit the analytical power that boundary objects bring to understanding negotiation and mediation in routine work”

Example – T-Matrix – supporting production of a tire and innovation.

Cabitz and Simone (2012) identify two levels of information

  1. awareness promoting information – current state of the activity
  2. knowledge evoking information – triggering previously acquired knowledge or triggering/supporting learning and innovation

Also suggest “affording mechanisms”

Other terms

  1. “boundary negotiating” objects
    Less structured ideas of boundary objects suggested
  2. knowledge artifacts – from Cabitza et al (2013)

    a physical, i.e., material but not necessarily tangible, inscribed artifact that is collaboratively created, maintained and used to support knowledge- oriented social processes (among which knowledge creation and exploita- tion, collaborative problem solving and decision making) within or across communities of practice…. (p. 35)

    These are inherently local, remain open for modification. Can stimulate socialisation and internalisation of knowledge.

common information spaces – common central archive (repository?) used by distributed folk. Open and malleable by nature. A repository is closed/finalised, CIS isn’t. Various work to make the distinction – e.g. degrees of distribution; kinds of articulation work and artifacts required, the means of communication , and the differences in frames of participant reference.

Various points made as to the usefulness of this abstraction.


  • Assembly – “denote an organised collection of information objects”
  • Assemblages – “would include the surrounding practices and culture around an object or collection” (p. 545)

How assemblies are put together and their impacts is of interest.

Sharing expertise

Emphasis on interpersonal communications over externalisation in IT artifacts. “ascribed a more crucial role to the practices of individuals” (p. 547). A focus on sharing tacit knowledge – including contextual knowledge.

tacit/explicit – Nonaka’s mistake – explicit mention of the misinterpretation of Polanyi’s idea of tacit knowledge. The mistaken assumption/focus was on making tacit knowledge explicit. When Polanyi used tacit to describe knowledge that was very hard, if not impossible to make explicit.

Tacit knowledge can be learned only through common experiences, and therefore, contact with others, in some form, is required for full use of the information. (p. 547)

Community of practice “roughly be defined as a group that works toegher in a certain domain and whose members share a common practice”.

Network of practice (from Brown and Duguid, 2000) – members do not necessarily work together, but work on similar issues in a similar way.

Community of Interest – defined by common interests, not common practice. Diversity is a source of creativity and innovation.

I like this critique of the evolution of use of CoP

Intrinsically based in their view of ‘tacit knowledge,’ the Knowledge Management community appropriated CoP in an interventionist manner. CoPs were to be cultivated or even created (Wenger et al. 2002), and they became fashionable as ‘the killer application for knowledge management practitioners’ (Su andWilensky 2011, p. 10) with supposedly beneficial effects on knowledge exchange within groups. (p. 547)

CSCW didn’t use CoPs in an interventionist way – instead as an analytical lens.

Social capital – from Bourdieu – “refers to the collective abilities derived from social networks”. Views sharing “in the relational and empathic dimension of social networks” (p. 548).

Nahapiet and Ghoshal (1998) suggest it consists of 3 dimensions

  1. Structural opportunity (‘who’ shares and ‘how’);
    Which is where the technical enters the picture.
  2. Cognitive ability (‘what’ is shared);
  3. Relational motivation (‘why’ and ‘when’ people engage)

Latter 2 dimensions not often considered by system designers.

The sharing approach places emphasis on “finding-out” work. Where knowledge is found by knowing/asking others and in finding the source, de-contextualising and then re-contextualising. Often involves “local knowledge” – which tends to have an emergent nature. What’s important is only known in the situation at hand and who holds it evolves within a concrete situation.

People finding and expertise location

Move from focusing on representations of data to the interactions between people – trying to produce and modify them. Tackling technical, organisational and social issues simultaneously.

Techniques include: information retrival, network analysis, topics of interest, expertise determination.

Profile construction can be contentious – privacy, identification of expertise. Especially given “big data” approaches to analysing and identification.

Expertise finding’s 3 stages: identification, selection, escalation.

Need to promote awareness of individual expertise and their availability – “based in ‘seeing’ others’ activities” (p. 551)

“people prefer others with whom they share a social connection to complete strangers” (p. 553) – no surprise there – but people known directly weren’t chosen as they were deemed not likely to have any greater expertise. Often people who were 2 or 3 degrees of separation away.

Profiles also found by one study to be often out of date. Explored “peripheral awareness” as a solution.

Open issues

  • Development of personal profiles.
  • Privacy and control.
  • Accuracy.

Finding others Lot of work outside CSCW.

CoI in the form of web Q&A communities have arising on the Internet. With research that has studied question classification, answer quality, user satisfaction, motivation and reputation.


  • more money = more answers, but not necessarily better quality.
  • charitable contributions increased credibility of answers “in a nuanced way”?
  • Altruism and reputation building two important motivations

Recent research looking at “social Q&A” – how people use social media to answer – two lines of research (echoing above)

  1. social analysis of existing systems;
    Looking at: impact of tie strength on answer quality, org setting, response rates when asking strangers – especially with quick, non-personal answers, community size and contact rate.
  2. technical development of new systems

Future directions

Interconnected practices: expertise infrastructures

Increasing inter-connectedness

  • may cause “experts” to become anonymous.
  • propel new types of interactions via micro-activities – microtasking environments make it easy/convenient to help
  • Collaboratively constructed information spaces – wikipedia – numerous papers examiner how it was constructed, including work looking more broadly at Wikis
  • Other research looked at github, mozilla bug reports etc.
  • And work looking at social media, microblogging etc and its use.


Ackerman, M. S., Dachtera, J., Pipek, V., & Wulf, V. (2013). Sharing Knowledge and Expertise: The CSCW View of Knowledge Management. Computer Supported Cooperative Work (CSCW), 22(4-6), 531–573. doi:10.1007/s10606-013-9192-8

Re-purposing V&R mapping to explore modification of digital learning spaces


Apparently there is a digital literacy/fluency problem with teachers. The 2014 Horizon Report for Higher Education identified the “Low Digital Fluency of Faculty” as the number 1 “significant challenge impeding higher education technology adoption”. In the 2015 Horizon Report for Higher Education this morphs into “Improving Digital Literacy” being the #2 significant challenge. While the 2015 K-12 Horizon Report has “Integrating Technology in Teacher Education” as the #2 significant challenge.

But focusing solely on the literacy of the teaching staff seems a bit short sighted. @palbion, @chalkhands and I are teacher educators working in a digitally rich learning environment (i.e. a large percentage of our students are online only students). We are also fairly digitally fluent/literate. In a paper last year we explored how a distributive view of knowledge sharing helped us “overcome the limitations of organisational practices and technologies that were not always well suited to our context and aims”.

Our digital literacy isn’t a problem, we’re able and believe we have to overcome the limitations of the environment in which we teach. Increasingly the digital tools we are provided by the institution do not match the needs we have for our learning designs and consequently we make various types of changes.

Often these changes are seen as bad. At best these changes are invisible to other people within our institution. At worst they are labelled as duplication, inefficient, unsafe, and feral. They are seen as shadow systems. Systems and changes that are undesirable and should be rooted out.


Rather than continue this negative perspective, @palbion, @chalkhands and I have just finished a rough paper that set out to explore if there was anything valuable or interesting to learn from the changes we made to our digital learning spaces. Our process for this paper was

  1. Generate a list of stories of the changes we made to our digital learning/teaching spaces.
    Using a Google doc and a simple story format (descriptive title; what change was made; why; and, outcomes) each of us generated a list of stories of where we’d changed the digital tools/spaces we use for our teaching.
  2. Map those stories using a modified Visitor and Resident mapping approach.
    The stories needed to be analysed in someway. The Visitors & Residents approach offered a number of advantages – more detail below.
  3. Reflect upon what that analysis showed and about potential future applications of this approach.

What follows is some reflection on the approach, a description of the original V&R map, and a description and example of our modified V&R map.

Reflection on the approach

In short, we (I think I can say we) found the whole approach interesting and could see some potential for broader use. In particular, the potential benefits of the approach include:

  1. Great way to start discussions and share knowledge.
    Gathering stories and analysing them using the V&R process appear to be very useful ways for starting discussions and sharing knowledge. Not the least because it starts with people sharing what they are doing (trying to do) now, rather than some mythical ideal future state.
    Reports from others using the original V&R mapping process suggest this is a strength of the V&R mapping approach. Our experience seems to suggest this might continue with the modified map we used.
  2. Doesn’t start by assuming that people are illiterate.
    Neither @palbion or I think we’re digitally illiterate. We have formal qualifications in Information Technology (IT). @chalkhands doesn’t have formal qualifications in IT. Early on in this process she was questioning whether or not she had anything to add. She wasn’t as “literate” as @palbion and I. However, as we started sharing stories and mapping them that questioning went away.
    The V&R approach is very much based on the idea of focusing on what people do, rather than who they are or what they know (or don’t). It doesn’t assume teaching staff are digitally illiterate and is just interested in what people do. I think this is a much more valuable starting point for engaging in this space. It appears likely to provide a method for helping universities follow observations from the 2015 Horizon Report that solving the “digital literacy problem” requires “individual scaffolding and support along with helping learners as they manage conflict between practice and different contexts” and “Understanding how to use technologies is a key first step, but being able to leverage them for innovation is vital to fostering real transformation in higher education” and “that programs with one-size-fits-all training approaches that assume all faculty are at the same level of digital literacy pose a higher risk of failure.”
  3. It accepts that the ability for people to change digital technologies is not only ok, it is necessary and unavoidable.
    Worthen (2007) makes the point that those in charge of institutional IT (including digital learning spaces) want to prevent change while the people using digital systems want the technology to change

    Users want IT to be responsive to their individual needs and to make them more productive. CIOs want IT to be reliable, secure, scalable, and compliant with an ever increasing number of government regulations

    Since the CIOs are in charge of the technology (they have the power) the practice of changing digital systems (without having gone through the approved governance processes) is deemed as bad and something to be avoided. This is due to change, especially in learning and teaching if you accept Shulman’s (1987) identification of the “knowledge based of teaching” laying (emphasis added)

    at the intersection of content and pedagogy, in the capacity of a teacher to transform the content knowledge he or she possesses into forms that are pedagogically powerful and yet adaptive to the variations in ability and background presented by the students (p. 15)

The original V&R map

The original V&R map is (example in the image below) a cartesian graph with two axes. The X-axis ranges from visitor to resident and describes how you perceive and use digital technologies. A visitor sees a collection of disparate tools that are fit for specific purposes. When something has to be done the visitor selects the tool, gets the job done, and leaves the digital space leaving no social trace. A resident on the other hand sees a digital space where they can connect and socialise with others. The Y-axis ranges from Institutional to Personal and describes where use of digital technologies fits on a professional or personal scale.

The following map shows someone for whom LinkedIn is only used for professional purposes. So it’s located toward the “Institutional” end of the Y-axis. Since LinkedIn is about leaving a public social trace for others to link to, it’s located toward the “Resident” end of the X-axis.

Our modified V&R map

Our purpose was to map stories about how we had change digital technologies within our role as teacher educators. Thus the normal Institutional/Personal scale for the Y-axis doesn’t work. We’re only considering activities that are institutional in purpose. In addition, we’re focusing on activities that changed digital technologies. We’re interested in understanding the types of changes that were made. As a result we adopted a “change scale” as the Y-axis. The scale was adapted from software engineering/information systems research and is summarised in the following table.

Item Description Example
Use Tool used with no change Add an element to a Moodle site
Internal configuration Change operation of a tool using the configuration options of the tool Change the appearance of the Moodle site with course settings
External configuration Change operation of a tool using means external to the tool Inject CSS of Javascript into a Moodle site to change its operation
Customization Change the tool by modifying its code Modify the Moodle source code, or install a new plugin
Supplement Use another tool(s) to offer functionality not provided by existing tools Implement course level social bookmarking by requiring use of Diigo
Replacement Use another tool to replace/enhance functionality provided by existing tools Require students to use external blog engines, rather than the Moodle blog engine.

Since we were new to the V&R mapping process and were trying to quickly do this work without being able to meet, some additional scaffolding was placed on the X-axis (visitor-resident). This provide some common level of understanding of the scale and was based on a specific (and fairly limited) definition of “social trace”. The lowest level of the scale was “tools used by teachers” which meant no social trace. The scale gradually increased the number of people involved in the activities mediated by the digital technology. “Subsets of students in a course” to “All students in a course” and right on up to “Anyone on the open web”.

The following image is the “template” map that each of used to map out our stories of changing digital technologies.

Modified V&R map template

An example map and stories

The following image is the outcome of mapping my stories of change. A couple of example stories are included after the image.

My V&R change map

Know thy student

This story involves replacing/supplementing existing digital tools, but is something that only I use. Hence Visitor/Replacement.

What? A collection of Greasemonkey scripts, web scrapping, local database/server designed to help me know my students and what they were doing in the Study Desk. Wherever there is a Moodle user profile link in Moodle, the script will add a link [ details ] that is specific for each user. If I click on that link I see a popup window with a range of information about the student

Why? Because finding out this information about a student would normally take 10+ minutes and require the use of multiple different web pages in two different system. Many of these pages don’t exactly make it easy to see the information. Knowing the students better is a core part of improving my teaching.

Outcomes? It’s been a god send. Saving time and enabling me to be more aware of student progess.

Using links in student blog posts

A fairly minor example of change. There’s a question of whether it’s just “use” or “internal configuration”? After all, it’s just using an editor on a web page to create some HTML. It was bumped up to “internal configuration” because of an observation that hyperlinks were not often used by many teachers. Something I’m hoping that @beerc will test empirically.

What? Some comments I write on student blog posts will make use of links to offer pointers to relevant resources.

Why? It’s more useful/easy to the students to have the direct link. Hence more likely to make use of the suggestion.

Outcomes? Minor anecdotal positive comments. Not really known

Early indications and reflection

The change scale worked okay but could use some additional reflection. In particular we raised some questions about whether many of the “replacement” examples of change (including those in my map above) are actually examples of supplement.

On reflecting on all this we made some initial observations, including

  1. Regardless of perceived levels of digital literacy we all engaged in a range of changes to digital technologies.
  2. Not surpisingly, the breadth/complexity of those changes increased with greater digital literacy.
  3. In the end very few of our changes were “replacement”. Almost all were focused more on overcoming perceived shortcomings with the provided tools, rather than duplicating their functionality.
  4. Most of changes tended to congregate towards the “visitor” end of the X-axis. Not surprising given that all of the digital technologies provided by the institution are not on the open web.
  5. Almost all of the stories that involved “replacement” were based on moving out onto the “open web”. i.e. they were all located toward the “resident” end of the X-axis.
  6. Changes were being made due to two main reasons: improving the efficiency of institutional systems or practices; or, customising digital technologies to fit the specific learning activities we wanted to implement.

Are our institutions digital visitors? What are the impacts on learning and teaching?

As it happens, we’ve been talking and thinking about the Visitor/Resident typology (White & Cornu, 2011) that last couple of weeks. The network gods have been kind, because over night a post titled “The resident web and its impact on the academy” (Lanclos & White, 2015) floats across my Twitter stream. Much food for thought.

It has me wondering

Are universities are digital visitors? If so, what impact is this having on learning and teaching?

Update: more reading and thinking has led to the addition of a section “Branding pushing out social traces”.

Residents and visitors

White & Cornu (2011) describe visitors as those that

understand the Web as akin to an untidy garden tool shed. They have defined a goal or task and go into the shed to select an appropriate tool which they use to attain their goal…Visitors are unlikely to have any form of persistent profile online which projects their identity into the digital space

White & Cornu (2011) describe residents as those that

see the Web as a place, perhaps like a park or a building in which there are clusters of friends and colleagues whom they can approach and with whom they can share information about their life and work. A proportion of their lives is actually lived out online where the distinction between online and off–line is increasingly blurred. Residents are happy to go online simply to spend time with others and they are likely to consider that they ‘belong’ to a community which is located in the virtual…To Residents, the Web is a place to express opinions, a place in which relationships can be formed and extended.

How Universities think about digital learning spaces

@damoclarky and I argued that institutional digital learning is informed by the SET mindset. A mindset that approaches any large, complex problem (like digital learning) with a Tree-like approach. That is, it employs logical decomposition to break the large problem up into its smaller and smaller problems until there is a collection of solvable problems that can allocated to individual units. The units now solve the problems (largely) independently, and each of the small solutions are joined back up together and consequently (hopefully) solve the original big problem.

You can see evidence of this tree-like perspective all over our institutions and the digital learning spaces they produce.

The institutions themselves are divided into hierarchical organisational structures.

What the institution teaches is divided up into a hierarchical structure consisting of programs (degrees), majors, courses, semesters, weeks, lectures, and tutorials.

And more relevant to this argument, the institutional, digital learning space is divided up into separate tools.

At my institution those separate tools include, but are not limited to:

  • the staff/student portal;
  • the Learning Management System;
    In the case of my institution that’s Moodle. Moodle (like many of these systems) is structured into a tree-like collection of modules. The “M” in Moodle stands for Modular.
  • the eportfolio system;
  • the learning object repository system;
  • the library system;
  • the gradebook (Peoplesoft); etc….

Each tool is designed to serve a particular goal, to help complete a specific task.

Hence the tendency for people to see these digital learning spaces “as akin to an untidy garden tool shed” where when they want to do something they “go into the shed to select an appropriate tool which they use to attain their goal” (White & Cornu, 2011).

This collection of separate tools is not likely to be seen as a “place, perhaps like a park or a building in which there are clusters of friends and colleagues whom they can approach and with whom they can” (White & Cornu, 2011) learn.

Of course, there is some awareness of this problem, which leads to a solution.

Brand as unifying solution

Increasingly, the one solution that the corporate university seems able to provide for this “untidy garden tool shed” problem is branding. The idea being that if all the tools use the same, approved, corporate brand then all will be ok. It will be seen as an institutional learning space. With the emphasis explicitly on the institution. It is the institution’s brand that is used to cover the learning space, not the learners and not the teachers. With which I see some problems.

First, is the observation made by Lanclos and White (2015) in the context of the resident web and the academy

scholars will gain a form of currency by becoming perceived as “human” (the extent to which ‘humanness’ must be honest self-expression or could be fabricated is an interesting question here) rather than cloaked by the deliberately de-humanised unemotive academic voice.

In this context the problem isn’t so much the “de-humanised unemotive academic voice” as it is the stultifying, stripping of individuality on the altar of the institutional identity. It doesn’t matter whether you’re learning engineering, accounting, teaching or anything else. It’s the institution and how it wishes to project itself that matters.

Which creates the second problem for which one of my institution’s documents around a large institutional digital learning project provides a wonderful exemplar.

Can you have a digital learning experience that is consistent, brand enhancing, and optimal for each student? I tend to think not. Especially in light of arguments that the diversification and massification of the student body has led universities to shift their education rhetoric from a notion of “one size fits all” to a concept of tailored, flexible learning (Lewis, Marginson et al. 2005).

My current experience is that instead of getting digital learning spaces that support tailored and flexible learning, institutions are more likely to create learning spaces that “have less variety in approach than a low-end fast-food restaurant” (Dede, 2008, p. 58).

Brand pushing out social traces

The visitors/residents typology (White and Cornu, 2011) is particularly interested in whether or not people are leaving social traces of themselves online as they interact with digital learning spaces (well, they are actually focused on the participatory web, but I’ll narrow it a bit). Does the “consistent..brand enhancing” approach to institutional digital learning spaces limit the likelihood of social traces being left? Can institutional digital learning spaces be seen as places people will want to reside within when it’s branded?

It would seem obvious that such a branded space couldn’t be seen as “my space”, especially for students. But what about the impact of teachers. Many teachers – for better or worse – like to customise the learning space (not only for the needs of the students) but also to meet project their personality. Can this be done in a branded digital space?

Impact on learning?

The above points to an institutionally provided (and sometimes mandated) digital learning space that is more likely to resemble and consistently branded, untidy garden tool shed. A perception that is unlikely to be perceived by learners and teachers as a space they would wish to inhabit. Instead, it’s more likely to encourage them to see the learning space as place to visit, complete a task, and leave ASAP. Which would appear likely to negatively impact engagement and learning.

It’s would also appear likely to be a perception that is not going to help institutions address a pressure identified by Lanclos and White (2015)

The academy can no longer simply serve its own communities in the context of the networked Web, and it is under increasing cultural pressure to reach out and appear relevant.


Dede, C. (2008). Theoretical perspectives influencing the use of information technology in teaching and learning. In J. Voogt & G. Knezek (Eds.), International Handbook of Information Technology in Primary and Secondary Education (pp. 43–62). New York: Springer.

Lewis, T., S. Marginson, et al. (2005). “The network university? Technology, culture and organisational complexity in contemporary higher education.” Higher Education Quarterly 59(1): 56-75.

White, D., & Le Cornu, A. (2011). Visitors and Residents : A new typology for online engagement. First Monday, 16(9). doi:doi:10.5210/fm.v16i9.3171

What is "netgl" and how might it apply to my problem

At least a couple of the students in a course I help out with are struggling a little with Assignment 2 which asks them “to develop a theory-informed plan for using NGL to transform your teaching (very broadly defined) practice”.

The following is a collection of bits of advice that will hopefully help. Littered throughout are also some examples from my own practice.

NGL != social media

Network and Global Learning (NGL/netgl) should not be interpreted to mean use of social media. In the course we use blogs, Diigo, feed readers etc as the primary form of NGL practice and in the past this has led folk to think that NGL equates to use of social media.

Just because we used blogs, Diigo, and feed readers, that doesn’t you should. You should use whatever is appropriate to your problem and your context.

What is NGL?

Which begs the question, “what is NGL”? If not just social media.

As I hope was demonstrated in the first two-thirds of the course there is no one definition of NGL. There are many different views from many different perspectives.

The first week’s material had a section on networked learning that included a few broad definitions. I particularly like the Goodyear et al (2014) quote that includes

learning networks now consist of heterogeneous assemblages of tasks, activities, people, roles, rules, places, tools, artefacts and other resources, distributed in complex configurations across time and space and involving digital, non-digital and hybrid entities.

The course material also covers more specific conceptions of NGL. e.g. connectivism gets a mention in week 1, as does public click pedagogy.

Week 3 mentions groups, networks, collectives and communities; the idea of network learning as a 3rd generation of pedagogy; and some historical origins of network learning.

What’s your problem?

It’s all overwhelming, is a common refrain I’m hearing. Understanding that there is a range of different views of NGL probably isn’t going to help. That’s one of the reason why Assignment 2 is intended to use a design-based research approach i.e. (emphasis added)

a particular approach to research that seeks to address practical problems by using theories and other knowledge to develop and enhance new practices, tools and theories.

At some level DBR can help narrow your focus by asking you to focus on a practical problem. A problem near and dear to your heart and practice.

Of course, the nature of “problems” in and around education are themselves likely to be complex and overwhelming. The example I give from my own practice – described initially as “university e-learning tends to be so bad” or “a bit like teenage sex” is a big complex problem with lots of perspectives.

How do you reduce the big overwhelming problem to something that you can meaningful address?

This is where the literature and theory(ies) enter the picture.

What might “theory informed” mean?

First, go and read a short post titled What is theory and why use theories?.

Adopting this broad and pragmatic view of theory, there are many ideas and concepts littered throughout this course (and many, many more outside) including, but not limited to: connectivism; connected learning; communities of practice; group, networks, collectives, and communities; threshold concepts etc. In understanding your problem, you are liable to draw upon a range more.

As per the short post theories are meant to be useful to you in understanding a situation or problem and then as an aid in formulating action.

Combining theories from NGL and your “problem”

The theories for assignment 2 aren’t limited just to theories from NGL. You should also use theories that are relevant to your problem.

You look around for how other people have conceptualised the problem and the approaches and theories that they have used. Do any of those resonate with you? Can you see any problems or limitations with the approaches used? Are there other theoretical lenses or just simple ways of understanding the problem that help narrow down useful avenues for action?

In terms of my problem with the perceived quality limitations of university e-learning, I’ve been using the TPACK framework for a while as one theoretical lens. TPACK is quite a recent and broadly used theory for understanding the knowledge teachers require to design technology-based learning experiences. (Since all models are wrong, it has it’s limitations)

Drawing on TPACK I wonder if the reason why university e-learning is so bad is because the TPACK (knowledge) being used to design, implement, and support it is insufficient. It needs to be improved.

Not an earth shatteringly insightful or novel suggestion. But by focusing on TPACK that does suggest that perhaps I focus my attention for potential solutions within the TPACK related literature, other than elsewhere. Almost always there is more literature than any body (especially in the context of a few weeks) can get their head around. So for better or worse, you need to starting drawing boundaries.

Now with a focus on TPACK it’s time to combine my personal experience with the theory and associated literature. My personal experience and context may also help focus my exploration. e.g. if I were working in a TAFE/VET context, I might start looking at the literature for mentions of TPACK in the TAFE/VET context (or just at TAFE/VET literature). Again, narrowing down the focus.

I might find that there’s nothing in the TAFE/VET context that mentions TPACK in conjunction with e-learning. This might highlight an opportunity to learn lessons from other contexts and test them out in the TAFE/VET context. Or there might already be some TPACK/TAFE/VET/e-learning literature that I can learn from.

In my case, as someone with relatively high TPACK I get really annoyed when people think the main challenge is “low digital fluency of faculty” (i.e. teaching staff). This gets me thinking that perhaps the problem isn’t going to be solved by focusing on developing the knowledge of teaching staff. i.e. requiring teaching staff to have formal teaching qualification isn’t (I believe) going to solve the problem, so what is?

You want digitally fluent faculty?

This is potentially interesting because a fair chunk of existing practice assumes that formal teaching qualifications or the “right” professional development opportunities will help teaching staff develop the right TPACK and thus university e-learning will be fantastic. Being able to mount a counter to a prevailing orthodoxy might be interesting and useful. It might make a contribution. It might also identify a fundamental misunderstanding of a problem and a need to read and consider further.

In my case that led to an interest in (seeing a connection with) another theoretical idea, i.e. the distributive view of learning and knowledge. I do recommend Putnam & Borko (2000) as a good place to start learning about how the distributive view of knowledge and thinking can help situate teacher learning.

The combination of TPACK and the distributive view of learning appears to be useful. So we ended up using it in this paper to explore our experience with university e-learning. That work lead to questions such as

  • How can institutional learning and teaching support engage with the situated nature of TPACK and its development?
  • How can University-based systems and teaching practices be closer to, or better situated in, the teaching contexts experienced by pre-service educators?
  • How can the development of TPACK by teacher educators be made more social?
  • How can TPACK be shared with other teacher educators and their students?
  • Can the outputs of digital renovation practices by individual staff be shared?
  • How can institutions encourage innovation through digital renovation?
  • What are the challenges and benefits involved in encouraging digital renovation?

Most of these are questions that could be good candidates for a design-based research project. i.e. can you use these and other theories to design an intervention or change in practice?

Designing an intervention

This recent post is my attempt to answer at least this question from above

How can institutional learning and teaching support engage with the situated nature of TPACK and its development?

It takes the distributed view of TPACK, the BAD mindset, and tries to envision some changes in practice/technology that might embody the principles from those theoretical ideas.

The idea is that being guided by those theoretical ideas makes it more likely that I can predict what can/should happen. I can justify the design of the intervention. I might be wrong, but it will hopefully be a better reason for the specific design approach than “because I wanted to”.

The ultimate aim of a DBR approach is to design, implement, and then test this design to see if it does achieve what I think it might.

Don’t forget the context. Don’t focus on the technology

My example above is very heavy on in terms of technology and requires fairly large technical expertise. That’s because it is something that I’ve designed for my specific context. It makes sense (hopefully) within that context.

If I were someone else working (with less technical knowledge) in a different context (e.g. an outback school with no Internet connection), then the solution I would design would be different.

Putnam and Borko (2000) give a range of examples around teacher learning that aren’t heavily technology based. If there is no Internet connection, there might be a high prevalence of mobile phones. If not, I might need to become a little more creative about using low levels of digital technologies.

In fact, if I were in a very low technology environment, I’d be actively searching the literature for insight and ideas about how other people have dealt with this problem. Almost certainly I wouldn’t be the first in the world.


Putnam, R. T., & Borko, H. (2000). What do new views of knowledge and thinking have to say about research on teacher learning? Educational Researcher, 4-15.

What is theory and why use theories?

The following is an edited version of something used in a course I teach that’s currently hidden away in the LMS. I’m adding it here because I’m using it with another group of students.

It’s a quick attempt to cover what I perceive to be a reasonable whole for many education students. i.e. what exactly is a theory and why the hell would I want to use them? My impression is that not many of them have developed an answer to these questions that they are comfortable with.

This is a complex and deeply contested pair of questions. I’m assuming that if you lined up 50 academics you’d get at least 50 different sets of answers. My hope that this is a useful start for some. Feel free to add your own pointers and answers to these questions.

If you want a more detailed look into the nature of theory then I can recommend Gregor (2006).

What is theory?

I take an inclusive and pragmatic view of theory.

An inclusive view, because there is a huge array of very different ideas that can be labelled theories. A pragmatic view is taken because the reason we use theories in this course is to make it easier to do something. To understand a particular situation, or for most reading this figure out how to design some use of digital technology to enhance or transform student learning.

Hirst (2012, p. 3) describes educational theory as

A domain of practical theory, concerned with formulating and justifying principles of action for a range of practical activities.

i.e. educational theory should help you teach and help your learners learn.

In the context of this particular course we touch on various ideas such as: the Computer
Practice Framework, TPACK, Backwards Design, the RAT framework, the SAMR model, The TIP Model, constructivism, and many more. For the purposes of this course, we’ll call these things theories. They help with “formulating and justifying principles of action”.

There is huge variability in the purpose, validity, and approaches used to formulate and describe these objects called theories. A theory isn’t inherently useful, important, or even appropriate. That’s a judgement that you need to make.

A theory is just a model and All models are wrong, but some are useful (Box, 1979).

Why use theories?

Thomas (1997, p. 78) cites Mouly (1978)

Theory is a convenience a necessity, really organizing a whole slough of facts, laws, concepts, constructs, principles into a meaningful and manageable form

These theories are useful because they help you understand, formulate and justify how and what to do. In this course, these theories will help you plan, implement, and evaluate/reflect upon the use of digital technologies to improve your teaching and your students’ learning.

Learning and teaching are difficult enough. When you add digital technologies to the mix even more complexity arises. The theories we introduce in this course should hopefully help you make sense of this complexity. Guide you in understanding, planning, implementing and evaluating of your use of ICTs.


Gregor, S. (2006). The nature of theory in information systems. MIS Quarterly, 30(3), 611–642.

Hirst, P. H. (2012). Educational theory. In P. H. Hirst (Ed.), Educational Theory and Its Foundation Disciplines (pp. 3-29). Milton Park, UK: Routledge.

Thomas, G. (1997). What’s the Use of Theory? Harvard educational review, 67(1), 75:105.

Technology required by teachers to customise technology-enhanced units

This is the 2nd post (first here) looking at Instructional Science 43(2) on the topic of “Teachers as designers of technology enhanced learning”. This post looks at Matuk et al (2015)

In summary

  1. The claim is that the ability for teachers to customise is positive for learning.

    Teachers’ involvement in curriculum design is essential for sustaining the relevance of technology-enhanced learning materials. Customizing – making small adjustments to tailor given materials to particular situations and settings – is one design activity in which busy teachers can feasibly engage. Research indicates that customizations based in evidence from student work lead to improved learning outcomes (p. 229)

  2. Customisations by a four middle/high school teachers are examined to see how these customisations were afforded
  3. Identified 3 4 types of customisations (the abstract says 3 and then proceeds to list these 4)
    • “devising timely instructional interventions to provide individualised guidance”
    • “planning activities and adjusting milestones to align with students’ progress”
    • “modifying existing materials to better integrate content into overall curriculum plans”
    • incorporating scaffolds to better address students’ needs
  4. Identified 3 technology features that support customisations
    • A system that logs student work for teachers’ inspection;
    • tools for conducting dynamic, formative assessment; and,
    • an authoring environment that supports re-design of units at multiple levels of granularity

In this paper, we argue that teachers’ effectiveness in customizing TEL materials also relies on the affordances of the tools available to them, particularly in their ability to make students’ ideas visible (p. 232)

Preliminary design principles “for flexibly adaptive curriculum materials based on the premise of making student work visible as evidence to inform teachers’ customizations” (p. 250)

  1. Provide an interface for browsing logged responses;
    i.e. display responses and revisions “and give teachers a persistent record of their students’ thinking”.
  2. Integrate scaffolds that make student thinking explicit;
    i.e. make students’ thinking processes visible to teachers to enable formative advice. Strong link here with learning process analytics (Lockyer et al, 2013)
  3. Provide technologies to monitor real-time progress;
  4. Offer flexible, accessible authoring tools that support testing and refinement.

Challenges for future technologies: a research and design agenda

  1. How do we design interfaces and real-time displays that make students’ logged data both accessible to, and usable by teachers?
  2. How can we make the underlying instructional framework transparent such that the curriculum materials themselves guide teachers’ customizations?
  3. How can authoring tools be designed that both take advantage of teachers’ expertise and respect their time?

Some of the findings echo some of the ideas from learning analytics, but more directly from a teacher perspective.


  1. The participants and context for this study was fairly limited. What types of customisations and features to support customisations might be identified by examining the work of other teachers in other contexts. Especially contexts that make significantly greater use of digital technologies (e.g. largely online university courses)?
  2. This paper appears to focus on teacher’s redesigning technology-enhanced “curriculum materials”, almost a content focus. What differences do you have to consider if you see digital technology as part of the learning space? As the environment in which learning occurs, not just the curriculum?
  3. The idea of educative curriculum materials – “curriculum materials with an additional tools and resources to aid teachers in attending to changing classroom dynamics, reflecting on their practices, and seeking new approaches to solving problems” (p. 233) – resonates with the idea of Context Appropriate Scaffolding Assemblages (CASA) including the idea of a CASA that allows course designers (teacher educators) to annotate their digital learning spaces (course sites) with explanations and rationalisations behind the designs. Perhaps something useful for other teacher educators, but also for pre-service teachers (links to an idea that @palbion has previously mentioned).
  4. How does this papers purpose/context

    existing research establishes that technology can support teachers’ customizations. It also characterizes broad categories of the kinds of customizations teachers make. Still, little is known about the specific ways by which technology enables customizations, especially those based in students’ ideas.

    link and inform the purpose/context of the paper(s) we’re thinking of?
    Links somewhat back to questions #1 and #2. A different context and a broader notion of digital technologies. Also perhaps a focus more on the type of digital knowledge required of teachers. “Affordance” as in “affordance of a technology for customisation” is a relational term. It’s dependent on the functionality of the technology and the teachers capability to perceive and perform tasks with that functionality.

  5. The technology-enhance units being customised here can be customised “without the need for programming skills” (p. 234). Might not this limit the type of customisations that teachers can undertake? Might not teachers with programming skills want to make different customisations and thus require different affordances from the systems? The customisations identified in this paper are very dependent on the nature of the system and the affordances it offered. Would a more open system combined with a teacher with programming skills identified more and different customisations/technology features?
    Something that the authors identify later

    Our findings raise questions for future research about how teachers’ different prior knowledge of their students and of the subject matter, their individual skills with tech- nology, and their personal orientations toward their roles as teachers and designers, influence their interpretations and responses to their students’ work. They also raise questions about how these interactions are manifested in teachers’ customizations. (p. 250)

  6. Is this observation

    A recent review of 30 technology-based inquiry-learning environments identified only eight, including WISE, that support teachers’ customizations (Donnelly et al. 2014) (pp. 234-235)

    indicative of a broader problem around digital technologies? i.e. they are generally not designed to be modified by teachers. There’s an aspect of that around the LMS, what about more broadly? How does this fit in with various perspectives about the (de-)professionalisation of teachers?

It’s all about putting the context back in

Reading the 4 types of customisation that were identified puts me in mind of the reusability paradox described as the tension between these two observations

  • “The more context a learning object has, the more (and the more easily) a learner can learn from it.”
  • “To make learning objects maximally reusable, learning objects should contain as little context as possible.”

And my current pet argument that the mindset underpinning the design and implementation of digital technologies for learning and teaching has a (strong) tendency to remove context and hence reduce pedagogical value.


What strikes me about the four customisations is that they are all about modifying the “technology-enhanced units” to insert more context. e.g. providing individual guidance, align with students’ progress, better integrate content into overall curriculum plans, and better address needs. All these talk about teachers modifying the “technology” to better respond to context.

Which resonates strongly with Shulman’s (1987) suggestion that

the key to distinguishing the knowledge base of teaching lies at the intersection of content and pedagogy, in the capacity of a teacher to transform the content knowledge he or she possesses into forms that are pedagogically powerful and yet adaptive to the variations in ability and background presented by the students (p. 15)

And also picks up a quote from this paper

The relationship between teachers and curriculum has been characterized as one between designers and their tools (Brown 2009). In designing curriculum, teachers combine available materials with their own knowledge and expertise to craft instructional experi- ences (Brown and Edelson 2003). (p. 232)

Animated gif of reusability paradox showing a trend to putting more context into the object



The authors argue that

materials that yield to teachers’ modifications better respond to the classroom’s changing needs, constraints, and resources…research finds that teachers who attend to students’ ideas design more effective instruction and formative feedback (Black and Wiliam 2010) (p. 230)

But the various constraints of the classroom setting mean that

their customization decisions tend to be driven by issues of practicality and feasibility (Boschman et al. 2014) rather than by evidence from students’ ideas

Reasons why materials are changed and how are outlined with some supporting references. Labelled as curriculum customizations (Brown and Edelson, 2003). Largely guided by experience, practicalities etc.

Customisation may be a process of differentiation leading to learning gains. “This process demands a degree of expertise” (p. 231). “Customisations based in students ideas have been shown to lead to improved learning outcomes (Ruiz-Primo and Furtak, 2007)…,em>how teachers understand their students’ thinking also influences the kinds of customizations they make” (p. 232)

The role of technology in supporting customisation

The relationship between teachers and curriculum has been characterized as one between designers and their tools (Brown 2009)…Thus, by understanding how teachers use tools to aid their practice, we can further define their facilitating roles. (p. 232)

Apparently Schwartz et al (1999) make a point related to the need to provide flexibly adaptive materials that can support teacher customisation without losing integrity. Which brings up the interesting point

because whereas teachers’ adaptations of materials to local conditions can sometimes lead to improved student learning, it is also possible that they deviate from the intended value of the innovation (p. 232)

TEL materials and afford/guide customisations. Many examples of TEL curriculum material that have done this. Also mentions educative curriculum materials as materials with additional tools and resources – e.g. annotations on documents viewable by a teacher that offers suggestions for implementation and described the rationale behind these designs.

The context

Case studies arise from use of the Web-based Inquiry Science Environment a system used by 9900+ teachers, 80,000+ students, and with 8,000 different customised WISE units (at the time of writing). Up to date statistics are available from the web site

Essentially appears to be a collection of established units in the form of web pages, animations etc supported by various functions (e.g. concept maps). It does have an authoring environment that “allows users to copy and modify existing units without the need for programming skills” (p. 234).

This is interesting

A recent review of 30 technology-based inquiry-learning environments identified only eight, including WISE, that support teachers’ customizations (Donnelly et al. 2014) (pp. 234-235)

Cases of customisation and the role of technology

Much detailed description. Explaining how and why the four teachers customised the WISE units in response to their students. Shows the origins of the four types of customisation.


Teachers used different tools based on a range of factors:

students’ differing needs; the conceptual and linguistic challenges most prominent in teachers’ regard; teachers’ own instructional goals; and teachers’ orientations toward technology, pedagogy and their roles as designers with respect to the curriculum materials (p. 248)

There was variability in modes of customisation – variability in level of digital changes

These differences in customization mode might be explained by teachers’ familiarity with, and orientations toward technology; as well as to the support available for using that technology (Inan and Lowther 2009; Koehler and Mishra 2008; Zhao et al. 2002)….If teachers did indeed vary in their facilities and familiarities with technology, then with consistent amounts of training, their customization strategies would come to more closely resemble one another. But another explanation for teachers’ differences is their perceptions of themselves as designers (Cviko et al. 2013) and as research participants in curriculum development projects such as WISE. (p. 249)

The last point is perhaps interesting.

How can technologies offload the effort involved in giving individualised guidance?

“logistic constraints of the classroom can limit what teachers can do” (p. 253) Mainly talks about automation as the tactic. Fairly limited discussion and something a lot of machine intelligence guys are working on.


Lockyer, L., Heathcote, E., & Dawson, S. (2013). Informing Pedagogical Action: Aligning Learning Analytics With Learning Design. American Behavioral Scientist, 57(10), 1439–1459. doi:10.1177/0002764213479367

Matuk, C. F., Linn, M. C., & Eylon, B.-S. (2015). Technology to support teachers using evidence from student work to customize technology-enhanced inquiry units. Instructional Science, 43, 229–257. doi:10.1007/s11251-014-9338-1

Shulman, L. (1987). Knowledge and teaching: Foundations of the new reform. Harvard Educational Review, 57(1), 1–21. Retrieved from

Teachers as designers of technology enhance learning?

Some colleagues and I are starting to wonder about what type of “digital knowledge” teachers might need. This is occurring in the context of a re-design of a Bachelor of Education. This particular post is a summary of reading and thinking about ideas outlined in Kirschner (2015) and related writings. Apparently Instructional Science 43(2) feature contributions discussing “teacher as a design professional”.

In particular, the idea of

if and how teachers as designers of technology enhanced learning might (not) be feasible or even desirable (p. 309)

Some of the major points Kirschner (2015) makes include

  1. Technology Enhanced Learning (TEL) is not new. Teachers have been using technologies to enhance learning since movable type and the blackboard (and perhaps earlier).

    But the question here is whether or not digital technologies represent a different type of technology. e.g. Kay’s (1984) identification of the computer as the first metamedium and also “the important original idea of opening tool creation to every user – even children” (Wardrip-Fruin & Montfort, 2003, p. 391) suggest that digital technologies can/should be very different from the historical technologies that Kirschner relies upon.

    Be this as it may, the five contributions have not convinced me that TEL is different from all other innovations and/or why it should be treated as such. (p. 318)

    I’m wondering if this perceived lack of distinction between digital technologies and other technologies is the important question here. If digital technologies are just like other technologies, then learning how to use them is sufficient. But if digital technologies are different, then perhaps just learning how to use them is not sufficient.

  2. Higher level competencies are more important/fundamental than knowledge of tools, techniques, and technologies.
    I’m not convinced you can develop these higher level competencies without having knowledge and experience using the tools, techniques, and technologies that are contemporary to your initial entry into the profession (whatever profession).

There’s more in this special issue that might be of interest.

Role of design

Kirschner (2015) continues

Both practicing professionals and institutions for teacher education must understand and embrace the role of design in professional com- petencies if technology enhanced learning is ever to be fully integrated into teaching and learning processes (p. 309)

Begging the question, What is meant by design?

Teacher competencies

Kirschner & Davis (2003) apparently identify “pedagogical benchmarks for ICT in teacher education” and argued that

as long as institutions for teacher education see the computer or ICT as an addition to teacher training and not as something fundamental to it, the computer would never become an integral well-used part of the teaching/learning process. (Kirschner, 2015, p. 310)

Suggesting that for this to happen, the teacher educators who design and teach into teacher education programs need to see ICT as integral to the teaching/learning process. i.e. at the very least they should seem themselves as digital residents.

i.e. pre-service teachers don’t take courses in “teacher aided, textbook aided, or whiteboard aided instruction/learning”. There is a need to learn about the different ways these tools can be integrated into learning, but the point is not as a separate course.

Should ICT be special and be taught in special courses? In a perfect world, perhaps not. As all of the teacher educators would be digital residents and easily demonstrating how ICT is integral to learning and teaching. But is this happening?

Kirschner uses “teacher competencies” to answer. Teacher competencies are defined as

combination of complex cognitive and higher- order skills, highly integrated knowledge structures, interpersonal and social skills, and attitudes and values (Van Merrienboer & Kirschner, 2012, p.2)

The suggestion is that “Many professionals, be they academic or vocational, have five basic competencies” (Kirschner, 2015, p. 310)

  1. Gathering necessary background/situational information;
  2. Analysing that information and arriving at a diagnoses for/decision as to a course of action;
  3. Determining exactly what needs to be done/what steps need to be taken;
  4. Carrying out the chosen actions; and,
  5. Evaluating whether the result of the actions was what was hoped for or expected, and if not, returning to #1.

I do wonder what the argument is for establishing this as the 5 basic competencies? Not that I necessarily disagree, but why these 5?

A connection is made to medical doctors and the point is made that

the basic competences do not really change, but rather the enabling/underlying knowledge, skills, and attitudes do. (p. 311).

For a teacher these changes include

a wealth of new and/or different domain specific knowledge, pedagogic knowledge and pedagogic content knowledge that is increasingly evidence informed….now includes new pedagogical techniques and mastery of newly available technologies. And attitudes of society, learners, parents/guardians are changing, even concerning the role and function of knowledge, learning and even formal education (p. 311)

Kirschner (2015) suggests that “the design of TEL is not a new competence that needs to be acquired, but is rather the twenty first century equivalent of twentieth century phenomena” (p. 311). The connection is made with the AECT and its origins in the use of audio-visual technologies. The faddish nature of technology is illustrated and the suggestion is that teacher training needs to focus on training that can be applied/transferred across a variety of contexts over an unlimited time span.


  1. There’s an assumption here that such knowledge/training is or can be currently known, can be meaningfully taught to students, and will maintain it’s currency into the future and into different contexts. Is there such knowledge?
  2. Or do the five competencies above represent the ability to learn and adapt and thus form that only knowledge that can be applied/transferred?
  3. How does this perspective go when faced with the reality of politics etc, especially those participants who believe that there is the “one true” method for learning and the “one true” set of knowledge to be imparted on learners?

Relationship between competencies and tools, techniques and ingredients

Can this higher level and transferable set of competencies be successfully imparted to novices, people who have yet to experience a range of contexts?

Kirschner (2015) argues

from the viewpoint of teacher competences, there is really no need for specific attention to TEL. TEL is an artefact of the times, and is essentially the educational equivalent of part of the expert chef’s arsenal of tools, techniques and ingredients.

Granted my knowledge of chef training is limited to the media, but the abiding image I have of the film Julie and Julia is of lots of practice chopping onions.

This image – reinforced by just about every other popular representation of chefs – is that a first step in becoming a chef is developing the immediate capabilities required to operate within a professional kitchen. The first step in learning to be a chef is not learning generic high level competencies. You first have to become expert in the “aresenal of tools, techniques and ingredients” so you can start to develop and apply the high level competencies to become a truly creative chef.

This echoes my observations of the difficulties faced in first year programming courses. Courses where problems arose when the expert programmer/academic assumed novice programmers should start by learning the fundamentals of program design (very similar to Kirschner’s 5 competencies) without having to worry about the low level of skills of basic programming language syntax.

Kirschner (2015) positions the “modern day expert teacher” as a “top-chef who integrates different educational ingredients according to effective, efficient and enjoyable pedagogic/educational techniques making use of different tools and technologies afforded at this moment” (pp 312-313).

The problem is that you can’t be a top chef unless you are intimately familiar with the ingredients, tools and techniques.

Which is perhaps capture by this quote from Van den Dool and Kirschner (2003, p. 176)Teachers need to integrate ICT competence into their core teaching competences and the educational system must integrate it into the heart of learning and teaching. What really counts at the end of the day is if teachers and learners feel that ICT tools are a ‘normal’ part of their competences and not an add-on, either in a positive or negative sense.

What is this digital competence?

My original annotation on this quote from February this year was

Yes, but given we don’t yet have a good handle on how to effectively do this an dmost teachers aren’t, isn’t it necessary that they learn how .

What is this “ICT competence” that he talks of? What is it’s nature? Is it just being a tool user? Is it being a digital renovator? How can teacher educators – many of whom haven’t “integrated ICT competence into the core teaching competences” make judgements about what is ICT competence?

How can educational systems that have failed to integrate ICT “into the heart of learning and teaching” make judgements about what is/isn’t required?

Ecology of Education

Uses CSCL and “New learning” as examples of approaches that lost their way.

e.g. CSCL focused too much on the technology, then the nature of collaboration, without paying enough attention to the type of learning. Not sure that any form of separation – computer vs collaboration vs learning – is an appropriate way. It’s about a mixture of all three to create something new.

New learning posed new approaches and was intended to complement, rather than replace, older approaches. But the bandwagon took over. That happens.

The “ecology of education” is defined as the exchanges and affordances between learners, teachers, and the digital tools, virtual environments, and physical spaces. As an ecosystem it is both

  1. a system; and,

    i.e. “a complex whole made up of elements that work together as parts of an interconnecting network” (p. 314)

  2. systemic.

    Change any part and the change will ripple through the entire system.

And this ecosystem extends out more broadly into its surrounding educational system – government policy, political parties, commercial companies etc…

The proposal is that

if research into teachers as designers of TEL is to have eco- logical validity, it must be undertaken in ways that accommodate the ecology of education, attending to its systems and systemic nature

It’s this systems and systemic nature that causes challenges such as

Teachers typically have little time, limited expertise and rarely any formal endorsement for their design efforts

They don’t readily have access to “their own work space or ‘down time’ during the day to maintain and increase the profesionalism”.


developments towards decentralized structures require teachers to be more involved in curriculum design (e.g. Dinham 2005); research, which shows that teacher customization of materials can enhance student learning (Gerard et al. 2010); and practice, where the mismatch between existing resources and needs of specific learners/settings requires that teachers design to improve alignment (McLoughlin 2001). (p. 314)

Comments on special issue contributions

Kirschner (2015) now goes onto look at the other contributions.

McKenny et al (2015) critique includes

  • Can current teachers really develop the identified knowledge?
  • Are teachers actually reflective practitioners?

Cover et al (2015) – importance of teacher participation in design

  • Doubts about generalisability from two case studies.
  • Is participation a worthwhile goal in itself?
  • What about the student?

Matuk et al (2015) – “added value of teachers’ re-design of curriculum materials via small, systematic adjustments” – identifying 4 types of customisations and three technology features that support customisation

  • Limited use of student voice.
  • Subjective nature of the data, analysis etc.

Voogt et al (2015) – teacher participation in design teams provide professional development.

  • focus on teacher design at the lesson/course level, rather than curriculum level
  • move beyond adaptation and look at shared design and construction

Svihla et al (2015) identify patterns of support for teacher designing

  • limitations of research design make findings a less than surprising.

Overall, results are “soft” due to reliance on small-scale case studies with varied methods and tools.

Discussions and conclusions

In commenting on the research framework proposed by McKenney et al (2015) it’s seen as potentially limited because it is based on “existing literature” and thus may not be comprehensive.

And this

Teachers are designers— of all learning, including TEL. Research in this area is important, but the TaD (of TEL) field is still young and needs to be more clearly placed in the broader ecology of education; that is, should not be compartmentalized as something different and fragmented in the field’s approach to it. (p. 320)


Kirschner, P. a. (2015). Do we need teachers as designers of technology enhanced learning? Instructional Science, 43(2), 309–322. doi:10.1007/s11251-015-9346-9

Wardrip-Fruin, N., & Montfort, N. (Eds.). (2003). The New Media Reader. Cambridge: The MIT Press.

University e-learning: Removing context and adding sediment

The following is the outlines the core of the argument used in a talk to folk at UniSA today titled “The perceived uselessness of the Technology Acceptance Model (TAM) for e-learning”. The argument is that the mindset underpinning the implementation of institutional e-learning within Universities focuses on widespread reuse across an institution (and sometimes beyond). As a result institutional e-learning has a tendency to remove considerations of context, which in turn reduces/removes any chance of learners and teachers perceiving any usefulness or ease-of-use from the provided systems and processes.

The end result is that rather than enabling high quality learning experiences, institutional e-learning practices are creating sediment that clogs up any attempt to create high quality learning experiences. The following offers on possible explanation for why this is the case and offers a possible solution.

Example – “Know thy student”

The ability to know thy student is of central importance to learning and teaching. However, research around learning analytics has identified that institutional e-learning systems do a particularly poor job at supporting this fairly central task.

Seven years ago Dawson and McWilliam (2008, p. 3) found that

current LMS present poor data aggregation and similarly poor visualisation tools in terms of assisting staff in understanding..student learning behaviour

Two years ago Corrin et. al. (2013, p. 204) found that

A common request that emerged across the focus groups was the ability to correlate data across systems

If I want to know who one of my students is, where they are located, what type of teacher they are studying to become (e.g. Early Childhood, Primary, Secondary, Special Education etc), what activities they’ve completed on the course site, and what course related posts they’ve written on their blog I have to (as summarised by the following image) spend 10+ minutes wandering around 3 different websites.


And while the above diagram uses simple and consistent black boxes to represent each of the web pages I use to get the information. The reality is actually much more complex. As is shown by the following image. It’s a full screen dump of the Activity Completion report in Moodle. Each of the rows in the massive table represent a student in my course. Each of the columns represents an activity they are asked to complete on the course site. A tick in a particular box indicates that they have completed that activity. Given the size and complexity of this representation it’s actually quite hard to identify whether or not a student has completed an activity.


Lesson from TAM – people won’t use this

The Technology Acceptance Model (TAM) proposes that people are much more likely to use a system if they perceive the system to be

  1. easy to use; and,
  2. useful.

Do you perceive the above system to be useful and easy to use?

I don’t. Which is why I (and assume most other teaching staff) don’t use it.

Given that this is blindingly obvious, and that both Dawson et al (2008) and Corrin et al (2013) have already identified this problem, why hasn’t the problem been fixed?

SET mindset – removing context, usefulness, and ease of use

Institutional e-learning – like much in contemporary corporate Universities – is driven by a SET mindset.

Amongst the many problems with the SET mindset is that it must focus on reuse. The learning objects/systems that the SET mindset creates must be usable (at least) across an entire institution. This is contributed to by each of the components of the SET mindset.

  1. Strategic – the emphasis of the SET mindset is on strategic planning. Strategies that are important for the organisation. Strategic planning separates the planning from the doing. It separates the planning from the context.
  2. Established – the SET framework has an established view of digital technologies. i.e. it’s hard, expensive and subsequently wrong to modify or customise technology. Thus the organisation must use the same technology. It can’t be modified to respond to contextual needs.
  3. Tree-like – the SET framework breaks big, difficult problems down into lots of little parts that are solved separately. Each of the parts of the organisation is focused on their little part of the problem and doing it well. There can’t be sufficient focus on the useful whole (e.g. a learning experience). A learning experience is actually a combination of different parts (e.g. branded look and feel, maintaining uptime on Moodle, university policy on extensions etc.) the people with the greatest focus on the whole/the learning experience (i.e. the learners and teachers) have the least capability to modify or control the parts and how they are put together.

In terms of the reusability paradox the SET mindset tends to focus on reuse at the expense of pedagogical value. It removes context (and thus usefulness and ease of use) from the learning objects/systems in order to be able to reuse them in different contexts.


BAD mindset – adding context, usefulness, and ease of use

On the other hand, the BAD mindset tends to put context back into the learning objects/systems. It responds to the needs of a specific context and focuses on maximising usefulness and ease of use within the confines of that context. As a result, the BAD mindset tends to reduce the capability to reuse the learning object/system.


This tendency is contributed to by each of the parts of the BAD mindset

  1. Bricolage – combines doing and planning/design within the specifics of the context. Takes what is available within the context and uses that creatively to scratch and itch. The nature of the solution is dependent on the context, the available resources, and the connections that can be made.
  2. Affordances – the BAD mindset sees digital technology as inherently “protean…to be shaped and exploited…the first metamedium, and as such it has degrees of freedom for representation and expression” (Kay, 1984, p. 59). i.e. digital technology is this hugely flexible resources that can and should e “shaped and exploited” to fit the requirements specific to a context.
  3. Distributed – rather than see the world as a collection of separate boxes never to be questioned, the BAD mindset sees the world as a distributed collection of connections and relationships that exist to be connected and re-connected in new and useful ways.

The BAD solution to “know thy student”

Using the BAD mindset, I’ve implemented a solution to the “know thy student” problem that I’ve used this year. Another post offers a more detailed description of that solution. The above argument suggests that the BAD solution should be hugely more contextually appropriate and thus useful and easy to use than the SET solution. The following table provides a simple comparison between the SET and BAD solutions.

Mindset Where How long
SET 3 separate websites, 9 + web pages 10+ minutes
BAD Where-ever I interact with students on the course web site 3 mouse clicks

I certainly know which solution I use more.

Can this scale?


The “know thy student” solution I’ve been using can’t be used by anyone else as it depends on a mix of technologies and learning designs specific to my context. For example, I don’t think any other course (not taught by me) at USQ uses a combination of activity completion and the BIM module that I use in my course. But the whole point of the BAD mindset is that the specifics of the “know thy student” solution can and should be modified to suit the specifics of the design of your course and your context.

Some of the technologies I use won’t scale to other people. However, the general trend with digital technologies (e.g. the rise of API-centric architectures) is such that it can be easily re-created and scaled. The challenge at the moment is that the SET mindset is holding back the adoption and innovative use of these technological trends.

For example, Peter Albion has customised the Moodle Assignment activity to better suit his needs. Peter’s customisation should (with little or no modification) be able to be used by any teacher who is using the Moodle assignment activity. (It should be especially easy if you are using the Firefox browser, but not all that difficult if you are using another browser)

But not everyone can code. University e-learning systems currently have a starvation problem. That is, a range of projects that should get implemented can’t because there aren’t enough development resources. Customising everything to each specific context/learning design is never going to happen, unless perhaps everyone can do their own coding.

But that’s the point of a Distributed perspective. Not everyone needs to be able to code, though it might be a huge benefit. All you need to do is be connected through your various links and connections to someone who can code. You also need to be within an environment that actively enables people who can code to share what they do in a way that can be re-used, customised, and re-shared.

You need to be in an environment that recognises and responds to Anton Ego’s sentiment in the following image. Rather than a SET-based environment that believes a great artist (programmer) can only come from the IT division (if you’re lucky) and an external consultant (if you’re unlucky). E-learning’s starvation problem is coming from too few people and too few perspectives being allowed and encouraged to engage in modification.

Not eveyone can

CASA – Context-Appropriate Scaffolding Assemblages

While @beerc and I were enjoying the following view of Queenstown post the 2015 ASCILITE conference we started talking about a range of ideas.

Queenstown View

One of those was the idea of CASA – Context-Appropriate Scaffolding Assemblages – as a representation of the type of “systems” that a BAD mindset would produce. Not as a replacement for the types of systems that the SET mindset generates. CASA are meant to be the recombination, reconnection, and mashup of a range of different parts of SET systems in ways that respond to contextual requirements.

Today’s talk at UniSA included an attempt to move the CASA concept forward a bit and give a few more examples of what CASA might look like.

The BAD and CASA acronyms overlap (but not as neatly as the following suggests)

  • Context-Appropriate and Bricolage.

    The focus is to increase pedagogical value (e.g. ease of use and usefulness) but putting more and more context into the e-learning systems. To enable individual teachers (and learners) to scratch the itches they have, rather than have to wait on the organisation or beyond.

  • Scaffolding and Affordances

    The aim is to modify digital technologies and generally make connections that help learners and teachers accomplish tasks that are specific to the context. Echoing the idea of Electronic Performance Support Systems (Hannafin et. al., 2001)

  • Assemblages and Distrubtion

    The focus is the on-going production, destruction, and re-construction of heterogeneous, productive, and desired socio-material connections/relationships. A naive and nascent channeling of Muller (2015) and Introna (2013).

Supplementary Assessment CASA

My faculty has a formal process for how supplementary assessments are meant to be managed using Moodle. To help academics implement this policy the faculty distributed version 2.7 of an 11 page PDF outlining the steps required.

This is an example of how the SET mindset is unable to insert additional context into its systems. Instead of modifying the tool to fit the task, the user has to modify their actions.

Rather than do this, why not implement a CASA that offers support to teaching staff to implement this policy. Support that is located exactly where they need it. i.e. the Moodle assignment activity module as shown in the following image.

supplementary CASA

The idea is that I know I need to set a supplementary assessment. I go to Moodle and add an assessment activity. But since I’ve installed this CASA on my browser, it modifies the traditional Moodle interface and adds a button for “Supplementary”. If I click on that button it scaffolds me through the process for creating a supplementary assessment as per university policy.

The CASA is implemented in via augmented browsing. It’s not a modification of Moodle. Moodle has an inherently tree-like structure and modifying it can be seen as problematic. Though modifying Moodle to implement this (or any) CASA is possible, it’s probably unlikely.

Minute paper CASA

A minute paper is a fairly well-known, simple, and effective strategy for getting feedback from students. However, Stead (2005) found that

the one-minute paper is perhaps not used especially extensively…largely due to lack of knowledge of its existence and the perception that it would be too time-consuming to analyse the responses (p. 118)

If only we had access to a technology that allowed for the simple capture, analysis, visualisation, and querying of data. That might help solve this problem.

Well, what about a minute paper CASA connected with the Moodle Feedback activity. A system that is directly designed to serve this purpose. However, because it’s designed for reuse across a range of feedback contexts configuring the Feedback activity to implement a minute paper takes a bit of work. It is also unlikely that the Feedback activity provides the type of analysis functionality that would be directly specific to the minute paper.

A minute paper CASA could add information about the minute paper to the Moodle interface, thereby increasing awareness (a little). But it could also provide a scaffolded (and perhaps almost entirely automated process) for creating a minute paper. The minute paper CASA could also usefully provide specific learning analytics for the minute paper.

minute paper CASA

Ice-breaker process analytics CASA

Lockyer et al (2013) define learning process analytics as

data and analysis (that) provide direct insight into learner information processing and knowledge application…within the tasks that the student completes as part of the learning design (p. 1448)

One of the learning designs I use early in my course is an ice breaker activity using a discussion forum. Students are asked to post an introduction to themselves and then read through the introductions provided by other students. Their aim is to located someone they think is the “same” as them and someone who is “different”. Once identified they are asked to say “Hi” to those individuals.

It’s not a bad activity. However, because the Moodle discussion forum is designed to be re-used in the broadest possible collection of contexts, it provides no scaffolding specific to this particular learning design. This is where a CASA specific to this learning design could help. Either when I create the discussion forum, or perhaps later when I configure it, I would specify that this discussion forum is being used for this specific learning design.

From then on when I view this specific discussion forum the inteface is modified to provide context-appropriate scaffolding. For example, a “check progress” button might be added to allow me to see where in the process students are up to. It might also provide some scaffolding around how I might encourage some of the laggards.

iceBreaker CASA 1

The CASA might also modify my “know thy students” CASA so that while I’m within this specific discussion forum the display is modified to include information specific to the learning design. In this case, a simple legend showing whether or not the student has completed the three required posts.

iceBreaker CASA 2


The talk also briefly touched on the idea of a CASA for CASA. This idea was previously described in a post looking at a BAD approach to developing distributed TPACK.

What’s next?

The immediate focus (I hope) is on exploring how the “know thy student” “CASA” can be scaled, customised, and tested with other colleagues here at USQ. What challenges are likely to exist in trying to convince the SET-mindset parts of the institution that they need to break BAD? Which of the fears that they have about breaking BAD will be proven? What haven’t we predicted? Will it make any difference?


Introna, L. (2013). Epilogue: Performativity and the Becoming of Sociomaterial Assemblages. In F.-X. de Vaujany & N. Mitev (Eds.), Materiality and Space: Organizations, Artefacts and Practices (pp. 330–342). Palgrave Macmillan.

Müller, M. (2015). Assemblages and Actor-networks: Rethinking Socio-material Power, Politics and Space. Geography Compass, 9(1), 27–41. doi:10.1111/gec3.12192

Stead, D. R. (2005). A review of the one-minute paper. Active Learning in Higher Education, 6(2), 118–131. doi:10.1177/1469787405054237