@damoclarky has commented on yesterday’s Part 2 post. A comment that’s sparked a bit of thinking. I’ve moved my length response into this post, rather than as a reply to the comment.

What is it? Stable or unstable?

@damoclarky writes

There also appears (at least to me) to be an irony in your blog post. On the one hand, we have technology as unstable, with constant change occurring such as Apple iOS/Phone updates, or 6monthly Moodle releases. Then on the other, we have:

“… commonplace notions of digital technologies that underpin both everyday life and research have a tendency to see them “as relatively stable, discrete, independent, and fixed” (Orlikowski & Iacono, 2001, p. 121).”

Part of the argument I’m working toward is that how people/organisations conceptualise and then act with digital technology doesn’t align or leverage the nature of digital technology. This lack of alignment causes problems and/or lost opportunities.  This is related to the argument that Orlikowski & Iacono make as they identify 5 different views of technology, illustrate the differences and argue for the importance of theorising the “IT artifact”.

The “relatively stable, discrete, independent, and fixed” view of technology is one of the views Orlikowsi & Iacono describe – the tool view. There are other views and what I’m working on here is a somewhat different representation.  I’m actually arguing against that tool view.  The discrepancy between the “relatively stable, discrete, independent, and fixed” view of digital technology and the unstable and protean nature of digital technology is evidence (for me) of the problem I’m trying to identify.

Actually, as I’m writing this and re-reading Orlikowski and Iacono it appears likely that the other nature of digital technology described in the part 2 post – opaque – contributes to the tool view. Orlikowski and Iacono draw on Latour to describe the tool view as seeing technologies as “black boxes”. Which aligns with the idea of digital technologies as being increasingly opaque.

Stable but unstable

For most people the tools they use are black boxes.  They can’t change them. They have to live with what those tools can or can’t do. But at the same time they face the problem of those tools changing (upgrades of Moodle, Microsoft Office etc), of the tools being unstable. But even though the tools change, the tools still remain opaque to them, they still remain as black boxes.  Black boxes that the person has to make do with, they can’t change it, they just have to figure out how to get on.

Perceptions of protean

Is it just perception that technology is not protean? There is a power differential at play. Who owns technology? Do you really “own” your iPhone? What about the software on your iPhone? What controls or restriction exist when you purchase something? What about your organisation’s OSS LMS software? It is very opaque, but who has permissions to change it?

Later in the series the idea of affordances will enter the picture. This will talk a bit more about how the perception of a digital technology being protean (or anything else) or not does indeed depend on the actor and the environment, not just the nature of the digital technology.

But there’s also the question of whether or not the tool itself is protean. Apple is a good example. Turkle actually talks about the rise of the GUI and the Job’s belief at Apple of controlling the entire experience as major factors in the increasing opacity of digital technology. While reprogrammability is a fundamental property of digital technology the developers of digital technology can decide to limit who can leverage that property. The developers of digital technology can limit the protean nature of digital technology.

In turn the organisational gate keepers of digital technology can further limit the protean nature of digital technology. For example, the trend toward standard course sites within  University run LMS as talked about by Mark Smithers.

But as you and I know, no matter how hard they try they can’t remove it entirely. The long history of shadow systems, workarounds, make-work and kludges (Koopman & Hoffman, 2003) spread through the use of digital technologies (and probably beyond). For example, my work at doing something with the concrete lounges in my institution’s LMS. But at this stage we’re starting to enter the area of affordances etc.

The point I’m trying to make is that digital technologies can be protean. At the moment, most of the digital technologies within formal education are not. This is contributing to formal education’s inability to effectively leverage digital technology.

Blackboxes, complexity and abstraction

Part of the black box approach to technology is to deal with complexity. Not in terms of complexity theory, but in terms of breaking big things into smaller things, thus making them easier to understand. This is a typical human approach to problem solving. If we were to alter the opacity of technological black boxes, how much complexity can we expect educators to cope with in then being able to leverage their own changes?

When I read Turkle in more detail for the first time yesterday, this was one of the questions that sprung to mind. Suchman is talking about being able to perceive the bare technology as being transparent, but even as she does this she mentions

When people say that they used to be able to “see” what was “inside” their first personal computers, it is important to keep in mind that for most of them there still remained many intermediate levels of software between them and the bare machine. But their computer system encouraged them to represent their understanding of the technology as knowledge of what lay beneath the screen surface. They were encouraged to think of understanding as looking beyond the magic of the mechanism (p. 23).

She then goes onto argue how the rise of the GUI – especially in the Macintosh – encourage people to stay on the surface. To see the menus, windows and icons and interact with those.  To understand that clicking this icon, that menu, and selecting this option led to this outcome without understanding how this actually worked.

The problem I’m suggesting here isn’t that people should know the details of the hardware, or the code that implements their digital technology. But that they should go beyond the interface to understand the model used by the digital technology.

The example I’ll use in the talk (I think) will be the Moodle assignment activity. I have a feeling (which could be explored with research) that most teachers (and perhaps learners) are stuck at the interface. They have eventually learned which buttons to push to achieve their task. But they have no idea of the model used by the Moodle assignment activity because the training they receive and the opaque nature of the interface to the Moodle assignment activity doesn’t help them understand the model.

How many teaching staff using the Moodle assignment activity could define and explain the connections between availability, submission types, feedback types, submission settings, notifications, and grade? How many could develop an appropriate mental model of how it works?  How many can then successfully translate what they would like to do into how the Moodle assignment activity should be configured to help them achieve those goals?

What about the home page for a Moodle course site? How much of the really poorly designed Moodle course home pages is due to the fact that the teachers have been unable to develop an effective mental model of how Moodle works because of the opaque nature of the technology?

How many interactive white boards are sitting unused in school classrooms because the teacher doesn’t have a mental model of how it works and thus can’t identify the simple fix required to get it working again?

I imagine that the more computational thinking a teacher/learner is capable of, the more likely it is that they have actively tried to construct the model behind that tool, and subsequently the more able they are to leverage the Moodle assignment activity to fit their needs.  The more someone sees a digital technology as not opaque and as protean, the more likely I think that they will actively try to grok the model underpinning the digital technology.

This isn’t about delving down in the depths of the abstraction layer. It’s just trying to see beyond the opaque interface.

Another interesting research project might be to explore if modifying the interface of a digital technology to make it less opaque – to make the model underpinning the digital technology clearer to the user – would make it easier to use and eventually improve the quality of the task they wish to complete?  e.g. would it improve the quality of learning and teaching with digital technology?

Can you do anything? How?

Without sounding too dramatic (or cynical), without industry-wide changes to how digital technology is viewed, are attempts to address the issues outlined in your blog post futile?

How do you bring about industry-wide change in attitude and thinking?

The funny thing is that significant parts of the digital technology industry is already moving toward ideas related to this.Increasingly what software developers – especially within organisations – are doing is informed by the nature of digital technologies outlined here. But that hasn’t quite translated into formal education insitutions. It is also unclear just how much of this thinking on the part of software developers has informed how they think about what the users of their products can do. But in some cases, the changes they are making to help them leverage the nature of digital technologies are making it more difficult, if not impossible, to prevent their users from making use of it.

For example, both you and I know that the improvements in HTML have made it much easier to engage in screen scraping. The rise of jQuery has also made it much easier to make changes to web pages in tools like Moodle. But at the same time you get moves to limit this (e.g. the TinyMCE editor on Moodle actively looking to hobble javascript).

This is something that will get picked up more in later posts in this series.

So it’s going to happen, it’s going to be easy, but I do think it’s going to get easier.

References

Koopman, P., & Hoffman, R. (2003). Work-arounds, make-work and kludges. Intelligent Systems, IEEE, 18(6), 70-75.