Assembling the heterogeneous elements for (digital) learning

Month: June 2016

Any pointers to an old, ancient game?

Way back in 1986 I started studying undergraduate computer science at the University of Queensland. One of our first year programming assignments was to use the fancy, new Macintosh computers to add some code to a game.  I’m looking for pointers to the name of the game and any online resources about it. A working version on some contemporary platform would be great.

Any help more than welcome.

The game

The game was played with a grid. Typically 4 by 4 grid that looked something like this.

Grid 001

The idea is that there were random mirrors hidden throughout the grid. The aim of the game was to figure out what type of mirrors were located where within the grid. To do this you had a flashlight that you could shine through one of the holes on the outside. The light would exit the grid at another location, depending on the location and type of mirrors it would encounter. A bit like this
grid 002

There were three types of mirrors. Two diagonal mirrors / and ; and, a X mirror.  The diagonal mirrors would change the direction of the light depending on how the light struck the mirror. The X mirror would direct the light back the way it came.

The following image shows one potential layout of mirrors to explain how the light behaved in the above image.

Grid 003

The light travels straight ahead until it hits the first diagonal mirror. This mirror causes the to change direction directly up. Where it immediately hits another diagonal mirror which send the light traveling right again until it exits the grid.

Nature of digital technology? Part 2 – expansion

@damoclarky has commented on yesterday’s Part 2 post. A comment that’s sparked a bit of thinking. I’ve moved my length response into this post, rather than as a reply to the comment.

What is it? Stable or unstable?

@damoclarky writes

There also appears (at least to me) to be an irony in your blog post. On the one hand, we have technology as unstable, with constant change occurring such as Apple iOS/Phone updates, or 6monthly Moodle releases. Then on the other, we have:

“… commonplace notions of digital technologies that underpin both everyday life and research have a tendency to see them “as relatively stable, discrete, independent, and fixed” (Orlikowski & Iacono, 2001, p. 121).”

Part of the argument I’m working toward is that how people/organisations conceptualise and then act with digital technology doesn’t align or leverage the nature of digital technology. This lack of alignment causes problems and/or lost opportunities.  This is related to the argument that Orlikowski & Iacono make as they identify 5 different views of technology, illustrate the differences and argue for the importance of theorising the “IT artifact”.

The “relatively stable, discrete, independent, and fixed” view of technology is one of the views Orlikowsi & Iacono describe – the tool view. There are other views and what I’m working on here is a somewhat different representation.  I’m actually arguing against that tool view.  The discrepancy between the “relatively stable, discrete, independent, and fixed” view of digital technology and the unstable and protean nature of digital technology is evidence (for me) of the problem I’m trying to identify.

Actually, as I’m writing this and re-reading Orlikowski and Iacono it appears likely that the other nature of digital technology described in the part 2 post – opaque – contributes to the tool view. Orlikowski and Iacono draw on Latour to describe the tool view as seeing technologies as “black boxes”. Which aligns with the idea of digital technologies as being increasingly opaque.

Stable but unstable

For most people the tools they use are black boxes.  They can’t change them. They have to live with what those tools can or can’t do. But at the same time they face the problem of those tools changing (upgrades of Moodle, Microsoft Office etc), of the tools being unstable. But even though the tools change, the tools still remain opaque to them, they still remain as black boxes.  Black boxes that the person has to make do with, they can’t change it, they just have to figure out how to get on.

Perceptions of protean

Is it just perception that technology is not protean? There is a power differential at play. Who owns technology? Do you really “own” your iPhone? What about the software on your iPhone? What controls or restriction exist when you purchase something? What about your organisation’s OSS LMS software? It is very opaque, but who has permissions to change it?

Later in the series the idea of affordances will enter the picture. This will talk a bit more about how the perception of a digital technology being protean (or anything else) or not does indeed depend on the actor and the environment, not just the nature of the digital technology.

But there’s also the question of whether or not the tool itself is protean. Apple is a good example. Turkle actually talks about the rise of the GUI and the Job’s belief at Apple of controlling the entire experience as major factors in the increasing opacity of digital technology. While reprogrammability is a fundamental property of digital technology the developers of digital technology can decide to limit who can leverage that property. The developers of digital technology can limit the protean nature of digital technology.

In turn the organisational gate keepers of digital technology can further limit the protean nature of digital technology. For example, the trend toward standard course sites within  University run LMS as talked about by Mark Smithers.

But as you and I know, no matter how hard they try they can’t remove it entirely. The long history of shadow systems, workarounds, make-work and kludges (Koopman & Hoffman, 2003) spread through the use of digital technologies (and probably beyond). For example, my work at doing something with the concrete lounges in my institution’s LMS. But at this stage we’re starting to enter the area of affordances etc.

The point I’m trying to make is that digital technologies can be protean. At the moment, most of the digital technologies within formal education are not. This is contributing to formal education’s inability to effectively leverage digital technology.

Blackboxes, complexity and abstraction

Part of the black box approach to technology is to deal with complexity. Not in terms of complexity theory, but in terms of breaking big things into smaller things, thus making them easier to understand. This is a typical human approach to problem solving. If we were to alter the opacity of technological black boxes, how much complexity can we expect educators to cope with in then being able to leverage their own changes?

When I read Turkle in more detail for the first time yesterday, this was one of the questions that sprung to mind. Suchman is talking about being able to perceive the bare technology as being transparent, but even as she does this she mentions

When people say that they used to be able to “see” what was “inside” their first personal computers, it is important to keep in mind that for most of them there still remained many intermediate levels of software between them and the bare machine. But their computer system encouraged them to represent their understanding of the technology as knowledge of what lay beneath the screen surface. They were encouraged to think of understanding as looking beyond the magic of the mechanism (p. 23).

She then goes onto argue how the rise of the GUI – especially in the Macintosh – encourage people to stay on the surface. To see the menus, windows and icons and interact with those.  To understand that clicking this icon, that menu, and selecting this option led to this outcome without understanding how this actually worked.

The problem I’m suggesting here isn’t that people should know the details of the hardware, or the code that implements their digital technology. But that they should go beyond the interface to understand the model used by the digital technology.

The example I’ll use in the talk (I think) will be the Moodle assignment activity. I have a feeling (which could be explored with research) that most teachers (and perhaps learners) are stuck at the interface. They have eventually learned which buttons to push to achieve their task. But they have no idea of the model used by the Moodle assignment activity because the training they receive and the opaque nature of the interface to the Moodle assignment activity doesn’t help them understand the model.

How many teaching staff using the Moodle assignment activity could define and explain the connections between availability, submission types, feedback types, submission settings, notifications, and grade? How many could develop an appropriate mental model of how it works?  How many can then successfully translate what they would like to do into how the Moodle assignment activity should be configured to help them achieve those goals?

What about the home page for a Moodle course site? How much of the really poorly designed Moodle course home pages is due to the fact that the teachers have been unable to develop an effective mental model of how Moodle works because of the opaque nature of the technology?

How many interactive white boards are sitting unused in school classrooms because the teacher doesn’t have a mental model of how it works and thus can’t identify the simple fix required to get it working again?

I imagine that the more computational thinking a teacher/learner is capable of, the more likely it is that they have actively tried to construct the model behind that tool, and subsequently the more able they are to leverage the Moodle assignment activity to fit their needs.  The more someone sees a digital technology as not opaque and as protean, the more likely I think that they will actively try to grok the model underpinning the digital technology.

This isn’t about delving down in the depths of the abstraction layer. It’s just trying to see beyond the opaque interface.

Another interesting research project might be to explore if modifying the interface of a digital technology to make it less opaque – to make the model underpinning the digital technology clearer to the user – would make it easier to use and eventually improve the quality of the task they wish to complete?  e.g. would it improve the quality of learning and teaching with digital technology?

Can you do anything? How?

Without sounding too dramatic (or cynical), without industry-wide changes to how digital technology is viewed, are attempts to address the issues outlined in your blog post futile?

How do you bring about industry-wide change in attitude and thinking?

The funny thing is that significant parts of the digital technology industry is already moving toward ideas related to this.Increasingly what software developers – especially within organisations – are doing is informed by the nature of digital technologies outlined here. But that hasn’t quite translated into formal education insitutions. It is also unclear just how much of this thinking on the part of software developers has informed how they think about what the users of their products can do. But in some cases, the changes they are making to help them leverage the nature of digital technologies are making it more difficult, if not impossible, to prevent their users from making use of it.

For example, both you and I know that the improvements in HTML have made it much easier to engage in screen scraping. The rise of jQuery has also made it much easier to make changes to web pages in tools like Moodle. But at the same time you get moves to limit this (e.g. the TinyMCE editor on Moodle actively looking to hobble javascript).

This is something that will get picked up more in later posts in this series.

So it’s going to happen, it’s going to be easy, but I do think it’s going to get easier.


Koopman, P., & Hoffman, R. (2003). Work-arounds, make-work and kludges. Intelligent Systems, IEEE, 18(6), 70-75.

The nature of digital technology? Part 2

This is a followup to yesterday’s Part 1 post and a continuation of an attempt to describe the nature of digital technology and to think about what this might reveal about how and what is being done by formal education has it attempts to use digital technology for learning and teaching. This post moves from the fundamental properties of digital technologies (yesterday’s focus) to what some suggest is that nature of digital technologies.

Note: this is not the end of this series. There’s a fair bit more to go (e.g. this is all still focused on a single black box/digital technology, it hasn’t touched on what happens when digital technology becomes pervasive). I’m not entirely comfortable with the use of “nature” at this level, but the authors I’m drawing on use that phrase.

Recap and revision

Yesterday’s post aimed to open up the black box of digital technology a touch by explaining the two fundamental properties (data homogenization and reprorammability) of digital technology proposed by Yoo, Boland, Lyytinen, and Majchrzak (2012).  This was original represented using this image.

Fundamental Properties

I don’t think the image makes the point that these are fundamental properties of the black box, the digital technology. Hence, the following revised image. The idea being is that data homogenization and reprogrammability are properties that are “baked into” digital technology.  Identifying these properties has opened up the black box a little. This is going to be useful as I attempt to develop the model of digital technology further.
Fundamental Properties embedded

Nature of digital technologies

The aim here is to move up a bit from the fundamental properties to look at the “nature” of digital technologies.  As mentioned above, I’m not entirely happy with the use of the phrase “nature” at this level, but I don’t have a better term at the moment, and I’m drawing on Koehler and Mishra (2009) here who argued (emphasis added)

By their very nature, newer digital technologies, which are protean, unstable, and opaque, present new challenges to teachers who are struggling to use more technology in their teaching. (p. 61)

As they argue the combination of protean, unstable, and opaque makes the use of digital technology by teachers (and others) difficult. The following seeks to expand and explore that a bit more.

The following representation (I’m not a designer by any stretch of the imagination) is attempting to illustrate that this “nature” of digital technology sits above (or perhaps build upon or become possible due to) the fundamental properties introduced in the last post.

Nature of Digital Technology


In this context, Koehler and Mishra (2009) define unstable and “rapidly changing” (p. 61). Which version of the iPhone (insert your preference) do you have? The combination of data homogenization and reprogrammability mean that digital technologies can be changed, and other external factors tend to make sure that they do. Commercial pressures mean that consumer digital technologies keep changing. Other digital technologies change to improve their functionality.

But beyond that is the argument that digital technology shows exponential growth. Bigum (2012) writes

To most, the notion of an exponential is something that belongs in a mathematic’s classroom or perhaps may somehow be related to home loan repayments. Exponential change is not something with which we have had to become familiar, despite the fact of Moore’s Law and other Laws that map the growth of various digi- tal technologies and which tell us that the price of various digital technologies is halving roughly every 18 months to 2 years and that their performance is doubling on about the same time scale….The fact is that the various digital technologies that end up in laptop computers, mobile phones, and an increasing number of things that we tend not to associate with computers, are still doubling their performance and halving their cost in fixed time periods, i.e. we are seeing exponential growth. (p. 32-33)


Koehler and Mishra (2009) draw on Turkle (1995) to define opaque as “the inner workings are hidden from users”. Turkle (1995) talks about people having “become accustomed to opaque technology”. Meaning that as the power of digital technologies have increased we no longer see the inner workings of the technology. She suggests that computers of the 1970s “presented themselves as open, ‘transparent’, potentially reducible to the underlying mechanisms”. Perhaps more importantly she argues that

their computer systems encouraged them to represent their understanding of the technology as knowledge of what lay beneath the screen surface. They were encouraged to think of understanding as looking beyond the magic to the mechanism. (p. 23)

Earlier this year, as part of an introductory activity, I asked students to find and share an image (or other form of multimedia) that captured how they felt about digital technologies. The following captures just some of the images shared, and also captures a fairly widespread consensus of how these pre-service educators felt about digital technology. I’m guessing that it resonates with quite a few people.
Perceptions of computers
The increasingly opaque nature of digital technology combined with our increasing reliance on digital technologies in most parts of our everyday life would seem to have something to do this sense of frustration. Ben-Ari and Yeshno (2006) found that people with appropriate conceptual models of digital technologies were better able to analyse and solve problems. While learners without appropriate conceptual models were limited to aimless trial and error. I suggest that it is the aimless trial and error, due to a inappropriate conceptual model of how a digital technology works, is what creates the feelings of frustration illustrated by the above image.


This is the characteristic that I’ve written the most about.  The following two paragraphs are from the first version of Jones and Schneider (2016).

The commonplace notions of digital technologies that underpin both everyday life and research have a tendency to see them “as relatively stable, discrete, independent, and fixed” (Orlikowski & Iacono, 2001, p. 121). Digital technologies are seen as hard technologies, technologies where what can be done is fixed in advance either by embedding it in the technology or “in inflexible human processes, rules and procedures needed for the technology’s operation” (Dron, 2013, p. 35). As noted by Selwyn and Bulfin (2015) “Schools are highly regulated sites of digital technology use” (p. 1) where digital technologies are often seen as a tool that is: used when and where permitted; standardised and preconfigured; conforms to institutional rather than individual needs; and, a directed activity. Rushkoff (2010) argues that one of the problems with this established view of digital technologies is that “instead of optimizing our machines for humanity – or even the benefit of some particular group – we are optimizing humans for machinery” (p. 15). This hard view of digital technologies perhaps also contributes to the problem identified by Selwyn (2016) where in spite of the efficiency and flexibility rhetorics surrounding digital technologies, “few of these technologies practices serve to advantage the people who are actually doing the work” (p. 5). Digital technologies have not always been perceived as hard technologies.

Seymour Papert in his book Mindstorms (Papert, 1993) describes the computer as “the Proteus of machines” (p. xxi) since the essence of a computer is its “universality, its power to simulate. Because it can take on a thousand forms and can serve a thousand functions, it can appeal to a thousand tastes” (p. xxi). This is a view echoed by Alan Kay (1984) and his discussion of the “protean nature of the computer” (p. 59) as “the first metamedium, and as such has degrees of freedom and expression never before encountered” (p. 59). In describing the design of the first personal computer, Kay and Goldberg (1977) address the challenge of producing a computer that is useful for everyone. Given the huge diversity of potential users they conclude “any attempt to specifically anticipate their needs in the design of the Dynabook would end in a disastrous feature-laden hodgepodge which would not be really suitable for anyone” (Kay & Goldberg, 1977, p. 40). To address this problem they aimed to provide a foundation technology and sufficient general tools to allow “ordinary users to casually and easily describe their desires for a specific tool” (Kay & Goldberg, 1977, p. 41). They aim to create a digital environment that opens up the ability to create computational tools to every user, including children. For Kay (1984) it is a must that people using digital technologies should be able to tailor those technologies to suit their wants, since “Anything less would be as absurd as requiring essays to be formed out of paragraphs that have already been written” (p. 57). For Richard Stallman (2014) the question is more fundamental, “To make computing democratic, the users must control the software that does their computing!” (n.p.).

Implications for formal education

The above – at least for me – opens up a range of questions about how formal education uses digital technology for learning and teaching. A small and rough list follows.

Unstable changes everything

If digital technologies are fundamentally different and if they are unstable (rapidly – even exponentially – changing) then everything will change.  Bigum (2012)

Taken together and without attempting to anticipate how any of these technologies will play out, it is nevertheless patently clear that doing school the way school has always been done or tweaking it around the edges will not prepare young people who will grow up in this world (p. 34)

Bigum (2012) then draws on this from Lincoln

The dogmas of the quiet past, are inadequate to the stormy present. The occasion is piled high with difficulty, and we must rise – with the occasion. As our case is new, so we must think anew, and act anew. We must disenthrall ourselves, and then we shall save our country

The increasing neo-liberal/corporatisation fetish within formal education on efficiency etc. appears to be placing an emphasis on refining what we already do. Dropping the dogmas of the quiet past would mean admitting that people had it wrong….etc.  It’s difficult to see how such change will happen.

Moving beyond recipe followers?

Since digital technology is increasingly opaque, it is increasingly difficult for people to develop conceptual models of how digital technology works. As a result, many people have developed recipes that they follow when using digital technology. i.e. they know that if they press this button, select that menu, and check that box this will happen. They don’t know why, they just know the recipe.

Increasingly, a lot of the training and documentation provided to help users use digital technologies are recipes. They are step-by-step examples (with added screen shots ) of the recipe to follow to achieve this specific goal. If they don’t have the recipe, or the recipe doesn’t work then they are stuck. They don’t have the conceptual models necessary to analyse and solve problems.

What can be done to digital technologies and the methods used to support them to help people develop better conceptual models? If you do that, does that improve the quality of learning and teaching with digital technology?

If your documentation and training is a collection of recipes, why aren’t you automating those recipes and building them into the technology? i.e. making use of the protean nature of digital technology?

What or whom drives the change? What is the impact?

My institution has adopted Moodle, an open source LMS. One of the benefits of open source is that it is meant to be more protean. It can change. The Moodle release calendar shows the aim of releasing a major upgrade of Moodle every six months. It appears that my institution aims to keep reasonably up to date with that cycle. This means that every 6 months a change process kicks in to make staff and students aware that a change is coming. It means that every 6 months or so it is possible that staff and students will find changes in how the system works. Changes they didn’t see the need for.

To make matters worse, since most people are recipe followers, even the most minor of changes cause confusion and frustration. Emotions that make people question why this change has been inflicted upon them. An outcome not likely to enhance acceptance and equanimity.

Perhaps if more of the changes being made responded to the experiences and needs of those involved, change might be more widely accepted. The problem is that because most institutional digital technologies aren’t that protean, changes can only be made by a small number of specific people who are in turn constrained by a hierarchical governance process. A situation that might lead a problem of starvation where the priority is given to  large-scale, institutional level changes, rather than changes beneficial to small numbers of specific situations.

Would mapping who and why changes are being made to the digital technologies reveal this starvation? How can institutional digital technologies be made more protean and more able to respond to the needs of individuals? What impact would that have on learning and teaching? Is this sort of change necessary to respond to exponential growth?

Opaque technology creates consumers, not producers

Kafai et al (2014) talk about the trend within schools of transforming “computer class” into the study of how to use applications such as  word processors and spreadsheets. Approaches which they argue

These technology classes promote an understanding of computers and software as black boxes where the inner workings are hidden to users. (p 536)

In contrast they argue that

working with e-textiles gives students the opportunity to grap- ple with the messiness of technology; taking things apart, putting them back together, and experimenting with the purposes and functions of technology make computers accessible to students

Which importantly has the effect of

by engaging learners in designing e-textiles, educators can encourage student agency in problem solving and designing with technologies. This work can disrupt the trend that puts students on the sidelines as consumers rather than producers of technology

Currently, most digital learning environment within formal education tend to lean towards being opaque and not protean. Does this contribute toward a cadre of learners and teachers that see themselves as consumers (victims?) of digital technologies for learning and teaching? Would the provision of a digital learning environment that is transparent and protean help encourage learner and teacher agency? Would this transform their role from consumer to producer? Would this improve the use of digital technology for learning and teaching within formal education?


Ben-Ari, M., & Yeshno, T. (2006). Conceptual Models of Software Artifacts. Interacting with Computers, 18(6), 1336–1350. doi:10.1016/j.intcom.2006.03.005

Kafai, Y. B., Fields, D. A., & Searle, K. A. (2014). Electronic Textiles as Disruptive Designs: Supporting and Challenging Maker Activities in Schools. Harvard Educational Review, 84(4), 532–556,563–565. doi:10.17763/haer.84.4.46m7372370214783

Koehler, M., & Mishra, P. (2009). What is Technological Pedagogical Content Knowledge (TPACK)? Contemporary Issues in Technology and Teacher Education, 9(1), 60–70. Retrieved from

Yoo, Y., Boland, R. J., Lyytinen, K., & Majchrzak, A. (2012). Organizing for Innovation in the Digitized World. Organization Science, 23(5), 1398–1408.

What is the nature of digital technology? Part 1

Formal education in most of its forms is still struggling to effectively harness digital technology to enhance and transform learning and teaching. Even with a history for 40+ years of various attempts. The reasons for this are numerous and diverse. The following is an attempt to look at one of the reasons. A reason, at least to me, which seems to have be somewhat ignored.

The technology. Does digital technology have a unique nature/set of capabilities/affordances that sets it apart from other types of technology? If so, what is it? What might understanding the nature of digital technology have to say about how formal education is attempting to use it to transform learning and teaching?

The following is a first attempt to frame some thinking that is moving towards a presentation I’ll be giving in a couple of weeks.  This is only the first step, there’ll be follow up posts over the coming week or two. These posts will aim to develop my own understanding of a model that aims to capture the nature of pervasive digital technology. It’s a model that will draw largely on the work of Yoo, Boland, Lyytinen, and Majchrzak (2012) combined with a few others (e.g. Papert, 1980; Kay, 1984; Mishra & Koehler, 2006). That model will then be used to look at current attempts within formal education to use digital technology for learning and teaching.

Views of Digital Technology

For most people digital technology is a black box. Regardless of what type of digital technology, it’s a black box.

DT black box
Orlikowski and Iacono (2001) label this the tool view of technology which

represents the common, received wisdom about what technology is and means. Technology, from this view is the engineered artifact, expected to do what its designers intend it to do. (p. 123)

They go onto cite work by Kling and Latour to describe this view and its’ limitations before going on to examine 4 other views of the IT artifact. The motivation for their work is that “The IT artifact itself tends to disappear from view, be taken for granted, or is presumed to be unproblematic once it is build and installed” (Orlikowski & Iacono, 2001 p. 121). They go proceed to describe 4 additional “broad metacategories” of the IT artifact “each representing a common set of assumptions about and treatments of information technology in IS research” (Orlikowski & Iacono, 2001 p. 123). Metacategories or views of technology that draw on a range of perspectives outside of their discipline such as Actor-Network Theory etc.

My attempt here at opening up the black box of digital technology perhaps best fits with Orlikowski & Iacono’s (2001) fourth view of technology – the computational view – where the interest is “primarily in the capabilities of the technology to represent, manipulate, store, retrieve, and transmit information, thereby supporting, processing, modeling, or simulating aspects of the world” (Orlikowski & Iacono, 2001 p. 127). My focus here is on trying to explore what is the unique nature of digital technology. Not as an end in itself, but as a starting point that will draw on (at least) the other four views of technology suggested by Orlikowski & Iacono (2011) in attempting to understand and improve the use of digital technology within formal education.

Fundamental properties of digital technology

Yoo, Boland, Lyytinen, and Majchrzak (2012) argue that the “fundamental properties of digital technology are reprogrammability and data homogenization” (p. 1398)
Fundamental Properties

Data homogenization

Whether a digital technology is allowing you to talk to friends via Skype (or smartphone or…); capture images of snow monkeys; listen to Charlie Parker; measure the temperature; analyse the the social interactions in a discussion forum; or, put your students to sleep as you read from your powerpoint slides (which they’re viewing via some lecture capture system) all of the data is represented as a combination of 0s and 1s. All the data is digital. Since all digital technologies deal with 0s and 1s, in theory at least, all digital technologies can handle all data.The content has been separated from the medium (Yoo, Henfridsson & Lyytinen, 2010).

Analog technologies, on the other hand, have a tight coupling between content and medium. If you had bought “Born in the USA” on a record, to play it on your Walkman you had to record it onto a cassette tape. Adding it as background to that video you recorded with your video camera involves another translation of the content from one medium to another.

Data homogenization is the primary reason why you – as per the standard meme – can now carry all of the following in your pocket.




It’s not just the content that is represented digitally with digital technology. Digital technology also stores digitally the instructions that tell it how and what to do. Digital technologies have a processing unit that will decode these digital technologies and perform the task they specify. More importantly those instructions can – in the right situations – be changed. A digital technology is reprogrammable. What a digital technology offers to the user does not need to be limited by its current function.

Questions  for formal education?

The above is but the first step in building a layered model for the nature of digital technology. The intent is that each layer should include a couple of questions related to how formal education is using digital technology. The following are a rough and fairly weak initial set. Really just thinking out loud.

Where is the convergence?

If data homogenisation is a fundamental property of digital technology, then why isn’t there more convergence within formal education’s digital technologies? Why is the information necessary for learning and teaching kept siloed in different systems?

When I’m answering a student question in the LMS, why do I need to spend 20 minutes heading out into the horrendous Peoplesoft web-interface to find out in which state of Australia the student is based?

Should we buy? Should we build?

I wonder if there is a large educational institution anywhere in the world that hasn’t at some stage, somewhat within the organisation had the discussion about whether they should buy OR build their digital technology? I wonder if there’s a large educational institution anywhere in the world that hasn’t felt it appropriate to lean heavily toward the buy (and NOT build) solution?

What is gained and/or lost by ignoring a fundamental property of digital technology?


Orlikowski, W., & Iacono, C. S. (2001). Research commentary: desperately seeking the IT in IT research a call to theorizing the IT artifact. Information Systems Research, 12(2), 121–134.

Yoo, Y., Henfridsson, O., & Lyytinen, K. (2010). The new organizing logic of digital innovation: An agenda for information systems research. Information Systems Research, 21(4), 724–735. doi:10.1287/isre.1100.0322

Yoo, Y., Boland, R. J., Lyytinen, K., & Majchrzak, A. (2012). Organizing for Innovation in the Digitized World. Organization Science, 23(5), 1398–1408.

Learn to code for data analyis – step 1

An attempt to start another MOOC.  Learn to code for data analysis from FutureLearn/OUUK.  Interested in this one to perhaps start the migration from Perl to Python as my main vehicle for data munging; and, also to check out the use of Jupyter notebooks as a learning environment.


  • The approach – not unexpectedly – resonates. Very much like the approach I use in my courses, but done much better.
  • The Juypter notebooks work well for learning, could be useful in other contexts.  Good example of the move toward a platform
  • The bit of Python I’ve seen so far looks good. The question is whether or not I have the time to come up to speed.

Getting started

Intro video from a BBC journalist and now the software.  Following a sequential approach, pared down interface, quite different from the standard, institutional Moodle interface. It does have a very visible and simple “Mark as complete” interface for the information.  Similar to, but perhaps better than the Moodle book approach from EDC3100.

Option to install the software locally (using Anaconda) or use the cloud (SageMathCloud).  Longer term, local installation would suit me better, but interested in the cloud approach.  The instructions are not part of the course, seem to be generic instructions used for the OUUK.


Intro using a video, which on my connection was a bit laggy. SageMathCloud allows connection with existing accounts, up and going.  Lots of warnings about this being a free service with degraded performance, and the start up process for the project is illustrating that nicely.  Offline might be the better option. Looks like the video is set up for the course.

The test notebook loads and runs. That’s nice.  Like I expected, will be interesting to see how it works in “anger”.

Python 3 is the go for this course, apparently.


Worried a little about installing another version of python.  Hoping it won’t trash what I have installed, looks like it might not.  Looks like the download is going to take a long time – 30 min+.  Go the NBN!

Course design

Two notebooks a week: exercise and project.  Encouraged to extend project. Exercises based on data from WHO, World Bank etc.  Quizzes to check knowledge and use of glossaries.  Comments/discussions on each page.  Again embedded in the interface, unlike Moodle.  Discussion threads expand into RHS of page.

Course content

Week 1

Start with a question – point about data analysis illustrated with a personal story. Has prompts to expand and share related to that story.  Encouraging connections.

Ahh, now the challenge of how to segue into first steps in programming and supporting the wide array of prior knowledge there must be. Variables and assignment. and a bit of Jupyter syntax.  Wonder how the addition of Jupyter impacts cognitive load?

Variable naming and also starting to talk about syntax, errors etc. camelCase is the go apparently.

And now for some coding. Mmm, the video is using Anaconda.  Could see that causing some problems for some learners. And the discussion seems to illustrate aspects of that.  Seems installing Anaconda was more of a problem. Hence the advantages of a cloud service if it is available..

Mmm, notebooks consist of cells. These can be edited and run. Useful possibilities.

Expressions.  Again Juypter adds it’s own little behavioural wrinkle that could prove interesting.  IF the last line in a cell is an expression, it’s value will be output.  Can see that being a practice people try when writing stand alone python code.

Functions. Using established functions.

Onto a quiz.  Comments on given answers include an avatar of the teaching staff.

Values and units.  With some discussion to connect to real examples.

Pandas. The transition to working with large amounts of data. And another quiz, connected to the notebook.  That’s a nice connection.  Works well.

Range of pages an exercises looking at the pandas module.  Some nice stuff here.

Do I bother with the practice project?  Not now.  But nice to see the notebooks can be exported.

Week 2 – Cleaning up our act

The BBC journalist giving an intro and doing an interview. Nodding head and all.

Ahh weather data.  Becoming part of the lefty conspiracy that is climate change?  🙂

Comparison operators, with the addition of data frames.  Which appears to be a very useful abstraction.

Bitwise operators. Always called these logical or boolean operators.  Boolean isn’t given a lot of intro yet.

Ahh, the first bit of “don’t worry about they syntax, just use it as a template” advice. Looks like it’s using the equivalent of a hash that hasn’t yet been covered.



Powered by WordPress & Theme by Anders Norén