Assembling the heterogeneous elements for (digital) learning

Month: March 2008

Initial thoughts from CogEdge accreditation course

As I’ve mentioned before Myers-Briggs puts me into the INTP box, a Kiersey Archiect-Rational. Which amongst many other things I have an interest in figuring out the structure of things.

As part of that interest in “figuring out the structure” I spent three days last week in Canberra at a Cognitive Edge accreditation course. Primarily run by Dave Snowden (you know that a man with his own Wikipedia page must be important), who along with others has significant criticisms of the Myers-Briggs stuff, the course aims to bring people up to speed with Cognitive Edge’s approach, methods and tools to management and social sciences.

Since this paper in 2000, like many software people who found a resonance with agile software development, I’ve been struggling to incorporate ideas with a connection to complex adaptive systems into my practice. Through that interest I’ve been reading Dave’s blog, his publications and listening to his presentations for sometime. When the opportunity to attend one of his courses arose, I jumped at the chance.

This post serves two main roles:

  1. The trip report I need to generate to explain my absence from CQU for a week.
  2. Forcing me to write down some immediate thoughts about how it might be applied at CQU before I forget.

Over the coming weeks on this blog I will attempt to engage, reflect and attempt to integrate into my context the huge amount of information that was funneled my way during the week. Some of that starts here, but I’m likely to be spending years engaging with some of the ideas.

What’s the summary

In essence the Cognitive Edge approach is to take insights from science, in particular complex adaptive systems theory, cognitive science and techniques from other disciplines and apply them to social science, in particular management.

That’s not particularly insightful or original. It’s essentially a rephrasing of the session blurb. In my defence, I don’t think I can come up with a better description and it is important to state this because the Cognitive Edge approach seriously questions much of the fundamental assumptions of current practices in management and the social sciences.

It’s also important to note that the CogEdge approach only questions these assumptions in certain contexts. The approach does not claim universality, nor does it accept claims of universality from other approaches.

That said, the CogEdge approach does provides a number of theoretical foundations upon which to question much of what passes for practices within the Australian higher education sector and within organisations more broadly. I’ll attempt to give some examples in a later section. The next few sub-sections provide a brief overview of some of these theoretical foundations. I’ll try and pick up these foundations and their implications for practice at CQU and within higher education at a later date.

The Cynefin Framework

At the centre of the CogEdge approach is the Cynefin framework.

The Wikipedia page describes it as a decision making framework. Throughout the course we were shown a range of contexts in which it can be used to guide people in making decisions. The Wikipedia page lists knowledge management, conflict resolution and leadership. During the course there were others mentioned including software development.

My summary (see the wikipedia page for a better one) is that the framework is based on the idea that there are five different types of systems (the brown bit in the middle of the above image is the fifth type of system – disorder, when you don’t know which of the four other systems you’re dealing with). Most existing principles are based on the idea of there being just one type of system. An ordered system. The type of system where causality is straight forward and one that the right leader(ship group) can fully understand and design (or most likely adopt them from elsewhere) interventions that will achieve some desired outcome.

If the intervention happens to fail, then it is a problem with the implementation of the intervention. Someone failed, there wasn’t enough communication, not enough attention paid to the appropriate culture and values etc.

The Cynefin Framework suggests that there are 5 different contexts. This suggests an alternate perspective for failure. That is, that the nature of the approach was not appropriate for the type of system.

A good example of this mismatch is the story which Dave regularly tells about the children’s birthday party. Some examples of this include: an mp3 audio description (taken from this presentation) or a blog post that points to a video offering a much more detailed description.

The kid’s birthday party is an example of what they Cynefin framework calls a complex system. The traditional management by objectives approach originally suggested for use is appropriate for the complicated and simple sectors of the Cynefin framework, but not the complex.

Everything is fragmented

“Everything is fragmented” was a common refrain during the course. It draws on what cognitive science has found out about human cognition. The ideal is that human beings are rational decision makers. We gather all the data, consider the problem from all angles, perhaps consult some experts and then make the best decision (we optimize).

In reality, the human brain only gets access to small fragments of the information that is presented. We compare those small fragments against the known patterns we have in our brain (our past experience) and then choose the first match (we satisfice). The argument is that we take fragments of information and assemble them into something, somewhat meaningful.

The CogEdge approach recognises this and its methods and software are designed to build on this strength.

Approach, methods and software

The CogEdge approach is called “naturalising sensemaking”. Dave offers a simple definition of sensemaking here

the way in which we make sense of the world so that we can act in it

Kurtz and Snowden provide a comparison between what passes for the traditional approaches within organisations (idealistic) and their approach (naturalistic). I’m trying to summarise this comparison in the following table.

Idealistic Naturalistic
identify the future state and implement approaches to achieve that state gain sufficient understanding of the present context and choose projects to stimulate the evolution of the system, monitor that evolution and intervene as necessary
Emphasis is on expert knowledge and their analysis and interpretation Emphasis on the inherent un-knowability of a complex system which means affording no privelege to expert interpretation and instead favouring emergent meaning at the coal-face
Diagnosis precedes and is separate from intervention. Diagnosis/research identifies best practice and informs interventions to close the gap between now and the identified future state All diagnosis are also interventions and all interventions provide an opportunity for diagnosis

As well as providing the theoretical basis for these views the CogEdge approach also provides a collection of methods that help management actually act within a naturalistic, sense-making approach. It isn’t an approach that says step back and let it all happen.

There is also the SenseMaker Suite. Software that supports (is supported by) the methods and informed by the same theoretical insights.

Things too question

Based on the theoretical perspective taken by CogEdge it is possible to raise a range of questions (many of a very serious nature) against a range of practices currently within the Australian Higher Education sector. The following list is a collection of suggestions, I need to work more on these.

The content of this list is based on my assumption that learning and teaching within a current Australian university is a context system and fits into the sector of the Cynefin framework. I believe all of the following practices only work within the simple or the complicated sectors of the Cynefin framework.

My initial list includes the following, and where possible I’ve attempted to list what some of the flaws might be of this approach within the complex sector of the :

  • Quality assurance.
    QA assumes you document all your processes. As practiced the written down practices are quite complete. It assumes you can predict the future. As practiced by AUQA it assumes that a small collection of auditors from outside the organisational context can come in, look around for a few days and make informed comments on the validity of what is being done. It assumes that these auditors are experts making rational decisions, not pattern-matchers fiting what they see against their past experience.
  • Carrick grants emphasising cross institutional projects to encourage adoption.
    Still thinking about this one, but my current unease is based on the belief of the uniqueness of each context and the difficulty of moving the same innovation across different institutional contexts as is.
  • Requiring teaching qualifications from new academic staff.
    There is an assumption that the quality of university learning and teaching can be increased by requiring all new academic staff to complete a graduate certificate in learning and teaching. This assumes that folk won’t game the requirement. i.e. complete the grad. cert. and then ignore the majority of what they “learnt” when they return to a context which does not value or reward good teaching. It assumes that academics will gain access to the knowledge they need to improve in such a grad cert. A situation in which they are normally not going to be developing a great deal of TPCK. i.e. the knowledge they get won’t be contextualised to their unique situation.
  • The application of traditional, plan-driven technology governance and management models to the practice of e-learning.
    Such models are inherently idealistic and simply do not work well to a practice that is inherently complex.
  • Current evaluation of learning and teaching.
    The current surveys given to students at the end of term are generally out of context (i.e. applied after the student has had the positive/negative experience). The use of surveys also limit the bredth of the information that can be provided by students to the limitations enshrined in the questions. The course barometer idea we’ve been playing with for a long time is a small step in the right direction.

There are many more, but it’s getting past time to post this.

Possible projects

Throughout the course there were all sorts of ideas about how aspects of the CogEdge approach could be applied to improve learning and teaching at CQU. Of course, many of these have been lost or are still in my notebooks waiting to be saved.

A first step would be to fix the practices which I believe are now highly questionable outlined in the previous section. Some others include

  • Implement a learning and teaching innovation scheme based on some of the ideas of the Grameen bank.
    e.g. if at least 3 academics from different disciplines can develop an idea for a particular L&T innovation and agree to help each other implement it in each of their courses, then it gets supported immediatley. No evaluation by an “expert panel”.
  • Expand/integrate the course barometer idea to collect stories from students (and staff?) during the term and have those stories placed into the SenseMaker software.
    This could significantly increase CQU’s ability to pick up weak signals about trouble (but also about things that are working) and be able to intervene. Not to mention generating a strong collection of evidence to use with AUQA etc.
  • A number of the different CogEdge methods to help create a context in which quality learning and teaching arise more naturally.

There are many others, but it’s time to get this post, posted.

Disclaimers

I’ve been a believer in complexity informed, bottom-up approaches for a long time. My mind has a collection patterns about this stuff to which I am positively inclined. Hence it is no great surprise that the CogEdge approach resonates very strongly with me.

Your mileage may vary.

In fact, I’d imagine that most hard-core, plan-driven IT folk, those in the business process re-engineering and quality assurance worlds and others from a traditional top-down management school probably disagree strong with all of the above.

If so, please feel free to comment. Let’s get a dialectic going.

I’m also still processing all of the material covered in the three day course and in the additional readings. This post was done over a few days in different locations there are certain to be inconsistencies, typos, poor grammar and basic mistakes.

If so, please feel free to correct.

From scarcity to over abundance – paradigm change for IT departments (and others)

Nothing all that new in this post, at least not that others haven’t talked about previously. But writing this helps me think about a few things.

Paradigms, good and bad

A paradigm can be/has been defined as a particularly collection of beliefs and ways of seeing the world. Perhaps as the series of high level abstractions which a particular community create to enable very quick communication. For this purpose a common paradigm/collection of abstractions is incredibly useful, especially within a discipline. It provides members of a community from throughout a wide geographic area with a shared language which they can use.

It also has a down side, paradigm paralysis. The high level abstractions, the ways of seeing the world, become so ingrained that members of that community are unable to see outside of that paradigm. A good example is the longitude problem where established experts ignored an innovation from a non-expert because it fell outside of their paradigm, their way of looking at the world.

Based on my previous posts it is no great surprise to find out that I think that there is currently a similar problem going on with the practice of IT provision within organisations.

What’s changed

The paradigm around organisational IT provision arose within a context that was very different. A context that has existed for quite sometime, but is now under-going a significant shift caused by (at least) three factors

  1. The rise of really cheap, almost ubiquitous computer hardware.
  2. The rise of cheap (sometimes free), easy to use software.
  3. The spread of computer literacy beyond the high priests of ITD.

The major change is that what was once scarce and had to be managed as a scarce resource (hardware, software and expertise) is now available in abundance.

Hardware

From the 50s until recently, hardware was really, really expensive, generally under-powered and consequently had to be protected and managed. For example, in the late 1960s in the USA there weren’t too many human endeavours that would have had more available computing power than the Apollo 11 moon landing. And yet, in modern terms, it was a pitifully under-resourced enterprise.

Mission control, the folk on earth responsible for controlling/supporting the flight had access to computer power equivalent to (probably less) than the Macbook Pro I’m writing this blog entry with. The lunar module, the bit that took the astronauts from moon orbit, down, and then back again is said to have had less power than the digital watch I am currently wearing.

Moore’s law means that computer power increases exponentially with a similar impact on price.

Software

Software has traditionally been something you had to purchase. Originally, only from the manufacturer of the hardware you used. Then software vendors arose, as hardware became more prevalent. Then there was public domain software, open source software and recently Web 2.0 software.

Not only was there more software available in these alternate approaches, this software became easier to use. There are at least half a dozen free blog services and a similar number of email services available on the Web. All offering a better user experience than similar services provided by organisations.

Knowledge and literacy

The primitive nature of the “old” computers meant that they were very difficult to program and support. But since their introduction the ability to maintain and manipulate computers in order to achieve something useful has become increasingly easy. Originally, it was only the academics, scientists and engineers who were designing computers who could maintain and manipulate them. Eventually a profession arose around the maintenance and manipulation of computers. As the evolution continued teenage boys of a certain social grouping became extremely proficient through to today when increasing numbers (but still not the majority) are able to maintain and manipulate computers to achieve their ends.

At the same time the spread of computers meant that more and more children grew up with computers. A number of the “uber-nerds” that grew up in the 60s and 70s had parents who worked in industries that enabled the nascent uber-nerds to access computers. To grow up with them. Today it is increasingly rare for anyone not to grow up with some familiarity with technology.

For example, Africa has the fastest growing adoption rate of mobile phones in the world. I recently read that the diffusion of mobile phones in South Africa put at 98%.

Yes, there is still a place for professionals. But the increasing power and ease of use of computers means that their place is increasingly not about providing specialised services for a particular organisation, but instead providing generalised platforms which the increasingly informed general public can manipulate and use without the need for IT.

For example, there’s an increasingly limited need (not quite no need) for an organisation to provide an email service when there are numerous free email services that are generally more reliable, more accessible and provide greater functionality than internal organisational services.

From scarcity to abundance

The paradigm of traditional IT governance etc is based around the idea that hardware, software and literacy are scarce. This is no longer the case. All are abundant. This implies that new approaches are possible, perhaps even desirable and necessary.

This isn’t something that just applies to IT departments. The line of work I’m in, broadly speaking “e-learning”, is also influenced by this idea. The requirement for universities to provide learning management systems is becoming increasingly questionable, especially if you believe this change from scarcity to abundance suggests the need for a paradigm change.

The question for me is what will the new paradigm be? What problems will it create that need to be addressed? Not just the problems caused by an old paradigm battling a new paradigm, the problems that the new paradigm will have. What shape will the new paradigm take? How can organisations make use of this change?

Some initial thoughts from others – better than free.

A related question is what impact will this have on the design of learning and teaching?

Powered by WordPress & Theme by Anders Norén

css.php