Assembling the heterogeneous elements for (digital) learning

Month: January 2009 Page 1 of 2

How do you implement PLEs into higher education courses?

Jocene reflects a bit upon a slidecast (titled “Personal Learning Environments: The future of education?”) by Graham Atwell.

I tend to sense a touch of frustration in the post. Which I don’t think is at all surprising since the question “How do you implement PLEs into higher education courses?” is extremely complex. Doing anything to change learning and teaching within higher education is extremely difficult. This is made almost impossible when it is something that potentially brings into question not only the pedagogical practice of individual academics, but also the assumptions underpinning the administrative and technological bureaucracies that have accreted within tertiary institutions.

Institutions of higher education have essentially failed to implement “enterprise e-learning” in a way that caters for and values the diversity inherent in university teaching. I’m somewhat pessimistic about its ability to implement e-learning that caters for the diversity of university students – an order of magnitude (or two) greater level of diversity.

A way forward

That diversity, is for me, a clue to a way forward. Implementing PLEs within higher education is about a focus on the potential adopters, both the teaching staff and the students. By answering questions like: “What do they want?”, “What do they do?”, “What is a problem you can solve for them that makes a difference?” with something that is related to, or at least moves them towards the ideals of a PLE.

However, answering these questions should not be done by asking them. When it comes to something brand new, something that challenges established ways of doing things simply asking people “what would you like to do with X” is a waste of time. If they tell you anything, the will tell you what they’ve always done.

I wonder if this explains the current suggestion that the next generation of students don’t want their university life mixed in with their social life. They don’t want universities getting into Facebook and other social spaces. If the students haven’t seen good examples of how this might work, you can’t really rely on their feedback, they don’t know yet.

I’ve talked about the 7 principles of knowledge management and in particular principle #2

We only know what we know when we need to know it. Human knowledge is deeply contextual and requires stimulus for recall.

Which is why I like this comment from Jocene

He talks about the need to contextualise the PLE. Well, yes. My colleague and I have decided to push ahead with our own contextualised understanding, so we can start to reflect upon rather that speculate about our PLE work.

Get stuck in, try a few things and then reflect upon what worked, what didn’t. What did the students like, what might be better. This sounds like a much more effective way than researchers and prognosticators extrapolating what they think people will need and how they should use it. However, I think Jocene’s next quote highlights the difficult in drawing a barrier between teleological and ateleological design.

But we still keep getting stuck, half way over the implementation hurdle! If we telelogically suggest a way forward for any group of learners, then we are not facilitating a PLE, we are imposing our values.

Traditionally e-learning within universities is teleological and because the nature of teleological design is a complete and utter mismatch with the requirements of e-learning problems arise. Some colleagues and I have pointed these problems out in two publications (Jones, Luck, McConachie and Danaher, 2005; Jones and Muldoon, 2007).

One defining characteristic of teleological design is that the major design decisions are made by a small group of experts and/or leaders. There decisions are meant to be accepted by the rest of the group and are meant to be the best decisions possible. I think Jocene’s worried about this type of “imperialism” within PLEs. If she and her colleague make decisions about what should be done, aren’t they being teleological?

They don’t have to be, but it does depend on how you go about it. To my mind you reduce the teleological nature of your decisions by doing the following

  • make small changes to existing practice;
  • ensure that the changes solve problems or provide new services that will be valued by the participants;
  • ensure that you will learn lessons/try new things/make different mistakes than you have before;

i.e. you are doing safe-fail probes rather than fail-safe design. This is a distinction from Dave Snowden and which is talked about more here.

What does that mean for PLEs in higher education

Some quick thoughts on what this might mean in concrete form for implementing PLEs in higher education:

  • Modify existing e-learning infrastructure to enable it to work with the Web 2.0/mashup/PLE technology and approaches.
    e.g. generate RSS feeds out of various course management system (CMS) features and make them available to students and staff.
  • Use the “web 2.0’ifying” of the CMS to build features that solve problems or provide better services for staff and students.
  • Build some examples using these services (of PLEs) in existing social media applications – the obvious is probably facebook – but this decision should be guided by some of the following.
  • Don’t be exclusionary, don’t focus all efforts on one type of PLE or social media application.
  • Implement strategies and techniques to really engage with the students and staff to learn about what they do. NOT what they say they do, but what they actually do and experience. Use this insight to guide the above.
  • Use the strategies and techniques in the previous point to observe what happens when staff and students do (or do not) use the PLE services and use that insight to identify the next step.
  • Ensure a tight connection with and awareness of what other interesting folk are doing in this area and use it to inform the design of the next safe-fail probes you are doing.
  • Try not to do too much for staff or students. The whole notion of the PLE is that you are empowering them to do things for themselves. If the “instructional assistant/designer” does too much for them it breaks this ideal and it also doesn’t scale.

References

David Jones, Jo Luck, Jeanne McConachie, P. A. Danaher, The teleological brake on ICTs in open and distance learning, To appear in Proceedings of ODLAA’2005

David Jones, Nona Muldoon, The teleological reason why ICTs limit choice for university learners and learning, In ICT: Providing choices for learners and learning. Proceedings ASCILITE Singapore. pp 450-459

Using a blog for course design foult sessions

I’ve bitten the bullet and have decided to use WordPress blog to support the 6 hour orientation to course analysis and design I’m supposed to run next week.

It’s probably going to be much more work than I should or planned to put in, but so far it’s been fairly easy. It may be worthwhile.

Good teaching is not innate, it can be "learned" – and what's wrong with academic staff development

The title to this post is included in a quote from Kane, Sandretto and Heath (2004)

The research team, comprising two teacher educators and an academic staff developer, embarked upon this research confident in the belief that good teaching is not innate, it can be learned. With this in mind, the project sought to theorise the attributes of excellent tertiary teachers and the relationships among those attributes, with the long-term goal of assisting novice academics in their development as teachers.

This coincides nicely with my current task and also with an idea I came across on the week-end about deliberate practice and the work of Anders Ericsson.

The combination of these “discoveries” is also providing some intellectual structure and support for the REACT idea about how to improve learning and teaching. However, it’s also highlighting some flaws in that idea. Though the flaws aren’t anywhere near as large as what passes for the majority of academic staff development around learning and teaching.

The following introduces these ideas and how these ideas might be used to improve academic staff development.

Excellent tertiary teaching

Kane et al (2004) close the introduction of their paper with

We propose that purposeful reflection on their teaching plays a key role in assisting our participants to integrate the dimensions of subject knowledge, skill, interpersonal relations, research/teaching nexus and personality into recognised teaching excellence. We conclude with a discussion of the implications of our model for staff development efforts.

Their proposition about the role of reflection in contributing to excellent teaching matches with my long held belief and perception that all of the best university teachers I’ve seen have been those that engage in on-going reflection about their teaching, keep looking for new knowledge and keep trying (and evaluating) innovations based on that knowledge in the hope to improve upon their teaching.

The authors summarise a long history of research into excellent teaching that focused on identifying the attributes of excellent teachers (e.g. well prepared, stimulate interest, show high expectations etc.) but they then suggest a very important distinction.

While these, and other studies, contribute to understanding the perceived attributes of excellent teachers, they have had limited influence on improving the practice of less experienced university teachers. Identifying the elements of “good” university teaching has not shed light on how university teachers develop these attributes.

The model the develop is shown below. The suggest

Reflection lies at the hub of our model and we propose that it is the process through which our participants integrate the various dimensions

Attributes of excellent tertiary teaching

The authors don’t claim this model to have identified any novel sets of attributes. But they do suggest that

the
way in which the participants think about and understand their own practice through purposeful reflection, that has led to their development of excellence

What’s been said about reflection?

The authors have a few paragraphs summarising what’s been said about reflection in connection to tertiary teaching, for example

Day (1999) wrote “it is generally agreed that reflection in, on and about practice is essential to building, maintaining and further developing the capactities of teachers to think and act professionally over the span of their careers”

.

They trace reflection back to Dewey and his definition

“an active, persistent, and careful consideration of any belief or supposed form of knowledge in light of the grounds supporting it and future considerations to which it tends

The also mention a framework of reflection outlined by Hatton and Smith (1995) and use it to provide evidence of reflection from their sample of excellent teachers.

Expertise and deliberate practice

Among the many quotes Kane et al (2004) provide supporting the importance of reflection is this one from Stenberg and Horvath (1995)

in the minds of many, the disposition toward reflection is central to expert teaching

Another good quote (Common 1989, p. 385).

“Master teachers are not born; they become. They become primarily by developing a habit of mind, a way of looking critically at the work they do; by developing the courage to recognize faults, and by struggling to improve”

Related to this view is the question “Was Mozart, and other child prodigies, brilliant because of some innate talent?”. This is a question that this blog post takes up. The answer it gives is no. Instead, it’s the amount and quality of practice they engage in which makes the difference. Nurture wins the “nature versus nurture” battle.

The blog post builds on the work of Anders Ericsson and the concept of “deliberate practice”. The abstract for Ericsson et al (1993) is

The theoretical framework presented in this article explains expert performance as the end result of individuals’ prolonged efforts to improve performance while negotiating motivational and external constraints. In most domains of expertise, individuals begin in their childhood a regimen of effortful activities (deliberate practice) designed to optimize improvement. Individual differences, even among elite performers, are closely related to assessed amounts of deliberate practice. Many characteristics once believed to reflect innate talent are actually the result of intense practice extended for a minimum of 10 years. Analysis of expert performance provides unique evidence on the potential and limits of extreme environmental adaptation and learning.

Implications for academic staff development

If reflection or deliberate practice are key to developing mastery or expertise, then how do approaches to academic staff development and associated policies, processes and structures around university learning and teaching help encourage and enable this practice?

Seminars and presentations probably help those that are keen to become aware of new ideas that may aid their deliberate practice. However, attendance at such events are minimal. Much of existing practice seems to provide some level of support to those, the minority, already engaging in deliberate practice around learning and teaching.

The majority seem to be able to get away without engaging like this. Perhaps there’s something here?

References

Common, D.L. (1989). ‘Master teachers in higher education: A matter of settings’, The Review of Higher Education 12(4), 375–387.

Hatton, N. and Smith, D. (1995). ‘Reflection in teacher education: Towards definition and implementation’, Teaching & Teacher Education 11(1), 33–49.

Kane, R., S. Sandretto, et al. (2004). “An investigation into excellent tertiary teaching: Emphasising reflective practice.” Higher Education 47(3): 283-310.

Sternberg, R. and Horvath, J. (1995). ‘A prototype view of expert teaching’, Educational Researcher 24(6), 9–17.

The design of a 6 hour orientation to course analysis and design

It’s that time of year again, next week I get to run a session with 20 or so new CQU academics looking at course analysis and design. The session is part of a four day program entitled Foundations of University Learning and Teaching (FoULT). The session is run twice a year.

The following post gives an overview of some of my thinking behind the session this year. The sessions won’t really be finalised until the sessions are over, so if you have any feedback or suggestions, fire away.

Constraints

The following constraints apply

  • The session lasts 6 hours.
  • I’m told there will be 24 participants, I expect less than that.
  • I’ll be the only facilitator.
  • The participants are required to do this as part of the employment and some may be less than enthusiastic, though there are generally some very keen participants.
  • The sessions will be held in a computer lab. The computers are arranged around the walls of the room and there is a table without computers in the middle of the room.
  • 3 hours after lunch on one day and the 3 hours before lunch the following day.
  • The participants will be a day and a half into the four days by the time they get to this session (information overload kicking in).
  • Earlier on the first day they will have done sessions on “knowledge management” and assessment – moderation and marking.
  • The title of the sessions is “course analysis and design” so should probably do something close to that.
  • I don’t have the time to do a lot of work because of time constraints and other responsibilities.
  • Have done this session a few times before (slides from the last time are Introduction, Implementation, Analysis and design) so that experience will constrain my thinking.
  • Theoretically, I don’t believe that there is much chance of radically changing minds or developing expertise in new skills. The best I can hope for is sparking interest, raising awareness and pointing them in the right direction.

The plan

I’m thinking that the session should aim to

  • Make people aware of the tools and support currently available to help with their teaching.
  • Introduce them to some concepts or ideas that may lead them to re-think the assumptions on which they base their course design.
  • Introduce them to some resources and ideas that may help them design their courses.

Activities during the session will include

  • Some presentation of ideas using video and images.
  • Discussion and sharing of responses and their own ideas via in class discussion but also perhaps through the CDDU wiki and/or perhaps this blog.
  • A small amount of activity aimed at performing some design tasks.
  • A bit of playing around with various systems and resources.

There won’t be any assessment for this one.

The sessions

I’m planning on having 4 sessions over the 6 hours

  1. Introduction
    Set up who I am and what we’re going to be doing. Find out more about the participants – maybe get them to put this on the wiki or perhaps a WordPress blog — that sounds like an idea. Introduce the Trigwell (2001) model of university teaching that I’ll be using as a basic organising concept. Use it to introduce some of the ideas and explain the aim of the sessions. Introduce them to the technology we’ll be using and get them going.
  2. The T&L Context
    Talk about the details of the CQUni T&L context. What tools and resources are available? What do students see when they use various systems (something staff often don’t see)? Who to ask for help? etc. Also include mention of “Web 2.0” tools i.e. that the context and tools for T&L aren’t limited to what is provided by the institution. Provide an opportunity to play and ask questions about this. Aim is to be concrete, active and get folk aware of what tools they can use. Hopefully to keep them awake after lunch.
  3. Teachers’ thinking
    Introduce and “attack” some basic ideas that inform the way people think about learning and teaching. Some ideas about course design, learning and teaching and human cognition.
  4. Teachers’ planning
    Talk about the process of actually doing course design and some of the ideas, resources and tools that can be used during this process.

The plan is that the first two would be on the afternoon of the first day with the last two on the following day.

The Trigwell (2001) model of teaching is shown in the following image and is briefly described on the flickr page for the image. You should see the connection between the names of the sessions and the model

Trigwell's model of teaching

Actually, after posting this I’ve made some changes to expand the use of the Trigwell (2001) model including teachers’ strategies and in particular gathering some of their strategies.

What’s needed? What would be nice?

I want to provide pointers to additional resources and also make use of good resources during the sessions. The list of what I’ve got is available on del.icio.us.

If you know of any additional resources you’d recommend please either add them in the comments of this post or tag them in del.icio.us with foult

Feedback on the above ideas would also be welcome.

References

Trigwell, K. (2001). “Judging university teaching.” The International Journal for Academic Development 6(1): 65-73.

Some possible reasons why comparison of information systems are broken

All over the place there are people in organisations performing evaluations and comparisons of competing information systems products with a strong belief that they are being rational and objective. Since the late 1990s or so, most Universities seem to be doing this every 5 or so years around learning management systems. The problem is that these processes are never rational or objective because the nature of human beings is such that they never can be (perhaps very rarely – e.g. when I’m doing it 😉 ).

Quoting Dave Snowden

Humans do not make rational, logical decisions based on information input, instead they pattern match with either own experience, or collective experience expressed as stories. It isn’t even a bit fit pattern match, but a first fit pattern match. The human brain is also subject to habituation, things that we do frequently create habitual patterns which both enable rapid decision making, but also entrain behaviour in such a manner that we literally do not see things that fail to match the patterns of our expectations”.

Dave also makes the claim that all the logical process, evaluations, documents and meetings we wrap around our pattern-matching decisions is an act of rationalisation. We need to appear to be rational so we dress it up. He equates the value of this “dressing up” with that of the ancient witch doctor claiming some insight from the spirit world leading him to the answer.

Via a Luke Daley tweet I came across a TED talk by Dan Gilbert that provides some evidence from psychology about why this is true. You can see it below or go to the TED page

As an aside the TED talks provide access to a lot of great presentations and even better they are freely available and can be downloaded. Putting them on my Apple TV is a great way to avoid bad television.

The "dominant" assumptions underlying university-based e-learning: an introduction

As part of working on my thesis I’m working on chapter 2. As the traditional literature review one purpose of the chapter is to demonstrate what I know about the topic and to highlight what I think are the flaws or holes in current research and practice that I believe my research will address. The following builds on some initial ideas from a previous blog post and serves as some practice in formulating my ideas. So it will still be rough. Feel free to suggest improvements, point out problems and disagree.

I was going to leap headlong in describing the flaws of university based e-learning that I perceive. However, before getting to those I thought I’d address what I see as the source of these problems – the “dominant assumptions”.

I was going to develop a list of those dominant assumptions to include in this post. Then it got too long and was getting to inflexible. So now I’m going to give an introduction in this post and have separate posts to develop the assumptions I associated with different components.

The argument

The basic argument is that much of the organisational practice of selecting, designing and supporting e-learning information systems (including both the technology and how the technology is harnessed through organisational practices) within universities is far less effective than it could be. The vast majority of this practice can be argued to be more closely aligned with the band-wagon effect than of being appropriate for the organisation.

This is because of the complexity of the factors to be considered in order to make informed decisions about this practice and the observation that most most of these decisions are flawed because they draw on a limited set of dominant assumptions of different components that contribute to e-learning. These dominant conceptualisations are entirely incompatible with the nature of e-learning within universities and/or their unquestioned acceptance limits consideration of alternate perspectives that might be useful.

Important: I’m thinking directly about the practice of e-learning individual courses, though this is directly impacted upon by what I am thinking about. My focus is at the organisational level. With how a university, or perhaps one university organisational unit, and its management decide how to implement and support e-learning in terms of technology, policy and processes.

Introducting the Ps Framework

The Ps Framework is defined in this paper (Jones, Vallack and Fitzgerald-Hood, 2008) as

As a descriptive theory, the Ps Framework is proposed as a tool to make some sense of the complex, uncertain and contradictory information surrounding the organisational adoption of educational technology. The seven components of the Ps Framework identify many (any claim to exhaustive coverage would require additional research) of the important factors to be considered in such decisions.

Earlier in the same paper the explanation of descriptive theory (aka taxonomy or framework) is given as

Frameworks offer new ways of looking at phenomena and provide information on which to base sound, pragmatic decisions (Mishra & Koehler, 2006). Gregor (2006) defines taxonomies, models, classification schema and frameworks as theories for analysing, understanding and describing the salient attributes of phenomena and the relationships therein.

The sheer complexity and scope of the factors to be considered around the organisational implementation of e-learning within universities is such that, I believe, there is a need for some sort of framework to structure the discussion. The Ps Framework may not be the final or best word frameworks to structure this discussion but it’s the most complete one that I’m currently aware of.

There are other frameworks that have been used in e-learning, including:

  • the 4Es conceptual model (Collis et al, 2001)
    Used to preduct the acceptance of ICT innovations by an individual within an educational context – environment, effectiveness, ease of use and engagement.
  • the ACTIONS model (Bates, 2005)
    Provides guidance on selecting a particular educational technology – Access, Costs, Teaching and learning, Interactivity and user-friendliness, Organisational issues, Novelty and Speed.
  • the P3 model (Khan 2004)
    An approach to course development, delivery and maintenance.

I’m sure there are many others. If you know of any, please share them.

The components of the Ps Framework

The seven components of the Ps Framework (I’m quoting from the paper) are

  1. Purpose.
    What is the purpose or reason for the organization in adopting e-learning or changing how it currently implements e-learning? What does the organization hope to achieve? How does the organization conceptualise its future and how e-learning fits within it?
  2. Place.
    What is the nature of the organization in which e-learning will be implemented? What is the social and political context within which it operates? How is the nature of the system in which e-learning will be implemented understood?
  3. People.
    What type of people and roles exist within the organization? What are their beliefs, biases and cultures?
  4. Pedagogy.
    What are the conceptualisations about learning and teaching, which the people within the place bring to e-learning? What practices are being used to learn and teach? What practices might the people like to adopt? What practices are most appropriate?
  5. Past experience.
    What has gone on before with e-learning, both within and outside of this particular place? What worked and what didn’t? What other aspects of previous experience at this particular institution will impact upon current plans?
  6. Product.
    What type of "systems" or products are being considered? What is the nature of these products? What are their features? What are their affordances and limitations?
  7. Process.
    What are the characteristics of the process used to choose how or what will be implemented? What process will be used to implement the chosen approach?

The relationship between the seven components can be explained as starting with purpose. Some event or reason will require an organization to change the way in which it supports e-learning. This becomes the purpose underlying a process used by the organization to determine how (process) and what it (product) will change. This change will be influenced by a range of factors including: characteristics of the organization and its context (place); the nature of the individuals and cultures within it (people); the conceptualisations of learning and teaching (pedagogy) held by the people and the organization; and the historical precedents both within and outside the organisation (past experience). This is not to suggest that there exists a simple linear, or even hierarchical, relationship between the components of the Ps Framework. The context of implementing educational technology within a university is too complex for such a simple reductionist view. It is also likely that different actors within a particular organization will have very different perspectives on the components of the Ps Frameworks in any given context.

The dominant assumptions

In the previous post that this one has grown out of I argued near the end that

E-learning in universities generally suffers from bandwagons because the decision makers draw upon a number of dominant assumptions the negative influence what they do.

Many of these assumptions are so fundamental that they are never questioned. In fact, they are never even thought about. The decision makers don’t even know that they don’t know. This limits the quality of their decisions and contributes to the bandwagon effect.

For the rest of this post I use the components of the Ps Framework to develop an early list of these dominant assumptions and the problems they create. Where possible I try to include some explanation of the problem with each assumption and some examples.

Disclaimer: “dominant” is a bit strong for some (perhaps all) of these assumptions. Each of the assumptions has a different “dominance level”, which is a factor of how many people believe it and to what level they are unaware that there are more appropriate alternatives.

References

Bates, A. W. (2005). Technology, E-learning and Distance Education, Routledge.

Collis, B., O. Peters, et al. (2001). “A model for predicting the educational use of information and communication technologies.” Instructional Science 29(2): 95-125.

Gregor, S. (2006). “The nature of theory in information systems.” MIS Quarterly 30(3): 611-642.

Jones, D. and S. Gregor (2004). An information systems design theory for e-learning. Managing New Wave Information Systems: Enterprise, Government and Society, Proceedings of the 15th Australasian Conference on Information Systems, Hobart, Tasmania.

Jones, D., J. Vallack, et al. (2008). The Ps Framework: Mapping the landscape for the PLEs@CQUni project. Hello! Where are you in the landscape of educational technology? ASCILITE’2008, Melbourne.

Khan, B. (2004). “The People-Process-Product Continuum in E-Learning: The e-learning P3 model.” Educational Technology 44(5): 33-40.

Mishra, P. and M. Koehler (2006). “Technological pedagogical content knowledge: A framework for teacher knowledge.” Teachers College Record 108(6): 1017-1054.

OECD. (2005, 17 January 2006). “Policy Brief: E-learning in Tertiary Education.” Retrieved 5 December, 2006, from http://www.oecd.org/dataoecd/55/25/35961132.pdf.

Walls, J., G. Widmeyer, et al. (1992). “Building an Information System Design Theory for Vigilant EIS.” Information Systems Research 3(1): 36-58.

Somethings that are broken with evaluation of university teaching

This article from a Training industry magazine raises a number of issues, well known in the research literature, about the significant limitations that exist with the evaluation of university teaching.

Essentially the only type of evaluation done at most universities is what the article refers to as “level 1 smile sheets”. That is student evaluation forms that ask them to rank what they felt they learn, what they felt about the course and the teacher. As Will Thalheimer describes

Smile sheets (the feedback forms we give learners after learning events) are an almost inevitable practice for training programs throughout the workplace learning industry. Residing at Donald Kirkpatrick’s 1st level—the Reaction level—smile sheets offer some benefits and some difficulties.

His post goes on to list some problems, benefits and a potential improvement. Geoff Parkin shares his negative view on them.

The highlight for me from the Training mag article was

In some instances, there is not only a low correlation between Level I and subsequent levels of evaluation, but a negative one.

The emphasis on level 1 evaluation – why

Most interestingly, the article then asks the question, “why do so many training organisations, including universities, continue to rely on level 1 smile sheets?”

The answer it provides is that they are too scared to do them in case of what they find. It’s the ostrich approach of sticking the head in the sand.

What else should be done?

This google book search result offers some background on “level 1” and talks about the other 3 levels. Another resource provides some insights and points to other resources. I’m sure if I dug further there would be a lot more information about alternatives.

Simply spreading the above findings amongst the folk at universities who rely and respond to findings of level 1 smile sheets might be a good start. Probably necessary to start moving beyond the status quo.

Open source learning management systems – the latest fad in e-learning

The following is forming up as something to go into my thesis as part of the “Past experience” component of the Ps Framework.

Essentially, I’m going to try and

  • Suggest a timeline of e-learning within universities.
  • Contend that this history shows all the hall marks of fads and fashions in organisational decision making, aka the bandwagon effect.
  • Outline what I think is one contributing factor to this bandwagon effect.

The timeline

Still early days for this, but I think you can group the timeline for e-learning in universities into the following (the times I’ve put in place are based on my experience at CQUniversity:

  1. Pre-Interent – pre-1992 or so
    This is the use of bulletin board systems and proprietary systems run by telecommunication companies. Text-only, primitive and horrible user interfaces and really, really expensive charges for dialing up at speeds approach 2400bps.
  2. Individual Internet – 1992-~1996 (perhaps later in some places)
    Individual innovators adopting use of FTP, email, Gopher and the very early days of the Web for their own classrooms. Most access is still via text-based interfaces with very slow expensive charges. The Internet is not a widespread things.
  3. Cottage industry learning management systems – ~1995/6-late 1990s
    The mid to late 1990s saw widespread recognition that the majority of academic staff simply did not have the skills or time to individual design their use of Internet technologies (Goldberg, Salari et al. 1996; Jones and Buchanan 1996). Through this time a diverse collection of intranet-based systems , home-built virtual learning environments, off-the-shelf products and customized groupware solutions were developed by different schools and faculties (Dron 2006). Some of them are still being used today – but all are just about gone.
  4. Institutional and then Enterprise commercial learning management systems – late 1990s through to now
    Centralised systems start being put in place. Almost without exception these are commercial systems from vendors. In the early days there were a plethora of vendors. But eventually this is reduced through mergers and takeovers. Some time in the noughties these systems go for added “prestige” by calling themselves “enterprise” systems. They most are still based on dodgy bits of technology put together in an ad hoc manner.
  5. Open source learning management systems – mid-noughties and onwards
    For various reasons, mostly cost, increasingly universities are going for open source learning management systems. Essentially the same as the commercial systems in terms of functionality, but designed by an open source community.

In terms of Gartner’s technology hype cycle my guess is that we’re currently climbing the peak of inflated expectations in terms of open source learning management systems. I believe most of the universities in New Zealand have adopted Moodle. All of the Australian universities that have recently made decisions about changing their LMS, I believe, have gone open (or community) source.

I was going to include the Wikipedia image of the hype cycle, but the format is wrong for this task. So I searched Flickr and found the following. It’s one view of emerging technologies in education and elsewhere. Interesting to see where other people have put Web 2.0, e-portfolios etc. It doesn’t distinguish between open source and commercial course management systems though. (click on the image to go to the original photo or here for the original blog post the image is from).

http://fleeptuque.com/ version of the gartner hype cycle

I believe we’re also at the start of another stage in this development, the “post industrial” approach. But this one still hasn’t risen to the awareness of most organisations, at least not at senior management making a decision to base the institution’s approach on it.

  • Post-industrial – mid-noughties (perhaps later) –
    This stage includes various ideas from such concepts as e-learning 2.0 (or the original), personal learning environments and web 2.0. At a very simple level, the emphasis moves away from the industrial model where everything is supplied by the institution through “enterprise” systems. To a model where the services provided by the institution blend in with the services and requirements of the learners. Where learner is defined in the broadest possible sense.

The case for fads, fashions and band-wagons

Back in June, 2007 – when writing this paper I said

…the subsequent limitation of rationality is demonstrated by the “faddish” adoption of LMSs within universities. Surprise has been expressed at how quickly university learning and teaching, commonly known for its reluctance towards change, has been modified to incorporate the use of learning management systems (West et al., 2006). Pratt (2005) finds connections between the Australian university sector’s adoption of e-learning during the 1990s and the concept of management fashions, fads and bandwagons where a relatively transitory collection of beliefs can legitimise the exercise of mindlessness with respect to innovation with information technology (Swanson & Ramiller, 2004). In particular, given conditions of uncertainty about prevailing technologies organisations may rely on imitation to guide decision making (Pratt, 2005).

One of the reasons, I believe, that the timeline sequence above applies fairly well to e-learning, regardless of institutions or countries. Is because most have been influenced by the bandwagon effect. The people making decisions in most organisations have had so little understanding of how to go about e-learning they have followed what everyone else is doing.

Following on from the above quote I asked the following questions

Is the current trend amongst universities to move towards open source learning management systems (e.g. Moodle) the most recent e-learning fashion? Will an open source learning management system, especially one that is supported within an institution in the same way as a commercial product, really make a significantly different impact than use of a commercial product?

Guess which university has recently decided to adopt Moodle?

The source of the bandwagon – unquestioned assumptions

Information systems development (and other interventions in human organisations) are not rational, purposive or goal-driven processes, they are instead subject to human whims, talents and the personal goals of those involved (Truex et al., 2000). Decision making about the implementation of information systems is not a techno-rational process with many decision makers relying on intuitions or instincts and simple heuristics to simplify decision making (Jamieson & Hyland, 2006). People are not rational in that their decision-making is influenced by a range of cognitive and other biases.

In the words of Dave Snowden, “Human beings are pattern matching intelligences”. Or more specifically (Snowden, 2005)

This builds on naturalistic decision theory in particular the experimental and observational work of Gary Klein (1944) now validated by neuro-science, that the basis of human decision is a first fit pattern matching with past experience or extrapolated possible experience. Humans see the world both visually and conceptually as a series of spot observations and they fill in the gaps from previous experience, either personal or narrative in nature. Interviewed they will rationalize the decision in whatever is acceptable to the society to which they belong: “a tree spirit spoke to me” and “I made a rational decision having considered all the available facts” have the same relationship to reality

When it comes to e-learning, the same problem arises. No matter who looks at e-learning their answers to the question of “how to do e-learning” is almost always limited by their background and experience. They do not see the whole picture, they only focus on what they know. And the fashion of what all the other universities are doing become a narrative that is so strong it becomes the obvious way to go.

Related to this problem is that there are range of unquestioned assumptions underlying these decisions. I’ll try and use the the Ps Framework to represent these. (This is very much formative for the thesis).

  1. Past experience – there’s almost an ignorance of what has gone before and the limitations of previous attempts. My university has just gone through its 3rd selection process for an LMS. How many has yours done?
  2. People – there’s an assumption that people are rational. The above disproves some of that. Google – bounded rationality or cognitive bias. There’s an assumption that students and staff will adopt what ever is decided.
  3. Product – there’s almost an automatic assumption that e-learning means LMS.
  4. Place – people seem to think universities and the context they operate in are static. There seems to be an assumption that universities are ordered or complicated systems, rather than complex systems (see the Cynefin framework).
  5. Process – there is a complete bias to teleological design processes (Introna, 1996; Jones and Muldoon, 2007) when ateleological design is more appropriate. In the words of Snowden, “fail-safe” design versus “safe-fail” design.

Institutional implementation of e-learning can’t help but not be faddish or fashionable because many of the underlying understandings of the components that contribute to these decisions are unquestionably biased towards one understanding of those components. These understandings have become the single domineering, and usually unquestioned, understandings that underpin the organisational implementation of e-learning within universities.

Paraphrasing Truex et al (2000)

The adoption of such domineering understandings not only imprisons thinking about those understandings, but also thinking about think about those understandings.

References

Dron, J. (2006). Any color you like, as long as it’s Blackboard. World Conference on E-Learning in Corporate, Government, Healthcare and Higher Education, Honolulu, Hawaii, USA, AACE.

Goldberg, M., S. Salari, et al. (1996). “World-Wide Web – Course Tool: An environment for building WWW-based courses.” Computer Networks and ISDN Systems 28: 1219-1231.

Jamieson, K. and P. Hyland (2006). Factors that influence Information Systems decisions and outcomes: A summary of key themes from four case studies. 17th Australasian Conference on Information Systems, Adelaide, Australia.

Jones, D. and R. Buchanan (1996). The design of an integrated online learning environment. Proceedings of ASCILITE’96, Adelaide.

Introna, L. (1996). “Notes on ateleological information systems development.” Information Technology & People 9(4): 20-39.

Pratt, J. (2005). “The Fashionable Adoption of Online Learning Technologies in Australian Universities.” Journal of the Australian and New Zealand Academy of Management 11(1): 57-73.

Snowden, D. (2005). Multi-ontology sense making: A new simplicity in decision making. Management Today, Yearbook 2005. R. Havenga.

Swanson, E. B. and N. C. Ramiller (2004). “Innovating mindfully with information technology.” MIS Quarterly 28(4): 553-583.

Truex, D., R. Baskerville, et al. (2000). “Amethodical systems development: the deferred meaning of systems development methods.” Accounting Management and Information Technologies 10: 53-79.

West, R., G. Waddoups, et al. (2006). “Understanding the experience of instructors as they adopt a course management system.” Educational Technology Research and Development.

Why am I a ePortfolio skeptic?

Update: It’s not just eportfolios

16 September, 2012 – for some reason today has seen a fair bit of interest in this post from over 3 years ago. I though I’d take the opportunity to move this particular argument up a level.

Proposition: Attempts to improve or – heaven forbid – “innovate” university teaching and learning is largely driven by mindless innovation, fads and fashions. eportfolios was just one such fad. There have been and will be many more. Some of the more recent include: e-learning, the LMS, the enterprise LMS, the open source LMS (e.g. an earlier post where I proposed that the open source LMS was yet another fad), and more recently MOOCs.

@downes seemed slightly annoyed when I wrote in this earlier post

MOOCs are the latest fad to hit higher education.

. It was never my intent to denigrate the work he and others have done with cMOOCs. Rather it was to criticise how universities – especially Australian universities – were responding to the rise of MOOCs. Somewhat along the lines of what @bonstewart argues in Is MOOC more than just a buzzword?”.

My argument is that rather than mindfully innovating – or simply improving – learning and teaching (either with or without), Universities are driven by outside influences. By fads, fashions and buzz words.

When I say I’m an eportfolio skeptic, I’m not necessarily denigrating eportfolios (or MOOCs or the LMS or any other fad, fashion or buzzword). I’m critiquing the institutional leadership and management that continues to be driven by these fads, fashions and buzzwords and is apparently unaware of the problems this entails.

Both this original post and this post “Justificatory knowledge” use Swanson and Ramiller (2004) on innovating mindfully with technology. The “justificatory” post includes the following summary.

An organisation which is mindful in innovating with IT, uses reasoning grounded in its own organisational facts and specifics when thinking about the innovation, the organisation recognises that context matters (Swanson and Ramiller 2004). Within mindful innovation, management have a responsibility to foster conditions that prompt collective mindfulness (Swanson and Ramiller 2004).

Especially when how the institution adopts the latest fad ends up corrupting some of the fundamental underpinnings of the original idea. e.g. how the adoption of a specific eportfolio system seems to create the situation where universities are mandating that students create a portfolio in the institutional system. So much for individual choice.

Or when institutional MOOCs are no longer MOOCs, which as @downes defines

What makes a MOOC is the way it is designed – it supports thousands of users that fully interact because it is distributed. It’s not located in just one place, it is located in many places.

Original post starts here

Update: Donald Clark has 7 reasons why he doesn’t want one.

I am a skeptic when it comes to ePortfolios. I believe they are a waste of time. Another fad that will take attention away from activities that will actually improve learning and teaching at universities. I believe they embody many of the faulty assumptions and mistakes that underpin most of e-learning within universities.

So, why do I think that? Why am I using such strong language to describe it? Am I right? (Answers to these questions are always open to change).

That second last question is easy to answer, given a range of contextual factors I’m increasingly annoyed at people making, what I see as, the same mistakes again and again and again. I’m also simply in a grumpy mood today. It’s now over a month since I first wrote the start of this paragraph, that sentiment still stands. I also find myself in a position where I can be a little more critical.

I’m going to try and get this post published today to achieve something and also because I came across this presentation from the EDUCAUSE Mid-Atlantic Regional Conference. It’s called “Assessing Impact: E-Portfolios in Higher Education”.

Yet another fad – ignorance of place, emphasis on product

In an earlier post I started some of this complaint. In the session which that post reports on there were lots of gleams of interest from CQUniversity folk when e-portfolios and elgg were mentioned. I could see people, who had little or no idea about what e-portfolios or elgg were, becoming interested in looking at them. At the time I said

This is one example of how the “product” (in terms of the Ps Framework) overwhelms consideration of “place”, of context. This is exactly how fads and fashions arise in educational technology and organisational/management practice in general.

Here’s what Swanson and Ramiller (2004) said about fads and fashions.

Attention deferral and contextual insensitivity may appear to be unproblematic in the face of the overwhelming “proof” afforded by the larger community’s rush toward the innovation

i.e. since everyone else is doing it, you don’t have to worry about considering whether or not it actually makes sense for your context. In many cases you also don’t have to worry about whether or not everyone else that is already using the fad is succeeding or not.

The educational technology/curriculum design community and consequently the broader university community is talking about e-portfolios. In Australia there is the Australian ePortfolio Project which is raising the profile of e-portfolios in the sector. I predict that this project will further increase awareness of e-portfolios within the sector and before the year is out (if they haven’t already) some discussions will occur about their use at CQUniversity.

The EDUCAUSE presentation above suggests e-portfolios have been around for almost 10 years. So it’s taken a while. The same presentation also reports on some less than stellar results from students and staff. It seems the people aren’t that happy.

The “technologists” alliance – ignorance of people

I’m guessing that if we did an analysis of the literature around ePortfolios that we would find three categories of people making up the majority, if not the entirety, of the list of authors. Those categories are

  1. Vendors or developers of e-portfolio systems – the people that make and sell the systems.
  2. Institutional instructional designers or instructional technology folk – the people implementing e-portfolio projects within universities.
  3. Innovative academic staff – the people who adopt e-portfolios first.

In other words, a minority of the folk involved with education within universities.

In the words of William Geoghegan (1994) these three groups of folk are the “technologists” alliance. He had this to say about them (emphasis and some explanation added by me)

Those involved include faculty innovators and early adopters, campus IT (IT here is instructional technology – US phrase that includes instructional designers and information technology folk) support organizations, and information technology vendors with products for the instructional market. Ironically, while this alliance has fostered development of many instructional applications that clearly illustrate the benefits that technology can bring to teaching and learning, it has also unknowingly worked to prevent the dissemination of these benefits into the much larger mainstream population.

Geoghegan (1994) continued to say

There seems to have been a naive assumption on the part of all three communities that what worked for those who were already committed to the use of instructional technology, were actively applying it in their own work and were serving as evangelists to others would work equally well with those who had not yet committed.

Geoghegan works with Geoffrey Moore’s concept of a chasm that exists between the early adopters of a product (the enthusiasts and visonaries) and the early majority (the pragmatists).
Represented something like this.

The diffusion curve with the chasm

Figure 1 – Revised technical adoption cycle

This is not to suggest the elgg or e-portfolios are a bad idea, there is some value in them. If they are appropriate for the organisational context and they are adopted because of a large organisational need and not because someone heard or read something positive about the idea. They are a good idea if they are implemented in a way that engages with the reality of what the people within the place are dealing with and with what they are ready to do.

The following table is an example of the differences which Geoghegan talks about between the folk that work with the technologists’ alliance and those in the mainstream.

Early Adopters Mainstream
Like radical change Like gradual change
Visionary Pragmatic
Project oriented Process oriented
Risk takers Risk averse
Willing to experiment Need proven uses
Self sufficient Need support
Relate horizontally Relate vertically

The types of support and encouragement you give to the early adopters has to be radically different than that you give to the mainstream. Time and time again, I have heard senior university folk express the opinion “We’ll concentrate on the people that are keen”. This perspective only entrenches this gap, it only makes certain that the mainstream won’t engage. It’s a mistake.

There are many other differences between people that will impact upon perceptions and adoption. The process used to identify, design, develop and implement any technology, including e-portfolios, has to be aware of and work with these differences.

Failure to adopt or work – ignorance of process

E-portfolios, like many other fads, are becoming a solution looking for the problem. The process used to implement these fads within higher education goes something like this

  • There is a problem. A problem is identified and some one or some group is convinced that is important and requires that a project be set up.
  • Some analysis is performed. A small group of folk, generally from the technologists’ alliance with a few senior management folk added, go away and do some analysis. They may be helped by a consultant. Typically, the problem will have been framed so that the decision is a foregone conclusions. For example, rather than examining the question “How do we improve assessment?” the question will be “Which approach to e-portfolios should we take?”
  • A decision is made. That analysis will be used by some smaller group to make a decision. From now on, this is the goal towards which the organisation and its resources are focused.
  • The decision is implemented. A project group is set up to achieve the goal. The project group will be there to ensure that all work moves towards the goal. Than anything that is different is cut off.
  • Long period of support. To recoup the costs involved in making and implementing the goal the organisation then has to use they “system” for a long period. During this time organisational resources are focused on supporting the system (and little or nothing else).
  • Eventually it will start to drift. In some cases the people won’t want to use it, they aren’t convinced of the rationale. They may well appear to be working towards the goal, but they may simply be playing sufficient lip service so they don’t get into trouble. Then there will be the problem of “stable systems drag” (Truex, Baskerville et al, 1999) where the world has moved on and the goal no longer makes any sense. Alternatively, a new senior executive could arrive with different approaches and kill one set of fads for another set.
  • Another problem is perceived, and the process starts again.

You can see this with the notion of LMS churn going on with universities. In the 10 years since 1999 CQUniversity will have been through 3 separate process to replace an LMS.

The very nature of universities, the place, the people and of pedagogy makes this type of process completely and utterly inappropriate and destined to fail.

But that’s a general perspective that applies to just about anything within universities, what about e-portfolios in particular.

The wrong solution – the wrong product

The definition of e-portfolio used in the EDUCAUSE presentation is taken from Lorenzo and Ittelson (2005)

a digitized collection of artifacts including demonstrations, resources, and accomplishments that represent an individual, group, or institution.

For the majority of the technologists alliance within universities this means that the university must purchase or build and then support a software system (almost certainly referred to as an enterprise software system to give it that badge of respectability – even though it means nothing in terms of reliability, flexibility, suitability and probably more in terms of cost and constraints) that resides on university hardware and is badged with the university look and feel.

The major problem with this product approach is that it ignores One student = multiple learning experiences = multiple learning “institutions”.

In the typical e-portfolio product there is an assumption that the students’ only place for learning is the host institution. It ignores the observation that students attend multiple learning institutions (including work-place training) and it ignores that most learning is informal. In other words, an institution that plays a very small part in the learning of a student expects the student to place all of their “demonstrates, resources and accomplishments” onto the institution’s server.

What are we trying to solve – ignorance of purpose

The EDUCAUSE presentation gives the following primary uses of e-portfolios

  • Academic advising
  • Institutional accreditation
  • Curricular development at program level
  • Career planning and development
  • Alumni development

I have two main problems around these stated purposes

  1. One system, multiple tasks. One of my major problems with “enterprise systems” is that try to do everything. They try to be all things to all tasks and end up being really bad at all of them. I hear the information technology folk cry, “But they are all integrated!”. Yea, but no-one uses them because they are really horrible to use. The above list includes a number of very different tasks performed by very different people. The assumption that one system can perform all of these well, is somewhat questionable. Rather than put all tasks in one system, adopt a best of breed approach and put your effort into making sure they are integrated. This requires the IT folk (rather than the users) do some work. Of course, there is an alternate position that is based on a number of false assumptions, but that’s a story for another day.
  2. The institutional, not adopter focus. Take a look at the diffusion curve image above. The majority of people are not the innovators and early adopters. This applies to university academics. Most university academics are keen to do a good enough job in teaching their classes so they can concentrate on other pursuits. Look at the list of primary uses of e-portfolios. How many of them are going to be of direct interest to the majority of academics, most of the time? Perhaps academic advising, maybe institutional accreditation from time to time, but typically that’s only a small number of academics. Perhaps curricular development at program level? Most of the programs in my context don’t do this. Your context might be different. (Remember, place/context is important).

    How many of these tasks are seen as problematic by these academics? Remember the majority are very different from the innovators and early adopters. See the table above. Most people don’t want to radically change the way they are doing things. They will only consider radical change if there are huge problems with current practice.

    The same applies to students. How many of the above tasks are students directly involved with regularly? How many of these tasks do students currently have huge problems with?

Conclusions

There’s more that can be put in here. For those who are wondering, yes much of this content is related to my PhD. To a large extent I believe that there are large collection of mistaken assumptions that underpin most of the practice of e-learning within Universities. Many of the above complaints about e-portfolios can be easily applied to other technologies and how they are implemented within universities. More on this as the thesis progresses.

References

Geoghegan, W. (1994). Whatever happened to instructional technology? 22nd Annual Conferences of the International Business Schools Computing Association, Baltimore, MD, IBM.

George Lorenzo and John Ittelson, An Overview of E-Portfolios, ed. Diana Oblinger (Boulder, CO: EDUCAUSE Learning Initiative, July 2005), http://www.educause.edu/ir/library/pdf/ELI3001.pdf.

Truex, D., R. Baskerville, et al. (1999). “Growing systems in emergent organizations.” Communications of the ACM 42(8): 117-123.

Swanson, E. B. and N. C. Ramiller (2004). “Innovating mindfully with information technology.” MIS Quarterly 28(4): 553-583.

Is all diversity good/bad – a taxonomy of diversity in the IS discipline

In a previous post I pointed to and summarised a working paper that suggests that IS research is not all that diverse. At least at the conceptual level.

The Information Systems (IS) discipline has for a number of years been having an on-going debate about whether or not the discipline is diverse or not. A part of that argument has been discussion about whether diversity is good or bad for IS and for a discipline in general.

Too much diversity is seen as threatening the academic legitimacy and credibility of a discipline. Others have argued that too little diversity could also cause problems.

While reading the working paper titled “Metaphor, meaning and myth: Exploring diversity in information systems research” I began wondering about the definition of diversity. In particular, the questions I was thinking about were

  1. What are the different types of diversity in IS research?
    Based on the working paper I believe there are a number of different types of diversity. What are they?
  2. Are all types of diversity bad or good?
    Given I generally don’t believe in universal generalisations, my initial guess is that the answer will be “it depends”. In some contexts/purposes, some will be bad and some will be good.
  3. Is this topic worth of a publication (or two) exploring these questions and the implications they have for IS and also for other disciplines and research in general?
    Other disciplines have had these discussions.
  4. Lastly, what work have IS researchers already done in answering these questions, particularly the first two?
    There’s been a lot of work in this area, so surely someone has provided some answers to these questions.

What different types of diversity exist?

The working paper that sparked these questions talks about conceptual diversity.

It also references Benbasat and Weber (1996) – two of the titans of the IS discipline and this article is perhaps one of “the” articles in this area – who propose three ways of recognising research diversity

  1. Diversity in the problems addressed.
  2. Diversity in the theoretical foundations and reference disciplines used to account for IS phenomena.
  3. Diversity of research methods used to collect, analyse and interpret data.

The working paper also suggests that Vessey et al (2002) added two further characteristics

  1. Research approach.
  2. Research method.

I haven’t read the Vessey paper but given this summary, I’m a bit confused. These two additional characteristics seem to fit into the 3rd “way” from Benbasat and Weber. Obviously some more reading is required.

In the work on my thesis I’m drawing on four classes of questions about a domain of knowledge from Gregor (2006). They are

  1. Domain questions. What phenomena are of interest in the discipline? What are the core problems or topics of interest? What are the boundaries of the discipline?
  2. Structural or ontological questions. What is theory? How is this term understood in the discipline? Of what is theory composed? What forms do contributions to knowledge take? How is theory expressed? What types of claims or statements can be made? What types of questions are addressed?
  3. Epistemological questions. How is theory constructed? How can scientific knowledge be acquired? How is theory tested? What research methods can be used? What criteria are applied to judge the soundness and rigour of research methods?
  4. Socio-political questions. How is the disciplinary knowledge understood by stakeholders against the backdrop of human affairs? Where and by whom has theory been developed? What are the history and sociology of theory evolution? Are scholars in the discipline in general agreement about current theories or do profound differences of opinion exist? How is knowledge applied? Is the knowledge expected to be relevant and useful in a practical sense? Are there social, ethical or political issues associated with the use of the disciplinary knowledge?

I wonder if these questions might form a useful basis or a contribution to a taxonomy of diversity in IS. At this stage, I think some sort of taxonomy of diversity might indeed be useful.

Using metaphor to examine diversity (or lack thereof) in research

This post contains a link to a PDF working paper that uses metaphor analysis on a collection of journal papers to examine just how diverse research within the Information Systems field actually is. It finds that the IS field is actually not very diverse at all from this perspective, which is somewhat contradictory to what is accepted by most IS folk.

The paper treats research as a cognitive process. A process that is constrained/influenced by the concepts we have of the world and the objects we study in our research. Our understandings of these concepts are very often unquestioned. The paper uses metaphor analysis to uncover what understandings IS researchers (writing in one of the IS fields premier journals) have of the concept of the “organisation”.

Work in organisational science has identified three main metaphors used of the organisation. They are:

  1. Organisation as a machine
  2. Organisation as an organism
  3. Organisation as a culture

It finds that the “organisation as a machine” metaphor is by far the most used and the most detailed. Followed a long way behind but “organisation as an organism”. “Organisation as a culture” is almost non-existent.

Within learning and teaching at universities I’m increasingly seeing a lot of this same bias. Much of the management, leadership and “quality” assurance stuff I see in higher education has a very strong assumption of the machine metaphor. A metaphor of questionable use in any group of human beings, let alone a university and especially around the act of learning and teaching.

With this fundamental assumption, I’m not surprised that much of what is proposed to improve learning and teaching fails.

Seven principles of knowledge management and applications to e-learning, curriculum design and L&T in universities

I’ve been a fan of Dave Snowden and his work for a couple of years. In this blog post from last year Dave shares 7 principles for “rendering knowledge”. For me, these 7 principles have direct connection with the tasks I’m currently involved with e-learning, curriculum design and helping improve the quality of learning and teaching.

If I had the time and weren’t concentrating on another task I’d take some time to expound upon the connections that I see between Snowden’s principles and the tasks I’m currently involved with. I don’t so I will leave it as an exercise for you. Perhaps I’ll get a chance at some stage.

Your considerations would be greatly improved by taking a look at the keynote presentation on social computing at the Knowledge Management Asia conference given by Dave based on these 7 principles. I listened to the podcast yesterday and slides are also available.

I strongly recommend these to anyone working in fields around e-learning, curriculum design etc. to listen to this podcast.

For example

Let’s take #2

  • We only know what we know when we need to know it.
    Human knowledge is deeply contextual and requires stimulus for recall. Unlike computers we do not have a list-all function. Small verbal or nonverbal clues can provide those ah-ha moments when a memory or series of memories are suddenly recalled, in context to enable us to act. When we sleep on things we are engaged in a complex organic form of knowledge recall and creation; in contrast a computer would need to be rebooted.

The design of both e-learning software and learning and teaching currently rely a great deal on traditional design process that rely on analysis, design, implementation and evaluation. For example, at the start of the process people are asked to reflect and share insights and requirements about the software/learning design divorced from the reality of actually using the software or learning design. Based on the knowledge generated by that reflection, decisions are made about change.

The trouble is asking people these questions divorced from the context is never going to get to the real story.

Content, redirects and impact on Google ranking

Late last year I wrote that this blog was currently ranked #12 by Google (US-based search) for a search on “david jones”.

Over the Xmas break I spent a fair bit of time copying content from my old website to this blog and putting in place permanent redirects from the old site to the new content. I also uploaded some images onto my Flickr photos and included links to content on the blog that was associated.

Today, when I do the same search the blog is now ranked #5 by Google for a US-based search.

For an Australian based Google search the site is up from 177 to 87

Reflections and Implications from Webfuse – Domain languages

As I am currently writing up the PhD I have banned myself from working on any new papers. However, as I work through the PhD I will get ideas for papers so rather than waste time writing them in full and even worse forget about them I’m going to try and write about them on the blog and categorise them appropriately. With the hope that post PhD I can come back and have a large collection of papers to write. Alternatively, I’ll have a collection of dribble to laugh at.

First cab off the rank is the idea of a “webfuse reflections and implications” paper. To some extent this would come from the last chapter of my thesis and capture some of the lessons, reflections etc learned from the 12 or so years working on the thesis and Webfuse (the artifact that arose from/created the thesis).

In part the idea for this paper is to capture the messy bits that you have to solve in practice that are typically overlooked by researchers and the enterprise folk implementing e-learning. The hope is that these reflections/implications could spark off ideas for future research projects.

Some of the initial ideas which will be expanded upon include

  • Domain langauges and the mismatch between enterprise software and institutional practice.
  • The need for “loose joining” between e-learning systems and other institutional databases.
  • Limitations of traditional IT governance and other forms of hierarchical division of labour and responsibilities.

Domain languages and LMS mismatches

The idea for this one was sparked by this post on iPhoto and Domain Languages and some previous work.

The definition of domain language given on the 37signals blog is

A domain language is the set of words that reflect the way you cut up a domain. It consists of the pieces you sliced and the names you chose to give them. This language defines an application and makes it special.

Universities have domain languages. Actually, university sectors in different countries have different domain languages and to some extent different universities within the same country can have different domain languages.

For example, in the mid-1990s the institution I worked at then had a domain language that consisted of the following terms

  • Unit – an individual course/subject that a student enrols in (e.g. Programming I etc.)
  • Course – a collection of units that make up a student’s entire study for a degree (e.g. bachelor of information technology).
  • Student number – the unique identifying number (actually combination of letters and numbers) for a particular student (e.g. C0101010X)

As the 37signals post suggests software applications have to encapsulate a domain language. The designers of those applications have to make choices about how they divide up the application space, what objects are sensible and what to call those objects.

Around the turn of the century the institution I worked with adopted the Peoplesoft enterprise resource planning (ERP) system for some aspects of its work, in particular its student administration system. Peoplesoft is from the North American university sector and had a domain language that made sense for that sector. It also originally started in the Human Resource management sphere and so aspects of its domain language showed its origins.

For example, the institution, its students and staff had to change the language they used from the above to the following

  • Unit became Course.
    This re-use of a term that meant something completely different in the previous domain language was particular confusing.
  • Course became Program.
  • Student number, while still used in normal conversation, became EMPLID (emploee ID) in the database and some aspects of the interface.

The change in domain language made the adoption of the software more difficult. It required significant changes to a broad array of practices and documentation that normally would not have been required. It also resulted in my institution (and the others who adopted Peoplesoft) started using a domain language that was different from many other Australian universities.

The Blackboard e-learning system we’ve been using over the last few years has also show this sort of problem. The main one has been the difference between the names given to teaching staff.

Implementing the rotating banner image

I’ve mentioned some plans to implement a rotating banner image on this blog. As you may have picked up from this post, if you’re looking at the site, that such a rotating banner image has been implemented. Here’s the story.

It’s one of pragmatism. The plan of not using an external server, after a minimum of searching, was proving to be a little more difficult than I thought. So rather than waste time I’ve simply re-used the approach I used on my old site. i.e. this script. At the moment the script simply does a loop through a list of images stored in the file system of the host web server.

After purchasing the “CSS Edit” ability from WordPress.com (about $AUD20 for a year) I’ve added a bit of CSS to call the above script and hey presto, rotating banner image.

In the spirit of release early, release often, I hope to continue modifying this to move it further away from the original design and towards some of the newer plans. In particular, the use of Flickr to host the images.

Webfuse usage statistics – Online assignment submission

Online assignment submission is one of the more used features of Webfuse and is explained in part in a few publications like this one.

Number of students and courses

Year # of Students % of All students # of courses # assignments
2000 1067 5.6% 11 1519
2001 1646 7.5% 18 3915
2002 4005 17.4% 28 13468
2003 3805 16.5% 48 14792
2004 5709 23.3% 65 23273
2005 8134 29.1% 79 26781
2006 7109 25.5% 92 29132
2007 6328 27.3% 97 25317
2008 6571 31.9% 124 24081

Page 1 of 2

Powered by WordPress & Theme by Anders Norén

css.php