In a serious of blog posts (starting with this one) I’ve been trying to develop a list of fundamental assumptions about learning and teaching at Universities which the various concepts associated with personal learning environments (PLEs) bring into question.
This post attempts to add another.
The expert designer
Within the practice of learning and teaching at universities there are a number of levels that assume the need for an expert designer (or a small group thereof). These include:
- Senior management (and their consultants);
Any important decision must be made by the small group of senior managers. Typically they will draw on “experts” to provide analysis and recommendations and then the senior management (or manager) will make the decision.
Senior management is difficult and requires great skills and foresight and subsequently couldn’t just be left to normal people to make the decision. They don’t have the skill.
- Learning design; and
The design a university course is performed by the academic (or small group thereof) with demonstrable discipline expertise in the form of PhDs. They might be aided by their consultants, the instructional designers and other technical staff, but in the end it is the academic staff who make the decisions.
After all, learning all about a discipline area is difficult. It requires great depth and breadth of knowledge to understand how best to do this. You couldn’t leave this sort of thing up to the learners. They don’t have the knowledge to do this.
- Provision of information technology systems.
Information technology is complex and complicated. There is a broad chasm of difference between looking after your home PC and managing large, complex and important enterprise systems. Such a task requires enterprise IT experts, and their consultants, to make these difficult decision and ensure that the organisation isn’t losing money.
You can’t simply leave information technology decisions up to the end-user. They don’t have this breadth and depth of knowledge. They would make mistakes. It would waste resources.
And the list could go on for each of the professional groups or divisions that infest a modern university. Ordering textbooks, booking travel, looking after gardens and buildings, all of these activities, as implemented in a modern organisation, assume that there is a need for the experts to take control.
Problems with this approach
The main problem with this tendency is “one size fits all”. The central, small group of designers can never fully understand the diversity of all of their clients and in many cases could never efficiently provide a customised service to each of them. For example, a university course is never customised for an individual student’s pre-existing knowledge – even though this is one of the things we know is important for learning.
Web 2.0, social media and other advances in technology are bringing this practice into question. Increasingly there are abundant, inexpensive and simple to use tools which users can adopt, and more importantly, adapt to their own preferences. These tools, through the use of standards, can be used to access organisational services (if they are configured appropriately).
It’s becoming possible for the end-user to use the tools they already know. Rather than being forced to use the tools selected by the central IT folk of the organisation they now work for or are studying at.
Related to this is that this approach assumes that the “non-experts” actually need the input of the experts. Increasingly with IT you don’t. Similarly, many folk can learn things quite effectively all of the time through informal learning without the need for the discipline expert. The need and supposed rise of lifelong learning means that this trend should only increase.
Following on from this is the assumption that the experts really are experts. I’m sure anyone that has worked within an organisation can point to organisational decisions which demonstrably suggest that the experts weren’t so expert.