Assembling the heterogeneous elements for (digital) learning

Category: c2d2 Page 1 of 2

Situated shared practice, curriculum design and academic development

Am currently reading Faegri et al (2010) as part of developing the justificatory knowledge for the final ISDT for e-learning that is meant to be the contribution of the thesis. The principle from the ISDT that this paper connects with is the idea of a “Multi-skilled, integrated development and support team” (the name is a work in progress). The following is simply a placeholder for a quote from the paper and a brief connection with the ISDT and what I think it means for curriculum design and academic development.

The quote

The paper itself is talking about an action research project where job rotation was introduced into a software development firm with the aim of increasing the quality of the knowledge held by software developers. The basic finding was that in this case, there were some benefits, however, the problems outweighed them. I haven’t read all the way through, I’m currently working through the literature review. The following quote is from the review.

Key enabling factors for knowledge creation is knowledge sharing
and integration [36,54]. Research in organizational learning has emphasized the value of practice; people acquire and share knowledge in socially situated work. Learning in the organization occurs in the interplay between tacit and explicit knowledge while it crosses boundaries of groups, departments, and organizations as people participate in work [17,54]. The process should be situated in shared practice with a joint, collective purpose [12,14,15].

Another related quote

The following is from a bit more related reading, in particular Seely Brown & Duguid (1991) – emphasis added

The source of the oppositions perceived between working, learning, and innovating lies primarily in the gulf between precepts and practice. Formal descriptions of work (e.g., “office procedures”) and of learning (e.g., “subject matter”) are abstracted from actual practice. They inevitably and intentionally omit the details. In a society that attaches particular value to “abstract knowledge,” the details of practice have come to be seen as nonessential, unimportant, and easily developed once the relevant abstractions have been grasped. Thus education, training, and technology design generally focus on abstract representations to the detriment, if not exclusion of actual practice. We, by contrast, suggest that practice is central to understanding work. Abstractions detached from practice distort or obscure intricacies of that practice. Without a clear understanding of those intricacies and the role they play, the practice itself cannot be well understood, engendered (through training), or enhanced (through innovation).

Relevance?

I see this as highly relevant to the question of how to improve learning and teaching in universities, especially in terms of the practice of e-learning, curriculum design and academic development. It’s my suggestion that the common approaches to these tasks in most universities ignore the key enabling factors mentioned in the above quote.

For example, the e-learning designers/developers, curriculum designers and academic developers are generally not directly involved with the everyday practice of learning and teaching within the institution. As a result the teaching academics and these other support staff don’t get the benefit of shared practice.

A further impediment to shared practice is the divisions between e-learning support staff, curriculum designers and academic developers that are introduced by organisational hierarchies. At one stage, I worked at a university where the e-learning support people reported to the IT division, the academic staff developers reported to the HR division, the curriculum designers reported to the library, and teaching academics were organised into faculties. There wasn’t a common shared practice amongst these folk.

Instead, any sharing that did occur was either at high level project or management boards and committees, or in design projects prior to implementation. The separation reduce the ability to combine, share and create new knowledge about what was possible.

The resulting problem

The following quote is from Seely Brown and Duiguid (1991)

Because this corporation’s training programs follow a similar downskilling approach, the reps regard them as generally unhelpful. As a result, a wedge is driven between the corporation and its reps: the corporation assumes the reps are untrainable, uncooperative, and unskilled; whereas the reps view the overly simplistic training programs as a reflection of the corporation’s low estimation of their worth and skills. In fact, their valuation is a testament to the depth of the rep’s insight. They recognize the superficiality of the training because they are conscious of the full complexity of the technology and what it takes to keep it running. The corporation, on the other hand, blinkered by its implicit faith in formal training and canonical practice and its misinterpretation of the rep’s behavior, is unable to appreciate either aspect of their insight.

It resonates strongly with some recent experience of mine at an institution rolling out a new LMS. The training programs around the new LMS, the view of management, and the subsequent response from the academics showed some very strong resemblances to the situation described above.

An alternative

One alternative, is what I’m proposing in the ISDT for e-learning. The following is an initial description of the roles/purpose of the “Multi-skilled, integrated development and support team”. Without too much effort you could probably translate this into broader learning and teaching, not just e-learning. Heaven forbid, you could even use it for “blended learning”.

An emergent university e-learning information system should have a team of people that:

  • is responsible for performing the necessary training, development, helpdesk, and other support tasks required by system use within the institution;
  • contains an appropriate combination of technical, training, media design and production, institutional, and learning and teaching skills and knowledge;
  • through the performance of its allocated tasks the team is integrated into the everyday practice of learning and teaching within the institution and cultivates relationships with system users, especially teaching staff;
  • is integrated into the one organisational unit, and as much as possible, co-located;
  • can perform small scale changes to the system in response to problems, observations, and lessons learned during system support and training tasks rapidly without needing formal governance approval;
  • actively examines and reflects on system use and non-use – with a particular emphasis on identifying and examining what early innovators – to identify areas for system improvement and extension;
  • is able to identify and to raise the need for large scale changes to the system with an appropriate governance process; and
  • is trusted by organisational leadership to translate organisational goals into changes within the system, its support and use.

References

Faegri, T. E., Dyba, T., & Dingsoyr, T. (2010). Introducing knowledge redundancy practice in software development: Experiences with job rotation in support work. Information and Software Technology, 52(10), 1118-1132.

Seely Brown, J., & Duguid, P. (1991). Organizational learning and communities-of-practice: Toward a unified view of working, learning, and innovation. Organization Science, 2(1), 40-57.

The rider, elephant, and shaping the path

Listened to this interview of Chip Heath, a Stanford Professor in Organizational Behaviour about his co-authored book Switch: How to change things when change is hard. My particular interest in this arises from figuring out how to improve learning and teaching in universities. From the interview and the podcast this seems to be another one in a line of “popular science” books aimed at making clear what science/research knows about the topic.

The basic summary of the findings seems to be. If you wish to make change more likely, then your approach has to (metaphorically):

  • direct the rider;
    The rider represents the rational/analytical decision making capability of an individual. This capability needs to be appropriately directed.
  • engage the elephant; and
    The elephant represents the individual’s emotional/instinctive decision making approach. From the interview, the elephant/rider metaphor has the express purpose of showing that the elephant is far stronger than the rider. In typical situations, the elephant is going to win, unless there’s some engagement.
  • shape the path.
    This represents the physical and related environment in which the change is going to take place. My recollection is that the shaping has to support the first two components, but also be designed to make it easier to traverse the path and get to the goal.

There are two parts of the discussion that stuck with me as I think they connect with the task of improving learning and teaching within universities.

  1. The over-rationalisation of experts.
  2. Small scale wins.

Over-rationalisation of experts

The connection between organisational change and losing weight seems increasingly common, it’s one I used and it’s mentioned in the interview. One example used in the interview is to show how a major problem with change is that it is driven by experts. Experts who have significantly larger “riders” (i.e. rational/analytical knowledge) of the problem area/target of change than the people they are trying to change. This overly large rider leads to change mechanisms that over complicate things.

The example they use is the recently modified food pyramid from the United States that makes suggestions something like, “For a balanced diet you should consume X tablespoons of Y a day”. While this makes sense to the experts, a normal person has no idea of how many tablespoons of Y is in their daily diet. In order to achieve the desired change, the individual needs to develop all sorts of additional knowledge and expertise. Which is just not likely.

They compare this with some US-based populariser of weight loss who proposes much simpler suggestions e.g. “Don’t eat anything that comes through your car window”. It’s a simpler, more evocative suggestion that appears to be easier for the rider to understand and helps engage the elephant somewhat.

I can see the equivalent of this within learning and teaching in higher education. Change processes are typically conceived and managed by experts. Experts who over rationalise.

Small scale wins

Related to the above is the idea that change always consists of barriers or steps that have to be stepped over. Change is difficult. The suggestion is that when shaping the path you want to design it in such a way so that the elephant can almost just walk over the barrier. The interviewer gives the example of never being able to get her teenage sons to stop taking towels out of the bathroom and into their bedroom. Eventually what worked was “shaping the path” by storing the sons’ underwear in the bathroom, not their bedroom.

When it comes to improving learning and teaching in universities, I don’t think enough attention is paid to “shaping the path” like this. I think this is in part due to the process being driven by the experts, so they simply don’t see the need. But it is also, increasingly, due to the fact that the people involved can’t shape the path. Some of the reasons the path can’t be shaped include:

  • Changing the “research is what gets me promoted” culture in higher education is very, very difficult and not likely to happen effectively if just one institution does it.
  • When it comes to L&T path (e.g. the LMS product model or the physical infrastructure of a campus) it is not exactly set up to enable “shaping”.
  • The people involved at a university, especially in e-learning, don’t have the skills or the organisational structure to enable “shaping”.

30% of information about task performance

Over on the Remote Learner blog, Jason Cole has posted some information about a keynote by Dr Richard Clark at one of the US MoodleMoots. I want to focus on one key quote from that talk and its implications for Australian higher education and current trends to “improve” learning and teaching and adopt open source LMS (like Moodle).

It’s my argument that this quote, and the research behind it, has implications for the way these projects are conceptualised and run. i.e. they are missing out on a huge amount of potential.

Task analysis and the 30%

The quote from the presentation is

In task analysis, top experts only provide 30% of information about how they perform tasks.

It’s claimed that all the points made by Clark in his presentation are supported by research. It appears likely that the support for this claim comes from Sullivan et al (2008). This paper address the problem of trying to develop procedural skills necessary for professions such as surgery.

The above quote arises due to the problems experts have in describing what they do. Sullivan et al (2008) offer various descriptions and references of this problem in the introduction

This is often difficult because as physicians gain expertise their skills become automated and the steps of the skill blend together [2]. Automated knowledge is achieved by years of practice and experience, wherein the basic elements of the task are performed largely without conscious awareness [3]. This causes experts to omit specific steps when trying to describe a procedure because this information is no longer accessible to conscious processes [2]

Then later, when describing the findings of their research they write

The fact that the experts were not able to articulate all of
the steps and decisions of the task is consistent with the expertise literature that shows that expertise is highly automated [2,3,5] and that experts make errors when trying to describe how they complete a task [3,6,7]. In essence, as the experts developed expertise, their knowledge of the task changed from declarative to procedural knowledge. Declarative knowledge is knowing facts, events, and objects and is found in our conscious working memory [2]. Procedural knowledge is knowing how to perform a task and includes both motor and cognitive skills [2]. Procedural knowledge is automated and operates outside of conscious awareness [2,3]. Once a skill becomes automated, it is fine-tuned to run on autopilot and executes much faster than conscious processes [2,8]. This causes the expert to omit steps and decision points while teaching a procedure because they have literally lost access to the behaviors and cognitive decisions that are made during skill execution [2,5].

The link to analysis and design

A large number of universities within Australia are either:

  1. Changing their LMS to an open source LMS (e.g. Moodle or Sakai), and using this as an opportunity to “renew” their online learning; and/or
  2. Busy on broader interventions to “renew” their online learning due to changes in government policies such as quality assurance, graduate attributes and a move to demand funding for university places.

The common process being adopted by most of these projects is from the planning school of process. i.e. you undertake analysis to identify all relevant, objective information and then design the solution on that basis. You then employ a project team to ensure that the design gets implemented, and finally you put in a skeleton team that maintains the design. This works in terms of information systems (e.g. the selection, implementation and support of a LMS) or broader organisational change (e.g. strategic plans).

The problem is that the “expert problem” Clark refers to above means that it is difficult to gather all the necessary information. It’s difficult to get the people with the knowledge to tell all that they know.

A related example.

The StaffMyCQU Example

Some colleagues and I – over a period of almost 10 years – designed, supported, and evolved an information system call Staff MyCQU. An early part of it’s evolution is described in the “Student Records” section of this paper. It was a fairly simple web application that provided university staff with access to student records and range of related services. Over it’s life cycle, a range of new and different features were added and existing features tweaked, all in response to interactions with the system’s users.

Importantly, the systems developers were also generally the people handling user queries and problems on the “helpdesk”. Quite often, changes to the system would result in tweaks and changes. Rather than being designed up front, the system grew and changed with people using it.

The technology used to implement Staff MyCQU is now deemed ancient and, even more importantly, the system and what it represents is now politically tainted within the organisation. Hence, for the last year or so, the information technology folk at the institution have been working on replacement systems. Just recently, there’s been some concrete outcomes of that work which has resulted in systems being shown to folk, including some of the folk who had used Staff MyCQU. On being shown a particular feature of the new system, it soon became obvious that the system didn’t include a fairly common extension of the feature. An extension that had actually been within StaffMyCQU from the start.

The designers of the new system, with little or no direct connection with actual users doing actual work, don’t have the knowledge about user needs to design a system that is equivalent to what already exists. A perfect example of why the strict separation of analysis, design, implementation and use/maintenance that is explicit in most IT projects and divisions is a significant problem.

The need for growing knowledge

Sullivan et al (2008) suggest cognitive task analysis as a way to better “getting at” the knowledge held by the expert, and there’s a place for that. However, I also think that there is a need, especially in some contexts, for recognition that the engineering/planning method is just not appropriate for some contexts. In some contexts, you need more of a growing/gardening approach. Or, in some cases you need to include more of the growing/gardening approach into your engineering method.

Rather than seeking to gather and analyse all knowledge separated from practice and prior to implementation. Implementation needs to be designed to pay close attention to knowledge that is generated during implementation and the ability to act upon that knowledge.

Especially for wicked problems and complex systems

Trying to improve learning and teaching within a university is a wicked problem. There are many different stakeholders or groups of stakeholders, each with a different frame of reference which leads to different understanding of how to solve the problem. Simple techno-rational solutions to wicked problems rely on the adoption of one of those frames of reference and ignorance of the remainder.

For example, implementation of a new LMS is seen as an information technology problem and treated as such. Consequently, success is measured by uptime and successful project implementation. Not on the quality of learning and teaching that results.

In addition, as you solve wicked problems, you and all of the stakeholders learn more about the problem. The multiple frames of reference change and consequently the appropriate solutions change. This is getting into the area of complex adaptive systems. Dave Snowden has a recent post about why human complex adaptive systems are different.

Prediction

Universities that lean too heavily on engineering/planning approaches to improving learning and teaching will fail. However, they are likely to appear to succeed due to the types of indicators they choose to adopt to as measurements of success and the capability of actors to game those indicators.

Universities that adopt more of a gardening approach, will have greater levels of success, but will have a messier time of it during their projects. These universities will be where the really innovative stuff comes from.

References

Sullivan, M., A. Oretga, et al. (2008). “Assessing the teaching of procedural skills: can cognitive task analysis add to our traditional teaching methods.” The American Journal of Surgery 195: 20-23.

Implications of cognitive theory for instructional design

The following is a summary/reflection of Winn (1990), the abstract follows

This article examines some of the implications of recent developments in cognitive theory for instuctional design. It is argued that behavioral theory is inadequate to prescribe instructional strategies that teach for understanding. Examples of how instructional designers have adopted relevant aspects of cognitive theory are described. However, it is argued that such adoption is only a first step. The growing body of evidence for the indeterminism of human cognition requires even further changes in how instructional designers think and act. A number of bodies of scholarly research and opinion are cited in support of this claim. Three implications of cognitive theory for design are offered: instructional strategies need to be developed to counter the reductionism implicit in task analysis; design needs to be integrated into the implementation of instruction; designers should work from a thorough knowledge of theory not just from design procedures.

Summary

Suggests problems arise when decisions within instructional design are driven by cognitive theory, not behavioural. Mostly around the assumptions of rationality and predictability and the subsequent appropriateness of the traditional teleological design process used by instructional design. Suggests some approaches/implications that might help address these somewhat.

Reflection

The ideas expressed here offer support for the ideas I’ve been formulating about how to improve learning and teaching at Universities. Which obviously means I think it is an important bit of work by an intelligent person. It probably does have flaws. Will need to read and reflect more.

Still not sure that these principles have been applied broadly enough (though the conclusion seems to indicate yes). Winn has focused on changes to the practice of instructional designers in how they approach design without talking about how they may have to change how they work with the academics. Instructional design, for me, is as much about staff development as it is about design, at least within the current university context. Instructional design within universities can’t scale unless it builds capacity amongst the academic staff and the system to help in design.

Many of these limitations of instructional design are similar to those I’ve been trying to push around the institutional implementation of e-learning and more generally about approaches to improve learning and teaching e.g. graduate attributes.

Introduction

Starts with a definition of instructional design from Reigeluth (1993) – essentially it is a set of decision-marking procedures which, given the outcomes to be achieved and the conditions under which they are to achieve them, develops the most effective instructional strategies.

Generally done with analysis of outcomes/conditions, selection of strategies, iterative testing until some level of success achieved. The decisions are guided by instructional theory.

Gives examples of instructional design processes informed by cognitive theory.

Suggests evidence that cognitive theory is impacting thinking/actions of instructional designers, however, suggests that cognitive theory requires further changes in the way they think/act. Has problems with the analysis and selection/testing stages. Current approaches are not sufficient.

Suggests that instructional design should be driven by an understanding of theories of learning and instruction, rather than mastery of design techniques.

I’m assuming here he means that the type and nature of the steps within design process itself should be informed by these, not what he also recognises is that the decisions made within these steps are already driven by this. I’m a bit slow this morning.

Instructional design and behavioural theory

Supports/explains the notion that instructional design originated in behavioural theory, the dominant learning theory of the time when ID originated. Shows how instructional design processes evolved to fit the needs of behavioural theory. Examples include the reductionist nature of task analysis and pilot testing being sufficient to debug instruction that consisted of stimulus-response prescriptions. i.e. behavourists did not consider that there were “mental operations” within the learner that might mediate between stimulus and response. This resulted in design being separated from implementation.

If instruction can be developed to the point where acceptable student performance is likely to occur, then it matters little whether instruction is implemented immediately after the designer has attained this standard, or at some later time and in some other place.

Connects with literature that acknowledges the separation (Richey, 1986), thinks it creates problems (Nunan, 1983, Streibel, 1989, Winn, 1989) and others which think it desirable (Heinich, 1970 and 1984). Desirable because “it allows instruction to be brought up to a high standard, and then to be distributed far and wide so that all students can enjoy the advantages of top-rate teaching”.

Lastly, suggests the idea that instructional design can be “done by the numbers” also arises from the behavioural tradition. The idea is that any novice designer can be successful if they just follow the process – do it by the numbers.

In summary, 3 important areas where behaviourism still exerts power over instructional design:

  1. Reductionist premise that you can identify the parts, then you can teach the whole.
  2. Separate design from implementation.
  3. Assumption that following good procedures, applied correctly results in good instruction.

Sticking with the behavioural traditions, suggests that these 3 are not a problem, if you’re limiting yourself to low-level skills. The problems arise when you go to high levels of cognitive processing.

Cognitive theory

The aim here is to explain why the three assumptions are problematic as informed by cognitive theory – the obvious though here is what would constructivism or connectivism suggest.

The description of cognitive theory is

Changes in behavior are seen as indirect rather than direct outcomes of learning. Observable behavior is mediated and controlled by such mental activities as the acquisition, organization and application of knowledge about the world (Neisser, 1976; Rumelhart and Norman, 1981); by the development of skills that allow the encoding, storing and relrieval of information (E. Gagne, 1985; Shuell, 1986); by people’s motivation (Keller, 1983); their perception of what a task requires of them (Salomon, 1983a); and their perception of their likelihood of success (Salomon, 1983b; Schunk, 1984). Consequently, students are seen as active in the construction of knowledge and the development of skills, leading to the conclusion that learning is a generative process under the control of the learner (Wittrock, 1979, 1982).

To my somewhat untrained ear, this sounds like it has aspects of constructivism.

References Bonner (1988) as identifying a number of the differences between traditional designers and those informed by cognitive theory including:

  • task analysis;
    Traditionally aims to identify directly observed behaviours. Cognitive theory requires that “unobservable” tasks be analysed. i.e. the mental tasks to be mastered prior to observable performance being possible. examples including identifying declarative and procedural knowledge or schemata required to perform. Also recognition that novice to expert involves many steps that need to mastered.
  • objectives;
    Statements of what the student is to accomplish under what conditions and to what criterion is a behaviourist approach. Cognitive objectives are schematic representations of the knowledge to be acquired and procedures to apply.
  • learner characteristics;
    Focus on the schemata/mental models students bring to instruction, not their behaviours. May not be a clear line between what they need to know and what they know – learner as dirty slate.

    This acknowledges the importance of current knowledge of the world, represented in mental models, for the acquisition of new knowledge and skills. Research (De Kleer and Brown, 1981; Larkin, 1985; and authors contributing to Gentner and Stevens, 1983) has shown that learning occurs as students’ mental models acquire refinement and accuracy.

  • instructional strategies;
    Behaviourism selected instructional strategies based on the type of learning to take place, the type of learning outcome.

    But because the cognitive conception of learning places so much importance on the student’s development of adequate knowledge structures, cognitive procedures and mental models, the designer should create opportunities for what Bonner calls a “cognitive apprenticeship” centered around problem-solving rather than prescribe strategies apriori.

    Some general principle may determine aspects of the strategy, however, it evolves like a conversation.

Rieber (1987) point out, is that instruction that is designed from cognitive principles can lead to understanding rather than just to memorization and skill performance.

This speaks to me because too much of what passes for improving learning and teaching strikes me as most likely to create memorisation and skill performance, not long term change.

The need for further change

While instructional designers are adopting principles from cognitive theory, the idea is that recent thinking in cognitive psychology and related fields brings into question some the assumptions of cognitive theory as currently accepted. Moving onto the reasons:

  • metacognition;
    Metacognition research shows that students have or can be trained to acquire the ability to reflect on their performance and adopt/adapt different learning strategies. This means that the intent of a instructional design can be circumvented if the student finds the chosen strategy problematic. If the instruction is not adaptable or the student doesn’t choose a good strategy, then the instructional design is compromised.

    I wonder what implications this has for constructive alignment and its idea of forcing the student to do the right thing?

  • dynamic nature of learning;
    Very interesting. As the learner learns, they develop knowledge and skill that is different from the start. The analysis performed at the start to select the instructional strategy no longer holds. If the analysis was done now, a different strategy would be required.

    Nunan (1983) develops this line of reasoning in his argument against the value of instructional design, drawing on arguments against the separation of thought from action put forward by Oakshotte (1962) and Polanyi (1958).

  • emergent properties of cognition;
    This is the argument against reductionism. Emergence is defined as the idea where the properties of the whole, cannot be explained solely be examining the individual parts of the whole. The nature of the whole affects the way elements within them behave. The suggestion is that a number of people have claimed that the actions of the human mind exhibit emergent properties (Churchland, 1988; Gardner, 1985).

    The reductionism that underpins task and learner analysis “acts counter to, or at best ignores, a significant aspect of human cognition, which is the creation of something entirely new and unexpected from the “raw material” that has to be learned.

  • plausible reasoning;
    A designer informed by cognitive theory assumes that the thought processes of a student will be as logical as the instruction itself. In order to learn from a machine, the student has to think like a machine (Streibel, 1986). There is lots of evidence to suggest people are not logical. “Plausible reasoning” is Collins (1978) idea that people proceed on hunches and incomplete information. Hunt (1982) suggests plausible reasoning has allowed the human species to survive.

    If we waited for complete sets of data before making decisions, we would never make any and would not have evolved into intelligent beings.

  • situated cognition; and
    Somewhat related to previous. Streibel (1989) argues that “cognitive science can never form the basis for instructional design because it proceeds from the assumption that human reasoning is planful and logical when in fact it is not”. References Brown, Collins and Duguid (1989); Lave (1988) and Suchman (1987) – i.e. situated cognition folk – to argue that the way we solve problems is dependent on the situation in which the problem occurs. We do not use formal or mathematical reasoning.
  • unpredictability of human behaviour.
    The 5 previous points suggest that human behaviour is indeterminate. Csiko (1989) gives 5 types of evidence to argue that the unpredictability and indeterminism of human behaviour is central to the debate concerning epistemology of educational research. Winn (1990) suggests it applies equally well to instructional design:
    1. Individual learner differences interact in complex ways with treatments which make prediction of performance difficult.
    2. Chaos theory suggests the smallest changes in initial states lead to wild and totally unpredictable fluctuations in a systems behaviour. Something that is more pronounced in complex cognitive systems.
    3. Much learning is “evolutionary” in that it arises from chance responses to novel stimuli
    4. Humans have free will which can be exercised and subsequently invalidate any predictions about behaviours made deterministically from data.
    5. Quantum mechanics shows that observing a phenomenon, changes that phenomenon so that the results of observations are probabilities, not certainties.

Though eclectic, this body of argument leads one to question seriously both the assumption of the validity of instructional prescriptions and the assumption that what works for some students will work for others.

While prediction may not be part of instructional design, it is of the theories it depends upon and Reigelluth (183) points out that any theory of instruction, while not deterministic, does rely on the probability that prescriptions made form it for its validity. Without such validity, you may as well rely on trial and error.

Conclusions

Cognitive theory has been incorporated into instructional design, but behaviourism influence remains and that causes problems.

Cognitive task analysis to develop objectives is just as reductionist as behaviourist approaches.. The whole approach designers take needs to be re-examined. Three directions might include:

  1. Analysis and synthesis;
    Addressing reductionist analysis – instructional strategies need to ensure knowledge/skill components are put back together in meaningful ways….e.g. Reigeluth and Stein’s (1983) use of “summarisers” and “synthesizes” in elaboration theory.

    Balance analysis as a design procedure with synthesis as an instructional strategy. Such prescriptions should exist in instructional theories.

  2. Design and implementation;

    For instruction to be successful, it must therefore constantly monitor and adapt to unpredicted changes in student behavior and thinking as instruction proceeds……To succeed, then, instructional decisions need to be made while instruction is under way and need to be based on complete theories that allow the generation of prescriptions rather than on predetermined sets of prescriptions chosen ahead of time by a designer. (p64)

    Requires the teacher to monitor and modify strategies as they’ve been prescribed. Requires teachers to be well schooled in instructional design and a solid knowledge of theories of learning and instructions – so that they can respond in some sort of informed way. e.g. need methods that will allow them to invent prescriptive principles when the need arises.

    Second recommendation is that the designer needs to monitor the actual use of the instructional system during implementation, or for the designer to make provision for the use to change instruction strategies.

  3. Theory and procedure.
    Decisions about instructional strategies need to be based on more than just the application of design procedures. Rather than techniques being taught, the principles should be.

    This problem is made worse by researchers who are content to identify strategies that work on single occasions rather than determine the necessary conditions for their success (Clark, 1983).

Reservations about instructional design

The following is at first a rambling diatribe outlining some of my reservations with instructional design as it is practiced. Then it is a summary/reflection on Winn (1990) – “Some implications of cognitive theory for instructional design”. The abstract for Winn (199)

This article examines some of the implications of recent developments in cognitive theory for instmctional design. It is argued that behavioral theory is inadequate to prescribe instructional strategies that teach for understanding. Examples of how instructional designers have adopted relevant aspects of cognitive theory are described. However, it is argued that such adoption is only a first step. The growing body of evidence for the indeterminism of human cognition requires even further changes in how instructional designers think and act. A number of bodies of scholarly research and opinion are cited in support of this claim. Three implications of cognitive theory for design are offered: instructional strategies need to be developed to counter the reductienism implicit in task analysis; design needs to be integrated into the implementation of instruction; designers should work from a thorough knowledge of theory not just from design proceduts.

Actually, I’m running out of time, this post will be just the diatribe. The summary/reflection on Winn (1990) will have to wait till later.

Some context

The following line of thought is part of an on-going attempt to identify potential problems in the practice of instructional design because I work within a Curriculum Design & Development Unit at a University. I am trying to identify and understand these problems as an attempt to move toward something that might be more effective (but would likely have its own problems). The current attempt at moving toward a solution will hopefully arise out of some ideas around curriculum mapping.

The diatribe

Back in the mid-1990s I was being put in charge of my first courses. The institution I worked at was, at that stage, a true 2nd generation distance education provider bolted onto an on-campus university (the university was a few years old, having evolved from an institute of advance education). Second generation distance education was “enterprise” print distance education. There was a whole infrastructure, set of processes and resources targeted at the production of print-based study guides and resource materials that were sent to students as their prime means of education. A part of the resources were instructional designers.

From the start, my experiences with the instructional designers and the system they existed within was not good. The system couldn’t see it was increasingly less relevant through the rise of information technology and the instructional designers seemed more interested in their knowledge about what was the right thing to do, rather than recognising the realities of my context and abilities. Rather than engaging with me and my context and applying their knowledge to show how I could solve my problems, they kept pushing their own ideal situations.

Over 15 years on, and not a lot has changed. I still see the same problem in folk trying to improve learning and teaching at that institution. Rather than engage in an on-going process of improvement and reflection, it’s all about big bang changes and their problems. Worse, then as now, only the smallest population of the academics are being effectively engaged by the instructional designers. i.e. the academics that are keen, the ones that are willing to engage with the ideas of the designers (and others). This is perhaps my biggest concern/proposition, that the majority of academics are not engaging with this work and that a significant proportion of them (but not all) are not improving their teaching. But there are others:

  • Instructional designers are increasingly the tools of management, not folk helping academics.
    In an increasingly managerialist sector, the “correct” directions/methods for learning and teaching are increasingly being set by government, government funded bodies (e.g. ALTC and AUQA) and subsequently the management and professionals (e.g. instructional designers, staff developers, quality assurance etc.) that are institutionally responsible for being seen to respond effectively to the outside demands.

    There are two problems with this:

    1. the technologists alliance; and
      The professionals within universities, because of their interactions with the external bodies and because their success depends on engaging with and responding to the demands of the external body, start to think more like the external body. For example, many of the folk on the ALTC boards/etc are from university L&T centres. Their agenda internally becomes more about achieving ALTC outcomes, rather than outcomes for the academics. Geoghegan (1994) identified the technologists alliance around technology, it is increasingly in existence for L&T.
    2. do what management says.
      Similarly, because senior management within universities are being measured on how well they respond to the external demands. They to are suffering the same problem. In addition, because they are generally on short-term contracts there’s increased demand to respond via short-term approaches that show short-term gain but are questionable in the long-term. Instructional designers etc are then directed to carry out these short-term approaches, even if they will hurt in the long term are or seen as nonsensical by academics.

    The end result is that academics perceive instructional designers as people doing change to them, not doing change with them or for them. Not a good foundation on which to encourage change and improvement in something as personal as teaching.

  • Traditional instructional design is not scalable.
    My current institution has about 4 instructional designers. The first term of this year sees the institution offering 400+ courses. That means somewhere around 800 courses a year. That’s 200 courses a year per instructional designer. If you’re looking at each course being “helped” once every two years, that means each course gets the instructional designer for 2 days every 2 years, at best.

    In this environment, traditional ADDIE type big-bang approaches can’t scale.

  • Instructional design seems informed by a great knowledge of ideal learning and teaching, but none of how to effectively bridge the gap between academics and that ideal.

References

Geoghegan, W. (1994). Whatever happened to instructional technology? 22nd Annual Conferences of the International Business Schools Computing Association, Baltimore, MD, IBM.

Winn, W. (1990). “Some implications of cognitive theory for instructional design.” Instructional Science 19(1): 53-69.

Improving university teaching – learning from constructive alignment by *NOT* mandating it

The problem

Most university teaching is crap. Too general, too sweeping? Maybe, but based on my experience I’m fairly comfortable with that statement. The vast majority of what passes for teaching at Universities has a number of really significant flaws. It’s based more on what the teaching academic is familiar with (generally based on the discipline experience) than on any idea of what might be effective.

So, how do you improve it? This is not a simple question to answer. However, I also believe that most of the current and proposed answers being used by universities to answer this question are are destined to fail. That is, they will be able to show some good practice amongst a small percentage of academic staff, but have the vast majority of learning and teaching to be less than good.

I should point out that almost all of my attempts to describe why I think this is the case and to outline a more appropriate solution have been, essentially, failures.

The following is an attempt to draw on Biggs’ (2001) three levels of teaching to formulate three levels of improving teaching that can be used to understand approaches to improving learning and teaching. I’ll briefly outline an important part of what I think is a better solution. I’ll also reject the approach Bigg’s (2001) outlines as being too teleological, too complex, simply not likely to be effectively implemented and consequently, fail.

By the end of writing this post, I’ve come up with a name “reflective alignment” for my suggested solution.

Biggs’ three levels of teaching

Levels of thinking about learning and teaching

The image to the right is taken from a short film that explains constructive alignment, an approach developed by John Biggs. (I recommend the film if you want another perspective on this.)

These levels of knowledge about teaching lays the blame for poor student outcomes in the hands of the teachers and what they perceive teaching to be about. The three levels are as a focus on:

  1. What the student is.
    This is the horrible “blame the student” approach to teaching. I’ll keep doing what I do. If the students can’t learn then it is because they are bad students. It’s not my fault. Nothing I can do.
  2. What the teacher does.
    This is the horrible “look at me and all the neato, innovative teaching that I’m doing”. I’m doing lots of good and difficult things in my teaching. Are the students learning?
  3. What the student does.
    Obviously this is the good level. The focus is on teaching and leads to learning. Biggs (2001) uses a quote from Tyler (1949) to illustrate that this is not a new idea

    [learning] takes place through the active behavior of the student: it is what he does that he learns, not what the teacher does

Flowing from these levels is the idea of constructive alignment that encompasses the type of teaching likely to suggest a level 3 teacher. Constructive alignment is based on the simple steps of:

  • Clearly specifying detailed learning objectives for students.
  • Arrange teaching and learning activities that encourage/require students to carry out tasks that provide the student with exposure, practice and feedback on the learning objectives.
  • Design a grading/marking system that requires the student to demonstrate how well they achieve the stated learning objectives.

Performing these 3 simple steps well results in the situation that Biggs (2001) describes

In aligned teaching, where all components support each other, students are “trapped” into engaging in the appropriate learning activities, or as Cowan (1998) puts it, teaching is “the purposeful creation of situations from which motivated learners should not be able to escape without learning or developing” (p. 112). A lack of alignment somewhere in the system allows students to escape with inadequate learning.

Sounds simple, doesn’t it? So why don’t more people use it?

“Staff development” is crap!

That’s my characterisation of the position Biggs (2001) espouses (SDC = Staff Development Centre). This includes the following comments

…getting teachers to teach better, which is what staff development is all about…..staff development…is being minimized in many universities, not only in the UK but also in
Australia and New Zealand…..Typically, staff development is undertaken in workshops run by the staff development centre…This is the fundamental problem facing SDCs: the focus is on individual teachers, not on teaching

I particularly liked the following comment from Biggs (2001) and find a lot of resonances with local contextual happenings.

Too often SDCs are seen from a
Level 2 theory as places providing tips for teachers, or as remedial clinics for poor or beginning teachers. Most recently, they are being replaced by training in educational technology, in the confused belief that if teachers are using IT then they must be teaching properly for the new millennium.

Biggs’ solution

Biggs’ (2001) own summary is hard to argue with

In sum, QE cannot be left to the sense of responsibility or to the priorities of individual teachers. The institution must provide the incentives and support structures for teachers to enhance their teaching, and most importantly, to involve individuals through their normal departmental teaching in QE processes.

However, the detail of his suggested solution is, I think, hideously unworkable to such an extent as likely to have a negative impact on the quality of teaching if any institution of a decent size tried to implement it. As Biggs (2001) says, but about a slightly different aspect, “the practical problems are enormous”.

I’ve been involved with the underbelly of teaching and learning at universities to have a significant amount of doubt about whether the reality of learning and teaching matches this representation to the external world. I’ve seen institutions struggle with far simpler tasks than the above and individual academics and managers “game the system” to be seen to comply while not really fulfilling (or even understanding) the requirements.

3 levels of improving teaching

Leadership: when in doubt, wave a flag

I’d like to propose that there are 3 levels of improving teaching that have some connection with Biggs’ 3 levels of teaching. My 3 levels are:

  1. What the teacher is.
    This is where management put teachers into good and bad categories. Any problems with the quality of teaching is the fault of the academic staff. Not the system in which they work.
  2. What the management does.
    This is the horrible simplistic approach taken by most managers and typically takes the forms of fads. i.e. where they think X (where X might be generic skills, quality assurance, problem-based learning or even, if they are really silly, a new bit of technology) will make all the different and proceed to take on the heroic task of making sure everyone is doing X. The task is heroic because it usually involves a large project and radical change. It requires the leadership to be “leaders”. To wield power, to re-organise i.e. complex change that is destined to fail.
  3. What the teacher does.
    The focus is on what the teacher does to design and deliver their course. The aim is to ensure that the learning and teaching system, its processes, rewards and constraints are aiming to ensure that the teacher is engaging in those activities which ensure quality learning and teaching. In a way that makes sense for the teacher, their course and their students.

Reflective alignment – my suggested solution

Biggs’ constructive alignment draws on active student construction of learning as the best way to learn. Hence the “constructive” bit in the name. I’m thinking that “reflective alignment” would be a good name for what I’m thinking.

This is based on the assumption that what we really want academic staff to be doing in order to ensure that they are always improving their learning and teaching is “being reflective”. That they are engaging in deliberate practice. I’ve talked a bit about this in an earlier post.

I’m just reading a paper (Kreber and Castleden, 2009) that includes some support for my idea

We propose that teaching expertise requires a disposition
to engage in reflection on core beliefs…..The value attributed to the notion of ‘reflective practice’ in teaching stems from the widely acknowledged view that reflection on teaching experience contributes to the development of more sophisticated conceptual structures (Leinhardt and Greeno 1986), which in turn lead to enhanced teaching practice and eventually, it is hoped, to improved student learning.

So, simply and without detail, I believe it is important that if a university wants to significantly improve the quality of the majority, if not all, of its learning and teaching then it is to create a context within which academic staff can’t but help to engage in reflective practice as part of their learning and teaching.

That’s the minimum, and not all that easy. The next step would be to create an environment in which academic staff can receive support and assistance in carrying out the ideas which their reflection identifies. But this is secondary. In the absence of this, but the presence of effective reflection, they will work out solutions without the support.

(There is some potential overlap with Biggs’ (2001) solution, but I don’t think his focuses primarily on encouraging reflection. It has more in common with Level 2 approaches to improving learning and teaching, especially in how it would be implemented in most universities. Yes, the implementation problem still remains for my solution and could also most likely be implemented as a Level 2 approach. But any solution should be contextually sensitive.)

References

Biggs, J. (2001). “The Reflective Institution: Assuring and Enhancing the Quality of Teaching and Learning.” Higher Education 41(3): 221-238.

Kreber, C. and H. Castleden (2009). “Reflection on teaching and epistemological structure: reflective and critically reflective processes in ‘pure/soft’ and ‘pure/hard’ fields.” Higher Education 57(4): 509-531.

Using a blog for course design foult sessions

I’ve bitten the bullet and have decided to use WordPress blog to support the 6 hour orientation to course analysis and design I’m supposed to run next week.

It’s probably going to be much more work than I should or planned to put in, but so far it’s been fairly easy. It may be worthwhile.

Good teaching is not innate, it can be "learned" – and what's wrong with academic staff development

The title to this post is included in a quote from Kane, Sandretto and Heath (2004)

The research team, comprising two teacher educators and an academic staff developer, embarked upon this research confident in the belief that good teaching is not innate, it can be learned. With this in mind, the project sought to theorise the attributes of excellent tertiary teachers and the relationships among those attributes, with the long-term goal of assisting novice academics in their development as teachers.

This coincides nicely with my current task and also with an idea I came across on the week-end about deliberate practice and the work of Anders Ericsson.

The combination of these “discoveries” is also providing some intellectual structure and support for the REACT idea about how to improve learning and teaching. However, it’s also highlighting some flaws in that idea. Though the flaws aren’t anywhere near as large as what passes for the majority of academic staff development around learning and teaching.

The following introduces these ideas and how these ideas might be used to improve academic staff development.

Excellent tertiary teaching

Kane et al (2004) close the introduction of their paper with

We propose that purposeful reflection on their teaching plays a key role in assisting our participants to integrate the dimensions of subject knowledge, skill, interpersonal relations, research/teaching nexus and personality into recognised teaching excellence. We conclude with a discussion of the implications of our model for staff development efforts.

Their proposition about the role of reflection in contributing to excellent teaching matches with my long held belief and perception that all of the best university teachers I’ve seen have been those that engage in on-going reflection about their teaching, keep looking for new knowledge and keep trying (and evaluating) innovations based on that knowledge in the hope to improve upon their teaching.

The authors summarise a long history of research into excellent teaching that focused on identifying the attributes of excellent teachers (e.g. well prepared, stimulate interest, show high expectations etc.) but they then suggest a very important distinction.

While these, and other studies, contribute to understanding the perceived attributes of excellent teachers, they have had limited influence on improving the practice of less experienced university teachers. Identifying the elements of “good” university teaching has not shed light on how university teachers develop these attributes.

The model the develop is shown below. The suggest

Reflection lies at the hub of our model and we propose that it is the process through which our participants integrate the various dimensions

Attributes of excellent tertiary teaching

The authors don’t claim this model to have identified any novel sets of attributes. But they do suggest that

the
way in which the participants think about and understand their own practice through purposeful reflection, that has led to their development of excellence

What’s been said about reflection?

The authors have a few paragraphs summarising what’s been said about reflection in connection to tertiary teaching, for example

Day (1999) wrote “it is generally agreed that reflection in, on and about practice is essential to building, maintaining and further developing the capactities of teachers to think and act professionally over the span of their careers”

.

They trace reflection back to Dewey and his definition

“an active, persistent, and careful consideration of any belief or supposed form of knowledge in light of the grounds supporting it and future considerations to which it tends

The also mention a framework of reflection outlined by Hatton and Smith (1995) and use it to provide evidence of reflection from their sample of excellent teachers.

Expertise and deliberate practice

Among the many quotes Kane et al (2004) provide supporting the importance of reflection is this one from Stenberg and Horvath (1995)

in the minds of many, the disposition toward reflection is central to expert teaching

Another good quote (Common 1989, p. 385).

“Master teachers are not born; they become. They become primarily by developing a habit of mind, a way of looking critically at the work they do; by developing the courage to recognize faults, and by struggling to improve”

Related to this view is the question “Was Mozart, and other child prodigies, brilliant because of some innate talent?”. This is a question that this blog post takes up. The answer it gives is no. Instead, it’s the amount and quality of practice they engage in which makes the difference. Nurture wins the “nature versus nurture” battle.

The blog post builds on the work of Anders Ericsson and the concept of “deliberate practice”. The abstract for Ericsson et al (1993) is

The theoretical framework presented in this article explains expert performance as the end result of individuals’ prolonged efforts to improve performance while negotiating motivational and external constraints. In most domains of expertise, individuals begin in their childhood a regimen of effortful activities (deliberate practice) designed to optimize improvement. Individual differences, even among elite performers, are closely related to assessed amounts of deliberate practice. Many characteristics once believed to reflect innate talent are actually the result of intense practice extended for a minimum of 10 years. Analysis of expert performance provides unique evidence on the potential and limits of extreme environmental adaptation and learning.

Implications for academic staff development

If reflection or deliberate practice are key to developing mastery or expertise, then how do approaches to academic staff development and associated policies, processes and structures around university learning and teaching help encourage and enable this practice?

Seminars and presentations probably help those that are keen to become aware of new ideas that may aid their deliberate practice. However, attendance at such events are minimal. Much of existing practice seems to provide some level of support to those, the minority, already engaging in deliberate practice around learning and teaching.

The majority seem to be able to get away without engaging like this. Perhaps there’s something here?

References

Common, D.L. (1989). ‘Master teachers in higher education: A matter of settings’, The Review of Higher Education 12(4), 375–387.

Hatton, N. and Smith, D. (1995). ‘Reflection in teacher education: Towards definition and implementation’, Teaching & Teacher Education 11(1), 33–49.

Kane, R., S. Sandretto, et al. (2004). “An investigation into excellent tertiary teaching: Emphasising reflective practice.” Higher Education 47(3): 283-310.

Sternberg, R. and Horvath, J. (1995). ‘A prototype view of expert teaching’, Educational Researcher 24(6), 9–17.

The design of a 6 hour orientation to course analysis and design

It’s that time of year again, next week I get to run a session with 20 or so new CQU academics looking at course analysis and design. The session is part of a four day program entitled Foundations of University Learning and Teaching (FoULT). The session is run twice a year.

The following post gives an overview of some of my thinking behind the session this year. The sessions won’t really be finalised until the sessions are over, so if you have any feedback or suggestions, fire away.

Constraints

The following constraints apply

  • The session lasts 6 hours.
  • I’m told there will be 24 participants, I expect less than that.
  • I’ll be the only facilitator.
  • The participants are required to do this as part of the employment and some may be less than enthusiastic, though there are generally some very keen participants.
  • The sessions will be held in a computer lab. The computers are arranged around the walls of the room and there is a table without computers in the middle of the room.
  • 3 hours after lunch on one day and the 3 hours before lunch the following day.
  • The participants will be a day and a half into the four days by the time they get to this session (information overload kicking in).
  • Earlier on the first day they will have done sessions on “knowledge management” and assessment – moderation and marking.
  • The title of the sessions is “course analysis and design” so should probably do something close to that.
  • I don’t have the time to do a lot of work because of time constraints and other responsibilities.
  • Have done this session a few times before (slides from the last time are Introduction, Implementation, Analysis and design) so that experience will constrain my thinking.
  • Theoretically, I don’t believe that there is much chance of radically changing minds or developing expertise in new skills. The best I can hope for is sparking interest, raising awareness and pointing them in the right direction.

The plan

I’m thinking that the session should aim to

  • Make people aware of the tools and support currently available to help with their teaching.
  • Introduce them to some concepts or ideas that may lead them to re-think the assumptions on which they base their course design.
  • Introduce them to some resources and ideas that may help them design their courses.

Activities during the session will include

  • Some presentation of ideas using video and images.
  • Discussion and sharing of responses and their own ideas via in class discussion but also perhaps through the CDDU wiki and/or perhaps this blog.
  • A small amount of activity aimed at performing some design tasks.
  • A bit of playing around with various systems and resources.

There won’t be any assessment for this one.

The sessions

I’m planning on having 4 sessions over the 6 hours

  1. Introduction
    Set up who I am and what we’re going to be doing. Find out more about the participants – maybe get them to put this on the wiki or perhaps a WordPress blog — that sounds like an idea. Introduce the Trigwell (2001) model of university teaching that I’ll be using as a basic organising concept. Use it to introduce some of the ideas and explain the aim of the sessions. Introduce them to the technology we’ll be using and get them going.
  2. The T&L Context
    Talk about the details of the CQUni T&L context. What tools and resources are available? What do students see when they use various systems (something staff often don’t see)? Who to ask for help? etc. Also include mention of “Web 2.0” tools i.e. that the context and tools for T&L aren’t limited to what is provided by the institution. Provide an opportunity to play and ask questions about this. Aim is to be concrete, active and get folk aware of what tools they can use. Hopefully to keep them awake after lunch.
  3. Teachers’ thinking
    Introduce and “attack” some basic ideas that inform the way people think about learning and teaching. Some ideas about course design, learning and teaching and human cognition.
  4. Teachers’ planning
    Talk about the process of actually doing course design and some of the ideas, resources and tools that can be used during this process.

The plan is that the first two would be on the afternoon of the first day with the last two on the following day.

The Trigwell (2001) model of teaching is shown in the following image and is briefly described on the flickr page for the image. You should see the connection between the names of the sessions and the model

Trigwell's model of teaching

Actually, after posting this I’ve made some changes to expand the use of the Trigwell (2001) model including teachers’ strategies and in particular gathering some of their strategies.

What’s needed? What would be nice?

I want to provide pointers to additional resources and also make use of good resources during the sessions. The list of what I’ve got is available on del.icio.us.

If you know of any additional resources you’d recommend please either add them in the comments of this post or tag them in del.icio.us with foult

Feedback on the above ideas would also be welcome.

References

Trigwell, K. (2001). “Judging university teaching.” The International Journal for Academic Development 6(1): 65-73.

Somethings that are broken with evaluation of university teaching

This article from a Training industry magazine raises a number of issues, well known in the research literature, about the significant limitations that exist with the evaluation of university teaching.

Essentially the only type of evaluation done at most universities is what the article refers to as “level 1 smile sheets”. That is student evaluation forms that ask them to rank what they felt they learn, what they felt about the course and the teacher. As Will Thalheimer describes

Smile sheets (the feedback forms we give learners after learning events) are an almost inevitable practice for training programs throughout the workplace learning industry. Residing at Donald Kirkpatrick’s 1st level—the Reaction level—smile sheets offer some benefits and some difficulties.

His post goes on to list some problems, benefits and a potential improvement. Geoff Parkin shares his negative view on them.

The highlight for me from the Training mag article was

In some instances, there is not only a low correlation between Level I and subsequent levels of evaluation, but a negative one.

The emphasis on level 1 evaluation – why

Most interestingly, the article then asks the question, “why do so many training organisations, including universities, continue to rely on level 1 smile sheets?”

The answer it provides is that they are too scared to do them in case of what they find. It’s the ostrich approach of sticking the head in the sand.

What else should be done?

This google book search result offers some background on “level 1” and talks about the other 3 levels. Another resource provides some insights and points to other resources. I’m sure if I dug further there would be a lot more information about alternatives.

Simply spreading the above findings amongst the folk at universities who rely and respond to findings of level 1 smile sheets might be a good start. Probably necessary to start moving beyond the status quo.

Choosing your indicators – why, how and what

The unit I work with is undertaking a project called Blackboard Indicators. Essentially the development of a tool that will perform some automated checks on our institution’s Blackboard course sites and show some indicators which might identify potential problems or areas for improvement.

The current status is that we’re starting to develop a slightly better idea of what people are currently doing through use of the literature and also some professional networks (e.g. the Australasian Council on Open, Distance and E-learning) and have an initial prototype running.

Our current problem is how do you choose what the indicators should be? What are the types of problems you might see? What is a “good” course website?

Where are we up to?

Our initial development work has focused on three groupings of category: course content, coordinator presence and all interactions. Some more detail on this previous post.

Colin Beer has contributed some additional thinking about some potential indicators in a recent post on his blog.

Col and I have talked about using our blogs and other locations to talk through what we’re thinking to develop a concrete record of our thoughts and hopefully generate some interest from other folk.

Col’s list includes

  • Learner.
  • Instructor.
  • Content.
  • Interactions: learner/learner, learner/instructor, learner/content, instructor/content

Why and what?

In identifying a list of indicators, as when trying to evaluate anything, it’s probably a good idea to start with a clear definition of why you are starting on this, what are you trying to achieve.

The stated purpose of this project is to help us develop a better understanding of how and how well staff are using the Blackboard courses sites. In particular, we want to know about any potential problems (e.g. a course site not being available to students) that might cause a large amount of “helpdesk activity”. We would also like to know about trends across the board which might indicate the need for some staff development, improvements in the tools or some support resources to improve the experience of both staff and students.

There are many other aims which might apply, but this is the one I feel most comfortable with, at the moment.

Some of the other aims include

  • Providing academic staff with a tool that can aid them during course site creation by checking their work and offering guidance on what might be missing.
  • Provide management with a tool to “check on” course sites they are responsible for.
  • Identify correlations between characteristics of a course website and success.

The constraints we need to work within include

  • Little or no resources – implication being that manual, human checking of course sites is not currently a possibility.
  • Difficult organisational context due to on-going restructure – which makes it hard to get engagement from staff in a task that is seen as additional to existing practice and also suggests a need to be helping staff deal with existing problems more so than creating more work. A need to be seen to be working with staff to improve and change, rather than being seen as inflicting change upon them.
  • LMS will be changing – come 2010 we’ll be using a new LMS, whatever we’re doing has to be transportable.

How?

From one perspective there are two types of process which can be used in a project like this

  1. Teleological or idealist.
    A group of experts get together, decide and design what is going to happen and then explain to everyone else why they should use it and seek to maintain obedience to that original design.
  2. Ateleological or naturalist.
    A group of folk, including significant numbers of folk doing real work, collaborate together to look at the current state of the local context and undertake a lot of small scale experiments to figure out if anything makes sense, they examine and reflect on those small scale experiments and chuck out the ones that didn’t work and build on the ones that did.

(For more on this check out: this presentation video or this presentation video or this paper or this one.)

From the biased way I explained the choices I think it’s fairly obvious which approach I prefer. A preference for the atelelogical approach also means that I’m not likely to want to spend vast amounts of time evaluating and designing criteria based on my perspectives. It’s more important to get a set of useful indicators up and going, in a form that can be accessed by folk and have a range of processes by which discussion and debate is encouraged and then fed back into the improvement of the design.

The on-going discussion about the project is more likely to generate something more useful and contextually important than large up-front analysis.

What next then?

As a first step, we have to get something useful (for both us and others) up and going in a form that is usable and meaningful. We then have to engage with them and find out what they think and where they’d like to take it next. In parallel with this is the idea of finding out, in more detail, what other institutions are doing and see what we can learn.

The engagement is likely going to need to be aimed at a number of different communities including

  • Quality assurance folk: most Australian universities have quality assurance folk charged with helping the university be seen by AUQA as being good.
    This will almost certainly, eventually, require identifying what are effective/good outcomes for a course website as outcomes are a main aim for the next AUQA round.
  • Management folk: the managers/supervisors at CQU who are responsible for the quality of learning and teaching at CQU.
  • Teaching staff: the people responsible for creating these artifacts.
  • Students: for their insights.

Initially, the indicators we develop should match our stated aim – to identify problems with course sites and become more aware with how they are being used. To a large extent this means not worrying about potential indicators of good outcomes and whether or not there is a causal link.

I think we’ll start discussing/describing the indicators we’re using and thinking about on a project page and we’ll see where we go from there.

Creating slidecasts on Slideshare – e-learning support?

The problem

The unit I work with is responsible for helping staff (and to some extent students) of CQUniversity with their learning and teaching. This is traditionally a fairly difficult task which is made more difficult at CQUniversity by a number of contextual factors. Perhaps the largest is the fact that staff and students are spread across at least 9 Australian campuses spread across almost the full length of the Australian east coast, a couple of overseas campuses and partners and throughout the world via distance education.

Face-to-face support sessions, either one on one or in groups, is somewhat difficult when our small unit is entirely based on one of the campuses. We have to look at using technology and other strategies to address this geographic distribution. We’ve been slowly developing our website using a Wiki and other Web 2.0 tools. This post talks about our early attempts at using Slidecasts – simple powerpoint presentations with a narration. In our case designed to be short and sharp and focus on a particular need.

In some vague, nascent and emergent way this work also links into and aims to continue the growth of our PLEs@CQUni project which seeks to develop insights into how CQUniversity can effectively make use of the increasing number of social media services available out on the net. We hope, that if this approach proves useful, we will develop much simpler and easier ways in which CQUniversity staff and students can make use of this sort of approach.

What we’ve done

Our initial experiments with slidecasts have been around the use of CQU’s online assignment submission system – OASIS – and have been implemented using Slideshare.

The slidecasts that are in place at the moment cover the following topics

How we did it

The process currently being used involves the following steps

  1. Prepare a powerpoint presentation with an emphasis on showing what happens.
  2. Use the “record narration” facility of either Powerpoint 2007/2008 to “deliver” the session in a room.
    This is currently done

    • Using a simple usb headset/mic in my office.
    • Without linking the audio files (this makes sure that Powerpoint creates a separate wav file for each slide – which is important for following steps).
    • On a Windows box – the times I’ve tried this on my Mac Powerpoint has cut arbritary lengths of audio off the end of each slide’s narration.
  3. Save a copy of the presentation (without audio) in Office 2004 format and upload it to Slideshare.
  4. Unzip the Office 2007 version of the presentation (with audio) and create a single MP3 file containing the whole narration.
    This is done using the following steps (on the command line on my Mac)

    • Rename the audio files for the first 9 slides
      Powerpoint names the first 9 slides audio1.wav audio2.wav etc. This throws out the order of the slides. i.e. echo audio*.wav – results in the following order of files: audio1.wav audio10.wav audio11.wav……audio2.wav audio20.wav audio21.wav. Which is not good when you want to concatenate the files together in slide order. I do this with a simple shell script

      for name in 1 2 3 4 5 6 7 8 9
      do
        mv audio${name}.wav audio0${name.wave
      done
      

      Appropriate use of sort could probably achieve the same thing, but I’m not keeping the unpacked files, so no problems here.

    • Concatenate the individual wav files into one using SoX
      Very simply using “sox audio*wav all.wav”
    • Convert the wav file into mp3
      SoX should be able to do this, but I haven’t had time to nut it out (busy, busy) so I’ve fallen back on the approach I know that works – iTunes.
  5. Upload the mp3 file to a public website
  6. Use the audio linking slidecast facility on Slideshare to link slides to the corresponding bits of the audio
    The command soxi gives information about individual sound files. I use that to identify the end of each individual slide audio which helps make using the Slidecast audio linker quicker.

    for name in *.wav
    do
      echo $name
       soxi $name | grep Duration
    done
    
  7. Long term aim

    Much of the above process can be automated. I can see a process by which someone gives a presentation with the narration feature of Powerpoint turned on. They then upload the complete (and usually very large) file to a CQUniversity web site (the size of the file and the specific requirements for CQUni would require an institutional system). The CQUni system could then extract the audio, produce the mp3, upload to a public website, upload a version of the presentation to Slideshare and connect the audio with the slides.

    Perhaps the current major limitation with this idea, at least the last time I checked, is that the Slideshare API doesn’t/didn’t appear to provide support for providing timing data for the slides so that the MP3 audio synchronisation could be automated.

    The other major problem is whether or not this approach is actually useful, usable and used by staff and students.

Scavenger hunt and other methods for getting into systems

Much of what we do involves enabling academics (and students) to become familiar with particular technologies. Sufficiently familiar to think about how they can use it in their learning and teaching. We’ve had to do it with Blackboard and we’ll have to do it with Second Life.

The aim of this post is to reflect upon some methods of doing this.

The recipe method

Some of this sort of work is reduced to the “recipe method”. Sessions become the presentation to the poor participant of long lists of recipes. e.g. if you want to do X then you do step 1, 2, 3, 4 and 5.

The benefit of this approach is it is simple to present and often is the quickest way for the participant to do something. The drawback is that they develop no real understanding of how the system works so are unable to problem solve or extend their understanding into new and more useful applications.

It can also be incredibly boring.

Understand the model

The approach I prefer and have started experimenting with involves trying to give the participants a work model of the system and give them opportunities to experiment.

It works on the assumption that any technology has an underlying model and a set of affordances. Things it can do easily.

An example I use to illustrated this is the
Introducing the book video. Which shows how early uses of the new fangled “book technology” had some problems. (In writing this post I notice that someone has posted a modified, english language version.

The point is that very few (if any) people within Universities today would have any problems doing anything with a book because they understand the underlying model. It’s second nature. They can problem solve, develop new uses and take their understanding to different types of printed material.

The aim of a session should be to attempt to allow the participants to develop this type of knowledge for a computer system.

How do you do this?

This should actually be titled “how have I tried to do this recently?” as it attempts to summarise the rationale behind my recent attempts. These have followed these major steps

  • Show the “introducing the book” video – explain the need to develop models of system (slide 6 and 7)
  • Become familiar with the language (slides 8-38 or so)
    This was probably too long. But during these slides the students had a list of terms which they were meant to fill in as a group (2 or 3 folk) – a sort of term bingo. With the winning team getting some small prize (usually chocolate).
  • Introduce the model (slide 26-40)
    It’s not done well in the slides, but the aim is to connect the model of the system with something that is familar. In this case, a Blackboard course site is organised in much the same way as a set of folders on a computer. This would be better illustrated with small activities involving the students.

    In a subsequent similar session, not yet online, I tried to connect the notion of “breadcrumbs” back to their understanding by having a slide of Hansel and Gretel and explaining the origin of the term “breadcrumbs” back to something many of them already knew.

  • Scavenger hunt (slide 40)
    Having been given the overview the question now is how to get the participants to test and apply that knowledge. The approach here is a scavenger hunt. In the same teams as the term bingo the students were given a set of items to find on Blackboard. These items were connected with their use of the system – e.g. the Social Work Blackboard course sites. The items were chosen to require them to apply different aspects of the model that were introduced.

This type of activity works best when

  • There is a time for the participants to play
  • The scavenger hunt has been designed to be specific to their needs or preferences
  • The hunt is done in a small group. Hopefully so that they can back up each others limitations and make each feel a bit more comfortable.

What do students find useful?

In a growing category of blog posts I’m expanding and attempting to apply my interest in diffusion theory and related theories to increase the use of course websites. A major requirement, as outlined in the previous post, in achieving this requires and understanding of what students find useful?

In this post, I’m trying to bring together some research that I’m aware of which seeks to answer this question by actually asking the students. If you know of any additional research, let me now.

Accessing the student voice

Accessing the student voice is the final report from a project which analysed 280,000 comments on Course Experience Questionnaire’s from 90,000 students in 14 Australian Universities. The final report has 142 pages (and is available from the web page). Obviously the following is a selective synopsis of an aspect of it.

The report summarises (pp. 7 and 8) the 12 sub-domains which attracted the highest percentage of mentions which it equates to those that are important to students. In rank order they are

  1. Course Design: learning methods (14.2% share of the 285,000 hits)
    There were 60 different methods identified as the best aspect of their studies which fell into 5 clusters

    • 16 face-to-face methods that focused on interactive rather than passive learning
    • 7 independent study and negotiated learning methods
    • 20 practice-oriented and problem-based learning methods
    • 6 methods associated with simulated environments and laboratory methods
    • 11 ICT enabled learning methods
  2. Staff: quality and attitude (10.8%)
  3. Staff: accessibility (8.2%)
  4. Course Design: flexibility & responsiveness (8.2%)
  5. Course Design: structure & expectations (6.7%)
  6. Course Design: practical theory links (5.9%)
  7. Course Design: relevance (5.6%)
  8. Staff: teaching skills (5.4%)
  9. Support: social affinity (3.8%)
  10. Outcomes: knowledge/skills (3.8%)
  11. Support: learning resources (3.5%)
  12. Support: infrastructure & learning environment (3.4%)

Going into totals

  • Course design – 40.6%
  • Staff – 24.4%
  • Support – 10.7%
  • Outcomes: knowledge/skills – 3.8%

Link to the 7 principles

A quote from the report

The analysis revealed that practice-oriented and interactive, face-to-face learning methods attracted by far the largest number of ‘best aspect’ comments.

Of the 7 Principles for Good Practice in Education mentioned in the last post, #3 is “encourages active learning”

What about CQU students

In late 2007 we asked CQU’s distance education students three questions

  1. What did you like or find useful?
  2. What caused you problems?
  3. What would you like to see?

Students were asked to post their answers to an anonymous discussion forum. This means they could see each others posts and respond.

An initial summary of the responses is available and CQU staff can actually view a copy of the discussion forum containing the original student comments.

A simple analysis revealed the following top 10 mentions

  1. 106 – Some form of eLecture – video, audio etc.
  2. 86 – Quick, effective and polite responses to study queries.
  3. 66 – Clear and consistent information about the expectations of the course and assignments e.g. making past assignments and exams available.
  4. 55 – Study guides.
  5. 53 – Good quality and fast feedback on assignments.
  6. 33 – For resources that are essentially print documents to be distributed as print documents.
  7. 30 – A responsive discussion board.
  8. 27 – Online submission and return of assignments.
  9. 25 – More information about exams, including more detailed information on how students went on exams.
  10. 21 – Having all material ready by the start of term.

CQU Students – 1996

Back in 1996 CQU staff undertook a range of focus groups of CQU distance education students aim at identifying issues related to improving distance education course quality. This work is described in more detail in a paper (Purnell, Cuskelly and Danaher, 1996) from the Journal of Distance Education.

Arising from this work were six interrelated areas of issues. These issues were used to group the suggested improvements from the DE students, these improvements are explained in detail in the paper and are summarised below.

  1. student contact with lecturers/tutors,
    • Easy access to people with relevant expert knowledge and skills (usually the lecturer).
    • Flexible hours for such access.
    • Some personal contacts through telephone and, where possible, some face-to-face contact.
    • Additional learning resources, such as audio- and videotapes to provide more of a personal touch.
  2. assessment tasks,
    • Detailed feedback (approximately one written page) on completed assessment tasks indicating how to improve achievement.
    • Timely feedback so that students can utilize feedback in future assessment tasks in the unit.
    • A one-page criteria and standards sheet showing specific criteria to be used in each assessment task and the standards associated with each criterion (statements of the achievement required for a high grade, etc.).
    • Clear advice on assessment tasks in the unit resource materials and in other contacts such as teleconferences.
    • Where possible, the provision of exemplar responses to similar assessment tasks be provided in the study materials.
    • Lecturers to be mindful of the differences in resources available to rural students compared to those in larger urban areas when setting and marking assessment tasks.
  3. flexibility,
    • Non-compulsory residential schools available at various locations of no more than three days’ duration and incorporating use of facilities such as libraries.
    • Greater consideration of the complexities of lives of distance education students by encouraging, for example, more self-paced learning.
    • Access to accredited study outside traditional semester times.
    • Lecturers/tutors to consider more fully the needs of isolated students in rural areas in support provided.
  4. study materials,
    • Ensure study materials arrive on time (preferably in the week prior to the commencement of a semester).
    • Efficient communications with students-particularly with the written materials provided in addition to the study materials.
    • Ensure each unit’s study guide matches other resources used in a unit, such as a textbook.
    • Lecturers should be mindful of extra costs for students to complete a unit in which, for example, specialized computer software might be needed; if a textbook must be purchased, it should be used sufficiently to justify its purchase.
    • Lecturers should cater to the range of students they have, especially from rural areas, with the study requirements for each unit (many participants reported that self-contained study materials in which there was little or no need to secure other resources to achieve high grades were valued).
  5. mentors, and
    • Having access to mentors is desirable but should be optional for students.
    • Issues about the role of a mentor need to be clarified.
  6. educational technology.
    • Continue to use and make more effective use of technologies familiar to students, such as the telephone and audio- and videocassettes.
    • Examine ways of minimizing access costs to the Internet for students, especially in rural areas.
    • Provide appropriate technical support for students to be able to access and use the Internet.
    • Provide professional development for staff to meet individual needs for using educational technologies involving, for example, interactive television, audio graphics, CD-ROM, e-mail, and the World Wide Web.

The commonalities between this list, from 1996, and the list generated in 2007 are not small.

Creating quality course websites – the pragmatic approach

In a previous post I laid out some rationale for an organisational approach to increase the usage of course websites. In this post I provide more detail on the rationale behind the pragmatic approach, which was described this way in that previous post.

  • Pragmatic – ad hoc changes to particular aspects of a course website.
    Most academic staff make these changes in an unguided way. I’ll suggest that you are likely to obtain greater success if those ad hoc changes are guided by theories and frameworks such as the Technology Acceptance Model (TAM) and related work (Davis, 1989), Diffusion Theory (Rogers, 1995) and the 7 Principles for Good Practice in Education (Chickering and Gamson, 1987).

I’ll describe each of the three theories that form the foundation of this idea. In a later post, I’ll try and take up the idea of how this could be used in the design of a course website.

The fundamental idea is that these three theories become the basis for guiding the design of a standard course website which becomes the minimal online course presences for an institution. These theories are applied with close attention to the local context and consequently there will be differences between contexts, perhaps even between disciplines or types of courses.

Technology acceptance model

The Technology Acceptance Model (TAM) suggests that there are two main factors which will influence how and when people use an information system (the assumption is that a course website is an information system):

  1. Perceived usefulness.
    “The degree to which a person believes that using a particular system would enhance his or her job performance” (Davis 1989).
  2. Perceived ease of use.
    “The degree to which a person believes that using a particular system would be free from effort” (Davis 1989).

Some more discussion about the use of TAM within e-learning can be found in (Behrens, Jamieson et al, 2005).

The questions that arise from this idea for the design of a standard course web might include:

  1. What do the students currently find useful?
  2. What additions might they find useful?
  3. The same questions applied to all staff, both teaching and support.
  4. How can these requirements be fulfilled in a way that is easy to use?
  5. How just do you determine that?

Diffusion theory

Diffusion Theory (Rogers 1995) encompasses a number of related theories explaining why people adopt (or don’t) innovations. The best known of these related theories are the perceived attributes.

The idea is that the how a potential adopter perceives an attribute influences whether or not they will adopt. The perceived attributes that have the biggest influence on adoption are:

  • Relative advantage.
    The degree to which an innovation is perceived as better than the idea it supersedes.
  • Compatibility.
    The degree to which an innovation is perceived as being consistent with the existing values, past experiences, and needs of potential adopters.

  • Complexity.
    The degree to which an innovation is perceived as difficult to understand and use.

If you want students to make use of an online course presence then they must perceive the services offered by that course presence to be useful (relative advantage), easy to use (complexity) and something that meets their expectations of a university experience (compatibility).

The questions which arise from this include

  • What do students expect from their university learning experience?
  • What are their capabilities when it comes to technology and online learning?
  • What do they find useful?

This is one theoretical explanation for why you would expect online lectures, especially if implemented with a YouTube like interface, to be seen as a positive thing by students. In particular because most students still see lectures as a core component of a university education. They expect to have lectures.

This prediction is backed up by the findings of the Carrick Project examining the impact of Web-based lecture technologies. You can find particular mention of this in the projects publications.

Jones, Jamieson and Clark (2003) talk more about the use of diffusion theory for choosing potential web-based educational innovations.

This paper moves beyond the perceived attributes component of diffusion theory. These other components of diffusion theory offer a range of insights and potential advice for other aspects of this type of project. For example,

  • Innovation-decision – is the decision to adopt a particular innovation an optional, collective or authority decision.
    Authority decisions enable the fastest adoption, but may be circumvented.
  • Communication channels – the nature of how information is communicated to people impact on the level of adoption.

The 7 principles

The 7 principles for good practice in undergraduate education were drawn from research on good teaching and learning and were intended to help academics address problems including apathetic students.

The 7 principles are that good practice in education:

  1. encourages contact between students and faculty,
  2. develops reciprocity and cooperation among students,
  3. encourages active learning,
  4. gives prompt feedback,
  5. emphasizes time on task,
  6. communicates high expectations, and
  7. respects diverse talents and ways of learning.

It could be argued that the 7 principles are very specific, research informed advice about how to design activities and resources which students perceive to be useful and provide them with relative advantage. Which has obvious connections with the diffusion theory and TAM.

For example, principle 4 is “gives prompt feedback”. A design feature derived from that might be to return marked assignments to all students within 2 days. Based on my experience with students, they would perceive this as very useful and believe they gain an advantage from it.

This connection suggests that appropriate use of the 7 principles could significantly increase the use of an online course presence.

Implementation considerations – what about the staff?

The insights from diffusion theory and TAM also apply to the teaching staff and even the organisation. Teaching staff are critical to learning and teaching. If they aren’t positively involved it won’t work well. From an organisational perspective, anything that is planned needs to be doable within the resource constraints and also needs to be compatible with the organisation’s current structure.

Creating quality course websites

CQUni has an interest in increasing the quality of the course websites, as part of a broader push to improve the quality of learning and teaching. This post is an attempt to engage with some of the issues and develop some suggestions for moving forward.

There are many an argument why this particular focus on course websites might be somewhat misguided. For example, there is a growing argument (under various titles, including Personal Learning Environments) for the need to move away from this sort of institutional focus to a much greater focus on the learning environment of the individual students. While those discussions are important, as an institution we do need to have a pragmatic focus and improve what we do and our students experience.

So, this post and any subsequent project ignores those broader questions. However, that’s not to say that CQU isn’t engaging in those broader questions, we are. For example, the PLEs@CQUni project is aiming to examine some of the questions raised by notions of PLEs and social software and what impact they might/should have on institutional practice.

What is quality and success

Before embarking on any sort of attempt to “improve quality” you should probably seek to define what quality is. When do you know that you have succeeded?

There have been any number of attempts to determine quality standards for course websites. Many of these draw on guidelines from the educational research literature or from the Human Computer Interface, Information Architecture and other web/online related disciplines. I’m from an information systems background so, not surprisingly, I’ll draw on that background.

Within the information systems research literature how to determine success and consequently replicate it has received a great deal of attention. One of the problems this attention has established is that the notion of success is extremely subjective. An IT department will label something successful while a user of the same system may disagree strongly. The finance division may have yet another perspective. For this and other reasons the IS literature has moved onto using system use as a measure of success (Behrens, Jamieson et al. 2005).

i.e. a system or a tool is successful if there is large and sustained usage of it by people. Hopefully those people you intended. As you might imagine there has been subsequent research talking about the quality of that use. However, the level of use has been established as a fairly reliable measure of success.

I’m going to suggest that the level of student and staff use of a course website is a useful benchmark for the success, and even quality, of an online course. You certainly cannot impact on learning outcomes or students’ perceptions and experience through an online course presence, if they don’t make use of it.

If you accept this, then the question becomes what can we do to encourage use, to encourage success.

If you don’t accept it, and many people might, then the rest of this argument becomes almost pointless. If you don’t accept it, I would like to hear the arguments why it doesn’t make sense.

Encouraging use

Drawing on some of the ideas from previous post I’m going to suggest that there are two main approaches you can take to increase the use of a course website

  1. Pragmatic – ad hoc changes to particular aspects of a course website.
    Most academic staff make these changes in an unguided way. I’ll suggest that you are likely to obtain greater success if those ad hoc changes are guided by theories and frameworks such as the Technology Acceptance Model (TAM) and related work (Davis, 1989), Diffusion Theory (Rogers, 1995) and the 7 Principles for Good Practice in Education (Chickering and Gamson, 1987).
  2. Re-design – where the entire course (and consequently course website) are re-designed by a project that returns to first principles.
    Again, most academic staff I’m familiar with do this type of re-design in a fairly unguided way. I’ll suggest that re-design approaches informed by appropriate educational theories or frameworks are more likely of success. I’ll suggest the approach already being used at CQU, which uses Constructive Alignment (Biggs and Tang, 2007) and the 7 Principles (Chickering and Gamson, ???), work well.
  3. The argument is that a well implemented approach that draws on either of these options has the capability to improve the use of a course website. However, I will propose, that the cost and level of success of each approach is different as shown in the following table.

    The following table seeks to suggestion some potential characteristics of the two approaches as applied to an institutional setting. i.e. not an individual course, but a large collection of courses within a program, faculty or university. Obviously these are broad predictions of outcomes abstracted away from a particular context.

    Characteristics Pragmatic Re-design
    Level of use A significant increase in use is possible A really significant increase in use is possible
    Quality of use/outcomes Some increase in the quality of use/student outcomes but still largely reliant on the individual student’s capabilities etc rather than the actual course website Potentially huge increases in the quality of use and student outcomes.
    Difficulty/cost of implementation Somewhat difficult but not all that expensive, large scale change in a university is always difficult. Extremely difficult and expensive. Typically this will require the academic staff associated with the courses to radically rethink their conceptualisations of learning and teaching. This is not easy.

    The suggestion

    I’m hoping to expand on this further in subsequent posts as an attempt to outline a project by which an institution like CQU can significantly improve the use of its course websites and subsequent improve the learning experience of its students.

    In summary, the proposal involves the following

    1. A broad scale push for pragmatic improvement to course sites
      A project to ensure that all course sites are informed by diffusion theory, TAM and the 7 principles in a way that is “automatic” and simple. Essentially, (almost) all course websites should at least illustrate these characteristics.
    2. A process of complete re-design targeting courses with the largest numbers of students.
    3. An on-going process by which the lessons and outcomes of the re-design are fed into the broad scale process of pragmatic improvement.

Page 1 of 2

Powered by WordPress & Theme by Anders Norén

css.php