Assembling the heterogeneous elements for (digital) learning

Month: September 2012

Compliance cultures and transforming the quality of e-learning

In putting the finishing touches on this ASCILITE paper I discovered that Tuesday will be the 2 year anniversary of when I first put together much of the following on attempts by universities to improve/transform the quality of e-learning through checklists and other “quality assurance” methods. Given that I still see this tendency from central L&T folk in Universities – especially those in management – and that the original checklist the sparked the following has been largely gotten rid of, I thought I’d share this.

The anecdotal spark will be briefly touched upon in the ASCILITE paper, the quick summary of some literature won’t be due to space constraints. But I do find it increasingly interesting/frightening/sad that these approaches are still being adopted, even with the widespread knowledge of what actually happens.

The anecdotal spark

The spark for this was a chat with a friend who was and is a Senior Lecturer within a Faculty at an Australian University. I was in a central L&T support role. My friend ins one of the few academics who was widely respected and made significant contributions to the institution. He/she, however, was being increasingly frustrated by the “quality assurance” of L&T, especially the recent introduction of a checklist for the minimum service standard for course websites. The nature of the checklist and the technology used to implement and manage it was so pointless that the widespread academic way of dealing with the checklist was captured by this quote

I go in and tick all the boxes, the moderator goes in and ticks all the boxes and the school secretary does the same thing. It’s just like the exam check list.

This was always a bit sad because the intent – at least the published, espoused intent – of the minimum service standards was to act as a starting point for “integrating learning and teaching strategies that could influence students study habits” and to “encourage academic staff to look beyond existing practices and consider the useful features of the new LMS” (Tickle et al., 2009, p. 1042). But the outcome was no great surprise given what is said in the literature.

Some of the literature

Knight and Trowler (2000)

Likewise, attempts to improve teaching by coercion run the risk of producing compliance cultures, in which there is ‘change without change’ , while simultaneously compounding negative feelings about academic work

Harvey and Newton (2004, p. 149)

These studies reinforce the view that quality is about compliance and accountability and has, in itself, contributed little to any effective transformation of the student learning experience.

Radloff (2008, n.p.)

Staff may question the institutional approach to quality which they perceive as
compliance driven creating ‘busy work’ (Anderson 2006; Harvey & Newton 2004; Laughton 2003) with little positive impact on teaching practice and student learning experiences (Harvey 2006). They may therefore try to avoid, subvert or actively reject attempts to implement quality systems and processes. As Jones and de Saram
(2005, p. 48) note, “It is relatively easy to develop a system and sets of procedures for quality assurance and improvement on paper. To produce a situation where staff on campus ‘buy into’ this in an authentic and energetic manner is much more difficult”.

What’s really surprising is that the last author quoted here was the Pro-Vice Chancellor responsible for learning and teaching just before the checklist approach was introduced.

References

Knight, P., & Trowler, P. (2000). Department-level Cultures and the Improvement of Learning and Teaching. Studies in Higher Education, 25(1), 69–83.

Harvey, L., & Newton, J. (2004). Transforming quality evaluation. Quality in Higher Education, 10(2), 149–165.

Radloff, A. (2008). Engaging staff in quality learning and teaching: What’s a Pro Vice Chancellor to do? In Engaging Communities, Proceedings of the 31st HERDSA Annual Conference (pp. 285–296). Rotorua.

Tickle, K., Muldoon, N., & Tennent, B. (2009). Moodle and the institutional repositioning of learning and teaching at CQUniversity. Proceedings ascilite Auckland 2009 (pp. 1038–1047). Auckland, NZ.

Chasing dreams and recognising realities: teachers' responses to ICT

Doing a bit of reading of the literature. As preparation for the redevelopment of a 3rd year course helping pre-service teachers figure out how ICTs can be integrated into/transform their learning and teaching.

This is a summary of Underwood and Dillon (2011). The abstract is

The teaching profession’s response to the inexorable march of new technology into education has been a focus of research for some 30 years. Linked with the impact of ICT on measurable performance outcomes, teacher attitudes to technology and the impact on pedagogic practice have been central to that research, a research that has often seen teachers as a barrier, not a force for change. The current article brings together findings from a decade of studies that have explored the ways in which teaching staff have responded to the growing notion that ICT is a core part of the teaching toolkit. In doing so we question the simplistic stereotyping of Luddite teachers. Drawing on findings from rare, but crucially important, longitudinal projects the article discusses hopes and fears raised by teaching staff when confronted with changes to existing pedagogy, before moving on to explore issues such as the ‘technology dip’, how maturity modelling can inform our understanding of technological change in schools and ways forward for helping teaching staff to embed technology into their teaching. The article concludes with a discussion of why it is important that the educational system meets this challenge from a learner’s perspective

The paper gives some insights into lines of research in this field, though I’m not sure whether the paper actually delivers anything earth-shaking.

Are teachers the problem or the solution?

Starts with a quote from Aviram and Talmi (2004) that argues for the inevitability of an ICT revolution arising mostly out of the “omnipresence of ICT in our everyday lives”. Then proceeds with references suggesting it isn’t so inevitable

  • Research showing most people don’t use advanced features of technologies (e.g. mobile phones) which may explain lack of readiness to use m-learning despite heavy use of mobiles.
  • Technologies “often fits uncomfortably with teachers’ professional judgements”.
  • Technologies that move teachers outside their comfort zone have slower take up and higher rejection rates (Watson 2001).
  • Jamieson-Proctor et al (2006) note that teachers want to enhance the current curriculum rather than transform with ICT, a reluctance to go beyond familiar practices. Arguing teaching is a conservative profession resistant to change. Which leads to the conclusion that positive impacts are more likely when based on existing pedagogical practice. With the example of the IWB given – though some mention of teacher unease around this

Comment: This is human nature 101. People are pattern matching intelligences. Anything that goes outside their established patterns doesn’t really register. It gets transformed into what they know or ignored. Changing this is difficult, but then that is the nature of learning. Learning anything knew is difficult. Which is one reason why I think an ICTs course for pre-service teachers has to try and engage pre-service teachers in the use of ICTs in new and interesting ways and challenge them to re-think their “existing pedagogical practice” AND show them how ICTs can be used to support their existing practice.

Underwood and Dillon (2011) continue with the idea that it is more than professional practice. The idea that the type of people go into teaching is a factor. “If teachers, as a group, are inherently low technology users compared to the general population, does this mean there is a natural resistance to the embedding of technology into the educational processes and practices?” Limited references to support this view and then they suggest there is evidence that the profession actually more constructive than this, if a little cautious.

Comment: If “teacher as a group” are prone to low technology use, what might this say about teacher educators?

Three ways forward

  1. A minimum emphasis on technology – a laissez-faire approach.
    Argued this is not an acceptable alternative as it leads to a digital underclass.
  2. Bend the technology to the system.
    Argues that there are some benefits, but also leads to an impoverished world in digital terms.
  3. Merge and evolve.
    So we need a merger of technology and education and then an evolution. An evolution that requires a skilled teaching work force.

The first two “options” tend to remind me of Cnut the Great (King Canute). But even the 3rd option suggests to me Cnut the Great. As if the education system will have ability to pick and choose what is merged. Who in the education system makes this decision? Is it the government (which level?), the principals and school leadership, or teachers? How do these folk propose to stop students using ICTs anyway they wish? How do principals/school leadership propose to stop teachers using the ICTs they have in their pocket to teach better? (and so on up the chain). The evolution will happen, the question of anyone being able to control it is much more open.

The Technology Dip

Suggesting that change is not linear, but arises from a set of complex, interacting influences. Change takes a long time and the “grammar of school” is a barrier. p. 321

“To truly embed change we have to unveil the hidden mechanisms that rule school first”

Argues that there is a “technology dip” an “unequivocal confirmation of the existence of, and recovery from, the ‘technology dip'”. From Somekh et al (2004) – school performance on national tests dipped in the years following the introduction of resources into the Test Bed schools. But research shows that there are swift and strong recoveries post dip.

Comment: The graphs demonstrating this seem somewhat light on with data about the statistics. There is the question whether national tests are a good measure.

Talks a bit about the assimilation of technology into the grammar of school, but that this uninspiring use of ICTs can often hide more interesting changes.

Mentions Crook et al (2010) in-depth case studies of 85 teachers. Finding that ICT largely used to support expository, construction and search activities.

There’s a bit of talk about VLEs and maturity models. The purpose wasn’t obvious.

Comes back to this

A consensus that emerges from much of the research on teacher responses to technology is that perceived usefulness is the most influential predictor of satisfaction and intention to continue e-learning usage.

with a range of research supporting it.

Ends with identifying the need to bridge the gap between current concepts of learning/schooling and the need for flexible thinkers and “debatable citizens” (a strange term).

References

Underwood, J., & Dillon, G. (2011). Chasing dreams and recognising realities: teachers’ responses to ICT. Technology, Pedagogy and Education, 20(3), 317–330. doi:10.1080/1475939X.2011.610932

The illusion we understand the past fosters overconfidence in our ability to predict the future

As mentioned in the last post I’m currently reading Thinking, Fast and Slow by Daniel Kahneman. The title of this post comes from this quote from that book

The illusion that we understand the past fosters overconfidence in our ability to predict the future

Earlier in the same paragraph Kahneman writes

As Nassim Taleb pointed out in The Black Swan, our tendency to construct and believe coherent narratives of the past makes it difficult for us to accept the limits of our forecasting ability.

Later in the same chapter, Kahneman writes (my emphasis)

The main point of this chapter is not that people who attempt to predict the future make many errors; that goes without saying. The first lesson is that errors of prediction are inevitable because the world is unpredictable. The second is that high subjective confidence is not to be trusted as an indicator of accuracy (low confidence could be more informative).

The connection to e-learning and the LMS

I read this section of Kahneman’s book while at lunch. On returning I found that @sthcrft had written about “The post-LMS non-apocalypse” in part influenced by @timklapdor’s post from earlier this week Sit down we need to talk about the LMS.

In @sthcrft’s post she tries (and by her own admission somewhat fails) at describing what a “post-LMS” world might look like. She’s being asked to predict the future. Which given the above (and a range of other perspectives) a silly thing to try and do. And this is my main problem with the current top-down, “management science” driven approach being adopted by universities. An approach that is predicated on the assumption that you can predict the future. But, before moving onto management, lets just focus on the management of IT systems and the LMS.

(About to paraphrase some of my own comments on @sthcrft’s post).

I have a problem with the LMS as a product model. It has serious flaws. But in seeking to replace the LMS, most universities are continuing to use the same Process model. The plan-driven process model that underpins all enterprise information systems procurement/development assumes you can predict the future. In this case, that you can predict all of the features that are ever going to be required by all of the potential users of the system.
Not going to happen.

Even though I like @timklapdor’s idea of the environment as a much better product model. It will suffer from exactly the same problems if it is developed/implemented without changing the process model and all that it brings with it. The focus on the plan-driven process model ends up with hierarchical organisations with the wrong types of people/roles with the wrong types of inter-connections between them to deal with the “post-lms” world.

This is one of the reasons why I don’t think the adoption of open source LMSes (e.g. Moodle) are going to show any significant changes in the practice of e-learning.

This is the point I will try to make in an 2012 ASCILITE paper. In that same paper, I’ll briefly touch on an alternative. For the longer version of that story – made significantly inaccessible through the requirements of academic writing – see my thesis.

Management and narratives

On a related note, a conversation with a colleague today reinforced the idea that one of the primary tasks taken on by senior managers (e.g. Vice-Chancellors) of a certain type is the creation of coherent narratives. Creating a positive narrative of the institution and its direction and accomplishments seems to have become a necessary tool to demonstrate that the senior manager has made a positive contribution to the institution. It’s a narrative destined to please all stakeholders, perhaps especially the senior managers set of potential employers.

I wonder about the cause/impact that this increasing emphasis on a coherent institutional narrative has on the belief of those within organisations that you can and should predict the future? I wonder if this type of narrative preventing organisations from preparing to fulfil Alan Kay’s quotation

The best way to predict the future is to make it

Perhaps organisations with certain types of leaders are too busy focused on predicting the future that they can’t actually make it?
Management is all about constructing coherent narratives.

The core problem with learning analytics?

Had some time this morning to read Analytics in Higher Education: Benefits, Barriers, Progress and Recommendations from the EDUCAUSE Centre for Applied Research. It’s a report on the results of a survey and seven focus groups (involving IT and Institutional Research – IR – people from universities) attempt to “address the topics of defining analytics, identifying challenges, and proposing solutions” (Bischel, 2012, p. 5). The following uses the quotes from this report that really struck me to identify and suggest what might be the core problem I have with learning analytics. The assumption of rationality.

The report uses this working definition for analytics

Analtyics is the use of data, statistical analysis and explanatory and predictive models to gain insights and act on complex issues

The benefit of analytics

In talking about the benefits of analytics the report says (Bischel, 2012, p. 12)

Others mentioned how a systematic use of anlytics increases the likelihood that faculty, staff, and particularly administrators will base their decisions on data rather than on intuition or conventional wisdom

Ahh, our saviour. Data. The enabler of rationality?

This does seem to be one of the arguments rolled out for moving to learning analytics. Let’s make our decisions based on data, rather than our beliefs.

The problem

I’m currently reading Thinking, Fast and Slow by Daniel Kahneman. The book does a good job of explaining some of the numerous flaws in the decision making capabilities of human beings. A number of the flaws the book covers have to do with how exceedingly poor human beings are at statistical thinking and drawing inferences from data.

It is not reading destined to make you believe that “data-based decision making” is going to save us.

One of the participants in the focus groups seems to make this point

Data are not going to give you a decision. It is your experience and wisdom – that’s what leads you to make decisions. Data are supposed to be a rational approach that provides you a solid foundation for your thinking so that when somebody questions you, you say that ther is a rationale beyhind my data, but still the decision comes from you. The human brain has to make the decision, not an analytical tool. You should have a number of years experience, background and wisdom before you make the decision, but you need ot have the data to thelp you. It’s just a tool, not an end itself.

In the end, it will be people that draw conclusions and make decisions based on what analytics will show them. Human beings that have bounded rationality, competing agendas etc.

But there’s also another problem with this which is summed up by this quote from a blog post from danah boyd

Just because you see traces of data doesn’t mean you always know the intention or cultural logic behind them. And just because you have a big N doesn’t mean that it’s representative or generalizable.

Engage in the complex system

Returning to the EDUCASE working definition

Analtyics is the use of data, statistical analysis and explanatory and predictive models to gain insights and act on complex issues

I don’t like the trend in learning analytics which is reducing it to reports. Reports that are analysed by experts and managers divorced from the environment in which the complex issues arise and on the assumption of rationality used to guide decision making.

Any promise held by analytics would seem to arise from actually using it to engage in the complex issues, observe what happens and learn from that.

One example of industrial e-learning as "on the web" not "of the web"

The following arises from some recent experiences with the idea of “minimum course sites” and this observation from @cogdog in this blog post

I have no idea if this is off base, but frankly it is a major (to me) difference of doing things ON the web (e.g. putting stuff inside LMSes) and doing things OF the web.

It’s also an query to see if anyone knows of an institution that has implemented a search engine across the institutional e-learning systems in a way that effectively allows users to search for resources in a course centric way.

The symptom

There’s a push on at my current institution’s central L&T folk to develop a minimum course site standard. Some minimum set of services, buttons etc. that will achieve the nirvana of consistency. Everything will be the same.

The main espoused reason as to why this is a good thing is that the students have been asking for it. There has been consistent feedback from students that none of the course sites are the same.

The problem

Of course, the real problem isn’t that students want everything to be the same. The real problem is that they can’t find what they are looking for. Sure, if everything was the same then they might have some ideas about where to find things, but that has problems including:

  • The idea that every course site at a university can be structured the same is a misunderstanding of the diversity inherent in course. Especially as people try to move away from the traditional models such as lecture/tutorial etc.
  • The idea that one particular structure will be understandable/appropriate to all people also is questionable.
  • Even if all the sites are consistent and this works, it won’t solve the problem of when the student is working on a question about “universal design” and wants to find where that was mentioned amongst the many artefacts in the course site.

The solution

The idea that the solution to this problem is to waste huge amounts of resources in the forlorn attempt to achieve some vaguely acceptable minimum standards that is broadly applicable seems to be a perfect example of “doing things ON the web, rather than doing things OF the web”.

I can’t remember the last time I visited a large website and attempted to find some important information by navigating through the site structure. Generally, I – like I expect most people – come to a large site almost directly to the content I am interested in either through a link provided by someone or via a search engine.

Broader implications

To me the idea of solving this problem through minimum standards is a rather large indication of the shortcomings of industrial e-learning. Industrial e-learning is the label I’ve applied to the current common paradigm of e-learning adopted by most universities. It’s techno-rational in its foundations and involves the planned management of large enterprise systems (be they open source or not). I propose that “industrial e-learning” is capable and concerned primarily with “doing things On the web, rather than doing things OF the web”.

Some potential contributing factors might include:

  1. Existing mindsets.
    At this institution, many of the central L&T folk come from a tradition of print-based distance education where consistency of appearance was a huge consideration. Many of these folk are perhaps not “of the web”.
  2. Limitations of the tools.
    It doesn’t appear that Moodle has a decent search engine, which is not surprising given the inspiration of its design and its stated intent of not being an information repository.
  3. The nature of industrial e-learning, its product and process.
    A key characteristic of industrial e-learning is a process that goes something like this
    1. Spend a long time objectively selecting an appropriate tool.
    2. Use that tool for along time to recoup the cost of moving to the new tool.
    3. Aim to keep the tool as vanilla as possible to reduce problems with upgrades from the vendor.
      This applies to open source systems as much as proprietary systems.
    4. Employ people to help others learn how to best use the system to achieve their ends.
      Importantly, the staff employed are generally not their to help others learn how to “best achieve their ends”, the focus definitely tends to be on ho to “best use the system to achieve their ends”.
    5. Any changes to the system have to be requested through a long-scale process that involves consensus amongst most people and the approval of the people employed in point d.

    This means that industrial e-learning is set up to do things the way the chosen systems work. If you have to do something that isn’t directly supported by the system, it’s very, very hard. e.g. add a search engine to Moodle.

All of these make it very hard for industrial e-learning to be “doing things OF the web”

Powered by WordPress & Theme by Anders Norén

css.php