Assembling the heterogeneous elements for digital learning

The emperor has no clothes – why is the learning and teaching peformance fund naked

Vilhelm Pedersen illustration for Andersen's 'Emperor's New Clothes'

The Australian Federal Government has a Learning and Teaching Performance Fund (LTPF) that is meant to allocate money to Australian universities on the quality and/or improvement in their learning and teaching.

Based on what I know of this approach I think it is fundamentally broken. It’s probably that the “emperor has no clothes”. i.e. Australian universities know it is broken, but can’t point it out because they want to get the money.

The fund and how it works

According to this story in the Australian newspaper the fund allocated $AUD73 million this year. The administrative information for providers document outlines the process for 2009. The process has changed over the 3 years it has been run.

There are two data sources used by the fund:

  • Australian Graduate Survey; and
    All Australian University graduates get a survey in the months after they graduate which asks them two broad sets of questions: are they working and in what, and how satisfied were they with their study/university. The LTPF uses two sets of indicators from this survey
    • Student satisfaction indicators
      • Satisfaction with generic skills
      • Satisfaction with good teaching
      • Overall satisfaction
    • Outcome indicators
      • Full-time employment
      • Further full-time and part-time study
    • Higher education student collection.
      The statistics are used in the LTPF to examine progress rates amongst Bachelor students and the retention rate for the same students.

    The process used goes something like

    • An adjustment process is applied to the raw indicators data.
    • Each university gets a package describing details of the findings for their students.
    • The university provide a submission offering information that may explain some of the results.
    • An expert panel looks at the information and provides advice to the government, back to the institutions and generally ensures the process is effective.

    The trouble is that I think the majority of the data that is at the foundation of this process is less than reliable.

    Why is it broken

    A couple of weeks ago I published a post titled “Somethings that are broken with the evaluation of university teaching”. Essentially it is a collection of links pointing out that “level 1 smile sheets” (surveys that ask learners “were you satisfied”) have significant and well-known limitations in validity and value. In particular the following quote is from this article

    In some instances, there is not only a low correlation between Level I and subsequent levels of evaluation, but a negative one.

    From my perspective the course experience questionnaire (the bit of the Australian graduate survey that asks student satisfaction) essentially takes the “level 1 smile sheet” approach and applies it to a graduates entire university experience. I don’t see this move to a broader area of coverage (whole university experience, up from individual course/unit/subject) helping address the concerns about “level 1 smile sheets”. In fact, I see it getting much worse.

    The course experience is likely to cover at least 3 years experience. For some part-time students this might as much as 6, 9 or more years. Do we really believe that their experience over the last 6 to 12 months isn’t going to over shadow and be more in their mind than their previous experience?

    There are also problems with the graduate destination survey – the part of the survey that asks about what they are doing now. This article points out some of the limitations.

    To some extent having institutions comment on the data before the expert panel examines it might address some of this. But I don’t think that goes anyway towards addressing the significant limitations of this form of evaluation.

    Solutions

    So if it’s broken, what are the alternatives?

    I don’t know. It’s a difficult question and I don’t have the knowledge, experience or time to recommend a solution. Given the nature of this problem, I’m not even sure that there is a single correct solution.

    Based on what I know, some suggestions I think might be worth more consideration include:

    • Measure fit for context/purpose, not comparison.
      To some extent the LTPF process acknowledges that comparing the quality of learning and teaching across all of the diversity of the Australian higher education sector is extremely difficult. So why continue to try and do it. Why not focus on how well the learning and teaching is for the given context. Measure fit for purpose, not comparison against others. That said, any form of measurement has some potential downsides and negative outcomes.
    • Concentrate on improvement.
      This is somewhat similar to the last point. It’s also linked to one of my common sayings, “It’s not how bad you start, but how quickly you get better”. Rather than measure fit for context, measure and reward how much better the learning and teaching at a particular institution has become. Also some potential negative concequences.
    • Use other forms of evaluation/data.
      This and I’m sure many other places talk about additional forms of evaluation beyond level 1 smile sheets. Dave Snowden also has some interesting approaches to evaluation which might apply.

    In general, I would suggest that rather than wringing hands over how difficult, spending inordinate amounts of time reflecting on how hard it is and arguing over which of many options is the best, spending lots of money on consultants that will push their own barrow at the expense of any knowledge of the local context, leaping at the latest fad, or say it’s all too hard we can’t change the governments mind. I think it would suggest an organisation attempt a lot of safe-fail probes to investigate different potential solutions. Apply the lessons learned to improve evaluation within the institution and then promote it amongst the sector.

css.php