Assembling the heterogeneous elements for (digital) learning

Month: April 2017

Early steps in developing a design system/model for Professional Learning Opportunities

A big responsibility for the new team I work with is the design, implementation and revision of Professional Learning Opportunities (PLOs) for teaching staff at our current institution. The PLO term has been gifted to us as part of the restructure process/documents that created the team. It’s a term I quite like since I’ve chosen to interpret it as covering a huge range of possibilities beyond just face-to-face, synchronous, physical professional development. This is good because the team has been charged with doing something different.

This post is part of the process of coming up with something different. It links our thinking with some work being done elsewhere and is an attempt to think what else can we add. This post is also an example of the team walking the walk. i.e. if we’re aiming to help teaching staff become open and connected educators, then we need to be operating in ways that are open and connected.

Untethered Faculty Development – a starting “recipe’

A few weeks ago I stumbled across the idea of untethered faculty development from the folk at Teaching and Learning Innovations at CSU Channel Islands. It has some strong resonances with what we’d be talking about, but actually provided a concrete example. (For further inspiration it appears that they whole institution had adopted a Domain of One’s Own approach with Reclaim Hosting that was embedded within professional learning practice).

Then this week I stumbled across this 12 minute online presentation that offered some further insight into the why and how of untethered faculty development. A presentation that included an explanation of the following table of what they’ve done to untether faculty development from the constraints of synchronous and face-to-face.

This description is explained as being a “recipe” that is provided to all facilitators.

Before During After

Think invitation
An email with links & additional information for those that can’t attend. It’s a PLO in itself.

Develop resource site
Online site with all resources for the PLO.

Create dynamic agenda
Google docs where people can ask questions etc prior to the start.

Offer remote participation
Use of zoom to allow remote participation.

All materials digital

Engage remote participation

Use dynamic agenda

Collaborative notes
Can be combined with dynamic agenda. A place where participants can share their notes/thoughts.

Record
where possible

Finalise recording

Refresh resource site
Use the collaborative notes and other discussions to improve the resource site.

Write & share blog post
All facilitators asked to write a blog post that links to the resource site.

Follow up communication
Almost a “thick conclusion” to the session

What else?

We’ve already started doing aspects of this. For example, here’s the resource site for the 2017 teaching orientation. (It’s hosted on my blog because we didn’t have a space. We have just taken ownership of a WordPress site within our institution where we’ll be starting work.)

Even this limited practice has a few additional steps in it, for example

  1. Create a short URL for each resource site.

    e.g. http://bit.ly/2017orient is the short URL for the 2017 teaching orientation. This is so people can write down a URL and find the resource site.

    With a thick invitation, this might not be needed. But then again people forget and perhaps during the session they want to visit the resource site.

  2. Add an evaluation step to after.

    e.g. the 2017 teaching orientation resource site links to the results of a simple evaluation of the session.

    I imagine the CSU-CI folk evaluate their PLOs. Absence of this step probably says more about it’s connection to the idea of untethering faculty development.

But what else could be added? What else should we do? What shouldn’t we do? These are questions being answered in this Google document by the team and anyone else who’ll want to. Feel free to add to the document.

Understanding systems conditions for sustainable uptake of learning analytics

My current institution is – like most other universities – attempting to make some use of learning analytics. The following uses a model of system conditions for sustainable uptake of learning analytics from Colvin et al (2016) to think about how/if those attempts might be enhanced. This is done by

  1. summarising the model;
  2. explaining how the model is “wrong”; and,
  3. offering some ideas for future work.

My aim here is mainly a personal attempt to make sense of what I might be able to do around learning analytics (LA) given the requirements of my current position. Requirements that include:

  • to better know my “learner”;

    In my current role I’m part of a team responsible for providing professional learning for teaching staff. My belief is that the better we know what the teaching staff (our “learners”) are doing and experiencing, the better we can help. A large part of the learning and teaching within our institution is supported by digital technologies. Meaning that learning analytics (LA) is potentially an important tool.

    How can we adopt LA to better understand teaching staff?

  • to help teaching staff use LA;

    A part of my work also involves helping teaching academics develop the knowledge/skills to modify their practice to improve student learning. A part of that will be developing knowledge/skills around LA.

    How can we better support the adoption of/development of knowledge/skills around LA by teaching staff?

  • increasing and improving research.

    As academics we’re expected to do research. Increasingly, we’re expected to be very pragmatic about how we achieve outcomes. LA is still (at least for now?) a buzz word. Since we have to engage with LA anyway, we may as well do research. Also done a bit in the past, which needs building upon.

    How can we best make a contribution to research around LA?

The model

The following uses work performed by an OLT funded project looking at student retention and learning analytics. A project that took a broader view that resulted in:

Given the questions I asked in the previous section and my current conceptions it appears that much of my work will need to focus on helping encourage the sustainable uptake of LA within my institution. Hence the focus here on that model.

The model looks like this.

Model of system conditions for sustainble uptake of LA (Colvin et al, 2016)

At some level the aim here is to understand what’s required to to encourage educator uptake of learning analytics in a sustainable way. The authors define educator as (Colvin et al, 2016, p. 19)

all those charged with the design and delivery of the ‘products’ of the system, chiefly courses/subjects, encompassing administrative, support and teaching roles

The model identifies two key capabilities that drive “the flow rate that pushes and pulls educators along the educator uptake pipeline from ‘interested‘ to ‘implementing‘”. These are

  1. Strategic capability “that orchestrates the setting for learning analytics”, and
  2. Implementation capability “that integrates actionable data and tools with educator practices”.

There are two additional drivers of the “flow rate”

  1. Tool/data quality – the “tool or combination of tools that manage data inputs and generate outputs in the form of actionable feedback” (Colvin et al, 2016, p. 30).
  2. Research/learning – “the organisational learning capacity to monitor implementations and improve the quality of tools, the identification and extraction of underlying data and the ease of usability of the feedback interface” (Colvin et al, 2016, p. 30)

The overall aim/hope being to create a “reinforcing feedback loop” (Colvin et al, p. 30) between the elements acting in concert that drives uptake. Uptake is accelerated by LA meeting “the real needs of learners and educators”.

How the model is “wrong”

All models are wrong, but some are useful (one explanation for why there are so many frameworks and models within education research). At the moment, I see the above model as useful for framing my thinking, but it’s also a little wrong, but that’s to be expected.

After all, Box (1979) thought

it would be very remarkable if any system existing in the real world could be exactly represented by any simple model. (p. 202)

Consequently, given that Colvin et al (2016) identify the implementation of LA as complex phenomenon “shaped by multiple interrelated dimensions traversing conceptual, operational and temporal domains…as a non-linear, recursive, and dynamic process..” (p. 22), it’s no great surprise that there are complexities not captured by the model (or my understanding and representation of it in this post).

The aim here is not to argue that (or how) the model is wrong. The aim is not to suggest places where the model should be expanded. Rather the aim is to identify the complexities around implementation that aren’t visible in the model (but which may be in the report) and to use that to identify important/interesting/challenging areas for understanding and action. i.e. for me to think about the areas that interest me the most.

“Complexifying” educator uptake

The primary focus (shown within a green box) of the model appears to be encouraging the sustainable uptake of LA by educators. There are at least two ways to make this representation a bit more complex.

Uptake

Uptake is represented as a two-step process moving from Interested to Implementing. There seems to be scope to explore more broadly than just those two steps.

What about awareness. Arguably, LA is a buzz word and just about everyone may be aware of LA. But are they? If they are aware, what is their conceptualisation of LA. Is it just a predictive tool? Is it even a tool?

Assuming they are aware, how many are actually already in the interested state?
I think @hazelj59 has done some research that might provide some answers about this.

Then there’s the 4 paths work that identifies at least two paths for implementing LA that aren’t captured here. These two paths involve doing it with (DIW) the educator, and enabling educator DIY. Rather than simply implementing LA, these paths see the teacher being involved with the construction of different LA. Moving into the tool/data quality and research/learning elements of the model.

educator

The authors define “educator” to include administrative, support and teaching roles. Yet the above model includes all educators in the one uptake process. The requirements/foci/capabilities of these different types of teaching roles are going to be very different. Some of these types of educators are largely invisible in discussions around LA. e.g. there are currently no moves to provide the type of LA that would be useful to my team.

And of course, this doesn’t even mention the question of the learner. The report does explicitly mention a focus on Supporting student empowerment with a focus on a conception of learners that includes their need to develop agency where LA’s role is to help students take responsibility for their learning.

Institutional data foundation: enabling ethics, privacy, multiple tools, and rapid innovation

While ethics isn’t mentioned in the model, the report does highlight discussion around ethical considerations as important. Ethical and privacy considerations are important.

When discussing tool/data quality the report mentions “an analytic tool or combination of tools that manage data inputs and generate outputs in the form of actionable feedback”. Given the complexity of LA implementation (see the above discussion) and the current realities of digital learning within higher education, it would seem unlikely that a single tool would ever be sufficient.

The report also suggests (Colvin et al, 2016, p. 22)

that the mature foundations for LA implementations were identified in institutions that adopted a rapid innovation cycle whereby small scale projects are initiated and outcomes quickly assessed within short time frames

Combined with the increasing diversity of data sources within an institution, these factors seem to suggest that having an institutional data foundation is a key enabler. Such a foundation could provide a common source for all relevant data to the different tools that are developed as part of a rapid innovation cycle. It might be possible to design the foundation so that it embeds institutional ethical, privacy, and other considerations.

Echoing the model, such a foundation wouldn’t need to be provided by a single tool. It might be a suite of different tools. However, the focus would be on encouraging the provision of a common data foundation used by tools that seek to manipulate that data into actionable insights.

Rapid innovation cycle and responding to context

The report argues that the successful adoption of LA(Colvin et al, 2016, pp. 22-23)

is dependent on an institution’s ability to rapidly recognise and respond to organisational culture and the concerns of all stakeholders

and argues that

the sector can further grow its LA capacity by encouraging institutions to engage in similarly diffuse, small-scale projects with effective evaluation that quickly identifies sites of success and potential impact (p 22)

This appears to be key, but how do you do it? How does an institution create an environment that actively encourages and enables this type of “small-scale projects with effective evaluation”?

My current institution currently has the idea of Technology Demonstrators that appears to resonate somewhat with this idea. However, I’m not sure that this project has currently solved the problem of “effective evaluation” or of how/when to scale beyond the initial project.

Adding in theory/educational research

In discussing LA, Rogers et al (2015, p. 233) argues

that effective interventions rely on data that is sensitive to context, and that the application of a strong theoretical framework is required for contextual interpretation

Where does the “strong theoretical framework” come from, if not educational and related literature/research? How do you include this?

Is this where some one/group needs to take on the role of data wrangler to support this process?

How do you guide/influence uptake?

The report assumes that once the elements in the above model are working in concert to form a reinforcing feedback loop that LA will increasingly meet the real needs of learners and educators. That this will in turn accelerate organisational uptake.

At least for me, this begs the question: How do they know – let alone respond to – the needs of learners and educators?

For me, this harks back to why I perceive that the Technology Acceptance Model (TAM) is useless. TAM views an individual’s intention to adopt a particular digital technology as being most heavily influenced by two factors: perceived usefulness, and perceived ease of use. i.e. if the LA is useful and easy to use, then uptake will happen.

The $64K question is what combination of features of an LA tool will be widely perceived by educators to be useful and easy to use? Islam (2014, p. 25) identifies the problem as

…despite the huge amount of research…not in a position to pinpoint…what attributes…are necessary in order to build a high level of satisfaction and which…generate dissatisfaction

I’ve suggested one possible answer but there are sure to be alternatives and they need to be developed and tested.

The “communities of transformation” approach appears likely to have important elements of a solution. Especially if combined with an emphasis on the DIW and DIY paths for implementing learning analytics.

The type of approach suggested in Mor et al (2015) might also be interesting.

Expanding beyond a single institution

Given that the report focuses on uptake of LA within an institution, the model focuses on factors within the institution. However, no institution is an island.

There are questions around how an institution’s approach to LA can be usefully influenced and influence what is happening within the literature and at other institutions.

Future work

Frame this future work as research questions

  1. How/can you encourage improvement in the strategic capability without holding up uptake?
  2. How can an institution develop a data foundation for LA?
  3. How to support rapid innovation cycles, including effective evaluation, that quickly identifies sites of success and potential impact?
  4. Can the rapid innovation cycles be done in a distributed way across multiple teams?
  5. Can a combination of technology demonstrators and an institutional data foundation provide a way foward?
  6. How to support/encourage DIW and DIY approaches to uptake?
  7. Might an institutional data foundation and rapid innovation cycles be fruitfully leveraged to create an environment that helps combine learning design, student learning, and learning analytics? What impact might this have?

References

Box, G. E. P. (1979). Robustness in the Strategy of Scientific Model Building. In R. Launer & G. Wilkinson (Eds.), Robustness in Statistics (pp. 201–236). Academic Press.

Colvin, C., Wade, A., Dawson, S., Gasevic, D., Buckingham Shum, S., Nelson, K., … Fisher, J. (2016). Student retention and learning analytics : A snapshot of Australian practices and a framework for advancement. Canberra, ACT: Australian Government Office for Learning and Teaching. Retrieved from http://he-analytics.com/wp-content/uploads/SP13-3249_-Master17Aug2015-web.pdf

Powered by WordPress & Theme by Anders Norén

css.php