The following is a summary and ad hoc thoughts on Macfadyen et al (2014).
There’s much to like in the paper. But the basic premise I see in the paper is that to fix the problems of the current inappropriate teleological processes used in institutional strategic planning and policy setting is an enhanced/adaptive teleological process. The impression I take from the paper is that it’s still missing the need for institutional to be enabling actors within institutions to integrate greater use of ateleological processes (see Clegg, 2002). Of course, Clegg goes onto do the obvious and develop a “dialectical approach to strategy” that merges the two extremes.
Is my characterisation of the adaptive models presented here appropriate?
I can see very strong connections with the arguments made in this paper between institutions and learning analytics and the reasons why I think e-learning is a bit like teenage sex.
But given the problems with “e-learning” (i.e. most of it isn’t much good in pedagogical terms) what does that say about the claim that we’re in an age of “big data” in education. If the pedagogy of most e-learning is questionable, is the data being gathered any use?
Conflating “piecemeal” and “implementation of new tools”
The abstract argues that there must be a shift “from assessment-for-accountability to assessment-for-learning” and suggests that it won’t be achieved “through piecemeal implementation of new tools”.
It seems to me that this is conflating two separate ideas, they are
- piecemeal; and,
i.e. unsystematic or partial measures. It can’t happen bit-by-bit, instead it has to proceed at the whole of institutional level. This is the necessary step in the argument that institutional change is (or must be) involved.
One of the problems I have with this is that if you are thinking of educational institutionals as complex adaptive systems, then that means they are the type of system where a small (i.e. piecemeal change) could potentially (but not always) have a large impact. In a complex system, a few very small well directed changes may have a large impact. Or alternatively and picking up on ideas I’ve heard from Dave Snowden, implementing large amounts of very small projects and observing the outcomes may be the only effective way forward. By definition a complex system is one where being anything but piecemeal may be an exercise in futility. As you can never understand a complex system, let alone being able to guess the likely impacts of proposed changes..
The paper argues that systems of any type are stable and resistant to change. There’s support for this argument. I need to look for dissenting voices and evaluate.
- implementation of new tools.
i.e. the build it and they will come approach won’t work. Which I think is the real problem and is indicative of the sort of simplistic planning processes that the paper argues against.
These are two very different ideas. I’d also argue that while these alone won’t enable the change, they are both necessary for the change. I’d also argue that institutional change (by itself) is also unlikely to to achieve the type of cultural change required. The argument presented in seeking to explain “Why e-learning is a bit like teenage sex” is essentially this. That institutional attempts to enable and encourage changed in learning practice toward e-learning fail because they are too focused on institutional concerns (large scale strategic change) and not enough on enabling elements of piecemeal growth (i.e. bricolage).
The Reusability Paradox and “at scale”
I also wonder about considerations raised by the reusability paradox in connection with statements like (emphasis added) “learning analytics (LA) offer the possibility of implementing real–time assessment and feedback systems and processes at scale“. Can the “smart algorithms” of LA marry the opposite ends of the spectrum – pedagogical value and large scale reuse? Can the adaptive planning models bridge that gap?
Abstract
In the new era of big educational data, learning analytics (LA) offer the possibility of implementing real–time assessment and feedback systems and processes at scale that are focused on improvement of learning, development of self–regulated learning skills, and student success. How- ever, to realize this promise, the necessary shifts in the culture, techno- logical infrastructure, and teaching practices of higher education, from assessment–for–accountability to assessment–for–learning, cannot be achieved through piecemeal implementation of new tools. We propose here that the challenge of successful institutional change for learning analytics implementation is a wicked problem that calls for new adaptive forms of leadership, collaboration, policy development and strategic planning. Higher education institutions are best viewed as complex systems underpinned by policy, and we introduce two policy and planning frameworks developed for complex systems that may offer institutional teams practical guidance in their project of optimizing their educational systems with learning analytics.
Introduction
First para is a summary of all the arguments for learning analytics
- awash in data (I’m questioning)
- now have algorithms/methods that can extract useful stuff from the data
- using these methods can help make sense of complex environments
- education is increasingly complex – increasing learner diversity, reducing funding, increasing focus on quality and accountability, increasing competition
- it’s no longer an option to use the data
It also includes a quote from a consulting company promoting FOMO/falling behind if you don’t use it. I wonder how many different fads they’ve said that about?
Second para explains what the article is about – “new adaptive policy and planning approaches….comprehensive development and implementation of policies to address LA challenges of learning design, leadership, institutional culture, data access and security, data privacy and ethical dilemmas, technology infrastructure, and a demonstrable gap in institutional LA skills and capacity”.
But based on the idea of Universities as complex adaptive systems. That “simplistic approaches to policy development are doomed to fail”.
Assessment practices: A wicked problem in a complex system
Assessment is important. Demonstrates impact – positive and negative – of policy. Assessment still seen too much as focused on accountability and not for learning. Diversity of stakeholders and concerns around assessment make substantial change hard.
“Assessment practice will continue to be intricately intertwined both with learning
and with program accreditation and accountability measures.” (p. 18). NCLB used as an example of the problems this creates and mentions Goodhart’s law.
Picks up the on-going focus on “high-stakes snapshot testing” to provide comparative data. Mentions
Wall, Hursh and Rodgers (2014) have argued, on the other hand, that the perception that students, parents and educational leaders can only obtain useful comparative information about learning from systematized assessment is a false one.
But also suggests that learning analytics may offer a better approach – citing (Wiliam, 2010).
Identifies the need to improve assessement practices at the course level. Various references.
Touches on the difficulties in making these changes. Mentions wicked problems and touches on complex systems
As with all complex systems, even a subtle change may be perceived as difficult, and be resisted (Head & Alford, 2013).
But doesn’t pick up the alternate possibility that a subtle change that might not be seen as difficult could have large ramifications.
Learning analytics and assessment-for-learning
This paper is part of a special issue on LA and assessment. Mentions other papers that have show the contribution LA can make to assessment.
Analytics can add distinct value to teaching and learning practice by providing greater insight into the student learning process to identify the impact of curriculum and learning strategies, while at the same time facilitating individual learner progress (p. 19)
The argument is that LA can help both assessment tasks: quality assurance, and learning improvement.
Technological components of the educational system and support of LA
The assumption is that there is a technological foundation for – storing, managing, visualising and processing big educational data. Need for more than just the LMS. Need to mix it all up and this “institutions are recognizing the need to re–assess the concept of teaching and learning space to encompass both physical and virtual locations, and adapt learning experiences to this new context (Thomas, 2010)” (p. 20) Add to that the rise of multiple devices etc.
Identifies the following requirements for LA tools (p. 21) – emphasis added
- Diverse and flexible data collection schemes: Tools need to adapt to increasing data sources, distributed in location, different in scope, and hosted in any platform.
- Simple connection with institutional objectives at different levels: information needs to be understood by stakeholders with no extra effort. Upper management needs insight connected with different organizational aspects than an educator. User–guided design is of the utmost importance in this area.
- Simple deployment of effective interventions, and an integrated and sustained overall refinement procedure allowing reflection
Some nice overlaps with the IRAC framework here.
It does raise interesting questions about what are institutional objectives? Even more importantly how easy it is or isn’t to identify what those are and what they mean at the various levels of the institution.
Interventions An inset talks about the sociotechnical infrastructure for LA. It mentions the requirement for interventions. (p. 21)
The third requirement for technology supporting learning analytics is that it can facilitate the deployment of so–called interventions, where intervention may mean any change or personalization introduced in the environment to support student success, and its relevance with respect to the context. This context may range from generic institutional policies, to pedagogical strategy in a course. Interventions at the level of institution have been already studied and deployed to address retention, attrition or graduation rate problems (Ferguson, 2012; Fritz, 2011; Tanes, Arnold, King, & Remnet, 2011). More comprehensive frameworks that widen the scope of interventions and adopt a more formal approach have been recently proposed, but much research is still needed in this area (Wise, 2014).
And then this (pp. 21-22) which contains numerous potential implications (emphasis added)
Educational institutions need technological solutions that are deployed in a context of continuous change, with an increasing variety of data sources, that convey the advantages in a simple way to stakeholders, and allow a connection with the underpinning pedagogical strategies.
But what happens when the pedagogical strategies are very, very limited?
Then makes this point as a segue into the next section (p. 22)
Foremost among these is the question of access to data, which needs must be widespread and open. Careful policy development is also necessary to ensure that assessment and analytics plans reflect the institution’s vision for teaching and strategic needs (and are not simply being embraced in a panic to be seen to be doing something with data), and that LA tools and approaches are embraced as a means of engaging stakeholders in discussion and facilitating change rather than as tools for measuring performance or the status quo.
The challenge: Bringing about institutional change in complex systems
“the real challenges of implementation are significant” (p. 22). The above identifies “only two of the several and interconnected socio-technical domains that need to be addressed by comprehensive institutional policy and strategic planning”
- influencing stakeholder understanding of assessment in education
- developing the necessary institutional technological infrastructure to support the undertaking
And this has to be done whilst attending to business as usual.
Hence not surprising that education lags other sectors in adoption analytics. Identifies barriers
- lack of practical, technical and financial capacity to mind big data
A statement from the consulting firm who also just happens to be in the market of selling services to help.
perceived need for expensive tools
Cites various studies showing education institutions stuck at gathering and basic reporting.
And of course even if you get it right…
There is recognition that even where technological competence and data exist, simple presentation of the facts (the potential power of analytics), no matter how accurate and authoritative, may not be enough to overcome institutional resistance (Macfadyen & Dawson, 2012; Young & Mendizabal, 2009).
Why policy matters for LA
Starts with establishing higher education institutions as a “superb example of complex adaptive systems” but then suggests that (p. 22)
policies are the critical driving forces that underpin complex and systemic institutional problems (Corvalán et al., 1999) and that shape perceptions of the nature of the problem(s) and acceptable solutions.
I struggle a bit with that observation and even more with this argument (p. 22)
we argue that it is therefore only through implementation of planning processes driven by new policies that institutional change can come about.
Expands on the notion of CAS and wicked problems. Makes this interesting point
Like all complex systems, educational systems are very stable, and resistant to change. They are resilient in the face of perturbation, and exist far from equilibrium, requiring a constant input of energy to maintain system organization (see Capra, 1996). As a result, and in spite of being organizations whose business is research and education, simple provision of new information to leaders and stakeholders is typically insufficient to bring about systemic institutional change.
Now talks about the problems more specific to LA and the “lack of data-driven mind-set” from senior management. Links this to earlier example of institutional research to inform institutional change (McINtosh, 1979) and links to a paper by Ferguson applying those findings to LA, from there and other places factors identified include
- academics don’t want to act on findings from other disciplines;
- disagreements over qualitative vs quantitative approaches;
- researchers & decision makers speak different languages;
- lack of familiarity with statistical methods
- data not presented/explained to decision makers well enough.
- researchers tend to hedge an dquality conclusions.
- valorized education/faculty autonomy and resisted any administrative efforts perceived to interfere with T&L practice
Social marketing and change management is drawn upon to suggest that “social and cultural change” isn’t brought about by simply by giving access to data – “scientific analyses and technical rationality are insufficient mechanisms for understanding and solving complex problems” (p. 23). Returns to
what is needed are comprehensive policy and planning frameworks to address not simply the perceived shortfalls in technological tools and data management, but the cultural and capacity gaps that are the true strategic issues (Norris & Baer, 2013).
Policy and planning approaches for wicked problems in complex systems
Sets about defining policy. Includes this which resonates with me
Contemporary critics from the planning and design fields argue, however, that these classic, top–down, expert–driven (and mostly corporate) policy and planning models are based on a poor and homogenous representation of social systems mismatched with our contemporary pluralistic societies, and that implementation of such simplistic policy and planning models undermines chances of success (for review, see Head & Alford, 2013).
Draws on wicked problem literature to expand on this. Then onto systems theory.
And this is where the argument about piecemeal growth being insufficient arises (p. 24)
These observations not only illuminate why piecemeal attempts to effect change in educational systems are typically ineffective, but also explains why no one–size–fits–all prescriptive approach to policy and strategy development for educational change is available or even possible.
and perhaps more interestingly
Usable policy frameworks will not be those which offer a to do list of, for example, steps in learning analytics implementation. Instead, successful frameworks will be those which guide leaders and participants in exploring and understanding the structures and many interrelationships within their own complex system, and identifying points where intervention in their own system will be necessary in order to bring about change
One thought is whether or not this idea is a view that strikes “management” as “researchers hedging their bets”? Mentioned above as a problem above.
Moves onto talking “adaptive management strategies” (Head and Alford, 2013) which offer new means for policy and planning that “can allow institutions to respond flexibly to ever-changing social and institutional contexts and challenges” which talk about
- role of cross-institutional collaboration
- new forms of leadership
- development of enabling structures and processes (budgeting, finance, HR etc)
Interesting that notions of technology don’t get a mention.
Two “sample policy and planning models” are discussed.
- Rapid Outcome Mapping Approach (ROMA) – from international development
“focused on evidence-based policy change”. An iterative model. I wonder about this
Importantly, the ROMA process begins with a systematic effort at mapping institutional context (for which these authors offer a range of tools and frameworks) – the people, political structures, policies, institutions and processes that may help or hinder change.
Perhaps a step up, but isn’t this still big up front design? Assumes you can do this? But then some is better than none?
Apparently this approach is used more in Ferguson et al (2014)
- “cause-effect framework” – DPSEEA framework
Driving fource, Pressure, State, Exposure, Effect (DPSEEA) a way of identifying linkages between forces underpinning complex systems.
Ferguson et al (2014) apparently map “apparently successful institutional policy and planning processes have pursued change management approaches that map well to such frameworks”. So not yet informed by? Of course, there’s always the question of the people driving those systems reporting on their work?
I do like this quote (p. 25)
To paraphrase Head and Alford (2013), when it comes to wicked problems in complex systems, there is no one– size–fits–all policy solution, and there is no plan that is not provisional.
References
Ferguson, R., Clow, D., Macfadyen, L., Essa, A., Dawson, S., & Alexander, S. (2014). Setting Learning Analytics in Context : Overcoming the Barriers to Large-Scale Adoption. Journal of Learning Analytics, 1(3), 120–144. doi:10.1145/2567574.2567592
Macfadyen, L. P., Dawson, S., Pardo, A., & Gasevic, D. (2014). Embracing big data in complex educational systems: The learning analytics imperative and the policy challenge. Research and Practice in Assessment, 9(Winter), 17–28.