Assembling the heterogeneous elements for (digital) learning

Month: November 2012

Moving beyond a fashion: likely paths and pitfalls for learning analytics

The following resources are for a presentation given at the Southern Solar Flare on the 30th November, 2012.

The premise of the talk is that learning analytics shows all the hallmarks of a management fashion, fad, or bandwagon and to avoid this we need to talk more realistically about its implementation. The talk identifies three paths that Universities might use to implement learning analytics and identifies pitfalls for each of these paths. The argument is that there are one or two dominant paths and a forgotten path. It’s the forgotten path that my co-author and I are most interested in. It’s the path which we think will allow learning analytics to have the most impact upon learning and teaching.

There is an abstract on the conference site and an extended abstract.

This presentation evolved from an unsuccessful OLT grant application that attempted to engage with the forgotten path.

Slides

References

Abrahamson, E., & Fairchild, G. (1999). Management fashion: Lifecycles, triggers and collective learning processes. Administrative Science Quarterly, 44(4), 708–740.

Arnott, D., & Pervan, G. (2005). A critical analysis of decision support systems research. Journal of Information Technology, 20(2), 67–87.

Beer, C., Clark, K., & Jones, D. (2010). Indicators of engagement. Curriculum, technology and transformation for an unknown future. Proceedings of ASCILITE Sydney 2010 (pp. 75–86). Sydney.

Beer, C., Jones, D., & Clark, D. (2012). Analytics and complexity : Learning and leading for the future The hidden complexity behind simple patterns. In M. Brown (Ed.), Future Changes, Sustainable Futures. Proceedings of ascilite 2012. Wellington, NZ.

Beer, C., Jones, D., & Clark, K. (2009). The indicators project identifying effective learning, adoption, activity, grades and external factors. Same places, different spaces. Proceedings ascilite Auckland 2009 (pp. 60–70). Auckland, New Zealand.

Birnbaum, R. (2000). Management Fads in Higher Education: Where They Come From, What They Do, Why They Fail. San Francisco: Jossey-Bass.

Campbell, G. (2012). Here I Stand. Retrieved April 2, 2012, from https://sas.elluminate.com/site/external/jwsdetect/playback.jnlp?psid=2012-03-01.1231.M.0728C08DFE8BF0EB7323E19A1BC114.vcr&sid=2008104

Cecez-Kecmanovic, D., Janson, M., & Brown, A. (2002). The rationality framework for a critical study of information systems. Journal of Information Technology, 17, 215–227.

Clark, K., Beer, C., & Jones, D. (2010). Academic involvement with the LMS : An exploratory study. In C. Steel, M. Keppell, P. Gerbic, & S. Housego (Eds.), Curriculum, technology & transformation for an unknown future. Proceedings ascilite Sydney 2010 (pp. 487–496).

Convery, A. (2009). The pedagogy of the impressed: how teachers become victims of technological vision. Teachers and Teaching, 15(1), 25–41.

Dawson, S., Bakharia, A., Lockyer, L., & Heathcote, E. (2011). “Seeing” networks: visualising and evaluating student learning networks Final Report 2011. Main. Canberra.

Dawson, S., Heathcote, L., & Poole, G. (2010). Harnessing ICT potential: The adoption and analysis of ICT systems for enhancing the student learning experience. International Journal of Educational Management, 24(2), 116–128.

Findlow, S. (2008). Accountability and innovation in higher education: a disabling tension? Studies in Higher Education, 33(3), 313–329.

Fleming, P., & Spicer, A. (2003). Working at a cynical distance: Implications for power, subjectivity and resistance. Organization, 10(1), 157–179.

Geoghegan, W. (1994). Whatever happened to instructional technology? In S. Bapna, A. Emdad, & J. Zaveri (Eds.), (pp. 438–447). Baltimore, MD: IBM.

Hirschheim, R., Murungi, D. M., & Pe–a, S. (2012). Witty invention or dubious fad? Using argument mapping to examine the contours of management fashion. Information and Organization, 22(1), 60–84.

Jamieson, K., & Hyland, P. (2006). Factors that influence Information Systems decisions and outcomes: A summary of key themes from four case studies. Adelaide, Australia.

Johnson, L., Adams, S., & Cummins, M. (2012). Technology Outlook for Australian Tertiary Education 2012-2017: An NMC Horizon Report Regional Analysis. New Media Consortium. Austin, Texas.

Johnson, L., Smith, R., Levine, A., & Haywood, K. (2010). The horizon report: 2010 Australia-New Zealand Edition. Austin, Texas.

Jones, D. (2012). The life and death of Webfuse: Principles for learning and leading into the future. ASCILITEÕ2012.

Jones, N., & OÕShea, J. (2004). Challenging hierarchies: The impact of e-learning. Higher Education, 48, 379–395.

Lattuca, L., & Stark, J. (2009). Shaping the college curriculum: Academic plans in context. San Francisco: John Wiley & Sons.

Macfadyen, L., & Dawson, S. (2012a). Numbers Are Not Enough. Why e-Learning Analytics Failed to Inform an Institutional Strategic Plan. Educational Technology & Society, 15(3), 149–163.

Macfadyen, L., & Dawson, S. (2012b). Numbers Are Not Enough. Why e-Learning Analytics Failed to Inform an Institutional Strategic Plan. Educational Technology & Society, 15(3), 149–163.

Marsh, J., Pane, J., & Hamilton, L. (2006). Making sense of data-driven decision making in education: Evidence from recent RAND research. Santa Monica, CA.

Pollock, N. (2005). When is a work-around? Conflict and negotiation in computer systems development. Science Technology Human Values, 30(4), 496–514.

Ramamurthy, K., Sen, A., & Sinha, A. P. (2008). Data warehousing infusion and organizational effectiveness. Systems, Man and É, 38(4), 976–994.

Rogers, E. (1995). Diffusion of Innovations (4th ed.). New York: The Free Press.

Schiller, M. J. (2012). Big Data Fail : Five Principles to Save Your BI. CIO Insight. Retrieved from http://www.cioinsight.com/c/a/Expert-Voices/Big-Data-Fail-Five-Principles-to-Save-Your-BI-Butt-759074/

Shaw, P. (1997). Intervening in the shadow systems of organizations: Consulting from a complexity perspective. Journal of Organizational Change Management, 10(3), 235–250.

Simon, H. (1991). Bounded rationality and organizational learning. Organization Science, 2(1), 125–134.

Snowden, D. (2002). Complex Acts of Knowing. Journal of Knowledge Management, 6(2), 100–111.

Stark, J. (2000). Planning introductory college courses: Content, context and form. Instructional Science, 28(5), 413–438.

Swanson, E. B., & Ramiller, N. C. (2004). Innovating mindfully with information technology. MIS Quarterly, 28(4), 553–583.

Trigwell, K. (2001). Judging university teaching. The International Journal for Academic Development, 6(1), 65–73.

Tutty, J., Sheard, J., & Avram, C. (2008). Teaching in the current higher education environment: perceptions of IT academics. Computer Science Education, 18(3), 171–185.

Tversky, A., & Kahneman, D. (1974). Judgment under uncertainty: Heuristics and biases. Science, 185(4157), 1124–1131.

White, N. (2006). Tertiary education in the Noughties: the student perspective. Higher Education Research & Development, 25(3), 231–246.

The life and death of Webfuse: What's wrong with industrial e-learning and how to fix it

The following is a collection of presentation resources (i.e. the slides) for an ASCILITE’2012 of this paper. The paper and presentation are a summary of the outcomes my PhD work. The thesis goes into much more detail.

Abstract

Drawing on the 14-year life and death of an integrated online learning environment used by tens of thousands of people, this paper argues that many of the principles and practices underpinning industrial e-learning – the current dominant institutional model – are inappropriate. The paper illustrates how industrial e-learning can limit outcomes of tertiary e-learning and limits the abilities of universities to respond to uncertainty and effectively explore the future of learning. It limits their ability to learn. The paper proposes one alternate set of successfully implemented principles and practices as being more appropriate for institutions seeking to learn for the future and lead in a climate of change.

Slides

The slides are available on Slideshare and should show up below. These slides are the extended version, prior to the cutting required to fit within the 20 minute time limit.

References

Arnott, D. (2006). Cognitive biases and decision support systems development: a design science approach. Information Systems Journal, 16, 55–78.

Behrens, S., Jamieson, K., Jones, D., & Cranston, M. (2005). Predicting system success using the Technology Acceptance Model: A case study. 16th Australasian Conference on Information Systems. Sydney.

Brews, P., & Hunt, M. (1999). Learning to plan and planning to learn: Resolving the planning school/learning school debate. Strategic Management, 20(10), 889–913.

Cecez-Kecmanovic, D., Janson, M., & Brown, A. (2002). The rationality framework for a critical study of information systems. Journal of Information Technology, 17, 215–227.

Central Queensland University. (2004). Faculty teaching and learning report. Rockhampton, Australia.

Davenport, T. (1998). Putting the Enterprise into the Enterprise System. Harvard Business Review, 76(4), 121–131.

Dede, C. (2008). Theoretical perspectives influencing the use of information technology in teaching and learning. In J. Voogt & G. Knezek (Eds.), (pp. 43–59). New York: Springer.

Dillard, J., & Yuthas, K. (2006). Enterprise resource planning systems and communicative action. Critical Perspectives on Accounting, 17(2-3), 202–223.

Fleming, P., & Spicer, A. (2003). Working at a cynical distance: Implications for power, subjectivity and resistance. Organization, 10(1), 157–179.

Haywood, T. (2002). Defining moments: Tension between richness and reach. In W. Dutton & B. Loader (Eds.), (pp. 39–49). London: Routledge.

Hutchins, E. (1991). Organizing work by adaptation. Organization Science, 2(1), 14–39.

Introna, L. (1996). Notes on ateleological information systems development. Information Technology & People, 9(4), 20–39.

Jamieson, K., & Hyland, P. (2006). Factors that influence Information Systems decisions and outcomes: A summary of key themes from four case studies. Adelaide, Australia.

Jones, D. (1996). Solving Some Problems of University Education: A Case Study. In R. Debreceny & A. Ellis (Eds.), Proceedings of AusWebÕ96 (pp. 243–252). Gold Coast, QLD: Southern Cross University Press.

Jones, D. (2002). Student Feedback, Anonymity, Observable Change and Course Barometers. In P. Barker & S. Rebelsky (Eds.), World Conference on Educational Multimedia, Hypermedia and Telecommunications 2002 (pp. 884–889). Denver, Colorado: AACE.

Jones, D. (2003). Course Barometers: Lessons gained from the widespread use of anonymous online formative evaluation. QUT, Brisbane.

Jones, D., & Buchanan, R. (1996). The design of an integrated online learning environment. In A. Christie, B. Vaughan, & P. James (Eds.), Making New Connections, asciliteÕ1996 (pp. 331–345). Adelaide.

Jones, D., & Luck, J. (2009). Blog Aggregation Management: Reducing the Aggravation of Managing Student Blogging. In G. Siemns & C. Fulford (Eds.), World Conference on Educational Multimedia, Hypermedia and Telecommunications 2009 (pp. 398–406). Chesapeake, VA: AACE.

Jones, N., & OÕShea, J. (2004). Challenging hierarchies: The impact of e-learning. Higher Education, 48, 379–395.

Katz, R. (2003). Balancing Technology and Tradition: The Example of Course Management Systems. EDUCAUSE Review, 38(4), 48–59.

Kurtz, C., & Snowden, D. (2007). Bramble Bushes in a Thicket: Narrative and the intangiables of learning networks. In M. Gibbert & T. Durand (Eds.), . Blackwell.

Laurillard, D. (2002). Rethinking University Teaching: A Conversational Framework for the Effective Use of Learning Technologies. London: Routledge.

Light, B., Holland, C. P., & Wills, K. (2001). ERP and best of breed: a comparative analysis. Business Process Management Journal, 7(3), 216–224.

March, J. (1991). Exploration and exploitation in organizational learning. Organization Science, 2(1), 71–87.

Mintzberg, H. (1989). Mintzberg on Management, Inside our Strange World of Organisations. New York: Free Press.

Morgan, Glenda. (2003). Faculty use of course management systems. Educause Centre for Applied Research.

Morgan, Glenn. (1992). Marketing discourse and practice: Towards a critical analysis. In M. Alvesson & H. Willmott (Eds.), (pp. 136–158). London: SAGE.

Pozzebon, M., Titah, R., & Pinsonneault, A. (2006). Combining social shaping of technology and communicative action theory for understanding rhetorical closuer in IT. Information Technology & People, 19(3), 244–271.

Robey, D., Ross, W., & Boudreau, M.-C. (2002). Learning to implement enterprise systems: An exploratory study of the dialectics of change. Journal of Management Information Systems, 19(1), 17–46.

Rossi, D., & Luck, J. (2011). Wrestling, wrangling and reaping: An exploration of educational practice and the transference of academic knowledge and skill in online learning contexts. Studies in Learning, Evaluation, Innovation and Development, 8(1), 60–75.

Seely Brown, J., Collins, A., & Duguid, P. (1989). Situated cognition and the culture of learning. Educational Researcher, 18(1), 32–42.

Seely-Brown, J., & Hagel, J. (2005). From push to pull: The next frontier of innovation. The McKinsey Quarterly. McKinsey & Company.

Simon, H. (1991). Bounded rationality and organizational learning. Organization Science, 2(1), 125–134.

Sturgess, P., & Nouwens, F. (2004). Evaluation of online learning management systems. Turkish Online Journal of Distance Education, 5(3).

Thomas, J. (2012). Universities canÕt all be the same – it’s time we embraced diversity. The Conversation. Retrieved June 28, 2012, from http://theconversation.edu.au/universities-cant-all-be-the-same-its-time-we-embraced-diversity-7379

Truex, Duane, Baskerville, R., & Travis, J. (2000). Amethodical systems development: the deferred meaning of systems development methods. Accounting Management and Information Technologies, 10, 53–79.

Truex, Duanne, Baskerville, R., & Klein, H. (1999). Growing systems in emergent organizations. Communications of the ACM, 42(8), 117–123.

Tversky, A., & Kahneman, D. (1974). Judgment under uncertainty: Heuristics and biases. Science, 185(4157), 1124–1131.

Underwood, J., & Dillon, G. (2011). Chasing dreams and recognising realities: teachersÕ responses to ICT. Technology, Pedagogy and Education, 20(3), 317–330. doi:10.1080/1475939X.2011.610932

Wagner, E., Scott, S., & Galliers, R. (2006). The creation of Òbest practiceÓ software: Myth, reality and ethics. Information and Organization, 16(3), 251–275.

Weick, K., & Quinn, R. (1999). Organizational change and development. Annual Review of Psychology, 50, 361–386.

A triumph of the explicit over the tacit and the subsequent loss of learning

I’ve spent the last week dealing with a range of institutional systems for the submission and processing of assignments, results etc. I’m likely to spend at least another week or two trudging through the inexplicable holes, dead-ends, and busy work such systems create. Hence the need for a break. While walking through the local “Japanese Gardens” back to the office I stumbled across a possible explanation. Or at least a catchy phrase to represent that explanation and provide an opportunity to revisit and share some recent reading.

These ill-fitting systems are illustrative of the triumph of the explicit over the tacit (and implicit) that is embodied in the type of business-like processes and policies in use in the current modern Australian university. It’s this triumph that is the biggest barrier to widespread improvement and innovation in learning and teaching at those institutions because they limited institutional learning.

Japanese Gardens

For example, the design of these information systems is based on the traditional Software Development Life Cycle where some poor sod had to develop the set of requirements which were then dutifully turned into software by the IT department or some vendor (even worse because the requirements become even less important as the focus becomes what the vendor’s system can do). The requirements have to be made explicit so that the IT department can prove to unhappy users who find a system they can’t use, that the system is exactly what the users asked for.

Beyond this there is a need to make explicit various formal university policies (and then unintentionally hide them on Intranets). There needs to be an explicit model for everything and everyone has to follow the same explicit model. After all, consistency is quality (isn’t it?). Managers are happy when they see evidence of consistent following of these processes. When in reality every person and their dog is complaining bitterly about the constraints and inappropriateness of these models and actively searching any which way around them.

The attempt to capture all insight and knowledge about the system and its requirements and make it explicit has failed. And worse the people, policies and processes put in place are largely incapable of recognising this. Let alone being able to do something about it. But the explicit has triumphed.

By trying to appear rational and capable of making everything explicit, these processes and policies are sacrificing the tacit. Not only does this triumph make the institution ignorant of the reality of the lived experience of its staff and students, it sacrifices any ability to learn and innovate.

This is no great original insight. Many folk have observed similar previously.

Ciborra (2002)

In this respect, science-based, method-driven approaches can be misleading. Contrary to their promise, they are deceivingly abstract and removed from practice.

…how to go beyond appearances and the illusionary realness of those management and systems concepts in common currency and how to value the apparitions of the ordinary in the life of information systems and organisations.

The development of organisational information systems, processes and policies aim to abstract away the realities of context and achieve a neat tidy, rational model. I see a great similarity between this and what Seely Brown, Collins and Duguid (1989) suggest

Many methods of didactic education assume a separation between knowing and doing, treating knowledge as an integral, self-sufficient substance, theoretically independent of the situations in which it is learned and used.

You can see evidence of this in the observation that the people who develop such systems are generally not involved in the day to day situation in which those systems are used. Not to mention that many of the system owners aren’t directly involved in the day to day use of such systems. The administrative staff put in place to double check consistent following of the standard process never actually have to complete the process themselves. They just make sure everyone else does. The “knowers” and “doers” are separate.

It’s not uncommon for the “knowers” to talk disparagingly of the “doers”. Anyone whose attended a meeting where IT management and academic management get together will have stories of this. So, not only do they not engage in the doing, they don’t value the insights that arise from the “doing”.

In getting into situated learning, Seely Brown et al (1989) continue

The activity in which knowledge is developed and deployed, it is now argued, is not separable from or ancillary to learning and cognition. Nor is it neutral. Rather, it is an integral part of what is learned.

And this, I believe, isn’t limited to the development of information systems and formal university policies and processes. This triumph of the explicit over the tacit directly informs much of the practice of central learning and teaching policies and processes. The very institutional instruments that are meant to inform and improve the quality of learning and teaching.

I’m going to suggest that the harnessing of Information and Communication Technologies (ICTs) and the widespread improvement of the quality of learning and teaching within Australian Universities is being significantly held back because of this uncritical acceptance of supposedly rational methods that result in the triumph of the explicit over the implicit. Even worse, this triumph is a big drag on the ability of these institutions to learn how to be better and to innovate.

Ciborra (2002)

let us drop the old methodologies, in, order to be better able to see the new dimensions the technology is going to disclose to us

The capacity to integrate unique ideas and practical design solutions at the end-user level turns out to be more important than the adoption of structured approaces to systems development or industry analysis

The power of bricolage, improvisation and hacking is that these activities are highly situated; they exploit, in full, the local context and resources at hand, while often pre-planned ways of operating appear to be derooted, and less effective because they do not fit the contingencies of the moment. Also, almost by definition these activities are highly idiosyncratic, and tend to be invisible both because they are marginalised and because they unfold in a way that is small in scope. The results are modes of operating that are latent and not easy to imitate. Thus, the smart bricolage or the good hack cannot be easily replicated outside the cultural bed from which it has emerged.

Throw away your best practices, your annual plans, quality assurance etc (at least a bit) and allow the space and resources for bricolage. Allow the harnessing of the tacit.

References

Ciborra, C. (2002). The Labyrinths of Information: Challenging the Wisdom of Systems. Oxford, UK: Oxford University Press.

Seely Brown, J., Collins, A., & Duguid, P. (1989). Situated cognition and the culture of learning. Educational Researcher, 18(1), 32–42.

Moving beyond a fashion: likely paths and pitfalls for learning analytics

The following started life as a submission to the SoLAR Southern Flare Conference and is serving double purpose as a contribution to #cfhe12, which is currently looking at Big data and analytics.

One of the questions asked for this week about learning analytics is, “is it a fad?”. I agree with Ian Reid’s comment on an earlier post, it’s almost certainly going to be another fad. The following offers some evidence for this, some insights into why it will be the case, and suggests one way it might be avoided.

First, the slideset that was used for the presentation and then the “extended” abstract of the talk.

Fashions and Learning Analytics

Baskerville and Myers (2009) define a management fashion as “a relatively transitory belief that a certain management technique leads rational management progress” (p. 647). For a variety of reasons, it appears that learning analytics will become the next fashion in educational technology. Siemens and Long (2011) position learning analytics as “essential for penetrating the fog that has settled over much of higher education” (p. 40) in part because making decisions based on data seems “stunningly obvious” (p. 31). The 2012 Horizon technology outlook for Australian Tertiary Education (Johnson, Adams, & Cummins, 2012) placed learning analytics into the “one year or less” time-frame for adoption. Anecdotal reports suggest that every Australian higher education institution has at least one, if not more, learning analytics projects underway. However, just two years ago the 2010 Horizon technology outlook for Higher Education in Australia and New Zealand (Johnson, Smith, Levine, & Haywood, 2010) included no mention of learning analytics.

If institutions are going to successfully harness learning analytics to address the challenges facing the higher education sector, then it is important to move beyond the slavish adoption of the latest fashion and aim for more mindful innovation. Swanson and Ramiller (2004) define mindfulness “as the nuanced appreciation of context and ways to deal with it lies at the heart … of what it means to manage the unexpected in innovating with IT” (p. 556). Hirschheim, Murungi and Pena (2012) argue that the introduction of social considerations at an early stage in discussions “may help moderate the adoption of a new IS innovation and replace the sudden and short-lived bursts of interest with a more enduring application of the innovation” (p. 76).

The following seeks to identify a range of broader considerations that are necessary to move learning analytics beyond being just the next fashion. It proposes three likely paths Australian universities may take in their adoption of learning analytics. It will argue that at least one of these paths is dominant and that the best outcomes will be achieved when institutions combine all three paths into a contextually appropriate strategy. It will identify a range of pitfalls specific to each path and another set of pitfalls common to all three paths. This is informed by experience from a four year old project exploring learning analytics within an Australian university (Beer, Clark, & Jones, 2010; Beer, Jones, & Clark, 2009; Beer, Jones, & Clarke, 2012), broader experience in e-learning, and insights from the learning analytics, education, management and information systems literature. This work will inform the next round of design-based research that is seeking to explore how and with what impacts academics can be encouraged to use learning analytics to inform individual pedagogical practice.

Three likely paths

The paths do not have clear and distinct boundaries, however, at the core of each path there is a distinct set of fundamental assumptions and common practices. The three paths are listed below in decreasing order of prevalence and increasing distance from the learning context. The three paths are:

  1. Do it to the academics and students.
    While Siemens and Long (2011) define this path as academic analytics, it is common to hear such projects described as learning analytics. This path involves the implementation and use of a data warehouse (DW) and associated business intelligence (BI) tools. It is the current dominant path (Dawson, Bakharia, Lockyer, & Heathcote, 2011; Johnson & Cummins, 2012).
  2. Do it for the academics and students.
    Researchers, vendors and institutional learning and teaching organizations design and implement methods, models, tools and professional development intended to be used by academics to harness learning analytics to inform their pedagogical practice.
  3. Do it with the academics and students.
    This path focuses on working closely with academics and students to explore how learning analytics can be helpful. It recognizes the complexity and contextual nature of teaching practice and the limited knowledge around how to effectively use learning analytics to inform the individual practice. It assumes that the best way to change how people think about education is to “have experiences that enable appropriation of new modes of teaching and learning that enable them to reconsider and restructure their thinking and practice” (Cavallo, 2004, p. 97).

Potential pitfalls specific to each path are listed in the following table with a brief description and references.

Path Potential pitfalls
Do it to Complex implementation requiring significant organisational changes (Ramamurthy, Sen, & Sinha, 2008) and facing a range of problems (Arnold, 2010; Campbell, 2012; Campbell, DeBlois, & Oblinger, 2007; Greenland, 2011).
Subsequent high failure rates and limited use (Macfadyen & Dawson, 2012; Ramamurthy et al., 2008; Schiller, 2012).
Over-use of simplistic, quantitative measures leading to compliance cultures, ineffective responses, and limited engagement from staff (Jones & Saram, 2005; Knight & Trowler, 2000; Palmer, 2012)
Few, if any, insight how learning analytics can help academics inform the design and evaluation of their teaching practice (Dawson et al., 2011).
Do it for Tendency to focus on abstract representations that are detached from practice, distorting the intricacies of practice, and limiting how well it can be understood and enhanced (Seely Brown, Collins, & Duguid, 1989).
Limited adoption because of an ignorance of the diversity of academic staff and the subsequent homogeneity of technology implementation and support (Geoghegan, 1994).
Assumption that academics design their teaching using a rational planning model. Lattuca and Stark (2009) suggest that this is not the case.
Do it with Such approach spends time and energy discovering what to do and consequent can be seen as inefficient (Introna, 1996).
Systems that engage pre-dominantly in exploration and too little in exploitation exhibit too many undeveloped new ideas and too little distinctive competence (March, 1991).
A decentralized approach can lead to problems include a lack of resources and a feeling of operating either outside of or in opposition to the institution’s policies and processes (Hannah, Jenny, Ruth, & Jane, 2010).

Other potential pitfalls

Beyond path specific pitfalls, there are a number of common pitfalls. The session will use the literature to identify and describe a range of these. A sample of these common pitfalls is listed in the following table.

Pitfall Problems
We’re not rational Data-driven decision making “does not guarantee effective decision making” (Marsh, Pane, & Hamilton, 2006, p. 10).
Organisational and political conditions and the interpretations of individuals and collectives shape and mediate this process (Marsh, Pane, & Hamilton, 2006, p. 3).
Given a complex environment, there are limits to the ability of human beings to adapt optimally, or even satisfactorily (Simon, 1991).
Even the best Decision Support System “cannot overcome poor managerial decision making” (Hosack, Hall, Paradice, & Courtney, 2012, p. 321).
Issues from informing fields Learning analytics draws on a number of established fields with active research programs that are identifying relevant issues that will impact upon learning analytics projects. Examples include big data (Bollier & Firestone, 2010; Boyd & Crawford, 2012) and Decision Support Systems (Arnott & Pervan, 2005; Hosack et al., 2012)
Learning is diverse There is no one best way of developing instruction (Davies, 1991) and instructional design can only progress with the recognition that “learning is a human activity quite diverse in its manifestations from person to person and even from day to day” (Dede, 2008, p. 58).
Both students (Beer, Jones, & Clark, 2012) and academics (Clark, Beer, & Jones, 2010) show diversity in how they interact within the LMS for different courses. The simple patterns of analytics hide significant complexity (Beer, Jones, & Clark, 2012).

Measuring the wrong thing While LMS adoption is almost universal at the institutional level, it is limited at the course level (Jones & Muldoon, 2007) with the majority of use focused on content distribution (Malikowski, 2010).
Analysis of existing LMS data is measuring limited, poor quality learning and teaching.
The absence of high-quality data leads to data becoming misinformation and subsequently invalid inferences (Marsh, Pane, & Hamilton, 2006).
The trend away from institutional information systems is likely to further reduce the level of data available for analysis.
Forgetting about action In considering data driven decision making, “equal attention needs to be paid to analysing data and taking action based on data” (Marsh, Pane, & Hamilton, 2006, p. 10). In the context of learning analytics, taking action doesn’t receive the same level of attention.
Being able to interpret the patterns provided by learning analytics and apply them to practice is difficult, time-consuming, requires additional support, and is worthy of further investigation (Dawson et al., 2011; Dawson & McWilliam, 2008).
Novelty and dynamic contexts Transforming an institution as complex as the university is neither linear nor predictable (Duderstadt, Atkins, & Van Houweling, 2002).
Operating in a dynamic context requires organisational structures that adjust and become far more responsive to change (Mintzberg, 1989)
Instructional design is an archetypal example of an ill-structured problem (Jonassen, 1997)

References

Arnold, K. E. (2010). Signals: Applying Academic Analytics. Educause Quarterly, 33(1).

Arnott, D., & Pervan, G. (2005). A critical analysis of decision support systems research. Journal of Information Technology, 20(2), 67–87.

Baskerville, R., & Myers, M. (2009). Fashion waves in Information Systems research and practice. Mis Quarterly, 33(4), 647–662.

Beer, C., Clark, K., & Jones, D. (2010). Indicators of engagement. Curriculum, technology and transformation for an unknown future. Proceedings of ASCILITE Sydney 2010 (pp. 75–86). Sydney.

Beer, C., Jones, D., & Clark, D. (2012). Analytics and complexity : Learning and leading for the future The hidden complexity behind simple patterns. ASCILITEÕ2012. Wellington, NZ.

Beer, C., Jones, D., & Clark, K. (2009). The indicators project identifying effective learning, adoption, activity, grades and external factors. Same places, different spaces. Proceedings ascilite Auckland 2009 (pp. 60–70). Auckland, New Zealand.

Beer, C., Jones, D., & Clarke, D. (2012). Analytics and complexity: Learning and leading for the future. ASCILITEÕ2012. Wellington, NZ.

Bollier, D., & Firestone, C. (2010). The promise and peril of big data. Washington DC: The Aspen Institute.

Boyd, D., & Crawford, K. (2012). Critical questions for big data. Information, Communication & Society, 15(5), 662–679.

Campbell, G. (2012). Here I Stand. Retrieved April 2, 2012, from https://sas.elluminate.com/site/external/jwsdetect/playback.jnlp?psid=2012-03-01.1231.M.0728C08DFE8BF0EB7323E19A1BC114.vcr&sid=2008104

Campbell, J., DeBlois, P., & Oblinger, D. (2007). Academic analytics: A new tool for a new era. EDCAUSE Review, 42(4), 40–42.

Cavallo, D. (2004). Models of growth – Towards fundamental change in learning environments. BT Technology Journal, 22(4), 96–112.

Clark, K., Beer, C., & Jones, D. (2010). Academic involvement with the LMS : An exploratory study. In C. Steel, M. Keppell, P. Gerbic, & S. Housego (Eds.), Curriculum, technology & transformation for an unknown future. Proceedings ascilite Sydney 2010 (pp. 487–496).

Clegg, S., & Smith, K. (2008). Learning, teaching and assessment strategies in higher education: contradictions of genre and desiring. Research Papers in Education, 25(1), 115–132.

Davies, I. (1991). Instructional development as an art: One of the three faces of ID. Performance and Instruction, 20(7), 4–7.

Dawson, S., Bakharia, A., Lockyer, L., & Heathcote, E. (2011). ÒSeeingÓ networks: visualising and evaluating student learning networks Final Report 2011. Main. Canberra.

Dawson, S., & McWilliam, E. (2008). Investigating the application of IT generated data as an indicator of learning and teaching performance. Canberra: Australian Learning and Teaching Council.

Dede, C. (2008). Theoretical perspectives influencing the use of information technology in teaching and learning. In J. Voogt & G. Knezek (Eds.), (pp. 43–59). New York: Springer.

Duderstadt, J., Atkins, D., & Van Houweling, D. (2002). Higher education in the digital age: Technology issues and strategies for American colleges and universities. Westport, Conn: Praeger Publishers.

Geoghegan, W. (1994). Whatever happened to instructional technology? In S. Bapna, A. Emdad, & J. Zaveri (Eds.), (pp. 438–447). Baltimore, MD: IBM.

Greenland, S. (2011). Using log data to investigate the impact of (a) synchronous learning tools on LMS interaction. In G. Williams, P. Statham, N. Brown, & B. Cleland (Eds.), ASCILITE 2011 (pp. 469–474). Hobart, Australia.

Hannah, F., Jenny, P., Ruth, L., & Jane, M. (2010). Distance education in an era of eLearning: challenges and opportunities for a campus-focused institution. Higher Education Research & Development, 29(1), 15–28.

Hirschheim, R., Murungi, D. M., & Pe–a, S. (2012). Witty invention or dubious fad? Using argument mapping to examine the contours of management fashion. Information and Organization, 22(1), 60–84. doi:10.1016/j.infoandorg.2011.11.001

Hosack, B., Hall, D., Paradice, D., & Courtney, J. F. (2012). A Look Toward the Future : Decision Support Systems Research is Alive and Well. Journal of the Association for Information Systems, 13(Special Issue), 315–340.

Introna, L. (1996). Notes on ateleological information systems development. Information Technology & People, 9(4), 20–39.

Johnson, L., Adams, S., & Cummins, M. (2012). Technology Outlook for Australian Tertiary Education 2012-2017: An NMC Horizon Report Regional Analysis. New Media Consortium. Austin, Texas.

Johnson, L., & Cummins, M. (2012). The NMC Horizon Report: 2012 Higher Education Edition (p. 42). Austin, Texas.

Johnson, L., Smith, R., Levine, A., & Haywood, K. (2010). The horizon report: 2010 Australia-New Zealand Edition. Austin, Texas.

Jonassen, D. (1997). Instructional design models for well-structured and ill-structured problem-solving learning outcomes. Educational Technology Research and Development, 45(1), 65–94.

Jones, D., & Muldoon, N. (2007). The teleological reason why ICTs limit choice for university learners and learning. In R. J. Atkinson, C. McBeath, S. K. A. Soong, & C. Cheers (Eds.), ICT: Providing choices for learners and learning. Proceedings ASCILITE Singapore 2007 (pp. 450–459). Singapore.

Jones, J., & Saram, D. D. D. (2005). Academic staff views of quality systems for teaching and learning: a Hong Kong case study. Quality in Higher Education, 11(1), 47–58. doi:10.1080/13538320500074899

Knight, P., & Trowler, P. (2000). Department-level Cultures and the Improvement of Learning and Teaching. Studies in Higher Education, 25(1), 69–83.

Lattuca, L., & Stark, J. (2009). Shaping the college curriculum: Academic plans in context. San Francisco: John Wiley & Sons.

Macfadyen, L., & Dawson, S. (2012). Numbers Are Not Enough. Why e-Learning Analytics Failed to Inform an Institutional Strategic Plan. Educational Technology & Society, 15(3), 149–163.

Malikowski, S. (2010). A Three Year Analysis of CMS Use in Resident University Courses. Journal of Educational Technology Systems, 39(1), 65–85.

March, J. (1991). Exploration and exploitation in organizational learning. Organization Science, 2(1), 71–87.

Marsh, J., Pane, J., & Hamilton, L. (2006). Making sense of data-driven decision making in education: Evidence from recent RAND research. Santa Monica, CA.

Mintzberg, H. (1989). Mintzberg on Management, Inside our Strange World of Organisations. New York: Free Press.

Palmer, S. (2012). Student evaluation of teaching: keeping in touch with reality. Quality in Higher Education, 18(3), 297–311. doi:10.1080/13538322.2012.730336

Ramamurthy, K., Sen, A., & Sinha, A. P. (2008). Data warehousing infusion and organizational effectiveness. Systems, Man and É, 38(4), 976–994. doi:10.1109/TSMCA.2008.923032

Schiller, M. J. (2012). Big Data Fail : Five Principles to Save Your BI. CIO Insight. Retrieved from http://www.cioinsight.com/c/a/Expert-Voices/Big-Data-Fail-Five-Principles-to-Save-Your-BI-Butt-759074/

Seely Brown, J., Collins, A., & Duguid, P. (1989). Situated cognition and the culture of learning. Educational Researcher, 18(1), 32–42.

Siemens, G., & Long, P. (2011). Penetrating the Fog: Analytics in Learning and Education. EDUCAUSE Review, 46(5).

Simon, H. (1991). Bounded rationality and organizational learning. Organization Science, 2(1), 125–134.

Swanson, E. B., & Ramiller, N. C. (2004). Innovating mindfully with information technology. MIS Quarterly, 28(4), 553–583.

Powered by WordPress & Theme by Anders Norén

css.php