Context-Appropriate Scaffolding Assemblages: A generative learning analytics platform for end-user development and participatory design

David Jones, Celeste Lawson, Colin Beer, Hazel Jones

Paper accepted to the LAK2018 workshop – Participatory design of learning analytics

Jones, D., Lawson, C., Beer, C., & Jones, H. (2018). Context-Appropriate Scaffolding Assemblages: A generative learning analytics platform for end-user development and participatory design. In A. Pardo, K. Bartimote, G. Lynch, S. Buckingham Shum, R. Ferguson, A. Merceron, & X. Ochoa (Eds.), Companion Proceedings of the 8th International Conference on Learning Analytics and Knowledge. Sydney, Australia: Society for Learning Analytics Research. Retrieved from http://bit.ly/lak18-companion-proceedings

Abstract

There remains a significant tension in the development and use of learning analytics between course/unit or learning design specific models and generic, one-size fits all models. As learning analytics increases its focus on scalability there is a danger of erring toward the generic and limiting the ability to align learning analytics with the specific needs and expectations of users. This paper describes the origins, rationale, and use cases of a work in progress design-based research project attempting to develop a generative learning analytics platform. Such a platform encourages a broad audience to develop unfiltered and unanticipated changes to learning analytics. It is hoped that such a generative platform will enable the development and greater adoption of embedded and contextually specific learning analytics and subsequently improve learning and teaching. The paper questions which tools, social structures, and techniques from participatory design might inform the design and use of the platform, and asks whether or not participatory design might be more effective when partnered with generative technology?

Keywords: Contextually Appropriate Scaffolding Assemblages (CASA); generative platform; participatory design; DIY learning analytics

Introduction

One size does not fit all in learning analytics. There is no technological solution that will work for every teacher, every time (Mishra & Koehler, 2006). Context specific models improve teaching and learning, yield better results and improve the effectiveness of human action (Baker, 2016; Gašević, Dawson, Rogers, & Gasevic, 2016). Despite this, higher education institutions tend to adopt generalised approaches to learning analytics. Whilst this may be cost effective and efficient for the organisation (Gašević et al., 2016), the result is a generic approach that provides an inability to cater for the full diversity of learning and learners and shows “less variety than a low-end fast-food restaurant” (Dede, 2008).

Institutional implementation of learning analytics in terms of both practice and research remain limited to conceptual understandings and are empirically narrow or limited (Colvin, Dawson, Wade, & Gašević, 2017). In practice, learning analytics has suffered from a lack of human-centeredness (Liu, Bartimote-Aufflick, Pardo, & Bridgeman, 2017). Even when learning analytics tools are designed with the user in mind (e.g. Corrin et al., 2015), the resulting tools tend to be what Zittrain (2008) defines as non-generative or sterile. In particular, the adoption of such tools tends to require institutional support and subsequently leans toward the generic, rather than the specific. This perhaps provides at least part of the answer of why learning analytics dashboards are seldom used to intervene during the teaching of a course (Schmitz, Limbeek, van Greller, Sloep, & Drachsler, 2017) and leading us to the research question: How can the development of learning analytics better support the needs of specific contexts, drive adoption, and ongoing design and development? More broadly, we are interested in if and how learning analytics can encourage the adoption of practices that position teaching as design and subsequently improve learning experiences and outcomes (Goodyear, 2015) by supporting a greater focus on the do-it-with (DIW – participatory design) and do-it-yourself (DIY) design (where teachers are seen as designers), implementation, and application of learning analytics. This focus challenges the currently more common Do-It-To (DIT) and Do-It-For (DIF) approaches (Beer, Tickner & Jones, 2014).

This project seeks to explore learning analytics using a design-based research approach informed by a broader information systems design theory for e-learning (Jones, 2011), experience with Do-It-With (DIW) (Beer et al., 2014) and teacher Do-It-Yourself (DIY) learning analytics (Jones, Jones, Beer, & Lawson, 2017), and technologies associated with reproducible research to design and test a generative learning analytics platform. Zittrain (2008) defines a generative system as having the “capacity to produce unanticipated change through unfiltered contributions from broad and varied audiences” (p. 70). How generative a system is depends on five principal factors: (1) leverage; (2) adaptation; (3) ease of mastery; (4) accessibility; and (5) transferability (Zittrain, 2008). A focus for this project is in exploring how and if a generative learning analytics platform can act as a boundary object for the diverse stakeholders involved with the design, implementation and use of institutional learning analytics (Suthers & Verbert, 2013). Such an object broadens the range of people who can engage in creative acts of making learning analytics as a way to make sense of current and future learning and teaching practices and the contexts within which it occurs. The platform – named CASA, an acronym standing for Contextually Appropriate Scaffolding Assemblages – will be designed to enable all stakeholders alone or together to participate in decisions around the design, development, adoption and sharing of learning analytics tools. These tools will be created by combining, customising, and packaging existing analytics – either through participatory design (DIW) or end-user development (DIY) – to provide context-sensitive scaffolds that can be embedded within specific online learning environments.

Know thy student – Teacher DIY learning analytics

Jones et al., (2017) uses a case of teacher DIY learning analytics to draw a set of questions and implications for the institutional implementation of learning analytics and the need for CASA. The spark for the teacher DIY learning analytics was the observation that it took more than 10 minutes, using two separate information systems including a number of poorly designed reports, to gather the information necessary to respond to an individual learner’s query in a discussion forum. The teacher was able to design an embedded, ubiquitous and contextually specific learning analytics tool (Know Thy Student) that reduced the time taken to gather the necessary information to a single mouse click. The tool was used in four offerings of a third year teacher education unit across 2015 and 2016. Analysis of usage logs indicates that it was used 3,100 separate times to access information on 761 different students, representing 89.5% of the total enrolled students. This usage was spread across 666 days over the two years, representing 91% of the available days during this period. A significant usage level, especially given that most learning analytics dashboards are seldom used to intervene during the teaching of a course (Schmitz et al., 2017). Usage also went beyond responding to discussion forum questions. Since the tool was unintentionally available throughout the entire learning environment (embedded and ubiquitous) unplanned use of the tool developed contributing to improvements in the learner experience. This led to the implication that embedded, ubiquitous, contextual learning analytics encourages greater use and enables emergent practice (Jones et al., 2017). It provides leverage to make the difficult job of teaching a large enrolment, online course easier. However, the implementation of this tool required significant technical knowledge and hence is not easy to master, not accessible, nor easily transferable, Zittrain’s (2008) remaining principles required for a generative platform. The questions now become: How to reduce this difficulty? How to develop a generative learning analytics platform?

CASA: Technologies and Techniques

To answer this question CASA will draw on a combination of common technologies associated with reproducible research including virtualisation, literate computing (e.g. Jupyter Notebooks), and version control systems (Sinha & Sudhish, 2016) combined with web augmentation (Díaz & Arellano, 2015) and scraping (Glez-Peña, Lourenço, López-Fernández, Reboiro-Jato, & Fdez-Riverola, 2014). Reproducible research technologies enable CASA to draw upon a large and growing collection of tools developed and used by the learning analytics and other research communities. Growth in the importance of reproducible research also means that there is a growing number of university teaching staff familiar with the technology. It also means that there is emerging research literature sharing insights and advice in supporting academics to develop the required skills (e.g. Wilson, 2016). Virtualisation allows CASA to be packed into a single image which allows individuals to easily download, install and execute within their own computing platforms. Web augmentation provides the ability to adapt existing web-based learning environments to embed learning analytics directly into the current common learning context. The combination of these technologies will be used to implement the CASA platform, enabling the broadest possible range of stakeholders to individually and collaboratively design and implement different CASA instances. Such instances can be mixed and matched to suit context-specific requirements and shared amongst a broader community. The following section provides a collection of CASA use case scenarios including explicit links to Zittrain’s (2008) five principal factors of a generative platform.

CASA: Use case scenarios

A particular focus with the CASA platform is to enable individual teachers to adopt CASA instances while minimising the need to engage with institutional support services (accessibility). Consequently a common scenario would be where a teacher (Cara) observes another teacher (Daniel) using a CASA instance. It is obvious to Cara that this specific CASA instance makes a difficult job easier (leverage) and motivates her to trial it. Cara visits the CASA website and downloads and executes a virtual image (the CASA instance) on her computer, assuming she has local administrator rights. Cara configures CASA by visiting a URL to this new CASA instance and stepping through a configuration process that asks for some context specific information (e.g. the URL for Cara’s course sites). Cara’s CASA uses this to download basic clickstream and learner data from the LMS. Finally, Cara downloads the Tampermonkey browser extension and installs the CASA user script to her browser. Now when visiting any of her course websites Cara can access visualisations of basic clickstream data for each student.

To further customise her CASA instance Cara uploads additional data to provide more contextual and pedagogical detail (adaptation). The ability to do this is sign-posted and scaffolded from within the CASA tool (mastery). To expand the learner data Cara sources a CSV file from her institution’s student records system. Once uploaded to CASA all the additional information about each student appears in her CASA and Cara can choose to further hide, reveal, or re-order this information (adaptation). To associate important course events (Corrin et al., 2015) with the clickstream data Cara uses a calendar application to create an iCalendar file with important dates (e.g assignment due dates, weekly lecture times). This is uploaded or connected to CASA and the events are subsequently integrated into the clickstream analytics. At this stage, Cara has used CASA to add embedded, ubiquitous and contextually specific learning analytics about individual students into her course site. At no stage has Cara gained access to new information. CASA has simply made it easier for Cara to access this information, increasing her efficiency (leverage). This positive experience encourages Cara to consider what more is possible.

Cara engages in a discussion with Helen, a local educational designer. The discussion explores the purpose for using learning analytics and how it relates to intended learning outcomes. This leads to questions about exactly how and when Cara is engaging in the learning environment. This leads them to engage in various forms of participatory design with Chuck (a software developer). Chuck demonstrates how the student clickstream notebook form Cara’s existing instance can be copied and modified to visualise staff activity (mastery). Chuck also demonstrates how this new instance can be shared back to the CASA repository and how this process will eventually allow Daniel to choose to adopt this new instance (transferability). These discussions may also reveal insights into other factors such as limitations in Cara’s conceptions and practices of learning and teaching, or institutional factors and limitations (e.g. limited quality or variety of available data).

Conclusions and questions

This paper has described the rationale, origins, theoretical principles, planned technical implementation and possible use cases for CASA. CASA is a generative learning analytics platform which acts as a boundary object. An object that engages diverse stakeholders more effectively in creative acts of making to help make sense of and respond to the diversity and complexity inherent in learning and teaching in contemporary higher education. By allowing both DIW (participatory design) and DIY (end-user development) approaches to the implementation of learning analytics we think CASA can enable the development of embedded, ubiquitous and contextually specific applications of learning analytics, better position teaching as design, and subsequently improve learning experiences and outcomes. As novices to the practice of participatory design we are looking for assistance in examining how insights from participatory design can inform the design and use of CASA. For us, there appear to be three areas of design activity where participatory design can help and a possibility where the addition of generative technology might help strengthen participatory design.

First, the design of the CASA platform itself could benefit from participatory design. A particular challenge to implementation within higher education institutions is that as a generative platform CASA embodies a different mindset. A generative mindset invites open participation and assumes open participation provides significant advantage, especially in terms of achieving contextually appropriate applications. It sees users as partners and co-designers. An institutional mindset tends to see users as the subject of design and due to concerns about privacy, security, and deficit models seek to significantly limit participation in design. Second, the DIW interaction between Cara, Helen and Chuck in the use case section is a potential example of using participatory design and the CASA platform to co-design and co-create contextually specific CASA instances. What methods, tools and techniques from participatory design could help these interactions? Is there benefit in embedding support for some of these within the CASA platform? Lastly, the CASA approach also seeks to enable individual teachers to engage in DIY development. According to Zittrain (2008) the easier we can make it for teachers to develop their own CASA instances (mastery) the more generative the platform will be. What insights from participatory design might help increase CASA’s generative nature? Can CASA be seen as an example of a generative toolkit (Sanders & Strappers, 2014)? Or, does the DIY focus move into the post-design stage (Sanders & Strappers, 2014)? Does it move beyond participatory design? Is the combination of participatory design and generative technology something different and more effective than participatory design alone? If it is separate, then how can the insights generated by DIY making with CASA be fed back into the on-going participatory design of the CASA platform, other CASA instances, and sense-making about the broader institutional context?

References

Baker, R. (2016). Stupid Tutoring Systems, Intelligent Humans. International Journal of Artificial Intelligence in Education, 26(2), 600-614. https://doi.org/10.1007/s40593-016-0105-0

Beer, C., Tickner, R., & Jones, D. (2014). Three paths for learning analytics and beyond : moving from rhetoric to reality. In B. Hegarty, J. McDonald, & S. Loke (Eds.), Rhetoric and Reality: Critical perspectives on educational technology. Proceedings ascilite Dunedin 2014 (pp. 242-250).

Colvin, C., Dawson, S., Wade, A., & Gašević, D. (2017). Addressing the Challenges of Institutional Adoption. In C. Lang, G. Siemens, A. F. Wise, & D. Gaševic (Eds.), The Handbook of Learning Analytics (pp. 281-289). Alberta, Canada: Society for Learning Analytics Research.

Corrin, L., Kennedy, G., Barba, P. D., Williams, D., Lockyer, L., Dawson, S., & Copeland, S. (2015). Loop : A learning analytics tool to provide teachers with useful data visualisations. In T. Reiners, B. von Konsky, D. Gibson, V. Chang, L.

Irving, & K. Clarke (Eds.), Globally connected, digitally enabled. Proceedings ascilite 2015 (pp. 57-61).

Dede, C. (2008). Theoretical perspectives influencing the use of information technology in teaching and learning. In J. Voogt & G. Knezek (Eds.), International Handbook of Information Technology in Primary and Secondary Education (pp. 43-62). New York: Springer.

Díaz, O., & Arellano, C. (2015). The Augmented Web: Rationales, Opportunities, and Challenges on Browser-Side Transcoding. ACM Trans. Web, 9(2), 8:1-8:30.

Gašević, D., Dawson, S., Rogers, T., & Gasevic, D. (2016). Learning analytics should not promote one size fits all: The effects of instructional conditions in predicating learning success. The Internet and Higher Education, 28, 68-84. https://doi.org/10.1016/j.iheduc.2015.10.002

Glez-Peña, D., Lourenço, A., López-Fernández, H., Reboiro-Jato, M., & Fdez-Riverola, F. (2014). Web scraping technologies in an API world. Briefings in Bioinformatics, 15(5), 788-797. https://doi.org/10.1093/bib/bbt026

Goodyear, P. (2015). Teaching As Design. HERDSA Review of Higher Education, 2, 27-50.

Jones, D. (2011). An Information Systems Design Theory for E-learning (Doctoral thesis, Australian National University, Canberra, Australia). Retrieved from https://openresearch-repository.anu.edu.au/handle/1885/8370

Jones, D., Jones, H., Beer, C., & Lawson, C. (2017, December). Implications and questions for institutional learning analytics implementation arising from teacher DIY learning analytics. Paper presented at the ALASI 2017: Australian Learning Analytics Summer Institute, Brisbane, Australia. Retrieved from http://tiny.cc/ktsdiy

Liu, D. Y.-T., Bartimote-Aufflick, K., Pardo, A., & Bridgeman, A. J. (2017). Data-Driven Personalization of Student Learning Support in Higher Education. In A. Peña-Ayala (Ed.), Learning Analytics: Fundaments, Applications, and Trends (pp. 143-169). Springer International Publishing.

Mishra, P., & Koehler, M. (2006). Technological pedagogical content knowledge: A framework for teacher knowledge. Teachers College Record, 108(6), 1017-1054.

Sanders, E. B.-N., & Stappers, P. J. (2014). Probes, toolkits and prototypes: three approaches to making in codesigning. CoDesign, 10(1), 5-14. https://doi.org/10.1080/15710882.2014.888183

Schmitz, M., Limbeek, E. van, Greller, W., Sloep, P., & Drachsler, H. (2017). Opportunities and Challenges in Using Learning Analytics in Learning Design. In Data Driven Approaches in Digital Education (pp. 209-223). Springer, Cham.

Sinha, R., & Sudhish, P. S. (2016). A principled approach to reproducible research: a comparative review towards scientific integrity in computational research. In 2016 IEEE International Symposium on Ethics in Engineering, Science and Technology (ETHICS) (pp. 1-9).

Suthers, D., & Verbert, K. (2013). Learning analytics as a middle space. In Proceedings of the Third International Conference on Learning Analytics and Knowledge – LAK ’13 (pp. 2-5).

Wilson, G. (2014). Software Carpentry: lessons learned. F1000Research, 3.

Zittrain, J. (2008). The Future of the Internet–And How to Stop It. Yale University Press.

Leave a Reply

Your email address will not be published. Required fields are marked *

css.php