Assembling the heterogeneous elements for (digital) learning

Category: bricolage

Representing problems to make the solution transparent

The following illustrates how the game Number Scrabble and Herb Simon’s thoughts on the importance of design representation appears likely to help with migration of 1000s of course sites from Blackboard Learn (aka Blackboard Original) to another LMS. Not to mention becoming useful post-migration.

Number Scrabble

Number Scrabble is a game I first saw described in Simon’s (1996) book Sciences of the Artificial. I used it in a presentation from 2004 (the source of the following images).

Number Scrabble is a game played between two players. The players are presented with nine cards. The players take turns selecting one card at a time. The aim being to get three cards which add up to 15 (aka a “book”). The first player to obtain a book wins. If no player gets a book, the game is a draw.

Basic number scrabble

Making the solution transparent

Simon (1996) argues that problem representation is an important part of problem solving and design. He identifies the extreme (perhaps not always possible) version of this view as

Solving a problem simply means representing it so as to make the solution transparent.

He uses the example of Number Scrabble to illustrate the point.

How much easier would you find it to pay Number Scrabble if the cards were organised in the following magic square?

Would it help any if I mentioned another game, tic-tac-toe?

Number scrabble's magic square

With this new representation Number Scrabble becomes a game of tic-tac-toe. No arithmetic required and tactics and strategies most are familiar with become applicable.

My Problem: Course Migration – Understand what needs migrating

Over the next two years my colleagues and I will be engaged in the process of migrating University courses from the Blackboard Learn (aka Blackboard Original) LMS to another LMS. Our immediate problem is to understand what needs migrating and identifying if and how that should/can be migrated to the new LMS.

I’ve actually grown to quite like Blackboard Learn. But it’s old and difficult to use (well). It’s very hard to fully understand the purpose and design of a course site by looking at and navigating around it. A course site is likely to have a handful of areas curated by the teaching staff. Each with a collection of different tools and content organised according to various schemes. There are another handful of areas for configuring the course site.

To make things more difficult a Blackboard course site has a modal interface. Meaning the course site will look different for different people at different times.

In addition, using Dron’s (2021) definition, Blackboard Learn is a very soft technology, which makes it hard to use. As a soft technology, Blackboard Learn provides great flexibility in how it is used. Flexibility when applied across 1000s of course sites will reveal many interesting approaches.

Attempting to understand the design, purpose and requirements of a Blackboard course site by looking at it is a bit like playing Number Scrabble with a single line of cards. A game we have to play 1000s of times.

Can we make the migration problem (more) transparent? How we’re trying

I wondered if the design problem of if/what/how to migrate a course site would be simpler if we were able to change the representation of the course site. Could we develop a representation that would make the solution (more) transparent?

COuld we develop a representation we designers could use to gain an initial understanding of the intent and method of a course site. A representation we could use during collaboration with the teaching staff and other colleagues to refine that understanding and plan the migration. A representation that could scaled for use across 1000s of course sites and perhaps lay the foundation for business as usual post-migration.

What I currently have is a collection of Python code that given a URL for a Blackboard course site will:

  1. Scrape the course site and store a data structure representing the site, it’s content and configuration.
  2. Perform various forms of analysis and modeling with this data to reveal important features.
  3. Generate a Word document summarising the course and hopefully providing the representation we need.

The idea is that given a list of 1000s of Blackboard courses. The code can quickly perform these steps and provide a more transparent representation of the problem.

But is it useful? Is it making solutions transparent? Yes

The script is not 100% complete. But it’s already proving useful.

Yesterday I was helping a teacher with one task on their course site (a story for another blog post). The teacher mentioned in passing another problem from earlier in the course. A problem that has been worked around, but for which the cause remains mysterious. It was quite a strange problem. Not one I’d encountered before. I had some ideas but confirmation would require further digging into the complexity of a Blackboard course site. Who has the time?!

As I’m also currently working on the “representation” script I thought I’d experiment with this course. Mainly to test the script, but maybe to reveal some insights.

I ran the script. Skimmed the resulting Word document and bingo there’s the cause. A cause I would never have considered. But it is understandable how it came about.

The different representation made the solution transparent!!

References

Dron, J. (2021). Educational technology: What it is and how it works. AI & SOCIETY. https://doi.org/10.1007/s00146-021-01195-z

Simon, H. (1996). The sciences of the artificial (3rd ed.). MIT Press.

Exploring auto-coding with NVivo

The challenge here is learn more about using NVivo in order to design processes for a research project exploring the prevalence and nature workarounds in higher education learning and teaching.

Can Word documents be imported and pre marked up?

The current plan is to have people complete a Word template. The template consists of numerous questions related to Alter’s Theory of Workarounds (Alter, 2014). The question is whether there’s a good way to structure this document to make it easier to code responses according to Alter’s (2014) theory? A simple first step to further analysis.

The answer is yes. Following documents the process.

What’s the NVivo model?

I’m a great believer in the idea that most difficulties with using software arises from a model mismatch between the person and the software. Hence my starting point with a new bit of software is to try and build a representation of the model underpinning the software.

The fact that NVivo’s makers have a Understand the key concepts page has me feeling good about Nvivo. As does the observation that the Using NVivo page starts with a focus on different types of qualitative research – the domain from which most potential NVivo users will have some familiarity. Start with where the user is. Good.

Though a diagram the “using” page does suggest that importing of data into NVivo is a separate from coding.

Quick summary of some of the concepts

Concept

NVivo Purpose

Local project

Files

Materials to analyse (can organise in folders)

Individual workaround descriptions

Memos

Place to store ideas and thoughts arising from analysis

Descriptions of analysis process/progress

Coding

Process of analysing content and allocating to a node

Analysis of workarounds

Nodes

Container for content coded as belonging to a common theme. Can be organised in a hierarchy.

There is some support for auto coding structured content that can rely on consistent use of paragraph styles in documents. Even supports the idea of nested nodes through use of nested headings (e.g. H1, H2 etc).

First test

As it happens, many years of frustration with Microsoft Word has convinced me of the value of using paragraph styles correctly. Hence the template is already set up. I should be able to import and auto-code my first.

Wonder if I can do it without following the recipe instructions.

There was an option to

Create a new case for each imported file?

What’s a case in NVivo speak? Ahh, nodes can be theme or case nodes. Each workaround being a separate case node seems like a useful idea. But there’s also the idea of a workaround belonging to a particular individual. Appears multiple case nodes are possible.

Importing is straight forward. Need to use the autocode wizard and make some decisions about where to put new nodes. For now, place them under the case code for the workaround.

And it all appears to work

Software engineering for computational science : past, present, future

Following is a summary of Johanson and Hasselbring (2018) and an exploration of what, if anything, it might suggest for learning design and learning analytics. Johanson and Hasselbring (2018) explore why scientists whom have been developing software to do science (computational science) haven’t been using principles and practices from software engineering to develop this software. The idea is that such an understanding will help frame advice for how computational science can be improved through application of appropriate software engineering practice (**assumption**).

This is interesting because of potential similarities between learning analytics (and perhaps even learning design in our now digitally rich learning environments) and computational science. Subsequently, lessons about if and how computational science has/hasn’t been using software engineering principles might provide useful insights for the implementation of learning analytics and the support of learning design. I’m especially interested due to my observation that both practice and research around learning analytics implementation isn’t necessarily exploring all of the possibilities.

In particular, Johanson and Hasselbring (2018) argue that it is necessary to examine the nature of computational science and subsequently select and adapt software engineering techniques that are better suited to the needs of computational scientists. For me, this generates questions such as:

  1. What is the nature of learning analytics?
  2. What is the nature of learning design?
  3. What happens to the combination of both?

    Increasingly it is seen as necessary that learning analytics be tightly aligned with learning design. Is the nature/outcome/practice of this combination different? Does it require different types of support?

  4. For all of the above is there a difference between the nature espoused in the research literature and the nature experienced by the majority of practitioners?
  5. What types and combination of software engineering/development principles and practices are best suited to the nature of learning analytics and learning design?

Summary of the paper

  • Question of problem

    The development of software with which to do science is increasing, but this practice isn’t using software engineering practices. Why? What are the underlying causes? How can it be changed?

  • Method

    Survey of relevant literature examining software development in computational science. About 50 publications examined. A majority case studies, but some surveys.

  • Findings

    Identify 13 key characteristics (divided into 3 groups) of computational science that should be considered (see table below) when thinking about which software engineering knowledge might apply and be adapted.

    Examines some examples of how software engineering principles might be/are being adapted.

Implications for learning analytics

Johanson and Hasselbring (2018) argue that the chasm between computational scientists and software engineering researchers arose from the rush on the part of computer scientists and then software engineers to avoid the “stigma of all things applied”. The search for general principles that applied in all places. Leading to this problem

Because of this ideal of generality, the question of how specifically computational scientists should develop their software in a well-engineered way, would probably have perplexed a software engineer and the answer might have been: “Well, just like any other application software.

In learning analytics there are people offering more LA specific advice. For example, Wise & Vytasek (2017) and just this morning via Twitter this pre-print of looming BJET article. Both focused on providing advice that links learning analytics and learning design.

But I wonder if this is the only way to look at learning analytics? What about learning analytics for reflection and exploration? Does the learning design perspective cover it?

But perhaps a more interesting question might be whether or not it is assumed that the learning analytics/learning design principles identified by these authors should then be implemented using traditional software engineering practices?

Category

Characteristics

Nature of scientific challenges

  1. Requirements are not known up front
    • uses software to make novel discoveries and further
      understanding, software is “deeply embedded” in an
      exploratory process

    • to produce software but to obtain scientific results”. Segal
      (2005) scientific people say they are “programming
      experimentally”

    • design
      and requirements rarely seen as distinct steps

  2. Verification and validation is difficult and
    strictly scientific

    • verification
      demonstrate that the implementation of models is correct

    • validation
      demonstrate software captures the real world

    • Validation is hard
      because models are being used “precisely because the subject
      at hand is ‘too complex, too large, t
      oo
      small, too dangerous, or too expensive to explore in the real
      world’ (Segal and Morris, 2008)

    • Problems arise from four
      different dimensions/combinations (Carver et al, 2007)

      • Model of reality is
        insufficient

      • Algorithm used to
        discretise the mathematical problem can be inadquate

      • Implementation of the
        algorithm is wrong

      • Combination of models can
        propagate errors

    • Testing methods could help,
      but rarely used

  1. Overly formal software processes restrict research

    • Easterbrook and Johns (2009) big up front design “poor
      fit” for computational science – deeply embedded in the
      scientifical model

    • there is a need for the flexibility to quickly
      experiment with different solution approaches (Carver et al,
      2007)

    • Use a very iterative process, iterating over both the
      software and the underlying scientific theory

    • Explicit connections with agile software development
      established in the literature but even those lightweight
      processes are largely rejected

Representation shown in figure

Limitations of computers

  1. Development is driven and limited by hardware

    • scientific software not limited by the science theory, but by the available computing resources

    • Computationl power is an issue

  2. Use of “old” programming languages and
    technologies
    • Some communities moving
      toward python, but typically non-technical disciplines
      (biology/psychology) and only for small scale projects

  1. Intermingling of domain logica and implementation
    details

  2. Conflicting software quality requirements
    (performance, portability and maintainability)

    • interviews of scientific
      developers rank requirements as

      • Functional correctness

      • Performance

      • Portability

Maintainability

Cultural environment

  1. Few scientists are trained in software engineering

    • Segal (2007) describe them
      am “professional end user developers”…develop software to
      advance their own professional goals

    • “In contrast to most
      conventional end user developers, however, computational
      scientists rarely experience any difficulties learning
      general-purpose languages”

    • But keeping up with sw eng
      is just too much for people who are already busy writing grants
      etc.

    • Didn’t want to delegate
      development as often required a PhD in the discipline to be
      able to understand and implement the softare

  1. Different terminology

    • e.g. computational
      scientists speak of “code” not “software”

  1. Scientific software in itself has no value but still
    it is long-lived

    • Code is valued because of
      the domain knowledge captured within it

  1. Creating a shared understanding of a “code” is
    difficult

    • preference for informal,
      collegial ways of knowledge transfer, not documentation

    • “scientists find it
      harder to read and understand documentation artifacts than to
      contact the author and discuss”

  1. Little code re-use

Disregard of most modern software engineering methods

A model of scientific software development

Johanson and Hasselbring (2018) include the following figure as a representation of how scientific software is developed. They note its connections with agile software development, but also describe how computational scientists find even the light weight discipline of agile software development as not a good fit.

Model of Scientific Software Development

Anecdotally, I’d suggest that the above representation would offer a good description of much of the “learning design” undertaken in universities. Though with some replacements (e.g. “develop piece of software” replaced with “develop learning resource/experience/event”).

If this is the case, then how well does the software engineering approach to the development and implementation of learning analytics (whether it follows the old SDLC or agile practices) fit with this nature of learning design?

References

Johanson, A., & Hasselbring, W. (2018). Software Engineering for Computational Science: Past, Present, Future. Computing in Science & Engineering. https://doi.org/10.1109/MCSE.2018.108162940

Wise, A., & Vytasek, J. (2017). Learning Analytics Implementation Design. In C. Lang, G. Siemens, A. F. Wise, & D. Gaševic (Eds.), The Handbook of Learning Analytics (1st ed., pp. 151–160). Alberta, Canada: Society for Learning Analytics Research (SoLAR). Retrieved from http://solaresearch.org/hla-17/hla17-chapter1

Teacher DIY learning analytics – implications & questions for institutional learning analytics

The following provides a collection of information and resources associated with a paper and presentation given at ALASI 2017 – the Australian Learning Analytics Summer Institute in Brisbane on 30 November, 2017. Below you’ll find an abstract, a recording of a version of the presentation, the presentation slides and the references.

The paper examines the DIY development and use of a particular application of learning analytics (known as Know thy student) within a single course during 2015 and 2016. The paper argues that given limitations about what is known about the institutional implementation of learning analytics that examining teacher DIY learning analytics can reveal some interesting insights. The paper identifies three implications and three questions.

Three implications

  1. Institutional learning analytics currently falls short of an important goal.

    If the goal of learning analytics is that “of getting key information to a human being who can use it” (Baker, 2016, p. 607) then institutional learning analytics is falling short, and not just at a specific institution.

  2. Embedded, ubiquitous, contextual learning analytics encourages greater use and enables emergent practice.

    This case suggests that learning analytics interventions designed to provide useful contextual data appropriately embedded ubiquitously throughout the learning environment can enable significant levels of usage, including usage that was unplanned, emerged from experience, and changed practice.

    In this case, Know thy student was used by the teacher on 666 different days (~91% of the days that the tool was available) to find out more about ~90% of the enrolled students. Graphical representations below.

  3. Teacher DIY learning analytics is possible.

    Know thy student was implemented by a single academic using a laptop, widely available software (including some coding), and existing institutional data sources.

Three questions

  1. Does institutional learning analytics have an incomplete focus?

    Research and practice around the institutional implementation of learning analytics tends to appear to have a focus on “at scale”. Learning analytics that can be used across multiple courses or an entire institution. That focus appears to be at the expense of course or learning design specific, which appear to be more useful.

  2. Does the institutional implementation of learning analytics have an indefinite postponement problem?

    Aspects of Know thy student are specific to the particular learning design within a single course. The implementation of such a specific requirement would appear unlikely to have ever been undertaken by existing institutional learning analytics implementation. It would have been indefinitely postponed.

  3. If and how do we enable teacher DIY learning analytics?

    This case suggests that teacher DIY learning analytics is possible and potentially overcomes limitations in current institutional implementation of learning analytics. However, it’s also not without its challenges and limitations. Should institutions support teacher DIY learning analytics? How might that be done?

Usage

The following heat map shows the number of times Know thy student was used on each day during 2015 and 2016.

Know thy student usage clicks per day

The following bar graph contains 761 “bars”. Each bar represents a unique student enrolled in this course. The size of the bar shows the number of times Know thy student was used for that particular student. (One student was obviously used for testing purposes during the development of the tool)

Know thy student usage clicks per student

Abstract

The paper on which it is based has the following abstract.

Learning analytics promises to provide insights that can help improve the quality of learning experiences. Since the late 2000s it has inspired significant investments in time and resources by researchers and institutions to identify and implement successful applications of learning analytics. However, there is limited evidence of successful at scale implementation, somewhat limited empirical research investigating the deployment of learning analytics, and subsequently concerns about the insight that guides the institutional implementation of learning analytics. This paper describes and examines the rationale, implementation and use of a single example of teacher do-it-yourself (DIY) learning analytics to add a different perspective. It identifies three implications and three questions about the institutional implementation of learning analytics that appear to generate interesting research questions for further investigation.

Presentation recording

The following is a recording of a talk given at CQUni a couple of weeks after ALASI. It uses the same slides as the original ALASI presentation, however, without a time limit the description is a little expanded.

Slides

Also view and download here.

References

Baker, R. (2016). Stupid Tutoring Systems, Intelligent Humans. International Journal of Artificial Intelligence in Education, 26(2), 600–614. https://doi.org/10.1007/s40593-016-0105-0

Behrens, S. (2009). Shadow systems: the good, the bad and the ugly. Communications of the ACM, 52(2), 124–129.

Colvin, C., Dawson, S., Wade, A., & Gašević, D. (2017). Addressing the Challenges of Institutional Adoption. In C. Lang, G. Siemens, A. F. Wise, & D. Gaševic (Eds.), The Handbook of Learning Analytics (1st ed., pp. 281–289). Alberta, Canada: Society for Learning Analytics Research (SoLAR).

Corrin, L., Kennedy, G., & Mulder, R. (2013). Enhancing learning analytics by understanding the needs of teachers. In Electric Dreams. Proceedings ascilite 2013 (pp. 201–205).

Díaz, O., & Arellano, C. (2015). The Augmented Web: Rationales, Opportunities, and Challenges on Browser-Side Transcoding. ACM Trans. Web, 9(2), 8:1–8:30. https://doi.org/10.1145/2735633

Dron, J. (2014). Ten Principles for Effective Tinkering (pp. 505–513). Presented at the E-Learn: World Conference on E-Learning in Corporate, Government, Healthcare, and Higher Education, Association for the Advancement of Computing in Education (AACE).

Ferguson, R. (2014). Learning analytics FAQs. Education. Retrieved from https://www.slideshare.net/R3beccaF/learning-analytics-fa-qs

Gašević, D., Dawson, S., Rogers, T., & Gasevic, D. (2015). Learning analytics should not promote one size fits all: The effects of instructional conditions in predicating learning success. The Internet and Higher Education, 28, 68–84. https://doi.org/10.1016/j.iheduc.2015.10.002

Germonprez, M., Hovorka, D., & Collopy, F. (2007). A theory of tailorable technology design. Journal of the Association of Information Systems, 8(6), 351–367.

Grover, S., & Pea, R. (2013). Computational Thinking in K-12: A Review of the State of the Field. Educational Researcher, 42(1), 38–43. https://doi.org/10.3102/0013189X12463051

Hatton, E. (1989). Levi-Strauss’s Bricolage and Theorizing Teachers’ Work. Anthropology and Education Quarterly, 20(2), 74–96.

Johnson, L., Adams Becker, S., Estrada, V., & Freeman, A. (2014). NMC Horizon Report: 2014 Higher Education Edition (No. 9780989733557). Austin, Texas. Retrieved from http://www.nmc.org/publications/2014-horizon-report-higher-ed

Jones, D., & Clark, D. (2014). Breaking BAD to bridge the reality/rhetoric chasm. In B. Hegarty, J. McDonald, & S. Loke (Eds.), Rhetoric and Reality: Critical perspectives on educational technology. Proceedings ascilite Dunedin 2014 (pp. 262–272).

Kay, A., & Goldberg, A. (1977). Personal Dynamic Media. Computer, 10(3), 31–41.

Ko, A. J., Abraham, R., Beckwith, L., Blackwell, A., Burnett, M., Erwig, M., … Wiedenbeck, S. (2011). The State of the Art in End-user Software Engineering. ACM Comput. Surv., 43(3), 21:1–21:44. https://doi.org/10.1145/1922649.1922658

Kruse, A., & Pongsajapan, R. (2012). Student-Centered Learning Analytics (CNDLS Thought Papers). Georgetown University. Retrieved from https://cndls.georgetown.edu/m/documents/thoughtpaper-krusepongsajapan.pdf

Levi-Strauss, C. (1966). The Savage Mind. Weidenfeld and Nicolson.

Liu, D. Y.-T. (2017). What do Academics really want out of Learning Analytics? – ASCILITE TELall Blog. Retrieved August 27, 2017, from http://blog.ascilite.org/what-academics-really-want-out-of-learning-analytics/

Liu, D. Y.-T., Bartimote-Aufflick, K., Pardo, A., & Bridgeman, A. J. (2017). Data-Driven Personalization of Student Learning Support in Higher Education. In A. Peña-Ayala (Ed.), Learning Analytics: Fundaments, Applications, and Trends (pp. 143–169). Springer International Publishing. https://doi.org/10.1007/978-3-319-52977-6_5

Lonn, S., Aguilar, S., & Teasley, S. D. (2013). Issues, Challenges, and Lessons Learned when Scaling Up a Learning Analytics Intervention. In Proceedings of the Third International Conference on Learning Analytics and Knowledge (pp. 235–239). New York, NY, USA: ACM. https://doi.org/10.1145/2460296.2460343

MacLean, A., Carter, K., Lövstrand, L., & Moran, T. (1990). User-tailorable Systems: Pressing the Issues with Buttons. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (pp. 175–182). New York, NY, USA: ACM. https://doi.org/10.1145/97243.97271

Mishra, P., & Koehler, M. (2006). Technological pedagogical content knowledge: A framework for teacher knowledge. Teachers College Record, 108(6), 1017–1054.

Norman, D. A. (1993). Things that make us smart: defending human attributes in the age of the machine. Cambridge, Mass: Perseus.

Repenning, A., Webb, D., & Ioannidou, A. (2010). Scalable Game Design and the Development of a Checklist for Getting Computational Thinking into Public Schools. In Proceedings of the 41st ACM Technical Symposium on Computer Science Education (pp. 265–269). New York, NY, USA: ACM. https://doi.org/10.1145/1734263.1734357

Scanlon, E., Sharples, M., Fenton-O’Creevy, M., Fleck, J., Cooban, C., Ferguson, R., … Waterhouse, P. (2013). Beyond prototypes: Enabling innovation in technology‐enhanced learning. London. Retrieved from http://tel.ioe.ac.uk/wpcontent/%0Duploads/2013/11/BeyondPrototypes.pdf

Sinha, R., & Sudhish, P. S. (2016). A principled approach to reproducible research: a comparative review towards scientific integrity in computational research. In 2016 IEEE International Symposium on Ethics in Engineering, Science and Technology (ETHICS) (pp. 1–9). https://doi.org/10.1109/ETHICS.2016.7560050

Wiley, D. (n.d.). The Reusability Paradox. Retrieved from http://cnx.org/content/m11898/latest/

Wiliam, D. (2006). Assessment: Learning communities can use it to engineer a bridge connecting teaching and learning. JSD, 27(1).

Yoo, Y., Boland, R. J., Lyytinen, K., & Majchrzak, A. (2012). Organizing for Innovation in the Digitized World. Organization Science, 23(5), 1398–1408.

Zittrain, J. L. (2006). The Generative Internet. Harvard Law Review, 119(7), 1974–2040.

Improving teacher awareness, action and reflection on learner activity

The following post contains the content from a poster designed for the 2017 USQ Toowoomba L&T celebration event. It provides some rationale for a technology demonstrator at USQ based on the Moodle Activity Viewer.

What is the problem?

Learner engagement is a key to learner success. Most definitions of learner engagement include “actively participating, interacting, and collaborating with students, faculty, course content and members of the community” (Angelino & Natvig, 2009, p. 3).

70% of USQ students study online. By mid-November 2017, 26,754 students had been active in USQ’s Moodle LMS.

In online learning, the absence of visual cues makes teacher awareness of student activity difficult (Govaerts, Verbert, & Duval, 2011).  Richardson (2011) identifies “the role which teaching staff play in inspiring, challenging and engaging students” as “perhaps the most woefully neglected aspect of quality in higher education” (p. 2)

Learning analytics (LA) is the “use of (big) data to provide actionable intelligence for learners and teachers” (Ferguson, 2014). However, current tools provide poor data aggregation, poor visualisation capabilities and have other limitations that inhibit teachers’ ability to: understand student activity; respond appropriately; and, reflect on course design (Dawson & McWilliam, 2008; Corrin et al, 2013; Jones, & Clark, 2014).

How will it be addressed?

Teachers can be supported through tools that help them “analyse, appraise and improve practices in their everyday activity systems” (Knight et al, 2016, p. 337).

This Technology Demonstrator has implemented and will customise and scaffold the use of the Moodle Activity Viewer (MAV) within the USQ activity system.

The MAV provides a useful and easy to use tool that provides representations of student activity from within all Moodle learning spaces. It provides affordances to support teacher intervention and further analysis.

MAV - How many students

MAV’s overlay answering the question how many and what percentage of students have accessed each Moodle activity & resource?

What are the expected outcomes?

The project aims to explore two questions:

  1. If and how does the provision of contextual, useful, and easy to use representations of online learner activity help teachers analyse, appraise and improve their practices?
  2. If and how does this change in teacher activity influence learner activity and learning outcomes?

MAV - How many clicks

MAV’s overlay answering the question how many times have those students clicked on each Moodle activity & resource?

Want to learn more?

Ask for a demostration of MAV during the poster session.

USQ staff can learn more* about and start using MAV from http://tiny.cc/aboutmav and http://tiny.cc/installmav

* (Only from a USQ campus or via the USQ VPN)

MAV - How many students in a forum

MAV’s overlay answering the question how many and what percentage of students have read posts in this introductory activity?

MAV - Who accessed and how to contact them

MAV’s student access dialog providing details of and enabling teacher contact with the students who have accessed the “Fix my class IWB” forum?

References

Angelino, L. M., & Natvig, D. (2009). A Conceptual Model for Engagement of the Online Learner. Journal of Educators Online, 6(1), 1–19.

Corrin, L., Kennedy, G., & Mulder, R. (2013). Enhancing learning analytics by understanding the needs of teachers. In Electric Dreams. Proceedings ascilite 2013 (pp. 201–205).

Dawson, S., & McWilliam, E. (2008). Investigating the application of IT generated data as an indicators of learning and teaching performance. Queensland University of Technology and the University of British Columbia.

Ferguson, R. (2014). Learning analytics FAQs. Education. Retrieved from https://www.slideshare.net/R3beccaF/learning-analytics-fa-qs

Govaerts, S., Verbert, K., & Duval, E. (2011). Evaluating the Student Activity Meter: Two Case Studies. In Advances in Web-Based Learning – ICWL 2011 (pp. 188–197). Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-25813-8_20

Jones, D., & Clark, D. (2014). Breaking BAD to bridge the reality/rhetoric chasm. In B. Hegarty, J. McDonald, & S.-K. Loke (Eds.), Proceedings of the 31st Annual Conference of the Australasian Society for Computers in Learning in Tertiary Education (ASCILITE 2014) (pp. 262–272). Sydney, Australia: Macquarie University.

Knight, P., Tait, J., & Yorke, M. (2006). The professional learning of teachers in higher education. Studies in Higher Education, 31(3), 319–339. https://doi.org/10.1080/03075070600680786

Nudging up MyOpinion response rates using a gamified leaderboard

The following information is taken from and adds to the contents of a poster by Alice Brown and I for a USQ L&T Celebration Event. It describes the need for a Student Evaluation of Teaching leaderboard, how it works, and the results of some early applications (12 to 15% increases in response rates in individual courses, resulting in response rates that are double the institutional average – around 50%, rather than 25%).

Challenge – Course evaluation ‘buy-in’ and responding to MyOpinion

Many academics struggle with getting ‘buy-in’ from students in terms of providing feedback to institutional Student Evaluation of Teaching (SET) surveys (labelled MyOpinion at USQ). While efforts to prompt students to respond might include: various forms of communication, announcements, reference to how a course has reflected and acted on feedback in future updates, and the other promotion features (including USQ’s use of a big yellow button), increasing the percentage of response rates still tends to be a challenge (average response rates in SET fluctuating between 30% and 50%) (Bennett & De Bellis, 2010; Spooren et al, 2013).

Nudging, gamification & leaderboards

  • Nudge – “any aspect of the choice architecture that alters people’s behaviour in a predictable way without forbidding any options or significantly changing their economic incentives” (Thaler & Sunstein, 2008, p. 6).
  • Deterding et al (2011) define gamification as “the use of game design elements in non-game contexts” (p. 10) with the intent to “motivate and increase user activity” (p. 9)
  • Leaderboards are amongst the most popular of games mechanics found in case studies using gamification in education (Dicheva et al, 2015).

A leaderboard is a game design element consisting of a visual display that ranks players according to their accomplishments; when used in an educational setting it serves as a way for students to directly compare their own performance with that of others (Christy & Fox, 2014, p. 67)

Question – Can a MyOpinion leaderboard help increase student response rates?

Initially trialled in 2015 in the course EDC3100, and then extended into several other courses in 2016, two academics wanted to explore whether providing a ‘nudge’ through the integration of a MyOpinion leaderboard could increase response rates.

How does a MyOpinion leaderboard work?

The following image (click on it to see a bigger version) illustrates how the leaderboard works. See below for more explanation.

How a MyOpinion leaderboard works

The MyOpinion leaderboard works like this:

  1. Course examiner checks response rates.

    While the MyOpinion survey is live, the course examiner checks the MyOpinion site every day or so. At this stage, the course examiner can only see the number of responses. Nothing more.

  2. Course examiner updates a Google spreadsheet

    If the current response rate increases, the course examiner updates a previously configured Google spreadsheet. The spreadsheet contains data about all the relevant offerings of the courses, including: the number of enrolled students; and, the number of MyOpinion responses. The spreadsheet automatically calculates the percentage response rate. The current course offering is indicated by a yes in the appropriate column.

  3. Visitors to the course home page see the leaderboard

    Every time someone visits the course home page, the data in the Google Spreadsheet is transformed into a table that ranks different course offerings based on the percentage response rate. This is done via a small bit of common Javascript that can be embedded into almost any web page.

  4. Additional nudges are employed

    The visibility of the leaderboard may not be sufficient. Typically course examiners have used other means to nudge students to complete MyOpinion. Often using the leaderboard data to spark any ‘competitive nature’.

  5. Students complete the MyOpinion survey

    Thereby changing the response rate and starting the cycle all over again.

Results

As the graphs below show, each of the courses that have trialled the leaderboard have:

  1. Increased response rates by at least 12-15%; and,
  2. Have achieved response rates around or more than double the institutional average.

    (Institutional average currently only available for 2015 and 2016)

Roll your mouse pointer over the graph elements to see additional information.

What’s next?

  • Improve support for other types of leaderboard.

    A single USQ course may have 3 or 4 different groups of students based on campus. At least one staff member has experimented with use the leadboard to rank response rates from different student groups within the current offering of the course.

  • Improve advice for using the leaderboard

    Current advice is accessible via the more information section below. There are a number of ways this could be improved.

  • Promote more broadly to academics

    Currently the leaderboard has largely been promoted within School of Teacher Education and Early Childhood.

  • Explore options to automate leaderboard updating

    In theory, if an API were available for MyOpinion response rates, there would be no need for manual updating of the Google spreadsheet (or the Google spreadsheet).

  • Integrate a range of other communication strategies
  • Evaluate

More information

  • How to implement the leaderboard in your USQ course?
    • There are instructions that USQ course examiners can use to implement a MyOpinion leaderboard.
    • There is also a video (only visible to USQ staff) where Alice and I walk through the process of setting up the leaderboard in her course.
  • Background on the origins of the leaderboard

    This post provides some background on how and why the semi-automated leaderboard approach was created.

References

Bennett, T., & De Bellis, D. (2010). The Move to a System of Flexible Delivery Mode (Online v Paper) Unit of Study Student Evaluations at Flinders University. Management Issues and the Study of Initial Changes in Survey Volume, Response Rate and Response Level. Journal of Institutional Research, 15(1), 41–53.

Christy, K. R., & Fox, J. (2014). Leaderboards in a virtual classroom: A test of stereotype threat and social comparison explanations for women’s math performance. Computers & Education, 78, 66–77. https://doi.org/10.1016/j.compedu.2014.05.005

Deterding, S., Dixon, D., Khaled, R., & Nacke, L. (2011). From Game Design Elements to Gamefulness: Defining “Gamification.” In Proceedings of the 15th International Academic MindTrek Conference: Envisioning Future Media Environments (pp. 9–15). New York, NY, USA: ACM. https://doi.org/10.1145/2181037.2181040

Dicheva, D., Dichev, C., Agre, G., & Angelova, G. (2015). Gamification in Education: A Systematic Mapping Study. Journal of Educational Technology & Society, 18(3), 75–88.

Spooren, P., Brockx, B., & Mortelmans, D. (2013). On the Validity of Student Evaluation of Teaching: The State of the Art. Review of Educational Research, 83(4), 598–642. https://doi.org/10.3102/0034654313496870

Thaler, R., & Sunstein, C. (2008). Nudge: Improving decisions about health, wealth and happiness. New York: Penguin.

Thaler, R., & Sunstein, C. (2008). Nudge: Improving decisions about health, wealth and happiness. New York: Penguin.

Breaking BAD to bridge the e-learning reality/rhetoric chasm

@damoclarky and I got a bit lucky. Our ASCILITE paper has been accepted with revisions. Apparently the first reviewer hated the “theoretical construct” we were using to make our argument. The following is what we originally wrote, sharing it here to hopefully spark some critique and improvement (and also not to entirely waste the writing when I gut it and start again).

Start with the problem and then the “construct”, both adapted from the paper.

Problem

In a newspaper article (Laxon, 2013) Professor Mark Brown makes the following comment on the quality of contemporary University e-learning

E-learning’s a bit like teenage sex. Everyone says they’re doing it but not many people really are and those that are doing it are doing it very poorly. (n.p).

E-learning – defined by the OECD (2005) as the use of information and communications technology (ICT) to support and enhance learning and teaching – has been around for so long that there have been numerous debates about replacing it with other phrases. Regardless of the term used there “has been a long-standing tendency in education for digital technologies to eventually fall short of the exaggerated expectations”(Selwyn, 2012, n.p.). Writing in the early 1990s Geoghagen (1994) seeks to understand why a three decade long “vision of a pedagogical utopia” (n.p.) promised by instructional technologies has failed to eventuate. Ten years on Salmon (2005) notes that e-learning within universities is still struggling to move beyond projects driven by innovators and engage a significant percentage of students and staff. Even more recently concerns remain about how much technology is being used to effectively enhance student learning (Kirkwood & Price, 2013). Given that “Australian universities have made very large investments in corporate educational technologies” (Holt et al., 2013, p. 388) it is increasingly important to understand and address the rhetoric/reality chasm around e-learning.

Not surprisingly the literature provides a variety of answers to this complex question. Weimer (2007) observes that academics come to the task of teaching with immense amounts of content knowledge, but little or no knowledge of teaching and learning, beyond perhaps their personal experience. A situation which may not change significantly given that academics are expected to engage equally in research and teaching and yet work towards promotion criteria that are perceived to primarily value achievements in research (Zellweger, 2005). It has been argued that the limitations of the Learning Management System (LMS) – the most common university e-learning tool – make the LMS less than suitable for more effective learner-centered approaches and is contributing to growing educator dissatisfaction (Rahman & Dron, 2012). It’s also been argued that the “limited digital fluency of lecturers and professors is a great challenge” (Johnson, Adams Becker, Cummins, & Estrada, 2014, p. 3) for the creative leveraging of emerging technologies. Another contributing factor is likely to be Selwyn’s (2008) suggestion that educational technologists have failed to be cognisant of “the more critical analyses of technology that have come to the fore in other social science and humanities disciplines (p. 83). Of particular interest here is the observation of Goodyear et al (2014) that the “influence of the physical setting (digital and material) on learning activity is often important, but is under-researched and under-theorised: it is often taken for granted” (p. 138).

Our argument is that the set of implicit assumptions that underpin the practice of institutional e-learning within universities (which we’ll summarise under the acronym SET) leads to a digital and material environment that contributes significantly to the reality/rhetoric chasm. The argument is that while this mindset underpins how universities go about the task of institutional e-learning, they won’t be able to bridge the chasm.

Instead, we argument that another mindset needs to play a larger role in institutional practice. How much we don’t know. We’ll summarise this mindset under the acronym “BAD”. Yep, we think institutional e-learning needs to break BAD.

Breaking BAD versus SET in your ways

The following table contrasts the two frameworks and expands their acronyms. A slightly more detailed examination of the two frameworks follows

Table 1: The BAD and SET frameworks for e-learning implementation
Component BAD SET
How work gets done Bricolage – concrete problems are solved through creative recombination of existing resources Strategy – a desired future state is identified, all resources required to achieve state in most efficient way identified and provided.
How ICT is perceived Affordances – ICT is protean. It can be modified to enhance and transform current practice; and, to make it easier for the users. Established – ICT is fixed and implemented vanilla. Processes change to fit and users trained to use the provided functionality.
How you see the world Distributed – the world is complex, dynamic and unpredictable. Tree-like – the world is relatively stable and predictable. It can be understood through logical decomposition into a hierarchy.

How work gets done

(this was originally titled “How stuff happens” but was probably what one reviewer described as “inappropriately colloquial”. Need a better label for this. The idea is that the organisation only recognises work of a particular type. It’s the only way it conceives of anything interesting/important happening. Not sure the following explains this well enough)

It would be an unusual contemporary Australian university that was not – at least proclaiming the rhetoric of – following a strategic approach to its operations. Numerous environmental challenges and influences have led to universities being treated as businesses with an increasing prevalence of managers using “strategic control and a focus on outputs which can be quantified and compared” (Reid, 2009, p. 575) to manage academic activities. In line with this has been the increasing strategic approach to learning and teaching. The requirement that Australian universities have institutional learning and teaching strategic plans publicly available on their websites prior to accessing a government learning and teaching fund (Inglis, 2007) is just one example of how university teaching has become an object of policy with the learning and teaching excellence necessarily including the specification of goals (Clegg & Smith, 2008). The perceived importance of strategic approaches to institutional e-learning is illustrated by Carter et al’s (2011) identifying the importance of ensuring “Technology alignment with goals of the organization” (p. 207). The strategic or planning-by-objectives (e.g. learning outcomes, graduate attributes) approach also underpins how course design is largely assumed to occur with Visscher-Voerman and Gustafson (2004) finding that it underpins “a majority of the instructional design models in the literature” (p. 77). These approaches to understanding “how stuff happens” are so ingrained that it is often forgotten that these ideas have not always existed (Kezar, 2001) and that there is an alternate perspective.

(An example comparing bricolage and engineering approaches might be useful, might actually be a better structure for this section)

An example of this alternate perspective can be found in the idea of bricolage or “the art of creating with what is at hand” (Scribner, 2005, p. 297). Bricolage involves the manipulation and creative repurposing of existing, and often unlikely, resources into new arrangements to solve a concrete problem. A bricoleur (someone who engages in bricolage) when faced with a project does not analyse what resources may be required to fulfill that project (a more strategic approach), instead they ask how the project can be achieved with the resources already available (Hatton, 1989). Hatton (1989) used bricolage to understand the work of teachers, though Scribner (2005) thinks somewhat negatively. In terms of developing strategic applications of ICT, Ciborra (1992) argues that the “capability of integrating unique ideas and practical design solutions at the end-user level” (p. 299) (bricolage) is more important than strategic approaches.

As argued by Jones et al (2005) there are risky extremes inherent in both the strategic and bricolage approaches to process. The suggestion here within the context of university e-learning is that it would be fruitful to explore a dynamic and flexible interplay between the strategic and bricolage approaches. The problem is that at the moment the strategic is crowding out the bricolage. As Groom and Lamb (2014) observe the cost of supporting an enterprise learning tool (e.g. LMS) limits resources for user-driven innovation, in part because it draws “attention and users away” from the strategic tool. The demands of sustaining the large, complex and strategic tool dominates priorities and leads to “IT organizations…defined by what’s necessary rather than what’s possible” (Groom & Lamb, 2014, n.p). The established view of Information and Communication Technologies (ICT) in part arises from the predominance of the strategic view of how work happens.

How ICT is perceived: Affordances or Established

Widely accepted best practice within the IT industry is that large integrated systems – like an LMS – should be implemented in their “vanilla” form as they are too expensive (Robey, Ross, & Boudreau, 2002). This way of perceiving ICTs assumes that the functionality provided by technology is established and cannot be changed. This perception of an LMS encourages the adoption of only those pedagogical designs that are supported by the existing LMS functionality and precludes the exploration of contextually specific learning designs (Jones, 2012). Perceiving and implementing the LMS as a established product simplifies and reduces the cost of training and support, but increases the difficulty of adoption as teaching staff attempt to use a standardised system to support hugely diverse disciplines, teaching philosophies and instructional styles (Black, Beck, Dawson, Jinks, & DiPietro, 2007). Perhaps in no small way the established view of ICT in e-learning contributes to Dede’s (2008) observation that “widely used instructional technology applications have less variety in approach than a low-end fast-food restaurant” (p. 58). This perception of ICT challenges Kay’s (1984) discussion of the “protean nature of the computer” (p. 59) as “the first metamedium, and as such has degrees of freedom and expression never before encountered” (p. 59). However, this perception of ICT is closely linked with the techno-rational assumptions of the strategic view, an approach that is increasingly seen as a naïve view of ICT, technology and organisations.

(Remove some of the quotes and tell a better story).

Goodyear et al (2014) argue that in thinking about design for networked learning it is vital to acknowledge “the likelihood of slippage between the task as set and the actual activity” (p. 139). Hannon (2013) describes a case where “meso-level practitioners – teaching academics, learning technologies, and academic developers” (p. 175) undertake “hidden effort” (p. 175) to deal with the gap between technology and pedagogy that arise from the application of centralised technologies. Rather than stick with the established functionality provided by an information system increasingly technically literate users draw upon increasingly available technologies to develop systems that bridge the gaps between their needs and the established information system. While often seen as dangerous and inefficient such systems can provide a resource of creativity and innovation that helps organisations survive in a competitive environment (Behrens, 2009). Such systems arise because ICT is not seen as established, but rather as one of a number of components of an emergent process of change where the outcomes are indeterminate because they are contingent on the specifics of the context and the situation (Markus & Robey, 1988). In particular, they arise due to an on-going process – not unlike bricolage – where users are exploring how the affordances of ICT can be leveraged to address concrete problems. The phrase affordances is used here as defined by Goodyear et al (2014) “not as pre-given, but as co-evolving, emergent and partly co-constitutive” (p. 142) and as a way of exploring how what is actually done with e-learning systems is “influenced by the qualities of the place in which they are working” (p. 137). Our view is that it is necessary for the implementation of e-learning systems to be perceived as an on-going and emergent exploration of the affordances that could be the most useful for the students and teachers within a given context. Echoing Johri’s (2011) observation that bricolage shifts focus away from the established “design of an artefact towards emergent design of technology-in-use, particularly by the users” (p. 212).

(that can certainly be improved upon)

How you see the world: Distributed or Tree-like

Techno-rational methods such as strategic planning and software development (or at least act like they) perceive the world as a hierarchy or as being tree-like. These methods use analysis and logical decomposition to reduce larger wholes into smaller more easily understood and manageable parts (Truex, Baskerville, & Travis, 2000). This approach is problematic because the isolation of components is largely imaginary and their separation leads to a loss of rich interdependencies between components (Truex et al., 2000). Enterprise systems are informed heavily by these tree-like conceptions and this is reflected in university e-learning environments and their poor fit with the heterarchical and self-organised potential of contemporary technologies and educational practices (Hannon, Ryberg, & Riddle, 2014). Goodyear et al (2014) argue “that the dominant images of the object of our research do not yet reflect the extent to which learning networks now consist of heterogenous assemblages of tasks, activities, people, roles, rules, places, tools, artefacts and other resources, distributed in complex configurations across time and space and involving digital, non-digital and hybrid entities” (p. 140). We suggest that the same applies to the dominant conceptions underpinning the implementation of institutional e-learning systems.

The limitations of tree-like models and a preference for distributed models are evident in a number of sources. Holt et al (2013) argue for the importance of distributed leadership in institutional e-learning to the growing complexity of e-learning meaning that no one leader at the top of a hierarchical tree has the knowledge to “possibly contend with the complexity of issues” (p. 389). The tend towards distribution is obviously evident in connectivism and its “thesis that knowledge is distributed across a network of connections and therefore learning consists of the ability to construct and traverse those networks” (Downes, 2011, n.p). Siemens’ (2008) list of some of the concepts from which connectivism arises – such as activity theory, distributed and embodied cognition, complexity and network theory – illustrate the breadth of this move to distributed understandings. The socio-material approaches to studying and understanding networked learning (and technology embedded practices more broadly) mentioned by both Hannon (2013) and Goodyear et al (2014) echo a distributed view and underpins the emergent view of technology mentioned in the previous section. It also links with the idea of bricolage as paying close attention to what occurs within the distributed network and responding to context-specific problems by experimenting with the affordances perceived by the components of an network/assemblage to reduce the chasm between rhetoric and reality.

Bringing the LMS into the network – Experiment # 1 – Activity completion

The following is the first step in an attempt to modify the Moodle Activity Viewer (or at least a local instance). I’d like a modified version of MAV to allow me to

  1. find out how students are progressing with activity completion.

    Rather than use clicks (as MAV does currently) to track student usage, use activity completion. I use this in EDC3100, however, activity completion isn’t even turned on at the Moodle level at the other institution.

  2. Easily display additional information about students as a roll-over/popup for any links to a student profile page.

The following is an initial exploration of how MAV works and what changes I’ll need to kludge it to work within the local constraints.

It starts with a description of how this is going to work, follows with some initial explorations getting MAV to work within my browser, an exploration of how MAV actually does this and some initial explorations of the changes I’ll have to make. It finishes with some suggestions for the next step.

All as a local instance

First constraint is that this is all being done as a local instance. I’ll be the only one who can see it when I’m using my laptop. It will work something like this

  • I have MAV installed on a version of the Firefox browser.
  • When I visit one of my institutional Moodle course sites MAV will recognise this and as a result will
    • Send a query to a web server running on my laptop asking for activity completion (and other) data for all, some or one student.
    • The web server on my laptop will query a database on my laptop that contains a copy of the activity completion data for my courses and send a reply back to Firefox/MAV.
    • On receipt of the reply Firefox/MAV will update the display of my course site to colour code the activities based on how many student(s) have completed the activity.

The reliance on my laptop and a local database is due to the difficulty of making connections to the institutional servers/data.

From a “theoretical” perspective, this is part of our argument that the LMS is not a full fledged member of networked learning. It’s too hard to make new connections to the LMS to enable new learning. MAV and local databases are an attempt to make it easier to connect to the LMS and its large number of individual parts. The theory is that by making this easier, it is easier to innovate and encourage the development of more interesting learning that is more used.


MAV recognising institutional LMS page

First some documentation on MAV and how it works

The local mav code is /usr/local/www/mav and /usr/local/www/smarty

MAV has to separate servers it knows about

  • balmiServer – this will be my local laptop
  • Moodle Server – this is the institutional LMS

    POINT: Would be interesting to see if this could be multiple servers? e.g. when I want it to work on both my local Moodle server and the institution one

These are set in ~/mav/gmdocs. Set it to http://usqstudydesk.usq.edu.au/m2

Go to this link http://localhost/fred/mav/gm/moodleActivityViewer.user.js and install the updated version of MAV

That seems to working. Getting at least some information dumped into the console. Seems to be breaking on a call to balmi.getLoggedInUserIDNumber() — moodleActivityViewere.user.js 1199

Ahh, seems the USQ study desk has an extra bread crumb in the list that breaks the code. Modify the code in balmi.user.js and all is good.

getMoodleLinks

balmi.user.js has a function getMoodleLinks that extracts all the Moodle type links from the page. This includes setting up some regular expressions to do the extraction.

Change: the RE needs to be updated for my institutional Moodle. Also another RE replacement a little further down for link name.

How does MAV work?

Try to nut out the process MAV uses and identify what possible changes I’ll need to make for both the activity completion and also the student information idea.

The client runs the GreaseMonkey script ~/mav/gmdocs/moodleActivityView.user.js installed on Firefox. It starts off and calls.

  • balmi.getCoursePageLink – Will only run MAV is the page is a valid Moodle page.

    Looks for the Moodle breadcrumbs and extracts the course id.

    CHANGE: this is where I could hard code the detection of my courses and also do the translation between the course ID on the USQ Moodle server and the course ID on the Moodle server on my laptop.

    If not what MAV is looking (getCoursePageLink returns NULL) for MAV exits.

  • moodleActivityView.user.js – does a range of set up prep. Adding the MAV interface etc.

    CHANGE: Some of these will need to change based on what I want to be able to do.

  • Adds mavUpdatePage function as a listener for the load page event – i.e. this is what updates the page.
  • mavUpdatePage does some debug stuff and then calls
  • generateJSONRequest – generate the particular request to send to the MAV server in JSON
    • balmi.user.js – balmi.getCoursePageLink() – A duplicate call
    • balmi.user.js – balmi.getMoodleLinks

      get’s all the links that are part of a Moodle course page. This is for the activity tracking.

      returns data of the form

      “/mod/forum/view.php?id=12345”: [“forum”,”view.php?id=12345″]

      CHANGE: For activity completion the aim here will be to return the links only for value Moodle activities.

      CHANGE: For the user details option, looking at returning the links to user details.

    • Filters out a range of links that shouldn’t be included
    • calls requestData – actually makes the request
  • updatePage – takes the data returned from the MAV server and updates the links. Either through increasing font size of changing the background colour of the links.

    CHANGE: the activity completion will be closest to a version of the number of students. Rather than the number of students who clicked on the link, it will be the number of students who completed the activity.

    Has a loop that goes through all the links in the page. If they link matches something that’s come back from the MAV server, then make the change.

The server is implemented using ~/mav/phpdocs/api/getActivity.php – processes the request

  • decodes and logs the request
  • getCourseIdFromCourseHomePageLink – extracts the course id which is used to query the Moodle database
  • SQL to count # student in course

    CHANGE: Not needed for the student ID stuff.

  • Checks to see if the user wants # clicks or # students and whether just for an individual student, a group(s) or all.

    CHANGE: Again not needed for student details.

  • Calls ~/mav/lib/generateSQLQuery’generateSQLQuery – just a wrapper around a fairly standard PHP template for dynamically generated SQL.

    The template is in ~/mav/lib/getActivityQueryTemplate.php This uses a range of PHP code to generate the appropriate SQL query to extract the stats per link

    CHANGE: the activity completion modifications could be implemented in here. Fairly similar to the S approach, but using activity completion rather than the Moodle log tables.

  • Processes the query for each link, placing the results into a data structure
  • Constructs the JSON object to send back to the browser.

    CHANGE: This is where my kludge will have to translate the student and activity ids returned by the SQL into the values that are being used on the USQ Moodle server and are thus what the browser will find embedded in the HTML.

Approach for changes

Separate clients and servers for the two approaches. Perhaps modify the existing for activity completion, but still do this separate from the existing MAV stuff so I have a clean copy? Definitely have to put this under git.

Questions

  1. What’s the format for links to the student profile? Does it use the Moodle user id?

    Basically a link to the script user/view.php with the user’s id and the course id as parameters.

    [code lang=”html”]
    <a href="~/user/view.php?id=USERID&course=COURSEID">Fred Nerf</a>
    [/code]

  2. How do you distinguish activity links from other links in Moodle?

    Looks like a list element with a class of activity is a good first start. If it in turn contains a span of class autocompletion that’s another good sign.

    Above, that with the activity completion you’re only looking for stuff within the course-content div, or below that the weeks unordered list.

    [code lang=”html”]
    <li class="activity book modtype_book " id="module-263678">
    <div>
    <div class="mod-indent-outer"><div class="mod-indent"></div>
    <div>
    <div class="activityinstance">
    <a class="" onclick="" href="..mod/book/view.php?id=263678"><img src="" class="iconlarge activityicon" alt=" " role="presentation" />
    <span class="instancename">Setting up your tools: Diigo, a blog and Twitter<span class="accesshide " > Book</span></span>
    </a>
    </div>
    <span class="actions">
    <span class="autocompletion"><img title="Completed: Setting up your tools: Diigo, a blog and Twitter" alt="Completed: Setting up your tools: Diigo, a blog and Twitter" class="smallicon" src="" /></span>
    </span>
    </div>
    </div>
    </div>
    </li>
    [/code]

  3. How am I going to get map the USQ Moodle activity and user ids with the ids used on my local server?

    A simple script to parse the HTML file for the course home page should be able to extract the ids for each of the activities on the USQ server and also the associated name. The above HTML shows that the id is in the id of the list element. Already have the names of the activities with a hard coded sequential idea in the local database. Can do the mapping that way.

To do

Misc tasks to do

  • Think through how this kludge is going to be done. Likely possibilities include
    1. Separate javascript plugins and servers for the activity completion and the user details.
    2. Modify the existing plugins and servers to handle the additional requests.
    3. Integrate activity completion into the existing MAV, but have user details separate

      Mainly because activity completion is largely the same as the existing display work that MAV does.

  • User detail display
    • Investigate what’s the best way to pass the data back to the browser – just data with the HTML generated by the browser or as HTML generated by the server and simply inserted by the
    • Chat with Rolley and see whether the rollover/popup idea can be implemented with the same HTML stuff used by the rest of MAV.
  • Extract the activity id data from the USQ server.
  • Find out if there is a report that Moodle will generate a list of all the users in a course so I can extract user ids from the USQ Moodle server to create a mapping to the local user ids.

    The activity participation report will generate a list of all students with a link that includes user id and their name.

  • Get the MAV code base into git.
  • Implement the separate user details version of MAV might be the first major change to do.

From thinking to tinkering: The grassroots of strategic information systems

What follows is a long overdue summary of Ciborra (1992). I think it will have a lot of insight for how universities implement e-learning. The abstract for Ciborra (1992) is

When building a Strategic Information. System (SIS), it may not be economically sound for a firm to be an innovator through the strategic deployment of information technology. The decreasing costs of the technology and the power of imitation may quickly curtail any competitive advantage acquired through an SIS. On the other hand, the iron law of market competition prescribes that those who do not imitate superior solutions are driven out of business. This means that any successful SIS becomes a competitive necessity for every player in the industry. Tapping standard models of strategy analysis and data sources for industry analysis will lead to similar systems and enhance, rather than decrease, imitation. How then should “true” SISs be developed? In order to avoid easy imitation, they should should emerge from from the grass roots of the organization, out of end-user hacking, computing, and tinkering. In this way the innovative SIS is going to be highly entrenched with the specific culture of the firm. Top management needs to appreciate local fluctuations in practices as a repository of unique innovations and commit adequate resources to their development, even if they fly if the face of traditional approaches. Rather than of looking for standard models in the business strategy literature, SISs should be looked for in the theory and practice of organizational leaming and innovation, both incremental and radical.

My final thoughts

The connection with e-learning

Learning and teaching is the core business of a university. For the 20+ years I’ve worked in Australian Higher Education there has been calls for universities to become more distinct. It would then seem logical that the information systems used to support, enhance and transform (as if there are many that do that) learning and teaching (I’ll use e-learning systems in the following) should be seen as Strategic Information Systems.

Since the late 1990s the implementation of e-learning systems has been strongly influenced by the traditional approaches to strategic and operational management. The influence of the adoption of ERP systems are in no small way a major contributor to this. This recent article (HT: @katemfd) shows the lengths to which universities are going when the select an LMS (sadly for many e-learning == LMS).

I wonder how much of the process is seen as being for strategic advantage. Part, or perhaps all, of Ciborra’s argument for tinkering is on the basis of generating strategic advantage. The question remains whether universities see e-learning as a source of strategic advantage (anymore)? Perhaps they don’t see selection of the LMS as a strategic advantage, but given the lemming like rush toward “we have to have a MOOC” of many VCs it would seem that technology enhanced learning (apologies to @sthcrft) is still seen as a potential “disruptor”/strategic advantage

For me this approach embodies the rational analytic theme to strategy that Ciborra critiques. The tinkering approach is what is missing from university e-learning and its absence is (IMHO) the reason much of it is less than stellar.

Ciborra argues that strategic advantage comes from systems where development is treated as an innovation process. Where innovation is defined as creating new knowledge “about resources, goals, tasks, markets, products and processes” (p. 304). To me this is the same as saying to treat the development of these systems as a learning process. Perhaps more appropriately a constructionist learning process. Not only does such a process provide institutional strategic advantage, it should improve the quality of e-learning.

The current rhetoric/reality gap in e-learning arises from not only an absence, but active prevention and rooting out, of tinkering and bricolage. An absence of learning.

The deficit model problem

Underpinning Ciborra’s approach is that the existing skills and competencies within an organisation provide both the source and the constraint on innovation/learning.

A problem with university e-learning is the deficit model of most existing staff. i.e. most senior management, central L&T, central L&T and middle managers (e.g. ADL&T) have a deficit model of academic staff. They aren’t good enough. They don’t know enough. They have to complete a formal teaching qualification before they can be effective teachers. We have to nail down systems so they don’t do anything different.

Consequently, wxisting skills and competencies are only seen as a constraint on innovation/learning. They are never seen as a source.

Ironically, the same problem arises in the view of students held by the teaching academics that are disparaged by central L&T etc.

The difficulties

The very notion of something being “unanalyzable” would be very difficult for many involved in University management and information technology to accept. Let alone deciding to use it as a foundation for the design of systems.

Summary of the paper

Introduction

Traditional approaches for designing information systems are based on “a set of guidlines” about how best to use IT in a competitive environment and “a planning and implementation strategy” (p. 297).

However, the “wealth of ‘how to build an SIS’ recipes” during the 1990s failed to “yield a commensurate number of successful cases” at least not measured against the rise of systems in the 1980s. Reviewing the literature suggests a number of reasons, including

  • Theoretical literature emphasises rational assessment by top management as the means for strategy formulation ignoring alternative conceptions from innovation literature valuing learning more than thinking and experimentation as a means for revealing new directions.
  • Examining precedent-setting SISs suggests that serendipity, reinvention and other facts were important in their creation. These are missing from the rational approach.

So there are empirical and theoretical grounds for a new kind of guidelines for SIS design.

Organisations should ask

  1. Does it pay to be innovative?
  2. Are SISs offering competitive advantage or are they competitive necessity?
  3. How can a firm implement systems that are not easily copied and thus generate returns?

In terms of e-learning this applies

the paradox of micro-economics: competition tends to force standardization of solutions and equalization of production and coordination costs among participants.

i.e. the pressures to standarise.

The argument is that an SIS must be based on new practical and conceptual foundations

  • Basing an SIS on something that can’t be analysed, like orgnisational culture will help avoid easy imitation. Leveraging the unique sources of practice and know-how of the firm and industry level can be th esource of sustained advantage.
  • SIS development should be closer to prototyping and engaging with end-users’ ingenuity than has been realised.

    The capability of integrating unique ideas and practical design solutions at the end-user level turns out to be important than the adoption of structured approaches to systems development or industry analysis (Schoen 1979; Ciborra and Lanzara, 1990)

Questionable advantage

During the 1980s a range of early adopters of strategic information systems (SISs) – think old style airline reservation systems – arose brought benefit to some organisations and bankruptcy to those that didn’t adopt. This arose to a range of frameworks for identifying SIS.

I’m guessing some of these contributed to the rise of ERP systems.

But the history of those cited success stories suggest that SIS only provide an ephemeral advantage before being copied. One study suggests 92% of systems followed industry wide trends. Only three were original.

I imagine the percentage in university e-learning would be significantly higher. i.e. you can’t get fired if you implement an LMS (or an eportfolio).

To avoid the imitation problem there are suggestions to figure out the lead time for competitors to copy. But that doesn’t avoid the problem. Especially given the rise of consultants and service to help overcome.

After all, if every university can throw millions of dollars at Accenture etc they’ll all end up with the same crappy systems.

Shifts in model of strategic thinking and competition

This is where the traditional approaches to strategy formulation get questioned.

i.e. “management should first engage in a purely cognitive process” that involves

  1. appraise the environment (e.g. SWOT analysis)
  2. identify success factors/distinctive competencies
  3. translate those into a range of competitive strategy alternatives
  4. select the optimal strategy
  5. plan it in sufficient details
  6. implement

At this stage I would add “fail to respond to how much the requirements have changed” and start over again as you employ new senior leadership

This model is seen in most SIS models.

Suggests that in reality actual strategy formulation involves incrementalism, muddling through, myopic and evolutionary decision making. “Structures tend to influence strategy formulation before they can be impacted by the new vision” (p. 300)

References Mintzberg (1990) to question this school of through 3 ways

  1. Assumes that the environment is highly predictable and events unfold in predicted sequences, when in fact implementation surprises happen. Resulting in the clash between inflexible plans and the need for revision.
  2. Assumes that the strategist is an objective decision maker not influenced by “frames of reference, cultural biases, or ingrained, routinized ways of action” (p. 301). Contrary to a raft of research.
  3. Strategy is seen as an intentional design process rather than as learning “the continuous acquisition of knowledge in various forms”. Quotes a range of folk to argue that strategy must be based on effective adaptation and learning involving both “incremental, trial-and-error learning, and radical second-order learning” (p. 301)

The models of competition implicit in SIS frameworks tend to rely on theories of business strategy from industrial organisation economics. i.e. returns are determined by industry structure. To generate advantage a firm must change the structural characteristics by “creating barriers to entry, product differentiation, links with suppliers” (p. 301).

There are alternative models

  • Chamberlin’s (1933) theory of monopolistic competition

    Firms are heterogeneous and compete on resource and asset differences – “technical know-how, reputation, ability for teamwork, organisational culture and skills, and other ‘invisible assets’ (Itami, 1987)” (p. 301)

    Differences enable high return strategies. You compete by cultivating unique strengths and capabilities and defending against imitation.

  • Schumpeter’s take based on innovation in product, market or technology

    Innovation arises from creative destruction, not strategic planning. The ability to guess, learn and luck appear to be the competitive factors.

Links these with Mintzberg’s critique of rational analytics approaches and identifies two themes in business strategy

  1. Rational analytic

    Formulate strategy in advance based on industry analysis. Plan and then implement. Gains advantage relative to firms in the same industry strucure.

  2. Tinkering (my use of the phrase)

    Strategy difficult to plan before the fact. Advantage arises from exploiting unique characteristics of the firm and unleashing its innovating capabilities

Reconsidering the empirical evidence

Turns to an examination of four well-known SIS based on the two themes and other considerations from above. This examination these “cases emphasize the discrepancy between ideal plans for an SIS and the realities of implementation” (p. 302). i.e.

The system was not developed according to a company-
by one of the business units. The system was not developed according to company-wide strategic plan; rather, it was the outcome of an evolutionary, piecemeal process that included the ingenious tactical use of systems already available.

i.e. bricolage and even more revaling

the conventional MIS unit was responsible not only for initial neglect of the new strategic applications within McKesson, but also, subsequently, for the slow pace of company-wide learning about McKesson’s new information systems

Another system “was supposed to address an internal inefficiency” (p. 303) not some grand strategic goal.

And further

The most frequently cited SIS successes of the 1980s, then, tell the same story. successes of the 1980s, then, tell the same story. Innovative SISs are not fully designed top-down or introduced in one shot; rather, they are tried out through prototyping and tinkering. In contrast, strategy formulation and design take place in pre-existing cognitive frames and organizational contexts that usually prevent designers and sponsors from seeing and exploiting the potential for innovation. (p. 303)

New foundations for SIS design

SIS development must be treated as an innovation process. The skills/competencies in an organisation is both a source and a constraint on innovation. The aim is to create knowledge.

New knowledge can be created in two non-exclusive ways

  1. Tinkering.

    Rely on local information and routine behaviour. Learning by doing, incremental decision making and muddling through).

    Accessing more diverse and distant information, when an adequate level of competence is not present, would instead lead to errors and further divergence from optimal performance (Heiner, 1983) (p. 304)

    People close to the operational level have to be able to tinker to solve new problems. “local cues from a situation are trusted and exploited in a somewhat unreflective way, aiming at ad hoc solutions by heuristics rather than high theory”

    The value of this approach is to keep development of an SIS close to the competencies of the organisation and ongoing fluctuations.

  2. Radical learning

    “entails restructuring the cognitive and organisational backgrounds that give meaning to the practices, routines and skills at hand” (p. 304). It requires more than analysis and requirements specifications. Aims at restructuring the context of both business policy and systems development”. Requires “intervening in situations and designing-in-action”.

    The change in context allows new ways of looking at the capabilities and devising new strategies. The sheer difference becomes difficult to imitate.

SIS planning by oxymorons

Time to translate those theoretical observations into practical guidelines.

Argues that the way to develop an SIS is to proceed by oxymoroon. Fusing “opposites in practice and being exposed to the mismatches that bound to occur” (p. 305). Defines 7

  • 4 to bolster incremental learning
    1. Value bricolage strategically
    2. Design tinkering

      This is important

      Activities, settings, and systems have to be arranged so that invention and prototyping by end-users can flourish, together with open experimentation (p. 305)

      Set up the organisation to favour local innovation. e.g. ad hoc project teams. ethnographic studies.

    3. Establish systematic serendipity

      Open experimentation results in largely incomplete designs, the constant intermingling of implementation and refinement, concurrent or simultaneous conception and execution – NOT sequential

      An ideal context for serendipity to merge and lead to unexpected solutions.

    4. Thrive on gradual breakthroughs.

      In a fluctuating environment the ideas that arise are likely to include those that don’t align with established organisational routines. The raw material for innovation. “management should appreciate and learn about such emerging practices”

  • Radical learning and innovation
    1. Practice unskilled learning

      Radically innovative approaches may be seen as incompetent when judged by old routines and norms. Management should value this behaviour as an attempt to unlearn old ways of thinking and doing. It’s where new perspectives arise.

    2. Strive for failure

      Going for excellence suggests doing better what you already do which generates routinized and efficient systems. The competency trap. Creative reflection over failures and suggest ways to novel ideas and designs. Also the recognition of discontinuities and flex points.

    3. Achieve collaborative inimitability

      Don’t be afraid to collaborate with competitors. Expose the org to new cultures and ideas.

These seven oxymorons can represent a new “systematic” approach for the establishment of an organizational environment where new information—and thus new systems can be generated. Precisely because they are paradoxical, they can unfreeze existing routines, cognitive frames and behaviors; they favor learning over monitoring and innovation. (p. 306)

References

Ciborra, C. (1992). From thinking to tinkering: The grassroots of strategic information systems. The Information Society, 8(4), 297–309.

Needed updates to cc_attrib.pl

The following is a list of updates I need to make to a perl script I wrote last year that helps me properly attribute the Creative Commons licenced Flickr photos I use in presentations. This list arises from prepare the welcome video for this year’s course. Most, if not all, of the updates are to make it easier to use, prevent the chance of “spam” like behaviour and deal with apparent reliability issues with the Flickr API.

Parse the slides file – ignore comments

As I discover images I want to use in a presentation, I maintain a text file with the details as follows
[code lang=”perl”]
1,http://www.flickr.com/photos/rameshng/5930493923/ # Welcome picture Welcome.jpg
2,http://www.flickr.com/photos/rameshng/5930493923/ # Welcome picture Welcome.jpg
[/code]

The script doesn’t parse this file yet. Also, the “comments” approach is a new thing and appears useful for tracking. The script should ignore those.

Track successful comments

One of the main tasks of the script is to post an acknowledgement comment to the Flickr page for an image. This morning the Flickr API would successfully post these comments to some pages, and not others. Meaning a manual check to see which worked and which didn’t, remove those that did work from the script and try again. Had to do this 3 times.

Would be useful if the script tracked which comments were successfully made and didn’t try to make another comment on those. Don’t want to start spamming.

Track all images used

Following on from that, I’m wondering whether the script should track all images ever run through the script. There’s a good chance I might use an image in more than one presentation associated with a course, not sure I’d want to make the same comment again. Perhaps I should. If I used the image in a research presentation – very different from the course – perhaps I should make a new comment.

Powered by WordPress & Theme by Anders Norén

css.php