Assembling the heterogeneous elements for (digital) learning

Month: May 2014

Making BIM ready for Moodle 2.6

The very nice folk from my institution’s ICT group warned me back in March that

I have started work on the moodle 2.6 upgrade that will be happening midyear and have come across some deprecation warning from BIM. Just giving you plenty of notice that an updated version will be needed before release.

That was just as my first use of BIM on the institution’s servers was getting underway. That’s gone reasonably well and it will be continuing (and hopefully expanding as I learn more about what’s required and possible with the approach) next semester, so I better get BIM playing nicely with 2.6. That’s what this post is reporting on.

BIM for Moodle 2.6 (and also 2.5) is available from the BIM plugin database entry and also from GitHub.

Get Moodle 2.6 running

Let’s get the latest version of Moodle 2.6 – 2.6.3 – and install that.

So that’s the first change. PHP setting for caching. Not that I’ll need that for testing. Looks like I can ignore it for now.

Get BIM installed

I’m doing this so irregularly now it’s good that I actually documented this last time.

That all appears to be working. Ahh, but I haven’t turned the debugging all the way up to annoying yet.

That’s better

get_context_instance() is deprecated, please use context_xxxx::instance() instead.

And about this stage it was always going to be time to….

Check the Moodle 2.6 release notes

The Moodle 2.6 release notes and then the developer notes. Nothing particularly related to this warning.

Do it manually

As outlined in this message it appears that this particular usage has been deprecated for a few versions. The deprecatedlib.php suggests this gets removed in 2.8.

So the changes I’m doing appear like this
[code language=”javascript”]
#$context = get_context_instance( CONTEXT_COURSE, $course->id );
$context = context_course::instance( $course->id );
[/code]

I can see this is needed in the following

  • ./coordinator/allocate_markers.php
  • ./coordinator/find_student.php
  • ./index.php **done?**
  • ./lib/groups.php
  • ./lib/locallib.php
  • ./marker/view.php
  • ./view.php – this one had actually been done earlier
    #$context = get_context_instance( CONTEXT_MODULE, $cm->id );
    $context = context_module::instance( $cm->id );

That all seems to be working.

Do a big test

Will back up a large BIM activity with a temp course from my Moodle 2.5 instance and restore it under Moodle 2.6.

Some more issues

print_container() is deprecated. Please use $OUTPUT->container() instead. Done

Does my course suffer from semester droop?

The institutional LMS seems to be having some problems, so I’ll post this instead.

Quite a few folk I work with have made observations about semester droop. i.e. attendance at lectures/tutorials dropping off as the semester progresses. @damoclarky and @beerc confirmed that the same droop can be seen in the longitudinal LMS data they have access to across most courses.

So the question I wanted to explore was

Does the course site from the S2, 2013 offering of my course show evidence of semester droop?

The quick answer is “Yes, a bit”. But it’s not huge or entirely unexpected. Might be interesting to explore more in other courses and especially find out what’s behind it.

Why do this?

I thought this would be interesting because

  1. I have a little tool that allows me to view usage of the site very easily.

    If it were harder, I probably wouldn’t have done it.

  2. The S2, 2013 offering is entirely online, no on-campus students so the course site is the main form of official interaction.
  3. Part of the final result (no more than 3%) comes from completing the sequence of weekly activities on the course site.
  4. I’ve tried to design these activities so that they explicitly link with the assessment and Professional Experience (the course if for pre-service teachers, they teach in schools for 3 weeks during the semester).

How?

Created two views of the S2, 2013 EDC3100 course site using MAV

  1. Clicks; and,

    Shows the entire course site with the addition of a heat map that shows the number of times students have clicked on each link.

  2. Students.

    The same image, but rather than clicks the heat map shows the number of students that clicked on each link.

Findings

  1. Students – there is some drop off.

    91 students completed all the assessment. 9 did not.

    97 students is the largest number of students clicking on any link. This is limited to the Assessment link and a couple of links in the first week. Where did the other two go?

    The activities in the last week range from 48 students clicking on a link up to 83 students.

    So definite drop off with some students not completing the activities in the last few weeks.

  2. Clicks.

    Assessment link had the most clicks – 1559 clicks.

    The “register your blog” link had 1211 clicks. This is where students registered and looked for other student blog addresses. The blog contributed to final result.

    Discussion forums for Q&A and Assignment 3 – 977 clicks and 949 clicks.

    Activities in the first week ranged from 177 clicks up to 352. Indicating that many students started these more than once.

    Activities in the last week ranged from 83 to 146 clicks. The 146 clicks was titled “Pragmatic assignment 3 advice”.

    Definite drop off. The most popular activity in the last week got less clicks than the least popular activity from week 1.

Reasons?

@palbion made the point that students are pragmatic and do what they think they need. It appears that the EDC3100 design addresses this somewhat in that they tend to stick with the activities as they need it.

However, by the last week the students have the results from two assignments that make up 59% of their assessment. I wonder if the small percentage associated with completing study desk activities and knowing their likely mark results in them making a pragmatic decision? One potential explanation for the drop off in the last week.

The other is they are probably busy with other assignments and courses they need to catch up on after being on Professional Experience.

@beerc has made the suggestion that perhaps by the end of semester the students are more confident with the value of the course site and how to use it. They’ve had the full semester to become familiar, hence less clicks searching around to make sure everything is checked.

Of course, asking them would be the only way to find out.

Thoughts?

From thinking to tinkering: The grassroots of strategic information systems

What follows is a long overdue summary of Ciborra (1992). I think it will have a lot of insight for how universities implement e-learning. The abstract for Ciborra (1992) is

When building a Strategic Information. System (SIS), it may not be economically sound for a firm to be an innovator through the strategic deployment of information technology. The decreasing costs of the technology and the power of imitation may quickly curtail any competitive advantage acquired through an SIS. On the other hand, the iron law of market competition prescribes that those who do not imitate superior solutions are driven out of business. This means that any successful SIS becomes a competitive necessity for every player in the industry. Tapping standard models of strategy analysis and data sources for industry analysis will lead to similar systems and enhance, rather than decrease, imitation. How then should “true” SISs be developed? In order to avoid easy imitation, they should should emerge from from the grass roots of the organization, out of end-user hacking, computing, and tinkering. In this way the innovative SIS is going to be highly entrenched with the specific culture of the firm. Top management needs to appreciate local fluctuations in practices as a repository of unique innovations and commit adequate resources to their development, even if they fly if the face of traditional approaches. Rather than of looking for standard models in the business strategy literature, SISs should be looked for in the theory and practice of organizational leaming and innovation, both incremental and radical.

My final thoughts

The connection with e-learning

Learning and teaching is the core business of a university. For the 20+ years I’ve worked in Australian Higher Education there has been calls for universities to become more distinct. It would then seem logical that the information systems used to support, enhance and transform (as if there are many that do that) learning and teaching (I’ll use e-learning systems in the following) should be seen as Strategic Information Systems.

Since the late 1990s the implementation of e-learning systems has been strongly influenced by the traditional approaches to strategic and operational management. The influence of the adoption of ERP systems are in no small way a major contributor to this. This recent article (HT: @katemfd) shows the lengths to which universities are going when the select an LMS (sadly for many e-learning == LMS).

I wonder how much of the process is seen as being for strategic advantage. Part, or perhaps all, of Ciborra’s argument for tinkering is on the basis of generating strategic advantage. The question remains whether universities see e-learning as a source of strategic advantage (anymore)? Perhaps they don’t see selection of the LMS as a strategic advantage, but given the lemming like rush toward “we have to have a MOOC” of many VCs it would seem that technology enhanced learning (apologies to @sthcrft) is still seen as a potential “disruptor”/strategic advantage

For me this approach embodies the rational analytic theme to strategy that Ciborra critiques. The tinkering approach is what is missing from university e-learning and its absence is (IMHO) the reason much of it is less than stellar.

Ciborra argues that strategic advantage comes from systems where development is treated as an innovation process. Where innovation is defined as creating new knowledge “about resources, goals, tasks, markets, products and processes” (p. 304). To me this is the same as saying to treat the development of these systems as a learning process. Perhaps more appropriately a constructionist learning process. Not only does such a process provide institutional strategic advantage, it should improve the quality of e-learning.

The current rhetoric/reality gap in e-learning arises from not only an absence, but active prevention and rooting out, of tinkering and bricolage. An absence of learning.

The deficit model problem

Underpinning Ciborra’s approach is that the existing skills and competencies within an organisation provide both the source and the constraint on innovation/learning.

A problem with university e-learning is the deficit model of most existing staff. i.e. most senior management, central L&T, central L&T and middle managers (e.g. ADL&T) have a deficit model of academic staff. They aren’t good enough. They don’t know enough. They have to complete a formal teaching qualification before they can be effective teachers. We have to nail down systems so they don’t do anything different.

Consequently, wxisting skills and competencies are only seen as a constraint on innovation/learning. They are never seen as a source.

Ironically, the same problem arises in the view of students held by the teaching academics that are disparaged by central L&T etc.

The difficulties

The very notion of something being “unanalyzable” would be very difficult for many involved in University management and information technology to accept. Let alone deciding to use it as a foundation for the design of systems.

Summary of the paper

Introduction

Traditional approaches for designing information systems are based on “a set of guidlines” about how best to use IT in a competitive environment and “a planning and implementation strategy” (p. 297).

However, the “wealth of ‘how to build an SIS’ recipes” during the 1990s failed to “yield a commensurate number of successful cases” at least not measured against the rise of systems in the 1980s. Reviewing the literature suggests a number of reasons, including

  • Theoretical literature emphasises rational assessment by top management as the means for strategy formulation ignoring alternative conceptions from innovation literature valuing learning more than thinking and experimentation as a means for revealing new directions.
  • Examining precedent-setting SISs suggests that serendipity, reinvention and other facts were important in their creation. These are missing from the rational approach.

So there are empirical and theoretical grounds for a new kind of guidelines for SIS design.

Organisations should ask

  1. Does it pay to be innovative?
  2. Are SISs offering competitive advantage or are they competitive necessity?
  3. How can a firm implement systems that are not easily copied and thus generate returns?

In terms of e-learning this applies

the paradox of micro-economics: competition tends to force standardization of solutions and equalization of production and coordination costs among participants.

i.e. the pressures to standarise.

The argument is that an SIS must be based on new practical and conceptual foundations

  • Basing an SIS on something that can’t be analysed, like orgnisational culture will help avoid easy imitation. Leveraging the unique sources of practice and know-how of the firm and industry level can be th esource of sustained advantage.
  • SIS development should be closer to prototyping and engaging with end-users’ ingenuity than has been realised.

    The capability of integrating unique ideas and practical design solutions at the end-user level turns out to be important than the adoption of structured approaches to systems development or industry analysis (Schoen 1979; Ciborra and Lanzara, 1990)

Questionable advantage

During the 1980s a range of early adopters of strategic information systems (SISs) – think old style airline reservation systems – arose brought benefit to some organisations and bankruptcy to those that didn’t adopt. This arose to a range of frameworks for identifying SIS.

I’m guessing some of these contributed to the rise of ERP systems.

But the history of those cited success stories suggest that SIS only provide an ephemeral advantage before being copied. One study suggests 92% of systems followed industry wide trends. Only three were original.

I imagine the percentage in university e-learning would be significantly higher. i.e. you can’t get fired if you implement an LMS (or an eportfolio).

To avoid the imitation problem there are suggestions to figure out the lead time for competitors to copy. But that doesn’t avoid the problem. Especially given the rise of consultants and service to help overcome.

After all, if every university can throw millions of dollars at Accenture etc they’ll all end up with the same crappy systems.

Shifts in model of strategic thinking and competition

This is where the traditional approaches to strategy formulation get questioned.

i.e. “management should first engage in a purely cognitive process” that involves

  1. appraise the environment (e.g. SWOT analysis)
  2. identify success factors/distinctive competencies
  3. translate those into a range of competitive strategy alternatives
  4. select the optimal strategy
  5. plan it in sufficient details
  6. implement

At this stage I would add “fail to respond to how much the requirements have changed” and start over again as you employ new senior leadership

This model is seen in most SIS models.

Suggests that in reality actual strategy formulation involves incrementalism, muddling through, myopic and evolutionary decision making. “Structures tend to influence strategy formulation before they can be impacted by the new vision” (p. 300)

References Mintzberg (1990) to question this school of through 3 ways

  1. Assumes that the environment is highly predictable and events unfold in predicted sequences, when in fact implementation surprises happen. Resulting in the clash between inflexible plans and the need for revision.
  2. Assumes that the strategist is an objective decision maker not influenced by “frames of reference, cultural biases, or ingrained, routinized ways of action” (p. 301). Contrary to a raft of research.
  3. Strategy is seen as an intentional design process rather than as learning “the continuous acquisition of knowledge in various forms”. Quotes a range of folk to argue that strategy must be based on effective adaptation and learning involving both “incremental, trial-and-error learning, and radical second-order learning” (p. 301)

The models of competition implicit in SIS frameworks tend to rely on theories of business strategy from industrial organisation economics. i.e. returns are determined by industry structure. To generate advantage a firm must change the structural characteristics by “creating barriers to entry, product differentiation, links with suppliers” (p. 301).

There are alternative models

  • Chamberlin’s (1933) theory of monopolistic competition

    Firms are heterogeneous and compete on resource and asset differences – “technical know-how, reputation, ability for teamwork, organisational culture and skills, and other ‘invisible assets’ (Itami, 1987)” (p. 301)

    Differences enable high return strategies. You compete by cultivating unique strengths and capabilities and defending against imitation.

  • Schumpeter’s take based on innovation in product, market or technology

    Innovation arises from creative destruction, not strategic planning. The ability to guess, learn and luck appear to be the competitive factors.

Links these with Mintzberg’s critique of rational analytics approaches and identifies two themes in business strategy

  1. Rational analytic

    Formulate strategy in advance based on industry analysis. Plan and then implement. Gains advantage relative to firms in the same industry strucure.

  2. Tinkering (my use of the phrase)

    Strategy difficult to plan before the fact. Advantage arises from exploiting unique characteristics of the firm and unleashing its innovating capabilities

Reconsidering the empirical evidence

Turns to an examination of four well-known SIS based on the two themes and other considerations from above. This examination these “cases emphasize the discrepancy between ideal plans for an SIS and the realities of implementation” (p. 302). i.e.

The system was not developed according to a company-
by one of the business units. The system was not developed according to company-wide strategic plan; rather, it was the outcome of an evolutionary, piecemeal process that included the ingenious tactical use of systems already available.

i.e. bricolage and even more revaling

the conventional MIS unit was responsible not only for initial neglect of the new strategic applications within McKesson, but also, subsequently, for the slow pace of company-wide learning about McKesson’s new information systems

Another system “was supposed to address an internal inefficiency” (p. 303) not some grand strategic goal.

And further

The most frequently cited SIS successes of the 1980s, then, tell the same story. successes of the 1980s, then, tell the same story. Innovative SISs are not fully designed top-down or introduced in one shot; rather, they are tried out through prototyping and tinkering. In contrast, strategy formulation and design take place in pre-existing cognitive frames and organizational contexts that usually prevent designers and sponsors from seeing and exploiting the potential for innovation. (p. 303)

New foundations for SIS design

SIS development must be treated as an innovation process. The skills/competencies in an organisation is both a source and a constraint on innovation. The aim is to create knowledge.

New knowledge can be created in two non-exclusive ways

  1. Tinkering.

    Rely on local information and routine behaviour. Learning by doing, incremental decision making and muddling through).

    Accessing more diverse and distant information, when an adequate level of competence is not present, would instead lead to errors and further divergence from optimal performance (Heiner, 1983) (p. 304)

    People close to the operational level have to be able to tinker to solve new problems. “local cues from a situation are trusted and exploited in a somewhat unreflective way, aiming at ad hoc solutions by heuristics rather than high theory”

    The value of this approach is to keep development of an SIS close to the competencies of the organisation and ongoing fluctuations.

  2. Radical learning

    “entails restructuring the cognitive and organisational backgrounds that give meaning to the practices, routines and skills at hand” (p. 304). It requires more than analysis and requirements specifications. Aims at restructuring the context of both business policy and systems development”. Requires “intervening in situations and designing-in-action”.

    The change in context allows new ways of looking at the capabilities and devising new strategies. The sheer difference becomes difficult to imitate.

SIS planning by oxymorons

Time to translate those theoretical observations into practical guidelines.

Argues that the way to develop an SIS is to proceed by oxymoroon. Fusing “opposites in practice and being exposed to the mismatches that bound to occur” (p. 305). Defines 7

  • 4 to bolster incremental learning
    1. Value bricolage strategically
    2. Design tinkering

      This is important

      Activities, settings, and systems have to be arranged so that invention and prototyping by end-users can flourish, together with open experimentation (p. 305)

      Set up the organisation to favour local innovation. e.g. ad hoc project teams. ethnographic studies.

    3. Establish systematic serendipity

      Open experimentation results in largely incomplete designs, the constant intermingling of implementation and refinement, concurrent or simultaneous conception and execution – NOT sequential

      An ideal context for serendipity to merge and lead to unexpected solutions.

    4. Thrive on gradual breakthroughs.

      In a fluctuating environment the ideas that arise are likely to include those that don’t align with established organisational routines. The raw material for innovation. “management should appreciate and learn about such emerging practices”

  • Radical learning and innovation
    1. Practice unskilled learning

      Radically innovative approaches may be seen as incompetent when judged by old routines and norms. Management should value this behaviour as an attempt to unlearn old ways of thinking and doing. It’s where new perspectives arise.

    2. Strive for failure

      Going for excellence suggests doing better what you already do which generates routinized and efficient systems. The competency trap. Creative reflection over failures and suggest ways to novel ideas and designs. Also the recognition of discontinuities and flex points.

    3. Achieve collaborative inimitability

      Don’t be afraid to collaborate with competitors. Expose the org to new cultures and ideas.

These seven oxymorons can represent a new “systematic” approach for the establishment of an organizational environment where new information—and thus new systems can be generated. Precisely because they are paradoxical, they can unfreeze existing routines, cognitive frames and behaviors; they favor learning over monitoring and innovation. (p. 306)

References

Ciborra, C. (1992). From thinking to tinkering: The grassroots of strategic information systems. The Information Society, 8(4), 297–309.

How much reblogging is "bad"?

Pragmatic students have always made judgements about exactly how they are going to engage with the activities teachers set them. I know I made decisions not to attend lectures (especially those at 8am on cold winter mornings). At times I’ve focused on just the assignments without engaging completely in the various intended learning activities. I made those decisions in a context nowhere near as constrained as many of my current students find themselves in terms of trying to balance work (often full-time), family (often very complex), and study.

So it’s no great surprise to see students “corrupting” the intent of learning activities designed into the course I teach. It’s happened before and I assume that there are a whole range of different forms of task corruption going on. Over the weekend, however, I became aware of a different type task corruption going on. It appears that at least some students are simply “reblogging” the posts of others without making additional comments.

The use of re-blogging has arisen for the first time this year due to changes in the advice given to students. This year I was much more specific in advising them to use WordPress.com for their individual blog and apparently many of them have noticed the WordPress “re-blog” feature. I’m not a big fan of reblogging. I don’t use it as I’m comfortable with creating links (something which a fair portion of my students appear to struggle with), I also like more control over what appears on my blog, and the “re-blog” functionality doesn’t really appear in my blogging process. That said, I don’t mind students using re-blogging as part of their reflective process and for the course.

The course’s use of student blogs is aimed more at getting participation than at “right” answers. Students have to write a number of posts each week, a portion of those have to contain links to online resources, and a number of them have to have links to the posts of other students. The aim is to encourage the students to engage with a different tool and to develop some reflective practice without worrying too much about providing the answers I want. Re-blogging is an easy way to include a link to someone else’s post and to use that as a spark for reflective thinking. I have no problem with that.

The issue that has arisen is that at least some of the students appear to have just re-blogging. They haven’t added any comments of their own. So no reflection of their own. The aim of the following is to

  1. Explore how much re-blogging has gone on amongst the current cohort.
  2. Give some thought to whether this is a problem and what should be done.

How much re-blogging?

So how much is going on?

Handily I have a copy of all the students’ posts and the WordPress.com re-blog feature uses a specific set of tags to indicate the different parts of a re-blog. This makes it very easy to analyse. The process here will seek to answer the following questions

  1. What percentage of all posts are based on a reblog?

    There are 242 posts out of 5915 posts so far that contain a reblog. So about 4% of posts.

  2. What percentage of students have used the reblog facility?

    There are 338 students who have posts in the system. 80 of those students have used reblog. About 24%.

  3. What percentage of each student’s total number of posts use reblog?

    The figures above suggest that those students who are using reblog are averaging about 3 reblogs. With the average total number of posts per student at 17.5.

    Somewhat alarmingly however, is the observation that the top two rebloggers have 25 and 18 reblogs respectively. Given that by this stage of semester 30 posts would be a good number of have written, this is a little worrying.

    But the chart below suggests otherwise. It shows that the students who are reblogging lots are also posting lots. There are one or two who are getting close to a 50% split. Will need to explore those a bit more closely.

  4. What percentage of each reblog is original content?

    Appears that the reblogs average about 33% of the content being original. In other words, the original post written by someone else (on average) makes up two-thirds of the post. There are 20 odd posts that have less than 10% of the post being original content. These appear to be examples of where “Great post” or similar is the original contribution.

    The question is whether these students are getting any advantage in terms of marks through this practice?

    For the second assignment there are 6 students who are getting some small benefit from reblogging. Of these 6 only 2 have multiple reblogs (4 and 2).

The simplified WordPress tag structure is
[code lang=”html”]<div class=’reblogger-note-content’>Comment made by person reblogging.</div>
<div class="reblog-post">the post that was reblogged</div>[/code]

Reblogs and posts by David T Jones, on Flickr

Conclusions

Some quick, initial conclusions based on the above

  1. There isn’t any evidence of widespread use of the reblogging feature to corrupt the task.
  2. The rise of this practice has only appeared this year as a result of slight changes in the tools used and perhaps a group of students who are more familiar with other social media than blogging (i.e more ready to adopt the retweet approach in blogging).
  3. Reblogging appears to challenge some of the more traditional notions of ownership held by some students.

    e.g. the idea that by reblogging others are trying to steal their ideas, rather than providing an affirmation of those ideas and trying to spread them further.

  4. Though perhaps the major concern was students being seen as getting a free ride.
  5. I still have the tendency to assume students are seeking to corrupt the task, other than looking for other more relevant explanations.
  6. Exploring reasons and actions at the student level would likely reveal more interesting negotiations and understandings.

All of the above need to be thought about and explored some more. One of these days.

Powered by WordPress & Theme by Anders Norén

css.php