The need for a third way

One of the themes for this blog is that the majority of current approaches to improving learning and teaching within universities simply don’t work. At least not in terms of enabling improvement in a majority of the learning and teaching at an institution. Recently I finally completed reading the last bits of the book Nudge by Thaler and Sunstein. Chapter 18 is titled “The Real Third Way”. This post explores how that metaphor connects with some of the thinking expressed here.

The real third way

Thaler and Sunstein mention that the “20th century was pervaded by a great deal of artificial talk about the possibility of a ‘Third Way'” in politics. Their proposal is that libertarian paternalism, the topic of the book, represents a real third way. I’m not talking politics but there appears to be the same need to break out of a pointless dichotomy and move onto something more useful.

The characterisations of the two existing ways provided by Thaler and Sunstein are fairly traditional (stereotypical?) extremes of the political spectrum. i.e.:

  1. Liberal/Democrat – “enthusiasm for rigid national requirements and for command-and-control regulation. Having identified serious problems in the private market, Democrats have often insisted on firm mandates, typically eliminating or at least reducing freedom of choice.”.
  2. Conservative/Republican – have argued against government intervention and on behalf of a laissez-fair approach with freedom of choice being a defining principle. They argue that “in light of the sheer diversity of Americans one size cannot possibly fit all”.

Thaler and Sunstein’s third way – libertarian paternalism – is based on two claims:

  1. Choice architecture is pervasive and unavoidable.
    Small features of social situations have a significant impact on the decisions people make. The set of these features – the choice architecture – in any given social situation already exists and is already influencing people toward making good or bad decisions.
  2. Choice architecture can be manipulated while retaining freedom of choice.
    It is possible to make minor changes to the set of features in a social situation such that it encourages people to make “better” decisions, whilst still allowing them to make the “bad” decision, if that’s what they want.

Connections with improving learning and teaching

Early last year I borrowed and slightly modified Bigg’s 3 levels of teaching to identify 3 levels of improving learning and teaching. Obviously there is a numerical connection between these 3 levels and the “3 ways” outlined above. The more I’ve thought about it, the more I realise that the connections are more significant than that, and that the “3rd way” seems to be a useful way to position my beliefs about how to improve learning and teaching within a university. Here goes my first attempt at explicating it.

Expanding upon the 3 levels of improving L&T

The 3 levels I initially introduced can be expanded/morphed into ways or into stages. In terms of stages, I could probably argue that the levels/stages represent a historical evolution of how learning and teaching has been dealt within in Universities. Those three stages are:

  1. What the teacher is (i.e. ignore L&T).
    This is the traditional/historical stage that some long term academics look back on with fond memories. Where university management didn’t really get involved with teaching and learning. Individual academics were left to teach the course they way they felt it should be taught. There was little over sight and little need for outside support.

    The quality of the teaching was solely down to the nature of the teacher. If they were a good teacher, good things happened. If bad….. This was the era of selective higher education where, theoretically, only the best and the brightest went to university and most were seen to have the intellectual capability and drive to succeed regardless.

    For a surprising number of universities, especially those in the top rank of universities, this is still primarily how they operate. However, those of us working in “lesser” institutions are now seeing a different situation.

  2. What management does (i.e. blame the teacher).
    Due to the broadly publicised characteristics of globalisation, the knowledge economy, accountability etc. there is now significant pressure upon universities to demonstrate that the teaching at their institutions is of high quality. Actually, this has morphed into proxy measures where the quality of teaching is being measured by ad hoc student memories of their experience (CEQ surveys), how many of the academics have been forced to complete graduate certificates in higher education, what percentage of courses have course websites and how well the institution has filled out forms mapping graduate attributes.

    All of these changes to the practice of teaching and learning are projects that are initiated and “led” by senior university management. The success of the institution is based on how well senior university management have been in completing those projects.

    As each new fad arises within government of the university sector, there is a new set of projects to be completed. Similarly, when a new set of senior management start within an institution, there is a new set of projects to be completed. In this case, however, the projects aren’t typically all that new. Instead they are simply the opposite of what the last management did. i.e. if L&T support was centralised by the last lot of management, it must now be de-centralised.

    Most academics suffering through this stage would like to move back to the first stage, I think they and their institutions need to move onto the next one.

  3. What the teacher does.
    For me this is where the institution its systems, processes etc are continually being aligned to encourage and enable academics to improve what they are doing. The focus is on what the teacher does. This has strong connections with ideas of distributive leadership, the work of Fullan (2008) and Biggs (2001).

    For me implementing this stage means taking an approach more informed by complex adaptive systems, distributive leadership, libertarian paternalism, emergent/ateleological design and much more. This stage recognises that in many universities stage 1 doesn’t work any longer. There are too many people and skills that need to be drawn upon for successful teaching that academics can’t do it by themselves (if they ever did). However, that doesn’t mean that the freedom of academics to apply their insights and knowledge should be removed.

So, now I’ve expanded on those, time to connect these three ways with some other triads.

Connections with politics

The following table summarises what I see as the connections with the 3 stages of improving learning and teaching and the work of Thaler and Sunstein (2008).

  1. Conservative/republican == What the teacher is.
    i.e. the laissez-faire approach to teaching and learning. Academics are all too different, no one system or approach to teaching can work for us.
  2. Liberal/democrat == What management does.
    There are big problems with learning and teaching at universities that can only be solved by major projects led by management. Academics can’t be trusted to teach properly we need to put in place systems that mandate how they will teach and force them to comply.
  3. Libertarian paternalism == What the teacher does.
    The teaching environment (including the people, systems, processes, policies and everything else) within a university has all sorts of characteristics that influence academics to make good and bad decisions about how they teach. To improve teaching you need to make small and on-going changes to the characteristics of that environment so that the decisions academics are mostly likely will improve the quality of their teaching and learning. A particular focus should be on encouraging and enabling academics to reflect on their practice and take appropriate action.

Approaches to planning

This morning George Siemens pointed to this report (Baser and Morgan, 2008) and made particular mention of the following chart that compares assumptions between two different approaches to planning.

Comparison of assumptions in different approaches to planning (adapted from )
Aspect Traditional planning Complex adaptive systems
Source of direction Often top down with inputs from partners Depends on connections between the system agents
Objectives Clear goals and structures Emerging goals, plans and structures
Diversity Values consensus Expects tension and conflict
Role of variables Few variables determine the outcome Innumerable variables determine outcomes
Focus of attention The whole is equal to the sum of the parts The whole is different than the sum of the parts
Sense of the structure Hierarchical Interconnected web
Relationships Important and directive Determinant and empowering
Shadow system Try to ignore and weaken Accept most mental models, legitimacy and motivation for action is coming out of this source
Measures of success Efficiency and reliability are measures of value Responsiveness to the environment is the measure of value
Paradox Ignore or choose Accept and work with paradox, counter-forces and tension
View on planning Individual or system behaviour is knowable, predictable and controllable Individual and system behaviour is unknowable, unpredictable and uncontrollable
Attitude to diversity and conflict Drive for shared understanding and consensus Diverse knowledge and particular viewpoints
Leadership Strategy formulator and heroic leader Facilitative and catalytic
Nature of direction Control and direction from the top Self-organisation emerging from the bottom
Control Designed up front and then imposed from the centre Gained through adaptation and self-organisation
History Can be engineered in the present Path dependent
External interventions Direct Indirect and helps create the conditions for emergence
Vision and planning Detailed design and prediction. Needs to be explicit, clear and measurable. A few simple explicit rules and some minimum specifications. But leading to a strategy that is complex but implicit
Point of intervention Design for large, integrated interventions Where opportunities for change present themselves
Reaction to uncertainty Try to control Work with chaos
Effectiveness Defines success as closing the gap with preferred future Defines success as fit with the environment

I was always going to like this table as it encapsulates, extends and improves my long term thinking about how best to improve learning and teaching within universities. I’ve long ago accepted (Jones, 2000; Jones et al, 2005)) that universities are complex adaptive systems and that any attempt to treat them as ordered systems is doomed to failure.

I particularly liked the row on shadow systems as it corresponds with what some colleagues and I (Jones et al, 2004) suggested sometime ago.

In terms of connections with the stages of improving learning and teaching,

  1. No planning == What the teacher is.
    i.e. there is no real organisational approach to planning how to improve learning and teaching. It’s all left up to the academic.

    Often “traditional planning” proponents will refer to the complex adaptive systems approach to planning as “no planning”. Or worse they’ll raise the spectre of no control, no discipline or no governance over the compelx adaptive systems planning approach. What they are referring is actually the no planning stage. A CAS planning approach, done well, needs as much if not more discipline and “governance” as a planning approach, done well.

  2. Traditional planning == What management does.
    University management (at least in Australia) is caught in this trap of trying to manage universities as if they were ordered systems. They are creating strategic plans, management plans, embarking on analysis and then design of large scale projects and measuring success by the completion of those projects, not on what they actually do to the organisation or the quality of learning and teaching.
  3. Complex adaptive systems == What the teacher does.
    The aim is to increase the quantity and quality of the connections between agents within the university. To harness the diversity inherent in a large group of academics to develop truly innovative and appropriate improvements. To be informed by everything in the complex adaptive systems column.

Orders of change

There also seems to be connections to yet another triad described by Bartunek and Moch (1987) when they take the concept of schemata from cognitive science and apply it to organisational development. Schemata are organising frameworks or frames that are used (without thinking) to make decisions. i.e. you don’t make decisions about events alone, how you interpret them is guided by the schemata you are using. Schemata (Bartunek and Moch, 1987):

  • Help identify entities and specify relationships amongst them.
  • Act as data reduction devices as situations/entities are represented as belonging to a specific type of situation.
  • Guide people to pay attention to some aspects of the situation and to ignore others.
  • Guide how people understand or draw implications from actions or situations.

In moving from the cognition of individuals to organisations, the idea is that different organisations (and sub-parts thereof) develop organisational schemata that a sustained through myths, stories and metaphors. These organisational schemata guide how the organisation understands and responds to situations in much the same way as individual schemata. e.g. they influence what is important and what is not.

Bartunek and Moch (1987) then suggest that planned organisational change is aimed at trying to change organisational schemata. They propose that successful organisational change achieves one or more of three different orders of schematic change (Bartunek and Moch, 1987, p486):

  1. First-order change – the tacit reinforcement of present understandings.
  2. Second-order change – the conscious modification of present schemata in a particular direction.
  3. Third-order change – the training of organisational members to be aware of their present schemata and thereby more able to change these schemata as they see fit.

Hopefully, by now, you can see where the connection with the three stages of improving teaching and learning are going, i.e.

  1. First-order change == What the teacher is.
    Generally speaking how teaching is understood by the academics doesn’t change. Their existing schemata are reinforced.
  2. Second-order change == What management does.
    Management choose a new direction and then lead a project that encourages/requires teaching academics to accept the new schemata. When the next fad or the next set of management arrives, a new project is implemented and teaching academics once again have to accept a new schemata. If you’re like me, then you question whether or not the academics are actually accepting this new schemata or they are being seen to comply.

    The most obvious current example of this approach is the current growing requirements for teaching academics to have formal teaching qualifications. i.e. by completing the formal teaching qualification they will change their schemata around teaching. Again, I question (along with some significant literature) the effectiveness of this.

  3. Third-order change == What the teacher does.
    The aim here is to have an organisational environment that encourages and enables individual academics to reflect on their current schemata around teaching and be able to change it as they see problems.

    From this perspective, I see the major problem within universities not being that academics don’t have appropriate schemata to improve teaching, but that the environment within which they operate doesn’t encourage nor enable them to implement, reflect or change their schemata.

Conclusions

I think there is a need for a 3rd way to improving learning and teaching within universities. It is not something that is easy to implement. The 2nd way of improving learning and teaching is so embedded into the assumptions of government and senior management that they are not even aware of (or at best not going to mention) the limitations of their current approach or that there exists a 3rd way.

Look down the “Traditional planning” column in the table above and you can see numerous examples of entrenched, “common-sense” perspectives that have to be overcome if the 3rd way is to become possible. For example, in terms of diversity and conflict, most organisational approaches place emphasis on consensus. Everyone has to be happy and reading from the same hymn sheet, “why can’t everyone just get along?”. The requirement to have a hero leader and hierarchical organisational structures are other “common-sense” perspectives.

Perhaps the most difficult aspect of implementing a 3rd way is that there is no “template” or set process to follow. There is no existing university that has publicly stated it is following the 3rd way. Hence, there’s no-one to copy. An institution would have to be first. Something that would require courage and insight. Not to mention that any attempt to implement a 3rd way should (for me) adopt an approach to planning based on the complex adaptive systems assumptions from the above table.

References

Baser, H. and P. Morgan (2008). Capacity, Change and Performance Study Report, European Centre for Development Policy Management: 166.

Bartunek, J. and M. Moch (1987). “First-order, second-order and third-order change and organization development interventions: A cognitive approach.” The Journal of Applied Behavoral Science 23(4): 483-500.

Biggs, J. (2001). “The Reflective Institution: Assuring and Enhancing the Quality of Teaching and Learning.” Higher Education 41(3): 221-238.

Fullan, M. (2008). The six secrets of change. San Francisco, CA, John Wiley and Sons.

Jones, D. (2000). Emergent development and the virtual university. Learning’2000. Roanoke, Virginia.

Jones, D., J. Luck, et al. (2005). The teleological brake on ICTs in open and distance learning. Conference of the Open and Distance Learning Association of Australia’2005, Adelaide.

Thaler, R. and C. Sunstein (2008). Nudge: Improving decisions about health, wealth and happiness. New York, Penguin.

Adding email merge to BIM

The following details an attempt to use user/messageselect.php with BIM in an attempt to move towards implementing an email merge facility for BIM.

BIM passing users

The intent here is that BIM will be used to select the users and will pass it to message select. The first test will be to replace the current “unregistered users” section on “Your Students” which simply shows a list of email address which the staff member has to copy and paste into an email program. See the following screen shot (click on it to see it larger).

Unregistered users - BIM your students

The idea is to replace it with a simple link that when clicked on will pass the details of the unregistered users to messageselect.php

Parameters for messageselect

For this to work, I need to pass messageselect all the parameters it expects in the way it expects them.

First, the parameters is expects are:

  • The list of user ids for the recipients.
    This is done using checkboxes with parameter names userid where id is the Moodle user id.
  • The course id.
    id set to the Moodle course id.
  • formaction
    Seems to simply be the name of the messageselect script.
  • returnto
    path of script it’s coming from.

Parameter passing for message select

In terms of how to pass the data, I’ve tried a normal query string. But that didn’t seem to create the necessary outcome.

It appears that messageselct uses the PHP $_POST variable/function which is used for a form with the post method. So let’s try that.

Yep, that seems to work. May be as simple as that.

Have been able to get that working, however, the “returnto” doesn’t seem to work all the way done the various screens. Works on the first, but not on the last.

bim_email_merge

The following is the function I’ve added to bim to enable the use of messageselect.php

[sourcecode lang=”php”]
function bim_email_merge( $ids, $course, $returnto, $button_msg ) {

global $CFG;

print <<<EOF
<form method="post" action="$CFG->wwwroot/user/messageselect.php" />
<input type="hidden" name="id" value="$course" />
<input type="hidden" name="returnto" value="$returnto" />
<input type="hidden" name="formaction" value="messageselect.php" />
<input type="submit" name="submit" value="$button_msg" />
EOF;

foreach ( $ids as $id ) {
print "<input type="hidden" name="user{$id}" value="on" />";
}
print "</form>";
}
[/sourcecode]

This function displays a submit button with a given message. If pressed the form sends a list of Moodle user ids ($ids) to messageselect. At this stage the user can create the message, choose to remove some users and then send the message. I think.

Implemented in BIM, it looks like the following.

BIM's new email merge

Institutional changes – 2000 and beyond – and their impact

This carries on “bits” from chapter 5 of the thesis. It’s a rough draft of a description of the institutional context within CQU from 2000 onwards. It’s brief and targeted mainly at the factors which impact on Webfuse development.

It needs more work and checking. If you have any suggestions, fire away.

Institutional changes

In mid-1996 CQU appointed a new Vice-Chancellor who was an advocate of a number of new initiatives (Gregor, Wassenaar et al. 2002) including the 1998 organisational restructure, the introduction of a four-term year and increasing emphasis on overseas, full-fee paying students. While all of these changes were introduced prior to 2000 each had on-going ramifications that were being dealt with by CQU management and staff. These ramifications in combination with a number of additional changes were part of the reason why the next CQU Vice-Chancellor described CQU as a “work in progress” and “a unique university” (Hancock 2002). However, the institution did retain the vision “to be a unified university, acknowledged universally as a leader in flexible teaching and learning” (Hancock 2002).

As described in chapter 4, by 1998 Webfuse was being used to support the online learning and teaching activities of the Faculty of Informatics and Communication (Infocom). While the organisational restructure that led to the creation of Infocom happened in 1998 it was not until 1999 that the foundation Dean of Infocom commenced work at CQU. The Dean saw various forces for change, including ICTs, enabling and requiring the development of a “‘glocal’ networked education paradigm” in order to provide a scaleable and globally competitive flexible model of educational delivery (Marshall 2001). The emergent development of this model underpinned Infocom’s Singapore online project (Marshall 2001) and subsequently impacted upon broader faculty practices.

In parallel with these developments within Infocom, CQU was perform various reviews and planning processes aim at developing “structures and systems that are responsive to the needs of learners and the changing nature of higher education in the 21st Century” (CQU 2001). The third stage of this process was the release of a Strategic Plan for Flexible Learning in 2001. Evidence of the importance of flexibility and on-going change is summarised by the following exhortation from that plan (CQU 2001)

The Strategic Plan for Flexible Learning is a ‘living document’. It is imperative that the Strategic Plan be regarded with the same flexibility as the very learning experiences it aims to promote and enhance. To regard the Strategic Plan as anything less will threaten CQU’s position as a market leader in a competitive environment

As described in Chapter 4, during the second-half of the 1990s CQU, in partnership with a commercial company, create a number of information campuses based in major Australian cities. By the late 1990s, these campuses in combination with the dot-com boom and changes in Australian migration rules contributed to significant growth in the student population at CQU. In 1996 international students comprised on 7.3% of CQU’s student population (Marshall and Gregor 2002). By 2002 CQU was, in terms of international students, one of Australia’s fastest growing universities with only 25% of CQU’s students being recent high school graduates (Marshall and Gregor 2002). By 2004, 40% of CQU’s student population were international students from 121 countries (Luck, Jones et al. 2004). From 1996 through 2004 CQU increased its total student numbers by almost 50%. Consequently, in 2002 the CQU Vice-Chancellor described CQU as the “most geographically disparate, ethnically diverse and fastest growing student population of any Australian University” (Hancock 2002).

By 2002, Infocom was teaching about 30% of all CQU students including almost 56% of the students at the international campuses (Jones 2003). From 1999 through 2002 Infocom student numbers more than doubled (Condon, Shepherd et al. 2003). However, by 2003 the global downturn in IT started to impact Infocom enrolments. Table 5.2, adapted from Condon et al (2003), summarises the trend in Infocom student numbers from 1998 through 2003.

Table 5.2 – Increase in Infocom student numbers – 1998-2003 (adapted from Condon, Shepherd et al. 2003).
Year Total Percentage Increase
1998 16646 Infocom’s first year
1999 18504 11% on previous year
2000 25784 39% on previous year
2001 37664 48% on previous year
2002 42654 13.2% on previous year
2003 36105 -15.3% on previous year

This growing complexity and the growing recognition of the importance of e-learning led CQU into a number of technological changes including the adoption of a number of enterprise systems. In order to cope with the increasing complexity CQU’s Vice Chancellor was strongly in favour of integrating the university’s administrative systems (Jones, Behrens et al. 2004). Consequently, in 1999 CQU’s senior management took the decision to implement the PeopleSoft suite of administrative systems (McConachie 2001). The implementation of PeopleSoft was seen as a business process re-engineering project which would require second-order structural and policy change at the University (McConachie 2001). The decision to adopt an ERP system like PeopleSoft was common within the Australian higher education sector at this time. By 2002 almost 90% of Australian universities had adopted at least one module of an ERP from a major vendor with approximately 55% of universities using PeopleSoft (Beekhuyzen, Nielsen et al. 2002).

In 2002, CQU’s Vice-Chancellor continued a long-running mantra of CQU senior management by writing that universities needed the ability to be responsive to a world that was changing fast and needed to provide education that was flexible in terms of delivery time, mode, location and content (Hancock 2002). The increasing requirement for flexibility and the accompanying increasing interest in online learning meant that by 1999 CQU’s existing processes for online/multimedia development – focused around the interactive multimedia unit (Macpherson and Smith 1998) described in chapter 4 – could no longer respond to demand (Sturgess and Nouwens 2004). After a survey and simple technical evaluation it was decided to adopt the use of WebCT as a trial institutional learning management system (LMS) (Sturgess and Nouwens 2004). An academic interviewed by Gregor et al (2002) reports major problems with the WebCT trials dur to inadequate infrastructure, a problem solved by the purchase of a large central Web server. Subsequently, WebCT became the official, institutional platform for e-learning. By the end of 2003 just over 10% of courses offered by CQU had a course website (Jones 2003). WebCT was replaced with Blackboard in 2004 (Danaher, Luck et al. 2005), which in turn was replaced by Moodle in 2010 (Tickle, Muldoon et al. 2009).

Impacts of these changes

As a result of the changes described above CQU had a diverse student population quite unlike that of a traditional university (Marshall and Gregor 2002). It was not unusual for course enrolments at the international campuses be considerably greater than those on the Queensland campuses (Oliver and Van Dyke 2004). By 1999 it was already obvious that these changes had significantly increased the complexity in teaching, increased duplication of teaching methods and significantly decreased time and resources (Jones 1999). Speaking based on experience teaching at CQU Kehoe et al (2004) describe how the development of large undergraduate courses, challenging at any time, becomes even more complex when the students represent a combination of internal and distance education students, and domestic and international students. By 2001 CQU had 11 course offerings that had over 1000 enrolled students. Typically these courses would be supported by close to 20 academic staff, including a number of casual staff, all managed by a single CQU academic.
The 1999 CQU review of distance education and flexible learning recognised that the work necessary to continue to provide existing services, while at the same time plan, implement and progress a broad array of on-going changes was considerable (CQU 1999). The growing complexity of teaching and learning on this scale led to the development of additional policies, procedures, systems and support structures to guide the management of teaching and learning. This included the employment of additional staff. Over an 18 month period Infocom staff numbers rose from 80 to 150 staff with a doubling of general staff (25 to 53) and almost a doubling of academic staff (55 to almost 90) (Condon, Shepherd et al. 2003). At the same time, the increasingly complex demands created by these changes focused attention on the need for supporting information systems such as an ERP (Oliver and Van Dyke 2004). However, there were significant problems with some of the systems implemented to address these problems. Oliver and Van Dyke (2004) report that rather than decrease staffing costs, the implementation of a new ERP, had increased staffing levels suggesting that processes had become more complicated, rather than simpler, and that cited benefits for staff have been “difficult to discern in practice”. Drawing on CQU experience, Jones et al (1999) identify two characteristics of the systems and processes set up to respond to these changes which limit flexibility. First, the cost of setting these systems suggests a period of stable use in order to recoup costs. Second, how aspects of learning and teaching are split amongst existing organisational structures limit convergence and integration.

The Vice-Chancellor of CQU writing in 2002 recognised that the institution’s “rapid growth “has placed great strain on its staff and its physical and technological infrastructure” (Hancock 2002). In particular, the attempt to increase flexibility by offering year-round teaching had placed great strain on staff and required new approaches to workload and workforce planning (Hancock 2002). Numerous authors (McConachie 2001; Luck, Jones et al. 2004; Oliver and Van Dyke 2004) describe how CQU staff members increasingly describe themselves as change weary. McConachie (2001) describe how CQU staff perceive the many changes of previous years to have been poorly communicated and badly managed leading to a climate where further change is unwelcome. Not surprisingly, it is not unusual for academic staff to resist attempts to alter their routines or their control over specific tasks (Hough, McNaught et al. 1998; Jones, Gregor et al. 2003).

As outlined in Section 5.2.1 the foundation Infocom Dean led the development of a “glocal networked learning paradigm” (Marshall 2001) that was first trailed with Infocom’s Singapore operations. As with other changes, this one required the provision of policies, processes, resources and systems for successful implementation. Early in 2000 the author, and chief Webfuse designer, was seconded away from teaching to help support the Singapore project. By August 2001 the decision was made to extend this assignment for as long as necessary (Marshall 2000), to broaden its scope beyond Singapore and consequently led to the author taking on the position of Faculty Teaching and Learning Innovation Officer.

The growing importance of the web and Webfuse to Infocom’s operations is demonstrated by the changes in the Infocom web team. Responsible for maintaining the faculty’s website, including its online learning operations, the web team was also responsible for the on-going development of Webfuse. From 1997 through 2000 the web team consisted of a webmaster, a part-time “developer” and ad hoc support from other faculty technical staff. The webmaster was responsible for the design and support of the entire faculty website and included some tasks associated with Webfuse development. The part-time “developer” was the author who, while not employed to perform Webfuse development, continued developing Webfuse for research purposes. By 2001 the web team had expanded to a webmaster, three permanent developers and a contracted developer (2001-2003).

As shown in Table 5.2 by 2003 Infocom student numbers were beginning to drop, most attributable to the global downturn in IT. Previous external perceptions of Infocom as innovative with hard working staff began to change to one where Infocom was seen as greedy and somewhat less than successful (Condon, Shepherd et al. 2003). By late 2003 the foundation Dean of Infocom was seconded to special projects and left the University in early 2004 (Jones, Behrens et al. 2004). Also in late 2003 and in line with the drop in student numbers there was indication that faculty budgets would be decreased and an increased push for centralisation of services. During 2004 CQU underwent another organisational review which led to an organisational restructure during 2005. During this time Webfuse support first moved into one of the new faculties and then into CQU’s central IT division. By 2008 there was one Webfuse developer working for the central IT division.

References

Beekhuyzen, J., J. Nielsen, et al. (2002). ERPS in universities: The Australian explosion! Pacific Asia Conference on Information Systems. Tokyo, Japan.

Condon, A., J. Shepherd, et al. (2003). Managing the evolution of a new faculty in the 21st century. ATEM’2003. Adelaide, SA.

CQU (1999). Review of distance education and flexible learning "The Foresight Saga". Rockhampton, Central Queensland University: 43.

CQU (2001). Strategic plan for Flexible Learning. Rockhampton, Central Queensland University.

Danaher, P. A., J. Luck, et al. (2005). "The stories that documents tell: Changing technology options from Blackboard, Webfuse and the Content Management System at Central Queensland University." Studies in Learning, Evaluation, Innovation and Development 2(1): 34-43.

Gregor, S., A. Wassenaar, et al. (2002). "Develoing a virtual organization: Serendipity or strategy?" Asian Academy of Management 7(1): 1-19.

Hancock, G. (2002). "Higher education at the crossroads: A review of Australian Higher Education – A response from Central Queensland University."   Retrieved 19 July, 2009, from http://www.backingaustraliasfuture.gov.au/submissions/crossroads/pdf/280.pdf.

Hough, G., C. McNaught, et al. (1998). Developing a Faculty Plan for Flexible Delivery for the Next Five Years – and how to get there. Proceedings of ASCILITE’98.

Jones, D. (1999). Solving some problems with university education: Part II. Ausweb’99, Balina, Australia.

Jones, D. (2003). How to live with ERP systems and thrive. 2003 Tertiary Education Management Conference, Adelaide.

Jones, D., S. Behrens, et al. (2004). The rise and fall of a shadow system: Lessons for enterprise system implementation. Managing New Wave Information Systems: Enterprise, Government and Society, Proceedings of the 15th Australasian Conference on Information Systems, Hobart, Tasmania.

Jones, D., S. Gregor, et al. (2003). An information systems design theory for web-based education. IASTED International Symposium on Web-based Education, Rhodes, Greece, IASTED.

Jones, D., S. Stewart, et al. (1999). Patterns: Using Proven Experience to Develop Online Learning. Proceedings of ASCILITE’99, Brisbane, QUT.

Kehoe, J., B. Tennent, et al. (2004). "The challenge of flexible and non-traditional learning and teaching methods: Best practice in every situation?" Studies in Learning, Evaluation, Innovation and Development 1(1): 56-63.

Luck, J., D. Jones, et al. (2004). "Challenging Enterprises and Subcultures: Interrogating ‘Best Practice’ in Central Queensland University’s Course Management Systems." Best practice in university learning and teaching: Learning from our Challenges.  Theme issue of Studies in Learning, Evaluation, Innovation and Development 1(2): 19-31.

Macpherson, C. and A. Smith (1998). "Academic authors’ perceptions of the instructional design and development process for distance education: A case study." Distance Education 19(1): 124-141.

Marshall, S. (2000). Edmu, secondment, Hartford etc. D. Jones. Rockhampton.

Marshall, S. (2001). Faculty level strategies in response to globalisation. 12th Annual International Conference of the Australian Association for Institutional Research. Rockhampton, QLD, Australia.

Marshall, S. and S. Gregor (2002). Distance education in the online world: Implications for higher education. The design and management of effective distance learning programs. R. Discenza, C. Howard and K. Schenk. Hershey, PA, USA, IGI Publishing: 21-36.

McConachie, J. (2001). "Who benefits from exploratory business research? The effect of sub-cultures on the implementation of an enterprise system: An Australian regional university perspective." Queensland Journal of Educational Research 17(2): 193-208.

Oliver, D. and M. Van Dyke (2004). Looking back, looking in and looking on: Treading over the ERP battleground. Qualitative case studies on implementation of enterprise wide systems. L. von Hellens, S. Nielsen and J. Beekhuyzen. Hershey, PA, Idea Group: 123-138.

Sturgess, P. and F. Nouwens (2004). "Evaluation of online learning management systems." Turkish Online Journal of Distance Education 5(3).

Tickle, K., N. Muldoon, et al. (2009). Moodle and the institutional repositioning of learning

and teaching at CQUniversity. ascilite 2009. Auckland, NZ: 1038-1047.

Focusing on integration – chapter 5

Back working on the thesis. The following is a rough draft of the introduction and part of the first section from Chapter 5 of the thesis. This chapter is starting to tell the story of Webfuse from 2000 to 2004 and beyond.

As I’m reading and writing, I’m remembering all sorts of details, which has led to a change in the title of the chapter. “Focusing on better integration” isn’t about technical integration, it’s about integrating “e-learning” into the everyday practice of academics. This is where I think Webfuse was a success.

Introduction

The previous chapter, chapter 4, described the first iteration (1996 to 1999) of the action-research process that led to the development of Webfuse and the Information Systems Design Theory (ISDT) that forms the basis for this thesis. This chapter describes the final iteration of the action research process (2000-2004 and beyond) and how it leads to the formulation of the final version of the ISDT. A significant point of difference between this cycle and the previous is that an obviously more atelological process was taken. That is, unlike the previous cycle which started with a specific set of design principles informing the design of an information system, this cycle commences with an existing information systems with a number of known problems and then proceeds via various changes to attempt to address those problems. The final ISDT is an accumulation of the learning and reflection from this ateleological process of experimenting with the existing Webfuse system.

This chapter uses the same basic structure – adapted from the synthesised design and action research approach proposed by Cole et al (2005) – as used in chapter 4. However, in keeping with the more ateleological approach adopted in this cycle the description of the intervention does not include a section explaining the a priori design principles. The chapter starts with the problem definition (Section 5.2) and a description of the changing happening within the broader societal and institutional contexts during this period (Section 5.2.1) and also a brief summary of the problems with Webfuse that arose from the first iteration (Section 5.2.2). Next, the intervention is described (Section 0) as a collection of separate, but related changes in the system and its support. The outcomes of the intervention are then examined in the evaluation section (Section 5.4). All of this is brought together first as an ISDT for e-learning within universities (section 5.5) and the identification of lessons learned (Section 5.6).

Problem Definition

As described in Section 4.2 (cross ref) the basic problem needing to be solved was how (in 1996) to enable the Department of Mathematics and Computing (M&C) – and later the Faculty of Informatics and Communication – at Central Queensland University (CQU) to use the World-Wide-Web and other Internet-based technologies in its teaching and learning. The solution implemented to address this problem was the design and implementation of the Webfuse e-learning system (Section 4.3 cross ref). By 2000, the development and support of Webfuse entered a new phase informed by changes within the broader context and the desire to address the lessons learned during Webfuse’s early use. This section starts be describing the changes in the broader societal and the CQU context, which both enabled and influenced the development of Webfuse from 2000 onwards (Section 5.2.1). Section 5.2.2 describes how these contextual factors influenced how Webfuse was supported and developed during this period. Finally, section 5.2.3 briefly re-iterates the lessons learned during the use of Webfuse from 1997 through 1999 identified in Chapter 4.

Changes in institutional context

The period towards the end of the 20th and start of the 21st centuries saw considerable change in the context within which this work was performed. This section seeks to provide a brief summary of those changes and how they impacted the development and support of the Webfuse information system and eventually the direction taken with the ISDT. It starts with a summary of some of the major societal changes impacting higher education within Australia. Next, it provides a brief description of the changes, many influenced by societal factors, within the CQU context from 1999 onwards.

From 1999 onwards acceptance, access to and use of computers and the Internet amongst the staff and students of CQU increased significantly in line with changes within the broader societal context. Household Internet access within Australia quadrupled from 16% in 1998 to 64% in 2006/7 (Australian Bureau of Statistics 2008). For most of this time, however, the majority of Australian households made do without broadband Internet connections. By 2004/5, only 16% of Australian households had a broadband Internet connection, increasing to 43% by 2006/7 (Australian Bureau of Statistics 2008). However, cost remained a significant barrier with only 34% of people in bottom income quintile households have home Internet access compared with 77% in the top income quintile (Australian Bureau of Statistics 2007). This rapid increase, mirrored in other advanced countries, represented the growing penetration of the Internet and the World-Wide Web into everyday life.

The growing adoption of information and communications technologies (ICTs) had a other broader societal impacts. In the years leading up to 2002 universities faced an almost overwhelming demand for information technology (IT) skills fuelled by the dot-com boom and the perceived Y2K crisis (Smyth and Gable 2008). In addition, the Australian government introduced an initiative in mid-1999, that allowed former full-fee paying overseas students – studying a specified set of programs, including IT – to apply for permanent residence within the first six months after course completion, even if they did not have work (Birrell 2000). At the same time, the more robust market disposition of the state meant that universities were not only required to be more efficient and effective in using state resources but were also required to compensate for reduced government funding by attracting private funds (Danaher, Gale et al. 2000). For a number of Australian universities this led to an increased reliance on overseas full-fee paying students in programs matching government specified skills areas. Initially this included a heavy emphasis on IT skills, but with a global downturn in IT from 2002 leading to a decline in demand for these courses (Smyth and Gable 2008), the emphasis shifted to other programs such as accounting. As outlined in Chapter 4, CQU had adopted a strategic direction based on planned growth into the market for overseas students and consequently fluctuations in demand impacted the institution.

In March 2000 the Australian federal and state ministers of education established the Australian Universities Quality Agency (AUQA) and assigned it the responsibility of providing public assurance of the quality of Australia’s universities and other institutions of higher education (AUQA 2000). Vidovich (2002) argues that the types of public sector policies that resulted in the formation of AUQA amount to mechanisms of indirect steerage developed as a complement to policies of devolution, decentralization and deregulation characteristic of a prevailing market ideology. Vidovich (2002) also argues that the rise of quality policy and globalisation within educational at around the same time suggests that the two were intimately bound. Woodhouse (2003) – the Executive Director of AUQA – argues that the most frequently cited reasons for the great increase in external quality agencies in higher education are: the increase in public funding, the connection between higher education and national needs and the growth in student numbers. Woodhouse (2003) reports that feedback on trial and substantive AUQA quality audits in 2001 and 2002 is positive, with universities reporting beneficial effects through the audits and self-reflection triggered by prospective audits. In a review of 320 substantive contributions from the first 15 volumes of the journal Quality in Higher Education, Harvey and Williams (2010) suggest that the overall tenor is that “external quality evaluations are not particularly good at encouraging improvement”.

By 1994, as one of eight nationally accredited Distance Education Centres, CQU had almost 5000 students in 400 courses studying by distance education (CQU 1999). By the late 1990s, the changes described in the previous paragraphs had begun to significantly influence the conceptions of on-campus and distance education. By this time Australian distance education had been through three phases: (1) external studies (1911 to early/mid 1970s); (2) distance education (early/mid 1970s to mid 1980s); and, (3) open learning (mid 1980s onwards) (Campion and Kelly 1988). By the late 1990s factors such as declining funds, advancing technology and the demography of students had triggered a profound process of change where distance education methods and systems were converging with those of face-to-face teaching (Moran and Myringer 1999). By 2004, Bigum and Rowan (2004) describe how flexibility in teaching and learning was commonplace within Australian higher education and how enthusiasm for the term arose from perceptions of it being: a) a more effective and efficient means of getting teaching resources to students, and b) through online teaching offering the possibility of generating revenue from overseas fee-paying students. The key idea of flexible learning was a move away from instructor choice of key learning dimensions toward an approach that offered the student the flexibility to pick from a number of choices (Collis and Moonen 2002). Dekkers and Andrews (2000) suggested that once the use of technology became more common, discussion of flexible learning, like that with open learning, would soon revert simply to discussion of teaching and learning.

References

AUQA. (2000). "Mission, objectives, vision and values."   Retrieved 26 May 2010, 2010, from http://auqa.edu.au/aboutauqa/mission/.

Australian Bureau of Statistics (2007). Patterns of Internet Access in Australia 2006. Canberra, Australian Bureau of Statistics: 80.

Australian Bureau of Statistics (2008). Internet access at home. Australian Social Trends 2008. Canberra, ACT, Australian Bureau of Statistics: 10.

Bigum, C. and L. Rowan (2004). "Flexible learning in teacher education: myths, muddles and models." Asia-Pacific Journal of Teacher Education 32(3): 213-226.

Birrell, B. (2000). "Information technology and Australia’s immigration program: Is Australia doing enough?" People and Place 8(2): 77-83.

Campion, M. and M. Kelly (1988). "Integration of external studies and campus-based education in Australian higher education: They myth and the promise." Distance Education 9(2): 171-201.

Cole, R., S. Purao, et al. (2005). Being proactive: Where action research meets design research. Twenty-Sixth International Conference on Information Systems: 325-336.

Collis, B. and J. Moonen (2002). "Flexible learning in a digital world." Open Learning 17(3): 217-230.

CQU (1999). Review of distance education and flexible learning "The Foresight Saga". Rockhampton, Central Queensland University: 43.

Danaher, P. A., T. Gale, et al. (2000). "The teacher educator as (re)negotiated professional: critical incidents in steering between state and market in Australia." Journal of Education for Teaching 26(1): 55-71.

Dekkers, J. and T. Andrews (2000). A meta-analysis of flexible delivery in selected Australian tertiary institutions: How flexible is flexible delivery? ASET-HERDSA 2000, Toowoomba, Qld.

Harvey, L. and J. Williams (2010). "Fifteen years of quality in higher education." Quality in Higher Education 16(1): 3-36.

Moran, L. and B. Myringer (1999). Flexible learning and university change. Higher education through open and distance learning. K. Harry. London, Routledge: 57-71.

Smyth, B. and G. G. Gable (2008). The information systems discipline in Queensland. The Information Systems Academic Discipline in Australia. G. G. Gable, S. Gregor, R. Clarke, G. Ridley and R. Smyth. Canberra, ACT, Australia, ANU E Press: 187-208.

Vidovich, L. (2002). ‘Acceding to audits’: New quality assurance policy as a ‘settlement’ in fostering international markets for Australian higher education. Australian Association for Research in Education Conference. Brisbane.

Woodhouse, D. (2003). "Quality improvement through quality audit." Quality in Higher Education 9(2): 133-139.

One potential approach to provide a Moodle email merge facility

One of the issues I have to address with the BIM Moodle module is the provision of an email merge facility. I (and a couple of other people I know) haven’t been able to find how to do this within Moodle. The following outlines one proposal for how this might be done within Moodle 1.9.

I’m very keen to hear from more experience Moodle folk about whether or not this type of service already exists within Moodle.

It’s likely that I will attempt to implement aspects of this approach in the next week to extend BIM.

What is email merge

Essentially it is a method to send the same message to multiple recipients, however, each message can be customised to include information specific to each recipient. There are two/three main tasks to email merge:

  • Selecting the recipients.
    Specify the list of folk you want to send the message to.
  • Create the message.
    Enter the message, including support for specifying the information that will be specific to each person.
  • Manage the sending/re-sending of the message.
    Tracking who has received the message, specifying whether to try again automatically etc.

The following is a screen shot (click on it to see a bigger version) of the manage message screen from the Webfuse email merge facility originally implemented by Nathaniel in 2002.

Email merge. It has a simple textbox for the message and supports attachments. The “Add tag to message” component allows the user to select some “tags” from a drop box. In Webfuse the tags include parts of the students’ name, email address, student id,
and program they were studying.

Why use it

For most teaching staff using Webfuse email merge was used to send messages to groups of students to welcome and orient them to the course, remind them that the assignment was due soon and pointing to resources, and asking them why they didn’t submit the assignment. In my experience, an email merge appears to be more personal and that generates a greater level of connection with the student. Many, if not all, of the students realised it was a bulk email, but the private touch helped.

What’s available in Moodle?

I’m still fairly new to Moodle from a user perspective, and the only functionality I’ve been able to find that comes close is the “Message course users” functionality that is available under course participants. When you view the participants in a course you can select some of them and then choose to “add /send message” – see the following.

Moodle select participants

Then you see a typical HTML editor with some additional guidance, plus a list of selected users which you can further edit. See the following.

Moodle send message

In terms of the main tasks for email merge there are some limitations:

  • Selecting the recipients.
    You can only select the recipients from the entire list of people within a course. This is limiting in two ways. First, you may wish to include recipients that cross a course boundary. Second, you may wish to start with an existing list of recipients, not select from the entire list of course participants.

    For example, you may wish to use email merge to send a message to all students who haven’t completed an assignment. Hence, from the gradebook you’d like to be viewing those students and have a link “Mail merge” that allows you to select all those students.

  • Create the message.
    Two limitations here, no support for attachments, and no support for personalisation. Though it does have the HTML editor.
  • Manage sending.
    Doesn’t appear to have support for this. So, you can’t schedule the message to be sent at a specific time or on a specific event.

Improving recipient selection

Going beyond a course boundary is a little more difficult, however, improving selection within a course could be possible. The form that displays the message takes the list of recipients as a parameter – it appears in session data – theoretically it might be possible for other Moodle extensions to generate this session data and call the form.

Improving message creation

The main missing piece here is the ability to include “tags” and get them replaced with personal information for each recipient. There are three broad tasks here:

  • Specifying the tags and where the information is.
  • Providing an interface that allows message senders to include tags in a message.
  • As each message is being sent, replace the tags with the actual personal information for the specific recipient.

The last two will likely require modifications to the file moodle/user/messageselect.php which seems to implement most of this

  • message edit screen;
    Need to add support to describe the available tags and allow the user to insert them in the message.
  • preview screen;
    Allows the user to see the message before it is sent. Add to this the ability to see the tags replaced with specific information from a user.
  • sending the message.
    i.e. where the tags get replaced with each recipients’ information.

Specifying the tags

Two ways to do this, simple and complex.

The simplest way to do this would be to restrict it to just standard Moodle system information about users such as name, email address and more standard extensions such as the gradebook. This would mean a “simple” change to to moodle/user/messageselect.php

A more complicated approach would be to allow greater support for Moodle’s extensibility. i.e. allow each activity/block define it’s own set of tags and have moodle/user/messageselect.php be able to handle those. For example,

  • BIM could define it’s own set of tags (e.g. REGISTERED_FEED for the student’s registered blog feed).
  • When a user clicks on email merge from BIM, it would call messageselect and pass the list of users selected from BIM (e.g. all students with unregistered blogs).
  • messageselect will know which extension called it and check to see if that extension defines its own tags.
  • messageselect would then use those tags (and how to get the information for each user) to modify the edit screen, the preview screen and the sending of the message.

Fixing BIM's back up and restore

The following outlines steps to continue work on BIM’s backup and restore functionality. As per this issue the user part of the back up has errors.

It appears that the code was actually working.

Re-create the problem

It’s been a few months since I worked on this. Have to re-create the problem first.

Looking through the bim/backuplib.php code, the first evidence is that the user code is commented out in the function bim_backup_one_mod. Let’s uncomment that and try to back up a BIM.

Okay, that seems to have completed. No errors reported. Is the Moodle debugging option set to the highest? Yep. So the problem isn’t a syntax error, it’s an error in operation/implementation.

Let’s look at the resulting backup and see where it is going wrong.

[sourcecode lang=”bash”]
david-joness-macbook-pro:tmp david$ unzip *
Archive: backup.zip
creating: course_files/
creating: course_files/1/
inflating: course_files/1/david.2.xml
inflating: course_files/1/david.xml
creating: group_files/
creating: group_files/60/
creating: group_files/61/
inflating: moodle.xml
creating: site_files/
[/sourcecode]

Should there be BIM specific files here? No, looks like the data is within the files. Okay, there’s no user data being saved. Why?

Ahh, there’s more commented code to uncomment. Mm, still getting the message “without user data”. Missing something.

Ahh, apparently not all users can back up user data. I was logged in as my own account, no permission. Login as root, and there’s the user stuff.

Okay, BIM backup appears to work – no errors. Let’s look at the files.

Yep, that seems to be working so far. All three tables are being saved in the XML file and in apparently correct format.

Doing the restore

So, is the problem with the restore? The restore looks to have worked. The big question how do we test. First, let’s look at the restored BIM. There are errors. But also no students. Appears that the problem is that the students/users from the backed up course, aren’t in the restored and separate course.

What if we do the restore within the same course?

Well the restore process didn’t create any errors, but when using the newly restored BIM (in addition to the existing BIM activity) it has the same sorts of errors as when restored in a new course, there’s a problem with manage marking.

Let’s look at the database and see what’s been restored, check each BIM table.

  • bim – information about all BIM activities.
    As expected. 3 BIMs on my test box. The values are all as I’d expect.

    The original activity has id 1, 2 is the restore in a new course (course id 15) and id 3 is for the activity restored in the same course (course id 4)

  • bim_group_allocation – which groups are allocated to which staff.
    Again, there are 3 BIMs listed. The same number of entries for each BIM. The userids of the markers are the same regardless of the course. The group ids are different between courses. As expected (I think).
  • bim_questions – list of questions for the activity.
    Ok, as expected as it’s the user stuff I’m checking and this isn’t user stuff.
  • bim_student_feeds – where are the students registered feeds?
    All correct. Each of the 3 bims have exactly the same data. The userids are the same regardless of the course. This indicates that the restore is making the right decisions about the students.
  • bim_marking – marking and other information about each student post (3 feeds registered, 10 posts per feed, 3 bims)
    As expected there are 90 rows in this table. After a quick check, it appears that this is all good.

This is somewhat strange. Looks like it’s all working, so why the problems?

Is it because of the user I’m logged in as (admin user), what about a standard teacher? Nope, same errors. Time to look at the code.

The errors being reported are in a Moodle library – tablelib.php. Hence the data being passed in must be corrupt/wrong in some way. If I compare the original BIM activity with the restored one (in the same course) there are differences in the manage marking output. The restored one is missing one of the questions. However, under manage questions all the questions are listed.

The code generating the header of the table generates the same data. Okay, the problem is within the code that generates the contents of the table.

Bugger, my problem here. The function “bim_get_question_id” compares the title of a question to get the unique id of the question. If questions have the same title, which they can and do in this example, then there’s a problem. Need to fix that.

Fixing get_question_id

This function is only used once. In this section. So, it looks like the solution is to remove the need to use it.

Let’s try simply adding the id of the question as the index for the array.

Yep, that works.

Does it work now?

So, does that mean the back up and restore process is working? Checking through the restored BIM in the same course, it appears it does. Some errors are there when restored to a new course.

This appears to be because there are no students assigned to the course and the error checking in BIM ain’t great. Fix that and the error messages disappear, but there still appear to be users within BIM. Which is probably what should happen because there are no students in the course, but there are in BIM.

Okay, the problems appear to be not with backup/restore, but with courses not having students enrolled and the poor error checking in BIM. If I fix up the error checking, we should be in action.

Required fixes

  • bim_create_posts_display – another area where the same question title is causing problems.

That’s it. Seems to be working

Just need to remove the debug statements in the restore process and commit it.

Adding multiple visualisation approaches to Indicators block

This post is a summary of work being done to update the Moodle indicators block so that it can support multiple visualisation tools and approaches.

Problem

The indicators block is aimed (at least for me) to be a way in which various visual insights (indicators) about what is going on within Moodle can be shown to students and staff. Col initial indicators within the block were generated using the Google chart tools. This worked really well and I think we’ve only scratched the surface with those tools. However, there appears to be a need to support multiple visualisation approaches, reasons might include:

  • the visualisation tool doesn’t provide necessary functionality; and
  • need for a multiple visualisations for the same data.

A simple example of this comes from the only “data” the indicators block currently visualises – the level of activity in a course site by staff or students. Currently this is shown as a “dial” or speedo (see below). The dial ranges from red through to green and a black arrow indicates the level of activity by the participant.

Next step in indicators block

Alan commented that he didn’t like the dial/meter visualisation in that it seems to encourage a simplistic “more is better” perception. Alan would prefer some sort of traffic light visualisation. After a very quick look, I don’t think the Google chart tools provide a traffic light visualisation. Regardless, you get the idea.

Rather than force someone to use only one visualisation, it would seem better if the Moodle indicators block allowed people to choose (and implement) the ones they preferred. i.e. support for multiple visualisations.

What’s been done

The aim here is to complete and describe three tasks that enable multiple visualisations. These tasks are:

  1. Move the Indicators to a Model/View pattern.
    The intent is to separate the calculation of the data from the visualisation. i.e. to allow multiple different visualisations.
  2. Add support for an alternate visualisation tool.
    In this the protovis library.
  3. Implement a couple of different visualisations of existing data.
    Essentially to test and illustrate the use of the Model/View patterns.

Most of these have been done, but only in initial stages for staff.

Model/View

For each indicator there are two main tasks it must perform:

  1. Generate/retrieve the data to be visualised.
  2. Generate the visualisation.

The aim here is to separate out those two tasks into two classes. A model and a view. This means that the existing indicator code that looks like this.

[sourcecode language=”php”]
$indicator = IndicatorFactory::create($context);
$this->content->text = $indicator->generateText();
[/sourcecode]

Will get modified to something like this.

[sourcecode language=”php”]
$model = IndicatorFactory::createModel( $context );
$view = IndicatorFactory::createView( $model, $context );
$this->context->text = $view->generateVisualisation();
[/sourcecode]

The factory class is now responsible for generating both the model and the view. The above is likely to change overtime. For example, rather than passing just $context, there might be other information e.g. user preferences etc.

Let’s see if I can get this to work with some testing.

Adding a protovis visualisation

Aim here is to create a 2nd visualisation of the existing indicator using Protovis. Using the protovis view will initially be hard coded for some users, eventually to be replaced with some preferences or rotation approach.

Running out of time at the moment, so I’m going to put in a dummy protovis view that simply shows a bar graph. Doesn’t use the data from the model at all.

So, here’s what the staff indicator looks like with the google chart view.

Staff activity indicator

The bit of the factory that generates this view looks like this

[sourcecode lang=”php”]
if ( has_capability( ‘moodle/legacy:teacher’, $context ) ||
has_capability( ‘moodle/legacy:editingteacher’, $context ) ) {
require_once(
$CFG->dirroot.’/blocks/indicators/staff/activity/google_view.php’);
return new activityView( $model );
}
[/sourcecode]

Eventually, rather than a straight use of require…google_view this would eventually be replaced by some algorithm that figures out which view the user wants. But, for now, I’ve introduced the following which randomly selects which view to use.

[sourcecode lang=”php”]
$view = "/blocks/indicators/staff/activity/google_view.php";
if ( rand( 0, 1 ) == 1 ) {
$view = "/blocks/indicators/staff/activity/protovis_view.php";
}
require_once( $CFG->dirroot.$view );
return new activityView( $model );
[/sourcecode]

The dummy protovis view looks like this

Proof of concept - protovis in Moodle indicators block

To do

Need to update the student view to use this model, need to start generating some different models and views.

Also need to think about how the models can be used to do some “caching” of database content.

Understanding what teachers do: First step in improving L&T

The following is an attempt to explain the initial description and rationale of an exploratory research project (perhaps ethnographic, narrative inquiry or some similar qualitative methodology) aimed at understanding what teachers/academics actually experience within a particular environment during a single term. The assumption is that by better understanding the lived experience of the teaching staff you can better understand why (or why not) teaching is likely to improve.

In terms of suggestions and advice, I’m really keen to hear from people who might have some insights to share around:

  • methodology;
    What good methods are there to gain the type of insight I’m interested in without being to onerous for the academics involved?
  • related literature;
    Where is the literature that talks about this sort of approach within university teaching, or perhaps more broadly in education?

Why? – The personal aspect

I’m interested in this because I coming to the opinion that it is the quality, quantity and diversity of the connections within the network of people, policies, technologies and other system objects that enable or constrain the ability for a university to improve it’s teaching and learning. In particular, the connections which surround the teaching staff and students define what they experience and that experience impacts what they are likely to do (or not). My bias is that I think the network/environment surrounding most staff/students is actively preventing improvement in learning and teaching.

In my current job I am expected to help improve the quality of teaching and learning. Much of what I do (e.g. Moodle curriculum mapping, the broader alignment project, and the indicators Moodle block) is aimed at modifying the environment/network around teaching staff to enable and encourage them to improve their teaching. But this is only half the equation.

Aside: My focus is on teaching staff. It is not on students. While I agree 100% that student-centered approaches (there’s a lot of “buzz word” around that phrase, I hesitate to use it) to learning are the most effective, I don’t teach students. My job is to help other academic staff improve what they are doing, to create an environment in which they start to think that student-centered learning is not only a good thing (which many of them do) but that the environment actually helps them implement such approaches, rather than actively hinders it. I’ve seen too many attempts to encourage student-centered approaches that ignore the teaching staff and consequently get hamstrung from reactance.

The other half of the equation is getting a good understanding of the environment/network as experienced by the teaching staff. Up until now I’ve been relying on my recent experience of the same environment (which is now 3+ years old) and ad hoc discussions with colleagues (which is limited by all sorts of bias). This understanding is necessary because of the need to:

  • Be more aware of what some of the potential problems or needs are that need addressing.
  • Design the interventions to address those problems.
  • Understand the impact and ramifications of those interventions.
  • Provide evidence to others of the problems within the environment and the value of the interventions.

Why? – the research perspective

So that’s the personal perspective, what about the research perspective?

First off, one of the “buzzwords” within education fields at the moment is distributive leadership. Here’s something I wrote describing distributive leadership in the alignment project blurb

Parrish et al (2008) define distributive leadership as the distribution of power through a collegial sharing of knowledge, of practice, and reflection within a socio-cultural context. Zepke (2007) argues that this is more than the delegation of tasks and responsibilities, and more than collaborative practice. Spillane et al (2004, p. 9) argue that, based on its foundations in distributed cognition and activity theory, distributive leadership is not limited to people, but can also be attributed to artefacts such as language, notational systems, tools and buildings. Leadership activity is distributed through an interactive web of actors, artefacts and situation (Spillane et al., 2004, p. 20).

Spillane et al (2004) go onto define leadership as

the identification, acquisition, allocation, co-ordination, and use of the social, material, and cultural resources necessary to establish the conditions for the possibility of teaching and learning.

i.e. the identification of the social, material and cultural resources within an organisation is an important part of creating the conditions for teaching and learning.

Support for the importance of the environment in terms of its impact on learning outcomes comes from 30 years of empirical research by Prosser et al (2003) and Ramsden et al (2007) which has produced abundant empirical inquiry and theory that links the quality of student learning outcomes with: (1) the approaches to learning taken by students; (2) the students’ perceptions of the learning context; and (3) the approaches to teaching practiced by teaching staff. In turn, this research confirms the findings of other leadership studies by illustrating that variation in teaching approaches is associated with perceptions of the academic environment (Ramsden et al., 2007).

In terms of models of what we know about teaching and learning, the importance of the environment and context is illustrated by Trigwell’s (2001) model of teaching (click on the images to see larger versions)

Trigwell's model of teaching

And by Richardson’s (2005) model of teachers’ approaches to teaching

Integrated model of teachers' approaches to teaching

While pedagogue’s are likely to adopt teaching approaches that are consistent with their conceptions of teaching there may be differences between espoused theories and theories in use (Leveson 2004). While pedagogues may hold higher-level view of teaching other contextual factors may prevent use of those conceptions (Leveson 2004). Environmental, institutional, or other issues may impel pedagogues to teach in a way that is against their preferred approach (Samuelowicz and Bain 2001). While conceptions of teaching influence approaches to teaching, other factors such as institutional influence and the nature of students, curriculum and discipline may also influence teaching approaches (Kember and Kwan 2000). Prosser and Trigwell (1997) found that pedagogue’s with a student-focused approach were more likely to report that their departments valued teaching, that their class sizes were not too large, and that they had control over what was taught and how it was taught. Other contextual factors that frustrate pedagogues’ intended approaches to teaching may include senior staff with traditional teacher-focused conceptions raising issues about standards and curriculum coverage and students who induce teachers to adopt a more didactic approach (Richardson 2005). In addition, teachers who experience different contexts may adopt different approaches to teaching in those different contexts (Lindblom-Ylanne, Trigwell et al. 2006).

i.e. the perceptions of the environment in which they teach held by teaching staff have a direct effect on how they teach. If you want to improve the quality of teaching within a university you have to understand how the academics perceive/experience the environment.

Most of the literature I’ve seen to date by Prosser and his colleagues has been mostly survey based. I’m interested in a more detailed insight into the actual lived experience, rather than ad hoc recollections filtered through survey questions.

What

To my way of thinking this has to be a exploratory, qualitative and ethnographic investigation. I’m looking to gain insight into the day to day lived experience of academics and how they react to that experience, what it does to them. I need to read up some more. What follows are some initial thoughts.

Murthy (2008) describes how good ethnography “effectively communicates a social story, drawing the audience into the daily lives of the respondents”. This is what I’m trying to get to, I want the stories of the daily lives of the academics around learning and teaching. Murthy (2008) goes on to give an overview of digital ethnography, but nothing immediately helpful…but it seems connected to what I was thinking of doing. Hookway (2008) also looks promising but the site is down for scheduled maintenance.

How

So, without much reading, I’ve been thinking about starting this with a small exploratory study along the following lines:

  • Approach half a dozen academics from my current institution.
    Selected to be somewhat diverse in terms of likely experience in terms of location, subject etc.
  • Invite them to be co-researchers.
    I’d rather they were collaborators than research subjects. I want them to have greater ownership and motivation to be involved. I want the value of their insight not just into the everyday experience of teaching, but also research.
  • For a single teaching term, ask them to contribute to a blog stories about their experience with teaching.
    Whenever they do something around teaching, something different, something frustrating etc. write a story on the blog. As short of as long as they like. It might be a personal blog, or it might be a group blog. It might well have to be a private blog.
  • At the end of term, employ various methods to analyse the data.
  • Present it locally and publish it.

References

Biggs, J. (1996). “Enhancing teaching through constructive alignment.” Higher Education 32(3): 347-364.

Kember, D. and K.-P. Kwan (2000). “Lecturers’ approaches to teaching and their relationship to conceptions of good teaching.” Instructional Science 28(5): 469-490.

Leveson, L. (2004). “Encouraging better learning through better teaching: a study of approaches to teaching in accounting.” Accounting Education 13(4): 529-549.

Lindblom-Ylanne, S., K. Trigwell, et al. (2006). “How approaches to teaching are affected by discipline and teaching context.” Studies in Higher Education 31(3): 285-298.

Hookway, N. (2008). “‘Entering the blogosphere’: some strategies for using blogs in social research.” Qualitative Research 8(1): 91-113.

Murthy, D. (2008). “Digital Ethnography.” Sociology 32(5): 837-855.

Parrish, D., G. Lefoe, et al. (2008). The GREEN Resource: The development of leadership capacity in higher education. Wollongong, CEDIR, University of Wollongong: 64.

Prosser, M. and K. Trigwell (1997). “Relations between perceptions of the teaching environment and approaches to teaching.” British Journal of Educational Psychology 67(1): 25-35.

Prosser, M., P. Ramsden, et al. (2003). “Dissonance in experience of teaching and its relation to the quality of student learning.” Studies in Higher Education 28(1): 37-48.

Ramsden, P., M. Prosser, et al. (2007). “University teachers’ experiences of academic leadership and their approaches to teaching.” Learning and Instruction 17(2): 140-155.

Samuelowicz, K. and J. Bain (2001). “Revisiting academics’ beliefs about teaching and learning.” Higher Education 41(3): 299-325.

Spillane, J., R. Halverson, et al. (2004). “Towards a theory of leadership practice: a distributed perspective.” Journal of Curriculum Studies 36(1): 3-34.

Trigwell, K. (2001). “Judging university teaching.” The International Journal for Academic Development 6(1): 65-73.

Zepke, N. (2007). “Leadership, power and activity systems in a higher education context: will distributive leadership server in an accountability driven world?” International Journal of Leadership in Education 10(3): 301-314.

Draft chapter 4 of the thesis is up

A couple of days ago I wrote the last few sentences for a fairly serious first draft of chapter 4 of the thesis. This chapter has to be re-read by me, read by my esteemed supervisor and then by a copy editor, so it’s not finished yet. But it’s a step closer.

This chapter tells the story of and rationale behind the development and use of Webfuse from 1997 through 1999. It attempts to formalise the thinking behind Webfuse into the first version of an information systems design theory for e-learning within universities. Since the thinking behind Webfuse was very naive, the resulting design theory is somewhat naive. From my perspective much of what passes for thinking around e-learning within universities today, is just as, if not more, naive.

The next step is to move onto chapter 5 which tells the story/rationale of the final period of Webfuse: 2000 through 2004 and beyond.

How curriculum mapping in Moodle might work

The purpose of this post is to provide a concrete description of how curriculum mapping of a Moodle course might work. The hope is that this will enable a broader array of people to comment on the approach and, in particular, identify flaws or problems. So, please comment.

This is being done as part of the alignment project and picks up from some earlier examination of Moodle’s existing outcomes feature.

Overview

The aim is to modify Moodle (as little as possible) to enable teaching staff to perform two tasks:

  1. Map how well the activities, resources and assessment within their Moodle course aligns with a set of outcomes.
    Related to this task is the ability to maintain this mapping as the course is modified.
  2. Use the alignment information about their course (and other courses) to enhance their course.

Each of those two tasks is expanded below.

Implementation

The implementation suggested below is based on ideas from Moodle’s existing support for Outcomes. Some of the following screen shots are using that existing support, some are slightly modified. Moodle’s existing support for outcomes (or competencies) is in terms of tracking how students are going in achieving specific outcomes or competencies. Rather than individual students, this project is mapping the activities, resources and assessments against outcomes. But the principle is basically the same.

Mapping

This task has the following steps (which are explained below):

There is also the problem of whether or not a Moodle course site can be used to map everything about a course.

Specifying the outcomes

The first step is specify which outcomes courses will be mapped against. Moodle supports two “types” of outcomes:

  • “standard” outcomes; and
    These would be created at the institution level and able to be used across all Moodle course sites for that installation.
  • course outcomes.
    These are added to a specific course and can only be used within that course.

Outcomes are placed into Moodle by direct entry via the Moodle interface or uploading a CSV file. Important or interesting values for an outcome include:

  • Both a full and short name.
  • A description of the outcome.
  • The scale to be used for measuring the outcome.

Scales are used by Moodle to evaluate or rate performance. By default this is a numeric value, however, Moodle supports the creation of custom scales. For example, the scales Moodle page talks about the cool scale that consists of the the values: Not cool, Not very cool, Fairly cool, Cool, Very cool, The coolest thing ever!

My current institution is currently rolling out its graduate attributes. There are eight graduate attributes, each of those could be loaded as a standard outcome in Moodle. The institution is currently using three levels – introductory, intermediate and graduate – and has created descriptions of these levels for each attribute. These could form the basis for a scale for each attribute/outcome.

The following is an example CSV file that can be uploaded into Moodle to achieve this.

[sourcecode lang=”text”]
outcome_name;outcome_shortname;outcome_description;scale_name;scale_items;scale_description
Communication;comm;"Described here http://dmai.cqu.edu.au/FCWViewer/view.do?page=7949";"CQU Graduate Attributes (Communication)";"Introductory – Use appropriate language to describe/explain discipline-specific fundamentals/knowledge/ideas (C2), Intermediate – Select and apply an appropriate level/style/means of communication (C3), Graduate – Formulate and communicate views to develop an academic argument in a specific discipline (A4)";
Problem solving;ps;"Described here http://dmai.cqu.edu.au/FCWViewer/view.do?page=7949";CQU Graduate Attributes (Problem solving);"Introductory – Manage time and prioritise activities within the University’s framework for learning (C3), Intermediate – Make decisions to develop solutions to given situations/questions (C5), Graduate – Formulate strategies to identify, define and solve problems including, as necessary, global perspectives (P5)";
[/sourcecode]

Mapping against outcomes

Let’s start with an example Moodle course site with “editing turned on”. With “editing turned on” you get a collection of additional icons next to just about every element of the site. See the following image (click on it to see a larger version).

Moodle course page - editing on

Can you see the icon that looks like a hand holding a pen? This is the “edit” icon. If you click on this icon you get taken to the edit page for that item of the Moodle course site. An edit page for a Moodle item contains a number of components specific to the item, and a number of components common to all items. The following image is a portion of the edit page for a Moodle discussion forum with some additional labels added to show the specific and common components.

Moodle edit page - outcomes

Did you spot the “Outcomes” component of the above edit page? It showed a list of “outcomes” which match the graduate attributes of my current institution. Against each “outcome” there was a check box. To “map” this discussion forum against a graduate attribute, you simply check the appropriate box. It would be not a great stretch to think that “Communication” and “Team work” might be appropriate.

Important: This is all in Moodle now. No additions needs.

The “on” or “off” nature of the check box is very limited. This is due to the purpose Moodle’s current outcome support is meant to fulfill. For curriculum mapping you would want something more like the following.

Example curriculum mapping outcomes

The above has two main changes:

  1. Addition of the question mark icon.
    In Moodle practice clicking on the question mark gives you help. In terms of outcomes for curriculum mapping I would expect that at the least this would explain the outcome (in this case a graduate attribute) and the scale being used. It might include examples and might include a link to talk to a real person.
  2. Replace the checkbox with the scale.
    In this case it’s showing a drop box next to each outcome/attribute. These drop boxes, as shown by the box next to “Communication”, contains the three level scale being used by my current institution.

There is a lot more you could do with this particular interface, but the basic point is that when a teacher is editing or creating a new item for a Moodle course site, they can map that item against the course outcomes at the same time.

Maintaining the mapping

Following on from the last point, the fundamental idea of this project is that a mapping of the alignment within a course site is maintained all of the time. It’s not something done every now and then because an accrediting body is visiting. The idea is that once a course site is mapped, maintaining the mapping fits into normal academic practice. For example, common practice at my institution is that each offering of a course does not start with a brand new, empty Moodle course site. Instead, the previous course offering is copied over for the next term and then edited.

With the suggested changes, the copying of the course site would also copy the mapping. So rather than mapping the entire course site all over again, the teacher only needs to map the new items added to the site or modify the mappings of any items they might change.

The new “mapping” features of Moodle should encourage/warn the teacher when the alignment is no longer correct. The following image is an example of what a teacher might see if they have changed the Moodle item, but not updated the outcomes/alignment mapping.

Out of date mapping

Map everything?

There’s an assumption in the above that by mapping everything item in a Moodle course site you are considering everything about the course. It’s a somewhat faulty assumption because most Moodle course sites are at best a supplement to what happens face-to-face or via other media. If this idea is to work, then thought would have to be given to how you design a Moodle course site that captures all aspects of a course.

This is by no means a simple task or one without potential problems. However, I do think that supporting people to collaborate about this question in the context of considering overall course alignment will allow interesting and useful approaches to develop. Approaches that could potentially improve the quality of Moodle course sites.

But this is something that would need to be tested.

Using the information

The previous section gave an overview of how the mapping of course alignment would be performed. This is only the first part of this project. The next, and potentially more interesting, step is what happens when people start using the availability of this information to inform quality enhancement of courses.

What people might do with this information is not something I think you can predict. The way the project was initially framed was to allow these potential uses to flow from action research cycles. However, there have been some initial ideas proposed. The following describes those which I think are some of the possibilities that are the most generative. i.e. the following ways of using this information will generate more interesting applications of or response to this information.

The three uses I talk about below are:

Visualising the alignment

The simplest use would be for a teacher taking on a course to be able to see how aligned (or not) a course is. The following is the type of visualisation that might be used. It’s taken from Lowe and Marshall (2004) and a tool developed at Murdoch University. Each graduate attribute has 4 graphs representing objectives, learning activities, assessments and contents, the size of the graph represents how often/much the attribute is covered by those course elements.

GAMP visualisation of course alignment

In the above image it’s visible that the “Ethics” graduate attribute is quite heavily covered in course objectives, somewhat in course contents, a bit less in assessment, but is not covered at all by learning activities. One of the propositions underpinning the project is that explicit representations of alignment problems is likely to encourage teaching staff to fix the problem (see the contextualise L&T support section for more on this). This type of visualisation could be especially helpful for new or casual teaching staff who taken on a new course for the first time.

A Moodle implementation could be modified to send reminders to teaching staff about apparent misalignment.

Share the alignment

Making the level of alignment within a course explicit to the staff teaching the course is only the first step. A common problem being faced by degree programs is preventing duplication of content or content holes. If all courses within a program are using this feature then it’s fairly simple to share the alignment of multiple courses into a form that can be shared. The following is another example from Lowe and Marshall (2004) and shows a visualisation for multiple courses.

GAMP program visualisation

This type of visualisation could be factored into quality assurance processes for a program at the start of a term. The program’s teaching group could adopt a collaborative process at the start of term to address any holes or duplications.

The sharing could also be more ad hoc. The visualisation of the course (the first image from Lowe and Marshall) could be extended to provide links to examples. i.e. when you see a visualisation like the above that shows that the Ethics graduate attribute is not covered by any learning activities there could be a link to other courses that do have activities covering the ethics graduate attribute. Teaching staff could follow these links to view those activities as a way of getting ideas. Which courses show up via these links could be chosen via a number of ways.

The alignment could also be shared with students. Adding the ability to view the contents of a site structured using the outcomes would be quite easy. Lots more interesting applications could be developed.

Contextualise L&T support

Above it was suggested that the visualisations of alignment could, when problems are identified, provide links to courses that can be used as examples. The visualisations could also provide links to documents, presentations, discussions and people who could provide specific support. This could help curriculum designers and related L&T support folk contextualise their assistance in a very specific way. An approach that moves towards achieving Boud’s (1999) argument that L&T support needs to be embedded within the context of academic work, that it needs to occur in or close to the teaching academics sites of practice.

References

Boud, D. (1999). Situating academic development in professional work: Using peer learning International Journal for Academic Development, 4(1), 3-10.

Lowe, K., & Marshall, L. (2004). Plotting renewal: Pushing curriculum boundaries using a web based graduate attribute mapping tool. Paper presented at the 21st ASCILITE Conference, Perth.