Assembling the heterogeneous elements for (digital) learning

Month: May 2010 Page 1 of 2

The need for a third way

One of the themes for this blog is that the majority of current approaches to improving learning and teaching within universities simply don’t work. At least not in terms of enabling improvement in a majority of the learning and teaching at an institution. Recently I finally completed reading the last bits of the book Nudge by Thaler and Sunstein. Chapter 18 is titled “The Real Third Way”. This post explores how that metaphor connects with some of the thinking expressed here.

The real third way

Thaler and Sunstein mention that the “20th century was pervaded by a great deal of artificial talk about the possibility of a ‘Third Way'” in politics. Their proposal is that libertarian paternalism, the topic of the book, represents a real third way. I’m not talking politics but there appears to be the same need to break out of a pointless dichotomy and move onto something more useful.

The characterisations of the two existing ways provided by Thaler and Sunstein are fairly traditional (stereotypical?) extremes of the political spectrum. i.e.:

  1. Liberal/Democrat – “enthusiasm for rigid national requirements and for command-and-control regulation. Having identified serious problems in the private market, Democrats have often insisted on firm mandates, typically eliminating or at least reducing freedom of choice.”.
  2. Conservative/Republican – have argued against government intervention and on behalf of a laissez-fair approach with freedom of choice being a defining principle. They argue that “in light of the sheer diversity of Americans one size cannot possibly fit all”.

Thaler and Sunstein’s third way – libertarian paternalism – is based on two claims:

  1. Choice architecture is pervasive and unavoidable.
    Small features of social situations have a significant impact on the decisions people make. The set of these features – the choice architecture – in any given social situation already exists and is already influencing people toward making good or bad decisions.
  2. Choice architecture can be manipulated while retaining freedom of choice.
    It is possible to make minor changes to the set of features in a social situation such that it encourages people to make “better” decisions, whilst still allowing them to make the “bad” decision, if that’s what they want.

Connections with improving learning and teaching

Early last year I borrowed and slightly modified Bigg’s 3 levels of teaching to identify 3 levels of improving learning and teaching. Obviously there is a numerical connection between these 3 levels and the “3 ways” outlined above. The more I’ve thought about it, the more I realise that the connections are more significant than that, and that the “3rd way” seems to be a useful way to position my beliefs about how to improve learning and teaching within a university. Here goes my first attempt at explicating it.

Expanding upon the 3 levels of improving L&T

The 3 levels I initially introduced can be expanded/morphed into ways or into stages. In terms of stages, I could probably argue that the levels/stages represent a historical evolution of how learning and teaching has been dealt within in Universities. Those three stages are:

  1. What the teacher is (i.e. ignore L&T).
    This is the traditional/historical stage that some long term academics look back on with fond memories. Where university management didn’t really get involved with teaching and learning. Individual academics were left to teach the course they way they felt it should be taught. There was little over sight and little need for outside support.

    The quality of the teaching was solely down to the nature of the teacher. If they were a good teacher, good things happened. If bad….. This was the era of selective higher education where, theoretically, only the best and the brightest went to university and most were seen to have the intellectual capability and drive to succeed regardless.

    For a surprising number of universities, especially those in the top rank of universities, this is still primarily how they operate. However, those of us working in “lesser” institutions are now seeing a different situation.

  2. What management does (i.e. blame the teacher).
    Due to the broadly publicised characteristics of globalisation, the knowledge economy, accountability etc. there is now significant pressure upon universities to demonstrate that the teaching at their institutions is of high quality. Actually, this has morphed into proxy measures where the quality of teaching is being measured by ad hoc student memories of their experience (CEQ surveys), how many of the academics have been forced to complete graduate certificates in higher education, what percentage of courses have course websites and how well the institution has filled out forms mapping graduate attributes.

    All of these changes to the practice of teaching and learning are projects that are initiated and “led” by senior university management. The success of the institution is based on how well senior university management have been in completing those projects.

    As each new fad arises within government of the university sector, there is a new set of projects to be completed. Similarly, when a new set of senior management start within an institution, there is a new set of projects to be completed. In this case, however, the projects aren’t typically all that new. Instead they are simply the opposite of what the last management did. i.e. if L&T support was centralised by the last lot of management, it must now be de-centralised.

    Most academics suffering through this stage would like to move back to the first stage, I think they and their institutions need to move onto the next one.

  3. What the teacher does.
    For me this is where the institution its systems, processes etc are continually being aligned to encourage and enable academics to improve what they are doing. The focus is on what the teacher does. This has strong connections with ideas of distributive leadership, the work of Fullan (2008) and Biggs (2001).

    For me implementing this stage means taking an approach more informed by complex adaptive systems, distributive leadership, libertarian paternalism, emergent/ateleological design and much more. This stage recognises that in many universities stage 1 doesn’t work any longer. There are too many people and skills that need to be drawn upon for successful teaching that academics can’t do it by themselves (if they ever did). However, that doesn’t mean that the freedom of academics to apply their insights and knowledge should be removed.

So, now I’ve expanded on those, time to connect these three ways with some other triads.

Connections with politics

The following table summarises what I see as the connections with the 3 stages of improving learning and teaching and the work of Thaler and Sunstein (2008).

  1. Conservative/republican == What the teacher is.
    i.e. the laissez-faire approach to teaching and learning. Academics are all too different, no one system or approach to teaching can work for us.
  2. Liberal/democrat == What management does.
    There are big problems with learning and teaching at universities that can only be solved by major projects led by management. Academics can’t be trusted to teach properly we need to put in place systems that mandate how they will teach and force them to comply.
  3. Libertarian paternalism == What the teacher does.
    The teaching environment (including the people, systems, processes, policies and everything else) within a university has all sorts of characteristics that influence academics to make good and bad decisions about how they teach. To improve teaching you need to make small and on-going changes to the characteristics of that environment so that the decisions academics are mostly likely will improve the quality of their teaching and learning. A particular focus should be on encouraging and enabling academics to reflect on their practice and take appropriate action.

Approaches to planning

This morning George Siemens pointed to this report (Baser and Morgan, 2008) and made particular mention of the following chart that compares assumptions between two different approaches to planning.

Comparison of assumptions in different approaches to planning (adapted from )
Aspect Traditional planning Complex adaptive systems
Source of direction Often top down with inputs from partners Depends on connections between the system agents
Objectives Clear goals and structures Emerging goals, plans and structures
Diversity Values consensus Expects tension and conflict
Role of variables Few variables determine the outcome Innumerable variables determine outcomes
Focus of attention The whole is equal to the sum of the parts The whole is different than the sum of the parts
Sense of the structure Hierarchical Interconnected web
Relationships Important and directive Determinant and empowering
Shadow system Try to ignore and weaken Accept most mental models, legitimacy and motivation for action is coming out of this source
Measures of success Efficiency and reliability are measures of value Responsiveness to the environment is the measure of value
Paradox Ignore or choose Accept and work with paradox, counter-forces and tension
View on planning Individual or system behaviour is knowable, predictable and controllable Individual and system behaviour is unknowable, unpredictable and uncontrollable
Attitude to diversity and conflict Drive for shared understanding and consensus Diverse knowledge and particular viewpoints
Leadership Strategy formulator and heroic leader Facilitative and catalytic
Nature of direction Control and direction from the top Self-organisation emerging from the bottom
Control Designed up front and then imposed from the centre Gained through adaptation and self-organisation
History Can be engineered in the present Path dependent
External interventions Direct Indirect and helps create the conditions for emergence
Vision and planning Detailed design and prediction. Needs to be explicit, clear and measurable. A few simple explicit rules and some minimum specifications. But leading to a strategy that is complex but implicit
Point of intervention Design for large, integrated interventions Where opportunities for change present themselves
Reaction to uncertainty Try to control Work with chaos
Effectiveness Defines success as closing the gap with preferred future Defines success as fit with the environment

I was always going to like this table as it encapsulates, extends and improves my long term thinking about how best to improve learning and teaching within universities. I’ve long ago accepted (Jones, 2000; Jones et al, 2005)) that universities are complex adaptive systems and that any attempt to treat them as ordered systems is doomed to failure.

I particularly liked the row on shadow systems as it corresponds with what some colleagues and I (Jones et al, 2004) suggested sometime ago.

In terms of connections with the stages of improving learning and teaching,

  1. No planning == What the teacher is.
    i.e. there is no real organisational approach to planning how to improve learning and teaching. It’s all left up to the academic.

    Often “traditional planning” proponents will refer to the complex adaptive systems approach to planning as “no planning”. Or worse they’ll raise the spectre of no control, no discipline or no governance over the compelx adaptive systems planning approach. What they are referring is actually the no planning stage. A CAS planning approach, done well, needs as much if not more discipline and “governance” as a planning approach, done well.

  2. Traditional planning == What management does.
    University management (at least in Australia) is caught in this trap of trying to manage universities as if they were ordered systems. They are creating strategic plans, management plans, embarking on analysis and then design of large scale projects and measuring success by the completion of those projects, not on what they actually do to the organisation or the quality of learning and teaching.
  3. Complex adaptive systems == What the teacher does.
    The aim is to increase the quantity and quality of the connections between agents within the university. To harness the diversity inherent in a large group of academics to develop truly innovative and appropriate improvements. To be informed by everything in the complex adaptive systems column.

Orders of change

There also seems to be connections to yet another triad described by Bartunek and Moch (1987) when they take the concept of schemata from cognitive science and apply it to organisational development. Schemata are organising frameworks or frames that are used (without thinking) to make decisions. i.e. you don’t make decisions about events alone, how you interpret them is guided by the schemata you are using. Schemata (Bartunek and Moch, 1987):

  • Help identify entities and specify relationships amongst them.
  • Act as data reduction devices as situations/entities are represented as belonging to a specific type of situation.
  • Guide people to pay attention to some aspects of the situation and to ignore others.
  • Guide how people understand or draw implications from actions or situations.

In moving from the cognition of individuals to organisations, the idea is that different organisations (and sub-parts thereof) develop organisational schemata that a sustained through myths, stories and metaphors. These organisational schemata guide how the organisation understands and responds to situations in much the same way as individual schemata. e.g. they influence what is important and what is not.

Bartunek and Moch (1987) then suggest that planned organisational change is aimed at trying to change organisational schemata. They propose that successful organisational change achieves one or more of three different orders of schematic change (Bartunek and Moch, 1987, p486):

  1. First-order change – the tacit reinforcement of present understandings.
  2. Second-order change – the conscious modification of present schemata in a particular direction.
  3. Third-order change – the training of organisational members to be aware of their present schemata and thereby more able to change these schemata as they see fit.

Hopefully, by now, you can see where the connection with the three stages of improving teaching and learning are going, i.e.

  1. First-order change == What the teacher is.
    Generally speaking how teaching is understood by the academics doesn’t change. Their existing schemata are reinforced.
  2. Second-order change == What management does.
    Management choose a new direction and then lead a project that encourages/requires teaching academics to accept the new schemata. When the next fad or the next set of management arrives, a new project is implemented and teaching academics once again have to accept a new schemata. If you’re like me, then you question whether or not the academics are actually accepting this new schemata or they are being seen to comply.

    The most obvious current example of this approach is the current growing requirements for teaching academics to have formal teaching qualifications. i.e. by completing the formal teaching qualification they will change their schemata around teaching. Again, I question (along with some significant literature) the effectiveness of this.

  3. Third-order change == What the teacher does.
    The aim here is to have an organisational environment that encourages and enables individual academics to reflect on their current schemata around teaching and be able to change it as they see problems.

    From this perspective, I see the major problem within universities not being that academics don’t have appropriate schemata to improve teaching, but that the environment within which they operate doesn’t encourage nor enable them to implement, reflect or change their schemata.

Conclusions

I think there is a need for a 3rd way to improving learning and teaching within universities. It is not something that is easy to implement. The 2nd way of improving learning and teaching is so embedded into the assumptions of government and senior management that they are not even aware of (or at best not going to mention) the limitations of their current approach or that there exists a 3rd way.

Look down the “Traditional planning” column in the table above and you can see numerous examples of entrenched, “common-sense” perspectives that have to be overcome if the 3rd way is to become possible. For example, in terms of diversity and conflict, most organisational approaches place emphasis on consensus. Everyone has to be happy and reading from the same hymn sheet, “why can’t everyone just get along?”. The requirement to have a hero leader and hierarchical organisational structures are other “common-sense” perspectives.

Perhaps the most difficult aspect of implementing a 3rd way is that there is no “template” or set process to follow. There is no existing university that has publicly stated it is following the 3rd way. Hence, there’s no-one to copy. An institution would have to be first. Something that would require courage and insight. Not to mention that any attempt to implement a 3rd way should (for me) adopt an approach to planning based on the complex adaptive systems assumptions from the above table.

References

Baser, H. and P. Morgan (2008). Capacity, Change and Performance Study Report, European Centre for Development Policy Management: 166.

Bartunek, J. and M. Moch (1987). “First-order, second-order and third-order change and organization development interventions: A cognitive approach.” The Journal of Applied Behavoral Science 23(4): 483-500.

Biggs, J. (2001). “The Reflective Institution: Assuring and Enhancing the Quality of Teaching and Learning.” Higher Education 41(3): 221-238.

Fullan, M. (2008). The six secrets of change. San Francisco, CA, John Wiley and Sons.

Jones, D. (2000). Emergent development and the virtual university. Learning’2000. Roanoke, Virginia.

Jones, D., J. Luck, et al. (2005). The teleological brake on ICTs in open and distance learning. Conference of the Open and Distance Learning Association of Australia’2005, Adelaide.

Thaler, R. and C. Sunstein (2008). Nudge: Improving decisions about health, wealth and happiness. New York, Penguin.

Adding email merge to BIM

The following details an attempt to use user/messageselect.php with BIM in an attempt to move towards implementing an email merge facility for BIM.

BIM passing users

The intent here is that BIM will be used to select the users and will pass it to message select. The first test will be to replace the current “unregistered users” section on “Your Students” which simply shows a list of email address which the staff member has to copy and paste into an email program. See the following screen shot (click on it to see it larger).

Unregistered users - BIM your students

The idea is to replace it with a simple link that when clicked on will pass the details of the unregistered users to messageselect.php

Parameters for messageselect

For this to work, I need to pass messageselect all the parameters it expects in the way it expects them.

First, the parameters is expects are:

  • The list of user ids for the recipients.
    This is done using checkboxes with parameter names userid where id is the Moodle user id.
  • The course id.
    id set to the Moodle course id.
  • formaction
    Seems to simply be the name of the messageselect script.
  • returnto
    path of script it’s coming from.

Parameter passing for message select

In terms of how to pass the data, I’ve tried a normal query string. But that didn’t seem to create the necessary outcome.

It appears that messageselct uses the PHP $_POST variable/function which is used for a form with the post method. So let’s try that.

Yep, that seems to work. May be as simple as that.

Have been able to get that working, however, the “returnto” doesn’t seem to work all the way done the various screens. Works on the first, but not on the last.

bim_email_merge

The following is the function I’ve added to bim to enable the use of messageselect.php

[sourcecode lang=”php”]
function bim_email_merge( $ids, $course, $returnto, $button_msg ) {

global $CFG;

print <<<EOF
<form method="post" action="$CFG->wwwroot/user/messageselect.php" />
<input type="hidden" name="id" value="$course" />
<input type="hidden" name="returnto" value="$returnto" />
<input type="hidden" name="formaction" value="messageselect.php" />
<input type="submit" name="submit" value="$button_msg" />
EOF;

foreach ( $ids as $id ) {
print "<input type="hidden" name="user{$id}" value="on" />";
}
print "</form>";
}
[/sourcecode]

This function displays a submit button with a given message. If pressed the form sends a list of Moodle user ids ($ids) to messageselect. At this stage the user can create the message, choose to remove some users and then send the message. I think.

Implemented in BIM, it looks like the following.

BIM's new email merge

Institutional changes – 2000 and beyond – and their impact

This carries on “bits” from chapter 5 of the thesis. It’s a rough draft of a description of the institutional context within CQU from 2000 onwards. It’s brief and targeted mainly at the factors which impact on Webfuse development.

It needs more work and checking. If you have any suggestions, fire away.

Institutional changes

In mid-1996 CQU appointed a new Vice-Chancellor who was an advocate of a number of new initiatives (Gregor, Wassenaar et al. 2002) including the 1998 organisational restructure, the introduction of a four-term year and increasing emphasis on overseas, full-fee paying students. While all of these changes were introduced prior to 2000 each had on-going ramifications that were being dealt with by CQU management and staff. These ramifications in combination with a number of additional changes were part of the reason why the next CQU Vice-Chancellor described CQU as a “work in progress” and “a unique university” (Hancock 2002). However, the institution did retain the vision “to be a unified university, acknowledged universally as a leader in flexible teaching and learning” (Hancock 2002).

As described in chapter 4, by 1998 Webfuse was being used to support the online learning and teaching activities of the Faculty of Informatics and Communication (Infocom). While the organisational restructure that led to the creation of Infocom happened in 1998 it was not until 1999 that the foundation Dean of Infocom commenced work at CQU. The Dean saw various forces for change, including ICTs, enabling and requiring the development of a “‘glocal’ networked education paradigm” in order to provide a scaleable and globally competitive flexible model of educational delivery (Marshall 2001). The emergent development of this model underpinned Infocom’s Singapore online project (Marshall 2001) and subsequently impacted upon broader faculty practices.

In parallel with these developments within Infocom, CQU was perform various reviews and planning processes aim at developing “structures and systems that are responsive to the needs of learners and the changing nature of higher education in the 21st Century” (CQU 2001). The third stage of this process was the release of a Strategic Plan for Flexible Learning in 2001. Evidence of the importance of flexibility and on-going change is summarised by the following exhortation from that plan (CQU 2001)

The Strategic Plan for Flexible Learning is a ‘living document’. It is imperative that the Strategic Plan be regarded with the same flexibility as the very learning experiences it aims to promote and enhance. To regard the Strategic Plan as anything less will threaten CQU’s position as a market leader in a competitive environment

As described in Chapter 4, during the second-half of the 1990s CQU, in partnership with a commercial company, create a number of information campuses based in major Australian cities. By the late 1990s, these campuses in combination with the dot-com boom and changes in Australian migration rules contributed to significant growth in the student population at CQU. In 1996 international students comprised on 7.3% of CQU’s student population (Marshall and Gregor 2002). By 2002 CQU was, in terms of international students, one of Australia’s fastest growing universities with only 25% of CQU’s students being recent high school graduates (Marshall and Gregor 2002). By 2004, 40% of CQU’s student population were international students from 121 countries (Luck, Jones et al. 2004). From 1996 through 2004 CQU increased its total student numbers by almost 50%. Consequently, in 2002 the CQU Vice-Chancellor described CQU as the “most geographically disparate, ethnically diverse and fastest growing student population of any Australian University” (Hancock 2002).

By 2002, Infocom was teaching about 30% of all CQU students including almost 56% of the students at the international campuses (Jones 2003). From 1999 through 2002 Infocom student numbers more than doubled (Condon, Shepherd et al. 2003). However, by 2003 the global downturn in IT started to impact Infocom enrolments. Table 5.2, adapted from Condon et al (2003), summarises the trend in Infocom student numbers from 1998 through 2003.

Table 5.2 – Increase in Infocom student numbers – 1998-2003 (adapted from Condon, Shepherd et al. 2003).
Year Total Percentage Increase
1998 16646 Infocom’s first year
1999 18504 11% on previous year
2000 25784 39% on previous year
2001 37664 48% on previous year
2002 42654 13.2% on previous year
2003 36105 -15.3% on previous year

This growing complexity and the growing recognition of the importance of e-learning led CQU into a number of technological changes including the adoption of a number of enterprise systems. In order to cope with the increasing complexity CQU’s Vice Chancellor was strongly in favour of integrating the university’s administrative systems (Jones, Behrens et al. 2004). Consequently, in 1999 CQU’s senior management took the decision to implement the PeopleSoft suite of administrative systems (McConachie 2001). The implementation of PeopleSoft was seen as a business process re-engineering project which would require second-order structural and policy change at the University (McConachie 2001). The decision to adopt an ERP system like PeopleSoft was common within the Australian higher education sector at this time. By 2002 almost 90% of Australian universities had adopted at least one module of an ERP from a major vendor with approximately 55% of universities using PeopleSoft (Beekhuyzen, Nielsen et al. 2002).

In 2002, CQU’s Vice-Chancellor continued a long-running mantra of CQU senior management by writing that universities needed the ability to be responsive to a world that was changing fast and needed to provide education that was flexible in terms of delivery time, mode, location and content (Hancock 2002). The increasing requirement for flexibility and the accompanying increasing interest in online learning meant that by 1999 CQU’s existing processes for online/multimedia development – focused around the interactive multimedia unit (Macpherson and Smith 1998) described in chapter 4 – could no longer respond to demand (Sturgess and Nouwens 2004). After a survey and simple technical evaluation it was decided to adopt the use of WebCT as a trial institutional learning management system (LMS) (Sturgess and Nouwens 2004). An academic interviewed by Gregor et al (2002) reports major problems with the WebCT trials dur to inadequate infrastructure, a problem solved by the purchase of a large central Web server. Subsequently, WebCT became the official, institutional platform for e-learning. By the end of 2003 just over 10% of courses offered by CQU had a course website (Jones 2003). WebCT was replaced with Blackboard in 2004 (Danaher, Luck et al. 2005), which in turn was replaced by Moodle in 2010 (Tickle, Muldoon et al. 2009).

Impacts of these changes

As a result of the changes described above CQU had a diverse student population quite unlike that of a traditional university (Marshall and Gregor 2002). It was not unusual for course enrolments at the international campuses be considerably greater than those on the Queensland campuses (Oliver and Van Dyke 2004). By 1999 it was already obvious that these changes had significantly increased the complexity in teaching, increased duplication of teaching methods and significantly decreased time and resources (Jones 1999). Speaking based on experience teaching at CQU Kehoe et al (2004) describe how the development of large undergraduate courses, challenging at any time, becomes even more complex when the students represent a combination of internal and distance education students, and domestic and international students. By 2001 CQU had 11 course offerings that had over 1000 enrolled students. Typically these courses would be supported by close to 20 academic staff, including a number of casual staff, all managed by a single CQU academic.
The 1999 CQU review of distance education and flexible learning recognised that the work necessary to continue to provide existing services, while at the same time plan, implement and progress a broad array of on-going changes was considerable (CQU 1999). The growing complexity of teaching and learning on this scale led to the development of additional policies, procedures, systems and support structures to guide the management of teaching and learning. This included the employment of additional staff. Over an 18 month period Infocom staff numbers rose from 80 to 150 staff with a doubling of general staff (25 to 53) and almost a doubling of academic staff (55 to almost 90) (Condon, Shepherd et al. 2003). At the same time, the increasingly complex demands created by these changes focused attention on the need for supporting information systems such as an ERP (Oliver and Van Dyke 2004). However, there were significant problems with some of the systems implemented to address these problems. Oliver and Van Dyke (2004) report that rather than decrease staffing costs, the implementation of a new ERP, had increased staffing levels suggesting that processes had become more complicated, rather than simpler, and that cited benefits for staff have been “difficult to discern in practice”. Drawing on CQU experience, Jones et al (1999) identify two characteristics of the systems and processes set up to respond to these changes which limit flexibility. First, the cost of setting these systems suggests a period of stable use in order to recoup costs. Second, how aspects of learning and teaching are split amongst existing organisational structures limit convergence and integration.

The Vice-Chancellor of CQU writing in 2002 recognised that the institution’s “rapid growth “has placed great strain on its staff and its physical and technological infrastructure” (Hancock 2002). In particular, the attempt to increase flexibility by offering year-round teaching had placed great strain on staff and required new approaches to workload and workforce planning (Hancock 2002). Numerous authors (McConachie 2001; Luck, Jones et al. 2004; Oliver and Van Dyke 2004) describe how CQU staff members increasingly describe themselves as change weary. McConachie (2001) describe how CQU staff perceive the many changes of previous years to have been poorly communicated and badly managed leading to a climate where further change is unwelcome. Not surprisingly, it is not unusual for academic staff to resist attempts to alter their routines or their control over specific tasks (Hough, McNaught et al. 1998; Jones, Gregor et al. 2003).

As outlined in Section 5.2.1 the foundation Infocom Dean led the development of a “glocal networked learning paradigm” (Marshall 2001) that was first trailed with Infocom’s Singapore operations. As with other changes, this one required the provision of policies, processes, resources and systems for successful implementation. Early in 2000 the author, and chief Webfuse designer, was seconded away from teaching to help support the Singapore project. By August 2001 the decision was made to extend this assignment for as long as necessary (Marshall 2000), to broaden its scope beyond Singapore and consequently led to the author taking on the position of Faculty Teaching and Learning Innovation Officer.

The growing importance of the web and Webfuse to Infocom’s operations is demonstrated by the changes in the Infocom web team. Responsible for maintaining the faculty’s website, including its online learning operations, the web team was also responsible for the on-going development of Webfuse. From 1997 through 2000 the web team consisted of a webmaster, a part-time “developer” and ad hoc support from other faculty technical staff. The webmaster was responsible for the design and support of the entire faculty website and included some tasks associated with Webfuse development. The part-time “developer” was the author who, while not employed to perform Webfuse development, continued developing Webfuse for research purposes. By 2001 the web team had expanded to a webmaster, three permanent developers and a contracted developer (2001-2003).

As shown in Table 5.2 by 2003 Infocom student numbers were beginning to drop, most attributable to the global downturn in IT. Previous external perceptions of Infocom as innovative with hard working staff began to change to one where Infocom was seen as greedy and somewhat less than successful (Condon, Shepherd et al. 2003). By late 2003 the foundation Dean of Infocom was seconded to special projects and left the University in early 2004 (Jones, Behrens et al. 2004). Also in late 2003 and in line with the drop in student numbers there was indication that faculty budgets would be decreased and an increased push for centralisation of services. During 2004 CQU underwent another organisational review which led to an organisational restructure during 2005. During this time Webfuse support first moved into one of the new faculties and then into CQU’s central IT division. By 2008 there was one Webfuse developer working for the central IT division.

References

Beekhuyzen, J., J. Nielsen, et al. (2002). ERPS in universities: The Australian explosion! Pacific Asia Conference on Information Systems. Tokyo, Japan.

Condon, A., J. Shepherd, et al. (2003). Managing the evolution of a new faculty in the 21st century. ATEM’2003. Adelaide, SA.

CQU (1999). Review of distance education and flexible learning "The Foresight Saga". Rockhampton, Central Queensland University: 43.

CQU (2001). Strategic plan for Flexible Learning. Rockhampton, Central Queensland University.

Danaher, P. A., J. Luck, et al. (2005). "The stories that documents tell: Changing technology options from Blackboard, Webfuse and the Content Management System at Central Queensland University." Studies in Learning, Evaluation, Innovation and Development 2(1): 34-43.

Gregor, S., A. Wassenaar, et al. (2002). "Develoing a virtual organization: Serendipity or strategy?" Asian Academy of Management 7(1): 1-19.

Hancock, G. (2002). "Higher education at the crossroads: A review of Australian Higher Education – A response from Central Queensland University."   Retrieved 19 July, 2009, from http://www.backingaustraliasfuture.gov.au/submissions/crossroads/pdf/280.pdf.

Hough, G., C. McNaught, et al. (1998). Developing a Faculty Plan for Flexible Delivery for the Next Five Years – and how to get there. Proceedings of ASCILITE’98.

Jones, D. (1999). Solving some problems with university education: Part II. Ausweb’99, Balina, Australia.

Jones, D. (2003). How to live with ERP systems and thrive. 2003 Tertiary Education Management Conference, Adelaide.

Jones, D., S. Behrens, et al. (2004). The rise and fall of a shadow system: Lessons for enterprise system implementation. Managing New Wave Information Systems: Enterprise, Government and Society, Proceedings of the 15th Australasian Conference on Information Systems, Hobart, Tasmania.

Jones, D., S. Gregor, et al. (2003). An information systems design theory for web-based education. IASTED International Symposium on Web-based Education, Rhodes, Greece, IASTED.

Jones, D., S. Stewart, et al. (1999). Patterns: Using Proven Experience to Develop Online Learning. Proceedings of ASCILITE’99, Brisbane, QUT.

Kehoe, J., B. Tennent, et al. (2004). "The challenge of flexible and non-traditional learning and teaching methods: Best practice in every situation?" Studies in Learning, Evaluation, Innovation and Development 1(1): 56-63.

Luck, J., D. Jones, et al. (2004). "Challenging Enterprises and Subcultures: Interrogating ‘Best Practice’ in Central Queensland University’s Course Management Systems." Best practice in university learning and teaching: Learning from our Challenges.  Theme issue of Studies in Learning, Evaluation, Innovation and Development 1(2): 19-31.

Macpherson, C. and A. Smith (1998). "Academic authors’ perceptions of the instructional design and development process for distance education: A case study." Distance Education 19(1): 124-141.

Marshall, S. (2000). Edmu, secondment, Hartford etc. D. Jones. Rockhampton.

Marshall, S. (2001). Faculty level strategies in response to globalisation. 12th Annual International Conference of the Australian Association for Institutional Research. Rockhampton, QLD, Australia.

Marshall, S. and S. Gregor (2002). Distance education in the online world: Implications for higher education. The design and management of effective distance learning programs. R. Discenza, C. Howard and K. Schenk. Hershey, PA, USA, IGI Publishing: 21-36.

McConachie, J. (2001). "Who benefits from exploratory business research? The effect of sub-cultures on the implementation of an enterprise system: An Australian regional university perspective." Queensland Journal of Educational Research 17(2): 193-208.

Oliver, D. and M. Van Dyke (2004). Looking back, looking in and looking on: Treading over the ERP battleground. Qualitative case studies on implementation of enterprise wide systems. L. von Hellens, S. Nielsen and J. Beekhuyzen. Hershey, PA, Idea Group: 123-138.

Sturgess, P. and F. Nouwens (2004). "Evaluation of online learning management systems." Turkish Online Journal of Distance Education 5(3).

Tickle, K., N. Muldoon, et al. (2009). Moodle and the institutional repositioning of learning

and teaching at CQUniversity. ascilite 2009. Auckland, NZ: 1038-1047.

Focusing on integration – chapter 5

Back working on the thesis. The following is a rough draft of the introduction and part of the first section from Chapter 5 of the thesis. This chapter is starting to tell the story of Webfuse from 2000 to 2004 and beyond.

As I’m reading and writing, I’m remembering all sorts of details, which has led to a change in the title of the chapter. “Focusing on better integration” isn’t about technical integration, it’s about integrating “e-learning” into the everyday practice of academics. This is where I think Webfuse was a success.

Introduction

The previous chapter, chapter 4, described the first iteration (1996 to 1999) of the action-research process that led to the development of Webfuse and the Information Systems Design Theory (ISDT) that forms the basis for this thesis. This chapter describes the final iteration of the action research process (2000-2004 and beyond) and how it leads to the formulation of the final version of the ISDT. A significant point of difference between this cycle and the previous is that an obviously more atelological process was taken. That is, unlike the previous cycle which started with a specific set of design principles informing the design of an information system, this cycle commences with an existing information systems with a number of known problems and then proceeds via various changes to attempt to address those problems. The final ISDT is an accumulation of the learning and reflection from this ateleological process of experimenting with the existing Webfuse system.

This chapter uses the same basic structure – adapted from the synthesised design and action research approach proposed by Cole et al (2005) – as used in chapter 4. However, in keeping with the more ateleological approach adopted in this cycle the description of the intervention does not include a section explaining the a priori design principles. The chapter starts with the problem definition (Section 5.2) and a description of the changing happening within the broader societal and institutional contexts during this period (Section 5.2.1) and also a brief summary of the problems with Webfuse that arose from the first iteration (Section 5.2.2). Next, the intervention is described (Section 0) as a collection of separate, but related changes in the system and its support. The outcomes of the intervention are then examined in the evaluation section (Section 5.4). All of this is brought together first as an ISDT for e-learning within universities (section 5.5) and the identification of lessons learned (Section 5.6).

Problem Definition

As described in Section 4.2 (cross ref) the basic problem needing to be solved was how (in 1996) to enable the Department of Mathematics and Computing (M&C) – and later the Faculty of Informatics and Communication – at Central Queensland University (CQU) to use the World-Wide-Web and other Internet-based technologies in its teaching and learning. The solution implemented to address this problem was the design and implementation of the Webfuse e-learning system (Section 4.3 cross ref). By 2000, the development and support of Webfuse entered a new phase informed by changes within the broader context and the desire to address the lessons learned during Webfuse’s early use. This section starts be describing the changes in the broader societal and the CQU context, which both enabled and influenced the development of Webfuse from 2000 onwards (Section 5.2.1). Section 5.2.2 describes how these contextual factors influenced how Webfuse was supported and developed during this period. Finally, section 5.2.3 briefly re-iterates the lessons learned during the use of Webfuse from 1997 through 1999 identified in Chapter 4.

Changes in institutional context

The period towards the end of the 20th and start of the 21st centuries saw considerable change in the context within which this work was performed. This section seeks to provide a brief summary of those changes and how they impacted the development and support of the Webfuse information system and eventually the direction taken with the ISDT. It starts with a summary of some of the major societal changes impacting higher education within Australia. Next, it provides a brief description of the changes, many influenced by societal factors, within the CQU context from 1999 onwards.

From 1999 onwards acceptance, access to and use of computers and the Internet amongst the staff and students of CQU increased significantly in line with changes within the broader societal context. Household Internet access within Australia quadrupled from 16% in 1998 to 64% in 2006/7 (Australian Bureau of Statistics 2008). For most of this time, however, the majority of Australian households made do without broadband Internet connections. By 2004/5, only 16% of Australian households had a broadband Internet connection, increasing to 43% by 2006/7 (Australian Bureau of Statistics 2008). However, cost remained a significant barrier with only 34% of people in bottom income quintile households have home Internet access compared with 77% in the top income quintile (Australian Bureau of Statistics 2007). This rapid increase, mirrored in other advanced countries, represented the growing penetration of the Internet and the World-Wide Web into everyday life.

The growing adoption of information and communications technologies (ICTs) had a other broader societal impacts. In the years leading up to 2002 universities faced an almost overwhelming demand for information technology (IT) skills fuelled by the dot-com boom and the perceived Y2K crisis (Smyth and Gable 2008). In addition, the Australian government introduced an initiative in mid-1999, that allowed former full-fee paying overseas students – studying a specified set of programs, including IT – to apply for permanent residence within the first six months after course completion, even if they did not have work (Birrell 2000). At the same time, the more robust market disposition of the state meant that universities were not only required to be more efficient and effective in using state resources but were also required to compensate for reduced government funding by attracting private funds (Danaher, Gale et al. 2000). For a number of Australian universities this led to an increased reliance on overseas full-fee paying students in programs matching government specified skills areas. Initially this included a heavy emphasis on IT skills, but with a global downturn in IT from 2002 leading to a decline in demand for these courses (Smyth and Gable 2008), the emphasis shifted to other programs such as accounting. As outlined in Chapter 4, CQU had adopted a strategic direction based on planned growth into the market for overseas students and consequently fluctuations in demand impacted the institution.

In March 2000 the Australian federal and state ministers of education established the Australian Universities Quality Agency (AUQA) and assigned it the responsibility of providing public assurance of the quality of Australia’s universities and other institutions of higher education (AUQA 2000). Vidovich (2002) argues that the types of public sector policies that resulted in the formation of AUQA amount to mechanisms of indirect steerage developed as a complement to policies of devolution, decentralization and deregulation characteristic of a prevailing market ideology. Vidovich (2002) also argues that the rise of quality policy and globalisation within educational at around the same time suggests that the two were intimately bound. Woodhouse (2003) – the Executive Director of AUQA – argues that the most frequently cited reasons for the great increase in external quality agencies in higher education are: the increase in public funding, the connection between higher education and national needs and the growth in student numbers. Woodhouse (2003) reports that feedback on trial and substantive AUQA quality audits in 2001 and 2002 is positive, with universities reporting beneficial effects through the audits and self-reflection triggered by prospective audits. In a review of 320 substantive contributions from the first 15 volumes of the journal Quality in Higher Education, Harvey and Williams (2010) suggest that the overall tenor is that “external quality evaluations are not particularly good at encouraging improvement”.

By 1994, as one of eight nationally accredited Distance Education Centres, CQU had almost 5000 students in 400 courses studying by distance education (CQU 1999). By the late 1990s, the changes described in the previous paragraphs had begun to significantly influence the conceptions of on-campus and distance education. By this time Australian distance education had been through three phases: (1) external studies (1911 to early/mid 1970s); (2) distance education (early/mid 1970s to mid 1980s); and, (3) open learning (mid 1980s onwards) (Campion and Kelly 1988). By the late 1990s factors such as declining funds, advancing technology and the demography of students had triggered a profound process of change where distance education methods and systems were converging with those of face-to-face teaching (Moran and Myringer 1999). By 2004, Bigum and Rowan (2004) describe how flexibility in teaching and learning was commonplace within Australian higher education and how enthusiasm for the term arose from perceptions of it being: a) a more effective and efficient means of getting teaching resources to students, and b) through online teaching offering the possibility of generating revenue from overseas fee-paying students. The key idea of flexible learning was a move away from instructor choice of key learning dimensions toward an approach that offered the student the flexibility to pick from a number of choices (Collis and Moonen 2002). Dekkers and Andrews (2000) suggested that once the use of technology became more common, discussion of flexible learning, like that with open learning, would soon revert simply to discussion of teaching and learning.

References

AUQA. (2000). "Mission, objectives, vision and values."   Retrieved 26 May 2010, 2010, from http://auqa.edu.au/aboutauqa/mission/.

Australian Bureau of Statistics (2007). Patterns of Internet Access in Australia 2006. Canberra, Australian Bureau of Statistics: 80.

Australian Bureau of Statistics (2008). Internet access at home. Australian Social Trends 2008. Canberra, ACT, Australian Bureau of Statistics: 10.

Bigum, C. and L. Rowan (2004). "Flexible learning in teacher education: myths, muddles and models." Asia-Pacific Journal of Teacher Education 32(3): 213-226.

Birrell, B. (2000). "Information technology and Australia’s immigration program: Is Australia doing enough?" People and Place 8(2): 77-83.

Campion, M. and M. Kelly (1988). "Integration of external studies and campus-based education in Australian higher education: They myth and the promise." Distance Education 9(2): 171-201.

Cole, R., S. Purao, et al. (2005). Being proactive: Where action research meets design research. Twenty-Sixth International Conference on Information Systems: 325-336.

Collis, B. and J. Moonen (2002). "Flexible learning in a digital world." Open Learning 17(3): 217-230.

CQU (1999). Review of distance education and flexible learning "The Foresight Saga". Rockhampton, Central Queensland University: 43.

Danaher, P. A., T. Gale, et al. (2000). "The teacher educator as (re)negotiated professional: critical incidents in steering between state and market in Australia." Journal of Education for Teaching 26(1): 55-71.

Dekkers, J. and T. Andrews (2000). A meta-analysis of flexible delivery in selected Australian tertiary institutions: How flexible is flexible delivery? ASET-HERDSA 2000, Toowoomba, Qld.

Harvey, L. and J. Williams (2010). "Fifteen years of quality in higher education." Quality in Higher Education 16(1): 3-36.

Moran, L. and B. Myringer (1999). Flexible learning and university change. Higher education through open and distance learning. K. Harry. London, Routledge: 57-71.

Smyth, B. and G. G. Gable (2008). The information systems discipline in Queensland. The Information Systems Academic Discipline in Australia. G. G. Gable, S. Gregor, R. Clarke, G. Ridley and R. Smyth. Canberra, ACT, Australia, ANU E Press: 187-208.

Vidovich, L. (2002). ‘Acceding to audits’: New quality assurance policy as a ‘settlement’ in fostering international markets for Australian higher education. Australian Association for Research in Education Conference. Brisbane.

Woodhouse, D. (2003). "Quality improvement through quality audit." Quality in Higher Education 9(2): 133-139.

One potential approach to provide a Moodle email merge facility

One of the issues I have to address with the BIM Moodle module is the provision of an email merge facility. I (and a couple of other people I know) haven’t been able to find how to do this within Moodle. The following outlines one proposal for how this might be done within Moodle 1.9.

I’m very keen to hear from more experience Moodle folk about whether or not this type of service already exists within Moodle.

It’s likely that I will attempt to implement aspects of this approach in the next week to extend BIM.

What is email merge

Essentially it is a method to send the same message to multiple recipients, however, each message can be customised to include information specific to each recipient. There are two/three main tasks to email merge:

  • Selecting the recipients.
    Specify the list of folk you want to send the message to.
  • Create the message.
    Enter the message, including support for specifying the information that will be specific to each person.
  • Manage the sending/re-sending of the message.
    Tracking who has received the message, specifying whether to try again automatically etc.

The following is a screen shot (click on it to see a bigger version) of the manage message screen from the Webfuse email merge facility originally implemented by Nathaniel in 2002.

Email merge. It has a simple textbox for the message and supports attachments. The “Add tag to message” component allows the user to select some “tags” from a drop box. In Webfuse the tags include parts of the students’ name, email address, student id,
and program they were studying.

Why use it

For most teaching staff using Webfuse email merge was used to send messages to groups of students to welcome and orient them to the course, remind them that the assignment was due soon and pointing to resources, and asking them why they didn’t submit the assignment. In my experience, an email merge appears to be more personal and that generates a greater level of connection with the student. Many, if not all, of the students realised it was a bulk email, but the private touch helped.

What’s available in Moodle?

I’m still fairly new to Moodle from a user perspective, and the only functionality I’ve been able to find that comes close is the “Message course users” functionality that is available under course participants. When you view the participants in a course you can select some of them and then choose to “add /send message” – see the following.

Moodle select participants

Then you see a typical HTML editor with some additional guidance, plus a list of selected users which you can further edit. See the following.

Moodle send message

In terms of the main tasks for email merge there are some limitations:

  • Selecting the recipients.
    You can only select the recipients from the entire list of people within a course. This is limiting in two ways. First, you may wish to include recipients that cross a course boundary. Second, you may wish to start with an existing list of recipients, not select from the entire list of course participants.

    For example, you may wish to use email merge to send a message to all students who haven’t completed an assignment. Hence, from the gradebook you’d like to be viewing those students and have a link “Mail merge” that allows you to select all those students.

  • Create the message.
    Two limitations here, no support for attachments, and no support for personalisation. Though it does have the HTML editor.
  • Manage sending.
    Doesn’t appear to have support for this. So, you can’t schedule the message to be sent at a specific time or on a specific event.

Improving recipient selection

Going beyond a course boundary is a little more difficult, however, improving selection within a course could be possible. The form that displays the message takes the list of recipients as a parameter – it appears in session data – theoretically it might be possible for other Moodle extensions to generate this session data and call the form.

Improving message creation

The main missing piece here is the ability to include “tags” and get them replaced with personal information for each recipient. There are three broad tasks here:

  • Specifying the tags and where the information is.
  • Providing an interface that allows message senders to include tags in a message.
  • As each message is being sent, replace the tags with the actual personal information for the specific recipient.

The last two will likely require modifications to the file moodle/user/messageselect.php which seems to implement most of this

  • message edit screen;
    Need to add support to describe the available tags and allow the user to insert them in the message.
  • preview screen;
    Allows the user to see the message before it is sent. Add to this the ability to see the tags replaced with specific information from a user.
  • sending the message.
    i.e. where the tags get replaced with each recipients’ information.

Specifying the tags

Two ways to do this, simple and complex.

The simplest way to do this would be to restrict it to just standard Moodle system information about users such as name, email address and more standard extensions such as the gradebook. This would mean a “simple” change to to moodle/user/messageselect.php

A more complicated approach would be to allow greater support for Moodle’s extensibility. i.e. allow each activity/block define it’s own set of tags and have moodle/user/messageselect.php be able to handle those. For example,

  • BIM could define it’s own set of tags (e.g. REGISTERED_FEED for the student’s registered blog feed).
  • When a user clicks on email merge from BIM, it would call messageselect and pass the list of users selected from BIM (e.g. all students with unregistered blogs).
  • messageselect will know which extension called it and check to see if that extension defines its own tags.
  • messageselect would then use those tags (and how to get the information for each user) to modify the edit screen, the preview screen and the sending of the message.

Fixing BIM's back up and restore

The following outlines steps to continue work on BIM’s backup and restore functionality. As per this issue the user part of the back up has errors.

It appears that the code was actually working.

Re-create the problem

It’s been a few months since I worked on this. Have to re-create the problem first.

Looking through the bim/backuplib.php code, the first evidence is that the user code is commented out in the function bim_backup_one_mod. Let’s uncomment that and try to back up a BIM.

Okay, that seems to have completed. No errors reported. Is the Moodle debugging option set to the highest? Yep. So the problem isn’t a syntax error, it’s an error in operation/implementation.

Let’s look at the resulting backup and see where it is going wrong.

[sourcecode lang=”bash”]
david-joness-macbook-pro:tmp david$ unzip *
Archive: backup.zip
creating: course_files/
creating: course_files/1/
inflating: course_files/1/david.2.xml
inflating: course_files/1/david.xml
creating: group_files/
creating: group_files/60/
creating: group_files/61/
inflating: moodle.xml
creating: site_files/
[/sourcecode]

Should there be BIM specific files here? No, looks like the data is within the files. Okay, there’s no user data being saved. Why?

Ahh, there’s more commented code to uncomment. Mm, still getting the message “without user data”. Missing something.

Ahh, apparently not all users can back up user data. I was logged in as my own account, no permission. Login as root, and there’s the user stuff.

Okay, BIM backup appears to work – no errors. Let’s look at the files.

Yep, that seems to be working so far. All three tables are being saved in the XML file and in apparently correct format.

Doing the restore

So, is the problem with the restore? The restore looks to have worked. The big question how do we test. First, let’s look at the restored BIM. There are errors. But also no students. Appears that the problem is that the students/users from the backed up course, aren’t in the restored and separate course.

What if we do the restore within the same course?

Well the restore process didn’t create any errors, but when using the newly restored BIM (in addition to the existing BIM activity) it has the same sorts of errors as when restored in a new course, there’s a problem with manage marking.

Let’s look at the database and see what’s been restored, check each BIM table.

  • bim – information about all BIM activities.
    As expected. 3 BIMs on my test box. The values are all as I’d expect.

    The original activity has id 1, 2 is the restore in a new course (course id 15) and id 3 is for the activity restored in the same course (course id 4)

  • bim_group_allocation – which groups are allocated to which staff.
    Again, there are 3 BIMs listed. The same number of entries for each BIM. The userids of the markers are the same regardless of the course. The group ids are different between courses. As expected (I think).
  • bim_questions – list of questions for the activity.
    Ok, as expected as it’s the user stuff I’m checking and this isn’t user stuff.
  • bim_student_feeds – where are the students registered feeds?
    All correct. Each of the 3 bims have exactly the same data. The userids are the same regardless of the course. This indicates that the restore is making the right decisions about the students.
  • bim_marking – marking and other information about each student post (3 feeds registered, 10 posts per feed, 3 bims)
    As expected there are 90 rows in this table. After a quick check, it appears that this is all good.

This is somewhat strange. Looks like it’s all working, so why the problems?

Is it because of the user I’m logged in as (admin user), what about a standard teacher? Nope, same errors. Time to look at the code.

The errors being reported are in a Moodle library – tablelib.php. Hence the data being passed in must be corrupt/wrong in some way. If I compare the original BIM activity with the restored one (in the same course) there are differences in the manage marking output. The restored one is missing one of the questions. However, under manage questions all the questions are listed.

The code generating the header of the table generates the same data. Okay, the problem is within the code that generates the contents of the table.

Bugger, my problem here. The function “bim_get_question_id” compares the title of a question to get the unique id of the question. If questions have the same title, which they can and do in this example, then there’s a problem. Need to fix that.

Fixing get_question_id

This function is only used once. In this section. So, it looks like the solution is to remove the need to use it.

Let’s try simply adding the id of the question as the index for the array.

Yep, that works.

Does it work now?

So, does that mean the back up and restore process is working? Checking through the restored BIM in the same course, it appears it does. Some errors are there when restored to a new course.

This appears to be because there are no students assigned to the course and the error checking in BIM ain’t great. Fix that and the error messages disappear, but there still appear to be users within BIM. Which is probably what should happen because there are no students in the course, but there are in BIM.

Okay, the problems appear to be not with backup/restore, but with courses not having students enrolled and the poor error checking in BIM. If I fix up the error checking, we should be in action.

Required fixes

  • bim_create_posts_display – another area where the same question title is causing problems.

That’s it. Seems to be working

Just need to remove the debug statements in the restore process and commit it.

Adding multiple visualisation approaches to Indicators block

This post is a summary of work being done to update the Moodle indicators block so that it can support multiple visualisation tools and approaches.

Problem

The indicators block is aimed (at least for me) to be a way in which various visual insights (indicators) about what is going on within Moodle can be shown to students and staff. Col initial indicators within the block were generated using the Google chart tools. This worked really well and I think we’ve only scratched the surface with those tools. However, there appears to be a need to support multiple visualisation approaches, reasons might include:

  • the visualisation tool doesn’t provide necessary functionality; and
  • need for a multiple visualisations for the same data.

A simple example of this comes from the only “data” the indicators block currently visualises – the level of activity in a course site by staff or students. Currently this is shown as a “dial” or speedo (see below). The dial ranges from red through to green and a black arrow indicates the level of activity by the participant.

Next step in indicators block

Alan commented that he didn’t like the dial/meter visualisation in that it seems to encourage a simplistic “more is better” perception. Alan would prefer some sort of traffic light visualisation. After a very quick look, I don’t think the Google chart tools provide a traffic light visualisation. Regardless, you get the idea.

Rather than force someone to use only one visualisation, it would seem better if the Moodle indicators block allowed people to choose (and implement) the ones they preferred. i.e. support for multiple visualisations.

What’s been done

The aim here is to complete and describe three tasks that enable multiple visualisations. These tasks are:

  1. Move the Indicators to a Model/View pattern.
    The intent is to separate the calculation of the data from the visualisation. i.e. to allow multiple different visualisations.
  2. Add support for an alternate visualisation tool.
    In this the protovis library.
  3. Implement a couple of different visualisations of existing data.
    Essentially to test and illustrate the use of the Model/View patterns.

Most of these have been done, but only in initial stages for staff.

Model/View

For each indicator there are two main tasks it must perform:

  1. Generate/retrieve the data to be visualised.
  2. Generate the visualisation.

The aim here is to separate out those two tasks into two classes. A model and a view. This means that the existing indicator code that looks like this.

[sourcecode language=”php”]
$indicator = IndicatorFactory::create($context);
$this->content->text = $indicator->generateText();
[/sourcecode]

Will get modified to something like this.

[sourcecode language=”php”]
$model = IndicatorFactory::createModel( $context );
$view = IndicatorFactory::createView( $model, $context );
$this->context->text = $view->generateVisualisation();
[/sourcecode]

The factory class is now responsible for generating both the model and the view. The above is likely to change overtime. For example, rather than passing just $context, there might be other information e.g. user preferences etc.

Let’s see if I can get this to work with some testing.

Adding a protovis visualisation

Aim here is to create a 2nd visualisation of the existing indicator using Protovis. Using the protovis view will initially be hard coded for some users, eventually to be replaced with some preferences or rotation approach.

Running out of time at the moment, so I’m going to put in a dummy protovis view that simply shows a bar graph. Doesn’t use the data from the model at all.

So, here’s what the staff indicator looks like with the google chart view.

Staff activity indicator

The bit of the factory that generates this view looks like this

[sourcecode lang=”php”]
if ( has_capability( ‘moodle/legacy:teacher’, $context ) ||
has_capability( ‘moodle/legacy:editingteacher’, $context ) ) {
require_once(
$CFG->dirroot.’/blocks/indicators/staff/activity/google_view.php’);
return new activityView( $model );
}
[/sourcecode]

Eventually, rather than a straight use of require…google_view this would eventually be replaced by some algorithm that figures out which view the user wants. But, for now, I’ve introduced the following which randomly selects which view to use.

[sourcecode lang=”php”]
$view = "/blocks/indicators/staff/activity/google_view.php";
if ( rand( 0, 1 ) == 1 ) {
$view = "/blocks/indicators/staff/activity/protovis_view.php";
}
require_once( $CFG->dirroot.$view );
return new activityView( $model );
[/sourcecode]

The dummy protovis view looks like this

Proof of concept - protovis in Moodle indicators block

To do

Need to update the student view to use this model, need to start generating some different models and views.

Also need to think about how the models can be used to do some “caching” of database content.

Understanding what teachers do: First step in improving L&T

The following is an attempt to explain the initial description and rationale of an exploratory research project (perhaps ethnographic, narrative inquiry or some similar qualitative methodology) aimed at understanding what teachers/academics actually experience within a particular environment during a single term. The assumption is that by better understanding the lived experience of the teaching staff you can better understand why (or why not) teaching is likely to improve.

In terms of suggestions and advice, I’m really keen to hear from people who might have some insights to share around:

  • methodology;
    What good methods are there to gain the type of insight I’m interested in without being to onerous for the academics involved?
  • related literature;
    Where is the literature that talks about this sort of approach within university teaching, or perhaps more broadly in education?

Why? – The personal aspect

I’m interested in this because I coming to the opinion that it is the quality, quantity and diversity of the connections within the network of people, policies, technologies and other system objects that enable or constrain the ability for a university to improve it’s teaching and learning. In particular, the connections which surround the teaching staff and students define what they experience and that experience impacts what they are likely to do (or not). My bias is that I think the network/environment surrounding most staff/students is actively preventing improvement in learning and teaching.

In my current job I am expected to help improve the quality of teaching and learning. Much of what I do (e.g. Moodle curriculum mapping, the broader alignment project, and the indicators Moodle block) is aimed at modifying the environment/network around teaching staff to enable and encourage them to improve their teaching. But this is only half the equation.

Aside: My focus is on teaching staff. It is not on students. While I agree 100% that student-centered approaches (there’s a lot of “buzz word” around that phrase, I hesitate to use it) to learning are the most effective, I don’t teach students. My job is to help other academic staff improve what they are doing, to create an environment in which they start to think that student-centered learning is not only a good thing (which many of them do) but that the environment actually helps them implement such approaches, rather than actively hinders it. I’ve seen too many attempts to encourage student-centered approaches that ignore the teaching staff and consequently get hamstrung from reactance.

The other half of the equation is getting a good understanding of the environment/network as experienced by the teaching staff. Up until now I’ve been relying on my recent experience of the same environment (which is now 3+ years old) and ad hoc discussions with colleagues (which is limited by all sorts of bias). This understanding is necessary because of the need to:

  • Be more aware of what some of the potential problems or needs are that need addressing.
  • Design the interventions to address those problems.
  • Understand the impact and ramifications of those interventions.
  • Provide evidence to others of the problems within the environment and the value of the interventions.

Why? – the research perspective

So that’s the personal perspective, what about the research perspective?

First off, one of the “buzzwords” within education fields at the moment is distributive leadership. Here’s something I wrote describing distributive leadership in the alignment project blurb

Parrish et al (2008) define distributive leadership as the distribution of power through a collegial sharing of knowledge, of practice, and reflection within a socio-cultural context. Zepke (2007) argues that this is more than the delegation of tasks and responsibilities, and more than collaborative practice. Spillane et al (2004, p. 9) argue that, based on its foundations in distributed cognition and activity theory, distributive leadership is not limited to people, but can also be attributed to artefacts such as language, notational systems, tools and buildings. Leadership activity is distributed through an interactive web of actors, artefacts and situation (Spillane et al., 2004, p. 20).

Spillane et al (2004) go onto define leadership as

the identification, acquisition, allocation, co-ordination, and use of the social, material, and cultural resources necessary to establish the conditions for the possibility of teaching and learning.

i.e. the identification of the social, material and cultural resources within an organisation is an important part of creating the conditions for teaching and learning.

Support for the importance of the environment in terms of its impact on learning outcomes comes from 30 years of empirical research by Prosser et al (2003) and Ramsden et al (2007) which has produced abundant empirical inquiry and theory that links the quality of student learning outcomes with: (1) the approaches to learning taken by students; (2) the students’ perceptions of the learning context; and (3) the approaches to teaching practiced by teaching staff. In turn, this research confirms the findings of other leadership studies by illustrating that variation in teaching approaches is associated with perceptions of the academic environment (Ramsden et al., 2007).

In terms of models of what we know about teaching and learning, the importance of the environment and context is illustrated by Trigwell’s (2001) model of teaching (click on the images to see larger versions)

Trigwell's model of teaching

And by Richardson’s (2005) model of teachers’ approaches to teaching

Integrated model of teachers' approaches to teaching

While pedagogue’s are likely to adopt teaching approaches that are consistent with their conceptions of teaching there may be differences between espoused theories and theories in use (Leveson 2004). While pedagogues may hold higher-level view of teaching other contextual factors may prevent use of those conceptions (Leveson 2004). Environmental, institutional, or other issues may impel pedagogues to teach in a way that is against their preferred approach (Samuelowicz and Bain 2001). While conceptions of teaching influence approaches to teaching, other factors such as institutional influence and the nature of students, curriculum and discipline may also influence teaching approaches (Kember and Kwan 2000). Prosser and Trigwell (1997) found that pedagogue’s with a student-focused approach were more likely to report that their departments valued teaching, that their class sizes were not too large, and that they had control over what was taught and how it was taught. Other contextual factors that frustrate pedagogues’ intended approaches to teaching may include senior staff with traditional teacher-focused conceptions raising issues about standards and curriculum coverage and students who induce teachers to adopt a more didactic approach (Richardson 2005). In addition, teachers who experience different contexts may adopt different approaches to teaching in those different contexts (Lindblom-Ylanne, Trigwell et al. 2006).

i.e. the perceptions of the environment in which they teach held by teaching staff have a direct effect on how they teach. If you want to improve the quality of teaching within a university you have to understand how the academics perceive/experience the environment.

Most of the literature I’ve seen to date by Prosser and his colleagues has been mostly survey based. I’m interested in a more detailed insight into the actual lived experience, rather than ad hoc recollections filtered through survey questions.

What

To my way of thinking this has to be a exploratory, qualitative and ethnographic investigation. I’m looking to gain insight into the day to day lived experience of academics and how they react to that experience, what it does to them. I need to read up some more. What follows are some initial thoughts.

Murthy (2008) describes how good ethnography “effectively communicates a social story, drawing the audience into the daily lives of the respondents”. This is what I’m trying to get to, I want the stories of the daily lives of the academics around learning and teaching. Murthy (2008) goes on to give an overview of digital ethnography, but nothing immediately helpful…but it seems connected to what I was thinking of doing. Hookway (2008) also looks promising but the site is down for scheduled maintenance.

How

So, without much reading, I’ve been thinking about starting this with a small exploratory study along the following lines:

  • Approach half a dozen academics from my current institution.
    Selected to be somewhat diverse in terms of likely experience in terms of location, subject etc.
  • Invite them to be co-researchers.
    I’d rather they were collaborators than research subjects. I want them to have greater ownership and motivation to be involved. I want the value of their insight not just into the everyday experience of teaching, but also research.
  • For a single teaching term, ask them to contribute to a blog stories about their experience with teaching.
    Whenever they do something around teaching, something different, something frustrating etc. write a story on the blog. As short of as long as they like. It might be a personal blog, or it might be a group blog. It might well have to be a private blog.
  • At the end of term, employ various methods to analyse the data.
  • Present it locally and publish it.

References

Biggs, J. (1996). “Enhancing teaching through constructive alignment.” Higher Education 32(3): 347-364.

Kember, D. and K.-P. Kwan (2000). “Lecturers’ approaches to teaching and their relationship to conceptions of good teaching.” Instructional Science 28(5): 469-490.

Leveson, L. (2004). “Encouraging better learning through better teaching: a study of approaches to teaching in accounting.” Accounting Education 13(4): 529-549.

Lindblom-Ylanne, S., K. Trigwell, et al. (2006). “How approaches to teaching are affected by discipline and teaching context.” Studies in Higher Education 31(3): 285-298.

Hookway, N. (2008). “‘Entering the blogosphere’: some strategies for using blogs in social research.” Qualitative Research 8(1): 91-113.

Murthy, D. (2008). “Digital Ethnography.” Sociology 32(5): 837-855.

Parrish, D., G. Lefoe, et al. (2008). The GREEN Resource: The development of leadership capacity in higher education. Wollongong, CEDIR, University of Wollongong: 64.

Prosser, M. and K. Trigwell (1997). “Relations between perceptions of the teaching environment and approaches to teaching.” British Journal of Educational Psychology 67(1): 25-35.

Prosser, M., P. Ramsden, et al. (2003). “Dissonance in experience of teaching and its relation to the quality of student learning.” Studies in Higher Education 28(1): 37-48.

Ramsden, P., M. Prosser, et al. (2007). “University teachers’ experiences of academic leadership and their approaches to teaching.” Learning and Instruction 17(2): 140-155.

Samuelowicz, K. and J. Bain (2001). “Revisiting academics’ beliefs about teaching and learning.” Higher Education 41(3): 299-325.

Spillane, J., R. Halverson, et al. (2004). “Towards a theory of leadership practice: a distributed perspective.” Journal of Curriculum Studies 36(1): 3-34.

Trigwell, K. (2001). “Judging university teaching.” The International Journal for Academic Development 6(1): 65-73.

Zepke, N. (2007). “Leadership, power and activity systems in a higher education context: will distributive leadership server in an accountability driven world?” International Journal of Leadership in Education 10(3): 301-314.

Draft chapter 4 of the thesis is up

A couple of days ago I wrote the last few sentences for a fairly serious first draft of chapter 4 of the thesis. This chapter has to be re-read by me, read by my esteemed supervisor and then by a copy editor, so it’s not finished yet. But it’s a step closer.

This chapter tells the story of and rationale behind the development and use of Webfuse from 1997 through 1999. It attempts to formalise the thinking behind Webfuse into the first version of an information systems design theory for e-learning within universities. Since the thinking behind Webfuse was very naive, the resulting design theory is somewhat naive. From my perspective much of what passes for thinking around e-learning within universities today, is just as, if not more, naive.

The next step is to move onto chapter 5 which tells the story/rationale of the final period of Webfuse: 2000 through 2004 and beyond.

How curriculum mapping in Moodle might work

The purpose of this post is to provide a concrete description of how curriculum mapping of a Moodle course might work. The hope is that this will enable a broader array of people to comment on the approach and, in particular, identify flaws or problems. So, please comment.

This is being done as part of the alignment project and picks up from some earlier examination of Moodle’s existing outcomes feature.

Overview

The aim is to modify Moodle (as little as possible) to enable teaching staff to perform two tasks:

  1. Map how well the activities, resources and assessment within their Moodle course aligns with a set of outcomes.
    Related to this task is the ability to maintain this mapping as the course is modified.
  2. Use the alignment information about their course (and other courses) to enhance their course.

Each of those two tasks is expanded below.

Implementation

The implementation suggested below is based on ideas from Moodle’s existing support for Outcomes. Some of the following screen shots are using that existing support, some are slightly modified. Moodle’s existing support for outcomes (or competencies) is in terms of tracking how students are going in achieving specific outcomes or competencies. Rather than individual students, this project is mapping the activities, resources and assessments against outcomes. But the principle is basically the same.

Mapping

This task has the following steps (which are explained below):

There is also the problem of whether or not a Moodle course site can be used to map everything about a course.

Specifying the outcomes

The first step is specify which outcomes courses will be mapped against. Moodle supports two “types” of outcomes:

  • “standard” outcomes; and
    These would be created at the institution level and able to be used across all Moodle course sites for that installation.
  • course outcomes.
    These are added to a specific course and can only be used within that course.

Outcomes are placed into Moodle by direct entry via the Moodle interface or uploading a CSV file. Important or interesting values for an outcome include:

  • Both a full and short name.
  • A description of the outcome.
  • The scale to be used for measuring the outcome.

Scales are used by Moodle to evaluate or rate performance. By default this is a numeric value, however, Moodle supports the creation of custom scales. For example, the scales Moodle page talks about the cool scale that consists of the the values: Not cool, Not very cool, Fairly cool, Cool, Very cool, The coolest thing ever!

My current institution is currently rolling out its graduate attributes. There are eight graduate attributes, each of those could be loaded as a standard outcome in Moodle. The institution is currently using three levels – introductory, intermediate and graduate – and has created descriptions of these levels for each attribute. These could form the basis for a scale for each attribute/outcome.

The following is an example CSV file that can be uploaded into Moodle to achieve this.

[sourcecode lang=”text”]
outcome_name;outcome_shortname;outcome_description;scale_name;scale_items;scale_description
Communication;comm;"Described here http://dmai.cqu.edu.au/FCWViewer/view.do?page=7949";"CQU Graduate Attributes (Communication)";"Introductory – Use appropriate language to describe/explain discipline-specific fundamentals/knowledge/ideas (C2), Intermediate – Select and apply an appropriate level/style/means of communication (C3), Graduate – Formulate and communicate views to develop an academic argument in a specific discipline (A4)";
Problem solving;ps;"Described here http://dmai.cqu.edu.au/FCWViewer/view.do?page=7949";CQU Graduate Attributes (Problem solving);"Introductory – Manage time and prioritise activities within the University’s framework for learning (C3), Intermediate – Make decisions to develop solutions to given situations/questions (C5), Graduate – Formulate strategies to identify, define and solve problems including, as necessary, global perspectives (P5)";
[/sourcecode]

Mapping against outcomes

Let’s start with an example Moodle course site with “editing turned on”. With “editing turned on” you get a collection of additional icons next to just about every element of the site. See the following image (click on it to see a larger version).

Moodle course page - editing on

Can you see the icon that looks like a hand holding a pen? This is the “edit” icon. If you click on this icon you get taken to the edit page for that item of the Moodle course site. An edit page for a Moodle item contains a number of components specific to the item, and a number of components common to all items. The following image is a portion of the edit page for a Moodle discussion forum with some additional labels added to show the specific and common components.

Moodle edit page - outcomes

Did you spot the “Outcomes” component of the above edit page? It showed a list of “outcomes” which match the graduate attributes of my current institution. Against each “outcome” there was a check box. To “map” this discussion forum against a graduate attribute, you simply check the appropriate box. It would be not a great stretch to think that “Communication” and “Team work” might be appropriate.

Important: This is all in Moodle now. No additions needs.

The “on” or “off” nature of the check box is very limited. This is due to the purpose Moodle’s current outcome support is meant to fulfill. For curriculum mapping you would want something more like the following.

Example curriculum mapping outcomes

The above has two main changes:

  1. Addition of the question mark icon.
    In Moodle practice clicking on the question mark gives you help. In terms of outcomes for curriculum mapping I would expect that at the least this would explain the outcome (in this case a graduate attribute) and the scale being used. It might include examples and might include a link to talk to a real person.
  2. Replace the checkbox with the scale.
    In this case it’s showing a drop box next to each outcome/attribute. These drop boxes, as shown by the box next to “Communication”, contains the three level scale being used by my current institution.

There is a lot more you could do with this particular interface, but the basic point is that when a teacher is editing or creating a new item for a Moodle course site, they can map that item against the course outcomes at the same time.

Maintaining the mapping

Following on from the last point, the fundamental idea of this project is that a mapping of the alignment within a course site is maintained all of the time. It’s not something done every now and then because an accrediting body is visiting. The idea is that once a course site is mapped, maintaining the mapping fits into normal academic practice. For example, common practice at my institution is that each offering of a course does not start with a brand new, empty Moodle course site. Instead, the previous course offering is copied over for the next term and then edited.

With the suggested changes, the copying of the course site would also copy the mapping. So rather than mapping the entire course site all over again, the teacher only needs to map the new items added to the site or modify the mappings of any items they might change.

The new “mapping” features of Moodle should encourage/warn the teacher when the alignment is no longer correct. The following image is an example of what a teacher might see if they have changed the Moodle item, but not updated the outcomes/alignment mapping.

Out of date mapping

Map everything?

There’s an assumption in the above that by mapping everything item in a Moodle course site you are considering everything about the course. It’s a somewhat faulty assumption because most Moodle course sites are at best a supplement to what happens face-to-face or via other media. If this idea is to work, then thought would have to be given to how you design a Moodle course site that captures all aspects of a course.

This is by no means a simple task or one without potential problems. However, I do think that supporting people to collaborate about this question in the context of considering overall course alignment will allow interesting and useful approaches to develop. Approaches that could potentially improve the quality of Moodle course sites.

But this is something that would need to be tested.

Using the information

The previous section gave an overview of how the mapping of course alignment would be performed. This is only the first part of this project. The next, and potentially more interesting, step is what happens when people start using the availability of this information to inform quality enhancement of courses.

What people might do with this information is not something I think you can predict. The way the project was initially framed was to allow these potential uses to flow from action research cycles. However, there have been some initial ideas proposed. The following describes those which I think are some of the possibilities that are the most generative. i.e. the following ways of using this information will generate more interesting applications of or response to this information.

The three uses I talk about below are:

Visualising the alignment

The simplest use would be for a teacher taking on a course to be able to see how aligned (or not) a course is. The following is the type of visualisation that might be used. It’s taken from Lowe and Marshall (2004) and a tool developed at Murdoch University. Each graduate attribute has 4 graphs representing objectives, learning activities, assessments and contents, the size of the graph represents how often/much the attribute is covered by those course elements.

GAMP visualisation of course alignment

In the above image it’s visible that the “Ethics” graduate attribute is quite heavily covered in course objectives, somewhat in course contents, a bit less in assessment, but is not covered at all by learning activities. One of the propositions underpinning the project is that explicit representations of alignment problems is likely to encourage teaching staff to fix the problem (see the contextualise L&T support section for more on this). This type of visualisation could be especially helpful for new or casual teaching staff who taken on a new course for the first time.

A Moodle implementation could be modified to send reminders to teaching staff about apparent misalignment.

Share the alignment

Making the level of alignment within a course explicit to the staff teaching the course is only the first step. A common problem being faced by degree programs is preventing duplication of content or content holes. If all courses within a program are using this feature then it’s fairly simple to share the alignment of multiple courses into a form that can be shared. The following is another example from Lowe and Marshall (2004) and shows a visualisation for multiple courses.

GAMP program visualisation

This type of visualisation could be factored into quality assurance processes for a program at the start of a term. The program’s teaching group could adopt a collaborative process at the start of term to address any holes or duplications.

The sharing could also be more ad hoc. The visualisation of the course (the first image from Lowe and Marshall) could be extended to provide links to examples. i.e. when you see a visualisation like the above that shows that the Ethics graduate attribute is not covered by any learning activities there could be a link to other courses that do have activities covering the ethics graduate attribute. Teaching staff could follow these links to view those activities as a way of getting ideas. Which courses show up via these links could be chosen via a number of ways.

The alignment could also be shared with students. Adding the ability to view the contents of a site structured using the outcomes would be quite easy. Lots more interesting applications could be developed.

Contextualise L&T support

Above it was suggested that the visualisations of alignment could, when problems are identified, provide links to courses that can be used as examples. The visualisations could also provide links to documents, presentations, discussions and people who could provide specific support. This could help curriculum designers and related L&T support folk contextualise their assistance in a very specific way. An approach that moves towards achieving Boud’s (1999) argument that L&T support needs to be embedded within the context of academic work, that it needs to occur in or close to the teaching academics sites of practice.

References

Boud, D. (1999). Situating academic development in professional work: Using peer learning International Journal for Academic Development, 4(1), 3-10.

Lowe, K., & Marshall, L. (2004). Plotting renewal: Pushing curriculum boundaries using a web based graduate attribute mapping tool. Paper presented at the 21st ASCILITE Conference, Perth.

Moving the indicators Moodle block to a factory class

The following reports on some work on the indicators block to move it towards using some object-orientation and the factory design pattern.

Why?

One of the requirements we’ve talked about for the indicators block was the ability to show many different “indicators” (visualisations of something important about learning and teaching within a Moodle course). The idea is that the individual user could either scroll through the different visualisations or they could configure it to show a subset that they are interested in. Some examples of indicators that are already out there which might be included are:

  • An effort meter;
    This is what the block currently shows. It’s closely related to the Moodle meter idea of Lewis Carr. To the extent that both the meter and the current indicator both use Google’s chart API for the graphics. Though the meter appears to place users into four groups. The indicators block currently uses a straight numeric scale.
  • Traffic lights;
    Purdue University has used traffic lights to represent students’ standing in a class. This is a little bit like Michael de Raadt’s Moodle progress bar block, at least in terms of helping students visualise their progress, this time within a course.
  • Network visualisations of connections;
    The SNAPP project is the main example of this I know. Visualise the network of interactions between participants in a discussion forum.
  • Waterfall visualisation;
    This comes from work done by David Wiley and his students. It’s connected to the traffic lights/progress bar idea, but is focused more on showing student progress to the teacher. Allowing the teacher to see which students are struggling or not.

The last one is somewhat connected to a visualisation that Col has been working on. He describes the rationale and the approach in this blog post. More on that later.

Are there any other ideas for visualisations?

What?

Given the aim is to include multiple visualisations and we do want people to be able to add their own to the block, we need to have a clean way of separating the code for the different indicators. This is what the factory design pattern does.

The idea of the factory pattern is that you want the code for the different indicators to be separated out. These indicators are different, but they all do essentially the same task (look at the LMS data and generate some visualisation). The decision about which of the indicators you want to show to the user is a fairly complex decision. In terms of the indicators this decision will eventually need to consider:

  • What type of user is this?
    Students and staff will be able to see different indicators.
  • How has the user configured the indicators block?
  • Based on the above, which indicator should we show?

The initial version of the block had the decision about which indicator to show and the code for both indicators (staff and student) combined into the one function. A bit messy with two. Have 6 or 7 and it would’ve been a nightmare.

With the factory design pattern, the guts of the block’s main function looks like this.

[sourcecode lang=”php”]
// get details about the context
$context = get_context_instance(CONTEXT_COURSE,
$SESSION->cal_course_referer);

// create the correct indicator
$indicator = IndicatorFactory::create($context);
// use the indicator to generate the HTML to put in the block
$this->content->text = $indicator->generateText();
[/sourcecode]

There’s a class called IndicatorFactory which when given the current context decides which indicator should be used. The factory then constructs the right indicator and returns it.

All indicators simply generate the “text” that is placed in the block. So we call the indicators generateText function and we’re done.

All the IndicatorFactory class does at the moment is look at the type of user. If the user is a teacher, then it creates an object of the class staffActivity. If it is a student, it creates an object of class studentActivity

[sourcecode lang=”php”]
if ( has_capability( ‘moodle/legacy:teacher’, $context ) ||
has_capability( ‘moodle/legacy:editingteacher’, $context ) ) {
require_once(
$CFG->dirroot.’/blocks/indicators/staff/staffActivity.php’);
return new staffActivity;
} else if ( has_capability( ‘moodle/legacy:student’, $context ) ) {
require_once(
$CFG->dirroot.’/blocks/indicators/student/studentActivity.php’);
return new studentActivity;
}
[/sourcecode]

Both the “activity” classes contain the SQL statements, a bit of maths and the call to the Google charts API that was necessary to generate the particular visualisation. Both of the “activity” classes extend the abstract class Indicator. The idea is that for each of the above indicators we implement, each one will have its own class that extends the Indicator class.

What’s next

This has laid the foundation for having multiple indicators for each user category. The next step might include:

  • How will users move between different indicators?
    One idea is that there are left and right arrows at the bottom of the indicators that allow the user to scroll through the available indicators. Have other Moodle blocks done this? How? Does this mean a refresh of the entire page or do we do some HTML trickery?

    Alternatively, do we randomly (or sequentially) show the indicators everytime the page is refreshed? Do we build some smarts into the Indicators block so that at certain times or based on certain events it shows specific indicators first? For example, in the day or two leading up to an assignment due date it might show the percentage of students who have submitted and those that haven’t.

  • How will users be able to configure which indicators they are interested in?
    Eventually (not right now) we may want to allow the users to configure which ones they are interested in.
  • Can we improve the current indicators?
    For example, some of the SQL for the student activity indicator users “roleid=5” to indicate a student. I think this is deprecated.
  • Do we need to and how will we “cache” the data required for the indicators?
    At the moment, both indicators query the database every time the block is shown. In a large installation this could lead to some performance problems. Eventually, we will need to look at an approach to “caching” to reduce the performance hit.
  • Do we need to move to a model/view pattern for the indicators?
    Are we going to want the one set of data around an indicator to be visualised in many different ways? SNAPP already does this with the different types of network visualisation it supports. If so, we may wish to split the indicators into a model (calculate the data) and view (visualise it) objects.
  • How do we support bigger indicators?
    SNAPP visualisations are probably not going to fit within a block. Especially if we’re talking about building on SNAPP by enabling visualisations of discussion forums across a range of courses, not just the current one. How do support indicators that need quite large areas? A popup? A new page? What’s the Moodle way? What’s the cool way?
  • Are there other visualisation tools?
    The Google chart API looks like a good, low impact way of doing visualisations. But it might not provide everything we need. Are there other alternatives?

Is there more to communities of practice?

Markus and I have been talking about behaviour change and lots more for a bit. He’s about to start a new job has been speculating about what he might do and how it connects with what we’ve been talking about. This post is an attempt to make explicit my initial gut reaction to the idea that

distributive leadership may not help, but communities of practice might

It’s main outcomes will likely be to encourage me to read and understand more about both conceptions.

A cynical, ill-informed view of communities of practice

My knowledge of communities practice is limited to my observations of local attempts to implement then and having purchased and skimmed one of Wegner’s books on the topic. Based on the vastly ill-informed perspective my current opinion is that communities of practice suck. I’ve previously outlined two of the reasons I don’t think they are working to improve L&T within universities. On reflection, my problems with CoP are the following, (most based on observations of local context, hence generalisability is somewhat limited)

  • If voluntary, they attract the folk who least need them.
    A voluntary CoP around assessment within a university is likely to attract people who are intrinsically interested in assessment. People who aren’t interested won’t join. I suggest that it’s the folk who won’t join that are more likely to be the folk who need to join.
  • If compulsory, they encourage compliance or corruption.
    Force an academic (knowledge worker? Dentist?) to do something they don’t want to do and they aim to be seen to comply, but won’t really participate. There’s a chance participation might lead to a change of mind, but that’s an if.
  • They are likely to suffer all the common collaboration problems.
    Get any group of people together and you are likely to find problems such as the herd mentality, echo chamber etc. Not to mention the problem that for some CoPs are becoming seen as yet another tool of management.
  • They find it hard to change the system.
    Most of the CoPs I have seen set up are not directly connected with the formal leadership of the institution or the formal processes/systems of the institution. Consequently, the discussions do not end up changing the institution’s systems or processes. The focus becomes sharing and talking, not making changes.

Obviously most of these can be addressed in setting up the CoP. But what I’m interested in is getting the whole system/context setup in a way that encourages on-going reflection and change. And that’s more than just people talking.

Distributive leadership as different/better

At the moment, the most interesting aspect of distributive leadership for me is its foundation on distributed cognition. Due to this foundation, leadership within distributive leadership is not a function of the formal leadership hierarchy within an organisation. It is also more than distributing leadership responsibility into the people within an organisation. It’s about distributing leadership responsibility into the people and the systems and the processes of the organisation.

This is important, if you limit leadership to just the formal leaders or the people you are missing the point. I also think this is the bit most people don’t get and one I’m still figuring out how to explain. Is this a point of departure from CoP, inclusion of systems/processes as well as people into consideration?

How about we consider what leadership is? What does it entail? Some previous reading and thinking identified two functions that are thought to be indispensable to leadership (Sourthwell and Morgan, 2009; Leithwood an Levin, 2005):

  • Direction-setting; and
    What is the direction or purpose considered valuable/appropriate?
  • Influence.
    Encouraging folk to move towards the direction, to achieve the purpose.

Let’s use Markus’ problem as the example. Thinking of this from a “distributive leadership” perspective (or at least what I currently think might be such a thing) might suggest the following perspective.

The direction or purpose is to reduce the sense of isolation of the dentists within Scottish prisons. In itself, this is a pretty big assumption. What’s the evidence that the dentists feel this and that it’s a problem? Is the lack of funding priority indicative that this is not important?

As I’m ignorant of this evidence, but could see that it could be important, Let’s assume that it is important. Why aren’t the dentists already moving in this direction? Why aren’t they talking more?

My interest/argument would be in finding out what is about the systems and processes within which they work that prevent them from reducing isolation and identifying the interesting ways you could reduce those barriers. The question is, if getting rid of isolation is an important direction, why aren’t the systems and processes used by the dentists in their job helping them get there?

For me this isn’t about setting up a community of practice, as that doesn’t immediately fundamentally address the barriers in the system that is creating the isolation. Participating in a CoP might creates a reason to overcome the isolation, but it doesn’t remove the barriers.

Some examples

The following is an attempt to provide some more concrete examples of what I mean by focusing on the prison dentists case. Rather than seriously suggests specific solutions, the idea here is to give a taste for how small, diverse and widespread the changes might be.

Markus describes a system that includes under-staffing, logistic difficulties, difficulties sourcing equipment and materials, an under-funded system. From that I get a picture of dentists that are overworked by a system that is continually showing that it doesn’t really value what they do.

Change? the prison hierarchy do something that (continually) illustrates to the dentists that there services are valued (e.g. all prison dentists be given the choice car park place).

Would folk in such a context have any sense of ownership or control over the work they do?

Change? Get prison hierarchy buy-in to engage in a social network stimulation type project involving the dentists and others within the system. Perhaps focused on oral health?

Would folk in such a context have the time, motivation and technology available to talk to each other?

Change? Identify some dentistry good practice that requires collaboration and is broadly accepted. Give them all iPads (sexy technology that they can collaborate with, but also do other things) as part of a project that seeks to embed the select practice into their everday live

This is definitely echoing the alignment project (internal echo chamber?).

Another thought, do they need to communicate with other prison dentists, what about other dentists, other prison workers?

Some tweaks to the indicators block

Yesterday’s post introduced Col’s initial work on the indicators block. This post reports on some minor tweaks I’ve been doing this afternoon, trying to find escape in something concrete.

Setting the title

As reported in the last post, the block ended up having a title [[Indicators]]. This was because get_string was being used to set the title but the necessary language file (from which to source the string) was not created.

First fix is to just hard code the title.

[sourcecode lang=”php”]
$this->title = "Indicators"; //get_string(‘Indicators’,’block_indicators’);
[/sourcecode]

That works. But the proper solution would be to figure out where the “lang” file should go for a block. According to this, it should be lang/en/block_BLOCKNAME.php. Small problem, that should be lang/en_utf8, not lang/en (as per here).

[sourcecode lang=”php”]
$this->title = get_string("indicators","block_indicators");
[/sourcecode]

Will commit those changes.

How to distinguish user roles?

The block was using roleid=5 as a way to identify students. I believe this is a deprecated approach. So need to find a better way. In my wonderings, I came across an approach that users the has_capability function along with some “legacy” capabilities for student, staff, guest and admin. The following is an early example from the block

[sourcecode lang=”php”]
if ( has_capability( ‘moodle/legacy:teacher’, $context )) {
print "This is a teacher<br />";
} else if ( has_capability( ‘moodle/legacy:student’, $context )) {
print "This is a student<br />";
$canview=0;
}
[/sourcecode]

What’s next

At least in my head, the plan is to enable different groups of users to see different sets of “indicators”. Where an “indicator” is a single graphic. This means that we need an good way to:

  • distinguish between different users;
  • call different code to generate the indicators for the different users;
  • distinguish which indicator a user wishes to see;
  • call different code based on the indicator.

A nice structure to do that, might be next on the list. If I was in Perl, I’d be doing this with a factory class. Should we go OO in PHP?

Some other tasks:

  • I’m getting errors when running the block as a teacher.
    [sourcecode lang=”bash”]
    Table ‘moodle.m_course’ doesn’t exist

    select (count(*)/count(distinct(userid))) from mdl_log where course=’4′ and userid=’3′ and action in (‘add discussion’,’add post’,’update post’) and course in (select id from m_course where idnumber like ‘%2010′) and userid in ( select userid from m_role_assignments where roleid !=’5′ and contextid in (select id from m_context where contextlevel=’50’))
    line 686 of lib/dmllib.php: call to debugging()
    line 379 of lib/dmllib.php: call to get_recordset_sql()
    line 71 of blocks/indicators/block_indicators.php: call to count_records_sql()
    line 317 of blocks/moodleblock.class.php: call to block_indicators->get_content()
    line 341 of blocks/moodleblock.class.php: call to block_base->is_empty()
    line 338 of lib/blocklib.php: call to block_base->_print_block()
    line 276 of course/format/weeks/format.php: call to blocks_print_group()
    line 229 of course/view.php: call to require()

    Warning: Division by zero in /Applications/XAMPP/xamppfiles/htdocs/moodle/blocks/indicators/block_indicators.php on line 80

    Warning: Division by zero in /Applications/XAMPP/xamppfiles/htdocs/moodle/blocks/indicators/block_indicators.php on line 86
    [/sourcecode]

  • What should an admin user see when they view the block?
  • Check the performance of the existing SQL code and think about how/what we might need to do to significantly reduce that.

Qualms about the alignment project

Yesterday, I posted a draft of an application for what is currently being called, the alignment project. Stephen Downes has commented on the alignment project in is OLDaily. Stephen’s comments are

This is totally not my approach, but a careful and detailed articulation of an alternative. It should be considered…..I would criticize the duality the approach presupposes, between ‘quality’ (alignment) and the disorganized ‘lone wolf’ approach to teaching.

Stephen is not alone in having qualms about the project. I have some as well. This is an attempt to make those qualms explicit and see if I can develop an argument/perspective that addresses at least some of them. After a week or two of developing the application draft, I’m a bit too close to the idea. I need to be more critical so that the idea can be improved.

I welcome suggestions and arguments, especially those targeting weaknesses or mistakes in the application

Aside: people wonder why I post to this blog. The prime reason is exactly the type of comment Stephen has made and what it encourages and enables me to think about. The duality Stephen mentions is important and not something I would have thought to closely about without his spark.

Summary

My response to Stephen’s criticism is that I recognise that this duality is a big problem. It’s also the most likely outcome of the project, i.e. those lone-wolves who aren’t seen as being “aligned” are also seen as “poor quality”. My intent, however, is to use alignment as an idea acceptable to higher education that can be used to modify a broken system to increase the level of reflection and discussion around L&T that occurs as part of the everyday practice of teaching academics. That “alignment” is a means, not the end.

I also think it unlikely that this is what will happen.

Qualms about constructive alignment

The alignment project as described draws heavily on John Biggs work on constructive alignment. I’ve always had to qualms about constructive alignment:

  • Assumption of plan-driven or teleological design.
    Constructive alignments is a teleological design process. It assumes you can identify the outcomes at the start and use that as a basis to identify/design the actions necessary to get students there. A number of aspects of learning, even in its limited form of university-based learning, make me doubt whether or not this is really all that possible. The next point picks up on human agency, but other aspects include the inherent diversity in learners backgrounds, capabilities and aims. Even in higher education the talk is of the increasing diversity of students. Given that diversity, how do you claim to identify a set of outcomes that is suitable for all, let alone develop activities and assessments that are suitable for all?

    I have a long standing preference for ateleological design processes and have a (potentially forlorn) hope that the alignment project might be more ateleological than teleological.

  • Assumption that you can “force” students to learn.
    Here’s a quote from Biggs (2001) that illustrates the assumption that troubles me

    In aligned teaching, where all components support each other, students are
    “trapped” into engaging in the appropriate learning activities, or as Cowan (1998) puts it, teaching is “the purposeful creation of situations from which motivated learners should not be able to escape without learning or developing” (p. 112). A lack of alignment somewhere in the system allows students to escape with inadequate learning.

    At one level I’m worried about this perspective because of what words such as “trapped” and “inescapable” can mean, what it says about the people in charge of such a system.

    My more pragmatic problem with this perspective is that I don’t think it can work. People always have some level of agency. Many university students are highly pragmatic, they will use their agency to subvert the system to achieve their ends with means they find acceptable. I’m not convinced that even the best constructively aligned course can escape the effects of compliance and task corruption.

Qualms about the reflective institution

The alignment project is essentially aimed at implementing something that approaches Biggs (2001) idea of a reflective institution. The application gives a summary of the stages involved in achieving Bigg’s goal. I’ve actually written about the idea previously back in February last year I wrote

However, the detail of his suggested solution is, I think, hideously unworkable to such an extent as likely to have a negative impact on the quality of teaching if any institution of a decent size tried to implement it. As Biggs (2001) says, but about a slightly different aspect, “the practical problems are enormous”.

I’ve been involved with the underbelly of teaching and learning at universities to have a significant amount of doubt about whether the reality of learning and teaching matches this representation to the external world. I’ve seen institutions struggle with far simpler tasks than the above and individual academics and managers “game the system” to be seen to comply while not really fulfilling (or even understanding) the requirements.

The project’s assumptions

In my head, the project is based on the following assumptions:

  • For most academics, the majority of teaching is copying a previous course, making some minor modifications and teaching it.
  • Preparation for teaching is generally driven by administrative deadlines (the bookshop needs to order textbooks on date X, teaching starts on Y etc) and systems (you use system X to order the textbook, the LMS to create your course site).
  • Most, if not all, of these systems do not encourage or enable the academic to think about the concepts of learning underpinning these decisions. They just have to choose the textbook, make sure the assignment is different from last year and ensure that there are no egregiously out of date references in the lectures.
  • Consequently, most academics (upwards of 50%) just do what they did last time, teach the way they were taught. (There are exceptions, but they are the minority).
  • If you add into these systems some minor tweaks that encourage and enable academics to reflect on the conceptions of learning within the course, and provide some appropriate support, then you might encourage the majority to start reflecting and eventually improving their teaching.

To some extent the project is based on the nature of the teaching context at the participating institutions. For example, my current institution has specified that every course will have a Moodle course site.

Qualms about the project and some possible responses

The following are the qualms that I can think about the project. Can you suggest more? For each of these, I’ve tried to describe what I think is a response.

It will die within a week

Qualm: The alignment project is still in pre-application days. There are on-going discussions with various institutional leaders about whether or not they think that this is an idea that they can support. There’s always a chance that by this time next week (or not long after) the project will be dead due to lack of support.

Response? If that happens, I’m hoping we can continue with the project on a smaller scale. Perhaps with just a single program or two at my current institution to explore some of the ideas and their impacts. At the very least, I’m interested in how/if some form of curriculum mapping can be put into Moodle.

Teleological design

Qualm: As I’ve stated above, I don’t think teleological design works. In particular, I don’t think this is the way most academics approach the design of the teaching. The draft alignment project application actually cites literature that shows academics are mostly making minor changes to existing courses. This post expands on this literature. When I’ve seen constructive alignment in action, it’s typically been as part of a large redesign of a course. For all sorts of reasons I think this is a failing.

Response? The methodology expressed in the project proposal recognises this and seeks to introduce the question of alignment in a way which fits with the focus academics have on minor modifications to existing teaching. The hope is that by making alignment a visible part of the tools and support around making minor modifications, then staff may start thinking about alignment and be able to make minor modifications that improve alignment. The aim is not massive redesign to ensure alignment. It’s about minor changes that improve alignment.

And, if I’m honest, the aim for me is not really to get them implementing constructive alignment. It’s just to create an environment that encourages and enables academics to reflect on their teaching and how they are doing it.

Using it as a stick

Qualm: This is one of my biggest fears. Theoretically the project will result in the alignment (or lack thereof) to be readily visible to all folk associated with a course/program. This is going to include people in formal leadership positions. This could very easily lead to this being used as a stick to beat the “bad” teachers. In Bigg’s (2001) words, a focus on the teacher, rather than the teaching.

Response? The only defense I can see against this is ensuring that the folk in those leadership positions are intelligent folk who can see the problems with this. Or, perhaps at the least, ensure that their actions are visible enough so other more enlightened folk can mitigate their effects. In some situations, I’m not sure we’d be able to convince teaching academics of the potential success of this approach.

In the project, I think this can be addressed by having people on the project reference group that are broadly recognised as being experts in this field and having them interact closely with the members of the institutional steering committees (containing formal leaders). Hopefully during the project they can learn the lessons which inform latter practice.

Corruption

Qualm: In terms of a likely outcome, I can see – given my comments about human agency above – that some/most academics would employ task corruption once the alignment project was in place. Task corruption is where an group or individual, consciously or unconsciously, adopts an approach to a task that either avoids or destroys the task. White (2006) talks about two approaches

  • amputation – where parts of the task are no longer performed; and
  • simulation – the emphasis is on being seen to have done the task, not actually to have done it.

Response? In the end, I don’t think there’s anything that can prevent this from happening. All you can do is provide an environment in which the practice becomes valued. That’s what I think this project is about, making alignment part of the culture, the way things are done. This will never entirely successful and is likely not to succeed. However, the responses currently in the project include:

  • making alignment and the responses to it visible;
    Much current corruption occurs because university teaching is a primarily solitary act. There are questions around whether this is a good thing, about forcing one’s views on others. But then academia is supposed to be about peer review.
  • enable and encourage;
    The focus is on creating an environment that helps academics engage in this. If they aren’t engaging then the environment needs to be tweaked, hence the focus on action research. This needs to be on-going.

Technological gravity

Qualm: Related to the above is the conception of technological gravity that McDonald and Gibbons (2009) define and which I’ve posted about (and linked to edupunk). This idea is based on the idea of three major assumptions around learning and teaching:

  1. Technology I – different technologies automatically lead people to develop quality instruction.
    i.e. Moodle is an LMS designed with social-constructivist principles and it’s open source. If the institution adopts Moodle then the quality of instruction will improve.
  2. Technolgy II – different techniques/methods lead people to develop quality instruction.
    i.e. if all our courses are designed using constructive alignment, then student learning outcomes will improve.
  3. Technology III – characteristics of a local situation are used to identify the technologies and methods that will have a practical, positive influence in solving a defined problem/improve learning.
    i.e. this is the bit I think has some connections with edupunk – perhaps what Stephen describes as the lone-wolf approach above.

Technological gravity is defined as the force that seems to suck people and institutions away from Technology III and towards Technology I and II. This project is just as likely to suffer from technological gravity as anything else. McDonald and Gibbons (2009) identify three reasons for technological gravity:

  1. distracted focus;
    i.e. the institution has to get ready for a quality audit and needs to focus on that.
  2. status quo adherence;
    i.e. the changes introduced are re-interpreted (or mis-interpreted) and slightly adapted to fit with current practice. “Of course, my 3 hour lecture on the basics of theory are help the students develop critical evaluation skills.”
  3. over-simplification.
    i.e. doing X is too hard, we need to make it simpler for folk to do routinely. By making it simpler, something is lost.

Responses? In terms of focus, the aim of the project is to embed this in the institutional systems. It should just happen. Buy-in of leadership, building it into institutional systems (the LMS) etc are all steps being taken. This will be hard.

With status quo adherence, if the project works then this should hopefully be the type of problem with which the quality enhancement process focuses on. That process is focused on encouraging people to reflect, in part visibly.

In terms of over simplification, this is perhaps where the quality feasibility stage comes in. Mm, weak

The quality and lone wolf duality

Qualm: Stephen’s qualms include

the duality the approach presupposes, between ‘quality’ (alignment) and the disorganized ‘lone wolf’ approach to teaching.

I think this is based on the idea that any disorganised “lone wolf” approach to teaching is by definition not aligned and consequently can’t be thought of quality teaching.

Response: I don’t think the project can respond to this. The first stage of Bigg’s (2001) reflective institution (and consequently of this project) is to make the quality model clear. The model in this case is some sense of alignment. Being aligned is by definition quality.

If you strongly believe in constructive alignment, then this is probably not a problem. However, as I outlined above, I have qualms about constructive alignment. If you asked most people around here whether I am a lone-wolf or a quality-focused/aligned kind, most would answer lone-wolf. Given these how can I justify this project and my involvement with it?

Having thought about this, my current response has two themes:

  • the alternatives are even worse;
    Anything has to be better than what I see as increasingly common practice within Australian higher education. I’ll pick up on this more below.
  • I’m not as dogmatic as Biggs.
    In the following, I’ll argue that I’m my perspective is not as black and white as the duality Stephen has identified. Instead, I’m taking a more gray perspective. Perhaps seeing the quality model as not a dichotomy or end-point, but encouraging a dialectic.

My impression of Biggs (solely from his writings) and some of the constructive alignment practitioners I’ve met (perhaps they influence my perspective of Biggs) are of a very dogmatic perspective. Alignment provides the answer, and the answer is good. If you are not a follower of the answer, you are a heathen. If you follow alignment, you need to re-design your course so that it is 100% aligned – as approved by the constructive alignment church. There’s almost a touch of Technology II about constructive alignment.

To me, learning and teaching is much more complex. I can see how a perfectly aligned course could have horrendously horrible outcomes, depending on the context. I can also see how an apparently mis-aligned course can generate good quality outcomes. For me, the aim of the alignment project is not to achieve perfectly aligned courses and hence quality student learning outcomes. The aim of project is to modify everyday teaching practice so that it encourages and enables academics to start asking questions, rather than simply following administrative processes. Am I trying to make use of a Technology III perspective of alignment?

So, why use alignment at all? Because the system is broken – I pick this up in the next section.

So why do it

All of the above has got me thinking about why I’m pushing this project. Here are my answers.

I have a job

At the simplest and most pragmatic level, I have a job. The institution pays me to do certain things that it deems important and ALTC grants are pretty high on the list of importance. If I want to continue to have a job, I need to demonstrate I’m fulfilling their goals.

That’s not sufficient though. I also think I could enjoy the job (eventually) and generate some benefit broader than my personal employment.

The system is broken

As it happens earlier this week I was listening to this discussion Stephen had in Argentina. One of the topics covered was that, in his opinion, the current educational system in which his “audience” were working within, is broken. One point being that it is very difficult, if not impossible, to use a lot of his approaches within such a broken system.

In terms of the higher education sector in Australia, I agree with Stephen, the system is broken. Even worse, for some time I have been dismayed at the increasingly prevalent broken approaches that are being adopted in the quest for improving the quality of L&T within that broken system. It’s a common theme on this blog. What is currently being done within Australian universities to improve L&T will, at best, offer slight improvements for the folk who were already improving, or at worst, significantly decrease the overall quality of L&T (while at the same time showing “evidence” of improvement).

At this stage I come back to some thinking about inside-out versus outside-in that was sparked by questions from Leigh Blackall. In my job I think I have to come up with approaches that can change the system from the inside-out. It’s an aim likely to fail.

To do this, you have to have some connection with what is being done within the system. Alignment is broadly and generally unquestioningly accepted within higher education, especially amongst the leadership. This project will be attractive because of this acceptance. Most have accepted the value of alignment. More importantly, in my experience most people can, from a common sense perspective (“Common sense is the collection of prejudices acquired by age eighteen”), accept the idea of alignment as a good thing. It’s also a fairly simple thing to understand (though difficult to implement).

What this means is:

  1. Management can see the rationale for this and how it fits with the external demands they are having to deal with.
  2. Teaching academics can (hopefully) see the initial sense of the idea, at least enough to start talking about it.

i.e. this helps get the idea accepted and helps the project introduce into everyday practice some discussion about L&T that moves beyond administrative tasks. It helps introduce a change that might move the system in the right direction (but won’t fix it).

In summary, alignment is a means to an end. It’s not what is fundamental about this project. What is fundamental is encouraging a bit more reflection around L&T into everyday practice. I don’t really care how aligned the courses are, as long as academics are working in an environment that helps them reflect on their L&T and do something about it.

Of course, translating that view into reality and avoiding alignment being seen as an ends, is another story all together.

References

Biggs, J. (2001). The Reflective Institution: Assuring and Enhancing the Quality of Teaching and Learning. Higher Education, 41(3), 221-238.

McDonald, J., & Gibbons, A. (2009). Technology I, II, and III: criteria for understanding and improving the practice of instructional technology Educational Technology Research and Development, 57(3), 377-392.

White, N. (2006). Tertiary education in the Noughties: the student perspective. Higher Education Research & Development, 25(3), 231-246.

Getting started with Col's indicators block

Col has been playing around with some ideas for a Moodle indicators block. This is a record of my first attempt to install and play with the block. Might also do a bit of reflection and setting up of processes etc so we can go further with this.

The long term goal is to promote the Indicators project, help some folk and do some research.

Warning: Much of the following is intended only for the indicators project team. At this stage, there’s probably not a lot of value in anyone outside the project trying to use the block. It’s very early days.

Installing the block

Installing the single PHP file provided by Col in the right place in my local Moodle install, setting permissions, visiting the “admin” page for my Moodle install and she’s all right to go. Go to a dummy course, login as a staff member, add the block and it’s all working. The block currently shows some idea of effort on the part of students, so logged in as a staff member, I don’t see much. Login as a dummy student and this is what I see. (Click on it to see a bigger version)

Indicators block version 0

It seems to work, though with a few errors. The dummy student I’m using hasn’t done a lot and the arrow indicates that. The errors include:

  • The PHP error re: undefined variable.
  • The [[Indicators]] as the label.
  • The quite large amount of screen space being taken up by the right hand block column – only since the indicators block was added.
  • The white background for the graph, rather than transparent.

The aim is to make this open source and let anyone work on it – or at least anyone in the indicators project as a first step. This means we need to get this under version control.

New code – effort tracking during early stages

Col’s just sent some new code, installed it and refreshed the page for the dummy student. I get the following

Next step in indicators block

The background colour has been improved. However, the interesting observation is that the one page reload has catapulted this student from a fairly low effort level, to a fairly high effort level.

My first guess, without even having looked at the code is that this is because this is a dummy course, there are no real students and I only use it occasionally for testing. This means very low levels of usage by “students”. At these levels, depending on the maths used, a single extra page refresh can make a huge difference.

This is something the block should recognise and address, some solutions might include:

  • Having a “too low to show” option, so that effort isn’t tracked in a state of low usage.
  • Or showing that overall usage is low and liable to wild swings. Perhaps a visible “confidence” level that indicates how confident the block is that it is showing you something meaningful.

Putting the block under git

If we’re going to work collaboratively on this, and allow other people to use it, we need some sort of support for version control and a range of other features. I’ve been using git and github for BIM, so I think we should use those for the block. I’m still a newbie at this, but I’m slightly ahead of the other guys in the indicators project. So the following shares what I did to get this up and going in the hope that it is useful for them and that they (and others) can pick up any errors I made.

Getting started

I’ve only done this once before, a month or so ago, and can’t remember anything. So, I’m starting with the github help.

I’ve already set up my laptop to use github which from memory involved: creating a github account, setting up some environment stuff and generating some ssh keys. Just follow the guides in the right hand menu on help.github.com. Just found the learn.github site.

The process

Here’s what I did

  • initialised a new git repo for the block;
    [sourcecode language=”bash”]
    bash$ cd blocks/indicators
    bash$ git init
    Initialized empty Git repository in /Applications/XAMPP/xamppfiles/htdocs/moodle/blocks/indicators/.git/
    [/sourcecode]
    This is the empty git repository
  • add and commit the file
    [sourcecode language=”bash”]
    bash$ git add block_indicators.php
    bash$ git commit -m ‘initial commit’
    [master (root-commit) c1a7051] initial commit
    1 files changed, 103 insertions(+), 0 deletions(-)
    create mode 100644 block_indicators.php
    [/sourcecode]
  • Quick double check
    [sourcecode language=”bash”]
    bash$ git log
    commit c1a70517f09d2f86de53e9e1c6a056d864e7622d
    Author: David Jones <davidthomjones@gmail.com>
    Date: Thu May 13 10:17:50 2010 +1000

    initial commit
    [/sourcecode]

  • Add a new repository on github
    Actually, when you create a new repository, github presents you with the full set of instructions. (Point I didn’t remember, is to make the name of the project match the folder name.) Part of those instructions include what I’ve already done. Here’s the rest.
    [sourcecode language=”bash”]
    bash$ git remote add origin git@github.com:djplaner/indicators.git
    bash$ git push origin master
    Counting objects: 3, done.
    Delta compression using up to 2 threads.
    Compressing objects: 100% (2/2), done.
    Writing objects: 100% (3/3), 1.63 KiB, done.
    Total 3 (delta 0), reused 0 (delta 0)
    To git@github.com:djplaner/indicators.git
    * [new branch] master -> master
    [/sourcecode]
  • github like to have a README file, so let’s add one.
    [sourcecode language=”bash”]
    bash$ vi README
    bash$ git add README
    bash$ git commit -m "Added readme"
    [master a08e84a] Added readme
    1 files changed, 5 insertions(+), 0 deletions(-)
    create mode 100644 README
    bash$ git push origin master
    Counting objects: 4, done.
    Delta compression using up to 2 threads.
    Compressing objects: 100% (3/3), done.
    Writing objects: 100% (3/3), 419 bytes, done.
    Total 3 (delta 0), reused 0 (delta 0)
    To git@github.com:djplaner/indicators.git
    c1a7051..a08e84a master -> master
    [/sourcecode]
  • I have a second local Moodle install, here’s how I get the code from git into the appropriate place for that second Moodle install
    [sourcecode language=”bash”]
    bash$ cd cm_moodle/blocks
    bash$ git clone git://github.com/djplaner/indicators.git
    Initialized empty Git repository in /Applications/XAMPP/xamppfiles/htdocs/cm_moodle/blocks/indicators/.git/
    remote: Counting objects: 6, done.
    remote: Compressing objects: 100% (5/5), done.
    remote: Total 6 (delta 0), reused 0 (delta 0)
    Receiving objects: 100% (6/6), done.
    bash$ cd indicators
    bash$ ls
    README block_indicators.php
    [/sourcecode]
    At this stage, I think you need to create a local git repository to work with this locally.

  • Let’s fix that undefined variable problem, caused by a variable not being declared/initialised before an “if”, and then update github?
    [sourcecode language=”bash”]
    bash$ vi block_indicators.php
    bash$ git commit -m "fixed undefined canview"
    bash$ git push –mirror git@github.com:djplaner/indicators.git
    Counting objects: 5, done.
    Delta compression using up to 2 threads.
    Compressing objects: 100% (3/3), done.
    Writing objects: 100% (3/3), 341 bytes, done.
    Total 3 (delta 1), reused 0 (delta 0)
    To git@github.com:djplaner/indicators.git
    a08e84a..caf4b54 master -> master
    * [new branch] origin/HEAD -> origin/HEAD
    * [new branch] origin/master -> origin/master
    [/sourcecode]
  • Okay, now to get that change showing up in my original indicators block
    [sourcecode language=”bash”]
    bash$ git pull origin master
    From github.com:djplaner/indicators
    * branch master -> FETCH_HEAD
    Updating a08e84a..caf4b54
    Fast forward
    block_indicators.php | 1 +
    1 files changed, 1 insertions(+), 0 deletions(-)
    [/sourcecode]
    A similar thing on my second Moodle install (which I think others will be doing), would’ve been just “git pull”.

Reflections and what’s next

That all seems to work. I do realise that I’ve probably just identified some of the basic commands without really groking the full capabilities of git and github. Perhaps I’ll learn.

Tasks to do later on include:

  • Identify how to add the other members of the indicators project to the github repository (so they have write access).
  • Take a look at Col’s code and start thinking about what we can do to add features.

More thinking about the alignment project

The following is the latest, and first close to (but not there) complete, draft of the proposal explaining the alignment project. While informed by good discussions with a range of folk, the following is still a bit limited. Should be improved over the next couple of weeks.

Even if the application doesn’t get off the ground it has helped me make connections bit a range of different bodies of work (complex adaptive systems, connectivism, distributive leadership and distributed cognition). Some of which I’ve been aware of and some I’ve ignored. It has helped develop my interest in thinking about how to combine some of the principles underpinning these bodies of work with behaviour change, hopefully to do some interesting things in the future.

As always, any comments/suggestions are more than welcome.

Executive summary

The aim of this project is to build distributive leadership capacity into institutional systems and processes to encourage and enable alignment and quality enhancement. It aims to make consideration of alignment a regular, transparent, supported and integrated part of common teaching practice, supported by effective systems and processes. The project aims to fulfil the suggestion by Biggs (1996), that attempts to enhance teaching should seek to address the system as a whole, rather than simply adding “good” components such as new curriculum or methods. It seeks to build distributive leadership to empower academics to actively engage in alignment and move towards achieving what Biggs (2001) calls ‘the reflective institution’.

For most teaching academics, the consideration of alignment in their courses and programs is not a part of everyday teaching practice. Consideration of alignment is typically limited to events such as significant re-design of courses and programs of visits from accreditation or quality assurance organizations. The dominant teaching experience for academics is teaching an existing course, generally one the academic has taught previously. In such a setting, academics spend most of their time fine tuning a course or making minor modifications to material or content (Stark, 2000). Given this focus, it does not appear surprising when Green et al (2009) report that “many academic staff continue to employ inappropriate, teacher-centered, content focused strategies”. If the systems and processes of university teaching and learning practice do not encourage and enable everyday consideration of alignment, is it surprising that many academics don’t consider alignment?

Instructional (Cohen, 1987), curriculum (Anderson, 2002) and constructive (Biggs, 1996) alignment are all built on a similar foundation: the recognition that student learning outcomes are significantly higher when there are strong links between those learning outcomes, assessment tasks, and instructional activities and materials. Cohen (1987) argues that limitations in learning are not mainly caused by ineffective teaching, but are instead mostly the result of a misalignment between what teachers teach, what they intend to teach, and what they assess as having been taught. The importance of achieving and demonstrating alignment with expected outcomes is also a central component of outcomes-based accreditation and quality assurance approaches that are increasingly widespread within higher education.

Consequently, the main tasks of this project are based on the three stages which Bigg’s (2001, p. 221) identified as encouraging institutional reflective practice. These are:

  1. Make explicit the quality model.
    Alignment should be explicit if it is to be seen as a key to quality student learning outcomes. The systems, technology, processes and support practices around learning and teaching should therefore enable and encourage alignment to be an everyday consideration. This support will enable: a) the level of alignment within a course, or group of courses, to be mapped and understood; and b) information about the alignment of a course or courses to be used in the everyday learning and teaching practice.
  2. Build in support for quality enhancement.
    An institution must also establish mechanisms that allow it to review and improve current practice, as it is not sufficient to simply make the quality model explicit (Biggs (2001, p. 223). This stage aims to help teachers to ‘teach better’ through the provision of responsive, appropriate, and contextualised support that responds to insights gained as a result of a greater focus on alignment and other factors.
  3. Institute a process for quality feasibility.
    An institution can only enhance quality if it actively identifies and removes factors that inhibit quality learning (Biggs, 2001, p. 229). This requires formal leadership, processes and hierarchies at the participating institutions to be actively involved in the removal of these inhibiting factors. For the project this involves factors identified through the quality enhancement process and also, more broadly, factors inhibiting the project’s aim of building distributive leadership capacity.

This project will help teaching academics to more regularly consider alignment through context sensitive and collegial methods by building distributive leadership capacity into the participant institutions. . This improved capacity will empower and encourage teaching academics to develop and grow their conceptions of teaching and learning and engage in ongoing improvement of teaching. This process is aided by the active removal of inhibiting factors. The combination of all these actions should lead to significant improvements in student learning outcomes.

Background and rationale

While it is common to describe leadership as a concept that eludes comprehensive definition (Southwell & Morgan, 2009), Parker (2008) suggests that some level of conceptual clarity around leadership within higher education has emerged from the ALTC leadership grants. This emerging view sees leadership in universities as inclusive and distributed, as opposed to the “deeply entrenched association of leadership with hierarchy and authority” (Parker, 2008). Lakomski (2005) argues that the growing recognition of distributed leadership within organisational theory is helping debunk the leader myth of traditional leadership theories. This project, like a number of previous ALTC Leadership projects, is based on the concept of distributed or distributive leadership.

Parrish et al (2008) define distributive leadership as the distribution of power through a collegial sharing of knowledge, of practice, and reflection within a socio-cultural context. Zepke (2007) argues that this is more than the delegation of tasks and responsibilities, and more than collaborative practice. Spillane et al (2004, p. 9) argue that, based on its foundations in distributed cognition and activity theory, distributive leadership is not limited to people, but can also be attributed to artefacts such as language, notational systems, tools and buildings. Leadership activity is distributed through an interactive web of actors, artefacts and situation (Spillane et al., 2004, p. 20). Spillane et al (2004, p. 11) define Leadership as

the identification, acquisition, allocation, co-ordination, and use of the social, material, and cultural resources necessary to establish the conditions for the possibility of teaching and learning.

Over thirty years of research (Prosser, Ramsden, Trigwell, & Martin, 2003; Ramsden, Prosser, Trigwell, & Martin, 2007) has produced abundant empirical inquiry and theory that links the quality of student learning outcomes with: (1) the approaches to learning taken by students; (2) the students’ perceptions of the learning context; and (3) the approaches to teaching practiced by teaching staff. In turn, this research confirms the findings of other leadership studies by illustrating that variation in teaching approaches is associated with perceptions of the academic environment (Ramsden et al., 2007). As Biggs (1999) argues, it is the alignment of all aspects of the system that contributes to higher quality outcomes. Conversely, misalignment within an institutional system is likely to contribute to a lowering of quality outcomes. In particular, while pedagogues may hold a higher-level view of teaching other contextual factors may prevent use of those conceptions (Leveson, 2004).

A fundamental assumption of this project is that there is a misalignment between the importance of instructional and curriculum alignment to student learning outcomes and its prevalence within the teaching and learning systems and processes of universities. This misalignment is seen as a major contributing factor to Barrie’s et al (2009) observation that despite significant espoused intentions around graduate attributes,

Australian universities have not generally been successful in deliberately and systematically refocussing the curriculum in ways that foreground the development of these attributes as opposed to the acquisition of factual disciplinary content or the accumulation of isolated and unrelated knowledge, skills and dispositions

This project aims to address this misalignment through making alignment a prevalent component of the teaching and learning systems of the participant institutions. It seeks to move consideration of alignment beyond a focus on program review or accreditation purposes, towards making consideration of alignment as a part of everyday teaching practice. To achieve this goal, the project must deal with a number of problems. The approaches this project will adopt to address these problems are described in the following.

Most teaching practice is not alignment focused

The practice of most academics does not separate planning from implementation, and rather than starting with explicit course objectives, starts with content (Lattuca & Stark, 2009). The dominant setting for academics is teaching an existing course for which they spend most of the time making minor modifications to material or content (Stark, 2000). For most staff teaching a course starts with the existing course materials such as outlines, assignments and website. The general description of these existing courses embedded in these materials may be non-specific and not systematically explain the content of teaching and the outcome of learning (Levander & Mikkola, 2009). This make it difficult to understand just how aligned a course is both within itself and with other courses in the program. This problem is compounded by the increasing casualisation of academic staff that leads to a context where there is high staff turnover, lack of ownership and lack of institutional support (Green et al., 2009).

The project will embed consideration of alignment into everyday practice by modifying the main institutional learning and teaching information system used by teachers and students, the LMS. The intent is to map alignment of a subset of existing courses within the LMS through a collaborative process between teaching academics and support staff. As described above, standard practice for most academics is to copy the course site from the last offering and make minor modifications to material and activities. The LMS modifications will enable and encourage teaching academics to modify the alignment mapping of their course as they make these minor modifications. Importantly, the project also aims to identify and experiment with additional LMS modifications that enable teaching staff and students to make use of the alignment mapping within the LMS.

Teaching is an isolated, solitary practice

The norms of the higher education community encourage autonomy and independence (Uchiyama & Radin, 2009). Lowe and Marshall (2004) describe academic life as often isolated and that even when this isolation is overcome, few academics will discuss course design and teaching practices with peers. The planning and implementation of teaching has largely been a private issue creating the possibility that the actual delivered teaching represents the teacher’s implicit, internalised knowledge and not that described in published course descriptions (Levander & Mikkola, 2009).

Enabling examination, comparison and discussion about the alignment and how it was achieved amongst groups of courses, teaching academics and other stakeholders is a major aim of the project. Initially this may focus on leveraging the alignment information for staff teaching courses within the same program, including program coordinators. The L&T support section below describes how the project hopes to enable and encourage connections between teaching academics and L&T support staff.

Alignment is difficult

Levander and Mikkola (2009) describe the full complexity of managing alignment at the degree level which makes it difficult for the individual teacher and the program coordinator to keep connections between courses in mind. von Konsky et al (2006) describe how the sharing of courses between programs and a variety of outcome types (e.g. graduate attributes and course, program, discipline accrediting body learning outcomes) significantly complicates curriculum design and review. In reporting on the status of curriculum mapping, a significant task associated with alignment, Willet (2008) reports on the need for more research on effective political and electronic strategies for the construction and maintenance of curriculum maps, especially those that improve faculty participation and buy-in.

The overarching aim of the project is to build distributive leadership capacity into the systems (mostly in the form of modifications to the LMS) and processes (mostly aimed at helping teaching staff overcome these difficulties) of the participant institutions. The project aims to reduce, if not remove, the difficulties associated with this task. It seeks to achieve this by adopting an action research methodology that draws heavily on the skills, experience and insights from a broad array of project participants. The action research methodology recognises that a major part of this project is focused on learning about these difficulties and how best to reduce them within the host institutions. The following table summarises how participant selection will help reduce the impact of difficulties.

Participants Contribution
Reference group Members: chosen due to expertise and experience gained from previous ALTC leadership grants (e.g. ???) and related alignment and mapping work (Lowe & Marshall, 2004; Oliver, Jones, Ferns, & Tucker, 2007).
Responsibilities: critique and offer suggestions for improvement of project plans and results.
Institutional steering committees
(1 per institution)
Members: Institutional members with expertise/responsibility for aspects of institutional strategic aims or operational environment.
Responsibilities: planning how project activities are integrated into each institution, and fulfilling quality feasibility task.
Project team Members: Institutional L&T support staff with expertise and insight into alignment and related issues.
Responsibilities: collaborating with and helping participating teaching academic staff map and respond to course alignment.
Teaching academic staff Members: Teaching staff responsible for courses selected (using process developed by institutional steering committee and reviewed by reference group) for participation in the project.
Responsibilities: Engage reflectively on the process and its outcomes.

Concerns around learning and teaching (L&T) support

Academics come to teaching with immense amounts of content knowledge but little or no knowledge of teaching and learning (Weimer, 2007). Given this limited knowledge and the complexities and importance of learning and teaching knowledge universities have provided various types of L&T support (e.g. staff development, instructional design etc). How this support is provided and questions about its impact of the quality of L&T remain problematic. Parker (2008) identifies the on-going tension between centralised and devolved L&T support. It is widely recognised that the activities and resources associated with L&T support are used by small numbers of teaching academics, and usually not those most in need of the support (The National GAP, 2009). Weimer (2007) argues that despite nearly 30 years of effort, L&T support roles have had little impact on the instructional quality of higher education.

By making alignment an everyday consideration of teaching practice, the project aims to directly address some concerns around L&T support by drawing on important insights from the literature. Numerous authors (Biggs, 1999; Michael Prosser & Trigwell, 1999; Ramsden, 1998) have argued that the focus of L&T support should shift from techniques and technologies towards the facilitation and support of a more reflective approach to teaching. Encouraging reflection at all levels is a fundamental components of the project’s aims to move towards Bigg’s (2001) idea of the reflective institution. The quality enhancement task of the project is most closely associated with encouraging a reflective approach to teaching. Biggs (2001, p. 227) argues that the fundamental problem with L&T support is the focus on individual teachers, rather than on teaching. Following his approach, this project maintains the on-going focus on the alignment of courses, not on individual teachers. Boud (1999) argues that L&T support needs to be embedded within the context of academic work, that it needs to occur in or close to the teaching academics sites of practice. The aim of the quality enhancement phase is to make consideration of alignment an important site of practice for teaching academics and to provide the L&T support necessary as part of this site of practice.

Limitations of quality assurance

While outcomes-based quality assurance has been a prevalent component of higher education for a number of years, there remain significant concerns about how it is implemented and the subsequent outcomes. Raban (2007) observes that the quality management systems of most universities employ procedures that are retrospective and weakly integrated with long term strategic planning. He continues to argue that the conventional quality management systems used by higher education are self-defeating as they undermine the commitment and motivation of academic staff through an apparent lack of trust, and divert resources away from the core activities of teaching and research (Raban, 2007, p. 78). Barrie et al (2009) identify a bureaucratic approach to quality assurance as a potential contributor to the limited engagement of university staff in graduate attributes curriculum renewal. Biggs (2001) defines this type of quality assurance as retrospective and argues that its procedures are frequently counter-productive for quality and that most of its indicators concentrate on administrative procedures. He cites Bowden and Marton’s (1998) opinion that “retrospective QA actually damages teaching”.

Bigg’s (2001) conception of the reflective institution and its use of prospective quality assurance is presented as a solution that can make retrospective QA redundant. This project seeks to build distributive leadership capacity that enables the development of prospective quality assurance based around the everyday teaching practice of academic. Bigg’s (2001) defines prospective quality assurance as being, in part, as a bottom-up, systemic and supportive process with a priority on educational or scholarly outcomes. Such an approach has a focus on the teaching, not the teacher. These characteristics have significant connections with Southwell and Morgan’s (2009) description of Fullan’s (2008) “new leadership”, which they describe as having many of the hall marks of distributed leadership.

Long-term systemic change

As an attempt to build distributive leadership capacity the fundamental problem facing the project is to encourage long-term, systemic change. The change should not disappear once the project completes, it should become part of everyday operations. To achieve long-term, systemic change the project will:

  1. Ensure participation of formal institutional leadership and integration with institutional priorities.
    Beyond simply expressing support for a project, this project requires the active participation of formal institutional leadership roles in the institutional steering committees. These committees are responsible for developing the institutional implementation plans for two cycles of alignment embedding. These plans are intended to ensure that the project integrates appropriately with institutional priorities and practices. They are tasked with Bigg’s (2001) quality feasibility task that aims to increase institutional alignment.
  2. Action research perspective, flexible responsive.
    There is recognition that the type of fundamental change being attempted by this project is difficult, complex and replete with uncertainty. A critical success factor for the project is the ability to identify and respond to new insights. The projects action research methodology and the very nature of Bigg’s (2001) idea of a reflective institution aims to achieve on-going learning and improvement.
  3. Having a scholarly, not bureaucratic focus.
    As described above, the very nature of prospective quality assurance (Biggs, 2001) is bottom-up, systemic, supportive, and with a priority on educational or scholarly outcomes.
  4. Modifying an institutional information system.
    A fundamental enabler of this project is the presence of an information system that is embedded into the everyday practice of teaching and learning (for both students and staff) that encourages and enables consideration of alignment. Rather than develop a stand alone tool, this project seeks to modify the institutional LMS, a system to which the institutions are already significantly committed. In addition, both institutions have adopted the open source LMS Moodle as their institutional LMS. As an open source system, it is not only possible to make the changes, the subsequent changes will become available within the broader Moodle community. This increases the likelihood of on-going support both within and outside the participant institutions.

Project outcomes

The project aims to build leadership capacity within two institutions that enables consideration of alignment to become part of everyday teaching practice. The outcomes of that aim will include:

  • Within both institutions a number of courses that have had their instructional alignment mapped, made visible and reflected upon.
  • Increased availability and knowledge of resources around alignment and course mapping, especially those produced by ALTC projects, within the participant institutions.
  • For some of these courses, evidence of changes over time in the alignment and structure of the course.
  • Evidence of whether or not there have been changes in the conceptions of learning and teaching held by teaching staff participants.
  • Evidence of whether or not there have been changes in student learning experience or outcomes.
  • Availability of extensions to the Moodle LMS that enable the mapping of instructional alignment within and between courses.
  • Availability of extensions to the Moodle LMS that leverage course alignment information to provide a diverse collection of learning and teaching services.

Methodology

The project will use an eight stage process that has at its core two action research cycles. Each action research cycle consists of 3 stages:

  • plan,
    The institutional steering committee with input from other institutional project members formulates a plan for the research cycle. Institutional plans are shared between participant institutions and reviewed by the reference group.
  • embed, and
    At its core, the project team work with selected teaching academic participants to map, understand and respond to the alignment within their courses. A key part of this stage will be identifying how having the alignment information of the course within the LMS can be leveraged for improving L&T. This will typically proceed over the course of an entire term.
  • review.
    A formal process of reviewing what happened during the embed stage involving all project participants.

Given that two action research cycles with the above three stages, there are two remaining stages. These are focused on the broader tasks of establishing and completing the project.

References

Anderson, L. (2002). Curricular alignment: A re-examination. Theory into Practice, 41(4), 255-260.

Barrie, S., Hughes, C., & Smith, C. (2009). The national graduate attributes project: integration and assessment of graduate attributes in curriculum. Sydney: Australian Learning and Teaching Council.

Biggs, J. (1996). Enhancing teaching through constructive alignment. Higher Education, 32(3), 347-364.

Biggs, J. (1999). Teaching for quality learning at university. Buckingham: Open University Press.

Biggs, J. (2001). The Reflective Institution: Assuring and Enhancing the Quality of Teaching and Learning. Higher Education, 41(3), 221-238.

Boud, D. (1999). Situating academic development in professional work: Using peer learning  International Journal for Academic Development, 4(1), 3-10.

Bowden, J., & Marton, F. (1998). The University of Learning. Oxford: Routledge.

Cohen, S. A. (1987). Instructional alignment: Searching for a magic bullet. Educational Researcher, 16(8), 16-20.

Fullan, M. (2008). The six secrets of change. San Francisco, CA: John Wiley and Sons.

Green, W., Hammer, S., & Star, C. (2009). Facing up to the challenge: why is it so hard to develop graduate attributes? Higher Education Research & Development, 28(1), 17-29.

Lakomski, G. (2005). Managing without Leadership: Towards a Theory of Organizational Functioning: Elsevier Science.

Lattuca, L., & Stark, J. (2009). Shaping the college curriculum: Academic plans in context. San Francisco: John Wiley & Sons.

Levander, L., & Mikkola, M. (2009). Core curriculum analysis: A tool for educational design. The Journal of Agricultural Education and Extension, 15(3), 275-286.

Leveson, L. (2004). Encouraging better learning through better teaching: a study of approaches to teaching in accounting. Accounting Education, 13(4), 529-549.

Lowe, K., & Marshall, L. (2004). Plotting renewal: Pushing curriculum boundaries using a web based graduate attribute mapping tool. Paper presented at the 21st ASCILITE Conference, Perth.

McKinney, L. (2010). Evaluability assessment: Laying the foundation for effective evaluation of a community college retention program. Community College Journal of Research and Practice, 34(4), 299-317.

Oliver, B., Jones, S., Ferns, S., & Tucker, B. (2007). Mapping curricula: ensuring work-ready graduates by mapping course learning outcomes and higher order thinking skills. Paper presented at the Evaluations and Assessment Conference. Retrieved 17 Feb, 2010, from http://www.eac2007.qut.edu.au/proceedings/proceedings_ebook.pdf.

Parker, L. (2008). Leadership for excellence in learning and teaching in Australian higher education: Review of the Australian Learning and Teaching Council (ALTC) Program 2006-2008. Sydney: Australian Learning and Teaching Council.

Parrish, D., Lefoe, G., Smigiel, H., & Albury, R. (2008). The GREEN Resource: The development of leadership capacity in higher education. Wollongong: CEDIR, University of Wollongong.

Prosser, M., Ramsden, P., Trigwell, K., & Martin, E. (2003). Dissonance in experience of teaching and its relation to the quality of student learning. Studies in Higher Education, 28(??), 37-48.

Prosser, M., & Trigwell, K. (1999). Understanding learning and teaching: The experience in higher education. Buckingham: SRHE / Open University Press.

Raban, C. (2007). Assurance versus enhancement: less is more? Journal of Further and Higher Education, 31(1), 77-85.

Ramsden, P. (1998). Learning to Lead in Higher Education. London: Routledge.

Ramsden, P., Prosser, M., Trigwell, K., & Martin, E. (2007). University teachers’ experiences of academic leadership and their approaches to teaching. Learning and Instruction, 17(2), 140-155.

Southwell, D., & Morgan, W. (2009). Leadership and the impact of academic staff development and leadership development on student learning outcomes in higher education: A review of the literature. Sydney: Australian Learning and Teaching Council.

Spillane, J., Halverson, R., & Diamond, J. (2004). Towards a theory of leadership practice: a distributed perspective. Journal of Curriculum Studies, 36(1), 3-34.

Stark, J. (2000). Planning introductory college courses: Content, context and form. Instructional Science, 28(5), 413-438.

The National GAP. (2009). Key issues to consider in the renewal of learning and teaching experiences to foster graduate attributes. Sydney: The National Graduate Attributes Project.

Uchiyama, K. P., & Radin, J. L. (2009). Curriculum Mapping in Higher Education: A Vehicle for Collaboration. Innovative Higher Education, 33(4), 271-280.

von Konsky, B., Loh, A., Robey, M., Gribble, S., Ivins, J., & Cooper, D. (2006). The benefit of information technology in managing outcomes focused curriculum development across related degree programs. Paper presented at the 8th Australian Conference on Computing Education, Hobart, Australia.

Weimer, M. (2007). Intriguing connections but not with the past. International Journal for Academic Development, 12(1), 5-8.

Wholey, J. S. (2004). Evaluability Assessment. In J. S. Wholey, H. P. Hartry & K. E. Newcomer (Eds.), Handbook of practical program evaluation. San Francisco: Jossey-Bass.

Willett, T. (2008). Current status of curriculum mapping in Canada and the UK. Medical Education, 42(8), 786-793.

Zepke, N. (2007). Leadership, power and activity systems in a higher education context: will distributive leadership server in an accountability driven world? International Journal of Leadership in Education, 10(3), 301-314.

Page 1 of 2

Powered by WordPress & Theme by Anders Norén

css.php