Assembling the heterogeneous elements for (digital) learning

Month: January 2015

Using the PIRAC – Thinking about an "integrated dashboard"

On Monday I’m off to a rather large meeting to talk about what data might be usefully syndicated into a integrated dashboard. The following is an attempt to think out lod about the (P)IRAC framework (Jones, Beer and Clark, 2013) in the context of this local project. To help prepare me for the meeting, but also to ponder some recent thoughts about the framework.

This is still a work in progress.

Get the negativity out of the way first

Dashboards sux!!

I have a long-term negative view of the value of dashboards and traditional data warehouses/business intelligence type systems. A view that has risen out of both experience and research. For example, the following is a slide from this invited presentation. There’s also a a paper (Beer, Jones, & Tickner, 2014) that evolved from that presentation.

Slide19

I don’t have a problem with the technology. Data warehouse tools do have a range of functionality that is useful. However, in terms of providing something useful to the everyday life of teachers in a way that enhances learning and teaching, they leave a lot to be desired.

The first problem is the Law of Instrument.

Hammer ... Nail ... by Theen ..., on Flickr
Creative Commons Creative Commons Attribution-Noncommercial-Share Alike 2.0 Generic License   by  Theen … 

The only “analytics” tool the institution has is the data warehouse, so that’s what it has to use. The problem being is that the data warehouse cannot be easily and effectively integrated into the daily act of learning and teaching in a way that provides significant additional affordances (more on affordances below).

Hence it doesn’t get used.

Now, leaving that aside.

(P)IRAC

After a few years of doing learning analytics stuff, we put together the IRAC framework as an attempt to guide learning analytics projects. Broaden the outlook and what needed to be considered. Especially what needed to be considered to ensure that the project outcome was widely and effectively used. The idea is that the four elements of the framework could help ponder what was available and what might be required. The four original components of IRAC are summarised in the following table.

IRAC Framework (adapted from Jones et al 2013)
Component Description
Information
  • the information we collect is usually about “those things that are easiest to identify and count or measure” but which may have “little or no connection with those factors of greatest importance” (Norman, 1993, p. 13).
  • Verhulst’s observation (cited in Bollier & Firestone, 2010) that “big data is driven more by storage capabilities than by superior ways to ascertain useful knowledge” (p. 14).
  • Is the information required technically and ethically available for use?
  • How is the information to be cleaned, analysed and manipulated?
  • Is the information sufficient to fulfill the needs of the task?
  • In particular, does the information captured provide a reasonable basis upon which to “contribute to the understanding of student learning in a complex social context such as higher education” (Lodge & Lewis, 2012, p. 563)?
Representation
  • A bad representation will turn a problem into a reflective challenge, while an appropriate representation can transform the same problem into a simple, straightforward task (Norman, 1993).
  • To maintain performance, it is necessary for people to be “able to learn, use, and reference necessary information within a single context and without breaks in the natural flow of performing their jobs.” (Villachica et al., 2006, p. 540).
  • Olmos and Corrin (2012) suggest that there is a need to better understand how visualisations of complex information can be used to aid analysis.
  • Considerations here focus on how easy is it to understand the implications and limitations of the findings provided by learning analytics? (and much, much more)
Affordances
  • A poorly designed or constructed artefact can greatly hinder its use (Norman, 1993).
  • To have a positive impact on individual performance an IT tool must be utilised and be a good fit for the task it supports (Goodhue & Thompson, 1995).
  • Human beings tend to use objects in “ways suggested by the most salient perceived affordances, not in ways that are difficult to discover” (Norman, 1993, p. 106).
  • The nature of such affordances are not inherent to the artefact, but are instead co-determined by the properties of the artefact in relation to the properties of the individual, including the goals of that individual (Young, Barab, & Garrett, 2000).
  • Glassey (1998) observes that through the provision of “the wrong end-user tools and failing to engage and enable end users” even the best implemented data warehouses “sit abandoned” (p. 62).
  • The consideration for affordances is whether or not the tool and the surrounding environment provide support for action that is appropriate to the context, the individuals and the task.
Change
  • Evolutionary development has been central to the theory of decision support systems (DSS) since its inception in the early 1970s (Arnott & Pervan, 2005).
  • Rather than being implemented in linear or parallel, development occurs through continuous action cycles involving significant user participation (Arnott & Pervan, 2005).
  • Buckingham-Shum (2012) identifies the risk that research and development based on data already being gathered will tend to perpetuate the existing dominant approaches from which the data was generated.
  • Bollier and Firestone (2010) observe that once “people know there is an automated system in place, they may deliberately try to game it” (p. 6).
  • Universities are complex systems (Beer, Jones, & Clark, 2012) requiring reflective and adaptive approaches that seek to identify and respond to emergent behaviour in order to stimulate increased interaction and communication (Boustani et al., 2010).
  • Potential considerations here include, who is able to implement change? Which, if any, of the three prior questions can be changed? How radical can those changes be? Is a diversity of change possible?

Adding purpose

Whilst on holiday enjoying the Queenstown view below and various refreshments, @beerc and I discussed a range of issues, including the IRAC framework and what might be missing. Both @beerc and @damoclarky have identified potential elements to be added, but I’ve always been reluctant. However, one of the common themes underpinning much of the discussion of learning analytics at ASCILITE’2014 was for whom was learning analytics being done? We raised this question somewhat in our paper when we suggested that much of learning analytics (and educational technology) is mostly done to academics (and students). Typically in the service of some purpose serving the needs of senior management or central services. But the issue was also raised by many others.

Which got us thinking about Purpose.

Queenstown View

As originally framed (Jones et al, 2013)

The IRAC framework is intended to be applied with a particular context and a particular task in mind……Olmos & Corrin (2012), amongst others, reinforce the importance for learning analytics to start with “a clear understanding of the questions to be answered” (p. 47) or the task to be achieved.

If you start the design of a learning analytics tool/intervention without a clear idea of the task (and its context) in mind, then it’s going to be difficult to implement.

In our discussions in NZ, I’d actually forgotten about this focus in the original paper. This perhaps reinforces the need for IRAC to become PIRAC. To explicitly make purpose the initial consideration.

Beyond increasing focus on the task, purpose also brings in the broader organisational, personal, and political considerations that are inherent in this type of work.

So perhaps purpose encapsulates

  1. Why are we doing this? What’s the purpose?
    Reading between the lines, this particular project seems to be driven more by the availability of the tool and a person with the expertise to do stuff with the tool. The creation of a dashboard seems the strongest reason given.
    Tied in with seems to be the point that the institution needs to be seen to be responding to the “learning analytics” fad (the FOMO problem). Related to this will, no doubt, be some idea that by doing something in this area, learning and teaching will improve.
  2. What’s the actual task we’re trying to support?
    In terms of a specific L&T task, nothing is mentioned.
  3. Who is involved? Who are they? etc.
    The apparent assumption is that it is teaching staff. The integrated dashboard will be used by staff to improve teaching?

Personally, I’ve found thinking about these different perspectives useful. Wonder if anyone else will?

(P)IRAC analysis for the integrated dashboard project

What follows is a more concerted effort to use PIRAC to think about the project. Mainly to see if I can come up with some useful questions/contributions for Monday.

Purpose

  • Purpose
    As above the purpose appears to be to use the data warehouse.

    Questions:

    • What’s the actual BI/data warehouse application(s)?
    • What’s the usage of the BI/data warehouse at the moment?
    • What’s it used for?
    • What is the difference in purpose in using the BI/data warehouse tool versus Moodle analytics plugins or standard Moodle reports?
  • Task
    Without knowing what the tool can do I’m left with pondering what information related tasks that are currently frustrating or limited. A list might include

    1. Knowing who my students are, where they are, what they are studying, what they’ve studied and when the add/drop the course (in a way that I can leverage).
      Which is part of what I’m doing here.
    2. Having access to the results of course evaluation surveys in a form that I can analyse (e.g. with NVivo).
    3. How do I identify students who are not engaging, struggling, not learning, doing fantastic and intervene?

    Questions:

    • Can the “dashboards” help with the tasks above?
    • What are the tasks that a dashboard can help with that isn’t available in the Moodle reports?
  • Who
  • Context

What might be some potential sources for a task?

  1. Existing practice
    e.g. what are staff currently using in terms of Moodle reports and is that good/bad/indifferent?

  2. Widespread problems?
    What are the problems faced by teaching staff?
  3. Specific pedagogical goals?
  4. Espoused institutional priorities?
    Personalised learning appears to be one. What are others?

Questions:

  • How are staff using existing Moodle reports and analytics plugins?
  • How are they using the BI tools?
  • What are widespread problems facing teaching staff?
  • What is important to the institution?

Information

The simple questions

  • What information is technically available?
    It appears that the data warehouse includes data on

    • enrolment load
      Apparently aimed more at trends, but can do semester numbers.
    • Completion of courses and programs.
    • Recruitment and admission
      The description of what’s included in this isn’t clear.
    • Student evaluation and surveys
      Appears to include institutional and external evaluation results. Could be useful.

    As I view the dashboards, I do find myself asking questions (fairly unimportant ones) related to the data that is available, rather than the data that is important.

    Questions

    • Does the data warehouse/BI system know who’s teaching what when?
    • When/what information is accessible from Moodle, Mahara and other teaching systems?
    • Can the BI system enrolment load information drill down to course and cohort levels?
    • What type of information is included in the recruitment and admission data that might be useful to teaching staff?
    • Can we get access to course evaluation surveys for courses in a flexible format?
  • What information is ethically available?

Given the absence of a specific task, it would appear

Representation

  • What types of representation are available?
    It would appear that the dashboards etc are being implemented with PerformancePoint hence it’s integration with Sharepoint (off to a strong start there). I assume relying on its “dashboards” feature hence meaning it can do this. So there would appear to be a requirement for Silverlight to see some of the representations

    Questions

    • Can the data warehouse provide flexible/primitive access to data?
      i.e. CSV, text or direct database connections?
  • What is knowledge is required to view those representations?
    There doesn’t appear to be much in the way of contextual help with the existing dashboards. You have to know what the labels/terminology mean. Which may not be a problem for the people for whom the existing dashboards are intended.
  • What is the process for viewing these representations?

Affordances

Based on the information above about the tool, it would appear that there are no real affordances that the dashboard system can provide. It will tend to be limited to representing information.

  • What functionality does the tool allow people to do?
  • What knowledge and other resources are required to effectively use that functionality?

Change

  • Who, how, how regularly and with what cost can the
    1. Purpose;
      Will need to be approved via whatever governance process exists.
    2. Information;
      This would be fairly constrained. I can’t see much of the above information changing. At least not in terms of getting access to more or different data. The question about ethics could potentially meant that there would be less information available.
    3. Representation; and,
      Essentially this would appear that all the dashboards can change. Any change will be limited by the specifics of the tool
    4. Affordances.
      You can’t change what you don’t have.

    be changed?

Adding some learning process analytics to EDC3100

In Jones and Clark (2014) we drew on Damien’s (Clark) development of the Moodle Activity Viewer (MAV) as an example of how bricolage, affordances and distribution (the BAD mindset) can add some value to institutional e-learning. My empirical contribution to that paper was talking about how I’d extended MAV so that when I was answering a student query in a discussion forum I could quickly see relevant information about that student (e.g. their major, which education system they would likely be teaching into etc).

A major point of that exercise was that it was very difficult to actually get access to that data at all. Let alone get access to that data within the online learning environment for the course. At least if I had to wait upon the institutional systems and processes to lumber into action.

As this post evolved, it’s become also an early test to see if the IRAC framework can offer some guidance in designing the extension of this tool by adding some learning process analytics. The result of this post

  1. Defines learning process analytics.
  2. Applies that definition to my course.
  3. Uses the IRAC framework to show off the current mockup of the tool and think about what other features might be added.

Very keen to hear some suggestions on the last point.

At this stage, the tool is working but only the student details are being displayed. The rest of the tool is simply showing the static mockup. This afternoon’s task is to start implementing the learning process analytics functionality.

Some ad hoc questions/reflections that arise from this post

  1. How is the idea of learning process analytics going to be influenced by the inherent tension between the tendency for e-learning systems to be generic and the incredible diversity of learning designs?
  2. Can generic learning process analytics tools help learners and teachers understand what’s going on in widely different learning designs?
  3. How can you the diversity of learning designs (and contexts) be supported by learning process analytics?
  4. Can a bottom-up approach work better than a top-down?
  5. Do I have any chance of convincing the institution that they should provide me with
    1. Appropriate access to the Moodle and Peoplesoft database; and,
    2. A server on which to install and modify software?

Learning process analytics

The following outlines the planning and implementation of the extension of that tool through the addition of process analytics. Schneider et al (2012) (a new reference I’ve just stumbled across) define learning process analytics

as a collection of methods that allow teachers
and learners to understand what is going on in a learning scenario, i.e. what participants work(ed) on, how they interact(ed), what they produced(ed), what tools they use(ed), in which physical and virtual location, etc. (p. 1632)

and a bit later on learning scenario and learning process analytics are defined as

as the measurement and collection of learner actions and learner productions, organized to provide feedback to learners, groups of learners and teachers during a teaching/learning situation. (p. 1632)

This is a nice definition in terms of what I want to achieve. My specific aim is to

collect, measure, organise and display learner actions and learner productions to provide feedback to the teacher during a teaching/learning situation

Two main reasons for the focus on providing this information to the teacher

  1. I don’t have the resources or the technology (yet) to easily provide this information to the learners.
    The method I’m using here relies on servers and databases residing on my computer (a laptop). Not something I can scale to the students in my class. I could perhaps look at using an external server (the institution doesn’t provide servers) but that would be a little difficult (I haven’t done it before) and potentially get me in trouble with the institution (not worth the hassle just yet).

    As it stands, I won’t even be able to provide this information to the other staff teaching into my course.

  2. It’s easier to see how I can (will?) use this information to improve my teaching and hopefully student learning.
    It’s harder to see how/if learners might use any sort of information to improve their learning.

Providing this information to me is the low hanging fruit. If it works, then I can perhaps reach for the fruit higher up.

Learner actions and productions

What are the learner actions and productions I’m going to generate analytics from?

The current course design means that students will be

  1. Using and completing a range of activities and resources contained on the course site and organised into weekly learning paths.
    These actions are in turn illustrated through a range of data including

    • Raw clicks around the course site stored in system logs.
    • Activity completion.
      i.e. if a student has viewed all pages in a resource, completed a quiz, or posted the required contributions to a discussion forum they are counted as completing an activity. Students get marks for completing activities.
    • Data specific to each activity.
      i.e. the content of the posts they contributed to a forum, the answers they gave on a quiz.
  2. Posting to their individual blog (external to institutional systems) for the course.
    Students get marks for # of posts, average word count and links to other students and external resources.
  3. Completing assignments.
  4. Contributing to discussions on various forms of social media.
    Some officially associated with the course (e.g. Diigo and others unofficially (student Facebook groups).

I can’t use some of the above as I do not have access to the data. Private student Facebook groups is one example, but the more prevalent is institutional data that I’m unable to access. In fact, the only data I can easily get access to is

  • Student blog posts; and,
  • Activity completion data.

So that’s what I’ll focus on. Obviously there is a danger here that what I can measure (or in this case access) is what becomes important. On the plus side, the design of this course does place significant importance on the learning activities students undertake and the blog posts. It appears what I can measure is actually important.

Here’s where I’m thinking that the IRAC framework can scaffold the design of what I’m doing.

Information

Is all the relevant Information and only the relevant information available?

Two broad sources of information

  1. Blog posts.
    I’ll be running a duplicate version of the BIM module in a Moodle install running on my laptop. BIM will keep a mirror of all the posts students make to their blogs. The information in the database will include

    • Date, time, link and total for each post.
    • A copy of the HTML for the post.
    • The total number of posts made so far, the url for the blog its feed.
  2. Activity completion.
    I’ll have to set up a manual process for importing activity completion data into a database on my computer. For each activity I will have access to the date and time when the student completed the activity (if they have).

What type of analysis or manipulation can I perform on this information?

At the moment, not a lot. I don’t have a development environment that will allow me to run lots of complex algorithms over this data. This will have to evolve over time. What do I want to be able to do initially? An early incomplete list of some questions

  1. When was the last time the student posted to their blog?
  2. How many blog posts have they contributed? What were they titled? What is the link to those posts?
  3. Are the blog posts spread out over time?
  4. Who are the other students they’ve linked to?
  5. What activities have they completed? How long ago?
  6. Does it appear they’ve engaged in a bit of task corruption in completing the activities?
    e.g. is there a sequence of activities that were completed very quickly?

Representation

Does the representation of the information aid the task being undertaken?

The task here is basically giving me some information about the student progress.

For now it’s going to be a simple extension to the approach talked about in the paper. i.e. whenever my browser sees on a course website a a link to a user profile, it will add a link [Details] next to it. If I click on that link I see a popup showing information about that student. The following is a mockup (click on the images to see a larger version) of what is currently partially working

001 - Personal Details

By default the student details are shown. There are two other tabs, one for activity completion and one for blog posts.

Requirement suggestion: Add into the title of each tab some initial information. e.g. Activity completion should include something like “(55%)” indicating the percentage of activities currently completed. Or perhaps it might be the percentage of the current week’s activities that have been completed (or perhaps the current module).

The activity completion tab is currently the most complicated and the ugliest. Moving the mouse of the Activity Completion tab brings up the following.

002 - Activity completion

The red, green and yellow colours are ugly and are intended to indicate a simple traffic light representation. Green means all complete, red is not, yellow means in progress for some scale.

The course is actually broken up into 3 modules. The image above shows each module being represented. Open up a module and you see the list of weeks for that module – also with the traffic light colours. Click on a particular week and you see the list of activities for that week. Also with colours, but also with the date when the student completed the activity.

Requirement suggestion: The title bars for the weeks and modules could show the first and last time the student completed an activity in that week/module.

Requirement suggestion: The date/time when an activity was completed could be a roll-over. Move the mouse over the date/time and it will change the date/time to how long ago that was.

Requirement suggestion: What about showing the percentage of students who have completed activities? Each activity could show the % of students who had completed it. Each week could show the percentage of students who had completed that week’s activities. Each module could….

Requirement suggestion: Find some better colours.

The blog post tab is the most under-developed. The mockup currently only shows some raw data that is used to generate the students mark.

003- blog posts

Update The following screen shot shows progress on this tab. The following is from the working tool.

BlogProcessAnalytics

Requirement suggestions:

  • Show a list of recent blog post titles that are also links to those posts.
    Knowing what the student has (or hasn’t) blogged recently may give some insight into their experience.
    Done: see above image.
  • Show the names of students where this student has linked to their blog posts.
  • Organise the statistics into Modules and show the interim mark they’d get.
    This would be of immediate interest to the students.

Affordances

Are there appropriate Affordances for action?

What functionality can this tool provide to me that will help?

Initially it may simply be the display of the information. I’ll be left to my own devices to do something with it.

Have to admit to being unable to think of anything useful, just yet.

Change

How will the information, representation and the affordances be Changed?

Some quick answers

  1. ATM, I’m the only one using this tool and it’s all running from my laptop. Hence no worry about impact on others if I make changes to what the tool does. Allows some rapid experimentation.
  2. Convincing the organisation to provide an API or some other form of access directly (and safely/appropriately) to the Moodle database would be the biggest/easiest way to change the information.
  3. Exploring additional algorithms that could reveal new insights and affordances is also a good source.
  4. Currently the design of the tool and its environment is quite kludgy. Some decent design could make this particularly flexible.
    e.g. simply having the server return JSON data rather than HTML and having some capacity on the client side to format that data could enable some experimentation and change.

References

Schneider, D. K., Class, B., Benetos, K., Lange, M., Internet, R., Developer, A., & Zealand, N. (2012). Requirements for learning scenario and learning process analytics. In World Conference on Educational Multimedia, Hypermedia and Telecommunications (pp. 1632–1641).

A proposal for fixing what's broken with ed tech support in some universities

This paper analyses the outcomes of what a small group of academics (myself included) attempted to do to develop the knowledge/capability to develop effective learning for hundreds of pre-service teachers via e-learning. That experience is analysed using a distributive view of knowledge and learning and illustrates just how broken what passes for ed tech support/academic staff development in some universities. Picking up on yesterday’s post, the paper reports on academics harnessing their digital fluency to address the almost complete lack of usefulness of the institutionally developed attempts at supporting academic staff in developing the knowledge necessary for effective e-learning.

The distributive view of knowledge and learning used in the paper drew on three conceptual themes from Putnam and Borko (2000) and one theme we’ve added. Those themes suggests that knowledge and learning is/should be

  1. situated;
    Context matters. An inappropriate context can limit transfer of learning into different contexts. The entire system/context in which learning takes place is fundamental to what is learned.
  2. social;
    How we think and what we know arises from on-going interactions with groups of people of time.
  3. distributed; and,
    The knowledge required to perform a task does not exist solely within an individual person or even groups of people. It also resides in artifacts. Appropriate tools can enhance, transform and distribute cognition and expand “a system’s capacity for innovation and invention” (Putnam & Borko, 2000, p. 10).
  4. protean.
    The computer is “the first metamedium, and as such has degrees of freedom and expression never before encountered” (Kay, 1984, p. 59) it has a “protean nature”. i.e. digital technology can be flexible and should be open to be manipulated in response to needs.

This post is an attempt to propose one way in which institutional attempts at ed tech support could be transformed to actually support these four themes. i.e. to actually be made useful and appropriate for the task.

Analysing existing practice

When I first started on this post the plan was to analyse my existing institution’s attempts at ed tech support. So I logged into the Moodle site for the course I’ll be teaching next semester and asked the question, “If I had problem with X (whatever that is), how would I find an answer?”. The answer to that question was summarised in this post. A post that is password protected because of the embarrassing difficulty I had in answering that question.

Using the four themes the following criticisms might be made

  1. situated;
    The support resources were not situated in the context. I could not find any help with the course site from within the course site. I had to go to another website and waste time figuring out which labyrinth of links I would follow to get to the support resources. The first time I tried it I failed.

    I just wanted to check some aspect of my earlier analysis. Initially I had difficulties finding my way through the labyrinth.

  2. social;
    Almost the entire support resources were centrally produced and approved. Some with heavy production values. Pure information distribution. There was a small collection of Moodle discussion forums intended I imagine to encourage social interaction. Half of those forums had no posts, the other half had single posts all from the same author.

    Apart from the discussion forums, the only way to add to these resources was via the small number of people from central support.

    This also means that the “message” shared via these resources can be controlled by the institution. Raising the question about whether differing views can be expressed. For example, there is a section on using the Mahara e-portfolio that extols the educational virtues of Mahara. There’s no way I can contribute the reasons I don’t use Mahara and use something different. The point isn’t that Mahara is a bad tool, but that there are some issues with using it and alternatives. More importantly, there’s no way to share this alternate view.

  3. distributed; and
    Any content to be added to the site had to be manually added by a small select group of people. There was no integration between the resources and other systems. For example, the IT Help Desk system was in no way integrated. So if there was a known problem with the “discussion forums” being raised through the IT help desk, there was no way for that information to appear in the support resources on “discussion forums”.
  4. protean.
    The support resources were implemented in an ad hoc collection of Moodle-based course sites. The resources were all static and professionally designed. Little or no way to repurpose those resources or to add to them.

    Moodle discussion forums can generate RSS feeds and also have the option of subscribing to a forum via email. These are methods that allow a user to modify how they interact with the discussion forum. If I’m interested in a forum I can integrate new activity on the forum into my daily routine either through my feed reader or email.

    The ability to grab the RSS feed or subscribe via email to the discussion forums in these support resources are not visible via the interface that has been used.

Some design principles

Beyond critiquing what exists, the four conceptual themes above might also be useful in terms of developing guidelines for what might be. Here’s an initial brainstorm of potential guidelines. Feel free to add and argue.

  1. The support resources should be situated within the context of the academics.
    Some of what this might suggests includes

    • If you want to learn about using the discussion forums better, you should be able to do this from within the discussion forum.
    • If you want to know how to use the discussion forum for a introductory/warm up activity at the start of semester, this should be possible.
    • The support you receive should be tailored(able) to the type of course or discipline you are teaching.
    • The support system should know who you are, what you’re teaching, what you’ve done before, what groups you belong to, what time of semester (e.g. before semester starts, first couple of weeks, assessment due, end of semester etc) it is etc.
    • The support system should use the tools that people use (not the institution).
      i.e. not this from Dutton

      Organizations aren’t thinking about the ‘networked individual’ – the networking choices and patterns of individual Internet users. They’re still focused on their own organizational information systems and traditional institutional networks.

  2. The support area should encourage/enable participation in various discourse communities.
    Which might suggest approaches such as

    • The system should make you aware of the communities/individuals that are using the tool you are currently (thinking of) using.
      e.g. if you’re looking at using the BIM Moodle module the support system should help you become aware of who else has used the tool and perhaps how (leading into..)
    • The system should help capture and make available for on-going use the “residue of experience” (Riel & Pollin, 2004) of other members of the community.
      The discussion, reflections and analysis of prior use of tools and methods should be available. At a simple level, this might be ensuring that any and all questions about the discussion forum (including those from the helpdesk) be visible/searchable from the support site about the discussion forum.
  3. The support area should integrate with and integrate into it all of the appropriate organisational and external systems and processes.
    This might include such things as

    • Knowledge from other systems offering support appear automatically.
      For example, any known issues information about tools are integrated appropriately into the environment.
    • Organisational information sources such as student records systems, teaching responsibilities databases, results of course evaluation surveys etc should be integrated into the support and used to situate and modify resources appropriately.
    • Knowledge from the support area should be openly available (as appropriate) for integration into other systems.
      Might be as simple as generating an RSS/OPML feed (or two) or allowing email subscription. Perhaps publish an API.
    • The “how to do” advice in the support area should actually help you do it.
      i.e. rather than a sequence of steps describing what you do, there’s actually a link that will take you back to actual system and help you do it. Linked to the idea of Context Appropriate Scaffolding Assemblages (CASA).
  4. The support area should support manipulation and change by the users and their actions (protean).
    This might mean

    • Something as simple as having decent customisation options.
    • Something more radical like Smallest Federated Wiki.
      i.e. where each individual or group could fork the support resources and make their own changes. Changes that might be potentially integrated back into the original institutional version.

One illustration

So how might that work in action. Here’s one possible illustration.

Login

You start by logging in to one of the institutional systems (e.g. the LMS).

Straight away I have a qualm about whether or not a login is required. In order for the system to know about you (see the situated principle above) some form of identification is required. But requiring a login means that system isn’t open. So perhaps there’s an avenue that doesn’t require a login.

The Mini-map appears

Not only do you login to the LMS but you also login to the “support system” and the mini-map appears.

The mini-map is a small icon (or three) that appears in the browser. Perhaps in the top right hand corner of the page. From now on, where ever you go the mini-map is there. But as you move around to different systems it will likely change because it knows your situation and responds accordingly.

This is based on the mini-maps concept from games occuring in immersive 3D worlds. The suggestion isn’t that this mini-map be represented as an actual map (though perhaps it might be), the point is that the purpose is to help orient you within the e-learning space.

What the mini-map might do

Nesbitt et al (2009) suggest that a

mini-map might also display the position of key landmarks along with the position of the player’s avatar and any other relevant actors in the game

which gives some idea of what the “mini-map” in this context might do.

Specific functionality might include

  1. You are here.
    Provide a summary of what it knows about your current location within the teaching and learning environment. This might include insight into the time of term, common or required tasks you may need to complete soon (or have completed at similar times in the past), updates and announcements updating you on what’s going on in the environment since you were last here.

    e.g. new problems that have arisen around where you are. Such as the lecture capture system being down and that it is being worked upon. This would also suggest that the support system is independent (distributed) from the various services. So it can keep working if they are down.

  2. Who else is here.
    Let you know who else is on this particular page, or who else is using this particular service in another course or at another time. e.g. other people who have used this particular service. Provide some functionality to allow you to control and organise who you want to know about.
  3. What have they done
    Access to the residue of experience, what have these people actually done within your current location. What worked. What didn’t. This might also be links to literature etc.
  4. How to do stuff.
    Advice on how to perform various tasks. Pedagogical patterns, learning designs etc including potential CASA’s that would help or do stuff for you.

And on a non-institutional system

The mini-map would appear when you visit any online location that has been used for learning and teaching. For example, if you want to Google Drive you would have access to (almost) the same functionality described above.

If the mini-map didn’t appear, because you’ve visited a tool that no-one else has used before, you could choose to add the tool to the mini-map and that addition would then be visible to others.

Implementation

I see the mini-map being implemented with something like a Greasemonkey script. This is how it’s possible for it to appear independent of whether you’re viewing an institutional or non-institutional system.

It might work something like the following

  1. You’ve installed the Greasemonkey script on your browser.
  2. You can choose to enable or disable the script at anytime.
  3. Then, whenever you visit a web page the mini-map grabs the URL for that page and sends it to a server.
  4. The server checks to see if that URL matches anything supported by the mini-map.
  5. The version of the mini-map that is displayed depends on whether the URL is currently supported
    • Supported – then show the full mini-map.
    • Not supported – then show the minimal mini-map with just the option to “add this page”
  6. If viewing the supported mini-map you then have access to a range of functionality.
  7. Some functionality will pop up new information.
    e.g. click on the “People” icon and the mini-map might show a list of people you “know” that are/have been here.
  8. Some functionality will take you into a different system.
    e.g. click on one of the people in the list and you might get taken to a web page that shows what they were doing, when, and also provides access to details of what others have done.

The different systems used to provide the various support services should tend to be whatever makes sense but with a focus on being tools people use all the time and not limited to institutional tools. You might use Slack for some functions. SFW might be good for others.

An interesting and challenging extension to this would be to allow the “mini-map” to be extensible by just about anyone at anytime.

Time for lunch.

References

Dutton, W. (2010). Networking distributed public expertise: strategies for citizen sourcing advice to government. One of a Series of Occasional Papers in Science and Technology Policy. Retrieved from http://papers.ssrn.com/sol3/papers.cfm?abstract_id=1767870

Nesbitt, K., Sutton, K., Wilson, J., & Hookham, G. (2009). Improving player spatial abilities for 3D challenges. Proceedings of the Sixth …, 1–3. doi:10.1145/1746050.1746056

Putnam, R., & Borko, H. (2000). What do new views of knowledge and thinking have to say about research on teacher learning? Educational Researcher, 29(1), 4–15. Retrieved from http://www.jstor.org/stable/1176586

Riel, M., & Polin, L. (2004). Online learning communities: Common ground and critical differences in designing technical environments. In S. A. Barab, R. Kling, & J. Gray (Eds.), Designing for Virtual Communities in the Service of Learning (pp. 16–50). Cambridge: Cambridge University Press.

Trying out a new writing process

It’s Wednesday, that means writing day. Time to continue work on a paper that’s been mentioned previously. In this process I’m exploring being open about the writing process and attempting to create a collection of artifacts (evidence?) leading up to the final paper. This post documents today’s steps with a particular focus on first steps with a new Smallest Federated Wiki (SFW) enabled writing process that fruitfully merges with use of this blog.

An idea SFW enabled writing process

Perhaps it’s my programming background, but when I start writing a paper or a presentation I start by developing the structure. The sequence of ideas or points that I want to make in the paper. Once I have that structure in place, once I know what I want to say, I find writing that much easier. Of course the structure will change over time, but having the outline of ideas helps.

In the past I’ve used a range of adhoc methods for developing this structure. Ranging from the purely mental approach of reciting and reframing it in my head through to drawing upon artifacts such as pen & paper, text files edited with vi, and Word documents. As the title suggests the plan here is to use SFW. To create a page/article in SFW that consists of a list of ideas/concepts. Each of those ideas/concepts are separate pages in SFW.

The idea being that the writing process will be a sequence of reading, thinking and writing about each of those ideas/concepts. Of fleshing out each of the pages for the single ideas and perhaps moving them around, removing some and adding others. At some stage when I’m happy with the raw material on SFW, the idea will be translate that into Word or some other editing software.

On the plus side, this approach seems to offer better support (the linking of ideas and their separate development, versioning at a paragraph level, enabling others to build on what I’ve written) for a writing approach I’m comfortable with. However, it’s also likely to offer some challenges and misfits. For example, I found it difficult to come up with labels for the ideas/concepts that make up the paper that are both meaningful to me (and the writing process) and general enough to encourage/enable others to contribute and build upon those ideas.

If other people do contribute on build on those ideas, what implications might this have for authorship? At the moment, I don’t see the SFW ideas being in the form of the final paper. They will be more general. When I move to the word processing software I will re-write/write my own version, but it will likely be influenced by the input of any other contributors.

I also wonder about the affordances of the SFW interface for writing. i.e. the level of technical support it offers for entering and organising large amounts of text and my understanding of how best to leverage that. There’s still not the level of “fluency” (I feel somewhat dirty using that word) I’d like.

Associated with that I wonder about the question of citation management. Important to academic writing, but not directly supported by SFW. I’ll have to develop some practices and wonder how those will scale, especially as I move into the writing of the formal paper.

Time will tell.

Case studies in relevant journals

The next phase of writing is to start filling in some of the ideas in the paper structure by looking at the literature. The first area I’m looking at will be the question of case study research in the journals I’m targeting and also around the question in the paper (what’s involved in the reality of trying to develop high quality learning environments in higher education? Does that explain why there’s limited widespread quality?) The point is to identify prior work to build on and learn from.

Raising the questions of how to best search through articles in specific journals. Does Google scholar support this or do I have return to using the old style (horrendous) “library database search”. Ahh, appears the institutional library has moved a step beyond some of my memories from years ago and my local collection of papers is revealing some tidbits.

Time to start sifting and reading.

Progress made. To early to tell how it will go.

At least one of the papers I skimmed resonated with a negative local experience and generated a rant.

Barriers to higher education technology adoption: Digital fluency or usefulness?

Motaghian et al (2013) in talking about “web-based learning systems” (LMS) conclude that (p. 167)

perceived usefulness was the most important influential factor on instructors’ intention and their actual use of the systems (adoption)

This is seen as important since earlier they’ve argued (p. 158)

despite the emerging trend of using web-based learning systems to facilitate teaching and learning activities, the number of users of web-based learning systems is not increasing as fast as expected (Wang &Wang, 2009). Eventually, while e-learning has been promoted to various levels of users, the intention to continue using such system is still very low. Although initial acceptance of e-learning is an important first step toward achieving e-learning success, actual success still needs continued usage (Lee, 2010). However, since the web is a new medium (for educators and learners alike) for course delivery and learning, it is not well known which mediating and moderating factors in the online environment contribute more to its acceptance and use (Sanchez-Franco, Martínez-López, & Martín-Velicia, 2009)

They found that how useful an instructor perceived the institutional web-based learning system to be, was the most important factor influencing use.

They found that “perceived usefulness (0.50) contributed two times more to intension to use than the perceived ease of use (0.25)” (p. 66) and concluded (emphasis added) (p. 66)

As a result, instructor’s perceived ease of use might not be as important as instructor’s perceived usefulness in this context. Thus, instructors will be more likely to continue to use the system if they consider it useful. Hence, instructors’ requirements should be taken into consideration when developing web-based learning systems (Wang & Wang, 2009).

What might that suggest about the idea from the 2014 Horizon Report that the #1 barrier to higher education technology adoption is the low digital fluency of academic staff?

Might it suggest that low levels of system adoption says more about the usefulness of the technology, than the fluency of the instructors?Might it suggest that the requirements of instructors aren’t being taken into consideration in their development?

Given my recent experience (at the same time as reading this paper) with institutional approaches to e-learning, I know how I’d answer those questions.

What about you?

Note: In talking about this I’m generally focusing on how institutions implement these systems (e.g. Moodle), rather than the development of those systems (e.g. Moodle). I think institutions (at least the ones that I’ve experienced) are particularly incompetent and producing systems that I would perceive as useful.

The importance of bricolage

Not to surprisingly, this observation has me thinking about the differences between the SET (in their way) mindset used by most institutional e-learning and the (breaking) BAD mindset used by people like myself.

A SET mindset is takes a Strategic approach to deciding what work gets done. It’s focus is on achieving the institutional plan or vision. For example, making sure that every course uses the standard look and feel template. Instructors’ perceived usefulness of the system is not the prime concern of a SET mindset.

A BAD mindset uses bricolage as a way of deciding what work gets done. Ciborra (1992) defined bricolage as the “capability of integrating unique ideas and practical design solutions at the end-user level” (p. 299). Bricolage is about solving the problems experienced by users. Bricolage focuses on enhancing perceived usefulness.

So, would the SET mindset or the BAD mindset contribute to greater levels of adoption?

Seems like a no-brainer to me.

References

Ciborra, C. (1992). From thinking to tinkering: The grassroots of strategic information systems. The Information Society, 8(4), 297–309.

Motaghian, H., Hassanzadeh, A., & Moghadam, D. K. (2013). Factors affecting university instructors’ adoption of web-based learning systems: Case study of Iran. Computers & Education, 61, 158–167. doi:10.1016/j.compedu.2012.09.016

Learning to live with a standard look and feel

About – this is an old post (2015) that was never posted. Posting it now for other reasons.

The following outlines experiments and steps taken to live with the dissonances that arise from the implementation of a new institutional look and feel for course sites. The background to this work is documented in this blog post which ends with the following questions that I try to answer:

  1. Can I change the study schedule and assessment links to my existing approaches?
  2. Can I remove the resources link?
  3. Can I implement a macro/API system to insert USQ information?
  4. What can and might I do around providing a search engine for the course site?

Jquery to modify links

The first two questions cry out for a bit of javascript modification. Javascript is just one of the more recent web technologies of which I have only passing knowledge.

The menu links have unique classes

  • study schedule is “link-studyschedule”
  • assessment is “link-assessments”
  • resources is “link-resources”

A quick Google search revealed this post on StackOverflow that includes an answer with the following approach
[code lang=”html”]
<script type=”text/javascript” src=”http://ajax.googleapis.com/ajax/libs/jquery/1.3.2/jquery.min.js”></script>
<script type=”text/javascript”>
$(function(){
$(“.classname”).hide();
});
</script>
[/code]

It uses a Google provided jquery source file to enable the use of Jquery. Adding something like this to the top of a Moodle page will hide the link. Quick search for a bit more jquery process and all done. In 10 minutes I’ve implemented something that would work, but there’s one problem.

Adding it to every study desk page

This only works if the javascript injection is on every page on the study desk. Every page I migrate in (all of those Moodle book chapters would require manual copy and paste) and every page produced by a Moodle plugin (e.g. quiz, discussion forum etc). This is troubling and potentially limiting to what I can do.

If I really wanted to, I could do the manual insertion. But not the quizzes, discussion forums.

The Moodle way would be to include this in the theme. However, I believe that the USQ look and feel is implemented as a site wide theme. Changes specific to a course are not likely to be high on the agenda, let alone organisationally doable. Moodle does allow the notion of course specific themes, but that would require the USQ powers that be to allow this and then enable the creation of a EDC3100 specific theme that used the new look and feel plus my Javascript.

Not likely to happen.

Adding a redirect + adding to some pages

One option/kludge would involve a combination of injecting the javascript on those pages that I can; and setting the official pages for assessment and study schedule to redirect to where I want them to be. So anyone who finds there way to the page would get sent elsewhere.

It’s a questionable approach. It also depends on being able to insert some javascript on the new look and feel’s assessment and study schedule pages.

Which you can’t. It appears that they’ve implemented something that prevents javascript inserted into assessment items from being included on the page displayed. Not that surprising, but a pain.

Which suggests that engaging the hierarchy will be required. Oh fun.

A simple template system

Each of the weekly learning paths on the course site are placed into a topic. Each topic has a label that includes the title of the learning path and the dates for the week in which it is meant to apply. In the image below you should be able to see the “21-25 Jul (Week 1)” near the top of the image.

Week1

The problem is that this is a screen shot from the Semester 1, 2015 offering of the course. The appropriate date is not 21-25 Jul for this learning path.

The same problem exists with the study schedule as shown in the following image. The August dates are not appropriate for these weeks.

studySchedule

Each semester, as the course is copied over, I need to manually update these dates and hope that I get them all. The time it takes to do this reduces the amount of time I spend on make truly important changes to the course.

The advantage of the study schedule in the new look and feel is that it does pull the dates for the weeks and includes them automatically. The problem is that it requires you to constrain yourself to the remaining functionality the study schedule page provides.

The aim here is explore whether it’s possible to use some Javascript to fix this problem. In summary the plan is to

  1. Create a javascript data structure summarising the dates for each week of semester.
  2. Replace the hard-coded/explicit week dates (i.e. 11-15 Aug) in the study schedule and the topic titles with a variable.
    The most likely approach will be to use a simple div such as
    [code lang=”html”]
    <div id=”WEEK_4_DATE”></div>
    [/code]
    This means that if this template approach doesn’t work, there will just be an empty slot. Not something ugly.But of course the topic headings won’t take HTML so the DIV option is out.
  3. Have a function that does a search and replace for the variables.

With all three of these in a page, the dates should just be added in. The advantage is that next semester all I need to do is change the data structure and all the pages will be automatically updated.

Oh dear, there’s a slight bug in the due dates in the new look and feel. Week 9 is listed as “27-1 April” i.e. 27 April to 1 April. It should be “27 April-1 May”. Will have to report that.

What might the 3 levels of organisational culture reveal about university e-learning

I have fulfilled my organisational duty and attended and participated in a 3 hour workshop intended to achieve some level of shared vision within the organisation. As always I remain cynical about likely impact such sessions will have on the organisation and my experience of it. There was, however, some benefit in making me aware of Schein’s three levels of organisational culture (apparently from this book) and summarised in the following table (and more on wikipedia).

Schein's Model.JPG
Schein’s Model” by ShiraraeOwn work. Licensed under CC BY-SA 3.0 via Wikimedia Commons.

Three levels of culture
(Adapted from Schein, 1992, p. 24)
Level Description
1. Artifacts Visible and feelable structures and processes

Observed behaviour – difficulty to decipher

2. Espoused beliefs and values Ideals, goals, values, aspirations

Ideologies

Rationalisations – may or may not be congruent with behaviour and other artifacts

3. Basic underlying assumptions Unconscious, taken-for-granted beliefs and values – determine behaviour, perception, thought and feeling

An idea is that the artifacts are one avenue for exploring the espoused beliefs and the underlying assumptions that inform the organisational culture.

Hence my question, what do the artifacts associated with organisational e-learning say about the organisational culture of those institutions? What are the espoused beliefs and underlying assumptions that inform that culture? Are there any contradictions?

What follows is a quick application of this to my next task – starting the preparation of my course site for the next semester. This is a simple exercise not an in-depth analysis and certainly informed by the barest of familiarities with a very specific view of culture. But on the face of it, this exercise strikes me as as useful lens. (It would appear that @leahpmac already done some work around “culture and e-learning”. It’s a small world.)

1. Artifact – Standard course design

The artifact I need to deal with is the new look and feel for the institutional LMS. This means that every course will not only look the same, there are some expectations about what is expected to be on the course site. For my course, this means it will look something like the following

Home page

Everything at the top of the page and in the left hand column is part of the new, standardised look and feel. Everything under the “Welcome to EDC3100: ICT and Pedagogy” heading is what was copied over from the last offering of the course. Hence the “Right now:” message suggestion the course has ended. The following is what the 2014 course site looked like.

edc3100 2014

Broadly speaking the changes involved in this project were

  • Changes in the branding/look and feel.
  • More moved into the reduced banner.
    Explicit mention of the time at USQ, notifications, my courses, useful links, course administration etc moved into the banner which takes up just a bit less vertical space. These look like good moves, but the breadcrumbs continue to include the unnecessary listing of the faculty to which the course belongs. Don’t think the students care about the faculty and it takes up a fair bit of space.
  • The removal of any ability to have a right hand column.
    Only two columns, not three now.
  • AN “expand all/collapse all” option for the main content area.
    I assume this is an attempt to address the scroll of death problem with Moodle. Wonder how much of this is implemented with standard Moodle and how much local customisation?
  • The addition of highly visible menu as the first element in the left hand column.
    These appear to be divided up into two broad categories of items

    1. Links to existing Moodle features (Forums, Resources, Calendar, Participants); or,
      As the labels suggest, these all point to equivalent Moodle functionality. e.g. Forums links to the Moodle service that lists all forums etc.
    2. institutionally specific pages that are semi-integrated with institutional data sources (Study Schedule, Teaching Team, Assessment).
      The basic model for these pages is to know about institutionally specific information such as the dates associated with weeks of semester, due dates for assignments, and the staff teaching into the course. Such information is drawn from institutional databases and combined with some capability to use the Moodle HTML editor to add additional text. For example, the study schedule will fill the date for a Week 1 and teaching staff can manually edit information such as the name of the module for that week.

2. Espoused beliefs and values

As expressed in the support resources, the rationale for this new look and feel include

  • similar experience;
    The desire that students have a similar experience regardless of the course.
  • findability.
    That students are able to find the information they need easily.

The support video for the assessment tab also proposes that the assessment tab “will be very useful for your students”.

3. Assumptions

Obviously I do not know what assumptions these beliefs are based upon, but the following perhaps are not a million miles away

  • Consistency is generally a good thing for learning.
    Given the institutional strategic plan putting some significant weight to personalisation, creativity and innovation, having everything the same doesn’t seem appropriate. Insights from research around learning, teaching, and educational technology would seem to support that. e.g. some of the points from Chris Dede (Harvard Professor of Education) mentioned in this post.
  • A consistent look and feel will make information more findable.
    Wikipedia suggests that findability involves a bit more than just user interface design. Especially when the user interface design is only really helping users locate very specific bits of information. i.e. The new look and feel does make it easy for students to find the forums, assessment, study schedule, and teaching team for a course. Good. But what about the content included in the resources? More on this below in “What about the resources”.In short, if findability is a concern, install a search engine!
  • That it’s possible to have a consistent look and feel across the diversity of courses in an institution.
    The new look and feel does have some features that allow for flexibility. Even though this does raise questions about the consistency espoused belief. If no information is entered for assessment or study schedule the students won’t see those options in the menu. In addition, the study schedule page provides some flexibility in terms of how many columns form the study schedule and the column titles. Allowing individual courses to substitute in the language they use.
    However, there’s a limit to how far this goes. More on this below in “Grouping the weeks”.

An assumption that appears to underpin this new look and feel is that the focus is student centered. The aim is to enhance the student experience. Now that’s a good aim, perhaps the best aim. But the follow on assumption in this case is that teaching staff aren’t capable of using the online environment to enhance the student experience and that the institution needs to do something.

From 1997 through 2004 I helped design, implement and support one approach to an institution doing something about this.
Since then I’ve made the argument for this. However, there are two important points missing from the “new look and feel” at my current institution, they are

  1. the ability to opt out; and
    There will always be academics who can and wish to create their own course sites. The approach provided the opportunity for academics to do this.
  2. an adopter-focused and emergent development approach to the new look and feel.
    i.e. it’s not sufficient for a project team to design the new look and feel and roll it out. There will be inevitably problems with the look and feel and there will be some really good ideas about how to enhance it that emerge from on-going use. How students and staff are using the new look and feel needs to be closely watched and those insights used to continually develop the new look and feel to solve problems and enhance it.

The absence of these points from the new look and feel suggest that there is an underlying assumption that there is nothing to be learned from the teaching staff and their experience. It’s a prime example of the “do it to” and perhaps “do it for” paths and an apparent avoidance of the “do it with” path (Beer et al, 2014).

Interestingly, I’ve just found the following slide on Flickr that purports to represent Schein’s cognitive transformation model for analysing organisational cultures. I’m guessing this was the basis for the consultant/facilitator. What I find particularly relevant to the specific decision is the circle around the outside labelled “organisation iteratively adapts” which I see as resonating with the adopter-focused and emergent development approach mentioned above.

3 Levels of Organization Culture (Schein by MizzD, on Flickr
Creative Commons Creative Commons Attribution-Noncommercial-Share Alike 2.0 Generic License   by  MizzD 

What are the limitations of the new look and feel?

The following explores the dissonances that exist between the new look and feel and the approach I use in my course.

What about the resources?

The following image is a partial screen shot of the “Resources” page for my course.

Resources

This partially illustrates that my course is designed so that each week contains a learning path. A collection of activities and resources that all students are expected to work through. The “resources” page only shows the resources, but at least it does show you that there are quite a few Moodle books amongst those resources.

A Moodle book is essentially a collection of web pages. The Moodle books are used to structure student learning around a task or some related concept. For example, the first book in Week 1 is titled “Setting up” and is designed to help the students set up Diigo, their blog and Twitter. The next book explains the first required learning task for the course – introducing themselves.

Each book is typically at least 3 or 4 web pages long. A quick visual count reveals almost 50 separate Moodle books in the learning paths. Some can cover some important concepts. Concepts that the students will wish to revisit later in the semester. In particular, some books give advice about assessment. It’s not unusual for students to ask “Where did we talk about *insert topic*“.

Nothing in the new look and feel will help students find this information.

A search engine would.

It would be nice not to have to implement this kludge again to enable the search.

Duplication and confusion

It would be interesting to find out the thinking behind promoting the “Resources” link into the menu for the new look and feel. I assume the aim (in line with the espoused beliefs above) is to make it easier for students to find the resources and that this is important for learning.

However, I wonder if it’s going to create some duplication/confusion, especially given the design of my course.

The image above shows part of the resources view for my course site, including most of the initial resources for Week 1. The following image shows the learning path for week 1.

Week1

The resources page offers essentially the same view as the learning path, but it misses two components. First, it doesn’t include the activities (e.g. the discussion forums “Share you introduction” and “Where you fill in the blanks” are missing). Second, the headings are missing. These are used to group the resources and activities into meaningful groups.

The presence of the resources link doesn’t appear to add any value to students and appears likely to create some confusion.

Grouping the weeks

Adding a study schedule page is potentially a useful addition. Something that isn’t present on may sites. Some students may find this useful. That’s the reason why I’ve had a study schedule in my course site since I started.

Take a look at the “Course content” box in the middle of the page below. Do you see the link “study schedule”?

Home page

The following image shows part of the study schedule I’ve created. A problem I have is that there are some features of this study schedule that the new look and feel won’t support

  1. grouping weeks by modules;
    In the following, Module 2 includes weeks 4, 5 and 6. The new look and feel doesn’t support this grouping. This is problematic because the course is designed to have four modules and it’s a good thing that the study schedule clearly shows these modules.
  2. assignments as separate rows; and
    The submission date for assignment 2 is quite clearly shown in the study schedule by an entire row of a different colour. The new look and feel’s schedule embeds this as part of the a cell in the week’s row.

Not major problems, but illustrations of how a consistent approach to course design breaks down when it meets the design decisions made by individual teachers. If those design decisions are bad, there may not be a problem. But what if those design decisions are valid? Is it appropriate that those design decisions should be thrown our and the course revert to the norm?

studySchedule

Limited assessment information

Just under “Study Schedule” in the “Course Content” box above you will see a link labelled “Assessment”. i.e. my course site already provide a range of information about course assessment. This is again a case of the new look and feel duplicating what I already do, and doing so in a way that loses functionality.

The new look and feel assessment page does have some advantages. For example, it allows you to create cohort specific assessment information that is only seen by that cohort. The trouble is that I don’t do that in my course. So no value for me.

The new look and feel’s approach to assessment creates a single page for assessment. Everything about assessment for a course on a single page. This is a problem as the following shows.

Assessment

Can you see the “Table of Contents” heading in the left-hand menu? That’s the start of the list of information I provide on Assessment. It includes the following

  • Assessment (1 and a bit pages long)
    Overview of course assessment, due dates, percentage etc. What you see in the above image.
  • How to request an extension (almost 2 pages long)
    Some FAQs about extensions and details of how to ask.
  • Learning Journal (almost 3 pages long)
    The learning journal is a core part of the learning design (and assessment) of the course. It’s new to students, this section offers an explanation of the learning journal.
  • Problems with the learning journal (about 2 pages long)
    FAQs about the learning journal, particular common problems.
  • Assignment 1 (3 and a bit pages)
    Detailed description of assignment including submission process and the rubrics.
  • Assignment 1 Questions and Answers Video (less than page)
    A video/screen cast answering FAQs about assignment 1.
  • How to submit assignment 1 (less than a page)
    Another video showing how to submit the assignment.
  • Assignment 2 (about 3 pages)
    Detailed description of assignment including submission process and the rubrics.
  • Assignment 3 (about 4 pages)
    Detailed description of assignment including submission process and the rubrics.
  • But I’m not going on Professional Experience? (a page and a bit)
    Assignment 3 is linked to professional experience. In some circumstances students aren’t going on Professional Experience. Describes what those students do.
  • How to query the marking (a page)
    Describes the process students should use to query the marking of their assignments.

That’s a total of about 22 pages (as you might see the new look and feel uses a larger font and white space) of information that under the new look and feel would appear to have to go onto a single page. Horrendous for me to create and worse for students to actually find and use any information.

With the above I use the Moodle book plugin to create and manage this collection of information. The Moodle book plugin also provides a nice way for students to print this information. Either the entire book or selected “chapters”. This “nice way” includes removing all of the additional web interface elements.

The new look and feel does attempt to implement something like this “nice way”, however, it’s a generic web approach that leads to overlapping of real content with basic web navigation (at least on my Mac).

Problems with Moodle books

Exploring the difference with assessment has revealed some additional problems around the use of Moodle books and the new look and feel. The following image is that Assessment overview page in the old course site as it would be seen by students.

Old Assessment

In the above image, look for the following components

  1. Table of contents; and
    This is in the top left hand corner. It’s the ToC for the Assessment book and shows all of the components. It aids findability. Students can see all of the chapters in the book.
  2. Book administration.
    This is in the bottom right hand corner. It has two important links: “Print book” and “Print this chapter”. These are the links that allow students to print versions of the book/chapters with just the content (the middle column).

Now, look below. This is the same assessment book in the new look and feel. What do you notice about the “Table of Contents” and the “Book administration” components?

Assessment

The two problems that I see are

  1. Table of contents is partially below the fold; and
    Due to the new look and feel’s use of a larger font, more whitespace, a single column on the left-hand side, and retaining the standard menu at the top of the left-hand menu the Table of Contents for the book gets pushed down. So that large parts of it aren’t visible.
    I should note that I have a large monitor and keep my browser windows open in a longer format than most. There will be some students for whom the Table of Contents will not be visible.
  2. There is no “book adminsitration” component.
    Actually, there is. It just doesn’t appear in the image above and I’ve only just know found where it is located after a concerted effort to find it. i.e. I knew it had to be there, so I went looking. The following image shows the “Forums” page in the new look and feel. IN the black bar at the top of the page you should be able to see “Forum administration”. When viewing a Moodle book this is where “Book administration” will appear.

DiscussionForum

What do I need to do?

Based on the above, here’s a list of questions to answer

  1. Can I change the study schedule and assessment links to my existing approaches?
    Happy to stay with the menu and the new look and feel, but would prefer that the menu link to the relevant Moodle books.
  2. Can I remove the resources link?
    I don’t think it adds any value in my course and is potentially going to cause confusion. So can I remove it.
    Question: I wonder whether the organisation has done any analysis of student usage of the course sites over Semester 3 (when they’ve been using the new look and feel)?
  3. Can I implement a macro/API system to insert USQ information?
    The one really useful addition to the new look and feel is the integration with USQ systems to pull information (e.g. dates for each week, assignment due dates etc). The problem is that this functionality is only available within the assessment and study schedule features from the new look and feel. As above, I don’t want to use these.
    However, it would be really useful if there were a combination API and course site wide macro facility that would allow me to enter something like the following in the HTML for my course
    [code lang=”html”]
    Assignment 1 due: &lt;div class=&quot;assignment_1_due_date&quot; course=&quot;edc3100&quot; offering=&quot;2015_1&quot;&gt;&lt;/div&gt;[/code]
    and have it automagically replaced with the appropriate due date.

The API is a step too far, but some kludges with javascript might make the macro possible.

  • How best to explain to students how to best use Moodle books in the new look and feel?
  • What can and might I do around providing a search engine for the course site?
    I feel this may be a step too far for this year.

 

If I am to be a good organisational citizen, then in answering the above questions I should be raising these questions through the formal support mechanisms and waiting for them to identify whether these are possible and allowed. (I fear that the latter is more likely to be the real problem).

Of course, there are also the possibility that some of the above can be implemented through a bit of bricolage.

Lessons from Schein’s 3 level of organisational culture?

Arguably I’ve established above that organisational culture contains strong assumptions about consistency equating to findability. That it is possible to employ a consistent approach across all courses in an entire university. I think I’ve established from the above that there are problems with these assumptions. I also think that I’ve illustrated that the new look and feel is (or at least appears to be) suffering from the absence of on-going iterative development. i.e. it’s not learning from itself.

In the workshop a slide was handed around that showed artifacts, espoused beliefs, and assumptions as three vertically arranged boxes (typical Powerpoint). Between assumptions and espoused beliefs, and espoused beliefs and artifacts, there were two arrows, each pointing from one box to the other. The suggestion being that assumptions influence espoused beliefs and espoused beliefs influence assumptions. Similarly for artifacts.

As we’ve argued in this paper, I believe that the organisation works on the assumption that its digital artifacts – such as the new look and feel – are established. i.e. they can’t be changed by anyone very easily and certainly not be anyone who hasn’t been approved via the appropriate governance structure. Hence the arrow suggesting that anything should be changing the artifact is somewhat attenuated when it comes to digital artifacts in enterprises.

However, as we’ve argued in another paper digital technologies are protean. They are flexible and changeable. Some more so than others. For example, phone apps are hard to change unless you’re the developer. But the web environment is definitely protean. Suggesting that the ability to change the artifact is possible from cultures that don’t hold the assumptions or espoused beliefs of the dominant organisational culture.

Of course, the problem is making changes to artifacts set up with the assumption that they are established. Can be harder than you think.

Learning about case study methodology to research higher education

The following is a summary and some thoughts on Harland (2014). The abstract for the paper is

Learning about teaching through inquiry is a sound practice for professional development and the university teacher-as-researcher is now commonplace. With the proliferation of inquiry-based postgraduate programmes in university teaching, more academics from across the disciplines are learning new ways of doing research. In this paper, I draw on 10 years’ experience of teaching research methods in higher education. I teach on a one-year part-time course that aims to help academics change their practice and contribute to the theories of teaching in higher education through publication of their work. The preferred research method is case study and there are four questions that both inexperienced and experienced participants can find challenging: what is the potential of case study; what forms of data are acceptable; when does analysis stop and what makes a quality case study? I conclude with a set of recommendations for the new researcher aimed at enhancing the quality of research. Suggestions include properly integrating existing theory into published work, avoiding positivist analogues to judge research, using multiple methods in research design and avoiding writing descriptively or without a critical audience in mind.

My interest

As outlined previously, I have to write more journal articles. The current paper idea I have goes under the following working title “BAM, BIM, Blogs and Breaking BAD: What does it take to create quality e-learning?”. It’s currently conceptualised as a case study of the development and use of BAM and BIM to support the use of individual student blogs from 2006 through 2015. The basic argument is that it’s no surprise that most e-learning is not that great, given the difficulty of doing anything decent within the current institutional mindset around e-learning. The idea is to draw on the Breaking BAD paper, a presentation to MoodleMoot and various other publications round BIM/BAM over the years.

Given it’s a case study and they have limitations, it’s probably a good idea to be able to write up the method in a way that ticks all the right boxes. Hence my interest in Harland (2014).

Thoughts

Case study is a dominant/well accepted method. Tick.

How often does Computers and Education accept case study work?

Points out some interesting points for consideration by early case study researchers in higher education. Not a “how to” guide such as Baxter and Jack (2008)

Summary

Introduction

Researching own teaching practice one way academics learn to teach. Teacher-as-researcher, link to high schools and Boyer’s SoTL. Defines “dual researchers” people rsearching both their discipline and the teaching of that discipline.

Author from hard sciences. Research in academic development didn’t follow the science rules.

cites Tight (2012) that qualitative inquiry remains dominant research method in higher ed journals.

As a reviewer, author finds most articles are sub-par. Case study used in most articles read.

# of case studies published in four higher education journals: 2007-2012
adapted from Harland (2014, p. 1114)
Journal Location Case study Conceptual study Total #
Higher Education Europe 344 181 525
Studies in Higher Education Europe 238 62 300
Teaching in Higher Education Europe 176 111 287
Higher Education Research & Development Australasia 146 101 247
Total: 904 455 1359

“Case study consists of empirical inquiries of single cases that are contextually unique (Stake, 1995)” (Harland, 2014, p. 1114) – my emphasis added – typically addressing something of interest to the authors professional practice. Has instrinic value to those that benefit from the professional practice, but can also contribute to “the theories and practices of higher education”

There are four sticking points in learning case study research methods

  1. What is the potential of the case study?
  2. What forms of data are acceptable?
  3. When does analysis stop?
  4. What makes a quality case study?

These are used to structure the rest of the paper.

Has a para which appears essentially the research method para. Autoethnography is used. The four sticking points are addressed through a personal narrative.

What is the potential of the case study?

Specificity of case study research seen to limit contribution to theory. But that type of certainty of knowledge is very techno-rational. Case study inquiry involves individuals/teams interpreting data. Requiring new standards of judgement (Flyvbjerg, 2006) who contrasts rule-based and case-based knowledge. Case-based is always context-dependent.

As I see it, no two practice contexts are ever genu- inely the same and so rules and deterministic models for guiding thinking and action are not that useful. (p. 1115-1116)

Case study research cannot be truly replicated given the uniqueness of context, but it can be learned from. What each reader may learn will differ.

While its possible to generalise from case studies (Denzin, 2009), it’s unusual. Though it is argued that cases provide an opportunity for generalisation.

Case study research seen as better for generating hypotheses than theory building (Flyvbjerg, 2006) – depending on the definition of theory. Options include

  1. Explanatory and predictive of cause & effect and thus can direct action
  2. pragmatist perspective that has theory/practice intertwined. People generate theories to seek meaning in practice. A form of personal theory building.

Theoretical relevance enhanced if existing theories are integrated. Through which contribution can be made through new interpretation of data. “Existing theory should be seen as an integral part of the case” (p. 1116)

What forms of data are acceptable?

“case study may rely on multiple sources of evidence and be practiced as multi-method research (Denzin & Lincoln, 1994)” (p. 1117) as long as it helps answer the research question. i.e. it can use quantitive research methods. Apparently a surprise to some.

Tight (2012) found only 5% of 440 published articles in 15 higher education journals used a mixed quantitative/qualitative methodology.

What does analysis stop?

Outcomes of any analytic technique will depend on intentions, background knowledge, cognitive processes, mindset etc. Hence analysis is recursive.

Minimise the time between collecting data and writing the research account. “disciplined writing seems to be the most essential part of the analytical process” (p. 1118).

No genuine endpoint.

What makes a quality case study?

Better to engage wider theories than just describe practice. But not sufficient

Quality case research:

  • requires imagination (Dewey, 1938)
  • requires creativity (Morse, 1995)
  • must bring the reader as close as possible to the experience (Fossey et al, 2002)
  • provide conceptual insight (Siggelkow, 2007)
  • should be believable, which requires coherence and provide new theory and instrumental utility (Eisner, 1991)
  • “potential to create an impact on the field of practice” (p. 1118)
  • have something important to say
  • well structured and clearly writte

Argues that a case study should enable someone to learn from it.

Author explores the impact (from this measure) of one of his case study publications. Somewhat sobering results.

Conclusion

Summarises the key points to be “attentive to” and “cautious of” against the four challenges and some more general comments e.g. research must fulfil its purpose and this needs to be known before time to help align process and outcomes.

Does make the point that case study is a form of learning and that this can be seen in daily practice, more so than in research articles.

Also points out that publications on case study methods are “often complex or underpinned by unstated assumptions, following a procedure is never straightforward” (p. 1121). Case study research does allow you to learn from experience, so these methods should be seen as guidelines.

References

Baxter, P., & Jack, S. (2008). Qualitative case study methodology: Study design and implementation for novice researchers. The Qualitative Report, 13(4), 544–559.

Harland, T. (2014). Learning about case study methodology to research higher education. Higher Education Research & Development, 33(6), 1113–1122. doi:10.1080/07294360.2014.911253

Choosing a research publication outlet: Part II

My name is David. I’m an academic. It’s been 8 years since my last journal paper (but what a paper it was, thanks Shirley).

In my defence, for three of those years I was finishing the PhD, 1 year finishing a GDL&T, three years getting used to a new discipline, and generally just cynical about the whole journal process, and not having much I considered useful to say. But that has to change, otherwise I’ll have some explaining to do. This year is about writing some journal papers. So where should I publish?

I asked and answered a similar question in 2009, hopefully this time I might actually go the next step and write something.

(Note: I start the writing process by getting to know the destination. The theory being that a paper is more likely to get accepted if it fits the requirements, aims, tone etc of the destination. I already have a few ideas re: papers, finding a suitable destination is the first task.)

What’s changed since 2009?

I’ve officially moved from the Information Systems discipline to Education. But given I was originally focused on educational technology, the journals identified in that original post are still suitable.

I’m at a new university that has recently introduced awards for journal articles “designed to encourage .. researchers to make strategic decisions about the journals in which they seek to publish, in order to help them get the best return on their research efforts and as a crucial step in improving the research performance of … as an institution”. Obviously linked to the on-going ERA. This is judged largely on the Scopus SNIP (Source-Normalized Impact per Paper) index.

Obviously there is more to research impact than the journal’s “citation potential”, but it’s the current measure of choice. At the very least I’ll have to consider it.

Updating the list

For the following list I’ve

  1. Taken a list of journals mentioned in the original post.

    Have excluded a couple I no longer deem interesting.

  2. Added any other journals I think potentially relevant.
  3. Identified/updated the following information about the journals
    • Ranking as per the Oz government approach.
      This was used in the 2009 ERA, but wasn’t used in the 2012 exercise.
    • SNIP
      Provided by this search form.
    • h5-index and h5-median.
      In my travels I’ve discovered various other types of rankings such as Google scholar’s method which generates the following for

    • Are the papers open or closed?

      This remains an important personal consideration.

    • Position on article copyright.

      i.e. is it ok for me to put a copy of the article on my personal site? A potential proxy for being open.

    • Max paper size.
    • # Issues per year.
    • Turnaround time on review.
    • ERA FoR
      Using John Lamp’s interface to the ERA data.

Once I have that information, the plan is to look more closely at the current editorial directions and the types of papers being accepted for publication in each of the different journals.

The list and observations

The complete list is available as a Google spreadsheet.

The top 10 journals ordered by SNIP are

  1. Computers and Education
  2. Internet and Higher Education
  3. Studies in Higher Education
  4. Educational Technology Research & Development
  5. IRRODL
  6. British Journal of Educational Technology
  7. Educational Technology & Society
  8. Higher Education Research & Development
  9. Teaching in Higher Education
  10. AJET

The top 10 journals ordered h5-index as per Google Scholar are

  1. Computers and Education
  2. British Journal of Educational Technology
  3. Educational Technology & Society
  4. Internet and Higher Education
  5. IRRODL
  6. Educational Technology Research & Development
  7. Studies in Higher Education
  8. AJET
  9. Higher Education Research & Development
  10. Teaching in Higher Education

Interesting to see BJET and ET&S climb the ladder in terms of h5-index.

Could I win an institutional award?

This is not the purpose for writing, but it’s an interesting exercise in exploring how level the playing field is between various disciplines (other than the sciences).

The only results I can find for the institutional publication awards had the following SNIP values

  • 1st – 3.154
  • 2nd – 3.01
  • 3rd – 2.181
  • Student award – 3.273
  • Special mention – 8.2

    An article in Nature that was in the wrong time frame.

Computers and Education is by far the highest ranked of these journals using either SNIP of h5-index. C&E’s SNIPP value was 3.29. Meaning a first place, at least for the above period.

Internet and Higher Education has a SNIP value of 2.55, making 3rd possible.

BJET, IRRODL and AJET (the most familiar journals to me) have SNIPP values of: 1.71, 1.77 and 1.16 respectively.

While a sample size of one isn’t great, it appears possible but only just if I focus on higher education or general educational technology journals.

The couple of teacher education related journals that get a mention either don’t have a SNIPP or have one just above 1. Not likely to please the “impact police”. Of course, I didn’t go searching for more general teacher education journals.

Other misc. observations on journals

Computers and Education

Apparently has a “liberal copyright policy”. Which appears to permit posting to open websites but doesn’t exactly trumpet that position, e.g. doesn’t appear to be mentioned on this page.

There is a choice to publish an article as open access, but there’s a fee required. $1800!!!!!! But as an author you do get a 30% discount on Elsevier books.

Recommends a clearly defined article structure. Wonder how many published articles follow that? Apparently not many.

Has a “turnaround time” of around 10 weeks.

Internet and Higher Education

Audience – “faculty, administrators, and librarians charged with the responsibility of fostering the use of information technology and the Internet on their respective campuses.” Potentially a good fit.

Seems Elsevier have a standard structure they like. Same here as Computers & Education.

About a 5 week turnaround time.

Educational Technology Research and Development

Two month turn around claimed.

The fee for your article being open access is $USD3000!!!!

BJET

$USD3000 cost for open. Not real clear about whether it’s okay to share versions of articles via personal websites.

So where might I publish?

No clear winner, but some thoughts include

  • Publishing in Computers and Education would satisfy the “impact police” but the initial topic I have planned may not fit well.
  • My preference is in open access journals and IRRODL is perhaps a closer fit for what I’m thinking of writing.
  • Internet and Higher Education is a closed journal, but also a reasonable fit and it promises the fastest decision time.

Time will tell.

This year it's all about the connections

The following is an attempt to give some structure to what I might do this year in terms of teaching and research. It’s in the same spirit as a similar post by Tim Klapdor. A summary of my possible contribution to 2015, rather than a set of predictions for 2015. More in line with Alan Kay’s quote

Don’t worry about what anybody else is going to do… The best way to predict the future is to invent it

(though it appears that a version of this adage my have arisen from others)

Two broad sections to the post

  1. Networks, learning and connections that attempts to find a unifying theme; and

    Mostly this section is useful for me.

  2. Questions and potential projects that attempts to outline what I might do (but never get near to completing).

Of course this list will change and grow as I (re-)connect with other ideas and people.

Networks, learning and connections

When I started this post (at least a couple of weeks ago now) the unifying thread seemed to be making connections and the idea that helping people make better connections with ideas, people and practices will help learning. The following quotes were central to this idea.

Goodyear et al (2005) define network learning as

learning in which information and communications technology (ICT) is used to promote connections: between one learner and other learners, between learners and tutors; between a learning community and its learning resources.

The Wikipedia page on Networked Learning starts with (and includes an interesting quote from Alexander’s “A Pattern Language”)

Networked learning is a process of developing and maintaining connections with people and information, and communicating in such a way so as to support one another’s learning. The central term in this definition is connections.

And have just come across the following quote from Dutton (2010) via Downes’ “A year in photos” (a post which has much to offer – and which surprisingly cites some of my work)

Organizations aren’t thinking about the ‘networked individual’ – the networking choices and patterns of individual Internet users. They’re still focused on their own organizational information systems and traditional institutional networks.

All of the above links to the problem I’m trying to fix. i.e. that a typical University’s attempt to implement and support network learning does not effectively encourage the development and maintenance of connections. Both in terms of the learning undertaken by the folk who pay to be students at these institutions, and in terms of the folk who are paid to teach at these institutions. Since it’s hard to make these connections in this context, much of the learning that occurs is far from good. If you fix this problem, the quality of learning should increase.

My last three publications have all been attempts to understand the problem and identify potential ways forward. In summary,

  1. Breaking Bad (Jones & Clark, 2014);

    Initial description of the BAD and the SET mindsets. The SET mindset is what I see informing the implementation of most institutional approaches to network learning. The BAD mindset is what I see as the more useful approach. This is illustrated by showing how the SET mindset has created a couple of concrete lounges and how the BAD mindset can help make it possible to live with a concrete lounge.

    The question is whether the BAD and SET mindsets can be fruitfully combined. Something I hope the following will help explore.

  2. Three paths (Beer, Jones & Tickner, 2014); and,

    Proposes three paths for change in institutional network learning: do it to, for and with. Typically most practice is in the form of “do it to” and “do it for” and ignores “do it with”. The “S” in the SET mindset stands for strategic, which is what the “do it to” and “do it for” paths best represent. The “B” in the BAD mindset stands for bricolage, which is what the “do it with” approach best represents.

    The paper illustrates how a successful institutional learning analytics project arose from a combination of all three. The argument being that without the “do it with” path – which would have been missing from a typical institutional project – this project would not have succeeded. The broader proposition is that network learning within a university is best served by a combination of the paths. (Almost by definition if you’re focusing on personal/individual learning you must by definition take a “do it with” approach, you can’t “do it to” yourself, can you?)

  3. TPACK as shared practice (Jones, Heffernan & Albion, 2015).

    Uses four themes of associated with a distributed view of learning to analyse the experiences of three teacher educators as they try to develop the knowledge required for effective integration of ICT into teaching (TPACK). Illustrates how the practices undertaken by the three teacher educators use a distributed/social approach to learning to overcome the limitations of institutional systems and support. Thereby partially illustrating the advantages of the “D” (for Distributed) in the BAD mindset over the “T” (for tree-like or hierarchical) in the SET mindset.

    Also talks about digital renovation and the value of participants in network learning (both learners and teachers) becoming digital renovators

    As the title suggests, this paper also took a quick stab at outlining some of the questions that might arise from this perspective. These are built upon in the next section.

Questions and potential projects

This starts with the four themes used in Jones, Heffernan and Albion (2015). This is used to generate some questions and random ideas for projects. Then comes some additional ideas which don’t fit nicely into the themes.

Situated in particular physical and social contexts

Standard institutional practice tends toward ignoring the situated perspective of learning and knowledge. Most especially in terms of assuming that there are single approaches that can apply to all learning that occurs within the institution. This can be seen in the idea of a standard course layout and design for all course sites. Also in the generic nature of much of the support and professional development around network learning.

Some of the questions that arise include

  1. How will a University-wide consistent structure for course sites impact the situated nature of learning?

    My current institution is introducing this across the institution this year. Even at a first glance I can see this standard design causing issues with the design I have used in my course. What other impacts will it have? What workarounds will I (and others) develop to make the design fit our pedagogical plans and the requirements of our situation? Will the standard design achieve the goal suggested by the proponents of making it easier for students to find information?

  2. How can institutional learning and teaching support engage with the situated nature of TPACK and its development?
    Mishra & Koehler (2006, p 1029) argue that (emphasis added)

    Quality teaching requires developing a nuanced understanding of the complex relationships between technology, content, and pedagogy and using this … to develop appropriate, context-specific strategies and representation

    i.e. the best teaching arises from knowledge that is specific to the content/course/learner/context. Institutional L&T support is anything but specific. It tends to connect teachers with central support staff with generic expertise in technology or pedagogy. Rarely does it connect teachers with people with experience and expertise in the combination of technology and pedagogy or better yet technology, content, pedagogy and context. i.e. help staff teaching into the Bachelor of Education support each other (in addition to other support). But also help connect people across disciplines who are using the same technology and pedagogy. e.g. enabling people using the BIM Moodle module to know who else is using it and engage in conversations.

    Jones et al (2015) shares some of the experience of how we did something approaching aspects of this using a blog shared amongst a group of teacher educators. I think there’s significant potential in embedding more of this sort of thing within the institutional network learning environment. More on this soon.

  3. How can University-based systems and teaching practices be closer to, or better situated in, the teaching contexts experienced by pre-service educators?

    Most of my teaching is to people studying to become teachers. While they are studying they use a range of university provided technologies. Once they graduate how well do these technologies prepare them for what they will and should (not the same thing) use as teachers? Will they continue to use any of it? If the technology they use whilst studying is better situated within a teacher’s professional practice, will they us it more/better? Can and how do we better integrate the technology used by teachers?

Some additional questions

  1. How and with what impacts can process analytics be implemented in my course?
    Process analytics are a type of analytics that provides information about the “information processing and knowledge application” (Lockyer et al, 2013, p. 1448) of learners as they engage within a particular learning design. This implies that the analytics being gathered are specific to the particular learning design being used. It is situated.
    Currently most university learning analytics is anything but situated. It’s the harnessing of data warehouses and other enterprise tools to provide generic analytics, not analytics situated in a particular learning design in a particular course. This is something that the largest course I teach is crying out for. Would love to explore what I can do.
  2. How to situate the knowledge of what works in educational technology into the practices of my students?
    There’s a lot of research about what works (and doesn’t) in learning and learning with technology. Research that could provide useful knowledge to the students taking EDC3100, ICT and Pedagogy. The question is how to help them build meaningful connections to the knowledge arising from that research in ways that are effective. This paper gives an overview of some of the problems.

    This problem is not that far removed from the same problem applied to university teachers in an e-learning environment.

Social

An oft repeated refrain around here is, “we don’t have those conversations about teaching anymore”. Those hallway and lunch-room conversations where colleagues share insights and experiences about teaching don’t appear to be happening any more. Lots of causes for this including the massification of higher education, multiple campus institutions, increasing workload etc.

But another under-estimated cause is that the institutional network learning technologies are not designed to enable and encourage social conversations about teaching. Jones et al (2015) shares a story about how a small group of us set up a shared blog to support some of this, but I think we can do better.

Two early questions that arose were

  1. How can the development of TPACK by teacher educators be made more social?
    i.e. how can the learning and teaching environment better encourage social conversations about teaching.

  2. How can TPACK be shared with other teacher educators and their students?
    As teacher educators we are meant to model the development and application of TPACK to our students. How can they view and engage with us in the social conversations that help encourage development of TPACK?

Distributed

Jones and Clark (2014) argues that too must of university e-learning is influenced by a tree-like (hierarchical) conception. i.e. that the world can be divided up into a neat hierarchical structure of separate components that are controlled by a particular group/individual and that these components can be combined to produce a larger whole without losing anything. Perhaps most importantly, each of the components can only make connections that are approved by the hierarchy (if at all).

Early questions include

  • How can technologies specific to teacher education (e.g. lesson and unit plan templates) be enhanced to increase the capability and learning of teacher educators?
    The current planning templates at my institution are not distributed. They are stand-alone. There is no automatic connection to the work of others or to the professional resources required for planning.

    For example, if I’m creating a lesson plan to help meet this content descriptor from the Australian Curriculum I have to manually copy the content descriptor into the unit plan. I also have to manually search the Scootle database/community for resources and discussions associated with this content descriptor. I have to manually search online for other lesson plans that aim to cover this content descriptor.

    More immediately important to my teaching. The students, markers and I have to manually check a completed lesson plan to see that that the content descriptors and the assessment criteria are aligned as per the curriculum. Rather than have the unit plan template automatically populate the plan with the appropriate criteria based on the selected content descriptors.

  • Is it possible to measure the digital fluency of a university, rather than focus on its teaching staff?
    The recent claim that the lack of digital fluency of teaching staff is the #1 “significant challenge impeding higher education technology adoption” strikes me as blaming the boxes at the bottom of the hierarchy for the problems of the whole. The knowledge of how to effectively integrate technology into learning and teaching isn’t something that’s solely in the head of teaching staff. It’s knowledge that is distributed across the many connections that make up the university.

    I’d argue that if digital fluency is a challenge to technology adoption, then it’s the lack of fluency held by the entire network, not the individual teachers that’s the problem.

    How would you measure the digital fluency of a university?

  • Can the LMS (and other systems) be more like the Globe Theatre?
    The LMS and other systems used to implement e-learning within universities provide little or not additional intelligence to the design and orchestration of learning. All (the vast majority of) the intelligence comes from the designer(s)/teacher(s). The Globe Theatre was set up to help actors meet the demands of their task, can the LMS be similarly enhanced? This is linked to the CASA idea below.

Encouraging more distribution also raises questions about

  1. How can the institution be encouraged to implement APIs and other means for enabling greater distribution.
    University information systems are generally fairly closed. The provision of APIs offers a way to make it easier to develop distributed systems.

    I’m particularly interested in how I can modify BIM appropriately to offer an API that can help in my own teaching.

  2. Exploring the impact of connections on student learning in EDC3100.
    In EDC3100 (a course I teach) students are required to create and use their own blog. They are encouraged to link to the blogs of other students and to outside links. Does encouraging these connections have any impact on learning? How might we find out? How can this inform the design of the course and the design of process analytics?

    Nick Kelly and I are in the early days of exploring how analytics might help.

Protean

The argument in both Jones et al (2015) and Jones and Clark (2014) is that typically universities see digital technologies as established. i.e. they are very difficult (if not impossible) to modify and what changes can be made are only done by qualified people under very strict governance structures. The argument is that this destroys/ignores the inherently protean nature of digital technologies. These are technologies that can be and should be changed.

Both those papers have the early stages of the development of an argument that good e-learning must involve an element of digital renovation by both learners and teachers. i.e. the learners and teachers must be able to modify the technologies to best suit their needs.

Some questions include

  1. Can the outputs of digital renovation practices by individual staff be shared?
    Currently a small number of staff (including myself) engage in a range of digital renovation practices. We implement kludges to make the concrete lounges that infest institutional e-learning into something approaching comfortable. These changes are typically very unique to our contexts (they are situated) and abilities and can’t be shared.

    Being able to share these practices could potentially provide some advantage. If only in being aware of where the concrete lounges are, but also perhaps as a way to address the resourcing issue that arises in institutional e-learning.

  2. How can institutions encourage innovation through digital renovation?
    Some of this connects back to the idea of APIs above. Assuming digital renovation is a good thing, how can an institution encourage and benefit from the innovations that would arise from more widespread digital renovation.
  3. What are the challenges and benefits involved in encouraging digital renovation?

A FedWiki textbook?

Beaulieu National Motor Museum 18-09-201 by Karen Roe, on Flickr
Creative Commons Creative Commons Attribution 2.0 Generic License   by  Karen Roe 

This is a random though that’s sprung from an unexpected source. Have just received an email from the institutional leaders of learning and teaching at my institution advising that two new internal grant opportunities. One of those is an “Open textbook initiative” intended “to develop an alternative to the traditional textbook”.

For now, I’ll leave aside the various horseless carriage connections to be made with idea of focusing on an alternative to a traditional textbook.

Given my recent dabbling with the #fedwikihappening, my immediate thought was what would a “textbook” look like if it were implemented on top of the Smallest Federated Wiki (SFW)? You certainly couldn’t get much more open. What would a SFW-based textbook even look like? Is it a nonsensical idea?

Tim also has a post reflecting on his experience with #fedwikihappening.

Looking for and bridging the chasm

The presentation given for Jones & Clark (2014) was structured around the idea of a chasm between what was possible with existing institutional systems, and what was actually need to efficiently perform a task. The paper identifies two examples of a chasm that we bridged using the BAD mindset.

Very briefly at the end of the presentation a “process” was suggested as a possible way forward. This process was thrown together late the night before the presentation after enjoying a post-conference drink or two. It’s not great, but a starting point. The process was based around the following questions

  1. What are the important e-learning tasks?
    What does the institution, its teachers, and its learners think are the important tasks associated with learning. That might be something like the the 7 principles for good practice in undergraduate education or some other identification of what is considered important.

    Importantly, these “tasks” should NOT be deemed equivalent to functionality provided by information systems like an LMS. They should be tasks/activities that teachers and/or learners engage in.

    At some level, this step is seen as providing management and the “experts” to provide their input into what is important. An attempt to integrate some level of strategic thinking. Having discussions about what is seen as important across all teaching staff could potentially be useful (and divisive).

  2. How’s the reality/rhetoric chasm?
    Critically examine what a teacher/learner needs to do to complete those important tasks using existing systems etc. How hard is it to complete these tasks well?
  3. Who can/is bridging that chasm?
    If the task is important and difficulty to currently compete, then there will almost certainly be teachers/learners who have bridged this gap via various means. Find out who they are and what they are doing. This can help understand the causes of the chasm, identify ideas for bridging it, and probably also help identify other chasms that they are dealing with.
  4. What help is being/can be provided to help bridge the gap?
    The chasm is probably being bridged in spite of the institutional systems, rather than because of them. Identify what can be done to make it easier for those bridging the chasm. Chances are these steps will help others (see next question), but also it may help the bridge-builders develop better bridges.
  5. Can “bridges” be shared/scaled?
    Can the bridges identified be shared more widely? What might it take? Can those bridges inspire tweaks to institutional systems that can be scaled?

This basic idea is very much aligned with the idea of “making quality feasible” from Biggs (2001). An idea that is part of his suggestion for a reflective insitution as a way of assuring and enhancing the quality of learning.

Interestingly, I had some early conversations with @catspyjamasnz about some ideas about how some of this might be done.

My proposition is that this type of process is missing from most institutions because they have adopted the Strategic view of how things get done (see Jones and Clark, 2014). A mindset that assumes you can identify the requirements, design the system, build the system, and then let it run in support mode for a long time. It doesn’t recognise the complex collection of work-arounds and connections that have to be made in order for things to work effectively.

Context-Appropriate Scaffolding Assemblages (CASA)

Late last year at my part of the university I currently work for produced a new method for setting and managing supplementary assessment. In essence another go at passing the course for those students who just fell short. It was the responsibility of the course examiner to implement this new method with the support of a 9 page PDF containing instructions and screen shots of how to perform the method using existing institutional systems. Some difficulties and disquiet arose as the process wasn’t immediately straight forward, and if you only had one or two students in this situation it felt like the cost outweighed any benefit.

The assumption is that the person should have the knowledge and the time to implement this drawn out process. Rather than the technologies that are available provide appropriate support. e.g. a better system would have shows a list of students, allowed the course examiner to select those students who have been granted supplementary assessments, and then set up those assessments appropriately for the course examiner.

However, such a system isn’t possible because the generic tools the institution uses do not know anything about these more specific, situated processes and requirements. Hence the gulf having to be bridged manually by the person.

To bridge this gap there appears to be a need for context-appropriate scaffolding assemblages. i.e. collections of technologies and other resources that are appropriate to the context and which scaffold the performance of important tasks. Not as entire new systems, but rather as ways to glue together different existing systems.

This is not a new idea. My PhD work suggested the idea of scaffolding, context-sensitive conglomerations. There are two reasons for the shift to CASA

  1. Obtain a decent acronym; and
  2. More strongly connect with socio-material perspectives.

Honan (2004) suggests understanding teachers as

bricoleurs, who gather an assemblage of practices, ideas, and theories, to create meaningful classroom practices (p. 109)

The idea of CASA is that for quality learning in a contemporary university setting, it is no longer appropriate for it to be the teacher alone who is doing this. Taking the distributed and social perspective on learning and knowledge it is necessary that the organisation take on the role of bricoleurs who gather an assemblage of ……

The fix to the Peoplesoft gradebook implemented last year is an early example of a CASA. A greasemonkey script is used to create an assemblage that makes for a more meaningful/appropriate practice. Both this example and the supplementary assessment problem above are more administrative, than pedagogical. The challenge is to explore how and what CASAs can be developed to aid learning and teaching.

References

Biggs, J. (2001). The Reflective Institution: Assuring and Enhancing the Quality of Teaching and Learning. Higher Education, 41(3), 221–238.

Dutton, W. (2010). Networking distributed public expertise: strategies for citizen sourcing advice to government. One of a Series of Occasional Papers in Science and Technology Policy. Retrieved from http://papers.ssrn.com/sol3/papers.cfm?abstract_id=1767870

Goodyear, P., Jones, C., & Asensio, M. (2005). Networked learning in higher education: Students’ expectations and experiences. Higher Education, 50(3), 473–508. doi:10.1007/s10734-004-6364-y

Honan, E. (2004). Teachers as bricoleurs: Producing plausible readings of curriculum documents. English Teaching: Practice and Critique, 3(2), 99–112. Retrieved from http://www.geocities.ws/ehonan05/EngTchingpracticeandcritique.pdf

"Do it with" teachers or students?

Late last year @salvetore read and commented on the “three paths” paper that @beerc and @rolley wrote for ASCILITE’2014. (Always nice to know that someone reads a paper). The following is a response I’ve just added to the thread on the Moodle research discussion forum. In particular, picking up on two suggestions from Sean Dicks, which badly summarised are

  1. That the focus should be on doing it with students, rather than teachers.
  2. That the use of Trigwell’s model (see images below) “seems somewhat incongruent with the earlier statement that in complex systems part are all “interacting and evolving so that we cannot understand any of the agents or systems without reference to the others”

The original paper argues that for learning analytics (and perhaps any form of education technology) to move beyond being a fad, it needs to be implemented by drawing on a good combination of three paths: do it to, for and with the teachers. (Remember this in in the context of Australian higher education).

The focus on teachers, not students??

The focus on teachers rather than students in the paper comes from our increasing disquiet with the rise of the “do it to” approach around educational technology within Australian higher education (this is the context we work within). This is especially obvious within learning analytics over recent years.

The “do it to” path involves senior (and not so senior) management responding to the rising fad, identifying a tool and implementing it with the expectation that it will improve learning and teaching. The innovation is done to the teaching staff who are expected to change their practice.

In our experience, this path generally doesn’t provide any significant, wide-spread change in learning or teaching.

We don’t disagree that the ultimate focus is on “do it with” the students, to contribute to the improvement of student learning. The question is how best do you achieve that goal within higher education as it currently stands?

And that’s where Trigwell’s model enters the picture. i.e. the idea that what the student experiences is largely (but not completely) influenced by the strategies, planning and thinking of teachers. The proposition is that if you change teachers thinking and strategies, then the student learning experience will change.

Trigwell's model of teaching

Our concern is that the current focus on the “do it to” path attempts to ignore this and instead believe that it’s possible for the institution to directly impact the student experience. Whereas the Trigwell model (or at least one interpretation of it) suggests that the best the “do it to” approach can achieve is to change the teaching and learning context.

The “do it with” approach is focused much more on engaging directly with teachers’ strategies as the starting place. Almost like desire paths, build on what they are already doing. In part, the assumption is that people learn from experience, change their experience (strategies) and you have a better chance of changing their thinking.

Hopefully, by the institution doing more “with teachers” will lead to the situation where teachers will be doing more “with students”.

Complexity

We do believe that with learning and teaching in a university setting, we’re talking about a complex adaptive system. The following slide (click on it to see a larger version) from an earlier presentation that contributed to the paper visually represents this belief and how it connects with the Trigwell model a little better.

Slide65

While (at best) the teaching and learning context may be the same each of the subsequent layers will be unique to each individual teacher and student. i.e. each teacher will think differently, that will in turn translate into different approaches to planning and hence strategies. In addition, each student will respond to those strategies in different ways. Hence the multitude of arrows in the above.

The double headed nature of the arrows also points to the complexity of the problem. Double headed is meant to suggest an inter-dependence. If you change a teacher’s strategy (and thus their experience), you will hopefully change their planning, which in turn will change their future strategies, which in turn will…and these changes will happen in ways that could not have been planned. Which is why we think doing it “with” teachers is so important to being able to respond effectively and build upon positive changes and respond to negative changes.

TPACK as shared practice: Toward a research agenda

The following is a draft of a paper that has been accepted at the SITE’2015 conference, and which has just been selected as a SITE 2015 Outstanding Paper.

Authors: Peter Albion, Amanda Heffernan and myself.

Jones, D., Heffernan, A., & Albion, P. R. (2015). TPACK as shared practice: Toward a research agenda. In D. Slykhuis & G. Marks (Eds.), Proceedings of Society for Information Technology & Teacher Education International Conference 2015 (pp. 3287–3294). Las Vegas, NV: AACE. Best paper award

Abstract

The task of using Information and Communication Technologies (ICT) to effectively teach hundreds of pre-service educators, many of whom never attend campus, is a significant challenge, which is amplified by the need to do so in ways that model how they might use ICT in their own classrooms once they graduate. This paper analyses a collection of posts written across a teaching year on a group blog by three teacher educators as they explored their practice and attempted to learn how to meet this challenge. The analysis uses a distributed view of knowledge and learning to identify the barriers and enablers encountered, and how the teacher educators developed their distributed TPACK throughout the year. The paper argues that a distributed view of TPACK offers some interesting insights that can inform practitioners, researchers and policy makers as they explore practice and learn how to meet the technology integration challenge.

Introduction

The 30+ year goal of effective integration of technology into learning and teaching remains elusive (Brantley-Dias & Ertmer, 2013) regardless of the voluminous amount of effort, research, and literature devoted to developing insights into how to achieve it. Belland (2009) critiqued the focus of researchers on teachers’ beliefs and barriers to adoption of Information and Communication Technologies (ICT) for learning and teaching. Instead he suggested that the explanation for limited uptake of ICT by teachers in their classroom practice is related to their past experience and its effect upon teachers’ habitus or dispositions to act in certain ways. He argued that, consistent with the observation that teachers very often teach as they were taught, the most powerful influence on the practices of teacher graduates would be their own experience as students in school. He cited examples of teacher educators professing constructivist beliefs but using teacher-directed approaches to prepare teacher candidates to adopt constructivist practices. For teacher education to be effective in overwriting the effects of conventional schooling to graduate teachers prepared to integrate ICT it must employ the strategies it promotes and allow teacher candidates to experience them as learners. The value of such modeling has long been recognized (LeBaron & Bragg, 1994) but not often achieved.

Technological Pedagogical Content Knowledge (TPACK)–one recent significant component of the literature–has become a popular framework for describing the knowledge required by teachers to successfully integrate technology and has underpinned attempts to measure and develop the TPACK of individual teachers (Di Blas, Paolini, Sawaya, & Mishra, 2014). As a consequence of the on-going growth in the perceived importance of ICT in education it has increasingly been seen as essential for teachers at all levels of education to build their TPACK (Doyle & Reading, 2012). Hence, teacher educators require knowledge in the domains of content, pedagogy, technology and their various intersections. Typically they are well prepared by education and experience in content and pedagogy but technology, and its intersections with the other forms of knowledge, presents significant challenges. Teacher educators are subject to their own habitus (Belland, 2009) and often have limited, if any, experience of the application of new ICT. Lankshear, Snyder and Green (2000) observed that, where teachers have limited experience of particular practices in the real world, they are challenged to design related authentic activities for the classroom. The protean nature of ICT, as discussed below, continually places teacher educators in the position of needing to model ICT practices that are not established parts of their repertoire. How then can teacher educators effectively model ICT integration so that the habitus of teacher candidates will be transformed? How can teacher educators develop the TPACK required for such modeling?

These were among the questions facing a group of teacher educators at the beginning of 2014. With a long history in the provision of distance education, the University of Southern Queensland has over recent years moved into online learning in a strategic way. The institution’s strategic plan (USQ, 2012) seeks to “provide students with personalised adaptive learning support” through the use of “emerging technologies” that “enable collaborative teaching and individualised learning for students regardless of their location or lifestyle” (p. 6). For teacher educators this has meant that “by 2012, up to 70% of students in the 4-year Bachelor of Education were studying at least some subjects online” (Albion, 2014, p. 72) and struggling at times when the reality did not always match the institutional rhetoric.

At the start of 2014 a private, shared blog was set up amongst a group of six teacher educators in an attempt to explore their shared practice and help bridge this reality/rhetoric gap. The blog provided a space to share stories about what frustrated, pleased, confused and surprised the teacher educators as they attempted to integrate technology into their teaching. Acting as a virtual water cooler, the blog evolved into a space where teacher educators at different stages of their careers could share practices; seek and provide support; and learn from each other. Between February and October 2014, 82 posts and 150 comments were shared by the six teacher educators.

This paper explores some experiences captured in the 77 posts and 111 comments shared by the three main contributors to the blog, who are also co-authors of this paper. Given the situated and distributed nature of the experience of using the shared blog, the paper draws upon a distributive view of learning and knowledge for this analysis. It begins by looking at recent writing around TPACK with a particular focus on perceived issues with TPACK and suggestions that a distributed view of TPACK might prove useful. Next, the paper describes four conceptual themes of a situated and distributed view of learning and knowledge. These themes are then used to identify and analyse the stories shared on the group blog. Finally, some initial questions for practitioners, teacher educators, researchers and policy makers are raised in the form of a proposed research agenda.

A distributive view of learning and knowledge

As noted by Di Blas et al. (2014), TPACK (the knowledge required to effectively integrate technology) “has consistently been conceptualized as being a form of knowledge that is resident in the heads of individual teachers” (p. 2457). The potential limitations of this perspective led those authors to draw on distributed cognition to explore the idea of distributed TPACK. Earlier, Putnam and Borko (2000) examined “new ideas about the nature of knowledge, thinking and learning” (p. 4), ideas which they labelled the “situative perspective” and included ideas such as situated cognition, distributed cognition and communities of practice. Phillips (2014) used these ideas in his investigation of the development and enactment of TPACK in workplace settings, implicitly recognizing the existence of TPACK as a form of shared practice embedded in context rather than knowledge held privately by individuals. Dron and Anderson (2014) suggest that the situative perspective of learning shares some common concepts and themes with a range of theoretical perspectives including heutagogy, distributed cognition, activity theory, Actor Network Theory, complexity theory and complex adaptive systems, Communities/Networks of Practice, and Connectivism. This range of perspectives has arisen from diverse disciplinary traditions.

Putnam and Borko (2000) developed three conceptual themes capturing the essence of these new ideas to examine the implications they may hold for how teachers design learning experiences, and learn about new ways of teaching. In the following we describe and use these three themes plus one other to analyse and draw implications from the stories shared on the blog.

Situated in particular physical and social contexts

The situated theme of learning and knowledge rejects the idea that context does not matter. Instead the entire context, understood as an interactive system including people, materials and representational systems, in which an activity takes place becomes “a fundamental part of what is learned” (Putnam & Borko, 2000, p. 4). The situated nature of learning means that an inappropriate context can limit transfer of learning into different contexts, a perspective that Putnam and Borko (2000) link to teacher complaints about professional development removed from the classroom being seen as “too removed from the day-to-day work of teaching to have a meaningful impact” (p. 6).

Social in nature

The social perspective of learning and knowing recognises the important role of other individuals and of discourse communities beyond encouragement and stimulation. Instead, how we think and what we know arises from our on-going interactions with groups of people over time. The implication is that rather than learning being limited only to instruction in particular concepts and skills, it “is as much a matter of enculturation into a community’s ways of thinking and dispositions” (Putnam & Borko, 2000, p. 5). The conception of schools as a powerful discourse community with established ways of thinking offers a partial explanation for the resistance to fundamental change of classroom teaching (Putnam & Borko, 2000).

Distributed across the individual, other persons and tools

The view of cognition as distributed proposes that the knowledge required to perform a task does not exist solely within an individual person, or just within groups of people. A distributed view sees cognition as requiring a contribution from a range of people and artifacts. The appropriate tools can enhance, transform and distribute cognition and through this expand “a system’s capacity for innovation and invention” (Putnam & Borko, 2000, p. 10). This view offers lenses for exploring how technologies may be able to support and transform teaching and learning (Putnam & Borko, 2000).

Digital technologies are protean

Part of the argument for the addition of technology to Shulman’s PCK to form TPACK was that the rise of digital technologies had created “a very different context from earlier conceptualizations of teacher knowledge, in which technologies were standardized and relatively stable” (Mishra & Koehler, 2006, p. 1023). The rapid and on-going evolution of digital technologies means they never become transparent and it becomes important for teachers to continually develop technological knowledge (Mishra & Koehler, 2006). Though it has been suggested that, as digital technology use within schools becomes more common, “TPACK should, at least in theory, become embedded within other aspects of teachers’ knowledge (Brantley-Dias & Ertmer, 2013, p. 117), the evolution of digital technologies will require corresponding changes in TPACK so that it is inherently unstable knowledge. With this theme we are seeking to explore two propositions. First, that technological knowledge should remain as a first class component of TPACK, for reason of its constant and rapid change. Second, that there may be benefits from changing the nature of the technological knowledge that is useful for teachers. As a result, this is a far more tentative theme than the previous three, but it also builds on the increased role technology plays in cognition suggested by those three themes.

Dron and Anderson (2014) quote Churchill (1943) as saying “We shape our buildings and afterwards our buildings shape us” (p. 50). But digital technologies are different, or at least they can be. Kay (1984) described the “protean nature of the computer” (p. 59) and suggested that it is “the first metamedium, and as such has degrees of freedom and expression never before encountered” (p. 59). However, experiencing the full protean nature of digital technologies requires the knowledge to manipulate them, particularly through (but not limited to) programming. If learning and knowledge are distributed across networks of people and objects – which in contemporary classrooms includes a significant proportion of digital technologies – then the ability to modify digital technologies appropriately would seem to be one approach to enhancing learning, especially given Shulman’s (1987) view that the distinguishing knowledge of a teacher is the capacity “to transform the content knowledge he or she possesses into forms that are pedagogically powerful and yet adaptive to the variations in ability and background presented by the students” (p. 15). With digital technologies it is possible and desirable that we shape our technologies, then our technologies shape us, and then – as we learn – we shape our technologies some more.

Stories from the blog

The following analysis draws on these four conceptual themes from a distributed view of TPACK to analyse the stories shared on the blog.

Situated

A number of discussions on the blog focused on the challenges of centralised processes or communications that needed to be contextualised to our own circumstances. As noted previously, an inappropriate context can limit transfer of learning, and posts from the blog demonstrated that a similar limitation was found when communications or processes seemed less than an effective fit for our context.

An indication that the institutional units responsible for the technologies at the university are perhaps not working in exactly the same context as teaching staff is revealed through upgrade and maintenance work occurring at perhaps the worst possible times. For example, upgrades to the online assignment submission and management system were scheduled in the week major assignments were due to be submitted.

Senior academics responsible for learning and teaching adopted a communiqué model to encourage effective learning, teaching, and technology integration. This model involved the same intermittent email sent out to all academic staff across the whole university. Due to their less than context-specific nature, the communiqués were limited to largely generic information. The lack of connection to the situated experience of teaching staff became a common concern about this model. For example, a communiqué including notice of decommissioning of the old Computer Managed Assessment process was sent to all academic staff, regardless of whether they made use of this function.

When considering the importance of situated knowledge and understanding of ICT, questions were raised in terms of how we are preparing our students for their future teaching contexts. The institution’s use of a LMS and other tools that are unlikely to be found in many schools generated questions about the usefulness of these approaches in terms of preparing students to transfer their experience and skills beyond the context of university into their own teaching practices. Further consideration of this notion can also be found below in relation to the ‘distributed’ theme.

Discussion on the blog noted, in particular, the power of our shared context as a strength of the learning opportunities afforded by the collaborative nature of the space. Centralised support options by their very nature have to be neutral and accessible by any academic within the many disciplines offered by the university, with the support on offer being limited to the experiences of staff typically without any knowledge of the specifics of teacher education. In comparison, a blog shared by a group of teacher educators enabled the sharing of expertise developed through many years in various education sectors and systems. Moreover, that experience was grounded in attempting to develop and harness TPACK to enhance teacher education to the same cohort of students, thereby significantly increasing the relevance of the support provided to our own needs and those of our students. This is reflective of two themes, firstly as mentioned previously that the learning opportunities within the blog were situated in our contexts, but also the notion that distributed TPACK is social in nature.

Social

By its very nature the blog was an appropriate way of sharing information in ways that fit into our own schedules, likely more so than a traditionally scheduled meeting or forum might have been. Furthermore, the approach provided opportunities for participants to post comments and observations, or voice frustrations, at the time that they arose. This resulted in a wider range of issues being raised than might have been discussed had we waited for a scheduled meeting. Issues or points that might not have ordinarily warranted a specific email or phone call to other participants were quickly and easily posted to the blog. Fellow participants could then opt-in or -out of the conversation, depending on whether they were interested or experiencing a similar issue. Given that all three of us were comfortable in the interactive or social nature of online environments, the blog functioned well as a platform for discussion. That we contributed the majority of posts and comments on the blog does raise the question whether our colleagues found it an equally conducive platform.

One aspect of the blog that had significant rewards was the power of sharing practice, adding value to our teaching as a result. As a relative newcomer to tertiary teaching, Amanda was seeking to learn more about the ways colleagues were making use of the various functions afforded by Moodle, the LMS used by our institution. David provided access to the online environment for one of his courses, giving Amanda the opportunity to explore the ways of working initiated by a more experienced colleague who holds expertise in the field of ICT and education. One of the key themes explored earlier in this paper, the notion that TPACK is situated in particular physical and social contexts, was demonstrated by the blog in that it provided opportunities for learning that were contextualised to our field. This was particularly evident in the result of this collaboration, where Amanda was able to see the LMS being utilised in ways that she previously had not observed in other courses. In addition, this provided the opportunity to engage in informed discussions with David about the impact of certain tools and approaches being used in his course. An important aspect to note here is that this was provided in a context of teacher education courses as opposed to other, less contextualised, opportunities that may have been available through central support or elsewhere online. This resulted in a shift in practice for Amanda, adjustments to the online spaces, and more efficient ways of delivering learning experiences for students in her courses.

In the same vein, another significant discussion on the blog was about the delivery of course content and learning experiences for our students. Similar opportunities for shifts in practice resulted from discussions about processes for releasing course content, and even preferences for ways of recording lectures or vodcasts. The key opportunity here, of course, was the chance to discuss these approaches with people who had a shared context of teacher education, alongside the variety of levels of experience and expertise in different areas, which added a richness to the conversation. Interactions beyond the blog also benefited participants, with David and Peter’s interactions in other online spaces such as Twitter providing solutions to challenges that David was facing with the LMS. The solution provided by Peter was identified as one that would not have been found as easily without these social networks.

Distributed

Stories from the blog are also illustrative of the notion that TPACK is distributed across the individual, other persons and tools. A number of the stories shared were identifying and sharing our responses to gaps in the complex assemblage of people, technologies and practices that make up learning and teaching, gaps that seemed to reduce the overall level of TPACK available across the system. One example of this problem was a new system for managing course documents that could support the use of only the Harvard referencing style, even though courses in teacher education and some other departments of the university courses require the use of the APA referencing style. Other examples of such gaps included issues with gaining access to computer labs, complexities of appointing casual teaching staff to work into courses, accessing the information necessary to create student groups, issues with the availability and performance of the LMS at the start of semester, and processes for contacting students who did not submit assignments, or checking in with students who had yet to engage within our courses. Discussion ranged from commiserating with our colleagues over having similar issues, to providing solutions which each of us had developed over time.

At times, though, these solutions became moot when systems changed unexpectedly and each of us identified some approaches that had worked in the past but were no longer possible for various reasons, either around submission of assessment, delivery and rollover of course content for ease of updating, or the continued use of an e-portfolio from a previous institution. While each of these issues was solved with workarounds, this was often a time sensitive exercise and the end result was not necessarily ideal. For example, Peter was working to modify course content from a previous semester and reached a solution, but “it took far longer than it needed to and finished in what seems an illogical arrangement”.

The rise of Web 2.0 and the cloud has also seen a significant increase in the availability of a range of technologies (e.g., Google providing student email accounts) from providers outside of the institutional context. The use of Facebook and other social media as a means of working with students was also a pertinent discussion point at different times throughout the year. At times we discussed issues that had arisen on social media sites that were impacting upon the course (such as misinformation being disseminated and causing confusion for students), but we also discussed ways of trying to engage students on multiple platforms. Tensions sometimes emerged around this integration of ‘outside’ technologies. One example of note is the frustration voiced by an institutional learning designer at the direction they were given to encourage the use of formally approved technologies only. The discovery of a taxonomy separating technologies into core, supported, allowed and emerging is also worth noting here and harks back to the concepts discussed earlier in the situated theme, wherein we questioned how comprehensively we were preparing students for a world beyond our institutional tools and approaches.

This insular, institutional view of allowed systems is the reason why the closing of discussion forums on course sites just prior to and for 72 hours after an exam is seen as a practice that will prevent students being able to communicate inappropriately about the examination. This seems to be a moot practice, given the standard experience of each course having its own (and sometimes multiple) student-created and private Facebook groups, not to mention the other LMS provided forms of communication that are available to students.

All of these experiences with the problematic distribution of TPACK across and beyond the institution are perhaps a significant contributor to an observation shared after a discussion with other colleagues, for whom “it was just accepted as par for the course that there would always be fairly major problems with the technology”. This acceptance of problems not only contributes to frustration, it also raises questions about the impact on innovation. Amanda posed the question “it makes me wonder. How many people would put [innovation] in the too-hard basket and go back to a tried-and-true method that has worked for them before?”. What impact does this have on the ability to effectively integrate technology, if the knowledge of how to effectively integrate technology is distributed across artifacts that are seen as always having fairly major problems? This leads directly to the next theme, which concerns the rapidly evolving nature of technology and the importance of being able to work with/in and around limited tools.

Protean

One solution to the problematic distribution of TPACK across complex assemblages is the idea of ‘digital renovation’. Rather than accept these problems, digital renovation draws upon the protean nature of digital technologies to adjust or enhance rigid and problematic systems to develop solutions. Digital renovation provides the opportunity to open up new educational possibilities, but only for those with the TPACK to engage in digital renovation. For example, David shared a pedagogical practice during a planning day and generated some significant interest from another teacher educator. However, the digital technologies provided by the institution do not provide sufficient capability to make this practice plausible within the context of a largish course (300+ students). While David was able to write Perl scripts that bridged the gap, this particular renovation was very specifically situated within the context of the renovator and could not be easily adopted by others. So, despite the interest in an effective practice, it could not be adopted more widely.

But not all such solutions suffer the same problem. Throughout the year the shared blog was used to share and discuss tools and shortcuts that had been developed to work around (or within) the limitations of institutional systems, providing timely solutions for challenges that were, at times, plaguing all of us. The blog’s enabling of timely solutions is vital here, ensuring that while teachers may be ‘perpetual novices’ (Mueller et al., 2008) when it comes to rapidly changing technology and tools, these solutions were developed and shared responsively, providing collegial support and breaking down silos that may have existed. The blog, therefore, enabled us to collectively grow our own solutions to issues as they arose.

The inability to undertake appropriate digital renovation also creates gaps at the institutional level. It has often been reported that a major problem for students using the university’s LMS was their inability to find required resources within course sites. The widely accepted reason for this difficulty within the institution is claimed to be the inconsistent and poor layout and design of individual course sites. However, it also points to the apparent inability of the institution to provide an effective search engine – the widely accepted method for finding resources online – that works within the bounds of the institutional LMS. Interestingly, a project is currently underway to streamline course sites to provide a consistent environment for students and apparently solve the discoverability issue. Discussion on the blog raised questions about the impact of this that also relates to previous themes reflecting the importance of context and specialised knowledge – an environment that works in one course may not be a good fit for another.

A final conversation worth noting in relation to the concept that digital technologies are protean was about the complexities of working in a recently updated system and the subsequent requirement of teaching staff adopting new, and as yet untested, processes. The difficulties faced in adopting these practices raised the question of whether the digital fluency of teaching staff was sufficient. Another perspective on these difficulties arose from one participant noting that the systems and processes being used were ‘scarcely fit for purpose’, raising questions about the digital fluency of the institution. While it was clear that those involved had reasonable ideas about what makes for good online learning practice, they did not always seem to have the digital fluency required to translate those ideas into efficient and effective practice.

The blog afforded us an interactive, low-pressure space to explore these ways of improving our practice, to engage in thoughtful and critical discussions, and to share the load of developing these understandings.

Conclusions and a research agenda

Throughout 2014 a group of six teacher educators used a group blog to share the ups and downs of trying to effectively integrate technology into the education of pre-service teachers. Analysis of the 77 posts and 111 comments shared by three of those teacher educators using a distributed view of learning and knowledge was used to extract insights into how these educators developed and shared the TPACK necessary to effectively integrate technology into their teaching. The analysis has revealed that our TPACK was enhanced through the ability to engage in social discussion with colleagues from within the same context. Such situated collaborations helped overcome the limitations of organisational practices and technologies that were not always well suited to our context and aims. Knowledge of how to leverage the protean nature of digital technologies to overcome some of these limitations also helped.

In this light and assuming that “developing TPACK ought to be a critical goal of teacher education” (Mishra & Koehler, 2006, p. 1046) what do we do next? How can a distributed view of TPACK help teacher educators model ICT integration so that the habitus of teacher candidates is transformed? How can we develop the TPACK required for such modeling? Table 1 provides a list of research questions that make up an initial agenda for future work that maps out some potentially interesting directions.

Table 1: Proposed research questions for future research on TPACK as shared practice
Themes Research questions
Situated
  • How will a University-wide consistent structure for course sites impact the situated nature of learning?
  • How can institutional learning and teaching support engage with the situated nature of TPACK and its development?
  • How can University-based systems and teaching practices be closer to, or better situated in, the teaching contexts experienced by pre-service educators?
Social
  • How can the development of TPACK by teacher educators be made more social?
  • How can TPACK be shared with other teacher educators and their students?
Distributed
  • Is it possible to measure the digital fluency of a university, rather than focus on its teaching staff?
  • How can technologies specific to teacher education (e.g. lesson and unit plan templates) be enhanced to increase the capability and learning of teacher educators?
Protean
  • Can the outputs of digital renovation practices by individual staff be shared?
  • How can institutions encourage innovation through digital renovation?
  • What are the challenges and benefits involved in encouraging digital renovation?

In framing this research agenda, it is important to keep in mind that the distributed view of knowledge drawn upon in this paper strongly suggests that there are significant limits to what teacher educators can achieve alone. The knowledge required is situated, distributed and social. Thus the success of such a research agenda will depend on how effectively all of the people and artifacts involved in teacher education can be involved in this research agenda.

References

Albion, P. R. (2014). From Creation to Curation: Evolution of an Authentic ‘Assessment for Learning’ Task. In L. Liu, D. Gibson, V. Brown, T. Cavanaugh, J. Lee, C. Maddux, M. Ochoa, M. Ohlson, D. Slykhuis & J. Voogt (Eds.), Research Highlights in Technology and Teacher Education 2014 (pp. 69-78). Waynesville, NC: AACE.

Belland, B. R. (2009). Using the theory of habitus to move beyond the study of barriers to technology integration. Computers & Education, 52(2), 353-364. doi: 10.1016/j.compedu.2008.09.004

Brantley-Dias, L., & Ertmer, P. A. (2013). Goldilocks and TPACK: Is the Construct “Just Right?”. Journal of Research on Technology in Education, 46(2), 103-128.

Di Blas, N., Paolini, P., Sawaya, S., & Mishra, P. (2014). Distributed TPACK: Going Beyond Knowledge in the Head. In M. Searson & M. N. Ochoa (Eds.), Society for Information Technology & Teacher Education International Conference 2014 (pp. 2464-2472). Jacksonville, Florida, United States: AACE.

Doyle, H., & Reading, C. (2012). Building teacher educator TPACK: Developing leaders as a catalyst for change in ICT Education. In M. Brown, M. Hartnett & T. Steward (Eds.), Future challenges, sustainable futures. (pp. 272-282). Wellington, NZ: ascilite.

Dron, J., & Anderson, T. (2014). Teaching crowds: Learning and social media: Athabasca University Press.
Kay, A. (1984). Computer Software. Scientific American, (September), 53-59.

Lankshear, C., Snyder, I., & Green, B. (2000). Teachers and Technoliteracy: Managing literacy, technology and learning in schools. Sydney: Allen and Unwin.

LeBaron, J. F., & Bragg, C. A. (1994). Practicing what we preach: Creating distance education models to prepare teachers for the twenty-first century. The American Journal of Distance Education, 8(1), 5-19.

Mishra, P., & Koehler, M. J. (2006). Technological Pedagogical Content Knowledge: A Framework for Teacher Knowledge. Teachers College Record, 108, 1017-1054.

Phillips, M. D. (2014). Teachers’ TPACK enactment in a Community of Practice. Unpublished PhD, Monash University, Melbourne. Retrieved from http://newmediaresearch.educ.monash.edu.au/lnmrg/sites/default/files/Teachers%27%20TPACK%20enactment%20in%20a%20Community%20of%20Practice.pdf

Putnam, R. T., & Borko, H. (2000). What do new views of knowledge and thinking have to say about research on teacher learning? Educational Researcher, 4-15.

Shulman, L. S. (1987). Knowledge and teaching: Foundations of the new reform. Harvard Educational Review, 57, 1-22.

USQ. (2012). USQ Strategic Plan 2013-2015. Toowoomba, Qld. Retrieved from https://www.usq.edu.au/about-usq/about-us/plans-reports/strategic-plan

Powered by WordPress & Theme by Anders Norén

css.php