Does institutional e-learning have a TPACK problem?

The following is the first attempt to expand upon an idea that’s been bubbling along for the last few weeks. It arises from a combination of recent experiences, including

  • Working through the institutional processes to get BIM installed on the institutional Moodle.
  • Using BIM in my own teaching and the resulting changes (and maybe something along these lines) that will be made.
  • Talking about TPACK to students in the ICTs and Pedagogy course.
  • On-going observations of what passes for institutional e-learning within some Australian Universities (and which is likely fairly common across the sector).

Note: the focus here is on the practice of e-learning within Universities and the institutionally provided systems and processes.

The problem(s)

A couple of problems that spark this thinking

  1. How people and institutions identify the tools available/required.
  2. How the tools provide appropriate support, especially pedagogical, to the people using it.

Which tools?

One of the questions I was asked to address in my presentation to ask for BIM to be installed on the institutional LMS was something along the lines “Why would other people want to use this tool? We can’t install a tool just for one peson.”

Well one answer was that a quick Google search of the institution’s course specifications that revealed 30+ 2012 courses using reflective journals of varying types. BIM is a tool designed primarily to support the use of reflective learning journals by students via individual blogs.

I was quite surprised to find 30+ courses already doing this. This generated some questions

  • How are they managing the workload and the limitations of traditional approaches?
    The origins of BIM go back to when I took over a course that was using a reflective journal assessment task. Implemented by students keeping them as Word documents and submitting at the end of semester. There were problems.
  • I wonder how many of the IT and central L&T people knew that there were 30+ courses already using this approach?
    In this context, it would be quite easy to draw the conclusion that the IT and central L&T folk are there to help people with the existing tools and keep their own workload to a minimum by controlling what new tools are added to the mix. Rather than look for opportunities for innovation within the institution. Which leads to..
  • I wonder why the institution wasn’t already actively looking for tools to help these folk?
    Especially given that reflective learning journals (diaries etc) are “recognised as a significant tool in promoting active learning” (Thorpe, 2004, p. 327) but at the same time the are also “demanding and time-consuming for both students and educators” (Thorpe, 2004, p. 339)

A combination of those questions/factors seem to contribute to recent findings about the workloads faced by academics in terms of e-learning (Tynan et al, 2012)

have increased both the number and type of teaching tasks undertaken by staff, with a consequent increase in their work hours

and (Bright, 2012, n.p)

Lecturers who move into the online learning environment often discover that the workload involved not only changes, but can be overwhelming as they cope with using digital technologies. Questions arise, given the dissatisfaction of lecturers with lowering morale and increasing workload, whether future expansion of this teaching component in tertiary institutions is sustainable.

How the tools provide support?

One of the problems I’m facing with BIM is that the pedagogical approach I originally used and which drove the design of BIM is not the pedagogical approach I’m using now. The features and functions in BIM currently, don’t match what I want to do pedagogically. I’m lucky, I can change the system. But not many folk are in this boat.

And this isn’t the first time we’ve faced this problem. Reaburn et al (2009) used BIM’s predecessor in a “work integrated learning” course where the students were working in a professional context. They got by, but this pedagogical approach had yet again different requirements.

TPACK

“Technological Pedagogical Content Knowledge (TPACK) is a framework that identifies the knowledge teachers need to teach effectively with technology” (Koehler, n.d.). i.e. it identifies a range of different types of knowledge that are useful, perhaps required, for the effective use of technology in teaching and learning. While it has it’s detractors, I believe that TPACK can provide a useful lens for examining the problems with institutional e-learning and perhaps identify some suggestions for how institutional e-learning (and e-learning tools) can be better designed.

To start, TPACK proposes that successful e-learning (I’m going to use that as short-hand for the use of technology in learning and teaching) requires the following types of knowledge (with my very brief descriptions)

  • Technological knowledge (TK) – how to use technologies.
  • Pedagogical knowledge (PK) – how to teach.
  • Content knowledge (CK) – knowledge of what the students are meant to be learning.

Within institutional e-learning you can see this separation in organisational structures and also the assumptions of some of the folk involved. i.e.

  • Technological knowledge – is housed in the institutional IT division.
  • Pedagogical knowledge – is housed in the central L&T division.
  • Content knowledge – academics and faculties are the silos of content knowledge.

Obviously there is overlap. Most academics have some form of TK, PK and CK. But when it comes to the source of expertise around TK, it’s the IT division. etc.

TPACK proposes that there are combinations of these three types of knowledge that offer important insights

  • Pedagogical Content Knowledge (PCK) – the idea that certain types of content is best taught using certain types of pedagogy.
  • Technological Pedagogical Knowledge (TPK) – the knowledge that certain types of technologies work well with certain types of pedagogy (e.g. teaching critical analysis using a calculator probably isn’t a good combination)
  • Technological Content Knowledge (TCK) – that content areas draw on technologies in unique ways (e.g. mathematicians use certain types of technologies that aren’t used by historians)

Lastly, TPACK suggests that there is a type of knowledge in which all of the above is combined and when used effectively this is where the best examples of e-learning arise.  i.e. TPACK – Technological, Pedagogical and Content Knowledge.

The problem I see is that institutional e-learning, its tools, its processes and its organisational structures are getting in the way of allowing the generation and application of effective TPACK.

Some Implications

Running out of time, so some quick implications that I take from the above and want to explore some more. These are going to be framed mostly around my work with BIM, but there are potentially some implications for broader institutional e-learning systems which I’ll briefly touch on.

BIM’s evolution is best when I’m teaching with it

Assuming that I have the time, the best insights for the future development of BIM have arisen when I’m using BIM in my teaching. When I’m able to apply the TPACK that I have to identify ways the tool can help me. When I’m not using BIM in my teaching I don’t have the same experience.

At this very moment, however, I’m only really able to apply this TPACK because I’m running BIM on my laptop (and using a bit of data munging to bridge the gap between it and the institutional systems). This means I am able to modify BIM in response to a need, test it out and use it almost immediately. When/if I begin using BIM on the institutional version of Moodle, I won’t have this ability. At best, I might hope for the opportunity for a new version of BIM to be installed at the end of the semester.

There are reasons why institutional systems have these constraints. The problem is that these constraints get in the way of generating and applying TPACK and thus limit the quality of the institutional e-learning.

I also wonder if there’s a connection here and the adoption of Web 2.0 and other non-institutional tools by academics. i.e. do they find it easier to generate and apply TPACK to these external tools because they don’t have the same problems and constraints as the institutional e-learning tools?

BIM and multiple pedagogies

Arising from the above point is the recognition that BIM needs to be able to support multiple pedagogical approaches. i.e. the PK around reflective learning journals reveals many different pedagogical approaches. If BIM as an e-learning tool is going to effectively support these pedagogies then new forms of TPK need to be produced. i.e. BIM itself needs to know about and support the different reflective journal pedagogies.

There’s a lot of talk about how various systems are designed to support a particular pedagogical approach. However, I wonder just how many of these systems actually provide real TPK assistance? For example, the design of Moodle “is guided by a ‘social constructionist pedagogy'” but it’s pretty easy to see examples of how it’s not used that way when course sites are designed.

There are a range of reasons for this. Not the least of which is that the focus of teachers and academics creating course sites is often focused on more pragmatic tasks. But part of the problem is also, I propose, the level of TPK provided by Moodle. The level of technological support it provides for people to recognise, understand and apply that pedagogical approach.

There’s a two-edged sword here. Providing more TPK may help people adopt this approach, but it can also close off opportunities for different approaches. Scaffolding can quickly become a cage. Too much focus on a particular approach also closes off opportunities for adoption.

But on the other hand, the limited amount of specific TPK provided by the e-learning tools is, I propose, a major contributing factor to the workload issues around institutional e-learning. The tools aren’t providing enough direct support for what teachers want to achieve. So the people have to bridge the gap. They have to do more work.

BIM and distributed cognition – generating TPACK

One of the concerns raised in the committee that had to approve the adoption of BIM was about the level of support. How is the institution going to support academics who want to use BIM? The assumption being that we can’t provide the tool without some level of support and training.

This is a valid concern. But I believe there are two asumptions underpinning it which I’d like to question and explore alternatives. The observations are

  1. You can’t learn how to use the tool, simply by using the tool.
    If you buy a good computer/console game, you don’t need to read the instructions. Stick it in and play. The games are designed to scaffold your entry into the game. I haven’t yet met an institutional e-learning tool that can claim the same. Some of this arises, I believe, from the limited amount of TPK most tools provide. But it’s also how the tool is designed. How can BIM be designed to support this?
  2. The introduction of anything new has to be accompanied by professional development and other forms of formal support.
    This arises from the previous point but it also connected to a previous post titled “Professional development is created, not provided”. In part, this is because the IT folk and the central L&T folk see their job as (and some have their effectiveness measured by) providing professional development sessions or the number of helpdesk calls they process.

It’s difficult to generate TPACK

I believe that the current practices, processes and tools used by institutional e-learning systems make it difficult for the individuals and organisations involved to develop TPACK. Consequently the quality of institutional e-learning suffers. This contributes to the poor quality of most institutional e-learning, the limited adoption of features beyond content distribution and forums, and is part of the reason behind the perceptions of increasing workload around e-learning.

If this is the case, then can it be addressed? How?

References

Bright, S. (2012). eLearning lecturer workload: working smarter or working harder? In M. Brown, M. Hartnett, & T. Stewart (Eds.), ASCILITE’2012. Wellington, NZ.

Reaburn, P., Muldoon, N., & Bookallil, C. (2009). <a href=”“>Blended spaces, work based learning and constructive alignment: Impacts on student engagement. Same places, different spaces. Proceedings ascilite Auckland 2009 (pp. 820–831). Auckland, NZ.

Thorpe, K. (2004). Reflective learning journals : From concept to practice. Reflective practice: International and Multidisciplinary Perspectives, 5(3), 327–343.

Tynan, B., Ryan, Y., Hinton, L., & Mills, L. (2012). Out of hours Final Report of the project e-Teaching leadership: planning and implementing a benefits-oriented costs model for technology-enhanced learning. Strawberry Hills, Australia.

Professional development is created, not provided

Over recent weeks I’ve been so busy that I’ve largely ignored Twitter. To my detriment. A quick return to it this afternoon found me following two links via tweets from @palbion. The two links were

  1. How effective is the professional development undertaken by teachers?, and
  2. Removing the lids of learning.

The first is a blog post outlining the many limitations of professional development as practiced in schools and many other locations (e.g. the L&T PD at Universities) and suggesting how it can be fixed to become both “useful and cost effective”. This post troubled me greatly. I agree that much of Professional Development is essentially worthless. But at least two aspects of the post troubled me.

The assumption that impact on student learning outcomes is the only true measure of the value of Professional Development worries me significantly. It’s simplistic in that it reduces the complexity of schools, teaching and teachers to a single measure. The practice of such abstraction is always going to lose something. But worse, if you focus everything on one particular measure and it becomes a target, it’s useless. i.e. Goodhart’s law

But what really bugged me was that the solution to the woes of Professional Development was better Professional Development. I disagree. I think you have to get rid of Professional Development and replace it with learning. i.e. the teachers (and academics) essentially have to continue learning. Here’s my provocative proposal

Professional development is mostly a solution provided by management due to flaws in the system that management preside over.

i.e. the education (or university) system – in its broadest understanding – is set up to make it difficult for the members of that system to learn and more importantly make changes based on what they learn.

The post actually makes the point itself when it says

Fortunately there have been a raft of reports (e.g. from EPPI and from Ofsted, among many others) that tell us exactly what to look for, and the good news is that great teacher learning is a remarkably similar beast to the great pupil learning.

Slide 19 of the Removing the lids of learning presentation by Dean Shareski contains the following quote from Stephen Downes

We need to move beyond the idea that an education is something that is provided for us, and toward the idea that an education is something that we create for ourselves.

I suggest that you can replace “education” with “professional development” and as a result you identify the solution to the problem of Professional Development.

BIM: Why and what?

Have to give a 10 minute spiel to the USQ L&T Systems Advisory group soon to support a request to get BIM installed in the institutional Moodle instance. The slide set is below.

One point to address was whether others would use it. A quick Google search of online course specifications revealed that 32 of the USQ courses offered in 2012 used some form of reflective/learning journal/diary. Most connected to assessment. This makes me wonder if they are still using the old “Word document” journal approach or similar, what their experiences are like and what strategies they are using to deal with the workload and problems with this sort of approach.

I also wonder what it says about the institutional L&T and IT folk if they aren’t aware that 32 courses are using this approach. Plus, I wonder how many more people might use the approach if they were aware of it and the broader implications for how new technologies get added to the institutional systems.

Visualising the blog network of #edc3100 students

The following describes the process and results of using Gephi to generate some visualisations of the inter-connections between the blogs of students in the course I’m teaching. The process is heavily informed by the work of Tony Hirst.

The result

The following represents the student blogs that have connected with each other. Size of the node is weighted towards the number of connections coming in. You can see a couple in the bottom right hand corner who have linked to themselves. The figure also suggests that there are 6 or 7 communities within these.

Network

There are actually 300+ blogs in the data set. However, a significant number of those are not yet connected to another blog. Hence the small number in the above image. Two possible explanations for this

  1. Many of the students haven’t yet taken seriously the need to connect with each other.
  2. There’s a bug in the code producing the file.

Will need to follow up on this. Will also need to spend a bit more time exploring what Gephi is able to do. Not to mention exploring why 0.8.2 of Gephi wouldn’t run for me.

The process

The process essentially seems to be

  1. Generate a data file summarising the network you want to visualise.
  2. Manipulate that data file in Gephi.

The rest contains a bit of a working diary of implementing the above two steps.

Generating the file

The format of the “GDF” file used in Tony’s post appears to be

  • A text file.
  • Two main sections
    1. Define some user/node information.

      The format is shown below. The key seems to be the “name” which is a unique identified used in the next section.

      [code lang=”bash”]
      nodedef> name VARCHAR,label VARCHAR, totFriends INT,totFollowers INT, location VARCHAR, description VARCHAR
      67332054,jimhillwrites,105,282,"Herne Hill, London","WIRED UK Product Editor."
      [/code]

    2. Define the connections

      Essentially a long list of id pairs representing a user and their friends. I’m assuming this means the use connects to the friend.

      [code lang=”bash”]
      edgedef> user VARCHAR,friend VARCHAR
      67332054,137703483
      [/code]

More on the GDF format available here. It mentions a minimal GDF file and also mentions that the edge thickness can be specified. This seems useful for this experiment i.e. edge thickness == number of links from one student blog to another.

So the file format I’ll use will come straight from the minimal spec, i.e.
[code lang=”bash”]
nodedef>name VARCHAR,label VARCHAR
s1,Site number 1
s2,Site number 2
s3,Site number 3
edgedef>node1 VARCHAR,node2 VARCHAR, weight DOUBLE
s1,s2,1.2341
s2,s3,0.453
s3,s2, 2.34
s3,s1, 0.871
[/code]

Thinking I’ll use the “hostname” for the student’s blog as the “site number”. Maybe just the first four letters of it. Just to keep student anonymity.

Questions for later

  1. Can I modify the file format to include with each “friend” connection a date?

    The idea is that the date will represent when the post was made. Using this I might be able to generate a visualisation of the connections over time.

  2. Is there value in also mapping the connections to the other links within the students’ posts?

    Could provide some idea of what they are linking to and help identify any interesting clusters.

The data

The database I’m maintaining contains

  • All the URLs for the students’ blogs.
  • All the posts to the students’ blogs.

I already have a script that is extracting links from each of the student blogs, I just need to modify this to count the number of connections between student blogs…..a bit easier than I thought it might be.

Now a quick script to generate the GDF file. Done.

Using Gephi

This is where I’m helping Tony Hirst’s instructions work with a minimum of hassle.

That’s a bugger. Empty menus on Gephi. It’s not working. Is it wrong of me to suspect Java issues?

Going back to older version. That’s worked, but I haven’t installed it yet into Applications. 0.8.2 seemed to have worked previously as well. Get this done and figure it out later.

File opened. We have a network.

001 - First Graph

Tony Hirst then removes the unconnected nodes. I’d like to leave them in as it will illustrate the point that the students need to connect with others.

The Modularity algorithm doesn’t seem to be working as I’d expect (on my very quick read). It’s finding 200+ communities. Perhaps that is to be expected given that most blogs are connected by one or two links and that I haven’t removed the unconnected nodes. Yes works much better if you do that.

A bit more playing produces the final result above.

Many of our students are neither digital natives nor digitally literate

Yesterday I attended a session with three different presentations focused around “student voices and their current use of technologies at USQ”. There was some very interesting information presented. However, I have a few reservations with aspects of the research and especially with some of the conclusions that have been drawn. I’m hoping to reflect upon and post more about this when I have some time. But an experience just now reinforced one of my reservations and is worth a short sidetrack.

A largish survey of students found that all of the students had some form of access to the Internet. Given that it was an online survey, this is perhaps not surprising. But that’s not the problem I particularly want to address here.

The big reservation I have is that one of the conclusions drawn from this work was that our students are “Digitally literate and agile”. My experience suggests that this is not the case.

My experience

The course I’m currently teaching has 300+ students, mostly 3rd year Bachelor of Education students, spread throughout Australia and the world. 200+ of these students are online students. i.e. they generally don’t set foot on a campus. The course itself is titled “ICTs and Pedagogy” and as you might expect we’re pushing some boundaries with the use of ICTs. Attempting to model what we espouse. Some examples of that include, amongst others

  • Students are required to set up their own blog and use it as a reflective journal.
  • They are required/asked to sign up for Diigo and join a course Diigo group
  • We’re using Diigo’s annotation facility to mark up online readings and the assignment pages.
  • They are encouraged (but not required) to join Twitter.
  • We use Google docs for shared content creation.
  • The course Moodle site is heavily used including the discussion forums leading to a lot of email traffic (this is picked up below).
  • We use a range of ad hoc activities to demonstrate different ICTs.
    e.g. a version of the “weather Flickr” activity of @courosa

Many of the students, especially the online students, have been studying online for 3+ years. Some of the earlier courses these students have completed encourage them to engage with different ICTs e.g. developing a webquest or creating digital stories.

So, obviously these students are “digitally literate and agile”?

Some gaps in digital literacy

Here’s a quick list of some of the questions/problems students have asked/had in the first three weeks of semester

  • Not knowing what their university provided email address is.
  • Not knowing to look in the junk folder for a confirmation email.
  • Not knowing how to add a link using one of the WYSIWYG editors provided on web-based services such as Moodle, WordPress etc.
  • Not knowing that you have to “right/ctrl” click on a link to download a file rather than display it in the web browser.
  • Not knowing about email filters.

The last point is a big one. It’s fairly common for students taking four online courses to get a lot of email from the discussion forums for those courses. This flood of email messages take over the Inbox and lead to confusion and missed information. Many of these students have been experiencing this for 3+ years. Yet, almost none of them knew about email filters.

Perhaps one of the most successful learning activities in the course was a Google doc that was created for the students to list the problems they were having with the course and any suggestions they might have for tools or practices that might help solve those problems. The following image is a screen shot of a section of that document about email filters. Click on it to see it bigger.

email filters

This particular activity was combined with reading about Toolbelt theory and encouraging the students to start building their toolbelt. To have them start taking control of their problems and identifying how they can solve them.

Not digital natives

Perhaps the major mistake (one of many) I made in the design of the first few weeks of this course was that I assumed that the students were far more digitally literate than they appear to be. Not only that, as someone who is fairly “digitally literate”, I assumed that they would have the experience/knowledge to be able to implement the “Tech support cheat sheet” without much prompting.

Lessons

When we are designing our learning experiences we (i.e. I) cannot assume that they are “digitally literate and agile”. I need to give more thought to scaffolding these experiences and perhaps exploring ways to better help them develop what @irasocol describes as an essential survival skill

knowing how to pick the right tool for the job and moment, how to use that tool well, and how to find new tools

I also think there’s a lesson here about research methodologies. Research methodologies – e.g. surveys and focus groups – that capture insights from people divorced from the actual activity (e.g. the “current use of technologies at USQ”) – are going to overlook important insights. That limitation has to be kept in mind when drawing conclusions and recommendations for action.

The absence of a search function – my current big problem with a Moodle installation

Consider this a plea for suggestions. In particular, consider it a plea for workarounds that I can implement quickly (and painlessly).

The problem

I have a Moodle course site. It has a range of activities, many with a page or two of text that sets the context and explains the task. The image below shows what the activities for one week look like.

Week 1 learning path

Now this works fine if a student works sequentially through the activities. It tracks what they’ve completed etc.

It fails miserably when they want to revisit the page about “X”. They have to remember in which week “X” was talked about, under which activity “X” was addressed.

I have problems doing this and I wrote the stuff.

The “web way” solution

If this was any other website, we’d follow the advice of Jakob Nielsen

Search is one of the most important user interface elements in any large website. As a rule of thumb, sites with more than about 200 pages should offer search.

The “web way” solution would be to have a search engine. But the Moodle installation of the University I teach the course for doesn’t appear to provide this functionality. I believe the only way this can occur is to allow Google to have access to all courses on the site. While there may be reasons for this, it’s not a solution I’m pushing just to solve my problem.

How can I provide my students with a search function? How can I make my course site “of the web” and not “on the web”?

I have heard mention made of being saved by repositories. i.e. Moodle is not a content hosting platform and doesn’t try to be. If you want searchable content, place it in a repository. The trouble is we’re not talking here about large documentation. Just a lot of small pages that are closely wrapped around specific learning activities in Moodle. I’m yet to see an information repository integration that works as seamlessly as I’d expect.

My interim solution

In the absence of any brilliant ideas, it appears that the only way to do this is to create a duplicate website that is actually “of the web”. i.e. one that is indexed by Google. I’m thinking probably a blog with pages set up to match the weeks and other components.

Some have suggested providing the pages as a PDF document (or three). The problem with this is that there is web content (videos, animations etc) embedded throughout. Producing a print document would allow folk to search, but then they wouldn’t have access to the web content (unless they clicked on a link etc).

Producing a second website is by no means a perfect solution, some of its limitations include

  • Extra workload for me.
  • Large potential to create confusion amongst the students
    e.g. which website do I visit? Which website has the correct content? Do I need to check both websites?
  • Loss of some Moodle functionality.
    The course currently uses the Moodle activity completion functionality to allow students to track their completion, but also as part of the assessment. If students start working through the blog version of the website it will lead to “But I already did that activity!” problems.

Surely there has to be a better solution?

How much of a cage should I build?

Just how much of a cage should I make my course into? How far should I take the constraints? The following sets the scene and asks the questions. Would love to hear alternate views.

Cat in a Cage, Valparaiso by geezaweezer, on Flickr
Creative Commons Attribution 2.0 Generic License  by  geezaweezer 

The course

The course I teach has 300+ students spread throughout much of Australian, parts of Asia and perhaps other parts of the world. Studying both on-campus and online. It has folk who will be teaching everything from Early Childhood through to TAFE/VET, and everything in-between.

The course website is a Moodle site. Each week the students have a list (perhaps too long a list) of activities to complete (see the image). To help them keep track of what they have and haven’t done the Moodle activity completion functionality was used. That’s what produces the nice ticked boxes indicating activity completion.

Week 1 learning path

Building on this, part of the assessment of the course is tied to how many of these activities they complete. Activity completion is actually linked to keeping a learning journal and contributes 15% of the total course mark (5% for each of the 3 assignments).

Task corruption

Given the pragmatic nature of students today and especially given the perceived large amount of work in the first week. It was not surprising that a bit of task corruption” is creeping in. I assumed that some students would figure out that for the activities that are “web pages” with exercises, simply visiting the page would tick the box.

But there are other activities that require students to post to a discussion forum with some form of artefact. For example, a description of a resource found on Scootle and a description of how it links to their curriculum. These are posts that I see and try to keep a track of. So a bit of a deterrent?

Turns out perhaps no. I’m starting to see the odd post that either purposely (or not) does address the task, but is sufficient to be recorded as an activity completion.

The question is whether or not I should be actively policing this?

The trade-off

I don’t want to set myself the task of being a policeman. But perhaps I need to implement some penalties here, some options might include:

  • A gentle warning, at least initially.
  • Warn the student and delete their activity completion for the given activity (i.e. do it again).
  • Deduct marks for task corruption.

    Of course, there will always be the “but I misunderstood the task sir” excuse.

.

A purely pragmatic reason against doing this is that it will take a lot of work to police this. For another, I’ve already expressed some reservations about what it means to impose new learning strategies on a group of learners. That’s certainly something the course is currently doing.

We’re also talking about 3rd year University students, shouldn’t they live with their choices? If the don’t engage in these activities I do believe they will learn less and perform worse on the other assessment.

Then there’s the question of the students who are engaging with the activities and may potentially be receiving the same marks as those who have engaged in task corruption. I’m sure there would be a view amongst both sets of students.

Perhaps I should just mention this to the students to discourage (though not prevent) this practice?

Thoughts? Suggestions?

How are they going?

The lack of interaction/feedback between student and teacher in large, contemporary, Australian university courses has always frustrated me. With 350+ students currently enrolled in the course I’m teaching, I’m keen to address this problem. Enter the weekly “course barometer”, a simple practice I’m hoping we can keep up for the current semester. The following is a quick summary of the results for the first week and a description of how it works.

How it works

In summary,

  • I ask the students to complete a Google form at the end of their learning for a week.
  • Their responses get put into a Google spreadsheet.
  • The responses are examined, analysed and inform what we’re doing in the course over coming weeks.

The questions/tasks in the Google form are (all except #4 are free response)

  1. Write down the two most important things you have learnt in EDC3100 this week.
  2. What would you most like more help with?
  3. How do you feel about EDC3100 at the moment? (Select all the words that apply to you)
  4. What is the biggest worry affecting your work in EDC3100 at the moment?
  5. How could we improve EDC3100?

A process similar to this has been widely used. This particular set of questions arise from the following

The IMPACT procedure (Clarke, 1987 cited in Goos et al., 2007, p. 411) is one method for discovering the concerns and opinions of students. It involves the regular completion of the following simple questionnaire during class (for this unit during the Friday “Reality and Reflection” lessons) and the retention of responses over the period of the class. Goos et al (2007, p. 411) suggest that the success of this process “depends on respecting the confidentiality of student responses and acting on these responses where appropriate to improve students’ experiences of learning mathematics.”

taken from here

Due to the point about “confidentiality” and the novelty of this approach, I’ve decided (for now) not to open up access to the Google spreadsheet with the data to anyone except the teaching staff in the course.

First weeks responses

The following images (click on them to see a bigger version) are word clouds generated by sending the raw responses for each question through Tagxedo. I still need to look more closely at the feedback, but some initial thoughts.

How are you feeling?

How are you feeling?

Was happy and a little surprised to see some of the more positive feelings be visible. Had worried it was all negative. Week 1 was very challenging and time consuming.

Given students are given a fixed set of words and are able to add a few words of their own, this is perhaps the best question to analyse using a Word cloud. A word cloud is not so useful for the free text questions.

Two most important things

Week 1 - Most important learning

Help

Week 1 - What help do you need?

Need to look at these responses in more detail. Interesting at some level that “assignment” is not the biggest. Arguably having “learning” and “understanding” being more of a focus is potentially a good thing. But closer examination is needed.

There is the “how to use the tool” presence (blog, twitter and diigo)

Biggest worry

Week 1 - biggest worry?

Time and workload have been the big worry, at least via other communication mechanisms, and that appears to have come through in this.

Improve

Week 1 - How can we improve?

Time and workload would appear to be a major area for improvement. Future weeks will see this improve and we’ll need to revisit the design of the course a bit. As it stands, week 1 is probably too much of an ask.

However, I’m going to be interested to see how this evolves over coming weeks. Much of the work in week 1 was setting up new tools and developing some foundational insights that should really help in subsequent weeks.

BIM2 and disable_form_change_checker

As a developer, you have to love it when someone using your code diagnoses and identifies their own problem with your code. Especially if they give you a clear and concise explanation you can use. That’s what happened with the BIM2 problem I blogged about recently. It appears I was using Moodle 2.3.4 the problem was found on Moodle 2.3.2+ and there was a change in the Moodle code in-between. The following describes the bug and hopefully the fix/change I’ve made to the BIM code.

The problem

The problem arises in bim/marker/allocation_form.php with this
[source code=”php”]
// turn off the checking
$mform->disable_form_change_checker();
[/source]

disable_form_change_checker is described here and it was added in Moodle 2.3.3.

So the question is how to handle this neatly so that BIM gracefully degrades with older versions of Moodle?

The solution?

One approach is to simply require the more recent version of Moodle, but given this is one function call in one section of the code. There has to be a more fine grain solution, doesn’t there?

Perhaps just removing the call? But I remember it getting quite annoying without it. So, for now, it stays.

Of course, method_exists. I need to code more.

[source code=”php”]
if ( method_exists( $mform, "disable_form_change_checker" ) ) {
$mform->disable_form_change_checker();
}
[/source]

No problem with 2.4. What about older versions?

Yes, Moodle 2.2 crashes with this problem in BIM. And method_exists fixes it. Time to commit the code and we’re done.