Exploring Moodle Book usage – Part 7 – When are they used?

The last post in this series looked briefly at the contents of Moodle Book resources. This post is going to look at when the book resources are used, including:

  • What time of day are the books used?
  • When in the semester are they used?

By the end I spent a bit of time exploring the usage of the Book resources in the course I teach.

What time of day are they used?

This is a fairly simple, perhaps useless, exploration of when during the day. More out of general interest and laying the ground work for the code for the next question.

Given the huge disparity in the number of views versus print versus updates, there will be separate graphs for each. Meaning 3 graphs per year.  For my own interest and for the sake of comparison, I’ve included a fourth graph which is the same analysis for the big 2015 offering of the course I teach.  This is the course that perhaps makes the largest use of the Book and also the offering in which  I did lots of updates.

The graphs below show the number of events that occurred in each hour of the day. 12pm to 1am, 1am to 2am,…and so on.  Click on the graphs to see expanded versions.

There is no graph for prints per hour for 2012 as there were none in the database. This appears likely to be a bug that needs to be addressed.

Overall findings from time of day

Growth – The maximum number of events has grown each year (as expected given earlier indications of growth).

  • max views per hour: 2012 just less than 35K to 2015 over 150K
  • max prints per hour: 2013 just over 400 to 2015 over 1500
  • max updates per hour: 2012 just over 500 to to 2015 over 6000.

Similarity – The overall of shapes of the graphs stay the same, suggesting a consistent pattern in interaction.

This is especially the case for the viewing events. Starting with a low number from midnight to 1am, a on-going drop in events until 5am when it grows until the maximum per hour between 11am and midday. Then there is a general drop away until 7pm to 8pm when it grows again until dropping away after 9pm

Views per hour each year

2012
2012 views per hour

2013
2013 views per hour

2014
2014 views per hour

2015

2015 views per hour

EDC3100 2015 S1

EDC3100 2015 1 views per hour

Prints per hour each year

2012

2012 prints per hour

2013

2013 prints per hour

2014

2014 prints per hour

2015

2015 prints per hour

EDC3100 2015 S1

EDC3100 2015 1 prints per hour

Updates per hour each year

2012

2012 updates per hour

2013

2013 updates per hour

2014

2014 updates per hour

2015

2015 updates per hour

EDC3100 2015 S1

EDC3100 2015 1 updates per hour

Calendar Heatmaps

A calendar heatmap is a fairly common method of representing “how much of something” is happening each day of the year. The following aims to generate calendar heatmaps using the same data shown in the above graphs. The plan is to use the method/code outlined on this page.

It requires the generation of a two-column CSV file. First column the date in YYYYMMDD format and the 2nd column the “how much of something” for that day. See the example data on the blog post.  Looks like it might be smart enough to figure out the dates involved.  Let’s see.

It is, but doing all of the years together doesn’t work all that well given the significant increase in numbers of courses using the Book as time progresses and the requirement for the heatmap to use the same scale for all years. As a result the 2012 usage doesn’t show up all that well. Hence each of the years were mapped on separate heatmaps.

The following calendar heatmaps show how often the Book resources were viewed on each day. The events counted are only those for Book resources from courses offered in the given year. In 2012, 2013 and 2014 this means that there is a smattering of views of a books early in the following year (semester 3 stretches from Nov to Feb). There is no similar usage for the 2015 books because the data does not include any 2016 events.

The darker the colour the greater the use. In the 2012 image below you should be able to see a tool tip showing a value of 81 (out of 100) that is quite dark, but not the darkest.

2012

The 2012 map seems to establish the pattern.  Heavy use at the start of semester with a gradual reduction through semester. A few upticks during semester and toward the end of semester.

I no longer have easy access to specific dates for 2012 and 2013. The 2014 heatmap has some specific dates which should broadly apply to these earlier years.
2012 Book usage

2013

2013 Book usage - calendar heatmap

2014

The institution maintains a web page that shows the important dates for 2014, it includes:

  • March 3 – Semester 1 starts.
    Course websites open 2 weeks before this date – 17th Feb
  • June 16 – Semester 1 exams start.
  • July 21 – Semester 2 starts
    Course websites open 2 weeks prior – 7th July.
  • November 3 – Semester 2 exams start.
  • November 17 – Semester 3 starts.

Screen Shot 2016-09-11 at 4.52.36 pm

2015

The semester 1 2015 offering of my course had the following due dates for its 3 assignments

  1. 30th March – which appears to coincide with a heavy usage day.
  2. 4th May – also a slightly heavy usage day, but not as heavy.
  3. 15th June – two somewhat heavy usage days before and on this date.

Raising the question of what the heatmap for that course might look like – see below

Screen Shot 2016-09-11 at 4.53.10 pm

EDC3100 – S1, 2015

Focusing just on my course the increase in usage just before the due date for the assignments is more obvious. One of the reasons for this is that all the Assessment information for the course is included in a Moodle Book resource.
EDC3100 S1 2015 book usage - calendar heatmap
Other time periods relevant to this course are:

  • April 6 to 17 – the two week mid-semester break; and,
    Which correspond to two of the lightest periods of usage of book resources.
  • May 18 to June 5 – a three week period when most of the students are on Professional Experience within schools.
    Which also corresponds to a light period of usage.

The two heaviest days of usage are the 9th and 10th of March. The start of Week 2 of semester. It’s a time when the pressure is on to get a blog created and registered and start completing learning paths.

After the peak of the first three weeks, usage of the Book resources drops to around 50% per day.

Questions to arise from this

  • Does the learning journal assessment item for EDC3100 change when students interact with the course site?
  • Is the pattern of usage (down to 50% a day) indicative of students turning off, or becoming more familiar with the approach?
  • Does the high level of usage indicate

It also begs the question about whether particular offerings of the course show any differences.

2012 – S2

The 2012 S2 pattern is quite a bit different. It is a bit more uneven and appears to continue well after the semester is finished.  This is due to this being the first semester the course used the Book module and also because there was a semester 3 offering of the course for a few students that used the same resources.
EDC3100 2012 2 - Book usage

The 2012 heatmap also shows a trend that continues. i.e. usage of the Book resources continue well past the end of semester. It’s not heavy usage, but is still there.

Question: is that just me, or does it include students?

2013 – S1

2013 S1 is a bit different as well. Lighter use at the start of semester. A bit heavier usage around assignment due dates. My guess is that this was still early in the evolution of how the Book was being used.

EDC3100 2013 S1 - Book usage

2013 – S2

This map seems to be evolving toward the heavy use at the start of semester.
EDC3100 2013 S2 - Book usage

2014 – S1

And now the pattern is established. Heavy use at the start of semester and in the lead up to Assignment 1. A slight uptick then for Assignments 2 and 3. With the light usage around Professional Experience evident.

EDC3100 2014 S1 - Book usage

2014 – S2

EDC3100 2014 S2 - Book usage

2015 – S2

  EDC3100 2015 S2 - Book usage
What about just the students?

The following shows just the student usage for the 2013 S1 offering. Not a huge difference to the “all role” version above suggesting that it is students who are doing most of the viewing. But it does confirm that the on-going usage of the Book resources past the end of the semester are students who appear to have found some value for the information post the course.

EDC3100 2013 1 - Just students

Which comes first? Pedagogy or technology?

Miranda picks up on a common point around the combination of technology and pedagogy with this post titled Pedagogy First then Technology. I disagree. If you have to think in simple sequential terms, then I think pedagogy should be the last consideration, not the first. The broader problem though is our tendency to want limit ourselves to the sequential

Here’s why.

The world and how we think isn’t sequential

The learning and teaching literature is replete with sequential processes such as ADDIE, Backwards Design, Constructive Alignment etc. It’s replete with such models because that’s what academics and experts tend to do. Develop models. The problem is that all models are wrong, but some of them are useful in certain situations for certain purposes.

Such models attempt to distill what is important from a situation to allow us to focus on that and achieve something useful. The only trouble is that the act of distillation throws something away. It’s an approach that suffers from a problem identified by Sir Samuel Vimes in Feet of Clay by the late Terry Pratchett

What arrogance! What an insult to the rich and chaotic variety of the human experience.

Very few, if any, human beings engage in anything complex or creative (such as designing learning) by following a sequential process.  We are not machines. In a complex task within a complex environment you learn as much, if not more, by engaging in the process as you do planning what you will do beforehand.

Sure, if the task you are thinking about is quite simple, or if it is quite complicated and you have a lot of experience and expertise around that task, then you can perhaps follow a sequential process. However, if you are a teacher pondering how to transform learning through the use of digital technology (or using something else), then your task is neither simple, nor is it complicated, nor is it something you likely have experience or expertise with.

A sequential process to explain why technology first

Technologies for Children is the title of a book that is designed to help teachers develop the ability to help learners engage with the Australian Curriculum – Technologies learning area. A curriculum that defines two subjects: Design and Technologies, and Digital Technologies. In the second chapter (Fleer, 2016) the author shares details of how one year 4/5 teacher integrates this learning area into her class. It includes examples of “a number of key statements that reflected the technological processes and production skills” (Fleer, 2016, p. 37) that are then turned into learner produced wall charts. The following example wall chart is included in Fleer (2016, p. 37). Take note of the first step.

When we evaluate, investigate, generate designs, generate project plans, and make/produce we:

  1. Collaboratively play (investigate) with the materials.
  2. Evaluate the materials and think about how they could be used.
  3. Generate designs and create a project plan for making the item.
  4. Produce of make the item.
  5. Evaluate the item.
  6. Write about the item and talk with others.
  7. Display the item.

Before you can figure out what you are going to do with a digital technology, you need to be fully aware of how the technology works, what it can do, what are the costs of doing that, what it can’t…etc. Once you’ve got a good handle on what the digital technology can do, then you can figure out interesting and effective ways to transform learning using the technology. i.e. pedagogy is the last consideration.

This is not to suggest that pedagogy is less important because it comes last. Pedagogy is the ultimate goal

But all models are wrong

But of course all models are wrong. This model is (arguably) only appropriate if you are not familiar with digital technology. If you know all about digital technology or the specific digital technology you are considering, then  your need to play with the digital technology first is lessened.  Maybe you can leap straight to pedagogy.

The trouble is that most teachers that I know have fairly limited knowledge of digital technologies. In fact, I think many of the supposed IT experts within our institutions and the broader institution have somewhat limited understandings of the true nature of digital technologies. I’ve argued that this limited understanding is directly impacting the quality of the use of digital technology for learning and teaching.

The broader problem with this “technology first” model – as with the “pedagogy first” model – is the assumption that we engage in any complex task using a simple, sequential process. Even the 7 step sequential process above is unlikely to capture “the rich and chaotic variety” of how we evaluate, investigate and generate designs for using digital technology for learning and teaching. A teacher is just as likely to “play (investigate)” with a new digital technology by trying out in a small safe to fail experiment to see how it plays out. Perhaps this is repeated over a few cycles until the teacher is more comfortable with how the digital technology works in the specific context, with the specific learners.

References

Fleer, M. (2016). Key ideas in the technologies curriculum. In Technologies for Children (pp. 35–70). Cambridge University Press.

Exploring Moodle Book Module usage – part 1 – background and planning

I’m due to have the slides for a Moodlemoot Australia presentation in a few weeks. Time to get organised. The following is (perhaps) the first of a sequence of posts reporting on progress toward that presentation and the related research.

Background

My interest in research is primarily driven by the observation that most educational usage of digital technology to enhance learning and teaching is fairly bad. Typically the blame for this gets laid at the feet of the teaching staff who are digitally illiterate, not qualified to teach, or are laggards. My belief/argument is that the problem really arises because the environment within formal education institutions just doesn’t understand what is required to make a difference. Much of what they do (e.g. institutional standards for course sites, checklists, training, support documentation, design and support of technlogies…) does little to help and tends to make the problem worse.

You want digitally fluent faculty?

A contributing factor to that is that institutional attempts to improve digital learning actually fails to be based on any insights on how people (in this case teaching staff and all those involved with digital learning) learn. How institutions implement digital learning actually gets in the way of people learning how to do it better.

Schema and the grammar of school

The ideas of schema and the grammar of school offer one example of this failure. This earlier post includes the following quote from Cavallo (2004) establishes the link

David Tyack and Larry Cuban postulated that there exists a grammar of school, which makes deviation from our embedded popular conception of school feel as nonsensical as an ungrammatical utterance [1]. They describe how reform efforts, whether good or bad, progressive or conservative, eventually are rejected or denatured and assimilated. Reform efforts are not attempted in the abstract, they are situated in a variety of social, cultural and historical contexts. They do not succeed or fail solely on the basis of the merit of the ideas about learning, but rather, they are viewed as successful based upon their effect on the system and culture as a whole. Thus, they also have sociological and institutional components — failure to attend to matters of systemic learning will facilitate the failure of the adoption of the reforms. (p. 96)

The grammar of school problem is linked to the idea of schema which links to the following quote that I first saw in Arthur (2009) and which is taken from Vaughan (1986, p. 71)

[In the situations we deal with as humans, we use] a frame of reference constructed from integrated sets of assumptions, expectations and experiences. Everything is perceived on the basis of this framework. The framework becomes self-confirming because, whenever we can, we tend to impost it on experiences and events, creating incidents and relationships that conform to it. And we tend to ignore, misperceive, or deny events that do not fit it. As a consequence, it generally leads us to what we are looking for. This frame of references is not easily altered or dismantled, because the way we tend to see the world is intimately linked to how we see and define ourselves in relation to the world. Thus, we have a vested interest in maintaining consistency because our own identity is at risk.

Evidence of schema in how digital technologies are used

Horsey, Horseless Carriage

The schema idea means that people will perceive and thus use digital technologies in ways that fit with their “integrated sets of assumptions, expectations and experiences”. This is an explanation for the horsey, horseless carriage way people respond to digital technologies. It’s why courses where the majority of students are online students and will never come onto a campus are still designed around the idea of face-to-face lectures and tutorials.

It also explains why when I finally returned to teaching a course I adopted the idea of a ramble for the structure of the course.  It explains why the implementation of the ramble evolved into using the Moodle Book module the way it does today. The images below (click on them to see larger versions) illustrate the connection between my practice 20 years apart, more detail follows.

1996 2016
The 85321 "online" book - 1996 Online book 2016

The 1996 image is a page from  the study guide (wonder how many people can play the au file containing the Wayne’s World II quote) for the Systems Administration course I taught in 1996. The 2016 image is a page from the “study guide” I developed for an Arts & Technologies C&P course.

I believe/suggest that the influence of schema also plays a significant contributor in the practice of other teaching staff as they transition into digital learning. It’s a factor in why most course sites remain dumping grounds for lecture slides and the subsequent widespread growth in the use of lecture capture systems.

And it’s not just the teaching staff. Students have developed schema about what it means to be taught, and what it means to be taught at university. A schema developed either through direct experience, or via the experience of others and various media. The typical schema for university education involved large lecture halls and tutorials.

 

So what?

The above suggests that whenever students and teachers engage with a digital technology (or any change around) and its use for learning and teaching, there are three main possibilities:

  1. It seen as nonsensical and rejected.
    e.g. whatever was said doesn’t make sense from existing grammar rules and seen as just being wrong.
  2. It sounds like something familiar and is modified to fit within the confines of that familiar practice.
    e.g. whatever was said sounds an awful lot like an existing use of grammar (even though it is different), and thus is interpreted as matching that existing use.
  3. The significant difference is seen as valued and existing practice is modified to make use of that difference.
    e.g. the different use of grammar is both understood as different and the difference is valued, and is subsequently existing practice is modified to incorporate the new grammar.

If this is the case, then examining the use (or not) of a digital technology in learning and teaching should reveal evidence of these possibilities.  This seems very likely, given widespread common complaints about the use of digital technology to enhance learning and teaching. Complains that see most practice stuck at possibility #2 (at best).

If this is the case, then perhaps this way of thinking might also identify how to address this.

But first, I’m interested in seeing if use of a particular digital technology matches this prediction.

Use of the Moodle Book module

Due to a 2015 grant from the USQ OpenTextbook Initiative I’m going explore the use the Moodle Book module. The plan is to analyse the use of the Moodle Book module (the Book) at USQ to see how both learners and teachers are engaging with it, see if the above expectations are met, and figure out what might be done in terms of the support and development of the Moodle Book module to help improve this.

What follows is an initial map of what I’ll be exploring.

A major aim here is to explore whether a student or teacher using the Book have made the transition from possibility #2 (treating the Book as a print-based book) to possibility #3 (recognising that this is an online book, and using that difference). I’ve highlighted some of the following questions/analysis, which I think be useful indicators of this transition. The darker the yellow highlight, the more strongly I think it might indicate someone making the leap to an online book.

Question for you: What other practices might indicate use that has moved from #2 to #3?

Which courses use the Book

First step is to explore whether the Book is being used. How many courses are using it? How many books are being produced with the module.

As the abstract for the talk suggests, early analysis revealed a growth in use, but I’m wondering how sound that analysis was. Hence there is a need to

  1. Correctly identify the number of course offerings using the Book each year.
  2. Identify the number of different teaching staff are responsible for those courses.
    Longer term, it would be useful to ask these staff about their background and reasons for using the Book.
  3. Identify the type of courses using the Book.
  4. How many books are being produced by each course?
  5. How do the books fit into the structure of the course?
    1. Is the structure the same from offering to offering?
    2. How much does the number and content of the books change from offering to offering?

Characteristics of the book content

  1. Statistics around the level of readability of the text (e.g. Flesch-Kincaid etc).
  2. The structure of the book – are sub-chapters used.
  3. Are images, video, Moodle activities included?
  4. What about links?
    • Are there any links at all?
    • What is linked to?
    • Are links purely to external resources? 
    • How many links connect back to other parts of the course’s Books?

Patterns in how the books are authored

  1. How are the books authored?
    • From scratch?
      1. Using the web interface?
      2. Via an import process?
    • Copied from previous offerings?
    • ?? other??
  2. How are they edited? 
    My expectation that a teacher who sees the Book as a replacement for a print book will not be editing the books during semester.

Patterns in how the books are read/used

  1. Are students reading the books online or printing them out?
  2. Does printing always happen at the start of semester? Does it continue through semester? Does it drop off?
  3. When are students reading the books?
  4. What is the nature of the paths they take through the books?
    1. Do they read the books and the chapters in order?
    2. How long do the spend on each chapter?
    3. Do they revisit particular books?
  5. How many times do discussion forum posts in a course include links to chapters/sub-chapters within the books
    • Posts written by teaching staff
    • Post written by students

References

Arthur, W. B. (2009). The Nature of Technology: what it is and how it evolves. New York, USA: Free Press.

Cavallo, D. (2004). Models of growth – Towards fundamental change in learning environments. BT Technology Journal, 22(4), 96–112.

How many digital devices do you have?

In a couple of the courses I teach I ask students (for slightly different purposes) the question from the title of this post, “How many digital devices do you have?”.  In one of the courses that question takes the form of a quiz and looks something like the following.

Question text

How many different digital technologies do you own?
Select one:
a. 0
b. 1 to 5
c. 6 to 10
d. 11 to 20
e. More than 20

 What answer would you give?

Count them up folks. What answer would you give. I’ll give you some space to think about that before talking about what some other folk have said.

 

 

What others have said

Some of the students in another course (where the question is framed somewhat differently) have offered the type of answers I expected, based on the framing of the question.

Jay identifies 3 devices. Neema lists 2.

Thinking a bit further afield than that I can probably count quite a few more than that in my house. I’ll ignore devices personal to other members of my family. This gets me the following list: laptop, 2 smart phones, digital camera, printer, various external drives, Apple TV device, T-Box, X-Box One.  That’s 9.

 

 

 But that doesn’t really start to count them

Fleming (2011) writes that it is

estimated that today’s well-equipped automobile uses more than 50 microcontroller units (p. 4)

Wikipedia defines a microcontroller as “a small computer) on a single integrated circuit containing a processor core, memory, and programmable input/output peripherals.

So your car alone potentially has you well into double figures. Remember that Fleming was writing in 2011. If you have recently purchased the latest Mercedes E-Class, chances are the number of microcontroller units in your car goes well beyond 50.

And of course, with your thinking re-calibrated by this example, you can probably quite easily identify additional devices in your house that are likely to control microcontrollers.

Implications

Digital devices are increasingly ubiquitous. Digital isn’t limited to a separate device like a computer, tablet, or smart phone. It’s wearable and in every thing.

I expect most people not to be aware of just how reliant they are on digital technologies in everything they do. Hence it’s uncertain that they understand or are prepared for what this might mean for what they do. For example, I don’t think many people in higher education or education more broadly quite understand the implications this has for how those organisations operate, perform, or exist. I’m not convinced that the patterns they use to make sense of the world are ready yet to deal with these changes effectively.

But then I’m not convinced the technologists are either.

Interesting times ahead.

References

Fleming, B. (2011). Microcontroller units in automobiles. IEEE Vehicular Technology Magazine, 6(3), 4–8. doi:10.1109/MVT.2011.941888

 

Teacher presence in network learning

A new semester and the Networked and Global Learning course is running again. Apologies to those in the other courses I teach, but this course is consistently the most engaging and interesting. It’s a course in which I typically learn as much as the other participants. However, due to the reasons/excuses outlined in the last post, I haven’t been able to engage as much as I would have liked with the course.

This has me thinking about something Adam wrote, in particular the argument/observation from Rovai (2002) which Adam describes as

This is bringing to light the sense of disconnection students are often experiencing due to physical and psychological separation from teachers, peers and institutions

What follows is some random reactions to this particular quote and an attempt to connect it with my teaching.

Badly designed learning generates bad outcomes

As someone who has been working, learning and teaching online for a long time I am biased and this idea troubles me. In fact, it puts me in mind of the following point made in this recent post around the question of banning laptops in the classroom, because handwriting is better for learning

Those studies about the wonders of handwriting all suffer from the same set of flaws, namely, a) that they don’t actually work with students who have been taught to use their laptops or devices for taking notes. That is, they all hand students devices and tell them to take notes in the same way they would in written form. In some cases those devices don’t have keyboards; in some cases they don’t provide software tools to use (there are some great ones, but doing it in say, Word, isn’t going to maximize the options digital spaces allow), in some cases the devices are not ones the students use themselves and with which they are comfortable. And b) the studies are almost always focused on learning in large lecture classes or classes in which the assessment of success is performance on a standardized (typically multiple-choice) test, not in the ways that many, many classes operate, and not a measure that many of us use in our own classes. And c) they don’t actually attempt to integrate the devices into the classes in question,

In terms of student disconnection,is it arising from there truly being something essential that a physical face-to-face learning experience provides that can’t be provided in an online space?

Or, is it because the types of online learning experiences being examined by Rovai have not been designed appropriate to draw on the affordances offered by an online learning environment?  Do these online learning experiences examined by Rovai suffer the same problem that most of the attempts to engage in open education illustrate? i.e. an inability to break out of the “persistent patterns of relations” (Bigum, 2012) that are formed by someone brought up teaching face-to-fact?

Given that the abstract for Roavi (2002) includes

Data were collected from 375 students enrolled in 28 different courses, offered for graduate credit via the Blackboard e-learning system by a private university

Indicating that the “persistent patterns of relations” under examination in this paper is from a North American university in 2000/2001 where online learning was limited to the Blackboard LMS. A time and system which is unlikely to be described by anyone as offering only the pinnacle of an online learning experience.

Might the sense of disconnection arise from the poor quality of the learning experience (online or otherwise) rather than the lack of physical presence.

Or is it simply that both teachers and learners have yet to figure out how to leverage the affordances of online learning?

What type of presence should a teacher have?

The following two images represent connections formed between participants in two discussion forums in a large course I teach (these are from first semester 2015). Each dot represents a participant. A red do is a teacher, blue dot a student. A connection between two people is formed when one of them replies to a post from the other.

This first image is from the general Question and Answers forum on the course site.

Forum network map The second image is from the Introduction and welcome forum, where students introduce themselves and say hi to someone the same and different. Screen Shot 2016-08-07 at 3.01.34 pm

In the first image, there is on red dot (me) that is strongly the center of all that’s going on. I’m answering questions. In the second image, the red dot that is me, is only lightly connected.

Which is better? Of course it depends. Which is scalable in an effective way?

The Equivalency Theorem suggests that as long as one of student-teacher, student-student, or student-content interaction is high, deep formal learning can occur. High levels of more than one and the educational experience will be more satisfying.

So far the NGL course has been suffering from low student-teacher interaction.  I wonder about the other two? Time will tell.

Teacher as meddler in the middle

A couple of years ago I wrote this post as an example of a “as teacher” post – a requirement for the NGL course. Not a lot has changed, and all this talk of interaction and connection has me thinking again of the first question I was interested in two years ago

How I can be a more effective “meddler in the middle”?

In particular, how can I be more aware of where the types of interactions students are having in the courses I teach, and subsequently what actions can I take to help strength as necessary? If I do this, what impact will it have on student learning and their experience?

I wonder if the paucity of methods for me to understand exactly how and interactions that are occurring that has me refining teaching materials. Materials that students may not be engaging with.  I’m hoping that this project will help reveal how and if students are engaging with the content in at least one course. Anecdotally, it appears that for many interaction with the content is little more than a box to tick. If borne out, this raises the question of how to get students to interact/engage effectively with the content.

There are similar questions to be explored around use of blogs and the connections between students….

References

Bigum, C. (2012). Edges , Exponentials and Education : Disenthralling the Digital. In L. Rowan & C. Bigum (Eds.), Transformative Approaches to New Technologies and student diversity in futures oriented classrooms: Future Proofing Education (pp. 29–43). Springer. doi:10.1007/978-94-007-2642-0

Rovai, A. (2002). Development of an instrument to measure classroom community. The Internet And Higher Education, 5(3), 197-211. http://dx.doi.org/10.1016/s1096-7516(02)00102-1

Valuing the "residue of experience" a bit more

For a while now I have been drawing on the following quote from Riel and Polin (2004)

Over time, the residue of these experiences remains available to newcomers in the tools, tales, talk, and traditions of the group. In this way, the newcomers find a rich environment for learning. (p. 18)

to explain why I encourage/require the use of various types of social media (blogs, social bookmarking, feed readers) in my courses. This 2014 post identifies the problem (what happens in a course site, stays and dies in a course site) and how the social used in these courses helps address that problem.  If you do a Google search for edc3100 blog, you will get another illustration of how at least some of the residue of experience remains available to newcomers in at least one of the courses.

The problem is that this year has revealed that the design of the course doesn’t yet value that residue of experience, at least not in terms of the main value measure for many students – assessment. Students gain marks for writing blog posts that link to posts from other students, but the code that does this marking only recognises currently enrolled students. Linking to the broader residue of experience doesn’t count.

Interestingly, this has only become an issues this year. Only this year have students been asking why they missed out on marks for links to other (“old”) student posts. Leaving aside why it’s only started this year, this post documents the move to valuing the residue of experience.

After implementing the code below, it appears that at least 28 (about 25%) students this semester have linked to blog posts from students in previous offerings of the course. Would be interesting to explore this further. See how prevalent the practice has been in previous courses. Update these visualiations to show the connections between offerings.

What I need to do

The process will be

  • Refamiliarising myself with how the “valuing” is currently done.
  • Planning and implementing how to value the residue of experience.
  • Figuring out if/how to check how often the residue of experience has been used.

How it is currently valued

Some perl code does the work.  Details follow.

BlogStatistics class gathers all information about the blogs for students in the current course offering.  A method generateAllStatistics does some of the grunt work.

But this class also creates a data member MARKING for each student. Based on the Marking class and its GenerateStats method. This class gets the content from the bim_marking table (i.e. all the posts by the student).

GenerateStats accepts a reference to a hash that contains links to all the other blogs in the course (for the specific offering).  It calls DoTheLinks (gotta love the naming) passes it the hash ref to do the count.

One question is how much old data do I currently have?  Seems like there’s only the 2015 and 2016 data easily accessible.

Planning and implementation

One approach would be

  • BlogStatistics generates a list of old student blog URLs
    • add BlogStatistics::getOldStudentBlogs that creates $%BlogStatistics::OLD_BLOGS DONE
  • BlogStatistics passes this into each call to Marking::GenerateStats  DONE
  • Marking::GenerateStats would pass this onto Marking::DoTheLinks DONE
    • also increment POSTS_WITH_STUDENT_LINKS if a link is to an old student blog DONE
    • increment POSTS_WITH_OLD_STUDNET_LINKS if a link is to an old student blog DONE
  • Modify the report generator to show OLD links DONE

 

 

Planning changes to EDC3100 assignment 1

In the first half of the year there was a new assignment in EDC3100 designed to both enhance student learning, but also experiment with making the data produced by students and markers as part of the assessment process more accessible for manipulation by software. i.e. the students and markers entered data into a spreadsheet.

It’s a new semester, time to reflect on that initial use and see what changes should and can be made.

Student results

Let’s start with student results. (Note: this is all a bit rough and ready)

Overall the average mark for the assignment was 13.8 (72%) out of 19 with a standard deviation of around 3.  But that’s for both parts of the assignment.

Given current practice of using Word documents as assignment cover sheets, extracting out the specific marks for the checklist/spreadsheet assignment is difficult. But I have an Excel spreadsheet and I can run a script to get that data.

The average mark is about 9.5 (68%) out of 14, with a standard deviation around 2.

Let’s dig a bit deeper into the three criteria that made up that mark. The three criteria were

  1. Mark – students use a checklist to evaluate a lesson plan and its use of ICT and pedagogy.
  2. Acceptable use – focused on students ability to identify a lesson plan they can use wrt copyright.
  3. RAT – students use the RAT model to evaluate the use of ICT and pedagogy in the course

The following table compares cohort performance on the criteria and overall.

Criteria Average % stdev %
Overall 68 15.8
Mark 75.2 17.2
Acceptable Use 63.2 16.7
RAT 59.3 17.8

The RAT question was where the students were least successful.  It’s also (arguably) the more difficult question. The checklist was the highest mark.  Acceptable use is also quite low and needs some work.

Those last two is where the focus will go for now.

Other thoughts and experiences

Student feedback

Student feedback included the following comments related to the assignment

Some of the items we were required to assess in Assignment One could have been better explained

more guidance was required for Assignment 1. I didn’t like the use of the Excel document

 The last point was connected to the issue of not being able to justify the interpretation, which links back to points raised elsewhere. The first point is one to ponder. The results above suggest that’s not where the real need lays.

Marker feedback

Feedback from markers included

  • Identifying use of an IWB, when in fact it’s just being used as a data projector.
  • Little understanding of what constitutes: an authentic problem, and connections beyond the classroom
  • Some surprise that even with 50 assignments to mark, there were few double ups of lesson plans.
  • Another liked the format in that it gave students a better handle on what to look for in an ICT-rich lesson and the RAT model was useful for framing an evaluation.
  • The wording and nature of the statements for the acceptable use and the RAT question need to be clarified – to confusing (for marker and student)

One aspect of the assignment that troubled one of the markers was that the lesson chosen by the student only had to include some form of ICT.  It didn’t need to be rich nor effective ICT. This was actually one of the aims of the assignment, to allow students develop some appreciation for the breadth of what is possible and just how narrow use often is.

Questions asked during semester

  • Struggles to find a CC-licensed lesson plan.
  • Clarity about what makes an acceptable lesson plan
    • e.g. Can an American lesson be used?
    • Linked to concerns about Q10 and distinguishing between an appropriate lesson plan and whether or not you can use it due to copyright
  • Questions re: term of use and uploading
  • What if I can’t find any information about copyright?
  • How can/should the lesson plan be put online?
  • The distinction between what a student is using an ICT, and when the teacher is using it
  • Explanation of how the checklist questions are marked – e.g. those that don’t apply
  • Reporting bugs in the formatting of the cells

Personal thoughts

Early reflections on the semester included

The spreadsheet worked reasonably well. The checklist within the spreadsheet requires some refinement. As does some aspects of the rubric. The duplication of a Word-based coversheet needs to be removed.

 Other thoughts during the semester included:

  • Students had a tendency to treat the free text questions as requiring an essay.
  • The “pretend” context for the task wasn’t clear enough.
  • In particular, a problem about the exact legal status of ACME’s server, links and making copies of files.
  • Issues with specific questions and the checklist
    • The “web applications” option under “What is used” causing confusion about overlap with “web browser” question
    • Q16 includes mention of print material around ICT
    • Q26 mentions embedded hardware, a question of it and the connection with IWB
    • Appears to be strong connections between Q22 and A46
    • The purpose of Q10 is not clear enough, confusion with matching curriculum etc.
    • A feeling that there are too many questions and perhaps overlap
    • Criteria for RAT question isn’t clear enough about the quality of the response
      • e.g. not mentioning all uses of ICT and Pedagogy
      • Missing out on themes
      • Incorrect identifying something as belonging to a theme
    • Suggestion for a drop down box around linkage of ICT to objectives: not related, somewhat related, essential, extends, transforms
  • More explicit scaffolding/basic activities around the evaluation questions
    • e.g. Is ICT being used to Question, Scaffold, Lecture, in an authentic task

Random suggestions

Due to institutional constraints (not to mention time) none of the changes to be made can be radical.  Keeping with that, some initial suggested changes to explore include:

  1. Pre-submission checks
    1. What pre-submission checks should I run?
    2. Can they be run? How does that integrate with the Moodle assignment activity workflow?
  2. Remove the cover sheet entirely, just use the spreadsheet
    1. Need to include the learning journal mark into the spreadsheet
    2. Would be nice to do this automagically
  3. Tweaking the marking
    1. The criteria for Acceptable use and RAT questions need to be improved
    2. Look closely at each of the points about the questions
  4. Student preparation
    1. Make clear the need not to write essays for the free text questions
    2. Finding CC licensed lesson plans
      1. Great difficulty in finding those that are CC licensed
      2. Provide a list of prior sites people have used
      3. Generate some sort of activity to test understanding of CC with a specific example
    3. RAT Model
      1. More activities in learning paths
      2. Better labeling on  the spreadsheet
    4. More questions/activities around specific terms and concepts within the checklist

 

Planning an EDC3100 "installfest"

The following documents the planning of an “installfest” for the course EDC3100. Implementation and reflection will come later.

Rationale

The course encourages/requires that students to modify their learning process in the course to engage in Jarche’s seek/sense/share framework using a combination of a personal blog, Diigo, and the Feedly feed reader.

This is a radical departure for most students and a challenge for most. It results in a lot of time expended at the start of semester. For example, a past students shared her experience

I spent a lot of time trying to work out blogging, Diigo and Feedly and to be honest I am still only using the bare minimum with blogging

Not a good outcome and apparently what has been used previously, doesn’t work. So an alternative is required.

As it happens, the same student also suggested a possible solution

My thoughts on changes or additions to the course that I would have found useful, would have been to come to a workshop near the start.

I’ve been pondering this suggestion and how it might work with the next offering of the course that has around 100 online students. Being of a certain age I remember installfests and have been wondering if that might be a useful model.    Leading to questions such as..

Can something like an installfest be run in a online video-conference space? Will students participate? Will it help? How to organise it within existing constraints?

Design thoughts

Linux Installfest HOWTO

Interestingly, I came across the Linux Documentation Project’s Linux Installfest HOWTO, the following starts from that document.

The location will be virtual, not physical. So advice about preparing the physical location doesn’t quite apply. However, the features of the Zoom service will need to considered.

Consideration: Might the “other room” feature of Zoom be useful for organising people at different stages?

Bringing up the major constraint, there’s likely to be only me to fulfill the various suggested roles. With more time I might have been able to organise additional help, but let’s not talk about the one week missing between semester 1 and semester 2.

Consideration: Can the session structure be informed by the identified roles? e.g. a receptionist role could be taken by the initial part of the session which focuses on welcoming people to the space. Might also be useful to explicitly ask for volunteers who are a little further ahead than others, volunteers who might take on a Tier 1 support role.

Consideration: Can a Google document/sheet be used to get an idea of people’s knowledge, experience and comfort level with the various tools? Is completing this sheet part of the entry process? Perhaps something based on the data sheet?

Consideration: Have a space at the end for reflection? Perhaps in part people could do this on their blog?  It might even be a good exercise to start them making connections etc.  To see all the tools working together.

Fit with the course requirements

Course requirements to consider include

  • Blog
    • Which blog?
    • Posts and their model.
    • Feeds
  • Trying to help students develop an appreciation of the value of developing conceptual models of how a technology works, moving beyond recipe following.
  • Challenge of explaining how these three tools fit together.
  • What about seek/sense/share, and what that means for how they learn.
    Question: Do the why first? Too abstract.  Leave it until the end? Don’t know why and perhaps too late and too tired by everything else.  Perhaps show them how it all looks at the end?
  • Identity
    • anonymous or not
    • Professional identity
    • Not being an egg
  • How to demonstrate to people the process
    Select a volunteer and I help guide them through the process using some sort of scaffold (e.g. the slides or study desk)
  • How to give people the time to try it by themselves and perhaps get support
  • How to encourage/enable reuse of sections of the video to integrate into the learning paths

Questions to ask (form/spreadsheet)

  • Name
  • Are you will to volunteer to be guided
  • Blog
    • Do you have one set up?
    • Rate your knowledge about the blog?
    • have you written a blog post?
    • Have you customised your blog?
  • Diigo
    • Do you have a Diigo account?
    • Do you have a Diigo extension installed?
    • Have you book marked something using Diigo
    • Have you shared it to the EDC3100 Diigo group
  • Feedly
    • Have you logged into Feedly?
    • Have you imported the EDC3100 OPML files?
    • Have you tried following anyone else?

 

Initial design

Welcome

First 5+ minutes focus on welcoming everyone and asking them to fill out the form.

Outline the purpose of the session.

Outline the structure

  • Welcome
  • Where are we up to, where are we going
  • Doing it
    • Diigo
    • Feedly
    • Blog
  • Pulling it all together

Where are we up to? Where are we going?

Explain the three tools and the seek/sense/share approach to learning, only briefly on why, focus on concrete illustration showing my use of the tools. Link this to professional identity and the idea of being anonymous. Which tools need to be anonymous?

We want you to be able to do this by the end of the session.

Show the sheet behind the form – link to an idea they can use, mention the Google spreadsheet links in the EDC3100 Diigo group.  Find out where people are up to, think about approaches, ask for volunteers to be Tier 1 support – perhaps on the chat?  Or perhaps in a breakout room.

Outline structure (easy first, to more difficult)

  • Feedly
  • Diigo
  • Blog

Diigo

  1. Sign-up for account.
    Make sure go to learn more. — username and email (which email – personal or USQ)
  2. Join the EDC3100 group.
  3. Show the emails I get and the approval process
  4. Install a Diigo tool
    Recommend Diigo extension – but Diigolet will do
  5. Bookmark a page for yourself
  6. Bookmark a page to the group????
  7. Do minute paper

Feedly

  1. Which account – link to professional identity
    1. Umail if only for University – this okay because it’s not visible.
    2. Facebook or other account if using for personal
  2. Visit Feedly – Hit the get started button – login
  3. Import the OPML files.
  4. Add some content – get them to search in Feedly for something they are interested in
  5. Make point about not reading the actual page, but a copy, show how to access the actual page
  6. minute paper

Blog

  1. Which blog service
  2. Which identity – anonymous etc.
  3. Go to your choice of blog provider
  4. Hit the equivalent of “Create website”
  5. Follow the process
  6. Choose your configuration
  7. Write your first blog post — maybe suggest it should be linked to this post and reflect upon it.  Work in some ideas about reflection.
  8. Register the blog on the Study Desk — probably shouldn’t show this in Zoom.
  9. Talk about WordPress reader and it’s relationship with Diigo
  10. Minute paper

Pulling it all together

  1. Can I get them to download the OPML file and into Feedly.
  2. Come back to the seek/sense/share processe
    1. Seek – Start with Feedly
      1. See discussion forum posts
      2. See posts from other students
    2. Sense – on blog
    3. Share – on blog and Diigo
  3. Another minute paper???

Tasks

  1. Powerpoint scaffold for the session
  2. Google forums
    1. Where are you up to?
    2. Minute papers
      1. Feedly
      2. Diigo
      3. Blog
  3. Set up data system for EDC3100 S2
    1. Blog registration counter
    2. Creating OPML files

 

Any pointers to an old, ancient game?

Way back in 1986 I started studying undergraduate computer science at the University of Queensland. One of our first year programming assignments was to use the fancy, new Macintosh computers to add some code to a game.  I’m looking for pointers to the name of the game and any online resources about it. A working version on some contemporary platform would be great.

Any help more than welcome.

The game

The game was played with a grid. Typically 4 by 4 grid that looked something like this.

Grid 001

The idea is that there were random mirrors hidden throughout the grid. The aim of the game was to figure out what type of mirrors were located where within the grid. To do this you had a flashlight that you could shine through one of the holes on the outside. The light would exit the grid at another location, depending on the location and type of mirrors it would encounter. A bit like this
grid 002

There were three types of mirrors. Two diagonal mirrors / and ; and, a X mirror.  The diagonal mirrors would change the direction of the light depending on how the light struck the mirror. The X mirror would direct the light back the way it came.

The following image shows one potential layout of mirrors to explain how the light behaved in the above image.

Grid 003

The light travels straight ahead until it hits the first diagonal mirror. This mirror causes the to change direction directly up. Where it immediately hits another diagonal mirror which send the light traveling right again until it exits the grid.

Early thoughts on S1, 2016 offering of EDC3100

First semester for 2016 is just about over. Time to reflect back on what’s happened with EDC3100, ICT and Pedagogy for this semester.

Overall, I feel the course is in a better place than it was last year. But there remains some significant room for improvement.

It will be interesting to see what the students think. It will be a couple of months until I see their feedback.

Changes made this semester

A range of different changes were made this semester.

New module 1 and assignment 1

Historically, EDC3100 starts with a bang and a lot of work – too much work – for students. The old assignment 1 required students to expend a fair bit of time getting to now a new technology. The return on that investment wasn’t as much as it might have been, hence students disliked it. This semester Assignment 1 and the supporting Module 1 was completely re-designed with an intent to reduce student workload and focus on a few particular outcomes.

In short, that appears to have worked okay. Workload reduced.  The content and activities for Module 1 could use some enhancement to make their purpose clearer and more engaging. The weekly release of Module 1 was’t great.

Assignment 1 as an Excel spreadsheet

Assignment 1 was designed with students using an Excel spreadsheet for at least three reasons:

  1. Provide students with more experience using an application type they appear not to have used a great deal.
  2. Ensure that the insights generated by both students and markers could be analysed via computer programs.
  3. Using Excel to reduce the workload for markers.

The spreadsheet worked reasonably well. The checklist within the spreadsheet requires some refinement. As does some aspects of the rubric. The duplication of a Word-based coversheet needs to be removed.

Analysing the submitted spreadsheets via software has commenced, but hasn’t been completed.  In part this is due to Moodle not providing a good way of extracting all marked files. This has been worked around – thanks to the good folk at LSS – but time is an issue. More work needs to be done on the analysis and sharing of insights gained from it.

The return of Toowoomba lectures

In 2015 there were no Toowoomba lectures and as a result the SET results for 2015 suffered. Students missed the lectures. In 2016 they were back and were streamed live using Zoom. The lectures and way stations were also more effectively integrated into the Study Desk structure.

The lectures were okay and there were folk attending via Zoom.  Overall, however, the lectures need work. The exact relationship between the lectures and the learning paths need to be thought about. Should it be duplicate? A complement?

Recordings of the lectures (hosted on Vimeo) show a greater amount of usage than I expected to see.
Video stats end May 2016

Refinements to later modules

A range of minor to more major refinements were made, especially for Module 3. These are generally an improvement over what went before.

The not so good

A list of the not so good experience this semester include:

  • The weekly release of Module 1.
  • My absence in Week 4 due to a conference.
  • The difficulty in finding material within the learning paths.
    The availability of the Moodle book search block from next semester will help massively with this problem.
  • Assignment 2 has the students thinking more about unit design than ICT and pedagogy.
  • A couple of major “marking outages” leading to late return of marked assignments.
  • The quality and quantity of feedback provided to students.
    This is a two-edged sword. Feedback on assignments was variable. Feedback via some of the study desk activities and discussion forums was quite good.
  • The on-going confusion amongst at least quite a few students around the learning journal and the marking of it.
  • Marking of assignments still requires too much work from the markers.

To do

All of this (including the following) will need to be revisited once the student feedback has been given, released to me, and considered.

Beyond some of what is mentioned above and based on what I current know, the following need to be done:

  1. Use analytics to explore how students are engaging with the learning paths, including the ability to produce an ePub version and/or print them.
  2. Analyse the Assignment 1 data and identify what new activities/resources this might be useful for.
  3. Better integrate the recorded lectures and other components with the learning paths.
  4. Think about a re-design of Module 2 and Assignment 2.