Assembling the heterogeneous elements for (digital) learning

Category: edc3100 Page 1 of 6

Exploring Moodle Book usage – Part 7 – When are they used?

The last post in this series looked briefly at the contents of Moodle Book resources. This post is going to look at when the book resources are used, including:

  • What time of day are the books used?
  • When in the semester are they used?

By the end I spent a bit of time exploring the usage of the Book resources in the course I teach.

What time of day are they used?

This is a fairly simple, perhaps useless, exploration of when during the day. More out of general interest and laying the ground work for the code for the next question.

Given the huge disparity in the number of views versus print versus updates, there will be separate graphs for each. Meaning 3 graphs per year.  For my own interest and for the sake of comparison, I’ve included a fourth graph which is the same analysis for the big 2015 offering of the course I teach.  This is the course that perhaps makes the largest use of the Book and also the offering in which  I did lots of updates.

The graphs below show the number of events that occurred in each hour of the day. 12pm to 1am, 1am to 2am,…and so on.  Click on the graphs to see expanded versions.

There is no graph for prints per hour for 2012 as there were none in the database. This appears likely to be a bug that needs to be addressed.

Overall findings from time of day

Growth – The maximum number of events has grown each year (as expected given earlier indications of growth).

  • max views per hour: 2012 just less than 35K to 2015 over 150K
  • max prints per hour: 2013 just over 400 to 2015 over 1500
  • max updates per hour: 2012 just over 500 to to 2015 over 6000.

Similarity – The overall of shapes of the graphs stay the same, suggesting a consistent pattern in interaction.

This is especially the case for the viewing events. Starting with a low number from midnight to 1am, a on-going drop in events until 5am when it grows until the maximum per hour between 11am and midday. Then there is a general drop away until 7pm to 8pm when it grows again until dropping away after 9pm

Views per hour each year

2012
2012 views per hour

2013
2013 views per hour

2014
2014 views per hour

2015

2015 views per hour

EDC3100 2015 S1

EDC3100 2015 1 views per hour

Prints per hour each year

2012

2012 prints per hour

2013

2013 prints per hour

2014

2014 prints per hour

2015

2015 prints per hour

EDC3100 2015 S1

EDC3100 2015 1 prints per hour

Updates per hour each year

2012

2012 updates per hour

2013

2013 updates per hour

2014

2014 updates per hour

2015

2015 updates per hour

EDC3100 2015 S1

EDC3100 2015 1 updates per hour

Calendar Heatmaps

A calendar heatmap is a fairly common method of representing “how much of something” is happening each day of the year. The following aims to generate calendar heatmaps using the same data shown in the above graphs. The plan is to use the method/code outlined on this page.

It requires the generation of a two-column CSV file. First column the date in YYYYMMDD format and the 2nd column the “how much of something” for that day. See the example data on the blog post.  Looks like it might be smart enough to figure out the dates involved.  Let’s see.

It is, but doing all of the years together doesn’t work all that well given the significant increase in numbers of courses using the Book as time progresses and the requirement for the heatmap to use the same scale for all years. As a result the 2012 usage doesn’t show up all that well. Hence each of the years were mapped on separate heatmaps.

The following calendar heatmaps show how often the Book resources were viewed on each day. The events counted are only those for Book resources from courses offered in the given year. In 2012, 2013 and 2014 this means that there is a smattering of views of a books early in the following year (semester 3 stretches from Nov to Feb). There is no similar usage for the 2015 books because the data does not include any 2016 events.

The darker the colour the greater the use. In the 2012 image below you should be able to see a tool tip showing a value of 81 (out of 100) that is quite dark, but not the darkest.

2012

The 2012 map seems to establish the pattern.  Heavy use at the start of semester with a gradual reduction through semester. A few upticks during semester and toward the end of semester.

I no longer have easy access to specific dates for 2012 and 2013. The 2014 heatmap has some specific dates which should broadly apply to these earlier years.
2012 Book usage

2013

2013 Book usage - calendar heatmap

2014

The institution maintains a web page that shows the important dates for 2014, it includes:

  • March 3 – Semester 1 starts.
    Course websites open 2 weeks before this date – 17th Feb
  • June 16 – Semester 1 exams start.
  • July 21 – Semester 2 starts
    Course websites open 2 weeks prior – 7th July.
  • November 3 – Semester 2 exams start.
  • November 17 – Semester 3 starts.

Screen Shot 2016-09-11 at 4.52.36 pm

2015

The semester 1 2015 offering of my course had the following due dates for its 3 assignments

  1. 30th March – which appears to coincide with a heavy usage day.
  2. 4th May – also a slightly heavy usage day, but not as heavy.
  3. 15th June – two somewhat heavy usage days before and on this date.

Raising the question of what the heatmap for that course might look like – see below

Screen Shot 2016-09-11 at 4.53.10 pm

EDC3100 – S1, 2015

Focusing just on my course the increase in usage just before the due date for the assignments is more obvious. One of the reasons for this is that all the Assessment information for the course is included in a Moodle Book resource.
EDC3100 S1 2015 book usage - calendar heatmap
Other time periods relevant to this course are:

  • April 6 to 17 – the two week mid-semester break; and,
    Which correspond to two of the lightest periods of usage of book resources.
  • May 18 to June 5 – a three week period when most of the students are on Professional Experience within schools.
    Which also corresponds to a light period of usage.

The two heaviest days of usage are the 9th and 10th of March. The start of Week 2 of semester. It’s a time when the pressure is on to get a blog created and registered and start completing learning paths.

After the peak of the first three weeks, usage of the Book resources drops to around 50% per day.

Questions to arise from this

  • Does the learning journal assessment item for EDC3100 change when students interact with the course site?
  • Is the pattern of usage (down to 50% a day) indicative of students turning off, or becoming more familiar with the approach?
  • Does the high level of usage indicate

It also begs the question about whether particular offerings of the course show any differences.

2012 – S2

The 2012 S2 pattern is quite a bit different. It is a bit more uneven and appears to continue well after the semester is finished.  This is due to this being the first semester the course used the Book module and also because there was a semester 3 offering of the course for a few students that used the same resources.
EDC3100 2012 2 - Book usage

The 2012 heatmap also shows a trend that continues. i.e. usage of the Book resources continue well past the end of semester. It’s not heavy usage, but is still there.

Question: is that just me, or does it include students?

2013 – S1

2013 S1 is a bit different as well. Lighter use at the start of semester. A bit heavier usage around assignment due dates. My guess is that this was still early in the evolution of how the Book was being used.

EDC3100 2013 S1 - Book usage

2013 – S2

This map seems to be evolving toward the heavy use at the start of semester.
EDC3100 2013 S2 - Book usage

2014 – S1

And now the pattern is established. Heavy use at the start of semester and in the lead up to Assignment 1. A slight uptick then for Assignments 2 and 3. With the light usage around Professional Experience evident.

EDC3100 2014 S1 - Book usage

2014 – S2

EDC3100 2014 S2 - Book usage

2015 – S2

  EDC3100 2015 S2 - Book usage
What about just the students?

The following shows just the student usage for the 2013 S1 offering. Not a huge difference to the “all role” version above suggesting that it is students who are doing most of the viewing. But it does confirm that the on-going usage of the Book resources past the end of the semester are students who appear to have found some value for the information post the course.

EDC3100 2013 1 - Just students

Which comes first? Pedagogy or technology?

Miranda picks up on a common point around the combination of technology and pedagogy with this post titled Pedagogy First then Technology. I disagree. If you have to think in simple sequential terms, then I think pedagogy should be the last consideration, not the first. The broader problem though is our tendency to want limit ourselves to the sequential

Here’s why.

The world and how we think isn’t sequential

The learning and teaching literature is replete with sequential processes such as ADDIE, Backwards Design, Constructive Alignment etc. It’s replete with such models because that’s what academics and experts tend to do. Develop models. The problem is that all models are wrong, but some of them are useful in certain situations for certain purposes.

Such models attempt to distill what is important from a situation to allow us to focus on that and achieve something useful. The only trouble is that the act of distillation throws something away. It’s an approach that suffers from a problem identified by Sir Samuel Vimes in Feet of Clay by the late Terry Pratchett

What arrogance! What an insult to the rich and chaotic variety of the human experience.

Very few, if any, human beings engage in anything complex or creative (such as designing learning) by following a sequential process.  We are not machines. In a complex task within a complex environment you learn as much, if not more, by engaging in the process as you do planning what you will do beforehand.

Sure, if the task you are thinking about is quite simple, or if it is quite complicated and you have a lot of experience and expertise around that task, then you can perhaps follow a sequential process. However, if you are a teacher pondering how to transform learning through the use of digital technology (or using something else), then your task is neither simple, nor is it complicated, nor is it something you likely have experience or expertise with.

A sequential process to explain why technology first

Technologies for Children is the title of a book that is designed to help teachers develop the ability to help learners engage with the Australian Curriculum – Technologies learning area. A curriculum that defines two subjects: Design and Technologies, and Digital Technologies. In the second chapter (Fleer, 2016) the author shares details of how one year 4/5 teacher integrates this learning area into her class. It includes examples of “a number of key statements that reflected the technological processes and production skills” (Fleer, 2016, p. 37) that are then turned into learner produced wall charts. The following example wall chart is included in Fleer (2016, p. 37). Take note of the first step.

When we evaluate, investigate, generate designs, generate project plans, and make/produce we:

  1. Collaboratively play (investigate) with the materials.
  2. Evaluate the materials and think about how they could be used.
  3. Generate designs and create a project plan for making the item.
  4. Produce of make the item.
  5. Evaluate the item.
  6. Write about the item and talk with others.
  7. Display the item.

Before you can figure out what you are going to do with a digital technology, you need to be fully aware of how the technology works, what it can do, what are the costs of doing that, what it can’t…etc. Once you’ve got a good handle on what the digital technology can do, then you can figure out interesting and effective ways to transform learning using the technology. i.e. pedagogy is the last consideration.

This is not to suggest that pedagogy is less important because it comes last. Pedagogy is the ultimate goal

But all models are wrong

But of course all models are wrong. This model is (arguably) only appropriate if you are not familiar with digital technology. If you know all about digital technology or the specific digital technology you are considering, then  your need to play with the digital technology first is lessened.  Maybe you can leap straight to pedagogy.

The trouble is that most teachers that I know have fairly limited knowledge of digital technologies. In fact, I think many of the supposed IT experts within our institutions and the broader institution have somewhat limited understandings of the true nature of digital technologies. I’ve argued that this limited understanding is directly impacting the quality of the use of digital technology for learning and teaching.

The broader problem with this “technology first” model – as with the “pedagogy first” model – is the assumption that we engage in any complex task using a simple, sequential process. Even the 7 step sequential process above is unlikely to capture “the rich and chaotic variety” of how we evaluate, investigate and generate designs for using digital technology for learning and teaching. A teacher is just as likely to “play (investigate)” with a new digital technology by trying out in a small safe to fail experiment to see how it plays out. Perhaps this is repeated over a few cycles until the teacher is more comfortable with how the digital technology works in the specific context, with the specific learners.

References

Fleer, M. (2016). Key ideas in the technologies curriculum. In Technologies for Children (pp. 35–70). Cambridge University Press.

Exploring Moodle Book Module usage – part 1 – background and planning

I’m due to have the slides for a Moodlemoot Australia presentation in a few weeks. Time to get organised. The following is (perhaps) the first of a sequence of posts reporting on progress toward that presentation and the related research.

Background

My interest in research is primarily driven by the observation that most educational usage of digital technology to enhance learning and teaching is fairly bad. Typically the blame for this gets laid at the feet of the teaching staff who are digitally illiterate, not qualified to teach, or are laggards. My belief/argument is that the problem really arises because the environment within formal education institutions just doesn’t understand what is required to make a difference. Much of what they do (e.g. institutional standards for course sites, checklists, training, support documentation, design and support of technlogies…) does little to help and tends to make the problem worse.

You want digitally fluent faculty?

A contributing factor to that is that institutional attempts to improve digital learning actually fails to be based on any insights on how people (in this case teaching staff and all those involved with digital learning) learn. How institutions implement digital learning actually gets in the way of people learning how to do it better.

Schema and the grammar of school

The ideas of schema and the grammar of school offer one example of this failure. This earlier post includes the following quote from Cavallo (2004) establishes the link

David Tyack and Larry Cuban postulated that there exists a grammar of school, which makes deviation from our embedded popular conception of school feel as nonsensical as an ungrammatical utterance [1]. They describe how reform efforts, whether good or bad, progressive or conservative, eventually are rejected or denatured and assimilated. Reform efforts are not attempted in the abstract, they are situated in a variety of social, cultural and historical contexts. They do not succeed or fail solely on the basis of the merit of the ideas about learning, but rather, they are viewed as successful based upon their effect on the system and culture as a whole. Thus, they also have sociological and institutional components — failure to attend to matters of systemic learning will facilitate the failure of the adoption of the reforms. (p. 96)

The grammar of school problem is linked to the idea of schema which links to the following quote that I first saw in Arthur (2009) and which is taken from Vaughan (1986, p. 71)

[In the situations we deal with as humans, we use] a frame of reference constructed from integrated sets of assumptions, expectations and experiences. Everything is perceived on the basis of this framework. The framework becomes self-confirming because, whenever we can, we tend to impost it on experiences and events, creating incidents and relationships that conform to it. And we tend to ignore, misperceive, or deny events that do not fit it. As a consequence, it generally leads us to what we are looking for. This frame of references is not easily altered or dismantled, because the way we tend to see the world is intimately linked to how we see and define ourselves in relation to the world. Thus, we have a vested interest in maintaining consistency because our own identity is at risk.

Evidence of schema in how digital technologies are used

Horsey, Horseless Carriage

The schema idea means that people will perceive and thus use digital technologies in ways that fit with their “integrated sets of assumptions, expectations and experiences”. This is an explanation for the horsey, horseless carriage way people respond to digital technologies. It’s why courses where the majority of students are online students and will never come onto a campus are still designed around the idea of face-to-face lectures and tutorials.

It also explains why when I finally returned to teaching a course I adopted the idea of a ramble for the structure of the course.  It explains why the implementation of the ramble evolved into using the Moodle Book module the way it does today. The images below (click on them to see larger versions) illustrate the connection between my practice 20 years apart, more detail follows.

1996 2016
The 85321 "online" book - 1996 Online book 2016

The 1996 image is a page from  the study guide (wonder how many people can play the au file containing the Wayne’s World II quote) for the Systems Administration course I taught in 1996. The 2016 image is a page from the “study guide” I developed for an Arts & Technologies C&P course.

I believe/suggest that the influence of schema also plays a significant contributor in the practice of other teaching staff as they transition into digital learning. It’s a factor in why most course sites remain dumping grounds for lecture slides and the subsequent widespread growth in the use of lecture capture systems.

And it’s not just the teaching staff. Students have developed schema about what it means to be taught, and what it means to be taught at university. A schema developed either through direct experience, or via the experience of others and various media. The typical schema for university education involved large lecture halls and tutorials.

 

So what?

The above suggests that whenever students and teachers engage with a digital technology (or any change around) and its use for learning and teaching, there are three main possibilities:

  1. It seen as nonsensical and rejected.
    e.g. whatever was said doesn’t make sense from existing grammar rules and seen as just being wrong.
  2. It sounds like something familiar and is modified to fit within the confines of that familiar practice.
    e.g. whatever was said sounds an awful lot like an existing use of grammar (even though it is different), and thus is interpreted as matching that existing use.
  3. The significant difference is seen as valued and existing practice is modified to make use of that difference.
    e.g. the different use of grammar is both understood as different and the difference is valued, and is subsequently existing practice is modified to incorporate the new grammar.

If this is the case, then examining the use (or not) of a digital technology in learning and teaching should reveal evidence of these possibilities.  This seems very likely, given widespread common complaints about the use of digital technology to enhance learning and teaching. Complains that see most practice stuck at possibility #2 (at best).

If this is the case, then perhaps this way of thinking might also identify how to address this.

But first, I’m interested in seeing if use of a particular digital technology matches this prediction.

Use of the Moodle Book module

Due to a 2015 grant from the USQ OpenTextbook Initiative I’m going explore the use the Moodle Book module. The plan is to analyse the use of the Moodle Book module (the Book) at USQ to see how both learners and teachers are engaging with it, see if the above expectations are met, and figure out what might be done in terms of the support and development of the Moodle Book module to help improve this.

What follows is an initial map of what I’ll be exploring.

A major aim here is to explore whether a student or teacher using the Book have made the transition from possibility #2 (treating the Book as a print-based book) to possibility #3 (recognising that this is an online book, and using that difference). I’ve highlighted some of the following questions/analysis, which I think be useful indicators of this transition. The darker the yellow highlight, the more strongly I think it might indicate someone making the leap to an online book.

Question for you: What other practices might indicate use that has moved from #2 to #3?

Which courses use the Book

First step is to explore whether the Book is being used. How many courses are using it? How many books are being produced with the module.

As the abstract for the talk suggests, early analysis revealed a growth in use, but I’m wondering how sound that analysis was. Hence there is a need to

  1. Correctly identify the number of course offerings using the Book each year.
  2. Identify the number of different teaching staff are responsible for those courses.
    Longer term, it would be useful to ask these staff about their background and reasons for using the Book.
  3. Identify the type of courses using the Book.
  4. How many books are being produced by each course?
  5. How do the books fit into the structure of the course?
    1. Is the structure the same from offering to offering?
    2. How much does the number and content of the books change from offering to offering?

Characteristics of the book content

  1. Statistics around the level of readability of the text (e.g. Flesch-Kincaid etc).
  2. The structure of the book – are sub-chapters used.
  3. Are images, video, Moodle activities included?
  4. What about links?
    • Are there any links at all?
    • What is linked to?
    • Are links purely to external resources? 
    • How many links connect back to other parts of the course’s Books?

Patterns in how the books are authored

  1. How are the books authored?
    • From scratch?
      1. Using the web interface?
      2. Via an import process?
    • Copied from previous offerings?
    • ?? other??
  2. How are they edited? 
    My expectation that a teacher who sees the Book as a replacement for a print book will not be editing the books during semester.

Patterns in how the books are read/used

  1. Are students reading the books online or printing them out?
  2. Does printing always happen at the start of semester? Does it continue through semester? Does it drop off?
  3. When are students reading the books?
  4. What is the nature of the paths they take through the books?
    1. Do they read the books and the chapters in order?
    2. How long do the spend on each chapter?
    3. Do they revisit particular books?
  5. How many times do discussion forum posts in a course include links to chapters/sub-chapters within the books
    • Posts written by teaching staff
    • Post written by students

References

Arthur, W. B. (2009). The Nature of Technology: what it is and how it evolves. New York, USA: Free Press.

Cavallo, D. (2004). Models of growth – Towards fundamental change in learning environments. BT Technology Journal, 22(4), 96–112.

How many digital devices do you have?

In a couple of the courses I teach I ask students (for slightly different purposes) the question from the title of this post, “How many digital devices do you have?”.  In one of the courses that question takes the form of a quiz and looks something like the following.

Question text

How many different digital technologies do you own?
Select one:
a. 0
b. 1 to 5
c. 6 to 10
d. 11 to 20
e. More than 20

 What answer would you give?

Count them up folks. What answer would you give. I’ll give you some space to think about that before talking about what some other folk have said.

 

 

What others have said

Some of the students in another course (where the question is framed somewhat differently) have offered the type of answers I expected, based on the framing of the question.

Jay identifies 3 devices. Neema lists 2.

Thinking a bit further afield than that I can probably count quite a few more than that in my house. I’ll ignore devices personal to other members of my family. This gets me the following list: laptop, 2 smart phones, digital camera, printer, various external drives, Apple TV device, T-Box, X-Box One.  That’s 9.

 

 

 But that doesn’t really start to count them

Fleming (2011) writes that it is

estimated that today’s well-equipped automobile uses more than 50 microcontroller units (p. 4)

Wikipedia defines a microcontroller as “a small computer) on a single integrated circuit containing a processor core, memory, and programmable input/output peripherals.

So your car alone potentially has you well into double figures. Remember that Fleming was writing in 2011. If you have recently purchased the latest Mercedes E-Class, chances are the number of microcontroller units in your car goes well beyond 50.

And of course, with your thinking re-calibrated by this example, you can probably quite easily identify additional devices in your house that are likely to control microcontrollers.

Implications

Digital devices are increasingly ubiquitous. Digital isn’t limited to a separate device like a computer, tablet, or smart phone. It’s wearable and in every thing.

I expect most people not to be aware of just how reliant they are on digital technologies in everything they do. Hence it’s uncertain that they understand or are prepared for what this might mean for what they do. For example, I don’t think many people in higher education or education more broadly quite understand the implications this has for how those organisations operate, perform, or exist. I’m not convinced that the patterns they use to make sense of the world are ready yet to deal with these changes effectively.

But then I’m not convinced the technologists are either.

Interesting times ahead.

References

Fleming, B. (2011). Microcontroller units in automobiles. IEEE Vehicular Technology Magazine, 6(3), 4–8. doi:10.1109/MVT.2011.941888

 

Teacher presence in network learning

A new semester and the Networked and Global Learning course is running again. Apologies to those in the other courses I teach, but this course is consistently the most engaging and interesting. It’s a course in which I typically learn as much as the other participants. However, due to the reasons/excuses outlined in the last post, I haven’t been able to engage as much as I would have liked with the course.

This has me thinking about something Adam wrote, in particular the argument/observation from Rovai (2002) which Adam describes as

This is bringing to light the sense of disconnection students are often experiencing due to physical and psychological separation from teachers, peers and institutions

What follows is some random reactions to this particular quote and an attempt to connect it with my teaching.

Badly designed learning generates bad outcomes

As someone who has been working, learning and teaching online for a long time I am biased and this idea troubles me. In fact, it puts me in mind of the following point made in this recent post around the question of banning laptops in the classroom, because handwriting is better for learning

Those studies about the wonders of handwriting all suffer from the same set of flaws, namely, a) that they don’t actually work with students who have been taught to use their laptops or devices for taking notes. That is, they all hand students devices and tell them to take notes in the same way they would in written form. In some cases those devices don’t have keyboards; in some cases they don’t provide software tools to use (there are some great ones, but doing it in say, Word, isn’t going to maximize the options digital spaces allow), in some cases the devices are not ones the students use themselves and with which they are comfortable. And b) the studies are almost always focused on learning in large lecture classes or classes in which the assessment of success is performance on a standardized (typically multiple-choice) test, not in the ways that many, many classes operate, and not a measure that many of us use in our own classes. And c) they don’t actually attempt to integrate the devices into the classes in question,

In terms of student disconnection,is it arising from there truly being something essential that a physical face-to-face learning experience provides that can’t be provided in an online space?

Or, is it because the types of online learning experiences being examined by Rovai have not been designed appropriate to draw on the affordances offered by an online learning environment?  Do these online learning experiences examined by Rovai suffer the same problem that most of the attempts to engage in open education illustrate? i.e. an inability to break out of the “persistent patterns of relations” (Bigum, 2012) that are formed by someone brought up teaching face-to-fact?

Given that the abstract for Roavi (2002) includes

Data were collected from 375 students enrolled in 28 different courses, offered for graduate credit via the Blackboard e-learning system by a private university

Indicating that the “persistent patterns of relations” under examination in this paper is from a North American university in 2000/2001 where online learning was limited to the Blackboard LMS. A time and system which is unlikely to be described by anyone as offering only the pinnacle of an online learning experience.

Might the sense of disconnection arise from the poor quality of the learning experience (online or otherwise) rather than the lack of physical presence.

Or is it simply that both teachers and learners have yet to figure out how to leverage the affordances of online learning?

What type of presence should a teacher have?

The following two images represent connections formed between participants in two discussion forums in a large course I teach (these are from first semester 2015). Each dot represents a participant. A red do is a teacher, blue dot a student. A connection between two people is formed when one of them replies to a post from the other.

This first image is from the general Question and Answers forum on the course site.

Forum network map The second image is from the Introduction and welcome forum, where students introduce themselves and say hi to someone the same and different. Screen Shot 2016-08-07 at 3.01.34 pm

In the first image, there is on red dot (me) that is strongly the center of all that’s going on. I’m answering questions. In the second image, the red dot that is me, is only lightly connected.

Which is better? Of course it depends. Which is scalable in an effective way?

The Equivalency Theorem suggests that as long as one of student-teacher, student-student, or student-content interaction is high, deep formal learning can occur. High levels of more than one and the educational experience will be more satisfying.

So far the NGL course has been suffering from low student-teacher interaction.  I wonder about the other two? Time will tell.

Teacher as meddler in the middle

A couple of years ago I wrote this post as an example of a “as teacher” post – a requirement for the NGL course. Not a lot has changed, and all this talk of interaction and connection has me thinking again of the first question I was interested in two years ago

How I can be a more effective “meddler in the middle”?

In particular, how can I be more aware of where the types of interactions students are having in the courses I teach, and subsequently what actions can I take to help strength as necessary? If I do this, what impact will it have on student learning and their experience?

I wonder if the paucity of methods for me to understand exactly how and interactions that are occurring that has me refining teaching materials. Materials that students may not be engaging with.  I’m hoping that this project will help reveal how and if students are engaging with the content in at least one course. Anecdotally, it appears that for many interaction with the content is little more than a box to tick. If borne out, this raises the question of how to get students to interact/engage effectively with the content.

There are similar questions to be explored around use of blogs and the connections between students….

References

Bigum, C. (2012). Edges , Exponentials and Education : Disenthralling the Digital. In L. Rowan & C. Bigum (Eds.), Transformative Approaches to New Technologies and student diversity in futures oriented classrooms: Future Proofing Education (pp. 29–43). Springer. doi:10.1007/978-94-007-2642-0

Rovai, A. (2002). Development of an instrument to measure classroom community. The Internet And Higher Education, 5(3), 197-211. http://dx.doi.org/10.1016/s1096-7516(02)00102-1

Valuing the "residue of experience" a bit more

For a while now I have been drawing on the following quote from Riel and Polin (2004)

Over time, the residue of these experiences remains available to newcomers in the tools, tales, talk, and traditions of the group. In this way, the newcomers find a rich environment for learning. (p. 18)

to explain why I encourage/require the use of various types of social media (blogs, social bookmarking, feed readers) in my courses. This 2014 post identifies the problem (what happens in a course site, stays and dies in a course site) and how the social used in these courses helps address that problem.  If you do a Google search for edc3100 blog, you will get another illustration of how at least some of the residue of experience remains available to newcomers in at least one of the courses.

The problem is that this year has revealed that the design of the course doesn’t yet value that residue of experience, at least not in terms of the main value measure for many students – assessment. Students gain marks for writing blog posts that link to posts from other students, but the code that does this marking only recognises currently enrolled students. Linking to the broader residue of experience doesn’t count.

Interestingly, this has only become an issues this year. Only this year have students been asking why they missed out on marks for links to other (“old”) student posts. Leaving aside why it’s only started this year, this post documents the move to valuing the residue of experience.

After implementing the code below, it appears that at least 28 (about 25%) students this semester have linked to blog posts from students in previous offerings of the course. Would be interesting to explore this further. See how prevalent the practice has been in previous courses. Update these visualiations to show the connections between offerings.

What I need to do

The process will be

  • Refamiliarising myself with how the “valuing” is currently done.
  • Planning and implementing how to value the residue of experience.
  • Figuring out if/how to check how often the residue of experience has been used.

How it is currently valued

Some perl code does the work.  Details follow.

BlogStatistics class gathers all information about the blogs for students in the current course offering.  A method generateAllStatistics does some of the grunt work.

But this class also creates a data member MARKING for each student. Based on the Marking class and its GenerateStats method. This class gets the content from the bim_marking table (i.e. all the posts by the student).

GenerateStats accepts a reference to a hash that contains links to all the other blogs in the course (for the specific offering).  It calls DoTheLinks (gotta love the naming) passes it the hash ref to do the count.

One question is how much old data do I currently have?  Seems like there’s only the 2015 and 2016 data easily accessible.

Planning and implementation

One approach would be

  • BlogStatistics generates a list of old student blog URLs
    • add BlogStatistics::getOldStudentBlogs that creates $%BlogStatistics::OLD_BLOGS DONE
  • BlogStatistics passes this into each call to Marking::GenerateStats  DONE
  • Marking::GenerateStats would pass this onto Marking::DoTheLinks DONE
    • also increment POSTS_WITH_STUDENT_LINKS if a link is to an old student blog DONE
    • increment POSTS_WITH_OLD_STUDNET_LINKS if a link is to an old student blog DONE
  • Modify the report generator to show OLD links DONE

 

 

Planning changes to EDC3100 assignment 1

In the first half of the year there was a new assignment in EDC3100 designed to both enhance student learning, but also experiment with making the data produced by students and markers as part of the assessment process more accessible for manipulation by software. i.e. the students and markers entered data into a spreadsheet.

It’s a new semester, time to reflect on that initial use and see what changes should and can be made.

Student results

Let’s start with student results. (Note: this is all a bit rough and ready)

Overall the average mark for the assignment was 13.8 (72%) out of 19 with a standard deviation of around 3.  But that’s for both parts of the assignment.

Given current practice of using Word documents as assignment cover sheets, extracting out the specific marks for the checklist/spreadsheet assignment is difficult. But I have an Excel spreadsheet and I can run a script to get that data.

The average mark is about 9.5 (68%) out of 14, with a standard deviation around 2.

Let’s dig a bit deeper into the three criteria that made up that mark. The three criteria were

  1. Mark – students use a checklist to evaluate a lesson plan and its use of ICT and pedagogy.
  2. Acceptable use – focused on students ability to identify a lesson plan they can use wrt copyright.
  3. RAT – students use the RAT model to evaluate the use of ICT and pedagogy in the course

The following table compares cohort performance on the criteria and overall.

Criteria Average % stdev %
Overall 68 15.8
Mark 75.2 17.2
Acceptable Use 63.2 16.7
RAT 59.3 17.8

The RAT question was where the students were least successful.  It’s also (arguably) the more difficult question. The checklist was the highest mark.  Acceptable use is also quite low and needs some work.

Those last two is where the focus will go for now.

Other thoughts and experiences

Student feedback

Student feedback included the following comments related to the assignment

Some of the items we were required to assess in Assignment One could have been better explained

more guidance was required for Assignment 1. I didn’t like the use of the Excel document

 The last point was connected to the issue of not being able to justify the interpretation, which links back to points raised elsewhere. The first point is one to ponder. The results above suggest that’s not where the real need lays.

Marker feedback

Feedback from markers included

  • Identifying use of an IWB, when in fact it’s just being used as a data projector.
  • Little understanding of what constitutes: an authentic problem, and connections beyond the classroom
  • Some surprise that even with 50 assignments to mark, there were few double ups of lesson plans.
  • Another liked the format in that it gave students a better handle on what to look for in an ICT-rich lesson and the RAT model was useful for framing an evaluation.
  • The wording and nature of the statements for the acceptable use and the RAT question need to be clarified – to confusing (for marker and student)

One aspect of the assignment that troubled one of the markers was that the lesson chosen by the student only had to include some form of ICT.  It didn’t need to be rich nor effective ICT. This was actually one of the aims of the assignment, to allow students develop some appreciation for the breadth of what is possible and just how narrow use often is.

Questions asked during semester

  • Struggles to find a CC-licensed lesson plan.
  • Clarity about what makes an acceptable lesson plan
    • e.g. Can an American lesson be used?
    • Linked to concerns about Q10 and distinguishing between an appropriate lesson plan and whether or not you can use it due to copyright
  • Questions re: term of use and uploading
  • What if I can’t find any information about copyright?
  • How can/should the lesson plan be put online?
  • The distinction between what a student is using an ICT, and when the teacher is using it
  • Explanation of how the checklist questions are marked – e.g. those that don’t apply
  • Reporting bugs in the formatting of the cells

Personal thoughts

Early reflections on the semester included

The spreadsheet worked reasonably well. The checklist within the spreadsheet requires some refinement. As does some aspects of the rubric. The duplication of a Word-based coversheet needs to be removed.

 Other thoughts during the semester included:

  • Students had a tendency to treat the free text questions as requiring an essay.
  • The “pretend” context for the task wasn’t clear enough.
  • In particular, a problem about the exact legal status of ACME’s server, links and making copies of files.
  • Issues with specific questions and the checklist
    • The “web applications” option under “What is used” causing confusion about overlap with “web browser” question
    • Q16 includes mention of print material around ICT
    • Q26 mentions embedded hardware, a question of it and the connection with IWB
    • Appears to be strong connections between Q22 and A46
    • The purpose of Q10 is not clear enough, confusion with matching curriculum etc.
    • A feeling that there are too many questions and perhaps overlap
    • Criteria for RAT question isn’t clear enough about the quality of the response
      • e.g. not mentioning all uses of ICT and Pedagogy
      • Missing out on themes
      • Incorrect identifying something as belonging to a theme
    • Suggestion for a drop down box around linkage of ICT to objectives: not related, somewhat related, essential, extends, transforms
  • More explicit scaffolding/basic activities around the evaluation questions
    • e.g. Is ICT being used to Question, Scaffold, Lecture, in an authentic task

Random suggestions

Due to institutional constraints (not to mention time) none of the changes to be made can be radical.  Keeping with that, some initial suggested changes to explore include:

  1. Pre-submission checks
    1. What pre-submission checks should I run?
    2. Can they be run? How does that integrate with the Moodle assignment activity workflow?
  2. Remove the cover sheet entirely, just use the spreadsheet
    1. Need to include the learning journal mark into the spreadsheet
    2. Would be nice to do this automagically
  3. Tweaking the marking
    1. The criteria for Acceptable use and RAT questions need to be improved
    2. Look closely at each of the points about the questions
  4. Student preparation
    1. Make clear the need not to write essays for the free text questions
    2. Finding CC licensed lesson plans
      1. Great difficulty in finding those that are CC licensed
      2. Provide a list of prior sites people have used
      3. Generate some sort of activity to test understanding of CC with a specific example
    3. RAT Model
      1. More activities in learning paths
      2. Better labeling on  the spreadsheet
    4. More questions/activities around specific terms and concepts within the checklist

 

Planning an EDC3100 "installfest"

The following documents the planning of an “installfest” for the course EDC3100. Implementation and reflection will come later.

Rationale

The course encourages/requires that students to modify their learning process in the course to engage in Jarche’s seek/sense/share framework using a combination of a personal blog, Diigo, and the Feedly feed reader.

This is a radical departure for most students and a challenge for most. It results in a lot of time expended at the start of semester. For example, a past students shared her experience

I spent a lot of time trying to work out blogging, Diigo and Feedly and to be honest I am still only using the bare minimum with blogging

Not a good outcome and apparently what has been used previously, doesn’t work. So an alternative is required.

As it happens, the same student also suggested a possible solution

My thoughts on changes or additions to the course that I would have found useful, would have been to come to a workshop near the start.

I’ve been pondering this suggestion and how it might work with the next offering of the course that has around 100 online students. Being of a certain age I remember installfests and have been wondering if that might be a useful model.    Leading to questions such as..

Can something like an installfest be run in a online video-conference space? Will students participate? Will it help? How to organise it within existing constraints?

Design thoughts

Linux Installfest HOWTO

Interestingly, I came across the Linux Documentation Project’s Linux Installfest HOWTO, the following starts from that document.

The location will be virtual, not physical. So advice about preparing the physical location doesn’t quite apply. However, the features of the Zoom service will need to considered.

Consideration: Might the “other room” feature of Zoom be useful for organising people at different stages?

Bringing up the major constraint, there’s likely to be only me to fulfill the various suggested roles. With more time I might have been able to organise additional help, but let’s not talk about the one week missing between semester 1 and semester 2.

Consideration: Can the session structure be informed by the identified roles? e.g. a receptionist role could be taken by the initial part of the session which focuses on welcoming people to the space. Might also be useful to explicitly ask for volunteers who are a little further ahead than others, volunteers who might take on a Tier 1 support role.

Consideration: Can a Google document/sheet be used to get an idea of people’s knowledge, experience and comfort level with the various tools? Is completing this sheet part of the entry process? Perhaps something based on the data sheet?

Consideration: Have a space at the end for reflection? Perhaps in part people could do this on their blog?  It might even be a good exercise to start them making connections etc.  To see all the tools working together.

Fit with the course requirements

Course requirements to consider include

  • Blog
    • Which blog?
    • Posts and their model.
    • Feeds
  • Trying to help students develop an appreciation of the value of developing conceptual models of how a technology works, moving beyond recipe following.
  • Challenge of explaining how these three tools fit together.
  • What about seek/sense/share, and what that means for how they learn.
    Question: Do the why first? Too abstract.  Leave it until the end? Don’t know why and perhaps too late and too tired by everything else.  Perhaps show them how it all looks at the end?
  • Identity
    • anonymous or not
    • Professional identity
    • Not being an egg
  • How to demonstrate to people the process
    Select a volunteer and I help guide them through the process using some sort of scaffold (e.g. the slides or study desk)
  • How to give people the time to try it by themselves and perhaps get support
  • How to encourage/enable reuse of sections of the video to integrate into the learning paths

Questions to ask (form/spreadsheet)

  • Name
  • Are you will to volunteer to be guided
  • Blog
    • Do you have one set up?
    • Rate your knowledge about the blog?
    • have you written a blog post?
    • Have you customised your blog?
  • Diigo
    • Do you have a Diigo account?
    • Do you have a Diigo extension installed?
    • Have you book marked something using Diigo
    • Have you shared it to the EDC3100 Diigo group
  • Feedly
    • Have you logged into Feedly?
    • Have you imported the EDC3100 OPML files?
    • Have you tried following anyone else?

 

Initial design

Welcome

First 5+ minutes focus on welcoming everyone and asking them to fill out the form.

Outline the purpose of the session.

Outline the structure

  • Welcome
  • Where are we up to, where are we going
  • Doing it
    • Diigo
    • Feedly
    • Blog
  • Pulling it all together

Where are we up to? Where are we going?

Explain the three tools and the seek/sense/share approach to learning, only briefly on why, focus on concrete illustration showing my use of the tools. Link this to professional identity and the idea of being anonymous. Which tools need to be anonymous?

We want you to be able to do this by the end of the session.

Show the sheet behind the form – link to an idea they can use, mention the Google spreadsheet links in the EDC3100 Diigo group.  Find out where people are up to, think about approaches, ask for volunteers to be Tier 1 support – perhaps on the chat?  Or perhaps in a breakout room.

Outline structure (easy first, to more difficult)

  • Feedly
  • Diigo
  • Blog

Diigo

  1. Sign-up for account.
    Make sure go to learn more. — username and email (which email – personal or USQ)
  2. Join the EDC3100 group.
  3. Show the emails I get and the approval process
  4. Install a Diigo tool
    Recommend Diigo extension – but Diigolet will do
  5. Bookmark a page for yourself
  6. Bookmark a page to the group????
  7. Do minute paper

Feedly

  1. Which account – link to professional identity
    1. Umail if only for University – this okay because it’s not visible.
    2. Facebook or other account if using for personal
  2. Visit Feedly – Hit the get started button – login
  3. Import the OPML files.
  4. Add some content – get them to search in Feedly for something they are interested in
  5. Make point about not reading the actual page, but a copy, show how to access the actual page
  6. minute paper

Blog

  1. Which blog service
  2. Which identity – anonymous etc.
  3. Go to your choice of blog provider
  4. Hit the equivalent of “Create website”
  5. Follow the process
  6. Choose your configuration
  7. Write your first blog post — maybe suggest it should be linked to this post and reflect upon it.  Work in some ideas about reflection.
  8. Register the blog on the Study Desk — probably shouldn’t show this in Zoom.
  9. Talk about WordPress reader and it’s relationship with Diigo
  10. Minute paper

Pulling it all together

  1. Can I get them to download the OPML file and into Feedly.
  2. Come back to the seek/sense/share processe
    1. Seek – Start with Feedly
      1. See discussion forum posts
      2. See posts from other students
    2. Sense – on blog
    3. Share – on blog and Diigo
  3. Another minute paper???

Tasks

  1. Powerpoint scaffold for the session
  2. Google forums
    1. Where are you up to?
    2. Minute papers
      1. Feedly
      2. Diigo
      3. Blog
  3. Set up data system for EDC3100 S2
    1. Blog registration counter
    2. Creating OPML files

 

Any pointers to an old, ancient game?

Way back in 1986 I started studying undergraduate computer science at the University of Queensland. One of our first year programming assignments was to use the fancy, new Macintosh computers to add some code to a game.  I’m looking for pointers to the name of the game and any online resources about it. A working version on some contemporary platform would be great.

Any help more than welcome.

The game

The game was played with a grid. Typically 4 by 4 grid that looked something like this.

Grid 001

The idea is that there were random mirrors hidden throughout the grid. The aim of the game was to figure out what type of mirrors were located where within the grid. To do this you had a flashlight that you could shine through one of the holes on the outside. The light would exit the grid at another location, depending on the location and type of mirrors it would encounter. A bit like this
grid 002

There were three types of mirrors. Two diagonal mirrors / and ; and, a X mirror.  The diagonal mirrors would change the direction of the light depending on how the light struck the mirror. The X mirror would direct the light back the way it came.

The following image shows one potential layout of mirrors to explain how the light behaved in the above image.

Grid 003

The light travels straight ahead until it hits the first diagonal mirror. This mirror causes the to change direction directly up. Where it immediately hits another diagonal mirror which send the light traveling right again until it exits the grid.

Early thoughts on S1, 2016 offering of EDC3100

First semester for 2016 is just about over. Time to reflect back on what’s happened with EDC3100, ICT and Pedagogy for this semester.

Overall, I feel the course is in a better place than it was last year. But there remains some significant room for improvement.

It will be interesting to see what the students think. It will be a couple of months until I see their feedback.

Changes made this semester

A range of different changes were made this semester.

New module 1 and assignment 1

Historically, EDC3100 starts with a bang and a lot of work – too much work – for students. The old assignment 1 required students to expend a fair bit of time getting to now a new technology. The return on that investment wasn’t as much as it might have been, hence students disliked it. This semester Assignment 1 and the supporting Module 1 was completely re-designed with an intent to reduce student workload and focus on a few particular outcomes.

In short, that appears to have worked okay. Workload reduced.  The content and activities for Module 1 could use some enhancement to make their purpose clearer and more engaging. The weekly release of Module 1 was’t great.

Assignment 1 as an Excel spreadsheet

Assignment 1 was designed with students using an Excel spreadsheet for at least three reasons:

  1. Provide students with more experience using an application type they appear not to have used a great deal.
  2. Ensure that the insights generated by both students and markers could be analysed via computer programs.
  3. Using Excel to reduce the workload for markers.

The spreadsheet worked reasonably well. The checklist within the spreadsheet requires some refinement. As does some aspects of the rubric. The duplication of a Word-based coversheet needs to be removed.

Analysing the submitted spreadsheets via software has commenced, but hasn’t been completed.  In part this is due to Moodle not providing a good way of extracting all marked files. This has been worked around – thanks to the good folk at LSS – but time is an issue. More work needs to be done on the analysis and sharing of insights gained from it.

The return of Toowoomba lectures

In 2015 there were no Toowoomba lectures and as a result the SET results for 2015 suffered. Students missed the lectures. In 2016 they were back and were streamed live using Zoom. The lectures and way stations were also more effectively integrated into the Study Desk structure.

The lectures were okay and there were folk attending via Zoom.  Overall, however, the lectures need work. The exact relationship between the lectures and the learning paths need to be thought about. Should it be duplicate? A complement?

Recordings of the lectures (hosted on Vimeo) show a greater amount of usage than I expected to see.
Video stats end May 2016

Refinements to later modules

A range of minor to more major refinements were made, especially for Module 3. These are generally an improvement over what went before.

The not so good

A list of the not so good experience this semester include:

  • The weekly release of Module 1.
  • My absence in Week 4 due to a conference.
  • The difficulty in finding material within the learning paths.
    The availability of the Moodle book search block from next semester will help massively with this problem.
  • Assignment 2 has the students thinking more about unit design than ICT and pedagogy.
  • A couple of major “marking outages” leading to late return of marked assignments.
  • The quality and quantity of feedback provided to students.
    This is a two-edged sword. Feedback on assignments was variable. Feedback via some of the study desk activities and discussion forums was quite good.
  • The on-going confusion amongst at least quite a few students around the learning journal and the marking of it.
  • Marking of assignments still requires too much work from the markers.

To do

All of this (including the following) will need to be revisited once the student feedback has been given, released to me, and considered.

Beyond some of what is mentioned above and based on what I current know, the following need to be done:

  1. Use analytics to explore how students are engaging with the learning paths, including the ability to produce an ePub version and/or print them.
  2. Analyse the Assignment 1 data and identify what new activities/resources this might be useful for.
  3. Better integrate the recorded lectures and other components with the learning paths.
  4. Think about a re-design of Module 2 and Assignment 2.

Competence with digital technology: Teacher or environment?

Apparently there’s a problem with digital skills in Australian schools. Only 52% of Year 10  students achieved a minimum standard of digital competence, and the teachers tasked to help develop that competence feel they aren’t competent. Closer to home, I’ve previously pointed out that the pre-service teachers I work with are far from digital natives harnessing digital technologies seamlessly to achieve the learning, teaching, and life goals.

Given the perceived importance of digital competence, then something must be done. Otherwise “we run the real risk of creating a generation of digitally illiterate students”.

But what?

Mcleod and Carabott suggest

explicit teaching of digital competence through professional development for teachers. This is also important in teacher education programs…

digital competence tests should also be required for teacher registration

What do I think of those suggestions?

Well they certainly have the benefit of being familiar to those involved in formal education. Expanding as they do existing ideas of testing teachers.

But I’m not sure that’s a glowing recommendation. There’s an assumption that those familiar practices are working and should be replicated in other areas.

Limited views of knowledge – blame the teacher

Beyond that they seem based on a fairly limited view of knowledge. Di Blas et al (2014) talk about the knowledge required to integrate digital technologies into teaching as having

consistently been conceptualized as being a form of knowledge that is resident in the heads of individual teachers (p. 2457)

The type of view that sees the problem with a perceive lack of digital competence to be fixed only by filling the heads of the teacher with the necessary digital competence, and then testing whether or not it’s been filled appropriately. If it hasn’t been filled properly, then it tends to be seen as the teacher’s fault.

The limitations of this view means that I don’t think that any approach based on it will be successful. (After all, a deficit model is not a great place to start).

A distributive view

In this paper (Jones, Heffernan, & Albion, 2015) some colleagues and I draw on a distributive view of learning and knowledge to explore our use as teacher educators of digital technologies in our learning and teaching. Borrowing and extending work from Putnam and Borko (2000) we see a distributive view of learning and knowledge focused on digital technologies as involving at least four conceptual themes:

  1. Learning/knowledge is situated in particular physical and social contexts;
  2. It is social in nature;
  3. It is distributed across the individual, other people, and tools; and, that
  4. Digital technologies are protean.

How does this help with the digital competence of school students, teachers, and teacher educators? It suggests we think about what these themes might reveal about the broader context within which folk are developing and using their digital competence.

Schools and digital technologies

Are schools digitally rich environments? Each year I teach about 400 pre-service teachers who head out into schools on Professional Experience for three weeks. During that time they are expected to use digital technologies to enhance and transform their students’ learning. As they prepare for this scary prospect, the most common question from my students is something like

My school has almost no (working) digital technologies? What am I going to do?

Many schools are not digitally rich environments.

If they do, then digital technologies are often seen in ways that mirror reports from Selwyn and Bulfin (2015)

Schools are highly regulated sites of digital technology use (p. 1)…

…valuing technology as

  1. something used when and where permitted;
  2. something that is standardized and preconfigured;
  3. something that conforms to institutional rather than individual needs;
  4. something that is a directed activity. (p. 15)

As teacher educators with large percentages of online students, our digital environment is significantly more rich in terms of the availability of digital technologies. However, in our 2015 paper (Jones, Heffernan, & Albion, 2015) we report that the digital technologies we use for our teaching matches the description from Selwyn and Bulfin. Our experience echoes Rushkoff’s (2010) observation that “instead of optimizing our machines for humanity – or even the benefit of some particular group – we are optimizing humans for machinery” (p. 15). More recently I worked on a paper (Jones and Schneider, in review) with a high school teacher that identified the same problem in schools. Digital technologies that were inefficient, got in the way of effective learning and teaching, and failed to mirror the real world digital technology experience.

How do students and especially teachers learn to value and develop their digital competence in such an environment?

In the recent paper (Jones and Schneider, in review) we wondered what might happen if this environment was modified to actually enable and encourage staff and student agency with digital technologies. Allow people to optimise the technology for what they want to do, rather than optimise what they want to do to suit the technology. If this was done:

  • Would it lead to digital environments that were more effective in terms of learning and teaching?
  • Would it demonstrate the value of digital technologies and computational thinking to teachers in their practice?
  • Would this improve their digital competence?

If you could do it, I think it would positively impact all of these factors. But doing so requires to radically rethink a number of assumptions and practices that underpin most of education and the institutional use of digital technologies.

I’m not holding my breath.

Instead, I wonder how long before there’s standardised test for that.

References

Blas, N. Di, Paolini, P., Sawaya, S., & Mishra, P. (2014). Distributed TPACK: going beyond knowledge in the head. In Society for Information Technology & Teacher Education International Conference (pp. 2457–2465). Retrieved from http://www.editlib.org/p/131154

Jones, D., Heffernan, A., & Albion, P. (2015). TPACK as Shared Practice: Toward a Research Agenda,. In L. Liu & D. Gibson (Eds.), Research Highlights in Technology and Teacher Education 2015 (pp. 13–20). Waynesville, NC: AACE. Retrieved from http://www.editlib.org/d/151871

Putnam, R., & Borko, H. (2000). What do new views of knowledge and thinking have to say about research on teacher learning? Educational Researcher, 29(1), 4–15. Retrieved from http://www.jstor.org/stable/1176586

Selwyn, N., & Bulfin, S. (2015). Exploring school regulation of students’ technology use – rules that are made to be broken? Educational Review, 1911(December), 1–17. doi:10.1080/00131911.2015.1090401

 

Some simple analysis of student submissions

The last post outlined the process for extracting data from ~300 student submissions. This one outlines what was done to actually do some analysis on that data.

The analysis has revealed

  • Around 10% of the submissions have an issue with the URL entered.
  • About 16 lesson plans that have been evaluated by more than 1 student.
  • At least 100 students have evaluated a lesson plan found online with the Australian Curriculum Lessons site being the most popular.
  • An education-based site set up by the industry group Dairy Australia appears to be the most progressive in terms of applying a CC license to resources (apart from the OER commons site)
  • It’s enabled allocating related assignments to a single marker, but the process for doing so with the Moodle assignment management system is less than stellar.

Time to do some marking.

What?

Having extracted the data, the following tests can/should be done

  1. Is the lesson plan URL readable?
  2. Are there any lesson plans being evaluated by more than one students?
    Might be useful to allocate these to the same marker.
  3. What is the distribution of where lesson plans are sourced from? Did most use their own lesson plan?
  4. Pre-check whether the lesson can be used?

Is the lesson plan readable?

Code is simple enough, but using LWP::Simple::head is having some problems.

Let’s try LWP::UserAgent.  That’s working better.

Seems that if they successfully enter the URL it’s readable.

3 students have used file:/ as a URL – not online.

Distribution of lesson plans

Aim here is to group all the URLs based on the hostname. This will then allow the generation of some statistics about where the lesson plans are being sourced from. Findings include the following counts for domains

  • 1 – domain = UNI (a problem)
  • 3 – that don’t have a domain
  • 96 that appear to be using their own site as the course, indicating their own lesson plan
  • 19 domains indicating a lesson planning site
    Accounting for 107 students
  • 32 with some sort of ERROR

That’s still not 300. Ahh, we seem to have some problems with entering URLs correctly, common mistakes include

  • Just leaving off the http:// entirely
  • Mangling bits of http:// (e.g. ttp:// or http//)
  • Using a local file i.e. file:////
  • Having the URL as “To complete”
  • Having the URL empty

Fix those up as much as possible.  Most of these appear to have put something in the cover sheet – if they have one.

Duplicate URLs

There are 16 lesson plans that are used by more than one student.  Most by 2, 4 by 3, 1 by 4 and 1 by 5 students.

Identifying these mean I can allocate them to the same marker. Would be nice if there was an easier way to do this in Moodle.

Pre-check use

At least 107 of the students are using a lesson plan find online. The question is whether or not they can use that lesson plan as per copyright etc.

I could manually check each site, but perhaps to short cut it I should check the spreadsheet for a couple of students, see what argument they’ve mounted for fair use, and then confirm that.

The sites to check are

  • http://www.oercommons.org
    Not surprisingly under a CC-NC-SA license, though the NC is a little bit of a surprise.
  • http://www.australiancurriculumlessons.com.au
    Requires permission to upload.
  • http://readwritethink.org
    Seems to allow reuse.
  • http://www.capthat.com.au
    Copyright applies, but they’ve been good in granting permission for students to use for this assignment.
  • http://www.dairy.edu.au
    Dairy Australia showing off their online nouse to have applied a CC license.
  • http://cdn.3plearning.com
    Most students appear to have had to request permission.

Interestingly there is some large variation between people using the same site.  Should allocate these to the same marker.  Will cut down on time for them.

Setting up the analysis of student submissions

A couple of weeks ago I wrote this post outlining the design of an Excel spreadsheet EDC3100 students were asked to use for their first assignment. They’ll be using it to evaluate an ICT-based lesson plan. The assignment is due Tuesday and ~140 have submitted so far. It’s time to develop the code that’s going to help me analyse the student submissions.

Aim

The aim is to have a script that will extract each students responses from the spreadsheet they’ve submitted and place those responses into a database. From there the data can be analysed in a number of ways to help improve the efficiency and effectiveness of the marking process; and, explore some different practices (the earlier post has a few random ideas).

The script I’m working on here will need to

  1. Be given a directory path containing unpacked student assignment submissions.
  2. Parse the list of submitted files and identity all the spreadsheets
  3. Exclude those spreadsheets that have already been placed into the database.
    Eventually this will need to be configurable.
  4. For all the new spreadsheets
    1. Extract the data from the spreadsheet

At this stage, I don’t need to stick the data in a database.

Steps

  1. Code that when given a directory will extract the spreadsheet names
  2. Match the filename to a student #id.
  3. Parse an individual Excel sheet
    1. Rubric
    2. About
    3. What
    4. How
    5. Evaluation
    6. RAT
  4. Mechanism to show the values associated with question number in the sheet.
    Look at a literal data structure.
  5. Implement a test sheet
  6. See which student files will give me problems.

Extract spreadsheet names

 

This is where the “interesting” naming scheme by the institutional system will make things interesting. The format appears to be

SURNAME Firstname_idnumber_assignsubmission_file_whateverTheStudentCalledTheFile.extension

Where

  • SURNAME Firstname
    Matches the name of the student with the provided case (e.g. “JONES David”)
  • idnumber
    Appears to be the id for this particular assignment submission.
  • assignsubmission_file_
    Is a constant, there for all files.
  • whateverTheStudent…
    Is the name of the file the student used on their computer. It appears likely that some students will have been “creative” with the naming schemes.  Appears at least one student has a file name something.xlsx.docx

Match the filename to a student id

This is probably going to be the biggest problem area. I need to connect the file to an actual unique student id. The problem is that the filename doesn’t contain a unique id that is associated with the student (e.g. the Moodle user id for the student, or the institutional student number).  All it has is the unique id for the submission.

Hence I need to rely on matching the name.  This is going to cause problems if there are students with the same name, or students who have changed their name while the semester is under way. Thankfully it appears we don’t currently have that problem.

Test with 299 submitted files

Assignment due this morning – let’s test with the 299 submitted files.

Ahh, issues with people’s names: apostrophe

Problem files

Apparently 18 errors out of 297 files.  Where did the other 2 go?

“Bad” submissions include

  1. 10 with only 1 file submitted;
    All 10 only submitted the checklist. Not the cover sheet or the lesson plan.
  2. 26 with only 2 files submitted (3 total required)
    1. 25 – Didn’t submit the lesson plan
    2. 1 – Didn’t submit the checklist
    3. 0 – Didn’t submit the coversheet
  3. 18 files that appear to have the bad xlsx version problem from below.

That implies that some of the people who submitted 3 files, didn’t submit an excel file?

Oh, quite proud in a nerdy, strange way about this

 

[code lang=”bash”]
for name in `ls | cut -d_ -f2 | sort | uniq -c | sort -r | grep ‘ 3 ‘ | sed -e ‘1,$s/^.*[0-9] //’`
do
files=`ls *$name*`
echo $files | grep -q ".xls"
if [ $? -eq 1 ]
then
echo "found $name"
fi
done
[/code]

I’m assuming there will be files that can’t be read. So what are the problems.

Seem they are all down to Microsoft’s “Composite Document File V2 Format”.  These files will open in Excel, but challenge the Perl module I’m using.

Out of the 297 submitted so far, 18 have this problem.  Going to leave those for another day.

What to expect/look for from SITE'2016?

Fountain

I’m spending this week attending the SITE’2016 conference (SITE = Society of Information Technology and Teacher Education). This is my first SITE and the following outlines some of my expectations and intent.

It’s big

Site is one of a raft of conferences run by AACE. I’ve been to two of them previously: EdMedia and Elearn. These are big conferences. 1000+ delegates. Up to be beyond 10 simultaneous sessions. Lots of in-crowds and cliques. Lots of times when there is nothing you’re really interested in, and lots of times when there are multiple things you are very interested in. A lot of really good stuff lost in the mass.

Observations that have been borne out by my first glance at the program.  Too much to take in and do justice to.

At face value, a fairly traditional large conference. With the same breadth of simple to complex, of repetition to real innovation, from boring to mind-blowing. Probably the same ratio as well.

As I’m far from being an extroverted and expert networker, I instead rely on actively trying to make some connections with what I see and I’m doing/going to do.

Join a clique?

While our paper didn’t get an overall paper award, it was successful in winning a TPACK SIG Paper Award.  Given that our previous paper was also TPACK related and won a paper award. This might suggest a “clique” with which I might have some connection, and there are a couple of related papers that sound interesting.

There also appear to be other “cliques” based around computational thinking, design thinking/technologies, and ICT integration by pre-service teachers (more generally than TPACK).  All of these are interests of mine, they connect directly to teaching.

I’m thinking a particular focus for the next few days will one identifying and sharing ideas for using digital technology in school settings with the current EDC3100 crew.

There was one explicit mention of OER in the titles. Pity I can get access to the content of talks yet (thanks to how I was registered and the closed way the organisers treat the proceedings – online now but only for those registered)

Time to get the presentation finished.

 

Setting up an Excel checklist

For a brand new first assignment for EDC3100 the students are being asked to find a lesson plan that uses digital technologies to enhance learning (ICT and Pedagogy), and evaluate it against a checklist. The following documents my explorations about how to set up this checklist.

Current status

Have test version that appears to be working. Need to test it on different platforms. A task for tomorrow.

If you’re an EDC3100 student, then do try downloading it and taking a look, is it going to work for you?  Remember, it is an early test.  Need to do more testing myself.

Why?

The rationale for this assignment includes the following:

  1. Broaden the students’ awareness of what is possible with (ICT and Pedagogy).
  2. Make them aware of some ways to evaluate how ICT and Pedagogy is being used.
  3. Help them question just how they can use a resource they found online
  4. Create a “database” of information that I can analyse.

The last point may not sound all that educational, but the hope is that the ability to be able to “mine” (i.e. the ability to use digital technologies to analyse the data) the students’ responses and the markers’ judgements around this task will enable a range of new practices that will enhance/transform learning and teaching.

Some initial examples of what might be possible

  • Pre-marking checks.
    e.g. students are required to include a URL to a lesson plan. A program could check that the URL is actually correct. In a perfect world, it would warn the student before allowing them to finalise submission – but the LMS isn’t flexible enough for that.
  • Marker allocation.
    e.g. if I one of the markers has an interest mathematics, allocating her all of the maths related evaluations might be a good idea.
  • Supporting moderation and providing summary feedback.
    e.g. after marking, a program could analyse how all the students have performed and generate summary feedback. Feedback that could be used to inform the moderation process and also to provide students an overall picture of how everyone went.
  • Providing a shareable database of evaluated lesson plans.
    The assignment has 300+ students finding a lesson plan and evaluating its use of ICT and Pedagogy. These evaluations are then marked by practicing teachers. In an environment where there is abundant information, these evaluations might help focus attention on the what’s actually “good”. e.g. here’s a list of all the lessons that transform (as per the RAT framework) student learning using iCT, rather than here’s a list of lessons that use ICT.  At the very least, this could be useful within the course.

But before any of those is possible, I have to figure out an appropriate method to create the checklist.

Requirements

A good method is going to meet the following requirements

  1. Easy/efficient/familiar for the students to use.Students will need to:
    • check boxes; and,
    • write/copy and paste small sections of text.
  2. Easy/efficient/familiar for the marker
    Markers will need to

    • read and understand student responses;
    • indicate right/wrong (check boxes);
    • write small sections of text (comments/feedbac);
    • make judgements against a rubric; and,
    • calculate a total
  3. Work with the existing technology, including
    • Be submitted/returned via Moodle;
    • Be a format that can be analysed via programs.

The two most obvious options are an Excel spreadsheet, or a Word document. Yep, pandering to closed formats, but then most of the open formats break the first two requirements.

I am assuming that students will have access to Excel, that might be an ask. They may need to resort to Google Sheets or some other tools, the question will be whether doing this gets in the way of any script.

What can I analyse programatically?

I should probably double check which formats I can actually write programs to analyse.

Excel works nicely – even down to the checkbox.  Word is being more difficult.

I’ll go with Excel, though I may be pandering to my prejudices.

Setting up the checklist in Excel

The requirements are:

  • About 90 questions separated into four sections
    That might seem a bit much, but a fair number of those are covered by lists of ICT to choose what is being used.

    Having the different sections on different sheets could be useful.  Might also challenge the students, but that’s a good thing.

  • Large % of the questions are just checkboxes.
    Have tested that I can get that to work.
  • Another portion of questions require a checkbox plus a textbox to include proof
  • Small number are just a textbox
  • One has a table like structure
  • Marking
    • Almost all the questions have the marking indicating right or wrong
      What should the default value be?  Wrong or right?
    • A couple of questions require a textbox to make a judgement call
    • Would like the rubric for the assignment in the spreadsheet and for it to be auto-filled in by the marker’s actions

Set up different worksheets, that’s working.  Got a format that looks okay. And some questions going in.  Checkboxes.

Can I read the test sheet from a script?

Absolutely, multiple sheets, no worries.  Looking good.

 

Using resources appropriately

The following is intended to be an example that will be used in the course I’m teaching. It’s meant to demonstrate appropriate ways to reuse resources that have been created in different ways. It’s also an opportunity to explicitly test my understanding. So feel free to correct me.

The idea is that how and if you can use a resource (be it words, audio, video etc) depends on who created the resource, copyright, and any additional conditions that have been applied.

Using a resource I created

The following image is a photo taken by me. I’m the copyright owner, I’m free to use this anyway I like. No need to reference or give attribution.

If I’d taken this image as part of preparing teaching materials for my paid work for the University of Southern Queensland, then I would have to ask their permission to use this image here. As the University (currently) retains copyright ownership on materials produced for teaching purposes.

Eating in the bath

There’s not need to include any attribution on this image, as I own the copyright.

Using a public domain image

The following image – taken from a book from the 1800s – is in the public domain. There are no restrictions on how I (or you) can use this image.

Image from page 363 of "Encyclopédie d'histoire naturelle; ou, traité complet de cette science d'après les travaux des naturalistes les plus éminents de tous les pays et de toutes les époques: Buffon, Daubenton, Lacépède, G. Cuvier, F. Cuvier, Geoffroy Sa

With public domain resources, there’s no need for an attribution, but it would be nice to do.

Using a Creative Commons image

The following image was taken by Daisuke Tashiro. Who has chosen to add to this image this Creative Commons license which allows me to reuse the image as along as a fulfill the conditions of the license, including appropriate attribution of the image.

To properly attribute the image, I make use of the ImageCodr service.

If I were to use the above image without the attribution, just the image itself. I would be breaking the terms of the license.
However, I can currently link to the image without any attribution or breaking any copyright conditions.

Using a copyrighted image

The following image is copyrighted. All rights reserved.  While I can link to this image without breaking copyright. If I embed it in this blog post, I’m likely to get into trouble.

Unless I ask the copyright holder for permission to use the image. As I have known the copyright holder for a long time, I’ve been able to do this quite easily and quickly. However, if you don’t know the copyright holder, obtaining permission may take quite some time, and may not happen at all.

 

Copyright © (2012) Colin Beer – used with permission

If I don’t get permission from the copyright holder, I can’t use this image. Even if I put the nice attribution of the resource, I still can’t use it.

What is that last image about?

The image is a little interesting in the context of the course. It indicates that there is a potential relationship between final grade a student achieves in a course, and the week of term when the student first accesses a course website. i.e. if you access a course website in week 5, you are likely to get a grade lower than students who access the course website earlier.

 

Page 1 of 6

Powered by WordPress & Theme by Anders Norén

css.php