Assembling the heterogeneous elements for (digital) learning

Month: April 2016

Focus, innovation and university IT

I’m currently reading “Automate this” by Christopher Steiner and came across the following

It’s not often that the most important innovations in the world come from the GEs and the Microsofts, the authors point out. They come from entrepreneurs who are focused on that one area with an intensity that bigger companies simply can’t bring. Most big firms see their time as best spent on making their current products and processes more efficient….Because of this fact, big companies tend to fall into traps of overmanaging and underinnovating

I don’t think this observation is all that earth shattering, I’ve heard it expressed in many places before. For example, “the authors” Steiner mentions are Kedrosky and Stangler in this report, which includes the following (also quoted by Steiner)

Startup firms specialize—in a way that larger and more-established companies can barely contemplate—in attacking complex problems in cheaper and more efficient ways

This argument contributes to the problems with University IT services talked about by Mark Smithers and Martin Weller. Both Mark and Martin pick up on the unique needs of Universities when it comes to IT, for example Martin writes:

We have to get back to having dialogue, and having IT people who understand the needs of universities (and equally academics who understand the demands of IT systems).

Can incommensurate parties have dialogue?

In a previous post I argued that a “incommensurate barrier” exists between techno (University IT) and the pedagogue (University teacher). The quote Steiner illustrates the nature of this barrier. IT is focused on efficiency (and other criteria) across the entire organisation, while the teacher specialises (focuses) on the course(s)/student(s) they are teaching. Their focus is on attacking the complex problem of teaching their course in different ways.  Sometimes with a focus on cheaper and more, but also on more effective.

Animated gif of reusability paradox showing a trend to putting more context into the object

Which, for me, brings to mind Wiley’s reusability paradox.

Stephen Downes makes the point that this divide isn’t necessarily inherent in either the technology or education disciplines and I agree. However, at least in my experience (and others that I know) this divide does exist within the very limited confines of University education. Perhaps in any formal education involving large organisations.

 What can be done?

Good question.  Tim Klapdor talks about a significant part of the answer

So rather than seeking to constantly create smarter technologies, what if you simply allowed people more control over how they interacted with them? What if you provided tools that allowed users to move data between systems more easily? What if you got your internal systems to talk to each other in a shared language? What if you made systems more contextually aware? What if instead of investing millions in “better” technology you empowered your users?

 

The need for technopedagogues and will it ever go away?

Tim Klapdor writes about (along with a bunch of other stuff today) the process of discovering the concept of a technopedagogue and offers his translation of the French definition of that concept

The technopedagogue is a kind of bilingualist, one foot in human needs and learning process, and the other in technology and its potential. So a technopedagogue can oversee the design, implementation and even the implementation of interfaces, environments and the digital tools that support learning or various processes. The technopedagogue communicates easily with system architects and programmers as well as administrators, trainers and teachers. They can also act as a translator between the two, often translating the educational needs into the technical requirements. What makes this techno-pedagogical bridge so vital to our digital society is the ability to maximise the potential of the technological tools to meet our needs, which are first and foremost, human.

Like Tim I can see some resonance between that role and the type of stuff I do. In fact, any value that exists in the stuff that I do comes because I’m able to bridge the techno-pedagogical chasm. Not because I’m a brilliant pedagogue, or even a brilliant technologist. Far from it.

It comes from the fact that it is so unusual for any one person or organisation to be able to effectively bridge the techno-pedagogical chasm.

Do we need to teach “ICT and Pedagogy”?

Four years ago I started work at my current institution charged with the responsibility of teaching “ICT and Pedagogy” to pre-service teachers. Back then I was wondering why there was still a need for this type of course, I wrote

If technology is part of every day learning and teaching, why have a separate ICT in pedagogy course?

I really wasn’t sure there would remain a need for the course. I was fairly certain I’d be looking for another job fairly soon.

Not as long as the chasm remains

I don’t think that any more.

The perceived importance of digital technologies in education remains and is perhaps more widely recognised. It appears that governments and formal education institutions are starting to spend more time and attention on the problem.

But I don’t think any of that is going to work because the chasm between the techno and the pedagogue is going to remain.

The Incommensurability barrier

For me the chasm is unlikely to disappear because the techno (perhaps defined as involving the “system architects and programmers as well as administrators”) work with a mindset that is incommensurable with the mindset used by the pedagogue. Another of Tim’s posts from today on scale gives one example of the incommensurable.

The techno is interested in scale. On systems and practices that work for the whole organisation or the whole of learning and teaching.

The pedagogue is interested – as much as they can be within the current system – in the individual, the specific.

Hence you get Wiley’s reusability paradox.

Ignorance of the nature of digital technologies

The “techno” mindset is so strong we end up with organisational approaches to digital technologies that no longer understand the fundamental nature of digital technologies. Rather than benefiting from Kay’s protean, meta-medium we get the situation Rushkoff identified where human beings are optimized for the digital technologies. A situation we talk a bit about in this paper.

Digital fluency won’t help

Rather than recognise this fundamental problem, the trend over recent years has been to blame the teachers. They aren’t using our technology because they are digitally fluent. If only they were digitally fluent, then everything would be okay. It won’t.

There may well be some benefit, but unless the digital technologies available to teachers (and students) help them deal with the contextual, the specific, and the individual, the chasm will continue to exist. Technology focused on the general, won’t be useful. There’ll still be a chasm between what is provided and what is required.

Re-arranging the deck chairs

Hence courses like ICT and Pedagogy will continue to exist, but there will be lots of activity re-arranging the deck chairs. My course isn’t helping because it isn’t teaching X. So revise to teach X. Oh, that didn’t work. The course needs to be taught in the first year. No that didn’t work. It needs to be integrated across the curriculum. Not that didn’t work….

No to go write some code to bridge the gap between the generic software systems the organisation provides me and the specific needs of my course, the pedagogy I use in it, and my learners.

 

 

 

Playing with Wikity

What is Wikity?

I have a feeling I may not be able to give a good answer to that. That’s part of the point of the work this post starts. Play around with Wikity and explore some possibilities.

At the moment, I see Wikity as a new style of mindtool. One that might be useful for me personally, but may also be really useful for much more than just my personal use. It’s a Wiki type tool that runs on top of WordPress and is informed by the ideas of Federated Wiki. It’s the most recent instantiation of thinking from Mike Caufield.

The plan here is to install the 0.3 version of Wikity on my under utilised Reclaim Hosting site.

Log

  1. Create a new sub-domain http://wikity.djon.es
    Straight forward
  2. Install WordPress – multisite
  3. Import the wikity theme – and hey it’s working – at least the home page has changed
  4. Install the auto-upload images plugin
    Wordpress wants to update this plugin, but advice for Wikity is not to.
  5. Start creating some initial cards
    1. “social” bookmark “Wik-it” with “OER as a participatory activity” creating this card.

Title-Abstract-Treatment-Connections

Getting Wikity installed was always going to be the easy part. Figuring out how to use it productively and integrate that use into my regular practice was going to be the challenge.

The Wikity introduction positions “Title-Abstract-Treatment-Connections” as the best structure for a card.  It encourages people to engage with this by creating a new card, I’m wondering if I can edit an existing one (it’s a wiki, of course you can, is my expectation).

Mmm, the edit link in the card view brings up the WordPress edit screen. Wonder how that will work with markdown? But the catalog view gives the more markdown view.

The focus in wikity is not to summarise the article/resource, but to capture the idea.  Something I’m somewhat terrible at, and need to practice more.

Connections to other cards is important let’s try that be adding one on the Reusability Paradox. Oh nice, the auto image upload works. Though have to use the full syntax for the image.

Paths look interesting.

Use of WordPress pages could be useful for some of my potential applications of Wikity.

Copying other people’s work

Given I’ve self-installed wikity separate from others, can I still copy cards from others?

The docs don’t appear to yet touch on that, let’s try it out.  Find an intersting card, and try to copy it.  Stick in the URL for my wikity and hit copy.

Nice, that worked. Mmm, but it appears that the connection Mike made to another card (Ideology of Disgust) doesn’t come across correctly.  Or is that because he hasn’t written that card yet?  No, he created it.

Time to add another personal connection to it. Something I’ve been thinking about recently.

All that seems to be working.

Further applications

I can see some definite applications personally, as long as I have the discipline. Wonder if Diigo’s days are numbered? At least for personal use.

Beyond that there are some potentially interesting contributions Wikity can help with ideas around the BAD mindset, CASA, and some sort of foundation for a “distributed TPACK” approach.

Also some potential for teaching.  It would be a good fit for Networked and Global Learning.

All I need is time.

The immediate application might be to this project that is supporting a group of teacher educators using analytics to understand what’s going on in the courses.

 

 

 

 

 

Designing a collection of analytics to explore "engagement"

I’m working with a group of fellow teacher educators here at USQ to explore what is happening around student engagement with our online courses. It’s driven by the apparent less than stellar responses on the QILT site from our prior students around “engagement”. It’s also driven by some disquiet about the limitations of aggregated and de-contextualised data like that reported on the QILT site and also that arising from most learning analytics (e.g. as found by Gašević et. al. (2015).

Hence our idea is to do something like

  1. Take a collection of teacher education courses.
  2. Iteratively apply a range of increasingly specific learning analytics to reveal what’s happening around engagement in our course sites.
  3. Reflect on what we learn and what that might have to say about
    • the use of aggregated and de-contextualised data/analytics;
    • what data/analytics might be useful for ourselves and our students; and
    • what’s happening in our courses, and how that compares to what we thought was going on.

As the person most familiar with technology and learning analytics, I’m tasked with identifying the sequence of “increasingly specific learning analytics” that we’ll use.

What follows is a first draft.  I’m keen to hear suggestions and criticisms. Fire away.

Specific questions to be to be answered

  1. Does the sequence below make sense?
  2. Are there other types of analytics that could be usefully added to the following and would help explore student/staff engagement and/or perhaps increase contextualisation?
  3. What literature exists around each of these analytics, where did the apply the analytics, and what did they find?

Process overview

Each of the ovals in the following diagram are intended to represent a cycle where some analytics are presented.  We’ll reflect on what is revealed and generate thoughts and questions. The labels for the ovals a short-hand for a specific type of analytics. These are described in more detail below.

The sequence is meant to capture the increasing contextualisation. The first four cycles would use fairly generic analytics, but analytics that reveal different and perhaps more specific detail. The last two cycles – learning design and course specific – are very specific to each course. The course specific cycle would be aimed at exploring any of the questions we identified for our individual courses as we worked through the other cycles.
Version 1 of process
It won’t be quite as neat as the above. There will be some iteration and refinement of existing and previous cycles, but the overall trend would be down.

The analytics below could also be compared and analysed a variety of ways, most of which would be responding to details of our context. e.g. comparisons against mode and specialisation etc.

Click/grade & Time/grade

This cycle replicates some of the patterns from Beer et al (2010) (somewhat shameless, but relevant self-citation) and related.  This is aimed at just getting the toe in the water, getting the process set up.  It’s also arguably perhaps as removed from student learning/engagement as you can get. A recent post showed off what one of these will look like.
EDC3100 2015 Course and grades

This would also include the heatmap type analysis such as the following diagrams.

12259871663_48602a5da1_q 

“Rossi” data sets

Rossi et al (2013) extended the Beer et al (2010) work and worked with the “Rossi” data sets drawing on the Interaction Equivalency Theorem (Miyazoe and Anderson, 2011). Hence increasing the theoretical connection with interaction/engagement.

The additions in the “Rossi” data sets follow

Proportion of clicks within LMS discussion forums (dhits)

CQU Moodle courses Non-forum clicks Forum clicks (dhits)
n=12870 students 68% 32%

# of forum hits (dhit), posts, and replies

A “dhit” is a click on a forum. This includes navigation etc. The idea here is to compare dhits with posts and replies

the posts and replies made within the forums are more representative of student and teacher engagement and interaction (Rossi et al 2013, p. 48)

Moodle CQU T1, 2011 dhits Posts replies
n=12870 385113 17154 29586

learner-learner, learner-teacher and ratio

# of forum posts that are learners replying to a learner post (learner-learner) or a learning responding to a teacher or vice versa (learner-teacher).

Average for T1, 2011 CQU courses Learner-learner Learner-teacher Ratio of LT to LL
n=336 86 56 .65

Comparison of learner-learner, learner-teacher and learner content interactions

 

Screen Shot 2016-04-12 at 12.36.46 pm.png

Networks and paths

These analytics focus on the relationships and connections between people and the paths they follow while studying. Moving beyond numbers to starting to understand connections.

This 2013 post from Martin Hawksey (found via his other post mentioned below) gives an overview of a range of uses and tools (including SNAPP) for social network analysis. It’s the early SNAPP work that identifies some of what these visualisations can help identify

  • isolated students
  • facilitator-centric network patterns where a tutor or academic is central to the network with little interaction occurring between student participants
  • group malfunction
  • users that bridge smaller clustered networks and serve as information brokers

The following is one of my first attempts generating such a graph. It shows the connections between individual student blogs (from EDC3100 2013). The bigger the line between dots (blogs), the more links.

Romero et al (2013) offers one example.

001 - First Graph

A Sankey diagram is a method for representing flow in networks. It can be used to understand usage of websites.  Martin Hawksey has just written this post (showing how to take LMS discussion data and send it through Google analytics) which includes the following screen shot of “event flow” (a related idea).  It shows (I believe) how a particular use has moved through a discussion forum. Looks like it provides various ways to interact with this information.

Google Analytics - Event Flow

Hoping we might be able to leverage some of the work Danny Liu is doing.

 Sentiment, content, and broader discourse analysis

The previous cycles are focused on using clicks and links to understand what’s going on. This cycle would start to play with natural language processing to analyse what the students and teachers are actually saying.

This is a fairly new area for me. Initially, it might focus on

  • readability/complexity analysis;
    Unpublished work from CQU has identified a negative correlation between the complexity of writing in assignment specifications and course satisfaction.
  • sentiment analysis
    How positive or negative are forum posts etc? The comments and questions on this blog post about a paper using sentiment analysis on MOOC forums provides one place to start.

Learning design

The plan here is to focus explicitly on the learning designs within the courses and explore what can be revealed using checkpoint and process analytics as outlined by Lockyer et al (2013).

Course specific

Not explict planned here. The idea is that the explorations and reflections from each of the above cycles will identify a range of additional course specific questions that will be dealt with as appropriate.

References

Beer, C., Clark, K., & Jones, D. (2010). Indicators of engagement. In Curriculum, technology and transformation for an unknown future. Proceedings of ASCILITE Sydney 2010 (pp. 75–86). Sydney. Retrieved from http://ascilite.org.au/conferences/sydney10/procs/Beer-full.pdf

Gašević, D., Dawson, S., Rogers, T., & Gasevic, D. (2015). Learning analytics should not promote one size fits all: The effects of instructional conditions in predicating learning success. The Internet and Higher Education, 28, 68–84. doi:doi:10.1016/j.iheduc.2015.10.002

Lockyer, L., Heathcote, E., & Dawson, S. (2013). Informing Pedagogical Action: Aligning Learning Analytics With Learning Design. American Behavioral Scientist. doi:10.1177/0002764213479367

Rossi, D., Rensburg, H. Van, Beer, C., Clark, D., Danaher, P., & Harreveld, B. (2013). Learning interactions: A cross-institutional multi-disciplinary analysis of learner-learner and learner-teacher and learner-content interactions in online learning contexts. Retrieved from http://www.dehub.edu.au/wp-content/uploads/2013/07/CQU_Report.pdf

 

Playing with D3

I’m part of a group that’s trying to take a deep dive into our courses using “learning analytics”. My contribution is largely the technology side of it and its time to generate some pretty pictures.  The following is a summary of some playing around with D3.js and ended up with some success with Plot.ly.

Example – box plots

Toe in the water time, can I get an example working. Start with box plots as I’ve got some data that could fit with that.

Rough process should be something like.

  1. Set up some space on the local web server.
  2. Install the library.
  3. Get the example working.
  4. Modify the example for my data

Step #4 is a bit harder given that I don’t yet grok the d3 model.

Work through some tutorials

Starting with bar charts.

So some jQuery like basics: selections, selectors, method chaining.

Appears joins might be the most challenging of the concepts.

data space (domain) -> display space (range)

Mmm, d3.js is too low level for current needs.

Plot.ly

Plot.ly is an online service driven by a Javascript library that is built upon d3.js and other services. Appears to be at the level I’d like.  Example works nicely.

Does appear to be a big project.

Ohh nice.  The example looks very appropriate.

Bit of data wrangling and the following is produced.

EDC3100 2015 Course and grades
Even that simple little test reveals some interesting little tidbits. Exploring a bit further should be even more interesting.

Learning analytics should not promote one size fits all: The effects of instructional conditions in predicting academic success

What follows is a summary of

Gašević, D., Dawson, S., Rogers, T., & Gasevic, D. (2015). Learning analytics should not promote one size fits all: The effects of instructional conditions in predicating learning success. The Internet and Higher Education, 28, 68–84. doi:doi:10.1016/j.iheduc.2015.10.002

I’ve skimmed it before, but renewed interest is being driven by a local project to explore what analytics might reveal about 9 teacher education courses, especially in light of the QILT process and data.

Reactions

Good paper.

Connections to the work we’re doing in terms of similar number of courses (9) and a focus on looking into the diversity hidden by aggregated and homogenised data analysis. The differences are

  • we’re looking at the question of engagement, not prediction (necessarily);
  • we’re looking for differences within a single discipline/program and aiming to explore diversity within/across a program
  • in particular, what it might reveal about our assumptions and practices
  • some of our offering are online only

Summary

Gašević, et al (2015) looks at the influence of specific instructional conditions in 9 blended courses on success prediction using learning analytics and log-data.

A lack of attention to instructional conditions can lead to an over or under estimation of the effects of LMS features on students’ academic success

Learning analytics

Interest in, but questions around the portability of learning analytics.

the paper aims to empirically demonstrate the importance for understanding the course and disciplinary context as an essential step when developing and interpreting predictive models of academic success and attrition (Lockyer, Heathcote, & Dawson, 2013)

Some aims to decontextualise – i.e. that some work aims to identify predictive models that can

inform a generalized model of predictive risk that acts independently of contextual factors such as institution, discipline, or learning design. These omissions of contextual variables are also occasionally expressed as an overt objective.

While there are some large scale projects, most are small scale and (emphasis added)

small sample sizes and disciplinary homogeneity adds further complexity in interpreting the research findings, leaving open the possibility that disciplinary context and course specific effects may be contributing factors

 Absence of theory in learning analytics – at least until recently.  Theory that points to the influence of diversity in context, subject, teacher, and learner.

Most post-behaviorist learning theories would suggest the importance of elements of the specific learning situation and student and teacher intentions

Impact of context – Mentions Finnegan, Morris and Lee (2009) as a study that looked at the role of contextual variables and finding disciplinary differences and “no single significant predictor shared across all three disciplines”

Role of theoretical frameworks – argument for benefits of integrating theory

  • connect with prior research;
  • make clear the aim of research designs and thus what outcomes mean.

Theoretical grounding for study

Winne and Hadwin’s “constructivist, meta-cognitive approach to self-regulated learning

  1. learners construct their knowledge by using tools (cognitive, physical, and digital);
  2. to operate on raw information (stuff given by courses);
  3. to construct products of their learning;
  4. learning products are evaluated via internal and external standards
  5. learners make decisions about their the tactics and standards used.
  6. decisions are influenced by internal and external conditions

Leading to the idea proposition

that learning analytics must account for conditions in order to make any meaningful interpretation of learning success prediction

The focus here is on instructional conditions.

Predictions from this

  1. Students will tend to interact more with recommended tools
  2. There will be a positive relationship between students level of interaction and the instructional conditions of the course (high frequency of use tools will have a large impact on success)
  3. The central tendency will prevail so that models that aggregate variables about student interaction may lead to over/under estimation

Method

Correlational (non-experimental) design. 9 first year courses that were part of an institutional project on retention. Participation in that project based on a discipline specific low level of retention – a quite low 20% (at least to me).  4134 students from 9 courses over 5 years – not big numbers.

Outcome variables – percent mark and academic status – pass, fail, or withdrawn (n=88).

Data based on other studies and availability

  • Student characteristics: age, gender, international student, language at home, home remoteness, term access, previous enrolment, course start.
  • LMS trace data: usage of various tools, some continuous, some lesser used as dichotomous and then categorical variables (reasons given)

Various statistics tests and models used.

Discussion

Usage across courses was variable hence the advice (p. 79)

  1. there is a need to create models for academic success prediction for individual courses, incorporating instructional conditions into the analysis model.
  2. there must be careful consideration in any interpretation of any predictive model of academic success, if these models do not incorporate instructional conditions
  3. particular courses,which may have similar technology use,maywarrant separatemodels for academic suc- cess prediction due to the individual differences in the enrolled student cohort.

And

we draw two important conclusions: a) generalized models of academic success prediction can overestimate or underestimate effects of individual predictors derived from trace data; and b) use of a specific LMS feature by the students within a course does not necessarily mean that the feature would have a significant effect on the students’ academic success; rather, instructional conditions need to be considered in order to understand if, and why, some variables were significant in order to inform the research and practice of learning and teaching (pp. 79, 81)

Closes out with some good comments on moving students/teachers beyond passive consumers of these models and the danger of existing institutional practice around analytics having decisions be made too far removed from the teaching context.

 

First steps in integrating LATs OER into Moodle open book

The following documents initial explorations into leveraging the two Learning Activity Types (LATs) short courses that have been released as Open Educational Resources (OERs) by Hofer and Harris (2016). As outlined in a prior post my plan is to use these OERs as a test case for the Moodle open book project. The aim being to

  1. Test and identify areas for improvement for the Moodle open book code using a real OER.
  2. See how/if this OER could be leveraged for the course I teach and related courses coming down the line.
  3. Eventually exploring how and if this work might connect with broader work around OERs, potential work here around Open Educational Practices and teacher education, and work to encourage adoption and adaptation of the LAT OER.

What’s been done

  1. The Elementary Short Course LMS package (IMS) has been successfully imported into Moodle (first image below)
  2. The Web-based Elementary course has been converted into a Moodle Book resources (second image below).
  3. The Moodle Book version of the Elementary course has been exported to GitHub using the Moodle open book tool (third image below).

All that was done quite easily. A couple of minor bugs to report, but nothing major.

Next steps are (not necessarily in order)

  1. Improve the structure/scaffolding of some of the pages with multiple videos.
  2. Explore if the video transcripts can be usefully integrated into the pages.
  3. Modify the Moodle book tool to produce a HTML version that is more immediately usable (e.g. a nice “book” interface, rather than a single long web page).
  4. Figure out when/if we might use this in EDC3100.

Random questions, outcomes, and future work

  1. Could a resource like this be integrated into multiple courses within the BEdu?
    Hofer and Harris (2016) suggest that this is a possibility with the LAT OER.
  2. How might the collection of student written lesson plans be more broadly contributed to and used?
  3. Can the experience students go through be captured, shared, and leveraged in some useful way?

What’s in the LAT short course OERs?

First step is to figure out the content and format of the LAT OERs to gain some idea of how and if it fit with the Moodle open book and my course.

The aim of the broader Learning Activity Type (LAT) work and the OER short courses is to help pre-service teachers develop the knowledge required to integrate digital technologies into their teaching. Learning Activity Types offer a taxonomy of learning activities that are linked to specific learning areas that are used during lesson planning.

One short course each for primary (elementary) and secondary pre-service teachers. Each  course is divided into eight sequential modules that includes videos and transcripts. Sequence is

  1. Reflect on prior experience with digital technologies in learning and teaching and what worked/didn’t.
  2. Select three lessons from a collection of lesson plans written by other pre-service teachers.
  3. Analyse these sample lessons and their learning objectives, learning activities used, student assessment, and use of digital and non-digital technologies.
  4. Given a single demonstration lesson plan, replacing learning activities that don’t match the objectives with those that do.
  5. Consider and explain the replacement of technologies in the demonstration lesson plan with others.
  6. Review portions of interviews with experienced teacher when making similar activities.
  7. Returning to the sample lessons, chose the LAT taxonomy that matches and explore.
  8. Use that to think about substitution of learning activities and technologies in the sample lessons, considering a range of factors and engaging in discussions.
  9. Create their own lesson plan based on a range of considerations.
  10. Evaluate their lesson plan with two self-tests called “Is it worth it?”

Related resources include

  1. Blackboard produced content package file – Elementary and secondary.
  2. Websites with the short courses – Elementary and secondary.
  3. Various media files – Elementary and secondary.
    This includes powerpoint, video, caption, script, sample lesson, and various student guide files.
  4. Instructor guide.
  5. Related share-alike materials.
  6. Collection of student written lesson plans.

Let’s try the IMS package

The LAT OER are provided as IMS packages, these should import directly into Moodle.  Not exactly what I want to do here, but worth a look.

Screen Shot 2016-04-02 at 9.51.40 am.png

Well that worked quite smoothly.  Add the resource, import the file and there it is.

The layout/interface isn’t that nice (very subjective and I’m likely biased).  Starts with an introductory video with context/background on the courses.   The interface (at least in Moodle hides the “play” button.

Nice, video makes mention of instructor provided discussion forums.  Mmm, video play back issue, is the video remote or local?

Ahh, broken link.  The IMS version links back to Blackboard.  The equivalent web version has the open link.  That seems to apply for all of the pages.

Each page has a video. The transcript is available. Wonder if I can do integrate both and whether there’s any value in doing so.  I know I’ve probably prefer the text version.

The lesson design page is a bit busy

Tasks and questions

  1. Where are the videos located here? Part of the Moodle LMS?
    Perhaps same as the web site versions?
  2. Is the “hidden” play button a Moodle interface problem? Can it be fixed?
  3. Report the UMW Blackboard link for the sample lesson plan on “Analyse existing lessons” in elementary
  4. Do some form of automated link check etc?
  5. Can/should the various guides be converted into HTML?
  6. Might some of the pages (e.g. lesson design) be better scaffolded/labelled?

What might it look like?

Converting the two courses into a Moodle book looks like it would be fairly straight forward.  Each book could be integrated into the EDC3100 study desk fairly easily. One modification might be the integration of the script of the videos into each page to support those who want to read, but also to enable use of search engines.

Initial plan – straight cut-and-paste

Create a bog standard Moodle book with just the current content of the elementary course.

Provide a better feel for the content and how it goes in the book. Identify any issues and ideas. Provide a concrete version on github for later experimentation.

Add the video transcripts

Can this be done?

Did it work?

Straight cut-and-paste

Process is

  1. Use Firebug to copy and paste the content from the Elementary course.
  2. Import it into the Book module
  3. Link it to github.

Here’s the Moodle book equivalent of the above.

Screen Shot 2016-04-02 at 11.05.59 am.png

Misc observations

  • HTML isn’t using headings.
    Welcome page changes font size.
  • Nor is it using paragraphs, lots of line break tags
  • The web versions and the IMS version use different titles for pages
  • Do I need to include the welcome? If so, need to fix the images for the college and license
  • Fix the warnings with the github tool.
  • Couldn’t create a folder with a space.

GitHub version

Background: The Moodle open book tool enables the Moodle Book module to export/import content to/from GitHub. Adding all the benefits of GitHub. It does this by combining all the pages in a Moodle book into a single HTML file that is placed onto GitHub. A file that can be split up again and used to modify a Moodle book.

The tool is still under development, but it does work.

Here’s the GitHub HTML file produced by the Moodle open book tool that contains the LAT Elementary course. It’s based entirely on the Moodle Book version of the course I created in the previous step. You can see the file as a web page via this link. The image below is a screenshot of that web page. You  can just see the second page (Identify existing lessons) peeking up below.

Screen Shot 2016-04-02 at 11.26.17 am.png

Powered by WordPress & Theme by Anders Norén

css.php