Learning analytics, quality indicators and meso-level practitioners

“failure” (CC BY 2.0) by tinou bao When it comes to research I’ve been a bit of failure, especially when measured against some of the more recent strategic and managerial expectations. Where are those quartile 1 journal articles? Isn’t your h-index showing a downward trajectory? The concern generated by these quantitative indicators not only motivated

Continue reading Learning analytics, quality indicators and meso-level practitioners

Learning analytics should not promote one size fits all: The effects of instructional conditions in predicting academic success

What follows is a summary of Gašević, D., Dawson, S., Rogers, T., & Gasevic, D. (2015). Learning analytics should not promote one size fits all: The effects of instructional conditions in predicating learning success. The Internet and Higher Education, 28, 68–84. doi:doi:10.1016/j.iheduc.2015.10.002 I’ve skimmed it before, but renewed interest is being driven by a local

Continue reading Learning analytics should not promote one size fits all: The effects of instructional conditions in predicting academic success

Helping teachers "know thy students"

The first key takeaway from Motz, Teague and Shepard (2015) is Learner-centered approaches to higher education require that instructors have insight into their students’ characteristics, but instructors often prepare their courses long before they have an opportunity to meet the students. The following illustrates one of the problems teaching staff (at least in my institution)

Continue reading Helping teachers "know thy students"

Dashboards suck: learning analytics' broken metaphor

I started playing around with what became learning analytics in 2007 or so. Since then every/any time “learning analytics” is mentioned in a university there’s almost an automatic mention of dashboards. So much so I was lead to tweet. @s_palm Well everyone knows that “real” LA requires a dashboard — Don Quixote Jones (@djplaner) June

Continue reading Dashboards suck: learning analytics' broken metaphor

Revisiting the IRAC framework and looking for insights

The Moodlemoot’AU 2015 conference is running working groups one of which is looking at assessment analytics. In essence, trying to think about what can be done in the Moodle LMS code to enhance assessment. As it happens I’m giving a talk during the Moot titled “Four paths for learning analytics: Moving beyond a management fashion”.

Continue reading Revisiting the IRAC framework and looking for insights

Reading – Embracing Big Data in Complex Educational Systems: The Learning Analytics Imperative and the Policy Challenge

The following is a summary and ad hoc thoughts on Macfadyen et al (2014). There’s much to like in the paper. But the basic premise I see in the paper is that to fix the problems of the current inappropriate teleological processes used in institutional strategic planning and policy setting is an enhanced/adaptive teleological process.

Continue reading Reading – Embracing Big Data in Complex Educational Systems: The Learning Analytics Imperative and the Policy Challenge

The four paths for implementing learning analytics and enhancing the quality of learning and teaching

The following is a place holder for two presentations that are related. They are: “Four paths for learning analytics: Moving beyond a management fashion”; and, An extension of Beer et al (2014) (e.g. there are four paths now, rather than three) that’s been accepted to Moodlemoot’AU 2015. “The four paths for implementing learning analytics and

Continue reading The four paths for implementing learning analytics and enhancing the quality of learning and teaching

css.php