Blog post 2 Colloquium 2 – Guest speaker Simon Welsh

Learning analytics

Guest speak Simon Welsh, Manager Adaptive Learning and Teaching Services at Charles Sturt provided an overview on the use of learning analytics to track, evaluate and support academic performance of students in higher education. Universities collect and store large amounts of data comprised of student admission files, interaction on LMSs, library records and other university systems. Learning analytics can mine some of this data to help institutions track engagement, performance and progress, measure impact of online course design and plan future learning support. Bu this is only the tip of iceberg when we consider tracking student learning in online learning environments.

iceberg

Tip of the iceberg  © Presenter media, 2015

Welsh categorised this data mining in online learning environments into two main streams: academic analytics which mines data to support the management of students, staff and institutions and learning analytics which mines data to capture students’ interactions in online learning environments to support and inform learning and teaching processes. The focus of such data gathering has often been to track and monitor ‘at risk’ students to reduce and prevent student failure and drop out rates. The driver behind this monitoring is primarily to reduce student attrition rates and therefore drive institutional profits.

Welsh surmised that typical institutional focus has been on risk and engagement reporting systems or activity reporting systems. These risk and reporting systems use ‘big data’ techniques which are applied to behavioural and outcome data. Optimally this approach should support predictions of upcoming challenges for both institutions and students. Presumedly departments would mine for patterns of engagement to identify program strengths, resource deficiencies and shifting marketing place demand for courses. At the institutional level data mining would help to align resources to support program initiatives and at the individual student level, data mining would reveal engagement and interaction within the the online learning environment.

Challenges

Welsh turned the discussion toward the challenges of current institutional approaches to learning analytics. Specifically:

  • Data quality: What type of data are learning analytics capturing? Quantity of interactions (click and keystrokes or quality of learning?). Participation in a discussion forum does not guarantee that rich learning or collaboration has taken place.
  • Disconnect between learning theory and analytics: over-reliance on inductive analytic methods (big data) with limited or not connection to learning theory. Learning is viewed too much as a behaviour (clicks on a topic or discussion forum) versus a process (what is really happening when the student clicks on this topic).
  • Vendor driven: vendors define what is important (they design the algorithms). Propriety systems and market-place competition have created closed systems which do not integrate with other systems (e.g data migration to new LMS expensive and problematic, low support for collaborative online technologies such as social media plugins-although modern LMS environments such as Canvas and Desire2Learn do.)
  • Ethics: ethical dilemmas such as who owns the data, profiling students. Should there be private spaces where students can collaborate with each other without being tracked?

Other significant challenges include the limitation of the analytics to capture learning which takes place outside of the LMS (Long & Siemens, 2011; Duval, 2013, https://youtu.be/LfXDzpTnvqY). For example, activity and learning that takes places via personal learning networks (PLNs) in online communities of practice such as FaceBook, Twitter, blogs, online gaming communities, MOOCs and YouTube.

Way forward

The discussion then turned toward a re-conceptualisation of adaptive learning and teaching approaches that create the right learning analytics versus learning and teaching being adapting to the data from the analytics. Learning design needs to create learning experiences which create the ‘right’ analytics.

Quantified self

This webinar inspired me to do some research into the topic of the quantified self. Rapid development of ubiquitous technology such as wearable technology is providing us with ways of self-tracking and monitoring ourselves. The improvements in human biometric monitoring has seen a rapid uptake of self-monitoring using mobile devices to track everything from our diet and exercise to spending, sleeping and geo-locations (Wolf, http://www.ted.com/talks/gary_wolf_the_quantified_self?language=en). Can learning analytics be as mobile-responsive  and individualised to track, monitor and adjust the way we engage with learning content beyond the LMS tracking activity across multiple sites and platforms (Long & Siemens, 2011) ? Our uptake of wearable technologies provide real-time feedback promoting ‘…self-knowledge through self-tracking with technology as the enabler…’ (Constantini, 2014, https://www.youtube.com/watch?v=FESv2CgyJag)

 

Wolf (2012) in an online seminar Are people machines? Lessons from the quantified self discussed the potential of self-tracking technology to provide quasi-reinforcement. Wolf describes wearable technology as Skinnerian machines; technologies which reinforce or extinguish behaviours. This links with Welsh’s comment that analytics capture behaviour not learning processes. Current learning analytic approaches track behaviour and are designed to reinforce behavioural interactions (e.g. a badge, token or tick indicating progress through content or participating in a forum but without measuring quality of engagement or participation)?

 

Tip of the iceberg

Erik Duval (2013, https://youtu.be/LfXDzpTnvqY) takes the discussion further by looking at the rapidly expanding world of online learning where learning is open. Adopting the stance that learning analytics should empower the learning he looks at the popularity of MOOCs  compared with traditional learning within universities to remove barriers and create open networks for authentic learning. Duval teaches a MOOC for 3,000 students at the Katholieke Universiteit Leuven working on authentic problems in an open learning environment where coursework designed around ‘fake problems’ in a classroom with critique from the lecturer is replaced with open communication on the web, where student learning is visible to anyone on the web and is open to external critique and comment.  Duval comments that this changes the dynamics of the ‘classroom’ where students are working on solving problems with ‘real people’.

In 2013, the Standford University Coursera Computing Science course 101 registered 300,000 students. Learning analytics from online course providers such as Coursera and EdX capture participation in a global learning space.

 

Very little of the relevant activities of the student go on in the Learning Management System. It’s like very little learning goes on in the classroom. And if you want to get a real picture of what is really happening in their heads, you should go beyond what is in the Learning Management System.

Learning analytics capture ‘digital exhaust’ such as clicks, blog posts, participation in discussion forums providing a detailed picture of the behaviour of the students in the MOOC learning environment. Duval’s intention is to capture the relevant activity of students and push the learning analytics further by ‘wiring’ the students with sensors. Duval comments that this sort of ‘digital exhaust’ capture is not the same as the analytics that LMSs record. Very little relevant learning activities happen within the LMS. Whereas MOOCs provide data capture of student learning in open, authentic learning environments. Duval also discusses how analytics can be used to steer student learning. However, this is both ‘interesting’ as well as ‘dangerous’. While large student cohorts enable educational data mining, facilitating comparison of student behaviours with other ‘like’ students to provide feedback on performance and steer learning this is based on the generation of sets of hypotheses. For example, “Your behaviour seems similar to student B.” The LMS can predict the outcome and recommend a particular course of action.

Duval comments that systems can now be built which can guide students every ten or fifteen minutes and tell them what to do. However, the danger is if you teach students with this type of ‘intelligent’, adaptive, individualised instruction, they graduate dependent on a piece of software which has guided their learning. These are not the skills required of the 21st century; the ‘C’ skills. It doesn’t teach them how to be creative, communicative or collaborative.

So how do Duval and his peers leverage the power of learning analytics to help student learning? The students have designed and built dashboards to track and monitor their own progress using a participatory design approach. The dashboard is a mobile app comprised of recording and displaying interactions that are significant to the student and provides them with actions they need to take. Duval compares this with a traditional LMS which only provides students with their ‘administrative’ details. Even modern LMSs such as Desire2Learn provide dashboards which provide the academic with a picture of student performance but limited feedback to the student (https://documentation.desire2learn.com/en/understanding-student-dashboard) . Duval provides an innovative example where learning analytics have been integrated with the physical environment providing students with real-time feedback on their communication behaviours (Bachour, Kaplan & Dillenbourg, 2010).

 

Implications for my professional practice

As a learning designer, the challenge is not only to incorporate the feedback from learning analytics’ data of student interactions within the LMS to inform pedagogical instructional design practices. It is also to design authentic activities for open learning spaces and then ‘steer’ the student to reflect and document their learnings in the closed LMS environment to capture those activities outside the LMS. For example, social media platforms such as Twitter, Wikis and blogs, FaceBook, online gaming communities, Moocs and YouTube. Social curation tools such as Diigo and Pinterest also provide opportunities for designing learning in open spaces, while Googledocs provide opportunities for collaborative critiquing and co-construction of knowledge in authentic environments. I will also seek professional development opportunities at work to explore the use of wearable technology and/or mobile devices to capture data in a quantified self approach to learning design. Specifically, where learning activities are designed for 21st century skill acquisition (creativity, communication, collaboration) leveraging feedback via wearable technology and apps to empower the learner with self-knowledge through self-tracking via emergent mobile and wearable technologies.

 

References

Bachour, K., Kaplan, F. & Dillenbourg, P. (2010). An interactive table for supporting participating balance in face-to-face collaborative learning. IEEE Transactions on Learning Technologies, 3(3), 203-213.

Constantini, L. (2014). The Quantified Self: How Wearable Sensors Expand Human Potential: TEDxMileHigh. [Video file]. Retrieved from https://www.youtube.com/watch?v=FESv2CgyJag.

Duval, E. (2013). Open learning analytics [Video file]. Retrieved from https://youtu.be/LfXDzpTnvqY

Long, P. & Siemens, G. (2011). Penetrating the fog: Analytics in learning and education. EDUCAUSE review, September/October.

Welsh, S. (2015 27 July). Learning analytics: A traveller’s guide. In Digital futures colloquium. Retrieved from https://connect.csu.edu.au/p54oghdkhtf/?launcher=false&fcsContent=true&pbMode=normal 

Wolf, G. (2010). The Quantified self. [Video file]. Retrieved from http://www.ted.com/talks/gary_wolf_the_quantified_self?language=en

Wolf, G. (2012). Are people machines? Lessons from the quantified self [Video file]. Retrieved from https://youtu.be/zw4sg7pCQTc

3 comments on “Blog post 2 Colloquium 2 – Guest speaker Simon Welsh
  1. Hyacinth, what a very interesting piece you have written. Very thought provoking especially the implications. I wonder if we could get this environment set up would that mean the extinction of NAPLAN/HSC/VCE? Would they be relevant any more? The concept of knowledge creation and curation is becoming more and more fascinating and, in my opinion, important. I don’t think we seek out the opportunities for our students to do this enough so that it becomes part of the learning experience that is valued.
    It is going to be very interesting to see how far learning analytics gets within the area of education.

  2. Hyacinth, sorry it has taken me so long to respond to this blog post. I really enjoyed reading your in-depth explorations and thoughts about using learning analytics, and in particular your thoughts about how this can inform learning design. The use of an LMS is contentious, and in contrast to the flexibility and feasibility of Web 2.0 environments there is much to be done with design of spaces to meet both learning and institution needs.

  3. Thanks for such an informative post Hyacinth. Your explanation and discussion of the issues surrounding the use of data were helpful to understand how complex the situation is. Also, the links to Duval and his peers’ work is valuable for further investigation into the questions about balancing the ethical use of data for the better learning outcomes of learners

Leave a Reply

Your email address will not be published. Required fields are marked *

Skip to toolbar