From Should Be, Could Be To Data-driven Adaptive Learning

Posted on
Retrieved from:

Retrieved from:

With the inception of Web 2.0 technologies and all they encompass – active participation through connecting, creating, collaborating is key.  The initial readings by Selwyn (2010) identify the need to reflect on what has been achieved and to identify how it is possible to move from the brink of educational technologies and what they should be, could be to actually using the technologies as they are intended.

The idea of learning analytics and data mining has been introduced via a colloquia given by Simon Welsh. Anyone who participates online via any social media platform would be naive to think that there is not some sort of tracking and data being gathered by every click.  Just as it would be naive to think that those who engage with shops via VIP cards and frequent shopper schemes are not being tracked through the purchases they make.  Data is everywhere and yes, our lives are being scrutinised in the interest of big business.  It is who is analysing the data and how that data is being used is the ‘grey’ area and where many people could and should ask the questions about privacy issues.  Is the offerings made by data analysis something that is really wanted by the consumer or is it being ‘forced’ upon them and is that indeed an ethical space to be?

Now let’s apply this idea of data mining and analysis to education and you have learning analytics.  The purpose of learning analytics is to track a student’s pathway to their learning.  The Society for Learning Analytics (SOLAR) defines learning analytics as “the measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimising learning and the environment in which it occurs.” The questions and thoughts that arose for me when participating in this colloquia was who is considered to be the expert in analysing and deciding how the data is used?  What makes them the expert in determining what is best for that particular student?  Who drives the data and then ultimately takes responsibility when the learning interventions are not what the student wants or needs?  Where does motivation fit into learning analytics?

One way of gauging a students engagement was said to be the number of clicks and interactions in the LMS but there are many educational technologies beyond the LMS that some students use in order to learn.  It just seems very black and white at the moment and seems to be reverting to a pedagogy that is content driven rather than considering success in learning can be determined by a number of factors.  Simon Buckingham Shum, Director of the Connected Intelligence Centre at UTS states that we ‘need to be careful that the learning analytics do not impose a pedagogy or a mindset that is counter to where we are trying to take our schools or universities.”

As a primary school teacher librarian I have experienced the introduction of various ways to track students such as NAPLAN data, data walls, literacy and numeracy continuums and one of the dangers that we are constantly being told is not to teach to the test or the continuum.  Human nature is such though that data about student learning is seen by some as being used against their abilities as a teacher.  The idea of adaptive learning is that the data is used to meet the student at their point of need and differentiate the content to suit that individual student.  While I remain apprehensive and pensive towards this field of learning analytics, particularly in a primary school setting, I can see the idea behind this concept as a further way to evidence student learning.

The nature of learning analytics and data mining remains a concept that I will continue to reflect on and at the moment can honestly say I need a lot more professional development and understanding to develop here.  Learning analytics seems to be offering the ‘secret sauce’ (Sharkey, 2014, but I agree that the benefits of them can only be determined through the ‘ability to execute.’  The aim of education is always to give our students a boost up to help them move forward to their informational needs but when factors such as skill set – digital, literacy, intra and interpersonal skills, mindset, motivation (just to highlight a few) are considered, do learning analytics meet the students at their humanity or is the focus too much on what the machine is generating?

Retrieved from:

Retrieved from:



Buckingham Shum, S. (2015). CIC: The future of learning. Learning Analytics.  Retrieved from:

Selwyn, N. (2010). Looking beyond learning: notes towards the critical study of educational technology. Journal of Computer Assisted Learning, 26(1), 65–73. DOI: 10.1111/j.1365-2729.2009.00338.x

Selwyn, N. (2014). Education and ‘the digital’. British Journal of Sociology of Education, 35(1), 155-164. DOI: 10.1080/01425692.2013.856668.

Sharkey, M. (2014).


4 thoughts on “From Should Be, Could Be To Data-driven Adaptive Learning

  1. Hi Michelle. I relate strongly to your final comment about the importance of including humanity within learning analytics. I hear Dr Simon Breakspear speak recently, asking the audience “How human is your digital learning strategy?” I think this applies to learning analytics also, reminding us that the purpose of using technology in learning analytics is to create a deeper and better understanding of teaching and learning processes. We need to always be mindful that there is a person that has generated the data we are analysing – “put a face to the data”, as Sharratt and Fullan would say. (See pdf doc here:

    • Hi Jo,

      Thanks for taking the time to connect and comment. Thanks for the link. It’s interesting reading about the recent release of Naplan results. It’s all about the student numbers. The general media are quoting that despite increased funding there has been no movement. Perhaps the fact that teachers and students are trying to get their head around new and increased expectations in the curriculum. Perhaps the fact that funding needs to be used for resources that schools do not have with the introduction of digital technologies and new literacies. The data is still limited in the picture it presents.

      I can see the benefits of this area of learning analytics but am still apprehensive as to how the data is used and by whom is considered to be the expert?

  2. Hi Michelle,
    Wonderful reflection on learning analytics. Like you, I have a lot more to find out about it. Your questions resonated with me too. The whole idea of who is ‘an expert’ to begin with, across the field of digital scholarship, but also within learning analytics, and as you say, ‘what makes them the expert in determining what is best for a particular student”. As a fellow TL, while I have used data from the library management database re borrowing stats, etc., I do question the validity of them. Perhaps they give one ‘snap shot’ of the culture of reading, within the library. But they also have to be considered skewed too. I think I need to consider how (and why) for my own professional practice next. A lot to think about. I have also bookmarked a number of your resources for tomorrow’s reading – so big thanks!

    • Hi Yvonne,
      Thanks for your comments. I agree the data we get through our Library Management Systems are about the borrowing but do they give us a true indication of the user as a reader. Perhaps we can tell that they favour fantasy over mystery or fiction over non-fiction but until I hear that student read or ask for a summary or some sort of response I can’t actually make a complete judgement of that student’s ability as a reader. That’s why analytics also possibly needs to be a co-operative and collaborative concept. Am loving all the questions that this course has raised for me.

Leave a Reply

Your email address will not be published. Required fields are marked *