INF537: The 2-sigma problem revisited

Introduction

In traditional university settings students have direct interactions with academics through lectures and tutorials, enabling the potential for deep insights into the way individual students learn and improving their learning outcomes. In online environments, learning analytics functionality is increasingly providing similar insights. However, a great deal of online learning takes place outside the LMS. While it may be possible to collate data related to student activity both internal and external to the LMS, this research examines current practices around that data and whether it is meaningful beyond basic indicators such as participation and/or some form of engagement.

The challenge for this study has its origins in Bloom’s 2 Sigma Problem (Bloom, 1984, p.4). Bloom investigated studies in which cohorts of students were taught using three different conditions:

  • Conventional: a class size of about 30 students with periodic tests.
  • Mastery Learning: a class size of about 30 students with same instruction, usually the same teacher, but with formative assessment, feedback and corrective actions.
  • Tutoring: students assigned to a good tutor in groups of up to 3 students.

The students in the Mastery Learning cohort typically obtained results 1 standard deviation higher than those in the conventional approach while students in the tutoring cohort typically obtained results 2 standard deviations higher than the students in the conventional approach. Bloom referred to this as the ‘2 sigma problem’ and sought to devise teaching approaches such that students in group conditions could achieve the same types of results as those taught under tutoring conditions.

The impact that individual tutoring had was clear and the problem that Bloom sought to solve was one of scale, but scale has increased by orders of magnitude over the numbers he and his team investigated. We now have classes (eg in MOOCs) running potentially into the thousands (Knox, 2014, p. 166).  Additionally, educators are using online tools not just provided by their institutions but general web services as well, albeit with varying degrees of success (Bennett, Bishop, Dalgarno, Waycott, & Kennedy, 2012, p. 533). Learning and teaching is taking place well beyond what may have traditionally been considered a classroom environment.

Usage of each of these technologies leaves behind digital trails, or footprints (Gasevic, 2015). In his presentation to the World Conference on E-Learning 2015, Gasevic defines Learning Analytics as the measurement, collection and reporting of data about learners and their contexts, enabling understanding and optimizing learning and the environments in which it occurs. One of the key areas of research for this study is investigating whether or how learning analytics might contribute to online learning attaining similar levels of efficacy achieved through tutoring or as Siemens, Dawson & Lynch (2013) express it, ‘to meld the seemingly mutually exclusive variables of quality and scale into the increasingly pervasive online learning environments of the future’.

Method

This case study begins by examining what variables contribute towards making fulfilling and successful learning experiences in on-campus higher education settings. It then examines a number of learning analytics implementations to determine to what extent learning analytics is ‘enabling the understanding and optimizing of online learning’, particularly in the context of developing and supporting approaches that aim to emulate in some way the richness of on-campus learning experiences. Finally, the study will examine associated challenges and opportunities, and how technologists and learning technologists can or are responding to meet those challenges. The study seeks to identify potential ways forward for learning analytics.

Case Study Review

In his research Bloom (1984, p. 6) identifies a number of higher-impact variables on learning including personalised instruction (tutoring), reinforcement, corrective feedback (formative assessment), cues and explanations, student classroom participation, student time on task, improved reading/study skills, and cooperative learning. Boyer (1992, p. 90) identifies in ‘scholarship’ of teaching, the ability to inspire future scholars, as one of the key properties of a scholar. Yair examines this further, identifying a number of properties related to the teacher which yield rich learning experiences including the ability to motivate students, symmetric student-teacher relations and personalisation (Yair, 2008, p. 452). Yair also found highly regarded teachers (by their students) were passionate and had a high degree of integrity and that many such teachers intertwined that passion with great teaching skills and knowledge (Yair, 2008, p. 455).

While the intent of this study is to gain insights into strategies that can transfer successful properties of on-campus learning to online learning experiences using learning analytics, it is useful to examine blended learning, a combination of the two, to see what insights can be drawn from that approach. Citing limitations in the social aspect of learning in e-learning environments, Akkoyunlu & Soylu (2008, p. 183) investigated differences in learning styles based on Kolb’s Learning Style Inventory (as cited in Akkoyunlu & Soylu, 2008, p. 185) in blended learning environments. The research revealed that ‘divergers’ found the face-to-face components of blended learning to be most valuable while ‘assimilators’ preferred the structure of the online environment. In well-designed environments however, there was no significant difference in the academic results for the different learning styles (Akkoyunlu & Soylu, 2008, p. 190).

Al-Hebaishi (2012, p. 379) investigated the difference between blended and online instruction and found a significant difference in results with the blended learning approach achieving the better results. Instruction for both groups was delivered online but the blended learning group received follow-up face-to-face discussion – using what is now referred to as a flipped-classroom design (“Flipped classroom”, n. d.). Social isolation has been identified as a challenge for ‘online only’ students (Frankola, 2001, para. 29) and may account for that difference. In a survey of students across 46 universities however, Wuensch, Aziz, Ozan, Kishore & Tabrizi (2008, p. 524) found students appreciated the richness of the social interactions in face-to-face environments but also highly valued the convenience (eg self-paced) nature of online learning environments.

Countering this to some extent Ladyshewsky & Taplin compared the performance of students in a leadership development course delivered in three modes (face-to-face, blended learning, and fully online) and found the fully online students achieved significantly better results (Ladyshewsky & Taplin, 2014, p. 273). In this study however, the online delivery was designed using a highly social-constructivist approach, which may have countered the social isolation identified earlier. Ladyshewsky & Taplin (2014, p. 285) also recognized that learning styles and motivation may have contributed towards the difference in the results.

In a study investigating learning styles and attitudes toward online education with students across four universities, Martinez de Monarrez & Korniejczuk (2013, p. 186-187) found significant differences towards online education based upon learning style. As learning analytics evolves towards support more adaptive learning approaches (Welsh, 2016, slide 11), the results of these studies suggest that analytics identifying and supporting different learning styles could play an important role. Buckingham Shum & Ferguson (2012, p.14) take this a step further by discussing learning dispositions, providing a layer of contextual awareness over the top of learning styles.

To ascertain whether learning analytics is or can be used to enrich and improve learning experiences in a similar fashion to on-campus learning, examination of learning analytics case studies and implementations reveals interesting results. Siemens, Dawson & Lynch developed a ‘sophistication model’ demonstrating that the higher education sector, as a whole, is relatively immature in its understanding and implementation of learning analytics (Siemens, Dawson & Lynch, 2013, p. 28). Their report reviewed a number of universities and found that most are only just starting to consider and understand how to build up their organisational capacity. Implementations tended to be small-scale and focused on student retention and identifying students at risk. Welsh (2016, slide 6), in his presentation on the state of learning analytics, argued that to date, the area has tended to focus on students at risk and retention, but is increasingly focusing on improving learning outcomes.   This argument is further backed by West et al (2015, p.1) who found that many institutions are not very advanced in their use of learning analytics and that supporting retention tends to be one of the main business drivers for it. This is evidenced in a number of case studies (“Batchelor Institute of Indigenous Higher Education”, 2015; “Murdoch University”, 2015; “University of Newcastle”, 2015; “Charles Darwin University”, 2015; “Griffith University”, 2015) which West et all examined.  In these studies, analytics data was typically a combination of student demographic data and activity data from the LMS.

The Signals project at Purdue University addressed the challenge of retention and improving student success too. It also focused on institutional accountability. The project collects data from a number of sources and provides feedback not just to teachers, but to students directly through the use of a student dashboard, enabling the student to be proactive in addressing any issues that they may have (Arnold, 2010, para. 5).

In addition to demographic and activity data, Faridham, Loch & Walker (2013, p. 280) seek to improve retention via learning analytics by including past and current academic performance in their implementation to develop predictive models which will trigger intervention strategies.

The data provided by the LMS to provide insights into identifying students at-risk is largely quantitative, measuring participation such as the number of logins, crude measures of time on task, and simple test scores. Dietrichson (2013, p. 333) argues that as a measure of behaviour, this data on its own it does not allow for deeper insights from a constructivist perspective. In addition to such data, Dietrichson (2013, p. 343) analyses social interactions in Moodle’s interactive tools between class participants, using data such as the number of interactions, communications channels (who is communicating with who), word counts and other linguistic measures in order to gain insights into the social aspect of learning and its effectiveness in such environments.

The social construction of knowledge and social interactions form an important part of compelling and effective learning experiences (Bloom, 1984, p. 6; Yair, 2008, p. 452; Al-Hebaishi, 2012, p. 379). In online contexts, in addition to using the collaborative features of the LMS, teachers are enhancing this social aspect through the use of social networks (Hammett, St. Croix & Wicks, 2012, p. 1037) potentially fracturing and limiting the ability to gain insights though learning analytics into the learning that is taking place (since the LMS can only report on activity within it). Buckingham & Ferguson (2010, p. 5) extend the concept of learning outside the LMS with SocialLearn, a platform that integrates many social networking services and aggregates learning events, supports a range of pedagogies, providing structure for learning processes and exploring the potential for a range of Social Learning Analytics such as content and connection recommendation engines.

Walker (2012, para. 5-6) describes a number of ways in which Twitter is being used for teaching and learning in higher education and goes on to demonstrate how learning analytics can be applied to understand levels of engagement, social interactions etc (Walker, 2012, para. 9-13). While Walker attempts to improve the analysis of data from simplistic behavioural observations to social-constructivist observations in order to gain meaningful insights into learning, the approach is limited in that it is constrained to data from the Twitter platform.

Twitter is just one of many social networking platforms where learning and teaching is taking place (Hammett, St. Croix & Wicks, 2012, p. 1037; Buckingham & Ferguson, 2010, p. 5). Capturing learning activity data from those platforms to facilitate meaningful analytics is a challenge since by design they do not emit learning activity-related information. Bakharia, Kitto, Pardo, Gaševic & Dawson (2016, p. 1) acknowledge this and attempt to solve the problem by extracting activity data from those platforms and ‘wrapping that data with some form of learning ‘construct’ then loading it into a single location (database) for analysis using a software specification called xAPI, or the ‘experience’ API. This approach brings with it a range of challenges and opportunities. Before discussing those challenges and opportunities it is useful to summarise the focus of the learning analytics implementations that have formed the basis for this study. Table 1 provides a high-level view of that focus.

Table 1: Implementation summary.

University Focus of implementation
  Retention / Students at risk Improved learning outcomes Improving Course design / content Teacher engagement / performance Institutional accountability
University of Michigan Y Y Y N Y
University of Wisconsin Y Y N N Y
Open University of the UK Y Y Y N N
University of New South Wales Y N N N N
University of New England Y P P N N
Queensland University of Technology Y N Y N N
University of Technology Sydney Y N N N P
Swinburne University of Technology Y N N N N
University of South Australia (1) Y N N N N
University of Queensland Y P Y N Y
Batchelor Institute of Indigenous Tertiary Education N N N N N
Murdoch University Y N N N Y
University of Newcastle Y N N N N
Charles Darwin University Y N N N Y
Griffith University Y Y N N Y
Purdue University Y Y N N Y
University of Melbourne Y N Y N Y
Macquarie University Y N N N N
University of South Australia (2) N N Y N N
Swinburne University (3) Y N N N N
Swinburne University (4) N Y N Y N

Y- Yes, N – No, P – Partial (ie some focus in this area but not significant)

  • Siemens, G., Dawson, S. & Lynch, G. (2013)
  • Corrin, L., Kennedy, G., de Barba, P.G., Lockyer, L., Gaševic ́, D., Williams, D., Dawson, S., Mulder, R., Copeland, S.,& Bakharia, A. (2016).
  • Faridhan, Y. A., Loch, B. & Walker, L. (2013)
  • Walker, L. (2012).

Further details of these case studies is provided in Appendix B.

The implementations summarised in the table neither represent a complete picture of the state of learning analytics across higher education or even within the universities that they come from. They do however appear in research commissioned at a systemic level into the current state of learning analytics as determined by the authors of the various source reports. What is obvious from the table is that most focus is on learner behaviour rather than a more holistic perspective (ie no deep consideration of the impact that teachers and content make). That is, most organisations are still in the beginning stages of Siemens, Dawson, & Lynch’s Learning Analytics Sophistication Model (Siemens, Dawson, & Lynch, 2013, p. 27).

Challenges and Opportunities

Examination of learning analytics implementations reveals a number of challenges as organisations attempt to increase their level of sophistication and increase student success and the richness of their online learning experience. A challenge for teachers using multiple online tools is extracting data in common formats. Increasingly educational technology providers store and can provide this data using technical standards for learning analytics data such as IMS Caliper (IMS Global Learning Consortium, n.d.). This has the potential to work well for those interested in collecting learning activity data from multiple educational systems such as learning management systems but will not work for those who seek to gain insights from learning activity on other platforms such as social networks.

Bakharia, Kitto, Pardo, Gaševic & Dawson (2016, p. 1) aim to solve this by using a broader specification called xAPI. xAPI allows implementers to collect data about online interactions with resources and people from many different types of systems and use that data to gain insights into the learning that it taking place (‘The Tin Can API’, n.d., para. 4). xAPI implementers describe and organize the data they are collecting from various sources using ‘recipes’. These recipes provide the structure for the data. Bakharia, Kitto, Pardo, Gaševic & Dawson (2016, p. 2) created recipes for the collection of data from Youtube, Twitter, Facebook and WordPress. A challenge for xAPI implementers is that each of them may have their own supposedly unique requirements so each implementation may collect slightly different data or data used in different contexts, rendering aggregation and comparison meaningless. JISC is building up a set of xAPI recipes for UK higher education (JISC, 2016, para. 6) however it is currently limited to a quite simplistic set of behavioural measures such as ‘logged in’, ‘logged out’, ‘viewed resource’, ‘assignment submitted’, ‘assignment graded’, ‘visited library’, ‘attempt started’, ‘attempt completed’ (JISC, n.d., para. 4).

Such behavioural measures are poor indicators of learning. For example, just because I clicked on a video to view it does not necessarily mean that I paid attention to it or learned anything from it. Chiang, Tseng, Chiang, & Hung (2015, p. 2277) recognize this limitation and call for research efforts into standardizing behavioural definitions and aligning them with learning theories in order to provide more meaningful insights. They also recognise the challenge associated with inconsistency of measuring the same behaviours in different environments.

Walker’s analysis of Twitter data addressed the problem through the development of more complex and sophisticated measures from a social-constructivist approach providing more meaningful evidence of the learning that took place (Walker, 2012, para. 9-13). Buckingham Shum refers to this as moving from ‘clicks to constructs’, measuring dispositions such as curiosity and diligence (Buckingham Shum, 2016). It involves developing deeper understanding and interpretation of observable behaviours (clicks). The Knewton adaptive learning platform is arguably one of the most sophisticated implementations of learning analytics to date (Siemens, Dawson, & Lynch, 2013, p. 5), performing real-time analytics using the power of big data to make decisions and very sophisticated measures of ‘active time’ and ‘work remaining’ (Wilson & Nichols, 2015, p. 14). It also uses Item Response Theory to continuously monitor learning, placing it well ahead of more simplistic behavioural measurement implementations.

While the Knewton platform is a good example of what is achievable through learning analytics, it does highlight another challenge for learning analytics in general and that is the collection and usage of student data.   Collection and storage of such data can be a contentious issue as highlighted by the failure of the In Bloom initiative. InBloom was a $100m investment into enabling improved learning outcomes for students across a number of US States (initially New York State) through data collection and analysis. However, it failed to convince the public it could protect the data or build up the trust required for such an initiative (Kharif, 2014, para. 5) and was forced to shut down. JISC provides a code of practice for learning analytics that attempts to address this challenge by covering areas such as responsibility, transparency and consent, privacy and custodianship of data. An important part of the development and implementation of the code of practice is the involvement of all stakeholders, most notably students (JISC, 2015 para. 9). Charles Sturt University outlines a code of practice that restricts the collection of learner data to learning and teaching systems (Charles Sturt University, 2015, p. 6). This approach is reinforced through conversation with S. Welsh of CSU who raised concerns over the inherent risks of starting to collect ‘learning data’ from outside learning and teaching systems (S. Welsh, personal communication, September 16, 2016). The approach contrasts with that of K. Kitto of Queensland University of Technology in her work with xAPI collecting data from outside the LMS who simply uses consent forms with students to collect limited, tagged data from social networks (K. Kitto, personal communication, September 20, 2016).

Interestingly, the CSU Learning Analytics Code of Practice acknowledges the three main roles in learning – student, teacher and content (Charles Sturt University, 2015, p. 3). Most of the learning analytics implementations reviewed focused entirely on student interactions with a small number focusing also on student-content interactions and improvement. Walker’s research on learning analytics for Twitter appears to be one of very few which also measures teachers activities and the subsequent impact on learning yet surprisingly this type of information is readily available to teachers. For example, in Blackboard Learn’s Retention Center there is a ‘teacher dashboard’ showing ‘Your Course Activity’ (Blackboard, (n.d), p. 23). Coupled with student data, this feature provides interesting insights into the impact that teacher activity has on student interactions with the LMS.

Conclusion

Just as Bloom sought to provide groups of students with learning experiences equivalent to those provided by tutoring, the challenge for online learning and learning analytics remains one of scale, but significantly larger. Learning Analytics to date has tended to focus on retention through the measurement of activity in the LMS. As it becomes more sophisticated, the discipline needs to start asking more sophisticated questions about the nature of the learning experience. For example, most universities list a global outlook, social and cultural awareness and dedication to online learning as desired graduate attributes. Campuses, with their diverse populations, clubs and activities help impart these attributes. Can online student’s experience be enriched in equivalent ways and if so, how do we measure that? Teachers overwhelmingly contribute to the richness of the learning experience yet very few analytics implementations focus on this area, with at least some focusing on improving content and course design. As Buckingham Shum states, we need to move from ‘clicks to constructs’, integrating learning design, student and teacher disposition into the development of our implementations, all the while remaining absolutely aware of the ethical, privacy and security implications of storing and using this valuable data.

References

Akkoyunlu, B., & Soylu, M. Y. (2008). A Study of Student’s Perceptions in a Blended Learning Environment Based on Different Learning Styles. Educational Technology & Society, 11 (1), 183-193.

Al-Hebaishi, S. M. (2012). A Comparison of Learners’ Achievement between Blended Learning and Distance Learning. International Journal on E-Learning, 11(4), 373-382.

Arnold, K. E. (2010). Signals: Applying Academic Analytics. EDUCAUSE Quarterly, 33(1). Retrieved October 2, 2016, from http://er.educause.edu/articles/2010/3/signals-applying-academic-analytics

Bakharia, A., Kitto, K., Pardo, A., Gaševic , D. & Dawson, S. (2016). Recipe for success: lessons learnt from using xAPI within the connected learning analytics toolkit. Paper presented at the Proceedings of the Sixth International Conference on Learning Analytics & Knowledge, Edinburgh, United Kingdom. Retrieved August 30, 2016, from http://www.beyondlms.org/assets/papers/xapiAnalytics.pdf

Bennett, S., Bishop, A., Dalgarno, B., Waycott, J., & Kennedy, G. (2012). Implementing Web 2.0 technologies in higher education: A collective case study. Computers & Education, 59(2), 524-534. doi: 10.1016/j.compedu.2011.12.022

Bloom, B. S. (1984). The 2 Sigma Problem: The Search for Methods of Group Instruction as Effective as One-to-One Tutoring. Educational Researcher, 13(6), 4-16.

Boyer, E. L. (1992). Scholarship Reconsidered: Priorities of the Professoriate. Issues In Accounting Education, 7(1), 87-91.

Buckingham Shum, S. & Ferguson, R. (2010). Towards a Social Learning Space for Open Educational Resources. In Open ED 2010 Proceedings. Barcelona: UOC, OU, BYU. Retrieved October 1, 2016, from http://hdl.handle.net/10609/5085

Buckingham Shum, S. & Ferguson, R. (2012). Social learning analytics. Journal of Educational Technology and Society, 15(3) pp. 3–26. Retrieved October 1, 2016, from http://www.ifets.info/journals/15_3/2.pdf

Case Study 1: Batchelor Institute of Indigenous Tertiary Education. (2015). Retrieved September 30. 2016, from http://www.letstalklearninganalytics.edu.au/wp-content/uploads/2015/06/1-Batchelor-Case-Study.pdf

Case Study 2: Murdoch University. (2015). Retrieved September 30. 2016, from http://www.letstalklearninganalytics.edu.au/wp-content/uploads/2015/06/2-Murdoch-University.pdf

Case Study 3: University of Newcastle. (2015). Retrieved September 30. 2016, from http://www.letstalklearninganalytics.edu.au/wp-content/uploads/2015/06/3-Newcastle-Case-Study.pdf

Case Study 4: Charles Darwin University. (2015). Retrieved September 30. 2016, from http://www.letstalklearninganalytics.edu.au/wp-content/uploads/2015/06/4-Charles-Darwin-Case-Study-31-March-2015.pdf

Case Study 5: Griffith University. (2015). Retrieved September 30. 2016, from http://www.letstalklearninganalytics.edu.au/wp-content/uploads/2015/06/5-Griffith-Case-Study.pdf

Charles Sturt University. (2015). CSU Learning Analytics Code of Practice. Retrieved October 7, 2016, from http://www.csu.edu.au/__data/assets/pdf_file/0010/2507824/2016-CSU-Learning-Analytics-Code-of-Practice_v3-3.pdf

Chiang, C.F., Tseng, H.C., Chiang, C.C. & Hung, J.L. (2015). A case study on learning analytics using Experience API. In D. Rutledge & D. Slykhuis (Eds.), Proceedings of Society for Information Technology & Teacher Education International Conference 2015 (pp. 2273-2278). Chesapeake, VA: Association for the Advancement of Computing in Education (AACE). Retrieved October 2, 2016, from http://www.learntechlib.org/d/150316

Corrin, L., Kennedy, G., de Barba, P.G., Lockyer, L., Gaševic ́, D., Williams, D., Dawson, S., Mulder, R., Copeland, S.,& Bakharia, A. (2016). Completing the Loop: Returning Meaningful Learning Analytic Data to Teachers. Sydney: Office for Learning and Teaching. Retrieved October 4, 2016, from http://melbourne-cshe.unimelb.edu.au/__data/assets/pdf_file/0006/2083938/Loop_Handbook.pdf

Dietrichson, A. (2013). Beyond Clickometry: Analytics for Constructivist Pedagogies. International Journal on E-Learning, 12(4), 333-351. Chesapeake, VA: Association for the Advancement of Computing in Education (AACE). Retrieved October 1, 2016, from http://www.editlib.org.ezproxy.csu.edu.au/d/38478

Experience API. (n.d.). In Wikipedia, Retrieved August 30, 2016, from https://en.wikipedia.org/wiki/Experience_API_(Tin_Can_API)

Faridhan, Y. A., Loch, B. & Walker, L. (2013). Improving retention in first-year mathematics using learning analytics. 30th Ascilite Conference Proceedings. 278-282. Retrieved October 1, 2016, from http://www.ascilite.org/conferences/sydney13/program/papers/Faridhan.pdf

Flipped classroom. (n.d.). In Wikipedia, Retrieved September 17, 2016, from https://en.wikipedia.org/wiki/Flipped_classroom

Frankola, K. (2001). Why online learners drop out. Retrieved September 30, 2016 from http://www.workforce.com/articles/why-online-learners-drop-out

Gasevic, D. (2015). Learning analytics are more than a technology. Presented at E-Learn: World Conference on E-Learning 2015. Retrieved September 24, 2016, from https://www.youtube.com/watch?v=ScP58HGcEHA

Hammett, R., St. Croix, L. & Wicks, C. (2012). Enhancing Online and Face-to-Face Teaching with Social Networking Sites and Digital Literacy Activities. In T. Bastiaens & G. Marks (Eds.), Proceedings of E-Learn: World Conference on E-Learning in Corporate, Government, Healthcare, and Higher Education 2012 (pp. 1036-1041). Chesapeake, VA: Association for the Advancement of Computing in Education (AACE). Retrieved October 1, 2016, from http://www.editlib.org.ezproxy.csu.edu.au/d/41730

IMS Global Learning Consortium. (n.d.). Caliper Analytics.   Retrieved October 6, 2016, from https://www.imsglobal.org/activity/caliperram

JISC (2015): Code of practice for learning analytics. Retrieved October 8, 2016, from https://www.jisc.ac.uk/guides/code-of-practice-for-learning-analytics

JISC. (2016). Effective Learning Analytics. Retrieved October 6, 2016, from https://analytics.jiscinvolve.org/wp/2016/06/28/a-technical-look-into-learning-analytics-data-and-visualisations/

JISC. (n.d). xAPI recipes for the JISC Learning Analytics Project V0.4. Retrieved October 6, 2016, from https://github.com/jiscdev/xapi

Kharif, O. (2014). Privacy Fears Over Student Data Tracking Lead to ONBloom’s Shutdown. Retrieved October 7, 2016, from http://www.bloomberg.com/news/articles/2014-05-01/inbloom-shuts-down-amid-privacy-fears-over-student-data-tracking

Knox, J. (2014). Digital culture clash: “massive” education in the E-learning and Digital Cultures MOOC. Distance Education, 35(2), 164-177. doi: 10.1080/01587919.2014.917704

Ladyshewsky, R. & Taplin, R. (2014). Leadership Development Using Three Modes of Educational Delivery: Online, Blended and Face to Face. International Journal on E-Learning, 13(3), 273-290. Chesapeake, VA: Association for the Advancement of Computing in Education (AACE). Retrieved October 1, 2016, from http://www.editlib.org.ezproxy.csu.edu.au/d/41246

Martinez de Monarrez, P. & Korniejczuk, V. (2013). Learning Styles and Attitudes toward Online Education in Four Universities in the State of Nuevo Leon, Mexico. International Journal on E-Learning, 12(2), 183-195. Chesapeake, VA: Association for the Advancement of Computing in Education (AACE). Retrieved October 1, 2016, from http://www.editlib.org.ezproxy.csu.edu.au/d/37550

Siemens, G., Dawson, S. & Lynch, G. (2013). Improving the Quality and Productivity of the Higher Education Sector: Policy and Strategy for Systems-Level Deployment of Learning Analytics. Retrieved September 14, 2016, from http://www.olt.gov.au/system/files/resources/SoLAR_Report_2014.pdf

The Tin Can API. (n.d.). What is the Tin Can API? Retrieved October 6, 2016, from https://tincanapi.com/overview/

Walker, L. (2012). Twitter Learning Analytics in R. In M. Brown, M. Hartnett & T. Stewart (Eds.), Proceedings of ASCILITE – Australian Society for Computers in Learning in Tertiary Education Annual Conference 2012. Australasian Society for Computers in Learning in Tertiary Education. Retrieved October 2, 2016, from http://www.ascilite.org/conferences/Wellington12/2012/images/custom/walker,_lyndon_-_twitter_learning.pdf

Welsh, S. (2016). Learning Analytics: A Traveller’s Guide [INF537 Colloquia Material, PowerPoint Slides]. Retrieved September 25, 2016, from https://interact2.csu.edu.au/bbcswebdav/pid-1023694-dt-content-rid-2321128_1/xid-2321128_1

West, D., Huijser, H., Lizzio, A., Toohey, D., Miles, C., Searle, B. & Bronnimann, J. (2015) Learning Analytics: Assisting Universities with Student Retention, Final Report (Part 1) prepared for the Australian Government Office for Learning and Teaching.  Retrieved September 28, 2016, from http://www.olt.gov.au/system/files/resources/SP13_3268_West_Report_2015.pdf

Wilson, K. & Nichols, Z. (2015). The Knewton Platform: A General-Purpose Adaptive Learning Infrastructure. Retrieved August 16, 2016, from https://www.knewton.com/wp-content/…/knewton-technical-white-paper-201501.pdf

Wuensch, K., Aziz, S., Ozan, E., Kishore, M. & Tabrizi, M.H.N. (2008). Pedagogical Characteristics of Online and Face-to-Face Classes. International Journal on E-Learning, 7(3), 523-532. Chesapeake, VA: Association for the Advancement of Computing in Education (AACE)

Yair, G. (2008). Can we administer the scholarship of teaching? Lessons from outstanding professors in higher education. Higher Education, 55(4), 447-459. doi: 10.1007/s10734-007-9066-4

 

Appendix A: Acknowledgments

One of the tasks I identified in my case study proposal was connecting with a number of experts in the area and interviewing them. I had planned on asking the same questions to each of them and while I perhaps hadn’t been able to meet with a statistically valid sample of experts, asking the same questions would at least impart some sort of rigor. However, almost immediately in my first interview we veered off track into a free-flowing discussion on what the expert was passionate about. This approach carried on into a number of informal discussions with experts in the field and I feel on a personal and academic level, I have gained far deeper insights into learning analytics as a result. While not all of the following experts are directly referenced in the paper, their insights and passion has in no small way made an impact into the way I feel about the field:

  • Simon Welsh, Manager, Adaptive Learning & Teaching Services, Division of Student Learning, Charles Sturt University. An expert into the way that learning analytics is evolving but also responsible for the delivery of high-quality, ethical digital services into the university.
  • Kirsty Kitto, Queensland Univeristy of Technology. Dr. Kitto is passionate about exploring learning ‘beyond the LMS’ and gaining insights into that learning. Dr. Kitto is an inaugural member of the board of the Data Interoperability Standards Consortium http://datainteroperability.org, an initiative that is taking over governance of the xAPI specification.
  • Jon Mason, Charles Darwin University. While an expert in the field of Learning Analytics, Jon has imparted to me the importance and power of asking the right questions, arguably one of the reasons why to date Learning Analytics has not delivered on its promise.
  • Rebecca Ferguson, The Open University (UK). Dr Ferguson has been working for years to improve the experience of her online students and is an expert in Learning Analytics.
  • Yong-Sang Cho (Korea Education & Research Information Service, South Korea). The South Korean education system is one of the most stressful (for students) education systems in the world and Yong-Sang is determined to bring back the ‘joy of learning’ to Korean students.
  • Tore Hoel, Oslo Akershus University College of Applied Sciences, Norway. Mr. Hoel is an expert on privacy and data protection, two key areas which I would liked to have expanded on more in my paper but I simply could not find a way of doing this area justice with our word count.

 

Appendix B: Case Study implementations and sources

 

Institution Case Study
University of Michigan

Overview

Created a task force to advise leadership on how to use learning analytics to improve both learning and the productivity of the university

Outcomes

Has funded a number of institutional projects and is collecting feedback to advise on future directions

Source
Siemens, G., Dawson, S. & Lynch, G. (2013)
University of Wisconsin

Overview

Building organisational capacity, implementing tools to identify and target students at risk, and personalise their learning

Outcomes

A small series of projects initiated which if successful, will be deployed at a larger scale

Source
Siemens, G., Dawson, S. & Lynch, G. (2013)
Open University of the UK

Overview

3 year plan implementing analytics to alert educators to students future needs, current needs, and retrospective (insights for course review, design)

Outcomes

Retrospective analysis in place but yet to roll out real-time analytics

Source
Siemens, G., Dawson, S. & Lynch, G. (2013)
University of New South Wales

Overview

Developing data warehouse to track achievement and identify students at risk

Outcomes

Have identified 11 academic risk indicators – use emails to contact at-risk students

Source
Siemens, G., Dawson, S. & Lynch, G. (2013)
University of New England

Overview

Focus on revitalising learning, teaching and assessment, and improving retention

Outcomes

Early alert program (EAP), and Automated Wellness Engine (AWE) used to contact most at risk students

Source
Siemens, G., Dawson, S. & Lynch, G. (2013)
Queensland University of Technology

Overview

Course performance reporting and monitoring early intervention activities for students at risk

Outcomes

Significant improvement in retention for students that have been contacted

Source
Siemens, G., Dawson, S. & Lynch, G. (2013)
University of Technology Sydney

Overview

University wide strategy with focus on becoming a ‘data intensive’ university

Outcomes

Early identification of students at risk, piloting a student dashboard measuring effort and engagement.

Source
Siemens, G., Dawson, S. & Lynch, G. (2013)
Swinburne University of Technology

Overview

Focus on students at risk of dropping out using a set of demographic and behavioural triggers

Outcomes

Pilot with commencing students, positive response from students and improved retention.

Source
Siemens, G., Dawson, S. & Lynch, G. (2013)
University of South Australia

Overview

Learning Analytics adopted to ‘provide an empirical base for improving student success and retention outcomes’

Outcomes

Identification of at-risk students has been used to inform the development of intervention and support strategies.

Source
Siemens, G., Dawson, S. & Lynch, G. (2013)
University of Queensland

Overview

Wish to use analytics to transform the student experience. Collecting and analysing data on individual learners, and on program/course performance.

Outcomes

Developing tools to support data-informed decision making for students and academics.

Source
Siemens, G., Dawson, S. & Lynch, G. (2013)
Batchelor Institute of Indigenous Tertiary Education

Overview

Low number of students with high reliance on personal relationships between teacher and each student. Additionally, very low usage of technologies with analytics functionality.

Outcomes

Has close ties with Charles Darwin University and may ‘piggy back’ on research and infrastructure from that institution

Source
“Case Study 1: Batchelor Institute of Indigenous Tertiary Education”, (2015).
Murdoch University

Overview

Identified a number of at-risk behaviours including non-attendance, missing multiple classes, non-submission of early assessment items and non-engagement of online tutorials. Uses data for improved reporting and action, exploring analytics reporting available in LMS (Moodle)

Outcomes

Recognises the importance of senior leadership sponsorship and also involvement of wide range of stakeholders to determine the questions that LA can start to answer.

Source
“Case Study 2: Murdoch University.” (2015)
University of Newcastle

Overview

Focussing on infrastructure aspect of learning analytics, combining data from Student Information Systems (SIS) and Learning Management Systems (LMS) to identify students at risk.

Outcomes

Recognises the importance of planning and understanding the types of data available from different systems, and how that may be integrated. Understanding the right questions to ask is important too.

Source
“Case Study 3: University of Newcastle.” (2015)
Charles Darwin University

Overview

Using data warehousing and analytics tools to provide insights into students at risk. Mapping data from SIS to analytics data schemas, customising environment to support the requirements of a dual-sector university.

Outcomes

Recognises learning analytics can provide much more than supporting retention and students at risk. Change management is a challenge.

Source
“Case Study 4: Charles Darwin University.” (2015)
Griffith University

Overview

Has a relatively mature implementation of learning analytics supporting retention and is exploring predictive analytics.   Focuses on the intersections between people and data

Outcomes

Identified a number of principles in relation to learning analytics and retention including:

·      Mutual responsibility

·      Transparency

·      Authentic culture

·      Local ownership

·      University-wide coordination and partnership

·      Information management

·      Monitoring of effectiveness

·      sustainabilty

Source
“Case Study 5: Griffith University.” (2015)
Swinburne University

Overview

Combining past and current academic performance data with demographic and activity data to improve first-year mathematics students retention.

Outcomes

Outcomes of project not reported on.

Source
Faridhan, Y. A., Loch, B. & Walker, L. (2013)
Purdue University

Overview

The Signals project at Purdue University addressed the challenge of retention and improving student success too. It also focused on institutional accountability. The project collects data from a number of sources and provides feedback not just to teachers, but to students directly through the use of a student dashboard.

Outcomes

Increase in student retention and self-help.   Identification of a number of future directions and challenges for learning analytics. Literature seems to refer to this project as a key learning analytics initiative.

Source
Arnold, K. E. (2010)
University of Melbourne

Overview

Analytics tool used to:

·      identify patterns of usage of course materials with the goal of using that information for making improvements to the course

·      Identify students at risk

Outcomes

Largely used for behavioural analysis of usage of course materials. One negative was the lag between usage and availability of data for analysis – would have liked close to real-time data but the data feed was 24-48 hours behind.

Source
Corrin, L., Kennedy, G., de Barba, P.G., Lockyer, L., Gaševic ́, D., Williams, D., Dawson, S., Mulder, R., Copeland, S.,& Bakharia, A. (2016).
Macquarie University

Overview

Analytics tool used to:

·      Identify resource access

·      Identify patterns of access to resources

·      Identify improvements to LMS

·      Link access to resources to assessment

Outcomes

Used to contact non-engaging students, and in consideration on appeals for assessments (ie had the student actually accessed the resources).

Source
Corrin, L., Kennedy, G., de Barba, P.G., Lockyer, L., Gaševic ́, D., Williams, D., Dawson, S., Mulder, R., Copeland, S.,& Bakharia, A. (2016).
University of South Australia

Overview

Analytics tool used to:

·      identify patterns of resource usage

·      Identify resource usage in relation to assessment

Outcomes

Used the implementation to identify concepts that students were struggling with and relate that back to the resources that were being used.

Source
Corrin, L., Kennedy, G., de Barba, P.G., Lockyer, L., Gaševic ́, D., Williams, D., Dawson, S., Mulder, R., Copeland, S.,& Bakharia, A. (2016).
Swinburne University

Overview

Using Twitter data to perform analysis measuring evidence of social learning.

Outcomes

Insights into social participation and communications channels between students, students/teacher.

Source
Walker, L. (2012).

 

 

Appendix B: Further information on Data and Privacy Policies, Learning Analytics Codes of Practice

The following list formed a part of the general research in this case study and provides further information however is not directly discussed and cited in the main paper:

ANDS. (n.d.). Ethics, consent and data sharing. Retrieved October 4, 2016, from http://www.ands.org.au/guides/ethics-consent-and-data-sharing

Insight Centre for Learning Analytics. (2013). Privacy Statement for the Insight Website. Retrieved October 4, 2016, from https://www.insight-centre.org/privacy

Kingston University. (n,d). Privacy Policy. Retrieved October 4, 2016, from http://www.kingston.ac.uk/privacy-policy/

Monash University. (n.d). Conduct and Compliance Procedure. Privacy. Retrieved October 4, 2016, from http://privacy.monash.edu.au/procedure/

The Open University. (n.d). Ethical Use of Student Data for Learning Analytics Policy. Retrieved October 4, 2016, from http://www.open.ac.uk/students/charter/essential-documents/ethical-use-student-data-learning-analytics-policy

University of Melbourne, (n.d.), Privacy. Retrieved October 4, 2016, from http://www.unimelb.edu.au/governance/compliance/privacy

University of Queensland. (n.d). Right to Information and Privacy. Retrieved October 4, 2016, from https://www.uq.edu.edu.au/rti

Victoria University, (n.d). Privacy. Retrieved October 4, 2016, from https://www.vu.edu.au/privacy

 

Appendix C: Sample Graduate Attributes

The following list provides links to a sample of Graduate Attributes:

 

Charles Sturt University (n.d). What are they learning? Retrieved October 7, 2016, from http://www.csu.edu.au/division/student-learning/home/csu-academics/sessional-staff/know-your-students/what-are-they-learning

Griffith University. (n.d). Griffith Graduate Attributes. Retrieved October 4, 2016, from https://www.griffith.edu.au/learning-teaching/student-success/graduate-attributes

RMIT University. (n.d.). Graduate Attributes. Retrieved October 4, 2016, from http://www1.rmit.edu.au/teaching/graduateattributes

The Open University of Hong Kong. (n.d.). Graduate Attributes. Retrieved October 4, 2016, from http://www.ouhk.edu.hk/wcsprd/Satellite?pagename=OUHK/tcSingPage&c=C_PAU&cid=191182123200

University of Adelaide. (n.d.). University of Adelaide Graduate Attributes. Retrieved October 4, 2016, from https://www.adelaide.edu.au/learning/strategy/gradattributes/

University of South Australia. (n.d). Graduate qualities. Retrieved October 4, 2016, from http://w3.unisa.edu.au/gradquals/

University of Sydney, (n.d.). Statement of Graduate Attributes. Retrieved October 4, 2016, from http://sydney.edu.au/education-portfolio/ei/GraduateAttributes/statement.htm

 

INF537: Critical Reflection (Assignment 3, Part B)

As I reflect on my journey in this Master’s program, considering how I will apply what I have learned, I look back on my very first post (Leeson, March 7, 2014) and how overwhelmed I was.  Interestingly, in that post I found myself looking at the difference between the on-campus experience and distance learning (a key element of my final assignment).  In our first online meet-up it was with some trepidation that I introduced myself as more of an IT worker than a teacher or teacher’s librarian and I felt somewhat intimidated by the expertise of my fellow  students (I feel much more comfortable now and quite-welcomed by this brilliant group).

I set out with the task of really trying to understand the challenges, opportunities and motivations that education professionals are faced with as a result of the sheer impact that technology and change is having on society.  Without knowing it at the time, I think I was already an advocate of cross-disciplinary study, something I was able to investigate in depth in this subject (Leeson, September 15, 2016). SAMR and TPACK have now become important models for me (Leeson, October 10, 2014).  SAMR articulates a view that I have held for some time while TPACK demonstrates for me just how many areas of expertise today’s teacher needs in order to be successful with educational technologies.

An early disappointment for me was that INF535 Information Flow and Advanced Search was not available as it was one of my priority subjects however in its place I was able to undertake INF415 Management of Information Agencies which enabled me to gain insights into the challenges for academic libraries and librarians (Leeson, August 14, 2016a; Leeson, August 14, 2016b).

The subtitle of this blog is ‘a reflective learning blog’ and it represents an important aspect of how I now learn.  I have had blogs in the past which have in the main been commentary-related or special interest however this blog has deep meaning for me as it has become an integral part of my learning and learning journey.  For subjects which did not have a blogging component I felt that something important was missing (Leeson, July 25, 2016a).  I have written over 80 posts  and hopefully made many useful comments.  I tend to value the interactions in the blogspace more than those in the forums (reflecting on this, I seem to have used the forums for discussion about course matters while blog commentary seems to be more centred on learning). Sometimes I feel my comments in other’s blogs are a little inadequate as I am communicating with some impressive people and find myself only able to offer words of encouragement.  That encouragement though is really important in countering the isolation of the distance learner and the Twitter interactions have also been fantastic for that (particularly as we approach assignment deadlines – a few words of encouragement works wonders).

In INF537 I have really taken on board aspects of Weller’s digital scholarship (Weller, 2012) and have used this blog to help develop my assignments (Leeson, July 25, 2016b; Leeson, July 25, 2016c; Leeson, August 15, 2016; Leeson, August 17, 2016, Leeson, August 27, 2016; Leeson, September 26, 2016; Leeson, September 27, 2016; Leeson, October 3, 2016) – some of these posts hopefully demonstrate some quite independent investigation and questioning.  I hope in some small way I have been able to take on the attributes of a digital scholar and will be using those as I move onwards from this very rewarding program.

I set out to understand the challenges faced by our educators (at least those that relate to technology) but feel that I have gained so much more.  For me the passion and commitment of our educators has been a highlight and perhaps one of the biggest takeaways from this program.  It is certainly one which I hope can be reflected into the increasingly pervasive world of online education.

 

References

Leeson, J. (March 7, 2014).  First Thoughts on Knowledge Networks and Digital Innovation. [blog post].  Retrieved October 7, 2016, from http://thinkspace.csu.edu.au/jerry/2014/03/07/first-thoughts-on-knowledge-networking-and-digital-innovation/

Leeson, J. (October 10, 2014).  INF533: Assessment Item 8, Part C, Critical Reflection. [blog post].  Retrieved October 8, 2016, from http://thinkspace.csu.edu.au/jerry/2014/10/10/inf533-assessment-item-8-part-c-critical-reflection/

Leeson, J. (July 25, 2016a).  First post for a new subject (just a reflection really). [blog post].  Retrieved October 9, 2016, from http://thinkspace.csu.edu.au/jerry/2016/07/25/first-post-for-a-new-subject-just-a-general-reflection-really/

Leeson, J. (July 25, 2016b).  Insights from the learning analytics colloquium. [blog post].  Retrieved October 9, 2016, from http://thinkspace.csu.edu.au/jerry/2016/07/25/insights-from-the-learning-analytics-colloquium/

Leeson, J. (July 25, 2016c).  Thoughts on Module 1 – the blogging aspect. [blog post].  Retrieved October 9, 2016, from http://thinkspace.csu.edu.au/jerry/2016/07/25/thoughts-on-module-1-the-blogging-aspect/

Leeson, J. (August 14, 2016a).  INF415: Case Study. [blog post]. Retrieved October 8, 2016, from http://thinkspace.csu.edu.au/jerry/2016/08/14/inf415-case-study/

Leeson, J. (August 14, 2016b).  INF415: Management Article. [blog post]. Retrieved October 8, 2016, from http://thinkspace.csu.edu.au/jerry/2016/08/14/inf415-management-article/

Leeson, J. (August 15, 2016).  Case Study Proposal. [blog post].  Retrieved October 9, 2016, from  http://thinkspace.csu.edu.au/jerry/2016/08/15/case-study-proposal/

Leeson, J. (August 17, 2016).  Digital scholarship or just note taking in a public manner? [blog post].  Retrieved October 9, 2016, from http://thinkspace.csu.edu.au/jerry/2016/08/17/digital-scholarship-or-just-note-taking-in-a-public-manner-1/

Leeson, J. (August 27, 2016).  On the irony of extolling the virtues of open access and open publishing from behind a paywall. [blog post]. Retrieved October 9, 2016, from http://thinkspace.csu.edu.au/jerry/2016/08/27/on-the-irony-of-extolling-the-virtues-of-open-access-and-open-publishing-from-behind-a-paywall/

Leeson, J. (September 15, 2016).  Interpretive discussion on digital scholarship and interdisciplinary knowledge and research. [blog post].  Retrieved October 7, 2016, from http://thinkspace.csu.edu.au/jerry/2016/09/15/interpretive-discussion-on-digital-scholarship-and-interdisciplinary-knowledge-and-research/

Leeson, J. (September 26, 2016).  Case Study background.  [blog post]. Retrieved October 9, 2016, from  http://thinkspace.csu.edu.au/jerry/2016/09/26/case-study-background/

Leeson, J. (September 27, 2016).  Is the LMS education’s equivalent of a panopticon? [blog post]. Retrieved October 9, 2016, from http://thinkspace.csu.edu.au/jerry/2016/09/27/is-the-lms-educations-equivalent-of-a-panopticon/

Leeson, J. (October 3, 2016). Capturing some thoughts. [blog post]. Retrieved October 9, 2016, from http://thinkspace.csu.edu.au/jerry/2016/10/03/capturing-some-thoughts/

Weller, M. (2012). The virtues of blogging as scholarly activityChronicle Of Higher Education, 58(35), B27-B28.

 

 

Capturing some thoughts

Largely a transcript of my attempt at a voice thread here:

What my case study is about is asking what it is that makes for a really rich and meaningful face-to-face learning experience and can some of the key attributes of that be transferred to online learning through the use of well-designed learning analytics.

It turns out that this is not a new problem. Bloom (1984) asked the same question but from the perspective of creating teaching and learning approaches for groups which were as good as 1-1 tutoring.

This, I think, forms one of the key challenges for learning analytics – how can we emulate, at scale, the richness of an on-campus learning experience

So far my research has wondered into areas such as:

  • motivation
  • learning styles
  • teacher qualities
  • adaptive learning
  • bringing data in from many disparate locations and trying to make sense of it
  • but perhaps most importantly, are we asking the right questions?

To that end I have been thinking about graduate attributes. Most universities seem to identify:

  • social and cultural awareness
  • a global outlook
  • active and lifelong learning

as key attributes.

I find myself wondering how many fully online courses intrinsically include the imparting of these attributes into their design and if so, how is that measured/reported on from both an academic and student perspective?

From a learning analytics perspective, these types of questions do not seem to be asked at all.

Ultimately, I am wondering if, along with the challenge of replicating in some way at scale, the symmetry of the student / teacher and peer relationships, that the development of some of these so called soft-skills are an important missing piece.

Footnote:  I have moved on from the importance of learning styles to the importance of learning dispositions.  This really resonated with me when I found it in the article on social analytics by Buckingham Shum & Ferguson (2012).

 

Buckingham Shum, S. & Ferguson, R. (2012). Social learning analytics. Journal of Educational Technology and Society, 15(3) pp. 3–26. Retrieved October 1, 2016, from http://www.ifets.info/journals/15_3/2.pdf

Is the LMS education’s equivalent of a Panopticon?

In a previous post I alluded to the collection and insightful use of data (about learning activity) as a potential means for addressing what I see as one of the major challenges for online learning at scale (i.e. teaching cohorts of thousands, if not hundreds of thousands) and that is the loss of meaningful and insightful relationships between teachers and students.

The collection of data implies measurement which implies observation.  Observation is another interesting area of investigation i.e. does observation alone imply intervention of some sort (and can it have an impact)?  In a classic and somewhat contentious series of experiments in the 1940s, workers at a factory had their productivity measured under a series of different conditions ranging from decreasing and increasing the ambient light through to increased work breaks and a number of other variables, all of which seemed to increase their productivity.  Ultimately this became known as the Hawthorne Effect or Observer Effect.  The mere act of being observed, it was concluded, altered the behaviour of the workers.

That observation changes behaviour is far from a new line of thought.  Eighteen Century philosopher Jeremy Bentham came up with the concept of the Panopticon, an institutional structure which allowed inmates of it to be observed without them knowing whether or not they were actually being observed.  His hypothesis was that simply by thinking they may be being observed, inmates would change their behaviour accordingly.

Elevation, section and plan of Jeremy Bentham's Panopticon penitentiary, drawn by Willey Reveley, 1791. from wikipedia https://en.wikipedia.org/wiki/Panopticon

Elevation, section and plan of Jeremy Bentham’s Panopticon penitentiary, drawn by Willey Reveley, 1791. from wikipedia https://en.wikipedia.org/wiki/Panopticon

… and so we move on to today, and the proliferation of the LMS in our higher education (and other) environments.  Higher Ed institutions proudly proclaim how good their policies and procedures are, providing great terms and conditions, along with full disclosure that they may use activity data generated from the LMS to help their students with their progress.  in other words, we may be observing you through the collection of usage data on the LMS.  What sort of impact does this have on students i.e. does the mere knowledge that they may be being tracked change their behaviour in some way?

It gets better(?).  As we know, much learning takes place outside the LMS so now we have tech-savvy educators and administrators, along with technology experts working away to collect learning activity trails from outside the LMS, all in the name of improving student outcomes.  Don’t get me wrong, I seem to have ‘drunk from the same gatorade bottle’ and find myself thinking this is a good thing – let’s develop some really deep and personal insights into how we are learning and use that to improve that learning through the collection of data, masses of it in-fact.

Sometimes though, it is just useful to step back and take a look at something from a different perspective (yay for cross-disciplinary research and knowledge).

Cheers,

Jerry

 

Footnote: my case study is on the collection of data and learning analytics, and I do think it is a good thing (I think?).

Hawthorne Effect. (n.d.).  In Wikipedia.  Retrieved September 16, 2016, from https://en.wikipedia.org/wiki/Hawthorne_effect

Panopticon. (n.d). In Wikipedia.  Retrieved September 22, 2016, from https://en.wikipedia.org/wiki/Panopticon#The_panopticon_as_metaphor

Case study background

One of the key motivators for my case study in this subject is the importance that interactions with others has on learning, particularly the interactions between learner and teacher and with other learners.  In my interpretive discussion on digital scholarship I examined this from the perspective of the social construction of knowledge but for me it is so much more than that.  When I studied for my undergraduate degree I studied part-time and was also mostly a ‘distance student’.  This was in the early 90s and I while I completed the degree, I felt that I had missed out on much of the ‘university experience’ i.e. the interactions with other students and the on-campus experience.  This Master’s program has been interesting for me in that I have had more (quite a bit more) interaction with other students than in my earlier student experience at a university.  Obviously the collaborative tools available to us have been a big part in this but so to has the way those tools have been integrated into the learning experience.

In a relatively small cohort, I guess it is still possible to have a personal connection between student and teacher, and if designed and facilitated well, between students however how can personal connections play any sort of equitable role when applied to scale (i.e. hundreds, if not thousands of students) and how is it possible to develop meaningful insights into individual student’s progress?

In my previous job I worked with learning technologies at a university which prided itself on its ‘on-campus experience’ and saw that as a key differentiator for it.  I had the chance to witness this in many ways such as:

  • the energy and ‘vibe’ of O Week,
  • the always crowded areas of the student hub,
  • the focus on small group discovery,
  • the somewhat different energy of examinations periods and finally,
  • the excitement of graduation periods (particularly with international students dragging their families all over campus and off to nearby cafes and restaurants, showing off where they had spent their last few years).

Working in the university provided me with the opportunity to witness at first hand, many different academics with many different motivations for teaching ranging from teaching because they were obliged to (they would much rather be researching) through to some of the most passionate and amazing teachers that I have ever come across.  The difference the latter made was incredible.

This background forms the basis for my final assignment in this program.  Having seen just how much difference a good teacher can make to not only the performance of students in a given subject, but the impact that they can have on their students futures, how can even a part of this impact be facilitated in a seemingly impersonal online learning environment?

Benjamin Bloom (1984) investigated what he called ‘the 2 sigma problem’.  He examined a study in which cohorts of students were taught under three different conditions:

  1. Conventional: class size of about 30 students with periodic tests.
  2. Mastery Learning: class size of about 30 students with same instruction, usually the same teacher, but with formative assessment, feedback and corrective actions.
  3. Tutoring: students assigned to a good tutor in groups of up to 3 students.

The students in the tutoring cohort typically had results 2 standard deviations higher than the students in the conventional  approach and the students in the Mastery Learning cohort typically had results 1 standard deviation higher than those in the conventional approach.  Bloom referred to this as the ‘2 sigma problem’ and asked the question ‘how can teaching approaches be devised such that students in group conditions achieve the same types of results as those who are taught under tutoring conditions?’.

The impact that individual tutoring/teaching has is clear but the problem that Bloom sought to solve has increased in orders of magnitude now that we have classes running into the thousands with no direct physical access to a tutor or teacher.  What we do have though is data, and potentially lots of it.  Are the insights available through good, real time learning analytics enough to help students achieve more?  What do we mean by achievement i.e. is it just better results? What about the ‘joy’ of learning and the impact that it can have?

 

 

Bloom, Benjamin S. (1984). The 2 Sigma Problem: The Search for Methods of Group Instruction as Effective as One-to-One Tutoring. Educational Researcher, 13(6), 4-16.

 

Interpretive discussion on digital scholarship and interdisciplinary knowledge and research

Introduction

The increasing use of collaborative learning approaches with emphasis on online global collaboration is an important current trend in the Australian Tertiary Sector (New Media Consortium, 2016a, p.6). Digital scholarship, which includes teaching (Veletsianos & Kimmons, 2012, p.767), is a necessary precondition for this to occur. Another key trend is the importance of cross-disciplinary approaches in rethinking how higher education institutions work as part of broader changes needed to bridge the disconnect with the 21st Century economy (New Media Consortium, 2016, p.10). This is all taking place alongside the massive transformation that digital technologies are having on society globally, changing our expectations and demands on education.

This paper will examine what it means to engage in digital scholarship and how such practice is an important factor in the pursuit of interdisciplinary knowledge and research. The paper will focus on higher education but draw from education and learning more broadly to highlight the importance of interdisciplinary approaches in an increasingly complex environment – one which is not determined by organisational boundaries but facilitated through communities of practice (Wenger, 2011, p.5) and the use of digital technologies. The paper will also examine the challenges for both academics and institutions.

Interpretive Discussion

To understand digital scholarship it is first necessary to define scholarship. Scanlon (2014, p.13) uses a definition by Boyer in 1990 outlining four distinct functions of scholarly activity that include discovery (creating new knowledge), integration (integrating knowledge across disciplines), application (applying that knowledge more broadly, using scholarly discipline) and teaching. Veletsianos & Kimmons (2012, p.766) also emphasise the importance of both research and teaching when describing scholarship. Digital scholarship, as argued by Veletsianos & Kimmons (2012, p.767) is not simply a matter of using digital technologies to perform these functions, but embraces the idea of openness and collaboration.

Weller (2011, p.99) uses the term ‘open scholar’ and identifies a number of characteristics that open scholars are likely to adopt including having a distributed online identity (typically using several social media services) that are often aggregated to a single location such as a blog. Open scholars develop networks of peers through those services and also publish their work informally on them. They do this in a complementary manner to their formal publications and may even promote that formal work through their social networks. Open scholars automatically share their work and mix both their personal and professional outputs as part of developing their network and reputation.

Nissani (1997, p.203) defines interdisciplinary knowledge as familiarity with two or more disciplines and interdisciplinary research as combining components of two or more disciplines to create new knowledge. Nissani argues the massive growth in human knowledge and the increasing need for specialization has contributed towards the need for interdisciplinary research (Nissani, 1997, p.202). As problems became too complex, interdisciplinary research helped solve them and reduced the potential for errors that could arise from single-disciplinary perspectives (Nissani, 1997, p.209). This argument is supported by Shin’s examination of the interrelationships between disciplines, and how new knowledge is constructed such that the sum of it is greater than its component parts across disciplines (Shin, 1986, p.100). Hayes Jacobs (1989, para.9) also recognises the challenges brought about by the massive growth of knowledge and the resultant fragmentation and specialization in research and practice, and presents an argument for interdisciplinary curriculum content as a response.

The importance of a multi-disciplinary approach can be seen in the Australian Curriculum (ACARA, 2010, p.1; ACARA, 2012, p.5) in preparing students for further study and success in the workplace. ACARA also integrates a number of ‘general capabilities’, including ICT skills, across the curriculum to help students ‘live and work successfully in the 21st Century’ (ACARA, n.d., para. 1). Interdisciplinary study benefits students in a number of ways as it allows them to synthesize ideas from many areas and develop transferable skills such as critical thinking and communications skills Appleby (2015, para. 5). From a scholar’s perspective however, the benefits of interdisciplinary studies may be offset by concerns such as isolating themselves from the ‘core of their field’ and potentially lowering their reputation and/or chances of tenure (Jones, 2010, p.79).

Brown & Adler (2008, p. 18-19) examine the impact that technology has on education and how advances in technology are changing the way we learn. In the early days of the Web (Web 1.0) we saw the proliferation of Open Educational Resources (OERs) which opened up access to content however teaching and learning still happened in what they saw as a ‘cartesian context’ where knowledge could be seen as a substance and pedagogy as knowledge transfer. With the advent of Web 2.0, exemplified by its participatory nature, learning became more social and knowledge was socially constructed. This social construction of knowledge is seen as pivotal in open scholarship (Weller, 2011, p.99).

Brown examines collaboration through the massive growth in socially constructed knowledge in Wikipedia and also in communities of practice surrounding World of Warcraft (Brown, 2010). He argues games such as World of Warcraft help to develop the leadership, communications and knowledge management skills required by cross-functional teams in the 21st century workplace, and help develop information and communications technology skills. Thomas & Brown (2009, p.40) believe that in games such as World of Warcraft we also develop skills such as learning what the right things to know are. They argue these environments develop an immense problem solving capability through what they call a ‘networked imagination’ and provide insights into what a future workplace might look like. McGonigal (2010) too imagines a future where digital games are used to solve “really big”, global problems through building on the inherent affordances of those environments. They are immersive, goal-oriented with “epic quests”, build trust, enable massive collaboration, and importantly, have room to fail and try again.

In its Green Paper on Citizen Science, the European Union recognises the transformation of science towards a culture of openness and sharing that is enabled by information and communications technology (European Commission, 2013, p.14). The EU has adopted the term ‘Digital Science’ to describe this. Examples of citizen science using the power of digital technologies and the collective effort of the public to solve problems include:

  • NASA’s Open Innovation Initiative (NASA, n.d.), encouraging the general public to collaborate with its scientists to innovate and solve complex challenges.
  • The PolyMath Project (“Polymath Project”, n.d), using mass collaboration to help mathematicians solve complex mathematical problems. Importantly, the use of blogging is enabling the mathematicians to learn from people with complementary (multidisciplinary) knowledge (Nielsen, 2011, p.2) and solve those problems in a fraction of the time it would otherwise take them.

In addition to collaborative problem solving, blogging can benefit scholarly life by increasing productivity, networking, and access to timely academic discussion Weller (2012, para. 2). It is however not without its challenges, one of which is the assessment and measurement of scholarly activity when it comes to tenure and promotion. For instance, blogging is not as easy to measure as peer-reviewed publications and web statistics can be deceiving (Weller, 2012, para. 11). On publication, Velestianos & Kimmons (2012, p. 772) cite evidence to suggest that open access papers are actually cited more than non-open access papers and that there is no evidence that open access publishing harms citations. The author’s own research practice would seem to back this up as all sources referenced in this work and others are either open access or availably freely through university library services. Paywalls have been and continue to be a barrier that can discourage access (Leeson, 2015, para. 2). There is however a cost to open publishing that still needs to be met and effective, sustainable business models are still being refined (Ilva, J., Laitinen, M.A. & Saarti, J., 2016, p.24) to support it.

Katz (2010) examines the impact of technology on scholarship from both individual and enterprise perspectives. Starting with Boyer’s definition of scholarship as discovery, integration, application and teaching Katz (2010, p.48) then highlights how digital technologies are enhancing scholarly pursuits through massively improved and low-cost communications, tools that increase our ability to attend to multiple tasks, to work at macro or micro scale and to access an abundance of information, all of which greatly assist interdisciplinary pursuits. Katz goes on to examine the impact of such technology on higher education institutions and while he argues that information technology has revolutionized the mission of higher education and ‘secured a place for scholars and scholarship in the knowledge era’, higher education itself faces the same types of challenges which beset the broadcasting, publishing and entertainment industries (Katz, 2010, p.48).  The consumer (of higher education) is now empowered and competition for their custom has intensified as a result of the global impact of technology (Katz, 2010, p.54).

While authors such as Katz and the European Commission argue firmly on how technology is bringing about change in scholarship, Veletsianos & Kimmons (2012, p.769) in their introduction of the term Networked Participatory Scholarship (using participatory technologies to share, reflect and improve scholarship) are more circumspect in assigning causality when examining the relationship between technology and scholarly culture. In other words, are cultural shifts driving technology or is technology transforming culture? Veletsianos and Kimmons believe the two influence each other in a complicated manner.

Goodfellow (2014, p. 6) argues that scientific scholarship is inherently open even without a digital context. It is based upon the principle that scholars give up their intellectual property rights on their work in exchange for recognition as the discoverer, enhancing their reputation. Digital openness emphasises ‘universal participation’ in the social construction of knowledge and, Goodfellow (2014, p. 9) argues, presents a challenge for specialist disciplinary communities striving to establish a ‘stable and enduring record’ of knowledge (Goodfellow, 2014, p. 12). This notion is challenged by Wenger’s argument that the institution is ‘not the privileged focus of learning” but that meaningful knowledge is created and acquired far more broadly (Wenger, 2011, p.5).

Another challenge for digital scholarship is the pressure faced by academics in competition for jobs, grant funding and also for tenure as releasing data prior to ‘publication’ can give competitors an advantage (Nielsen, 2011, p. 8). Scanlon (2014, p.17) found similar concerns from academics about releasing data they had worked on. Those academics could however see how they may benefit in tangential ways (e.g. opening up opportunities for collaboration) but were still resistant to the idea of making their data open.

Wolski & Richardson also recognise challenges associated with digital scholarship and in response have developed a model for institutional support of it. Organisationally, assessment of academic research tends to be based on publication. To address this requires institutional change articulating a vision for research with a corresponding digital strategy (Wolski & Richardson, 2014, p.88). For digital infrastructure institutions need to better understand which solutions they should be providing and the external tools (e.g. Skype, Dropbox) that their academics are using. Preservation of digital assets, particularly those that are ‘born digital’ (Thomas, 2015, para. 5) present a challenge to institutions that Wolski & Richardson ( 2014, p.92) believe academic libraries should be well positioned to meet using national services such as the National Collaborative Research Infrastructure Initiative.  Mason (2015, p.36) however believes reliance on government-funded initiatives such as these is risky due to the machinations of the political systems that fund them, with long-term funding being particularly problematic. Long-term preservation of open publishing is also seen as a challenge by Ilva, Laitinen, & Saarti (2016, p.25). The final component in Wolski & Richardson’s model for institutional support of digital scholarship is ongoing professional development and support (Wolski & Richardson, 2014, p.94).

Conclusion

Digital scholarship and more broadly, academia, does not exist in a vacuum and has an increasingly complex relationship with technology and the wider global society. The advent of the participatory web has enabled the public to engage in massive online, scientific challenges and empowered individuals to redefine and renegotiate their expectations and requirements of higher education. The era of openness and collaboration, while providing opportunities for scholars and institutions, is also challenging them to reinvent themselves to remain relevant. Emerging pedagogies in new technology environments such as games and social platforms for learning are enabling incredibly complex problems to be solved and education to occur well beyond institutional boundaries. Digital scholarship and interdisciplinary knowledge, research and practice are key components that will enable higher education to meet current and emerging demands being placed upon it.

References

ACARA. (2010). Australian Curriculum Information Sheet. Retrieved August 17, 2016, from http://www.acara.edu.au/_resources/AC_SCIENCE_INFO_Senior_Sec_v1_FINAL.pdf.

ACARA. (2012). The Shape of the Australian Curriculum: Health and Physical Education. Retrieved August 17, 2016, from http://www.acara.edu.au/_resources/Shape_of_the_Australian_Curriculum_Health_and_Physical_Education.pdf.

ACARA. (n.d.). General capabilities. Retrieved August 17, 2016, from http://acara.edu.au/curriculum/general-capabilities.

Appleby, M. (2015). What are the benefits of interdisciplinary study? Retrieved August 20, 2016, from http://www.open.edu/openlearn/education/what-are-the-benefits-interdisciplinary-study

Brown, J. S. (2010). The Knowledge Economy of the World of Warcraft.   Retrieved August 16, 2016, from https://www.youtube.com/watch?v=RZG6WTRP-6E

Brown, J. S. & Adler, R. (2008). Open education, the long tail, and learning 2.0. Educause Review, 43(1), 16-20. Retrieved August 16, 2016, from http://www.educause.edu/ero/article/minds-fire-open-education-long-tail-and-learning-20

European Union. (2013). Green Paper on Citizen Science. Retrieved August 26, 2016, from http://ec.europa.eu/newsroom/dae/document.cfm?doc_id=4122

Goodfellow, R. (2014). Scholarly, digital, open: an impossible triangle? Research In Learning Technology, 21. doi:http://dx.doi.org/10.3402/rlt.v21.21366

Hayes Jacobs, H. (1989). Interdisciplinary Curriculum: Chapter 1. The Growing Need for Interdisciplinary Curriculum Content.   Retrieved August 21, 2016, from http://www.ascd.org/publications/books/61189156/chapters/The-Growing-Need-for-Interdisciplinary-Curriculum-Content.aspx

Ilva, J., Laitinen, M.A. & Saarti, J., (2016). The Costs of Open and Closed Access: Using the Finnish Research Output as an Example. LIBER Quarterly. 26(1), pp.13–27. DOI: http://doi.org/10.18352/lq.10137

Jones, C. (2010) “Interdisciplinary Approach – Advantages, Disadvantages, and the Future Benefits of Interdisciplinary Studies,” ESSAI: Vol. 7, Article 26. Available at: http://dc.cod.edu/essai/vol7/iss1/26

Katz, R. (2010). Scholars, Scholarship, and the Scholarly Enterprise in the Digital Age. EDUCAUSE Review, 45(2), 44-56.

Leeson, J. (2016). On the irony of extolling the virtues of open access and open publishing from behind a paywall. Retrieved August 28, 2016, from http://thinkspace.csu.edu.au/jerry/2016/08/27/on-the-irony-of-extolling-the-virtues-of-open-access-and-open-publishing-from-behind-a-paywall/

Mason, J. (2015). Digital Amnesia and the Demise of a Learning Community. Learning Communities: International Journal of Learning in Social Contexts [Special Issue: Narrative Inquiry], 18, 30-39.

McGonigal, J. (2010). Gaming Can Make a Better World.   Retrieved August 24, 2016, from http://www.ted.com/talks/jane_mcgonigal_gaming_can_make_a_better_world?language=en

NASA. (2016). open NASA.   Retrieved August 26, 2016, from https://open.nasa.gov/about/

New Media Consortium. (2016a). 2016 NMC Technology Outlook – Australian Tertiary Education. Retrieved August 27, 2016, from http://cdn.nmc.org/media/2016-nmc-technology-outlook-au.pdf

New Media Consortium. (2016b). NMC Horizon Report: 2016 Higher Education Edition. Retrieved August 27, 2016, from http://cdn.nmc.org/media/2016-nmc-horizon-report-he-EN.pdf

Nielsen, M. (2011). Reinventing Discovery: The New Era of Networked Science: Princeton University Press.

Nissani, M. (1997). Ten cheers for interdisciplinarity: The case for interdisciplinary knowledge and research. The Social Science Journal, 34(2), 201-216. doi: http://dx.doi.org/10.1016/S0362-3319(97)90051-3

Polymath Project. (n.d.). In Wikipedia. Retrieved August 26, 2016, from https://en.wikipedia.org/wiki/Polymath_Project#Problems_solved

Scanlon, E. (2014). Scholarship in the digital age: Open educational resources, publication and public engagement. British Journal of Educational Technology, 45(1), 12–23. doi:10.1111/bjet.12010

Shin, U. (1986). “The structure of interdisciplinary knowledge: A Polanyian view.” Issues in Integrative Studies, 4 93-104.

Thomas, W. (2015). What is digital scholarship? A typology [Blog post]. Retrieved August 25, 2016, from http://railroads.unl.edu/blog/?p=1159

Thomas, D, & Brown, J. S. (2009). Why virtual worlds can matter. International Journal of Learning and Media, 1(1), 37-49.

Weller, M. (2012). The virtues of blogging as scholarly activityChronicle Of Higher Education, 58(35), B27-B28.

Weller, M. (2011). Openness in Education. In The Digital Scholar: How Technology Is Transforming Scholarly Practice (pp. 96–113). London: Bloomsbury Academic. Retrieved August 15, 2016, from http://dx.doi.org/10.5040/9781849666275

Wenger, E. (2011). Communities of practice: A brief introduction. Retrieved August 27, 2016, from https://scholarsbank.uoregon.edu/xmlui/handle/1794/11736

Wolski, M., & Richardson, J. (2014). A Model for Institutional Infrastructure to Support Digital Scholarship. Publications, 2(4), 83-99. doi: 10.3390/publications2040083

Veletsianos, G. & Kimmons, R. (2012). Networked Participatory Scholarship: Emergent techno-cultural pressures toward open and digital scholarship in online networks. Computers & Education, 58(2). doi: doi:10.1016/j.compedu.2011.10.001

on the irony of extolling the virtues of open access and open publishing from behind a paywall

irony

I think the image above demonstrates the problem pretty well.  In my research I have been looking for resources on open access and open publishing, and hoping to use them in my assignment.  I found what looks to be a great book on open science and the way science needs to change to embrace the idea of openness (in some areas it is and there are some great examples such as the Polymath project and GenBank).  There is a preview of the book (you can read Chapter 1 which essentially provides teasers for the rest of the book) that suggests what is in the rest of it will be a really great discussion on openness, open access, open publishing, open science, and the opportunities and challenges in this area.  It seems to go on to extolling the virtues of open access etc however all that great content is locked behind a paywall.  So much for openness!

In my eagerness to read the book (I can’t afford to buy yet another book on top of fees etc) I tried to access it through the Uni library only to find out it isn’t available.  Upon further investigation, it appears not all university libraries are created equal and some (including one near me) seem to have much larger reserves to draw on to expand their (largely virtual) collections. Off I went to that university to see if I could borrow the book.  It turns out that I can, if I pay a fee which was even larger than the cost of the book to join the library.  That fee only provides me access to the library for the current academic year (which is fast coming towards the end) and will not be pro-rated. As a consequence, that potentially great resource was not accessed (along with a number of other resources sitting behind paywalls).

I guess I find some irony in having to pay to learn about open access and how good the author thinks it is!

Digital scholarship – or just note-taking in a public manner? (1of ?)

A cautionary note – the title of this post is really just a question to myself.  It follows on from watching Martin Weller’s youtube video  “The Digital Scholar – how technology is transforming scholarly practice” so here I am starting to collate my notes/thoughts on some of our readings into a blog post.  The format is going to be messy, and I am going to be adding to it, probably not good form for a blog post and while I am not a scholar according to the definition we are working with, that is, “individuals who participate in teaching and/or research endeavors (e.g., doctoral students, faculty members, instructors, and researchers)” (Veletsianos & Kimmons, 2012, p.766), I am going to attempt to adopt some of the ‘open’ practices Weller speaks of.  All that being said, and if you have read this far, here are my evolving notes on this…..

Weller, M.(2015). The Digital Scholar – how technology is transforming scholarly practice.

  • Characteristics of ‘open scholars’
    • distributed identity
    • …but with a ‘central place’
    • have a cultivated online network
    • open publishing
    • range of informal outputs
    • will always be trying new tools/services
    • identity tends to be a mix of personal/professional
    • automatically share output

Lessons

  • Lesson 1: Accept digital scholarship as relevant to you
  • Lesson 2: Resolve the tension between existing and new practice
  • Lesson 3: Use the network to enhance engagement and dissemination
  • Lesson 4: There are new research possibilities
  • Lesson 5: Embrace unpredictability

Weller makes the point that digital scholarship is complementary, not in competition, to traditional scholarship.

Greenhow, C., Robelia, B., & Hughes, J. E. (2009). Learning, teaching, and scholarship in a digital age Web 2.0 and classroom research: What path should we take now?

2009 – a bit old when it comes to discussing Web 2.0 and the participatory web however here’s some thoughts:

  • begins by discussing the change from a ‘read-only’ web (Web 1.0) to a ‘read-write’ (Web 2.0) web.
  • interesting to see these authors start from the premise that how “youth’s proclivities for web 2.0 influence learning and teaching” (Greenhow, Robelia & Hughes, 2009, p.248) – in contrast to Veletsianos and Kimmons (2012) reserving their opinion on the causal relationship between technology and culture (however they do concede the way technologies are used in larger cultures may influence sub-cultures (e.g. academic publishers, research communities) (Velestianos & Kimmons, 2012, p.770). It’s also interesting to see Katz (2010) opinion on technology as an enabler – “we may rightly argue that information technology has truly revolutionised the mission of higher education and is a prime enabler of the knowledge-driven era” (Katz, 2010, p.48)
  • introduces the notion of social scholarship – developing an online identity using services such as Delicious to tag, critique and share resources:
    • openness
    • sharing
    • collaboration
    • conversation
    • transparent revision (e.g. this blog post)
  • makes brief mention of licensing regimes such as Creative Commons to help academics define how they want to share their works (Greenhow, Robelia & Hughes, 2009, p.254)
  • The article starts to consider the ethics of using social network data for research (Greenhow, Robelia & Hughes, 2009, p.254) and identifies some of the privacy/permissions and other challenges (such as unintentionally influencing the data (e.g. Hawthorne Effect).

Scanlon, Eileen. (2014). Scholarship in the digital age: Open educational resources, publication and public engagement.

Scanlon (2014), like others (e.g. Weller, 2015), begins by looking at Boyer’s identification of four functions of scholarship (Boyer, 1990, in Scanlon, 2014, p.13):

  • discovery
  • integration
  • application
  • teaching

.. and then goes on to look at how digital scholarship might affect these functions

References

Greenhow, C., Robelia, B., & Hughes, J. E. (2009). Learning, teaching, and scholarship in a digital age Web 2.0 and classroom research: What path should we take now? Educational Researcher, 38(4), 246–259. doi: 10.3102/0013189X09336671

Hawthorne Effect, (n.d.). Wikipedia. Retrieved August 15, 2016, from https://en.wikipedia.org/wiki/Hawthorne_effect

Katz, R. (2010). Scholars, scholarship and the digital age. EDUCAUSE Review, 45, 2, 44-56.

Scanlon, Eileen. (2014). Scholarship in the digital age: Open educational resources, publication and public engagement. British Journal of Educational Technology, 45(1), 12-23. doi: 10.1111/bjet.12010

Veletsianos, G. & Kimmons, R. (2012). Networked Participatory Scholarship: Emergent techno-cultural pressures toward open and digital scholarship in online networks. Computers & Education, 58(2). doi: doi:10.1016/j.compedu.2011.10.001

Weller, M. (2015). The Digital Scholar – how technology is transforming scholarly practice. Retrieved August 15, 2016 from https://www.youtube.com/watch?v=ut4boLiWLHM&feature=youtu.be  

Case Study Proposal

Title

Can Learning Analytics provide the insights into online student learning that on-campus teachers have through direct access to student interactions?

Brief Description

In traditional university settings students have direct interactions with academics through lectures and tutorials, enabling the potential for deep insights into the way individual students learn and improving their learning outcomes. In online settings, some of these insights may be provided through learning analytics in learning management systems (LMSs) however a great deal of learning takes place outside the LMS. While it may be possible to collate data related to student activity outside the LMS, this research will investigate the challenges around that data and whether it is meaningful beyond basic indicators such as participation/some form of engagement.

Expected Outcomes

This case study will:

Demonstrate the importance of gaining insights into ‘online students’ learning.
Demonstrate the level of sophistication that LMSs are able to provide into student learning that would need to be matched or at least approximated by analysis of data from other sources.
Demonstrate the importance of a holistic view of students’ online learning.
Describe the challenges associated with collecting insights into student learning beyond the LMS (eg privacy, ethical, policy, validity).
Examine the challenges of collecting data from different sources and making sense of that data (eg skills required, tools available, interoperability, semantics, vocabularies etc).
Determine the viability of providing sophisticated insights into student learning beyond the LMS.

Case Study Plan

Major Steps:

Validate the assertions made in relation to the increase of online/distance learning in higher education and also the increasingly sophisticated analytics functionality in LMSs and other formal education-specific online services.
Examine the challenges associated with using services outside the LMS for teaching and learning in higher education.
Demonstrate how data from non-educational web services is being collected to gain insights into student learning and examine the validity/effectiveness of that data/approach.
Propose next steps forward for learning analytics in using data from multiple online services (both education and non-education specific).

Resources required:

Evidence/data showing the increasing participation in online-only learning and consequent lack of direct academic/student interactions.
Research/literature demonstrating the complexity and science utilized within analytics engines in LMSs.
Literature demonstrating the challenges associated with collecting data from different (eg non-educational) platforms.
Interviews with learning analytics experts.
Evidence describing current approaches and challenges.

Projected timeline:

Week beginning Monday 22 August

Begin literature review.
Connect with learning analytics communities of practice/experts.
Week beginning Monday 29 August

Continue with literature review.
Organise interviews with experts.
Week beginning Monday 5 September

Consolidate evidence supporting the assertions made in the description of topic.
12 September – 25 September

Document challenges with collecting data from outside the LMS and the validity of this data (from a learning context).
Week beginning Monday 26 September

Develop insights/next steps for the application of learning analytics with data from both educational and non-educational platforms.
Complete the draft case study report.
Week beginning Monday 3 October

Final draft of case study report.
Reflection on research report.
Week beginning Monday 10 October

Submit final draft.

Thoughts on Colloquium 2: Leading Learning in a Digital World

Like everyone else I found Pip’s presentation was really quite inspirational.  The amount of content covered was almost overwhelming.  Coming from more of a technology background than an educational background I feel that I have a strong leaning towards the use of technologies in learning and teaching and think that I am quite aware of the ICT capabilities which are part of the Australian Curriculum however I am more involved in higher education than primary and secondary.  While it was great to see such use of technology in the classroom I started to wonder at what might be a sensible balance between having to acquire ICT skills while at the same time learn the subject being taught (e.g. Japanese).  I was wondering just how much time is being spent learning about the technology and how much time the actual subject.  Pip herself mentioned this citing feedback from students about the technology challenges associated with one of their projects.

It was also interesting to hear about here approaches to change and encouraging more innovative practice.  Looking at the models in Yvonne’s, Chantal’s and Jo’s summary, I feel a bit uncomfortable with some of the labels used when categorizing staff in relation to change.  Fundamentalists and laggards are labels that may not resonate particularly well with that staff.  I think in dealing with people in these categories we are essentially dealing with resistance to change and it is important to understand that some approaches will work well with some people and not at all with others.  Resistance may be cognitive (I don’t know), psychological, (I am not able) or ideological/power driven (I don’t want).  Plying staff with more information and more examples may help with cognitive resistance and to a certain extent psychological resistance, but it will only reinforce ideological/power-driven resistance.  Ideological resistance needs to be addressed at a value level (e.g. perhaps connecting with shared historical values and through strong leadership).

I found the following video on change management, while not education-specific, provides some great insights and also identifies a number of areas for further investigation on this subject.