EdTech Global

Developing a global understanding of educational technology

Author: claire.hazenberg (page 1 of 16)

Module 7: ICT Integration – Part 2 (Assessment & Monitoring)

In my previous post ICT Integration: Part One – Defining, I defined what ICT integration looks like in my context. In this post I will suggest several different assessment and monitoring approaches that can be used to measure and assess the impact of this integration.

How do we assess the degree to which we are integrating ICT into the classrooms?

 Current monitoring, and evaluation processes within the project do, to some extent assess the degree to which technology is being integrated into the classroom. This occurs through the analysis of quantitative data which illustrates the number of hours students use the devices. While this provides a clear assessment of the time spent using the integrated technology, it provides little indication of the impact of the integration itself.

 

So how can we measure the impact?

Collaboratively as a group, the Education Specialists have tried to tackle this issue previously. Through meetings and copious amounts of WhatsApp messaging we have identified some strategies that could work within our context to measure the impact of the technology integration.

Strategy One – Triangulate data to measure learning growth

Strategy One – Triangulating data to measure learning growth

  • What? Triangulating data on the time spent using the device with previous and current literacy and numeracy learning achievement data for each child.
  • Why? If the technology integration is proving effective than the students who spend more time on the devices should be making greater gains in numeracy and literacy, compared to those students who use the devices for a lesser amount of time.

Strategy Two – Leverage and analyze the data collected from the app to measure task fluency and understanding

  • What? Analyze the time spent on a task, with the learner’s number of attempts to complete the task before answering correctly.
  • Why? The tasks, which are structured into units are designed using rote-learning principles, meaning that as the tasks progress in the unit they are simply being repeated with no increase in difficulty. As a result, as students progress through the unit they should be completing tasks in shorter amounts of time with higher levels of accuracy. Thus demonstrating learning growth through increased understanding and fluency.

Strategy Three – Leverage and analyze the data collected from the app to measure basic digital literacy skills of learners

  • What? Compare the number of times a learners take to log-on to their profile and begin working.
  • Why? This comparison would reveal any increases (or decreases) in digital literacy skills, being the ability to locate and use the app. Ideally the logon time should decrease the more students use the devices.

Strategy Four – Teacher Questionnaire to measure technical and pedagogical ICT knowledge.

  • What? Conduct qualitative and quantitative teacher questionnaires.
  • Why? As part of the ongoing training facilitated by Education Specialist, each session should allocate time for a questionnaire that aims to measure the technical and pedagogical ICT knowledge of teachers and school leaders. The questionnaire would be modelled off the UNESCO Teacher ICT Competency Framework (UNESCO, 2011) with a particular focus on the following two areas
    • Pedagogy – ICT enhanced learning and
    • Teacher Professional Learning – Digital Literacy.

Strategy Five – Community consultations to measure awareness and values of ICT integration in education

  • What? Hold bi-annual community consultations (mothers’ groups, parent associations ect) to discuss benefits of ICTs, address concerns and collect qualitative data on their awareness and values of ICT integration in schools.
  • Why? In Malawi the prominence of ICT in everyday life is minimal, as such it is believed that the impacts of ICT integration is not just felt within the school but also in communities, particularly on community perception towards the efficacy of ICT.

References:

United Nations Educational, Scientific and Cultural Organization (UNESCO). (2011). UNESCO ICT competency framework for teachers. Retrieved from https://unesdoc.unesco.org/ark:/48223/pf0000213475

 

Module 7: ICT Integration – Part 1 (Defining)

If we can’t define it, we can’t measure it.

This is the beginning sentiment in Proctor, Watson & Finger’s (2003) study on measuring ICT curriculum integration. Since 2003 the need to quantify and to assess ICT integration has been a prominent area of study. Numerous methodologies, strategies and approaches used to measure integration have been suggested including using the use of questionnaires (Jamieson-Proctor et al., 2007, Christensen & Knezek, 2008), maturity models informed by international ICT competency standards (Solar, Sabattin & Parada, 2013) and “Technology Mapping” methodologies (Angeli and Valanides, 2009).

To me this diversity in assessment and measuring tools indicates that there is no one way to define ICT integration. For example, integration can be defined by,

  • the consistent presence of ICT across schools and subject areas (Çapuk, 2015),
  • uptake of ICT’s in schools by teachers and pupils (Cox, 2008),
  • specific and a range of ICT uses (Marshall & Cox, 2008),
  • its ability to empower (Gareis and Hüsing, 2009) and
  • effective use of ICT in digital communication and collaboration (Shamir Inbal & Blau, 2017).

How do I define ICT integration my context?

Adapted from Proctor, Watson & Finger (2003)

Using the model by Proctor, Watson & Finger (2003), it is clear that in terms of ICT integration my current context exists within the third quadrant.

 

Within the scope of the project there is wide use of ICT amongst learners however at a school and community level there is still a need to justify and substantiate the efficacy of ICT integration in education. With plans for the project to be adopted into a government initiative there is a need more than ever to resist the urge to turn back. Instead it is vital to keep the foot on the pedal and continue to build capacity and increase community awareness. In doing so will it hopefully result in the following.

  • See – all learners in the program using devices twice a week, multiple teachers facilitating sessions, improved learning outcomes in literacy and numeracy and increased digital literacy amongst learners, increased teacher ICT technical and pedagogical knowledge.
  • Hear – parents and community members talking positively about the initiative, headteachers actively supporting teachers and students sharing their successes.
  • Feel – inspired, supported and confident.

I acknowledge this is a diverse range of indicators and as such need to measured and assessed using a range of qualitative and quantitative monitoring tools.

 

References

Angeli, C., & Valanides, N. (2009). Epistemological and methodological issues for the conceptualization, development, and assessment of ICT–TPCK: Advances in technological pedagogical content knowledge (TPCK). Computers & education, 52(1), 154-168.

Çapuk, S. (2015). ICT Integration models into middle and high school curriculum in the USA. Procedia-Social and Behavioral Sciences, 191, 1218-1224.

Christensen, R., & Knezek, G. (2008). Self-report measures and findings for information technology attitudes and competencies. InVoogt, J., & Knezek, G. (Eds.). (2008).

Cox, M. J. (2008). Researching IT in education. InVoogt, J., & Knezek, G. (Eds.). (2008). International handbook of information technology in primary and secondary education, Vol. 20 (pp. 965–981). US: Springer International handbook of information technology in primary and secondary education, Vol. 20 (pp. 349–365). US: Springer.

Gareis, K., & Hüsing, T. (2009). Measuring transformational use of ICTs at regional level. In Handbook of research on ICT-enabled transformational government: A global perspective (pp. 351-378). IGI Global.

Jamieson-Proctor, R., Watson, G., Finger, G., Grimbeek, P., & Burnett, P. C. (2007). Measuring the use of information and communication technologies (ICTs) in the classroom. Computers in the Schools, 24(1), 167–184.

Marshall, G., & Cox, M. J. (2008). Research methods: their design, applicability and reliability. InVoogt, J., & Knezek, G. (Eds.). (2008). International handbook of information technology in primary and secondary education, Vol. 20 (pp. 983–1002). US: Springer

Proctor, R., Watson, G. and Finger, G. 2003. Measuring information and communication technology (ICT) curriculum integration. Computers in the Schools, 20(4): 67–87. http://primo.unilinc.edu.au/CSU:CSU_ALL:TN_ericEJ698975

Shamir-Inbal, T., & Blau, I. (2017). Which pedagogical parameters predict the general quality of ICT integration from the perspective of elementary school leaders?. Computers in the Schools, 34(3), 168-191.

Solar, M., Sabattin, J., & Parada, V. (2013). A maturity model for assessing the use of ICT in school education. Journal of Educational Technology & Society, 16(1), 206-218.

Module 6: Digital Technology Resources Queensland

Australian Curriculum: Digital Technology

P-10 Subject curriculum, achievement standards 

Digital Technologies Hub

Lesson ideas, learning sequences. 

C2C &  The Learning Place.

Learning sequences, lesson plans and assessment tasks contexualised for Queensland schools. 

Australian Institute for Teaching and School Leadership

Teaching standards, leadership strategies 

Digital Strategy – Education Queensland

State-wide strategy to deliver personalised, collaborative and integrated learning experiences for the digital generation.

Scootle

National repository of digital resources aligned to the Australian Curriculum.

Older posts

© 2021 EdTech Global

Theme by Anders NorenUp ↑

Skip to toolbar