The Evidence-based TL

Evidenced-based practice should not only drive pedagogical initiatives of classroom teachers, it should also be at the core of the work of teacher librarians. Evidence can be used to identify a need or problem, plan and implement initiatives, and evaluate and reflect on those programs. It can also be used to justify library initiatives and programs, budgetary and staffing requirements, and a range of other resourcing needs. That being said, when evidence is gathered by the practitioner for advocacy purposes, the trustworthiness of the evidence and its interpretation comes into question. Therefore, it is vital evidence gathering practices are holistic and incorporate a range of methods.

Vertical and horizontal data sets can help schools better understand their students’ learning journey, while considering their specific context. Data is easily interpreted in a myriad of ways and can be manipulated to suit purpose. It becomes an issue when data is used to compare school performance, rather than focusing on individual student progression. The focus must be on student learning outcomes rather than outcomes. Therefore, vertical data gathered from high-stakes testing should be used in conjunction with horizontal data that provides a more holistic picture of students’ learning levels (Renshaw, Baroutsis, van Kraayenoord, Goos, & Dole, 2013).

Teacher librarians adopt a holistic approach to evidence-based practice by considering the whole school with which they serve and by drawing on a range of evidence in a range of ways (Gillespie, 2013). As seen in Figure 1, Todd (2015) proposes a holistic model for school libraries, which includes evidence for practice (Foundational: informational), evidence in practice (Process: transformational), and evidence of practice (Outcomes: formational). Furthermore, Robins (2015) highlights the usefulness of action research as part of a holistic approach to collect qualitative and quantitative data and connect educational research with improved practice. Oddone (2017) provides an overview of action research for school libraries as seen in Figure 2.

A table, which shows Todd's three categories of evidence-based practice. 1. Evidence FOR Practice - FOUNDATION Informational: Existing formal research provides the essential building blocks for professional practice. 2. Evidence IN Practice - Applications/Actions PROCESS Transformational: Locally produced evidence—data generated by practice (librarian-observed evidence)—is meshed with research-based evidence to provide a dynamic decision-making environment. 3. Evidence OF Practice - Results—impacts and outcomes; evidence of closing of gap OUTCOMES Formational User-reported evidence shows that the learner changes as a result of inputs, interventions, activities, and processes.
Figure 1: Holistic model of evidence-based practice for school librarians (Todd, 2015).

 

Infographic showing: Action Research for Teacher Librarians: A brief introduction and overview to Action Research as a tool for evidence based practice for teacher librarians.
Figure 2: Action Research for Teacher Librarians: A brief introduction and overview to Action Research as a tool for evidence based practice for teacher librarians. (Oddone, 2017).

Gillespie (2013) found that teacher librarians gather evidence through two key modes; by engaging and encountering. The nature of evidence-based practice is therefore not linear. Teacher librarians gather evidence through purposeful and accidental encounters, which involves research-based and/or practitioner-based evidence and application for improved practice through a combination of intuition and reflection (Gillespie, 2013). As with improvement in any field, particularly education, reflection is key to driving improvement as it provides the metacognitive prompts to interpret evidence and consider the practitioner’s professional experience and expertise that is needed to apply evidence meaningfully.

Teacher librarians can utilise evidence-gathering tools to gather evidence for, in and of practice. Gillespie (2013) recommends valuable evidence is drawn from and used within three areas; including, teaching and learning, library management, and professional practice. A tool such as School Library Impact Measure (which I spoke about in my final assessment for ETL504) equips teacher librarians with a framework to assess the impact of their instruction on student learning outcomes during a Guided Inquiry experience (Todd, Kuhlthau, & Heinstrom, 2005). To gather evidence of library management, including collection development, the environment, and services, teacher librarians can source benchmarks from peer institutions, questionnaires or surveys to evaluate performance, set new goals or standards and implement strategies. Benchmarks from peer institutions can be sourced through network meetings and through the Softlink survey.

As a school library that runs a college-wide academic reading and writing program, we utilise data to assess the extent to which the programs effectively improve student learning. Reading comprehension tests help the teacher librarians and teachers determine the effect of the strategies on student abilities. Teacher librarians should work with heads of department and classroom teachers to analyse and digest the data in meaningful ways. Collaboration is central to the perception of the data analysis process. Teachers do not want to fall victim to judgement due to the results within their class. While this is important to note, it is also important to hold honest and open conversations about opportunities the data may present. Viewing the results through an opportunities lens ensures all stakeholders feel safe and valued and student learning is central.

This year, I have been delving deeper into the world of data and evidence to assess the impact the Study Skills program has for our boarding students. The program is designed to assist boarding students with; organisation, prioritisation, study skills, assessment skills, and research skills, while growth mindset underpins the sessions. A teacher librarian (myself on Tuesday and Wednesday nights and our Head of Library on Monday nights) facilitate a half hour targeted session with the students in the senior library – one cohort per night on a weekly rotation. These sessions are timely and relevant to the assessment and class work to ensure the value is clear. After the half hour session, the students undertake their independent study for the remaining hour and a half, while the teacher librarian circulates and assist students in smaller groups and one-on-one. I am pleased to share that since the inception of the program last year, the boarders’ results are trending up – often at a greater rate than the day students. While many factors influence the boarders’ results (strategies from classroom teachers, tutoring, support from boarding supervisors and parents for example), the results are also indicative of the renewed impetus for study and a changing culture that has developed from the program. The program has been a feat of collaboration and has seen staff members from across the college come together for this common cause. To gather evidence, I was able to use data from both Learning Analytics and TASS to compare day and boarding groups, and boarding groups over time. This also provides me with opportunities to compare external high-stakes results, such as NAPLAN results, with curriculum results. This evidence was used to not only assess the effectiveness of the program but also to open conversations with staff and students about learning outcomes. From this, I created a report to document the findings and present to teaching staff and leadership team (Figure 3). I have had incredibly valuable conversations with students concerning their progress and strategies going forward. One-on-one, I talked to students about their GPAs, compared these against their report cards, then again against their individual assessment tasks to identify strengths and areas for improvement. The additional ownership students took of their results and the empowerment felt was palpable. The evidence is used to evaluate and reflect on the programs, to plan targeted sessions, to empower students, and to advocate for the work done by the teacher librarians.

Title page of Study Skills report. The pages states: "Study Skills, 6-monthly report 2018 Semester one.
Figure 3: Title page of Study Skills report.

Ultimately, learning must be at the centre of the analysis, discussion and subsequent initiatives or action. Evidence must be gathered on a local level to determine the effectiveness of library programs and through other means, such as empirical research, to ensure best practice. Evidence must be made explicit and concerted efforts must be made to “measure the relationship between inputs, outputs, actions and student outcomes” (Hughes, Bozorgian, & Allan, 2014, p. 15). These practices can help to ensure the longevity of the school library and assert the library and staff as invaluable to the school community.

 

 

References

Gillespie, A. (2013). Untangling the evidence: Teacher librarians and evidence based practice [Thesis]. Retrieved from https://eprints.qut.edu.au/61742/2/Ann_Gillespie_Thesis.pdf

Hughes, H., Bozorgian, H. & Allan, C. (2014). School libraries, teacher-librarians and student outcomes: Presenting and using the evidence. School Libraries Worldwide, 20(1), 29-50. doi: 10.14265.20.1.004

Oddone, K. (2017). Action research for teacher librarians [Infographic]. Retrieved from https://my.visme.co/projects/jwvj7ogk-action-research-for-teacher-librarians#s1

Renshaw, P., Baroutsis, A., van Kraayenoord, C., Goos, M., and Dole, S. (2013).  Teachers using classroom data well:  Identifying key features of effective practices. Final report. Retrieved from https://www.aitsl.edu.au/docs/default-source/default-document-library/teachers-using-classroom-data-well.pdf

Robins, J. (2015). Action research empowers school librarians. School library research, 18, 1-38. Retrieved from http://ezproxy.csu.edu.au

Todd, R. J. (2015). Evidence-based practice and school libraries: Interconnections of evidence, advocacy, and actions. Knowledge Quest, 43(3), 8-15. Retrieved from https://search-proquest-com.ezproxy.csu.edu.au

Todd, R., Kuhlthau, C. & Heinström, J. (2005). School library impact measure SLIM: A toolkit and handbook for tracking and assessing student learning outcomes of guided inquiry through the school library. Retrieved from https://cissl.rutgers.edu/sites/default/files/inline-files/slimtoolkit.pdf

Algorithms Rule the World: Its time to get SCRAPpy

Image of a conceptual computer algorithm. Neon green data lines running vertically down a projected screen in a black room.

The ability to read and interpret information is a fundamental skill needed to participate fully in the world. These basic skills (although actually quite complex) will continue to be as important, if not more, than the past. The information-rich world is expanding; however, algorithms are filtering the information we see. So, people’s values and beliefs are consistently reinforced, while other perspectives are left out or buried on page 3 of Google search results – an equally ominous fate. In turn, this leads to confirmation bias, which can be detrimental to those who cannot critically evaluate what they are experiencing and reading.

Algorithms present users with a calculated selection of “relevant” information; however, it is clear that users must develop and employ the necessary skills to work within these algorithms. The top search results are not always the most useful. Searchers must not ignore the other titbits of information such as Google’s snippets displayed under each search result. While Google can change the snippet from the meta description of the webpage to their own algorithmically determined snippet (Silver Smith, 2013), it is still a useful port of call that many searchers skip over. This allows savvy searchers to preliminarily assess the relevance and worth of the search results – which are not always at the top of the page (Wineburg & McGrew, 2017). But algorithms are not the only cause of confirmation bias. Ashrafi-Amiri & Al-sader (2016) suggest searchers’ assumptive search queries based on fact retrieval and verification will characteristically retrieve more bias results than if the query were non-assumptive; that is, knowledge acquisition, comparative, analytical, and exploratory in nature. Information literacy instructors must be aware of this and consider this when developing instruction for students.

TLs must address the critical thinking skills required to work with and within algorithms that reinforce bias. Maynes (2015) identifies the role of information literacy instructors in explicitly teaching students about the forms of bias, ways to identify their own bias, and skills to mitigate the potential effects of their bias. This involves teaching the metacognitive skills students need to not only know the strategies to use but how, when and why to use them (Maynes, 2015). A combination of lateral and vertical reading is useful in all information evaluation situations. While many libraries utilise a CRAP (Currency, Reliability, Authority, Purpose/Point of View) test to step students through the information evaluation process, other steps can also be considered so students tune into their metacognition and identify their bias (Wineburg & McGrew, 2017). Allan (2017) suggests incorporating some form of personal reflection into the information literacy sessions offered to students. Students must not only be taught that confirmation bias exists, they must also be taught the skills to identify it in themselves and to deal with it when it occurs. One such strategy is to identify when a source of information elicits an emotional response from the reader – Does it make you happy? Sad? Reinforce? Challenge? Developing self-regulation triggers the reader to seek additional information and reflection to consider the opposite or alternate (Hirt & Markman, 1995; Lord, Lepper, & Preston, 1984; Mussweiler, Strack, & Pfeiffer, 2000). This requires information searchers to reflect on their reactions at each step and consider whether their evaluation of the usefulness or credibility of the source would be the same if it presented the opposing viewpoint. Deliberately considering the opposing viewpoint requires the searcher to consider their bias and the bias of others. This is a powerful strategy in unveiling subconscious or hidden bias. Allan (2017) posits adding an S (Self-examination or Self-awareness) to the beginning of the CRAP test would highlight the importance of identifying and recognising cognitive and confirmation bias.

SCRAP it: Source evaluation process.
SCRAP it: Source evaluation process.

The importance of slowing down the information evaluation process by thinking effortfully and deliberately (Kahneman, 2011) and evaluating laterally (Wineburg & McGrew, 2017) is central to 21st century information and digital literacy. Evaluating laterally requires searchers to seek and consider context and perspective, which means they must seek additional information. Slowing down does not simply mean taking longer to read the article and its parts – it means careful and deliberate consideration and slowing your judgement by first taking your bearings and exploring laterally. This may mean to first leave the site or visit the About Us section to find out more about the author or the organisation, before navigating back to the original source (Wineburg & McGrew, 2017). Thinking laterally can occur in multiple stages of the CRAP test, particularly when assessing the reliability and purpose/point of view present in the source. Searchers will need to explore other sites to learn more about the information. While searchers will not always slow down and employ lateral reading, it is important to know when to slow down. High stakes situations where the searcher may possess a strong bias already or where the information may have significant consequences for the searcher or others, or a highly contested issue or topic may require more deliberate reasoning to ensure the searcher is acquiring balanced, truthful information (Maynes, 2015). Considering the opposite is another practical strategy to employ in these situations.

It is clear that information evaluation and digital literacy skills need to evolve with changing demands and issues within the information landscape. Information literacy instructors must stay abreast of these changes and adapt evaluation strategies as needed. A start might be to model and incorporate lateral reading into existing strategies and follow Allan’s (2017) suggestion and put that S at the beginning of CRAP.

 

References

Allan, M. (2017). Information literacy and confirmation bias: You can lead a person to information, but can you make him think? Informed Librarian Online, 2017(5). Retrieved from https://asu-ir.tdl.org/handle/2346.1/30699

Ashrafi-Amiri, N. & Al-sader, J. (2016). Effects of confirmation bias on web search engine results and a differentiation [Thesis]. Retrieved from https://core.ac.uk/download/pdf/43564372.pdf

Hirt, E.R., & Markman, K.D. (1995). Multiple explanation: A consider-an-alternative strategy for debiasing judgments. Journal of Personality and Social Psychology, 69(6), 1069– 1086.

Kahneman, D. (2011). Thinking, fast and slow. New York: Farrar, Straus and Giroux

Lord, C. G., Lepper, M. R., & Preston, E. (1984). Considering the opposite: A corrective strategy for social judgment. Journal of Personality & Social Psychology, 47(6), 1231-1243.

Mussweiler, T., Strack, F., & Pfeiffer, T. (2000). Overcoming the inevitable anchoring effect: Considering the opposite compensates for selective accessibility. Personality and Social Psychology Bulletin, 26(9),1142–1150.

Silver Smith, C. (2013). Influencing how Google displays your page description. Retrieved from https://www.practicalecommerce.com/influencing-how-google-displays-your-page-description

Wineburg, S. & McGrew, S. (2017). Lateral reading: Reading less and learning more when evaluating digital information [Report]. Retrieved from https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3048994

In the Dip

I had the opportunity this week to work with Year 10 HPE students for their “Risky Behaviours” unit. In groups, students are to select one area of interest, analyse the topic and statistics, develop guidelines, and produce a presentation. The students had previously visited the library to unpack their assignment by identifying instruction words, key terms, and limiters and to identify and practice effective search skills. The second lesson they had with a TL was to check-in during their exploration stage. This gave me the opportunity to guide students through and out of the dip. I asked questions to gauge their knowledge, interest areas, and chosen avenues. We talked about potential pathways to narrow their topics, Boolean Operators to assist in searching, and helpful databases and websites. While these were all discussed during their first lesson, when students are in the exploration stage they require reminders and guidance to refocus their thoughts and actions. Through these conversations, the students had many light bulb moments by connecting their chosen topic to events they had seen or read in the news. This piqued their interest and was the beginning of the students entering the Third Space. Students went on to discuss, in their groups, their personal experiences and viewpoints on the topic and how it relates to their lives. This session allowed me to remind students to refer back to the task to ensure their research and responses were relevant and would meet task requirements and it allowed me to engage in their research process as a guide and fellow learner. This zone of intervention allows TLs to help students get from exploration to formulation. This practice equips students with the skills to put information into some kind of order to better make sense of what they are finding.

Graphic showing the GID process and the light bulb moment that can occur with intervention during the explore stage, which transitions students to the identify stage
Moving from Explore to Identify. GID Framework by Kuhlthau, C., Maniotes, L. K., & Caspari, A. K. (2012). Light bulb pixabay image by jambulboy https://pixabay.com/en/question-questions-man-head-2519654/ modified under a Creative Commons (0 1.0) license

The TL’s Place in the Dip

(and Before and Beyond)

Photograph showing a dip in a curved road and a road sign to the right of the road that says "dip". A mountain appear in the background.
“DIP sign as mountain seen from the visitor centre in evening” flickr photo by daveynin https://www.flickr.com/photos/daveynin/37451356472 shared under a Creative Commons (BY) license

Reflecting on the need for an evidenced-based inquiry process to support 21st century inquiry, I have come to revisit Kuhlthau’s ISP and GID Framework once more. Currently, the TL programs running at my college include a variety of sessions all intended to enhance the learning outcomes of students while also supporting teachers. A monopoly of reading comprehension and writing sessions has been commonplace this year. While the value of this is unquestionable, the team has realised we also need to refocus to ensure all facets of our role are equally met; curriculum leader, information specialist, and information services manager. While these have not been entirely neglected, they have dipped out of the spotlight. Not only can we support the college in developing effective literacy practices and metalanguage surrounding this, we must also support the information needs of our clients through digital and information literacy programs.

This got me thinking about our recent lessons with Year 7 HPE students. The TL team held interactive lessons with all classes to develop their search and retrieval skills, as students are about to embark on several small research activities throughout the term. These lessons were buzzing with student and teacher engagement. Students learnt, first-hand, the power of key words and Boolean phrases. Now, as I think back to Kuhlthau’s Information Search Process [ISP], I see it would be highly beneficial to revisit each class while they’re in the exploration stage and perhaps feeling the sudden and stabbing pangs of uncertainty and research anxiety (Kuhlthau, 2004). I experienced this myself during my last assignment for this semester. I had never felt such anxiety or uncertainty around a topic and my thinking. I was overwhelmed with the information and could not see a way out. I was in the dip and it was quickly becoming an abyss. With a moment of clarity, I sought intervention. The information barrier had led to a total cognitive and affective meltdown but, with intervention, I regained my certainty and refocused my thinking (Kuhlthau & Cole, 2013). If that is not ISP in action I don’t know what is. If this was my extreme, how are the Year 7s currently coping with only infantile inquiry capabilities?

Well, the Year 7 HPE students are approaching the danger zone (Kuhlthau & Cole, 2013). So, how can I, as a TL, support them? Kuhlthau and Cole (2013) suggest tapping into the Third Space. It is now, in this space of uncertainty, that students need reminding to tap into their prior knowledge. As a true constructivist approach, GID reminds students to build on and reflect on their knowledge and belief systems. We must revisit the search process, collaborate and share findings and thinking, relate the new information to old, and engage with the wider community to expand understanding and connect the new information to the world. Students must first be encouraged to relax, slow down, read without distraction and reflect on the information (Kuhlthau, Maniotes, & Caspari, 2012). There are many stages of intervention throughout the ISP; however, the dip requires particular attention because it could make or break knowledge construction and task engagement. A series of reflective questions helps at this stage. Kuhlthau, Maniotes and Caspari (2012) recommend asking students what they have found interesting thus far, what they’d like to share with others, and what questions they have. Engaging with others provides students with an opportunity to further tap into the Third Space, to connect the academic world to their world. The Third Space allows students to connect to their world to construct new ideas and perspectives (Kuhlthau, Maniotes & Caspari, 2007). Providing students with these opportunities will be more important than ever, as the revised senior curriculum will require students to develop unique inquiries and responses. Not only does it support expectations from QCAA, it also aligns with the social and personalised learning preferences of Gen Z (Gudowski, 2018; Zozinsky, 2017). The GID Framework is flexible enough to support 21st century students and their ways of learning.

I realise, on reflection, the Year 7 HPE unit could have been an excellent opportunity to engage the department in using the full GID Framework and work with the teachers and students throughout the entire process – not just sporadically. This is something I will endeavour to do next year, as GID is students’ GPS through the information search process.

 

References

Gudowski, C. (2018, May 25). Gen Z and GID [Blog post]. Retrieved from http://52guidedinquiry.edublogs.org/category/third-space/

Kuhlthau, C. C. (2004). Seeking meaning: A process approach to library and information services (2nd ed). Westport, Connecticut: Libraries Unlimited.

Kuhlthau, C. & Cole, C. (2013). Third space as an information system and services intervention methodology for engaging the user’s deepest levels of information need. American Society for Information Science and Technology, 49(1), 1-6. doi: 10.1002/meet.14504901074

Kuhlthau, C. C., Maniotes, L. K., & Caspari, A. K. (2007). Guided inquiry: Learning in the 21st century. Retrieved from EBSCOhost

Kuhlthau, C. C., Maniotes, L. K., & Caspari, A. K. (2012). Guided inquiry design: A framework for inquiry in your school. Retrieved from ProQuest Ebook Central

Zozinsky, S. (2017). How generation Z is shaping the change in education. Forbes. Retrieved from https://www.forbes.com/sites/sievakozinsky/2017/07/24/how-generation-z-is-shaping-the-change-in-education/#3b7303f66520