Portable Magic

"Books make people quiet, yet they are so loud.” – Nnedi Okorafor

Tag: digital resources

Reading From the Screen

Reflection: Think about how you process information and read. Are young people any different? Do they use technology differently to older people? Have ebooks ‘taken off’ in your school? What reasons could explain this?


Several years ago our school brought in Mark McCrindle for an all-day PL – repeated a few years later, weirdly. He used selected data to show that teens are ‘digital natives’ – a misleading term that has been widely countered – and thus that they needed a different style of instruction, using lots of short, multimodal texts. Pop Teaching instead of Pop Art, if you will.

Yet all of us listening were all thinking the same thing: that’s not our experience.

I don’t see a lot of difference in how they use technology, or how confident they are with it, only that they’re more engaged with SMS apps like Snapchat, and will be on multiple platforms. ‘Older’ people will probably only use text, Messenger and What’s App, and communicate via Instagram messages. Young people don’t like using email, but then no one uses email to chat anymore – that’s what apps are for.

Our students, aged 16-18, aren’t all that familiar with technology really. They’re just really comfortable with navigating their phones. They don’t know how to use computer programs much, especially Word. They interact, rather than utilise or explore. They’re certainly capable of learning more, but in general, their technology use involves a lot of passive staring.

Our school library had a subscription to Wheelers but hardly anyone borrowed books from it. Certainly the selection wasn’t super great – but even for the texts we did have, it cost over three thousand dollars a year. We ditched it.

Part of its lack of popularity is that a lot of teens aren’t reading much at all – they prefer to watch, and their attention spans are getting shorter and shorter. Some do read ebooks but they’re more likely to borrow them from the state library, which has a really good selection. Another issue is browsing; they’re just less likely to do it on an app or the website. Even getting to the Wheelers site seemed too difficult.

I definitely process information in limited ways compared to others. My students are content reading off a screen but I need a print copy or my eyes struggle. Partly due to glare, partly due to the fact that text on screen invites me to skim. I suspect my students too this too, though! Many of the articles for this degree, I have to print in order to read – which enables me to highlight and annotate, which is a good study habit anyway.

Module 5.1 Discussion: digital trends

Working in a senior secondary public college, there is limited scope for embracing the latest trend in digital literacy or interactive media. A public school simply doesn’t have the funds, not is there space in the curriculum for much experimentation. With 30 weeks to teach 40 weeks worth of material and prepare students for exams, my experience has been not to get too clever. Working Padlet into my classes is about as inventive as I get.

What I do see is an increasing dependency on the mobile phone – not for learning but for distraction. Avoidance. There’s just so much going on in their lives, from relationship drama to sorting out a lift home to organising shifts at work. It’s hard to cut across that noise, so I’m not averse to integrating some form of digital learning – in the past I’ve tried student blogs, and creating memes, but these don’t have the same benefits of gamification, as described by Briggs (2016).

Their reference to the benefits of Minecraft in the classroom really highlighted for me the difference between what you can do in a primary school (or high school) setting, compared to a college. My son is in grade 6 and his teacher is using Minecraft almost every Friday morning to teach numeracy – but not just numeracy. Teams are given tasks, or challenges, to complete; doing so requires collaboration, maths skills and problem-solving, and then design and implementation followed by a reflection. Not only is my son super excited about being able to ‘play’ Minecraft with his friends in class, but he gets a chance to apply the learning in an engaging way.

Too often ‘engaging’ seems to mean ‘bells and whistles’, in digital device design. (There’s a nice bit of alliteration!) All those custom-made tablet-style devices for children, or the apps designed for them, seem to do little more than keep kids quiet. Some, like the ABC’s Reading Eggs (there’s also a maths version), are quite good at supplementing and consolidating more traditional classroom learning. The digital media mentioned by Springen (2011) are a lot more gimmicky than the publishers would like to admit. Which is why they haven’t ‘taken off’. There’s no substance to them. They’re not satisfying. You don’t get to sit with your thoughts, which we really need – our brains really need.

Springen quotes several publishers as saying they don’t intend for digital media to replace print books; what’s not acknowledged is that young people are so distracted by shiny shiny, and getting so many dopamine hits from digital media/devices, that they’re not learning how to be present for a traditional book. It’s something that needs to be taught. My son’s primary school newsletter frequently includes messages about the importance of parents reading to their children from a young age, every. single. day. Digital media aims to free parents from this ‘chore’ and create a shortcut. But there isn’t one.

I’m generalising, of course. But that’s how we make a point. And my point is, it can’t be ‘digital media for the sake of digital media’. They’re not all equal. And just plopping a device in front of a kid doesn’t absolve adults from their responsibility to teach. There’s plenty of research on the benefits of print-based reading as opposed to digital, for learning comprehension especially (Delgado et al., 2018), as well as the potential harm caused by devices on children’s creativity (Ruder, 2019). Digital devices have a lot in common with TV shows like Cocomelon, which employs the same techniques to hook toddlers as poker machines do (Kosmas, 2022).

It must surely be about balance, and choosing digital media wisely and carefully. In terms of resourcing the library collection, I can see this being more difficult. Unlike books, Teacher Librarians can’t browse devices and apps with the same ease. And it’s not always obvious how a child will interact with it, or what exactly they will (really) learn. It will take a lot more work to research different digital media – and for teachers, more work to figure out how to integrate it in such a way that students actually learn something. They’re here to stay and so much has already changed; we do have a responsibility to teach children how to safely engage with and navigate digital and online spaces.

References

Briggs, S. (2016, Jan. 16). Using gaming principles to engage students. InformED. https://www.opencolleges.edu.au/informed/features/using-gaming-principles-to-engage-students/.

Delgado, P., Vargas, C., Ackerman, R. & Salmeron, L. (2018). Don’t throw away your printed books: A meta-analysis on the effects of reading media on reading comprehension. Educational Research Review, 25 (pp. 23-38)https://www.sciencedirect.com/science/article/pii/S1747938X18300101

Kosmas, S. (2022, Mar. 17). Cocainemelon: Why toddlers can get addicted to watching Cocomelon. Evie. https://www.eviemagazine.com/post/cocainemelon-why-toddlers-can-get-addicted-to-watching-cocomelon.

Ruder, D.B. (2019, Jun. 19). Screen time and the brain. Harvard Medical School News & Research. https://hms.harvard.edu/news/screen-time-brain.

Springen, K. (2010, Jul. 19). The digital revolution in children’s publishing. Publisher’s Weekly. https://www.publishersweekly.com/pw/by-topic/childrens/childrens-industry-news/article/43879-the-digital-revolution-in-children-s-publishing.html.

 

Models of Collecting and Analysing Data

[ETL503 – Module 5: Evaluating Collections]

Explore at least two of the following sources and note the types of analytical data and models of collecting data that are presented.

  1. Karen Grigg, Chapter 9: “Assessment and evaluation of e-book collections” 

“Libraries are now using means such as balanced scorecard, circulation and usage statistics, survey measures, focus groups, and identification of strength areas of the institution as methods to ensure that book collections are vital and relevant.” (p. 128)

  1. Usage data
  2. Overlap analysis
  3. Survey instruments
  4. Benchmarking
  5. Focus groups
  6. Balanced scorecard method

Usage data:

  • collecting and interpreting the data from circulation. Things to look for includes
    • which subject areas are borrowed the most from
    • whether texts with illustrations are more likely to be borrowed than others
    • any preference for standalone volumes vs series
    • whether theoretical or practical texts are more popular
    • for eResources, whether the number of people seeking to borrow a text exceeds the agreed-upon number in the licence (meaning there’s high demand for that title)
  • Challenges with using usage (circulation) data include
    • too many inconsistencies in how data on online or eResources are collected (and these statistics are managed by the vendor, not the librarian)
    • the type of usage allowed for eResources, such as a single-user model when a title can only be borrowed by one person at a time, rather than multiple (in such cases, potential borrowers are turned away and may not return, making it hard to gauge just how popular a text is)
    • issues with ascertaining how useful an eResource is, as the usage data doesn’t reveal this (it may have simply been promoted more in the search results) (p. 129)

Overlap analysis:

  • “Overlap analysis can query, for each database in a library’s collection, the number of titles in that database that are unique to that database and the number that are available elsewhere in the library’s collection.” (p. 130)
  • used mostly for databases, but possibly with ebook ‘packages’ to “compare titles by subject area, looking for both gaps and overlaps.” (p. 130)
  • helps locate duplications and free up budget to fill gaps
  • Limitations include:
    • duplicates within an e-book package can’t be ‘cancelled’
    • the copies (duplicates) may not be equal, but slightly different versions, meaning the librarian still has to evaluate each “for such factors as ease of navigation, inclusions of graphs and illustrations, and potential negative financial consequences of cancellations that may be imposed by the vendor” (p. 130)

Survey instruments:

  • methods of surveying users (the people borrowing the resources)
  • allows librarians to “evaluate” the impressions users have of the collection, including the e-book collection. User can note how they feel about “the increasing availability of e-books, what e-books they have used, and what the users perceive as gaps in the collection.” (p. 130)
  • must be used alongside other methods of data collection, as the results are not comprehensive. Only the most engaged will usually participate, so you won’t hear from those who under-use the resources in order to find out why

Benchmarking:

  • comparing a library’s collection to another library’s collection (p.131)
  • in selecting another library, it should be comparable by size, similarity of subject areas covered, and budget
  • search and study the other library’s catalogue, its “scope and holdings” (p. 131).
  • or network with the other library, to find someone who will provide some data

Focus groups:

  • instead of the quantitative data from user surveys, focus groups allow for “qualitative querying” (p. 131).
  • groups should be diverse
  • discussion based, with open-ended questions and “creative brainstorming” (p. 131)
  • can be used to evaluate an e-book publisher’s site – its navigability, collections and format
  • Limitations:
    • lack of anonymity may cause self-censorship (p. 132)
    • incentives may be required, which can cause some people to attend simply for a free lunch (but not to offer any useful insights)
    • they require a skilled mediator
    • groups of staff can be hard to organise due to conflicting schedules

Balanced scorecard method:

  • “a strategic approach and performance management system that can be employed to identify critical success factors and translate these into performance measures that can be tracked over time.” (p. 132)
  • includes have outcome measures – the point at which the librarian decides the collection is serving its purpose
  • results are only as good as the measures used to collect the data (p. 132).
  • challenge to identify realistic measures.

2. Amy Hart, chapter 3: “Collection analysis: powerful ways to collect, analyze and present your data.”

Assess the collection by

  1. Dewey Decimal number (p. 88)
    • how many titles does the library have in each classification?
    • enter data into a spreadsheet in order to analyse it
  2. date of publication (p. 88)
    • to determine currency/relevance
    • however, it’s important to note that some subjects do not date as much as others do
  3. circulation statistics
    • “By comparing the composition of the collection (how many titles in each Dewey class) to circulation counts by Dewey class, we could learn where supply was meeting demand.” (p. 89)
  4. then make many busy, incomprehensible and illegible graphs and charts to impress higher-ups

Activity: Consider models and methods for collection evaluation which may effectively relate to the learning and teaching context, the needs of users and the school library collection within your school, or in a school with which you are familiar.
  • What are the practicalities of undertaking a collection evaluation within a school in terms of time, staffing, and priorities, as well as appropriateness of methodology.
    • as there is often a shortage of time, collecting and analysing data could actually be spread out over the year. (For instance, weeding can be done at any time in the year, while user surveys could be done at the end.)
    • using the information management system to run reports on usage is an ideal place to start. 
    • as noted in Grigg (2012), any model of collecting data should be used in conjunction with one or two others – such as surveys and overlap analysis.
    • ‘benchmarking’ is possibly the least useful, as it would be hard to find another school like mine in the whole state (I can think of one, LC, that comes close but is smaller)
    • we have the staff available, but not all staff have the skill set (yet); they/we need training
  • How does the need for, and possible benefits of an evaluation of the collection outweigh the difficulties of undertaking such an evaluation?
    • As difficult and time-consuming it is to undertake a collection evaluation, it’s always necessary. It should be considered a necessary part of the job. Otherwise the shelves get clogged with old, dusty, irrelevant texts, new subject areas are under-resourced, the reading interests of the student body are not understood, and online tools and resources such as databases take up big chunks of the budget yet may not be utilised.
    • In discovering what resources are under-utilised (such as our Wheeler e-book platform; only 4 e-books were borrowed in 2021), the TL has an opportunity to promote such resources to see if user patronage can be driven up or, if it really is out-of-touch with users, to remove it altogether. This would free up space in the library (including online) and budget for other areas.
  • Is it better to use a simple process with limited but useful outcomes, or to use the most appropriate methodology in terms of outcomes?
    • I think it’s a combination, surely. We use what is available, which may be limited but is better than nothing. What we use, though, would be appropriate for the context – not sure I understand the bit about outcomes. It almost seems to imply that you use a methodology that will give you the result you want, rather than what’s actually happening.
  • What are the current priority areas for evaluation in your school library collection?
    • The non-fiction/reference sections
    • promoting e-books to students
    • database usage (EBSCO etc.)

References

Grigg, K. (2012). ‘Assessment and evaluation of e-book collections’ in R. Kaplan (Ed.), Building and Managing E-Book Collections. American Library Association. https://ebookcentral.proquest.com/lib/csuau/reader.action?docID=1158439&ppg=144 

Hart, A. (2003). Collection analysis: powerful ways to collect, analyze, and present your data. In C. Andronik (Ed.), School Library Management (5th ed.) (pp. 88-91) Worthington, Ohio: Linworth. https://primo.csu.edu.au/discovery/delivery/61CSU_INST:61CSU/12131785030002357

Powered by WordPress & Theme by Anders Norén

Step 1 of 2
Please sign in first
You are on your way to create a site.
Skip to toolbar