Article: From Facebook to YouTube: The Potential Exposure to COVID-19 Anti-Vaccine Videos on Social Media
Let’s be honest, COVID-19 was a bit of a bummer for everyone. The seemingly never-ending lockdowns had us turning to our screens for entertainment – spending countless hours scrolling through funny dog videos and less-than-stellar baking endeavours. But our usual feeds were littered with a growing presence of individuals and groups who held strong and, in some cases, insistent views on vaccinations. As someone with an auto-immune disease, all the unknowns and misinformation being broadcasted everywhere, there were no escapes from the nerve-wracking commentary on what we considered life-or-death decisions.
Now that we are in a post-pandemic phase, it is interesting to explore how prominent social media platforms created an environment to rapidly spread information around the world to a significant portion of the population. From the coverage of misinformation without scientific backing and the promotion of anti-vaccine conspiracies on social media, behaviours in the public were impacted in the ‘real world’ with vaccine reluctance and ill-informed protests being the result.
On Facebook, studies found that around 41% to 88% of misinformation about COVID-19 remained available on their platform, despite the interventions the company introduced in late 2020 (Gruzd et al., 2023, p. 4, as cited in Avaaz, 2020 and Szeto et al., 2021). Further investigation of 56 Facebook entities showed 37 supported anti-vaccine opinions and merely 8 entities shared pro-vaccine information. Thankfully a review of 98 anti-vaccine posts, only 33 remained available on the platform (Gruzd et al., 2023, p. 10).
Since YouTube’s beginnings in 2005, anti-vaccination ideology has taken advantage of the free streaming service and exponential yearly audience growth. However, during COVID-19 anti-vax was kicked into high gear and YouTube found itself dealing with complications of its own success. The platform performed very poorly when it came to moderating these harmful posts with only 34% of reported videos being removed, with its sheer size to blame (Gruzd et al., 2023, p. 5, as cited in Szeto et al., 2021). In addition to the poor moderation of content, YouTube’s recommending algorithm proved to be disastrous, creating an environment where users were bombarded with one-sided videos through targeting (Gruzd et al., 2023, p. 5, as cited in Abul-Fottouh et al., 2020).
While there has always been a small number of people that oppose government measures made for the population’s health and safety, it was scary to see how social media seemed to amplify their voices and widen their audience in a way that showed serious ramifications on global vaccine adoption.
After reading about social media platforms and anti-vaccination movements, the scariest thing isn’t the targeted ads that seem to listen to conversations. It’s the power of misinformation and how individuals’ and groups’ unfounded opinions can affect our society’s inner workings when targeting frightened people. Lets all stay safe and well informed!
References
Abul-Fottouh, D., Song, M. Y., & Gruzd, A. (2020). Examining algorithmic biases in YouTube’s recommendations of vaccine video. International Journal of Medical Informatics, 140(104175). https://doi.org/10.1016/j.ijmedinf.2020.104175
Avaaz. (2020). How Facebook can flatten the curve of the coronavirus infodemic
Gruzd, A., Abul-Fottouh, D., Song, M. Y., & Saiphoo, A. (2023). From Facebook to YouTube: the Potential exposure to COVID-19 anti-vaccine videos on social media. Social Media + Society, 9(1). https://doi.org/10.1177/20563051221150403
Szeto, E., Pedersen, K., & Tomlinson, A. (2021, March 30). Marketplace flagged over 800 social media posts with COVID-19 misinformation. Only a fraction were removed. CBC News. https://www.cbc.ca/news/marketplace/marketplace-social-media-posts-1.5968539