YouTube Viewers Can Be ‘Inoculated’ Against Online Misinformation, Study Finds

Topline

A series of studies found that YouTube viewers who were presented with short, informational videos about misinformation intended to “inoculate” them against harmful social media content were more likely to recognize untrustworthy information and less likely to share it with others, findings that researchers say could help combat misinformation online.

Key Facts

A team of researchers led by the University of Cambridge, the University of Bristol and Beth Goldberg, Jigsaw’s head of development and research, created five 90-second videos to “inoculate” study participants against misinformation by teaching people what warning signs to look out for based on what psychologists call “inoculation theory,” or the belief that if given a microdose of misinformation early, people are less susceptible to it in the future.

The peer-reviewed study was funded by Google Jigsaw, an arm of Alphabet that aims to develop technological solutions to societal problems (Alphabet also owns YouTube, which has faced scrutiny from critics who say the platform amplifies misinformation and harmful content).

After watching the videos researchers asked participants to identify manipulation techniques, and compared to a control group, “inoculated” people were in some cases more than twice as good at identifying the techniques, according to the study published Wednesday in the journal Science Advances.

In a final study–the first real-world example testing inoculation theory on a social media platform–Google Jigsaw, an arm of Alphabet that aims to build tech solutions to societal problems, found that when YouTube viewers in the U.S. were exposed to one of the inoculation videos, their abilities to recognize manipulation techniques rose by 5%, which Google noted is significant and five times as high as returns on its similarly sized YouTube advertising campaigns.

These findings show that psychological inoculation can “readily be scaled across hundreds of millions of users worldwide,” study co-author Sander van der Linden, Head of the Social Decision-Making Lab at Cambridge, which led the study, said in a statement.

The “pre-bunking” of misinformation may also be more effective than classic fact-checking, which the authors noted is “impossible” to do at scale and can actually worsen the spread of conspiracy theories when debunking feels like a personal attack on the people who hold those beliefs, according to the University of Cambridge.

Tangent

Each video specifically focused on one of five common manipulation techniques, like the use of emotionally manipulative language, incoherence, false dichotomies, scapegoating, and ad hominem attacks. The clips are available to view here.

What To Watch For

Jigsaw is slated to roll out a “prebunking” campaign at the end of the month across multiple platforms targeting users in Poland, Slovakia and the Czech Republic aimed at combating misinformation related to Ukrainian refugees.

Key Background

For several years, social media companies have focused on fighting the spread of misinformation on their platforms, particularly as it relates to political news, like the results of the 2020 presidential election, and health information, like the safety and efficacy of Covid-19 vaccines. More recently, platforms have been faced with how to address the circulation of misinformation about abortion.

Source: https://www.forbes.com/sites/carlieporterfield/2022/08/24/youtube-viewers-can-be-inoculated-against-online-misinformation-study-finds/