Quantifying the Impact of Misinformation and Vaccine-Skeptical Content on Facebook

Quantifying the Impact of Misinformation and Vaccine-Skeptical Content on Facebook

5/20/24 | Jennifer Allen*, Duncan J. Watts, David G. Rand
This study investigates the impact of misinformation and vaccine-skeptical content on Facebook vaccination intentions. Researchers combined lab experiments, crowdsourcing, and machine learning to estimate the causal effect of 13,206 vaccine-related URLs on the vaccination intentions of 233 million US Facebook users. They found that misinformation flagged by fact-checkers had 46 times less impact than unflagged content that encouraged vaccine skepticism. Although misinformation reduced vaccination intentions significantly when viewed, flagged content's exposure on Facebook was limited. In contrast, unflagged stories highlighting rare deaths following vaccination were among Facebook's most-viewed stories. The study emphasizes the need to scrutinize factually accurate but potentially misleading content in addition to outright falsehoods. The research shows that vaccine-skeptical content, defined as content that could undermine faith in approved vaccines even if it does not reflect an explicitly anti-vaccine viewpoint, played a significant role in driving vaccine hesitancy. The study found that the overall predicted impact of vaccine-skeptical content on vaccine hesitancy was much greater than that of content flagged as misinformation by fact-checkers. This suggests that policies targeting only factually inaccurate content may not be sufficient to address the broader issue of vaccine hesitancy. The study used a framework that estimates impact as the interaction of persuasive influence and exposure. They analyzed the persuasive effects of vaccine-related content on Facebook and found that the extent to which a headline suggested the vaccine was harmful to a person's health was the most significant predictor of negative persuasive influence. They also found that content from mainstream sources, even if not flagged as misinformation, could have a significant impact on vaccination intentions. The study highlights the importance of considering the reach and impact of content, not just its veracity. It suggests that misleading content without being factually inaccurate can have significant societal harm. Researchers and technology companies should move beyond a narrow focus on veracity and devote more attention to understanding, tracking, and potentially intervening on harmful content that is misleading without being literally false. The study provides a replicable framework for researchers and social media companies to identify and measure the impact of potentially harmful content in contexts where field experiments are not possible.This study investigates the impact of misinformation and vaccine-skeptical content on Facebook vaccination intentions. Researchers combined lab experiments, crowdsourcing, and machine learning to estimate the causal effect of 13,206 vaccine-related URLs on the vaccination intentions of 233 million US Facebook users. They found that misinformation flagged by fact-checkers had 46 times less impact than unflagged content that encouraged vaccine skepticism. Although misinformation reduced vaccination intentions significantly when viewed, flagged content's exposure on Facebook was limited. In contrast, unflagged stories highlighting rare deaths following vaccination were among Facebook's most-viewed stories. The study emphasizes the need to scrutinize factually accurate but potentially misleading content in addition to outright falsehoods. The research shows that vaccine-skeptical content, defined as content that could undermine faith in approved vaccines even if it does not reflect an explicitly anti-vaccine viewpoint, played a significant role in driving vaccine hesitancy. The study found that the overall predicted impact of vaccine-skeptical content on vaccine hesitancy was much greater than that of content flagged as misinformation by fact-checkers. This suggests that policies targeting only factually inaccurate content may not be sufficient to address the broader issue of vaccine hesitancy. The study used a framework that estimates impact as the interaction of persuasive influence and exposure. They analyzed the persuasive effects of vaccine-related content on Facebook and found that the extent to which a headline suggested the vaccine was harmful to a person's health was the most significant predictor of negative persuasive influence. They also found that content from mainstream sources, even if not flagged as misinformation, could have a significant impact on vaccination intentions. The study highlights the importance of considering the reach and impact of content, not just its veracity. It suggests that misleading content without being factually inaccurate can have significant societal harm. Researchers and technology companies should move beyond a narrow focus on veracity and devote more attention to understanding, tracking, and potentially intervening on harmful content that is misleading without being literally false. The study provides a replicable framework for researchers and social media companies to identify and measure the impact of potentially harmful content in contexts where field experiments are not possible.
Reach us at info@study.space