Study: Corrections on Facebook News Feed Reduce Misinformation

New research shows on average, across partisan and ideological lines, people read fact-checks on social media and became better informed about misinformation.

February 15, 2022

facebook image

Factual corrections published on Facebook’s news feed can reduce a user’s belief in misinformation, even across partisan lines, according to a new paper co-authored by a George Washington University assistant professor published this month in the Journal of Politics. 

Social media users were tested on their accuracy in recognizing misinformation through exposure to corrections on a simulated news feed that was made to look like Facebook’s news feed. However, just like in the real world, people in the experiment were free to ignore the information in the feed that corrected false stories also posted on the news feed. Even when given the freedom to choose what to read in the experiment, users’ accuracy improved when fact-checks were included with false stories. 

The study’s findings contradict previous research that suggests displaying corrections on social media was ineffective or could even backfire by increasing inaccuracy. Instead, even when users are not compelled to read fact-checks in a simulation of Facebook’s news feed, the new study found they nonetheless became more factually accurate despite exposure to misinformation. This finding was consistent for both liberal and conservative users with only some variation depending on the topic of the misinformation.

“Our study finds that posting fact-checks on Facebook increases users’ accuracy when it comes to encountering misinformation,” said Ethan Porter, assistant professor of media and public affairs and co-author of the study. “While other researchers have studied simulated news environments, ours is among the first that we are aware of to study corrections and misinformation on a detailed simulation of the news feed.”  

Researchers administered two experiments on large, nationally representative population samples using a novel platform designed to mimic Facebook’s news feed.  

Subjects in the first experiment were exposed to news feeds that contained, at random, multiple items of misinformation, factual corrections and non-political placebo content. Subjects were free to read—or avoid reading—any material that they wished. The fake stories used in the experiment were stories that had actually circulated on Facebook and included, for example, false claims relating to immigrants and measles, climate activist Greta Thunberg and President Donald Trump. Corrections directly contradicted the false claims and offered subjects the opportunity to access a more detailed fact-check on an external website. In the first experiment, the correction would appear above a facsimile of the original fake story, and test subjects were asked on a one-to-five scale whether they agreed with the fake story.  

The second experiment investigated whether results would change depending how the corrections were displayed in the news feed with the correction placed above the original fake story that had been blurred out. The test subjects were then asked to judge on a one-to-five scale the truthfulness of the fake story. 

In both versions tested, the fact-checks decreased false beliefs by larger amounts than misinformation shared without corrections. Additionally, the researchers tested the differences among participants who identified as either liberal or conservative and did not find one group to be more resistant to facts than the other. 

“Our previous research has shown that individuals respond to corrections by becoming more accurate, and this study finds the same impact even when people weren’t forced to read the shared fact-checks,” Porter said. “Even on a platform that approximates Facebook’s news feed, in which test subjects were presented with fact-checks that didn’t align with their politics, sharing corrections increased their ability to recognize misinformation as such.

“Our results suggest that social media companies, policymakers and scholars need not resign themselves to the spread of misinformation on social media but can use corrections to rebut it.”

Porter is affiliated with the GW Institute for Data, Democracy and Politics and is the cluster lead of the institute’s Misinformation/Disinformation Lab.

The researchers noted limitations to their study, pointing out that while their simulation closely mimicked the look of a Facebook news feed, their research did not factor how social ties from friends and connections may lead to different results in gauging the effects of sharing fact-checks and corrections.  

The paper, “Political Misinformation and Factual Corrections on the Facebook News Feed: Experimental Evidence,” was published in the Journal of Politics. The paper was co-authored with Thomas J. Wood from Ohio State University. The research was supported with funding from Avaaz, a civil society non-profit organization.