New York University researchers discovered scathing new data on the spread of misinformation on Facebook, despite the site curtailing their research, even going so far as deplatforming them.
The study found that from August 2020 to January 2021, misinformation got six times more clicks on Facebook than posts containing factual news. Misinformation also accounted for the vast majority of engagement with far right posts — 68% — compared to 36% of posts coming from the far left.
Facebook blocked the researchers’ personal accounts and site access last month. The company said their decision to deplatform the researchers was related to a separate study on political ads that involved using a browser extension that allowed users to anonymously share the ads they saw on Facebook with the researchers.
“NYU’s Ad Observatory project studied political ads using unauthorized means to access and collect data from Facebook, in violation of our Terms of Service. We took these actions to stop unauthorized scraping and protect people’s privacy in line with our privacy program under the FTC Order,” Mike Clark, Product Management Director, said in a statement.
But Laura Edelson, lead researcher on the project Cybersecurity for Democracy at NYU, said this didn’t make sense because the browser extension is still up and downloadable. However the restrictions on their access to Facebook has hindered the group’s research into political ads and misinformation.
“We couldn’t even replicate this study ourselves right now if we tried,” Edelson said on “Reliable Sources” Sunday.
Facebook paid a $5 billion penalty in a settlement with the Federal Trade Commission on privacy and data sharing in 2019.
The company could not be immediately reached for comment on Sunday.
Additionally, thousands of posts relating to the Jan. 6 Capitol attack went missing on CrowdTangle, a popular research tool. Edelson said her team reported the bug to Facebook, but the platform didn’t seem to be aware that the posts were gone.
“It took several rounds with them to get them to acknowledge the full scope of this,” Edelson said. “Frankly, I’d be worried that they don’t understand what’s going on their own systems.”
Facebook said the bug has been fixed.
The spread of misinformation on Facebook is partisan agnostic, meaning that the platform does not favor or reward falsehoods coming from one side or the other. But the far right and far left media ecosystems are “fundamentally different,” Edelson said, with 40% of the media sources cited on the far right actively spreading false information. In other corners of the media ecosystem and other partisan groups, this number doesn’t go above 10%.
“This report looks at how people engaged with content from Pages, which represents a tiny amount of all content from Facebook,” Facebook said in a statement.