Facebook will roll out additional, temporary measures to limit election misinformation on its platform in response to an increased number of misleading claims, the company said Thursday.
Content on Facebook and Instagram will be demoted by the company’s automated systems if the systems determine that it may contain misinformation, “including debunked claims about voting," Facebook spokesman Andy Stone said in a statement to CNN Business.
Users will face an additional hurdle when they share posts that Facebook has labeled with further context, Stone said. Users who attempt to share labeled content will now see an additional message that encourages them to visit Facebook’s voting information center.
"We are also limiting the distribution of Live videos that may relate to the election on Facebook,” Stone added.
“As vote counting continues,” Stone said, "we are seeing more reports of inaccurate claims about the election. While many of these claims have low engagement on our platform, we are taking additional temporary steps, which we’ve previously discussed, to keep this content from reaching more people.”
Facebook’s head of global affairs, Nick Clegg, has previously said the company has prepared multiple “break-glass” tools and options in the event of a chaotic US election. Thursday’s announcement appears to make use of them.
The statement did not provide a timeframe for the rollout, and Facebook didn’t immediately respond to a question from CNN Business seeking clarification. But the New York Times, which was first to report the news, said that the rollout could begin as soon as Thursday.
Baseless claims of election fraud made by President Donald Trump and his allies this week have turned up the heat on tech companies, which have for years largely allowed dubious and debunked claims to thrive on their platforms.
This week, Twitter and Facebook have increasingly labeled posts that seek to undermine the validity of the election results; YouTube, however, has largely lagged behind.