While Facebook has repeatedly come under fire over the past few years for its role in disseminating misinformation, especially related to the 2016 election, the last two months have been especially turbulent as a whistleblower and top officials have been called to testify in front of Congress following the release of leaked internal research and documents.
These disclosures made to the Securities and Exchange Commission and provided to Congress in redacted form by Facebook whistleblower Frances Haugen's legal counsel have shed new light on the inner workings of the tech giant. A consortium of 17 US news organizations, including CNN, has reviewed the redacted versions of the documents received by Congress. She also shared some of the documents with the Wall Street Journal, which published a multi-part investigation showing that Facebook was aware of problems with its platforms.
Facebook has pushed back on Haugen's assertions, with CEO Mark Zuckerberg even issuing a 1,300-word statement suggesting that the documents are cherry picked to present a misleading narrative about the company.
Here are some key takeaways from the tens of thousands of pages of internal documents.
Spread of misinformation
In one SEC disclosure, Haugen alleges "Facebook misled investors and the public about its role perpetuating misinformation and violent extremism relating to the 2020 election and January 6th insurrection."
And leaked comments from some Facebook employees on January 6 suggest the company might have had some culpability in what happened by not moving more quickly to halt the growth of Stop the Steal groups.
In response to these documents a Facebook spokesperson told CNN, "The responsibility for the violence that occurred on January 6 lies with those who attacked our Capitol and those who encouraged them."
Global lack of support
Internal Facebook documents and research shared as part of Haugen's disclosures highlight gaps in Facebook's ability to prevent hate speech and misinformation in countries such as Myanmar, Afghanistan, India, Ethiopia and much of the Middle East, where coverage of many local languages is inadequate.
Facebook has known about human traffickers using its platforms since at least 2018, but has struggled to crack down on related content, company documents reviewed by CNN show.
Inciting Violence Internationally
Internal documents indicate Facebook knew its existing strategies were insufficient to curb the spread of posts inciting violence in countries "at risk" of conflict, like Ethiopia.
This is not the first time concerns have been raised about Facebook's role in the promotion of violence and hate speech. After the United Nations criticized Facebook's role in the Myanmar crisis in 2018, the company acknowledged that it didn't do enough to prevent its platform being used to fuel bloodshed, and Zuckerberg promised to increase Facebook's moderation efforts.
Impact on Teens
According to the documents, Facebook has actively worked to expand the size of its young adult audience even as internal research suggests its platforms, particularly Instagram, can have a negative effect on their mental health and well-being.
Although Facebook has previously acknowledged young adult engagement on the Facebook app was "low and regressing further," the company has taken steps to target that audience.
However, Facebook's internal research, first reported by the Wall Street Journal, claims Facebook's platforms "make body image issues worse for 1 in 3 teen girls." Its research also found that "13.5% of teen girls on Instagram say the platform makes thoughts of 'Suicide and Self Injury' worse" and 17% say the platform, which Facebook owns, makes "Eating Issues" such as anorexia worse.
Algorithms fueling divisiveness
A late 2018 analysis of 14 publishers on the social network, entitled "Does Facebook reward outrage," found that the more negative comments incited by a Facebook post, the more likely the link in the post was to get clicked.
"The mechanics of our platform are not neutral," one staffer wrote.