Live Updates

Internal Facebook documents revealed

Revealed: How Facebook promoted QAnon to a 'North Carolina mom'
02:31

What we covered here

  • CNN is publishing a series of articles based on more than ten thousand pages of internal Facebook documents, referred to as “The Facebook Papers.”
  • They give deep insight into the company’s internal culture, its approach to misinformation and hate speech moderation, internal research on its newsfeed algorithm, and communication related to Jan. 6.
  • These documents are disclosures made to the SEC and provided to Congress in redacted form by Facebook whistleblower Frances Haugen’s legal counsel. The redacted versions were obtained by a consortium of 17 US news organizations, including CNN.

Our live coverage of this story has ended. Read the latest here.

24 Posts

Here are the big takeaways from the Facebook Papers

While Facebook has repeatedly come under fire over the past few years for its role in disseminating misinformation, especially related to the 2016 election, the last two months have been especially turbulent as a whistleblower and top officials have been called to testify in front of Congress following the release of leaked internal research and documents.

These disclosures made to the Securities and Exchange Commission and provided to Congress in redacted form by Facebook whistleblower Frances Haugen’s legal counsel have shed new light on the inner workings of the tech giant. A consortium of 17 US news organizations, including CNN, has reviewed the redacted versions of the documents received by Congress. She also shared some of the documents with the Wall Street Journal, which published a multi-part investigation showing that Facebook was aware of problems with its platforms.

Facebook has pushed back on Haugen’s assertions, with CEO Mark Zuckerberg even issuing a 1,300-word statement suggesting that the documents are cherry picked to present a misleading narrative about the company.

Here are some key takeaways from the tens of thousands of pages of internal documents.

Spread of misinformation

In one SEC disclosure, Haugen alleges “Facebook misled investors and the public about its role perpetuating misinformation and violent extremism relating to the 2020 election and January 6th insurrection.”

And leaked comments from some Facebook employees on January 6 suggest the company might have had some culpability in what happened by not moving more quickly to halt the growth of Stop the Steal groups.

In response to these documents a Facebook spokesperson told CNN, “The responsibility for the violence that occurred on January 6 lies with those who attacked our Capitol and those who encouraged them.”

Global lack of support

Internal Facebook documents and research shared as part of Haugen’s disclosures highlight gaps in Facebook’s ability to prevent hate speech and misinformation in countries such as Myanmar, Afghanistan, India, Ethiopia and much of the Middle East, where coverage of many local languages is inadequate.

Human Trafficking

Facebook has known about human traffickers using its platforms since at least 2018, but has struggled to crack down on related content, company documents reviewed by CNN show.

Inciting Violence Internationally

Internal documents indicate Facebook knew its existing strategies were insufficient to curb the spread of posts inciting violence in countries “at risk” of conflict, like Ethiopia.

This is not the first time concerns have been raised about Facebook’s role in the promotion of violence and hate speech. After the United Nations criticized Facebook’s role in the Myanmar crisis in 2018, the company acknowledged that it didn’t do enough to prevent its platform being used to fuel bloodshed, and Zuckerberg promised to increase Facebook’s moderation efforts.

Impact on Teens

According to the documents, Facebook has actively worked to expand the size of its young adult audience even as internal research suggests its platforms, particularly Instagram, can have a negative effect on their mental health and well-being.

Although Facebook has previously acknowledged young adult engagement on the Facebook app was “low and regressing further,” the company has taken steps to target that audience.

However, Facebook’s internal research, first reported by the Wall Street Journal, claims Facebook’s platforms “make body image issues worse for 1 in 3 teen girls.” Its research also found that “13.5% of teen girls on Instagram say the platform makes thoughts of ‘Suicide and Self Injury’ worse” and 17% say the platform, which Facebook owns, makes “Eating Issues” such as anorexia worse.

Algorithms fueling divisiveness

A late 2018 analysis of 14 publishers on the social network, entitled “Does Facebook reward outrage,” found that the more negative comments incited by a Facebook post, the more likely the link in the post was to get clicked.

“The mechanics of our platform are not neutral,” one staffer wrote.

Get the full analysis and list of takeaways here.

Facebook has language blind spots around the globe that allow hate speech to flourish

A passenger looks at his mobile phone on a local train in the state of West Bengal in Kolkata, India, on Nov. 11, 2020. 

For years, Facebook CEO Mark Zuckerberg touted his mission to connect the entire world — and his company has come closer than perhaps any other to fulfilling that lofty goal, with more than 3 billion monthly users across its various platforms. But that staggering global expansion has come at a cost.

Facebook’s own researchers have repeatedly warned that the company appears ill-equipped to address issues such as hate speech and misinformation in languages other than English, potentially making users in some of the most politically unstable countries more vulnerable to real-world violence, according to internal documents viewed by CNN.

The documents are part of disclosures made to the Securities and Exchange Commission and provided to Congress in redacted form by Facebook whistleblower Frances Haugen’s legal counsel. A consortium of 17 US news organizations, including CNN, has reviewed the redacted versions received by Congress.

Many of the countries that Facebook refers to as “At Risk” — an internal designation indicating a country’s current volatility — speak multiple languages and dialects, including India, Pakistan, Ethiopia and Iraq. But Facebook’s moderation teams are often equipped to handle only some of those languages and a large amount of hate speech and misinformation still slips through, according to the documents, some of which were written as recently as this year.

While Facebook’s platforms support more than 100 different languages globally, its global content moderation teams do not. A company spokesperson told CNN Business that its teams are comprised of “15,000 people who review content in more than 70 languages working in more than 20 locations” around the world. Even in the languages it does support, the documents show several deficiencies in detecting and mitigating harmful content on the platform.

There are also translation problems for users who may want to report issues. One research note, for example, showed that only a few “abuse categories” for reporting hate speech in Afghanistan had been translated into the local language Pashto. The document was dated Jan. 13, 2021, months before the Taliban militant group’s takeover of the country.

The documents, many of which detail the company’s own research, lay bare the gaps in Facebook’s ability to prevent hate speech and misinformation in a number of countries outside the United States, where it’s headquartered, and may only add to mounting concerns about whether the company can properly police its massive platform and prevent real-world harms.

Facebook has invested a total of $13 billion since 2016 to improve the safety of its platforms, according to the company spokesperson. (By comparison, the company’s annual revenue topped $85 billion last year and its profit hit $29 billion.) The spokesperson also highlighted the company’s global network of third party fact-checkers, with the majority of them based outside the United States.

Read the full story here.

Zuckerberg strikes defensive tone on Facebook's earnings call 

CEO Mark Zuckerberg kicked off Facebook’s quarterly earnings call by addressing the latest wave of coverage based on a trove of leaked internal documents on Monday.

“Good faith criticism helps us get better, but my view is that we are seeing a coordinated effort to selectively use leaked documents to paint a false picture of our company,” he said. “The reality is that we have an open culture that encourages discussion and research on our work so we can make progress on many complex issues that are not specific just to us.”

The earnings come amid perhaps the biggest crisis in the social media giant’s 17-year history.

Tens of thousands of pages of internal documents leaked by whistleblower Frances Haugen informed the Wall Street Journal’s “Facebook Files” series and, on Monday, a flood of additional news coverage by a consortium of 17 US news organizations, as well as hearings with US and UK lawmakers. 

The documents provide the deepest look yet at many of Facebook’s biggest problems, including its struggles to regulate hate speech and misinformation, the use of its platform by human traffickers, research on harms to young people and more.

Facebook has repeatedly pushed back on many of the reports, saying they are misleading and mischaracterize its research and actions.

 Zuckerberg last commented on the situation following Haugen’s Senate subcommittee hearing earlier this month, in a statement wherein he tried to discredit Haugen. Still, on Friday, another former Facebook employee anonymously filed a complaint against the company to the SEC, with allegations similar to Haguen’s.

Despite all the bad headlines, the company posted another quarter of massive earnings.

Facebook reports over $9 billion quarterly profit amid damning headlines

On a day full of bad news for Facebook, the company reminded investors that it continues to be a money-making machine.

Facebook on Monday reported $29 billion in revenue for the three months ended in September, up 33% from the same period a year earlier. The company posted nearly $9.2 billion in profit, up 17% from the year prior.

The results were nearly in line with Wall Street analysts’ projections. Facebook’s stock rose more than 3% in after-hours trading Monday following the earnings report.

The results come amid perhaps the biggest crisis in the social media giant’s 17-year history. Tens of thousands of pages of internal documents leaked by whistleblower Frances Haugen informed the Wall Street Journal’s “Facebook Files” series, and on Monday, a flood of additional news coverage by a consortium of 17 US news organizations, as well as hearings with US and UK lawmakers. The documents provide the deepest look yet at many of Facebook’s biggest problems, including its struggles to regulate hate speech and misinformation, the use of its platform by human traffickers, research on harms to young people and more. (Facebook has pushed back on many of the reports, saying they are misleading and mischaracterize its research and actions.)

Still, Facebook is no stranger to PR crises. In most cases, Facebook’s business has continued to chug along at a healthy clip despite outcry from regulators and the public.

But this time could be different. Facebook’s massive ad business is already in a vulnerable state because of recent changes to Apple’s app tracking rules. Apple’s iOS 14.5 software update, which went into effect in April, requires that users give explicit permission for apps to track their behavior and sell their personal data, such as age, location, spending habits and health information, to advertisers. Facebook has aggressively pushed back against the changes and warned investors last year that the update could hurt its business if many users opt out of tracking.

On Monday, Facebook warned that the iOS 14 changes could create “continued headwinds” in the fourth quarter of 2021.

While much of the world spent the day focused on Facebook’s real-world harms, the company hinted to investors in the report that it wants them looking forward, not backward. Starting in the fourth quarter, the company plans to break out Facebook Reality Labs — its division dedicated to augmented and virtual reality services — as a separate reporting segment from its family of apps, which includes Instagram, WhatsApp and Facebook’s namesake social network.

CFO David Wehner said Facebook is investing so heavily in this newer division that it will reduce “our overall operating profit in 2021 by approximately $10 billion.”

In a statement with the results, Facebook CEO and cofounder Mark Zuckerberg also focused on what’s next: “I’m excited about our roadmap, especially around creators, commerce, and helping to build the metaverse.”

Anti-Defamation League blasts Facebook: Never has a single company been responsible for so much misfortune

Jonathan Greenblatt, the CEO of the Anti-Defamation League, blasted Facebook on Monday following the publication of a series of articles revealing the company’s struggles to stop hate speech, human trafficking and coordinated groups that sowed discord ahead of the Jan. 6 insurrection.

“The kind of monopolistic indifference the company has demonstrated in dealing with hate is mind-bending,” Greenblatt told CNN in a phone interview. “I don’t think ever before a single company has been responsible for so much misfortune.”

Greenblatt said the ADL is in talks with members of its coalition to “explore the appropriate response” to the Facebook Papers. “There are things advertisers can do to demonstrate their discontent,” he said. 

Last year the ADL helped launch Stop Hate for Profit, a campaign that called on major companies to pause advertising on Facebook for failures to address the incitement of violence on the platform. Hundreds of companies eventually joined the ad boycott.

“Advertisers, from Fortune 500 companies to small businesses, need to ask themselves: Do they want to continue to invest in a platform that is knowingly pushing out misinformation and hate and that seems designed more to divide than convene?” Greenblatt said. “Companies can vote with their wallets and decide where they want to build their brands, redirecting resources away from Facebook.”

The comments come after a consortium of 17 US news organizations began publishing the Facebook Papers, a series of stories based on a trove of hundreds of internal documents that were included in disclosures made to the Securities and Exchange Commission and provided to Congress in redacted form by Facebook whistleblower Frances Haugen’s legal counsel. The consortium, which includes CNN, reviewed the redacted versions received by Congress.

CNN’s coverage includes stories about how coordinated groups on Facebook sow discord and violence, including on Jan. 6, as well as Facebook’s challenges moderating content in some non-English-speaking countries, and how human traffickers have used its platforms to exploit people.

“The news is stunning, but not shocking,” Greenblatt said. “Mark Zuckerberg would have you believe [Facebook] was doing all it could. Now we know the truth: He was aware and did nothing about it.”

As CNN reported on Friday, the Facebook Papers suggest the company was fundamentally unprepared for how the Stop the Steal movement used its platform to organize ahead of the Jan. 6 insurrection.

“They misled investors and the public about the spread of misinformation that led to the January 6insurrection,” Greenblatt said.

Greenblatt had his own spin on Facebook CEO Mark Zuckerberg’s famous “move fast and break things” motto for his company: “Move fast and lie about things.”

“We know they continually misled the public, misled the press, misled organizations like mine about the steps they were taking to deal with the hate on their service,” Greenblatt said.

Facebook did not respond to a request for comment but the company has denied the premise of Haugen’s conclusions around the company’s role in the Jan. 6 insurrection.

“The responsibility for the violence that occurred on January 6 lies with those who attacked our Capitol and those who encouraged them. We took steps to limit content that sought to delegitimize the election, including labeling candidates’ posts with the latest vote count after Mr. Trump prematurely declared victory, pausing new political advertising and removing the original #StopTheSteal Group in November,” Facebook spokesperson Andy Stone told CNN Friday.

Facebook also published a blog post detailing its efforts around the 2020 election. 

Still, Greenblatt said the current moment is an opportunity for business leaders to speak up about the problems at Facebook.

“Whether you’re a corporation or a celebrity or an elected official, all of us have a stake in getting this right,” he said. “Unfortunately, Facebook’s problem is all of our problem.”

However, Greenblatt suggested the best path, at this point, is to pursue regulatory changes through Congress and government agencies

“I believe in self-regulation,” Greenblatt said, pointing to his own career in Silicon Valley. “But Facebook has proven itself incapable of demonstrating the kind of responsibility we expect for a company of any size, let alone one of its sheer scale.”

Watch CNN reporting:

05:36

Former Facebook employee Sophie Zhang said she felt like there was "blood on her hands" after working there

This is not the first time that Facebook’s employees and former workers are complaining about the company’s troubling practices and culture.

Sophie Zhang, who worked as a data scientist at the tech giant for almost three years, she felt like she had “blood on her hands” after working there.

Zhang wrote a lengthy memo when she was fired by Facebook last year, detailing how she believed the company was not doing enough to tackle hate and misinformation — particularly in smaller and developing countries. Zhang said the company told her she was fired because of performance issues.

The memo was first reported last year by BuzzFeed News and later helped form the basis of a series of reports by The Guardian newspaper.

She is willing to testify before Congress about her former employer, she told CNN following whistleblower Frances Haugen’s testimony. She said she had also passed on documentation about the company to a US law enforcement agency.

“I provided detailed documentation regarding potential criminal violations to a U.S. law enforcement agency. My understanding is that the investigation is still ongoing,” she tweeted.

Central to Zhang’s allegations about Facebook is that it doesn’t do enough to tackle abuse of its platform in countries outside of the United States. Roughly 90% of Facebook’s monthly active users are outside the US and Canada, according to its most recent quarterly filing.

Read the full story here.

Watch more:

04:03

Facebook knew it was being used to incite violence in Ethiopia. Little was done to stop it.

Mekelle, the regional capital of Tigray, in northern Ethiopia, is seen through a bullet hole at the Ayder Referral Hospital, in May 2021.

Facebook employees repeatedly sounded the alarm on the company’s failure to curb the spread of posts inciting violence in “at risk” countries like Ethiopia, where a civil war has raged for the past year, internal documents seen by CNN show.

The social media giant ranks Ethiopia in its highest priority tier for countries at risk of conflict, but the documents reveal that Facebook’s moderation efforts were no match for the flood of inflammatory content on its platform.

The documents are among dozens of disclosures made to the US Securities and Exchange Commission (SEC) and provided to Congress in redacted form by Facebook whistleblower Frances Haugen’s legal counsel. A consortium of 17 US news organizations, including CNN, has reviewed the redacted versions received by Congress.

They show employees warning managers about how Facebook was being used by “problematic actors,” including states and foreign organizations, to spread hate speech and content inciting violence in Ethiopia and other developing countries, where its user base is large and growing. Facebook estimates it has 1.84 billion daily active users — 72% of which are outside North America and Europe, according to its annual SEC filing for 2020.

For example, an internal report distributed in March, entitled “Coordinated Social Harm,” said that armed groups in Ethiopia were using the platform to incite violence against ethnic minorities in the “context of civil war.”

The documents also indicate that the company has, in many cases, failed to adequately scale up staff or add local language resources to protect people in these places.

Read the full story about what’s going on in Ethiopia, what Facebook knew about violent actors and the company’s insufficient mitigation strategies.

Facebook employees flagged people for sale on its platforms in 2018. It's still a problem.

Facebook has for years struggled to crack down on content related to what it calls domestic servitude: “a form of trafficking of people for the purpose of working inside private homes through the use of force, fraud, coercion or deception,” according to internal Facebook documents reviewed by CNN. 

The company has known about human traffickers using its platforms in this way since at least 2018, the documents show. It got so bad that in 2019, Apple threatened to pull Facebook and Instagram’s access to the App Store, a platform the social media giant relies on to reach hundreds of millions of users each year. Internally, Facebook employees rushed to take down problematic content and make emergency policy changes avoid what they described as a “potentially severe” consequence for the business.  

But while Facebook managed to assuage Apple’s concerns at the time and avoid removal from the app store, issues persist. The stakes are significant: Facebook documents describe women trafficked in this way being subjected to physical and sexual abuse, being deprived of food and pay, and having their travel documents confiscated so they can’t escape. Earlier this year, an internal Facebook report noted that “gaps still exist in our detection of on-platform entitles engaged in domestic servitude” and detailed how the company’s platforms are used to recruit, buy and sell what Facebook’s documents call “domestic servants.” 

Last week, using search terms listed in Facebook’s internal research on the subject, CNN located active Instagram accounts purporting to offer domestic workers for sale, similar to accounts that Facebook researchers had flagged and removed. Facebook removed the accounts and posts after CNN asked about them, and spokesperson Andy Stone confirmed that they violated its policies.

“We prohibit human exploitation in no uncertain terms,” Stone said. “We’ve been combatting human trafficking on our platform for many years and our goal remains to prevent anyone who seeks to exploit others from having a home on our platform.”

Read more here.

Here's a recap of the Facebook whistleblower's testimony in the UK parliament

Facebook whistleblower Frances Haugen leaves after giving evidence to the joint committee for the Draft Online Safety Bill, as part of British government plans for social media regulation, at the Houses of Parliament in London, Monday, Oct. 25, 2021. 

Facebook whistleblower Frances Haugen’s testimony in the UK parliament has concluded. She raised concerns about the company’s focus on algorithms, its approach to misinformation and hate speech moderation. She also reiterated her call for regulating the tech giant to get more transparency, which she previously spoke about in her testimony to US Congress.

Here’s a recap of what Haugen told the UK parliament:

Facebook’s under-invests in content safety systems for non-English languages.

“Facebook says things like, ‘we support 50 languages,’ when in reality, most of those languages get a tiny fraction of the safety systems that English gets,” Haugen told British lawmakers. “UK English is sufficiently different that I would be unsurprised if the safety systems that they developed primarily for American English were actually [under-enforced] in the UK.”

The documents also indicate that the company has, in many cases, failed to adequately scale up staff or add local language resources to protect people in these places.

Facebook should not be allowed to “mislead” its Oversight Board.

“I hope the Oversight Board takes this moment to stand up and demand a relationship that has more transparency,” said Haugen. “If Facebook can come in there and just actively mislead the Oversight Board — which is what they did — I don’t know what the purpose of the oversight board is.”

The board adjudicates cases on controversial content that is both left up or taken down — but these cases are just “the tip of the iceberg” when it comes to oversight at Facebook, Oversight Board member and PEN America CEO Suzanne Nossel said.

The UK is leading the world in its efforts to regulate social media platforms through its Draft Online Safety Bill.

Haugen said she couldn’t imagine that Facebook CEO and founder Mark Zuckerberg “isn’t paying attention” to the efforts.

While countries in the “Global South” do “not have the resources to stand up and save their own lives,” the UK has the chance to take a “world leading stance” with its bill, which seeks to impose a duty of care on social media sites towards their users, Haugen added.

Facebook views safety as a cost center instead of a growth center.

“I think there is a view inside the company that safety is a cost center; it’s not a growth center, which, I think, is very short-term in thinking. Because Facebook’s own research has shown that when people have worse integrity experiences on the site, they are less likely to retain,” she said Monday.

She urged British lawmakers to put regulations in place, saying it was for the good of the company’s long-term growth.

“I think regulation could actually be good for Facebook’s long-term success. Because it would force Facebook back into a place where it was more pleasant to be on Facebook,” she said.

Facebook whistleblower says Zuckerberg should pay attention to UK's efforts to regulate social media

Facebook whistleblower Frances Haugen is “proud” of the UK’s “world leading” efforts to regulate social media platforms through its Draft Online Safety Bill.

Speaking before a UK parliamentary committee Monday, Haugen said she couldn’t imagine that Facebook CEO and founder Mark Zuckerberg “isn’t paying attention to what you’re doing,” calling it a critical moment for the UK “to stand up and make sure that these platforms are in the public good, and are designed for safety.”

While countries in the “Global South” do “not have the resources to stand up and save their own lives,” the UK has the chance to take a “world leading stance” with its bill, which seeks to impose a duty of care on social media sites towards their users, Haugen added.

Earlier during her testimony, Haugen urged lawmakers to include societal harm in prospective legislation calling its omission a “grave danger to democracy and societies around the world.”

Facebook Oversight Board member calls for more transparency

Suzanne Nossel at the 29th Annual PEN America LitFestGala at Regent Beverly Wilshire Hotel in November 2019 in Beverly Hills, California. 

As more of Facebook’s internal documents are revealed, Facebook Oversight Board member and PEN America CEO Suzanne Nossel emphasized that the platform must do a better job with transparency.

The Oversight Board’s first report, released last week, revealed that the group of independent content moderators prompted Facebook to restore more than 30 pieces of content covering major issues. It also found users are often in the dark when content is removed by Facebook, and don’t know why they’ve been banned or had their content taken down.

The board also said Facebook was not “fully forthcoming” about its cross-check system, which is used to make content decisions for high-profile users.

The Oversight Board is an effort by Facebook to bring in independent, outside expertise to oversee the platform’s content moderation decisions. There are 20 members on the board with backgrounds ranging from law and human rights to journalism. 

Read more here.

Situations in Ethiopia and Myanmar are "opening chapters of a novel that is going to be horrific to read," whistleblower says

The situations in Ethiopia and Myanmar are the “opening chapters of a novel that is going to be horrific to read” Facebook whistleblower Frances Haugen said during her testimony to a UK Parliamentary committee Monday.

Haugen shed light on the company’s use of so-called “glass break measures.”

Haugen said Facebook knows based on its engagement-based rankings when the temperature in a country “gets hotter”, name checking places like Myanmar which didn’t have any hate speech classifying labelling systems.

Haugen said the company has a “strategy” of only slowing down the platform when “the crisis has begun”, deploying its “glass break measures” instead of making the platform “safer as it happens.”

Haugen told US senators in a damning testimony on Oct. 5 that the social media company was aware its platform was being used to stir up ethnic violence in Ethiopia and Myanmar.

Haugen: Facebook should not be allowed to "mislead" its Oversight Board

Facebook should not be allowed to mislead its Oversight Board, whistleblower Frances Haugen told UK lawmakers on Monday, saying the body should demand more transparency from the social media giant.

“This is a defining moment for the Oversight Board. What relationship does it want to have with Facebook? I hope the Oversight Board takes this moment to stand up and demand a relationship that has more transparency,” said Haugen.

The Facebook Oversight Board is made up of experts in areas such as freedom of expression and human rights. They were appointed by the company but operate independently. The board is often described as a kind of Supreme Court for Facebook.

Last week, the board said that Facebook failed to provide crucial details about its “Cross-Check” program that reportedly shielded millions of VIP users from normal content moderation rules.

Facebook uses Cross-Check to review content decisions relating to high-profile users, such as politicians, celebrities and journalists. The program had mushroomed to include 5.8 million users in 2020, according to the Wall Street Journal.

On Sunday, Oversight Board member and PEN America CEO Suzanne Nossel said that Facebook must do a better job with transparency.

The board adjudicates cases on controversial content that is both left up or taken down — but these cases are just “the tip of the iceberg” when it comes to oversight at Facebook, Nossel said on CNN’s “Reliable Sources.”

“They didn’t want to bear the full weight of responsibility for big questions like, ‘Should Donald Trump be allowed on the platform,’” Nossel said. “I think they’re very ambivalent about regulation.”

Facebook's success was built on algorithms. Can they also fix it?

Billions of people around the world see relevant content all around the internet with the help of algorithms. It’s not unique to Facebook.

However, Facebook whistleblower Frances Haugen’s testimony earlier this month and submitted documents have renewed scrutiny of the impact Facebook and its algorithms have on teens, democracy and society at large.

The fallout has raised the question of just how much Facebook — and perhaps platforms like it — can or should rethink using a bevy of algorithms to determine which pictures, videos and news users see.

Algorithms are not going away. But there are ways for Facebook to improve them, experts in algorithms and artificial intelligence told CNN. It will, however, require something Facebook has so far appeared reluctant to offer (despite executive talking points): more transparency and control for users.

Rethinking focus on engagement

A big hurdle to making meaningful improvements is social networks’ current focus on the importance of engagement, or the amount of time users spend scrolling, clicking, and otherwise interacting with social media posts and ads, experts say.

Haugen revealed internal documents from Facebook that show the social network is aware that its “core product mechanics, such as virality, recommendations and optimizing for engagement, are a significant part” of why hate speech and misinformation “flourish” on its platform.

Changing this is tricky though several agreed that it may involve considering the feelings users have when using social media and not just the amount of time they spend using it.

In the past, some might have said it would require pressure from advertisers whose dollars support these platforms. But in her testimony, Haugen seemed to bet on a different answer: pressure from Congress.

Whistleblower: Facebook views safety as a cost center instead of growth center

Former Facebook employee and whistleblower Frances Haugen told the UK parliament that the company views safety as a cost center and not an investment for growth.

“I think there is a view inside the company that safety is a cost center; it’s not a growth center, which, I think, is very short-term in thinking. Because Facebook’s own research has shown that when people have worse integrity experiences on the site, they are less likely to retain,” she said Monday.

She urged British lawmakers to put regulations in place, saying it was for the good of the company’s long-term growth.

“I think regulation could actually be good for Facebook’s long-term success. Because it would force Facebook back into a place where it was more pleasant to be on Facebook,” she said.

Remember: When Haugen testified in the US Congress, she urged the lawmakers to step in and create regulations, too.

Haugen says Facebook has a "huge weak spot" when it comes to reporting issues up the chain

Former Facebook employee and whistleblower Frances Haugen told UK lawmakers that the social media giant doesn’t devote enough resources to critical areas, such as national security and public safety.

Facebook has a “huge weak spot” when it comes to reporting issues up its chain of command, Haugen said in testimony on Monday.

“If I drove a bus in the United States, there would be a phone number in my break room that I could call that would say ‘Did you see something that endangered public safety? Call this number,’” she said.

“When I worked on counter-espionage [at Facebook] I saw things where I was concerned about national security, and I had no idea how to escalate those, because I didn’t have faith in my chain of command at that point,” added Haugen.

The whistleblower claimed that asking for more resources to tackle difficult issues was not part of Facebook’s corporate culture.

“We were told just to accept under-resourcing,” she told UK lawmakers. “There is a culture that lionizes kind of a startup ethic that, in my opinion, is irresponsible.”

“I flagged repeatedly when I worked on Civic Integrity that I felt that critical teams were understaffed, and I was told at Facebook ‘we accomplish unimaginable things with far fewer resources than anyone would think possible,’” said Haugen.

Whistleblower: Facebook's content safety systems don't apply similarly to non-English speaking countries

While testifying in the UK parliament, former Facebook employee and whistleblower Frances Haugen said one of her primary concerns is Facebook’s “under-investment in non-English languages and how they mislead the public [into thinking that] they are supporting them.”

“Facebook says things like, ‘we support 50 languages,’ when in reality, most of those languages get a tiny fraction of the safety systems that English gets,” Haugen told British lawmakers.

For example, in the recent revelations from the Facebook Papers, it is clear that Facebook employees repeatedly sounded the alarm on the company’s failure to curb the spread of posts inciting violence in “at risk” countries like Ethiopia, where a civil war has raged for the past year. But the documents reveal that Facebook’s moderation efforts were no match for the flood of inflammatory content on its platform.

By the way, by Facebook’s own estimates, it has 1.84 billion daily active users — 72% of which are outside North America and Europe, according to its annual SEC filing for 2020.

The documents also indicate that the company has, in many cases, failed to adequately scale up staff or add local language resources to protect people in these places.

The highest level of existing moderation efforts could also be best suited for American English, Haugen said.

“UK English is sufficiently different that I would be unsurprised if the safety systems that they developed primarily for American English were actually [underenforced] in the UK,” Haugen explained.

Facebook staff this weekend were told to brace for "more bad headlines"

Facebook's Vice President of Global Affairs Nick Clegg in Berlin in June 2019.

Facebook Vice President of Global Affairs Nick Clegg told staff at the company to be prepared for more “bad headlines in the coming days” as a consortium of news organizations, including CNN, continue to publish stories based on a cache of tens of thousands of pages of leaked documents from the company. 

Clegg made the comments in an internal company post on Saturday, Axios first reported. A copy of the memo was obtained by CNN Sunday. 

Clegg took aim at news organizations, writing:

“Social media turns traditional top-down control of information on its head. In the past, public discourse was largely curated by established gatekeepers in the media who decided what people could read, see and digest. Social media has enabled people to decide for themselves – posting and sharing content directly. This is both empowering for individuals – and disruptive to those who hanker after the top-down controls of the past, especially if they are finding the transition to the online world a struggle for their own businesses.”

He also echoed public statements made by the company, writing, “At the heart of these stories is a premise which is plainly false: that we fail to put people who use our service first, and that we conduct research which we then systematically ignore. Yes, we’re a business and we make profit, but the idea that we do so at the expense of people’s safety or wellbeing misunderstands what we’re about, and where our own commercial interests lie.”

Facebook revelations are shocking — but nothing will change until Congress acts

Public pressure alone won’t get Facebook to change. If shame were enough, Facebook would have changed after the 2016 election. Or the Cambridge Analytica scandal. Or the 2020 election.

Even when dozens of major brands pulled their advertising over Facebook’s lax approach to regulating hate speech, the company barely felt a ding.

So it’s up to Washington to fix Facebook. And that’s no easy task.

Part of the problem with regulating Facebook is that lawmakers and regulators are feeling around in the dark for a solution to a problem society has never faced before. To borrow whistleblower Frances Haugen’s metaphor, it’s like the Transportation Department writing the rules of the road without even knowing that seat belts are an option.

And Facebook’s structure is uniquely murky, even among tech companies, according to Haugen.

“At other large tech companies like Google, any independent researcher can download from the Internet the company’s search results and write papers about what they find,” she said. “But Facebook hides behind walls that keep researchers and regulators from understanding the true dynamics of their system.”

Read more here.

The Facebook whistleblower is testifying in the UK today

Former Facebook employee and whistleblower Frances Haugen testified before a Senate Committee on Commerce, Science, and Transportation hearing on Capitol Hill on October 5, 2021, in Washington, DC.

Former Facebook employee-turned-whistleblower whistleblower Frances Haugen, who testified before Congress about how the social media giant misled the public, will now face questions in the UK Parliament.

Haugen is set to testify staring at 9:30 a.m. ET.

Haugen, the 37-year-old former Facebook (FB) product manager who worked on civic integrity issues at the company, revealed her identity during a “60 Minutes” segment that aired earlier this month.

She has reportedly filed at least eight whistleblower complaints with the Securities and Exchange Commission alleging that the company is hiding research about its shortcomings from investors and the public. She also shared the documents with regulators and the Wall Street Journal, which published a multi-part investigation showing that Facebook was aware of problems with its apps.

In her testimony before Congress earlier this month, Haugen faced questions from a Commerce subcommittee about what Facebook-owned Instagram knew about its effects on young users, among other issues.

“I am here today because I believe that Facebook’s products harm children, stoke division, and weaken our democracy,” she said during her opening remarks. “The company’s leadership knows how to make Facebook and Instagram safer but won’t make the necessary changes because they have put their astronomical profits before people. Congressional action is needed. They won’t solve this crisis without your help.”

Facebook confronts an existential crisis

Facebook has confronted whistleblowers, PR firestorms and Congressional inquiries in recent years. But now it faces a combination of all three at once in what could be the most intense and wide-ranging crisis in the company’s 17-year history. 

On Friday, a consortium of 17 US news organizations began publishing a series of stories — collectively called “The Facebook Papers” — based on a trove of hundreds of internal company documents which were included in disclosures made to the Securities and Exchange Commission and provided to Congress in redacted form by Facebook whistleblower Frances Haugen’s legal counsel. The consortium, which includes CNN, reviewed the redacted versions received by Congress.  

CNN’s coverage includes stories about how coordinated groups on Facebook sow discord and violence, including on January 6, as well as Facebook’s challenges moderating content in some non-English-speaking countries, and how human traffickers have used its platforms to exploit people. The Wall Street Journal previously published a series of stories based on tens of thousands of pages of internal Facebook documents leaked by Haugen. The consortium’s work is based on many of the same documents.

Facebook has dealt with scandals over its approach to data privacy, content moderation and competitors before. But the vast trove of documents, and the many stories surely still to come from it, touch on concerns and problems across seemingly every part of its business: its approach to combatting hate speech and misinformation, managing international growth, protecting younger users on its platform and even its ability to accurately measure the size of its massive audience.  

All of this raises an uncomfortable question for the company: Is Facebook actually capable of managing the potential for real-world harms from its staggeringly large platforms, or has the social media giant has become too big not to fail? 

Ongoing problems

The documents show various examples of issues that Facebook has been aware of, even as it still struggles with them. Take the example of a report published by the Journal on September 16 that highlighted internal Facebook research about a violent Mexican drug cartel, known as Cartél Jalisco Nueva Generación. The cartel was said to be using the platform to post violent content and recruit new members using the acronym “CJNG,” even though it had been designated internally as one of the “Dangerous Individuals and Organizations” whose content should be removed. Facebook told the Journal at the time that it was investing in artificial intelligence to bolster its enforcement against such groups. 

Despite the Journal’s report last month, CNN last week identified disturbing content linked to the group on Instagram, including photos of guns, and photo and video posts in which people appear to have been shot or beheaded. After CNN asked Facebook about the posts, a spokesperson confirmed that multiple videos CNN flagged were removed for violating the company’s policies, and at least one post had a warning added. 

Facebook’s response

Facebook, for its part, has repeatedly tried to discredit Haugen, and said her testimony and reports on the documents mischaracterize its actions and efforts.  

“At the heart of these stories is a premise which is false,” a Facebook spokesperson said in a statement to CNN. “Yes, we’re a business and we make profit, but the idea that we do so at the expense of people’s safety or wellbeing misunderstands where our own commercial interests lie.” 

In a tweet thread last week, the company’s Vice President of Communications, John Pinette, called the Facebook Papers a “curated selection out of millions of documents at Facebook” which “can in no way be used to draw fair conclusions about us.” But even that response is telling ­­— if Facebook has more documents that would tell a fuller story, why not release them?

Read more here.

Facebook's own research showed they should fact-check politicians. Instead, they let them lie.

On Jan. 6, when the Capitol attack began, some Facebook staffers began to wonder what role their company played in fueling the lies that led to the insurrection.

CNN’s Donie O’Sullivan explains how internal memos show that the company’s decision-making on content policy is routinely influenced by political considerations.

Watch more:

04:01