Mark Zuckerberg and Jack Dorsey face Senate grilling over moderation practices

By Brian Fung, Kaya Yurieff and Rishi Iyengar CNN Business

Updated 7:48 a.m. ET, November 18, 2020
22 Posts
Sort byDropdown arrow
1:42 p.m. ET, November 17, 2020

Sen. Klobuchar's questions show Big Tech has an even bigger problem than content: antitrust

From CNN Business' Kaya Yurieff

Though Tuesday's hearing is ostensibly about content moderation and the election, Sen. Amy Klobuchar asked a number of questions about the dominance of tech platforms. It's a good reminder that for Facebook, antirust, and not debates over moderation, may be the biggest forthcoming challenge in Washington.

Sen. Klobuchar pressed Zuckerberg about the company's allegedly anti-competitive tactics. She pointed to Facebook's purchase of Instagram in 2012, which documents show it saw as an emerging rival, as an example.

"At the time, I don't think we or anyone else viewed Instagram as a competitor as a large multi-purpose social platform," Zuckerberg said. "In fact, people at the time kind of mocked our acquisition because they thought that we dramatically spent more money than we should have to acquire something that was viewed primarily as a camera and photo sharing app."

Klobuchar responded: "We don't know how [Instagram] would've done."

"When we look at your emails it kind of leads us down this road as well with WhatsApp that part of the purchase of these nascent competitors is to -- I'll use the words of FTC Chairman Joe Simons who just said last week: 'A monopolist can squash a nascent competitor by buying it, not just by targeting it with anti-competitive activity.'"

12:58 p.m. ET, November 17, 2020

Zuckerberg and Dorsey are both Big Tech, but that doesn't mean they agree

From CNN Business' Brian Fung

It seems that what Facebook and Twitter would like to see from federal content moderation laws is increasingly diverging.

Zuckerberg has consistently emphasized how well Facebook’s systems work for detecting and blocking terrorist and child-exploitation content, categories of material that are explicitly illegal. Zuckerberg also said he would be open to a regulatory regime that imposes liability on tech platforms for some forms of content.

Twitter takes a very different tack. Dorsey has argued for rules promoting transparency in process and outcomes, and importantly, the ability for consumers to opt out of tech company algorithms that rank and determine what you should see.

Each executive’s position reflects the strengths of his respective product — Facebook’s entire business rests on its algorithm, so it makes sense that Zuckerberg would want rules that lean more heavily on a company’s ability to deploy its algorithms to manage content.

By contrast, Twitter has only gotten into ranking content in users’ feeds relatively recently in its history. For years its main approach was to serve users tweets in reverse chronological order, without ranking or sorting.

This distinctions may also be the source of another point of friction between Dorsey and Zuckerberg. Dorsey has warned against policy changes that would “entrench” large, established social media companies — a veiled shot at Facebook — and Dorsey’s arguments today have all basically sought to deny Facebook the ability to benefit disproportionately from federal rules based on its business model.

1:21 p.m. ET, November 17, 2020

Zuckerberg and Dorsey say ideological makeup of workforces doesn't lead to bias

From CNN Business' Brian Fung

Sen. Ben Sasse pressed Facebook and Twitter on the ideological makeup of their workforces, focusing particularly on their employees in Silicon Valley, and questioning how content moderation policies can be applied in a non-partisan manner.

Zuckerberg acknowledged that many of Facebook’s employees are left-leaning. But, he said, the company takes care not to allow political bias to seep into decisions. In addition, he said, Facebook’s content moderators, many of whom are contractors, are based worldwide and “the geographic diversity of that is more representative of the community that we serve than just the full-time employee base in our headquarters in the Bay Area.”

Dorsey said political affiliation is “not something we interview for” at Twitter and said that while Twitter’s decisionmaking and outcomes can sometimes seem opaque, the company is trying to be more transparent.

“If people are questioning that,” he said, “[then] it’s a failure.”

12:57 p.m. ET, November 17, 2020

Sen. Cruz pushes for Twitter to admit it has taken sides on voter fraud allegations

From CNN Business' Brian Fung

Sen. Ted Cruz repeatedly pressured Dorsey to admit that Twitter has taken a corporate position on whether voter fraud exists, in a wider campaign to discredit the company’s labeling of tweets questioning the election results.

“Does voter fraud exist?” Cruz asked Dorsey.

“I don’t know for certain,” Dorsey replied.

“Why, then, is Twitter right now putting purported warnings on any statement about voter fraud?” Cruz shot back.

Dorsey said the company is simply connecting users to more context about claims of voter fraud. But Cruz rejected that defense, accusing Twitter of taking a “disputed” policy position despite repeated statements by the Trump administration’s own Department of Homeland Security that the election was conducted securely.

12:17 p.m. ET, November 17, 2020

Zuckerberg’s rhetorical crutch

From CNN Business' Brian Fung

Throughout this hearing, Zuckerberg has cited Facebook’s handling of terrorist or child-exploitation content as an example of how well the company’s systems work. But this is a deflection; federal law makes this type of content illegal, so Facebook’s obligations in this area, as well as its practices, are clear and straightforward.

The entire point of the hearing is how tech platforms should handle content where the lines are not so clear-cut, and what responsibilities lie with the tech companies for addressing that.

12:54 p.m. ET, November 17, 2020

Sen. Leahy raises Facebook's Myanmar missteps

From CNN Business' Rishi Iyengar

While questioning Zuckerberg, Sen. Patrick Leahy of Vermont cited one of the most infamous examples of where Facebook's policies failed.

The company was accused of helping fuel "genocide" in Myanmar, where its platforms were used to spread hate speech and promote violence against the country's Rohingya Muslim minority. Facebook acknowledged in 2018 that it had not done enough to prevent the violence.

Leahy said he was "deeply concerned" about Facebook's role in Myanmar, and acknowledged that Facebook had taken down dozens of accounts linked to Myanmar's military that promote anti-Rohingya content.

"I compliment you for doing that but the Myanmar military just turned around and created new accounts that promote the same content," Leahy said. "So...in some way you've got a whack-a-mole problem here."

Zuckerberg noted that Facebook prohibited certain members of the military and other dangerous individuals from making new accounts, but he compared Facebook's effort to crack down on hate speech to a city's efforts to crack down on crime.

"These kinds of integrity problems are not ones that there's a silver bullet," he said. "You will always be working to help minimize the prevalence of harm in the same way that a city will never eliminate all crime, you try to reduce it...and have it be as little as possible, and that's what we're trying to do."

The oft-cited Myanmar example highlights just how Facebook and other social networks' battle against misinformation often play out in far more sinister ways outside the United States, particularly in developing countries. In India — home to Facebook's biggest user base — the company now faces similar accusations of hate speech and political bias, with government committees examining its role in religious riots in New Delhi earlier this year.

12:02 p.m. ET, November 17, 2020

Sen. Lee cites content moderation mistakes as evidence of bias

From CNN Business' Brian Fung

Sen. Mike Lee asked a series of questions regarding content decisions that Facebook and Twitter later reversed. Listening to him, it sounded like he was right that the companies were simply biased against the right.

But as with many of these claims the truth is often simpler: The companies are just bad at this. If you cherry-pick examples of them being bad at moderation, it's easy to make it look like they're biased, but the fact remains: They're bad at this. 

Both Zuckerberg and Dorsey acknowledged the errors and made little effort to defend themselves except to say that they made mistakes and corrected them. As Dorsey said of content moderation early in the hearing, "We are facing something that feels impossible."

Under questioning by Lee, Dorsey acknowledged that Twitter made a “mistake” when it acted against a tweet by Mark Morgan, US Customs and Border Patrol commissioner.

Dorsey attributed the mistake to a “heightened awareness of government accounts.” Lee slammed social media platforms for repeated “mistakes” that he said appear to disproportionately affect conservative accounts.

11:51 a.m. ET, November 17, 2020

Sen. Lee claims Facebook's content labels are 'editorializing'

CNN Business' Brian Fung

Sen. Mike Lee used his question time to spread the narrative about voter fraud, accusing Facebook of trying to manipulate its users through “editorializing” and content labels.

Lee cited Facebook’s decision to label his own post about suspected election fraud as something that “sounds more like state run media announcing the party line, rather than a neutral company as it purports to be.”

“This kind of editorializing insulates people from the truth and it insinuates that anyone concerned about voter fraud must be crazy,” Lee said. “These concerns may be out of the mainstream in Palo Alto, but they’re not out of the mainstream in the rest of America.”

Facebook’s labeling is based on public statements by election authorities and the Bipartisan Policy Center.

3:32 p.m. ET, November 17, 2020

Zuckerberg defends inaction on Steve Bannon's Facebook page

From CNN Business' Kaya Yurieff and Brian Fung

Zuckerberg flatly rejected pressure to ban Steve Bannon’s Facebook page.

"How many times is Steve Bannon allowed to call for the murder of public officials before Facebook suspends his account?" Sen. Blumenthal asked Zuckerberg. He also asked whether Zuckerberg would commit to taking down Bannon's Facebook page.

Zuckerberg said "no" and “that’s not what our policies suggest that we should do.”

The Facebook CEO added that the company does take down accounts that post content related to terrorism or child exploitation the first time they do so. However, for other categories, Facebook requires multiple violations before an account or page is removed.

Bannon was permanently suspended from Twitter last week after saying in a video that Dr. Anthony Fauci and FBI Director Christopher Wray should be beheaded. The video was live on Bannon's Facebook page for about 10 hours and was viewed almost 200,000 times before the company removed it, citing its violence and incitement policies.

Last week, Zuckerberg told employees at a company meeting that the video was not enough of a violation of Facebook's rules to permanently suspend the former White House chief strategist from the platform, an employee told CNN.