Mark Zuckerberg and Jack Dorsey face Senate grilling over moderation practices

By Brian Fung, Kaya Yurieff and Rishi Iyengar CNN Business

Updated 7:48 a.m. ET, November 18, 2020
24 Posts
Sort byDropdown arrow
1:52 p.m. ET, November 17, 2020

Facebook pressed on handling of Kenosha counter-protester group

From CNN Business' Brian Fung

Sen. Chris Coons challenged Facebook on its handling of the protests in Kenosha, Wisc., particularly pushing Zuckerberg on a Facebook page that urged armed counter-protesters to gather.

Coons pointed to Facebook’s existing policy banning calls to arms, asking why the page was not removed sooner in light of that policy.

Zuckerberg, who has testified that Facebook’s handling of the page as a mistake, said his understanding was that the call in question did not violate the policy at the time.

Coons responded said that “facially, it seems to me this was a violation of your own call to arms policy.”

1:29 p.m. ET, November 17, 2020

Hawley tries to paint common practice as nefarious

From CNN Business' Oliver Darcy

Republican Sen. Josh Hawley targeted Facebook CEO Mark Zuckerberg with a misleading line of questioning.

Hawley repeatedly asked Zuckerberg if Facebook coordinates with YouTube and Twitter for "censorship" and "to control information," citing information he had obtained from a "whistleblower."

It is well-known that social media companies do communicate with each other on issues related to foreign meddling, terrorism, and other topics.

But Hawley tried to deceptively characterize the practice of Facebook communicating with its peers as nefarious and implied that the coordination was a major revelation.

Zuckerberg pushed back against Hawley's characterization, explaining to him first that the major social media companies "do coordinate and share signals on security related topics."

"For example," Zuckerberg explained, "there is signal around a terrorist attack or around child exploitation imagery or around a foreign government creating an influence operation, that is an area where the companies do share signals about what they see."

But Zuckerberg stressed that the communication "is distinct from the content moderation policies" Facebook has.

Zuckerberg noted that the companies might share information on what they are each seeing occur on their respective platforms, but that each company makes their own decisions on how they will enforce their policies.

1:42 p.m. ET, November 17, 2020

Sen. Klobuchar's questions show Big Tech has an even bigger problem than content: antitrust

From CNN Business' Kaya Yurieff

Though Tuesday's hearing is ostensibly about content moderation and the election, Sen. Amy Klobuchar asked a number of questions about the dominance of tech platforms. It's a good reminder that for Facebook, antirust, and not debates over moderation, may be the biggest forthcoming challenge in Washington.

Sen. Klobuchar pressed Zuckerberg about the company's allegedly anti-competitive tactics. She pointed to Facebook's purchase of Instagram in 2012, which documents show it saw as an emerging rival, as an example.

"At the time, I don't think we or anyone else viewed Instagram as a competitor as a large multi-purpose social platform," Zuckerberg said. "In fact, people at the time kind of mocked our acquisition because they thought that we dramatically spent more money than we should have to acquire something that was viewed primarily as a camera and photo sharing app."

Klobuchar responded: "We don't know how [Instagram] would've done."

"When we look at your emails it kind of leads us down this road as well with WhatsApp that part of the purchase of these nascent competitors is to -- I'll use the words of FTC Chairman Joe Simons who just said last week: 'A monopolist can squash a nascent competitor by buying it, not just by targeting it with anti-competitive activity.'"

12:58 p.m. ET, November 17, 2020

Zuckerberg and Dorsey are both Big Tech, but that doesn't mean they agree

From CNN Business' Brian Fung

It seems that what Facebook and Twitter would like to see from federal content moderation laws is increasingly diverging.

Zuckerberg has consistently emphasized how well Facebook’s systems work for detecting and blocking terrorist and child-exploitation content, categories of material that are explicitly illegal. Zuckerberg also said he would be open to a regulatory regime that imposes liability on tech platforms for some forms of content.

Twitter takes a very different tack. Dorsey has argued for rules promoting transparency in process and outcomes, and importantly, the ability for consumers to opt out of tech company algorithms that rank and determine what you should see.

Each executive’s position reflects the strengths of his respective product — Facebook’s entire business rests on its algorithm, so it makes sense that Zuckerberg would want rules that lean more heavily on a company’s ability to deploy its algorithms to manage content.

By contrast, Twitter has only gotten into ranking content in users’ feeds relatively recently in its history. For years its main approach was to serve users tweets in reverse chronological order, without ranking or sorting.

This distinctions may also be the source of another point of friction between Dorsey and Zuckerberg. Dorsey has warned against policy changes that would “entrench” large, established social media companies — a veiled shot at Facebook — and Dorsey’s arguments today have all basically sought to deny Facebook the ability to benefit disproportionately from federal rules based on its business model.

1:21 p.m. ET, November 17, 2020

Zuckerberg and Dorsey say ideological makeup of workforces doesn't lead to bias

From CNN Business' Brian Fung

Sen. Ben Sasse pressed Facebook and Twitter on the ideological makeup of their workforces, focusing particularly on their employees in Silicon Valley, and questioning how content moderation policies can be applied in a non-partisan manner.

Zuckerberg acknowledged that many of Facebook’s employees are left-leaning. But, he said, the company takes care not to allow political bias to seep into decisions. In addition, he said, Facebook’s content moderators, many of whom are contractors, are based worldwide and “the geographic diversity of that is more representative of the community that we serve than just the full-time employee base in our headquarters in the Bay Area.”

Dorsey said political affiliation is “not something we interview for” at Twitter and said that while Twitter’s decisionmaking and outcomes can sometimes seem opaque, the company is trying to be more transparent.

“If people are questioning that,” he said, “[then] it’s a failure.”

12:57 p.m. ET, November 17, 2020

Sen. Cruz pushes for Twitter to admit it has taken sides on voter fraud allegations

From CNN Business' Brian Fung

Sen. Ted Cruz repeatedly pressured Dorsey to admit that Twitter has taken a corporate position on whether voter fraud exists, in a wider campaign to discredit the company’s labeling of tweets questioning the election results.

“Does voter fraud exist?” Cruz asked Dorsey.

“I don’t know for certain,” Dorsey replied.

“Why, then, is Twitter right now putting purported warnings on any statement about voter fraud?” Cruz shot back.

Dorsey said the company is simply connecting users to more context about claims of voter fraud. But Cruz rejected that defense, accusing Twitter of taking a “disputed” policy position despite repeated statements by the Trump administration’s own Department of Homeland Security that the election was conducted securely.

12:17 p.m. ET, November 17, 2020

Zuckerberg’s rhetorical crutch

From CNN Business' Brian Fung

Throughout this hearing, Zuckerberg has cited Facebook’s handling of terrorist or child-exploitation content as an example of how well the company’s systems work. But this is a deflection; federal law makes this type of content illegal, so Facebook’s obligations in this area, as well as its practices, are clear and straightforward.

The entire point of the hearing is how tech platforms should handle content where the lines are not so clear-cut, and what responsibilities lie with the tech companies for addressing that.

12:54 p.m. ET, November 17, 2020

Sen. Leahy raises Facebook's Myanmar missteps

From CNN Business' Rishi Iyengar

While questioning Zuckerberg, Sen. Patrick Leahy of Vermont cited one of the most infamous examples of where Facebook's policies failed.

The company was accused of helping fuel "genocide" in Myanmar, where its platforms were used to spread hate speech and promote violence against the country's Rohingya Muslim minority. Facebook acknowledged in 2018 that it had not done enough to prevent the violence.

Leahy said he was "deeply concerned" about Facebook's role in Myanmar, and acknowledged that Facebook had taken down dozens of accounts linked to Myanmar's military that promote anti-Rohingya content.

"I compliment you for doing that but the Myanmar military just turned around and created new accounts that promote the same content," Leahy said. "So...in some way you've got a whack-a-mole problem here."

Zuckerberg noted that Facebook prohibited certain members of the military and other dangerous individuals from making new accounts, but he compared Facebook's effort to crack down on hate speech to a city's efforts to crack down on crime.

"These kinds of integrity problems are not ones that there's a silver bullet," he said. "You will always be working to help minimize the prevalence of harm in the same way that a city will never eliminate all crime, you try to reduce it...and have it be as little as possible, and that's what we're trying to do."

The oft-cited Myanmar example highlights just how Facebook and other social networks' battle against misinformation often play out in far more sinister ways outside the United States, particularly in developing countries. In India — home to Facebook's biggest user base — the company now faces similar accusations of hate speech and political bias, with government committees examining its role in religious riots in New Delhi earlier this year.

12:02 p.m. ET, November 17, 2020

Sen. Lee cites content moderation mistakes as evidence of bias

From CNN Business' Brian Fung

Sen. Mike Lee asked a series of questions regarding content decisions that Facebook and Twitter later reversed. Listening to him, it sounded like he was right that the companies were simply biased against the right.

But as with many of these claims the truth is often simpler: The companies are just bad at this. If you cherry-pick examples of them being bad at moderation, it's easy to make it look like they're biased, but the fact remains: They're bad at this. 

Both Zuckerberg and Dorsey acknowledged the errors and made little effort to defend themselves except to say that they made mistakes and corrected them. As Dorsey said of content moderation early in the hearing, "We are facing something that feels impossible."

Under questioning by Lee, Dorsey acknowledged that Twitter made a “mistake” when it acted against a tweet by Mark Morgan, US Customs and Border Patrol commissioner.

Dorsey attributed the mistake to a “heightened awareness of government accounts.” Lee slammed social media platforms for repeated “mistakes” that he said appear to disproportionately affect conservative accounts.