Near the end of a tough, wide-ranging 40-minute interview on Monday, YouTube CEO Susan Wojcicki made her second attempt at an apology to users.
Wojcicki’s remarks were intended to address a backlash stemming from YouTube’s confusing response last week to videos posted by Steven Crowder, a prominent right-wing personality, which included homophobic and racist slurs directed at Vox video journalist Carlos Maza. First, YouTube said Crowder’s videos didn’t violate its policies. The next day, YouTube said it would demonetize the videos, in light of what YouTube’s head of communications described as “widespread harm to the YouTube community resulting from the ongoing pattern of egregious behavior.”
Yet, a similar apology could arguably have applied to any number of controversies swirling around the company as YouTube increasingly finds itself nudged into the harsh spotlight long occupied primarily by Facebook (FB).
This month alone, YouTube has faced scrutiny for allegedly radicalizing young men online through its recommendation engine and for suggesting videos of “prepubescent, partially clothed children” to users who had previously viewed “sexually themed content.” YouTube announced plans to ban supremacist content and pull videos denying well-documented atrocities like the Sandy Hook school massacre — only after years of conspiracy theories about the tragedy circulating on the platform — but it also removed videos from academics and anti-racist groups, some of which were later restored.
For more than two years, politicians and media on both sides of the Atlantic have scrutinized the big online platforms for their roles in spreading fake news, enabling election meddling, efforts to tackle extremist content and their data-privacy practices. But until recently, much of that scrutiny had focused squarely on Facebook’s bottomless pit of scandals and self-inflicted wounds rather than YouTube.
Like YouTube, Wocjicki herself has largely avoided the public grillings of her counterparts at other tech companies. While the CEOs of Facebook and Twitter have both testified before Congress, Wocjicki has not — though her boss, Google CEO Sundar Pichai, has appeared.
If the recent wave of damning headlines is any indication, YouTube’s time in the hot seat may finally be approaching. A bipartisan mix of senators have called on YouTube to do more to protect children on the platform. Google’s LGBTQ employees are said to be frustrated and angry about YouTube’s handling of the Crowder videos. And YouTube faces the prospect of a possible US antitrust probe into its parent company Google. When asked at the conference Monday what would happen if Google had to spin off YouTube, Wojcicki repeatedly said “I don’t know” before adding, “We would figure it out.”
Shortly before that, she said, “There’s definitely more regulation in store for us.”
At the heart of the latest crop of headlines about YouTube are fundamental concerns that the company can’t fully police its sprawling platform and that it struggles to enforce its policies consistently. YouTube now has more than 2 billion monthly active users who collectively upload more than 500 hours of content to the service every minute. Beyond that, there are concerns YouTube prioritized user engagement — and the potential to make money from those engaged eyeballs — above all else, even if that meant algorithmically directing viewers to more controversial content.
If this all sounds oddly familiar, it’s because Facebook has fielded a similar set of concerns about its power and enforcement approach since the 2016 presidential election and the Cambridge Analytica scandal.
In the interview Monday, Wojcicki repeatedly struck a defiant tone. She argued that “all the news and the concerns” have actually “been about this fractional 1%” of activity on the platform. The other 99-plus percent, she said, is “all really valuable content.” Wojcicki also stressed YouTube’s efforts to improve its video suggestions, including a move announced in January to reduce recommendations for “borderline content that could misinform users in harmful ways.”
But perhaps the most telling quote from the interview came when the interviewer pointed out what may be the fatal flaw of YouTube and its tech peers: The company doesn’t actually know what’s on its website because it doesn’t vet and approve content the way many traditional publishers do.
“We work hard to understand what is happening on it,” she said.
But no matter how hard it works, there’s no guarantee YouTube ever fully succeeds.