A version of this article first appeared in the “Reliable Sources” newsletter. You can sign up for free right here.
YouTube is under scrutiny from several different directions. One big issue: How children interact with videos. Given YouTube’s dominance, and the allure of the smartphone screen for kids, this is one of the most important issues in media and tech right now.
– The news: “YouTube is weighing a number of changes to its handling of content for children following pressure from inside and outside the company,” CNN’s Brian Fung wrote Wednesday. “Some of the changes under consideration include preventing videos from automatically playing after the previous one finishes, the person said. Another concept, first reported by the WSJ, proposes moving children’s videos off of YouTube and into YouTube Kids, a standalone app that more tightly limits the content it allows. But that would be a drastic step and is unlikely to occur, the person said, and no decisions have been made.”
– The context: The WaPo reported Wednesday that the FTC is “in the late stages of an investigation into YouTube for its handling of children’s videos…”
– The next steps: No announcements are imminent, but lots of potential changes are on the proverbial table…
“It’s not about free speech, it’s about free reach.”
According to the WSJ’s Rob Copeland, who came out with a big new story on Wednesday, that’s a new “company mantra” that’s been “endorsed” by CEO Susan Wojcicki in internal meetings.
The suggestion is that not all videos are created equal — not all videos are owed the same treatment and visibility. Some of the company’s changes, Copeland wrote, “are designed to choke viewership for certain content by burying it far from most users, rather than proactively eliminating wide swaths of videos…”
What will the FTC do?
After the WSJ story came the Post story, which revealed that the FTC could hit YouTube with a “hefty penalty” and a settlement that forces changes to “better protect kids.” The focus has been on the Children’s Online Privacy Protection Act. So in other words, as WIRED editor Nicholas Thompson tweeted, the FTC investigation “is into whether YouTube illegally collected data on kids, violating privacy. It’s not about child porn, addictive recommendations, or the videos of kids being put in washing machines…”
Markey says he’s working on a bill…
Democratic Senator Ed Markey responded to the Post report by saying it’s “long overdue.” Markey tweeted that YouTube “has yet to take the necessary steps to protect its youngest users.” And he said that he’ll introduce legislation “in the coming weeks” that will combat “online design features that coerce children and create bad habits; marketing that manipulates kids and pushes them into consumer culture; the amplification of inappropriate and harmful content on the internet.”
Everyone thinks they can fix it…
On the phone with a tech exec on Wednesday night, I found myself second-guessing my own ideas about YouTube. Move all kids content to the YouTube Kids app? Makes sense! But: What about kids videos that teens and adults like to share? That’s tricky.
Or how about the YouTube feature I desperately want for my two-year-old — an option that only shows professionally made content, removing all the amateur and homemade videos? The question is, who’s going to determine when an amateur becomes a pro? Every suggested solution leads to more questions…
The trouble with the recommendation engine
As Fung noted in his story, “YouTube has found itself in the crosshairs amid concerns that the platform’s video-recommendation software directs viewers toward violent, disturbing or conspiratorial content. Users, including children, may start viewing safe content and then be led to less appropriate videos that have been optimized for YouTube’s algorithm. There are videos on YouTube of familiar children’s cartoon characters in dangerous or unsettling scenarios.”
→ Stephen Balkam, founder and CEO of the Family Online Safety Institute: “My hope would be that YouTube’s algorithms would be better at picking these sorts of things up, and in the case of disturbing or violent images, at the very least put them behind an interstitial warning that content might be seen as offensive.”
→ My most recent “Reliable” podcast, with Kevin Roose, was all about the recommendation engine…