YouTube on Thursday said it has started disabling comments on videos of minors following accusations that the platform aided pedophiles in finding clips of young children.
The company, which is owned by Alphabet, said it has turned off comments on “tens of millions” of videos over the past week that could be subject to “predatory behavior.” It will continue to identify such videos over the next few months.
“We recognize that comments are a core part of the YouTube experience and how you connect with and grow your audience. At the same time, the important steps we’re sharing today are critical for keeping young people safe,” the company wrote in a blog post.
Last week, video creator Matt Watson said YouTube’s algorithm created a “worm hole” that let pedophiles easily find clips of young children performing activities such as gymnastics or yoga using the website’s recommended videos feature. Some users left comments specifying what times minors were seen in “compromising positions.”
Over the next few months, YouTube says it will widen its efforts to shut off comments on videos featuring young minors under 13, as well as older minors under 18 that could attract “predatory” actions.
A small number of creators will be allowed to keep comments on these types of videos, but the channels must actively moderate their comments and prove the content is low risk. A YouTube spokesperson declined to say which specific accounts or types of creators would be exempt.
“We will work with them directly and our goal is to grow this number over time as our ability to catch violative comments continues to improve,” the company said in the blog post, referring to creators.
Videos featuring minors are flagged algorithmically through AI. As with any AI-assisted moderation, YouTube’s method is likely to be far from perfect because it can be tricky for a computer to accurately determine a person’s age. The way they’re sitting or standing, or the kind of lighting in a video, may make it harder to determine.
Last April, YouTube announced that AI systems were detecting most of the videos that ended up getting taken down from the site. The company has also used AI to help expert groups like the Internet Watch Foundation identify and report child sexual abuse content online.
YouTube also announced it’s working on a more effective tool to help detect and remove two times as many comments from the site.
This isn’t the first time the company has said it is prioritizing detrimental comments on its platform. In 2017, CEO Susan Wojcicki said the company was “taking aggressive action on comments, launching new comment moderation tools and in some cases shutting down comments altogether.”
At the time, YouTube announced it would hire 10,000 people to clean up offensive videos after backlash including troubling messages seeping through its YouTube Kids platform.
CNN Business’ Sara Ashley O’Brien, Rachel Metz and Oliver Darcy contributed reporting.