- Film trailer blamed for violence in Libya that leaves U.S. ambassador dead
- YouTube blocks the video in Libya and Egypt
- The video does not breach YouTube's community guidelines
YouTube on Wednesday announced it was restricting access to a controversial video that has been blamed for inciting violence in Libya and protests in Egypt.
The video, a film trailer mocking the Muslim faith, will not be accessible via YouTube in Libya and Egypt, the company said in a statement issued to CNN.
"We work hard to create a community everyone can enjoy and which also enables people to express different opinions," YouTube said by e-mail. "This can be a challenge because what's OK in one country can be offensive elsewhere.
"This video -- which is widely available on the Web -- is clearly within our guidelines and so will stay on YouTube. However, given the very difficult situation in Libya and Egypt we have temporarily restricted access in both countries.
"Our hearts are with the families of the people murdered in yesterday's attack in Libya."
The video, a 14-minute movie trailer that many view as an insult to Muslims and their faith, was blamed for setting off a wave of violence in Libya and protests in Egypt on Tuesday and Wednesday. An attack on a U.S. consulate in the Libyan city of Benghazi left the U.S. ambassador and three other embassy staffers dead. President Obama strongly condemned the violence, calling the attack "outrageous."
Afghanistan banned YouTube in response to the video, according to reports.
"We have been told to shut down YouTube to the Afghan public until the video is taken down," Aimal Marjan, from the country's ministry of information and technology, told Reuters.
The events raise what's becoming a familiar question at a time when the Internet has become the world's major medium of communication: When should websites that display user-generated content take down material that is deemed to be offensive?
This sort of debate comes up with some frequency. Some people were outraged this year when Facebook decided not to take down pages that supported the man accused of opening fire on moviegoers in a Colorado theater. YouTube also has been criticized for taking down videos showing police brutality and other acts of violence that could have political or journalistic significance. Some of those videos were reposted by the site after further review.
YouTube's online guidelines ban pornography and "graphic or gratuitous violence" and ask the people who upload videos to respect copyright laws.
On the subject of controversial speech, the site says, "We encourage free speech and defend everyone's right to express unpopular points of view. But we don't permit hate speech (speech which attacks or demeans a group based on race or ethnic origin, religion, disability, gender, age, veteran status, and sexual orientation/gender identity)."
Jillian York, director for international freedom of expression at the Electronic Frontier Foundation, said YouTube should not take down the video worldwide but should consider blocking it in specific countries where it's causing violence.
"Is it good in the short term? Probably not," she said of leaving the video online. "But for the long term and what it means for corporate free expression, yes, I think it's good to keep it up. It's a difficult balance."
The video could been seen as having news and documentary value, she said, potentially putting it in a category of videos that are allowed to stay on YouTube even though they show, for example, drug use or violence -- acts that would violate YouTube's terms if they weren't newsworthy.
This clip probably wouldn't break U.S. rules about inciting violence, York said, because it does not directly call for people to engage in violent acts.
In a 2010 lecture at a human rights conference, YouTube's Victoria Grand offers further details on how the site policies itself. With 72 hours of video being uploaded to YouTube every minute, the site relies on its audience, employees and computer programs to address and sometimes delete content that is deemed to violate its terms.
To be removed from YouTube, a video first must be flagged by one of the site's users. Computers scan the clip for certain traits -- the first scan, she said, is for flesh-colored pixels -- and then prioritize the videos in a queue, Grand said. A group of YouTube employees, who are positioned all over the world, then reviews the queue of flagged and computer-prioritized videos to judge them against YouTube policy.
"Their job is to spend every single day looking at a queue of what I would define as the dark underbelly of the Internet," Grand said in a video posted by the group Global Voices. "So you're going to see a lot of masturbation videos in that queue; you're going to see animal abuse. You're going to see some pretty, pretty horrendous things. I would say the vast majority of flags we get are for things like pornography."
The film in question is considered offensive to Muslims and depicts the Muslim prophet Mohammed as a child molester, womanizer and killer. The video had about 55,000 views as of 11 a.m. ET Wednesday. It had a mostly unfavorable rating on YouTube, with more than 8,000 "dislikes" to 2,600 "likes." Nearly 9,000 people had commented on the video, and several versions of it appeared to have been uploaded to YouTube.
The 2-month-old film trailer is in English, but it has been dubbed into Arabic. The New York Times reports that an Arabic-language version of the video was taken down from YouTube, apparently for reasons of copyright infringement.
New York magazine called the video "absurd" and "Islamophobic."
The video also reportedly has been shown by means other than YouTube. Part of the film was broadcast on Egyptian TV, according to the Times.