Flu Vaccine
Flu Vaccine
PHOTO: CNN
Now playing
01:20
Facebook to crack down on anti-vaccine misinformation
Chris, a Trump supporter, reacts to a fact check of a manipulated video shared by the Trump campaign.
Chris, a Trump supporter, reacts to a fact check of a manipulated video shared by the Trump campaign.
PHOTO: CNN
Now playing
03:58
What Trump supporters see on their Facebook feeds
PHOTO: CNN
Now playing
02:03
Watch this former exec compare Facebook to Big Tobacco
Screengrab Nick Clegg facebook
Screengrab Nick Clegg facebook
PHOTO: CNN
Now playing
07:34
Facebook exec explains the company's US election actions
Now playing
05:15
Misleading videos shared by Republicans get millions of views
Now playing
02:24
Under questioning, Zuckerberg admits Instagram was a 'competitor'
Now playing
03:31
Congresswoman grills Facebook CEO on copying competitors
PHOTO: From Facebook
Now playing
02:40
Zuckerberg blasts Trump administration for worsening pandemic
Facebook co-founder, Chairman and CEO Mark Zuckerberg testifies before the House Energy and Commerce Committee in the Rayburn House Office Building on Capitol Hill April 11, 2018 in Washington, DC. This is the second day of testimony before Congress by Zuckerberg, 33, after it was reported that 87 million Facebook users had their personal information harvested by Cambridge Analytica, a British political consulting firm linked to the Trump campaign.  (Photo by Chip Somodevilla/Getty Images)
Facebook co-founder, Chairman and CEO Mark Zuckerberg testifies before the House Energy and Commerce Committee in the Rayburn House Office Building on Capitol Hill April 11, 2018 in Washington, DC. This is the second day of testimony before Congress by Zuckerberg, 33, after it was reported that 87 million Facebook users had their personal information harvested by Cambridge Analytica, a British political consulting firm linked to the Trump campaign. (Photo by Chip Somodevilla/Getty Images)
PHOTO: Chip Somodevilla/Getty Images
Now playing
03:33
Facebook meeting 'disappointing,' says ad boycott organizer
PHOTO: Shutterstock
Now playing
01:57
Facebook removes Trump ads 'for violating our policy against organized hate'
This picture taken on July 4, 2019 in Nantes, shows logos of the US online social media and social networking service, Facebook. (Photo by LOIC VENANCE / AFP)        (Photo credit should read LOIC VENANCE/AFP via Getty Images)
This picture taken on July 4, 2019 in Nantes, shows logos of the US online social media and social networking service, Facebook. (Photo by LOIC VENANCE / AFP) (Photo credit should read LOIC VENANCE/AFP via Getty Images)
PHOTO: LOIC VENANCE/AFP/AFP via Getty Images
Now playing
02:41
He quit Facebook over Zuckerberg's handling of Trump posts. Hear why
PHOTO: Glenn Chapman/AFP/Getty Images/CNN
Now playing
03:38
He says Facebook's Libra is the future. Lawmakers aren't so sure
PHOTO: YouTube/Financial Services Committee
Now playing
02:15
Zuckerberg struggles to explain whether Facebook fact checks political ads
Mark Zuckerberg remained silent after Congressman Barry Loudermilk compared him to President Trump.
Mark Zuckerberg remained silent after Congressman Barry Loudermilk compared him to President Trump.
Now playing
00:43
Watch Zuckerberg react when a lawmaker compares him to Trump
Facebook
Facebook's CEO Mark Zuckerberg delivers his speech during the VivaTech (Viva Technology) trade fair in Paris, on May 24, 2018. (Photo by GERARD JULIEN / AFP) (Photo credit should read GERARD JULIEN/AFP/Getty Images)
PHOTO: GERARD JULIEN/AFP/Getty Images
Now playing
03:21
This is how Facebook kills its competition
facebook zuckerberg origin business orig _00011530.jpg
facebook zuckerberg origin business orig _00011530.jpg
Now playing
01:47
It took Facebook 15 years to take over the world. Here's how
(CNN Business) —  

It’s a new normal: Thousands of people now have jobs that require them to view graphic and disturbing videos for hours on end. But in this wild, wild west of work, critics say companies need to better understand and support the needs of those in this growing industry.

The role of content moderators was once again put into focus this week following an explosive report by The Verge into the lives of some of these workers for Facebook. The report is just the latest glimpse into the dark underbelly of the internet.

Content moderators typically help companies weed out disturbing content ranging from suicide and murder videos to conspiracy theories in order to make platforms more palatable. The report about Facebook, which cited interviews with a dozen workers who do or have done moderation work for the company, showed workers are reportedly paid $28,800 annually with little upward mobility and few perks. Some reported coping with trauma by getting high on breaks or having sex at the office.

“It’s not really clear what the ideal circumstance would be for a human being to do this work,” said Sarah T. Roberts, an assistant professor of information studies at UCLA, who has been sounding the alarm about the work and conditions of content moderators for years.

Not much is known about how many workers are tasked with viewing the worst of social media. There is also a little understanding of the long-term effects of this kind of work or how to mitigate against on-the-job trauma.

The content moderation industry is growing as the platforms do. Some companies have increasingly touted these human workforces as a solution to criticism over inappropriate content, oftentimes relying on outsourced workers in a “call center” environment to handle the disturbing tasks. YouTube announced in late 2017 it would hire 10,000 people to clean up offensive videos after backlash including troubling messages seeping through its YouTube Kids platform. Facebook has said the company had 15,000 workers doing content moderation, nearly double what it had just last April.

It is hard to track exact numbers because job titles vary across companies, some employees are under nondisclosure agreements, much of the work is outsourced and there tends to be a high turnover rate. But the work is almost always taxing on the workers.

A person who formerly vetted social media content for the news industry spoke to CNN Business on the condition of anonymity about experiencing PTSD as a result of moderation duties. His job required viewing footage such as chemical attacks and bus bombings.

While he was not reviewing content to evaluate whether it went against a platform’s user policies, his job was not dissimilar. He had to watch and rewatch violent and disturbing videos to get his job done.

“Every terrible Islamic state video over the last four years that you can think of, I saw it,” he said. “The WWI term for PTSD was shell shock and that’s kind of what you suddenly feel.”

Despite being “well compensated,” the worker said there weren’t any resources outside of a standard employee assistance program to help cope with the job’s trauma.

“The horrible images that you have to see, money doesn’t really enter into it,” said the worker, who ultimately took off two months from the job to go to intensive therapy, and is still recovering.

Instead, he suggested that companies better care for workers by distributing the work or limiting the amount of time people spend on viewing extreme content. He said firms can provide experts who understand the trauma and symptoms that could result from exposure to certain types of content.

Roberts says there’s still a “reckoning” that needs to happen when it comes to understanding the facets, implications and costs of the job on workers.

“There’s really two exit pathways for people who do this work for the most part: Burnout and Desensitization,” she told CNN Business.

While there’s not a clear alternative content moderation model for big companies like Facebook and Google, Roberts said there are other approaches being taken. Reddit and Wikipedia have more community-based models while some smaller companies allow moderators to have more responsibility in crafting content moderation policies.

Multiple companies including Facebook are beginning to use automated systems based on artificial intelligence, but they are not yet close to the point where they can replace human judgment.

Kate Klonick, an assistant professor at St. John’s University Law School who has studied content moderation from a policy perspective told CNN Business, “A cost of not over-censoring, and having the nuance, is having humans do this kind of really gross work, at least for now.”

“AI is basically the only thing that can save people from this type of stuff, and it’s years and years away,” she said.

According to The Verge’s report, some moderators even started to embrace views of the conspiracy videos they were viewing on the platform. Roberts, who has a book out in June on the genesis of content moderation work over the past decade, said that rings true to her.

“I have spoken extensively with a woman who was a MySpace moderator in 2007,” Roberts said. “She talked about the propensity to be irrevocably altered based on consumption of what [moderators] were supposedly moderating against.”