Flu Vaccine
CNN
Flu Vaccine
Now playing
01:20
Facebook to crack down on anti-vaccine misinformation
Shutterstock
Now playing
02:38
FTC antitrust complaint against Facebook dismissed by judge
Getty Images
Now playing
04:16
Emails show frustration over Facebook's handling of election lies
CNN/Getty Images
Now playing
06:01
Facebook VP: We are applying the most severe penalty we have on Trump
Getty Images
Now playing
01:41
Trump suspended from Facebook until 2023
Getty Images
Now playing
02:25
Facebook changes policy on Covid-19 origin claims
Now playing
02:37
Facebook Oversight Board: Indefinite suspension of Trump's account is 'not appropriate'
Facebook Trump board Welker Psaki White House response _00005421.png
Facebook Trump board Welker Psaki White House response _00005421.png
Now playing
01:56
White House responds to question about Facebook's decision on Trump
Now playing
04:11
Facebook decision 'is wait and see,' says former public policy director
ORLANDO, FLORIDA - FEBRUARY 28:  Former President Donald Trump addresses the Conservative Political Action Conference held in the Hyatt Regency on February 28, 2021 in Orlando, Florida. Begun in 1974, CPAC brings together conservative organizations, activists, and world leaders to discuss issues important to them. (Photo by Joe Raedle/Getty Images)
Joe Raedle/Getty Images
ORLANDO, FLORIDA - FEBRUARY 28: Former President Donald Trump addresses the Conservative Political Action Conference held in the Hyatt Regency on February 28, 2021 in Orlando, Florida. Begun in 1974, CPAC brings together conservative organizations, activists, and world leaders to discuss issues important to them. (Photo by Joe Raedle/Getty Images)
Now playing
01:32
CNN correspondent: This is a nightmare situation for Facebook
Facebook
Getty Images
Facebook
Now playing
03:14
More than 500 million Facebook users' personal data leaked online
Energy and Commerce Committee/YouTube
Now playing
02:50
US lawmakers question tech CEOs on misinformation
TOPSHOT - A demonstrator wearing a mask painted with the colours of the flag of East Turkestan and a hand bearing the colours of the Chinese flag attends a protest of supporters of the mostly Muslim Uighur minority and Turkish nationalists to denounce China's treatment of ethnic Uighur Muslims during a deadly riot in July 2009 in Urumqi, in front of the Chinese consulate in Istanbul, on July 5, 2018. - Nearly 200 people died during a series of violent riots that broke out on July 5, 2009 over several days in Urumqi, the capital city of the Xinjiang Uyghur Autonomous Region, in northwestern China, between Uyghurs and Han people. (Photo by OZAN KOSE / AFP)        (Photo credit should read OZAN KOSE/AFP/Getty Images)
OZAN KOSE/AFP/AFP/Getty Images
TOPSHOT - A demonstrator wearing a mask painted with the colours of the flag of East Turkestan and a hand bearing the colours of the Chinese flag attends a protest of supporters of the mostly Muslim Uighur minority and Turkish nationalists to denounce China's treatment of ethnic Uighur Muslims during a deadly riot in July 2009 in Urumqi, in front of the Chinese consulate in Istanbul, on July 5, 2018. - Nearly 200 people died during a series of violent riots that broke out on July 5, 2009 over several days in Urumqi, the capital city of the Xinjiang Uyghur Autonomous Region, in northwestern China, between Uyghurs and Han people. (Photo by OZAN KOSE / AFP) (Photo credit should read OZAN KOSE/AFP/Getty Images)
Now playing
01:16
Facebook: Chinese hackers targeted Uyghurs living in US
Chris, a Trump supporter, reacts to a fact check of a manipulated video shared by the Trump campaign.
CNN
Chris, a Trump supporter, reacts to a fact check of a manipulated video shared by the Trump campaign.
Now playing
03:58
What Trump supporters see on their Facebook feeds
Now playing
02:24
Under questioning, Zuckerberg admits Instagram was a 'competitor'
Facebook's CEO Mark Zuckerberg delivers his speech during the VivaTech (Viva Technology) trade fair in Paris, on May 24, 2018. (Photo by GERARD JULIEN / AFP)        (Photo credit should read GERARD JULIEN/AFP/Getty Images)
GERARD JULIEN/AFP/Getty Images
Facebook's CEO Mark Zuckerberg delivers his speech during the VivaTech (Viva Technology) trade fair in Paris, on May 24, 2018. (Photo by GERARD JULIEN / AFP) (Photo credit should read GERARD JULIEN/AFP/Getty Images)
Now playing
03:21
This is how Facebook kills its competition
New York CNN Business —  

It’s a new normal: Thousands of people now have jobs that require them to view graphic and disturbing videos for hours on end. But in this wild, wild west of work, critics say companies need to better understand and support the needs of those in this growing industry.

The role of content moderators was once again put into focus this week following an explosive report by The Verge into the lives of some of these workers for Facebook. The report is just the latest glimpse into the dark underbelly of the internet.

Content moderators typically help companies weed out disturbing content ranging from suicide and murder videos to conspiracy theories in order to make platforms more palatable. The report about Facebook, which cited interviews with a dozen workers who do or have done moderation work for the company, showed workers are reportedly paid $28,800 annually with little upward mobility and few perks. Some reported coping with trauma by getting high on breaks or having sex at the office.

“It’s not really clear what the ideal circumstance would be for a human being to do this work,” said Sarah T. Roberts, an assistant professor of information studies at UCLA, who has been sounding the alarm about the work and conditions of content moderators for years.

Not much is known about how many workers are tasked with viewing the worst of social media. There is also a little understanding of the long-term effects of this kind of work or how to mitigate against on-the-job trauma.

The content moderation industry is growing as the platforms do. Some companies have increasingly touted these human workforces as a solution to criticism over inappropriate content, oftentimes relying on outsourced workers in a “call center” environment to handle the disturbing tasks. YouTube announced in late 2017 it would hire 10,000 people to clean up offensive videos after backlash including troubling messages seeping through its YouTube Kids platform. Facebook has said the company had 15,000 workers doing content moderation, nearly double what it had just last April.

It is hard to track exact numbers because job titles vary across companies, some employees are under nondisclosure agreements, much of the work is outsourced and there tends to be a high turnover rate. But the work is almost always taxing on the workers.

A person who formerly vetted social media content for the news industry spoke to CNN Business on the condition of anonymity about experiencing PTSD as a result of moderation duties. His job required viewing footage such as chemical attacks and bus bombings.

While he was not reviewing content to evaluate whether it went against a platform’s user policies, his job was not dissimilar. He had to watch and rewatch violent and disturbing videos to get his job done.

“Every terrible Islamic state video over the last four years that you can think of, I saw it,” he said. “The WWI term for PTSD was shell shock and that’s kind of what you suddenly feel.”

Despite being “well compensated,” the worker said there weren’t any resources outside of a standard employee assistance program to help cope with the job’s trauma.

“The horrible images that you have to see, money doesn’t really enter into it,” said the worker, who ultimately took off two months from the job to go to intensive therapy, and is still recovering.

Instead, he suggested that companies better care for workers by distributing the work or limiting the amount of time people spend on viewing extreme content. He said firms can provide experts who understand the trauma and symptoms that could result from exposure to certain types of content.

Roberts says there’s still a “reckoning” that needs to happen when it comes to understanding the facets, implications and costs of the job on workers.

“There’s really two exit pathways for people who do this work for the most part: Burnout and Desensitization,” she told CNN Business.

While there’s not a clear alternative content moderation model for big companies like Facebook and Google, Roberts said there are other approaches being taken. Reddit and Wikipedia have more community-based models while some smaller companies allow moderators to have more responsibility in crafting content moderation policies.

Multiple companies including Facebook are beginning to use automated systems based on artificial intelligence, but they are not yet close to the point where they can replace human judgment.

Kate Klonick, an assistant professor at St. John’s University Law School who has studied content moderation from a policy perspective told CNN Business, “A cost of not over-censoring, and having the nuance, is having humans do this kind of really gross work, at least for now.”

“AI is basically the only thing that can save people from this type of stuff, and it’s years and years away,” she said.

According to The Verge’s report, some moderators even started to embrace views of the conspiracy videos they were viewing on the platform. Roberts, who has a book out in June on the genesis of content moderation work over the past decade, said that rings true to her.

“I have spoken extensively with a woman who was a MySpace moderator in 2007,” Roberts said. “She talked about the propensity to be irrevocably altered based on consumption of what [moderators] were supposedly moderating against.”