Misinformation Watch

By Donie O'Sullivan, Kaya Yurieff, Kelly Bourdet, the CNN Business team and contributors from across CNN

Updated 11:21 a.m. ET, January 26, 2021
180 Posts
Sort byDropdown arrow
18 hr 8 min ago

Analysis: What comes next?

From CNN Business' Kelly Bourdet

We created Misinformation Watch to provide CNN readers a destination for misinformation-related U.S. election coverage. With the election process having come to a close, this is our last post on Misinformation Watch. It is far from the end of our coverage of misinformation, though. 

In the few short months of this project’s operation, a conspiracy theory took root within misinformation communities that had long thrived online. Former U.S. President Donald Trump did not lose the election, the theory falsely posited it was stolen.

The theory moved through well trod channels of misinformation -- QAnon groups, conspiratorial hashtags, fringe YouTube personalities, the former President’s social media accounts -- growing more complex as it spread. Eventually, the theory was so varied and multi-pronged that it was all but impossible to disprove to those who subscribed to it. It spilled out into the physical world and, eventually, its adherents spilled blood.

We watched, in real time, both the birth and the ultimate destructive power of that false reality. It was the perfect storm of misinformation, and it was another warning.

Misinformation is not simply the result of conspiratorial internet posters or grifters sowing fear to make a buck. Though it thrives under tech giants’ uneven content moderation policies, it will not be stamped out solely by more robust self-policing by these platforms.

Misinformation’s impact has as much to do with the ways in which we are served online content as the content itself. It is a consequence of how social media and internet companies are built and how they profit. A company that can both collect an unfathomable amount of information about a person – where they live, who they love, if they are happy, if they have a job, what secret questions they ask, if they might be interested in buying a bulletproof vest – is able to target content and advertisements at them precisely.

When that company’s ultimate goal is simply to keep that person scrolling, to keep them returning to the platform, then the alternative realities we have seen emerge are inevitable.

In the wake of the Capitol insurrection, panicked tech titans took broad action against purveyors of lies and conspiracy, ousting tens of thousands of accounts, including ones belonging to then-President Donald Trump. It was a moment of real change.

Misinformation and conspiracy communities that were pushed out of mainstream homes like Facebook and Twitter found safe haven on Parler, which was, in turn, pushed off its web hosting service.

“Stop the Steal” conspiracy theorists, QAnon believers, and other fringe communities scattered and splintered as they sought new homes online. The impact of this exodus for the internet and the rest of the US is yet to be known. 

While this project has come to an end, we’ll continue to cover both the origins and impact of misinformation. It is a topic of vital importance and we’re committed to covering it. Thanks for reading, and please keep coming back for more of our coverage of the issue — we’ve got a lot more in mind.  

5:14 p.m. ET, January 21, 2021

Many believed conspiracy theories about Trump and the election. Now, they're losing faith

From CNN Business' Richa Naik

The moment of reckoning promised by the QAnon conspiracy theory never came. Now, many believers feel confused, duped, and uncertain of what comes next.

11:42 a.m. ET, January 21, 2021

QAnon believers are in disarray after Biden is inaugurated

From CNN Business' Brian Fung and Kaya Yurieff

For years, believers of the QAnon conspiracy theory had been waiting for the moment when a grand plan would be put into action and secret members of a supposed Satanic pedophilia ring at the highest ranks of government and Hollywood would suddenly be exposed, rounded up and possibly even publicly executed. They were nearly always sure it was right around the corner, but "The Storm" never came — and the moment of Joe Biden's inauguration was the last possible opportunity for President Donald Trump to put the plan in motion.

But as Biden raised his hand and swore an oath to defend the Constitution, becoming the nation's 46th president — nothing happened.

The anti-climax sent QAnon adherents into a frenzy of confusion and disbelief, almost instantly shattering a collective delusion that had been nurtured and amplified by many on the far right. Now, in addition to being scattered to various smaller websites after Facebook (FB) and Twitter (TWTR) cracked down on QAnon-related content, believers risked having their own topsy-turvy world turned upside down, or perhaps right-side up.

Members of a QAnon-focused Telegram channel, and some users of the image board 4chan, vowed to keep the faith. Others proclaimed they were renouncing their beliefs. Still others devised new theories that purported to push the ultimate showdown further into the future. One of the ideology's most visible icons, Ron Watkins — who goes by the online moniker CodeMonkeyZ — told supporters to "go back to our lives."

Read more here

9:52 a.m. ET, January 20, 2021

Facebook says it has removed tens of thousands of QAnon accounts since last summer

From CNN Business' Donie O’Sullivan and Kaya Yurieff

Facebook said Tuesday that since August it has removed about 18,300 Facebook profiles and 27,300 accounts on Facebook-owned Instagram for violating its policies against QAnon. The company has also removed 10,500 groups and 510 events for the same reason.

In a blog post, updated on the eve of the inauguration, the company said it had taken action on tens of thousands of “militarized social movements” and self-described militias since last summer.

“As of January 12, 2021, we have identified over 890 militarized social movements to date," the company said.

Facebook has come under scrutiny for the role its platform played in the lead-up to the deadly insurrection at the Capitol earlier this month. Groups and individuals spreading lies about the 2020 election and calling to protest the outcome continued to hide in plain sight on Facebook even after the Capitol riots.

Facebook announced it would crack down on QAnon last summer. The baseless conspiracy theory has been circulating since 2017. People identifying as part of QAnon were part of the mob of Trump supporters who stormed the Capitol.

Last week, Twitter announced it had suspended more than 70,000 accounts for promoting QAnon.

Despite crackdowns, the conspiracy theory continues to spread on Twitter, Facebook and fringe social platforms, according to new research from nonpartisan nonprofit Advance Democracy.

Over the holiday weekend, more than 1,280 accounts related to QAnon posted on Twitter about 67,000 times, peddling conspiracy theories about the election and President-Elect Joe Biden, according to the research.

For example, one QAnon account shared a 45-second video rife with false claims about election fraud. The video racked up about 360,000 views. After CNN Business flagged the video, Twitter took down the account for violating its rules against ban evasion.

4:31 p.m. ET, January 19, 2021

Facebook shows ads for tactical gear despite announcing a temporary ban

From CNN Business' Brian Fung

Less than 24 hours before Joe Biden is set to become president, Facebook continues to show ads for tactical gear despite vowing to ban those promotions ahead of the inauguration.

A review by CNN and other internet users this week showed that ads for body armor, holsters and other equipment were being displayed on the platform as late as Tuesday afternoon. 

Often, the advertised products are pictured alongside guns, ammunition, or people clad in camouflage fatigues. 

The ads have frequently appeared in the timelines of military veterans and contribute to a false narrative of an imminent violent conflict in the United States, according to Kristofer Goldsmith, founder and president of High Ground Veterans Advocacy. 

“They’re selling the idea of pending violence, or inevitable violence, and that’s the kind of thing that becomes a self-fulfilling prophecy,” said Goldsmith. 

In one example still on Facebook Tuesday afternoon, a pair of noise-reducing earbuds was being advertised as a form of active hearing protection, shown inserted in the ears of a gunman aiming down his rifle sights. 

Another ad, for body armor, promises consumers that the product can shield them from bullets, knives, stun guns and other threats. 

A third series of ads, for hard-knuckled gloves, showed a man wearing desert camouflage and a tactical rig performing various tests on the gloves, including punching concrete walls, breaking a glass bottle by hand and rubbing broken glass on the gloves’ palms.

“They put people in combat gear in a civilian setting,” Goldsmith said of the ads. "They’re promoting this image of, ‘You need to get ready for combat.’”

Asked for comment, Facebook referred CNN to its earlier blog post announcing that it will ban “ads that promote weapon accessories and protective equipment” in the United States through at least Jan. 22. 

"We already prohibit ads for weapons, ammunition and weapon enhancements like silencers," Facebook said in the blog post. "But we will now also prohibit ads for accessories such as gun safes, vests and gun holsters in the US."

After Facebook introduced the ban on Saturday, BuzzFeed News reported the following day that some ads for tactical gear were still active. Many of the ads observed by CNN had been active, in some cases, for months. Others had been launched within the past week.

Facebook appears to have removed some of the advertisements CNN found, including a series of ads for armored plates and plate carriers. The plates had, in some cases, been shown being held by heavily muscular individuals dressed in fatigues or being inserted into camouflage-patterned backpacks. Despite having seemingly removed some of the advertisers' ads, Facebook has allowed other ads for the same products, by the same advertisers, to persist on the platform.

Another now-removed series of body armor ads included marketing copy that claimed specific levels of protection under the rubric established by the National Institute of Justice. 

Veterans are a popular target for misinformation and conspiracy theorists, Goldsmith said, because as a group they enjoy political and social authority. An endorsement by a veteran can reinforce a conspiracy theory's apparent credibility.

“If you change the mind of a veteran, there’s a good chance you change the minds of those within that veteran’s immediate circle — friends, family, coworkers,” said Goldsmith. 

4:00 p.m. ET, January 15, 2021

'Stop the steal' groups hide in plain sight on Facebook

From CNN Business' Brian Fung and Donie O'Sullivan

Groups and individuals spreading lies about the 2020 election and calling to protest the outcome have continued to hide in plain sight on Facebook, even as COO Sheryl Sandberg this week tried to downplay the platform's role in the Capitol riots. 

From altering the names of their online forums to abusing the core features of Facebook's own services, conspiracy theorists have worked to evade content moderators despite the company's vows of a crackdown, new research shows. 

These groups' efforts to remain undetected highlight the sophisticated threat confronting Facebook, despite its insistence the situation has been less of a problem compared to on other platforms. It also raises new concerns that the groups' persistence on these mainstream social networks could spark a new cycle of violence that stretches well into Joe Biden's presidency. 

The latest examples surfaced on Thursday, as extremism experts at the activist group Avaaz identified 90 public and private Facebook groups that have continued to circulate baseless myths about the election, with 166,000 total members. 

Of those, a half-dozen groups appeared to have successfully evaded Facebook's restrictions on "stop the steal" content, according to Avaaz. Though many initially had "stop the steal" in their names, the groups have since altered their profiles, according to page histories reviewed by CNN Business — allowing them to blend in with other Facebook activity. 

"So instead of 'Stop the Steal,' they became 'Stop the Fraud' or 'Stop the Rigged Election' or 'Own the Vote,'" said Fadi Quran, campaign director at Avaaz.

Read more here.

1:56 p.m. ET, January 14, 2021

YouTube hires a doctor to help combat Covid-19 misinformation

From CNN Business' Kaya Yurieff

YouTube is working with top health organizations to create authoritative medical videos for the platform in an effort to crackdown on Covid-19 misinformation.

The new health partnership team will be headed by Dr. Garth Graham, YouTube’s new director and global head of healthcare and public health partnerships. Graham was most recently the chief community health officer at CVS Health.

YouTube will work with organizations including the American Public Health Association, Cleveland Clinic and the Forum at the Harvard School of Public Health to make “high-quality health content” for its users, according to a blog post.

Like other tech platforms, YouTube has had to tackle the spread of misinformation about Covid-19.

In October, the Google-owned platform said it would take down videos that include misinformation about Covid-19 vaccines. It previously took action on other content containing falsehoods about the virus, such as videos disputing Covid-19 exists. At the time, the company said it had removed more than 200,000 videos containing dangerous or misleading information about Covid-19 since February 2020.

9:15 a.m. ET, January 14, 2021

Messaging app Zello bans thousands of armed extremist channels after Capitol riots

From CNN Business' Brian Fung

The messaging app Zello said it has removed more than 2,000 channels on its platform related to armed extremism, and banned all “militia-related channels,” after it found evidence that some of its users participated in the Capitol riots. 

Zello, a voice messaging app that provides a walkie-talkie-like function, condemned the violence in a blog post on Wednesday.

“It is with deep sadness and anger that we have discovered evidence of Zello being misused by some individuals while storming the United States Capitol building last week,” the company said. “Looking ahead, we are concerned that Zello could be misused by groups who have threatened to organize additional potentially violent protests and disrupt the U.S. Presidential Inauguration Festivities on January 20th.”

Zello added that “a large proportion” of the channels it removed on Wednesday had been dormant for months and in some cases years.

The company is further analyzing the groups on its platform to determine whether any may violate its terms of service. But it added that because it does not store message content, the task is not as simple as running searches for keywords or hashtags and blocking them.

7:42 p.m. ET, January 13, 2021

Telegram struggling to combat calls for violence amid surge in growth

From CNN's Brian Fung and Mallory Simon

The messaging app Telegram is battling an increase in violent extremism on its platform amid a surge in new users, the company acknowledged to CNN Wednesday. 

In the last 24 hours, the company has shut down "dozens" of public forums that it said in a statement had posted "calls to violence for thousands of subscribers."

But the effort has turned into a game of cat and mouse, as many of the forum's users set up copycats just as soon as their old haunts were disabled. Screenshots and Telegram groups monitored by CNN show that a number of channels containing white supremacy, hate and other extremism have been shut down, but that at least some have been replaced by new channels. And at least one meta-channel has emerged that maintains lists of deactivated groups and that redirects visitors to the replacements. One now-defunct group that CNN reviewed had more than 10,000 members.

"Our moderators are reviewing an increased number of reports related to public posts with calls to violence, which are expressly forbidden by our Terms of Service," Telegram spokesperson Remi Vaughn told CNN. "In the past 24 hours we have blocked dozens of public channels that posted calls to violence for thousands of subscribers." 

Vaughn added: "Telegram uses a consistent approach to protests and political debate across the globe, from Iran and Belarus to Thailand and Hong Kong. We welcome peaceful discussion and peaceful protests, but routinely remove publicly available content that contains direct calls to violence." 

Telegram has surpassed half a billion active users worldwide. The company announced Tuesday that it had grown by 25 million users over the past several days -- with about 3 percent of that growth, or 750,000 new signups, occurring in the United States alone, Telegram told CNN.

Apps such as Telegram, Signal and MeWe have experienced explosive growth in recent days after WhatsApp sent a notification to its users reminding them that it shares user data with its parent, Facebook -- and following the suspension of President Donald Trump and the alternative social network Parler from many major tech platforms. 

One of the people who has been reporting violent channels to Telegram is Gwen Snyder, a Philadelphia-based activist who said she has been monitoring far-right extremists on the platform since 2019. Earlier this week, as Telegram was witnessing a surge in new users, Snyder enacted a plan to organize mass pressure against Telegram’s content moderators.

“We started two days ago calling for Apple and Google to deplatform Telegram if they refused to enforce their terms of service,” Snyder told CNN. “We had dozens if not hundreds of relatively large-follower Twitter accounts amplifying the campaign.”

It’s difficult to determine whether Telegram’s actions may have been a direct result of the activism; Snyder said she never heard from Telegram or from Apple or Google, either. 

But at least some of the Telegram channels affected by the crackdown appeared to believe that Snyder’s efforts were responsible — and soon began posting her personal information online and targeting her with death threats.

“That’s my home address,” Snyder said in a public tweet, attaching a redacted screenshot of an extremist Telegram channel that had shared her information. Addressing Telegram, she added: “You're okay with this? ENFORCE YOUR OWN TERMS OF SERVICE.”