The moment of reckoning promised by the QAnon conspiracy theory never came. Now, many believers feel confused, duped, and uncertain of what comes next.
For years, believers of the QAnon conspiracy theory had been waiting for the moment when a grand plan would be put into action and secret members of a supposed Satanic pedophilia ring at the highest ranks of government and Hollywood would suddenly be exposed, rounded up and possibly even publicly executed. They were nearly always sure it was right around the corner, but "The Storm" never came — and the moment of Joe Biden's inauguration was the last possible opportunity for President Donald Trump to put the plan in motion.
But as Biden raised his hand and swore an oath to defend the Constitution, becoming the nation's 46th president — nothing happened.
The anti-climax sent QAnon adherents into a frenzy of confusion and disbelief, almost instantly shattering a collective delusion that had been nurtured and amplified by many on the far right. Now, in addition to being scattered to various smaller websites after Facebook (FB) and Twitter (TWTR) cracked down on QAnon-related content, believers risked having their own topsy-turvy world turned upside down, or perhaps right-side up.
Members of a QAnon-focused Telegram channel, and some users of the image board 4chan, vowed to keep the faith. Others proclaimed they were renouncing their beliefs. Still others devised new theories that purported to push the ultimate showdown further into the future. One of the ideology's most visible icons, Ron Watkins — who goes by the online moniker CodeMonkeyZ — told supporters to "go back to our lives."
Facebook said Tuesday that since August it has removed about 18,300 Facebook profiles and 27,300 accounts on Facebook-owned Instagram for violating its policies against QAnon. The company has also removed 10,500 groups and 510 events for the same reason.
In a blog post, updated on the eve of the inauguration, the company said it had taken action on tens of thousands of “militarized social movements” and self-described militias since last summer.
“As of January 12, 2021, we have identified over 890 militarized social movements to date," the company said.
Facebook has come under scrutiny for the role its platform played in the lead-up to the deadly insurrection at the Capitol earlier this month. Groups and individuals spreading lies about the 2020 election and calling to protest the outcome continued to hide in plain sight on Facebook even after the Capitol riots.
Facebook announced it would crack down on QAnon last summer. The baseless conspiracy theory has been circulating since 2017. People identifying as part of QAnon were part of the mob of Trump supporters who stormed the Capitol.
Last week, Twitter announced it had suspended more than 70,000 accounts for promoting QAnon.
Despite crackdowns, the conspiracy theory continues to spread on Twitter, Facebook and fringe social platforms, according to new research from nonpartisan nonprofit Advance Democracy.
Over the holiday weekend, more than 1,280 accounts related to QAnon posted on Twitter about 67,000 times, peddling conspiracy theories about the election and President-Elect Joe Biden, according to the research.
For example, one QAnon account shared a 45-second video rife with false claims about election fraud. The video racked up about 360,000 views. After CNN Business flagged the video, Twitter took down the account for violating its rules against ban evasion.
Less than 24 hours before Joe Biden is set to become president, Facebook continues to show ads for tactical gear despite vowing to ban those promotions ahead of the inauguration.
A review by CNN and other internet users this week showed that ads for body armor, holsters and other equipment were being displayed on the platform as late as Tuesday afternoon.
Often, the advertised products are pictured alongside guns, ammunition, or people clad in camouflage fatigues.
The ads have frequently appeared in the timelines of military veterans and contribute to a false narrative of an imminent violent conflict in the United States, according to Kristofer Goldsmith, founder and president of High Ground Veterans Advocacy.
“They’re selling the idea of pending violence, or inevitable violence, and that’s the kind of thing that becomes a self-fulfilling prophecy,” said Goldsmith.
In one example still on Facebook Tuesday afternoon, a pair of noise-reducing earbuds was being advertised as a form of active hearing protection, shown inserted in the ears of a gunman aiming down his rifle sights.
Another ad, for body armor, promises consumers that the product can shield them from bullets, knives, stun guns and other threats.
A third series of ads, for hard-knuckled gloves, showed a man wearing desert camouflage and a tactical rig performing various tests on the gloves, including punching concrete walls, breaking a glass bottle by hand and rubbing broken glass on the gloves’ palms.
“They put people in combat gear in a civilian setting,” Goldsmith said of the ads. "They’re promoting this image of, ‘You need to get ready for combat.’”
Asked for comment, Facebook referred CNN to its earlier blog post announcing that it will ban “ads that promote weapon accessories and protective equipment” in the United States through at least Jan. 22.
"We already prohibit ads for weapons, ammunition and weapon enhancements like silencers," Facebook said in the blog post. "But we will now also prohibit ads for accessories such as gun safes, vests and gun holsters in the US."
After Facebook introduced the ban on Saturday, BuzzFeed News reported the following day that some ads for tactical gear were still active. Many of the ads observed by CNN had been active, in some cases, for months. Others had been launched within the past week.
Facebook appears to have removed some of the advertisements CNN found, including a series of ads for armored plates and plate carriers. The plates had, in some cases, been shown being held by heavily muscular individuals dressed in fatigues or being inserted into camouflage-patterned backpacks. Despite having seemingly removed some of the advertisers' ads, Facebook has allowed other ads for the same products, by the same advertisers, to persist on the platform.
Another now-removed series of body armor ads included marketing copy that claimed specific levels of protection under the rubric established by the National Institute of Justice.
Veterans are a popular target for misinformation and conspiracy theorists, Goldsmith said, because as a group they enjoy political and social authority. An endorsement by a veteran can reinforce a conspiracy theory's apparent credibility.
“If you change the mind of a veteran, there’s a good chance you change the minds of those within that veteran’s immediate circle — friends, family, coworkers,” said Goldsmith.
Groups and individuals spreading lies about the 2020 election and calling to protest the outcome have continued to hide in plain sight on Facebook, even as COO Sheryl Sandberg this week tried to downplay the platform's role in the Capitol riots.
From altering the names of their online forums to abusing the core features of Facebook's own services, conspiracy theorists have worked to evade content moderators despite the company's vows of a crackdown, new research shows.
These groups' efforts to remain undetected highlight the sophisticated threat confronting Facebook, despite its insistence the situation has been less of a problem compared to on other platforms. It also raises new concerns that the groups' persistence on these mainstream social networks could spark a new cycle of violence that stretches well into Joe Biden's presidency.
The latest examples surfaced on Thursday, as extremism experts at the activist group Avaaz identified 90 public and private Facebook groups that have continued to circulate baseless myths about the election, with 166,000 total members.
Of those, a half-dozen groups appeared to have successfully evaded Facebook's restrictions on "stop the steal" content, according to Avaaz. Though many initially had "stop the steal" in their names, the groups have since altered their profiles, according to page histories reviewed by CNN Business — allowing them to blend in with other Facebook activity.
"So instead of 'Stop the Steal,' they became 'Stop the Fraud' or 'Stop the Rigged Election' or 'Own the Vote,'" said Fadi Quran, campaign director at Avaaz.
YouTube is working with top health organizations to create authoritative medical videos for the platform in an effort to crackdown on Covid-19 misinformation.
The new health partnership team will be headed by Dr. Garth Graham, YouTube’s new director and global head of healthcare and public health partnerships. Graham was most recently the chief community health officer at CVS Health.
YouTube will work with organizations including the American Public Health Association, Cleveland Clinic and the Forum at the Harvard School of Public Health to make “high-quality health content” for its users, according to a blog post.
Like other tech platforms, YouTube has had to tackle the spread of misinformation about Covid-19.
In October, the Google-owned platform said it would take down videos that include misinformation about Covid-19 vaccines. It previously took action on other content containing falsehoods about the virus, such as videos disputing Covid-19 exists. At the time, the company said it had removed more than 200,000 videos containing dangerous or misleading information about Covid-19 since February 2020.
The messaging app Zello said it has removed more than 2,000 channels on its platform related to armed extremism, and banned all “militia-related channels,” after it found evidence that some of its users participated in the Capitol riots.
Zello, a voice messaging app that provides a walkie-talkie-like function, condemned the violence in a blog post on Wednesday.
“It is with deep sadness and anger that we have discovered evidence of Zello being misused by some individuals while storming the United States Capitol building last week,” the company said. “Looking ahead, we are concerned that Zello could be misused by groups who have threatened to organize additional potentially violent protests and disrupt the U.S. Presidential Inauguration Festivities on January 20th.”
Zello added that “a large proportion” of the channels it removed on Wednesday had been dormant for months and in some cases years.
The company is further analyzing the groups on its platform to determine whether any may violate its terms of service. But it added that because it does not store message content, the task is not as simple as running searches for keywords or hashtags and blocking them.
The messaging app Telegram is battling an increase in violent extremism on its platform amid a surge in new users, the company acknowledged to CNN Wednesday.
In the last 24 hours, the company has shut down "dozens" of public forums that it said in a statement had posted "calls to violence for thousands of subscribers."
But the effort has turned into a game of cat and mouse, as many of the forum's users set up copycats just as soon as their old haunts were disabled. Screenshots and Telegram groups monitored by CNN show that a number of channels containing white supremacy, hate and other extremism have been shut down, but that at least some have been replaced by new channels. And at least one meta-channel has emerged that maintains lists of deactivated groups and that redirects visitors to the replacements. One now-defunct group that CNN reviewed had more than 10,000 members.
"Our moderators are reviewing an increased number of reports related to public posts with calls to violence, which are expressly forbidden by our Terms of Service," Telegram spokesperson Remi Vaughn told CNN. "In the past 24 hours we have blocked dozens of public channels that posted calls to violence for thousands of subscribers."
Vaughn added: "Telegram uses a consistent approach to protests and political debate across the globe, from Iran and Belarus to Thailand and Hong Kong. We welcome peaceful discussion and peaceful protests, but routinely remove publicly available content that contains direct calls to violence."
Telegram has surpassed half a billion active users worldwide. The company announced Tuesday that it had grown by 25 million users over the past several days -- with about 3 percent of that growth, or 750,000 new signups, occurring in the United States alone, Telegram told CNN.
Apps such as Telegram, Signal and MeWe have experienced explosive growth in recent days after WhatsApp sent a notification to its users reminding them that it shares user data with its parent, Facebook -- and following the suspension of President Donald Trump and the alternative social network Parler from many major tech platforms.
One of the people who has been reporting violent channels to Telegram is Gwen Snyder, a Philadelphia-based activist who said she has been monitoring far-right extremists on the platform since 2019. Earlier this week, as Telegram was witnessing a surge in new users, Snyder enacted a plan to organize mass pressure against Telegram’s content moderators.
“We started two days ago calling for Apple and Google to deplatform Telegram if they refused to enforce their terms of service,” Snyder told CNN. “We had dozens if not hundreds of relatively large-follower Twitter accounts amplifying the campaign.”
It’s difficult to determine whether Telegram’s actions may have been a direct result of the activism; Snyder said she never heard from Telegram or from Apple or Google, either.
But at least some of the Telegram channels affected by the crackdown appeared to believe that Snyder’s efforts were responsible — and soon began posting her personal information online and targeting her with death threats.
“That’s my home address,” Snyder said in a public tweet, attaching a redacted screenshot of an extremist Telegram channel that had shared her information. Addressing Telegram, she added: “You're okay with this? ENFORCE YOUR OWN TERMS OF SERVICE.”
Facebook has seen online signals, on its platform and elsewhere, indicating the potential for more violence following last week’s insurrection, a company spokesperson told CNN Wednesday.
The company is working with organizations that track terrorists and dangerous groups to monitor conversation on other platforms, like 8Kun (formerly 8chan) and 4chan, in an effort to prevent talk of violence from those platforms becoming popular on Facebook, the spokesperson said.
One example of work Facebook is doing on this, according to the spokesperson, is collecting and indexing promotional fliers being distributed on other sites for more demonstrations this weekend and on Inauguration Day. Indexing promotional material like this can help make it easier for Facebook to identify and remove that material from its platforms or prevent it from being posted in the first place.
The spokesperson said Facebook is monitoring and removing praise of or support for last week’s storming of the US Capitol from its platform.
Facebook has passed on information to the FBI and is cooperating with the agency’s efforts to identify members of last week’s insurrection, the spokesperson said.