Editor’s Note: Kara Alaimo, an associate professor in the Lawrence Herbert School of Communication at Hofstra University, writes about women and social media. She was spokeswoman for international affairs in the Treasury Department during the Obama administration. Follow her on Twitter @karaalaimo. The opinions expressed in this commentary are solely those of the author. View more opinion at CNN.
On Sunday, Twitter confirmed it had permanently suspended the personal account of Republican Georgia Rep. Marjorie Taylor Greene for violating the company’s policy against sharing misinformation about Covid-19.
She’ll still have access to her Congressional account.
Twitter said Greene’s tweet claiming there had been an “extremely high amount of Covid vaccine deaths” earned her a fifth “strike,” which resulted in permanent loss of her account. She had previously tweeted claims that vaccines were “failing,” when experts say they have saved millions of lives, and that Covid-19 wasn’t dangerous; in fact, it’s potentially deadly.
The decision to permanently suspend the account was the right call for Twitter, and applying it more broadly would be an astonishingly simple and effective way of helping to curb the spread of deadly disinformation. Now, it’s time for other social networks to follow suit.
In the United States, the number of Covid cases just hit a record high. With everyone from medical professionals to firefighters, train operators and airline personnel getting sick en masse, the continued operation of essential services is now in jeopardy in some places, as of course, are people’s lives. It’s never been more important for people to receive accurate information about how to protect themselves and their families from this virus, which includes getting vaccines that are safe and effective at preventing serious disease and death.
But misinformation is a key reason why many Americans still refuse to get vaccinated. Last April, results from an ongoing Kaiser Family Foundation survey found 81% of people who definitely planned not to get vaccinated had been exposed to at least one myth about vaccines and either thought it was true or weren’t sure whether it was true. With nearly half of all Americans regularly getting their news from social media, according to the Pew Research Center, it’s essential for social networks to play a part in weeding out these kinds of untruths.
While addressing the problem of misinformation is often described as impossibly complex, it’s much simpler than it may appear, and a key component is social media companies shutting down the accounts of people who post misinformation repeatedly or are unusually influential, like Greene.
One reason’s it is so important is because the majority of misinformation on social platforms comes from a shockingly small number of accounts. For example, according to internal Facebook documents reviewed by the Wall Street Journal, half of the posts made in Facebook Groups disabled because of pandemic-related misinformation were created by just 5% of the Groups’ users. It is why shuttering the accounts of a relatively small number of people who are most often repeating dangerous information – who Facebook internally calls the “big whales,” according to the Wall Street Journal report – is one key to this problem.
“If we see harmful misinformation on the platform, then we take it down. It’s against our policy,” chief executive Mark Zuckerberg said in a CBS interview in August. “But do we catch everything? Of course, there are mistakes that we make or areas where we need to improve.”
Another key is going after people with large followings, since their posts stand to be particularly influential. Greene, for example, had more than 465,000 followers on her personal account.
But for a long time, social networks have done just the opposite. For example, according to Facebook’s own oversight board, the company has long operated a program called “Cross-Check,” which protects the accounts of famous people from automatic penalties that would apply to other users who post misinformation, hate, or other content that violates the company’s Community Standards.
That’s outrageous. The accounts of people in the public eye are disproportionately influential, so it is especially important they be held responsible for adhering to the rules. A spokesperson for Facebook, Andy Stone, told The Wall Street Journal the “Cross-Check” program “was designed for an important reason: to create an additional step so we can accurately enforce policies on content that could require more understanding.” The oversight board added in October it has accepted a request from the company to review the program and make recommendations on how it can be changed.
It’s time for this to stop. When accounts post vaccine misinformation, those accounts should be shut down. Full stop. No one should ever get a free pass for being famous or powerful.
And, by the way, this is not a free speech issue. Information that puts people’s lives at risk makes no healthy contribution to public debate. That’s why, as I’ve pointed out before, there is plenty of precedent for restricting it.
If social networks applied such a rule against misinformation uniformly, it would also send an important signal to other users that they, too, would be putting their accounts at risk by sharing inaccurate information. That would also likely radically stem the flow of false claims.
It isn’t difficult to figure out how social networks can stop the spread of many dangerous deceptions about the coronavirus. But it does require them to show some backbone and be willing to stand up to powerful people when they break the rules. Twitter’s decision on Greene was the right move. Facebook and Twitter have also banned another influential person, former President Donald Trump, but only after his actions on social media contributed to last year’s attack on the Capitol. Now other tech leaders must have the courage to make it this call every time a powerful person uses their platform to endanger people.