Earlier this month, ISIS posted a video of its horrific burning of a captured Jordanian pilot. Unfortunately, this was not the first time ISIS has used Twitter, an American social media company, to broadcast its barbaric acts to the world. In August, when ISIS released the gruesome beheading of American journalist James Foley, it also used social media
. In fact, ISIS has been using Twitter for years.
Nor is ISIS the only terrorist group on Twitter.
, Hezbollah and the al Qaeda branch in Syria, al-Nusra Front
, are all on Twitter. On January 14, the al Qaeda branch in Yemen, known as al Qaeda in the Arabian Peninsula or simply AQAP, claimed responsibility via Twitter
for the terrorist attacks in Paris that killed 17 people. The group has two official accounts on Twitter.
There are many more examples from such groups, all of which have officially been listed as foreign designated terrorist organizations by the U.S. government. It's with this reality in mind that on January 27, my subcommittee held a hearing on terrorists' use of social media. At those hearings, experts detailed how terrorist use of social media platforms has long been a problem.
If social media is being used to help radicalize thousands of people and raise millions of dollars from many more, the question all this raises is this: Why is no one shutting them down? Because American companies aren't. And nor is the American government.
I've heard two arguments for why we should keep the status quo.
The first -- and easiest to set aside -- is the claim that if the U.S. government were to shut down terrorists' social media accounts, these measures would be violating terrorists' free speech rights.
My own belief is that the Constitution does not apply to terrorists.
These thugs gave up their right to free speech the first time they killed innocent civilians. We should certainly not be helping them kill more. But this isn't just my thinking.
The Supreme Court has already held this to be the case
in Holder v. Humanitarian Law Project, when it ruled that if someone has aided a terrorist organization, their free speech rights were not protected. Indeed, free speech does not apply when it harms others, such as creating and distributing child pornography.
The second argument is that terrorists' use of social media provides the intelligence community with information that they would not otherwise be able to acquire. But while terrorists may slip up from time to time, they are also aware that by its very nature, social media is about sharing, which means what they say can easily become widely shared.
Nor is terrorist use of social media a new phenomenon. We have had years to weigh the kind of intelligence that we can gather about terrorist groups against the advantages in messaging and recruitment that terrorists gain from it. And from what I have heard, allowing this public, online jihad to continue has provided no significant intelligence breakthroughs. The fact that there are more terrorists using social media than ever before should say all we need to know about whether they are benefiting from it.
To put it bluntly, private American companies should not be operating as the propaganda megaphone of foreign terrorist organizations.
So what needs to change?
For a start, social media companies themselves need to do more. It is not good enough to only pay attention when bad press threatens a company's public image after something truly horrific is posted online. Instead, companies not only have a public responsibility but a legal obligation to do more. Section 219 of the Immigration and Nationality Act states that it is unlawful to provide a designated foreign terrorist organization with "material support or resources," including "any property, tangible or intangible, or services."
That's about as comprehensive as you can get.
What's more, most social media companies already have terms of service with prohibition of threats of violence that would preclude terrorist use of their platforms. But companies need to do a better job of enforcing their own terms. The lack of child pornography or stolen copyrighted material on social media platforms -- content that is quickly removed if it appears at all -- demonstrates what these companies can do.
With this in mind, they would do well to consider having dedicated teams that remove terrorist content, and also streamline reporting processes for offensive content so users can easily report terrorist use on their platforms. Companies have the technology and the resources to crack down on terrorists' use of their platform; they just need the motivation to act.
This is where the federal government can assist.
In 2011, the White House promised a strategy to prevent online radicalization. But more than three years later -- and despite a summit last week aimed at tackling extremism -- we are still waiting on a that strategy. Without one, the federal government's efforts to combat terrorist use of social media will be as haphazard and lackluster as the efforts of private social media companies. Instead, we need a strategy that clearly articulates our goals and roles, and the responsibilities of each federal agency that needs to be involved, as well as how we are going to work with civil society.
It is mindboggling to think that those who behead and burn others alive are able to use our own companies against us to further their cause. But that is exactly what is occurring.
American newspapers would have never allowed the Nazis to place an ad for recruitment during World War II. Designated Foreign Terrorist Organizations should not be allowed to use private American companies to reach billions of people with their violent propaganda in an instant, all for free.