Wajahat Ali on 3/17.
PHOTO: CNN
Wajahat Ali on 3/17.
Now playing
02:00
Panelist calls white supremacy 'white ISIS'
A Facebook employee walks by a sign displaying the "like" sign at Facebook
PHOTO: Josh Edelson/AFP/Getty Images
A Facebook employee walks by a sign displaying the "like" sign at Facebook's corporate headquarters campus in Menlo Park, California, on October 23, 2019. (Photo by Josh Edelson / AFP) (Photo by JOSH EDELSON/AFP via Getty Images)
Now playing
02:36
Facebook to restore news in Australia
Now playing
02:57
Why Microsoft backs Australia's pay for news proposal
MADISON, WI - NOVEMBER 05:  U.S. President Barack Obama and rocker Bruce Springsteen wave to a crowd of 18,000 people during a rally on the last day of campaigning in the general election November 5, 2012 in Madison, Wisconsin. Obama and his opponent, Republican presidential nominee and former Massachusetts Gov. Mitt Romney are stumping from one
PHOTO: Chip Somodevilla/Getty Images
MADISON, WI - NOVEMBER 05: U.S. President Barack Obama and rocker Bruce Springsteen wave to a crowd of 18,000 people during a rally on the last day of campaigning in the general election November 5, 2012 in Madison, Wisconsin. Obama and his opponent, Republican presidential nominee and former Massachusetts Gov. Mitt Romney are stumping from one 'swing state' to the next in a last-minute rush to persuade undecided voters. (Photo by Chip Somodevilla/Getty Images)
Now playing
01:26
Barack Obama and Bruce Springsteen team up for new podcast
Find Solution AI facial expression reading software mapping facial muscles to assess emotion
PHOTO: Shutterstock/Find Solution AI
Find Solution AI facial expression reading software mapping facial muscles to assess emotion
Now playing
03:54
How AI that reads emotions is changing the online classroom
This illustration picture shows the social media website from Parler displayed on a computer screen in Arlington, Virginia on July 2, 2020. - Amid rising turmoil in social media, recently formed social network Parler is gaining with prominent political conservatives who claim their voices are being silenced by Silicon Valley giants. Parler, founded in Nevada in 2018, bills itself as an alternative to "ideological suppression" at other social networks. (Photo by Olivier Douliery/AFP/Getty Images)
PHOTO: Olivier Douliery/AFP/Getty Images
This illustration picture shows the social media website from Parler displayed on a computer screen in Arlington, Virginia on July 2, 2020. - Amid rising turmoil in social media, recently formed social network Parler is gaining with prominent political conservatives who claim their voices are being silenced by Silicon Valley giants. Parler, founded in Nevada in 2018, bills itself as an alternative to "ideological suppression" at other social networks. (Photo by Olivier Douliery/AFP/Getty Images)
Now playing
01:25
Parler's website is back online
screengrab singapore contact tracing app
PHOTO: Government Technology Agency of Singapore
screengrab singapore contact tracing app
Now playing
03:07
How Singapore's contact tracing technology undermines citizen's trust
PHOTO: Boston Dynamics
Now playing
01:20
Spot the robot's new arm lets it jump rope (and do serious stuff)
Detail of a mans hand scrolling through Netflix on an Apple iPad Pro, taken on March 6, 2020. (Photo by Phil Barker/Future Publishing via Getty Images)
PHOTO: Future Publishing via Getty Imag
Detail of a mans hand scrolling through Netflix on an Apple iPad Pro, taken on March 6, 2020. (Photo by Phil Barker/Future Publishing via Getty Images)
Now playing
04:27
2020 was supposed to be the year of streaming. Instead, it was the year of Netflix
PHOTO: WCCO
Now playing
01:26
Man donates Nintendo Switches to kids in need with GameStop stock
NEW YORK, NEW YORK - JANUARY 27: GameStop store signage is seen on January 27, 2021 in New York City. Stock shares of videogame retailer GameStop Corp has increased 700% in the past two weeks due to amateur investors. (Photo by Michael M. Santiago/Getty Images)
PHOTO: Michael M. Santiago/Getty Images
NEW YORK, NEW YORK - JANUARY 27: GameStop store signage is seen on January 27, 2021 in New York City. Stock shares of videogame retailer GameStop Corp has increased 700% in the past two weeks due to amateur investors. (Photo by Michael M. Santiago/Getty Images)
Now playing
03:16
GameStop mania shakes up Wall Street
SAN ANSELMO, CALIFORNIA - JANUARY 08: The suspended Twitter account of U.S. President Donald Trump appears on an iPhone screen on January 08, 2021 in San Anselmo, California. Citing the risk of further incitement of violence following an attempted insurrection on Wednesday, Twitter permanently suspended President Donald Trump
PHOTO: Justin Sullivan/Getty Images North America/Getty Images
SAN ANSELMO, CALIFORNIA - JANUARY 08: The suspended Twitter account of U.S. President Donald Trump appears on an iPhone screen on January 08, 2021 in San Anselmo, California. Citing the risk of further incitement of violence following an attempted insurrection on Wednesday, Twitter permanently suspended President Donald Trump's account. (Photo Illustration by Justin Sullivan/Getty Images)
Now playing
04:29
What impact could deplatforming Donald Trump have?
screengrab US social media
PHOTO: Getty Images
screengrab US social media
Now playing
04:35
Tech companies ban Trump, but not other problematic leaders
PHOTO: Samsung
Now playing
01:53
See Samsung's new Galaxy S21 lineup
PHOTO: LG Display
Now playing
01:10
See LG's transparent TV
(CNN Business) —  

Among the many tragedies of the massacre at two New Zealand mosques on Friday is a bitter irony: The terrorist who killed at least 50 people in an Islamophobic attack resembled in many ways a member of ISIS. If his life had gone different in some way, he might well have ended up one and killed people somewhere else in its name. The type of extremism and hatred is of course different. But they have at least one thing in common: the internet as a tool of radicalization.

There is still much we don’t know about the suspect and his background. But before anything at all was known about him, anyone who has studied or covered extremism and these kinds of attacks could have given you an educated guess about what kind of person he was: Male. Probably in his 20s. Decent chance of at least a minor criminal record. More than likely a history of hatred toward or violence against women. Oh, and one more thing — probably spent a fair amount of time on the internet.

People could easily become radicalized before social media. Many are still radicalized without it. But social media, often in combination with other factors, has proven itself an efficient radicalizer, in part because it allows for the easy formation of communities and in part because of its algorithms, used to convince people to stay just a little longer, watch one more video, click one more thing, generate a little more advertising revenue.

The recommendations that YouTube provides, for instance, have been shown to push users toward extreme content. Someone who comes to the site to watch a video about something in the news could quickly find themselves seeing a conspiracy theory clip instead, for instance. (In January, YouTube said it was taking steps to remedy this.) A few years ago, someone looking for information about Islam could soon find themselves listening to a radical preacher.

Combine those algorithms with men who are disaffected, who may feel that the world owes them more, and you have a recipe for creating extremism of any stripe.

“They’re picking up an ideology that helps them justify their rage, their disappointment, and it’s something available,” Jessica Stern, a research professor at Boston University’s Pardee School of Global Studies and the co-author of “ISIS: The State of Terror,” told CNN Business Friday. “Terrorism runs in fads. We noticed that people were picking up the ISIS ideology who weren’t even Muslim, they were converting to Islam. The ISIS ideology was an attractive way for some of these men to express their rage and disappointment. This is another ideology that is becoming very popular, it’s another fad.”

For all the largely much-deserved criticism they’ve gotten recently over all the things they’ve failed to act upon, the social networks did step up and take real and impressive action when faced with a deluge of ISIS supporters and content. The big tech companies could be taking similar action against white supremacists now.

“The issue on mainstream sites is for the most part there’s been an aggressive takedown” of ISIS-related content, Seamus Hughes, the deputy director of the Program on Extremism at George Washington University, said. “That same dynamic hasn’t happened when it comes to white supremacy.”

The companies could take action against white supremacists now. Indeed, they could go on forever like that, playing whac-a-mole with different movements that pop up and begin radicalizing their users, moving against them after enough people have been killed. It would be easier for them to do that than to actually deal with the underlying problem of those algorithms designed to keep people around.

“It makes sense from a marketing perspective; if you like Pepsi then you’re going to watch more Pepsi videos… but you take that to the logical extreme with white supremacy videos,” Hughes said. “They’re going to have to figure out how to not completely scrap a system that has brought them hundreds of millions of dollars of ad revenue while not also furthering someone’s radicalization or recruitment.”

Perhaps the most disheartening aspect of this is that the companies have been told, over and over again, that they have a problem. Ben Collins, a reporter with NBC News, tweeted Friday, “Extremism researchers and journalists (including me) warned the company in emails, on the phone, and to employees’ faces after the last terror attack that the next one would show signs of YouTube radicalization again, but the outcome would be worse. I was literally scoffed at.”

So what should the platforms do now?

Asked that question, Bill Braniff, the director of the National Consortium for the Study of Terrorism and Responses to Terrorism (START) and a professor of the practice at the University of Maryland, said, “What I believe we should be asking them to do is to continue to minimize the salience or the reach of violent extremist propaganda, that calls for violence… but not to limit themselves to just content takedowns as the way to do that. What happens when large platform takes down this content or these views is that the content just shifts to smaller platforms. … Maybe fewer people will be exposed over time, and that’s a good thing, but that’s not the same as a comprehensive solution.”

Content takedowns alone can both contribute to a persecution narrative and drive people to smaller, more radical sites, Braniff noted. And he thinks that means giving up an opportunity to use the algorithms to redirect, rather than reinforce.

“We know that people… can actually be addressed through counseling [and] mentorship,” he said. “If instead of directing people who might be flirting with extremism to support, if you censor them and remove them from these platforms you lose… the ability to provide them with an off-ramp.”

While noting that platforms should still take down content that explicitly calls for violence, which also violates their terms of service, Braniff said, “There’s some content that doesn’t violate the terms of use, and so the question is, can you make sure that information is contextualized with videos before and after it on the feed?”

The comprehensive solution he sees is a change to the algorithms, so that they could point people to differing views or even in some cases to support such as counseling.

“Algorithms can either foster groupthink and reinforcement or they can drive discussion,” he said. “Right now the tailored content tends to be, ‘I think you’re going to like more of the same,’ and unfortunately that’s an ideal scenario for not just violent extremism but polarization … We’re only sharing subsets of information and removing the middle ground, the place where we come together to discuss different ideas… [a] massive part of violent extremism is polarization, and it’s really dangerous.”