Hiring the wrong person can be costly.
Interviews are often the only way employers have to get to know a candidate personally before they offer them a job. While background and reference checks are standard for most new hires, they don’t always catch behaviors that can clash with a company’s culture and values and create a toxic work environment.
California-based Fama Technologies is offering another layer of screening to help identify risky applicants before they get hired. It uses artificial intelligence to analyze a person’s public digital footprint to find problematic behaviors like sexual harassment, bigotry and bullying.
“If you make a hire and it turns out they were posting, sexist, racist and other lunacy online…that is not only a liability for an employer, it also calls into question your ability of making a hiring decision,” said Alex Granovsky, an employment lawyer in New York City.
More from Success
Fama, which was launched in 2015, creates risk profiles for job applicants or even current employees by scouring publicly-available content online, including social media, message boards, blogs, news articles and comment sections that can be tied to a current or prospective employee.
“We do not score, there’s no thumbs up or down,” said Fama CEO Ben Mones. “We aren’t saying anything about the person … We can say this piece of text is an example of bigotry.”
About 12% to 15% of the time a report will affect an employment or a hiring decision, said Mones.
Companies using Fama are legally obligated to share the report with job candidates and employees to contest or explain the results, according to Mones, who added the accuracy on reports is 99.98% as reported by candidates.
Companies can select which behaviors to focus on for the screening, but sexism, bigotry and violence are the most frequently searched categories.
But there are some categories of information that are off-limits.
Employees have protections when it comes to hiring discrimination. It is illegal for an employer to discriminate against a job applicant based on race, color, religion, sex, national origin, age or disability. Fama says it ensures companies never interact with these protected classes of information.
“Every person controls their own social media,” said Granovsky. “You choose what you put up there and there are consequence for making statements that offend others. It can result in not being hired, job loss, or even a punch in the face.”
While most of the screens are for potential hires, Fama has recently seen a large increase in the number of companies running checks on current workers.
“It’s about level setting. It’s saying, maybe where you grew up or where you worked this stuff could be OK, but here at this company it is not,” said Mones.
The company has more than 120 clients and screens 20,000 people a month.
In 2018, Fama screened more than 20 million pieces of public content online and found 14.2% of people had red flags for misogyny and sexism and 10.3% had indicators for bigotry, racism and hate speech. Almost 12% had red flags for violence, drugs and crime.
Mones said Fama works with companies in a variety of industries, including major brands in financial services, entertainment and media and professional sports.
The software can also flag drug and alcohol use, but Mones says the majority of companies aren’t looking at these subjects on social media as a factor of employability. So when it comes to that picture of you at that party that’s posted on Facebook? “Don’t worry about having the Solo cup in your hand.”
Mones came up with the idea for Fama after he brought on a new employee at a former company that harassed a fellow colleague after a few months on the job. After the incident, Mones realized that the hire had exhibited harassing behavior on social media. He said if he had seen it, he wouldn’t have hired the person.
“Toxic behavior drives down productivity and innovation,” said Mones. “Other employees feel disconnected and can feel less focused and less motivated to come to work. The sphere of influence that a toxic employee has will adversely affect others.”