You probably haven’t seen PimEyes, a mysterious facial-recognition search engine, but it may have spotted you.
If you upload a picture of your face to PimEyes’ website, it will immediately show you any pictures of yourself that the company has found around the internet. You might recognize all of them, or be surprised (or, perhaps, even horrified) by some; these images may include anything from wedding or vacation snapshots to pornographic images.
PimEyes is open to anyone with internet access. It’s a stark contrast from Clearview AI, which became well-known for building its enormous stash of faces with images of people from social networks and limits its use to law enforcement (Clearview has said it has hundreds of such customers).
PimEyes’ decision to make facial-recognition software available to the general public crosses a line that technology companies are typically unwilling to traverse, and opens up endless possibilities for how it can be used and abused.
Imagine a potential employer digging into your past, an abusive ex tracking you, or a random stranger snapping a photo of you in public and then finding you online. This is all possible through PimEyes: Though the website instructs users to search for themselves, it doesn’t stop them from uploading photos of anyone. At the same time, it doesn’t explicitly identify anyone by name, but as CNN Business discovered by using the site, that information may be just clicks away from images PimEyes pulls up.
“Using the latest technologies, artificial intelligence and machine learning, we help you find your pictures on the Internet and defend yourself from scammers, identity thieves, or people who use your image illegally,” the website declares.
It’s precisely this ease of access that concerns Clare Garvie, a senior associate at Georgetown Law’s Center on Privacy and Technology, who has extensively researched police use of facial-recognition technology.
“Face recognition at its foundation is a tool of identification,” Garvie told CNN Business. “Think of any reason a person would want to conduct an identification — positive and negative — and that’s what this tool makes possible.”
“A creepy stalking tool”
PimEyes lets users see a limited number of small, somewhat pixelated search results at no cost, or you can pay a monthly fee, which starts at $29.99, for more extensive search results and features (such as to click through to see full-size images on the websites where PimEyes found them and to set up alerts for when PimEyes finds new pictures of faces online that its software believes match an uploaded face).
The company offers a paid plan for businesses, too: $299.99 per month lets companies conduct unlimited searches and set up 500 alerts.
The images come from a range of websites, including company, media and pornography sites — the last of which PimEyes told CNN Business that it includes so people can search online for any revenge porn in which they may unknowingly appear.
But while Clearview AI built its massive stockpile of faces in part by scraping images from major social networks (it was subsequently served with cease-and-desist notices by Facebook, Google, and Twitter, sued by several civil rights groups, and declared illegal in Canada), PimEyes said it does not scrape images from social media. (A Clearview AI spokesperson would not confirm whether the company currently grabs photos from social sites such as Facebook and Twitter, just saying that the company “collects only public data from the open internet.” The company’s CEO has said in the past that it has a first-amendment right in the United States to collect publicly available information.)
Although PimEyes instructs visitors to only search for their own face, there’s no mechanism on the site to ensure it’s used this way. Several Twitter users claim to have used it in an effort to identify US Capitol rioters, for example — efforts that PimEyes told CNN Business it is aware of but that are unavoidable, despite being a violation of the site’s terms and conditions, since PimEyes can’t verify who is performing a search for a given face. The site, PimEyes noted, doesn’t identify by name those who search for faces nor those whose faces show up in search results.
There’s also no way to ensure this facial-recognition technology isn’t used to misidentify people. There are a handful of US state laws restricting the use of facial-recognition systems and city-wide bans on it, yet these rules tend to target how government and businesses might use such software, not individuals.
PimEyes’ ease of access and the lack of enforcement of its own search rules makes it a tool primed for online stalking and surveillance, said Lucie Audibert, legal officer with London-based human rights group Privacy International.
“In the hands of random citizens, like you or me, it becomes a creepy stalking tool where you can identify anyone on the streets or in any public space,” Audibert said.
To get a sense for what PimEyes can do and how well it works, CNN Business paid for the $29.99-per-month individual subscription, which gave me the ability to conduct 25 “premium” searches per day, see all the search results PimEyes dredged up from around the internet, and the ability to set up alerts for any new images that PimEyes comes across.
I conducted multiple searches for my face online, using new and old photos featuring different hairstyles. In some I wore glasses; in others I did not. Sometimes, before PimEyes would conduct a search, a pop-up forced me to check two boxes saying I accepted the site’s terms of service and that I agreed to use a photo of my face to conduct the search.
The results that were actually pictures of me (and not, say, pornographic images of similar-looking women, of which there were plenty) were mostly familiar. These included work-related headshots, still images from videos I recorded while testing gadgets years ago, and a picture of me smiling with my high school journalism teacher.
There was one surprise: a photo of me dancing at a friend’s wedding in 2013. I hadn’t realized the picture was taken at the time, but that’s not what was startling. Rather, it was the fact that I’m hardly in the picture at all. On the right side of the frame, you can see part of my face, in profile.
My eyes appear closed and I’m wearing black glasses. It’s a blurry image, but it’s definitely me.
With PimEyes, I could trace a selfie to my identity with just a few clicks. As a journalist with headshots and biographies at multiple publications’ websites, it’s pretty easy to connect my face to my name online. So I tried again with the image of a friend (after first getting his consent) who works in another field and has a smaller online presence; one of the first results was from his website, which has his name in the URL.
With their permission, I also ran several co-workers’ selfies through PimEyes to see what popped up. It revealed photos documenting bits and pieces of my colleagues’ pasts: my boss’s wedding, the adoption of another manager’s dog, the time a fellow reporter’s funny facial expression was turned into a meme when he was in college (he knew this, fortunately). In multiple cases, it only took a click or two to connect faces to names.
Shrouded in secrecy
I wanted to learn more about how PimEyes works, and why it’s open to anyone, as well as who’s behind it. This was much trickier than uploading my own face to the website. The website currently lists no information about who owns or runs the search engine, or how to reach them, and users must submit a form to get answers to questions or help with accounts.
Poring over archived images of the website via the Internet Archive’s Wayback Machine, as well as other online sources, yielded some details about the company’s past and how it has changed over time.
The Pimeyes.com website was initially registered in March 2017, according to a domain name registration lookup conducted through ICANN (Internet Corporation for Assigned Names and Numbers). An “about” page on the Pimeyes website, as well as some news stories, shows it began as a Polish startup.
PimEyes positions itself as a tool for finding pictures of yourself online, yet this was not always its focus. An image of the website from October 2018, for instance, indicates it instructed users to upload a photo of whomever they wanted to look for. It showed pictures of celebrities such as Angelina Jolie, Rihanna, and Donald Trump as examples.
In June 2020, some news articles noted how PimEyes may be used by stalkers. In one piece, PimEyes told the BBC that the website’s aim was to help individuals “fight for their own online privacy,” including finding fake profiles, leaked images, and unauthorized photo usage. At the time, it also told the BBC that it worked with police forces via a software investigation tool called Paliscope (and an archived version of the PimEyes’ website’s “Frequently Asked Questions” indicated that PimEyes marketed to law enforcement as recently as that month; though that reference was gone a few days later, a company blog post suggests PimEye’s technology can be used to “look for criminals or missing persons.”)
In early July, the website suddenly emphasized personal privacy. “Upload your photo and find where your face image appears online. Start protecting your privacy,” PimEyes’ site said at the time.
The shift makes sense to Garvie, who pointed out that, initially, Clearview AI was more widely available than it is now (she knows someone, she said, outside of law enforcement, who had the app on his phone).
She thinks PimEyes more strongly resembles Russian facial-recognition software FindFace than Clearview; FindFace, which was available to consumers in Russia, gained prominence in 2016 for its ability to match up faces in user-submitted images to pictures on Russian social network Vkontakte. (The software, which was also used to identify and badger Russian sex workers, is currently available just to business and government customers.)
They refused to conduct a formal interview, saying they “don’t take part in live interviews or direct interviews,” but that they would answer questions sent via email. Over multiple messages they answered a number of questions, but ignored or sidestepped others, such as why the company had switched its focus from suggesting users search for anyone to searching just for yourself.
They would not say how much they paid to purchase PimEyes from its prior owners, nor why they bought it, though they did write the company is currently based in the Seychelles due to the country’s “good incorporation environment.”
When asked where employees are actually based, they answered that PimEyes has an “international team, but we don’t want to disclose details.”
Our emails back and forth did reveal a potential clue about their location, however, due to timestamps. The first note I sent them was timestamped at 11:58 am, PDT, on Thursday, April 8; their response, which I got the next day at 2:31 am my time, included my note, but this time the timestamp above my words read 20:58, or 8:58 pm. When it’s 11:58 am in California, it’s 8:58 pm in a number of places, including Poland. This same nine-hour time difference was evident across numerous emails.
They confirmed that the facial-recognition search engine works similarly to other such systems, by comparing measurements between different facial features in one image (the one you upload) to those in others (in this case, ones it has found online). In order to match up the faces that users submit, PimEyes must scour the internet for images of people. PimEyes doesn’t save images from around the internet, they explained, but it does keep an index of facial-feature measurements from photos it has spotted on the web.
This kind of AI-driven image-matching is different from what happens when you upload a picture of yourself to a site such as Google Images and conduct a search: There, the results will include pictures of similar people (for me, that means lots of dark-haired women in glasses), but Google isn’t using facial measurements in the hopes of finding you, specifically, in other pictures online.
The person behind the PimEyes Team email would not provide a current figure for how many faces it has indexed. But according to archived images of PimEyes.com, as of August 2018, PimEyes said it had analyzed “over 30 million websites”, and in November 2019, the company claimed to have analyzed 900 million faces (Clearview AI, by comparison, claimed to have scraped over 3 billion photos from the internet as of February 2020).
When PimEyes’ search engine finds a match between the photo a user uploads and one PimEyes has previously seen online, it can pair the measurements of the previously analyzed photo with the web address where that photo is located. The website shows you an array of all the pictures it thinks look most like your own photo.
The search accuracy, the company claimed, is about 90%; in general, the accuracy of facial-recognition technology depends on many factors, such as the quality of face images that are fed into a system.
The person behind the PimEyes Team email claimed the company doesn’t use photos that are uploaded by users to improve its software. PimEyes claims to delete images that are uploaded to the site after two days.
They would not name any paying business customers, only saying that “there are no law enforcement agencies among them”.
“It is naive to think that if our search engine didn’t exist, harassers wouldn’t break the law,” they wrote. “On the other hand — we are available to everyone, so any victim of harassment or other internet crime can check themselves using our search engine.”
Connecting names and faces
This accessibility is precisely what concerns Audibert, of Privacy International, and Garvie, of Georgetown. One of Audibert’s biggest concerns about PimEyes, she said, maybe even more so than with Clearview, is whose hands it could fall into. People could use it to identify others in public places, she points out, while private companies could use it to track people.
It could also result in plenty of users misidentifying the faces that the search engine thinks closely resemble the person they’re trying to find, the consequences of which could be enormous. Police already use facial-recognition systems to track down potential suspects, even though the technology has been shown to be less accurate when identifying people of color. Several Black men, at least, have been wrongfully arrested due to this use of facial recognition.
Garvie, who used PimEyes on an image of her own face, noticed that most of the results that were not her were of similar-looking White women in their 30s. This type of misidentification is common across facial-recognition algorithms, she said, and also makes it more likely that a person who sees those results will then make a misidentification.
PimEyes’ technology could hurt people in other ways, too, such as by outing people who are transgender — intentionally or not. When Rachel Thorn, a professor at Kyoto Seika University, uploaded a recent photo of herself to PimEyes, she encountered other recent images of herself. There were also older images, she said, where she presented as masculine. She looks very different today, she said, but guessed that PimEyes may have picked up on similarities between facial features in a recent photo and old photos.
“As a transgender person it was not a great feeling to see old photos of myself show up. I’m pretty sure almost any transgender person would feel the same way,” she said.
Thorn, who studies Japanese graphic novels, known as manga, was impressed by the technology but also worried about how it could be abused. And since the site didn’t stop her from uploading anyone else’s image, she did: She looked up an acquaintance who had worked in pornography by uploading a selfie that person sent her. Sure enough, pornographic images of her friend popped up.
“I thought, ‘Oh my gosh’,” she said. “If you wanted to find out if someone had ever done work in porn, this would do it.”