San Francisco CNN Business  — 

By day, Paul Shales is a computer programmer who works in advertising operations for a bank. By night, he’s creating videos that show Elon Musk as a creepy looking, giggly baby; President Donald Trump as a temperamental pageant contestant on “Toddlers & Tiaras”; and Kim Kardashian freestyle rapping.

His videos are all fakes — deepfakes, actually, which use artificial-intelligence to realistically show people doing things they didn’t actually do. Shales’ creations aren’t meant to fool or scare people; rather, they’re meant to be fun and funny, and occasionally politically satirical. Shales posts them to places like YouTube and Instagram where, under the monikers TheFakening and The_Fakening, respectively, he’s amassed a collective audience of about 100,000 people who check out his work — and often respond with the emoji equivalent of a face laughing so hard it’s crying.

While political leaders and government officials are increasingly concerned about the potential for deepfakes being used to mislead voters in the 2020 election, Shales is among a growing number of computer hobbyists who see the AI-crafted videos as a new form of online humor, rather than a way to trick or threaten people.

“I recognized right away, ‘Hey this could build a social following. People will enjoy this,’” Shales, who’s based in Toronto, told CNN Business.

Elon Musk is depicted as a baby in one of Paul Shales' deepfake videos.

Learning to fake with AI

Deepfakes have only been around for a few years; the first known videos, posted to Reddit in 2017, featured celebrities’ faces swapped with those of porn stars. Shales got interested in making them himself early this year, and in February released his first deepfake: in it, he plastered the face of actor Nicolas Cage onto Elon Musk’s body to make it appear as if Cage, rather than Musk, was smoking marijuana during a podcast interview with comedian Joe Rogan.

Shales admits it isn’t a great video; the resulting face is more of a morph than a swap, he said. And the voice is still unmistakably Musk’s. But Shales kept going and quickly got better. One video looks much more natural: it features actor and comedian Pete Davidson’s face matched with the body of pop star Ariana Grande (Davidson’s ex-girlfriend), tearfully accepting an award and singing.

His most popular video places Elon Musk’s face on the infant in a popular baby montage video with impressively disturbing results.

He’s gotten good enough to snag some paid deepfake-related work. Shales is currently working on a deepfake for a music video, he said, and he helped make some for a segment about the technology for the TV show “Full Frontal with Samantha Bee.” (That show airs on TBS, which is owned by CNN parent company WarnerMedia.)

“I don’t know where it’s headed,” Shales said. “I’m just enjoying it, for the time being, getting some attention.”

Deep Homage, a fellow maker of fun deepfakes who wants to remain anonymous due to the controversy surrounding the medium, also sees it as a way to entertain people and, perhaps, gain some followers on social media. His videos tend to nod at Hollywood screen icons like Grace Kelly and Marilyn Monroe.

“I would never do a deepfake with the intention of harming someone, attacking someone, or slandering someone,” he told CNN Business.

It’s not easy making fakes

Yet while Shales and Deep Homage enjoy making these videos, they want to make it clear how much work goes into each one — so much work, in fact, that they aren’t too worried the existing technology behind deepfakes may truly fool someone in a way that could be dangerous.

It’s not the kind of thing you can make in a day, pointed out Deep Homage.

“The idea that anyone with a computer can make these is off base,” he said.

To create a deepfake, such as this one of Kardashian freestyling, Shales needs a base video (in this case, an actress named Natalie Friedman doing an impression of Kardashian rapping), and thousands of photos of the face he wants in the final video (Kardashian herself). A computer then spends hours attempting to match Kardashian’s face with the key points on Friedman’s face, one video frame at a time.

It’s not that simple, though. Shales uses free deep-learning programs like FaceSwap and DeepFaceLab to replace faces in videos, but also requires a lot of human grunt work to get a good-looking final result. For instance, Shales had to collect a bunch of clips of Kardashian from her reality show, “Keeping Up with the Kardashians,” use computer software to sort out the images with faces in them, and then manually weed out the bad ones.

After a deepfake computer program comes up with final images that Shales is satisfied with, he almost always has to do some smoothing and blurring by hand with visual effects software (AI, he notes, is getting better at this kind of finishing work).

These are not the fakes we’re looking to stop

These are not the kind of deepfakes that some experts watching the space for emerging threats are concerned about stopping. Sam Gregory, program director for nonprofit Witness, which works with human rights defenders, thinks the kinds of videos clearly meant for entertainment are really interesting. He believes it’s important to protect the ability for people to use AI to make such videos, which often are meant as satire or political commentary.

His favorite so far? Shales’ Elon-Musk-as-a-baby video, which he sees as playful and a way of puncturing the bombast of Silicon Valley.

“This is like the visible, fun side of the dark side,” he said.