Deepfakes Were Created As a Way to Own Women's Bodies—We Can't Forget That
Illustration by Jacqueline Lin

FYI.

This story is over 5 years old.

Identity

Deepfakes Were Created As a Way to Own Women's Bodies—We Can't Forget That

When Redditors started using AI to attach celebrities' faces to porn performers' bodies, the media reaction focused on the implications for potential political hoaxes, but we need to focus on the women they harmed.

This essay originally appeared in the Privacy & Perception Issue of Vice Magazine, created in collaboration with Broadly. You can read more stories from the issue here.

When we first found Reddit user “Deepfakes”—a play on deep machine learning, a branch of artificial intelligence, and fake photos—he was in a celebrity doppelgänger porn forum quietly going about his hobby: creating the videos that would eventually take on his name and become a symbol of Truth’s deterioration.

Advertisement

There he was, out in the virtual wide open, diligently cranking out fictional footage of women through a machine-learning meat grinder and posting them to public porn forums. And holy shit did they look believable.

For the blissfully uninitiated, deepfakes are AI-generated videos in which a person’s face is attached to someone else’s body. They began as Frankenporn made from mash-ups of celebrity faces and porn performer bodies, creating pretty realistic videos of people having sex that never happened.

Deepfakes first posted his creations to Reddit toward the end of 2017, and by December, my editors at Motherboard, VICE’s science and tech outlet, and I tracked him down and published the first in-depth article on the phenomenon. Come mid January 2018, people in deepfakes-focused subreddits had made a user-friendly application called FakeApp for running the algorithm the job required. Soon, it seemed like every fake porn enthusiast was making their own AI-generated sex tapes with varying success.

As it got easier for anyone to do, more outlets started reporting on the phenomenon and panic ensued as media theorists considered the implications of video losing its inherent veracity, especially when it came to news and politics. But it all began with the sex—and a long legacy of toxic male culture and willful ignorance of consent that’s come to a glitchy, moaning, pixelated head.

Deepfakes communicated with me in Reddit direct messages only, and only for a brief time, until I think the thing he created got too large for his comfort. “I’m just a programmer who has an interest in machine learning,” his first reply said. “I’m not a professional researcher.” At that point, in December, his work was only popular on a subreddit devoted to editing female celebrities’ faces onto still photos of porn performers— a pastime as old as photo editing.

Advertisement

But a fairly straightforward AI algorithm made those years of shitty ’shop jobs and doppelgänger forums look like scrapbooking. Deepfakes discovered that he could use existing AI algorithms to bring those images to life. All one needed to do was feed the algorithm hundreds of images of someone’s face, pit that training against a video of another person, and let the AI try to match the video as closely as possible to the original training target. The technique took the concept of creating DIY kink content and cranked the dystopia level clean off the dial. It was Weird Science combined with a Lovecraftian monster, and anyone could do it.

Everyone could, but not everyone would. Not everyone has the patience or the graphics processing capabilities to wrestle their own fantasy woman out of an algorithm and series of datasets, but those elements—a good computer and a high tolerance for failure—were the only things stopping anyone from making their own. Accessible, open-source AI research (combined with a gaming and photo processing community willing to walk newbies through every troubleshooting bump) democratized a process that was once only possible through big-budget Hollywood magic.

Over the course of weeks of reporting, I must have watched hours of deepfakes posted in forums, mostly in the form of seconds-long GIFs and soundless clips. When the algorithm ran correctly, with enough data and time, the end product was fairly believable, especially when fried down to a lower resolution image. But when the training went wrong, the results—which people posted for others to learn from their mistakes—were mildly horrifying. Blurred videos of distorted faces, expressions writhing in that painful, frustrated way so many faces in heterosexual porn do, plus a layer of homemade AI-generated uncanny valley; disjointed bits of women smashed onto other women, being violently jackhammered by a disembodied dick; glitches that twisted faces into nightmare masks, possessed by an algorithm trying to crawl out face-first.

Advertisement

I watch a lot of porn for my job. I talk to and read about sex workers and their trade all day, most days of the week. So, there’s no such thing as NSFW for me. But deepfakes were different. None of the women featured in these films—neither the porn performers nor the women whose faces were being used—had consented to have their likenesses spread around the internet like that. It disturbed me because it could happen to anyone, and the people doing it couldn’t be bothered to think twice.

In forums, all the imagery—successful attempts, glitches, source material—was interspersed with nonchalant commentary critiquing the results and troubleshooting one anothers’ processes. This is how the users discussed real people’s likenesses, as broken down into datasets and optimal training times.

One of the last questions I asked Deepfakes—before his creation achieved viral fame, and he stopped talking to me—was whether he was concerned his hobby might someday be used to hurt people. No, he replied. Any technology can be used to harm, he reasoned. “I don’t think it’s a bad thing for more average people to engage in machine learning research.”

But the women affected felt differently. When I showed a porn performer one of the early deepfakes, she questioned the property rights of the people working hard to make a living in porn, only to have people steal and abuse their videos. And one woman, a minor YouTube celebrity, barely held back tears when I asked how she found out her face was on a deepfake: Her middle-school-aged audience stumbled into the videos of what looked like her in explicit situations and wanted to know if it was really her.

Advertisement

To most deepfakers, however, these women are simply the sum of interchangeable body parts.

In major news outlets around the world, the deepfakes story blitzed straight past sex and consent and into “fake news” territory. As coverage graduated from “OMG nudes” to “OMG what if someone makes fake videos of Trump launching nukes,” the media attention deepfakes garnered stopped grappling with how it began.

To understand ideologically where deepfakes came from, it’s helpful to understand how porn itself has evolved. In the Playboy and Hustler heydays of the 60s and 70s, porn products popped up mostly in public, male-dominated spaces like hole-in-the-wall shops or erotic film screenings—spaces where a woman might hesitate to venture, thanks to social stigma.

The internet has changed how people access porn, but for the most part, the industry is still geared toward the male consumer. And men aren’t picking up videotapes in porn stores anymore, but they are still taking up niche, male-dominated public spaces on the internet where explicitly entering as a woman or marginalized person opens you up to abuse and harassment. Reddit’s one of them—with years-old communities devoted to “doppleganger” fetishes and posting creepshots of people. Discord, a chat platform made for gamers and used by people who harass women in the name of their hobby, is another, and we found troves of images of real women in those private channels. Then there are largely toxic forum websites like 4chan and 8chan, where men trade explicit images of women, and private forums fill with revenge porn.

Advertisement

In these online spaces, men’s sense of entitlement over women’s bodies tends to go entirely unchecked. Users feed off one another to create a sense that they are the kings of the universe, that they answer to no one. This logic is how you get incels and pickup artists, and it’s how you get deepfakes: a group of men who see no harm in treating women as mere images, and view making and spreading algorithmically weaponized revenge porn as a hobby as innocent and timeless as trading baseball cards.

For More Stories Like This, Sign Up for Our Newsletter

This is the point I kept coming back to: We have to pay attention to the spirit of deepfakes as it started. We can move beyond it, and talk at length about ethical uses of artificial intelligence, fake news literacy, and who has access to powerful tools like machine learning. But we must first acknowledge that the technology that could start a nuclear war was born as a way for men to have their full, fantastical way with women’s bodies.

That is what’s at the root of deepfakes. And the consequences of forgetting that are more dire than we can predict.

Note: As of February 7, 2018, Reddit has updated its site-wide policy to explicitly ban involuntary pornography.