Scoopfeeds — Intelligent news, curated.
The shock of seeing your body used in deepfake porn
ai

The shock of seeing your body used in deepfake porn

MIT Technology Review · May 14, 2026, 9:00 AM

Why this matters: a development in AI with implications for how people work, create, and decide.

When Jennifer got a job doing research for a nonprofit in 2023, she ran her new professional headshot through a facial recognition program. She wanted to see if the tech would pull up the porn videos she’d made more than 10 years before, when she was in her early 20s. It did in fact return some of that content, and also something alarming that she’d never seen before: one of her old videos, but with someone else’s face on her body. “At first, I thought it was just a different person,” says Jennifer, who is being identified by a pseudonym to protect her privacy. But then she recognized a distinctly garish background from a video she’d shot around 2013, and she realized: “Somebody used me in a deepfake.” Eerily, the facial recognition tech had identified her because the image still contained some of Jennifer’s features—her cheekbones, her brow, the shape of her chin. “It’s like I’m wearing somebody else’s face like a mask,” she says. “It’s like I’m wearing somebody else’s face like a mask.” Conversations about sexualized deepfakes—which fall under the umbrella of nonconsensual intimate imagery, or NCII—most often center on the people whose faces are featured doing something they didn’t really do or on bodies that aren’t really theirs. These are often popular celebrities, though over the past few years more people (mostly women and sometimes youths) have been targeted, sparking alarm, fear, and even legislation. But these discussions and societal responses usually are not concerned with the bodies the faces are attached to in these images and videos. As Jennifer, now 37 and a psychotherapist working in New York City, says: “There’s never any discussion about Whose body is this?” For years, the answer has generally been adult content creators. Deepfakes in fact earned their name back in November 2017, when someone with the Reddit username “deepfakes” uploaded videos showing faces of stars like Scarlett Johansson and Gal Gadot pasted onto porn actors’ bodies. The noncons

Article preview — originally published by MIT Technology Review. Full story at the source.
Read full story on MIT Technology Review → More top stories
Aggregated and edited by the Scoop newsroom. We surface news from MIT Technology Review alongside other reporting so you can compare coverage in one place. Editorial policy · Corrections · About Scoop