Taylor Swift Wants to Trademark Her Likeness. These TikTok Deepfake Ads Show Why
Key takeaways
- The move comes as AI deepfakes continue to proliferate across social media.
- A new report from AI detection company Copyleaks shows that Swift and other stars have recently had their likenesses used in scammy advertisements.
- The fake ads show Swift et al. in what seem to be common interview settings—red carpet events or talk show sets.
Why this matters: a development in AI with implications for how people work, create, and decide.
Photo-Illustration: WIRED STAFF; Getty Images Comment Loader Save Story Save this story Comment Loader Save Story Save this story Last week, Taylor Swift filed a trio of trademark applications to protect her image and voice. One is meant to cover a well-known photograph of the pop singer holding a pink guitar during a concert on her record-breaking Eras tour, while the two sound trademarks are for simple identifying phrases: “Hey, it’s Taylor Swift,” and “Hey, it’s Taylor.”
The move comes as AI deepfakes continue to proliferate across social media. Any individual stands to have their likeness exploited in the creation of nonconsensual AI-generated material; earlier this month, an Ohio man was the first person convicted under a new federal law criminalizing “intimate” visual deceptions of this sort. Celebrities, meanwhile, find themselves at risk of both explicit deepfakes and false endorsements.
A new report from AI detection company Copyleaks shows that Swift and other stars have recently had their likenesses used in scammy advertisements. Researchers identified a cluster of sponsored videos on TikTok that appeared to show Swift, Kim Kardashian, Rihanna, and others promoting “potentially fraudulent or malicious services,” with the clips making use of what the researchers call “realistic-sounding voices” as well as “textured filters meant to mask some of the flaws in the AI-generated visuals.”