Taylor Swift is the New Rallying Cry in the Fight Against Deepfakes
It took reportedly millions of people to view fake explicit photos of pop megastar Taylor Swift, but politicians now have a new reason to take up the fight against deepfakes.
Deepfakes are falsified images of a person made with AI. In this case, the explicit images were deepfakes of Swift, which were posted on Telegram and X, formerly Twitter. Though the social media companies seem busy removing the content, the BBC reported just one image received more than 47 million views. In a statement to the publication, X said it was removing the images and taking action against the posters.
Regardless, those images can’t be unviewed and politicians are using the high profile example to add fuel to efforts to combat deepfakes.
What’s happened to Taylor Swift is nothing new. For yrs, women have been targets of deepfakes w/o their consent. And w/ advancements in AI, creating deepfakes is easier & cheaper.
This is an issue both sides of the aisle & even Swifties should be able to come together to solve.
— Yvette D. Clarke (@RepYvetteClarke) January 25, 2024
“What’s happened to Taylor Swift is nothing new. For yrs, women have been targets of deepfakes w/o their consent. And w/ advancements in AI, creating deepfakes is easier & cheaper, New York Representative Yvette D. Clark posted on x. “This is an issue both sides of the aisle & even Swifties should be able to come together to solve.”
Fellow New York Representative Joe Morelle echoed Clark’s thoughts and used the story to promote his bill to make non-consensual explicit deepfake images illegal, which was introduced in May 2023.
The explicit AI images of Taylor Swift have people wondering: how is this not illegal?
I was astounded, too—so I wrote legislation to make non-consensual deepfakes a federal crime. Join me in advocating for passage of my bill, the Preventing Deepfakes of Intimate Images Act.
— Joe Morelle (@RepJoeMorelle) January 26, 2024
“The explicit AI images of Taylor Swift have people wondering: how is this not illegal?” Morelle posted on X. “I was astounded, too — so I wrote legislation to make non-consensual deepfakes a federal crime. Join me in advocating for passage of my bill, the Preventing Deepfakes of Intimate Images Act.”
While both Clark and Morelle are Democrats hailing from the state of New York, the issue has caught attention across the aisle. The Guardian noted that Republican congressman Tom Kean Jr. of New Jersey also made a statement against the practice.
“It is clear that AI technology is advancing faster than the necessary guardrails. Whether the victim is Taylor Swift or any young person across our country, we need to establish safeguards to combat this alarming trend,” The Guardian quoted Kean as saying. The publication further noted that the New Jersey politician co-sponsored Morelle’s bill. He’s joined by one other Republican, New York Representative Michael Lawler and 20 other Democrats.
Passage of the bill would bring the U.S. in line with the United Kingdom, which made non-consensual and explicit deepfake images illegal back in December 2022, as The Guardian points out. Importantly, it would target all non-consensual deepfake porn. And celebrities haven’t been the only victims of this either.
Still, explicit deepfakes aren’t the only ones causing issues. Swift’s likeness was used to scam people into participating in a giveaway for Le Creuset cookware earlier this month. Similar cases have happened to YouTuber Mr. Beast and Tom Hanks as well.
Image credits: Header photo licensed via Depositphotos.