Taylor Swift fans, alongside politicians and the White House, have expressed their outrage over AI-generated fake pornographic images of the pop superstar that recently went viral on various social media platforms.
According to reports, one such image on X, formerly known as Twitter, was viewed over 47 million times before its removal on Thursday, after being live for approximately 17 hours.
White House Press Secretary Karine Jean-Pierre called the incident “alarming” and highlighted the disproportionate impact of such lack of enforcement on women and girls, who are often the primary targets of online harassment.
Deepfake porn images featuring celebrities are not a new phenomenon, but there is growing concern that easy-to-use tools leveraging generative AI could lead to a surge in toxic or harmful content. Increasing cases of young women and teens being victimised on social media with sexually explicit deep fakes are also being reported. These deepfakes are becoming more realistic and easier to create, AFP reported.
The targeting of Swift, one of the world’s most-listened-to artists on Spotify, has drawn significant attention to the issue, with her fans expressing outrage. Swift, known for leveraging her influence for social causes, had previously urged her 280 million Instagram followers to vote and played a role in prompting US Congress hearings about Ticketmaster’s handling of her concert ticket sales.
Influencer Danisha Carter commented on X about the potential impact of Swift’s influence on legislative actions against deepfake pornography.
X, known for its more lenient policies on nudity compared to Meta-owned platforms, is one of the largest platforms for porn content. Despite this, it has been tolerated by Apple and Google through their app store guidelines.
X stated that Non-Consensual Nudity (NCN) images are strictly prohibited on its platform and that it has a zero-tolerance policy towards such content. The platform claimed to be actively removing the images and taking action against responsible accounts while monitoring further violations.
However, the images continued to circulate and were shared on other platforms like Telegram. Swift’s representatives have not commented on the matter.
The incident also highlights Swift’s past experiences with right-wing conspiracy theories and fake videos.
Democratic Congresswoman Yvette Clarke and Republican Congressman Tom Kean have raised concerns about the advancements in AI and the lack of adequate safeguards to combat deepfake pornography. While federal laws are needed to enforce legally mandated controls, achieving this remains a challenge in the deeply divided US Congress.
Deepfake technology, increasingly used to create explicit content targeting women, is widely accessible on the internet. Research indicates a significant presence of pornographic deepfake videos online, with a majority of deepfake content being of a pornographic nature.