The rapid online spread of deepfake pornographic images of Taylor Swift has renewed calls, including from US politicians, to criminalise the practice, in which artificial intelligence is used to synthesise fake but convincing explicit imagery.
The images of the US popstar have been distributed across social media and seen by millions this week. Previously distributed on the app Telegram, one of the images of Swift hosted on X was seen 47m times before it was removed.
X said in a statement: “Our teams are actively removing all identified images and taking appropriate actions against the accounts responsible for posting them.”
Yvette D Clarke, a Democratic congresswoman for New York, wrote on X: “What’s happened to Taylor Swift is nothing new. For yrs, women have been targets of deepfakes [without] their consent. And [with] advancements in AI, creating deepfakes is easier & cheaper. This is an issue both sides of the aisle & even Swifties should be able to come together to solve.”