Taylor Swift deepfake pornography sparks renewed calls for US legislation

The rapid online spread of deepfake pornographic images of Taylor Swift has renewed calls, including from US politicians, to criminalise the practice, in which artificial intelligence is used to synthesise fake but convincing explicit imagery.

ADVERTISEMENT

Online Spread of Deepfake Images

Recently, explicit deepfake images of Taylor Swift have gone viral on social media platforms, raising concerns about the need for legislation to address this issue. These fake images, created using artificial intelligence, have been viewed by millions of people. One image, hosted on the app Telegram, received 47 million views before being taken down.

In response to the spread of these images, X, a company involved in the distribution of deepfake content, has stated that they are actively removing identified images and taking action against the accounts responsible for posting them. However, the incidents have sparked a larger conversation about the need for legal measures to combat deepfake pornography.

Calls for Legislation

US politicians, including Democrat congresswoman Yvette D Clarke and Republican congressman Tom Kean Jr, have expressed their concerns about deepfake pornography and called for legislative action. Clarke highlighted that deepfakes have long been used to target women without their consent, and with advances in AI, it has become easier and cheaper to create such content. She emphasized the importance of bipartisan cooperation to address this issue.

While some individual US states have laws against deepfakes, there is a growing push for federal legislation. In May 2023, Democrat congressman Joseph Morelle proposed the Preventing Deepfakes of Intimate Images Act, which aims to make the sharing of deepfake pornography without consent illegal. Morelle and Kean have both emphasized the need for safeguards to combat this alarming trend and protect potential victims.

Targeting Women and the Need for Safeguards

Deepfake technology is primarily used to target women in sexually exploitative ways. A study by DeepTrace Labs in 2019 found that 96% of deepfake video content was non-consenting pornographic material. The problem has worsened with the advancement of AI, enabling the creation of highly convincing images with simple text commands.

High-profile women, including celebrities like Scarlett Johansson, have been vocal about the prevalence of fake pornography featuring their likeness. The UK government took action in December 2022, making nonconsensual deepfake pornography illegal. These steps aim to protect women and girls from the abuse and humiliation caused by manipulated intimate photos.