Menu Close
Taylor Swift singing into a microphone and pointing onstage in her Eras tour, wearing a sequined top

Taylor Swift deepfakes: a legal case from the singer could help other victims of AI pornography

In between her record-shattering Eras tour and cheering on her NFL-star boyfriend Travis Kelce, Taylor Swift may be gearing up for a history-making legal battle over AI pornography. Swift is reportedly preparing to take action against distributors of “deepfake” images of her.

Obscene images of Swift began circulating on X (formerly Twitter) on January 25. The images, which fans describe as “disgusting”, reportedly originated in a Telegram group dedicated to generating artificial pornographic content of women. They were live for around 17 hours and viewed more than 45 million times before being taken down. X temporarily blocked searches of Swift’s name in an attempt to stop other users from sharing the images.

In response, a group of US senators have introduced a bill to criminalise the distribution of AI-generated, non-consensual sexual images.

There is currently no federal law in the US against deepfake content. Such legislation has been discussed, though mainly in response to the use of generative AI in political misinformation.

But until a law is passed, Swift’s options for recourse are limited. She could sue the company responsible for the technology, or possibly bring a civil suit against the image creators or distributors. Microsoft, whose software was allegedly used to create the images, has already applied restrictions to prevent similar images being generated.

The impact of image-based sexual abuse

Swift’s case is high-profile due to her celebrity status, but AI-generated pornography and deepfakes are a rapidly growing problem as the tech becomes more acessible. It now takes just 25 minutes and costs nothing to create artificial pornography.

Almost all deepfake content is pornographic in nature, and nearly always depicts women. While celebrities are often targeted, the reality is that anyone with your image could easily create pornographic images using your likeness.

While the images may be fake, the harmful impact of this experience is very real. Studies into the non-consensual sharing of pornographic or intimate images have documented the feelings of violation and shame felt by victims. Anxiety, depression and PTSD are common results of this experience.

As victims frequently blame themselves or experience blame from others, it can be difficult to move on. What’s more, the digital nature of sexual image sharing means that victims must live in constant fear of their photo being shared over and over again.

Actor Jennifer Lawrence and television personality Georgia Harrison, both famous victims of image-based sexual abuse, have discussed how their celebrity status increased the scale and scope of their own experiences.

Deepfake porn and other forms of digital sexual abuse are sometimes dismissed as a “lesser” harm than physical sexual assault because of their online nature. A high profile case like Swift’s could help draw attention to the genuine impact of these images on victims.

The fact that it is not “really” her body does not diminish the seriousness. Research shows that the “pornification” of non-risque photos using photoshop or AI can be just as harmful to the person depicted. The intent behind these images is not simply sexual – the degradation and humiliation of the victim is an equally desired outcome.

What does the law in the UK say?

Through the Online Safety Act, UK law now criminalises the distribution (though not the creation) of digitally made or altered intimate images.

While the change has been welcomed by campaigners, there are still barriers preventing victims from seeking justice.

As with sexual violence more broadly, incidents of non-consensual sexual image sharing often go unreported. When they are reported, poor police response can exacerbate the harms.

Prosecution rates are low, often due to lack of evidence or victims withdrawing their support. A lack of guidance from officers, no guarantee of anonymity, or a fear of repercussion from the perpetrator are just some of the factors that might lead to someone deciding not to see a case through.

A record of fighting sexism

This wouldn’t be the first time Swift took legal action against misogyny. In 2017, she won her claim against a former radio DJ who groped her at a meet-and-greet. She claimed just US$1 (79p) in damages, stating that she wanted to use the case to take a stand against sexual violence.

Courtroom sketch of Taylor Swift conferring with her attorney at a civil trial in 2017.
Swift is no stranger to battling sexism in the courtroom. Jeff Kandyba/Alamy

While Swift’s celebrity status has made her a target, her fame brings advantages that many victims of sexual violence don’t have, as she noted in the aftermath of her 2017 trial. Her platform, wealth and influence gave her the means to speak out where others cannot.

The “Taylor Swift effect” has proven to be a powerful economic force. If she does choose to take action against the creators of the images, it could be a landmark case against artificial pornography. Increasing public awareness might just embolden victims, and apply pressure to governing bodies around the world for better enforcement of the law.

If you have been affected by artificial pornography or other forms of image-based sexual abuse, there are number of places you can find help. The Revenge Porn Helpline and The Cyber Helpline can provide guidance on removing these images, reporting to the police, and accessing additional support.

Want to write?

Write an article and join a growing community of more than 182,600 academics and researchers from 4,945 institutions.

Register now