Deepfake porn: why we need to make it a crime to create it, not just share it

Deepfake pornography – where someone’s likeness is imposed into sexually explicit images with artificial intelligence – is alarmingly common. The most popular website dedicated to sexualised deepfakes, usually created and shared without consent, receives around hits a month. The content almost exclusively targets . There has also been an in “nudifying” apps which transform ordinary images of women and girls into nudes.

Author


  • Clare McGlynn

    Professor of Law, Durham University

When , the subject of a new BBC Radio File on 4 documentary, received an anonymous email telling her she’d been deepfaked, she was devastated. Her sense of violation intensified when she the man responsible was someone who’d been a close friend for years. She was left with suicidal feelings, and several of her other female friends were also victims.

The horror confronting Jodie, her friends and other victims is not caused by unknown “perverts” on the internet, but by ordinary, everyday men and boys. Perpetrators of deepfake sexual abuse can be our friends, acquaintances, colleagues or classmates. around the have realised that their classmates are using apps to transform their social media posts into nudes and sharing them in groups.

Having with victims and spoken to many young women, it is clear to me that deepfake porn is now an pervading the lives of all women and girls. Deepfake pornography or nudifying ordinary images can happen to any of us, at any time. And, at least in the UK, there is nothing we can do to prevent it.

While UK laws deepfake porn without consent, they do not cover its creation. The possibility of creation alone implants fear and threat into women’s lives.

Deepfake creation itself is a violation

This is why it’s time to consider criminalising the creation of sexualised deepfakes without consent. In the House of Lords, described deepfake abuse as a “new frontier of violence against women” and called for creation to be criminalised.

It’s also a debate taking place around the world. The US is federal legislation to give victims a right to sue for damages or injunctions in a civil court, following states such as that have criminalised creation. Other jurisdictions such as the and the Australian state of already criminalise the production of sexualised deepfakes without consent.

A common response to the idea of criminalising the creation of deepfakes without consent, is that deepfake pornography is a , just like imagining it in your head. But it’s not – it is creating a digital file that could be shared online at any moment, deliberately or through malicious means such as hacking.

It’s also not clear why we should privilege men’s rights to sexual fantasy over the rights of women and girls to sexual integrity, autonomy and choice. This is non-consensual conduct of a sexual nature. Neither the porn performer nor the woman whose image is imposed into the porn have consented to their images, identities and sexualities being used in this way.

Creation may be about sexual fantasy, but it is also about power and control, and the humiliation of women. Men’s sense of over women’s bodies pervades the internet chat rooms where sexualised deepfakes and tips for their creation are shared. As with all forms of , deepfake porn is about telling women to and to get off the internet.

Taking the law further

A law that only criminalises the distribution of deepfake porn ignores the fact that the non-consensual creation of the material is itself a violation. Criminalising production would aim to stop this practice at its root.

While there are legitimate concerns about over-criminalisation of social problems, there is a worldwide of harms experienced by women, particularly online abuse.

And while criminal justice is not the only – or even the primary – solution to sexual violence due to continuing , it is . Not all women want to report to police, but some do. We also need new civil powers to enable judges to order internet platforms and perpetrators to take-down and delete imagery, and require compensation be paid where appropriate.

As well as the criminal law laying the foundation for education and cultural change, it can impose greater obligations on internet platforms. If creation of pornographic deepfakes was unlawful, it would be difficult for to continue to prop up the deepfake ecosystem, difficult for to continue returning deepfake porn sites at the top of searches and difficult for social media companies such as or the app stores to continue to advertise nudify apps.

The reality of living with the invisible threat of deepfake sexual abuse is now dawning on women and girls. My women students are aghast when they realise that the student next to them could make deepfake porn of them, tell them they’ve done so, that they’re enjoying watching it – yet there’s nothing they can do about it, it’s not unlawful.

With women their deep despair that their futures are in the hands of the “unpredictable behaviour” and “rash” decisions of men, it’s time for the law to address this threat.

The Conversation

Clare McGlynn does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

/Courtesy of The Conversation. View in full .