Imagine if all the photographs from your life were stolen from you. Photos of you blowing out candles on your 18th birthday, basking in the sun on holiday, looking gleeful at graduation, or awestruck at a friend’s wedding. Imagine if these photographs from every chapter of your life were harvested by a stranger who used technology to crop and cut the pictures as if for a collage. Imagine if one day, by total happenstance, you not only see that your face has been plastered all over the internet but that it has been placed on someone else’s naked body and framed as pornography.
These are the unsettling dangers of deepfake porn. In recent years, deepfakes – which are videos and pictures of a person whose face and body have been digitally altered so that they appear to be someone else – have gained a lot of attention due to the risk they hold politically. Some deepfakes are harmless (see apps like TikTok for a host of counterfeit videos of Vladimir Putin and Joe Biden dancing together). But some also pose a malicious threat to diplomacy and national security owing to their unmatched power in blurring the boundary between what is classified as accurate and what is factitious. Yet in the ongoing discussion of the political ramifications of deepfakes, we have side-lined the primary victims of this technology in the process: women.
In fact, we forget that the origin of deepfake technology came about from a Reddit creator with the screen name “deepfake”, who used techniques developed and open-sourced by AI researchers to swap female celebrities’ faces from Kim Kardashian to Ariana Grande, onto porn videos.
But the problem is that since then, AI technology has become markedly more sophisticated, and it is no longer only celebrities and public figures who are the focus of these identity-theft attacks, but everyday women. New apps are cropping up left, right, and centre which gives anybody the chance to create non-consensual porn. A concerning MIT Technology Review recently revealed that a new AI app invites anyone to upload a picture of any face and project it onto anyone’s body with a simple click of a button. The tagline of this app encourages users to “turn anyone into a porn star by using deepfake technology to swap the person’s face into an adult video.”
The app was discovered by Henry Adjer, a deepfake researcher, who refers to the platform as “Y” to avoid helping it break out into the mainstream. Currently, the app has a small user base, but researchers like Adjer have long feared that an app along these lines would emerge and more and more would crop up, giving users the ability to create a new form of revenge porn.
The research company Sensity AI estimates that between 90 to 95 per cent of all online deepfake videos are of non-consensual porn, and around 90 per cent of those feature women – so why are we not hearing enough about this and is the law doing enough to protect this violence against women?
“People aren’t talking about this issue despite the fact it could happen to anyone,” says 26-year-old law graduate and activist Noelle Martin. Martin, who lives in Perth, Australia, has been a campaigner for image-based abuse ever since she discovered deepfake photos of herself aged 17.
After doing a reverse Google image search of a picture of herself out of curiosity, she came across the images after uploading a photo of herself to see where else it cropped up on the internet. Her stomach sank when she realised she had been photoshopped onto hundreds of pornographic images and her face photoshopped onto the bodies of porn actresses engaged in sexual acts and distributed across porn sites. “It became something that was impossible to control,” says Martin. “One of the biggest issues was not knowing who was responsible and also to find out later that the perpetrator had already been targeting me for a year before I even found out.”
Exasperated by the fact there was no judicial recourse in Australia, Martin began to speak out and supercharge the discussion about image-based sexual abuse. Yet instead of curtailing the abuse, it merely opened up a can of worms. “When I started speaking out and getting more involved with the law reform in Australia, the perpetrators stopped creating photos and started making videos,” she tells me.
“I remember one day, when I was at work, I got an email saying there was a video of me. I opened the link, and it was a fabricated video of me having sexual intercourse with my full name on the title, saying ‘Noelle Martin, feminist and activist in Australia’. I then found another one and realised the image used was of me at 17, which they clearly used as a foundation to build the video from.”
Eventually, Martin’s herculean efforts paid off, and the law was changed in Australia in 2018, making it a criminal offence to distribute non-consensual intimate images. “I played a role in the legal process, but that role was on the backs of many other people,” she said. “I did what I could, and that was to put a human face to the issue, and that’s so vital. There are still online sites with a continuous thread of collecting images from women’s social media profiles. But If I don’t speak about it, who will?”
Over in the UK, Helen Mort, a poet and broadcaster from Sheffield, has also been vocal about being the target of a fake pornography campaign. In an interview with MIT, she said that what shocked her was that the intimate images she found were based on photos taken between 2017 and 2019, from holiday snaps to pregnancy pictures. Even more worryingly, the pictures had been extracted from her private social media accounts, one of which included a Facebook profile she had deleted. “It really makes you feel powerless, like you’re being put in your place,” she told MIT. “Punished for being a woman with a public voice of any kind. That’s the best way I can describe it. It’s saying, ‘Look: we can always do this to you.”
Fortunately, movements to ban non-consensual deepfake porn are gaining momentum in the US and the UK. Yet still, legal options for victims of non-consensual deepfake porn are like looking for a needle in a haystack. In the US, 46 states have some ban on revenge porn, but only Virginia’s and California’s include faked and deepfake media. In the UK, revenge porn is banned, but the law doesn’t include any image that has been faked or altered. There are no laws that specify targeting deepfakes nor is there a “deepfake intellectual property right” that could be invoked in a dispute. The UK also does not have a specific law protecting a person’s “image” or “personality.”
Campaigners and academics have called on the government to fill in the legal gaps and take swift action to regulate deepfakes. Professor Clare McGlynn of Durham University, a leading legal expert, told the BBC that there was a chance of an “epidemic” of sexual abuse if the law wasn’t reformed. She warned that, although the number of people affected by deepfake pornography is low, we cannot afford to be complacent when tackling this problem. “If we don’t stop this, if we don’t try and change things now, this is going to become the next pandemic,” she said. “It’s going to be the next epidemic of abuse.”
Without a change in the legal and regulatory framework, there will be a culture of mistrust, reputations will be ruined and there will be an inevitable increase in image-based abuse. But fortuitously, an independent review on strengthening the law is underway in the UK. The Law Commission, an academic body that reviews laws and recommends reforms when needed, is currently conducting a review of the existing criminal law as it relates to taking, making and sharing intimate images without consent. The responses include “sharing an altered image – including sexualised photoshopping and deepfake pornography”.
Although steady progress is being made in countries such as the US and UK, campaigners like Noelle Martin express concern that lawmakers and politicians need to understand that this issue is global and goes beyond one requiring a straightforward slam of a gavel. “The law is only one aspect. The problem is how effective they are when the victim doesn’t know who the perpetrator is or they are overseas. We have a government regulator in Australia that goes and gets the material removed on behalf of the victim, but this doesn’t take into account that the material could multiply from just one site.
“Ultimately, this is a global issue that transcends borders, so there has to be a global response to the nature of the threat. We need collaborative action between the government, law enforcement and regulators like we see with child sexual abuse and exploration. But how do we get there?”
Notwithstanding the US and the UK, no other country bans fake consensual porn at a national level. “Many countries do not have the infrastructure or legislation in place to tackle this issue,” says Martin. “Countries need to employ domestic legislation and provide avenues for justice for people who experience this sort of abuse. But on top of this, we also need greater awareness and education, we need to challenge attitudes around victim-blaming and slut-shaming, and we also need to know how effective the law is in supporting victims.”
Another big problem with tackling the spread of non-consensual deepfake pornography is that deepfakes are still a relatively unknown technology and that victims are reluctant to come forward with their stories for fear it would lead to public shame and embarrassment. Mort told MIT that she had been trolled since sharing her experience publicly and Martin also told me how the abuse towards her only escalated when she spoke out and the emotional toll that had on her as a result.
Still, it is the fearless work of campaigners like Helen Mort and Noelle Martin that will kickstart reform and help lawmakers understand the scope of this multifarious issue as the technology becomes ever more ubiquitous. “In order for people to understand just how horrible this issue is, it’s powerful to hear from someone who has lived through it,” says Martin. “I’m not a celebrity or a public figure; I’m no different to anyone else. If it can happen to me, it can happen to anyone.”