6 Reasons We Need to Take on Deepfake Porn – Right Now.

Have you heard of deepfakes yet?  It’s a new form of video technology that has been weaponized to harm women online.  Think of it as nonconsensual pornography 2.0 (aka revenge porn 2.0). Videos are created using a face-swapping app or software to transfer a victim's face onto the body of someone else in a pornographic video – making it falsely (yet realistically) appear the victim is engaging in sex acts. These videos are created and shared without the victim's consent. We call this nonconsensual deepfake pornography. 

Here are 6 reasons why we created a guide for victims and anyone helping them:  

Violence. Against. Women: The statistics leave no doubt: 96% of all deepfake videos are pornographic and all of them target women. What began with a focus on female celebrities has now spread to non-celebrities.  It gets worse.  A recent offshoot of deepfake tech – a malicious ‘x-ray’ app – digitally removes a victim’s clothing from any photo.  It only works on photos of women.  Any form of abuse this popular, and this gendered, demands our full attention.

Deepfake porn doesn’t need to fool anyone – it’s more popular than ever.  The concern dominating the conversation around deepfakes is deception — how videos will fool viewers into believing something that isn’t true.  Only deepfake porn doesn’t need to be believed to cause harm.  When viewers visit deepfake porn websites – 134,364,438 unique views and counting – it’s not with the belief the videos are real. These websites openly identify as deepfake porn sites and most include ‘deepfake’ in their name.  This means the harm to victims of nonconsensual deepfake porn is different, and far worse.

If a deepfake is first believed to be authentic, the damage will be similar to that of nonconsensual porn.  If not believed (or later debunked), the woman depicted is still publicly sexualized and fetishized without their consent.  Even if all deepfakes could be detected instantaneously (which isn’t possible right now), victims will still suffer in this way. For nonconsensual deepfake pornography – currently 96% of the problem – we have to contemplate solutions other than detection.

Deepfakes make it harder to believe victims:  Despite the rising popularity and devastating impact of deepfakes, many remain completely unaware this technology exists.  This includes victim service providers, law enforcement, judges, lawmakers and even the victims themselves.  It means a victim may not be believed (and re-traumatized) when they say it isn’t actually them in a pornographic deepfake.

Waiting For More Victims Is An Unacceptable Strategy:  High schools and college campuses are not reporting deepfake abuse as rampant among students.  Domestic violence survivors are not reporting nonconsensual deepfake pornography in droves.  And yet, we know deepfake porn is rapidly increasing in popularity and the technology to create them is improving – and becoming more accessible – in real time.  The mass proliferation of deepfake pornography is not a question of if, but when. If we act now, we can prepare and educate victims and first responders before we get there.

We Are All Potential Targets:  Nonconsensual pornography is predicated on the existence of an actual intimate image or video of the victim.  Deepfakes only need images of a victim’s face.  With our faces plastered all over social media and the Internet, this puts us all at risk. 

Too Few Resources:  Nonconsensual pornography laws have been passed in 46 states.  They won’t help.  Those laws generally require the victim’s body be exposed, but deepfake porn swaps only the victim’s face onto someone else’s body.  Still, four deepfake laws were passed this year in the United States.  Only two address nonconsensual deepfake pornography.  The other two?  Election fraud.  As it stands now, 50% of the laws passed do not address 96% of the problem.

For questions about the Victim Resource Guide or our work at EndTAB, you can reach us here.