We’ve spent a lot of time worrying about the possible effect of deepfake videos on the 2020 election.
While that’s a real concern, we were blown away by the stats in a report from Deeptrace Labs. The most startling statistic was that 96% of fake videos across the internet are of women, mostly celebrities, whose images are used in sexual fantasy deepfakes without their consent.
Deeptrace Labs identified 14,678 deepfake videos across a number of streaming platforms and porn sites, a 100 percent increase over its previous measurement of 7,964 videos in December of 2018.
Sadly, we imagine we’ll see a surge in lawyers representing exploited celebrities whose publicity rights have been violated. Far worse, we are quite sure those women (and non-celebrities too) feel physically violated by these images. Revenge porn (targeting ex-girlfriends/wives) has also been taken to a whole new level with the use of deepfake videos.
The top four websites dedicated to hosting deepfakes received a combined 134 million views on such videos. There is, sadly, no absence of demand for these images.
There are places you go on the internet (I’m not going to give them publicity here) with a lineup of celebrities. Their faces move, smile and blink as you move around them. They are fully nude, waiting for you to decide what you’ll do to them as you peruse a menu of sex positions. Inevitably, because there is so much money to be made, the sex will be of all kinds, including rape.
We briefly watched a snippet from one of the videos. It was creepy and nauseating. To think that a real woman somewhere would have to cope with seeing herself manipulated by a user in this manner is horrific. And of course, those behind the videos will move to using children as well. Because they can and because there is a market. The full force of the law needs to stop revenge porn, the violation of publicity rights of celebrities, and the non-consensual use of anyone’s face in these videos. Where the laws are currently insufficient, we need new and stronger laws.
Most of the states have revenge porn laws of some kind, sometimes weak laws with minor penalties. The laws tend to assume postings by a vengeful ex-spouse or lover rather than a mass market for products capitalizing on the demand for celebrities in sexual deepfake videos.
Sharing deepfake revenge porn is now a specific crime in Virginia (effective July 1, 2019). We have not seen a study of current revenge porn laws fail to specifically criminalize deepfake revenger porn videos, but it is a good guess that many state laws are now inadequate. The federal government (we know you are shocked) has not been able to agree on a law outlawing revenge porn deepfakes.
How do we combat the spread of $50 apps like DeepNude (thankfully defunct as we write, but there will be others), which could undress women in a single click? DeepNude was trained on more than 10,000 images of nude women and would provide the undressed woman within 30 seconds—and of course the image could be shared to smear reputations (sending it to the woman’s employer or friends and family) or to post online as revenge porn.
Let’s hope our legislatures and the federal government pass laws with teeth to put a stop to this online debasement of women.