Graffiti spelling out Fake

Taylor Swift, Deepfake Porn and the Law

Since working in the legal technology field, I’ve been aware of the risk of “deepfake” audio and video. It’s growing in popularity as it becomes easier to use.

If you’re not familiar with it, you will be. Audio and video show someone saying things or acting in ways that never happened. Whether it be President Biden’s voice or Taylor Swift’s face, it is being used to “prove” things that never happened.

This article offers a good look at the problem, which goes much deeper than celebrity culture:

Deepfake Porn: It Impacts More People Than Just Taylor Swift

“Since this technology has become more widely available, 90-95% of deepfake videos are now nonconsensual pornographic videos and, of those videos, 90% target women — mostly underage.”[15]  All a perpetrator needs is an innocent photo from social media and they can turn it into an explicit image that looks just like the victim.[16] The problem is believed to be regularly affecting millions of teenage girls.[17]

As the article explains, there is no federal law against this in the US. There are some state laws, but they’re not very effective. The problem is that we’ve always treated sexual abuse materials as criminal because, as the saying goes, behind every photo is a child being abused. With deepfakes, though, the abuse isn’t happening. So when a teen girl is the subject of deepfake porn, she was never sexually assaulted, so there’s no crime.

There’s a lot of harm, though. In the case of a celebrity, it might be reputational harm. In the case of teenagers, that fake can turn into blackmail, bullying, and all the mental health issues that go along with that. (The same is true of adults, but we know how vulnerable teens are to this and how this too often ends.)

There might be a law that covers blackmail and bullying, but there’s no law against creating the video in the first place. Once it exists, though, it’s beyond anyone’s ability to control what happens and how it will be used. The question is, should there be a law against creating and/or disseminating fake porn or other materials meant to cause harm to the people being depicted? Thanks to the publicity that comes along with Taylor Swift being depicted in fake porn videos, it seems like we are finally starting to think about what laws should be in place. That’s a start, but it’s a start that has come far, far too late. This has been going on for a couple of years now. Technology has gotten a huge head-start on the law in this and many other areas.

What can we do in the meantime? Learn about deepfake technology; don’t believe everything you see and hear. The harm to teens is lessened if they know most people won’t believe that fake video. It’s not zero, but it’s harder to blackmail someone when no one believes the material is real. Taylor was harmed, but not nearly as much as some unknown teenager could be because no one thinks the videos of her are real.

We also have to decide that this is not appropriate. Whether there is a law against it or not, society has to make this as gross as it is. It shouldn’t be socially acceptable to make non-consensual fake videos of anyone for any reason. We aren’t close to that. I hope we get there, and I hope we find the right laws that will prevent the most harm. I also know how far behind we are now, though. I know how fast this technology is moving, and I’m afraid we’ll never get in front of it at the rate we are working now.

Technology can do great things, and it can cause harm. People who use it to cause harm should not be celebrated.

 

Similar Posts

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.