23.5 C
New York
Tuesday, September 17, 2024

Hundreds Of People Are Now Making Fake Celebrity Porn Thanks To A New AI-Assisted Tool

Last month, we wrote about a redditor named ‘deepfakes’, who was starting to make a name for himself on Reddit thanks to his face-swap porn. Mind you, editing tools have been letting people put celebrity faces onto pornographic images and video for years now, but deepfakes’ creations were on a different level.

He was using a machine learning algorithm, coupled with his home computer and armloads of images of both faces involved. With a few hours of training, the artificial intelligence is then able to lay the replacement face on top of the old one in dynamic video, making a pretty convincing fake porn video in the process. Now, there are a hundred others doing the same.

One Redditor

One Redditor’s attempt to put Daisy Ridley’s face on a pornstar

Another redditor has since created an app, using the same machine learning algorithm, to create AI-assisted porn videos. With its easier user interface, there’s now many others on the platform using it to build their own pornographic creations. In fact, there’s an entire subreddit called deepfakes that’s dedicated to these attempts, both success and failure, and it has close to 30,000 subscribers. Gal Gadot, Maisie Williams, Katy Perry, and Taylor Swift, you can find faked porn videos of all of them, some even incredibly convincing. All you need is the app, photos or video of the celebrity of your choice, and a porn video with a similar-looking performer. 

The future of fake porn we predicted is hurtling towards us at blinding speed, and the problem is no country in the world has the laws to deal with this kind of thing. Is it sexual harassment, or just mere defamation? More importantly, how will it affect celebrities and even regular people when this technology progresses to the point where we can’t tell the difference?

To be sure, there are a lot of failed attempts on the subreddit, but some manage to come uncomfortably close to believable. For instance, redditor DaFakir decided to try his hand at faking Katy Perry porn.

mengohmengohmeng instead thought to manipulate Kristen Bell.

HaDenG made Gal Gadot star in a pornographic title once more. 

pornbot205 meanwhile wanted Natalie Dormer to be raunchy on video.

As you can see, there are still flaws here and there, but it’s improving everyday. It’s a major problem, and frankly also a major violation, for the people involved in these fake clips. Redditors on the subreddit argue that editing tools have been around for years, allowing people to create fake porn. Before that were pornographic magazines and scissors, they say. However, the real issues here are both how realistic these fake videos can turn out (with the right training material) and how much easier Deep Fake app makes it. 

However, there is somewhat of a silver lining to this technology. Other redditors have been using the app in decidedly non-pornographic ways, instead swapping out celebrity faces in movies and TV shows. For instance, here’s redditor derpfakes, who decided the world needed more Nicholas Cage.

Even more impressive however is his attempt to fix Rogue One. In the movie, the final scene with a young Carrie Fisher had to be CGI, costing hundreds of millions of dollars. That’s the clip on the top. On the bottom is derpfakes’ AI-assisted face swap using Fisher’s performance in the original Star Wars trilogy. “Bottom is a 20 minute fake that could have been done in essentially the same way with a visually similar actress. My budget: $0 and some Fleetwood Mac tunes,” he says.

It’s a glimpse into how this kind of AI-assisted editing could be used for good, particularly in Hollywood, Unfortunately, that also means it’s going to get very hard in the future to decide whether something is fake or real. In the hands of trolls, this is a worrisome issue. In the hands of vengeful exes and black hat hackers, it’s downright terrifying.

Related Articles

Latest Articles