Imagine this: You’re browsing the web and stumble onto a porn site where you notice a face that looks strangely familiar. Perplexed and a little confused, you click the link – but what you see horrifies you. You're watching yourself in a porn scene – you've never shot porn before – this can't be right.
While the prospect of being featured in a porn scene is probably on your bucket list, the video we're referring to shows actor Gal Gadot in an incest video where she's allegedly having sex with her stepbrother – only it's not Gadot, her face has been ‘pasted' onto a porn star's body.
Using machine learning, Reddit user ‘Deepfakes’ trained an algorithm on real porn videos and images of Gal Gadot, allowing it to create an approximation of the actors face that could then be applied to the actor in the video. It then basically did a face-swap. The end product isn’t perfect – we had to watch it a few times to be sure, but it will certainly fool the unsuspecting, or more imaginative users out there who just want to see Wonder woman with her kit off.
This isn’t the first time Deepfakes has dabbled in face-swap dark arts. He’s made similar videos with Taylor Swift, and Game of Thrones’ star Masie Williams. All it takes is a couple of hundred face images, and he can generate images to train the AI ‘network’.
"I just found a clever way to do face-swap," he said, referring to his algorithm. "With hundreds of face images, I can easily generate millions of distorted images to train the network," he said. "After that, if I feed the network someone else's face, the network will think it's just another distorted image and try to make it look like the training face” he told Motherboard.
What some people may find a little concerning is that Deepfakes isn't a professional researcher, he's just a programmer with an interest in machine learning. Thus anyone with an understanding of programming, recent computer hardware, and access to various social media platforms, where people are constantly uploading photos of themselves, could hypothetically generate fake videos. It can be used for both good or bad, and the ethical implications are deep.
Motherboard asked Deepfakes what he considered the ethical implications of this technology, when used to create non-consensual videos, like revenge porn:
“Every technology can be used with bad motivations, and it's impossible to stop that… The main difference is how easy [it is] to do that by everyone. I don't think it's a bad thing for more average people [to] engage in machine learning research."