In Depth

AI-generated fake celebrity porn takes over Reddit

App face-swaps movie stars and X-rated actors - but is it legal?

Artificial intelligence (AI) is being used to create fake celebrity pornography videos by placing the faces of movie stars onto the bodies of porn performers.  

The trend was kick-started in December when a Reddit user by the name of deepfakes posted mocked-up celeb porn videos made using AI-assisted editing software, reports The Verge

According to the website, other Reddit users are now employed a growing range of “easy-to-use” editing software to create their own face-swapped sex films and are posting them to deepfakes’ chat page, which has more than 15,000 subscribers.

 Wonder Woman star Gal Gadot, Taylor Swift, Scarlett Johansson, and Game of Thrones actor Maisie Williams are among those who have been featured in the X-rated clips.

Most of the editing apps employ machine learning, which uses photographs to create human masks that are then overlaid on top of adult film footage,  says Motherboard

“All the tools one needs to make these videos are free,” the website says. The apps also come with “instructions that walk novices through the process”.

Is it legal?

No, says The Sun, since the fake porn videos are created without the consent of the celebs featured in them. 

Andrew Murray, a professor of law at the London School of Economics, told the newspaper: “To put the fact of an identifiable person onto images of others, and then sharing them publicly, is a breach of Data Protection Law.”

Murray says that stars could sue the creators of fake porn for defamation if the videos are “received as genuine images and the celebrity, as a result, is viewed less favourably by members of society”.

The videos could also be seen as form of harassment, he told The Sun, which celebrities could report to the police. 

Questioning reality  

The ease with which plausible fake videos can be made is causing widespread concern, with fears that it heralds an era when “even the basic reality of recorded film, image or sound can’t be trusted”, reports The Guardian.

Mandy Jenkins, from social news company Storyful, told the newspaper: “We already see it doesn’t even take doctored audio or video to make people believe something that isn’t true.”

Reddit user deepfakes has told Motherboard that the technology is still in its infancy.

Deepfakes said they intended to keep improving the porn-creation software so that users can “can simply select a video on their computer” and swap the performer’s face with a different person “with the press of one button”.

Recommended

Trans sport, fragrant friends and lost data
Competitive swimming
Podcasts

Trans sport, fragrant friends and lost data

Digital innovation unpacked: the secret to business longevity
Microsoft’s panel of experts discusses the importance of innovation
Advertisement Feature

Digital innovation unpacked: the secret to business longevity

Is it possible for AI to achieve sentience?
Blake Lemoine
Today’s big question

Is it possible for AI to achieve sentience?

AI will deepen divisions, say Yuval Noah Harari and Slavoj Zizek
Yuval Noah Harari and Slavoj Zizek
Getting to grips with . . .

AI will deepen divisions, say Yuval Noah Harari and Slavoj Zizek

Popular articles

Are we heading for World War Three?
Ukrainian soldiers patrol on the frontline in Zolote, Ukraine
In Depth

Are we heading for World War Three?

What happened to Logan Mwangi?
Tributes left to Logan Mwangi
Today’s big question

What happened to Logan Mwangi?

Nato vs. Russia: who would win in a war?
Nato troops
Today’s big question

Nato vs. Russia: who would win in a war?

The Week Footer Banner