Daisy Ridley
Actress Daisy Ridley is among stars whose faces have been stuck onto a porn performer's. Christopher Polk/Getty Images

KEY POINTS

  • The app uses machine learning to superimpose faces onto adult films.
  • Experts warn that celebrities could sue for defamation.

Celebrities including Star Wars actress Daisy Ridley and Wonder Woman star Gal Gadot have had their faces superimposed onto the bodies of porn actress thanks to a new app that experts warn is startlingly realistic.

In a growing trend known as "deepfakes", artificial intelligence is used to take a person's image and swap it with that of a porn performer. The result is realistic footage appearing to show the person carrying out sex acts, without their permission. The technology used in FakeApp is similar to that recently rolled out by Apple, which allows a user to control the face of emoji characters.

The technique is named after the Reddit user called "deepfakes" who is believed to be the first to face-swap celebrities onto porn using machine learning with any success in terms of realism. In December, the user created a subreddit named after themselves, which has amassed almost 30,000 followers in two months.

Some of the most popular videos on the forum include one that appears to show singer Katy Perry performing a solo sex act, while another features Taylor Swift's head superimposed on the body of adult actress Krystal Boyd.

Another Redditor named "deepfakeapp" created the user-friendly app named FakeApp based on deepfakes' algorithm, Motherboard reported.

He told the website that he wants to hone it to the point where users can select a video, and correlate the video with a face from a public library "with the press of one button".

Peter Eckersley, chief computer scientist for the Electronic Frontier Foundation, told Motherboard last year that users can tell the images are fake because the faces are not matched up seamlessly. But he believes faultless, easy to use technology will be "widespread" in a year or two.

Deborah Johnson, Professor Emeritus of Applied Ethics at the University of Virginia's school of engineering, told the website that this will destabilise the "whole business of trust and reliability" as it becomes impossible to distinguish between real and fake porn.

Responding to concerns this technology is unethical and could be used for revenge porn and other nefarious purposes, a user named self.deepfakes wrote on Reddit: "To those who condemn the practices of this community, we sympathise with you. What we do here isn't wholesome or honorable, it's derogatory, vulgar, and blindsiding to the women that deepfakes works on."

"There is concern that such realistic technology could be used to destroy a person's reputation via fake revenge porn or even forge evidence in court.

"That said, the work that we create here in this community is not with malicious intent. Quite the opposite. We are painting with revolutionary, experimental technology, one that could quite possibly shape the future of media and creative design.

"No matter what happens, this technology was going to become a reality. Nothing could stop that. And ironically enough, the safest hands for it to be in might just be the general public with the power to desensitise it, rather than an exclusive few, with the power to exploit it."

Luke Patel, a solicitor partner at Blacks Solicitors in Leeds, told IBTimes UK that the speed withwhich such technology progresses makes it difficult for laws to keep pace and adequately support victims.

However, he added that celebrities could take legal action if such videos could be seen to breach Data Protection Law in relation to the use of images without consent, and could bring about libel cases if it is proven to harm a person's reputation.

"Due to the nature and graphic content of these videos it seems that this would not be too difficult to prove. If the celebrity victim who brought such action was successful he could look to seek damages or injunctive relief to permanently remove these defamatory images from the internet," he said.

"Unfortunately the long arm of the law is just simply not long enough or equipped to deal quickly with such fast changes in the digital world," he said adding: "The pressure and responsibility to curb such activity can only lie with the internet platforms and more needs to be done to clamp down on them and for them to ensure tighter controls are in place to ensure that their platforms are not used for such elicit activities."