A PORN company said it can now use notorious "deepfakes" editing software to put viewers in scenes.

Naughty America has been working with the artificial intelligence tech to create fabricated sex tapes that look completely realistic.

Earlier this year, The Sun reported on a new app called "deepfakes" that created convincing face-swap videos using machine learning.

By scanning thousands of pictures of a person, it was possible to map their face onto someone else's – with exceptional accuracy.

It soon emerged that pervs were using the tech to put celebrity faces on porn stars during sex scenes.

Now Naughty America says it wants to let randy porn punters put their own faces in adult movies using the same technology.

In a blog post, the company explained how viewers could personalise virtual reality porn videos with their own mugs – or the face of their partners.

"The artificial intelligence and machine-learning technology used for these creations now allows Naughty America to not just customize VR porn videos for consumers, but to also to personalize VR experiences for them, too, by adding their own face, or that of a consenting loved one, to a specific fantasy," the XXX film firm explained.

Speaking to Variety, Naughty America CEO Andreas Hronopoulos said: "We see customisation and personalisation as the future."

First, viewers need to send in footage of themselves.

They'll need to perform very specific facial expressions to allow the saucy tech to work its magic.

Then, the viewer's face will be accurately mapped onto an actor's face, creating a convincing – yet fake – porn clip.

This might sound terrifying, especially if someone used your face to create their own fake porn movie.

But Hronopoulos said that although the tech is more advanced, the concept is nothing new.

"It's just editing, that's all it is," he explained.

"People have been editing people's faces on pictures since the internet started."

The Sun first reported on "deepfakes" in January, revealing how the tech could be illegal in the UK.

At the time, Andrew Murray, Professor of Law at the London School of Economics, told us how there are "many forms of legal recourse" available to celebrities targeted by these sex videos.

It all began when a Reddit user called "deepfakes" was found to be face-swapping celebrity faces onto the bodies of porn performers – including Gal Gadot, Maisie Williams, and Taylor Swift.

The user soon created his own Reddit sub-forum – also called 'deepfakes' – where he, and other users, posted their phoney celeb sex videos. The forum has since been banned.

Soon after that, another user called "deepfakeapp" created a downloadable app that uses artificial intelligence and machine learning to produce face-swapped pornos with barely any effort.

The Sun discovered fake sex tapes of top stars like Jennifer Lawrence, Katy Perry, Evangeline Lilly, Daisy Ridley, Blake Lively, Sophie Turner, Kristen Bell, Cara Delevigne, Emma Watson, Natalie Portman, Kate Beckinsale and Jessica Alba – all created using "deepfakes" tech.

Law expert Murray told The Sun: "To put the fact of an identifiable person onto images of others and then sharing them publicly is a breach of Data Protection Law.

"Should the images be received as genuine images and the celebrity, as a result, is viewed less favourably by members of society then they could sue for defamation if it was shown to have harmed their reputation."

The Sun discovered how some sick pervs were using childhood photos of Emma Watson to create deepfakes – with some pics taken when the Harry Potter star was only 10 years old.

In February, the fake porno clips were taken down from a number of sites, including Reddit, Pornhub and Twitter.

However, it later emerged that these "deepfake" videos were still easily available online.

Speaking to Fast Company, Naughty America's Hronopoulos defended the tech: "Deepfakes don't hurt people. People using deepfakes hurt people."