In recent weeks there has been an explosion in what has become known as deepfakes: pornographic videos manipulated so that the original actress's face is replaced with somebody else's.
As these tools have become more powerful and easier to use, it has enabled the transfer of sexual fantasies from people's imaginations to the internet. It flies past not only the boundaries of human decency, but also our sense of believing what we see and hear.
Beyond its use for hollow titillation, the sophistication of the technology could bring about serious consequences. The fake news crisis, as we know it today, may only just be the beginning.
Several videos have already been made involving President Trump's face, and while they are obvious spoofs it's easy to imagine the effect being produced for propaganda purposes.
As is typical, institutions and companies have been caught unaware and unprepared. The websites where this kind of material has begun to proliferate are watching closely. But most are clueless about what to do, and nervous about the next steps.
Within communities experimenting with this technique, there is excitement as famous faces suddenly appear in an unlikely "sex tape".
Only rarely do we see flickers of a heavy conscience as they discuss the true effects of what they are doing. Is creating a pornographic movie using someone's face unethical? Does it really matter if it is not real? Is anyone being hurt?
Perhaps they should ask: How does this make the victim feel?
As one user on Reddit put it, "this is turning into an episode of Black Mirror" - a reference to the dystopian science-fiction TV show.
How are deepfakes created?
One piece of software commonly being used to create these videos has, according to its designer, been downloaded more than 100,000 times since being made public less than a month ago.
Doctoring sexually explicit images has been happening for over a century, but the process was often a painstaking one - considerably more so for altering video. Realistic edits required Hollywood-esque skills and budgets.
But by using machine learning, that editing task has been condensed into three user-friendly steps: Gather a photoset of a person, choose a pornographic video to manipulate, and then just wait. Your computer will do the rest, though it can take more than 40 hours for a short clip.
The most popular deepfakes feature celebrities, but the process works on anyone as long as you can get enough clear pictures of the person - not a particularly difficult task when people post so many selfies on social media.
The technique is drawing attention from all over the world. Recently there has been a spike in searches for "deepfake" coming from internet users in South Korea. Spurred, it can be assumed, by the publishing of several manipulated videos depicting 23-year-old K-Pop star Seolhyun.
"This feels like it should be illegal," read a comment from one viewer. "Great work!"
There are some celebrities in particular that seem to have attracted the most attention from deepfakers.
It seems, anecdotally, to be driven by the shock factor: the extent to which a real explicit video involving this subject would create a scandal.
Fakes depicting actress Emma Watson are among the most popular on deepfake communities, alongside those involving Natalie Portman.
But clips have also been made of Michelle Obama, Ivanka Trump and Kate Middleton. Kensington Palace declined to comment on the issue.
Gal Gadot, who played Wonder Woman, was one of the first deepfakes to demonstrate the possibilities of the technology.
An article by technology news site Motherboard predicted it would take a year or so before the technique became automated. It ended up taking just a month.
And as the practice draws more ire, some of the sites facilitating the sharing of such content are considering their options - and taking tentative action.
Gfycat, an image hosting site, has removed posts it identified as being deepfakes - a task likely to become much more difficult in the not-too-distant future.
Reddit, the community website that has emerged as a central hub for sharing, is yet to take any direct action - but the BBC understands it is looking closely at what it could do.
A Google search for specific images can often suggest similar posts due to the way the search engine indexes discussions on Reddit.
Google has in the past altered its search results in order to make it more difficult to find certain types of material - but it is not clear if Google is considering this kind of step at this early stage. Like the rest of us, these companies are only just becoming aware this kind of material exists.
In recent years, these sites have wrestled with the problem of so-called "revenge porn", real images posted without the subject's consent as a way to embarrass or intimidate. Deepfakes add a new layer of complexity to what could be used to harass and shame people. A video may not be real - but the psychological damage most certainly would be.
It is a tech journalism cliche to say that one of the biggest drivers of innovation has historically been the porn business - whether it improved video compression, or was instrumental in the success of home video cassettes.
As was the case then, what has begun here with porn could reach into other facets of life.
In a piece for The Outline, journalist Jon Christian puts out a worst case scenario, that this technology "could down the road be used maliciously to hoax governments and populations, or cause international conflict".
It is not a far-fetched threat. Fake news - whether satirical or malicious - is already shaping global debate and changing opinions, perhaps to the point of swaying elections.
Combining advancements in audio technology, from companies such as Adobe, could combine fakery for both eyes and ears - tricking even the most astute news watcher.
But for now, it is mostly porn. Those experimenting with this software do not skirt the issue.
"What we do here isn't wholesome or honourable, it's derogatory, vulgar, and blindsiding to the women that deepfakes works on," wrote one user on Reddit, before concocting the laughable suggestion that deepfakes might actually diminish the impact of revenge porn.
"If anything can be real, nothing is real," the user added.
"Even legitimate homemade sex movies used as revenge porn can be waved off as fakes as this system becomes more relevant."
These kind of justification gymnastics are of course designed to protect the mental well-being of those who create this material, rather than those who are featured in it.
But the deepfake community is right about one thing: the technology is here, and there is no going back.