Some might argue that many modern-day innovations are a direct result of people wanting to find better ways of accessing porn. Even the internet, for that matter, is mostly porn. As expected, people eventually got sick of regular porn and always want to take it up a notch. That, directly resulted in the birth of celebrity porn (and loads of other messed up stuff, but let’s not go there). Armed with some garden-variety porn, a video editing software and a mountain of sexual frustration, the internet found ways to put just about anyone they wanted in just about any scene they wanted. The problem only got worse after ‘compromising’ photographs of several prominent Hollywood personalities got leaked online, known in internet history as the fappening (I didn’t come up with that.)

Initially, one could only find such content on the fringes of the internet; small, private subreddits on Reddit, shady porn sites and 4chan (surprise surprise). There was not a lot to go around, considering the fact that it required some degree of expertise in video editing and a lot of patience. But, as technology progresses, we get newer and better software. Soon enough, the Steve Jobs of morphed porn entered the scene. Reddit user /u/deepfakes shot to fame after posting several realistic-looking celebrity porn videos. His methods were a secret, until now, when he revealed the tricks of his trade and made it open for public use.

A brief history of celebrity porn on Reddit

The r/CelebFakes subreddit was a community devoted to photo manipulations of celebrities which has been mainly devoted to photoshopping celebrities to appear nude. These photos often spread onto porn sites. Thankfully the subreddit laid down a few ground rules: no celebrities under 18 and no no-celebrities.

Until recently, the top post on CelebFakes was a video manipulation from two years ago that spliced an interview with Emma Watson over footage of an adult film actress removing her top. The result was a pretty convincing image of Watson stripping down in a newsroom. It was the most upvoted post ever, until very recently. Now, a morphed gif of Maisie Williams (Arya Stark from Game Of Thrones) has taken its place.

For obvious reasons, I won’t be posting the gif here. The gif doesn’t look very convincing, and one can tell that it’s morphed after a point. Make no mistake; the technology is far from perfect, but it leads us down a slippery slope. Is this something we want to perfect?

How it works

The process automates the process of replacing the face in every frame of a video, with the one you want. It creates convincing fake photos in large numbers, which when combined can be made into a video. The automation is handled by deep learning using an artificial neural network. A similar algorithm has been employed for several years in Hollywood for everything ranging from face swaps to making Carrie Fisher look younger in a Star Wars movie.

The process is a bit cumbersome and requires you to have a powerful Nvidia GPU. The tutorial page states that at least 2GB of VRAM is required, but 6GB is recommended for better results. The tool uses the harnesses the power of CUDA cores found in Nvidia GPUs for all the heavy lifting. AMD GPUs are not supported at this point, unfortunately. According to the tutorial page:

"The program requires CUDA 8.0 support, which means you must use a recent NVIDIA graphics card. Furthermore, the computational model requires a minimum amount of graphics card memory. As a rule of thumb, the GTX 1060 with 6GB of VRAM should be able to provide satisfactory baseline performance requiring approximately 10-20 hours of model training. The GTX 1070 or 1080 will provide better performance and faster model training. In general, graphics cards with 2GB of VRAM will barely suffice or not be able to work. We have successfully tested a GTX 1050 with 2GB of VRAM using the settings batch size=32, layers=2, and nodes=32 in FakeApp. Another user has reported success with a 2GB card using batch size=64, layers=1, and nodes=128. Lowering the number of nodes and layers may allow systems with less VRAM to use FakeApp, but the quality of the results may suffer."

After some extensive ‘research’ and a few hours of meddling with the software, I found out getting the output to look realistic is way more difficult than I thought. Despite following the instructions to the tee, the results were always off and looked horrible. However, people who are a lot better at it than me managed to get some impressive results

Img

Part of the reason I think many people ended up with sub-par results is because of underpowered hardware. Yes, you can still use your GTX 1060, but for better results, you’re better firing up the 4xTitan Xps in SLI. It is recommended that you let the software “train” for 15-20 hours for optimal results, but somtimes, you end up with abominations like this.

Img

Is 4chan really behind this?

It isn’t wrong to assume that this is one of the many projected perpetrated by the cesspool of the internet namely the American imageboard 4chan. Anyone who’s been on the internet long enough that most bad things start there, including the fappening. Even if it did originate on 4chan, there is no evidence of it (and yes I did check chanarchives). The subreddits in question somehow survived the wave of bans that shut down the like of r/fatpeoplehate and r/incels because they were close-knit and didn’t have a lot of members. With this new development and the sheer amount of attention it’s been getting, it is very likely that the subreddits will be shut down, or heavily moderated.

Is this legal?

Since all the tools involved in the process are free and open-source, technically all of it is legal. So far, image hosting sites such as Imgur and Gyfcat have done well to keep most deepfakes gifs/images off of their sites. Legally speaking, the content does violate DMCA guidelines in the US as the underlying porn video is the intellectual property of the site that hosted/produced it.

However, clips are now finding their way to porn sites, where they’ve racked up a lot of views. We don’t see any of the porn sites taking any of the content down anytime soon, given that it only makes their product ‘more appealing to the masses’, for lack of a better phrase. Even though there are regulations in place that can control the distribution of doctored content, it isn’t going to accomplish a whole lot. Just about anyone can download the software, and that is, to put it mildly, terrifying.

Just how slippery is the slope?

Short answer; very. While Reddit users claim that this is merely ‘for the lulz’, the software in the hands of the wrong people can wreak havoc. Fake news is already a problem, and now as a cherry on top of the cake, we have fake videos. Any idiot can now generate footage of a person they don’t like for revenge/leverage/lulz. Don’t like your neighbour? Morph his/her face in a porno. The possibilities are endless, and that is not a good thing.

Another major repercussion will be the rise in instances of revenge porn. Revenge porn is described as explicit or incriminating material that’s taken and uploaded to the internet without the consent of the subject or the owner of the materials. It is often done by jilted lovers to get back at their exes. Even though there are regulations that hold the guilty party accountable for uploading such content, there is a potential for some cases to slip through the cracks.

The most frightening part is yet to come. It’s when the technology is applied to discredit political opposition or dissent. The repercussions could potentially be disastrous and cause major turmoil among the constituents. It is not a new tactic by any means and has been employed several times.The only thing that changes now is that just about anyone can make it.

Currently, video evidence is infallible legally and in day-to-day life. That won’t be the case in the next few years as genuine footage can be dismissed as morphed and vice versa, and without the proper resources, it’ll be impossible to tell one from the other. I reckon the credibility of videos will reduce over the year and we’ll have to figure out a new, tamper-proof method of recording things.

So, what now?

Not much can be done, to be honest. The subreddit has over 25,000 subscribers already who’ve very likely used the software and recommended it to friends. The only silver lining here is that it is a bit of a chore to operate, so non-tech-savvy people will likely steer clear. The software is only going to get better with time, as more people begin to notice and develop it.

One one hand, machine learning and AI have legitimate uses which can potentially revolutionize the way we live life, and on the other, we have applications such as this one. Limiting access to machine learning resources might help, but the internet always finds a way. Whenever a new technology has been developed, people have found a way to misuse it. Do we stop researching new things just because it can be misused and go back to the stone ages? Probably not.

I’m not entirely pessimistic, and still, think that we can still eke out an iota of good out of the whole episode. We might get better laws specific to morphed videos, along with increased awareness about information security among the people. Companies will be driven to implement better safeguards in place to ensure that their content isn’t misused. It doesn’t sum up to much, overall, but it’s a start.