It recently came to public knowledge that pornographic material distributor Pornhub (essentially YouTube for pornography, for those of you pretending not to know what that is) has been using Artificial Intelligence to scan their pornography database. Essentially, they’ve trained an AI whose job it is to sit and watch the latest pornography all day every day.

Nice.

Joking aside, the reason for this is that it is being trained via machine learning in the fine art of facial recognition in order to automatically detect who the porn star is in each video, and allow users to find other videos of that porn star. Imagine ‘Amazon recommends’, but everyone’s naked. So far this AI has watched 50,000 videos during its beta testing, and Pornhub hopes to have this AI scan its entire porn database by Summer 2018. Eventually, they hope this AI will be able to recognise and account for hair colour, sex position and even fetishes.

This doesn’t just help lonesome guys and gals find material starring their new favourite performer online so that they can…well…do whatever it is they want to do. It is also hugely helpful in combatting piracy.

The adult entertainment industry has a huge copyright and piracy problem. Though ‘porn pirates’ don’t sound like something that’s likely to be a major social issue, the scale of it actually rivals that found in music or film. The porn industry grosses as much as $3 billion per year on the internet alone. That’s not accounting for video, magazine or in-store sales which in the USA alone rack up an impressive extra $10 billion per year on top of this figure. And 69% of pay-per-view internet content is pornography meaning it’s actually very big business.

Global porn revenues have declined 50% in the last decade due to the huge amount of free porn online. Sites like Pornhub, xHamster, Pornmd, YouPorn and others all offer it for free and instead make much of their income from advertising and the sale of data.

So what does this have to do with facial recognition? Well, just as YouTube uses ContentID to make sure that piracy isn’t happening on their platform, porn sites could make sure that the videos posted are owned by the person posting them. That means that no longer will people be able to record paid-for porn and then post it on a free site. An AI that can detect these copyright infringements and then flag them for removal would be a massive boon for porn producers and the actors who benefit from them.

So why has this caused widespread uproar throughout the industry and beyond?

It has been claimed that this technology could ruin lives – especially in relation to ‘revenge porn’. Revenge porn is when someone, usually a scorned ex-partner, posts a pornographic video or image online with the sole intention of embarrassing or hurting the person in it.

Typically this is done by one party after a break-up when intimate images from the relationship are shared on platforms like Facebook, Twitter or Instagram. Those places where their family and friends can see them, maximising the damage. A facial recognition piece of software could be used to identify adult entertainers, find their Facebook or Twitter accounts and have their private videos posted where their families can see.
Needless to say, many amateur porn actors do this in their very private lives and their families tend not to know about it. Therefore it is easy to see why this could be so embarrassing.

Porn still has a massive social stigma attached. People who either produce or consume it can often face accusations of moral bankruptcy. But the fact is, those who consume pornography are actually more likely to be blood and organ donors, to engage in volunteer activities, to participate in community projects.

On this front, Pornhub have already stated that they will only use this technology on professional porn stars. How it will be decided who is professional and who is not is so far unclear.

Despite this, many are claiming that professional porn stars have a right to privacy. Allowing people to trace their real names and identities is unethical and morally unjustifiable. And this is where they begin to lose touch a bit with reality.

Someone who makes their living showing their face on video and then sharing that video to millions of people then realistically loses their right to anonymity.

It comes with the territory. Imagine a pop star or an actor declaring that though they want to sing to a sell-out crowd or become a key character in Game of Thrones, they don’t want anyone to know who they are.
It’s not only pretty unjustifiable for a public figure who makes money from their fame to seek total secrecy and anonymity. And it’s almost unenforceable in both law and practicality.

If tomorrow I was to make a video of myself fully nude explaining the plot to ‘Seven Samurai’ (one of my favourite movies), then I am totally free to do so. That is my right. However, once I upload that content to a public place I do so on the understanding that some day that video might come to light, and people I know might see it.

Am I saying that the ‘average Joe’ or ‘average Jane’ does not have a right to privacy? Not at all. It should be illegal (and in most places is) for someone to share intimate videos of people that were never made to be shared.
But for those who produce content for the sole purpose of sharing it far and wide in exchange for money? It is neither reasonable nor realistic for them to expect to be omitted from the onward march of technology on the basis they’ve changed their mind.

That said, there should definitely be a way of grouping porn actors under their ‘porn name’ rather than their real one. If only so an elderly relative Googling their granddaughter’s address doesn’t accidentally give themselves a heart attack if their safe search is off.