Current AI Is Built Against Porn,Not For It
We all use social media and various online platforms. But rarely do we take the time to think about how the platforms are moderating, especially when it comes to porn. Afterall, it’s not a sexy thing to think about.
When it comes to social media, most platforms use a mixture of AI and human moderation to keep things ‘clean’. For instance, companies such as Meta have been making a big public push towards more fully automated moderation. This is for a couple reasons; one being scalability and the other being due the amount of public scrutiny they have received for the negative mental health effects that moderators face. Indeed, AI moderation is definitely presented in the press as though it's the solution to all these moderation challenges, and that it will be able to eliminate human moderators. However, that is currently very far from the case. This is even further from the case when it comes to pornography, because AI for this industry is even more underdeveloped.
Built for or against porn?
Most current AI models are pretty good at detecting nudity. However, when it comes to moderating a porn site, flagging nudity isn’t going to help you. In fact, the majority of AI models on the market are built against porn, not for it. They don’t understand the nuances of porn, and what type of content occurs on these sites. What is right and what is potentially wrong. Therefore, porn sites are still heavily relying on human moderators as they are the only ones that can make a proper decision on this.
Humans can barely understand consent, how can AI?
Even though it would be nice for consent to be a basic and easy to understand concept, it is still heavily argued about in our society. Only recently has the mainstream begun to widely accept the fact that consent can be taken away, and even then a lot of people still struggle with this concept. If we still struggle with understanding sexual consent, then how on earth is an AI model going to understand it?
Context is everything, especially when it comes to porn.
Given the challenges with sex, and all the grey areas it can bring. Context is everything. Some videos can appear quite rough, and it would be all too easy for AI to mark it as violence and take the video down. However, a human moderator is able to decipher the difference (most of the time) between what is actual sexual violence and what is a rough sex video. Rough sex and sexual violence are two incredibly different things.
A good example of context is fake blood. AI would flag a video with someone covered in blood. But, a human moderator would be able to understand the context of the fake blood being poured on, and be able to distinguish the difference. This is just one niche example of the many other niche things that happen on porn sites.
Flagging, not deciding
AI certainly has been advancing in recent years, but it’s not quite there yet when it comes to moderation. This is especially true when it comes to video. It should be noted that most current models are built to flag content, and put it in a moderation queue for a human. Very few models actually are able to properly review, and make a decision on the content without human intervention.
False positives/negatives
There are very few companies who have been able to train their model against databases of illegal content and so quite often there are low precision rates and inaccuracies. Similarly, some of the ‘best’ AI models on the market still struggle to understand and identify things such as hate speech. Therefore, they flag a large number of false positives. This in turn results in an increase in content being sent to human moderators that shouldn’t be sent. Therefore, this increases the moderator's workload quite often, rather than helping to reduce it. This is a large issue for social media but even worse for porn.
AI models to fight hate speech will probably flag most pornographic content. Just think about the words used in porn: slut, whore. Quite often these words (if consensual) are okay to use in porn. Nonetheless, an AI model would flag most of the content on the site, leading to an endless moderation queue for the human moderator. In turn, not relieving any pressure from the moderator.
Fragmentation
There are very few AI models on the market that are able to handle image, video, text and audio moderation all at a high precision rate. Therefore, a lot of companies are stuck using multiple providers rather than an all-in-one solution.
Overall, it would be great if an AI company could begin to train a model to properly moderate porn. This would relieve pressure from human moderators. Lets face it, watching porn all day and viewing harmful sexual content isn’t good for the psyche. I used to do it, and it certainly catches up to you. However, the porn industry is still regarded as shady and problematic, and so most companies continue to build against it rather than for it.
If you think about it, there is plenty of material out there to train a model on. However, this would need to be done with a deep understanding of the porn industry not only from the business side but also from the creator side in order to create the best possible model that helps the industry, rather than unfairly censors it or creates more work.