Human moderators are responsible for keeping us safe online, and yet we know very little about them, and how their job works. Indeed, before starting my company I don’t think I ever really thought about moderators. I knew that they existed, but I had very little idea about what they actually did and how it worked.
When I started Freyja, I knew that we would need moderation because we allowed third party pornographic content. However, I had no clue where to start with this. I spoke to a couple of my mentors about this who recommended that I watch a documentary on Facebook moderators called; ‘The Cleaners’ . If you haven’t seen this, I highly recommend it. It shows how many companies use external moderation companies in countries such as the Philippines, and walks you through the day to day life of a moderator, the type of content that they have to see and the effects that this can have on them. After watching this, I was amazed that the effects on moderators, and the way that companies moderate has not been put more into question publicly. Of course, AI can be a solution to reduce human moderator exposure to harmful content, but it’s just not as simple as that yet. AI can assist with moderation, however, it is not good enough yet when it comes to video moderation, and there is also the fact that moderation is a very subjective and circumstantial thing, which is why often human moderators can be better than AI. Too often on some platforms that rely too heavily on AI, I have seen extreme cases of unfair censorship. The reality is, moderation, and guidelines is a philosophical thing which is why it works best when combined with humans and AI.
When it came to my company, and our approach to moderation I knew that I wanted to do things differently than mainstream social media and adult sites. There were a couple things that were particularly important to me:
We knew that we wanted to make Freyja the safest possible social media site out there. That is why we designed a pre-moderation system that prevented harmful content from even going live.
I wanted to use a moderation company that provided support to moderators and I wanted to be transparent on our moderation approach, and treatment of moderators.
I wanted to approach our moderation guidelines philosophically, and democratic in order to avoid unfair censorship.
I wanted to have hands-on experiences as a moderator. This is because I felt as though I could not be making the ‘rules’ of the platform without understanding what actually happens in order to make things better for the moderators and our creators.
Months before we launched we had been working on our guidelines. This was an incredibly difficult process, because you were very aware of the gravity of what you were doing. You make one mistake with the guidelines, and miss something out, then you suddenly leave the door open for harmful content. We found ourselves trying to think of every possible thing that could go wrong on our site. Similarly, it was very challenging to know where to draw the line on grey areas, and to not let one’s personal bias enter these decisions. When writing the guidelines it became clear to us that this would be a forever evolving process, and one that we wanted our creators and users to be involved in, as to keep things democratic. Once we finally had our guidelines, we had to make a moderation training presentation for the moderators to explain what was allowed and what was not. This was a very interesting experience, as you literally had to spell out things that seemed simple, but then when you think about it from a moderation standpoint, things do really need to be spelled out clearly so that mistakes are not made. See below for example of the presentation:
Now, you may look at the above image and thing it’s comical, but that’s not really the point here. The point is you really have to spell out the seemingly simplest of things when training moderators so that something doesn’t get confused. For instance, with this example if this was not spelled out then it runs the risk of a moderator censoring the person on the right hand side unfairly. Similarly, the same goes for ‘advanced weaponry’. Most companies list ‘advanced weaponry’ as something that is banned. However, when it comes to porn sites that blanket statement does work because what about BDSM or sex toys? Therefore, I had to make a 80 paged power point on what was ‘approved’ BDSM weaponry as to avoid confusion and accidental censorship.
Moreover, when I said that I wanted to take a set number of moderation hours a month, a lot of people looked at me as though I was crazy, because that’s not exactly something most CEO’s would do. However, how on earth should I be allowed to make guidelines that affect creators, and moderators if I have no understanding of the content that moderators actually handle?
Obviously, I have not worked full-time as a moderator and so cannot claim to fully understand their experience. But, what was it like for me being a moderator for a porn site?
It’s important to understand what content goes to the moderator. On most sites the moderator will be sent what is reported, or random content to sweep check. Contrastingly, on Freyja we had a pre-moderation system which meant that the AI would pick up any nude content, which all had to be moderated before it could be posted. And so, we had to look at everything.
The moderation system was made up of two queues: priority and escalation. Content would automatically go into the priority queue. When a creators content comes up, you are there to check three things;
Does the KYC (ID) match this creator? AKA. is it the right person?
Is there anyone else with this content? If so, have they been tagged, and do you have their consent and ID?
Does this content break any guidelines.
Based on that, you can then approve, reject or escalate the content. Content would be escalated if it was in a grey area, and you wanted a second opinion on it.
Here are some examples of the majority of escalation cases:
The main thing that occurred was creators uploading content and not doing their KYC verification, which meant it had to be rejected. This was always an easy solution though, as you would just need to reach out to the creator and ask them to verify. However, I always struggled to understand why this was such a recurring issue when these were creators who had been in the industry for many years, and were aware of the fact they need to verify on any site.
Creators wouldn’t tag their third party which meant it had to be rejected.
The unverified dick. This was a daily occurrence as a moderator. This was when a user would want to send their dick to a creator. However, it would first go to moderators, like me, who would have to then ask them to please ID their dick. In a way, that ID request was the best way to stop unsolicited dick pics …
The grey area. There were some videos that definitely crossed into the grey area and you couldn’t quite work out what was going on, or if it was okay. Such as extreme BDSM bruising, or in one case someone who appeared to be covered in blood. Quite often with grey area cases once you spoke with the creator things were cleared up. However, there were certainly a couple of these types of videos that were hard to have to watch.
The thing I was most grateful for with our system was the fact that the content was not live yet, which meant that when something was unclear, it gave us the chance to have an open dialogue with the creators and users rather than just censoring them unfairly. The main thing I learnt from working as a moderator was that context is everything, and even if you have rough guidelines, most things are taken case by case and you just have to use your common sense and ask yourself; does this seem okay? If in doubt, talk and escalate.
In terms of how much content I would have to moderate this varied depending on the time of the day. Most creators would upload a whole month's worth of content in bulk, and then wouldn’t upload again for weeks. This would mean sometimes you’d only have a couple dozen items in the moderation queue, and other times it would be hundreds and thousands. It wasn’t very predictable. The majority of content came in early hours of the morning, or in the middle of the night. I would have system alerts on for when an item would enter the queue, which meant you always felt quite on high alert.
Something that is important to talk about is; what are the effects on a moderator? I can only speak to my own experience in this, and I can definitely say that this affected me in some shape or form. At first I thought that it wouldn’t, I thought that I could separate it from my personal life. The first couple weeks of moderating I did, it was quite a shock to have to moderate and look at so much adult content. However, as time went on I became incredibly desensitised to it which is probably not a good thing. I noticed that if I spent too long thinking about guidelines or moderating that my mood would massively dip, and I would feel down. Similarly, I certainly saw how this would affect my own personal love life, and I felt more closed off from people. It certainly was a very lonely task to have to do, with little people having a similar experience.
My relationship with moderation is love hate. I find moderation fascinating, and am passionate about how companies can make this a safer and yet more democratic process. Simultaneously, I know that this couldn’t have all been so good for me. In terms of how to reduce these effects on moderators, I’m not really sure what the solution is. Some say that AI is, however, I fear that this does not understand some of the context and nuances of content, especially adult content. Most moderation systems are currently built to ban adult content and not to allow it and so there is a large knowledge gap there. What my experience taught me is that we can’t eliminate the human from moderation and guidelines. We need humanity, however, it needs to work in conjunction with tech and there needs to be mental health support and hour caps for moderators. I hope that there will be more companies entering the AI space who specifically focus on moderation for adult companies, as there needs to be this niche focus in order to advance this type of moderation tech. A general moderation tech solution that would be used on social media platforms will not have the same type of effect on an adult entertainment platform. Lastly, we as consumers need to think more about the platforms we are using and their treatment of moderators, and push for transparency in order to more ethically consume content.
Inspirational
It sounds like such incredibly important and nuanced work. Also definitely something every 🌶 company should be doing