Content Moderation — Critical Power

Alix Gallardo
3 min readDec 14, 2021

Case Study at a Gaming Community

What is content moderation?

“The governance mechanisms that structure participation in a community to facilitate cooperation and prevent abuse.” — James Grimmelmann, “THE VIRTUES OF MODERATION”, YALE J.L. & TECH. 42 (2015)

Why is it necessary?

A content moderation model is created to make sure all content fits for general public consumption and not against the Community Standards and/or site guidelines and/or policies.

User-generated content (UGC), can be the fuel for the live of your product, forever. As a community/company you can drive revenue and build brand thought-leadership, loyalty, safety user experiences and free expression by implementing a great UGC moderation model to avoid child sexual exploitation material, hate speech, terror propaganda, harassment, graphic violence and other things that are against the Community Standards and/or site guidelines and/or policies.

There are three primary approaches to content moderation:

  • Manual content moderation
  • Automated content moderation
  • Hybrid content moderation

Also, there are 5 moderation types:

  • Pre-moderation(Human moderators, high control +delay on postings)
  • Post-moderation (Unposting elements by human moderators +volume issues and risky for companies)
  • Reactive moderation(Community responsibility and scalable growth)
  • Distributed moderation (Rating systems with human moderators)
  • Automated moderation (Computer vision, natural language processing and AI + human support moderation)

This case study is about: Reactive Moderation with different types of content on an hybrid approach: Images, Video and Text. With this type we were able to scale community growth.

Some context about the gaming community:

  • Latin role-play gaming community for desktop.
  • 6,000 daily active users.
  • 150- 200 reports per day.
  • The reactive hybrid moderation worked from 2019 to 2021.

About the users:

  • Between 18–30 years old.
  • Latino or Hispanic origin.
  • Between 2–3 hours of playing time per user per day.

Challenge: Democratization. How could we democratize this process?

We designed a “Reactive moderation” model in order to have a healthy community. Everyone has to be proactive and participate in this process of reporting inappropriate content. Answering to our first question: How could we democratize this reactive moderation model?

  1. Consistency and Transparency — Rules are structured to reduce bias and subjectivity so that users can make consistent judgements on each case. As a role-play gaming community, we took a decentralized approach to content moderation that empowers users to manage their own speech and helps democratize expression and enable localized and diverse viewpoints. So they can keep record with their own evidences (photo, in-game conversations and/or videos). This means that if they don’t have evidence, they don’t have right to submit a report and this will save time for both parts, User A and User B.
  2. Reach an agreement. 6 hours time frame and push/email notifications to enable conversation between both parts. We promote communication to look for an agreement before being a “Public Report” and voting process.
  3. Accuracy — Quality user profile. The Public Report is shared with quality users and this means for us: Users with these characteristics: More than +90 days as a registered user, 30 days as an active user, reputation level (no past reports, no sanctions), house or business owner.
  4. Democratization — Community get involve in the process, and the decision is backed by 50+1 votes. Also, random pre-filtered users (are voting — no manipulation / favoritism / relationship between users. No nexus “Those who vote are not related to those who vote.”

Results:

  • Reports were reduced from 200 to 30 per day.
  • Admins don’t intervene in user sanctions decisions from user reports. No hate for admins. Avoid manipulation.
  • Scalable structure.
  • Promote transparency.

--

--