How to Develop Robust Moderation Methodology

Moderation, at its core is about ensuring that published content on a particular site, typically submitted by the site’s users themselves, meets the terms of the site’s Terms of Service (ToS). This function is, all too often, seen as an analog task: groups of moderators site at terminals clearing content submission queues asking simple yes/no questions like “Is this porn? Is this hate speech? Does this content have personally identifiable data?”

The problem with approaching the moderation task as an analog, queue-clearing activity is that it simply doesn’t scale. What if you have hundreds of thousands of content pieces being submitted every minute? (YouTube, for example, has 24 hours of video uploaded every minute!) Your company simply won’t be able to afford the sheer number of bodies needed to clear those queues.

Therefore, it’s important to stop thinking about “moderation” as that analog activity of reacting to full queues and instead look at the primary objective moderation is trying to fulfill: an overall reduction in inappropriate content being published. This means reacting to bad content that’s been submitted, but it’s also about reducing the amount of negative content developed in the first place. It’s about getting your moderators smarter, as well as getting your community more active. The goal is not to reduce moderation clear rates any more than the goal of customer service is to reduce call times.

No, the goal is to improve the community health overall by providing positive, safe environments using every method you can. To that end, here are nine methods that you should be applying as you develop any community. The question isn’t which one of these methods to apply, the question is in what ratio do you apply each.

Governance
The starting point, and all too often the stopping point with preparing moderation processes is the governance piece: Terms of Service, Community Guidelines, and other formal documents meant to define the concept of “appropriate behavior”.

Alison Michalk has a great overview of how to approach the creation of such documents. By far, my favorite governance example is Flickr’s Community Guidelines. Fun, clear, and shareable.

Engagement
General community management practices, development of culture, encouraging positive and discouraging negative activities, and participation from the company.

Engagement is a common discussion with message boards/forums, but is also highly effective, though implemented differently in all three levels of your Presence Framework.

As I’ve written about before, the moderation activity can, and should be used to help build your community’s culture.

Processes
Community moderation activities – generic approval processes, from one individual to multiple levels of approval.

This is where traditional in-sourced or out-sourced vendor moderation processes play. Straightforward concept of human approval before/after content is posted.

Positioning
Moderation is as much about providing a sense of security and safety as it is about simply deleting inappropriate content. Social experience have a culture and when the culture is one of positivity, the experience overall tends to have vastly more positivity. It’s not enough to simply have great moderation processes, you need to prove it out as well.

Algorithmic Tech
Using technology to discover and utilize patterns of tone, structure, users, response times, and other such data points to automatically identify potential problems and/or filter those problems out before moderators even see them.

UX Tech

Improved methods of user-facing technology like like buttons, report abuse, on-topic buttons, and other tools that give users a chance to actively participate in the identification and reporting of problems.

  • Amazon’s “was this helpful” buttons
  • Get Satisfaction’s smiley faces

Reputation Systems
In any online social experience, reputation is crucial. Whether that’s simply a culture reputation amongst community members, or specific points/badges collection, reputation can help with a range of activities in community building. Moderation efforts can be significantly helped by applying UX Tech and Algorithmic Tech together with reputation status. Yahoo Answers is extremely strong in this area.

For more on reputation systems, be sure to pick up the new book, Web Reputation Systems .

Tool Consistency
Undedicated moderator resources (moderators who don’t work on just one property day in, day out) spend a surprisingly large percentage of their time simply wrestling with poorly design moderation tools that lack consistency across properties. Moderators can clear multiple pieces of content per minute, so every minute lost to a struggle with the bad content is time spent in entirely the wrong way.

I would love to see a company like Yahoo or Google or any other organization who has a vast array or moderation-necessary properties lead the industry by creating a set of moderation tool standards, a UI/UX library of sorts, that any and all properties that use moderation tools are required to use. Yahoo already has experience with this type of concept through the YUI Library, for example. Let’s see that same thinking applied to moderation tools.

Programs
Specifically designed programs such as the Facebook Community Council that grant additional powers to select groups of partners, customers, or users.

After the AOL Community Leaders and About.com court cases a few years back, companies have been very hesitant to engage users to do anything that can be perceived as a “real job”. Even the Facebook Community Council is small and invite-only only because they are beta testing before rolling out to anyone who wants to participate. Programs can be successfully and legally implemented, they just need to take proper precautions and do proper planning. It’s actually quite simple: If you’re going to develop a program that treats volunteers like paid staff (only without actually paying them), stop it.

The fine folks at TextsFromLastNight.com recently added a “Moderate” feature to their iPhone app. You can pay 99 cents for the privilege of moderating content submitted by users. Hey, I bought it…. what can I say? I love my TFLN nuttiness!

Gaming/Application
Moderation functions wrapped in a shell of activity that users can enjoy as a game or useful secondary application

Remember: The question isn’t which one of these methods to apply, the question is in what ratio do you apply each.

CONNECT

For information about my Community Consulting, Training and Speaker services, or to find out more about Dinner5, my unique community for community builders, contact me today.

By submitting this form, you are consenting to receive marketing emails from: Jake McKee Consulting, 9908 China Garden Cove, Austin, TX, 78730, jakemckee.com. You can revoke your consent to receive emails at any time by using the SafeUnsubscribe® link, found at the bottom of every email. Emails are serviced by Constant Contact