The discussion is an interesting insight into the ins and outs of adding content moderation to your community building efforts.
Developing a solid moderation gameplan for any community is not just a good thing to do, it’s an absolute requirement. Without an appropriate, robust moderation plan, the community is almost destined to fail sooner rather than later. This holds true for brand driven communities as well as user driven communities.
Too many people assume that pre-moderation (moderating all content before it ever appears to the public) is the best solution. After all, if the company looks over the content first, what problems could exist? Separate from the issue of users finding ways to trick your moderators once they discover they’re being moderated, there are some immediate risks:
- Realism: Who wants to hang out in a community where someone is looking over your shoulder like a disapproving parent when there are tons of other community sites around?
- Trust: Inherently, pre-moderation says to your community members "we don’t trust you". That’s not a very good way to build social connection.
- Turnaround time: Will you be staffing the moderation team 24/7? If not, what happens at midnight when users are still uploading content and expecting to see it on the site quickly? Users don’t like to wait minutes, much less hours or days.
- Focus: The nature of pre-moderation draws resources and focus away from tasks like encouraging users to participate, creating methods of drawing users into your community, and keeping content on track through participation rather than directive.
- Scalability: If your community becomes successful, despite the issues above, are you going to build a team of people capable of review content additions in a timely fashion? What can your budget withstand… 5 full-timers? 10? 20?
Easy? Not really. Safer? Well, depends on what you mean when you talk about "safe". Sure, there’s a higher chance that porn might slip through. But there’s also a significantly increased failure possibility too. MySpace users change their profiles on average every 72 hours. Imagine if every single bit of content had to be approved before it was displayed. With 100+ million users now, do you think the turnaround time would be acceptable to that audience? How long before MySpace users get tired of waiting around and take off for some other site where things happen faster?
Certainly post-moderation has it’s issues too. If you’re dealing with kids, pre-moderation becomes much more important. But beyond a specific use case that requires pre-moderation, post-moderation offers many more possibilities.
- Involve your users: Give them tools to easily and quickly notify moderators of problematic content. User flagging can do wonders for keeping the site on par with what your users are comfortable with
- Enlist a volunteer army: In any great community, there are a core group of users who want to make sure the community they love lives on. Ask them to help you with the moderation process. Not only does this help with scalability, it lets them know that the site is as much theirs as yours.
- Spot check: With the right user volunteers and notification methods, you can often rely on spot checking, rather than manually reviewing every piece of content.
- Robust tools: When you are designing your community, don’t wait until the last minute to design the moderation tools. Planning methods of dealing with potential problems at the same time you plan the rest of the project ensures that your site is easily able to handle anything users throw at it. For instance, you decide to tie the way content is rated to the way content is flagged for inappropriateness based on the belief that low-rated, flagged content should be seen by a moderator immediately.
This is a significant issue in the development of any project that directly asks for user input. If you’re not spending time on this issue early and often, your project may well fall flat.
The good news is that with proper planning, it’s not a insurmountable issue.