Content Governance: Questions to Ask Before Launching a User Forum or Community Site
She said WHAT!? Don’t be taken aback by angry trolls. Here are the policies you need to put in place before you launch a Facebook page or online community.
Content governance is a term that is used to describe a series of practices for dealing with content in the real world, when things don’t go according to plan.
In this post in our content governance series, we’ll talk about the community management questions you need to answer before starting a Facebook business page, online forum or community site. The focus of this post will be on writing policies that help community managers moderate comments and which help your burgeoning community thrive.
Write Two Sets of Policies
Before you launch an online community, write both a public-facing set of community guidelines and an internal moderation policy.
The community guidelines should be specific enough that you can use them as objective criteria for moderating a comment or banning an abusive user.
Having a written public-facing policy can really help when a community member sends your team an angry email asking why their comment has been moderated or banned.
CUSTOMER: I am very angry that you deleted my artistic photographs!
MODERATOR: While we enjoyed your photographs, they violated our community guidelines because they depict [this banned topic]. Here’s a link to those guidelines.
CUSTOMER: Oh, I didn’t know that there was a rule about that. [calms down]
MODERATOR: Thanks for reviewing those guidelines. How else I can help you today?
Community Guidelines Policy Examples
Here are two examples of well written public-facing community guidelines:
Both of these documents explain the focus of the Facebook page or forum, what behavior or language will not be tolerated, and what actions will be taken when a participant violates this policy (such as deleting a comment, and banning a participant for repeat offenses).
Both policies encourage participants to stay on topic and warn that off-topic content may be deleted from the site or page. Both policies also define what the business is not responsible for, such as links posted by participants on the page or in the forum.
Internal Moderation Policy
An internal moderation policy has a very different purpose than public-facing community guidelines. The purpose of an internal moderation policy, is to guide and standardize the way that your organization responds to community participants in specific situations.
This document, which should be updated over time as you encounter new situations, should also include official responses to frequently asked questions. Why? This can save community moderators a lot of time, and also serve as a training tool for new community moderators.
Questions You Need to Answer
An internal moderation policy should contain the answers to a number of questions that you answer before you launch.
Here are the questions you should ask:
- What is the mission of the Facebook page or community forum?
- When is it appropriate to delete a comment?
- When is it appropriate to ban a user?
- At what frequency will moderation be handled? Immediately? Hourly? Daily?
- What is our policy about moderating comments outside of normal office hours?
- Will we empower forum participants to assist us with moderation?
- What could go wrong? How will we address each of those situations? (http://www.mrmediatraining.com/2011/06/10/think-like-a-sociopath-act-like-a-saint/)
- What will our approach be to controversial topics?
- How will we address giving advice around topics that may have a legal or medical component to them? Are we allowed to give legal or medical advice?
- What words or phrases should be banned or flagged for moderation? Many social media commenting and user forum tools allow you to define a list of words and phrases that trigger moderation, or which are not allowed to be used.
- What are the pros and cons of deleting a comment or shutting down a thread in a user forum?
- How will we handle duplicate threads on similar topics? Many user forums choose to have a navigator post a link to a similar thread and then disable the comments on a new discussion thread.
- When does a moderation situation warrant an official response from our organization, or need to get elevated up a chain? What is that chain?
- Is there a list of spokespeople that we can call upon to jump in and talk about particular topics?
- Are there people at our organization who should be whitelisted in a user forum?
- What is our master list of appropriate channels that we can direct user forum participants to if they ask a question that is better addressed outside of the forum? Examples include customer service phone number, another department’s website, or a tech support ticketing system.
- What frequently asked questions can we draft responses to ahead of time?
- Where will our policies be posted and maintained? How often will they be reviewed?
Finally, you should consider actions that you will take to proactively facilitate conversations and encourage repeat use of the Facebook page or user forums.
Community management is, after all, not just about planning for the worst case scenarios, but proactively facilitating the types of engagement and behavior that you want.