If a brand’s social web presence is compared to a party, it needs some form of security to ensure that any guests causing disturbances or conducting offensive behavior are kept out, therefore enabling all the other guests to have a good time, stay a bit longer, and hopefully come to the next event. Reviewing and responding accordingly to damaging posts and publicity should be an important part of any brand’s social media strategy. Websites and platform owners are not responsible for the content posted on them. It is the brands themselves that are required to take ownership for their content, or content connected with them. Brands need to moderate all forms of postings, communications and exchanges related to their brand; even if it is inadvertently associated. This leads onto the first point; where to moderate?

Where?

Within the social web there are a multitude of platforms and sites where content is exchanged and comments are left. There are sites that are directly owned by and managed by a brand, for example, company blogs, company website. But it goes further than that. There are social networking platforms, chat rooms, topical forums, review sites, media channels, infographic sites, and article sites. All of these should be taken into account when looking to moderate what is being said about a brand. A useful tip for covering all ground is to use a service such as Google Alerts to track your brand or company name. It’s worth pointing out at this juncture that as well as being a tool to put out fires, widespread moderation in this manner can be a proactive method of further promoting brand identity and forging routes to your own website.

Why?

First and foremost content moderation is about being able to control what is being said about your brand, protecting it from negative reputations, and remove any offending associations that may form. Negative publicity and misleading information can very rapidly and very easily go viral, resulting in mass circulation beyond anybody’s control, and with possible fatal outcomes for a brand; especially smaller ones that may not have the resources to defend quickly enough. And as many astute business minds would testify, being able to respond to offensive comments and keeping your brand clean, can be likened to dealing with any operational problem, complaint or criticism. If the situation dealt with in a timely manner and is successfully resolved, it can actually be turned into positive publicity and an opportunity to showcase excellent customer service.

What?

So what exactly does a brand need to look out for when it comes to their online content, exchanges with it, and the publicity surrounding it? Cyber bullying and abuse are obvious examples. Although extreme, it still occurs all too often. Illegal content is another clear no no. But there are also more subtle content abuses that can take place. These may not have such a traumatic effect on other users and participants, but can be just as damaging to a brand’s longevity. Users with names that include insulting or rude words should be targeted as trouble and in most cases barred from future communications. Off topic posts are also potentially hurtful; there are other more suitable locations for these. They can cause damaging diversions for what’s relevant to you, as well as put off new prospects and even loyal fans.

How?

There are a few different methodologies for moderating content. What’s appropriate will be dependent on the nature and purpose of the site, as well as how you want the brand to be perceived and the target market. There is pre-moderation where content and postings are fully reviewed before being published. There is then post-moderation, where content is reviewed after it has been published, and can be removed if required. Reactive moderation is where users and community members are given the opportunity to flag up offensive and undesirable content, which is then reviewed by a moderator. Distributed moderation occurs where users are given permission to moderate other users’ contributions. Whilst the type of content moderation that is used is in the hand of the site owner or platform rather than the brand, this knowledge should go some way to assisting a brand in deciding on what sites and platforms they wish to involve with.

Brands also need to determine who should be monitoring their content and brand presence, for example, employees, volunteers or outsourced vendors. A brand needs people, a framework which sets the boundaries as to what is acceptable and what is not, and the right software tools in place to keep on top of it.

Determining where and how a brand will protect itself against harmful content and content exchanges should be at the forefront of its social media strategy, and not a reactionary after-thought. There is simply too much at risk.

LiveWorld, a user content management company, is a trusted partner to the world’s largest brands, including the number-one companies in retail, CPG, pharmaceutical, and financial/travel services.