google-site-verification=rELuVVyS5Y8o0Ezst8ITY3su3PIT5khzDgo-anRp4o8 Content Moderation - The Front End Of Social Marketing Management ~ Tech Senser - Technology and General Guide

9 Feb 2013

Content Moderation - The Front End Of Social Marketing Management

Using social media to expand, sell and publicize a brand presents unique challenges unknown before the internet age.  While having internet users comment on and engage with a brand is the aim of all social marketing management teams, the days are gone when every press release, every company statement and every interview can be under complete control.

Customers can tell the world in seconds what they think of a brand, and brands need to react appropriately. While the content on social media cannot by definition be totally controlled, it can be supervised and arbitrated. This is the role of moderation.

Why moderate?
Consider two of the main reasons to have some kind of moderation for your social media presence:
  • Protecting your brand
  • Protecting against defamation and legal problems
A social marketing management strategy that does not include moderation presents a risk to the brand. Dissatisfied customers, mischievous posters and spammers all have the same access to your social media presence as the people that you actually want to target.These less-desirable elements may be very internet-savvy and able to cause massive disruption to your social media sites.

The legal aspect should also be considered. The concept of ‘defamation’ has been in the legal systems of most countries for centuries, and in most cases the laws have not kept up with the rapid evolution of the internet. However there have now been successful prosecutions for ‘online’ offences – there was an arrest in the UK following an offensive tweet made about one of the Olympic athletes, and there have been other examples.

As a general rule, brands should assume that the big social network sites are not willing to take responsibility for what is said or posted on them. This principle has been upheld in the American courts, although some cases in Europe have indicated that this may not necessarily apply elsewhere.

It is wise for brands and their social marketing management teams to work on the basis that they are responsible for the content on their own YouTube channels, Facebook pages and Twitter feeds, and act accordingly.

The correct degree of moderation also needs to be considered. The recent closure of the popular BBC-owned Thorntree travel forum is an interesting example of what can happen if moderation is implemented too late, but possibly too heavily. Details of what happened are still emerging, but it appears that BBC top management were informed that the forum contained a huge number of inappropriate posts.

With the high level of nervousness at the corporation following the Jimmy Savile scandal, the decision was taken to shut the forum with immediate effect. The forum was closed for two weeks over the Christmas and New Year period, and is still not fully in operation. It remains to be seen if it will regain its previous levels of activity and popularity, now that higher levels of moderation have been implemented and much content is still unavailable.

Levels of moderation
Given that moderation should be seen as ‘highly desirable’ if not an essential feature of a social marketing management strategy, the next question will be ‘how do I moderate and how much control should I take over what the users are saying and doing?’

There are several ways for a social marketing management team to approach moderation. Considerations include available effort, cost and what is felt to be acceptable to the site users.

• Pre-moderation; all postings or uploads are checked before being visible on the site. This is the safest method – but also the most costly and time-consuming. It will require 24/7 effort if an acceptable timescale is to be achieved.

This is because most users want instant results from their postings. Automated moderation (discussed in more detail below) can be a useful compromise and can help to reduce social marketing management costs. However automated moderation is only a useful aid, and is no substitute for ‘real people’.

• Post-moderation; postings and uploads appear, but are also reviewed and may be removed if inappropriate. This relies on fast reactions and also needs full-time monitoring. The internet does not recognize weekends and holidays.

• Community moderation; new content is only checked when flagged by a user. This has the big advantage of using ‘free’ labor – social media marketing is always about getting the customers to do the work, but if they are also participating in the management of the forum that is even better.

This model also gives the users a feeling of ownership and involvement in the forum or community concerned. YouTube uses this model, and defines many different categories of flag which helps moderators to prioritize their actions – with hours of new content being uploaded every second, this pre-filtering is essential.

• Distributed moderation; this uses a rating system that allows users to vote on comments or postings. A threshold is set which, when reached, will trigger an alert to a moderator to investigate further. This method also relies on user involvement.

Moderation Rules
As moderation is never a one-person job, there must be a common ‘book of rules’ so that the social marketing management team know what to allow and what stop. Although there is an option to ‘refer for further consideration’ this can take valuable time.

Given the possible huge volume of postings, automated moderation software and filters are almost essential. These tools allow the obvious spams, scams and irrelevancies to be spotted quickly and easily. Very few forums are able to avoid an attack by a spammer scattering adverts for inappropriate services – but automated moderation can stop this before it ever reaches the site.

When using automated moderation with obscenity filters, it is important to apply some intelligence to avoid the ‘Scunthorpe problem’. This refers to the problems that arise when an apparent obscenity or other banned word is embedded in a longer word, causing false triggers of the filter. This problem has been around since the mid-1990s and still appears occasionally.

There are further issues for international businesses. Most will be familiar with the misunderstandings that can occur between American English and British English, but what about more global issues? There is a commonThai name that means ‘blessings’, but means something entirely different in English,or there is the harmless Englishword ‘lull’ which is less harmless in Dutch and Flemish.

So if a site is likely to generate content in languages other than English, the moderation team and software will need access to the appropriate language skills. This is still an area where there is no subject for human intervention.


Finally, remember the definition of a moderator in nuclear physics. It is a device that is used to stop a reaction running wildly out of control.No-one can predict the reactions from users to your social media presence, so it is up to you and your social marketing management team to keep that reaction controlled.

Faizan Ahmad

About the Author:

This article is posted by Faizan who is the Author and Founder of TechSenser. He is a Professional Blogger from India and a passionate writer about Technology, Gadgets, How-to-Guides, etc. You can connect him on Google+.