Six Techniques for Safer User Generated Content Campaigns
I’ve worked with my vendor partner, friend, and all around great folks at eModeration to develop a fun whitepaper: Six Techniques for Safer User Generated Content (UGC) Campaigns.
The paper details techniques that creators of UGC sites (brands and marketing agencies alike) to help protect both their brand reputations and their users; while creating a site that is fun and engaging for users. While there are a great many methods to find this balance, the paper focuses on six key, and often under appreciated techniques:
1. Craft your guidelines – create “community guidelines” rather than “terms and conditions”. Use accessible language so that users will understand the rules of the site. After all, the clearer the guidelines, the more likely users will abide by them.
2. Build automated filters – the first line of defense against offensive, litigious, illegal or hijack-marketing content should be smart filters. Filters should not replace human intervention – they will never understand slang trends or cultural sensitivities, for example – but they will get rid of the more obviously “bad” content and help to offset some of the load on the human moderators.
3. Embrace your technology – use some basic mathematics and logic-informed algorithms to build tools that human moderators can use to review content. For example, keep an eye out for a single user that is making numerous submissions within a given time frame. Look at a user’s site history – how many times have they been in agreement or dispute with the moderator? Is there a particular piece of content that is driving significant volumes of traffic – and is it for the right reason? Is it because of undesirable content?
4. Enlist your users – most site users want a positive experience. Given the opportunity, many of them will help to protect the safety and quality of a project. Enlisting users can not only help moderators, but can engage users in the site itself. Make sure to develop tools and processes that make it easy and rewarding for “good” behaviors to help protect against the “bad” behaviors.
5. Make moderation actions visible – contrary to traditional thinking, human moderation does not work best when hidden from view. In fact, hiding moderation techniques can give an implicit invitation to a user to try to abuse or get round the system. If these controls are visible and clearly laid out, it can discourage people to post bad content. Moderators have a job not just to remove content, but also to work with the community to educate users as to what is and isn’t acceptable. Some users make honest mistakes, so should be allowed to make amends and resubmit content.
6. Moderation tools need love too – test the usability of the moderation tools, alongside site testing. You don’t want to find that you’ve created a site that’s difficult to moderate once the site has gone live. Smart interface design can significantly reduce moderation time (and cost). You moderators will love you if you show them the same love you’d show your users!
All of that, and it even features some cool British English! What more could you want?
UPDATE: The discussion begins!
- Best Engaging Communities
- Internet Retailing
- Snow Patrol
- The Retail Bulletin
- Julian on Software
- Green Blog
- IT Week (UK)
- Phil Muncaster
- mad.co.uk (requires registration)
- Two Point Touch