eModeration: Five Techniques For Creating Safer Environments For Children
The fantastic outsourced moderation company, eModeration, has released a new whitepaper called “Five Techniques for Creating Safer Environments for Children“. It’s a quick and helpful read so be sure to check it out. [download]
Here’s the author’s quick recap of what you’ll read about:
1. Consult all available research when drafting parental guidelines – any organisation that plans to set up a virtual world must explain how they mitigate risk in its parental guidelines. Each virtual world will vary in theme and content, but there are a number of rules that children and parents should adhere to. The full list can be found in the whitepaper, but the golden rule for children is that when online, never share personally identifiable information (PII); this way, a child can never be traced.
2. Use automated moderation filters – these can be used to intercept the disclosure of a child’s personal information preventing children from giving away their mobile phone number, email or IM address and their social network pages, which would otherwise hand to an adult with malicious intent a wealth of useful information. Sophisticated filters can flag to a moderator when a child is being persistently pursued for information, such as where they are from, what school they got to, and for his/her personal preferences, such as favourite football team or singer. It is also possible to tackle overt bullying, abuse and harassment using filters.
3. Utilise the expertise and experience of moderators – automated filters often detect inappropriate or abusive behaviour, but they do not remove the need for human moderators, who are trained to keep the peace and ensure a healthy playing environment. Without human moderators, children can find themselves in a ‘Lord of the Flies’ scenario. Becoming a character or a host also helps enhance the playing experience for children. Moderators can help them overcome in-game challenges and obstacles, and make suggestions on new things within the game for them to experience.
4. Make reporting inappropriate behaviour clear and simple – virtual worlds must provide a very easy way for children and parents to report instances of inappropriate behaviour and should provide easily-accessible contact details for a moderator. The site should also have a very clear policy on what constitutes bullying so that children understand what is, and what isn’t, acceptable behaviour before they play.
5. Get parents involved – parents have a very important role to play in ensuring virtual worlds and MMOGs are a safe and positive place for their children to play. There should always be a ‘parents’ guidelines’ page clearly visible on the site. It is also very important that parents are encouraged to adopt a balanced approach when it comes to educating their children on the dangers of virtual worlds, and do not unwittingly frighten them before they’ve even played the game.
Still looking for more moderation tips? Check out the joint Ant’s Eye View and eModeration whitepaper from last year.