Awareness Is Key: Online Safety 101 For Game Community Management

 
 

In a nutshell:

  • Anonymity, game offenses, and impact on player safety. 

  • A must-have process for identifying and handling threats. 

  • How meaningful partnerships can help ensure the wellbeing of your game community.


Game communities are, by and large, a force for good. They help people from different backgrounds, ethnicities, cultures, and genders connect over a shared emotional bond to the games they love to play. But with the bonus of remaining anonymous, the places where people come together online will inevitably be home to varying degrees of intimidation. 

In a recent podcast on online safety, we had a chat with Kelvin Plomer and Steve Wilson of Jagex about how they prioritize and maintain a secure and harmonious environment for a veritable horde of RuneScape players—loyal patrons of a game that has received a Guinness World Record for the most known users of a MMORPG and recently reached 1 million subscribers on its 18th birthday.

In their insightful half-hour episode on our GCM Podcast, these old school game veterans reveal timeless insights that every game developer and community manager should take into consideration. We’re going to sum them up for you here.


Understanding The Online Game Community Landscape

In order to understand the online landscape, game community managers must bear in mind that the actions of their players are a reflection of what is already buzzing among the general online population. 

Online safety is not an issue for developers alone to worry about. However, keeping a proactive eye on the impact greater online trends have on individual players and the community at large is a worthwhile goal.

Toxicity covers a multitude of sins.

ANONYMITY & OFFENCES

While anonymity can lead to greater aggression than you see in real life, often only a small percentage of misdeeds made within the game and community are related to matters of safety.

Kelvin Plomer points out that toxicity covers a whole multitude of sins, and that what people do and don’t find offensive varies. According to their numbers, only the smallest fraction of offenses—0.002% in RuneScape—relate to community safety.

So how can you and your community management team set up a process to determine what is a threat against a user, group, company, or public safety? How should you handle them, and when do you escalate to a higher authority?

WHAT YOU CAN DO

When it comes to your game, consider first how you can collect information and build an understanding of where player offenses are coming from. 

This will also vary according to the nature of your game. If one player takes a jab at another with a phrase like “I !#@% own you” or “Get rekt N00B!”, some will find it alarming and players will point out that it is an FPS they’re playing. When contextual to the actions players can take in a game (and the perspective of those playing it), some colorful language is to be expected. 

That being said, developers can and should take steps to actively prevent the kind of toxicity that can result in emotional and physical harm from breaking into their communities.

We define the infringement of online safety as anything that makes a person feel uncomfortable.

Improving Online Safety

When it comes to ensuring the safety of a game community, managers should be empowered to take a proactive approach. Here are a number of structures your team can put in place to make sure that happens in your game.

THREAT HANDLING PROCESS 

  1. Player Interaction Guidelines and Safety FAQs. Establish a clear definition of what constitutes appropriate online behavior and outline steps to take and people to contact when threats are encountered. It’ll show players within the community that you take safety seriously, and that toxicity will not be tolerated.

  2. An Easily Accessible Reporting System. In RuneScape, players can report anything through any channel at any time.

  3. Identification of High-Risk Reports. Ideally—and depending on the number of players you’re dealing with—you would have an automated system set up to do this.

  4. Human Moderation. Once high-risk reports have been identified, human moderators can work to distinguish “false positives” from real threats and take necessary steps, such as dismissing it as game banter, reaching out to see if someone is okay, banning a player from the game and community, determining if there is a genuine threat of harm to anyone involved, and/or escalating the concern to the police.

  5. Employee Support. Remember that the employees who deal with threats on a regular basis need support as well. Stay tuned for more on how to look after your team and yourself in a future post.

  6. Partnerships. Depending on the nature of your game, you may be able to work with the police or partner with nonprofits that focus on improving online safety.

➤ Pro Tip: If you’re serious about investing in player wellbeing, consider working with NGOs such as the Fair Play Alliance.

IDENTIFYING FALSE POSITIVES AND KNOWING WHEN TO ESCALATE

To learn how to separate “false positives” from concerns that require action of a higher authority, RuneScape brought in the police to hold workshops. This has helped them better understand when to escalate a player concern and develop effective ways to deal with those that don’t require police intervention.

Even though these occurrences can be rare, individuals will be affected by them.

Steve Wilson also expresses that they’ve found their work with organizations such as the Fair Play Alliance to be very useful when dealing with online safety. In addition, RuneScape has also aligned itself with three mental health charities—showing how seriously they value a proactive investment in player and staff wellbeing.

In consort, Kelvin Plomer adds that even though direct threats to player safety are rare, individuals will be affected by them.



In Summary:

To ensure a safe and amicable environment for your players:

  • Understand the landscape. Using insights to understand where offenses are coming will help you put structures in place to deal with them appropriately.

  • Develop a process for handling threats. Consider how you can process and prioritize player reports to ensure they receive the necessary attention.

  • Know when to escalate. Perhaps you can organize a police workshop to help community moderation staff better identify threats that require intervention.

  • Look out for your teammates, and yourself! As rewarding as the job of a game community manager may be, it can also cause a lot of stress—and require solid emotional support. 

  • Build meaningful partnerships. Another great way to improve player wellbeing is by working with nonprofits who have greater online safety as their goal. 

Stay tuned for more fundamental and expert knowledge for managing your game community, and don’t forget to pick up a free copy of our Game Community Management Essentials guide! 

(It comes free with a subscription that puts the best content of every month in your inbox.) 👍✨


Pay it forward. Share this post with your friends and colleagues.