Last month, Vice ran an interesting article by Jess Morrissette on how games marketing invented toxic gaming culture by promoting toxicity and harassment as value propositions for gaming. While considered perfectly reasonable at the time, games marketing has luckily taken a turn for the better.
It reminded me of the CX challenges many great studios and CX leaders face daily, when trying to protect their player community. In 2014, a Riot Games study suggested that players facing exposure to abusive language or harassment the first time in the game are potentially 320% more likely to churn and never return. Think about it by removing the divider between online and “RL”: if you are verbally assaulted when you go to a grocery store, are you going to back?
A study released last year revealed toxicity runs rampant in free-to-play games and that 74% of US online gamers have experienced some form of harassment when playing online. The survey was conducted by the Anti-Defamation League (ADL) in collaboration with games analytics firm Newzoo.
The study revealed that:
- 65% of people playing video games online have experienced “severe harassment, such as physical threats, stalking, and sustained harassment”.
- 53% of those reporting harassment said they were targeted for their “race, religion, ability, gender, gender identity, sexual orientation, or ethnicity.
- 29% of people surveyed reported they were doxed (the act of publishing personal or identifying information with malicious intent) at one point while playing games online.
- 23% of people surveyed reported “exposure to extremist ideologies and hateful propaganda”.
A clear expectation has been set when more than half of the surveyed players believe that video-game studios are responsible for and should do more on player safety, inclusiveness and content moderation.
While it comes no longer as a surprise that building strong communities influences player retention, moderated in-app chats also increase the overall player experience and drive higher LTV and brand loyalty.
Two Hat, the company that runs an AI-powered content moderation platform, released a research paper based on the in-game community data of a Top 10 mobile Action RPG title.
It states that gamers participating daily in moderated chats show an increased LTV by up to 20 times. In addition an increase was shown in the number of daily sessions by 4 times and the average session length grew by 60%.
That means players who feel engaged, appreciated and heard by studios play more and spend more. Not to mention a 2009 study by Waggener Edstrom Worldwide that blew our minds with the news that friends are three times as likely to influence a game purchase than traditional advertising.
So what can gaming studios do to protect their community and brand while banking on more highly engaged players? Here’s some top tips:
- Determine the community’s voice and create community guidelines to set the tone on what is acceptable.
- Provide a parent portal and parental controls when the audience requires parental guidance.
- Invest in a moderation tool that proactively filters undesirable content when an in-game chat feature is present in the game.
- Provide block, ignore, mute or report functionalities as an added safety feature in games.
- Invest in professional community managers and moderators for prioritization and triage of content queues, increased efficiency and improved ROI in community health.
- Encourage positive behavior by means of game design and game experiences.
- Join other video-game studios at the Fair Play Alliance.
If managed properly, social features and in-game communities can bring significant benefits in engagement and increased LTV. Left unchecked however, they may pose a risk to your audience, brand and reputation.
What are you doing to keep your players safe?