How to combat toxicity in gaming with player support | 5CA

How expert player support can combat toxicity in gaming

Words by 5CA
Reading time 13 min

How to combat toxicity in gaming

Blog

Like working from home, television, and technology, gaming has come a long way in the last twenty years. Studies have shown it helps people with PTSD, promotes better well-being, and improves cognitive ability. But the gaming world isn’t all sunshine and rainbows. The rise of online gaming has also opened gamers up to toxicity.

Toxic behaviour is unfortunately rampant in gaming culture. Anyone who has ever played an online multiplayer shooter knows just how brutal it can be. Toxicity in gaming is so widespread that almost two-thirds of players have reported witnessing it.

According to Unity’s Toxicity in Multiplayer Games Report, nearly 72% of online gamers have run into it. And given that toxicity can lose you players or damage your studio’s reputation, it’s something that everyone in the gaming industry needs to take action against.

It all starts by identifying exactly what toxicity in online gaming is.

What is toxicity?

Playing online multiplayer games, it’s not uncommon to hear jeers of “you’re so toxic” through a headset. It’s also a term used as both an adjective and a noun. But toxic gamers didn’t start as toxic gamers; for most, it’s something they picked up on just by playing.

Toxicity in gaming can range from harmless jabs like “git gud” to crueller personal statements that can ruin lives. It can also appear amid gameplay by doing things to obstruct teammates’ games, like killing them or blocking their paths.

The biggest issue in this is that it’s become so normalized and—like bullying in the real world—has a knock-on effect. Players who have experienced toxicity, more often than not, carry it through and end up displaying the same behaviors.

what is gaming toxicity

What causes toxicity in gaming?

Toxicity in the gaming community is now more of a norm than an exception. But as stated, many players are toxic because those before them were toxic to them.

According to a Firstpost survey, 65.6% of players have been subject to abusive behavior while playing. Resultingly, 59.4% also admitted to having been offensive to other players.

Much of this toxicity comes down to the anonymity of online games. Unlike the days when couch co-ops with friends were the only way to play, they aren’t familiar with those with whom they’re playing. Furthermore, toxicity is now so commonplace that many players see it as an acceptable reaction without consequences.

But the long-term effects of toxicity reach beyond anonymity. While your players don’t know who they’re playing against, each has personal lives, feelings, and unique frames of mind. This is particularly true in cases where the player doesn’t fit into the commonly misplaced gamer demographic of young white men.

Let’s look at the types of toxicity online gamers face.

What kind of toxic behaviour are your players running into?

Toxic behavior is a loose term that covers many actions; to combat it, we need to understand what it entails. Some of the types of toxicity your players may encounter include:

Rage

On the surface, rage is a natural reaction to losing or doing poorly in a game. Unfortunately, many gamers now take their rage and push it onto others. There’s now even a term for it: gamer rage.

Scientific studies on gamer rage find that some of the biggest triggers include dying, losing, other players’ actions, and technical problems. Consequently, they ‘rage’ by abusing other players through voice chat, quitting the team game, ruining it for their teammates, and in extreme cases, destroying things at home.

Griefing

Griefing is a form of harassment that takes many shapes. Griefers are everywhere and transcend gaming genres, from massively multiplayer online role-playing games (MMORPGs) to free-play first-person shooters (FPS). While types of griefing vary, it all boils down to players taking pleasure out of games for others.

Griefing is chaotic and comes not from malice but boredom. There are countless YouTube videos of griefing, many of which are lorded by those who watch. Whether it’s blinding teammates on CS:Go, spamming voice lines, or destroying player builds on Minecraft, griefing is a big issue.

Thankfully, many gaming companies are beginning to implement reporting systems that make it much harder.

Harassment

Harassment is a massive problem in online gaming. While the world once pictured the gamer demographic as exclusively young white boys, time has proven otherwise. Gamers come in all shapes and sizes, genders, skin colours, and sexualities. As a result, many players outside the previously imagined demographic experience online harassment because of who they are.

In a report by the Anti-Defamation League (ADL), more than half of multiplayer gamers reported harassment related to their race/ethnicity, religion, ability, gender, or sexual orientation in the first half of 2020. Furthermore, a third of LGBTQ, Black, and Hispanic/Latinx players experienced in-game harassment related to their sexual orientation, race, or ethnicity.

Women also experience sexism and cyberbullying on massive levels; with many female streamers reporting targeted threats related to their gender. Many women also avoid voice chat, sometimes leading to an unfair advantage in team-based games requiring communication. This specific example of harassment was most notably evident with Gamergate, wherein thousands of gamers threatened, harassed, doxed, and attacked women game journalists. Repercussions of this spread into an ongoing hate campaign.

Unity’s study finds that women are more likely than men to stop playing multiplayer video games if they experience toxic behaviour. And considering nearly half of gamers are women, losing them would be detrimental to any developers who want their game to be popular.

Doxing

Doxing—or doxxing—is a term that came from cyber security and passed naturally to online gaming and social media. It’s a severe and toxic threat that goes beyond trolling and can have dire personal consequences. Many popular streamers have been victims of dox attacks recently, with hackers finding and revealing their personal details to the world. All done to shame, embarrass, or—in extreme cases—cause physical harm to them.

In the US, doxing has led to ‘swatting’, which involves reporting the streamer to the police for false crimes. As a result, armed police show up at their homes while streaming, all for the “lols”, despite its potentially serious consequences.

negative impacts of toxicity in gaming

Negative effects of toxicity in gaming

Gaming toxicity has many adverse effects on gamers and game developers. For gamers, it can cause lesser enjoyment and psychological harm; for gaming companies, it can cause a massive dent in your player base and game success.

Some of the negative effects you may see, with data from the Unity, Firstpost, and ADL surveys, include:

  • 67% of multiplayer gamers would stop playing a game cold turkey if other players were being toxic
  • 49% of women have quit playing games after running into toxic behavior
  • Toxicity has run-on effects, decreasing women’s motivation to play video games and pursue technical careers
  • Only 24% of players feel there’s a point in using in-game reporting, while 23% leave the game

If your game has a rampant toxicity reputation, players will likely be put off and share their experiences with their friends. And given how popular forums like Reddit are to share grievances, reaching thousands of avid gamers in less than an hour.

So how can you create a safe player environment that makes your players comfortable enough to keep enjoying your games?

How to create a safe player environment

Many game developers have already noticed the negative impacts of toxicity, with some starting initiatives to combat it.

The Fair Play Alliance, for example, is a global alliance of gaming industry professionals working together to create better gaming experiences. Its members include CCP, EA, Epic Games, Behaviour Interactive, Blizzard, and hundreds of others.

While FPA prefers not to use the term toxicity, its committed drive to creating healthy online gaming communities stands firm. They veer from a one-size-fits-all approach and work together to challenge the status quo of all their games: understanding and potentially solving disruptive behavior.

The Valve Corporation implemented a community-driven approach to toxicity as a behavior score. In this initiative, high-behavior players review low-player scores to see whether action is needed. Similarly, Riot Games’ League of Legends has a zero-tolerance policy, wherein proven abusive behaviour results in player bans.

Take This is a non-profit mental health organization founded to increase support for the gaming community and those in the gaming industry. And lastly, Raising Good Gamers aims to teach everyone in the gaming industry how to push for positive change for the next generation.

Initiatives like FPA, Take This and Raising Good Gamers reflect what the gaming world could be without toxicity. But as developers, you can start creating these environments yourself with initiatives that stem from actively listening to your players, like:

Community management

Video games have always had gigantic communities. Before the internet age, gamers were passionate about their favorite games, buying magazines, calling helplines, etc. But the internet allowed gaming communities to grow beyond pen pal letters or knowing someone with the guidance they need.

Community management is a direct connection to the people playing your games. Managers have to understand your game’s community and those within it. They’re on the frontline, connecting with your players and responding to feedback, whether good or bad. As a result, good community managers can help with toxic behavior outside of the game itself.

✔️ Encourage your community managers to flag bad behavior, investigate player concerns, and produce newsletters that openly discuss your stance against harassment.

After all, most players want to know developers care about them as much as they do your games.

Sentiment analysis

Sentiment analysis is an incredible digital tool combining natural language processing (NLP) and machine learning that can help you measure player sentiment. It allows you to mine text from forums and Twitter to see what players are saying about your game. Then, in knowing their thoughts, you can create a plan to remedy their issues.

✔️ Implement a sentiment analysis model to help you dissect game feedback at lightning speed, allowing you to develop solutions quicker.

If sentiment analysis interests you, 5CA provides a full-service Trending Topics & Sentiment Analytics solution. Learn more at 5CA.tech.

VIP support

VIP support allows you to better segment your players and respond to their individual needs. So, if an issue relating to toxicity arises, they can feel confident you’ll go out of your way to solve it. Although in most cases, VIP support doesn’t cover your entire player base, it can be a way of tackling the more problematic ones.

✔️ Include harassment protection in your VIP support. By taking harassment claims seriously, you can take action against problem players, keeping your game safer for everyone. 

Although only a fraction of your player base may be VIPs, such efforts have a knockdown effect. However, as long as you make an effort to listen to your players, they will respond.

Behavioral standards

Behavioral standards come down to game theory, analyzing the strategic decisions made by players in multiplayer games. For example, what parts of your game urge your players to engage in toxicity? Once you know this, you can figure out what to do to curb that.

While changing parts of the game to counter toxic players may seem like a leap on paper, its results are worth it. Many developers continuously monitor player feedback and make changes to benefit them positively. Games such as Behaviour Interactive’s Dead by Daylight, for example, release regular community surveys, the results of which impact future patches.

✔️ Monitor player feedback to see what parts of your game lead to a less-than-enjoyable experience, then work on patches to overcome it.

While there’ll always be toxic gamers, limiting what leads to toxicity can benefit overall player satisfaction.

Protect your players

As a developer, you love your game as much as your players do. As a result, you want them to enjoy it too. Unfortunately, there’s no quick fix for players dissatisfied with the experience of playing with others. But player protection goes a long way.

Protecting your players from the bad behaviour that limits their enjoyment is a good place to start. Whether through community managers or you taking an active interest in your players, it’s good to show you care.

✔️ Address negative feedback points in your patch notes, and make your players aware you’re working on protecting them. 

Put some time and thought into how you’ll deal with problem players. It’s worth implementing a ban or penalty system that deters negative behavior, thus protecting your player base.

Specialized player support

Specialized player support combines most of the above points and more. The fact is, many people are playing your games — people of all demographics. As a result, you need player support that supports your audience.

Segmenting your player base is one way to prepare a specialized support initiative. You can do this by identifying player archetypes and learning what you can do to support them. For example, a casual gamer who plays after work usually prefers more relaxed gaming sessions. So running into toxicity for them slights their relaxation time, leading them to switch to other games.

✔️ Identify who is playing your game, why they’re playing it, and what you can do to support their enjoyment.

Hiring gamers to fulfil your player support roles can help immensely — especially if they play the game they’re supporting. They know who’s playing and how to help better than those who don’t.

If you’re interested in humanizing your player experience, specialized support may be the key to combatting toxicity personally.

how player support can combat toxicity in gaming

How player support can combat toxicity

Player support contributes massively to player satisfaction. In cases like this, it’s worth putting yourself in your players’ shoes. Where would you turn if you’re playing a game you love and run into players who harass you? You’d turn to support.

Having a team of gamers involved in your player support is almost essential. For true customer satisfaction, customers like to be heard. The same is true for player satisfaction, and if your players get to talk to somebody who understands their issues, they’ll feel immediately understood.

So if the key to combatting toxicity could lie in your specialized player support, what can you do to amp it up?

Monitor in-game chat

One of the biggest proponents of toxicity in online gaming is anonymity. Many players feel they can say whatever they like without repercussions. However, monitoring text-based chats is far easier than voice chat, and asking people to provide proof is easy.

✔️ Let players know they can submit screenshots of harassment, then investigate. Instilling short-term bans will deter more people from doing it. 

Take reports seriously

Many players feel their reports go unnoticed, and this isn’t your fault, nor is it theirs. But ensuring every player’s voice is heard is complex, with so many messages to sift through. It would help if you empowered your support team to take as many reports as they can seriously. Otherwise, your players might stop playing.

✔️ Encourage your player support team to investigate reports of harassment to ensure your players feel heard.

Implement game masters

Game masters are an old institution, but they’re still a great way of connecting to your player base. As well as helping those who are having a difficult time with parts of the game, they may be present to witness and report toxic behavior.

✔️ Implement game masters to monitor in-game behavior and report any harassment they witness. 

Ticket flagging

Ticket flagging should already be part of your support system. Even so, it’s worth including a flag for harassment. Ticket flagging helps specific tickets reach the right people, so those helping with technical issues, for example, will go to technical support. Similarly, tickets flagged with in-game behavior will reach those who can investigate the reports thoroughly.

✔️ Dedicate a subset of your support team to handle harassment-related tickets. In doing so, these reports won’t get swallowed by others, and your team can deal with them accordingly. 

Where to start

As per Unity’s Toxicity in Multiplayer Games report, 44% of multiplayer gamers think players who exhibit toxic behaviour should be suspended. While using player support to curb this behaviour is more antidote than cure, it can go a long way.

First, you have to figure out why your players are displaying such behaviour in the first place. Doing so starts with your community and building the foundations for a healthy gaming experience directly into the game — including your player support. In proactively considering the gaming community before your game’s release, you’ll already have a stable position to deal with issues later.

And supporting your players with expert player support is a good starting point.

If you’re looking to boost your specialized player support strategy, 5CA can deliver valuable insights and solutions to help combat toxicity for good. Get in touch.

5CA

5CA