Content warning: this piece mentions suicide.
“Go kill yourself.”
I received this message on my Xbox One X console while playing one of my favourite online multiplayer games. I was eliminating an opposing player, which is part of the gameplay and all in good fun, when I suddenly received that message from my opponent. I wish I could say it was the first time I’ve received that kind of message. That’s when I changed my console settings to only allow voice communication and messages from friends.
Today, the majority of people with a computer, phone, or game console can play online games where they can connect with other players from across the globe. This type of gaming creates more engagement between players, more competitive gameplay, and a new sense of thrill. However, being able to communicate with strangers through online gaming has also revealed the dark side of the video game community.
This is why online video games are often associated with aggressive behaviour including verbal abuse and death threats. So why is this type of behaviour prominent within the online video game community?
Nearly 200 studies conducted by the American Psychological Association have failed to find evidence that violent video games correlate with crime. Yet, these studies have concluded the online video game community is a place where people often engage in inappropriate behaviour. Further studies have shown it is the anonymity of gamers’ identity, overly competitive gameplay, and the encouragement of sexist attitudes toward women that create this negative environment.
These abusers feel their anonymity protects them from the consequences of harassment. Online competitive gameplay can also create a stressful environment where players can be blamed for any in-game mistakes they make. Add in the stereotype that women can’t play video games and you have the perfect ingredients to create a toxic atmosphere. These are some of the reasons why the video game community can be so awful, but what measures are being taken to fix this problem?
The system that currently counters abusive text chat on most video game consoles is simple: if an inappropriate word is detected, the player who used it is automatically banned from communication privileges for a set period of time. The duration a player is banned for depends on whether that player has prior offences. However, there are a limited number of words that the banning system considers to be abusive. This means loopholes like strategic word choices still allow players to harass others.
Getting rid of abuse over voice chat is even more complicated because there is no successful system that accounts for inappropriate language. There’s no stopping someone from shouting homophobic slurs or racist remarks. These players can be reported, but the discouraging reality is these players rarely face the appropriate consequences of their actions. Players who are reported for verbal harassment over voice chat are only banned if they are reported and if the video game company has time to review it.
Thankfully, there are ways video game companies can counter today’s problematic gaming community. The video game company Blizzard Entertainment found success lowering abusive behaviour within their popular game Overwatch. The company found giving individuals the option to find players with similar skill levels and assigning pre-made compliments reduced incidents of harassment.
Gaming companies need to try harder to create stricter systems to counter the dark side of the video game community. These systems should accurately detect harassment and consistently take appropriate actions. The flaws and current limitations of banning regulations are evident, but companies like Blizzard Entertainment have shown improvements can be made. It’s time to make this a priority so people can connect and have fun playing video games instead of experiencing abuse.