The good and the bad
I make no secret that I met some of my closest friends through World of Warcraft over five years ago. To this day we still play together on a weekly basis despite the fact that we let our subscriptions to the behemoth of an MMO lapse a long time ago. Through the guild that we formed we were able to teach each other, level our characters, deck them out in awesome looking gears and defeat some of the most difficult bosses ever implemented into the game. All humble-bragging aside, we were and still are a pretty close-knit group that understood and cooperated with each other extremely well, all without ever seeing each other in real life (until recently). This degree of amicability however is not always the norm for people who choose to enter into an online community.
For newcomers and those slow to learn an online game’s mechanics the prospect of dealing with those who are already well acquainted with the world and its intricacies can be daunting, if not entirely overwhelming. This problem is exacerbated for those with any kind of social anxiety disorder, prompting them to disregard these games altogether out of fear. Many of these people will be the victims of cyberbullying of the video game variety.
Your experience with cyberbullying in an online game varies depending on the game you are playing, but in the end it will all involve at least one player harassing another. With the online communities for games growing as the years go by the problem of policing and punishing these offending individuals becomes increasing difficult. The circumstances involving bullying in a game differ significantly from what you might see over social media, where an individual often uses their real name and identity. Over an online game it is often the decision of a player to create a new identity, private and detached from their real life one. While this certainly protects players from harassment in real life it makes punishing trolls, griefers and plain old bullies exceedingly difficult, as there is little in the way of accountability in regards to a fictional identity. This has not stopped developers from trying to create some though, and with recent events in cyberbullying both within games and without the emphasis on making progress has only been intensified.
The traditional method
Because of the differences between games policing the communities requires a variety of approaches. In World of Warcraft, developer Blizzard went with and continues to stand by a fairly traditional method. Right clicking a player’s name in chat allows for one to report them for innocuous behavior, i.e. spamming, offensive language, abuse and more. As the tickets for these players mount up eventually a game master will intervene and, after reviewing their chat logs, will punish the player accordingly. In the time I spent lazing about Stormwind City, waiting for my guild to get ready for a raid, I sent many of these reports myself followed swiftly with the addition of the reported to my ignore list.
This simple approach is a relatively common one for MMO’s, with others such as Rift, Guild Wars 2 and more utilizing it. From what I can tell though it is a successful one as well. Reported players often receive temporary bans and even risk having their accounts suspended permanently for repeat offenses. I can honestly say that I never saw the same person offending week after week, but then again I never removed anyone from my ignore list. It was simply easier to mark these people as unpleasant and deal with the people who already knew were kind and helpful. Much like real life the first impressions that these people made had a lasting effect on me, and I ignored them much like I do with those who irritate me in reality.
But what about the games where you don’t get to choose who you interact with? Massively online battle arena’s, or MOBA’s for short, one of the quickest growing genre of online games faces a decidedly greater challenge when it comes to negative player behavior. It’s become fairly common to hear the communities for games like League of Legends or DOTA 2 referred to as toxic. They certainly have a history as such, as I can personally attest to. Due to their harsh learning curve and complex character building structure it can be easy for a new player to fall behind and for a returning player to flounder in confusion for a time. The responses from the players, both on the less experienced player’s team and on the opposing one, is often less than helpful. Making the slightest mistake like missing a skill-shot on an escaping player or even being the wrong person to kill an enemy player can set a toxic player’s sights on you, creating an atmosphere of verbal abuse and negativity.
The major problem with this situation is that, unlike WoW, you cannot simply ignore this person and go on playing. LoL or DOTA requires you to keep playing with them till the match is over, or you risk receiving a negative report on your own account for leaving early. This is understandable, as leaving would put the rest of your team at a competitive disadvantage, and hopefully they don’t all deserve it. So you play with the troll, possibly muting him so you can avoid his abuse, but nonetheless still sharing virtual space with him. Sadly this rarely improves things, and your team suffers nonetheless. Fortunately, the developers of LoL, Riot Games have taken an interesting and largely innovative approach to the issue.
Putting the power in the player’s hands
Riot Games attempts at dealing with player harassment have become a major success story for the company. Recently Jeffrey “Lyte” Lin and Carl “StatusKwoh” Kwoh, the the lead social systems designer and producer at Riot Games, respectively, spoke at MIT regarding the issue. The video is quite long, but worth a watch should you have the time.
If you weren’t able to get through the whole thing, I’ll summarize the main points for you and give you the background necessary to understand. Riot, in an effort to improve the level of discourse and cooperation in their player base, have implemented a number of systems to do so. Players are able to report offending players at the end of a match, which functions similar to WoW’s method of player reporting. These reports prompt the records of the match to be brought to The Tribunal, a form of public trial in which the players of LoL are able to review a case and vote on whether or not the player in question was in violation of the game rules or not. If the community agrees that they are the case is pushed forward to a Riot employee who doles out the appropriate punishment.
The system has been largely successful, especially now that Riot has been sharing the information regarding the Tribunal case with the guilty players as it shows them where they went wrong. As was shown in the video, having the player base actively participate in the process had a very positive effect, as it removed the sense that players were being arbitrarily punished by the developers. Riot’s decision to further incentivize good behavior has certainly helped as well. By giving players a goal and by making that goal reliant on being helpful, kind or at the very least quiet during a game goes a long way in keeping the toxic players out of the loop.
What the future holds
Riot’s largely successful approach to cyberbullying in its player base has prompted others to take on similar tactics. Though not everyone is copying LoL in its largely player driven policing structure there are still some noteworthy examples. DOTA 2 for instance, rather than ban their players from playing the game instead have those who receive an inordinate number of negative reports placed into a separate league in which they can only compete against each other, effectively creating a toxic waste dump of negativity. Newer games, such as the recently released Titanfall take a similar approach, putting cheaters and negative players in the same league as one another which they cannot leave. Though I personally feel that the Riot method is more productive, it’s certainly pleases me to imagine awful people being forced to play with only other awful people. It’s like a virtual hell that we get to inflict on others who truly deserve it! Even Microsoft is doing it with the Xbox One, forcing all the negative players into one large pool of hate.
Though I would much rather see more in the way of positive reinforcement strategies I find it hard to complain about toxic players being forced to play with other toxic players. Still, in the back of my mind it just feels less productive than teaching them to be better people. We could argue all day about whether or not creative an incentive to be nice is truly making people better but in the end it doesn’t really matter. It’s simply more rewarding for me to see someone become kinder than to watch them drown in the bile of other people’s hate. Not that that make’s it less funny, though.