New Blizzard “Strike Team” Will Help Address Negative Player Behaviour in Overwatch

Joe O'Brien

Blizzard has revealed some of the new steps they are taking to help combat negative behaviour by Overwatch players.

Overwatch principle designer Scott Mercer took to the Blizzard forums to explain the steps they are taking to “create a friendlier place to play Overwatch and provide you with more control over your game experience”.

Blizzard has made its stance on “toxicity” abundantly clear, and the post explains the latest measures they are taking to reduce undesirable behaviours in-game. Those measures include the formation of a “strike team” to coordinate and execute these efforts.

One of the methods that Mercer touches on involves improving the report system by developing machine learning techniques to help the system identify punishable behaviours, and protect players from false reports.

Machine learning is a branch of artificial intelligence in which systems “learn” by repeated exposure to data, rather than being explicitly programmed, reducing the reliance on the developers to take into account the nuances of every possible scenario the system might have to encounter. Machine learning techniques are well-suited to categorization tasks – such as identifying whether a chat log contains any offensive or rule-breaking language.

The post does state that technologies such as this will work together with player reports and be used to “quickly bring attention to bad behaviour so that the appropriate steps can be taken”, implying that punishments will not be delivered directly by an automated system.

The post also touches on the recent “avoid as team-mate” tool, and increasing the extent of potential punishments for abusive chat for repeat offenders. The full statement can be read below:

Hello, everyone! Today I want to discuss many changes and improvements we’re making to create a friendlier place to play Overwatch and provide you with more control over your game experience. The Overwatch development team has formed a “strike team” to coordinate and execute all these efforts alongside our customer service teams, community team, and several other departments across Blizzard.

The core tool we rely on to identify bad behavior is the player report system. Recently we changed the in-game player reporting categories by removing the “Poor Teamwork” category and changing “Griefing” to “Gameplay Sabotage”. We made those changes to make it easier to correctly choose a report category that matches the bad behavior being reported.

Accurate in-game reports of bad behavior are the best way for you to help us improve community behavior, and we want to make sure that process is easy and clear for everyone. We also monitor our Overwatch social accounts for reports of players behaving badly, and we follow up on these reports with investigations and appropriate penalties.

I’ve seen many comments from players who think that their reports are meaningless, but we want to stress that they are actually very helpful and incredibly important to improving the Overwatch community. The in-game thank you messages we recently added let you know that the time you spend making a report is indeed making a positive contribution. Just this last week, we corrected an issue that prevented many players from receiving these messages, so players should now start seeing even more feedback about their reports. To further improve these systems, we’re doing a lot of exciting work to develop “machine learning” systems to assist in accurately identifying abusive chat and gameplay sabotage. These technologies will work together with player reports, empowering the community to quickly bring attention to bad behavior so that the appropriate steps can be taken to discourage or prevent future bad behavior from ruining others’ experiences playing Overwatch. These same systems will also protect players from false reports.

That being said, we recognize that anyone can have a bad day, and they might not even realize they are using abusive chat or acting negatively to their fellow players. To help players realize that other players have taken notice of their actions, we recently added in-game warnings to let them know they’re acting in a way that’s unacceptable to the Overwatch community.

Recently we provided another active tool to give you more control of your Overwatch experience: the ability to mark players as people who you don’t want to play with as a teammate. Our initial deployment of the feature only allows you to avoid two players, but we want to increase that limit over time as we become more confident that it doesn’t negatively affect matchmaking.

We’ve also made some changes to how we penalize repeat offenders for abusive chat. Repeat offenders used to be penalized with a silence, restricted from communicating with other players in the game, with each subsequent infraction incurring a longer silence duration. We’ve changed this so repeat offenders can now be suspended, and therefore unable to play Overwatch, for longer and longer durations. If someone continues to use abusive chat even after being warned, silenced, and suspended enough times, they’ve proven they do not want to be a positive member of the Overwatch community and will be permanently banned from playing Overwatch.

Going forward, we’re also really excited about some new social features we’ll make available in the summer. We’ll talk about those more when they’re closer to being ready for testing on the PTR, but they should give players even more ways to control their gameplay experience online. Thanks, everyone!

 

About The Author

Joe O'Brien was a veteran esports and gaming journalist, with a passion and knowledge for almost every esport, ranging from Call of Duty, to League of Legends, to Overwatch. He joined Dexerto in 2015, as the company's first employee, and helped shape the coverage for years to come.