Share This Story, Choose Your Platform!
Published On: June 4th, 2020|4.5 min read|

The primary challenge of gaming content moderation

Katherine Cross, working on her Ph.D. at the University of Washington and a frequent commentator on gaming and online issues, has noted a few times that games are often marketed and sold as “consequence-free indulgence,” which is both true and completely reasonable. Gaming is often an escape. But then a challenge arises: what if the idea of “consequence-free indulgence” is extended to interactions between the players? For example, someone opts to use a game as a form of personal escape. Once in the game, they are confronted with racist or sexist comments — or bullying — on the boards or in multiplayer. We’ve seen both gender and racial harassment on Twitch, for example. For gaming to work as a community, it has to provide a visual escape, a chance for a hero’s journey, and the fun elements like sweet dance moves in Fortnite. It cannot include any form of harassment or bad actors. That threatens the community, and, frankly, threatens the revenue of the gaming company — harassment means departures, departures means fewer users, and fewer users means less money, as well as a poor brand in-market.

How do you effectively approach gaming content moderation, then?

It starts with a mix of tech and human

In that same article linked above, Cross notes that “we need more community managers who can use sophisticated tools to aid their work,” while also saying “algorithms can’t fully replace mods much in the same way a hammer can’t just replace a carpenter.”

This has been the Conectys position for years. The tech tools are there, yes, especially in the form of AI moderation. But they are not fully at scale, and there is a lot of nuance around online negativity and insults that often doesn’t get flagged by technology. In gaming content moderation, you see racist emojis periodically that an AI won’t necessarily catch, even though a human would instantly know that it’s a negative context.

As a result, your gaming content moderation strategy cannot be all technology. But as you scale a gaming platform, it can’t be all human either. It needs to be a mix. Brian Solis has been a thought leader in this space for a few years, and done a couple of webinars on the topic — your gaming content moderation strategy has to be a mix of human and tech. He rightfully notes:

Raising the bar means raising our standards. Demanding online communities foster healthy environments, protecting its users from toxic behavior and being unapologetic in doing so. And also raising our own standards as users. We need to understand the motivation behind our own behavior, and ask, “Why am I sharing this content? Why am I making this comment? Does this contribute to the greater good? Would I say this in real life? Why can’t I see that I am a villain or as an active part of the problem.

There are undoubtedly broader social questions in the overall gaming content moderation approach you take.

The core considerations of a gaming content moderation strategy

  • Users must feel secure, but cannot feel slow: The game still needs to unfold in real-time and be exciting. It cannot be stopped down for everything. The inherent yin-yang of content moderation is speed vs. security.
  • Moderators must be emotionally protected: There has been an increasing push in the last couple of years for moderators to be psychologically safe, which is great and very necessary. That comes down to training and frequent check-ins with your mods.
  • Take the temperature of users: No, not in the COVID sense. Are they enjoying the game and the community? At Conectys, we introduce CSAT in customer service to our gaming clients, which can help measure customer impressions and track whether the gaming fans are happy with the company and product. We also measure NPS instantly after each interaction, producing highly actionable insight that can boost customer service and development as well. After all, the decision to recommend a product to a friend is often backed by several criteria, which users will express when they submit NPS answers. If you’re missing systems like this, your feedback loop isn’t maximized for optimal customer service.
  • Realize this is often a highly complex subject matter: Have you seen the newest gaming engine from Epic? It’s called Unreal Engine 5. Here is a screenshot from it:

Unreal 5 Game Engine Screenshot

This is an insanely-powerful engine that allows for devs to make hyper-realistic versions of ecosystems, often at a fraction of time and cost. This is all part of a gaming move towards “the metaverse,” which is a technically-complex approach to our future interaction with games. Who’s going to moderate that world? They are going to need some unreal skills at understanding the gameplay and what drives it.

This is why we ensure our Conectys agents are fantastic communicators, with solid industry knowledge. They have to already have baseline skills and experience to work with our gaming clients, and we provide them with comprehensive training so they can seamlessly adhere to defined SLAs, deliver appropriate reactions, tone, and more to your customer base. This is a crucial step for customer service that actually results in satisfied, repeat customers

NPS&CSAT-blog-postWays to measure customer success at scale
Conectys Global BPO ProviderWays to reduce customer service response time
Gaming content moderation: How to do it better

Contact our sales to learn more or send us your RFP!

Recent  Articles

Colombia vs. Top LATAM Markets: Unveiling the BPO Powerhouse Winning Potential

August 22nd, 2024|Tags: , , , |

Latin America’s rise in outsourcing reflects its dedication to education, infrastructure, and stability. While many LATAM destinations excel in nearshoring and offshoring, one stands out: Colombia. The country combines critical advantages in one place: talent, bilingualism, value-for-money, [...]