Conectys Outsourcing made easy

What are the most necessary elements you need in an outsourcing partner?

At a Glance

What we do now The de facto play on more effective user-generated content (UGC) moderation has been technology. The thinking is often: Get more tech to put towards these challenges. That helps, but it’s not the full picture. Remember: at its best, tech is a force multiplier. At worst, it can exacerbate existing blind spots in your process. Tech is necessary within UGC, but it typically cannot handle cultural and language nuance very well. Consider the role of artificial intelligence (AI) in moderation. We’ve consistently thought AI would be the primary guiding force in UGC at scale, and AI is certainly helpful.

Table of Content

Elevate your operations with our expert global solutions!

What we do now

The de facto play on more effective user-generated content (UGC) moderation has been technology. The thinking is often: Get more tech to put towards these challenges. That helps, but it’s not the full picture. Remember: at its best, tech is a force multiplier. At worst, it can exacerbate existing blind spots in your process. Tech is necessary within UGC, but it typically cannot handle cultural and language nuance very well.


Consider the role of artificial intelligence (AI) in moderation. We’ve consistently thought AI would be the primary guiding force in UGC at scale, and AI is certainly helpful as a tool (notice we didn’t say solution).


AI is a strong tool in the moderation arsenal, but it lacks a deeper nuance right now. In many ways, it’s still very nascent. Basing your scaled moderation strategy almost completely on these types of technology won’t work right now. Adapting AI to local cultures, language dynamics, customer demographic preferences, local jurisdiction requirements, and spontaneous events within the community is a long way from perfection. On top of that, imagine a product or community channel that broadcasts video live, has comments enabled — and images can be added to the comments. AI simply isn’t ready to be the only moderator of a world such as that.

Oh, so what should we be doing?

If the answer is not “Rely on technology only” for companies facing a massive scale of UGC moderation needs, what is the answer right now?
Many companies will initially perform human moderation with in-house employees. They start with a small number, using a PDCA (plan-do-check-adjust) approach. This allows the company to improve its expertise in moderation. Internal employees gain knowledge. All good. But if the company is growing, and growing quickly, it’s not going to be sustainable. And remember, for social media properties, “scale” is a pace difficult to comprehend, much less keep up with.


The opportunity to scale internationally, leverage best practice, and improve operating costs, though, comes from a different approach. You typically cannot achieve this solely through hiring, especially if your business — or its communities — are experiencing growth on a modern scale.
At that time, many turn to outsourcing — simply because companies lack the resources, experience, and expertise to moderate at speed and scale internally. And to be blunt, when moderation is done poorly, there are massive legal and brand repercussions, so it’s often an area of the business best outsourced to experts in UGC and moderation.


But if you’re going to consider outsourcing, what do you need to be looking for?
Glad you asked. That’s why we wrote this paper. It’s all about how to scale and expand moderation operations right now — before AI is ready to do everything– all while involving multilingual nuance and more.