Content Moderation Definition
When you see that TikTok video or Instagram post, usually it wasn’t just posted — rather, it was posted and then some type of review process took place on the back-end before a wider audience could see it. That’s called “content moderation,” and it takes many different forms, including users policing other users on many websites and platforms. In general, though, the idea of content moderation is to keep users safe and secure, while also not sacrificing speed. You want them to see each other’s posts and engage with them, and you don’t want them to wait forever to see said posts, but you also want to make sure that whatever reaches your community isn’t graphic or in poor taste or ultimately very divisive. Content moderation is about safety and speed, then. Increasingly the legal and regulatory landscape around content moderation has become more complicated in multiple countries, with many regulators blaming social media for tons of societal ills. This will be an interesting space to watch in the next decade.