Crowdsourcing is one of the nicest things the Internet can offer to website owners, online community managers, and social media moderators.
Crowdsourced content moderation, also known as reactive moderation, gives you a faster and cheaper way of managing and filtering user-generated content (UGC).
Random users, for example, can be gathered to complete an online task for you or solve a problem that normally would require professional help. Labor costs are significantly reduced because you either only pay for what you’ve asked your volunteer moderators to do or let your entire community evaluate content without spending money at all.
But letting the crowd do these tasks for you, instead of outsourcing content moderation to an expert agency, can also be a dangerous step. When talent shortage is addressed by appointing volunteers as content moderators, you are making your online brand reputation susceptible to several risks.
The risks of crowdsourcing moderation
While crowdsourcing can give you promising results because you gain a workforce that is dedicated to helping you, it can also lead you to have content moderators who are not fit to do the job.
The volunteers who filter damaging content for you are people who were never trained to perform the task in an objective way.
Most of the time, community members hide under the veil of anonymity, especially if you allow them to use random usernames. Since the identities of your crowdsourced moderators are barely known, this makes them feel less accountable for the things you ask them to do. If they give a post a thumbs down or report a random image on your message board as offensive, there’s no serious consequence to their action that can compel them to be mindful about what they’re doing.
The potential courses of action that you can do for crowdsourced moderators who do things poorly, such as blocking them or stripping them of their moderator privileges, are not intimidating enough. They can always create a new account, regain their membership points, and resort to being an Internet troll instead of helping you protect your brand.
Volunteer moderators are rarely educated about the technical aspects of UGC moderation. Their understanding of distinguishing good from bad content is mainly conceptual: if it’s offensive, then they downvote the post, if it’s a funny post, then they give it a thumbs up. Since you are not letting experts do the job, you are instead relying on people who evaluate the content on a black and white basis. In the long run, this could prove damaging to a brand’s online reputation.
Combining crowdsourcing and outsourcing
Content moderation is a multi-faceted digital security practice. Damaging content can be interpreted in several ways. One witty remark can be amusing for some but offensive to others. When you ask people to screen UGC, you are empowering them to use their personal preferences and values in protecting the welfare of the entire community.
Professional content moderators know how to step back and analyze a website’s predefined Term of Use policies and UGC guidelines in managing your online community. The motivation they have in protecting the online conversations on your social media account or forum is hinged not on personal biases. They do it to protect your brand’s online reputation.
The fact that there’s a monetary compensation at stake is enough to motivate them to do things right. Plus, as part of your team, they have a better grasp of what’s expected of the brand and the community it caters to.
Crowdsourcing can indeed help you in many ways, but relying purely on inexpensive labor can potentially be detrimental. A two-step process can instead be implemented. Here, your community members can report, downvote, or reject damaging or inappropriate content. These reports will then be verified by your expert moderators. This will help direct your expert moderators’ attention to posts that need immediate action. You also continue to keep your community engaged in protecting your online property.
When outsourcing content moderation services, you are more confident that the moderators evaluate content using the proper criteria. There are lesser chances for ignorance, biases, or laziness to affect your expert moderators’ ability to manage your community. If you truly want your brand to have a safe online platform, there must always be technical expertise in carrying out your quality assurance strategies.
Open Access BPO can help you transition from crowdsourced or reactive moderation to having your own team of experienced content moderators. We won’t leave any UGC unturned to ensure your brand reputation remains pristine. Contact us to discuss our content moderation services, available in over 30 languages.