Blog

How IT Can Help You With USG: CMS Development and Moderation
These days, user-generated content (UGC) is essential for any company that wants to remain relevant and competitive. Allowing UGC on your platform is a must, as it harnesses the power of your product’s end-user.
UGC does come with its fair share of risks, however, including brand-damaging issues like profanity, nudity, and racism. These elements could prove disastrous to a brand if they slip through the cracks.
Marketing departments are developing forward-thinking UGC campaigns that drive real-time engagement with existing and potential customers. And as the IT department, you have the integral job of ensuring that the proper tech is in place to be certain that these campaigns run safely and all UGC is carefully vetted.
To safeguard your company’s reputation and protect your users , it is critical that you have a very clear plan in place before ever opening the gate to promoting UGC. Fortunately, there is an array of tech available that can work behind the scenes to moderate this new incoming content.
To avoid the risks associated with allowing UGC on your platform, you can monitor content in house by building your own CMS with advanced moderation capabilities. You also have the option of partnering with a moderation company that offers an easy to integrate API.
Option 1: Building Your Own CMS
If you plan to monitor content in house, you will need to build a comprehensive CMS that allows you to easily monitor high volumes and take immediate action should you need to block users and take down any UGC quickly. A more advanced CMS will enable you to queue up potentially high-risk submissions, including UGC from new users, repeat offenders, or any content that appears to be receiving a lot of views.
The more you know about your users’ track record on your site, the more effective your moderation tools can be.
Handling content in house can be challenging if there are high volumes of submissions and if you wish to monitor the content 24/7, which is why partnering with a professional moderation company may be the best route.
Option 2: Partnering With a Moderation Company That Uses Advanced Tools
If reviewing content in house and building you own CMS seems daunting, you should team up with an experienced moderation company. The right partner can provide your IT departments with an API that can be easily integrated into any website or mobile application. Depending on the chosen partner, your users can submit text, photo, or video content, which can then be checked against criteria, even on the fly.
Humans are necessary for effectively reviewing images for context, but AI also plays a major part in this process. The most effective way to moderate content is to partner with a company that offers AI and live image moderation in one platform. Efficient AI technology can complement live moderation to eliminate content that is clearly NSFW in real-time.
The AI moderation service should return a score that indicates the likelihood of offensive content. You can then determine what automated action your platform will take based on thresholds that you set. For example, you could determine that all images flagged with a score of 70% or higher for nudity will either be auto-deleted or be prioritized in an internal queue for a live team to review and confirm.
Ideally, you will submit the URLs to your content rather than the actual files. This ensures that your partner is simply viewing your UGC submissions via a URL that you can take down at any time, and not storing the content. Also, be sure to lock down that URL so it is only viewable from your moderation partner’s IP address.
Take Advantage of User Reporting Tools
Your audience knows what content they want to view, and will typically make it abundantly clear what they don’t want to see. Allowing your audience to easily report any upsetting content will put a substantial team of resources at your company’s disposal. Easy reporting will also establish a sense of community, as users are working together with the brand to promote a safe environment.
To accomplish this, implement a simple reporting button that automatically escalates the content to your internal team or your moderation partner. If handled by an internal team, your CMS should populate these reports and users should have the ability to provide further details on why they flagged certain content.
If you are working with a moderation partner’s API, it is important to communicate both the reported content and the metadata that accompanies it (explaining why it was flagged). Effective use of AI-based moderation technology and a well-trained moderation team, as described earlier, should significantly reduce the number of submissions flagged by users.
After all, you want your users to help maintain a safe environment, but a toxic site, without the assistance of professional moderation, will lose your audience quickly.
Allowing UGC on your platform doesn’t have to be risky business. With the right moderation tools, you can use UGC to support real-time engagement while successfully protecting your company’s brand.
Confident with Your IT Strategy?
If you found the information in this blog post helpful and you'd like to discuss your business' technology strategy, then we'd be happy to hear from you.
Categories
Cyber Security Policy Starter Kit:
10 Critical Policies That Every Company Should Have in Place
