fbpx

Supercharge Trust & Safety: Keyword Flagging & More in Checkstep’s Latest Updates

summer updates checkstep platform

We’ve been busy updating and adding new features to our Trust & Safety platform. Check out some of the latest release announcements from Checkstep!

Improved Abilities to Live Update your Trust & Safety workflows

Trust and Safety operations are always evolving and new forms of violating content pop up in new ways. It’s critical that Trust & Safety teams have the ability to adapt to new trends and, in the case of a breaking emergency or PR event, to quickly moderate new types of content. Checkstep has made it easier to quickly respond to new issues.

Quick Responses with Keyword Flagging

Sometimes the easiest way to address emerging issues is through the use of keywords. Within 30 seconds, updates to your keywords will apply to your content and allow you to respond to and moderate emerging topics faster than ever before. We’ve updated our keyword tagging features to give any Checkstep customer the ability to add new keyword strategies and quickly start tagging new content themes.

Easy policy rule updates to specific content types

Our Trust and Safety customers regularly review and update their policy rules and how much of their content they review with human moderators and automate with our moderation bot. We’ve added new self-service features to limit how policy rules apply to different types of content. This lets you add new rules for parts of your moderation flow in real-time.

Agile updates to AI categorization (beta)

While keywords are often an easy ‘starting point’ to respond to emerging trends, sometimes you need more sophisticated models to help respond to new types of content issues. In seconds, customers can now update large-language model (LLM) content classification to adapt their AI toolkit on the fly (within 30 seconds).

Trust & Safety Workforce Management Updates

Managing and monitoring a Trust and Safety operations organization requires setting and raising the bar for the quality, efficiency, and safety of your teams. Checkstep is always updating its transparency tools and its efficiency features:

Average Handling Time (AHT) Reporting

Understanding the average handling time for Trust and Safety decisions is a key measure of efficiency for teams. Checkstep added Average Handling Time (AHT) reporting for moderators and for queues so that you can understand how quickly decisions are made and where content decisions are difficult. Customers are already using this statistic to measure efficiency gains, to coach and support moderators, and to identify areas where policies are not clear enough. We’ve also seen new customers getting an 83% time savings for moderation decisions with Checkstep.

AI Marketplace Updates

Checkstep has added new AI scanning tools (including multi-modal scanning) to its AI marketplace. It’s easy to mix and match your scanning approach with Checkstep and we recently made it easier to track your scanning usage between different AI models in ‘Settings’ > ‘Usage’. Monitor your AI spending and make more informed decisions about your scanning approach to continue to drive efficiency for your Trust and Safety Operations.

Small Improvements for your Trust & Safety Operations

We’ve also launched a ton of optimizations and small improvements for the platform:

  • Small-screen moderator view updates: see all metadata even if you’re moderating on a small screen.
  • New Date-Picker in reporting: it’s now easier to pick a date range for reporting.
  • Updates to Resiliency features: we’ve added resiliency features for moderators (blur, grayscale, etc.) while watching a video.
  • Self-service user management: update user roles or disable accounts self-service in ‘Settings’ > ‘Members’
  • Export CSV from quality assurance: download a csv to dig into quality assurance metrics from the Checkstep dashboard.
  • Export CSV for transparency reporting: for customers not using our transparency portal, they can now download a CSV of key transparency metrics to build a separate report.

And of course, we’ve made a number of performance improvements and bug fixes. Stay tuned for a busy autumn of Checkstep product development.

Book a Demo: Experience Checkstep’s Latest Updates Live!

More posts like this

We want content moderation to enhance your users’ experience and so they can find their special one more easily.

17 Questions Trust and Safety Leaders Should Be Able to Answer 

A Trust and Safety leader plays a crucial role in ensuring the safety and security of a platform or community. Here are 17 important questions that a Trust and Safety leader should be able to answer.  What are the key goals and objectives of the Trust and Safety team? The key goals of the Trust…
6 minutes

How Content Moderation Can Save a Brand’s Reputation

Brand safety and perception have always been important factors to look out for in any organisation, but now, because we live in a world where social media and the internet play an essential role in the way we interact, that aspect has exponentially grown in importance. The abundance of user-generated content on different platforms offers…
5 minutes

The longest 20 days and 20 nights: how can Trust & Safety Leaders best prepare for US elections

Trust and Safety leaders during the US elections: are you tired of election coverage and frenzied political discussion yet? It’s only 20 days until the US votes to elect either Kamala Harris or Donald Trump into the White House and being a Trust and Safety professional has never been harder. Whether your site has anything…
5 minutes

What is Content Moderation ? 

Content moderation is the strategic process of evaluating, filtering, and regulating user-generated content on digital ecosystems. It plays a crucial role in fostering a safe and positive user experience by removing or restricting content that violates community guidelines, is harmful, or could offend users. An effective moderation system is designed to strike a delicate balance…
5 minutes

A Guide to Detect Fake User Accounts

Online social media platforms have become an major part of our daily lives: with the ability to send messages, share files, and connect with others, these networks provide a way, for us users, to stay connected. Those platforms are dealing with a rise of fake accounts and online fraudster making maintaining the security of their…
4 minutes

How to Keep your Online Community Abuse-Free

The Internet & Community Building In the past, if you were really into something niche, finding others who shared your passion in your local area was tough. You might have felt like you were the only one around who had that particular interest. But things have changed a lot since then. Now, thanks to the…
6 minutes

UK far-right riots: Trust & Safety solutions for online platforms

The far-right riots in the UK The recent UK far-right riots, undeniably fuelled by misinformation spread on social media platforms serves as a stark reminder of the urgent need for online platforms to adapt their content moderation policies and work closely with regulators to prevent such tragedies. The consequences of inaction and noncompliance are serious,…
10 minutes

Text Moderation: Scale your content moderation with AI

In today's interconnected world, text-based communication has become a fundamental part of our daily lives. However, with the exponential growth of user-generated text content on digital platforms, ensuring a safe and inclusive online environment has become a daunting task. Text moderation plays a critical role in filtering and managing user-generated content to prevent harmful or…
4 minutes

Ensuring Child Safety Online: The Role of Trust & Safety Teams

Children are now growing up with technology as an integral part of their lives. With the increase of smartphones, tablets, and internet-connected devices, it is important for parents, educators, and technology companies to prioritize children's online safety. This shared responsibility requires collaboration, best practices, and strategies to create a secure and user-friendly virtual environment. By…
5 minutes

9 Industries Benefiting from AI Content Moderation

As the internet becomes an integral part of people's lives, industries have moved towards having a larger online presence. Many businesses in these industries have developed online platforms where user-generated content (UGC) plays a major role. From the rise of online healthcare to the invention of e-learning, all of these promote interaction between parties through…
8 minutes

Why emerging trends put your user retention at risk – and how to fix it with flexible LLM prompts

Emerging trends can severely threaten user retention We've recently seen how hate speech and misinformation can put user retention at risk during the recent UK far-right riots. Recent events like the UK far-right riots have highlighted how unchecked hate speech and misinformation can severely threaten user retention. When harmful content spreads without effective moderation, it…
5 minutes

Audio Moderation: AI-Driven Strategies to Combat Online Threats

In today's digitally-driven world, audio content has become an integral part of online platforms, ranging from podcasts and audiobooks to user-generated audio clips on social media. With the increasing volume of audio content being generated daily, audio moderation has become a critical aspect of maintaining a safe and positive user experience. Audio moderation involves systematically…
4 minutes

The Impact of Trust and Safety in Marketplaces

Nowadays, its no surprise that an unregulated marketplace with sketchy profiles, violent interactions, scams, and illegal products is doomed to fail. In the current world of online commerce, trust and safety are essential, and if users don't feel comfortable, they won’t buy. As a marketplace owner, ensuring that your platform is a safe and reliable…
9 minutes

Why moderation has become essential for UGC 

User-Generated Content (UGC) has become an integral part of online participation. Any type of material—whether it's text, photos, videos, reviews, or discussions—that is made and shared by people instead of brands or official content providers is called user-generated content. Representing variety and honesty, it is the online community's collective voice. Let's explore user-generated content (UGC)…
6 minutes

Building a trusted and authentic social shopping experience: Bloop partners with Checkstep for comprehensive moderation and compliance solutions

The fast-growing ecommerce startup Bloop has partnered with Checkstep to safeguard the user-generated content (UGC) on its new social shopping platform, ensuring Trust and Safety for users. About Bloop Bloop is reshaping the social shopping landscape by rewarding real consumer influence. Bloop combines the best elements of social networking and marketplace platforms. The team aims…
4 minutes

Prevent unwanted content from reaching your platform

Speak to one of our experts and learn about using AI to protect your platform
Talk to an expert