fbpx

Building a trusted and authentic social shopping experience: Bloop partners with Checkstep for comprehensive moderation and compliance solutions

Bloop x Checkstep

The fast-growing ecommerce startup Bloop has partnered with Checkstep to safeguard the user-generated content (UGC) on its new social shopping platform, ensuring Trust and Safety for users.

About Bloop

Bloop is reshaping the social shopping landscape by rewarding real consumer influence. Bloop combines the best elements of social networking and marketplace platforms. The team aims to revolutionize the shopping experience by rewarding users for posting real consumption recommendations, creating an engaging, community-driven shopping experience. How it works: consumers earn 10% credit just for posting, and 5% credit when they generate a sale. 

Bloop will launch in Portugal in early 2025, with plans to expand to Spain and France in the following years. The founders have raised a total pre-seed round of $1.3 million, making it one of the best-funded early-stage e-commerce companies in Europe.  

Bloop moderation challenges 

With the rise of GenAI, platforms have a higher risk of scams, misinformation and impersonation, with an increase in accounts created with the sole purpose of deceiving other users. Bloop’s mission is to build strong and distinct policies and processes for both its social feed and marketplace to mitigate these risks. 

Bloop’s approach to moderation will be proactive rather than reactive (i.e. waiting for users to report unwanted content). Prior to launching their platform, Bloop’s main concern is the risk of inappropriate, duplicate or misleading content spreading across both the social feed and the marketplace. They actively sought a solution that could help them develop two distinct strategies and responses for both elements of the platform in an efficient, compliant and cost-effective manner. 

To ensure the authenticity of every post (i.e. verifying real users, detecting bots or AI-generated content, and preventing reposts or copyright infringement), Bloop has chosen to partner with Checkstep. Checkstep will help Bloop develop proactive moderation strategies from the start.

Goals of our partnership 

Through this partnership, Bloop will integrate Checkstep’s AI-powered moderation tools to ensure a safe environment for its users. By moderating both the social feed and the marketplace, Checkstep provides Bloop with comprehensive protection against inappropriate or misleading content, while also ensuring compliance with the European Union’s Digital Services Act (DSA).

Checkstep’s policy-centric moderation solution allows Bloop to apply different content rules across its UGC sources, creating a tailored experience for both social interactions and marketplace transactions.

“We are excited to collaborate with Checkstep as we launch our innovative platform,” said João Neves, Bloop’s CTO to Andrew Kaptarenko, Technical Account Manager at Checkstep. “Your expertise in content moderation and compliance will help us ensure that our users can trust the content they see and share on our platform.”

Bloop’s legal counsel was particularly impressed with Checkstep’s experience with DSA compliance and the flexibility of their moderation solutions. Having previously evaluated other solutions, such as AWS Rekognition and Azure Content Moderation, Bloop found that Checkstep offered a superior end-to-end moderation suite that required fewer internal resources, allowing them to focus on platform growth and innovation. In short, Bloop can focus on building the core of their product, while having a flexible way to anticipate any risk linked to user-generated-content, including explicit content and hateful speech that can damage brands and user experiences.

Bloop’s Head of Business Development added: “As a fast-growing platform, moderation is a small part of our overall operations, but we needed a solution that minimized our resource allocation while still ensuring complete safety and compliance. Checkstep’s consultative approach was exactly what we needed, helping us navigate the complexities of Trust & Safety.”

Why Bloop chose Checkstep

  • Consultative partnership: Checkstep’s team led by Andrew, worked closely with Bloop to provide guidance on Trust & Safety and DSA compliance, ensuring that the solution was customized to Bloop’s specific needs.
  • Comprehensive moderation coverage: Checkstep’s end-to-end moderation suite offers seamless coverage across Bloop’s social feed and marketplace, reducing the strain on internal teams and providing users with a safe environment.
  • DSA Compliance Expertise: Checkstep’s experience with companies like Shein and their deep knowledge of the DSA regulations were critical factors in Bloop’s decision to partner with them.

This partnership marks a significant milestone for Bloop as it prepares for its Q1 2025 launch, positioning the platform as a trusted, safe, and innovative space for social shopping.

About Checkstep

Checkstep is a leading AI-powered Trust & Safety platform, helping companies ensure compliance and trust in their platforms with their complete software suite. With expertise in managing user-generated content and ensuring compliance with the European Union’s Digital Services Act, Checkstep partners with companies across multiple industries to create safer online environments.

Media inquiries

For media inquiries, please fill out the form below:

More posts like this

We want content moderation to enhance your users’ experience and so they can find their special one more easily.

Audio Moderation: AI-Driven Strategies to Combat Online Threats

In today's digitally-driven world, audio content has become an integral part of online platforms, ranging from podcasts and audiobooks to user-generated audio clips on social media. With the increasing volume of audio content being generated daily, audio moderation has become a critical aspect of maintaining a safe and positive user experience. Audio moderation involves systematically…
4 minutes

The Impact of Trust and Safety in Marketplaces

Nowadays, its no surprise that an unregulated marketplace with sketchy profiles, violent interactions, scams, and illegal products is doomed to fail. In the current world of online commerce, trust and safety are essential, and if users don't feel comfortable, they won’t buy. As a marketplace owner, ensuring that your platform is a safe and reliable…
9 minutes

Why moderation has become essential for UGC 

User-Generated Content (UGC) has become an integral part of online participation. Any type of material—whether it's text, photos, videos, reviews, or discussions—that is made and shared by people instead of brands or official content providers is called user-generated content. Representing variety and honesty, it is the online community's collective voice. Let's explore user-generated content (UGC)…
6 minutes

Why emerging trends put your user retention at risk – and how to fix it with flexible LLM prompts

Emerging trends can severely threaten user retention We've recently seen how hate speech and misinformation can put user retention at risk during the recent UK far-right riots. Recent events like the UK far-right riots have highlighted how unchecked hate speech and misinformation can severely threaten user retention. When harmful content spreads without effective moderation, it…
5 minutes

Video Moderation : It’s Scale or Fail with AI

In the digital age, video content has become a driving force across online platforms, shaping the way we communicate, entertain, and share experiences. With this exponential growth, content moderation has become a critical aspect of maintaining a safe and inclusive online environment. The sheer volume of user-generated videos poses significant challenges for platforms, necessitating advanced…
4 minutes

Content Moderation for Virtual Reality

What is content moderation in virtual reality? Content moderation in virtual reality (VR) is the process of monitoring and managing user-generated content within VR platforms to make sure it meets certain standards and guidelines. This can include text, images, videos, and any actions within the 3D virtual environment. Given the interactive and immersive nature of…
31 minutes

3 Facts you Need to Know about Content Moderation and Dating Going into 2024

What is Content Moderation? Content moderation is the practice of monitoring and managing user-generated content on digital platforms to ensure it complies with community guidelines, legal standards, and ethical norms. This process aims to create a safe and inclusive online environment by preventing the spread of harmful, offensive, or inappropriate content. The rise of social…
6 minutes

Checkstep’s 2024: A Year of Innovation, Impact, and Safety

As we’re working on 2025’s OKR, we’re reflecting at Checkstep an an incredible year, shipping new product features faster than ever and positive feedback from our early adopters, giving us the feeling that our platform now ready to scale and will prepare our users to benefit from the next big wave of AI agents. We…
4 minutes

Podcast Moderation at Scale: Leveraging AI to Manage Content

The podcasting industry has experienced an explosive growth in recent years, with millions of episodes being published across various platforms every day. As the volume of audio content surges, ensuring a safe and trustworthy podcast environment becomes a paramount concern. Podcast moderation plays a crucial role in filtering and managing podcast episodes to prevent the…
4 minutes

Live Chat Content Moderation Guide

During any live streaming nowadays, whether it be a content creator on Youtube, an influencer on Instagram, or even live sports in some cases, there's always some sort of live chat. These are public commentary sections where viewers can interact and share their thoughts and opinions, but depending on which event or what sort of…
6 minutes

How to use Content Moderation to Build a Positive Brand Image

The idea of reputation has changed dramatically in the digital age, moving from conventional word-of-mouth to the wide world of user-generated material on the internet. Reputation has a long history that reflects changes in communication styles, cultural developments, and technological advancements. The importance of internet reviews has been highlighted by recent research conducted by Bright…
5 minutes

The Ultimate Guide to GenAI Moderation x Sightengine

Map your GenAI risks and craft “AI-resilient” policies [Part 1] GenAI presents significant challenge for platforms and the Trust & Safety field. As we head into 2025, AI-generated content and detection advancements are poised to take center stage. This post is part of a two-part blog series, co-authored with our partner Sightengine, exploring innovative strategies and…
12 minutes

Ready or Not, AI Is Coming to Content Moderation

As digital platforms and online communities continue to grow, content moderation becomes increasingly critical to ensure safe and positive user experiences. Manual content moderation by human moderators is effective but often falls short when dealing with the scale and complexity of user-generated content. Ready or not, AI is coming to content moderation operations, revolutionizing the…
5 minutes

The Effects of Unregulated Content for Gen Z

The Internet as an Irreplaceable Tool Gen Z’s are the first generation to be born in a world where the internet plays an irreplaceable role, and in some way, these children and adolescents are not just consumers but have become inhabitants of the digital society. Apart from school, generation Z spends most of their time…
5 minutes

Live Chat Moderation Guide

Interactions have moved online, and people now have the ability to interact as users, share content, write comments, and voice their opinions online. This revolution in the way people interact has led to the rise of many businesses that use live chat conversations and text content as one of their main components. Let's take, for…
10 minutes

Prevent unwanted content from reaching your platform

Speak to one of our experts and learn about using AI to protect your platform
Talk to an expert