7 dating insights from London Global Dating Insights Conference 2024

Justin GDI 2024

Hi, I’m Justin, Sales Director at Checkstep. In September, I had the opportunity to attend the Global Dating Insights Conference 2024, where leaders in the dating industry gathered to discuss emerging trends, challenges, and the evolving landscape of online dating. This year’s conference focused on how dating platforms are adapting to new user behaviors, regulatory changes, and the need for enhanced trust and safety measures. From the rise of conscious dating to combating toxic behaviors, here are seven key insights shaping the future of digital dating.

Dating Insight #1: The rise of conscious dating and user intent focus

The first dating insight targets a large base of users, especially younger generations, as they are leaning towards intentional, values-driven dating rather than casual encounters. This shift emphasizes trust, safety, and meaningful interactions, making it crucial for dating platforms to moderate content effectively to foster deeper connections and ensure user safety.

Dating Insight #2: Challenges with toxic behaviors

Digital dating is seeing specific challenges like ghosting, breadcrumbing, kittenfishing, and orbiting. These behaviors undermine trust and create emotional distress for users. During the conference, experts shared they want content moderation platforms to play a role by detecting patterns in user communications to address these harmful behaviors proactively, offering better safety and engagement experiences. 

Dating Insight #3: Increased scrutiny and regulation

The third dating insight is without any surprise, about regulations. With the Digital Services Act, Online Safety Acts and the many other regulations existing and to come – dating platforms need robust moderation systems to remain compliant and avoid fines. Solutions should include tools for monitoring evolving content and maintaining a balance between automation and human moderation. It should also ensure good data collection in order to report moderation decisions transparently to the EU Transparency Database. You can read about Shein’s experience with Checkstep’s latest connection to the database here

Dating Insight #4: Need for hybrid moderation models

Platforms benefit from combining AI-driven moderation (for scalability and speed) with human oversight (for nuanced cases). This approach ensures faster removal of inappropriate content, detects scams, and handles complex situations like identity-based harassment more effectively. Having a strong policy team to design and review moderation policies, while using LLMs to scan, detect and moderate the most harmful content, and leaving only marginal cases to moderators, is not only more efficient, but also better for the well-being of the moderators. 

Dating Insight #5: Combatting catfishing and fake profiles

With the prevalence of fake profiles and romance scams, platforms are increasingly implementing photo verification and identity checks. Moderation systems can further enhance this by using AI to flag suspicious activity patterns and integrating real-time verification systems. Dating companies we’ve met during the conference consistently report spam and scams as the biggest risk they need to manage on their platform. 

New dating trends such as open casting (engaging with multiple partners simultaneously) reflect the complexity of modern dating interactions. Moderation tools need to handle increasing volumes of communication while focusing on intent-based filtering—identifying harmful behaviors even when phrased differently or subtly. This is something we’ve developed at Checkstep letting you change your labels and the definitions that you provide to your LLM while scanning in seconds.

Dating Insight #7: User retention through enhanced Trust & Safety

Safety on dating apps has a direct correlation with better user retention, as safer platforms encourage long-term engagement. Tools that offer users control (e.g., block, report, or mute functions) are becoming essential features for dating apps. 

These insights from the Global Dating Insights 2024 Conference underscore the critical role content moderation plays in shaping the future of dating platforms. As user expectations evolve and regulations tighten, platforms that can balance innovation with user safety will be best positioned to succeed.

More posts like this

We want content moderation to enhance your users’ experience and so they can find their special one more easily.

How Video Game Bullying is Threatening the Future of the Industry

Video games have become an integral part of modern entertainment, offering immersive experiences and interactive gameplay. With the rise in popularity of online multiplayer games, a dark side has emerged : video game bullying. This pervasive issue threatens the well-being of players and the reputation of the entire video game industry. In this article, we…
4 minutes

How to Build a Safe Social Media Platform without Sacrificing the User’s Freedom

It was once unthinkable that social media would become an integral aspect of daily life, but here we are, relying on it for communication, information, entertainment, and even shaping our social interactions. It’s brought to our lives a whole new set of rules, and now that online duality is expected, the balance between safety and…
6 minutes

Digital Services Act (DSA) Transparency Guide [+Free Templates]

The Digital Services Act (DSA) is a comprehensive set of laws that aims to regulate digital services and platforms to ensure transparency, accountability, and user protection. In other words, it’s the European Union’s way of regulating and harmonizing separate laws under one universal piece of legislation to prevent illegal and harmful activities online and the…
7 minutes

Building Safer Dating Platforms with AI Moderation

It is not that long since online dating was considered taboo, something that people rarely talked about. But in the last decade it has changed hugely.  Now the global dating scene is clearly flourishing. Over 380 million people worldwide are estimated to use dating platforms, and it is an industry with an annualised revenue in…
4 minutes

Outsourcing Content Moderation

Outsourcing content moderation has become an essential aspect of managing online platforms in the digital age. With the exponential growth of user-generated content, businesses are faced with the challenge of maintaining a safe and inclusive environment for their users while protecting their brand reputation. To address this, many companies are turning to outsourcing content moderation…
4 minutes

From Trolls to Fair Play: The Transformative Impact of AI Moderation in Gaming

The Online Battlefield The online gaming community, once a haven for enthusiasts to connect and share their passion, has faced the growing challenge of toxic behaviour and harassment. Teenagers and young adults are still the main demographic of players, and as multiplayer games became more popular, so did instances of trolling, hate speech, and other…
4 minutes

Minor protection : 3 updates you should make to comply with DSA provisions

Introduction While the EU already has some rules to protect children online, such as those found in the Audiovisual Media Services Directive, the Digital Services Act (DSA) introduces specific obligations for platforms. As platforms adapt to meet the provisions outlined in the DSA Minor Protection, it's important for businesses to take proactive measures to comply…
5 minutes

How to Respond Faster to Crises with Self-Serve Queues

On May 26th, what began as a moment of celebration for Liverpool FC fans turned tragic when a car drove through the club’s Premier League victory parade on Water Street injuring 79 people including four children. As the news came out, videos and posts of eyewitnesses flooded social media. Moments like these bring more than…
4 minutes

Designing for Trust in 2023: How to Create User-Friendly Designs that Keep Users Safe

The Significance of designing for trust in the Digital World In today's digital landscape, building trust with users is essential for operating a business online. Trust is the foundation of successful user interactions and transactions, it is key to encouraging users to share personal information, make purchases, and interact with website content. Without trust, users…
5 minutes

The Evolution of Online Communication: Cultivating Safe and Respectful Interactions

What was once an outrageous dream is now a mundane reality. Going from in-person communication to being able to hold a conversation from thousands of kilometres away has been nothing short of revolutionary. From the invention of email to the meteoric rise of social media and video conferencing, the ways we connect, share, and interact…
5 minutes

Fake Dating Images: Your Ultimate Moderation Guide

Introduction: Combatting fake dating images to protect your platform With growing number of user concerns highlighting fake dating images to mislead users, dating platforms are facing a growing challenge. These pictures are not only a threat to dating platform's integrity but it also erodes user trusts and exposes companies to reputational and compliance risks. In…
5 minutes

TikTok DSA Statement of Reasons (SOR) Statistics

What can we learn from TikTok Statements of Reasons? Body shaming, hypersexualisation, the spread of fake news and misinformation, and the glorification of violence are a high risk on any kind of Social Network. TikTok is one of the fastest growing between 2020 and 2023 and has million of content uploaded everyday on its platform.…
10 minutes

Trust and Safety Regulations: A Comprehensive Guide [+Free Cheat Sheet]

Introduction In today’s digital landscape, trust, and safety are paramount concerns for online businesses, particularly those dealing with user-generated content. Trust and Safety regulations are designed to safeguard users, ensure transparency, and foster a secure online environment. These regulations are crucial for maintaining user confidence and protecting against online threats. In addition, as global concerns…
8 minutes

Content Moderation Using ChatGPT

In 10 minutes, you’ll learn how to use ChatGPT for content moderation across spam and hate speech. Who is this for? If you are in a technical role, and work at a company that has user generated content (UGC) then read on. We will show you how you can easily create content moderation models to…
11 minutes

Prevent unwanted content from reaching your platform

Speak to one of our experts and learn about using AI to protect your platform
Talk to an expert