fbpx

Expert’s Corner with Head of Product Yu-Lan Scholliers

We recently spoke with Checkstep’s newest addition to the team, Yu-Lan Scholliers. She joins Checkstep from her recent position at Facebook where she was a Trust and Safety Specialist. She joins Checkstep as the Head of Operations. We asked her a bit about her background and interests to get to know her a little better.

1. What drew you to the Trust & Safety / Online Safety space?

I would say that the space picked me! A couple of years ago, I wanted to change teams within Meta but had no idea what I wanted to do. The manager of the Trust & Safety team (internally it’s called Community Integrity) presented me with a couple options and when I saw counter-terrorism on the list I immediately knew that’s what I wanted to work on. Later on, I also worked on other areas such as Adult Sexual Exploitation and Suicide & Self-Injury Prevention. I love the Trust & Safety space for mainly two reasons:

  • Helping to build safe communities — the internet is here to stay and I want to contribute to make it a safe space where people can get together, share, interact and learn
  • The variety of challenges — not only technical challenges, but also how to define and refine policies based on learnings and integrate legislative requirements. Above all — the adversarial aspect in some areas is fascinating — this problem is never solved.

2. Why did you choose Checkstep?

Moving to a start-up was a big step for me, after spending my career until now entirely in big corporations:

  • I believe Checkstep is a great product that focuses on aggregating different technologies through both in-house development and through partnerships, together with workflow management, so we can offer clients a one-stop-shop
  • What convinced me was the proven track record of the founders to successfully build and grow start-ups and their mission-driven mindset

I’m also excited to join a start-up as a new personal challenge, bringing my experience in Trust & Safety to this fast-paced and innovative environment, working with some incredible people!

3. What are your interests outside of work, your hobbies etc.?

Aah so many!! I love exploring new places with my partner — 2021 has been my most well-traveled year ever, and I enjoy spending time with my friends. I also started learning how to play the piano, and you can also find me cantering down the countryside on my horse, Picasso. I’m a curious person and love learning new things, so I read a lot of books and listen to podcasts around a variety of topics.

4. Is the upcoming legislation (UK’s Online Safety Bill) a reflection of existing Trust & Safety practices at Big Tech? What do you think the impact is on smaller platforms?

In my opinion, Big Tech has played a major role in shaping online safety regulations — both in the UK and Europe. Due to the sheer size of these networks, the impact of any safety issue on the platform can be severe, hence driving regulators to think about how to deal with them. I believe an open discussion on regulation is welcome; however, the current regulation sometimes stays very vague on some hugely complex issues. Big platforms already have many tools in place, so this will mainly have an impact on smaller platforms as they do not necessarily have the required resources nor want to run the risk of facing jail-time. As a result, I believe some platforms might seriously consider the value of their UK presence or they might want to err on the side of caution which might result in over-enforcement.

5. What do you think are the key indicators of a good trust and safety system?

In general, it’s always a good sign if a platform has a strong cross-functional group of people working on the problem, i.e. policy experts, engineering, product and operations. On a more granular level, the system is designed to both prevent harm from happening and reduce any harm that does occur in a timely and efficient manner, with the ultimate indicator measuring the prevalence of a given harm on a platform, i.e. a platform would want to be able to measure how much harm there is and reduce it over time. However, measuring prevalence can be extremely difficult and a range of other indicators exist that might serve as proxies. I could write a whole book on this, but the best way to learn more about this is to have a look at the transparency reports published by various tech platforms.

6. Do you think women are well-represented in tech? Do you think diversity and the tech industry go hand-in-hand?

No — and neither are other diversity groups. I have been fortunate to have been working with many incredible people, including women, during my time at Meta because the data function was very well balanced in terms of gender; however, I have also worked with engineering teams that were all male. This is due to a combination of supply, i.e. some groups not being able to access education or not believing tech is the right path for them, but also hiring challenges.

I do think the tech industry has the means and capabilities to address this, i.e. by providing access to education and information. Two examples of these efforts I really like are The Female Lead and the Your Life Campaign — both driven by Edwina Dunn. I do believe we need to focus on all-encompassing diversity, focusing on more than just gender, by supporting and inspiring adolescents. By doing so, not only can we lift and grow more people but also ensure that the people who build platforms are culturally diverse and representative of the users on those platforms.

7. Isolation often leads people to take drastic measures, such as self-harm or partaking in online challenges that could be dangerous. Platforms often find such incidents harder to detect. In your opinion, how can platforms be better prepared?

There is a delicate balance between expressing thoughts around mental health, suicide and self-harm and protecting vulnerable users. By keeping such content on a platform, a community can intervene and reach-out while that same content can also be extremely triggering. I believe one of the things a platform can do here is to have a clear and comprehensive policy in place, which can then be translated into detection and safeguarding efforts by product teams while also building tools for the community to raise any issues or risks they perceive on the platform.

8. When the Taliban took control of Afghanistan, they also took control of the national social media channels. But at that point, they were still designated a terrorist organization. Even though no extreme activity has been detected, do you think they should be allowed access to the various social media platforms?

I am against any form of violence, and for me, violence has no place in society, let alone social media. The issue can be very nuanced however, because it would be up for debate whether a fighter using social media for their personal use without explicit affiliation to any entity would be banned. There are often no clear-cut decisions, and it’s very important for platforms to have a clear policy on this, while also ensuring there is a feedback loop in place. Designating organizations can be very polarizing and politically charged, and it’s important platforms are aware of the impact this can have.

More posts like this

We want content moderation to enhance your users’ experience and so they can find their special one more easily.

Text Moderation: Scale your content moderation with AI

In today's interconnected world, text-based communication has become a fundamental part of our daily lives. However, with the exponential growth of user-generated text content on digital platforms, ensuring a safe and inclusive online environment has become a daunting task. Text moderation plays a critical role in filtering and managing user-generated content to prevent harmful or…
4 minutes

How Predators Are Abusing Generative AI

The recent rise of generative AI has revolutionized various industries, including Trust and Safety. However, this technological advancement generates new problems. Predators have found ways to abuse generative AI, using it to carry out horrible acts such as child sex abuse material (CSAM), disinformation, fraud, and extremism. In this article, we will explore how predators…
4 minutes

Content Moderation for Virtual Reality

What is content moderation in virtual reality? Content moderation in virtual reality (VR) is the process of monitoring and managing user-generated content within VR platforms to make sure it meets certain standards and guidelines. This can include text, images, videos, and any actions within the 3D virtual environment. Given the interactive and immersive nature of…
31 minutes

Why moderation has become essential for UGC 

User-Generated Content (UGC) has become an integral part of online participation. Any type of material—whether it's text, photos, videos, reviews, or discussions—that is made and shared by people instead of brands or official content providers is called user-generated content. Representing variety and honesty, it is the online community's collective voice. Let's explore user-generated content (UGC)…
6 minutes

Audio Moderation: AI-Driven Strategies to Combat Online Threats

In today's digitally-driven world, audio content has become an integral part of online platforms, ranging from podcasts and audiobooks to user-generated audio clips on social media. With the increasing volume of audio content being generated daily, audio moderation has become a critical aspect of maintaining a safe and positive user experience. Audio moderation involves systematically…
4 minutes

‍The Future of Dating: Embracing Video to Connect and Thrive

In a rapidly evolving digital landscape, dating apps are continually seeking innovative ways to enhance the user experience and foster meaningful connections. One such trend that has gained significant traction is the integration of video chat features. Video has emerged as a powerful tool to add authenticity, connectivity, and fun to the dating process. In…
4 minutes

Live Chat Content Moderation Guide

During any live streaming nowadays, whether it be a content creator on Youtube, an influencer on Instagram, or even live sports in some cases, there's always some sort of live chat. These are public commentary sections where viewers can interact and share their thoughts and opinions, but depending on which event or what sort of…
6 minutes

3 Facts you Need to Know about Content Moderation and Dating Going into 2024

What is Content Moderation? Content moderation is the practice of monitoring and managing user-generated content on digital platforms to ensure it complies with community guidelines, legal standards, and ethical norms. This process aims to create a safe and inclusive online environment by preventing the spread of harmful, offensive, or inappropriate content. The rise of social…
6 minutes

Expert’s Corner with Head of Research Isabelle Augenstein

This month we were very happy to sit down with one of the brains behind Checkstep who is also a recognized talent among European academics. She is the co-head of research at Checkstep and also an associate professor at the University of Copenhagen. She currently holds a prestigious DFF Sapere Aude Research Leader fellowship on ‘Learning to…
5 minutes

Video Moderation : It’s Scale or Fail with AI

In the digital age, video content has become a driving force across online platforms, shaping the way we communicate, entertain, and share experiences. With this exponential growth, content moderation has become a critical aspect of maintaining a safe and inclusive online environment. The sheer volume of user-generated videos poses significant challenges for platforms, necessitating advanced…
4 minutes

What is Content Moderation: a Guide

Content moderation is one of the major aspect of managing online platforms and communities. It englobes the review, filtering, and approval or removal of user-generated content to maintain a safe and engaging environment. In this article, we'll provide you with a comprehensive glossary to understand the key concepts, as well as its definition, challenges and…
15 minutes

The Effects of Unregulated Content for Gen Z

The Internet as an Irreplaceable Tool Gen Z’s are the first generation to be born in a world where the internet plays an irreplaceable role, and in some way, these children and adolescents are not just consumers but have become inhabitants of the digital society. Apart from school, generation Z spends most of their time…
5 minutes

How to use Content Moderation to Build a Positive Brand Image

The idea of reputation has changed dramatically in the digital age, moving from conventional word-of-mouth to the wide world of user-generated material on the internet. Reputation has a long history that reflects changes in communication styles, cultural developments, and technological advancements. The importance of internet reviews has been highlighted by recent research conducted by Bright…
5 minutes

Expert’s Corner with Community Building Expert Todd Nilson

Checkstep interviews expert in online community building Todd Nilson leads transformational technology projects for major brands and organizations. He specializes in online communities, digital workplaces, social listening analysis, competitive intelligence, game thinking, employer branding, and virtual collaboration. Todd has managed teams and engagements with national and global consultancy firms specialized in online communities and the…
7 minutes

Podcast Moderation at Scale: Leveraging AI to Manage Content

The podcasting industry has experienced an explosive growth in recent years, with millions of episodes being published across various platforms every day. As the volume of audio content surges, ensuring a safe and trustworthy podcast environment becomes a paramount concern. Podcast moderation plays a crucial role in filtering and managing podcast episodes to prevent the…
4 minutes

Prevent unwanted content from reaching your platform

Speak to one of our experts and learn about using AI to protect your platform
Talk to an expert