We recently spoke with Checkstep’s newest addition to the team, Yu-Lan Scholliers. She joins Checkstep from her recent position at Facebook where she was a Trust and Safety Specialist. She joins Checkstep as the Head of Operations. We asked her a bit about her background and interests to get to know her a little better.
1. What drew you to the Trust & Safety / Online Safety space?
I would say that the space picked me! A couple of years ago, I wanted to change teams within Meta but had no idea what I wanted to do. The manager of the Trust & Safety team (internally it’s called Community Integrity) presented me with a couple options and when I saw counter-terrorism on the list I immediately knew that’s what I wanted to work on. Later on, I also worked on other areas such as Adult Sexual Exploitation and Suicide & Self-Injury Prevention. I love the Trust & Safety space for mainly two reasons:
- Helping to build safe communities — the internet is here to stay and I want to contribute to make it a safe space where people can get together, share, interact and learn
- The variety of challenges — not only technical challenges, but also how to define and refine policies based on learnings and integrate legislative requirements. Above all — the adversarial aspect in some areas is fascinating — this problem is never solved.
2. Why did you choose Checkstep?
Moving to a start-up was a big step for me, after spending my career until now entirely in big corporations:
- I believe Checkstep is a great product that focuses on aggregating different technologies through both in-house development and through partnerships, together with workflow management, so we can offer clients a one-stop-shop
- What convinced me was the proven track record of the founders to successfully build and grow start-ups and their mission-driven mindset
I’m also excited to join a start-up as a new personal challenge, bringing my experience in Trust & Safety to this fast-paced and innovative environment, working with some incredible people!
3. What are your interests outside of work, your hobbies etc.?
Aah so many!! I love exploring new places with my partner — 2021 has been my most well-traveled year ever, and I enjoy spending time with my friends. I also started learning how to play the piano, and you can also find me cantering down the countryside on my horse, Picasso. I’m a curious person and love learning new things, so I read a lot of books and listen to podcasts around a variety of topics.
4. Is the upcoming legislation (UK’s Online Safety Bill) a reflection of existing Trust & Safety practices at Big Tech? What do you think the impact is on smaller platforms?
In my opinion, Big Tech has played a major role in shaping online safety regulations — both in the UK and Europe. Due to the sheer size of these networks, the impact of any safety issue on the platform can be severe, hence driving regulators to think about how to deal with them. I believe an open discussion on regulation is welcome; however, the current regulation sometimes stays very vague on some hugely complex issues. Big platforms already have many tools in place, so this will mainly have an impact on smaller platforms as they do not necessarily have the required resources nor want to run the risk of facing jail-time. As a result, I believe some platforms might seriously consider the value of their UK presence or they might want to err on the side of caution which might result in over-enforcement.
5. What do you think are the key indicators of a good trust and safety system?
In general, it’s always a good sign if a platform has a strong cross-functional group of people working on the problem, i.e. policy experts, engineering, product and operations. On a more granular level, the system is designed to both prevent harm from happening and reduce any harm that does occur in a timely and efficient manner, with the ultimate indicator measuring the prevalence of a given harm on a platform, i.e. a platform would want to be able to measure how much harm there is and reduce it over time. However, measuring prevalence can be extremely difficult and a range of other indicators exist that might serve as proxies. I could write a whole book on this, but the best way to learn more about this is to have a look at the transparency reports published by various tech platforms.
6. Do you think women are well-represented in tech? Do you think diversity and the tech industry go hand-in-hand?
No — and neither are other diversity groups. I have been fortunate to have been working with many incredible people, including women, during my time at Meta because the data function was very well balanced in terms of gender; however, I have also worked with engineering teams that were all male. This is due to a combination of supply, i.e. some groups not being able to access education or not believing tech is the right path for them, but also hiring challenges.
I do think the tech industry has the means and capabilities to address this, i.e. by providing access to education and information. Two examples of these efforts I really like are The Female Lead and the Your Life Campaign — both driven by Edwina Dunn. I do believe we need to focus on all-encompassing diversity, focusing on more than just gender, by supporting and inspiring adolescents. By doing so, not only can we lift and grow more people but also ensure that the people who build platforms are culturally diverse and representative of the users on those platforms.
7. Isolation often leads people to take drastic measures, such as self-harm or partaking in online challenges that could be dangerous. Platforms often find such incidents harder to detect. In your opinion, how can platforms be better prepared?
There is a delicate balance between expressing thoughts around mental health, suicide and self-harm and protecting vulnerable users. By keeping such content on a platform, a community can intervene and reach-out while that same content can also be extremely triggering. I believe one of the things a platform can do here is to have a clear and comprehensive policy in place, which can then be translated into detection and safeguarding efforts by product teams while also building tools for the community to raise any issues or risks they perceive on the platform.
8. When the Taliban took control of Afghanistan, they also took control of the national social media channels. But at that point, they were still designated a terrorist organization. Even though no extreme activity has been detected, do you think they should be allowed access to the various social media platforms?
I am against any form of violence, and for me, violence has no place in society, let alone social media. The issue can be very nuanced however, because it would be up for debate whether a fighter using social media for their personal use without explicit affiliation to any entity would be banned. There are often no clear-cut decisions, and it’s very important for platforms to have a clear policy on this, while also ensuring there is a feedback loop in place. Designating organizations can be very polarizing and politically charged, and it’s important platforms are aware of the impact this can have.