fbpx

Minor protection : 3 updates you should make to comply with DSA provisions

Minor Protection : how to be DSA compliant

Introduction

While the EU already has some rules to protect children online, such as those found in the Audiovisual Media Services Directive, the Digital Services Act (DSA) introduces specific obligations for platforms.

As platforms adapt to meet the provisions outlined in the DSA Minor Protection, it’s important for businesses to take proactive measures to comply with these regulations effectively.

This article will go over three essential updates that businesses and online platforms should prioritize to align with DSA provisions regarding minor protection :

  • create strong privacy, safety and security measures
  • avoid targeted ads for minors
  • include the rights of children in the digital providers risk assessment.

These DSA Minor protection updates are crucial for creating a safer online environment for minors while at the same time meeting regulatory obligations. Let’s explore these updates in detail to understand how they can fortify your compliance efforts and protect the integrity of minor protection.

Minor protection : ensuring a high level of privacy, safety and security

What the DSA says

Article 28.1 : “Providers of online platforms accessible to minors shall put in place appropriate and proportionate measures to ensure a high level of privacy, safety, and security of minors, on their service.

What it means for online platforms

Digital providers must design their interface with the highest level of privacy, security and safety for minors. Best practices and available guidance are considered in the new European strategy for a better internet for kids (BIK+). Here are the main points coming out of it for you :

  1. Collaboration and Partnership: The BIK+ notes the importance of multi-stakeholder cooperation to address the challenges faced by young internet users.
  2. Safer Internet Centers: Use the resources and support offered by Safer Internet Centers, which are established across Europe to promote safe and responsible internet use among children and young people. These centers provide a lot of educational resources, helplines, and awareness-raising campaigns to empower minors and their caregivers with the tools and knowledge needed to navigate the online world safely.
  3. Age-Appropriate Content and Services: The BIK+ emphasizes the importance of age-appropriate design and content moderation to ensure that digital services cater to the unique needs and vulnerabilities of young users.
  4. Digital Literacy and Education: Invest in digital literacy programs and educational initiatives to help improve minors digital skills and critical thinking abilities. The BIK+ initiative offers a range of resources and tools for educators, parents, and young people to promote digital literacy and responsible online behavior.
  5. Empowering Young People: Empower young people to become active participants in shaping their online experiences and advocating for their rights and safety. Encourage platforms to involve young users in the design and governance of digital services.
  6. Continuous Learning and Improvement: Stay informed about evolving trends, technologies, and risks online through constant learning and professional development. Engage with industry forums, conferences, and training programs to exchange insights and best practices with peers.

Minor protection : avoid targeting minors through ads

What the DSA says

Article 28.2:Providers of online platform shall not present advertisements on their interface based on profiling as defined in Article 4, point (4), of Regulation (EU) 2016/679 using personal data of the recipient of the service when they are aware with reasonable certainty that the recipient of the service is a minor.

What it means for online platforms

What comes out of this articles is that digital service providers should not present advertisements to minors based on profiling.

Targeted advertisements (or targeted ads) are adverts that use the information you put online or information on the websites you visit and show you ads that it thinks you want to see. This can be a problem because these ads are trying to convince you to buy things you don’t need, or they only show one side of a story.

Minor protection : include the rights of children in your risk assessment

What the DSA says

Very large online platforms and search engines (VLOPs and VLOSEs) must make additional efforts to protect minors.

Very large online platforms and search engines (VLOPs and VLOSES) – those with more than 45 million monthly active users – have to comply to a stricter regulation because “the greater the size, the greater the responsibilities of online platforms”. The Charter of Fundamental Rights of the European Union says that they must conduct risk assessments at least once a year to gauge any negative effects on privacy, freedom of expression and information, the prohibition of discrimination and the rights of the child.

The reference here relates to children having the right to the protection and care as is necessary for their well-being. Private institutions must always act in the child’s best interests. Audits must also consider how Big Tech’s algorithms affect the physical and mental health of minors.

What it means for online platforms

This includes making sure that large online platforms risk assessment covers fundamental rights, which include the rights of the child. They should assess how easy it is for children and adolescents to understand how their service works and possible exposures to content that could impair their physical or mental wellbeing, or moral development.

Incorporating children’s rights into risk assessments, Digital service providers have to develop stronger safeguards and mitigation strategies adapted to the unique needs and vulnerabilities of minors.

For example, TikTok and YouTube, in addition to banning targeted ads for minors, have set the profiles of minors automatically to private, which means that the videos they upload can only be viewed by the people they approve.

Conclusion

The DSA minor protection is the recognition of children’s rights and their vulnerabilities online. As digital service providers navigate the complexities of compliance, it becomes imperative to integrate the rights of children into risk assessments comprehensively. This entails acknowledging children’s rights to privacy, safety, and access to age-appropriate content, while also considering the potential risks they may encounter online, including exposure to harmful content, exploitation, and data privacy breaches.

Checkstep helps providers to comply with the three new DSA minor protection updates by providing an all-in-one DSA compliance solution.


What is the DSA?

the Digital Services Act (DSA) is proposed European Union legislation aimed at regulating digital services to address issues like online content moderation, disinformation, and user protection. For the latest information, please check recent sources.


What does the DSA say about minor protection?

The Digital Services Act (DSA) includes provisions to protect minors online like: age verification, stricter privacy rules for minors data, measures for content moderation and safety, promotion of digital literacy, and accountability mechanisms for digital service providers.

More posts like this

We want content moderation to enhance your users’ experience and so they can find their special one more easily.

Podcast Moderation at Scale: Leveraging AI to Manage Content

The podcasting industry has experienced an explosive growth in recent years, with millions of episodes being published across various platforms every day. As the volume of audio content surges, ensuring a safe and trustworthy podcast environment becomes a paramount concern. Podcast moderation plays a crucial role in filtering and managing podcast episodes to prevent the…
4 minutes

Live Chat Content Moderation Guide

During any live streaming nowadays, whether it be a content creator on Youtube, an influencer on Instagram, or even live sports in some cases, there's always some sort of live chat. These are public commentary sections where viewers can interact and share their thoughts and opinions, but depending on which event or what sort of…
6 minutes

How to use Content Moderation to Build a Positive Brand Image

The idea of reputation has changed dramatically in the digital age, moving from conventional word-of-mouth to the wide world of user-generated material on the internet. Reputation has a long history that reflects changes in communication styles, cultural developments, and technological advancements. The importance of internet reviews has been highlighted by recent research conducted by Bright…
5 minutes

The Ultimate Guide to GenAI Moderation x Sightengine

Map your GenAI risks and craft “AI-resilient” policies [Part 1] GenAI presents significant challenge for platforms and the Trust & Safety field. As we head into 2025, AI-generated content and detection advancements are poised to take center stage. This post is part of a two-part blog series, co-authored with our partner Sightengine, exploring innovative strategies and…
12 minutes

Ready or Not, AI Is Coming to Content Moderation

As digital platforms and online communities continue to grow, content moderation becomes increasingly critical to ensure safe and positive user experiences. Manual content moderation by human moderators is effective but often falls short when dealing with the scale and complexity of user-generated content. Ready or not, AI is coming to content moderation operations, revolutionizing the…
5 minutes

The Effects of Unregulated Content for Gen Z

The Internet as an Irreplaceable Tool Gen Z’s are the first generation to be born in a world where the internet plays an irreplaceable role, and in some way, these children and adolescents are not just consumers but have become inhabitants of the digital society. Apart from school, generation Z spends most of their time…
5 minutes

Live Chat Moderation Guide

Interactions have moved online, and people now have the ability to interact as users, share content, write comments, and voice their opinions online. This revolution in the way people interact has led to the rise of many businesses that use live chat conversations and text content as one of their main components. Let's take, for…
10 minutes

Content Moderators : How to protect their Mental Health ? 

Content moderation has become an essential aspect of managing online platforms and ensuring a safe user experience. Behind the scenes, content moderators play a crucial role in reviewing user-generated content, filtering out harmful or inappropriate materials, and upholding community guidelines. However, the task of content moderation is not without its challenges, as it exposes moderators…
4 minutes

How to Protect Online Food Delivery Users: The Critical Role of Moderation

Nowadays, most people can’t remember the last time they called a restaurant and asked for their food to be delivered. In fact, most people can’t recall the last time they called a restaurant for anything. In this new era of convenience, food delivery has undergone a revolutionary transformation. What once involved a phone call to…
5 minutes

The Digital Services Act (DSA) Guide

What is the Digital Services Act (DSA)? The Digital Services Act, otherwise known as the DSA, is the first attempt by theEuropean Union to govern platforms at the regulatory level. Up until this point, all 27 EUmember states have each had their own laws that may or may not apply to onlineplatforms. The DSA is…
7 minutes

Trust and Safety Teams: Ensuring User Protection

As the internet becomes an integral part of our daily lives, companies must prioritize the safety and security of their users. This responsibility falls on trust and safety teams, whose primary goal is to protect users from fraud, abuse, and other harmful behavior.  Trust and Safety Teams Objectives  The Role of Trust and Safety Teams…
6 minutes

Scaling Content Moderation Through AI Pays Off, No Matter the Investment

In the rapidly evolving digital landscape, user-generated content has become the lifeblood of online platforms, from social media giants to e-commerce websites. With the surge in content creation, content moderation has become a critical aspect of maintaining a safe and reputable online environment. As the volume of user-generated content continues to grow, manual content moderation…
4 minutes

Navigating Relationships: Why Content Moderation Plays a Critical Role in Modern Dating

Since the invention of dating websites in 1995, the way potential partners meet and form relationships has changed completely. However, with this convenience comes the challenge of ensuring a safe and positive user experience, which becomes increasingly tedious and time-consuming as more users enter the platform. This is where AI content moderation comes in handy,…
4 minutes

Virtual Reality Content Moderation Guide

Its’s no surprise that virtual reality (VR) and the Metaverse have become buzzwords in the world of technology. Notably, these immersive experiences are revolutionising the way we interact with digital content and each other. However, as the popularity of VR continues to grow, attracting more and more users, so does the need for content moderation.…
14 minutes

How to Launch a Successful Career in Trust and Safety‍

Before diving into the specifics of launching a career in Trust and Safety, it's important to have a clear understanding of what this field entails. Trust and Safety professionals are responsible for maintaining a safe and secure environment for users on digital platforms. This includes identifying and addressing harmful content, developing policies to prevent abuse,…
5 minutes

Prevent unwanted content from reaching your platform

Speak to one of our experts and learn about using AI to protect your platform
Talk to an expert