fbpx

Minor protection : 3 updates you should make to comply with DSA provisions

Minor Protection : how to be DSA compliant

Introduction

While the EU already has some rules to protect children online, such as those found in the Audiovisual Media Services Directive, the Digital Services Act (DSA) introduces specific obligations for platforms.

As platforms adapt to meet the provisions outlined in the DSA Minor Protection, it’s important for businesses to take proactive measures to comply with these regulations effectively.

This article will go over three essential updates that businesses and online platforms should prioritize to align with DSA provisions regarding minor protection :

  • create strong privacy, safety and security measures
  • avoid targeted ads for minors
  • include the rights of children in the digital providers risk assessment.

These DSA Minor protection updates are crucial for creating a safer online environment for minors while at the same time meeting regulatory obligations. Let’s explore these updates in detail to understand how they can fortify your compliance efforts and protect the integrity of minor protection.

Minor protection : ensuring a high level of privacy, safety and security

What the DSA says

Article 28.1 : “Providers of online platforms accessible to minors shall put in place appropriate and proportionate measures to ensure a high level of privacy, safety, and security of minors, on their service.

What it means for online platforms

Digital providers must design their interface with the highest level of privacy, security and safety for minors. Best practices and available guidance are considered in the new European strategy for a better internet for kids (BIK+). Here are the main points coming out of it for you :

  1. Collaboration and Partnership: The BIK+ notes the importance of multi-stakeholder cooperation to address the challenges faced by young internet users.
  2. Safer Internet Centers: Use the resources and support offered by Safer Internet Centers, which are established across Europe to promote safe and responsible internet use among children and young people. These centers provide a lot of educational resources, helplines, and awareness-raising campaigns to empower minors and their caregivers with the tools and knowledge needed to navigate the online world safely.
  3. Age-Appropriate Content and Services: The BIK+ emphasizes the importance of age-appropriate design and content moderation to ensure that digital services cater to the unique needs and vulnerabilities of young users.
  4. Digital Literacy and Education: Invest in digital literacy programs and educational initiatives to help improve minors digital skills and critical thinking abilities. The BIK+ initiative offers a range of resources and tools for educators, parents, and young people to promote digital literacy and responsible online behavior.
  5. Empowering Young People: Empower young people to become active participants in shaping their online experiences and advocating for their rights and safety. Encourage platforms to involve young users in the design and governance of digital services.
  6. Continuous Learning and Improvement: Stay informed about evolving trends, technologies, and risks online through constant learning and professional development. Engage with industry forums, conferences, and training programs to exchange insights and best practices with peers.

Minor protection : avoid targeting minors through ads

What the DSA says

Article 28.2:Providers of online platform shall not present advertisements on their interface based on profiling as defined in Article 4, point (4), of Regulation (EU) 2016/679 using personal data of the recipient of the service when they are aware with reasonable certainty that the recipient of the service is a minor.

What it means for online platforms

What comes out of this articles is that digital service providers should not present advertisements to minors based on profiling.

Targeted advertisements (or targeted ads) are adverts that use the information you put online or information on the websites you visit and show you ads that it thinks you want to see. This can be a problem because these ads are trying to convince you to buy things you don’t need, or they only show one side of a story.

Minor protection : include the rights of children in your risk assessment

What the DSA says

Very large online platforms and search engines (VLOPs and VLOSEs) must make additional efforts to protect minors.

Very large online platforms and search engines (VLOPs and VLOSES) – those with more than 45 million monthly active users – have to comply to a stricter regulation because “the greater the size, the greater the responsibilities of online platforms”. The Charter of Fundamental Rights of the European Union says that they must conduct risk assessments at least once a year to gauge any negative effects on privacy, freedom of expression and information, the prohibition of discrimination and the rights of the child.

The reference here relates to children having the right to the protection and care as is necessary for their well-being. Private institutions must always act in the child’s best interests. Audits must also consider how Big Tech’s algorithms affect the physical and mental health of minors.

What it means for online platforms

This includes making sure that large online platforms risk assessment covers fundamental rights, which include the rights of the child. They should assess how easy it is for children and adolescents to understand how their service works and possible exposures to content that could impair their physical or mental wellbeing, or moral development.

Incorporating children’s rights into risk assessments, Digital service providers have to develop stronger safeguards and mitigation strategies adapted to the unique needs and vulnerabilities of minors.

For example, TikTok and YouTube, in addition to banning targeted ads for minors, have set the profiles of minors automatically to private, which means that the videos they upload can only be viewed by the people they approve.

Conclusion

The DSA minor protection is the recognition of children’s rights and their vulnerabilities online. As digital service providers navigate the complexities of compliance, it becomes imperative to integrate the rights of children into risk assessments comprehensively. This entails acknowledging children’s rights to privacy, safety, and access to age-appropriate content, while also considering the potential risks they may encounter online, including exposure to harmful content, exploitation, and data privacy breaches.

Checkstep helps providers to comply with the three new DSA minor protection updates by providing an all-in-one DSA compliance solution.


What is the DSA?

the Digital Services Act (DSA) is proposed European Union legislation aimed at regulating digital services to address issues like online content moderation, disinformation, and user protection. For the latest information, please check recent sources.


What does the DSA say about minor protection?

The Digital Services Act (DSA) includes provisions to protect minors online like: age verification, stricter privacy rules for minors data, measures for content moderation and safety, promotion of digital literacy, and accountability mechanisms for digital service providers.

More posts like this

We want content moderation to enhance your users’ experience and so they can find their special one more easily.

How to deal with Fake Dating Profiles on your Platform

Have you seen an increase in fake profiles on your platform? Are you concerned about it becoming a wild west? In this article, we’ll dive into how to protect users from encountering bad actors and create a safer environment for your customers. An Introduction to the Issue Dating apps have transformed the way people interact…
5 minutes

Moderation Strategies for Decentralised Autonomous Organisations (DAOs)

Decentralised Autonomous Organizations (DAOs) are a quite recent organisational structure enabled by blockchain technology. They represent a complete structural shift in how groups organise and make decisions, leveraging decentralised networks and smart contracts to facilitate collective governance and decision-making without a centralised authority. The concept of DAOs emerged in 2016 with the launch of "The…
6 minutes

How Content Moderation Can Save a Brand’s Reputation

Brand safety and perception have always been important factors to look out for in any organisation, but now, because we live in a world where social media and the internet play an essential role in the way we interact, that aspect has exponentially grown in importance. The abundance of user-generated content on different platforms offers…
5 minutes

How to Keep your Online Community Abuse-Free

The Internet & Community Building In the past, if you were really into something niche, finding others who shared your passion in your local area was tough. You might have felt like you were the only one around who had that particular interest. But things have changed a lot since then. Now, thanks to the…
6 minutes

9 Industries Benefiting from AI Content Moderation

As the internet becomes an integral part of people's lives, industries have moved towards having a larger online presence. Many businesses in these industries have developed online platforms where user-generated content (UGC) plays a major role. From the rise of online healthcare to the invention of e-learning, all of these promote interaction between parties through…
8 minutes

Why moderation has become essential for UGC 

User-Generated Content (UGC) has become an integral part of online participation. Any type of material—whether it's text, photos, videos, reviews, or discussions—that is made and shared by people instead of brands or official content providers is called user-generated content. Representing variety and honesty, it is the online community's collective voice. Let's explore user-generated content (UGC)…
6 minutes

3 Facts you Need to Know about Content Moderation and Dating Going into 2024

What is Content Moderation? Content moderation is the practice of monitoring and managing user-generated content on digital platforms to ensure it complies with community guidelines, legal standards, and ethical norms. This process aims to create a safe and inclusive online environment by preventing the spread of harmful, offensive, or inappropriate content. The rise of social…
6 minutes

How to use Content Moderation to Build a Positive Brand Image

The idea of reputation has changed dramatically in the digital age, moving from conventional word-of-mouth to the wide world of user-generated material on the internet. Reputation has a long history that reflects changes in communication styles, cultural developments, and technological advancements. The importance of internet reviews has been highlighted by recent research conducted by Bright…
5 minutes

Live Chat Moderation Guide

Interactions have moved online, and people now have the ability to interact as users, share content, write comments, and voice their opinions online. This revolution in the way people interact has led to the rise of many businesses that use live chat conversations and text content as one of their main components. Let's take, for…
10 minutes

The Digital Services Act (DSA) Guide

What is the Digital Services Act (DSA)? The Digital Services Act, otherwise known as the DSA, is the first attempt by theEuropean Union to govern platforms at the regulatory level. Up until this point, all 27 EUmember states have each had their own laws that may or may not apply to onlineplatforms. The DSA is…
7 minutes

Virtual Reality Content Moderation Guide

Its’s no surprise that virtual reality (VR) and the Metaverse have become buzzwords in the world of technology. Notably, these immersive experiences are revolutionising the way we interact with digital content and each other. However, as the popularity of VR continues to grow, attracting more and more users, so does the need for content moderation.…
14 minutes

Top 3 Digital Services Act Tools to make your compliance easier

Introduction The Digital Service Act (DSA) is a European regulation amending the June, 8th 2000 Directive on electronic commerce (Directive 2000/31/EC). Its goal is to modernize and harmonize national legislation within the internal market in response to the risks and challenges of digital transformation. The DSA applies to a large range of digital services such…
12 minutes

Digital Services Act (DSA) Transparency Guide [+Free Templates]

The Digital Services Act (DSA) is a comprehensive set of laws that aims to regulate digital services and platforms to ensure transparency, accountability, and user protection. In other words, it’s the European Union’s way of regulating and harmonizing separate laws under one universal piece of legislation to prevent illegal and harmful activities online and the…
7 minutes

Customizing AI Content Moderation for Different Industries and Platforms

With the exponential growth of user-generated content across various industries and platforms, the need for effective and tailored content moderation solutions has never been more apparent. Artificial Intelligence (AI) plays a major role in automating content moderation processes, but customization is key to address the unique challenges faced by different industries and platforms. Understanding Industry-Specific…
3 minutes

Prevent unwanted content from reaching your platform

Speak to one of our experts and learn about using AI to protect your platform
Talk to an expert