The Digital Services Act (DSA) Guide

DSA

What is the Digital Services Act (DSA)?


The Digital Services Act, otherwise known as the DSA, is the first attempt by the
European Union to govern platforms at the regulatory level. Up until this point, all 27 EU
member states have each had their own laws that may or may not apply to online
platforms. The DSA is the first major effort to harmonise these separate laws under one
universal piece of legislation.

Broadly speaking, this regulation’s goals are to hold digital services accountable for:

  1. The dissemination of illegal and harmful content.
    a. Illegality is determined at both the EU level and Member State level.
  2. Transparency of their operations, including the design of their services.
    a. Focusing on transparency reporting requirements.
  3. Protection of user information.
    a. Including compliance with GDPR.

More specifically, the DSA focuses on:

  1. Preventing the sale of illegal goods and services.
  2. Ensuring the safety of minors.
  3. Banning content that is illegal and/or misleading.
  4. Prohibiting advertising that is targeted using sensitive personal information.

This guide will help you and your service or platform with regard to preparation for the DSA. We will cover all the essential information you need to get yourself started on the path to full compliance. While the above introductory info is important, this guide will go a step further and detail :

  1. What specific kinds of entities the DSA applies to.
  2. What the particular requirements are for those who own and/ or run in-scope services.
  3. What exactly is a Transparency Report and what data it should include.
  4. Suggestions for preparing to comply with the DSA.

Who Does the DSA Apply to?

While it is aimed at all digital companies that have users/ customers in the EU, the DSA divides in-scope services into these categories:

  1. Very large online platforms or VLOPs. The platforms in this category reach an
    average of 45 million EU users (or more than 10% of the EU population) per
    month.
  2. Hosting services include web-hosting or cloud storage services.
  3. Intermediary services don’t usually post their own content, but instead provide
    the infrastructure – think domain services or internet service providers (ISPs).
  4. Online marketplaces, including platforms, that facilitate sales/ transactions
    between users and sellers. These include e-commerce sites as well as, for
    example, app stores.

Services that fall within Scope :

  • Social Networks
  • Messaging Platforms
  • Blogs, Forums, Q&A
  • Dating Sites
  • Creativity Platforms
  • Wellness and Personal Development
  • Services Education and Child-specific Platforms
  • Live Streaming Platforms
  • Streaming Platforms (General)
  • Hosting Platforms
  • Gaming and E-Sport Platforms
  • Gig Economy/ Marketplace Platforms
  • Reviews Sites Crowdfunding Platforms
  • Online News Sites
  • Sport Media Platforms
  • Retail Investing Platforms
  • iGaming Platforms
  • Community Building Platforms
  • e-Commerce Platforms

NOTE: It’s important to know that your service doesn’t have to be based in the EU
for the DSA to apply.

What Happens if a Service Doesn’t Comply?

In considering consequences for non-compliance, the EU has paid special attention to VLOPs. Services that fall into this category when they are found to be non-compliant could be fined up to 6% of global annual turnover. For those services that continue to breach the DSA’s obligations, there is a risk of a complete ban from operating in the EU. For companies that are too small to fall into the VLOPs category, regulatory supervision will be the responsibility of the Digital Services Coordinator in each Member State. In those cases, the Coordinator in the relevant Member State will decide on how non-compliance is penalised.

Exemptions

Businesses considered to be micro/ small are exempt from some requirements. These
businesses must have:

  1. Less than 50 employees.
  2. A maximum annual turnover of €10 million.

These businesses are exempt from the following requirements:

  1. A mechanism for user complaints as well as out-of-court disputes.
  2. Prioritising notices from trusted flaggers.
  3. Processes and/ or measures that defend against malicious and abusive notices.
  4. Safety by Design.
  5. Transparency reports detailing all moderation actions.
  6. Online advertising transparency.
  7. User traceability to track illegal activity.
  8. Reporting activity that is suspected of being criminal.

What are the DSA Requirements?

The obligations under the DSA vary by the size of the platform in question, with VLOPs
carrying the heaviest burden for compliance. Here’s a list of requirements based on
which category your service falls under:

Intermediary Services

  1. Transparency reports detailing all moderation actions, learn more here.
  2. Clear and easy-to-find Terms of Service.
  3. Designated points of contact and (where applicable) a legal representative.
  4. Full cooperation with orders and processes required by the authorities of EU
    Member States.

Hosting Services

All of the above, in addition to:

  1. Reporting of illegal activity.
  2. A notice-and-action procedure, learn more here.

Online Platforms

All of the above, in addition to:

  1. A mechanism for user complaints as well as out-of-court disputes.
  2. Trusted flaggers. These are people or entities who are independent of platforms
    and appointed by the relevant Digital Services Coordinator of an EU Member
    State. Their job is, among other things, to report illegal content on platforms.
  3. Reporting criminal offences, learn more here.
  4. Processes and/ or measures that defend against malicious and abusive notices.
  5. Bans on ads that target children or are targeted based on protected
    characteristics of users. These characteristics include race, sexual orientation,
    and political views.
  6. Safety by Design:
    a. Banning platform designs that inhibit users’ ability to make free and
    informed decisions.
    b. (For platforms accessed by minors) making sure to take appropriate
    measures to ensure protection and privacy.
  7. Recommendation system transparency, including banning the use of ‘dark
    patterns’.
  8. Online advertising transparency.
  9. Marketplace obligations, including (but not limited to):
    a. Compliance by design.
    b. Random checks.
    c. Third-party vetting.
    d. User traceability to track illegal activity.
    e. Security of personal information.

VLOPs

All of the above, in addition to:

  1. Published and easy-to-find Codes of Conduct.
  2. Transparent sharing of data with external researchers, auditors, and relevant
    authorities.
  3. Process that allows users to opt out of recommendation systems.
  4. Crisis response process that includes, among other things, measures to
    cooperate with relevant authorities.
  5. Regular conducting of risk assessments.

What are Transparency Reports?

Transparency reports are one of the main requirements of the DSA, applying to all
in-scope services. Specifically, all services are required to publish a report once a year
at a minimum. This process is similar to the requirement set forth in Germany’s NetzDG
legislation and includes similar information obligations as well.

The information required on a transparency report depends on the size of the service in
question. Furthermore, it depends on how large your Trust and Safety team is along with
your overall strategy concerning user safety. It’s important to note that while the DSA
has universal measures for all services, it doesn’t strictly advise on how these obligations
are to be met. This accounts for the myriad of ways Trust and Safety teams go about
keeping their users safe.

For the purposes of the transparency reports, here is a list of the required information:

  1. Number of moderation actions (including kinds of actions taken).
  2. Number of reported content and any resulting actions taken.
  3. Number of appeals to moderation decisions.
  4. Any actions taken in response to appeals.
  5. Average response time as well as time needed to action requests.
  6. Number of take-down orders issued by EU authorities – these are to be
    categorised by type of content reported.
  7. Number of active users on your service, to be updated biannually

What Can You Do to Prepare for the DSA?

On February 17th, 2024, the DSA will be enforceable for all in-scope platforms and
services. Businesses need to prepare well in advance of this date. There are a few ways
to do so. Here are some examples:

  1. A key element in preparing for the DSA is actually one of the legislation’s own
    requirements: risk assessments. Carrying out these risk assessments of your
    service will help you understand how your users may be at risk. Additionally, you’ll
    learn how your service may be at risk of non-compliance.
  2. Reviewing your existing processes. How do you take down content? How do you
    process appeals? Do you have an easy-to-find Terms of Service document? How
    about your reporting mechanisms? These are just a few of the questions you may
    ask yourself.
  3. Staying aware of your transparency reporting obligations. We put together an
    expert webinar to help you with this.
  4. Keeping track of developments in the regulatory landscape, including any
    upcoming legislation.
  5. Learning about designing for trust, which you can read more about here.

FAQ


What is the DSA?

The Digital Services Act, also known as the DSA, is the first attempt by the European Union to regulate platforms. Until now, all 27 EU Member States have each had their own laws that may or may not apply to online platforms. The DSA is the first major attempt to harmonise these separate laws under one universal piece of legislation.


When will DSA be enforceable for all in-scope Platforms?

The DSA will be enforceable for all in-scope Platforms starting February 17th, 2024.


Who does the DSA apply to?

Very large online platforms or VLOPs. The platforms in this category reach an
average of 45 million EU users (or more than 10% of the EU population) per
month.
Hosting services include web-hosting or cloud storage services.
Intermediary services don’t usually post their own content, but instead provide
the infrastructure – think domain services or internet service providers (ISPs).
Online marketplaces, including platforms, that facilitate sales/ transactions
between users and sellers. These include e-commerce sites as well as, for
example, app stores.

More posts like this

We want content moderation to enhance your users’ experience and so they can find their special one more easily.

Audio Moderation: AI-Driven Strategies to Combat Online Threats

In today's digitally-driven world, audio content has become an integral part of online platforms, ranging from podcasts and audiobooks to user-generated audio clips on social media. With the increasing volume of audio content being generated daily, audio moderation has become a critical aspect of maintaining a safe and positive user experience. Audio moderation involves systematically…
4 minutes

The Impact of Trust and Safety in Marketplaces

Nowadays, its no surprise that an unregulated marketplace with sketchy profiles, violent interactions, scams, and illegal products is doomed to fail. In the current world of online commerce, trust and safety are essential, and if users don't feel comfortable, they won’t buy. As a marketplace owner, ensuring that your platform is a safe and reliable…
9 minutes

Why moderation has become essential for UGC 

User-Generated Content (UGC) has become an integral part of online participation. Any type of material—whether it's text, photos, videos, reviews, or discussions—that is made and shared by people instead of brands or official content providers is called user-generated content. Representing variety and honesty, it is the online community's collective voice. Let's explore user-generated content (UGC)…
6 minutes

Building a trusted and authentic social shopping experience: Bloop partners with Checkstep for comprehensive moderation and compliance solutions

The fast-growing ecommerce startup Bloop has partnered with Checkstep to safeguard the user-generated content (UGC) on its new social shopping platform, ensuring Trust and Safety for users. About Bloop Bloop is reshaping the social shopping landscape by rewarding real consumer influence. Bloop combines the best elements of social networking and marketplace platforms. The team aims…
4 minutes

Video Moderation : It’s Scale or Fail with AI

In the digital age, video content has become a driving force across online platforms, shaping the way we communicate, entertain, and share experiences. With this exponential growth, content moderation has become a critical aspect of maintaining a safe and inclusive online environment. The sheer volume of user-generated videos poses significant challenges for platforms, necessitating advanced…
4 minutes

Content Moderation for Virtual Reality

What is content moderation in virtual reality? Content moderation in virtual reality (VR) is the process of monitoring and managing user-generated content within VR platforms to make sure it meets certain standards and guidelines. This can include text, images, videos, and any actions within the 3D virtual environment. Given the interactive and immersive nature of…
31 minutes

3 Facts you Need to Know about Content Moderation and Dating Going into 2024

What is Content Moderation? Content moderation is the practice of monitoring and managing user-generated content on digital platforms to ensure it complies with community guidelines, legal standards, and ethical norms. This process aims to create a safe and inclusive online environment by preventing the spread of harmful, offensive, or inappropriate content. The rise of social…
6 minutes

Checkstep’s 2024: A Year of Innovation, Impact, and Safety

As we’re working on 2025’s OKR, we’re reflecting at Checkstep an an incredible year, shipping new product features faster than ever and positive feedback from our early adopters, giving us the feeling that our platform now ready to scale and will prepare our users to benefit from the next big wave of AI agents. We…
4 minutes

Podcast Moderation at Scale: Leveraging AI to Manage Content

The podcasting industry has experienced an explosive growth in recent years, with millions of episodes being published across various platforms every day. As the volume of audio content surges, ensuring a safe and trustworthy podcast environment becomes a paramount concern. Podcast moderation plays a crucial role in filtering and managing podcast episodes to prevent the…
4 minutes

Live Chat Content Moderation Guide

During any live streaming nowadays, whether it be a content creator on Youtube, an influencer on Instagram, or even live sports in some cases, there's always some sort of live chat. These are public commentary sections where viewers can interact and share their thoughts and opinions, but depending on which event or what sort of…
6 minutes

How to use Content Moderation to Build a Positive Brand Image

The idea of reputation has changed dramatically in the digital age, moving from conventional word-of-mouth to the wide world of user-generated material on the internet. Reputation has a long history that reflects changes in communication styles, cultural developments, and technological advancements. The importance of internet reviews has been highlighted by recent research conducted by Bright…
5 minutes

The Ultimate Guide to GenAI Moderation x Sightengine

Map your GenAI risks and craft “AI-resilient” policies [Part 1] GenAI presents significant challenge for platforms and the Trust & Safety field. As we head into 2025, AI-generated content and detection advancements are poised to take center stage. This post is part of a two-part blog series, co-authored with our partner Sightengine, exploring innovative strategies and…
12 minutes

Ready or Not, AI Is Coming to Content Moderation

As digital platforms and online communities continue to grow, content moderation becomes increasingly critical to ensure safe and positive user experiences. Manual content moderation by human moderators is effective but often falls short when dealing with the scale and complexity of user-generated content. Ready or not, AI is coming to content moderation operations, revolutionizing the…
5 minutes

The Effects of Unregulated Content for Gen Z

The Internet as an Irreplaceable Tool Gen Z’s are the first generation to be born in a world where the internet plays an irreplaceable role, and in some way, these children and adolescents are not just consumers but have become inhabitants of the digital society. Apart from school, generation Z spends most of their time…
5 minutes

Live Chat Moderation Guide

Interactions have moved online, and people now have the ability to interact as users, share content, write comments, and voice their opinions online. This revolution in the way people interact has led to the rise of many businesses that use live chat conversations and text content as one of their main components. Let's take, for…
10 minutes

Prevent unwanted content from reaching your platform

Speak to one of our experts and learn about using AI to protect your platform
Talk to an expert