The Digital Services Act (DSA) Guide

DSA

What is the Digital Services Act (DSA)?


The Digital Services Act, otherwise known as the DSA, is the first attempt by the
European Union to govern platforms at the regulatory level. Up until this point, all 27 EU
member states have each had their own laws that may or may not apply to online
platforms. The DSA is the first major effort to harmonise these separate laws under one
universal piece of legislation.

Broadly speaking, this regulation’s goals are to hold digital services accountable for:

  1. The dissemination of illegal and harmful content.
    a. Illegality is determined at both the EU level and Member State level.
  2. Transparency of their operations, including the design of their services.
    a. Focusing on transparency reporting requirements.
  3. Protection of user information.
    a. Including compliance with GDPR.

More specifically, the DSA focuses on:

  1. Preventing the sale of illegal goods and services.
  2. Ensuring the safety of minors.
  3. Banning content that is illegal and/or misleading.
  4. Prohibiting advertising that is targeted using sensitive personal information.

This guide will help you and your service or platform with regard to preparation for the DSA. We will cover all the essential information you need to get yourself started on the path to full compliance. While the above introductory info is important, this guide will go a step further and detail :

  1. What specific kinds of entities the DSA applies to.
  2. What the particular requirements are for those who own and/ or run in-scope services.
  3. What exactly is a Transparency Report and what data it should include.
  4. Suggestions for preparing to comply with the DSA.

Who Does the DSA Apply to?

While it is aimed at all digital companies that have users/ customers in the EU, the DSA divides in-scope services into these categories:

  1. Very large online platforms or VLOPs. The platforms in this category reach an
    average of 45 million EU users (or more than 10% of the EU population) per
    month.
  2. Hosting services include web-hosting or cloud storage services.
  3. Intermediary services don’t usually post their own content, but instead provide
    the infrastructure – think domain services or internet service providers (ISPs).
  4. Online marketplaces, including platforms, that facilitate sales/ transactions
    between users and sellers. These include e-commerce sites as well as, for
    example, app stores.

Services that fall within Scope :

  • Social Networks
  • Messaging Platforms
  • Blogs, Forums, Q&A
  • Dating Sites
  • Creativity Platforms
  • Wellness and Personal Development
  • Services Education and Child-specific Platforms
  • Live Streaming Platforms
  • Streaming Platforms (General)
  • Hosting Platforms
  • Gaming and E-Sport Platforms
  • Gig Economy/ Marketplace Platforms
  • Reviews Sites Crowdfunding Platforms
  • Online News Sites
  • Sport Media Platforms
  • Retail Investing Platforms
  • iGaming Platforms
  • Community Building Platforms
  • e-Commerce Platforms

NOTE: It’s important to know that your service doesn’t have to be based in the EU
for the DSA to apply.

What Happens if a Service Doesn’t Comply?

In considering consequences for non-compliance, the EU has paid special attention to VLOPs. Services that fall into this category when they are found to be non-compliant could be fined up to 6% of global annual turnover. For those services that continue to breach the DSA’s obligations, there is a risk of a complete ban from operating in the EU. For companies that are too small to fall into the VLOPs category, regulatory supervision will be the responsibility of the Digital Services Coordinator in each Member State. In those cases, the Coordinator in the relevant Member State will decide on how non-compliance is penalised.

Exemptions

Businesses considered to be micro/ small are exempt from some requirements. These
businesses must have:

  1. Less than 50 employees.
  2. A maximum annual turnover of €10 million.

These businesses are exempt from the following requirements:

  1. A mechanism for user complaints as well as out-of-court disputes.
  2. Prioritising notices from trusted flaggers.
  3. Processes and/ or measures that defend against malicious and abusive notices.
  4. Safety by Design.
  5. Transparency reports detailing all moderation actions.
  6. Online advertising transparency.
  7. User traceability to track illegal activity.
  8. Reporting activity that is suspected of being criminal.

What are the DSA Requirements?

The obligations under the DSA vary by the size of the platform in question, with VLOPs
carrying the heaviest burden for compliance. Here’s a list of requirements based on
which category your service falls under:

Intermediary Services

  1. Transparency reports detailing all moderation actions, learn more here.
  2. Clear and easy-to-find Terms of Service.
  3. Designated points of contact and (where applicable) a legal representative.
  4. Full cooperation with orders and processes required by the authorities of EU
    Member States.

Hosting Services

All of the above, in addition to:

  1. Reporting of illegal activity.
  2. A notice-and-action procedure, learn more here.

Online Platforms

All of the above, in addition to:

  1. A mechanism for user complaints as well as out-of-court disputes.
  2. Trusted flaggers. These are people or entities who are independent of platforms
    and appointed by the relevant Digital Services Coordinator of an EU Member
    State. Their job is, among other things, to report illegal content on platforms.
  3. Reporting criminal offences, learn more here.
  4. Processes and/ or measures that defend against malicious and abusive notices.
  5. Bans on ads that target children or are targeted based on protected
    characteristics of users. These characteristics include race, sexual orientation,
    and political views.
  6. Safety by Design:
    a. Banning platform designs that inhibit users’ ability to make free and
    informed decisions.
    b. (For platforms accessed by minors) making sure to take appropriate
    measures to ensure protection and privacy.
  7. Recommendation system transparency, including banning the use of ‘dark
    patterns’.
  8. Online advertising transparency.
  9. Marketplace obligations, including (but not limited to):
    a. Compliance by design.
    b. Random checks.
    c. Third-party vetting.
    d. User traceability to track illegal activity.
    e. Security of personal information.

VLOPs

All of the above, in addition to:

  1. Published and easy-to-find Codes of Conduct.
  2. Transparent sharing of data with external researchers, auditors, and relevant
    authorities.
  3. Process that allows users to opt out of recommendation systems.
  4. Crisis response process that includes, among other things, measures to
    cooperate with relevant authorities.
  5. Regular conducting of risk assessments.

What are Transparency Reports?

Transparency reports are one of the main requirements of the DSA, applying to all
in-scope services. Specifically, all services are required to publish a report once a year
at a minimum. This process is similar to the requirement set forth in Germany’s NetzDG
legislation and includes similar information obligations as well.

The information required on a transparency report depends on the size of the service in
question. Furthermore, it depends on how large your Trust and Safety team is along with
your overall strategy concerning user safety. It’s important to note that while the DSA
has universal measures for all services, it doesn’t strictly advise on how these obligations
are to be met. This accounts for the myriad of ways Trust and Safety teams go about
keeping their users safe.

For the purposes of the transparency reports, here is a list of the required information:

  1. Number of moderation actions (including kinds of actions taken).
  2. Number of reported content and any resulting actions taken.
  3. Number of appeals to moderation decisions.
  4. Any actions taken in response to appeals.
  5. Average response time as well as time needed to action requests.
  6. Number of take-down orders issued by EU authorities – these are to be
    categorised by type of content reported.
  7. Number of active users on your service, to be updated biannually

What Can You Do to Prepare for the DSA?

On February 17th, 2024, the DSA will be enforceable for all in-scope platforms and
services. Businesses need to prepare well in advance of this date. There are a few ways
to do so. Here are some examples:

  1. A key element in preparing for the DSA is actually one of the legislation’s own
    requirements: risk assessments. Carrying out these risk assessments of your
    service will help you understand how your users may be at risk. Additionally, you’ll
    learn how your service may be at risk of non-compliance.
  2. Reviewing your existing processes. How do you take down content? How do you
    process appeals? Do you have an easy-to-find Terms of Service document? How
    about your reporting mechanisms? These are just a few of the questions you may
    ask yourself.
  3. Staying aware of your transparency reporting obligations. We put together an
    expert webinar to help you with this.
  4. Keeping track of developments in the regulatory landscape, including any
    upcoming legislation.
  5. Learning about designing for trust, which you can read more about here.

FAQ


What is the DSA?

The Digital Services Act, also known as the DSA, is the first attempt by the European Union to regulate platforms. Until now, all 27 EU Member States have each had their own laws that may or may not apply to online platforms. The DSA is the first major attempt to harmonise these separate laws under one universal piece of legislation.


When will DSA be enforceable for all in-scope Platforms?

The DSA will be enforceable for all in-scope Platforms starting February 17th, 2024.


Who does the DSA apply to?

Very large online platforms or VLOPs. The platforms in this category reach an
average of 45 million EU users (or more than 10% of the EU population) per
month.
Hosting services include web-hosting or cloud storage services.
Intermediary services don’t usually post their own content, but instead provide
the infrastructure – think domain services or internet service providers (ISPs).
Online marketplaces, including platforms, that facilitate sales/ transactions
between users and sellers. These include e-commerce sites as well as, for
example, app stores.

More posts like this

We want content moderation to enhance your users’ experience and so they can find their special one more easily.

Fake Dating Images: Your Ultimate Moderation Guide

Introduction: Combatting fake dating images to protect your platform With growing number of user concerns highlighting fake dating images to mislead users, dating platforms are facing a growing challenge. These pictures are not only a threat to dating platform's integrity but it also erodes user trusts and exposes companies to reputational and compliance risks. In…
5 minutes

TikTok DSA Statement of Reasons (SOR) Statistics

What can we learn from TikTok Statements of Reasons? Body shaming, hypersexualisation, the spread of fake news and misinformation, and the glorification of violence are a high risk on any kind of Social Network. TikTok is one of the fastest growing between 2020 and 2023 and has million of content uploaded everyday on its platform.…
10 minutes

Supercharge Trust & Safety: Keyword Flagging & More in Checkstep’s Latest Updates

We’ve been busy updating and adding new features to our Trust & Safety platform. Check out some of the latest release announcements from Checkstep! Improved Abilities to Live Update your Trust & Safety workflows Trust and Safety operations are always evolving and new forms of violating content pop up in new ways. It’s critical that…
3 minutes

The Impact of AI Content Moderation on User Experience and Engagement

User experience and user engagement are two critical metrics that businesses closely monitor to understand how their products, services, or systems are being received by customers. Now that user-generated content (UGC) is on the rise, content moderation plays a main role in ensuring a safe and positive user experience. Artificial intelligence (AI) has emerged as…
4 minutes

Content Moderation Using ChatGPT

In 10 minutes, you’ll learn how to use ChatGPT for content moderation across spam and hate speech. Who is this for? If you are in a technical role, and work at a company that has user generated content (UGC) then read on. We will show you how you can easily create content moderation models to…
11 minutes

How to deal with Fake Dating Profiles on your Platform

Have you seen an increase in fake profiles on your platform? Are you concerned about it becoming a wild west? In this article, we’ll dive into how to protect users from encountering bad actors and create a safer environment for your customers. An Introduction to the Issue Dating apps have transformed the way people interact…
5 minutes

EU Transparency Database: Shein Leads the Way with Checkstep’s New Integration

🚀 We launched our first Very Large Online Platform (VLOP) with automated reporting to the EU Transparency Database. We’ve now enabled these features for all Checkstep customers for seamless transparency reporting to the EU. This feature is part of Checkstep’s mission to make transparency and regulatory compliance easy for any Trust and Safety team. What…
2 minutes

Emerging Threats in AI Content Moderation : Deep Learning and Contextual Analysis 

With the rise of user-generated content across various platforms, artificial intelligence (AI) has played a crucial role in automating the moderation process. However, as AI algorithms become more sophisticated, emerging threats in content moderation are also on the horizon. This article explores two significant challenges: the use of deep learning and contextual analysis in AI…
4 minutes

What is Content Moderation: a Guide

Content moderation is one of the major aspect of managing online platforms and communities. It englobes the review, filtering, and approval or removal of user-generated content to maintain a safe and engaging environment. In this article, we'll provide you with a comprehensive glossary to understand the key concepts, as well as its definition, challenges and…
15 minutes

Moderation Strategies for Decentralised Autonomous Organisations (DAOs)

Decentralised Autonomous Organizations (DAOs) are a quite recent organisational structure enabled by blockchain technology. They represent a complete structural shift in how groups organise and make decisions, leveraging decentralised networks and smart contracts to facilitate collective governance and decision-making without a centralised authority. The concept of DAOs emerged in 2016 with the launch of "The…
6 minutes

7 dating insights from London Global Dating Insights Conference 2024

Hi, I'm Justin, Sales Director at Checkstep. In September, I had the opportunity to attend the Global Dating Insights Conference 2024, where leaders in the dating industry gathered to discuss emerging trends, challenges, and the evolving landscape of online dating. This year's conference focused on how dating platforms are adapting to new user behaviors, regulatory…
3 minutes

Customizing AI Content Moderation for Different Industries and Platforms

With the exponential growth of user-generated content across various industries and platforms, the need for effective and tailored content moderation solutions has never been more apparent. Artificial Intelligence (AI) plays a major role in automating content moderation processes, but customization is key to address the unique challenges faced by different industries and platforms. Understanding Industry-Specific…
3 minutes

17 Questions Trust and Safety Leaders Should Be Able to Answer 

A Trust and Safety leader plays a crucial role in ensuring the safety and security of a platform or community. Here are 17 important questions that a Trust and Safety leader should be able to answer.  What are the key goals and objectives of the Trust and Safety team? The key goals of the Trust…
6 minutes

How Content Moderation Can Save a Brand’s Reputation

Brand safety and perception have always been important factors to look out for in any organisation, but now, because we live in a world where social media and the internet play an essential role in the way we interact, that aspect has exponentially grown in importance. The abundance of user-generated content on different platforms offers…
5 minutes

The longest 20 days and 20 nights: how can Trust & Safety Leaders best prepare for US elections

Trust and Safety leaders during the US elections: are you tired of election coverage and frenzied political discussion yet? It’s only 20 days until the US votes to elect either Kamala Harris or Donald Trump into the White House and being a Trust and Safety professional has never been harder. Whether your site has anything…
5 minutes

Prevent unwanted content from reaching your platform

Speak to one of our experts and learn about using AI to protect your platform
Talk to an expert