fbpx

The Digital Services Act (DSA) Guide

DSA

What is the Digital Services Act (DSA)?


The Digital Services Act, otherwise known as the DSA, is the first attempt by the
European Union to govern platforms at the regulatory level. Up until this point, all 27 EU
member states have each had their own laws that may or may not apply to online
platforms. The DSA is the first major effort to harmonise these separate laws under one
universal piece of legislation.

Broadly speaking, this regulation’s goals are to hold digital services accountable for:

  1. The dissemination of illegal and harmful content.
    a. Illegality is determined at both the EU level and Member State level.
  2. Transparency of their operations, including the design of their services.
    a. Focusing on transparency reporting requirements.
  3. Protection of user information.
    a. Including compliance with GDPR.

More specifically, the DSA focuses on:

  1. Preventing the sale of illegal goods and services.
  2. Ensuring the safety of minors.
  3. Banning content that is illegal and/or misleading.
  4. Prohibiting advertising that is targeted using sensitive personal information.

This guide will help you and your service or platform with regard to preparation for the DSA. We will cover all the essential information you need to get yourself started on the path to full compliance. While the above introductory info is important, this guide will go a step further and detail :

  1. What specific kinds of entities the DSA applies to.
  2. What the particular requirements are for those who own and/ or run in-scope services.
  3. What exactly is a Transparency Report and what data it should include.
  4. Suggestions for preparing to comply with the DSA.

Who Does the DSA Apply to?

While it is aimed at all digital companies that have users/ customers in the EU, the DSA divides in-scope services into these categories:

  1. Very large online platforms or VLOPs. The platforms in this category reach an
    average of 45 million EU users (or more than 10% of the EU population) per
    month.
  2. Hosting services include web-hosting or cloud storage services.
  3. Intermediary services don’t usually post their own content, but instead provide
    the infrastructure – think domain services or internet service providers (ISPs).
  4. Online marketplaces, including platforms, that facilitate sales/ transactions
    between users and sellers. These include e-commerce sites as well as, for
    example, app stores.

Services that fall within Scope :

  • Social Networks
  • Messaging Platforms
  • Blogs, Forums, Q&A
  • Dating Sites
  • Creativity Platforms
  • Wellness and Personal Development
  • Services Education and Child-specific Platforms
  • Live Streaming Platforms
  • Streaming Platforms (General)
  • Hosting Platforms
  • Gaming and E-Sport Platforms
  • Gig Economy/ Marketplace Platforms
  • Reviews Sites Crowdfunding Platforms
  • Online News Sites
  • Sport Media Platforms
  • Retail Investing Platforms
  • iGaming Platforms
  • Community Building Platforms
  • e-Commerce Platforms

NOTE: It’s important to know that your service doesn’t have to be based in the EU
for the DSA to apply.

What Happens if a Service Doesn’t Comply?

In considering consequences for non-compliance, the EU has paid special attention to VLOPs. Services that fall into this category when they are found to be non-compliant could be fined up to 6% of global annual turnover. For those services that continue to breach the DSA’s obligations, there is a risk of a complete ban from operating in the EU. For companies that are too small to fall into the VLOPs category, regulatory supervision will be the responsibility of the Digital Services Coordinator in each Member State. In those cases, the Coordinator in the relevant Member State will decide on how non-compliance is penalised.

Exemptions

Businesses considered to be micro/ small are exempt from some requirements. These
businesses must have:

  1. Less than 50 employees.
  2. A maximum annual turnover of €10 million.

These businesses are exempt from the following requirements:

  1. A mechanism for user complaints as well as out-of-court disputes.
  2. Prioritising notices from trusted flaggers.
  3. Processes and/ or measures that defend against malicious and abusive notices.
  4. Safety by Design.
  5. Transparency reports detailing all moderation actions.
  6. Online advertising transparency.
  7. User traceability to track illegal activity.
  8. Reporting activity that is suspected of being criminal.

What are the DSA Requirements?

The obligations under the DSA vary by the size of the platform in question, with VLOPs
carrying the heaviest burden for compliance. Here’s a list of requirements based on
which category your service falls under:

Intermediary Services

  1. Transparency reports detailing all moderation actions, learn more here.
  2. Clear and easy-to-find Terms of Service.
  3. Designated points of contact and (where applicable) a legal representative.
  4. Full cooperation with orders and processes required by the authorities of EU
    Member States.

Hosting Services

All of the above, in addition to:

  1. Reporting of illegal activity.
  2. A notice-and-action procedure, learn more here.

Online Platforms

All of the above, in addition to:

  1. A mechanism for user complaints as well as out-of-court disputes.
  2. Trusted flaggers. These are people or entities who are independent of platforms
    and appointed by the relevant Digital Services Coordinator of an EU Member
    State. Their job is, among other things, to report illegal content on platforms.
  3. Reporting criminal offences, learn more here.
  4. Processes and/ or measures that defend against malicious and abusive notices.
  5. Bans on ads that target children or are targeted based on protected
    characteristics of users. These characteristics include race, sexual orientation,
    and political views.
  6. Safety by Design:
    a. Banning platform designs that inhibit users’ ability to make free and
    informed decisions.
    b. (For platforms accessed by minors) making sure to take appropriate
    measures to ensure protection and privacy.
  7. Recommendation system transparency, including banning the use of ‘dark
    patterns’.
  8. Online advertising transparency.
  9. Marketplace obligations, including (but not limited to):
    a. Compliance by design.
    b. Random checks.
    c. Third-party vetting.
    d. User traceability to track illegal activity.
    e. Security of personal information.

VLOPs

All of the above, in addition to:

  1. Published and easy-to-find Codes of Conduct.
  2. Transparent sharing of data with external researchers, auditors, and relevant
    authorities.
  3. Process that allows users to opt out of recommendation systems.
  4. Crisis response process that includes, among other things, measures to
    cooperate with relevant authorities.
  5. Regular conducting of risk assessments.

What are Transparency Reports?

Transparency reports are one of the main requirements of the DSA, applying to all
in-scope services. Specifically, all services are required to publish a report once a year
at a minimum. This process is similar to the requirement set forth in Germany’s NetzDG
legislation and includes similar information obligations as well.

The information required on a transparency report depends on the size of the service in
question. Furthermore, it depends on how large your Trust and Safety team is along with
your overall strategy concerning user safety. It’s important to note that while the DSA
has universal measures for all services, it doesn’t strictly advise on how these obligations
are to be met. This accounts for the myriad of ways Trust and Safety teams go about
keeping their users safe.

For the purposes of the transparency reports, here is a list of the required information:

  1. Number of moderation actions (including kinds of actions taken).
  2. Number of reported content and any resulting actions taken.
  3. Number of appeals to moderation decisions.
  4. Any actions taken in response to appeals.
  5. Average response time as well as time needed to action requests.
  6. Number of take-down orders issued by EU authorities – these are to be
    categorised by type of content reported.
  7. Number of active users on your service, to be updated biannually

What Can You Do to Prepare for the DSA?

On February 17th, 2024, the DSA will be enforceable for all in-scope platforms and
services. Businesses need to prepare well in advance of this date. There are a few ways
to do so. Here are some examples:

  1. A key element in preparing for the DSA is actually one of the legislation’s own
    requirements: risk assessments. Carrying out these risk assessments of your
    service will help you understand how your users may be at risk. Additionally, you’ll
    learn how your service may be at risk of non-compliance.
  2. Reviewing your existing processes. How do you take down content? How do you
    process appeals? Do you have an easy-to-find Terms of Service document? How
    about your reporting mechanisms? These are just a few of the questions you may
    ask yourself.
  3. Staying aware of your transparency reporting obligations. We put together an
    expert webinar to help you with this.
  4. Keeping track of developments in the regulatory landscape, including any
    upcoming legislation.
  5. Learning about designing for trust, which you can read more about here.

FAQ


What is the DSA?

The Digital Services Act, also known as the DSA, is the first attempt by the European Union to regulate platforms. Until now, all 27 EU Member States have each had their own laws that may or may not apply to online platforms. The DSA is the first major attempt to harmonise these separate laws under one universal piece of legislation.


When will DSA be enforceable for all in-scope Platforms?

The DSA will be enforceable for all in-scope Platforms starting February 17th, 2024.


Who does the DSA apply to?

Very large online platforms or VLOPs. The platforms in this category reach an
average of 45 million EU users (or more than 10% of the EU population) per
month.
Hosting services include web-hosting or cloud storage services.
Intermediary services don’t usually post their own content, but instead provide
the infrastructure – think domain services or internet service providers (ISPs).
Online marketplaces, including platforms, that facilitate sales/ transactions
between users and sellers. These include e-commerce sites as well as, for
example, app stores.

More posts like this

We want content moderation to enhance your users’ experience and so they can find their special one more easily.

Minor protection : 3 updates you should make to comply with DSA provisions

Introduction While the EU already has some rules to protect children online, such as those found in the Audiovisual Media Services Directive, the Digital Services Act (DSA) introduces specific obligations for platforms. As platforms adapt to meet the provisions outlined in the DSA Minor Protection, it's important for businesses to take proactive measures to comply…
5 minutes

TikTok DSA Statement of Reasons (SOR) Statistics

What can we learn from TikTok Statements of Reasons? Body shaming, hypersexualisation, the spread of fake news and misinformation, and the glorification of violence are a high risk on any kind of Social Network. TikTok is one of the fastest growing between 2020 and 2023 and has million of content uploaded everyday on its platform.…
10 minutes

How to deal with Fake Dating Profiles on your Platform

Have you seen an increase in fake profiles on your platform? Are you concerned about it becoming a wild west? In this article, we’ll dive into how to protect users from encountering bad actors and create a safer environment for your customers. An Introduction to the Issue Dating apps have transformed the way people interact…
5 minutes

Moderation Strategies for Decentralised Autonomous Organisations (DAOs)

Decentralised Autonomous Organizations (DAOs) are a quite recent organisational structure enabled by blockchain technology. They represent a complete structural shift in how groups organise and make decisions, leveraging decentralised networks and smart contracts to facilitate collective governance and decision-making without a centralised authority. The concept of DAOs emerged in 2016 with the launch of "The…
6 minutes

How Content Moderation Can Save a Brand’s Reputation

Brand safety and perception have always been important factors to look out for in any organisation, but now, because we live in a world where social media and the internet play an essential role in the way we interact, that aspect has exponentially grown in importance. The abundance of user-generated content on different platforms offers…
5 minutes

How to Keep your Online Community Abuse-Free

The Internet & Community Building In the past, if you were really into something niche, finding others who shared your passion in your local area was tough. You might have felt like you were the only one around who had that particular interest. But things have changed a lot since then. Now, thanks to the…
6 minutes

9 Industries Benefiting from AI Content Moderation

As the internet becomes an integral part of people's lives, industries have moved towards having a larger online presence. Many businesses in these industries have developed online platforms where user-generated content (UGC) plays a major role. From the rise of online healthcare to the invention of e-learning, all of these promote interaction between parties through…
8 minutes

Why moderation has become essential for UGC 

User-Generated Content (UGC) has become an integral part of online participation. Any type of material—whether it's text, photos, videos, reviews, or discussions—that is made and shared by people instead of brands or official content providers is called user-generated content. Representing variety and honesty, it is the online community's collective voice. Let's explore user-generated content (UGC)…
6 minutes

3 Facts you Need to Know about Content Moderation and Dating Going into 2024

What is Content Moderation? Content moderation is the practice of monitoring and managing user-generated content on digital platforms to ensure it complies with community guidelines, legal standards, and ethical norms. This process aims to create a safe and inclusive online environment by preventing the spread of harmful, offensive, or inappropriate content. The rise of social…
6 minutes

How to use Content Moderation to Build a Positive Brand Image

The idea of reputation has changed dramatically in the digital age, moving from conventional word-of-mouth to the wide world of user-generated material on the internet. Reputation has a long history that reflects changes in communication styles, cultural developments, and technological advancements. The importance of internet reviews has been highlighted by recent research conducted by Bright…
5 minutes

Live Chat Moderation Guide

Interactions have moved online, and people now have the ability to interact as users, share content, write comments, and voice their opinions online. This revolution in the way people interact has led to the rise of many businesses that use live chat conversations and text content as one of their main components. Let's take, for…
10 minutes

Virtual Reality Content Moderation Guide

Its’s no surprise that virtual reality (VR) and the Metaverse have become buzzwords in the world of technology. Notably, these immersive experiences are revolutionising the way we interact with digital content and each other. However, as the popularity of VR continues to grow, attracting more and more users, so does the need for content moderation.…
14 minutes

Top 3 DSA Tools to make your compliance easier

Introduction The Digital Service Act (DSA) is a European regulation amending the June, 8th 2000 Directive on electronic commerce (Directive 2000/31/EC). Its goal is to modernize and harmonize national legislation within the internal market in response to the risks and challenges of digital transformation. The DSA applies to a large range of digital services such…
12 minutes

DSA Transparency Guide [+Free Templates]

The Digital Services Act (DSA) is a comprehensive set of laws that aims to regulate digital services and platforms to ensure transparency, accountability, and user protection. In other words, it’s the European Union’s way of regulating and harmonizing separate laws under one universal piece of legislation to prevent illegal and harmful activities online and the…
7 minutes

Global Perspective : How AI Content Moderation Differs Across Cultures and Religion

The internet serves as a vast platform for the exchange of ideas, information, and opinions. However, this free exchange also brings challenges, including the need for content moderation to ensure that online spaces remain safe and respectful. As artificial intelligence (AI) increasingly plays a role in content moderation, it becomes essential to recognize the cultural…
5 minutes

Prevent unwanted content from reaching your platform

Speak to one of our experts and learn about using AI to protect your platform
Talk to an expert