fbpx

The Digital Services Act (DSA) Guide

DSA

What is the Digital Services Act (DSA)?


The Digital Services Act, otherwise known as the DSA, is the first attempt by the
European Union to govern platforms at the regulatory level. Up until this point, all 27 EU
member states have each had their own laws that may or may not apply to online
platforms. The DSA is the first major effort to harmonise these separate laws under one
universal piece of legislation.

Broadly speaking, this regulation’s goals are to hold digital services accountable for:

  1. The dissemination of illegal and harmful content.
    a. Illegality is determined at both the EU level and Member State level.
  2. Transparency of their operations, including the design of their services.
    a. Focusing on transparency reporting requirements.
  3. Protection of user information.
    a. Including compliance with GDPR.

More specifically, the DSA focuses on:

  1. Preventing the sale of illegal goods and services.
  2. Ensuring the safety of minors.
  3. Banning content that is illegal and/or misleading.
  4. Prohibiting advertising that is targeted using sensitive personal information.

This guide will help you and your service or platform with regard to preparation for the DSA. We will cover all the essential information you need to get yourself started on the path to full compliance. While the above introductory info is important, this guide will go a step further and detail :

  1. What specific kinds of entities the DSA applies to.
  2. What the particular requirements are for those who own and/ or run in-scope services.
  3. What exactly is a Transparency Report and what data it should include.
  4. Suggestions for preparing to comply with the DSA.

Who Does the DSA Apply to?

While it is aimed at all digital companies that have users/ customers in the EU, the DSA divides in-scope services into these categories:

  1. Very large online platforms or VLOPs. The platforms in this category reach an
    average of 45 million EU users (or more than 10% of the EU population) per
    month.
  2. Hosting services include web-hosting or cloud storage services.
  3. Intermediary services don’t usually post their own content, but instead provide
    the infrastructure – think domain services or internet service providers (ISPs).
  4. Online marketplaces, including platforms, that facilitate sales/ transactions
    between users and sellers. These include e-commerce sites as well as, for
    example, app stores.

Services that fall within Scope :

  • Social Networks
  • Messaging Platforms
  • Blogs, Forums, Q&A
  • Dating Sites
  • Creativity Platforms
  • Wellness and Personal Development
  • Services Education and Child-specific Platforms
  • Live Streaming Platforms
  • Streaming Platforms (General)
  • Hosting Platforms
  • Gaming and E-Sport Platforms
  • Gig Economy/ Marketplace Platforms
  • Reviews Sites Crowdfunding Platforms
  • Online News Sites
  • Sport Media Platforms
  • Retail Investing Platforms
  • iGaming Platforms
  • Community Building Platforms
  • e-Commerce Platforms

NOTE: It’s important to know that your service doesn’t have to be based in the EU
for the DSA to apply.

What Happens if a Service Doesn’t Comply?

In considering consequences for non-compliance, the EU has paid special attention to VLOPs. Services that fall into this category when they are found to be non-compliant could be fined up to 6% of global annual turnover. For those services that continue to breach the DSA’s obligations, there is a risk of a complete ban from operating in the EU. For companies that are too small to fall into the VLOPs category, regulatory supervision will be the responsibility of the Digital Services Coordinator in each Member State. In those cases, the Coordinator in the relevant Member State will decide on how non-compliance is penalised.

Exemptions

Businesses considered to be micro/ small are exempt from some requirements. These
businesses must have:

  1. Less than 50 employees.
  2. A maximum annual turnover of €10 million.

These businesses are exempt from the following requirements:

  1. A mechanism for user complaints as well as out-of-court disputes.
  2. Prioritising notices from trusted flaggers.
  3. Processes and/ or measures that defend against malicious and abusive notices.
  4. Safety by Design.
  5. Transparency reports detailing all moderation actions.
  6. Online advertising transparency.
  7. User traceability to track illegal activity.
  8. Reporting activity that is suspected of being criminal.

What are the DSA Requirements?

The obligations under the DSA vary by the size of the platform in question, with VLOPs
carrying the heaviest burden for compliance. Here’s a list of requirements based on
which category your service falls under:

Intermediary Services

  1. Transparency reports detailing all moderation actions, learn more here.
  2. Clear and easy-to-find Terms of Service.
  3. Designated points of contact and (where applicable) a legal representative.
  4. Full cooperation with orders and processes required by the authorities of EU
    Member States.

Hosting Services

All of the above, in addition to:

  1. Reporting of illegal activity.
  2. A notice-and-action procedure, learn more here.

Online Platforms

All of the above, in addition to:

  1. A mechanism for user complaints as well as out-of-court disputes.
  2. Trusted flaggers. These are people or entities who are independent of platforms
    and appointed by the relevant Digital Services Coordinator of an EU Member
    State. Their job is, among other things, to report illegal content on platforms.
  3. Reporting criminal offences, learn more here.
  4. Processes and/ or measures that defend against malicious and abusive notices.
  5. Bans on ads that target children or are targeted based on protected
    characteristics of users. These characteristics include race, sexual orientation,
    and political views.
  6. Safety by Design:
    a. Banning platform designs that inhibit users’ ability to make free and
    informed decisions.
    b. (For platforms accessed by minors) making sure to take appropriate
    measures to ensure protection and privacy.
  7. Recommendation system transparency, including banning the use of ‘dark
    patterns’.
  8. Online advertising transparency.
  9. Marketplace obligations, including (but not limited to):
    a. Compliance by design.
    b. Random checks.
    c. Third-party vetting.
    d. User traceability to track illegal activity.
    e. Security of personal information.

VLOPs

All of the above, in addition to:

  1. Published and easy-to-find Codes of Conduct.
  2. Transparent sharing of data with external researchers, auditors, and relevant
    authorities.
  3. Process that allows users to opt out of recommendation systems.
  4. Crisis response process that includes, among other things, measures to
    cooperate with relevant authorities.
  5. Regular conducting of risk assessments.

What are Transparency Reports?

Transparency reports are one of the main requirements of the DSA, applying to all
in-scope services. Specifically, all services are required to publish a report once a year
at a minimum. This process is similar to the requirement set forth in Germany’s NetzDG
legislation and includes similar information obligations as well.

The information required on a transparency report depends on the size of the service in
question. Furthermore, it depends on how large your Trust and Safety team is along with
your overall strategy concerning user safety. It’s important to note that while the DSA
has universal measures for all services, it doesn’t strictly advise on how these obligations
are to be met. This accounts for the myriad of ways Trust and Safety teams go about
keeping their users safe.

For the purposes of the transparency reports, here is a list of the required information:

  1. Number of moderation actions (including kinds of actions taken).
  2. Number of reported content and any resulting actions taken.
  3. Number of appeals to moderation decisions.
  4. Any actions taken in response to appeals.
  5. Average response time as well as time needed to action requests.
  6. Number of take-down orders issued by EU authorities – these are to be
    categorised by type of content reported.
  7. Number of active users on your service, to be updated biannually

What Can You Do to Prepare for the DSA?

On February 17th, 2024, the DSA will be enforceable for all in-scope platforms and
services. Businesses need to prepare well in advance of this date. There are a few ways
to do so. Here are some examples:

  1. A key element in preparing for the DSA is actually one of the legislation’s own
    requirements: risk assessments. Carrying out these risk assessments of your
    service will help you understand how your users may be at risk. Additionally, you’ll
    learn how your service may be at risk of non-compliance.
  2. Reviewing your existing processes. How do you take down content? How do you
    process appeals? Do you have an easy-to-find Terms of Service document? How
    about your reporting mechanisms? These are just a few of the questions you may
    ask yourself.
  3. Staying aware of your transparency reporting obligations. We put together an
    expert webinar to help you with this.
  4. Keeping track of developments in the regulatory landscape, including any
    upcoming legislation.
  5. Learning about designing for trust, which you can read more about here.

FAQ


What is the DSA?

The Digital Services Act, also known as the DSA, is the first attempt by the European Union to regulate platforms. Until now, all 27 EU Member States have each had their own laws that may or may not apply to online platforms. The DSA is the first major attempt to harmonise these separate laws under one universal piece of legislation.


When will DSA be enforceable for all in-scope Platforms?

The DSA will be enforceable for all in-scope Platforms starting February 17th, 2024.


Who does the DSA apply to?

Very large online platforms or VLOPs. The platforms in this category reach an
average of 45 million EU users (or more than 10% of the EU population) per
month.
Hosting services include web-hosting or cloud storage services.
Intermediary services don’t usually post their own content, but instead provide
the infrastructure – think domain services or internet service providers (ISPs).
Online marketplaces, including platforms, that facilitate sales/ transactions
between users and sellers. These include e-commerce sites as well as, for
example, app stores.

More posts like this

We want content moderation to enhance your users’ experience and so they can find their special one more easily.

Image Moderation Guide: Discover the Power of AI

In today's digital world, visual content plays a significant role in online platforms, ranging from social media to e-commerce websites. With the exponential growth of user-generated images, ensuring a safe and inclusive user experience has become a paramount concern for platform owners. However, image moderation poses unique challenges due to the sheer volume, diverse content,…
4 minutes

How to Build a Trustworthy E-Commerce Brand Using AI-text Moderation

In the fast-paced and competitive world of online commerce, trust is the most important element in ensuring successful transactions, and customer evaluations hold a top spot in the ranking of factors that contribute to the development of brand reliability. They act as a kind of digital word-of-mouth, influencing consumers' choices to make purchases and moulding…
4 minutes

Digital Services Act (DSA) Transparency Guide [+Free Templates]

The Digital Services Act (DSA) is a comprehensive set of laws that aims to regulate digital services and platforms to ensure transparency, accountability, and user protection. In other words, it’s the European Union’s way of regulating and harmonizing separate laws under one universal piece of legislation to prevent illegal and harmful activities online and the…
7 minutes

How Predators Are Abusing Generative AI

The recent rise of generative AI has revolutionized various industries, including Trust and Safety. However, this technological advancement generates new problems. Predators have found ways to abuse generative AI, using it to carry out horrible acts such as child sex abuse material (CSAM), disinformation, fraud, and extremism. In this article, we will explore how predators…
4 minutes

How Video Game Bullying is Threatening the Future of the Industry

Video games have become an integral part of modern entertainment, offering immersive experiences and interactive gameplay. With the rise in popularity of online multiplayer games, a dark side has emerged : video game bullying. This pervasive issue threatens the well-being of players and the reputation of the entire video game industry. In this article, we…
4 minutes

How to Build a Safe Social Media Platform without Sacrificing the User’s Freedom

It was once unthinkable that social media would become an integral aspect of daily life, but here we are, relying on it for communication, information, entertainment, and even shaping our social interactions. It’s brought to our lives a whole new set of rules, and now that online duality is expected, the balance between safety and…
6 minutes

Minor protection : 3 updates you should make to comply with DSA provisions

Introduction While the EU already has some rules to protect children online, such as those found in the Audiovisual Media Services Directive, the Digital Services Act (DSA) introduces specific obligations for platforms. As platforms adapt to meet the provisions outlined in the DSA Minor Protection, it's important for businesses to take proactive measures to comply…
5 minutes

The Future of AI-Powered Content Moderation: Careers and Opportunities

As companies are grappling with the challenge of ensuring user safety and creating a welcoming environment: AI-powered content moderation has emerged as a powerful solution, revolutionizing the way organizations approach this task. In this article, we will explore the careers and opportunities that AI-powered content moderation presents, and how individuals and businesses can adapt to…
6 minutes

Outsourcing Content Moderation

Outsourcing content moderation has become an essential aspect of managing online platforms in the digital age. With the exponential growth of user-generated content, businesses are faced with the challenge of maintaining a safe and inclusive environment for their users while protecting their brand reputation. To address this, many companies are turning to outsourcing content moderation…
4 minutes

From Trolls to Fair Play: The Transformative Impact of AI Moderation in Gaming

The Online Battlefield The online gaming community, once a haven for enthusiasts to connect and share their passion, has faced the growing challenge of toxic behaviour and harassment. Teenagers and young adults are still the main demographic of players, and as multiplayer games became more popular, so did instances of trolling, hate speech, and other…
4 minutes

Ethical Consideration in AI Content Moderation : Avoiding Censorship and Biais

Artificial Intelligence has revolutionized various aspects of our lives, including content moderation on online platforms. As the volume of digital content continues to grow exponentially, AI algorithms play a crucial role in filtering and managing this content. However, with great power comes great responsibility, and the ethical considerations surrounding AI content moderation are becoming increasingly…
3 minutes

Designing for Trust in 2023: How to Create User-Friendly Designs that Keep Users Safe

The Significance of designing for trust in the Digital World In today's digital landscape, building trust with users is essential for operating a business online. Trust is the foundation of successful user interactions and transactions, it is key to encouraging users to share personal information, make purchases, and interact with website content. Without trust, users…
5 minutes

The Evolution of Online Communication: Cultivating Safe and Respectful Interactions

What was once an outrageous dream is now a mundane reality. Going from in-person communication to being able to hold a conversation from thousands of kilometres away has been nothing short of revolutionary. From the invention of email to the meteoric rise of social media and video conferencing, the ways we connect, share, and interact…
5 minutes

Trust and Safety Regulations: A Comprehensive Guide [+Free Cheat Sheet]

Introduction In today’s digital landscape, trust, and safety are paramount concerns for online businesses, particularly those dealing with user-generated content. Trust and Safety regulations are designed to safeguard users, ensure transparency, and foster a secure online environment. These regulations are crucial for maintaining user confidence and protecting against online threats. In addition, as global concerns…
8 minutes

Prevent unwanted content from reaching your platform

Speak to one of our experts and learn about using AI to protect your platform
Talk to an expert