fbpx

Digital Services Act (DSA) : The Ultimate FAQ

DSA ultimate FAQ

Where did the DSA come from?

The DSA was created to provide online safety platform and liability regulation. The e-commerce directive is the DSA predecessor even tho the e-commerce directive is still in force, but the DSA has come in and either deleted, replaced or added some of its provisions. In particular, it’s added this new tranche of compliance obligations on businesses who are within its scope.
You might well be familiar from the e-commerce directive if the categories are a conduit cache and host. These categories are actually taken over into the DSA as well. They delete and replaces the e-commerce directives platform liability provisions.

It’s essentially saying that services, when it’s within its scope, if they’re taking a passive role in respect of the content, that they’re intimidating, that they’re transmitting or hosting or disseminating means that they’re exempt from liability in its respect. Those provisions are taken across and updated from the Commerce directive. We’ve got some supplemental provisions around, particularly information provision.

The Commerce directive already talks about certain information you need to provide about commercial communications service providers. The DSA adds to those.

What’s the difference between the DSA and other compliance?

The change that the DSA brings in is that there are a long list of kind of additional compliance, procedural process obligations that apply to organizations if they’re carrying out content moderation. This is the reason why we’re starting to see now organizations feeling that they need to kick-in and prove DSA compliance processes.


How does the DSA combine with the Audio Media Service Directive (AVMSD)?

The DSA will sit alongside to supplement these provisions and also strengthen the enforcement system behind them. The DSA will also be applicable to a broader range of services which carry out audiovisual content moderation. 


DSA: Why now?

The reason that the DSA has been brought forward is because ultimately the e-commerce directive, particularly was implemented a long time ago now and the digital landscape has changed a lot since then. Indeed, we now got social media marketplaces and services that are part of our day to day lives and it was deemed that the the regime and the e-commerce directive were not gonna be enough to keep a safe online environnement.


What does the DSA cover? SAFETY

In the DSA there is reference to the notice and action obligations but content moderation can be carried out voluntarily. The DSA covers the specific manner in which you must carry out content moderation. However, the DSA doesn’t place a general obligation on services to monitor for content, it just sets out a structure for monitoring illegal content.


What does the DSA cover? TRANSPARENCY

The DSA covers reporting, transparency and how the providers moderate content on their platforms. Indeed, what the DSA requires is that providers be able to explain those algorithms and how they work. On that note, the DSA touch base on the ad transparency and the main parameters of based on which adverts are targeted now need to be disclosed on user interfaces as part of these obligations.


What does the DSA cover ? ACCOUNTABILITY

The DSA explains that : the larger or the greater the influence the type of organization has from a content moderation and online safety perspective, the greater the obligations.
These organization are called VLOPS or VLOSES (Very Large Online Platforms or Very Large Online Search Engines and their Services). The obligations of these entities will come in force sooner.


What does the DSA cover? USER RIGHTS

The DSA wants providers to explains the procedures around content moderation. Indeed, providers will have to inform users about any rights to redress they have. Online platforms must give the users a way to challenge the moderation decision they made. The DSA creates an actual obligation to put in place an internal complaint mechanisms, out of court dispute, settlement regimes or to make these accessible to enable content creators to contest that decision.


Is DSA like GDPR?

There is a lot of GDPR DNA running through the DSA especially about transparency and user rights and safety. It is becoming obvious that the European legislation is all starting to coalesce together.


What are the responsibilities for intermediary service for the DSA?

Under the Digital Services Act (DSA), intermediary service providers have several responsibilities aimed at promoting online safety, transparency, and accountability. Some of these responsibilities include:

Content Moderation: Intermediary service providers are responsible for implementing measures to detect and remove illegal content promptly. This includes content that violates laws such as hate speech, terrorism, child sexual abuse material, and counterfeit products.
User Reporting Mechanisms: They must provide users with easily accessible and user-friendly mechanisms to report illegal content. These reporting systems should be efficient and transparent, enabling users to notify the platform of any potentially harmful or illegal content they encounter.
Transparency Requirements: Intermediary service providers are obligated to provide transparent information about their content moderation practices, including how content is moderated, the criteria used for removal, and the outcomes of user reports.
Protection of Freedom of Expression: Intermediary service need to find a balance between removing illegal content and preserving legitimate speech.
Risk Assessment and Mitigation: Intermediary service providers are expected to conduct regular risk assessments to identify and mitigate potential risks associated with their services.
Cooperation with Authorities: They must cooperate with competent authorities, including law enforcement agencies and regulatory bodies, to combat illegal activities and ensure compliance with legal obligations.


What are the responsibilities for hosting services for the DSA?

For the DSA, the responsibilities for hosting service are the same as the one for intermediary service plus notice, action and obligation to provide information to users and reporting criminal offense or harm.


What are the responsibilities for platform services for the DSA?

For the DSA, the responsibilities for platforms service are the same as the one for hosting services service plus :
– Provide complaint mechanisms
– Requirements with random checks
– Trusted flaggers
– Measures against abusive notices and counter-notices
– User-facing transparency of ads
– Transparency of recommendation algorithms
– Ban on dark patterns
– Ban on targeted ads to minors


What should be in my DSA risk assessment?

What is important is to look at how quickly dissemination of content could happen through the services we were looking at. Basically, what is the likelihood of the impact of the content and what is an effective measurement for risk and likelihood. Checkstep helps online platforms understanding the prevalence of certain issues so they can begin to put those strategies in place.
A DSA risk assessment must evaluate potential risks associated with your digital service, including:

1) Legal Compliance: Assess compliance with relevant laws and regulations.
2) Illegal Content: Identify risks related to the presence of illegal content on your platform.
3) User Safety: Evaluate risks to user safety, including the potential for harmful interactions, cyberbullying, harassment, and exposure to inappropriate content.
4) Privacy Risks: Assess risks to user privacy, including data collection, processing, and storage practices, and ensure compliance with data protection regulations such as GDPR or other applicable laws.
5) Cybersecurity: Identify cybersecurity risks, such as data breaches, hacking, malware, and phishing attacks, and implement appropriate measures to safeguard against them.
6) Content Moderation: Evaluate the effectiveness of your content moderation practices in detecting and removing illegal or harmful content, and assess the potential impact of inadequate moderation on users and society.
7) User Reporting Mechanisms: Assess the functionality and effectiveness of user reporting mechanisms for reporting illegal or harmful content, and ensure prompt and appropriate responses to user reports.
8) Cooperation with Authorities: Evaluate your cooperation with competent authorities, including law enforcement agencies and regulatory bodies, in combating illegal activities and ensuring compliance with legal obligations.
9) Technological Risks: Identify risks associated with the use of technology on your platform, such as algorithmic bias, discriminatory practices, and unintended consequences of automated decision-making systems.
10) Third-party Services: Assess risks associated with third-party services or content hosted on your platform, including potential legal liabilities and reputationnal risks.
11) Emerging Risks: Stay ahead of emerging risks and trends such as new forms of online harm, technological advancements, and evolving regulatory requirements.
12) Business Continuity: Evaluate risks to business continuity, such as service disruptions, financial losses, and reputational damage, and implement strategies to mitigate these risks.


What should be in a Content Moderation Policy? (DSA)

– Clear definition of prohibited content.
– Reporting mechanisms for users.
– Procedures for content moderation.
– Transparency in decision-making.
– Appeals process for users.
– Compliance with privacy regulations.
– Training and support for moderators.
– Commitment to continuous improvement.

Checkstep DSA Plugin automates this directly for online platforms.


What is a Statement of Reasons? (DSA) / How to make a Statement of Reasons (DSA)

A Statement of Reasons under the Digital Services Act (DSA) is a document that explains the rationale behind a decision made by a digital service provider regarding content moderation.
A Statement of reason should provide clear and transparent explanations why certain content was removed, restricted, or otherwise moderated.

To make a Statement of Reasons, digital service providers need to explain the specific reasons for their actions, including references to relevant policies, guidelines, and legal requirements. The statement should be concise, clear, and easily understandable for users, ensuring transparency and accountability in content moderation practices.

The Statement of reason has to show if the decision made by the provider was made through human or automated review.
The Statement of Reason also has to provide a section for the users to show them how to appeal that decision.

With Checkstep DSA Plugin, online platforms can automate its creation in one click.


How to handle complaints against content removal

If a user complains about decision that removed content, suspend the user access or suspend the ability for the user to monetize content. Providers should provide easy to use complaint system, quick reaction to complaints, reinstate content ASAP if the decision was appealed by user. Providers must not rely only on automated means for content removal on their platforms.

To facilitate this step, Checkstep DSA Plugin automatically handles complaints.


What are the requirements for an Out of Court Dispute Settlement? (DSA)

The requirements for an Out of Court Dispute Settlement (ODS) under the Digital Services Act (DSA) include providing an accessible and impartial mechanism for resolving disputes between users and digital service providers. This mechanism must be :
– Independent and impartial of platforms
– Have expertise in illegal content
– Easily accessible
– Fast and cost-effective in at least one EU language
– Publish clear and fair rules
– Reimburse costs if user wins
– User doesn’t pay if they lose

Checkstep DSA Plugin helps online platforms by automating each case report for them.


What should be in a Notice and Action form? (DSA)

According to the DSA, a Notice and Action form must have :
– Reason why content is illegal
– URL to content
– Description of content
– Contact details of reporter
– Declaring report is accurate
– How to challenge the decision


What should be in a Transparency Report? (DSA)

According to the DSA, in a Transparency Report there must be :
– Number of orders to act
– Number of Notice & Actions, categorised 
– Information on automated tools and human training
– Numbers of internal complaints
– Reaction times
– Numbers of out-of-court settlements
– Anonymized data
– AI accuracy metrics
– Number of suspensions
– Monthly active recipients every six months

If you need a template to create your first Transparency Report, you’ll find one here.


When will the DSA be enforced?

On the 17th of February 2024. It is now officially enforced.


What is a Statement of Reason?

A Statement of Reason is a document that provides clear explanations for decisions made by digital service providers regarding content moderation under the Digital Services Act (DSA).

It explains:
– What content was actioned,
– Why content was actioned,
– How content was actioned,
– How to appeal.

If you need a template to create your first statement of reasons, you’ll find one here.


When should you issue a Statement of Reasons?

A Statement of Reasons under the DSA should be issued by digital service providers when they take actions like content removal or restriction. It provides clear explanations for these decisions, promoting transparency and accountability in content moderation practices.
This means a Statement of Reason must be issued in case of :
– Downlinking content
– Hiding content
– Removing content
– Restricting access
– Suspending account
– Suspending monetization.


What should be in a Statement of Reasons?

A Statement of Reasons must include :
– What content was actioned
– Why content was actioned (Illegal / Against policy)
– If illegal, what law it broke
– If against policy, which policy it went against.
– How content was actioned (AI / Human)
– How to appeal.

If you need a template to create your first statement of reasons, you’ll find one here.


What rights does a user have when their content is removed?

– Get a Statement of Reasons to have an explanation why their content has been removed.
– Get six months to appeal, online for free (“Internal Complaint Handling”).


What if a user doesn’t want to appeal internally?

A user has the option to do an Out of Court dispute settlement (see above) and it allow the users to not be charged to pay if they lose and to get reimbursed if they win.


What is “Notice and Action”? (DSA)

“Notice and Action” under the Digital Services Act (DSA) refers to a process where digital service providers receive notifications about potentially illegal content on their platforms and take appropriate action in response. The “Notice” means the ability for users to flag illegal content and the “Action” means that company must take action.
This typically involves users reporting illegal content to the platform, it then triggers a review process by the provider to assess the reported content and take necessary actions, such as removing or restricting access to the content.

Checkstep DSA Plugin, helps online platforms by automating this for them.


How should you allow a user to flag illegal content? (DSA)

Users should be provided with a clear and easily accessible “Report” or “Flag” button on the platform’s interface. This button should lead to a user-friendly reporting interface where users can select relevant categories, provide additional details if necessary, and submit their report.
The “Notice & Action” ability must be clearly identifiable, close to content in question, give the ability to report multiple items through one notice, give the ability to submit online and give the ability to filter reliable and unreliable submitters.


What does a “Notice and Action” form looks like? (DSA)

A “Notice and Action” form typically consists of a user-friendly interface with fields for users to provide information about the content they are reporting. It may include options to select the type of violation (e.g., hate speech, harassment), space to describe the issue, and an option to upload supporting evidence such as screenshots or URLs. The form should be intuitive and easy to navigate, facilitating the quick and accurate reporting of illegal content.


A user has flagged illegal content on my platform, now what ?

After receiving a report of illegal content on your platform you must :
– review the flagged content to assess its validity and determine if it violates your platform’s policies or legal obligations.
– take appropriate action, such as removing or restricting access to the content if it is indeed illegal or violates your terms of service.
– Notify the user who flagged the content about the outcome of their report, maintaining transparency in the process (include whether it was an automated or human decision).
– Consider implementing measures to prevent similar violations in the future and continuously monitor and address content moderation issues on your platform.

How can you trust users to flag content appropriately?

To make sure user flag content appropriately it is possible to designate “trusted flaggers” and this helps because they :
– have expertise in given area of illegal content
– use audit trail and data analytics before flagging
– prioritize their reports depending on urgency of the content.

Can you count on the police to help flag illegal content on your site?

No, as they’re likely overstretched. Although law enforcement becoming a ‘trusted flagger’ is an area to watch. Realistically, it is more likely to have independent groups becoming trusted flaggers.

Do you have to moderate your website’s content?

Under the DSA, providers are generally required to implement measures to detect and remove illegal content promptly.

While the DSA does not explicitly mandate content moderation, it imposes obligations on platforms to address illegal content effectively. Therefore, moderation of a website’s content is often necessary to ensure compliance with the DSA.


What should your content moderation policy contain?

A content moderation policy should contain :
– clear guidelines on prohibited content
– reporting mechanisms for users
– moderation procedures
– transparency in decision-making
– appeals process
– privacy considerations
– training and support for moderators
– Different language for adults and minors
– commitment to continuous improvement.

If you need a template to create or update your Content Moderation policy, you’ll find one here.


What happens to all these Statement of Reasons?

They go to a transparency report database where everyone can view them online.
If you want to see a user-friendly report, here’s the link to our own database.

More posts like this

We want content moderation to enhance your users’ experience and so they can find their special one more easily.

How to use Content Moderation to Build a Positive Brand Image

The idea of reputation has changed dramatically in the digital age, moving from conventional word-of-mouth to the wide world of user-generated material on the internet. Reputation has a long history that reflects changes in communication styles, cultural developments, and technological advancements. The importance of internet reviews has been highlighted by recent research conducted by Bright…
5 minutes

UK far-right riots: Trust & Safety solutions for online platforms

The far-right riots in the UK The recent UK far-right riots, undeniably fuelled by misinformation spread on social media platforms serves as a stark reminder of the urgent need for online platforms to adapt their content moderation policies and work closely with regulators to prevent such tragedies. The consequences of inaction and noncompliance are serious,…
10 minutes

Minor protection : 3 updates you should make to comply with DSA provisions

Introduction While the EU already has some rules to protect children online, such as those found in the Audiovisual Media Services Directive, the Digital Services Act (DSA) introduces specific obligations for platforms. As platforms adapt to meet the provisions outlined in the DSA Minor Protection, it's important for businesses to take proactive measures to comply…
5 minutes

Digital Services Act (DSA) Transparency Guide [+Free Templates]

The Digital Services Act (DSA) is a comprehensive set of laws that aims to regulate digital services and platforms to ensure transparency, accountability, and user protection. In other words, it’s the European Union’s way of regulating and harmonizing separate laws under one universal piece of legislation to prevent illegal and harmful activities online and the…
7 minutes

Top 3 Digital Services Act Tools to make your compliance easier

Introduction The Digital Service Act (DSA) is a European regulation amending the June, 8th 2000 Directive on electronic commerce (Directive 2000/31/EC). Its goal is to modernize and harmonize national legislation within the internal market in response to the risks and challenges of digital transformation. The DSA applies to a large range of digital services such…
12 minutes

Virtual Reality Content Moderation Guide

Its’s no surprise that virtual reality (VR) and the Metaverse have become buzzwords in the world of technology. Notably, these immersive experiences are revolutionising the way we interact with digital content and each other. However, as the popularity of VR continues to grow, attracting more and more users, so does the need for content moderation.…
14 minutes

The Digital Services Act (DSA) Guide

What is the Digital Services Act (DSA)? The Digital Services Act, otherwise known as the DSA, is the first attempt by theEuropean Union to govern platforms at the regulatory level. Up until this point, all 27 EUmember states have each had their own laws that may or may not apply to onlineplatforms. The DSA is…
7 minutes

Live Chat Moderation Guide

Interactions have moved online, and people now have the ability to interact as users, share content, write comments, and voice their opinions online. This revolution in the way people interact has led to the rise of many businesses that use live chat conversations and text content as one of their main components. Let's take, for…
10 minutes

9 Industries Benefiting from AI Content Moderation

As the internet becomes an integral part of people's lives, industries have moved towards having a larger online presence. Many businesses in these industries have developed online platforms where user-generated content (UGC) plays a major role. From the rise of online healthcare to the invention of e-learning, all of these promote interaction between parties through…
8 minutes

Moderation Strategies for Decentralised Autonomous Organisations (DAOs)

Decentralised Autonomous Organizations (DAOs) are a quite recent organisational structure enabled by blockchain technology. They represent a complete structural shift in how groups organise and make decisions, leveraging decentralised networks and smart contracts to facilitate collective governance and decision-making without a centralised authority. The concept of DAOs emerged in 2016 with the launch of "The…
6 minutes

How Content Moderation Can Save a Brand’s Reputation

Brand safety and perception have always been important factors to look out for in any organisation, but now, because we live in a world where social media and the internet play an essential role in the way we interact, that aspect has exponentially grown in importance. The abundance of user-generated content on different platforms offers…
5 minutes

How to Keep your Online Community Abuse-Free

The Internet & Community Building In the past, if you were really into something niche, finding others who shared your passion in your local area was tough. You might have felt like you were the only one around who had that particular interest. But things have changed a lot since then. Now, thanks to the…
6 minutes

Trust and Safety Teams: Ensuring User Protection

As the internet becomes an integral part of our daily lives, companies must prioritize the safety and security of their users. This responsibility falls on trust and safety teams, whose primary goal is to protect users from fraud, abuse, and other harmful behavior.  Trust and Safety Teams Objectives  The Role of Trust and Safety Teams…
6 minutes

Why moderation has become essential for UGC 

User-Generated Content (UGC) has become an integral part of online participation. Any type of material—whether it's text, photos, videos, reviews, or discussions—that is made and shared by people instead of brands or official content providers is called user-generated content. Representing variety and honesty, it is the online community's collective voice. Let's explore user-generated content (UGC)…
6 minutes

3 Facts you Need to Know about Content Moderation and Dating Going into 2024

What is Content Moderation? Content moderation is the practice of monitoring and managing user-generated content on digital platforms to ensure it complies with community guidelines, legal standards, and ethical norms. This process aims to create a safe and inclusive online environment by preventing the spread of harmful, offensive, or inappropriate content. The rise of social…
6 minutes

Prevent unwanted content from reaching your platform

Speak to one of our experts and learn about using AI to protect your platform
Talk to an expert