fbpx

Digital Services Act (DSA) : The Ultimate FAQ

DSA ultimate FAQ

Where did the DSA come from?

The DSA was created to provide online safety platform and liability regulation. The e-commerce directive is the DSA predecessor even tho the e-commerce directive is still in force, but the DSA has come in and either deleted, replaced or added some of its provisions. In particular, it’s added this new tranche of compliance obligations on businesses who are within its scope.
You might well be familiar from the e-commerce directive if the categories are a conduit cache and host. These categories are actually taken over into the DSA as well. They delete and replaces the e-commerce directives platform liability provisions.

It’s essentially saying that services, when it’s within its scope, if they’re taking a passive role in respect of the content, that they’re intimidating, that they’re transmitting or hosting or disseminating means that they’re exempt from liability in its respect. Those provisions are taken across and updated from the Commerce directive. We’ve got some supplemental provisions around, particularly information provision.

The Commerce directive already talks about certain information you need to provide about commercial communications service providers. The DSA adds to those.

What’s the difference between the DSA and other compliance?

The change that the DSA brings in is that there are a long list of kind of additional compliance, procedural process obligations that apply to organizations if they’re carrying out content moderation. This is the reason why we’re starting to see now organizations feeling that they need to kick-in and prove DSA compliance processes.


How does the DSA combine with the Audio Media Service Directive (AVMSD)?

The DSA will sit alongside to supplement these provisions and also strengthen the enforcement system behind them. The DSA will also be applicable to a broader range of services which carry out audiovisual content moderation. 


DSA: Why now?

The reason that the DSA has been brought forward is because ultimately the e-commerce directive, particularly was implemented a long time ago now and the digital landscape has changed a lot since then. Indeed, we now got social media marketplaces and services that are part of our day to day lives and it was deemed that the the regime and the e-commerce directive were not gonna be enough to keep a safe online environnement.


What does the DSA cover? SAFETY

In the DSA there is reference to the notice and action obligations but content moderation can be carried out voluntarily. The DSA covers the specific manner in which you must carry out content moderation. However, the DSA doesn’t place a general obligation on services to monitor for content, it just sets out a structure for monitoring illegal content.


What does the DSA cover? TRANSPARENCY

The DSA covers reporting, transparency and how the providers moderate content on their platforms. Indeed, what the DSA requires is that providers be able to explain those algorithms and how they work. On that note, the DSA touch base on the ad transparency and the main parameters of based on which adverts are targeted now need to be disclosed on user interfaces as part of these obligations.


What does the DSA cover ? ACCOUNTABILITY

The DSA explains that : the larger or the greater the influence the type of organization has from a content moderation and online safety perspective, the greater the obligations.
These organization are called VLOPS or VLOSES (Very Large Online Platforms or Very Large Online Search Engines and their Services). The obligations of these entities will come in force sooner.


What does the DSA cover? USER RIGHTS

The DSA wants providers to explains the procedures around content moderation. Indeed, providers will have to inform users about any rights to redress they have. Online platforms must give the users a way to challenge the moderation decision they made. The DSA creates an actual obligation to put in place an internal complaint mechanisms, out of court dispute, settlement regimes or to make these accessible to enable content creators to contest that decision.


Is DSA like GDPR?

There is a lot of GDPR DNA running through the DSA especially about transparency and user rights and safety. It is becoming obvious that the European legislation is all starting to coalesce together.


What are the responsibilities for intermediary service for the DSA?

Under the Digital Services Act (DSA), intermediary service providers have several responsibilities aimed at promoting online safety, transparency, and accountability. Some of these responsibilities include:

Content Moderation: Intermediary service providers are responsible for implementing measures to detect and remove illegal content promptly. This includes content that violates laws such as hate speech, terrorism, child sexual abuse material, and counterfeit products.
User Reporting Mechanisms: They must provide users with easily accessible and user-friendly mechanisms to report illegal content. These reporting systems should be efficient and transparent, enabling users to notify the platform of any potentially harmful or illegal content they encounter.
Transparency Requirements: Intermediary service providers are obligated to provide transparent information about their content moderation practices, including how content is moderated, the criteria used for removal, and the outcomes of user reports.
Protection of Freedom of Expression: Intermediary service need to find a balance between removing illegal content and preserving legitimate speech.
Risk Assessment and Mitigation: Intermediary service providers are expected to conduct regular risk assessments to identify and mitigate potential risks associated with their services.
Cooperation with Authorities: They must cooperate with competent authorities, including law enforcement agencies and regulatory bodies, to combat illegal activities and ensure compliance with legal obligations.


What are the responsibilities for hosting services for the DSA?

For the DSA, the responsibilities for hosting service are the same as the one for intermediary service plus notice, action and obligation to provide information to users and reporting criminal offense or harm.


What are the responsibilities for platform services for the DSA?

For the DSA, the responsibilities for platforms service are the same as the one for hosting services service plus :
– Provide complaint mechanisms
– Requirements with random checks
– Trusted flaggers
– Measures against abusive notices and counter-notices
– User-facing transparency of ads
– Transparency of recommendation algorithms
– Ban on dark patterns
– Ban on targeted ads to minors


What should be in my DSA risk assessment?

What is important is to look at how quickly dissemination of content could happen through the services we were looking at. Basically, what is the likelihood of the impact of the content and what is an effective measurement for risk and likelihood. Checkstep helps online platforms understanding the prevalence of certain issues so they can begin to put those strategies in place.
A DSA risk assessment must evaluate potential risks associated with your digital service, including:

1) Legal Compliance: Assess compliance with relevant laws and regulations.
2) Illegal Content: Identify risks related to the presence of illegal content on your platform.
3) User Safety: Evaluate risks to user safety, including the potential for harmful interactions, cyberbullying, harassment, and exposure to inappropriate content.
4) Privacy Risks: Assess risks to user privacy, including data collection, processing, and storage practices, and ensure compliance with data protection regulations such as GDPR or other applicable laws.
5) Cybersecurity: Identify cybersecurity risks, such as data breaches, hacking, malware, and phishing attacks, and implement appropriate measures to safeguard against them.
6) Content Moderation: Evaluate the effectiveness of your content moderation practices in detecting and removing illegal or harmful content, and assess the potential impact of inadequate moderation on users and society.
7) User Reporting Mechanisms: Assess the functionality and effectiveness of user reporting mechanisms for reporting illegal or harmful content, and ensure prompt and appropriate responses to user reports.
8) Cooperation with Authorities: Evaluate your cooperation with competent authorities, including law enforcement agencies and regulatory bodies, in combating illegal activities and ensuring compliance with legal obligations.
9) Technological Risks: Identify risks associated with the use of technology on your platform, such as algorithmic bias, discriminatory practices, and unintended consequences of automated decision-making systems.
10) Third-party Services: Assess risks associated with third-party services or content hosted on your platform, including potential legal liabilities and reputationnal risks.
11) Emerging Risks: Stay ahead of emerging risks and trends such as new forms of online harm, technological advancements, and evolving regulatory requirements.
12) Business Continuity: Evaluate risks to business continuity, such as service disruptions, financial losses, and reputational damage, and implement strategies to mitigate these risks.


What should be in a Content Moderation Policy? (DSA)

– Clear definition of prohibited content.
– Reporting mechanisms for users.
– Procedures for content moderation.
– Transparency in decision-making.
– Appeals process for users.
– Compliance with privacy regulations.
– Training and support for moderators.
– Commitment to continuous improvement.

Checkstep DSA Plugin automates this directly for online platforms.


What is a Statement of Reasons? (DSA) / How to make a Statement of Reasons (DSA)

A Statement of Reasons under the Digital Services Act (DSA) is a document that explains the rationale behind a decision made by a digital service provider regarding content moderation.
A Statement of reason should provide clear and transparent explanations why certain content was removed, restricted, or otherwise moderated.

To make a Statement of Reasons, digital service providers need to explain the specific reasons for their actions, including references to relevant policies, guidelines, and legal requirements. The statement should be concise, clear, and easily understandable for users, ensuring transparency and accountability in content moderation practices.

The Statement of reason has to show if the decision made by the provider was made through human or automated review.
The Statement of Reason also has to provide a section for the users to show them how to appeal that decision.

With Checkstep DSA Plugin, online platforms can automate its creation in one click.


How to handle complaints against content removal

If a user complains about decision that removed content, suspend the user access or suspend the ability for the user to monetize content. Providers should provide easy to use complaint system, quick reaction to complaints, reinstate content ASAP if the decision was appealed by user. Providers must not rely only on automated means for content removal on their platforms.

To facilitate this step, Checkstep DSA Plugin automatically handles complaints.


What are the requirements for an Out of Court Dispute Settlement? (DSA)

The requirements for an Out of Court Dispute Settlement (ODS) under the Digital Services Act (DSA) include providing an accessible and impartial mechanism for resolving disputes between users and digital service providers. This mechanism must be :
– Independent and impartial of platforms
– Have expertise in illegal content
– Easily accessible
– Fast and cost-effective in at least one EU language
– Publish clear and fair rules
– Reimburse costs if user wins
– User doesn’t pay if they lose

Checkstep DSA Plugin helps online platforms by automating each case report for them.


What should be in a Notice and Action form? (DSA)

According to the DSA, a Notice and Action form must have :
– Reason why content is illegal
– URL to content
– Description of content
– Contact details of reporter
– Declaring report is accurate
– How to challenge the decision


What should be in a Transparency Report? (DSA)

According to the DSA, in a Transparency Report there must be :
– Number of orders to act
– Number of Notice & Actions, categorised 
– Information on automated tools and human training
– Numbers of internal complaints
– Reaction times
– Numbers of out-of-court settlements
– Anonymized data
– AI accuracy metrics
– Number of suspensions
– Monthly active recipients every six months

If you need a template to create your first Transparency Report, you’ll find one here.


When will the DSA be enforced?

On the 17th of February 2024. It is now officially enforced.


What is a Statement of Reason?

A Statement of Reason is a document that provides clear explanations for decisions made by digital service providers regarding content moderation under the Digital Services Act (DSA).

It explains:
– What content was actioned,
– Why content was actioned,
– How content was actioned,
– How to appeal.

If you need a template to create your first statement of reasons, you’ll find one here.


When should you issue a Statement of Reasons?

A Statement of Reasons under the DSA should be issued by digital service providers when they take actions like content removal or restriction. It provides clear explanations for these decisions, promoting transparency and accountability in content moderation practices.
This means a Statement of Reason must be issued in case of :
– Downlinking content
– Hiding content
– Removing content
– Restricting access
– Suspending account
– Suspending monetization.


What should be in a Statement of Reasons?

A Statement of Reasons must include :
– What content was actioned
– Why content was actioned (Illegal / Against policy)
– If illegal, what law it broke
– If against policy, which policy it went against.
– How content was actioned (AI / Human)
– How to appeal.

If you need a template to create your first statement of reasons, you’ll find one here.


What rights does a user have when their content is removed?

– Get a Statement of Reasons to have an explanation why their content has been removed.
– Get six months to appeal, online for free (“Internal Complaint Handling”).


What if a user doesn’t want to appeal internally?

A user has the option to do an Out of Court dispute settlement (see above) and it allow the users to not be charged to pay if they lose and to get reimbursed if they win.


What is “Notice and Action”? (DSA)

“Notice and Action” under the Digital Services Act (DSA) refers to a process where digital service providers receive notifications about potentially illegal content on their platforms and take appropriate action in response. The “Notice” means the ability for users to flag illegal content and the “Action” means that company must take action.
This typically involves users reporting illegal content to the platform, it then triggers a review process by the provider to assess the reported content and take necessary actions, such as removing or restricting access to the content.

Checkstep DSA Plugin, helps online platforms by automating this for them.


How should you allow a user to flag illegal content? (DSA)

Users should be provided with a clear and easily accessible “Report” or “Flag” button on the platform’s interface. This button should lead to a user-friendly reporting interface where users can select relevant categories, provide additional details if necessary, and submit their report.
The “Notice & Action” ability must be clearly identifiable, close to content in question, give the ability to report multiple items through one notice, give the ability to submit online and give the ability to filter reliable and unreliable submitters.


What does a “Notice and Action” form looks like? (DSA)

A “Notice and Action” form typically consists of a user-friendly interface with fields for users to provide information about the content they are reporting. It may include options to select the type of violation (e.g., hate speech, harassment), space to describe the issue, and an option to upload supporting evidence such as screenshots or URLs. The form should be intuitive and easy to navigate, facilitating the quick and accurate reporting of illegal content.


A user has flagged illegal content on my platform, now what ?

After receiving a report of illegal content on your platform you must :
– review the flagged content to assess its validity and determine if it violates your platform’s policies or legal obligations.
– take appropriate action, such as removing or restricting access to the content if it is indeed illegal or violates your terms of service.
– Notify the user who flagged the content about the outcome of their report, maintaining transparency in the process (include whether it was an automated or human decision).
– Consider implementing measures to prevent similar violations in the future and continuously monitor and address content moderation issues on your platform.

How can you trust users to flag content appropriately?

To make sure user flag content appropriately it is possible to designate “trusted flaggers” and this helps because they :
– have expertise in given area of illegal content
– use audit trail and data analytics before flagging
– prioritize their reports depending on urgency of the content.

Can you count on the police to help flag illegal content on your site?

No, as they’re likely overstretched. Although law enforcement becoming a ‘trusted flagger’ is an area to watch. Realistically, it is more likely to have independent groups becoming trusted flaggers.

Do you have to moderate your website’s content?

Under the DSA, providers are generally required to implement measures to detect and remove illegal content promptly.

While the DSA does not explicitly mandate content moderation, it imposes obligations on platforms to address illegal content effectively. Therefore, moderation of a website’s content is often necessary to ensure compliance with the DSA.


What should your content moderation policy contain?

A content moderation policy should contain :
– clear guidelines on prohibited content
– reporting mechanisms for users
– moderation procedures
– transparency in decision-making
– appeals process
– privacy considerations
– training and support for moderators
– Different language for adults and minors
– commitment to continuous improvement.

If you need a template to create or update your Content Moderation policy, you’ll find one here.


What happens to all these Statement of Reasons?

They go to a transparency report database where everyone can view them online.
If you want to see a user-friendly report, here’s the link to our own database.

More posts like this

We want content moderation to enhance your users’ experience and so they can find their special one more easily.

Fake Dating Images: Your Ultimate Moderation Guide

Introduction: Combatting fake dating images to protect your platform With growing number of user concerns highlighting fake dating images to mislead users, dating platforms are facing a growing challenge. These pictures are not only a threat to dating platform's integrity but it also erodes user trusts and exposes companies to reputational and compliance risks. In…
5 minutes

TikTok DSA Statement of Reasons (SOR) Statistics

What can we learn from TikTok Statements of Reasons? Body shaming, hypersexualisation, the spread of fake news and misinformation, and the glorification of violence are a high risk on any kind of Social Network. TikTok is one of the fastest growing between 2020 and 2023 and has million of content uploaded everyday on its platform.…
10 minutes

Supercharge Trust & Safety: Keyword Flagging & More in Checkstep’s Latest Updates

We’ve been busy updating and adding new features to our Trust & Safety platform. Check out some of the latest release announcements from Checkstep! Improved Abilities to Live Update your Trust & Safety workflows Trust and Safety operations are always evolving and new forms of violating content pop up in new ways. It’s critical that…
3 minutes

The Impact of AI Content Moderation on User Experience and Engagement

User experience and user engagement are two critical metrics that businesses closely monitor to understand how their products, services, or systems are being received by customers. Now that user-generated content (UGC) is on the rise, content moderation plays a main role in ensuring a safe and positive user experience. Artificial intelligence (AI) has emerged as…
4 minutes

Content Moderation Using ChatGPT

In 10 minutes, you’ll learn how to use ChatGPT for content moderation across spam and hate speech. Who is this for? If you are in a technical role, and work at a company that has user generated content (UGC) then read on. We will show you how you can easily create content moderation models to…
11 minutes

How to deal with Fake Dating Profiles on your Platform

Have you seen an increase in fake profiles on your platform? Are you concerned about it becoming a wild west? In this article, we’ll dive into how to protect users from encountering bad actors and create a safer environment for your customers. An Introduction to the Issue Dating apps have transformed the way people interact…
5 minutes

EU Transparency Database: Shein Leads the Way with Checkstep’s New Integration

🚀 We launched our first Very Large Online Platform (VLOP) with automated reporting to the EU Transparency Database. We’ve now enabled these features for all Checkstep customers for seamless transparency reporting to the EU. This feature is part of Checkstep’s mission to make transparency and regulatory compliance easy for any Trust and Safety team. What…
2 minutes

Emerging Threats in AI Content Moderation : Deep Learning and Contextual Analysis 

With the rise of user-generated content across various platforms, artificial intelligence (AI) has played a crucial role in automating the moderation process. However, as AI algorithms become more sophisticated, emerging threats in content moderation are also on the horizon. This article explores two significant challenges: the use of deep learning and contextual analysis in AI…
4 minutes

What is Content Moderation: a Guide

Content moderation is one of the major aspect of managing online platforms and communities. It englobes the review, filtering, and approval or removal of user-generated content to maintain a safe and engaging environment. In this article, we'll provide you with a comprehensive glossary to understand the key concepts, as well as its definition, challenges and…
15 minutes

Moderation Strategies for Decentralised Autonomous Organisations (DAOs)

Decentralised Autonomous Organizations (DAOs) are a quite recent organisational structure enabled by blockchain technology. They represent a complete structural shift in how groups organise and make decisions, leveraging decentralised networks and smart contracts to facilitate collective governance and decision-making without a centralised authority. The concept of DAOs emerged in 2016 with the launch of "The…
6 minutes

7 dating insights from London Global Dating Insights Conference 2024

Hi, I'm Justin, Sales Director at Checkstep. In September, I had the opportunity to attend the Global Dating Insights Conference 2024, where leaders in the dating industry gathered to discuss emerging trends, challenges, and the evolving landscape of online dating. This year's conference focused on how dating platforms are adapting to new user behaviors, regulatory…
3 minutes

Customizing AI Content Moderation for Different Industries and Platforms

With the exponential growth of user-generated content across various industries and platforms, the need for effective and tailored content moderation solutions has never been more apparent. Artificial Intelligence (AI) plays a major role in automating content moderation processes, but customization is key to address the unique challenges faced by different industries and platforms. Understanding Industry-Specific…
3 minutes

17 Questions Trust and Safety Leaders Should Be Able to Answer 

A Trust and Safety leader plays a crucial role in ensuring the safety and security of a platform or community. Here are 17 important questions that a Trust and Safety leader should be able to answer.  What are the key goals and objectives of the Trust and Safety team? The key goals of the Trust…
6 minutes

How Content Moderation Can Save a Brand’s Reputation

Brand safety and perception have always been important factors to look out for in any organisation, but now, because we live in a world where social media and the internet play an essential role in the way we interact, that aspect has exponentially grown in importance. The abundance of user-generated content on different platforms offers…
5 minutes

The longest 20 days and 20 nights: how can Trust & Safety Leaders best prepare for US elections

Trust and Safety leaders during the US elections: are you tired of election coverage and frenzied political discussion yet? It’s only 20 days until the US votes to elect either Kamala Harris or Donald Trump into the White House and being a Trust and Safety professional has never been harder. Whether your site has anything…
5 minutes

Prevent unwanted content from reaching your platform

Speak to one of our experts and learn about using AI to protect your platform
Talk to an expert