Where did the DSA come from?
The DSA was created to provide online safety platform and liability regulation. The e-commerce directive is the DSA predecessor even tho the e-commerce directive is still in force, but the DSA has come in and either deleted, replaced or added some of its provisions. In particular, it’s added this new tranche of compliance obligations on businesses who are within its scope.
You might well be familiar from the e-commerce directive if the categories are a conduit cache and host. These categories are actually taken over into the DSA as well. They delete and replaces the e-commerce directives platform liability provisions.
It’s essentially saying that services, when it’s within its scope, if they’re taking a passive role in respect of the content, that they’re intimidating, that they’re transmitting or hosting or disseminating means that they’re exempt from liability in its respect. Those provisions are taken across and updated from the Commerce directive. We’ve got some supplemental provisions around, particularly information provision.
The Commerce directive already talks about certain information you need to provide about commercial communications service providers. The DSA adds to those.
The change that the DSA brings in is that there are a long list of kind of additional compliance, procedural process obligations that apply to organizations if they’re carrying out content moderation. This is the reason why we’re starting to see now organizations feeling that they need to kick-in and prove DSA compliance processes.
How does the DSA combine with the Audio Media Service Directive (AVMSD)?
The DSA will sit alongside to supplement these provisions and also strengthen the enforcement system behind them. The DSA will also be applicable to a broader range of services which carry out audiovisual content moderation.
DSA: Why now?
The reason that the DSA has been brought forward is because ultimately the e-commerce directive, particularly was implemented a long time ago now and the digital landscape has changed a lot since then. Indeed, we now got social media marketplaces and services that are part of our day to day lives and it was deemed that the the regime and the e-commerce directive were not gonna be enough to keep a safe online environnement.
What does the DSA cover? SAFETY
In the DSA there is reference to the notice and action obligations but content moderation can be carried out voluntarily. The DSA covers the specific manner in which you must carry out content moderation. However, the DSA doesn’t place a general obligation on services to monitor for content, it just sets out a structure for monitoring illegal content.
What does the DSA cover? TRANSPARENCY
The DSA covers reporting, transparency and how the providers moderate content on their platforms. Indeed, what the DSA requires is that providers be able to explain those algorithms and how they work. On that note, the DSA touch base on the ad transparency and the main parameters of based on which adverts are targeted now need to be disclosed on user interfaces as part of these obligations.
What does the DSA cover ? ACCOUNTABILITY
The DSA explains that : the larger or the greater the influence the type of organization has from a content moderation and online safety perspective, the greater the obligations.
These organization are called VLOPS or VLOSES (Very Large Online Platforms or Very Large Online Search Engines and their Services). The obligations of these entities will come in force sooner.
What does the DSA cover? USER RIGHTS
The DSA wants providers to explains the procedures around content moderation. Indeed, providers will have to inform users about any rights to redress they have. Online platforms must give the users a way to challenge the moderation decision they made. The DSA creates an actual obligation to put in place an internal complaint mechanisms, out of court dispute, settlement regimes or to make these accessible to enable content creators to contest that decision.
Is DSA like GDPR?
There is a lot of GDPR DNA running through the DSA especially about transparency and user rights and safety. It is becoming obvious that the European legislation is all starting to coalesce together.
What are the responsibilities for intermediary service for the DSA?
Under the Digital Services Act (DSA), intermediary service providers have several responsibilities aimed at promoting online safety, transparency, and accountability. Some of these responsibilities include:
Content Moderation: Intermediary service providers are responsible for implementing measures to detect and remove illegal content promptly. This includes content that violates laws such as hate speech, terrorism, child sexual abuse material, and counterfeit products.
User Reporting Mechanisms: They must provide users with easily accessible and user-friendly mechanisms to report illegal content. These reporting systems should be efficient and transparent, enabling users to notify the platform of any potentially harmful or illegal content they encounter.
Transparency Requirements: Intermediary service providers are obligated to provide transparent information about their content moderation practices, including how content is moderated, the criteria used for removal, and the outcomes of user reports.
Protection of Freedom of Expression: Intermediary service need to find a balance between removing illegal content and preserving legitimate speech.
Risk Assessment and Mitigation: Intermediary service providers are expected to conduct regular risk assessments to identify and mitigate potential risks associated with their services.
Cooperation with Authorities: They must cooperate with competent authorities, including law enforcement agencies and regulatory bodies, to combat illegal activities and ensure compliance with legal obligations.
What are the responsibilities for hosting services for the DSA?
For the DSA, the responsibilities for hosting service are the same as the one for intermediary service plus notice, action and obligation to provide information to users and reporting criminal offense or harm.
What are the responsibilities for platform services for the DSA?
For the DSA, the responsibilities for platforms service are the same as the one for hosting services service plus :
– Provide complaint mechanisms
– Requirements with random checks
– Trusted flaggers
– Measures against abusive notices and counter-notices
– User-facing transparency of ads
– Transparency of recommendation algorithms
– Ban on dark patterns
– Ban on targeted ads to minors
What should be in my DSA risk assessment?
What is important is to look at how quickly dissemination of content could happen through the services we were looking at. Basically, what is the likelihood of the impact of the content and what is an effective measurement for risk and likelihood. Checkstep helps online platforms understanding the prevalence of certain issues so they can begin to put those strategies in place.
A DSA risk assessment must evaluate potential risks associated with your digital service, including:
1) Legal Compliance: Assess compliance with relevant laws and regulations.
2) Illegal Content: Identify risks related to the presence of illegal content on your platform.
3) User Safety: Evaluate risks to user safety, including the potential for harmful interactions, cyberbullying, harassment, and exposure to inappropriate content.
4) Privacy Risks: Assess risks to user privacy, including data collection, processing, and storage practices, and ensure compliance with data protection regulations such as GDPR or other applicable laws.
5) Cybersecurity: Identify cybersecurity risks, such as data breaches, hacking, malware, and phishing attacks, and implement appropriate measures to safeguard against them.
6) Content Moderation: Evaluate the effectiveness of your content moderation practices in detecting and removing illegal or harmful content, and assess the potential impact of inadequate moderation on users and society.
7) User Reporting Mechanisms: Assess the functionality and effectiveness of user reporting mechanisms for reporting illegal or harmful content, and ensure prompt and appropriate responses to user reports.
8) Cooperation with Authorities: Evaluate your cooperation with competent authorities, including law enforcement agencies and regulatory bodies, in combating illegal activities and ensuring compliance with legal obligations.
9) Technological Risks: Identify risks associated with the use of technology on your platform, such as algorithmic bias, discriminatory practices, and unintended consequences of automated decision-making systems.
10) Third-party Services: Assess risks associated with third-party services or content hosted on your platform, including potential legal liabilities and reputationnal risks.
11) Emerging Risks: Stay ahead of emerging risks and trends such as new forms of online harm, technological advancements, and evolving regulatory requirements.
12) Business Continuity: Evaluate risks to business continuity, such as service disruptions, financial losses, and reputational damage, and implement strategies to mitigate these risks.
What should be in a Content Moderation Policy? (DSA)
– Clear definition of prohibited content.
– Reporting mechanisms for users.
– Procedures for content moderation.
– Transparency in decision-making.
– Appeals process for users.
– Compliance with privacy regulations.
– Training and support for moderators.
– Commitment to continuous improvement.
Checkstep DSA Plugin automates this directly for online platforms.
What is a Statement of Reasons? (DSA) / How to make a Statement of Reasons (DSA)
A Statement of Reasons under the Digital Services Act (DSA) is a document that explains the rationale behind a decision made by a digital service provider regarding content moderation.
A Statement of reason should provide clear and transparent explanations why certain content was removed, restricted, or otherwise moderated.
To make a Statement of Reasons, digital service providers need to explain the specific reasons for their actions, including references to relevant policies, guidelines, and legal requirements. The statement should be concise, clear, and easily understandable for users, ensuring transparency and accountability in content moderation practices.
The Statement of reason has to show if the decision made by the provider was made through human or automated review.
The Statement of Reason also has to provide a section for the users to show them how to appeal that decision.
With Checkstep DSA Plugin, online platforms can automate its creation in one click.
How to handle complaints against content removal
If a user complains about decision that removed content, suspend the user access or suspend the ability for the user to monetize content. Providers should provide easy to use complaint system, quick reaction to complaints, reinstate content ASAP if the decision was appealed by user. Providers must not rely only on automated means for content removal on their platforms.
To facilitate this step, Checkstep DSA Plugin automatically handles complaints.
What are the requirements for an Out of Court Dispute Settlement? (DSA)
The requirements for an Out of Court Dispute Settlement (ODS) under the Digital Services Act (DSA) include providing an accessible and impartial mechanism for resolving disputes between users and digital service providers. This mechanism must be :
– Independent and impartial of platforms
– Have expertise in illegal content
– Easily accessible
– Fast and cost-effective in at least one EU language
– Publish clear and fair rules
– Reimburse costs if user wins
– User doesn’t pay if they lose
Checkstep DSA Plugin helps online platforms by automating each case report for them.
What should be in a Notice and Action form? (DSA)
According to the DSA, a Notice and Action form must have :
– Reason why content is illegal
– URL to content
– Description of content
– Contact details of reporter
– Declaring report is accurate
– How to challenge the decision
What should be in a Transparency Report? (DSA)
According to the DSA, in a Transparency Report there must be :
– Number of orders to act
– Number of Notice & Actions, categorised
– Information on automated tools and human training
– Numbers of internal complaints
– Reaction times
– Numbers of out-of-court settlements
– Anonymized data
– AI accuracy metrics
– Number of suspensions
– Monthly active recipients every six months
If you need a template to create your first Transparency Report, you’ll find one here.
When will the DSA be enforced?
On the 17th of February 2024. It is now officially enforced.
What is a Statement of Reason?
A Statement of Reason is a document that provides clear explanations for decisions made by digital service providers regarding content moderation under the Digital Services Act (DSA).
It explains:
– What content was actioned,
– Why content was actioned,
– How content was actioned,
– How to appeal.
If you need a template to create your first statement of reasons, you’ll find one here.
When should you issue a Statement of Reasons?
A Statement of Reasons under the DSA should be issued by digital service providers when they take actions like content removal or restriction. It provides clear explanations for these decisions, promoting transparency and accountability in content moderation practices.
This means a Statement of Reason must be issued in case of :
– Downlinking content
– Hiding content
– Removing content
– Restricting access
– Suspending account
– Suspending monetization.
What should be in a Statement of Reasons?
A Statement of Reasons must include :
– What content was actioned
– Why content was actioned (Illegal / Against policy)
– If illegal, what law it broke
– If against policy, which policy it went against.
– How content was actioned (AI / Human)
– How to appeal.
If you need a template to create your first statement of reasons, you’ll find one here.
What rights does a user have when their content is removed?
– Get a Statement of Reasons to have an explanation why their content has been removed.
– Get six months to appeal, online for free (“Internal Complaint Handling”).
What if a user doesn’t want to appeal internally?
A user has the option to do an Out of Court dispute settlement (see above) and it allow the users to not be charged to pay if they lose and to get reimbursed if they win.
What is “Notice and Action”? (DSA)
“Notice and Action” under the Digital Services Act (DSA) refers to a process where digital service providers receive notifications about potentially illegal content on their platforms and take appropriate action in response. The “Notice” means the ability for users to flag illegal content and the “Action” means that company must take action.
This typically involves users reporting illegal content to the platform, it then triggers a review process by the provider to assess the reported content and take necessary actions, such as removing or restricting access to the content.
Checkstep DSA Plugin, helps online platforms by automating this for them.
How should you allow a user to flag illegal content? (DSA)
Users should be provided with a clear and easily accessible “Report” or “Flag” button on the platform’s interface. This button should lead to a user-friendly reporting interface where users can select relevant categories, provide additional details if necessary, and submit their report.
The “Notice & Action” ability must be clearly identifiable, close to content in question, give the ability to report multiple items through one notice, give the ability to submit online and give the ability to filter reliable and unreliable submitters.
What does a “Notice and Action” form looks like? (DSA)
A “Notice and Action” form typically consists of a user-friendly interface with fields for users to provide information about the content they are reporting. It may include options to select the type of violation (e.g., hate speech, harassment), space to describe the issue, and an option to upload supporting evidence such as screenshots or URLs. The form should be intuitive and easy to navigate, facilitating the quick and accurate reporting of illegal content.
A user has flagged illegal content on my platform, now what ?
After receiving a report of illegal content on your platform you must :
– review the flagged content to assess its validity and determine if it violates your platform’s policies or legal obligations.
– take appropriate action, such as removing or restricting access to the content if it is indeed illegal or violates your terms of service.
– Notify the user who flagged the content about the outcome of their report, maintaining transparency in the process (include whether it was an automated or human decision).
– Consider implementing measures to prevent similar violations in the future and continuously monitor and address content moderation issues on your platform.
To make sure user flag content appropriately it is possible to designate “trusted flaggers” and this helps because they :
– have expertise in given area of illegal content
– use audit trail and data analytics before flagging
– prioritize their reports depending on urgency of the content.
No, as they’re likely overstretched. Although law enforcement becoming a ‘trusted flagger’ is an area to watch. Realistically, it is more likely to have independent groups becoming trusted flaggers.
Under the DSA, providers are generally required to implement measures to detect and remove illegal content promptly.
While the DSA does not explicitly mandate content moderation, it imposes obligations on platforms to address illegal content effectively. Therefore, moderation of a website’s content is often necessary to ensure compliance with the DSA.
What should your content moderation policy contain?
A content moderation policy should contain :
– clear guidelines on prohibited content
– reporting mechanisms for users
– moderation procedures
– transparency in decision-making
– appeals process
– privacy considerations
– training and support for moderators
– Different language for adults and minors
– commitment to continuous improvement.
If you need a template to create or update your Content Moderation policy, you’ll find one here.
What happens to all these Statement of Reasons?
They go to a transparency report database where everyone can view them online.
If you want to see a user-friendly report, here’s the link to our own database.