Watch the recording

Digital Services Act Deep Dive: Statement of Reason & Appeals


How Statements of Reason should align to your policies

When you should you produce one and what information should it contain? This is the cornerstone of a fair and user centric digital landscape.

The hidden benefits of Appeals

Empowering users to challenge content moderation decisions by allowing them to present their perspectives and points of view is required under the DSA. We explain the non-legal benefits of offering appeals to their users.

Why human reviewers are here to stay

The DSA requires a human layer for appeals to ensure fairness and objectivity. With the right systems you can use automation in the flagging of unwanted content to improve both speed and efficiency of human decisions.

Out-of-court disputes- what do we know so far?

Out-of-court disputes- what do we know so far and how to prepare evidence

What is discussed in the webinar?

The webinar covered various aspects related to the DSA, including its background and scope, responsibilities regarding statement of reasons, internal complaint handling, and out of court disputes. The importance of user redress and understanding decisions made by online platforms, as well as the significance of clarity and precision in providing statements of reasons to users. The discussion provides insights into how platforms should handle content moderation, appeals, and user rights under the DSA regulations.

Statement of Reason

What is a Statement of Reason?

If you offer a hosting service or online platform and moderate content, a Statement of Reason (SoR) is a clear statement that explains why a piece of content has been taken down and what rights they have to appeal. This statement must:

  • Be issued as soon as you action content
  • Tell the user what content was actioned
  • Tell the user why the content was actioned
  • Tell the user how the content was actioned
  • Tell the user what they can do to appeal
  • Be clear, easily understandable and precise

Providing a Statement of Reason is essential for transparency, accountability, and user trust on a platform, as well as a key part of being compliant with the Digital Services Act.

  1. Transparency: Users deserve to know why their content was moderated or taken down. A clear statement helps users understand the rules and guidelines they need to follow.
  2. Accountability: By explaining the reasons behind content actions, the platform demonstrates accountability for its decisions, ensuring that they are based on objective criteria and not arbitrary measures.
  3. User Trust: When users know the reasons for content moderation, they are more likely to trust the platform and feel that their rights and concerns are respected.
  4. Appeals Process: A Statement of Reasons allows users to identify potential misunderstandings or errors in moderation decisions and provides them with information on how they can appeal the decision if they believe it was unjustified.
  5. Legal Compliance: to be complaint with the DSA, there is a clear legal requirement for platforms to provide a clear statement of reasons
  6. Improvement: Analysing and documenting the reasons for content actions can help the platform improve its content moderation policies and practices over time

The Statement of Reasons must be clear, easily understandable and precise.

When should a Statement of Reason be issued?

When it comes to moderating content, there are times when certain measures need to be applied to maintain a trustworthy and secure online environment. 

Content enforcement encompasses actions that limit the visibility and accessibility of specific information shared within a service. These measures are essential to ensure a positive user experience and to uphold community guidelines. When should a “Statement of Reason” be issued?

Essentially when ever an enforcement action is taken, such as:

  • Downranking Content: Adjusting the visibility or prominence of specific content to maintain quality and relevance.
  • Hiding Content: Temporarily making certain content not visible to the public or specific users to address potential issues.
  • Removing/Deleting Content: Permanently taking down or deleting content that violates guidelines or poses risks to the community.
  • Restricting Access to the Service: Temporarily limiting access to certain features, such as placing an account in “read-only” mode, as a preventive measure.
  • Permanently Deleting Accounts or Temporarily Suspending Access: Taking necessary actions to manage user accounts that may be in violation of policies or community guidelines. This could include either permanent account deletion or temporary suspension of access as deemed appropriate.

Checkstep has an inbuilt statement of reason workflow which explains the process and the user’s rights in a clear and transparent way

Are there times when an SoR isn't needed?

  1. Removing illegal content: In certain instances, content removal or action is executed in response to legal orders targeting illegal content (Art.9), in such cases platforms may be exempt from providing an SoR.

  2. Contact Information Considerations: Platforms are relieved from the obligation of furnishing an SoR when they lack access to the user’s contact details. This exemption is applicable when moderation actions are taken without means for direct communication.

  3. Dealing with Deceptive High Volume Content: Deceptive high volume content, such as spam, poses a persistent challenge in the digital landscape. Platforms have the discretion to address such content without necessarily providing detailed explanations through an SoR.

In the context of regulatory guidance, two crucial Recitals warrant attention:

  1. Recital 64 – Prior Warnings: Platforms are advised, through Recital 64, to issue prior warnings to users before resorting to severe measures like account suspension. This practice allows users the opportunity to rectify potential violations, promoting fairness and adherence to rules.

  2. Recital 66 – Real-time Reporting: Recital 66 emphasises the timely transmission of SoRs to the Commission’s database, where feasible. This provision enables regulatory authorities to monitor content moderation practices effectively and ensure compliance with guidelines.

Internal Complaint Handling

The User's Right to Complain about a Content Moderation Action

Whenever you enforce content and provide a Statement of Reasons, you must allow the user access to complain internally to you for up to six months after the relevant decision, electronically and free of charge.

  1. Decisions Involving Information Removal or Access Disabling: Whenever a decision results in the removal of content or the disabling of user access to information, the affected user retains the right to voice their concerns and seek redress through the internal complaints mechanism.

  2. Decisions Pertaining to Service Suspension or Termination: In cases where the platform takes action to suspend or terminate the provision of services (partially or entirely) to recipients, users are granted the opportunity to initiate internal complaints within the designated six-month timeframe.

  3. Decisions Relating to Account Suspension or Termination: Should a user’s account face suspension or termination as a consequence of platform decisions, they are entitled to raise complaints through the established internal channels.

  4. Decisions Affecting Monetisation Opportunities: In situations where the platform decides to suspend or terminate a user’s ability to monetise the information they provide, users have the right to challenge the decision internally.

Checkstep has an appeals workflow built in which explains the user’s rights in a clear and transparent way

A well-designed complaints procedure is the cornerstone of user empowerment and fairness in content moderation. It should be readily accessible, user-friendly, and equipped to handle sufficiently precise and well-substantiated complaints. Timeliness is of the essence, ensuring that complaints are promptly addressed and resolved. When initial content or user actions are overturned upon appeal, the procedure must facilitate the swift restoration of content or access without unnecessary delays. Furthermore, the process should not solely rely on automated mechanisms, recognising the importance of human review and judgment in fostering a just and equitable online environment. With these principles at its core, the complaints procedure becomes a powerful tool in upholding user rights and maintaining the integrity of content governance policies.

Out of Court Disputes

What is the mechanism for settling disputes?

As well as offering internal complaint handling, users must also be able to access an out of court dispute settlement mechanism that has been authorised to resolve disputes, whether they have been through internal complaint handling or not.

Checkstep has an appeals workflow built in which explains the user’s rights in a clear and transparent way

The requirements for certification of such a settlement body include:

  • Being independent and impartial of platforms
  • Having expertise in illegal content or one or more areas of enforcement of T&Cs
  • Being easily accessible through electronic means
  • Being fast, efficient, cost effective and can conduct dispute resolution in at least one official union language
  • Publishing clear and fair rules of procedure

If the user wins an out of court dispute, the platform needs to reimburse all reasonable costs. If the platform wins at out of court dispute, the user is not required to pay any costs or expenses.

This is without prejudice to redress complaints through existing legal processes where applicable.

Watch the recording to discover more about the Digital Services Act, Statement of Reason, and Appeals.

You can sign up for upcoming DSA webinars here

Don't delay, get Digital Services Act compliance in 2 Weeks!

Speak to one of our experts and find out how
Talk to an expert