fbpx

Watch the recording

Digital Services Act Masterclass

Highlights

How the DSA builds upon existing legislation

The DSA modernises and enhances rules governing digital services within the European Union, covering harmful content, disinformation, and illegal activities.

Responsibilities under the DSA

The DSA introduces different levels of compliance obligations depending on whether the business operates as an Intermediary Service, Hosting Service, or Platform Service.

What the DSA means in action

What needs to be contained in content moderation processes e.g. Risk Assessments, Statements of Reason, Appeals, Notice, Action, Transparency reporting and Law Enforcement Reporting.

Demonstration

What does a DSA compliant system look like in the real world? A demonstration of how Checkstep’s DSA compliant platform works and an overview of the architecture that needs to be in place.

What is discussed in the webinar?

History of the Digital Services Act

While the DSA is a comprehensive legislative proposal, it primarily seeks to revise and complement two main existing regulations:
  1. E-Commerce Directive (2000/31/EC): The DSA aims to update and modernise the E-Commerce Directive which provided certain liability exemptions for online services which transmit and host user-generated content. The DSA preserves and updates these liability exemptions, and also creates supplementary/additional provisions around user protection, transparency, and risk management
  2. Audiovisual Media Services Directive (AVMSD) (2010/13/EU): The AVMSD is a sector-specific directive which already contains some content moderation rules specifically in relation to video-sharing platforms (VSPs). The DSA proposes to extend and to strengthen the enforcement system behind these rules, and to make them applicable to other online services that host/provide access to audiovisual content too. This update is aimed at ensuring consistent regulations for all providers of audiovisual services
By revising and complementing these existing regulations, the DSA intends to address the evolving challenges posed by digital intermediary services, such as harmful content, disinformation, and illegal activities.
The DSA retains the categories of “mere conduit,” “cache,” and “host” from the E-Commerce Directive, which define the liability exemptions for platforms. Essentially, platforms that play a passive role in transmitting or hosting content are exempt from liability. The DSA updates and replaces the platform liability provision of the E-Commerce Directive, commonly known as “safe harbours”.

The most significant change brought by the DSA is the set of compliance obligations imposed on organisations involved in content moderation. This means organisations will need to establish robust compliance processes and procedures to meet these new requirements. The DSA will become applicable to the majority of organisations by February 2024, so it is advisable to start preparing early!

What is the Digital Services Act?

The Digital Services Act (DSA) is a regulatory framework aimed at modernising and enhancing the rules governing digital services within the European Union (EU). It builds upon the existing E Commerce Directive (ED) while addressing the evolving digital landscape and the need for stronger accountability of online platforms.

The DSA addresses a wide range of digital services, including social media platforms, online marketplaces, search engines, and other intermediaries. Its main objectives include:
  1. Safety: By requiring platforms to implement measures to detect and remove illegal content such as hate speech, terrorist propaganda, and counterfeit products.
  2. Transparency: By obligating platforms to disclose their content moderation policies, algorithms, and advertising practices. Users should have clear information about how their data is used and have the ability to make informed choices.
  3. Accountability for the largest platforms: By designating certain platforms as ”VLOPs”/”VLOSEs” and imposing specific obligations on them. VLOPs/VLOSEs have a significant influence on EU users and these additional obligations are designed to ensure that they are held to account appropriately.
  4. User Rights: The right to contest platform decisions, access and portability of data, and safeguards against unfair commercial practices.

Responsibilities

Intermediary Services

Hosting services play a crucial role in ensuring the safety and legality of online content. However, the responsibilities of hosting services can be complex and multi-layered. In this article, we will explore the obligations that hosting services have, particularly in relation to notice and action, providing information to users, and reporting criminal offences and harm.

As well as adhering to the same DSA obligations as Intermediary Services, Hosting Services have two extra obligations:

  1. Transparency Reporting: Organisations must publish annual reports on content moderation. These reports should include information such as the number of orders received from authorities, the amount and type of legal content taken down, and other relevant details.
  2. Terms of Service: Intermediary services must disclose information about their content moderation practices, terms and conditions, and the tools they use. They should also provide details about their complaints procedure.
  3. Cooperation with National Authorities: If authorities issue orders to act against illegal content or request information about specific service recipients, organisations must comply. They are required to inform the authorities of the actions taken and also notify the affected service recipients.
  4. Points of Contact and, where necessary, Legal Representative: All organisations subject to the DSA must appoint points of contact for both authorities and service recipients. If an organisation is not established within the EU but falls under the jurisdiction of the DSA due to its extraterritorial reach, it must appoint a legal representative. This representative is contacted by the authorities when necessary.

It is important for intermediary services to familiarise themselves with these obligations to ensure compliance with the DSA. By adhering to these requirements, organisations can contribute to a safer and more transparent online environment.

Hosting Services

Hosting services play a crucial role in ensuring the safety and legality of online content. However, the responsibilities of hosting services can be complex and multi-layered. In this article, we will explore the obligations that hosting services have, particularly in relation to notice and action, providing information to users, and reporting criminal offences and harm.

As well as adhering to the same DSA obligations as Intermediary Services, Hosting Services have two extra obligations:

  1. Notice and Action: an important aspect of hosting services’ responsibilities is the concept of notice and action. This refers to the obligation to promptly respond to reports of illegal content. Rather than making subjective judgments about whether an offence is criminal or not, hosting services are expected to act upon suspicions of criminal activity. Essentially, they serve as a bridge between users and the authorities.
  2. Community Reporting: To facilitate notice and action, hosting services often employ a community reporting system. This system allows anyone to report illegal content found on the platform. By making the reporting process accessible to all users, hosting services encourage active participation in ensuring a safe online environment. This approach forms the foundation for developing effective products and services in the field of content moderation.

Platform Services

Platform services, often considered a subset of hosting services, play a crucial role in not only storing content but also disseminating it to the public. These services, which include social media platforms and online marketplaces, are categorised as platform services due to their accessibility without significant barriers. Under the DSA, extra obligations are being imposed on these platforms, on top of the obligations for Hosting Services and Intermediary Services. These are:

  1. Trusted Flaggers: Designated organisations recognised as trusted flaggers by the authorities are given priority for notices. These organisations play a crucial role in reporting and identifying harmful or illegal content on the platform.

  2. Abusive Notices and Counter-Notices: Measures are put in place to address abusive notices and counter-notices. This is to prevent misuse of the notice-and-takedown process, ensuring fair resolution for content disputes.

  3. Complaint and Redress Mechanism: Platforms are required to establish complaint and redress mechanisms to offer internal complaint resolution and out-of-court dispute settlement options for affected individuals. This ensures users have a way to address issues related to their content or accounts.

  4. Online Marketplaces Obligations: Online marketplaces like Amazon have additional requirements, such as implementing Know Your Business Customer (KYBC) to vet third-party suppliers and ensuring compliance by design with random checks on suppliers.

  5. Transparency of Online Advertising: Platforms must disclose specific information about each ad, including the advertiser, the entity on whose behalf the ad is placed, and the main parameters used for ad targeting. This enhances transparency in the online advertising ecosystem.

  6. Ban on Dark Patterns: The DSA prohibits the use of dark patterns, manipulative tactics designed to coerce users into certain actions, such as misleading visuals or false urgency.

  7. Protections for Minors: The DSA reinforces protections for minors by prohibiting targeted advertisements based on sensitive data categories defined under GDPR, and it also bans deliberate targeting of ads towards children.

Overall, the DSA aims to ensure a safer, more transparent, and accountable digital environment for users, while placing specific obligations on online platforms and marketplaces to meet these goals. As with any regulatory framework, the practical implementation and enforcement will depend on the relevant authorities and how they work with the platforms.

DSA in Action: What are the best practices?

Risk Assessments

Whilst only VLOPs/VLOSEs are required to conduct risk assessments, it’s essential that all platforms conduct them regardless as best practice. The assessment should focus on specific systemic risks, including:

  1. Dissemination of illegal content through their services.
  2. Negative effects on fundamental rights, private life, freedom of expression, data protection, and more.
  3. Impact on civic discourse, electoral processes, public security, and wellbeing, including gender-based violence and public health.

Factors considered should include intentional manipulation, inauthentic use, automated exploitation, and the dissemination of illegal or violating content. By staying compliant, you will safeguard your platform’s integrity.

Checkstep has inbuilt Assessments for you to measure your platform against systemic risks 

Content Moderation Policies (T&Cs)

Everything starts with a policy. For content to be transparently moderated and fair sanctions issued to users, there must be a clear policy to enforce against. 

The policy should cover: 

  • Procedures
  • Measures 
  • Tools used for content moderation (including algorithmic decision-making and human review)
  • A procedure for any internal complaint-handling systems

The information needs to be provided in clear and unambiguous language, and it must be publicly available in an easily accessible format. 

Checkstep Policy manager allows you to create and deploy policies at a click of button, and version policies through our integrated repository.

Statement of Reasons

If you provide a hosting service/online platform and you enforce content, you must provide a Statement of Reasons which presents the user with certain information. “Enforce” means either user or actor based enforcements, including when a user is disabled or content is removed, downranked or otherwise interrupted (inc. “shadow bans”). 

The Statement of Reasons needs to include at least: 

  • What content was actioned (taken down, restricted, downranked etc.)
  • Why content/user was actioned (illegal or policy/T&Cs violation)
  • Whether action was based on automated decision making or human review
  • If the content was considered illegal under applicable legislation, and under what legal ground/why it was considered to be illegal, or;
  • If content was violating your policy/T&Cs, and why that content went against policy/T&Cs (including the contractual ground relied on)
  • You must give information about appeal/complaint process internally, and out-of-court dispute resolution

The Statement of Reasons must be clear, easily understandable and precise.

Internal Complaint Handling

Whenever you enforce content and provide a Statement of Reasons, you must allow the user access to complain internally to you for up to six months after the relevant decision, electronically and free of charge. 

This obligation applies in respect of: 

  • Decisions to remove or disable access to the information;
  • Decisions to suspend or terminate the provision of the service, in whole or in part, to the recipients;
  • Decisions to suspend or terminate the recipients’ account;
  • Decisions to suspend or terminate the ability to monetise information provided by the recipients.

The complaints procedure should:

  • Be easy to access, user-friendly and enable and facilitate sufficiently precise and adequately substantiated complaints
  • Enable complaints to be handled in a timely manner
  • Where the initial decision to action content/user are overturned on appeal, ensure content or access is restored without undue delay
  • Ensure decisions to users are not unduly delayed 
  • Not rely solely upon automated means

 

Checkstep has an inbuilt statement of reason and appeals workflow which explains the process and the user’s rights in a clear and transparent way

Out of Court Dispute Settlement

As well as offering internal complaint handling, users must also be able to access an out of court dispute settlement mechanism that has been authorised in order to resolve disputes whether they have been through internal complaint handling or not. 

The requirements for certification of such a settlement body include: 

  • Being independent and impartial of platforms
  • Having expertise in illegal content or one or more areas of enforcement of T&Cs 
  • Being easily accessible through electronic means
  • Being fast, efficient, cost effective and can conduct dispute resolution in at least one official union language
  • Publishing clear and fair rules of procedure

If the user wins the out of court dispute, the platform needs to reimburse all reasonable costs. If the platform wins at out of court dispute, the user is not required to pay any costs or expenses. 

This is without prejudice to redress complaints through existing legal processes where applicable.

Checkstep tracks the history of content moderation decisions as well as the result of appeals

Notice & Action

Any person or entity needs to be able to report content on hosting services/online platforms which they consider to be illegal. Reports from designated Trusted Flaggers (art. 19) need to be prioritised by online platforms.

The Notice & Action form should allow:

  • An expression of why the content is illegal
  • The URL to be submitted where the content is hosted
  • Free text to help further identify the content (where required)
  • Capture of name and email address of the person or entity reporting the content, and validate that email before allowing submission of the report (except where the report concerns CSAM or child abuse)
  • An expression that the report is being submitted is accurate and complete, and the submitter believes that the content is illegal

The report handling system should send a confirmation email that the complaint has been received and notify the submitter when the case has been reviewed, the decision, how that decision was reached (human, automated) and how to challenge that decision (report form response should be a version of the appeal form).

Checkstep’s reporting form is designed to prevent “manifestly unfounded” submissions.

Transparency Reporting - Intermediary & Hosting

All organisations subject to the DSA and who conduct content moderation need to publicise an annual transparency report on their content moderation activities. 

Transparency reporting should include:

  • Number of orders received (to act on illegal content or provide information) – plus information including average time to take action 
  • Number of Notice & Actions received to remove illegal content – broken down in categories of illegal content, whether action was based on T&Cs or illegality, whether automated means was used to make the decision, and the average time taken 
  • Information about any own-initiative content moderation, including use of automated tools (+ their accuracy/any safeguards), and training & assistance provided to employees 
  • Number and type of measures that affect the availability of information provided by recipients 
  • Internal complaints: the number of complaints submitted, why they were submitted, average time to handle complaints and the number of times complaints were overturned

Reports will also be transmitted to a public database (TBC – managed by the EU Commission).

Transparency Reporting - Platforms

Online platforms’ transparency reports will need to include everything in Intermediaries & Hosting, plus…

  • Types and number of illegal content removed (CSAM, Terrorist, etc.)
  • Types and number of content removed from terms and conditions breaches
  • Number of out-of-court dispute settlements (submitted, upheld, overturned, abandoned/completed), average time to complete, and whether outcome was implemented 
  • Data must be anonymous 
  • AI accuracy metrics & safeguards
  • The number of suspensions applied to those who post illegal content, or who persistently send manifestly unfounded notices

All online platforms also need to publish their “monthly active recipients” at least once every six months (first publication was on 17 February 2023).

Watch the recording to discover more about the Digital Services Act.

You can sign up for upcoming DSA webinars here

Don't delay, get Digital Services Act compliance in 2 Weeks!

Speak to one of our experts and find out how
Talk to an expert