What kinds of risks must be assessed (e.g. disinformation, CSAM, scams)?
Questions & Answer
The DSA specifies a wide range of risks that platforms must consider, including:
Illegal content
(e.g.child sexual abuse material, terrorism, scams, counterfeit goods).
Disinformation and misinformation that undermine democratic processes or public health.
Manipulation of services such as algorithmic amplification or recommendation bias.
Harms to vulnerable groups, especially children and minorities.
Impact on fundamental rights, including freedom of expression, data protection, and non-discrimination.
Platforms must document both the likelihood and severity of these risks and describe how they will mitigate them.
Checkstep Raises £3 Million to Scale AI-Powered Trust & Safety Infrastructure Globally
London, UK — April 29, 2026Checkstep, the AI content moderation platform transforming Trust and Safe...
4 mins
Checkstep Platform Spotlight: Real-Time Chat Moderation at Scale
The Checkstep Platform Spotlight is our regular deep dive into the features that power the Checkstep...
3 mins
Checkstep Platform Spotlight: Policies and ModBot
The Checkstep Platform Spotlight is our regular deep dive into the features that power the Checkstep...
3 mins