What kinds of risks must be assessed (e.g. disinformation, CSAM, scams)?
Questions & Answer
The DSA specifies a wide range of risks that platforms must consider, including:
Illegal content
(e.g.child sexual abuse material, terrorism, scams, counterfeit goods).
Disinformation and misinformation that undermine democratic processes or public health.
Manipulation of services such as algorithmic amplification or recommendation bias.
Harms to vulnerable groups, especially children and minorities.
Impact on fundamental rights, including freedom of expression, data protection, and non-discrimination.
Platforms must document both the likelihood and severity of these risks and describe how they will mitigate them.
Marketplace Risk Global Summit 2025
The only conference designed for marketplace and digital platform founders and leaders. Each year Eu...
Whitepaper: Corporate Complacency vs. Human Cost
The real cost of “good enough” moderationOnline platforms face growing scrutiny as harmful content c...
10 Questions to Ask When Selecting a Partner for AI Content Detection in 2025
Having been part of four Trust & Safety startups throughout my career, typically in pre-sales or...
9 mins