How must platforms notify users about content removal?
Questions & Answer
Whenever content is removed or an account is restricted under the DSA, platforms must provide a statement of reasons.
This notice must:
Identify the content or account affected.
Explain why the action was taken and cite the relevant rule or law.
Give users information about how to appeal.
Be delivered in clear, accessible language.
For example, if a photo is removed for hate speech, the platform must say which policy was applied, when the removal happened, and where to submit an appeal. Notices should be logged and made available for regulator review.
Checkstep Raises £3 Million to Scale AI-Powered Trust & Safety Infrastructure Globally
London, UK — April 29, 2026Checkstep, the AI content moderation platform transforming Trust and Safe...
4 mins
Checkstep Platform Spotlight: Real-Time Chat Moderation at Scale
The Checkstep Platform Spotlight is our regular deep dive into the features that power the Checkstep...
3 mins
Checkstep Platform Spotlight: Policies and ModBot
The Checkstep Platform Spotlight is our regular deep dive into the features that power the Checkstep...
3 mins