FAQs
Our most frequently asked questions
Learn more about our AI content moderation platform
-
What is the EU Digital Services Act (DSA)?
The EU Digital Services Act (DSA) is a European Union regulation that came into force in 2024.
-
Who does the DSA apply to?
The DSA applies to almost all digital services that operate in or provide services to users in the European Union.
-
What are the main requirements of the DSA for online platforms?
The DSA introduces a structured set of duties for online platforms, with obligations scaling according to size and risk.
-
What is a Very Large Online Platform (VLOP) under the DSA?
A Very Large Online Platform (VLOP) is defined in the DSA as an online service with more than 45 million average monthly active users in the EU — roughly 10% of the EU population.
-
What are the penalties for non-compliance with the DSA?
What are the penalties for non-compliance with the DSA?
-
What is the UK Online Safety Act (OSA)?
The UK Online Safety Act (OSA), passed in 2023, is the United Kingdom’s flagship online safety law.
-
How does the OSA differ from the DSA?
While both the DSA (EU) and OSA (UK) address online safety, their focus and enforcement differ.
-
What is the role of dispute resolution bodies in compliance?
Dispute resolution bodies act as independent overseers for unresolved appeals.
-
How do appeals processes differ in the OSA vs DSA?
Both the DSA (EU) and OSA (UK) require appeals, but the emphasis differs:
DSA: Appeals must be available for all users, with structured escalation to out-of-court dispute resolution bodies. -
How quickly must platforms respond to appeals?
The DSA requires platforms to respond to appeals “without undue delay.” While the regulation doesn’t specify exact deadlines, regulators expect timely handling, often within days or weeks depending on case volume.
-
How do appeals processes differ in the OSA vs DSA?
Both the DSA (EU) and OSA (UK) require appeals, but the emphasis differs:
DSA: Appeals must be available for all users, with structured escalation to out-of-court dispute resolution bodies. -
How can user appeals be automated for compliance?
Automation can make DSA appeals processes scalable and compliant.
-
What’s the process for handling malicious or abusive notices?
-
What role do trusted flaggers play under the DSA?
Trusted flaggers are organisations or individuals designated by EU authorities as reliable reporters of illegal content.
-
How must platforms notify users about content removal?
Whenever content is removed or an account is restricted under the DSA, platforms must provide a statement of reasons.
-
What is the appeals process required by the DSA?
The DSA requires platforms to give users a clear and fair appeals process whenever content or accounts are restricted.
-
How do I build a reporting dashboard for compliance?
A compliance reporting dashboard centralises all the data needed for DSA obligations.
-
What’s the difference between user-facing and regulator-facing reports?
-
What are the risks of not publishing transparency reports?
Failing to publish transparency reports exposes companies to legal, financial, and reputational risks.
-
How can AI support DSA transparency reporting?
AI can significantly reduce the burden of compliance reporting by:
Automatically classifying enforcement actions and tagging them with policy categories. -
What tools are available to automate transparency reporting?
Platforms can use compliance automation tools to collect, aggregate, and publish the data required by the DSA.
-
How do transparency reports differ between the DSA and OSA?
Both the DSA (EU) and OSA (UK) require transparency, but the focus is different:
DSA: Standardised annual reports with specific metrics such as takedowns, appeals, and active users. -
What data needs to be included in a DSA transparency report?
A DSA transparency report must contain specific metrics that show how moderation systems are working.
[View full answer] -
How often must platforms publish transparency reports under the DSA?
All platforms in scope of the DSA must publish transparency reports at least once every 12 months.
-
What is a statement of reasons under the DSA?
A statement of reasons is a notice that platforms must give users whenever their content or account is restricted.
[View full answer] -
What transparency obligations are included in the DSA?
Transparency is a central theme of the DSA.
-
What is a DSA transparency report?
A DSA transparency report is a publicly available document that online platforms must publish at least once a year.
-
Which companies are exempt from the DSA?
-
What does “safety by design” mean in the DSA?
“Safety by design” in the DSA means building digital services in ways that reduce risk and protect users, especially minors, from the start.
-
How often must risk assessments be carried out?
-
What does the DSA require during a crisis response?
The DSA introduces a crisis response mechanism for extraordinary situations such as pandemics, terrorist attacks, or armed conflicts.
-
What kinds of risks must be assessed (e.g. disinformation, CSAM, scams)?
The DSA specifies a wide range of risks that platforms must consider, including:
Illegal content -
How do platforms mitigate risks identified in assessments?
-
How do companies document appeals for regulators?
-
What are risk assessments under the DSA?
Risk assessments are structured evaluations that Very Large Online Platforms (VLOPs) and Very Large Online Search Engines (VLOSEs) must carry out under the DSA.
-
How do crowdfunding platforms comply with the DSA?
Crowdfunding platforms that host user content or seller pages are typically online platforms under the DSA.
-
What is the cost of compliance for mid-sized platforms?
-
How does the OSA impact gaming companies?
In the UK, the Online Safety Act applies to user-to-user and search services that operate in the UK, which typically includes online games with chat, UGC, voice, or social features.
-
What is the difference between the DSA and Germany’s NetzDG law?
NetzDG is a German law from 2017 that targets certain illegal content categories and imposes fast removal timelines and transparency reporting for large social networks operating in Germany.
-
What compliance steps are required for marketplaces under the DSA?
-
What are Australia’s online safety regulations?
-
How does Singapore regulate harmful online content?
Singapore uses the Online Safety (Miscellaneous Amendments) Act and a Code of Practice for Online Safety administered by IMDA.
-
What are the content moderation obligations in India?
India’s IT Rules 2021 and later amendments impose duties on intermediaries and “Significant Social Media Intermediaries” with more than 5 million users.
-
What are the main global online safety regulations in 2025?
In 2025, a few frameworks shape most platform obligations worldwide.
-
How does the EU DSA compare with California’s Age-Appropriate Design Code?
The DSA is a broad platform regulation covering illegal content handling, transparency, research access, and marketplace seller checks across the EU.
-
How can AI help with ongoing risk monitoring?
AI can enhance risk monitoring by detecting patterns and anomalies that human teams might miss.
-
What’s the difference between systemic risks and content-level risks?
The DSA distinguishes between two kinds of risks:
Content-level risks: Individual harmful or illegal items, such as a single post containing CSAM or a fraudulent listing. -
What is the role of external researchers in risk assessments?
-
How do companies prepare for audits under the DSA?
-
How do regulators audit risk assessments?
Digital Services Coordinators and the European Commission can audit how platforms conduct risk assessments.
Want to see our AI content moderation platform for yourself?
Book a demo to see how it can help you deliver safer, more inclusive content at scale.