The European Commission has issued a €120 million fine against X (formerly Twitter) under the Digital Services Act (DSA). Outside the amount, which is staggering, this is a big deal for the future of online platforms, not only in Europe, but also around the world: DSA enforcement is no longer a future scenario. It’s an operating reality.

The fine isn’t about content removal or platform policy decisions. It’s about precision: the DSA is demanding a new level of precision from platforms especially where trust and transparency are so fundamental to the product and user safety.

 

The real story isn’t moderation. It’s the systems behind trust.


We hear a lot of it: content decisions and moderation debates trigger hot debates around curbing free speech. But the X fine relates to trust and transparency obligations: how authenticity is signalled, how advertising is disclosed, and whether legitimate oversight is possible.

The reported focus areas include:

  • User Verification: experiences that can mislead users about authenticity.
  • Advertising transparency: shortcomings in an ad repository.
  • Restrictions on researcher access to anonymised data: essential for democracy.

Whether you agree with every enforcement move or not, the fact is that regulators are now treating these areas as high-impact infrastructure - because they shape safety outcomes at scale.

 

The new baseline: trust IS infrastructure


For years, many platforms treated trust as a blend of policy documents, reactive operations, and brand messaging. The DSA changes the nature of the conversation.

Under this regime, what matters is not how you talk about compliance, but whether you can demonstrate it. In X’s case, it’s:

  • Can your software show how identity, reputation, and authenticity signals work and prove - in a data-driven way - they don’t systematically mislead?
  • Do you have internal evidence that ads are disclosed and tracked in a way that enables real scrutiny?
  • Can you enable legitimate oversight without compromising user privacy or security?

This is why I keep coming back to a simple idea: Online safety is not a department. It’s infrastructure.

 

DSA enforcement is speeding up: a platform-by-platform recap


One reason the X fine matters is that it doesn’t sit in isolation. Over the last year, when enforcement became effective, the Commission has steadily moved from opening proceedings to preliminary findings, binding commitments, and now a headline penalty.


Here’s a practical snapshot of the pattern platforms should be paying attention to:
 

PlatformMilestone & dateRegulator focusWhat to watch next
X (formerly Twitter)€120M fine (Dec 2025)Transparency and trust obligations tied to verification design, ad transparency, and researcher accessEscalation risk if remediation is slow; obligations continue even amid strong public pushback
Meta (Facebook & Instagram)Formal breach notification / findings (Oct 2025)Friction or failures in user reporting, appeals, and researcher accessUnder review - this stage often precedes remedies or penalties if changes aren’t convincing
TikTokInvestigated; no fine on ad library after commitments accepted (2025)Ad transparency and the shape of a compliant ad repositoryCommitments can avoid a fine on a specific area, but wider DSA scrutiny can continue
TemuPreliminary breach notice (Jul 2025)Illegal/dangerous product listings and risk mitigation adequacyPotential fine pending response; marketplaces are clearly in scope for active enforcement
AliExpressCommitments made (Jun 2025)Binding transparency and safety upgrades, reportedly with external monitoringCooperation can avert a fine, but it can still result in ongoing oversight requirements
AmazonVLOP status upheld by EU court (Nov 2025)Whether the platform remains subject to VLOP obligationsYou can litigate labels, but compliance expectations remain - and the bar keeps rising


My take: the pattern is the point

If you zoom out, the consistent theme isn’t any one company. It’s the EU’s direction of travel.

The Commission keeps returning to the same operational foundations:

  • Do your product mechanics create misleading trust signals?
  • Can the public meaningfully see who is advertising what?
  • Can users report and appeal in ways that actually work?
  • Can legitimate oversight happen through structured access for researchers?

In other words: the DSA is pushing Trust & Safety out of the policy and ops layer and into the platform architecture itself.
 

 

What happens when you don’t build for auditability


Most platforms don’t choose non-compliance. They drift into it because their systems evolved quickly - and then regulation arrives and asks for something teams rarely design for by default: traceability.

That’s where the pain usually shows up:

  • Decisions happen across tools and teams, but evidence isn’t consistently captured.
  • Policies exist, but enforcement is inconsistent or difficult to prove after the fact.
  • Reporting and appeals exist, but not in a way that creates reliable records and metrics.
  • Ads transparency is treated like a page or a widget - not a governance system.

And when scrutiny arrives - internally, from partners, from watchdogs, from regulators - you end up retrofitting compliance into a moving platform. That’s expensive. It’s disruptive. And it’s exactly what enforcement actions are designed to discourage.
 

 

“We’ll build it later” is now a risky strategy


The most common trap I see is timing.

Leadership teams say: “We’ll solve this once we hit the next growth milestone.” But regulation doesn’t wait for your roadmap and enforcement doesn’t care that you had good intentions.

If you have European users, the practical question is: how quickly can you reach a defensible level of trust and transparency?

That leads to the strategic decision every platform is now being pushed to make: build vs buy.

 BuildBuy
Initial investmentMulti-month effort requiring in-house expertise.
Significant engineering and policy investment
Deploy proven workflows, evidence capture, reporting, and AI-assisted moderation faster.
Maintenance costsHeavy operational maturity: Engineering needs is constant, maintain a 24/7 operations with complete security. Adapt the T&S configurations and policies as your product/market evolves.


 There’s no universal answer. But time is a universal constraint.

 

How Checkstep helps: trust, compliance and transparency in one platform


At Checkstep, we help platforms put a real Trust & Safety operating system in place - one that supports day-to-day moderation and the governance expectations that regulations like the DSA are bringing into focus.

Checkstep offers full-stack DSA compliance modules:

  • Verified user handling & deceptive design detection.
  • Ads repository compliance tools.
  • Research-safe data infrastructure.
  • Risk assessments and systemic harm modeling.

This is about staying ahead of the curve, not reacting to it.


 

The urgency: enforcement is the start, not the peak


The €120M fine against X is a clear proof point that DSA enforcement can be costly and fast-moving. But the deeper message is even more important: regulators are building expectations around repeatable, auditable systems - not best-effort intentions.

If you’re not sure where you stand, the next step is straightforward.


 

Request a DSA readiness audit


We’ll map the gaps that matter most - from trust signals to reporting and appeals to transparency workflows - and outline a path to compliance that won’t overwhelm your operations.