Microsoft and the Digital Fairness Act
Read Microsoft’s Official Statement on the Digital Fairness Act (PDF)
On 24 October 2025, during the European Commission’s public consultation on the Digital Fairness Act, Microsoft published its official feedback on the upcoming legislation.
Microsoft tells the Commission that EU law already covers most DFA concerns (dark patterns, addictive design, personalisation, subscriptions). It asks Brussels to enforce and simplify what exists, avoid overlapping rules, use risk-based obligations (especially for minors), and not mandate blanket bans (e.g., on loot boxes or autoplay). On subscriptions, it supports clearer disclosures and easy online cancellation but warns against “double consent” for free-trial conversions and against a one-click “cancel button” that bypasses account login. It also wants better regulator coordination and room to digitise consumer information.
Why this matters (and how it maps to the DFA rulebook)
The DFA is expected to target a concrete set of practices: dark patterns, addictive design/gaming, personalisation, influencer marketing, contract cancellations & digital subscriptions, and unfair terms. Microsoft’s submission engages each bucket, but its through-line is: simplify, harmonise and only legislate where there’s a proven gap.
Microsoft’s starting position: enforce what we have, streamline what overlaps
Microsoft opens by backing “strong standards of online safety and consumer protection,” but says the EU should “prioritize the consistent application” of existing rules (UCPD, CRD, GDPR, DSA, DMA, AI Act, etc.) and avoid duplicating them with a new, sprawling package. It argues much of the framework only started applying recently (e.g., Omnibus since May 2022; DSA to all services since February 2024), so it’s “premature to conclude that the current framework is ineffective.”
The DFA should be “limited to very targeted adjustments,” with a particular opportunity to digitise consumer information so it’s clearer and more accessible.
Microsoft also urges clear definitions where consumer, data-protection and platform laws intersect (e.g., what exactly counts as a “dark pattern”), to prevent contradictory obligations and regulator forum-shopping.
Dark patterns: target proven harms, don’t re-legislate everything
Microsoft says the existing toolbox already bans deceptive UIs (DSA Article 25, UCPD misleading/omission/aggressive practices; GDPR and ePrivacy constraints). Priority should be consistent enforcement and guidance to separate illegal manipulation from “standard marketing techniques.”
New rules should “target specific and identified harmful and illegal patterns,” not cast a vague net over common design practices.
DFA tie-in: the Act’s list (biased hierarchies, confirm-shaming, drip pricing, hard cancellations, fake urgency) fits Microsoft’s focus on enumerated harms, backed by guidance rather than a sweeping new regime.
Addictive design & gaming: caution on bans; use risk-based duties and existing mitigations
Microsoft warns against blanket prohibitions on popular features (autoplay, infinite scroll, streaks), arguing the DSA already obliges Very Large Online Platforms to assess and mitigate systemic risks (including over-engagement) and that design risk is context-dependent. It prefers “clear, evidence-based, outcome-focused principles” over feature bans.
On games, Microsoft defends in-game purchases and currencies as the engine of modern “live-service” development, highlighting parental controls (spend limits, “Ask a Parent,” purchase histories), and noting PEGI and EU law already govern transparency. It stresses that “in-game currency” is digital content, not real money, with no value outside the game.
On loot boxes, Xbox commits to probability disclosure, no odds manipulation, and ensuring items have fair value but “strongly caution[s] against a blanket ban on lootboxes for minors,” citing existing parental controls and the cost/inefficacy of heavy age checks.
What this means for the DFA: regulate specific risky mechanics (e.g., transparency on odds, no misleading claims), rely on controls for minors, and avoid a one-size-fits-all rule that could wall off EU users from global content.
Personalisation & advertising: already dense law, fix within that stack
Microsoft frames personalised offers/ads as economically essential and says the existing regime (GDPR fairness/consent; ePrivacy cookies; DSA ad transparency and minor protections; DMA consent for cross-service data; AI Act manipulation bans; CRD/UCPD transparency) is “extensive and complex.” Any proven issues should be addressed inside that framework, not via the DFA.
Translation: don’t create a parallel “unfair personalisation” code. Enforce GDPR/DSA consistently, and align interpretations among DPAs and consumer authorities.
DFA tie-in: the project’s “problematic personalisation” bucket (opaque profiling, personalised pricing, exploiting vulnerabilities) is real, but Microsoft wants harmonised enforcement, not a new layer of overlapping consent/UX duties.
Contracts, cancellations & subscriptions: simpler flows, but no “double consent” or insecure one-click
Microsoft backs a clear, flexible baseline across devices and services:
Disclose key terms upfront (price, renewal, cancellation).
Get consent to auto-renewals before charging or starting a free trial that converts.
Send reminders for long cycles and before trial → paid.
Make cancellation simple and free, generally in the same medium (if you sign up online, cancel online).
It warns against two ideas floated around the DFA:
A single “cancel” button with no login, Microsoft says that’s a security risk and often clunkier (consumers end up hunting for order numbers). It prefers cancellation within the logged-in account.
Mandatory “double consent” before a free trial turns paid, Microsoft says it creates “notification fatigue,” kills trials (they cite South Korea’s rule leading them and others to stop offering trials there), and adds engineering burden with little benefit.
DFA tie-in: the Act’s “subscription traps” goal (hard cancellations, silent renewals, hidden price jumps) is compatible with Microsoft’s stance, so long as the law leaves room for service-appropriate UX and avoids blanket re-consent mandates.
Children & age assurance: define which risks, then apply a risk-based check
Microsoft supports protecting minors but says the consultation doesn’t specify, with evidence, which risks merit gating via age assurance, creating a danger that providers must install checks “regardless of risk level or type.” It urges the DFA to define concrete risks not covered elsewhere, then let high-risk services show mitigation (age assurance as one tool), balancing safety with kids’ access to culture, learning and play.
In short: harmonise the when and why of age assurance; keep it proportionate and privacy-respecting; don’t make app stores universal gatekeepers for everyone else’s risks.
Regulator coordination: cut duplication, align decisions
Because consumer, platform and data-protection rules now interlock, Microsoft asks for stronger cooperation and coordination among the CPC network, national consumer authorities, Digital Services Coordinators, and the Commission, so firms aren’t audited by multiple bodies in divergent ways and consumers get uniform outcomes.
“Technology-neutral,” evidence-based drafting: don’t regulate features, regulate outcomes
Microsoft repeatedly calls for technology-neutral, evidence-based rules that focus on harm and outcomes, not on specific UI widgets. It also asks lawmakers to consider different business models (marketplaces ≠ gaming ≠ search) and to keep any new DFA duties narrowly scoped.
Where this lands for Microsoft’s businesses
Xbox & first-party games: Odds disclosure, parental controls and clear spend histories are already in place; a loot-box ban for minors would be opposed; risk-based duties are manageable.
Microsoft 365/Consumer subscriptions: The company is fine with transparent sign-up and easy online cancellation, but will resist double consent and login-free cancellation buttons as costly, confusing, and insecure.
Ads & retail surfaces (e.g., Bing, Microsoft Start): Microsoft wants harmonised enforcement across GDPR/DSA/ePrivacy rather than a new DFA layer on “unfair personalisation.”
Policy overhead: A single, clarified playbook and coordinated enforcers would trim duplicate audits and reduce conflict between legal regimes.
Quick reference: DFA practice → Microsoft’s position
Dark patterns → Enforce DSA/UCPD/GDPR; issue guidance; target specific manipulations, not broad, vague bans.
Addictive design → Use risk-based, outcome rules; no blanket bans; DSA already compels risk assessment/mitigation.
Gaming features (loot boxes, virtual currencies) → Keep transparency + parental controls; treat in-game currency as digital content; avoid blanket prohibitions.
Personalisation & ads → Framework already “extensive and complex”; fix within GDPR/ePrivacy/DSA/DMA/AI Act, don’t add a DFA overlay.
Contracts, cancellations, subscriptions → Clear disclosures; easy online cancellation; no account-less cancel button; no “double consent.”
Children & age assurance → Define which risks first; apply proportionate, privacy-respecting age checks for high-risk contexts.
Regulatory overlaps → Simplify; digitise consumer info; coordinate enforcers to avoid duplication.
What Microsoft is pushing back on
A new, free-standing dark-pattern code that duplicates DSA/UCPD.
Feature bans (e.g., universal autoplay or loot-box bans) without context and evidence.
Login-free “cancel buttons” and mandatory re-consent before trial conversion.
What Microsoft welcomes
Targeted fixes (not broad rewrites), digitised consumer information, and harmonised guidance across Member States and authorities.