Apple and the Digital Fairness Act

Read Apple’s Official Statement on the Digital Fairness Act (PDF)

On 24 October 2025, during the European Commission’s public consultation on the Digital Fairness Act, Apple published its official feedback on the upcoming legislation.

Apple welcomes the DFA’s consumer-protection goals (privacy, child safety, simpler information), but warns against definitions of “dark patterns” that would accidentally penalise privacy-protective prompts. It urges tougher rules on tracking and “unfair personalisation,” EU-wide, privacy-friendly age-assurance for minors, clearer allocation of responsibility to the service closest to the user, and permission to digitise mandatory product information. Apple also complains that competition rules (DMA) have created privacy risks by forcing more interoperability and data access.

Why this matters (and how it fits the DFA rulebook)

The DFA is being designed to curb dark patterns, addictive design, unfair personalisation and pricing, misleading influencer marketing, subscription/cancellation traps, and unfair contract terms across the EU’s digital economy. Think: neutral consent flows, simpler cancellations, fewer manipulative nudges, safer defaults for minors.

Apple’s October 2025 submission positions the company as broadly supportive of these goals, especially on privacy and child safety, while drawing red lines around UI design freedom for privacy protections, responsibilities in marketplaces, and the interaction with competition law.

Apple’s starting pitch: “we support the goals” (but with caveats)

Apple opens by praising the DFA’s direction, “protecting the safety of children online” and “strengthening transparency and user controls”, and says it wants to “continue to engage” as the law takes shape. It then lists five priorities: stronger tracker rules; child-safety obligations; clear allocation of responsibilities; better authority coordination; and digitised consumer information.

“The DFA creates an important opportunity to empower consumers… [by] mandating proactive information to consumers on the use of trackers [and] requiring that choices are presented neutrally.”

How this maps to the DFA practice list: proactive, neutral consent flows speak directly to DFA dark-pattern restrictions (no biased buttons, no confirm-shaming), while “trackers” and “unfair personalisation” are squarely in scope.

Personalisation & ad tracking: clamp down on “unfair” profiling, reward privacy-tech

Apple’s centerpiece is privacy. It argues that “many companies” fuel hyper-personalised advertising via opaque tracking and data brokers, and it wants the DFA to set clear, enforceable guardrails on “unfair and deceptive personalisation” in online ads, paired with incentives for privacy-preserving methods such as on-device processing, differential privacy, and even homomorphic encryption.

“Unfair and deceptive personalization practices… must be stopped through clear, enforceable rules.”

“The DFA should incentivize privacy-preserving and enhancing methods… Privacy-by-design practices such as automatic blocking of trackers by default should also be systematically encouraged.”

What this could mean in practice

  • Consent UX: DFA pushes neutral choices; Apple wants to ensure privacy prompts (like its App Tracking Transparency) remain legitimate and not mislabeled as dark patterns.

  • “Pay-or-OK” alternatives: Apple says users who refuse tracking or “do not pay for a specific digital service” should still get equivalent services, a signal against coercive “pay or be tracked” models that DFA watchers will recognise.

  • Personalisation rules: The DFA’s work on problematic personalisation (opaque profiling, personalised pricing, exploitation of vulnerabilities) aligns with Apple’s critique of ad-tech profiling.

Business impact for Apple: Stronger anti-tracking norms complement Apple’s on-device design and its ATT ecosystem, fortifying a differentiator it already markets. (Apple cites EU research praising ATT’s “user-friendly” prompt and high opt-out as empowering.)

Dark patterns: target manipulation, not privacy safeguards

Apple backs a DFA definition that “focus[es] on practices aimed at harming consumers or circumventing their rights,” and explicitly warns against drafting that would discourage default settings, notifications or designs intended to protect privacy.

“We must not give data harvesters another opportunity to weaponize EU law to their benefit.”

Where this touches DFA specifics: the rulebook’s examples include biased hierarchies, nagging, hidden cancellations, fake timers, drip pricing, basket sneaking, unclear “sponsored” labels, pre-ticked boxes, and confirm-shaming, all practices Apple says should be in scope, while privacy prompts should be out of the firing line.

Children, addictive design & age assurance: harmonise, be privacy-first, and make platforms answerable

Apple argues for a complementary DFA that fills gaps between GDPR, DSA and AVMSD, and backs EU-wide age-assurance standards for high-risk services, tailored to risk, and privacy-preserving. It also supports setting a minimum EU age for social media access to avoid fragmented national rules.

“The DFA should define adequate and harmonized means and methods for age assurance… [taking into account] the risk of the service to minors, the impact on user privacy, and other rights and freedoms.”

“Focusing responsibility on the party that is closest to the consumer and content will be key…”

Where this meets DFA practice areas: the Parliament and consumer groups have pushed to restrict addictive design (infinite scroll, autoplay, streaks), and game mechanics like loot boxes and in-app currencies for minors, topics that sit alongside Apple’s focus on age assurance and provider accountability.

Subscriptions, cancellations & marketplace responsibility: put duties on the service that sells to the user

Apple highlights its App Store rules and human review as consumer protections (including around subscriptions). Its core policy ask here is role clarity: make the service provider, “the party closest to the consumer and content”, legally responsible for age checks and consumer-right implementation, not a generalised obligation on marketplaces to police every downstream flow.

DFA angle: the Act is expected to mandate simpler cancellations, ban subscription traps (auto-renewals without notice, nagging), and neutralise biased subscription UIs, areas where Apple says it already imposes strict rules, but wants the legal duty to sit with the seller, not the storefront.

Competition rules vs privacy: Apple’s warning shot about DMA spillovers

Apple claims that recent EU competition/platform mandates have created privacy risks by forcing technical access to data Apple itself keeps restricted. It points to DMA-driven changes like decrypting notifications on device for third parties and sharing Wi-Fi history, arguing this undermines GDPR data-minimisation and invites tracking. (These are Apple’s assertions; regulators and rivals may dispute the characterisation and necessity.)

“This sets a precedent… to collect even more data in ways that undermine the GDPR’s data minimization principles and consent.”

Why DFA drafters should care: the Act will need to mesh with DSA/DMA/AVMSD without opening privacy loopholes, or, as Apple puts it, ensure privacy authorities’ views are taken into “utmost account.”

Digitising consumer information: let us put the paper online

Apple wants EU-mandated user information (safety labels, radio specs, energy data, recycling info, etc.) to be digital-by-default, not paper-stuffed into every box.

“In 2023, Apple bundled the equivalent of nearly 1 billion sheets of A4 paper with all its products in Europe for regulatory information alone…”

It argues digital delivery is more accessible (screen-readers, zoom, contrast, language choice), cheaper to update, and greener, a deregulatory ask that fits the DFA’s “modernisation” motif.

What Apple didn’t really engage on

  • Influencer marketing: little to no detail, despite DFA focus.

  • Personalised pricing tests or transparency duties: Apple talks “unfair personalisation” broadly but doesn’t spell out mechanisms like price-disclosure or “no-surprises” tests.

  • Detailed game-mechanic bans (loot boxes, pity timers): Apple leans instead on age-assurance and provider accountability.

Likely impact on Apple’s products & business

  • Reinforced privacy moat: If the DFA clamps down on cross-service tracking and coercive consent, Apple’s on-device and ATT posture looks vindicated and competitively helpful.

  • UI guardrails, not handcuffs: Tight but targeted dark-pattern bans should spare privacy-protective prompts, the line Apple most wants in the sand.

  • Child-safety engineering: EU-wide age-assurance norms will trigger design work across Apple’s ecosystems but, if responsibility sits with service providers, Apple avoids being the universal gatekeeper for others’ age checks.

  • Compliance simplification if digitisation wins: E-labelling and digital instructions would cut paper and update costs at scale.

  • Persistent DMA tension: The more interoperability mandates encroach on device-level data, the louder Apple will argue that privacy is being traded away, expect sustained push-and-pull with enforcers.

Quick reference: DFA practice → Apple’s position

  • Dark patterns → Ban manipulative UIs; don’t catch privacy prompts and protective defaults.

  • Addictive design/gaming → Focus on age-assurance and harmonised rules for minors; put duty on the service closest to the content.

  • Personalisation & profiling → Stop “unfair” ad personalisation; prefer privacy-preserving tech; ensure equivalent service when users decline tracking or payments.

  • Contracts, cancellations & subscriptions → Neutral, simple cancellations; keep legal accountability on the seller/service, not the marketplace hosting the app.

  • Unfair terms → Implicitly supportive of clarity and accessibility; strongest concrete ask is digitised mandatory information.

Apple’s bottom line for lawmakers

“Enable privacy-by-design, stop unfair personalisation, protect kids with harmonised, privacy-friendly age assurance, clarify who’s responsible, coordinate enforcers, and let us put the paper online.”

If the Commission leans that way, the DFA will land close to Apple’s worldview. If instead it sweeps in privacy prompts as potential dark patterns, cements pay-or-OK models, or leans on marketplaces to police everything, expect pushback.

FAQ

What parts of the DFA does Apple most support?
Privacy guardrails for tracking/personalisation, child safety (age assurance), neutral consent flows, and digital consumer information.

What worries Apple?
Overly-broad “dark pattern” definitions that could chill privacy-protective defaults; DMA-style mandates that, in Apple’s view, erode GDPR minimisation by forcing more data access; and unclear responsibility splits that dump duties on marketplaces.