TikTok and the Digital Fairness Act
Read TikTok’s Official Statement on the Digital Fairness Act (PDF)
On 24 October 2025, during the European Commission’s public consultation on the Digital Fairness Act, TikTok published its official feedback on the upcoming legislation.
TikTok urges the EU to enforce and harmonise existing laws (DSA, GDPR/ePrivacy, UCPD/CRD) before adding new ones. It pushes a risk-based approach to minors’ safety and “persuasive design” (rather than blanket bans on features), defends personalisation as pro-consumer and pro-SME, and asks for centralised, consistent enforcement. Where the DFA considers new duties, across dark patterns, addictive design, personalisation, influencer marketing, and subscriptions/cancellations, TikTok mostly prefers guidance and simplification over new prohibitions.
Why this matters (and how it maps to the DFA rulebook)
The DFA is expected to target concrete online practices: dark patterns, addictive design, personalisation, influencer marketing, contract cancellations & digital subscriptions, and unfair contract terms, a focused list meant to complement the existing EU consumer-law stack.
TikTok’s submission supports the consumer-protection goals but warns against duplicating today’s frameworks or hard-banning UI features that may be useful or benign depending on context. It wants a horizontal, trader-agnostic approach with proportionate, risk-based measures and better enforcement coherence.
“Prioritise strengthening the existing legislative framework through guidance and more effective, and centralised enforcement.”
Minors, age assurance & “digital well-being”: risk-based obligations, not feature bans
This is TikTok’s centre of gravity. The company highlights a multi-layered age-assurance model, combining technology, moderation, and reports from parents; it says it removed 25+ million suspected under-13 accounts in the last quarter (global). It is exploring the Commission’s age-verification app as an additional tool.
TikTok lists “50 safeguards” for younger users by default and differentiates the experience for 13–15 vs 16–17. Under-16 accounts are private by default and ineligible for recommendation; all under-18s get a 60-minute daily screen-time prompt and no notifications after bedtime; LIVE, gifting and TikTok Shop are 18+. Families can use Family Pairing and Restricted Mode to lock down settings.
It backs obligations that require services using “persuasive design features that are aimed predominantly at engagement” to assess and mitigate risks of over-use, proportionately to the service, the audience (e.g., minors), and the trader’s size.
Crucially, TikTok prefers the DSA’s new minors’ protection guidance language (“persuasive design”) over the consultation’s “addictive design” label, arguing that the science on “digital addiction” is still evolving and screen-time alone is a poor proxy for harm. It asks for DSA Article 35(3) guidance to help VLOPs (like TikTok) operationalise minors’ well-being duties, and for any cross-industry duty to exclude VLOPs already covered by DSA risk-mitigation, so as not to double-regulate.
DFA fit: This aligns with the DFA’s focus on addictive design but pushes lawmakers toward risk assessments + mitigations, not universal bans (e.g., on autoplay or infinite scroll). Parliament has signalled interest in curbing specific engagement hooks, so expect tension here.
Dark patterns & choice architecture: central guidance, clear lines, no duplicative regimes
TikTok says the EU already outlaws deceptive UIs through UCPD, CRD, GDPR/ePrivacy, and DSA Article 25 but interpretations aren’t aligned. It asks the Commission for centralised cross-regulatory guidance on “dark patterns” (or the broader “harmful online choice architecture”), clarifying the hierarchy and interplay between regimes and distinguishing legitimate business practices from harmful ones.
It also points to the CJEU’s Compass Banca decision to argue the “average consumer” benchmark is dynamic, already capable of reflecting modern behavioural insights (e.g., information overload and cognitive biases) another reason, in TikTok’s view, to rely on guidance/enforcement rather than inventing new, vague concepts such as “fairness by design.”
TikTok urges the EU to avoid introducing specific, inflexible and mandatory presentation rules for information or cancellations form should follow context and device constraints.
Personalisation & ads: keep it, regulate it, simplify the pop-ups
TikTok defends personalisation as the backbone of a vibrant digital economy helping consumers discover and SMEs grow (it cites €4.8bn added by TikTok SMEs in 2023 and 51,100 jobs). It argues the current legal stack already draws the important red lines (e.g., no profiling of minors, no special-category data for ads) through GDPR/ePrivacy/DSA. Its bottom line: no clear legislative gaps on personalisation have been shown.
TikTok wants less fragmentation and fatigue: integrate cookie/trackers rules from ePrivacy into GDPR’s risk-based framework, and rationalise overlapping consent/information prompts so controls are usable on mobile. It also warns not to unduly restrict legitimate advertising and personalisation models, invoking the freedom to conduct business and the need for a level playing field.
Influencer marketing: the law already bans hidden ads, make enforcement consistent
TikTok stresses that hidden digital advertising is already prohibited and that many issues stem from low awareness among creators/brands and fragmented national rules. It says the platform auto-labels paid ads, provides commercial content disclosure tools, and trains creators via policy hubs and a Creator Code of Conduct. It backs the Commission’s Influencer Legal Hub, EASA AdEthics, and DiscloseMe, suggesting EU-level recognition of accepted disclosure options to standardise practice.
5) Pricing tactics, dynamic pricing & transparency
TikTok doesn’t see a need for material interventions; instead, it proposes targeted updates to UCPD guidance to address dynamic pricing expectations. It supports continued Commission sweeps on price transparency and drip pricing with follow-up enforcement.
Contracts, cancellations & subscriptions
TikTok supports the DFA’s simplification agenda: reduce repetitive information obligations in in-app purchases and recurring services; adopt digital-by-default information; remove paper withdrawal forms; modernise the notion of a “durable medium.” It also favours EU-level alignment of post-termination information and contract-termination rules. For new obligations on cancellations or disclosure, it cautions against rigid, one-size-fits-all UX mandates that ignore product context or device constraints.
Simplification, harmonisation & (centralised) enforcement
Here TikTok is emphatic. It welcomes the Digital Omnibus and a Digital Fitness Check; calls out fragmented national approaches (e.g., influencer rules) as a cost driver; and proposes strengthening EU-level enforcement, including a central supervisory/enforcement authority under a revised CPC Regulation, annual EU enforcement priorities, and closer coordination across DSA, GDPR, Data Act enforcement silos. It also wants more transparency about outcomes (commitments, decisions, case closures) and annual guidance updates.
TikTok warns against introducing new, vague concepts like “fairness by design” or reversing the burden of proof.
Where TikTok aligns and where it pushes back (vs the DFA practice list)
Dark patterns / harmful choice architecture → Support cross-regulatory guidance, clear lines; avoid duplicative, inflexible rules on form and layout.
Addictive design → Prefer “persuasive design” framing; risk-based assessments and mitigations; no blanket bans on features; VLOPs already have DSA duties.
Personalisation → Keep it; enforce existing limits (no profiling of minors, etc.); simplify consent/info flows; don’t restrict legitimate ad models.
Influencer marketing → Laws already ban hidden ads; focus on awareness, tools, harmonised disclosures, and consistent enforcement.
Pricing → No major new rules; update UCPD guidance for dynamic pricing; keep EU sweeps.
Contracts, cancellations & subscriptions → Simplify and digitise; align termination rules; resist rigid, one-format mandates.
Likely impact on TikTok’s products & business
Minors’ safeguards: The platform already touts strict defaults, gating of certain features to 18+, and auditing under the DSA. A DFA that codifies risk-based minors’ duties (rather than banning features) would fit TikTok’s model; broad bans could force UI/algorithm changes across feeds and LIVE.
Personalisation & ads: If the DFA leans toward harmonised enforcement (not new layers), TikTok’s For You model and SME ad tools remain viable; sweeping restrictions on personalisation would bite.
Influencer economy: EU-recognised disclosure standards would reduce frictions for creators/brands; fragmented national rules would keep compliance complex.
Compliance overhead: A centralised CPC-style enforcer and annual guidance would bring clarity, but also concentrated oversight. TikTok is effectively inviting that trade-off.
What critics may say (context for readers)
Parliament has floated feature-level restrictions (e.g., infinite scroll, autoplay, constant notifications) on public-health grounds especially for minors. That’s a sharper approach than TikTok’s risk-based preference and could re-ignite debates about engagement-maximising design versus user well-being.
Bottom line for DFA drafters
TikTok is offering a bargain: keep the toolbox we have, make it work better, and target real gaps with risk-based measures. Expect it to support minors’ protections and simplification, while pushing back on blanket feature bans, duplicative regimes, or vague catch-alls that would constrain personalisation and muddy compliance.