Authored By: Neha Anusha D
O.P. Jindal Global University
Introduction
Digital platforms and online marketplaces now sit at the centre of trade mark disputes, providing the primary infrastructure through which infringing goods are advertised, sold and promoted. The earlier E-Commerce Directive (ECD) was drafted for a much smaller and more “neutral” internet, and its conditional safe harbours for intermediary services were widely criticised for failing to address the active role of platforms in curating, ranking and monetising user content. The DSA responds by retaining the safe harbours but radically expanding the obligations that attach to them, particularly for very large platforms and search engines.
This article critically examines the DSA’s model of intermediary liability with a focus on trade mark infringements and the tension between effective enforcement and the protection of users’ fundamental rights. It situates the DSA against the background of earlier EU and national regimes (including the UK’s E-Commerce Regulations), explains key mechanisms such as notice-and-action and trusted flaggers, and evaluates likely “spillover effects” of the DSA on jurisdictions beyond the EU. It argues that while the DSA will deliver real benefits for brand owners and consumers, it also risks over-removal, increased reliance on automated filtering and a structural tilt in favour of powerful rights-holder constituencies.
From Safe Harbours to Risk Governance
Under the ECD, intermediary service providers benefit from safe harbours for “mere conduit”, “caching” and “hosting” provided they lack knowledge of illegality and act expeditiously upon obtaining such knowledge. This system was complemented by a prohibition on imposing general monitoring obligations, reflecting a political choice to protect the openness and scalability of online services. Over time, however, platforms evolved from passive hosts into active gatekeepers, using recommender systems, behavioural advertising and marketplace design to shape user attention and transactions. EU case law on trade mark infringement through online marketplaces exposed tensions in the “neutral host” fiction, as courts struggled with questions of knowledge, control and economic benefit.
The DSA explicitly builds upon and updates the ECD rather than replacing it. Safe harbours remain, and the no-general-monitoring principle is reiterated. Yet the Act introduces a tiered architecture of obligations that effectively recalibrates the risk calculus for intermediaries. All intermediary services must comply with baseline transparency and reporting duties. Hosting services face additional obligations around notice-and-action and statements of reasons. Online platforms incur further responsibilities concerning complaint handling, trader traceability, advertising transparency and dark patterns. Very large online platforms (VLOPs) and very large online search engines (VLOSEs) must conduct recurring systemic risk assessments, implement risk-mitigation measures, undergo independent audits and cooperate closely with regulators.
For trade mark disputes, this shift from a minimalist liability regime to a thick layer of procedural and organisational duties matters in two ways. First, it standardises and strengthens the infrastructure through which rights-holders can notify and seek removal of infringing content. Secondly, it embeds trade mark enforcement within a broader risk-governance framework that encourages proactive detection and mitigation of “systemic” illegal content, potentially pulling platforms closer to de facto general monitoring.
Notice-and-Action and Trusted Flaggers
Structured notice-and-action
The DSA requires hosting providers, including platforms, to operate user-friendly electronic mechanisms for notifying allegedly illegal content. Notices must identify the content precisely (for example, via URLs), provide a substantiated explanation of illegality, identify the notifier, and include a good-faith statement regarding accuracy. Upon receipt, the provider must assess the notice in a timely, diligent, objective and non-arbitrary manner, decide whether to remove or disable access, and inform the notifier of its decision and available redress.
For trade mark owners, this represents an EU-wide codification and enhancement of practices many global platforms already apply under EU and US law. Rights-holders gain more predictable procedures, clearer expectations of platform behaviour and the backing of significant fines for systemic non-compliance. Traders and users, in turn, benefit from greater transparency about why content is removed and from access to complaint mechanisms.
However, the structure and incentives of the system skew towards caution on the part of platforms. Because failing to remove genuinely illegal content can lead to regulatory sanctions, while over-removal of borderline content is largely invisible to regulators, platforms have strong reasons to err on the side of takedown. In nuanced trade mark scenarios such as comparative advertising, resale of genuine goods, parody or fan uses, this may translate into a default bias toward removal, even where the legality of the content is at least arguable.
The rise of trusted flaggers
The DSA introduces “trusted flaggers”: entities (not individuals) with recognised expertise and independence whose notices must be treated with priority and processed without undue delay by online platforms. These entities must report annually on their activities and can lose their status if they submit large volumes of imprecise or inadequately substantiated notices. The legislation explicitly envisages industry associations as prime candidates, partly to keep the overall number of trusted flaggers limited.
This mechanism clearly empowers collective trade mark enforcement. Brand coalitions and anti-counterfeiting organisations can leverage trusted flagger status to secure rapid removal of infringing listings and promotional content and to coordinate systemic action against recurring offenders. From a consumer-protection perspective, this may significantly reduce the prevalence of counterfeit goods and deceptive advertising.
However, concentrating priority notification powers in a small group of private actors also raises rule-of-law concerns. Even with reporting and revocation safeguards, trusted flaggers will exert disproportionate influence over platform enforcement practices. Smaller traders, resellers and users—whose interests may diverge from those of major rights-holders—have no symmetric institutional role in the system. The danger is that the DSA’s trusted flagger regime may amplify existing power imbalances, lowering the effective threshold for content removal in ways that disproportionately favour large rights-holder constituencies.
Systemic Risk, Recommender Systems and Marketplace Design
For VLOPs and VLOSEs, the DSA goes beyond individual notices to impose duties around “systemic risks”. Platforms must periodically identify, analyse and assess risks stemming from their design, functioning and use, including the dissemination of illegal content such as trade mark-infringing goods and the impact of their services on fundamental rights, civic discourse and public security. They must then adopt “reasonable, effective and proportionate” mitigation measures. These may include adjusting recommender and ranking algorithms, tightening terms and conditions and their enforcement, refining content-moderation processes, introducing additional protections for vulnerable groups, and modifying cooperation with trusted flaggers.
In the trade mark context, systemic risk obligations are likely to drive:
- More intensive onboarding and verification of traders, especially in high-risk product categories.
- Increased proactive detection of suspect listings, for example via image recognition, keyword filters and behavioural signals.
- Stricter defaults in search and recommendation logic to demote or exclude higher-risk sellers or product types.
- Closer integration of brand-owner inputs into marketplace design.
These measures can materially reduce the volume and visibility of infringing goods, improving both consumer protection and the integrity of legitimate trade channels. Yet they also push platforms towards continuous, large-scale analysis of user activity, blurring the line between specific, targeted interventions and general monitoring. Automated filters may struggle to distinguish counterfeit goods from lawful resales, repairs or modified products, and to account for contextual defences such as parody or commentary that reference trademarks. Because false positives are rarely visible to regulators but quickly felt by users and small traders, the systemic risk framework risks institutionalising a quiet over-enforcement of trade mark norms.
Recommender system transparency obligations may further shape trade mark outcomes. Platforms must explain, in accessible terms, the main parameters of their recommender systems and, for VLOPs, provide at least one option not based on profiling. These duties can indirectly encourage safer default configurations that favour verified traders and demote unverified or high-risk sellers. At the same time, the partial disclosure of ranking logic may enable sophisticated infringers to adapt their strategies, underscoring the need for continually adaptive enforcement rather than static compliance.
Users’ Rights and Procedural Safeguards
A key innovation of the DSA is the package of procedural rights afforded to users whose content or accounts are affected by moderation decisions. Hosting providers must issue statements of reasons for content restrictions, including the legal or contractual basis, whether the content is regarded as illegal or simply policy-violating, and available avenues of redress. Online platforms must implement internal complaint-handling mechanisms, staffed by suitably qualified personnel rather than purely automated systems, and offer access to certified out-of-court dispute settlement bodies. Users retain the right to pursue judicial remedies and to seek compensation for DSA breaches.
In principle, these mechanisms should curb arbitrary or opaque removals and give traders and users meaningful routes to challenge erroneous trade mark enforcement. In practice, however, their effectiveness may be limited by time and resource constraints. Complaints and out-of-court dispute procedures can take weeks or months to resolve, which is often too slow in fast-moving commercial environments. A small business whose listing has been removed during a key sales period may suffer irreparable loss even if ultimately vindicated. The costs and complexity of pursuing redress—particularly across borders—may also deter many users and SMEs from contesting rights-holder claims, reinforcing the structural imbalance between well-resourced brand owners and fragmented, resource-constrained counterparties.
National Regimes and Global Spillovers
Although the DSA is a directly applicable Regulation intended to achieve maximum harmonisation in its field, it coexists with sector-specific EU and national laws, including those governing intellectual property, media, consumer protection and security. Member States cannot impose additional obligations in areas covered by the DSA, but they retain room to regulate issues beyond its scope. This creates a layered regulatory environment in which trade mark disputes may engage multiple legal instruments simultaneously.
Outside the EU, the DSA’s impact is nevertheless likely to be significant. Large platforms seldom maintain entirely separate compliance architectures for each jurisdiction; instead, they tend to adopt global or regional standards aligned with the most demanding regulatory regimes. The DSA therefore functions as a de facto global benchmark. Its notice-and-action, trusted flagger and systemic risk frameworks are likely to influence legislative debates and platform practices in jurisdictions such as the UK, which continues to apply ECD-style safe harbours through its E-Commerce Regulations while developing its own Online Safety Bill. As a result, the DSA’s particular balance between trade mark enforcement, platform accountability and users’ rights may be exported—directly or indirectly—into legal systems with different constitutional cultures and IP policies.
Conclusion
The Digital Services Act represents a decisive shift from a minimalist safe-harbour regime to a dense governance framework that embeds trade mark enforcement within a broader system of platform accountability and risk management. For trade mark owners and consumers, it promises more consistent and powerful tools: harmonised notice-and-action procedures, strengthened trader traceability, prioritised enforcement via trusted flaggers, and systemic risk duties that push platforms to address counterfeit and infringing goods proactively. For users and small traders, it offers new transparency rights and procedural avenues to challenge content removals and seek redress.
Yet these benefits come with serious risks. Strong compliance incentives, combined with systemic risk obligations, are likely to promote over-removal, generalised reliance on automated filtering and a structural bias toward the preferences of large rights-holder organisations. Trusted flagger regimes may entrench enforcement asymmetries, while complaint and dispute-settlement mechanisms may prove too slow and burdensome to protect time-sensitive commercial and expressive interests. Through regulatory spillover, these dynamics may spread beyond the EU, shaping global standards for intermediary liability and trade mark enforcement.
Whether the DSA’s net effect is normatively desirable will depend on how regulators interpret its proportionality and fundamental-rights provisions, how platforms design and operate their enforcement systems, and how effectively users, SMEs and civil society can participate in governance processes around systemic risk, codes of conduct and dispute resolution. The DSA provides a powerful toolkit for tackling trade mark infringement in the digital environment, but its legitimacy will ultimately turn on whether it succeeds in balancing that aim with the preservation of an open, diverse and rights-respecting online ecosystem.
Bibliography
- Regulation (EU) 2022/2065 of the European Parliament and of the Council of 19 October 2022 on a Single Market for Digital Services and amending Directive 2000/31/EC OJ L277/1.
- Directive 2000/31/EC of the European Parliament and of the Council of 8 June 2000 on certain legal aspects of information society services, in particular electronic commerce, in the Internal Market OJ L178/1.
- Electronic Commerce (EC Directive) Regulations 2002, SI 2002/2013.
- Online Safety Bill 2022–24 (UK), Bill documents https://bills.parliament.uk/bills/3137 accessed 3 February 2026.
- Dinwoodie GB and others, Internet Service Provider Liability for Copyright and Trade Mark Infringement (OUP 2017).
- Papandropoulou P and others, ‘Internet Intermediaries’ Liability for Online Copyright Infringement’ in van Zimmeren E and others (eds), Intellectual Property Perspectives on the Regulation of New Technologies (De Gruyter 2021).
- Buiten J, ‘The Digital Services Act: A New Horizontal Framework for Intermediary Liability in the EU’ in Pollicino O and others (eds), The Digital Services Act (2022).
- Krokida M, Online Platforms and Intermediary Liability in EU Law (PhD thesis, European University Institute 2021).
- European Broadcasting Union, Digital Services Act: A Handbook for Public Service Media (EBU Legal and Policy Department 2023).
- Clifford Chance, ‘The Digital Services Act: What Is It and What Impact Will It Have?’ (Client briefing, December 2022).
- Digital Services Act – Practical Implications for Online Services and Platforms (White Paper, 2023).
- Impact of the Digital Services Act on Online Services and Platforms (Policy Brief, 2023).
- Keller D, ‘The EU’s New Digital Services Act and the Rest of the World’ (Verfassungsblog, 7 November 2022) https://verfassungsblog.de/the-eus-new-digital-services-act-and-the-rest-of-the-world accessed 3February 2026.





