Authored By: Charushila Pranavi A/P Sukumaran
ABSTRACT
The establishment of artificial intelligence (AI) in creative fields such as literature, music, and visual art presents a unique challenge to the traditional framework of copyright law in the United Kingdom. This paper explores whether AI-generated works should be eligible for copyright protection and if so, who should hold ownership: the developer, the user, or another party. While UK law recognises computer-generated works under Section 9(3)1 of the Copyright, Designs and Patents Act 1988 (CDPA), authorship remains anchored in human contribution. This article examines legal, ethical, and philosophical dilemmas, evaluates recent UK and international jurisprudence including the landmark Getty Images v. Stability AI case and advocates for a hybrid model that respects both human rights and technological innovation. The discussion highlights ethical concerns such as accountability and the dilution of human authorship in the age of machine intelligence. Ultimately, the article argues for adaptive legislation and international collaboration to redefine authorship in an AI-augmented creative economy.
INTRODUCTION
Artificial intelligence is transforming the creative landscape producing poetry, music, visual artworks, and scripts that rival human-made content. This evolution challenges foundational copyright principles that hinge on human attributes such as intention, creativity, and expression. UK copyright law, governed by the CDPA 1988, attempts to accommodate AI via Section 9(3), which assigns authorship to the person who made the “arrangements necessary” for the creation. However, this provision, drafted before the advent of generative AI, now faces scrutiny in terms of applicability, relevance, and fairness.
Proponents of AI copyright protection argue it fosters innovation and rewards investment, while critics warn that extending authorship to non-human or developer entities undermines human creativity’s legal recognition. This article explores competing perspectives, current UK laws, emerging judicial trends, and the imperative for reform.
BACKGROUND
Historically, copyright has protected works authored by humans, rooted in notions of originality and moral rights. As AI systems like GPT-4, DALL·E, and DeepMind’s AlphaCode create increasingly autonomous works, the distinction between human and machine creativity has blurred. The UK was among the first to legislate for computer-generated works. Under Section 9(3) CDPA , “the author shall be taken to be the person by whom the arrangements necessary for the creation of the work are undertaken.” Yet ambiguity remains: does authorship lie with the developer, the user, or the system operator?
Internationally, the US insists on human authorship (e.g., the DABUS cases), while China has begun recognising AI-assisted creations when humans exert significant creative control. The UK now stands at the crossroads of aligning domestic law with digital innovation.As AI systems become capable of autonomous or semi-autonomous creativity, these foundations are being tested. However, AI systems like GPT-4 do not merely follow pre-defined instructions; they learn, adapt, and generate novel content, often based on opaque internal processes. As a result, the human role is increasingly indirect and arguably less “creative” in the traditional legal sense.
3 ANALYSIS
3.1 Legal Personhood and Copyright Eligibility
UK law defines authors and rights holders as humans or legal persons AI lacks legal personality and cannot independently hold rights. In Thaler v Comptroller‑Genera [2023] UKSC 492, the UK Supreme Court confirmed that an AI system cannot be an inventor under patent law . This principle, though patent-related, resonates in copyright debates.Certainly! Here’s the explanation rewritten as a clear and concise paragraph:
Thaler v Comptroller-General of Patents [2023] UKSC 49, where the Court ruled that an AI system could not be named as an inventor under patent law. Although the case concerned patents, the reasoning applies similarly to copyright law: since AI cannot possess legal rights or responsibilities, it cannot qualify as an author or rights holder under current UK legal frameworks.
3.2 Section 9(3): Ambiguities and Limitations
Section 9(3) of the Copyright, Designs and Patents Act 1988 provides that, in the case of computer-generated works, the author is “the person by whom the arrangements necessary for the creation of the work are undertaken.” However, the provision lacks clarity about who qualifies as making these arrangements, especially when artificial intelligence is involved. This ambiguity was addressed in THJ Systems Ltd v Sheridan [2023] EWCA Civ 13543, where the Court of Appeal held that the software developer, rather than the user, was the author, as they made the key intellectual and creative choices necessary for the work’s creation . This case highlights that human intellectual input is crucial in determining authorship. Moreover, works generated entirely by autonomous AI, without human “free and creative choices,” are unlikely to satisfy the originality requirement under UK copyright law and may therefore fall outside the scope of protection.
3.3 The Moral Rights
UK originality demands that works bear the author’s “own intellectual creation” requiring free, creative, and personal choices . AI lacks consciousness and moral agency; extending moral rights to AI would devalue human creator rights and disrupt normative legal frameworks. Originality in UK copyright law is not purely technical; it also encompasses a personal dimension. According to the UK’s implementation of the EU’s standard of originality, works must be the result of the author’s “own intellectual creation,” implying a degree of individual expression, judgement, and free will4.
AI, however, lacks consciousness, intentionality, and moral agency. It operates based on probabilistic pattern recognition, not personal experience or aesthetic judgment. Therefore, granting AI systems moral rights such as the right of attribution or integrity would be both conceptually and legally incoherent. Moreover, doing so could undermine human creators, whose personal and cultural identities are often inextricably linked to their creative output.
Thus, the “human touch” remains essential not only for meeting legal originality standards but also for upholding the integrity of moral rights. These rights are grounded in human dignity, identity, and creativity qualities that AI systems do not and cannot possess.
3.4 Ethical Implications
Granting full copyright protection to AI-generated works risks marginalising human artists and obscuring accountability. Training AI on datasets often scraped without consent raises privacy and derivative work concerns. These are not adequately addressed under current law.
In addition to legal complexities, the rise of AI-generated content raises serious ethical concerns. Granting full copyright protection to AI-generated works risks marginalising human artists, potentially flooding the market with low-cost, high-volume content that undermines the economic and cultural value of human creativity. Many AI systems are trained on datasets scraped from the internet without the consent of creators, raising issues of privacy, ownership, and infringement. For example, training a model on copyrighted images or written works without obtaining permission or offering compensation undermines the rights of original creators. These concerns are largely unaddressed by existing UK law, which has yet to provide a regulatory framework governing the sourcing and use of training data.
3.5 High-Profile UK Case: Getty Images v Stability AI
A prominent illustration of the legal and ethical tensions in this area is the ongoing case of Getty Images v Stability AI5, filed in the London High Court in June 2025. Getty Images alleged that Stability AI used millions of its copyrighted images without permission to train the image-generating model Stable Diffusion. The lawsuit also claimed that Stability AI reproduced Getty’s watermarks, suggesting intent to deceive users or misrepresent ownership .
The case has become a legal flashpoint, demonstrating how existing UK copyright law struggles to address AI’s novel capabilities and global training pipelines. The outcome could significantly shape future litigation, licensing standards, and regulatory approaches both in the UK and internationally. The legal and ethical tensions are clearly illustrated in the ongoing case of Getty Images v Stability AI, filed in the London High Court in June 2025. Getty alleged that Stability AI used millions of copyrighted images without permission to train its image-generation model, Stable Diffusion. The suit also accused the company of reproducing Getty’s watermarks, implying deceptive use of protected content. Although Getty withdrew its central copyright claims due to jurisdictional issues the model was trained outside the UK the case continues on trademark and indirect copyright grounds. The High Court’s eventual decision will have profound implications for the legality of AI training data, cross-border enforcement of IP rights, and developer accountability for AI outputs. This case
In June 2025, Getty Images sued Stability AI in London’s High Court, alleging unauthorized use of millions of images to train Stable Diffusion, and reproduction of Getty’s watermarks. Though Getty later dropped the central copyright claims due to jurisdictional issues, the case still proceeds on trademark and indirect copyright grounds . The High Court decision will be watched closely for its implications on AI training data liability and enforcement across borders . The Getty Images v Stability AI lawsuit, filed in London in 2025, marks a critical legal flashpoint. Getty accused Stability AI of copying millions of its images without permission to train the Stable Diffusion model. The suit also alleged reproduction of watermarks, suggesting deceptive practices. Although Getty later withdrew the central copyright claims due to jurisdictional technicalities, the case continues on trademark and residual copyright grounds.
Its outcome will likely influence global standards on:
- AI training data legality
- Jurisdiction over AI-related IP claims
- Developer liability for AI outputs
- The case illustrates how existing copyright law strains under AI’s capabilities and business models.
3.6 Comparative Approaches
United States: Requires human authors. For instance, in Zarya of the Dawn , only human-created parts were protected.
– EU : Favors originality and human choices, similar to UK logic.
– China: In Li v Liu, AI-assisted works were protected when human intervention was significant .
The UK has the opportunity to adopt a nuanced, future-focused model.
Authored by : Charushila Pranavi A/P Sukumaran
4.0 Proposal for Reform in the UK
The current UK legal framework, largely rooted in human-centric copyright principles, is increasingly inadequate in addressing the complexities introduced by AI. To keep pace with rapid technological developments and to provide clarity and fairness to all stakeholders, comprehensive reform is essential.
The first priority should be to clarify Section 9(3) by explicitly defining what constitutes the “arrangements necessary” for the creation of a computer-generated work. This clarification should help determine who holds ownership among the various actors involved, such as developers, users, and funders. Secondly, the UK should adopt a Hybrid Authorship Model that recognises the evolving nature of creative collaboration involving AI. This model would acknowledge human contributors in AI-assisted works and allocate copyright when meaningful human input such as prompt engineering, editing, or output curation is evident. This approach avoids the artificial binary of either human or AI authorship, reflecting the collaborative reality of modern content creation.
Importantly, the reform should maintain the principle of AI non-ownership, ensuring that AI systems themselves remain barred from holding copyright. Legal personhood and rights must stay anchored in human agency and accountability.
To increase transparency and facilitate enforcement, the UK should establish a registration framework that allows creators to disclose AI involvement and human contributions at the point of copyright registration. This would provide clear records of authorship and the extent of AI collaboration. Finally, reforms must include ethical safeguards. These would regulate the use of training data, requiring transparency in dataset sourcing, prompt logging, and obtaining consent where necessary. These measures, reflected in recent government consultations , are essential to protect creators’ rights and privacy, and to maintain public trust in AI-generated content.
A strictly human-centric framework is inadequate for current realities. UK reform should address gaps and enhance clarity.
Recommendations:
- Clarify Section 9(3) define “arrangements necessary” to determine ownership among developers, users, or funders.
- Introduce a Hybrid Authorship Model :Recognise human contributors in AI collaborations.
- Attribute rights when meaningful human input like prompt engineering or editing is evident.
- Maintaining AI Non-Ownership keeps AI systems barred from holding copyright. 4. Create a Registration Framework allows creators to disclose AI involvement and human contributions during registration.
- Ethical Safeguards implement rules for transparent training data usage, prompt logging, and consent, as proposed in government consultations ([GOV.UK][12], [Myerson Solicitors][13]).
4.2 Introduce a Hybrid Authorship Model
Instead of forcing a rigid, binary choice between human authorship and AI-generated work, UK copyright law should embrace a hybrid authorship model. This approach recognises the increasingly collaborative nature of creativity involving both humans and AI systems. Under such a model, varying levels of human creative input ranging from crafting prompts to selecting, refining, or curating AI outputs would be acknowledged as valid contributions deserving of authorship rights. Importantly, this model elevates the role of prompt design and output curation as integral acts of creativity, rather than mere technical tasks. By doing so, the law would better reflect the realities of modern content creation, where human and machine interplay to produce original works, while maintaining clear boundaries that AI itself remains ineligible for ownership or moral rights.
5.0 Conclusion
The creative potential of AI compels a fundamental re-examination of authorship within UK copyright law. While Section 9(3) provides a historical framework for computer-generated works, it falls short in addressing the sophisticated realities of AI-generated content. A thoughtfully designed hybrid authorship model, which clearly defines the roles of humans and AI in the creative process, offers the best path forward. It safeguards human authorship and moral rights while enabling innovation and commercial development.
The outcome of key cases such as Getty Images v Stability AI will be pivotal. These decisions will not only influence UK law but may also set precedents affecting international standards on AI and copyright. As AI continues to reshape creativity, the UK has the opportunity to lead by adopting a balanced, clear, and ethically grounded legal framework.
BIBLIOGRAPHY
Articles
- Neville (2024) Ownership of AI-generated content in the UK <https://www.aoshearman.com/en/insights/ownership-of-ai-generated-content-in-the-uk > accessed 29 October 2024
- Aleksandra J (2023) Who Owns the Content Produced By Artificial Intelligence Platforms? < https://www.linkedin.com/pulse/who-owns-content-produced-artificial-intelligence-platforms jarosz/ > accessed 29 October 2024
- LPP (2023) Who Owns the Copyright Work in AI-Generated Art? <https://lpplaw.my/insights/e-articles/who-owns-the-copyright-work-in-ai-generated-art/ > accessed 29 October 2024
- Newell, A., & Simon, H. A. (1956). The Logic Theory Machine: A Complex Information
CASES
- Thaler v Comptroller-General of Patents [2023] UKSC 49
- THJ Systems Ltd v Sheridan [2023] EWCA Civ 1354
- Zarya of the Dawn (U.S. Copyright Office, 2023)
- Getty Images v Stability AI (High Court of England and Wales, ongoing 2025)
- Li v Liu
Beijing Internet Court No. (2019) Jing 0491 Min Chu No. 19816
LEGISLATION
- Section 9(3), Copyright, Designs and Patents Act 1988 (c 48)
1 Section 9(3), Copyright, Designs and Patents Act 1988 (c 48)
2 Thaler v Comptroller-General of Patents [2023] UKSC 49
3 THJ Systems Ltd v Sheridan [2023] EWCA Civ 1354
4 Neville (2024) Ownership of AI-generated content in the UK <https://www.aoshearman.com/en/insights/ownership-of-ai-generated-content-in-the-uk > accessed 29 October 2024
5 Getty Images v Stability AI (High Court of England and Wales, ongoing 2025)