Home » Blog » A Comparative Analysis of the Legal Implications of Artificial Intelligence in Contract Formation: Nigeria the United Kingdom and the European Union

A Comparative Analysis of the Legal Implications of Artificial Intelligence in Contract Formation: Nigeria the United Kingdom and the European Union

Authored By: Oluwatamilore Omojola

Bowen University

Abstract

Artificial intelligence (AI) is rapidly transforming the mechanisms of contract formation, raising complex legal questions regarding capacity, consent, liability, and enforceability. This article conducts a comparative analysis of the legal implications of AI-assisted and AI-generated contracts under Nigerian law, the United Kingdom, and the European Union. While Nigeria relies on traditional contract doctrines rooted in human agency, both the UK and EU have begun developing adaptive frameworks to regulate AI-driven commercial transactions. Adopting a doctrinal and comparative methodology, the article evaluates whether existing Nigerian contract law can accommodate autonomous contracting technologies and identifies critical legal gaps. It argues that without explicit statutory guidance, Nigeria risks legal uncertainty and diminished competitiveness in the digital economy. The paper concludes by proposing legislative and policy reforms to integrate AI into Nigeria’s contract law framework, drawing lessons from international models to promote legal certainty, innovation, and accountability.

Introduction

In the second quarter of 2025, the National Information Technology Development Agency (NITDA) in Nigeria unveiled the country’s new National Artificial Intelligence Strategy (NAIS) which signalled a watershed moment for law and technology in Africa’s largest economy. To complement the strategy, Nigeria had already launched the National Artificial Intelligence Policy (NAIP) earlier in the year laying out ethical guardrails, governance structures and the ambition to steer AI Innovation towards inclusive growth establishing a foundation for the development of AI in Nigeria.

Yet while policy signals are bold, the legal terrain for one of the most consequential shifts in contract law remains unsettled. What happens when an automated algorithm rather than a human mind, negotiates and enters into a binding agreement? Can the existing Nigerian legal framework rooted in offer, acceptance, consideration and intention adequately accommodate this new paradigm? Most importantly, does it provide clear lines of accountability when a smart contract executes flawlessly or catastrophically without a human signature?

This articles addresses these questions through a comparative lens, exploring how the Nigerian Law currently handles contract formation in the age of artificial intelligence, juxtaposing it with approaches in the United Kingdom and the European Union (EU). The UK is chosen for its shared common-law heritage with Nigeria and its emerging regulatory responses to AI in contracts. The EU is included for its advanced, risk-based regulatory regime that is shaping global norms. By a comparative analysis of these jurisdictions, the paper aims to highlight lacunae in Nigeria’s regime and offer reform pathways grounded in International best practice.

Research Methodology

This article adopts a doctrinal and comparative legal research methodology, supported by analytical and descriptive tools. The doctrinal method is employed to examine the existing legal principles governing contract formation in Nigeria, the United Kingdom, and the European Union, focusing on statutes, case law, policy frameworks, and regulatory instruments relevant to artificial intelligence (AI). This involves a close textual analysis of primary legal sources such as the Nigerian Contract Law, the Companies and Allied Matters Act 2020, the National Data Protection Regulation (NDPR), the United Kingdom’s Electronic Communications Act 2000, and the European Union Artificial Intelligence Act 2024.

The article also draws upon secondary sources including scholarly articles, policy papers, international guidelines, judicial commentaries, and reports from regulatory bodies such as NITDA, the UK Law Commission, and the European Commission. A qualitative analytical approach is employed to interpret these materials, offering a critical evaluation of the adequacy, effectiveness, and future prospects of AI regulation in contract formation.

To ensure relevance and contemporary value, the research incorporates recent developments, such as Nigeria’s National Artificial Intelligence Policy (2024/2025), global trends in smart contracting, and ongoing debates regarding AI liability and personhood. No empirical or field-based research was conducted; rather, the article relies on authoritative textual sources and established principles of legal reasoning. The methodology supports the article’s objective of contributing to legal scholarship and policy reform by providing actionable recommendations for the Nigerian legal system based on comparative insights.

Legal Framework Governing AI Contract Formation in:

  • Nigeria: In Nigeria, the legal foundation for contract formation in Nigeria is rooted in common law principles inherited from the United Kingdom, emphasizing offer, acceptance, consideration and intention to create legal relations.[1] These principles are preserved in judicial precedent and statutory instruments such as the Contract Law of Various states, the Evidence Act 2011[2], and the Companies and Allied Matters Act (CAMA) 2020, which recognizes the validity of electronic signatures and digital communication in commercial transactions. However, artificial intelligence (AI) systems is not currently recognized under Nigerian Law as legal persons capable of creating legal relations[3]. AI is perceived as a tool operated by humans, thereby raising uncertainty where agreements are formed autonomously without direct human involvement.

Furthermore, the Nigeria Data Protection Regulation (NDPR) 2019[4] and the National Artificial Intelligence Policy (2024\2025)[5] establish obligations regarding transparency, accountability and data governance in AI deployment. These policy documents demonstrate the country’s intention to regulate AI-driven economic activities but remain non-binding and do not address the enforceability or validity of AI-generated contracts. As a result, Nigeria’s legal framework is reactive rather than anticipatory, relying on general contract principles that presuppose human agency.

  • United Kingdom: The UK maintains a similar common law foundation for contract formation but has taken significant steps to integrate AI within its digital regulatory architecture. The Electronic Communications Act 2000[6] and the Electronic Identification and Trust services for Electronic Transactions Regulations facilitate electronic contracts by recognizing automated systems and digital signatures. Under the UK Law Commission’s Report on Smart Contracts (2021) affirms that English contract law is sufficiently flexible to accommodate smart contracts formed through code, even where elements of offer and acceptance are automated.[7]

However, like Nigeria, the UK does not confer legal personhood on AI. Instead, contracts generated by AI are attributed to the natural or legal person deploying the system. Nonetheless, the UK is ahead in policy development, as evident in the 2023 AI Regulation White Paper[8], which proposes a pro-innovation framework to address accountability, transparency and autonomy in AI-assisted contracting.

  • European Union: The European Union adopts a more structured and proactive     regulatory approach. While traditional principles of contract formation remain governed by Member States’ national laws, the EU establishes harmonized standards for digital transactions through the eIDAS Regulation and GDPR. Most notably, the EU Artificial Intelligence Act (2024) represents the world’s first comprehensive AI law, classifying AI systems based on risk and imposing strict obligations on providers and deployers. Although, the act Act stops short of granting legal personality to AI systems, it directly regulates AI applications involved in contractual processes, mandating human oversight, transparency in automated decision-making and liability safeguards.[9]

Additionally, the EU’s digital contract law frameworks recognize smart contracts as legally enforceable where parties have manifested consent through automated means. Thus, the EU has developed an anticipatory model that directly addresses the emerging challenges posed by AI in contract formation, positioning itself as a global standard-setter.

Judicial Interpretation and Case Law

  • Nigeria: In Nigeria, judicial interpretation of artificial intelligence in contract formation is virtually non-existent in Nigeria. Nigerian courts continue to apply classical contract principles derived from common law, which assume contracts are formed between natural or legal persons with the mental capacity to consent. In Esso West Africa Inc. v. T. Oyegbola[10], the Supreme Court stated that the law recognizes modern business techniques and must not turn a blind eye to the complexities introduced by computers. However, the dictum was applied in the context of electronic evidence, not AI autonomy. Nigerian courts have yet to determine whether contracts concluded by AI systems without direct human intervention can meet requirements of offer and acceptance or intention to create legal relations.

Due to the absence of domestic case laws, Nigerian courts are likely to rely on persuasive precedents from common law jurisdictions, particularly the United Kingdom. The absence of judicial clarification creates uncertainty regarding enforceability and liability in AI-generated contracts, leaving parties to rely on general principles of agency and attribution

  • United Kingdom: The UK judiciary has not yet confronted a case directly on AI-generated contracts, but has made significant judicial and policy contributions that recognize automated contracting mechanisms. The Law Commission’s Smart Contracts Report (2021) synthesizes judicial approaches to electronic and automated agreements, concluding that existing English contract law is sufficiently adaptable to govern smart contracts formed by algorithmic systems.[11] This aligns with the principle that a contract formation depends on objective manifestation of assent, not the subjective intention of a human decision-maker.

In the case of Thaler v Comptroller-General of Patents (2021)[12], the Court of Appeal made it clear that AI can’t be named as an inventor under patent law because it’s not recognized as a legal person. While this wasn’t a contract case, it shows that AI doesn’t have legal capacity. This means any contract involving AI has to be linked back to a human or a company responsible for it. Similarly, English courts have accepted contracts made through automated systems. For example, in Thornton v Shoe Lane Parking (1971)[13], the court accepted that a machine could make a contract offer, which becomes binding once a customer interacts with it. These cases show the courts see AI more as a tool used by people rather than an independent party in contracts.

  • European Union: Within the European Union, judicial interpretation concerning artificial intelligence and contractual arrangements is significantly shaped by the jurisprudence of the European Court of Justice (ECJ). The ECJ consistently underscores the principles of transparency, accountability and the necessity of human oversight in automated decision-making processes. For instance, in Case C-40/17 Fashion ID GmbH v Verbraucherzentrale NRW[14], the Court affirmed that entities deploying automated systems bear joint responsibility for decisions effectuated through such digital technologies. Although this case primarily addressed data processing issues, it established foundational liability principles applicable to AI-driven contractual systems.

Moreover, the EU’s Artificial Intelligence Act (2024)[15] expressly regulates AI used in contractual processes, mandating human intervention in high-risk AI systems. While no specific ECJ ruling has yet determined the contractual capacity of AI, the legal architecture presumes that AI cannot function as an autonomous contracting party. Instead, liability is allocated to the provider or deployer under a strict regulatory regime.

Critical Analysis and Comparative Evaluation

The emergence and use of AI introduces significant disruptions to traditional contract law doctrines, particularly regarding capacity, consent, liability and enforceability. A comparative analysis of Nigeria, the United Kingdom and the European Union reveals fundamental gaps in Nigeria’s current legal architecture, while showcasing progressive and anticipatory frameworks in the UK and the EU

  • Capacity and Legal Personhood: Under the law of contract, agreements are formed between entities of legal capacity. AI, however, possesses no legal personality in any of the jurisdictions examined. In both Nigeria and the UK, AI is treated as a technological tool not as an independent contracting agent. The EU’s AI Act also rejects AI personhood, instead it imposes obligations on developers and deployers. This means contracts that are generated autonomously by AI must be linked to a natural or legal person. Nigeria’s reliance on human-centric doctrines creates uncertainty when contracts are concluded by AI without explicit human intervention, unlike the UK and EU, which have begun developing guidance for attribution and liability.
  • Consent and Intention: The doctrines of offer and acceptance in Nigeria contract law presuppose human intention either express or implied by conduct. AI-generated agreements challenge this presumption in the sense that if an AI tool autonomously negotiates or generates contractual terms, whether or not the requirements of mutual assent is satisfied remains unclear. In contrast, the UK’s law operates on objective intention, enabling recognition of contracts formed through automated systems where the human deployer is deemed to consent to the AI’s actions. The EU’s goes further by requiring transparency in automated decision-making, ensuring that parties are aware when they are interacting with AI. Nigeria lacks such explicit provisions, exposing parties to uncertainty and potential disputes.
  • Liability and Risk Allocation: Liability remains a critical challenge in Nigeria’s AI contracting landscape. Unlike the UK, which allocates responsibility contractually or through the doctrine of attribution, holding the AI deployer accountable. Nigeria lacks a specific statutory or judicial framework addressing AI-related liability. Instead, parties must rely on general tort and agency laws, producing legal uncertainty. In contrast, the EU imposes strict regulatory liability under the AI Act, requiring proactive risk mitigation. Consequently, while the EU anticipates and regulates AI-related disputes, Nigeria remains reactive and vulnerable to ambiguity, highlighting the urgent need for legislative clarity.
  • Enforceability of Smart Contracts: There is no explicit legislative recognition of smart contracts but they can be enforced through existing contract rules. The UK Law Commission affirms that smart contracts are compatible with English Law and a valid means of expressing contractual terms. EU recognizes smart contracts in its digital transaction regulations, subject to transparency and human oversight requirements. This positions the UK and EU as more legally prepared for AI-driven contracting.

Recent Developments

Nigeria has taken its first step with the Federal Government’s National Artificial Intelligence Policy (2024/2025), recognizing AI as a transformative legal and economic force. This policy promotes ethical AI use, accountability, and innovation but remains mostly aspirational without binding legislation. Additionally, the National Information Technology Development Agency (NITDA) is beginning to establish regulations, showing Nigeria’s commitment to align with global trends. However, the country still lacks specific laws governing AI in contract formation, leaving a regulatory gap.

By contrast, the UK has moved beyond policy to implementation. The UK Law Commission has confirmed the legal enforceability of smart contracts and is actively exploring how digital assets and AI will shape commercial law’s future. Government publications from 2023 set out a regulatory environment encouraging innovation while providing businesses legal certainty for AI use in contracts.The European Union has gone even further by introducing the AI Act (2024), the world’s first comprehensive legal framework addressing AI. It requires transparency, categorizes AI systems by risk, and mandates human oversight. These measures reflect a structured and forward-looking approach, setting a regulatory benchmark that Nigeria is yet to establish.

Way Forward

As Nigeria stands on the verge of the AI revolution, it is imperative that the government and lawmakers seize this moment to enact clear, forward-thinking legislation that not only recognizes AI’s transformative potential but also safeguards commercial certainty and public trust. The existing National Artificial Intelligence Policy lays an important foundation, yet without binding legal frameworks, Nigeria risks lagging behind other jurisdictions. Taking cues from the UK and EU, Nigeria should actively legislate to specifically recognize AI-assisted contracts, clarifying that responsibility ultimately lies with the human or corporate agents behind the technology. In doing so, Nigeria can foster confidence in its digital economy, encouraging innovation while ensuring accountability. Furthermore, updating existing laws such as the Evidence Act and Companies and Allied Matters Act to explicitly accommodate AI and smart contract technologies will modernize Nigeria’s legal infrastructure, preventing costly ambiguities in enforcement. Consumer protections must also be strengthened through mandated transparency, rigorous bias mitigation, and human review of AI-driven contractual decisions, building on the foundations laid by the Nigerian Data Protection Regulation. Lastly, the establishment of an AI regulatory sandbox would provide a controlled, experimental environment enabling policymakers and stakeholders to understand AI’s real-world impacts on contracting and develop adaptive, informed regulations accordingly.

Conclusion

In conclusion, this comparative analysis demonstrates that while Nigeria has begun to recognize the significance of artificial intelligence, its current contract law framework is ill-equipped to address AI-mediated agreements. Unlike the United Kingdom’s adaptive common law approach and the European Union’s comprehensive AI regulatory regime, Nigeria lacks clear legal provisions on capacity, consent, liability, and enforceability in AI-driven contracting. To ensure legal certainty and remain competitive in the global digital economy, Nigeria must move from policy aspiration to binding legislation. The future of commercial transactions will increasingly be shaped by autonomous systems, and unless proactive reforms are implemented, Nigeria risks legal ambiguity and diminished economic relevance in the emerging AI-driven contractual landscape.

Reference(S):

Statutes and Regulations

Nigeria

  • Companies and Allied Matters Act 2020 (Nig.).
  • Evidence Act 2011 (Nig.).
  • Nigeria Data Protection Regulation (NDPR), issued by NITDA (2019).
  • National Artificial Intelligence Policy (Fed. Gov’t of Nig. 2024/2025).
  • National Artificial Intelligence Strategy (NITDA 2025).

United Kingdom

  • Electronic Communications Act 2000, c. 7 (UK).
  • Electronic Identification and Trust Services for Electronic Transactions Regulations (UK).

European Union

  • Regulation (EU) 2016/679, General Data Protection Regulation (GDPR).
  • Regulation (EU) No. 910/2014, eIDAS Regulation.
  • Regulation of the European Parliament and of the Council on Artificial Intelligence, Artificial Intelligence Act, 2024 O.J. (L …) (EU)

Case Law

Nigeria

  • Esso W. Afr. Inc. v. Oyegbola, (1969) 1 N.M.L.R. 194 (Nig.).

United Kingdom

  • Thaler v. Comptroller-General of Patents, Designs and Trade Marks, [2021] EWCA Civ 1374 (CA) (UK).
  • Thornton v. Shoe Lane Parking Ltd., [1971] 2 Q.B. 163 (C.A.) (UK).

European Union

  • Case C-40/17, Fashion ID GmbH & Co. KG v. Verbraucherzentrale NRW eV, ECLI:EU:C:2019:629.

Government & Policy Reports

  • Law Commission of England and Wales, Smart Legal Contracts: Advice to Government, Law Com No. 401 (2021).
  • UK Government, AI Regulation White Paper (2023).
  • European Commission, Proposal for a Regulation on Artificial Intelligence, COM (2021) 206 final.

IV. Secondary Sources

  • Anthony Odor & Nkem Odiaka, Engaging with Qualifying Principles in Nigerian Contract Law, in More Constitutional Dimensions of Contract Law: A Comparative Perspective (Springer Int’l Publ’g 2019).                                                                                                                        

[1] A Odor and N O Odiaka, ‘Engaging with Qualifying Principles in Nigerian Contract Law’ in (eds), More Constitutional Dimensions of Contract Law: A Comparative Perspective (Springer International Publishing 2019) 111-128.

[2] Evidence Act, 2011 (Nigeria) (Act No. 18 of 2011)

[3] Temitope Omojugba, ‘Can AI Be Considered a Legal Entity? Examining AI Personhood Under Nigerian Law’ (18 June 2025).

[4] Nigeria Data Protection Regulation 2019 (Regulation No. NITDA/CC/001) (2019).

[5] National Artificial Intelligence Policy 2024/2025
Federal Republic of Nigeria, National Artificial Intelligence Policy (2024/2025)https://ncair.nitda.gov.ng/wp-content/uploads/2025/09/National-Artificial-Intelligence-Strategy-19092025.pdf accessed 25 October 2025.

[6] Electronic Communications Act 2000 (c 7).

[7] UK Law Commission, ‘Smart Contracts: Consultation Paper’ (2021) Law Com CP No 246.

[8] United Kingdom Government, ‘AI Regulation: A Pro-Innovation Approach – White Paper’ (2 August 2023)https://www.gov.uk/government/publications/ai-regulation-a-pro-innovation-approach accessed 25 October 2025.

[9] Regulation (EU) 2024/1689 of the European Parliament and of the Council of 13 June 2024 laying down harmonised rules on artificial intelligence and amending Regulations (EC) No 300/2008, (EU) No 167/2013, (EU) No 168/2013, (EU) 2018/858, (EU) 2018/1139 and (EU) 2019/2144 and Directives 2014/90/EU, (EU) 2016/797 and (EU) 2020/1828 (Artificial Intelligence Act) OJ L 2024/1689 http://data.europa.eu/eli/reg/2024/1689/oj accessed 25 October 2025.

[10] Esso West Africa Inc. v. Oyegbola, (1969) 1 N.M.L.R. 194.

[11] Law Commission, Smart Legal Contracts: Advice to Government (Law Com No 401, 2021).

[12] Thaler v Comptroller-General of Patents, Designs and Trade Marks [2021] EWCA Civ 1374.

[13] Thornton v Shoe Lane Parking Ltd [1971] 2 QB 163 (CA).

[14] Case C-40/17 Fashion ID GmbH & Co. KG v Verbraucherzentrale NRW eV [2019] ECLI:EU:C:2019:629.

[15] European Union, Artificial Intelligence Act [2024]

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top