Authored By: Azeemah Lubnaa Jaulim
The University of Sheffield
Abstract:
Emerging neurotechnologies, from neuromarketing tools to advanced Brain-Computer Interfaces, blur the line between mind and machine. Wearable EEG headsets record brain activity to infer attention or emotional states while implants like Neuralink aim to translate neural signals into computer commands.[1] These developments raise pressing legal questions: do existing rights protect the last ‘private’ frontier – our thoughts and feelings – from corporate or state intrusion? As Farahany notes, ‘the brain was the last safe space – now privacy’s final frontier is fading.’[2] It is now important to ask: do current laws adequately protect mental privacy, and cognitive liberty in the neurotech era?
Keywords: Neuromarketing; Brain-Computer Interfaces; Neurodata; Cognitive Liberty; Human Rights Act 1998; ECHR; GDPR
Introduction: The Neurotechnology Challenge
From neuromarketing to Brain-Computer Interfaces (BCIs), advances in neuroscience and neurotechnology promise unprecedented insights into human thoughts and emotions, but also pose profound challenges to privacy and autonomy. This article examines UK and EU law on emerging neurodata technologies. It begins by defining key technologies which create neurodata –
information collected from the brain or nervous system – which can include brain structure, activity or even emotions or intentions. We explore whether current legal regimes, including the Human Rights Act 1998 (which incorporates the European Convention on Human Rights) and the UK General Data Protection Regulation (GDPR), adequately apply to brain data, cognitive liberty and mental privacy. While these provisions offer partial protection for the freedom to control one’s own mind, they leave significant gaps in relation to inferred and unconscious data. Drawing on academics, we show that neuromarketing and BCIs strain the law’s capacity. We consider proposals for new rights such as ‘mental integrity,’ or a consensus for a right to cognitive liberty. In conclusion, we argue that while existing rights cover certain neurotech risks, law reform will eventually be needed to safeguard autonomy in the neural age.
Background
Neuromarketing, defined as marketing based on neuroscience research,[3] utilises EEG, fMRI or eye-tracking to measure unconscious consumer responses and tailor advertising. As the US Government Accountability Office explains, BCIs create direct pathways between the brain and external devices.[4] It enables users to control cursors or prosthetics with neural signals.[5] Both yield neurodata, information from the brain or nervous system,[6] ranging from raw EEG waveforms and blood‑flow imaging to inferred mental states like emotions, intentions or memories.[7] Neurodata’s sensitivity lies in revealing biometric signatures and intimate health or thought patterns. The EU Data Protection Supervisor warns that neurotechnologies can infer ‘physical health or fitness, problem‑solving, reasoning, decision‑making, comprehension, memory retrieval, perception, language, emotions, etc.’ endangering cognitive liberty – one’s right to self‑governance of the mind.[8]
UK law, through the Human Rights Act 1998 (HRA), incorporates ECHR rights: Article 8 (respect for private life), Article 9 (freedom of thought, conscience and religion) and Article 10 (expression).[9] EU law adds the Charter of Fundamental Rights Articles 3 (mental integrity) and 8 (data protection), while GDPR Article 9 restricts ‘special category’ personal data, including biometric and health data. However, these regimes predate modern neurotech and address neuromarketing and BCIs incoherently.
Human Rights and Data Protection Framework
ECHR and HRA Protections
In the UK, the HRA incorporates the ECHR. Article 8 ECHR guarantees that ‘everyone has the right to respect fo his private and family life.’[10] This covers personal autonomy, identity and data privacy. The term ‘private life’ has been interpreted expansively by the Strasbourg Court in cases such as Pretty v UK, analysing the victim’s autonomy [11] and KA v Finland on sexual orientation inference.[12] Thus, a strong argument exists that mental privacy – the inviolability of ones thoughts and brain processes – falls under Article 8’s protective scope, although there has been no specific litigation of this extension before the court. Article 9 ECHR is also applicable: it affords an absolute right to freedom. It requires that ‘Everyone has the right to freedom of thought,
conscience and religion’[13] and thereby inner thought from state coercion. In theory, this ensures mental autonomy, but its caselaw, notably Kokkinakis v Greece, Mikyas and Others v Belgium and Lindholm v Denmark,[14] focuses on belief and religion. Article 9 may arguably shield against forced mind-reading or interrogation by implant, but it is unclear whether it directly tackles corporate or subtle cognitive interference. Article 10 (expression) could analogously protect the dissemination of thoughts but not internal mental processes per se. In sum, the HRA/ECHR provides strong rhetorical support for thought freedom, but no explicit mention of ‘brain data’ or ‘cognitive liberty.’
EU Law: Charter and GDPR
In the EU, the Charter of Fundamental Rights (CFR) recognises related rights: Article 3(1) confirms ‘the right to integrity of the person,’ including ‘the free and informed consent of the person concerned’ in medical contexts.[15] Notably, the Charter explicitly mentions mental integrity: Article 3(1) protects ‘physical and mental integrity’ as a general right. Article 8 CFR guarantees that personal data will be processed fairly and with consent. These provisions reflect EU awareness of neurorights: as Bublitz notes, the Charter’s acknowledgement of mental integrity ‘indicates a broader understanding of human dignity’ appropriate to neurotechnology challenges.[16] However, in practice the EU Charter applies to EU institutions and Member States, but not (formally) to the UK or private actors.
The GDPR (and its UK counterpart) is the central data-protection law. It does not mention ‘neurodata’ explicitly but its protections are potentially relevant. Personal data is any information relating to an identified or identifiable person. Neurodata, unique brain signals, clearly identifies individuals,[17] so it qualifies as personal data. Moreover, much neurodata falls under Article 9’s ‘special categories.’ For instance, brain signals revealing health (neurological condition, stress levels) or biometric identity (unique EEG patterns) are covered by Article 9(1) which forbids processing of special categories except under narrow exceptions. The EDPS notes neural data often involves biometric or health information.[18] So, GDPR in principle requires ‘explicit consent’ or another exception for processing EEG or fMRI data. As Lenca and Andorno warn, the GDPR was drafted before consumer neurotechnology and does not expressly mention ‘inferred’ brain data such as decoded thoughts.[19] Highly intimate inferences about cognition might escape existing definitions.
In sum, Article 8/9 ECHR and GDPR/Charter Article 8 cover key terrain – private life, thought and sensitive data. But mental privacy and cognitive liberty are novel concepts stretching beyond these frameworks. We must ask: Can neurodata use evade the letter or spirit of these laws? The following sections analyse this.
Neuromarketing, Neurodata and Privacy
Neuromarketing uses neuroscience tools to gauge consumers’ unconscious reactions and steer their decisions. By measuring physiological signals like pupil dilation and neural responses, marketers infer attention, emotion or preference in ways surveys cannot. As Sposini notes, neuromarketing ‘builds more effective marketing techniques to push consumers’ by tapping into biases they’re not even aware of.[20] Yet the GDPR does not explicitly classify such inferred neurodata as a special category, creating a regulatory gap: brain‑driven insights essentially peer into mental states that traditional privacy safeguards never envisioned.[21]
From a GDPR perspective, processing neural signals raises key questions. First, firms need a legal basis under Article 6, perhaps consent or legitimate interest, and if the data qualify as ‘special,’ an Article 9 exemption. Imagine a marketing agency fitting volunteers with EEG caps to refine an ad. When linked to an individual login, raw EEG signals become personal data. Without explicit, informed consent, Article 9 would bar this processing. But standard click‑through agreements rarely ensure that consent is both fully informed and voluntary, especially given the complexity of neural readings. The EDPS warns that brain data can reveal thoughts ‘that do not translate into actions,’ so users may not understand what they’ve authorised.[22] Even when consent is obtained, the principles of data minimisation and purpose limitation (Article 5) demand that any neural profiling be strictly necessary and proportionate – a high bar given the intrusiveness of brain data.
Beyond the GDPR, could the ECHR constrain neuromarketing? Article 8 protects ‘private life,’ which courts have extended to personal data as seen in S and Marper v UK with DNA retention.[23] It seems plausible that thoughts and neural signatures form part of private life, yet no judgment has explicitly recognised brain data under Article 8. Article 9 guarantees freedom of thought but has traditionally applied to belief systems rather than consumer inclinations. While companies inferring emotional reactions may not violate absolute protections, the broader concern is autonomy or a right not to be psychologically manipulated – an area the ECHR has not fully explored.
Neuromarketing exploits vulnerabilities beyond conventional profiling. It blends ‘physical and emotional reactions to sensory stimuli’ into psychological engineering. Lynskey argues that data protection frameworks must evolve to address such novel uses.[24] In fact, the EU AI Act (2024) flags high‑risk systems that manipulate cognition, though it stops short of banning neuromarketing. Still, unauthorised collection or breaches of neural data could trigger severe GDPR liabilities.
Practical examples abound: Dutch firm NordicBrain uses EEG for ad testing; Amazon’s Lab126 has investigated EEG headsets for shopping feedback;[25] consumer devices like Muse or Emotiv headbands, marketed for wellness, could easily repurpose their streams for targeted ads. Combining web behaviour with EEG‑inferred ‘interest levels’ would far outstrip cookie‑based profiling in invasiveness. Although EU and UK law would treat this as special‑data processing, enforcement remains largely untested.
In sum, neuromarketing lays bare the limits of current privacy law. While the GDPR provides essential guardrails, covering personal data identification, consent, and security, it was never designed for subconscious neuroinferences.[26] Since neurodata ‘surrenders without explicit communication,’[27] these novel insights pose acute risks to cognitive autonomy. Many commentators agree: the GDPR is a critical starting point, but it ‘cannot be spared criticism’ when applied to the uncharted territory of brain‑based marketing.[28]
Brain-Computer Interfaces and Cognitive Liberty
BCIs extend neurotechnology beyond marketing into direct interaction with the brain, either by reading neural signals to control devices (Category 2 in EDPS terminology) or by delivering feedback to stimulate brain activity (Category 3, closed‑loop neurofeedback).[29] Devices range from Neuralink’s high‑resolution implants, offering fMRI‑like precision, to consumer‑grade EEG/fNIRS headsets and gaming peripherals that translate thought into action. As the US Government Accountability Office observes, BCIs ‘could help people with paralysis regain control of their limbs… or augment human capabilities by allowing people to control machinery using their thoughts.’[30] Yet this immediacy also allows BCIs to monitor cognitive states in real time – detecting drowsiness for safety or measuring which virtual stimuli elicit pleasure, potentially for commercial exploitation.[31]
Legally, BCIs straddle medical‑device regulation and data‑protection law. Clinical BCIs generally fall under healthcare exceptions – patient consent and confidentiality rules apply.[32] But consumer‑oriented BCIs for gaming, productivity, social media escape these safeguards. If a headset logs your brainwaves to boost gameplay, the resulting data qualifies as personal and often ‘special category’ under the GDPR. Article 9(2)(a) mandates ‘explicit and specific’ consent for processing cognitive data, yet continuous or stealth data collection undermines truly informed consent. The ICO warns that neuro‑sensors generate information users ‘cannot even perceive,’ making meaningful consent improbable.[33]
Beyond data protection, BCIs pose autonomy and integrity concerns. Category 3 devices that stimulate neural activity risk[34] altering mood or attention – for example, an AI‑enhanced VR game that subtly modulates users’ focus to increase engagement. While Article 8 of the ECHR protects physical integrity, no case law explicitly addresses brain interventions; Article 9 safeguards freedom of thought but has been narrowly applied to belief systems. In EU law, Article 3 of the CFR enshrines ‘mental integrity,’ requiring ‘free and informed consent’ for any brain intervention.[35] This suggests that non‑therapeutic BCIs must respect mental autonomy as strictly as medical implants.
Many argue for enshrining ‘cognitive liberty’ – the right to control one’s mental processes – as an explicit legal principle.[36] Farahany champions a formal ‘right to cognitive liberty’ encompassing freedom of thought and mental privacy[37] while Balkin and Lawrence debate whether existing rights, like the US First Amendment, can adapt to these challenges.[38] In Europe, one could view cognitive liberty as implicit within the penumbras of Articles 8 and 9, yet novel neurotechnologies demand more tailored protection.
Real‑world examples underscore the urgency. Neuralink has implanted a 1024‑electrode chip in a monkey’s brain to control a computer; human trials are pending. Meta’s research into non‑invasive BCIs has produced patents for wristbands and AR glasses that capture subvocal neural signals and their ‘telepathic typing’ prototype interprets intended words in real time. Start‑ups like NextMind offer hands‑free app control via EEG while DARPA funds military BCIs for drone operation and enhanced cognition. When a device decodes unspoken words, is that expression or pure thought? If a headset stimulates the brain to heighten focus, is that therapy or covert behavioural control?
BCIs reveal a stark mismatch between technological capability and legal frameworks. Although the GDPR and human‑rights law offer starting points – treating neural outputs as data and persons as rights‑bearers – they lack dedicated remedies for the nuanced harms of mental manipulation. Without explicit rules safeguarding neural privacy and cognitive autonomy, companies and states could engineer cognitive effects with little oversight – an outcome at odds with fundamental notions of self‑determination.
Limits of Consent under GDPR
The GDPR’s consent requirement for ‘special’ data (Article 9(2)(a)) seems to cover neural signals – but in practice, it falls short.
- Informed Consent Is Elusive. Effective consent demands clear understanding of what’s collected and its risks. Yet neurodata are involuntary and deeply revealing: they may encode fleeting thoughts or hidden beliefs. As the European Data Protection Supervisor observes, users ‘have no direct control over the information disclosed due to the intrinsic and involuntary nature of neurodata.’[39] Even a meticulous privacy notice may fail to convey that ‘reading brainwaves’ can unearth unconscious mental states.[40] Vulnerable groups – children or those with cognitive impairments – pose an even greater challenge, since explaining neuroprocessing to them is even more difficult.
- Granularity vs Innovation. GDPR requires consent to be specific and granular. A single broad consent for all future neural uses is invalid, yet seeking fresh consent for every new feature is impractical. Controllers must outline ‘deep insights’ neurodata can yield – insights unattainable by other means – yet cannot foresee every inference at design time.[41] This tension risks invalidating overly general consent while stifling technological progress.
- Coercion Through Product Access. Although GDPR allows refusal or withdrawal of consent, opting out of neurodata often means losing access to the product entirely for example, a BCIheadset that will not function without brain calibration. Such ‘take‑it‑or‑leave‑it’ setups undermine genuine voluntariness, akin to forcing consent in exchange for a free service – an approach GDPR explicitly rejects.
- Purpose Limitation and Necessity. Even with valid consent, controllers must adhere to data‑minimization and strict purpose limitation as highlighted by the EDPS.[42] Given brain data’s intrusiveness, they must justify why simpler sensors (eye‑tracking, heart‑rate monitors) or questionnaires are insufficient. Clinical uses (an example is for epilepsy diagnosis) may meet this test; marketing applications likely will not.
In sum, while GDPR’s consent framework nominally applies to neurotech, ordinary click‑through models are ill‑suited to protect the most intimate aspects of mental life.[43] Balkin argues that beyond bans, we need proactive protections – perhaps a standalone right to cognitive liberty – rather than reliance on inadequate consent alone.[44]
Adequacy of Current Protections
UK and EU law offer a mosaic of safeguards for neural technologies, but important gaps persist. On the plus side, Article 8 ECHR and domestic privacy rules protect personal autonomy, while the GDPR and UK-GDPR impose robust requirements on personal data – consent, purpose limitation, security, and data-subject rights all apply when neurodata are linked to individuals.[45]
Moreover, new regulatory initiatives – the Digital Services Act, AI Act, and guidance from the EDPS and ICO – signal growing awareness of neurotech risks.[46]
Yet these regimes were not designed for our inner selves. As Consumer Policy analysis observes, GDPR and the AI Act were ‘designed for traditional contexts’ and struggle with AI-driven neuromarketing.[47] Inferred or hidden mental data often evade clear classification: pupil-dilation tracking is non-health data yet reveals stress or interest; emotion-recognition algorithms produce cognitive inferences not explicitly listed as special-category. Anonymous processing may slip beneath GDPR’s ambit, and non-biometric cognitive profiling remains ambiguously regulated, forcing regulators to stretch existing rules to novel harms.
No case law directly addresses mental privacy. While R v Marper[48] held DNA retention violates privacy, suggesting that brain-wave profiles could likewise intrude, courts have not yet recognized brain data under Article 8. The margin of appreciation doctrine gives states leeway in new domains; perhaps the ECtHR will ultimately affirm mental autonomy as core ‘private life,’ but until then protection is theoretical.
Enforcement is nascent. The EDPS’s June 2024 TechDispatch[49] urges neurodata processing only under highest justifications – medical necessity – and the ICO cautions against neuro-mining, but these remain guidelines.[50] Industry often outpaces law: Spain’s AEPD and the EDPS jointly flagged neural data as high-risk in 2024, yet the UK Supreme Court has never ruled on neurotech. In R v LabCorp,[51] DNA from discarded items was deemed non-private – if courts treat EEG skin-surface data similarly, mental privacy could be left undefended.
Academics are skeptical. Lynskey argues data protection was not built for unforeseeable technologies;[52] Farahany insists freedom of thought cannot be assumed safe once machines read brains.[53] Some propose new rights – cognitive liberty, mental privacy or integrity – beyond existing privacy or expression guarantees.[54]
In sum, current protections form only a partial safety net: they touch some neurotech harms, but neither GDPR nor human-rights law was conceived for cognitive autonomy, and fresh legal measures may be essential.
Proposals for New Rights and Reform
Given the limits identified, a reform is necessary. Proposals range from modest legal clarifications to enacting new fundamental rights. One idea is to label certain neurodata uses as inherently disallowed (akin to a rights-based objection to subliminal persuasion). For instance, the EDPS and others suggest banning brain fingerprinting outside of healthcare.[55] The EU Charter already enshrines mental integrity, but its enforcement could be bolstered by sector-specific rules like in medical device law and AI law. The GDPR could be amended explicitly to list ‘neuronal or brain data’ as special category, removing any ambiguity. Some have even floated a ‘right to cognitive liberty’ in constitutional law. Farahany, for example, proposes recognising the right to control one’s own brain and to be free from non-consensual mental interference.[56] In Europe, advocates have suggested an Article 3 bis CFR for ‘personal identity, free will and mental privacy,’ modeled on the 2021 Chilean constitutional amendment.[57]
Such reforms face challenges. Introducing a new human right is no small matter: ECHR amendments require broad consensus of Member States. Existing rights already cover what matters. For example, as the Europarl report notes, Article 3 CFR already includes mental integrity,[58] so a separate ‘right to mental integrity’ might be redundant. Similarly, the ECHR Committee on Bioethics has observed that personal autonomy and privacy provide a foundation for neurorights without rewriting treaties. However, the novelty of neural data may warrant explicit mention in binding instruments. The Council of Europe’s strategic plan (2020-25) emphasises embedding rights in emerging biotechnologies, hinting at possible new protocols.[59]
Beyond rights, regulation is crucial. The EU’s Artificial Intelligence Act (2024) could classify certain neuro-AI as high-risk (for example, emotion-detection systems used in marketing or employment).[60] Digital services rules might force transparency from platforms using neurotech. Data protection enforcement needs escalation: regulators might issue binding guidance that any BCI for marketing must meet the strictest standards, or even consider it ‘special risk processing’ under GDPR Article 35 (Data Protection Impact Assessments). The UK, post-Brexit, might follow suit in its Data Protection Act or new online safety laws – the planned Online Safety Act 2.0 could include neurotech content.
Rights-based mechanisms beyond privacy are also suggested. For instance, a right to mental privacy could enshrine that one’s brain states are inviolate without consent – analogous to bodily privacy. A right to cognitive liberty could combine freedom of thought and autonomy: for example, blocking subliminal influences or requiring individuals’ permission before AI systems profile their unconscious preferences. It can be countered that too many new rights can complicate law without clear benefits. But the speed of neurotech development – as the Friends of Europe piece notes – may require proactive legal foresight.[61] Without fresh safeguards, markets or states could dominate minds unintentionally.
Conclusion
Neurotechnology has moved from science fiction to everyday reality: EEG headbands and BCIs, powered by AI, now probe our innermost thoughts.[62] Although Article 8 ECHR and GDPR extend core safeguards to brain data, inferred neural signals occupy a legal gray zone. Short‑term reform must classify all neural outputs as special‑category personal data and deem cognitive profiling high‑risk AI. Longer‑term, Europe should enshrine mental privacy in the Charter or new directives, and the UK update its Data Protection Act or human‑rights law. Four steps – legislative clarity, binding neurodevice guidance, ethics and education and international neurorights dialogue – will ensure cognitive liberty and human dignity.
Reference(S):
[1] Elon Musk, Neuralink Progress Update (2020) https://neuralink.com accessed 3 July 2025. [2] Nita A Farahany, The Battle for Your Brain (St. Martin’s Press 2023) 11.
[3] Ale Smidts and others, ‘Advancing Consumer Neuroscience’ (2014) 55(4) Journal of Marketing Research 427.
[4] Rafael Yuste and others, ‘Four Ethical Priorities for Neurotechnologies and AI’ (2017) 551 Nature 159.
[5] United States Government Accountability Office, Science & Tech Spotlight: Brain–Computer Interfaces (2022) https://www.gao.gov/products/gao-22-106118 accessed 3 July 2025.
[6] EDPS, ‘Opinion 13/2022 on the European Commission’s Proposal on AI Regulation’ (2022) 4 https://edps.europa.eu accessed 3 July 2025.
[7] EDPS, ‘TechDispatch: Neurotechnology’ (June 2024) 3 https://edps.europa.eu accessed 3 July 2025.
[8] Marcello Ienca and Roberto Andorno, ‘Towards New Human Rights in the Age of Neuroscience and Neurotechnology’ (2017) 13 Life Sciences, Society and Policy 5.
[9] Human Rights Act 1998, s 1 and Sch 1, art 8; Convention for the Protection of Human Rights and Fundamental Freedoms, art 8,9.
[10] Human Rights Act 1998, Sch 1, Pt I, Art 8, European Convention on Human Rights (ECHR), Article 8.
[11] Pretty v United Kingdom (2002) 35 EHRR 1 (ECHR).
[12] KA v Finland (2003) 37 EHRR 38 (ECHR).
[13] European Convention on Human Rights (ECHR), Article 9.
[14] Kokkinakis v Greece App no 14307/88 (ECtHR, 25 May 1993); Mikyas and Others v Belgium App no 50681/20 (ECtHR, 16 May 2024); Lindholm and the Estate after Leif Lindholm v Denmark App no 25636/22 (ECtHR, 5 November 2024).
[15] Charter of Fundamental Rights of the European Union [2012] OJ C 326/391, art 3(1).
[16] J C Bublitz, ‘My Mind is Mine!? Cognitive Liberty as a Legal Concept’ in J C Bublitz and R Merkel (eds), Cognitive Enhancement: Ethical and Policy Implications in International Perspectives (Oxford University Press 2014).
[17] European Data Protection Supervisor, ‘TechDispatch #1/2024 – Neurodata’ (3 June 2024). [18] Ibid.
[19] F Ienca and R Andorno, ‘Towards new human rights in the age of neuroscience and neurotechnology’ (2017) 13 Life Sciences, Society and Policy 5.
[20] L Sposini, ‘Impact of New Technologies on Economic Behavior and Consumer Freedom of Choice: from Neuromarketing to Neuro-Rights’ (2023) Journal of Digital Technologies and Law <https://www.lawjournal.digital/jour/article/view/375> accessed 11 July 2025.
[21] Research Brief, ‘Neurodata: Navigating GDPR and AI Act Compliance in the Context of Neurotechnology’ (Geneva Academy, 2024) 6.
[22] Ibid [17].
[23] S and Marper v United Kingdom App nos 30562/04 and 30566/04 (ECtHR, 4 December 2008).
[24] O Lynskey, ‘Complete and Effective Data Protection’ (2023) 76 Current Legal Problems 297.
[25] Smidts A et al, ‘Measuring Neural Arousal for Advertisements and Its Relationship with Advertising Success’ (2020) 14 Frontiers in Neuroscience 736.
[26] Ibid [21].
[27] Ibid [19].
[28] M Ienca, ‘Mental data protection and the GDPR’ (2022) 9(1) Journal of Law and the Biosciences lsac006.
[29] Ibid [17].
[30] US Government Accountability Office, ‘Brain-Computer Interfaces: Applications, Challenges, and Policy Options’ (GAO-23-106118, 2023).
[31] Z Mohy-Ud-Din et al, ‘Pleasure detection by online brain–computer interface’ (2011) 25(3) Measurement 235; C-T Lin et al, ‘A Real-Time Wireless Brain–Computer Interface System for Drowsiness Detection’ (2013) 20(4) IEEE Transactions on Biomedical Circuits and Systems 1; S Woo et al, ‘An Open Source-Based BCI Application for Virtual World Tour’ (2021) 15(7) Frontiers in Neuroscience 1.
[32] Health Care (Consent) and Care Facility (Admission) Act, RSBC 1996, c 181, s 5; Health and Care Professions Council, ‘Consent and confidentiality’ (2024).
[33] Information Commissioner’s Office, ‘Tech Futures: Neurotechnology’ (2023). [34] Ibid [17].
[35] Ibid [15].
[36] J C Bublitz and R Merkel, ‘Cognitive Liberty or the International Human Right to Freedom of Thought’ (2014) 13(1) Surveillance & Society 43.
[37] N Farahany, The Battle for Your Brain: Defending the Right to Think Freely in the Age of Neurotechnology (St. Martin’s Press 2023).
[38] J M Balkin, ‘Digital Speech and Democratic Culture: A Theory of Freedom of Expression for the Information Society’ (2004) 79 NYU Law Review 1; G Lawrence, ‘Freedom of Thought in the Age of Neuroscience’ (2012) 101 Georgetown Law Journal 593.
[39] Ibid [17].
[40] S Rainey, ‘Brain Recording, Mind-Reading, and Neurotechnology’ (2020) 7(2) Neuroethics 1, 4.
[41] Ibid [15].
[42] Ibid [17].
[43] Ibid [15]; [28]; [40].
[44] J M Balkin, ‘The Three Laws of Robotics in the Age of Big Data’ (2015) 78(1) Ohio State Law Journal 121.
[45] Regulation (EU) 2022/2065 (Digital Services Act) [2022] OJ L277/1; Regulation (EU) 2024/1689 (AI Act) [2024] OJ L168/1.
[46] Ibid [17].
[47] A Pirozzoli, ‘The Human-centric Perspective in the Regulation of Artificial Intelligence: A New Challenge for the European Union’ (2024) European Papers – European Forum.
[48] Ibid [23].
[49] Ibid [17].
[50] Ibid [33].
[51] Laboratory Corporation of America Holdings, dba Labcorp v Luke Davis et al, No 24-304 (US Supreme Court, 5 June 2025).
[52] O Lynskey, ‘Complete and Effective Data Protection’ (2023) 76 Current Legal Problems 297. [53] Ibid [2].
[54] Ibid [36].
[55] Ibid [17]; Hong Yang and Li Jiang, ‘Regulating neural data processing in the age of BCIs: Ethical concerns and legal approaches’ (2025) PMC11951885.
[56] Ibid [2].
[57] European Parliament, Directorate-General for Parliamentary Research Services, The protection of mental privacy in the area of neuroscience (2024) EPRS STU(2024)757807 18; Neurorights Foundation, ‘Chile’ (2021); Lorena Guzmán H, ‘Chile: Pioneering the protection of neurorights’ (UNESCO Courier, 2021).
[58] European Parliament, Directorate-General for Parliamentary Research Services, The protection of mental privacy in the area of neuroscience (2024) EPRS_STU(2024)757807 18.
[59] Council of Europe, ‘Strategic Action Plan on Human Rights and Technologies in Biomedicine (2020–2025)’ (2020).
[60] Regulation (EU) 2024/1689 (AI Act) [2024] OJ L168/1.
[61] Centre for Future Generations and the Institute of Neuroethics, ‘Towards Inclusive EU Governance of Neurotechnologies’ (2024).
[62] Ibid [2].





