Authored By: U. Hajara Aashika
Government Law College Karaikudi
INTRODUCTION:
The digital revolution has shifted the paradigm of identification from “what you know” (passwords) to “who you are” (biometrics). Biometric data are encompassing fingerprints, iris scans, and facial geometry: that is unique, permanent, and inherently linked to an individual’s physical identity. However, unlike a compromised password, a compromised biometric template cannot be “reset,” making its protection a matter of extreme constitutional urgency.
In India, the journey toward biometric privacy has been defined by the landmark Supreme Court ruling in K.S. Puttaswamy v. Union of India (2017), which elevated privacy to a fundamental right under Article 21. Despite this, the rapid integration of biometrics into daily life from Aadhaar-linked payments to facial recognition at airports has outpaced traditional legal safeguards. With the notification of the Digital Personal Data Protection (DPDP) Rules in late 2025, India has entered a new era of data sovereignty. This article critically examines the adequacy of this new framework, exploring whether it provides a robust shield for our most intimate data or merely a procedural facade in an age of escalating digital surveillance.
The Biometric Identity:
In the modern legal lexicon, biometric data refers to the measurement and statistical analysis of an individual’s unique physical and behavioral characteristics. Unlike traditional identifiers such as passwords, PINs, or physical ID cards which are external to the person but biometric data is intrinsic to the human body. Under the broad umbrella of Privacy Law, biometrics are typically categorized into two types:
- Physiological Biometrics: These include fingerprints, iris patterns, retina scans, facial geometry, and DNA profiles.
- Behavioral Biometrics: These include voiceprints, gait patterns (the way a person walks), and even keystroke dynamics (the rhythm with which a person types).
The “Immutability” Factor:
The primary legal concern regarding biometric data is its immutability. If a bank password or a credit card number is compromised in a data breach, the user can simply change the password or cancel the card. However, a person cannot “reset” their fingerprints or “change” their facial structure. Once a biometric template is digitized and leaked, that individual’s identity is potentially compromised for the rest of their life. This creates a permanent risk of identity theft and unauthorized profiling that traditional data does not carry.
From Biology to Binary:
It is important to understand that most modern systems do not store an actual “photo” of a fingerprint. Instead, they use algorithms to convert physical features into a mathematical string called a biometric template. The legal risk arises when these templates are stored in centralized databases. If a “Data Fiduciary” (the company or government agency) fails to use end-to-end encryption, hackers can intercept these templates. In the context of the 2026 digital landscape, where biometrics are used for everything from unlocking a smartphone to authorizing a high-value UPI transaction, the stakes of a leak are no longer just about privacy—they are about financial and physical security.
The Risk of “Function Creep”
A significant inherent risk in biometric collection is “Function Creep.” This occurs when biometric data collected for one specific, legitimate purpose is later used for a different, unauthorized purpose. Without strict “Purpose Limitation” clauses in the law, the transition from a “service-oriented” biometric check to a “surveillance-oriented” system becomes almost invisible to the average citizen.
Constitutional Foundations:
The legal discourse on biometric privacy in India cannot exist without the foundational bedrock of the 2017 Supreme Court judgment in K.S. Puttaswamy v. Union of India. Before this landmark ruling, the right to privacy was a fractured concept, often debated but not explicitly guaranteed. The nine-judge bench unanimously declared that privacy is an intrinsic part of the Right to Life and Personal Liberty under Article 21 of the Constitution of India.
The “Triple Test” for Biometric Collection:
For any state-led collection of biometric data such as Aadhaar, facial recognition for law enforcement, or digital health Ids etc, to be constitutionally valid, it must satisfy a three-pronged test established in the Puttaswamy judgment:
- Legality: The collection must be backed by an existing law passed by Parliament.
- Legitimate Aim: The state must demonstrate a clear and necessary objective, such as the prevention of crime or the distribution of social welfare benefits.
- Proportionality: This is the most critical element for biometrics. The state must prove that there is a rational nexus between the data collected and the goal achieved, and that it is the “least intrusive” method available.
Informational Self-Determination:
A key sub-doctrine emerging from these cases is “Informational Self-Determination”. This principle asserts that an individual should maintain control over their data, including the right to know who has access to it and the ability to correct or delete it. When applied to biometrics, this suggests that once a citizen provides data for a specific purpose, the entity loses the right to use that data for any other purpose without fresh, explicit consent.
However, under Section 7 of the DPDP Act, the state often bypasses this by citing “Certain Legitimate Uses,” effectively suspending the Data Principal’s right to erasure if the state deems the data necessary for a “specified purpose” related to a subsidy. This creates a tension between constitutional philosophy and statutory practice that remains a significant flaw in the 2023 framework.
The 2026 Context: Post-Puttaswamy Challenges:
Moving into 2026, the challenge lies in the “normalization” of biometric surveillance. While the judiciary has set high standards, the executive branch often utilizes “administrative necessity” to bypass the spirit of the Puttaswamy ruling. For example, the mandatory use of facial recognition at airports (DigiYatra) or for registering competitive exams often presents a “choice” that is not truly a choice, as opting out results in significant disadvantage. This section of your article should conclude that while the Constitution protects the body, regulatory practice is increasingly treating the body as a public database.
The DPDP Act, 2023:
While the Puttaswamy judgment provided the constitutional philosophy, the Digital Personal Data Protection (DPDP) Act, 2023, provides the statutory machinery. In the context of 2026, this Act is the primary lens through which all biometric processing in India is regulated. However, a critical legal analysis reveals that while the Act strengthens some areas, it leaves significant gaps regarding the unique nature of body-based data.
The Classification of Biometrics:
Unlike the European Union’s GDPR, which classifies biometric data as a “Special Category” requiring heightened protection, the Indian DPDP Act treats it under the broader definition of “Personal Data.” However, under the 2025 DPDP Rules, entities handling large volumes of biometrics are often classified as Significant Data Fiduciaries (SDFs). This classification mandates them to appoint a Data Protection Officer (DPO) and conduct regular Data Protection Impact Assessments (DPIAs).
The Mechanism of “Notice and Consent”:
The Act is built on the pillar of “Consent.” For a company to scan your face or fingerprint, it must provide a Notice that is clear and available in multiple languages. The consent must be Free, Specific, Informed, Unconditional.
In practice, “Consent” is often a legal fiction. If an employer makes biometric attendance mandatory for salary processing, or a bank requires a thumbprint for an account, the “Data Principal” (the individual) is in a position of weak bargaining power. The legal question for 2026 is whether consent given under the pressure of losing a livelihood can truly be considered “free.
The Liability Gap:
A major point of analysis for your article is the penalty structure. While the Act imposes heavy fines for data breaches, these fines go to the Consolidated Fund of India, not to the victim. For someone whose permanent biometric identity has been compromised, the lack of direct compensation remains a significant flaw in the 2023 framework.
The “Legitimate Use” Exception:
While the DPDP Act is built on the foundation of consent, Section 7 introduces the concept of “Certain Legitimate Uses.” This section is arguably the most controversial aspect of the Act, especially when applied to biometric data. It allows “Data Fiduciaries” to process personal data without obtaining the explicit consent of the Data Principal under specific circumstances.
Broad Government Powers:
Under Section 7, the State can process biometric data for providing subsidies, benefits, services, certificates, licenses, or permits. In the 2026 legal landscape, where biometric authentication is increasingly mandatory for accessing basic government services, this creates a significant loophole. If a citizen must provide a thumbprint or iris scan to receive food rations or a pension, and the law allows the State to process that data without formal “consent” under the guise of “service delivery,” the constitutional right to privacy is put at risk
The Risk of “Function Creep” and Surveillance:
The primary legal concern here is “Function Creep”- the use of data for a purpose other than the one for which it was originally collected. For example, biometric data collected for a “Digital India” scholarship program could technically be accessed by law enforcement agencies under the broad exemptions provided for “maintenance of public order” or “national security.”
Technological Frontiers:
As we navigate through 2026, the intersection of Artificial Intelligence (AI) and biometric data has created a new frontier for legal scrutiny. The most prominent application of this technology is Automated Facial Recognition Systems (AFRS), which are being rapidly deployed in Indian airports (via DigiYatra), railway stations, and by law enforcement agencies for “predictive policing.”
The Problem of “Function Creep”:
In legal terms, “Function Creep” occurs when data collected for a specific, legitimate purpose is subsequently used for an unrelated, often more intrusive purpose without the individual’s knowledge. For instance, facial geometry collected for “seamless travel” at an airport could technically be cross-referenced with criminal databases or used for commercial profiling by third-party vendors. While the DPDP Act mandates Purpose Limitation, the lack of strict technological barriers makes function creep an invisible threat to the Data Principal.
Invisible Surveillance and Passive Collection:
Unlike fingerprints, which require a “positive act” (placing a finger on a scanner), facial recognition and gait analysis can be performed passively from a distance without the individual ever knowing they are being “read.” This challenges the very notion of “Notice and Consent” under the DPDP Act. If a citizen walks through a public mall or a government building where cameras are scanning faces in real-time, the “Notice” is often reduced to a tiny sticker on a wall, which fails the test of “Informed Consent” established in Puttaswamy.
A Global Comparison: India’s DPDP Act vs. the European Union’s GDPR
In the realm of international data protection, the European Union’s General Data Protection Regulation (GDPR) is widely considered the “Gold Standard.”
Classification of Data: “Sensitive” vs. “General”
Under Article 9 of the GDPR, biometric data is classified as a “Special Category of Personal Data.” The processing of such data is generally prohibited unless the controller can meet very strict conditions, such as “explicit consent” or “substantial public interest.” In contrast, the Indian DPDP Act does not create a separate statutory category for “Sensitive Personal Data.” While the 2025 Rules allow the government to designate certain entities as Significant Data Fiduciaries based on the volume of sensitive data they handle, the Act itself treats biometrics under the same broad umbrella as a person’s name or email address. This lack of a “high-risk” classification is a point of significant academic debate.
The Right to be Forgotten and Erasure:
The GDPR provides a robust “Right to Erasure” (Article 17), which is particularly potent for biometrics. If a biometric template is no longer necessary for its original purpose, the user has a strong legal claim to have it deleted. The Indian DPDP Act also provides a Right to Correction and Erasure (Section 12); however, this right is subject to several “Legitimate Use” exemptions that do not exist in the same capacity in Europe. For instance, if the Indian state deems biometric data necessary for a “specified purpose” related to a subsidy, the Right to Erasure can be effectively suspended.
Data Protection Impact Assessments (DPIA):
The GDPR mandates a DPIA for any processing that is “likely to result in a high risk to the rights and freedoms of natural persons.” This almost always includes biometric processing. In India, the requirement for a DPIA is not universal; it is currently reserved for Significant Data Fiduciaries. This means smaller startups or private security firms in India might deploy biometric surveillance without the mandatory rigorous risk assessment required of their European counterparts.
The Challenge of Passive Surveillance: Facial Recognition and Gait Analysis
In the realm of biometric data, a critical legal distinction exists between “active” and “passive” collection methods. Traditional biometrics, such as fingerprints, typically require a “positive act” or a voluntary gesture from the individual, such as placing a finger on a scanner. This physical interaction provides a clear moment of “Notice and Consent” as envisioned by the Digital Personal Data Protection (DPDP) Act, 2023.
However, the 2026 technological landscape is increasingly dominated by passive systems, most notably Automated Facial Recognition Systems (AFRS) and gait analysis. These technologies can capture an individual’s unique physical and behavioral characteristics from a distance, in real-time, and without the person ever being aware that they are being “read”.
The Erosion of “Informed Consent”
Under the DPDP Act, consent must be Free, Specific, Informed, and Unconditional. In a public space—such as a government building, a railway station, or a mall—where cameras are scanning faces for “security” or “analytics,” the requirement for a “clear notice” is often reduced to a small sticker or a generic digital sign. This fails the rigorous test of Informed Consent established in the Puttaswamy judgment. If a citizen cannot realistically walk through a public area without being scanned, the “choice” to opt-out is effectively non-existent, rendering the consent involuntary.
DigiYatra and the Illusion of Choice
A prominent example of this tension is the DigiYatra initiative at Indian airports. While framed as a tool for “seamless travel,” it utilizes facial geometry to replace traditional boarding passes. The legal concern arises when opting out of such a system results in significant delays or disadvantages, creating a “weak bargaining power” for the Data Principal. For a fourth-year law student analyzing these trends, the question is whether a “choice” made under the pressure of administrative necessity can ever be considered “free” under the Section 7 framework.
The Invisible Threat of Function Creep
Passive surveillance is particularly susceptible to “Function Creep”—the use of data for a purpose other than the one for which it was originally collected. Facial geometry collected for an airport check-in could technically be cross-referenced with criminal databases for “predictive policing” or utilized by third-party vendors for commercial profiling. While the DPDP Act mandates Purpose Limitation, the lack of strict technological barriers or a “Sensitive Data” classification for biometrics makes such transitions almost invisible to the average citizen.
The Redressal Gap: Penalties vs. Compensation
A major point of analysis for the 2026 legal landscape is the penalty structure under the DPDP Act. While the Act imposes heavy financial penalties on Data Fiduciaries for data breaches, these fines are deposited into the Consolidated Fund of India rather than being used to compensate the affected individuals.
For a Data Principal whose permanent biometric identity has been compromised, the lack of a direct compensation mechanism is a significant flaw. Unlike a financial password, a leaked biometric template represents a lifelong risk of identity theft that cannot be “cured” by a state-collected fine. This section concludes that the 2023 framework focuses more on punishing the fiduciary than restoring the victim, a balance that may need to be reconsidered as biometric leaks become more frequent.
Conclusion:
The transition from traditional identification methods to biometric systems is an irreversible trend in India’s digital evolution. As this article has explored, while biometrics offer unparalleled convenience and security, they also introduce a unique set of legal risks primarily due to their immutability and the potential for invisible, large-scale surveillance.
The Digital Personal Data Protection Act, 2023, along with its 2025/2026 Rules, marks a significant milestone in India’s journey toward a regulated data economy. By establishing the roles of Data Fiduciaries and granting rights to Data Principals, the Act provides a much-needed statutory skeleton. However, as analyzed in the preceding sections, the “Legitimate Use” exemptions under Section 7 and the lack of a specific “Sensitive Data” classification for biometrics remain areas of concern that could undermine the constitutional spirit of the Puttaswamy judgment. In conclusion, the human body should not be treated as a mere set of data points for administrative ease. As we move further into the digital age, the law must act as a vigilant guardian, ensuring that the “Body as a Password” does not lead to the “Body as a Tool of State Control.” Only by balancing technological utility with robust, human-centric legal safeguards can India truly achieve the vision of a secure and private Digital India.
Bibliography:
- The Constitution of India, 1950.
- The Digital Personal Data Protection Act, 2023 (India).
- The Digital Personal Data Protection Rules, 2025 (India).
- General Data Protection Regulation (GDPR), Regulation (EU) 2016/679.
- The Information Technology Act, 2000 (India).
Case Laws:
- K.S. Puttaswamy v. Union of India, (2017) 10 SCC 1.
- Justice K.S. Puttaswamy (Retd.) v. Union of India, (2019) 1 SCC 1 (Aadhaar Judgment).
Reports & Journals:
- Justice B.N. Srikrishna Committee, A Free and Fair Digital Economy: Protecting Privacy, Empowering Indians (2018).
- Law Commission of India, Report No. 271 on Human DNA Profiling (2017).





