Authored By: Ankita Vijay
Faculty of Law, Banaras Hindu University
ABSTRACT
Virtual try-on tools deployed by fashion e-commerce platforms collect sensitive biometric data, including facial scans and body measurements, raising significant privacy concerns under Indian law. This article examines whether such data collection violates consumer privacy rights and whether the Digital Personal Data Protection Act, 2023 provides adequate legal safeguards. The analysis proceeds across three dimensions: the classification of biometric and body data under existing statutes, the adequacy of consent mechanisms employed by major platforms, and the identification of regulatory gaps in the current legal framework. Drawing on the landmark Puttaswamy judgment and comparative international frameworks, the article finds that prevailing industry practices of implied and bundled consent fail to satisfy the DPDPA’s requirement of free, specific, informed, and unambiguous consent. The article concludes that legislative reform is necessary to address definitional ambiguities, mandate sector-specific guidelines, and establish clear data retention limits for retail biometric technologies.
INTRODUCTION
Technological innovation has significantly transformed the global retail sector, particularly within fashion e-commerce. Virtual try on technology is one of them. Many e-commerce platforms like Amazon (Outfit-VITON), Sephora (Makeup), IKEA (furniture), Lenskart (eyewear), Nike (footwear), and Myntra (fashion) deploys virtual try-on technology to enhance customers’ online shopping experience by boosting their confidence and reducing returns.
Virtual try on technology is a digital tool which allows a person to virtually try on clothes, shoes, and other accessories before they purchase them. It allows individuals to get an idea of how the item will look on them, its fit and size. This technology uses Augmented Reality (AR) and Artificial Intelligence to make a realistic model of a customer. Customers either upload a picture or use the camera. Then the AI analyses the picture and makes the accessory fit to the body or face.[1]
Despite these benefits, it raises serious privacy and legal concerns. Virtual try-on tools frequently require the collection and processing of highly sensitive personal information, including facial images, body measurements, and behavioural data related to consumer preferences. It exposes users to privacy risks and data breaches if stored improperly.[2]
In India, Digital Personal Data Protection Act, 2023 and the Information Technology Act,2000 and IT (Reasonable Security Practices and Procedures and Sensitive Personal Data or Information) Rules 2011 are the primary statutes governing the protection of digital personal data.
This article examines whether the collection of biometric data through virtual try-on tools by fashion e-commerce platforms violate consumer privacy rights, and does the Digital Personal Data Protection Act, 2023 provide adequate legal safeguards against such risks? In addressing this question, the article examines three subsidiary issues: first, whether facial scans and body measurements qualify as sensitive personal data or biometric data under Indian law; second, whether fashion platforms satisfy the consent requirements mandated by the DPDPA 2023; and third, whether the existing legal framework contains a regulatory gap that warrants legislative reform.
BACKGROUND AND CONCEPTUAL FRAMEWORK
In India, Digital Personal Data Protection Act, 2023 and Information Technology Act, 2000 , IT (Reasonable Security Practices and Procedures and Sensitive Personal Data or Information) Rules 2011 are the major laws governing the protection of data, its use, storage and data breach.[3]
DPDP Act,2023 is a statute enacted to regulate the processing of personal data. It imposes obligations upon data fiduciaries for lawful processing, informed consent, purpose limitation, and reasonable security safeguards when handling personal data.
Information Technology Act, 2000 and IT (Reasonable Security Practices and Procedures and Sensitive Personal Data or Information) Rules 2011 is another law providing for protection of data. These rules specify certain types of data, including biometric information, which require higher levels of protection during processing and storage as sensitive personal data.
Virtual try-ons are considered biometric processing under the broad DPDPA definition of personal data (any data that identifies an individual)[4]․ However scholarship underemphasizes retail tech focusing on Aadhaar or surveillance‚ while there are worldwide lawsuits, e․g․ Estee Lauderfor facial scanning without explicit consent[5]․ Puttaswamy judgement noted that proportionality must be demonstrated and that biometric data must not be processed for commercial purposes when not required․[6] The EU’s General Data Protection Regulation classifies biometrics as special category data requiring explicit consent․ India has no known regulations to enforce retail biometrics‚ and platforms self-regulate using privacy policies‚ often with bundled consents․[7]
LEGAL ANALYSIS
The legal regulations of virtual try-on require examination of three questions:the classification of biometric data, the adequacy of consent mechanisms, and the obligations imposed on digital platforms handling personal data.
Sensitive personal data or biometrics is not explicitly defined under DPDP Act,2023. But under section 2(t) the term ‘personal data’ is defined as any data about an individual who is identifiable by or in relation to such data.[8] And ‘digital personal data’ is defined as personal data in digital form under section 2(n) of the act.[9] The physical and biological features of any face are unique and can identify a person.[10] Therefore facial scans and bodily measurements fall under this definition and the provisions of thai act will apply on such e-commerce platforms which use these tools.
IT (Reasonable Security Practices and Procedures and Sensitive Personal Data or Information) Rules 2011 explicitly defines biometrics as the technologies that measure and analyse human body characteristics, such as ‘fingerprints’, ‘eye retinas and irises’, ‘voice patterns’, ‘facial patterns’, ‘hand measurements’ and ‘DNA’ for authentication purposes.[11] These rules classify biometric information as sensitive personal data under rule 3.[12]
The landmark case of Justice K.S. Puttaswamy v. Union of India (2017) declared the right to privacy as a fundamental right under Article 21 of the Constitution, encompassing personal data protection. The court emphasized the need for legislative safeguards to protect personal data, including biometric data, from unauthorized collection and use.[13]
Section 4 of the DPDP Act,2023 lays the grounds for processing of personal data. It states that only that data can be processed for which consent has been given by the data principal and such consent as given under section 6(1) shall be free, specific, informed, unconditional and unambiguous with a clear affirmative action. The data principal has the right to withdraw the consent at any time under section 6(5). If a Data Principal withdraws her consent to the processing of personal data under sub-section (5), the Data Fiduciary shall, within a reasonable time, cease and cause its Data Processors to cease processing the personal data of such Data Principal unless such processing without her consent is required or authorised under the provisions of this Act or the rules made thereunder or any other law for the time being in force in India. The notice must be provided at the time‚ or before‚ the data is collected․ It should specify the type of personal data collected‚ the purpose of processing the data‚ and how to exercise the rights (for example‚ withdrawal of consent under Section 6(4))‚ grievance redressal (under Section 13) and approach the Data Protection Board․ It should be in simple and clear language (i․e․ English or the languages of the Eighth Schedule)‚ as well as separate from a site’s terms of service and not inferred from usage․[14] These platforms still use rules, from the Information Technology Act, 2000 and the SPDI Rules, 2011. The privacy policies of these platforms were updated in 2025. These platforms think that people agree to things just because they are using the platform. The privacy policies of these platforms treat consent as implied or bundled which means they think people agree to things without asking them. For example‚ Myntra says that using its site or providing information constitutes explicit consent to the stated policy and terms[15]‚ and Flipkart says consent is granted simply by visiting the site or creating an account․ Consent can only be revoked through e-mail support․[16] Likewise‚ Ajio bundles consent with browsing or account creation‚ cites only older IT legislation‚ and offers no information on opting out of consent other than that services would stop․[17] Amazon Fashion follows this model․ None provide itemized DPDPA-style notice or separate affirmative consent checkboxes for purposes such as marketing‚ analytics‚ and third-party sharing․ While self-serve privacy centres and data deletion options exist‚ these are not a substitute for prior consent as required by law․ The Digital Personal Data Protection Rules have some rules for notices and consent managers. However these rules will not be fully enforced until 13 May 2027 which’s 18 months after they were announced. Until then the main ideas of the Digital Personal Data Protection Rules apply. Companies are not really doing anything wrong if they do not follow them.
Many companies have already started getting ready for these rules including making privacy portals because they think they will be considered Significant Data Fiduciaries and have to do more. After May 2027 if companies do not follow the rules they can be fined up to ₹250 crore for each mistake they make.[18]
Users can already ask to see, fix or delete their data by contacting the company’s support team. Companies are still not really asking for permission in a way that is clear and specific.
The Digital Personal Data Protection Rules require companies like fashion e-commerce platforms to change their websites and apps to have notices and to make sure users are really agreeing to things.
Companies should start checking what they are doing now to make sure they are following the rules. Users who are worried about how their data’s being used should look at their settings and ask for their data to be deleted if they want to.
In conclusion, what companies are doing now is not enough. They will have to follow the Digital Personal Data Protection Rules fully by 2027.
As mentioned earlier, primary laws governing protection of personal data and biometric information in india includes DPDP Act, 2023 , IT Act, 2000 and SPDI Rules, 2011 promulgated under section 43A of the IT Act. Despite these statutes, regulatory and legal gaps emerge while applying these laws to advancing technology such as virtual try-on tools. Areas where gaps exist:
Lack of clear definition of Biometric data in retail technology
While biometric data is defined under SPDI Rules, it does not include bodily measurements and digital avatars, the two factors widely used by the fashion e-commerce platforms to make customer specific recommendations.
Absence of Sector-Specific Regulation for Retail Technologies
The current legal framework provides broad definition and technology neutral data protection provisions. However, technologies like virtual try-on systems involve continuous image scanning, AI-based analysis, and behavioural tracking, which raise unique privacy risks. There are currently no sector-specific guidelines for fashion e-commerce platforms using such technologies.
Inadequate Guidance on Biometric Data Retention
DPDP act provides for a 3 year limit for retention of data. It also specifies that data must be deleted after the purpose is fulfilled or the consent is withdrawn. But it does not specify :
- whether facial or body data should be deleted immediately after simulation, or
- whether platforms may retain such data for AI model training or product recommendation systems.
This creates uncertainty about permissible retention practices.
CASE LAWS DISCUSSION
Judicial interpretation of privacy rights in India has played a significant role in shaping the legal framework governing personal data protection. Although courts have not yet directly addressed virtual try-on technologies, several landmark decisions provide important principles relevant to biometric data and digital privacy.
The case of Justice K.S. Puttaswamy vs. Union of India, 2017 is a landmark case concerning privacy rights . Petitioner challenged Aadhaar’s requirement for biometrics like fingerprints and iris scans. They contended it was an invasion of privacy under Article 21. The Supreme Court declared privacy a fundamental right. They used a test of proportionality which means that any invasion of privacy must have a reason, be rational and not be too intrusive. The court also said that the harms caused by the invasion of privacy must be balanced.
The court struck down Section 57 which allowed private companies to use Aadhaar for authentication because it could be used for exploitation without consent. This is relevant to the use of body scans in try-ons because biometrics can reveal a person’s core identity. The court said that the use of biometrics must be carefully considered and that people must give their consent.[19]
In the Puttaswamy II case the court held that private companies could not use e-KYC without proper safeguards. The court noted that biometrics can undermine anonymity and that the collection of data can breach a person’s right to self-determination. This is especially relevant to try-ons, where data may be retained indefinitely or used for profiling. The court said that companies must justify their use of data.[20]
There are no cases in India about e-commerce but the case of Manohar Lal Sharma vs. Union of India, 2021 is similar. The court said that there must be regulations for the use of facial recognition technology because it can be used to invade people’s privacy.[21] This is similar to cases in countries like the Estée Lauder BIPA suits, where companies have been sued for collecting facial geometry without disclosing it to users.[22] These cases establish that biometric processing requires a purpose and transparency and this is directly applicable to the use of try-ons.
The use of biometrics in virtual try-ons is an issue because it can be used to collect a lot of data about people without their consent. The cases mentioned above show that companies must be careful when using biometrics and that they must get consent from users. The use of body scans in try-ons must be carefully considered and companies must justify their use of this technology. Justice K.S. Puttaswamy case and Puttaswamy II case are very important, in this context because they establish the principles of privacy and proportionality that must be followed when using biometrics.
CRITICAL ANALYSIS AND FINDINGS
Existing laws inadequately address retail biometrics. The consent mechanism as outlined in the DPDPA are not effective when it comes to the facial scans done during try-ons, which often result in the data being stored for a long time. There are no rules that require companies that’re not sensitive data fiduciaries to do data protection impact assessments.
Judges have been ruling in favor of privacy after the Puttaswamy case. They are focusing on government programs and ignoring what is happening in private retail. This is creating gaps that allow companies to profile people like targeting them based on their body type without being fair. The laws say that companies can be fined a lot of money. Because there are no guidelines they are getting around the rules by having vague policies and this is making people not trust them.
What we found out is that try-ons are collecting data and the existing are not specific enough when it comes to retail. They do not address things like how someone’s data can be scanned or how vendors should be audited. This is creating a lot of risks like the data getting stolen or shared without permission or people being treated unfairly. We think that the laws need to be changed so that retail biometrics are considered a risk and people have to give their consent in a very clear way and there should be limits on how long the data can be retained.
CONCLUSION
The way Indian fashion e-commerce platforms collect data via virtual try-ons violates the customers’ right to privacy. When you use these tools they take scans of your face and body measurements which’s personal information. These platforms do not ask for your clear permission. They rely on implied and non-affirmative consent that does not fulfill the criteria of free, specific, informed, unconditional and unambiguous consent with a clear affirmative action consent given under section 6 of the DPDP Act.
The law that is supposed to protect us, the DPDPA 2023 itself has some gaps. It does not clearly specify what biometric data is or how websites should handle it..
The DPDP Act 2023 does not provide adequate safeguards. It lacks explicit retail-biometric definitions, sector-specific guidelines, clear retention limits for simulation data, and mandatory impact assessments for non-significant fiduciaries. Enforcement of the 2025 Rules is delayed until May 2027, leaving proportionality concerns from the Puttaswamy judgment unaddressed. Legislative reform is therefore imperative to close these gaps.
REFERENCE(S):
Table of Cases
- Castelaz v. Estee Lauder Companies, Inc., No. 22 CV 5713, 2024 WL 136872 (N.D. Ill. Jan. 10, 2024)
- Justice KS Puttaswamy(Retd) And Anr vs Union Of India And Ors AIR 2017 SUPREME COURT 4161
- Justice K.S.Puttaswamy(Retd) vs Union Of India, AIR 2018 SC (SUPP) 1841
- Manohar Lal Sharma vs Union Of India, AIR 2021 SC 5396
Table of Legislation
- Digital Personal Data Protection Act, 2023
- Information Technology Act, 2000
- SPDI Rules 2011
BIBLIOGRAPHY
- Abhinav Girdhar, ‘How Virtual Try-On Technology Is Revolutionizing the Fashion E-Commerce Industry?’ (Pixazo Blog, 20 February 2025) https://www.pixazo.ai/blog/virtual-try-on-technology-in-fashion-ecommerce accessed 12 March 2026.
- AJIO Commerce Privacy Policy https://seller.ajio.com/ajiocommerce/privacy/ accessed 12 March 2026.
- AZB & Partners, ‘Biometric Data: Regime in India’ (16 October 2019) https://www.azbpartners.com/bank/biometric-data-regime-in-india/ accessed 12 March 2026.
- Fashion Brand Named in Data Privacy Lawsuit Over Virtual Try-On Feature’ (Steptoe LLP) https://www.steptoe.com/en/news-publications/fashion-brand-named-in-data-privacy-lawsuit-over-virtual-try-on-feature.html accessed 12 March 2026.
- Flipkart Privacy Policy https://www.flipkart.com/pages/privacypolicy accessed 12 March 2026.
- Gehlot K, ‘Biometric Data Technology in Fashion Retail: Opportunities, Risks, and the Legal Landscape’ (Fashion Law Journal, 16 July 2025) https://fashionlawjournal.com/biometric-data-technology-in-fashion-retail/ accessed 12 March 2026.
- King Stubb & Kasiva, ‘Regulation of Biometric Data under the DPDP Act’ (2 November 2023) https://ksandk.com/data-protection-and-data-privacy/regulation-of-biometric-data-under-the-dpdp-act/ accessed 12 March 2026
- Ronita Halder, ‘Legal Implications of Facial Recognition Technology in India’ (2025) 2 Journal of Contemporary Law and Society 99.
[1] Abhinav Girdhar, “How Virtual Try-On Technology Is Revolutionizing the Fashion E-Commerce Industry?” (Pixazo Blog, February 20, 2025) <https://www.pixazo.ai/blog/virtual-try-on-technology-in-fashion-ecommerce> accessed March 12, 2026.
[2] Gehlot K, “Biometric Data Technology in Fashion Retail: Opportunities, Risks, and the Legal Landscape” (Fashion Law Journal, July 16, 2025) <https://fashionlawjournal.com/biometric-data-technology-in-fashion-retail/> accessed March 12, 2026
[3] “Biometric Data: Regime in India” (azb, October 16, 2019) <https://www.azbpartners.com/bank/biometric-data-regime-in-india/> accessed March 12, 2026
[4] King Stubb & Kasiva, “Regulation of Biometric Data under the DPDP Act” (King Stubb & Kasiva, November 2, 2023) <https://ksandk.com/data-protection-and-data-privacy/regulation-of-biometric-data-under-the-dpdp-act/> accessed March 12, 2026.
[5] “Fashion Brand Named in Data Privacy Lawsuit Over Virtual Try-On Feature” (Steptoe LLP) <https://www.steptoe.com/en/news-publications/fashion-brand-named-in-data-privacy-lawsuit-over-virtual-try-on-feature.html> accessed March 12, 2026.
[6] Justice KS Puttaswamy(Retd) And Anr vs Union Of India And Ors AIR 2017 SUPREME COURT 4161
[7] AZB & Partners (n 3).
[8] The Digital Personal Data Protection Act, 2023 s.2(t).
[9] The Digital Personal Data Protection Act, 2023 s.2(n).
[10] Ronita Halder, “Legal Implications of Facial Recognition Technology in India” (2025) 2 Journal of Contemporary Law and Society 99.
[11] Information Technology (Reasonable security practices and procedures and sensitive personal data or information) Rules, 2011, rule 2(1)(b)
[12] Information Technology (Reasonable security practices and procedures and sensitive personal data or information) Rules, 2011, rule 3.
[13] Puttaswamy (n 7)
[14] The Digital Personal Data Protection Act, 2023 ss 4,6.
[15] Myntra Privacy Policy https://www.myntra.com/privacypolicy accessed 12 March 2026
[16] Flipkart Privacy Policy https://www.flipkart.com/pages/privacypolicy accessed 12 March 2026
[17] AJIO Commerce Privacy Policy https://seller.ajio.com/ajiocommerce/privacy/ accessed 12 March 2026.
[18] “India’s Digital Personal Data Protection Act 2023 Brought into Force” (www.hoganlovells.com) <https://www.hoganlovells.com/en/publications/indias-digital-personal-data-protection-act-2023-brought-into-force-> accessed March 12, 2026
[19] Puttaswamy (n.7)
[20] Justice K.S.Puttaswamy(Retd) vs Union Of India, AIR 2018 SC (SUPP) 1841
[21] Manohar Lal Sharma vs Union Of India, AIR 2021 SC 5396
[22] Castelaz v. Estee Lauder Companies, Inc., No. 22 CV 5713, 2024 WL 136872 (N.D. Ill. Jan. 10, 2024)





