
Data Protection & Privacy
Contents
Updates
1. Constitutional Foundation
3. DPDP Act Explained
3.1 Implementation Timeline
3.2 Key Concepts
3.3 Lawful Grounds of Processing
3.4 Obligations & Compliance
3.5 Rights of Data Principals
3.6 Enforcement & Penalties
4. Data Governance & Contractual Risks
4.1 Key Agreements
4.2 Allocation of Liability
5. Digital Piracy & Data Theft – Legal Meaning
5.1 What is Piracy in Indian Law?
5.2 Piracy vs Data Breach vs Cybercrime
6. Piracy Law Framework in India
6.1 Copyright Act, 1957
6.2 IT Act, 2000
6.3 Bharatiya Nyaya Sanhita
7. Intermediary Liability & Platform Responsibility
8. Enforcement, Litigation & Remedies
8.1 Civil Remedies
8.2 Criminal Remedies
8.3 Regulatory Action
9. Sector-Specific Risk Areas
9.1 OTT & Media Platforms
9.2 SaaS & Cloud Companies
9.3Fintech & BFSI
9.4 EdTech
9.5 AI & Data Analytics Companies
9.6 Marketplaces & Aggregators
1. Constitutional Foundation
The modern framework of data protection and privacy law in India is constitutionally anchored in the recognition of the Right to Privacy as a fundamental right by the Supreme Court in Justice K.S. Puttaswamy (Retd.) v. Union of India (2017). In this landmark nine-judge bench decision, the Court unanimously held that the right to privacy is intrinsic to the right to life and personal liberty under Article 21 of the Constitution, and also flows from the freedoms guaranteed under Part III of the Constitution. The judgment overruled earlier precedents that treated privacy as a mere common law or statutory interest, firmly elevating it to a constitutional guarantee enforceable against the State.
The Court conceptualised privacy not as a narrow right against physical intrusion, but as a multi-dimensional right encompassing autonomy, dignity, and informational self-determination. Privacy was recognised as essential to individual autonomy the ability of a person to make personal choices free from unwarranted interference and as a core component of human dignity. Importantly, the judgment acknowledged informational privacy as a distinct and critical facet, recognising an individual’s right to control the collection, use, and dissemination of personal data in an increasingly digital and data-driven society.
This constitutional understanding laid the normative foundation for India’s data protection regime, directly influencing subsequent legislative developments, including the Digital Personal Data Protection Act, 2023. The principles articulated in Puttaswamy such as legality, necessity, proportionality, and procedural safeguards now operate as constitutional guardrails against excessive data collection, surveillance, and misuse, and continue to guide judicial interpretation of privacy, data protection obligations, and State as well as private sector conduct in the digital ecosystem.
2. Statutory Framework
India’s data protection and information governance regime is not contained in a single, consolidated statute but operates through a layered statutory framework comprising general data protection law, legacy cyber law provisions, sector-specific regulations, and transparency legislation. Together, these instruments regulate the collection, processing, storage, sharing, and disclosure of personal and sensitive data, while balancing privacy interests with regulatory oversight, public interest, and national security considerations.
The Digital Personal Data Protection Act, 2023 (DPDP Act) and the DPDP Rules 2025, constitute the cornerstone of India’s contemporary data protection framework. It establishes a comprehensive, consent-centric regime governing the processing of digital personal data, whether collected online or collected offline and subsequently digitised. The Act defines the roles and responsibilities of Data Fiduciaries and Data Processors, codifies the rights of Data Principals, prescribes lawful grounds of processing, and introduces enforceable obligations relating to purpose limitation, data minimisation, security safeguards, breach reporting, and grievance redressal. Importantly, the DPDP Act also creates an adjudicatory mechanism in the form of the Data Protection Board of India and empowers the Central Government to notify Significant Data Fiduciaries and sector-specific compliance requirements, thereby marking a decisive shift from a purely IT-security-based approach to a rights-based data governance model.
The Information Technology Act, 2000 continues to operate as India’s primary cyber law statute and remains relevant alongside the DPDP Act. While the IT Act does not establish a general data protection regime, it addresses unauthorised access, data theft, hacking, and computer-related offences through both civil and criminal provisions. Sections such as Section 43 and Section 66 impose liability for unauthorised extraction, damage, or misuse of data, while intermediary liability provisions and due diligence requirements regulate digital platforms and online service providers. Post-DPDP, the IT Act increasingly functions as an enforcement and penal framework complementing data protection obligations, particularly in cases involving cybercrime, data breaches, and digital piracy.
The Information Technology (Reasonable Security Practices and Procedures and Sensitive Personal Data or Information) Rules, 2011 commonly referred to as the SPDI Rules represent an earlier attempt to regulate sensitive personal data under the IT Act. These Rules impose obligations relating to consent, privacy policies, data security practices, and restrictions on disclosure of sensitive personal data such as financial information, health data, and biometric information. Although the DPDP Act now occupies the field of personal data protection, the SPDI Rules continue to have limited relevance, particularly in areas not yet fully notified under the DPDP framework and in contractual and compliance practices that still reference “reasonable security practices” under the IT Act. Questions of overlap, implied repeal, and transitional applicability remain legally significant during the implementation phase of the DPDP regime.
In addition to general data protection law, sectoral regulations issued by regulators such as the Reserve Bank of India (RBI), Securities and Exchange Board of India (SEBI), Insurance Regulatory and Development Authority of India (IRDAI), and telecommunications authorities play a critical role in shaping data governance obligations. These regulations often impose stricter standards on regulated entities, including data localisation requirements, cybersecurity controls, audit obligations, vendor risk management, and incident reporting timelines. For entities operating in regulated sectors such as banking, fintech, capital markets, insurance, and telecom, compliance with DPDP obligations must be harmonised with sector-specific regulatory mandates, which frequently operate as special laws imposing higher thresholds of accountability.
The Right to Information Act, 2005 (RTI Act) represents a distinct but intersecting statutory regime focused on transparency and access to information held by public authorities. Historically, Section 8(1)(j) of the RTI Act provided an exemption for disclosure of personal information unless justified by larger public interest. Following the enactment of the DPDP Act, amendments introduced through Section 44(2) have modified the personal information exemption, significantly altering the balance between privacy and transparency. This legislative change has generated important legal questions regarding the dilution of privacy safeguards under the RTI framework, the scope of permissible disclosure of personal data by public authorities, and the constitutional consistency of such amendments in light of the Puttaswamy principles.
Taken together, this statutory framework reflects India’s evolving attempt to reconcile individual privacy rights, digital innovation, regulatory supervision, and public interest considerations. For businesses, platforms, and public bodies, effective compliance requires not only understanding the DPDP Act in isolation, but also navigating its interaction with legacy IT laws, sectoral regulations, and transparency statutes that continue to shape India’s data governance ecosystem.
3. DPDP Act, 2023 and DPDP Rules, 2025 Explained
3.1 Implementaiton Timeline:
3.2 Key Concepts
A clear understanding of the core concepts under India’s data protection framework is essential to correctly determine legal obligations, compliance responsibilities, and risk exposure under the Digital Personal Data Protection Act, 2023. These definitions are not merely semantic; they directly influence consent architecture, contractual structuring, liability allocation, and regulatory enforcement.
3.2.1 Personal Data & Digital Personal Data
Personal Data and Digital Personal Data form the foundational distinction under the DPDP framework. Personal data refers to any data about an individual who is identifiable by or in relation to such data, whether directly or indirectly. The DPDP Act, however, consciously narrows its operative scope to digital personal data, meaning personal data that is processed in digital form or personal data that is initially collected in non-digital form but subsequently digitised. This distinction is legally significant, as purely offline personal data that is never digitised falls outside the Act’s ambit. In practice, given the pervasive digitisation of records, most modern data processing activities across platforms, enterprises, and public authorities inevitably trigger DPDP applicability.
3.2.1 Data Principal
A Data Principal is the individual to whom the personal data relates and is the rights-holder under the DPDP regime. The Act recognises data principals as the central beneficiaries of data protection, granting them enforceable rights such as the right to access information about processing, seek correction and erasure of personal data, withdraw consent, and access grievance redressal mechanisms. In the case of children and persons with disabilities, the DPDP Act introduces heightened protections, including verifiable parental consent and restrictions on certain forms of processing. The concept of the data principal operationalises the constitutional idea of informational self-determination by placing individuals at the heart of data governance.
3.2.2 Data Fiduciary
A Data Fiduciary is any person or entity that determines the purpose and means of processing personal data. This role carries the primary compliance burden under the DPDP Act. Data fiduciaries are responsible for issuing legally compliant notices, obtaining valid consent where required, ensuring purpose limitation and data minimisation, implementing reasonable security safeguards, and reporting personal data breaches. The classification as a data fiduciary depends on decision-making authority, not ownership of infrastructure or technical control. Consequently, startups, platforms, employers, financial institutions, and even public authorities routinely qualify as data fiduciaries where they decide “why” and “how” personal data is processed.
3.2.3 Data Processor
A Data Processor is an entity that processes personal data on behalf of a data fiduciary and strictly in accordance with the fiduciary’s instructions. Data processors do not independently determine the purpose of processing and cannot use personal data for their own commercial objectives. Their obligations arise primarily through contractual arrangements, such as data processing agreements, which must mandate security safeguards, confidentiality, breach reporting, and deletion or return of data upon termination. While the DPDP Act places direct statutory obligations primarily on data fiduciaries, processors remain exposed to regulatory and contractual liability, particularly in cases of negligence, unauthorised processing, or security failures.
3.2.4 Significant Data Fiduciary (SDF)
The category of Significant Data Fiduciary (SDF) reflects a risk-based regulatory approach under the DPDP Act. The Central Government may notify a data fiduciary as an SDF based on factors such as the volume and sensitivity of personal data processed, risk of harm to data principals, impact on electoral democracy, national security considerations, and use of new technologies. Significant Data Fiduciaries are subject to enhanced compliance obligations, including the appointment of a Data Protection Officer based in India, independent data audits, and the implementation of additional organisational and technical safeguards. This classification is particularly relevant for large digital platforms, fintech entities, data-driven businesses, and infrastructure providers, where data processing activities have systemic or societal implications.
Collectively, these key concepts establish the structural architecture of the DPDP regime, enabling a differentiated allocation of responsibilities and ensuring that regulatory obligations are proportionate to the nature, scale, and risk profile of data processing activities.
3.3 Lawful Grounds of Processing
The DPDP Act adopts a structured, purpose-driven approach to lawful processing of personal data, departing from the open-ended flexibility seen in some foreign regimes. Processing of digital personal data is permitted only when it falls within clearly identified lawful grounds, ensuring that data collection and use remain tethered to legitimacy, necessity, and proportionality.
Consent remains the primary and default lawful ground under the DPDP Act. Consent must be free, specific, informed, unambiguous, and given through a clear affirmative action, preceded by a valid notice explaining the purpose of processing. The Act mandates that consent be revocable at any time, and withdrawal must be as easy as giving consent. From a compliance perspective, this requires businesses to design robust consent management systems, maintain records of consent, and ensure that processing ceases upon withdrawal unless another lawful ground applies.
Legitimate Uses represent statutory carve-outs where personal data may be processed without obtaining explicit consent. These include processing necessary for compliance with law, performance of State functions, responding to medical emergencies, employment-related purposes, and protection of public order or safety. Legitimate uses are tightly circumscribed and cannot be expanded by contractual agreement or internal policy. Misclassification of routine commercial processing as “legitimate use” exposes organisations to enforcement risk.
Deemed Consent occupies a distinct conceptual space under the DPDP Act and applies in situations where the conduct of the data principal reasonably implies consent, such as voluntarily providing data for a specific purpose. While deemed consent reduces friction in practical workflows, it is not a blanket exemption. Fiduciaries must still ensure transparency, relevance, and fairness, and cannot rely on deemed consent where the processing is intrusive, excessive, or disconnected from the original context of disclosure.
3.4 Obligations & Compliance
The DPDP Act places the primary compliance burden on Data Fiduciaries, reflecting their control over the purpose and means of processing. Compliance is not merely procedural but structural, requiring integration of data protection principles into organisational design.
A compliant Notice and Consent Architecture is foundational. Data fiduciaries must issue clear, intelligible notices detailing the categories of personal data collected, purposes of processing, rights of data principals, and grievance mechanisms. Consent flows must be purpose-specific, granular, and capable of audit, particularly for digital platforms and data-driven businesses.
Purpose Limitation requires that personal data be processed strictly for the purpose specified at the time of collection. Any secondary use must be compatible with the original purpose or supported by a fresh lawful ground. This principle directly constrains practices such as data repurposing, profiling, and analytics beyond disclosed objectives.
Storage Limitation obligates fiduciaries to retain personal data only for as long as necessary to fulfil the stated purpose or comply with legal requirements. Indefinite retention, passive archiving, or retention “for future use” is inconsistent with the DPDP framework and increases breach exposure.
The requirement of Reasonable Security Safeguards mandates technical and organisational measures proportionate to the nature and sensitivity of personal data. While the Act avoids prescribing specific technologies, it expects risk-based security practices, access controls, encryption where appropriate, vendor oversight, and documented information security policies.
Data Breach Reporting introduces a transparency and accountability obligation. Personal data breaches must be reported to the Data Protection Board of India and, where required, to affected data principals. Failure to detect, assess, or report breaches in a timely manner can attract significant penalties, even where no actual harm is demonstrated.
3.5 Rights of Data Principals
The DPDP Act operationalises privacy by conferring enforceable rights on Data Principals, positioning individuals as active participants rather than passive subjects of data processing.
The Right of Access enables data principals to obtain confirmation of processing, a summary of personal data processed, and details of data sharing. This right enhances transparency and imposes an implicit documentation burden on fiduciaries.
The Right to Correction allows individuals to seek rectification of inaccurate or misleading personal data, ensuring data quality and fairness in decision-making systems.
The Right to Erasure empowers data principals to require deletion of personal data once the purpose of processing is fulfilled or consent is withdrawn, subject to legal retention obligations.
Grievance Redressal is a core compliance pillar. Data fiduciaries must establish effective grievance mechanisms and respond within prescribed timelines, failing which data principals may approach the Data Protection Board.
The Right of Nomination allows data principals to nominate another individual to exercise rights on their behalf in the event of death or incapacity, reflecting a continuity-based approach to informational autonomy.
3.6 Enforcement & Penalties
Enforcement under the DPDP Act is centralised through the Data Protection Board of India, which functions as a specialised adjudicatory body empowered to inquire into non-compliance and impose penalties. The Board’s role is administrative rather than judicial, focusing on regulatory enforcement rather than compensation.
The Act prescribes monetary penalties that may extend to substantial amounts, depending on the nature of the violation. Penalties are not linked to turnover percentages but are structured to reflect the seriousness of the breach.
In determining penalties, the Board considers factors such as the nature and gravity of the violation, duration of non-compliance, mitigation measures adopted, previous violations, and the degree of harm caused or likely to be caused to data principals. This introduces a quasi-discretionary enforcement model grounded in proportionality.
4. Data Governance & Contractual Risks
Data protection and piracy law intersect most sharply at the contractual layer, where rights over data, content, and digital assets are allocated. Poorly drafted contracts often convert compliance lapses into litigation exposure, particularly where data misuse overlaps with intellectual property violations.
4.1 Key Agreements
A compliant Privacy Policy serves as both a statutory notice instrument and a risk-allocation document. Inconsistencies between actual practices and published policies frequently form the basis of enforcement actions and consumer litigation.
Terms of Use govern user-generated content, platform liability, takedown rights, and acceptable use, making them critical for piracy prevention and intermediary protection.
Data Processing Agreements (DPAs) formalise fiduciary-processor relationships, specifying processing instructions, security measures, breach reporting obligations, and data return or deletion protocols.
Vendor & SaaS Contracts must address data access, audit rights, sub-processing, and liability allocation, particularly where vendors handle sensitive or regulated data.
Cloud & Hosting Agreements raise complex issues relating to data location, access controls, incident response, and third-party risk, making them central to both DPDP compliance and anti-piracy enforcement.
4.2 Allocation of Liability
The distinction between Fiduciary and Processor risk determines who bears primary regulatory exposure. Fiduciaries face statutory liability, while processors are typically exposed through contractual indemnities and negligence claims.
Indemnities are critical tools for risk transfer but must be carefully calibrated to cover regulatory penalties, third-party claims, and IP infringement.
Limitation of Liability clauses attempt to cap exposure but may be scrutinised where statutory penalties or fundamental breaches are involved.
Audit Rights enable fiduciaries to verify compliance and are increasingly demanded by regulators and enterprise clients alike.
5. Digital Piracy & Data Theft – Legal Meaning
Digital piracy and data theft are often conflated, but Indian law treats them as overlapping yet distinct legal phenomena. Early clarification is essential to avoid misclassification and strategic errors in enforcement.
5.1 What is Piracy in Indian Law?
Copyright piracy involves unauthorised reproduction, distribution, or communication of protected works. Software piracy includes unauthorised copying, licensing violations, and circumvention of access controls. Content scraping and data extraction may amount to piracy where proprietary databases, compilations, or protected content are systematically appropriated.
Database rights and trade secrets are protected through a combination of copyright law, contract, and confidentiality doctrines, particularly where substantial investment and secrecy are established.
5.2 Piracy vs Data Breach vs Cybercrime
While piracy focuses on unauthorised exploitation, data breaches involve unauthorised access or disclosure, and cybercrime encompasses broader criminal conduct. These categories often overlap but trigger different remedies.
Civil exposure typically involves injunctions and damages, while criminal exposure arises under IT and penal statutes, depending on intent, scale, and harm.
6. Piracy Law Framework in India
The Copyright Act, 1957 remains the principal statute governing piracy. Section 14 defines exclusive rights, Section 51 establishes infringement, and Sections 65A and 65B criminalise circumvention of technological protection measures. The Act provides both civil remedies and criminal sanctions.
The IT Act, 2000 supplements copyright law by addressing unauthorised access and extraction under Section 43 and computer-related offences under Section 66, while also shaping intermediary liability.
Under the Bharatiya Nyaya Sanhita, offences such as cheating, criminal breach of trust, and theft apply to digital assets where dishonest intent and property elements are established.
7. Intermediary Liability & Platform Responsibility
Intermediary liability is central to modern piracy enforcement. Platforms rely on safe harbour protection under the IT Act, conditioned upon compliance with due diligence obligations.
Failure to act on takedown notices, delay in response, or facilitation of infringement can result in loss of immunity. Indian courts have increasingly issued Ashok Kumar injunctions (John Doe) and dynamic injunctions to combat rogue piracy websites, requiring intermediaries to proactively block mirror and redirect sites.
8. Enforcement, Litigation & Remedies
Civil remedies include injunctions, blocking orders, and monetary relief such as damages and account of profits. Criminal remedies involve FIRs, cyber crime cell investigations, and search and seizure operations. Regulatory action may be initiated by CERT-In, MeitY, and the Data Protection Board, depending on the nature of the violation.
9. Sector-Specific Risk Areas
Risk profiles under data protection and technology laws vary significantly across sectors, depending on the nature of data processed, the degree of control exercised over processing, and the regulatory frameworks applicable to the business. A sector-specific understanding is therefore critical to accurately assess compliance obligations and liability exposure.
OTT and media platforms operate at the intersection of content regulation, intermediary liability, and large-scale personal data processing. These platforms routinely collect and analyse user viewing behaviour, preferences, and engagement data for recommendation engines and targeted advertising. At the same time, they must comply with takedown obligations, grievance redressal mechanisms, and piracy-related enforcement, often under tight timelines. The challenge lies in balancing platform neutrality, freedom of expression, and data protection obligations, while navigating overlapping requirements under the IT Act, intermediary guidelines, and the data protection framework.
SaaS and cloud companies typically position themselves as data processors, but this classification is increasingly nuanced in practice. Where such providers determine aspects of data processing, reuse data for analytics or product improvement, or train models on customer data, they may assume data fiduciary responsibilities. Additional risks arise from cross-border data transfers, client-specific regulatory requirements, and contractual exposure through audit rights, indemnities, and breach notification obligations. Misalignment between operational realities and contractual role definitions is a common compliance risk in this sector.
Fintech and BFSI entities face heightened scrutiny due to the sensitivity of financial, identity, and transactional data they process. These entities must comply not only with data protection obligations but also with sectoral regulations issued by RBI, SEBI, IRDAI, and other regulators. Key risk areas include consent management, purpose limitation, vendor and outsourcing governance, cybersecurity controls, and breach reporting. In this sector, data protection compliance is deeply intertwined with prudential regulation, consumer protection norms, and operational resilience requirements.
EdTech platforms encounter unique risks due to their processing of children’s personal data. Enhanced consent standards, restrictions on behavioural tracking and profiling, limitations on targeted advertising, and strict data retention requirements apply in such cases. Platform design choices, including the use of dark patterns or excessive data collection, can significantly increase regulatory and reputational exposure. Compliance failures in this sector often have amplified consequences due to heightened public and regulatory sensitivity around child data protection.
AI and data analytics companies face evolving regulatory expectations concerning lawful data sourcing, secondary use, and accountability. Risks arise in establishing valid legal grounds for data used in model training, particularly where data is scraped, inferred, or obtained from third parties. As AI systems increasingly influence decision-making affecting individuals, issues of transparency, explainability, and role classification (fiduciary versus processor) become critical. Regulatory focus in this space is gradually expanding beyond data protection to encompass algorithmic accountability and ethical governance.
Marketplaces and aggregators occupy a hybrid position in the digital ecosystem, simultaneously acting as platforms, facilitators, and enforcers. They process personal data of multiple stakeholder groups, including buyers, sellers, and service providers, and must carefully manage liability boundaries for third-party conduct. Ensuring effective notice-and-takedown mechanisms, enforcing platform rules, and maintaining neutrality without assuming unintended fiduciary responsibilities presents a complex compliance challenge. Poorly structured governance or data flows can inadvertently trigger deeper regulatory obligations.