Special ReportsPublished on Apr 22, 2025 Digital Personal Data Protection Rules 2025 Recommendations To MeityPDF Download
ballistic missiles,Defense,Doctrine,North Korea,Nuclear,PLA,SLBM,Submarines
Digital Personal Data Protection Rules 2025 Recommendations To Meity

Digital Personal Data Protection Rules, 2025: Recommendations to MeitY

The Ministry of Electronics and Information Technology (MeitY), Government of India, released the draft Digital Personal Data Protection Rules, 2025 (DPDP Rules) in early January this year and invited feedback by 5 March 2025. On 18 February 2025, MeitY organised a consultative session on the draft DPDP Rules; the event, held in New Delhi, was attended by the Observer Research Foundation (ORF). Based on the consultative session and discussions with various stakeholders and scholars doing work in this domain, ORF submitted the following comments to MeitY.

Attribution:

Basu Chandola, Shravishtha Ajaykumar and Tanusha Tyagi, “Digital Personal Data Protection Rules, 2025: Recommendations to MeitY,” ORF Special Report No. 258, April 2025, Observer Research Foundation.

Introduction

The Digital Personal Data Protection (DPDP) Act[1] was enacted in August 2023 for the “processing of digital personal data in a manner that recognises both the right of individuals to protect their personal data and the need to process such personal data for lawful purposes.”[2] While the DPDP Act lays the broad foundation for data protection, the DPDP Rules,[3] released on 3 January 2025, clarify the details, address the finer points, and operationalise the Act.

ORF’s comments on the DPDP Rules address the following key themes:

  1. Registration and obligations of Consent Manager
  2. Verifiable consent for the processing of personal data of a child or of a person with disability who has a lawful guardian
  3. Localisation of data vis-à-vis Significant Data Fiduciary
  4. Due diligence for algorithmic software
  5. Functioning of Board as digital office
I. Registration and Obligations of Consent Manager

Under the current data protection framework, the DPDP Rules elaborate on the registration, accountability, and obligations of the Consent Manager. Rule 4, read with the First Schedule, provides for consent management, the obligations of Consent Managers, and their registration with the Data Protection Board (Board).

Consent Managers need to ensure that Data Principals can easily give, review, and withdraw their consent whenever they choose. They also maintain accurate records of all consent-related activities and enforce strong security practices to protect this information. These obligations simplify consent management, increase trust, and provide individuals with more control over their data.

The following are a few areas of concern around Rule 4 and the first schedule:

  • Under Part A of Schedule 1, several conditions for the registration of Consent Managers have been provided. However, many terms used leave room for discretion by the Board. These include the determination of “sufficient capacity to fulfil its obligations as a Consent Manager”, “general character of management”, adequate “earning prospects of the applicant”, and whether the key managerial personnel and senior management of the applicant company are “individuals with a general reputation and record of fairness and integrity.” The Board has been entrusted with wide discretionary powers, and no guidelines for exercising this discretion have been provided.
  • The Schedule requires calls for independent certification about interoperability and that appropriate technical and organisational measures are in place. However, the Rules are silent on who can conduct such independent certification or the standards to be applied during such process. Further, there is no guidance on the interoperability standards that Consent Managers must abide by.
  • Under Part B of Schedule 1, Consent Managers must “take reasonable security safeguards to prevent personal data breaches.” However, the Rules are silent on what such reasonable security safeguards could be.
II. Verifiable Consent for Processing of Personal Data of a Child or of Person with Disability Who Has Lawful Guardian

Section 9 of the DPDP Act requires a Data Fiduciary to obtain verifiable consent of the parents or lawful guardian for processing the personal data of a child or a person with disability. Rule 10 expands on Section 9 and prescribes how such verifiable consent can be obtained.

Personal Data of a Child

  • Though these measures aim to protect children’s privacy and access to online resources, they may unintentionally cause more harm. In many cases, especially in rural or urban poor households with only one device, the device might be registered to an adult, but it is primarily used by the children and adolescents. Currently, 82.1 percent of rural youth aged 15-24 have internet access.[4] Age verification in such cases of invisible use will be difficult to validate, and where validation is possible, it could affect digital growth in low-access areas in rural India.
  • Under the Verifiable Parent Consent mechanism (Rule 10), children must voluntarily self-declare to a Data Fiduciary that they are children within the meaning of the Act.[5] While seemingly straightforward, this approach creates challenges in verifying children’s consent. Self-declaration opens a potential loophole, as many children may falsify their age to bypass restrictions, particularly in the absence of accessible mechanisms for verifiable consent. This undermines the effectiveness of age restrictions and could result in a digital divide.[6] Children unable to navigate the consent process due to technical or systemic barriers may resort to falsifying information,[7] exposing themselves to online risks and being excluded from essential digital services.
  • The requirement under the rules to verify a child’s account on an online platform using identity and age details issued by a government-authorised entity raises serious concerns about privacy and the fundamental rights of citizens. The rules propose that parents voluntarily provide such details using services like the Digital Locker,[8] but this process inherently compromises the privacy of both the child and the parent. By requiring sensitive government-issued credentials to verify the parent’s adult status, these provisions introduce unnecessary surveillance and data collection into the digital ecosystem. This creates a database of personal information vulnerable to breaches, misuse, or exploitation, further impinging on individuals’ right to privacy.
  • Moreover, the rules fail to clarify how often consent will be required. Will the verification process be a one-time exercise or an ongoing requirement for every instance of data sharing or account modification? Without such clarification, the burden on parents could become excessive and intrusive, discouraging their participation in digital platforms. This lack of specificity in the rules leaves room for arbitrary implementation and could result in overreach, infringing on citizens’ fundamental rights to access and use digital services without undue surveillance or interference. These provisions, while intended to protect children, risk disproportionately affecting all users by normalising mass data collection and eroding the digital autonomy of individuals.

Personal Data of Person with Disability Who Has Lawful Guardian

Under Rule 10, those diagnosed with a disability, specifically those registered under the Rights of Persons with Disabilities Act, 2016, and the National Trust for the Welfare of Persons with Autism, Cerebral Palsy, Mental Retardation, and Multiple Disabilities Act, with “severe mental disabilities” require consent to be submitted by their guardians to process their data.

A few areas of concern around this provision for persons with disability who have lawful guardians are as follows:

  • This provision, while well-intentioned, will result in a backward implementation, undermining the agency and dignity of those with a diagnosis who are capable of autonomy. This clause does not capture the heterogeneity of the disabled population and will contribute to the continued discrimination against the community. It fundamentally disagrees with the Rights of Persons with Disabilities Act, 2016 (RPwD Act), that encourages autonomy and independence in the disabled community.[9]
  • The draft DPDP Rules highlight a complex issue regarding consent management for persons with disability, specifically when their “legal guardians” are involved. While the Rules define legal guardians as those appointed under either the RPwD Act or the National Trusts Act, 1999, they overlook a fundamental difference between these two laws, leading to confusion and potential problems for persons with disabilities.
  • The National Trusts Act, 1999, follows a substituted decision-making model, where the guardian steps in to make decisions on behalf of the person with a disability, assuming they are unable to decide for themselves. On the other hand, the RPwD Act promotes supported decision-making, where the guardian’s role is to assist, not replace, the person’s judgment, recognising that persons with disabilities have the legal right and ability to make their own choices, with some assistance when needed. The Rules fail to consider these basic differences, which could lead to conflicts in certain situations, such as the following:
    • If a person with autism or cerebral palsy has a guardian under the National Trusts Act, does that mean the guardian can make all decisions about the person’s data, even if the individual is able to consent with little or no support from the guardian?
    • For someone under the RPwD Act with a guardian, what happens if the person themselves wants to consent to data sharing? Considering the RPwD Act focuses more on support rather than complete substitution, are they allowed to do so, or does the guardian still need to step in?
  • This confusion is more than a technical problem; it has real-life consequences. It risks wrongly assuming that everyone with disabilities lack the ability to make their own decisions. Both the RPwD Act and international standards, like the United Nations Convention on the Rights of Persons with Disabilities (CRPD), are clear that people with disabilities should be able to make their own choices, with support if necessary.
  • The draft Rules also do not explain what happens if a person with a disability makes a decision independently, without involving their guardian. Would this be considered a violation of the law or the Rules? What impact would it have on their digital rights or their relationship with their guardian? These unanswered questions could unnecessarily complicate life for persons with disabilities and their families.
  • It is also unclear how the data of children of persons with disabilities will be managed, which takes away parental agency even within the confines of the rules.
  • At its core, the problem is that the draft Rules do not recognise the diversity of abilities and needs among persons with disabilities. They treat everyone under the same blanket rule, which can unintentionally take away people’s autonomy and independence. To truly protect and empower persons with disabilities in the digital space, the Rules must clarify when guardianship is needed, respect the right to independent consent, and ensure that any support provided enhances, rather than overrides, the person’s voice.
III. Localisation of Certain Kinds of Data

Section 10 of the DPDP Act provides the Central Government with the power to “restrict the transfer of personal data by a Data Fiduciary for processing to such country or territory outside India as may be notified.” The Section allows for the free flow of data across international borders unless the Central Government blacklists a country by notification. Thus, the Act has broadened India’s position on data localisation. However, the Draft DPDP Rules try to impose data localisation requirements beyond Section 10 of the DPDP Rules.

Under Section 10 of the DPDP Act, the Central Government has the power to notify a data fiduciary as a Significant Data Fiduciary (SDF) and require them to undertake other measures consistent with the provisions of the Act. Rule 12 builds on this power and requires the SDF to ensure that “personal data specified by the Central Government, based on the recommendations of a committee it constitutes, is processed subject to the restriction that the personal data and traffic data pertaining to its flow are not transferred outside the territory of India.”

This may be problematic because of the following reasons:

  • It is a well-settled principle that delegated legislation should not travel beyond the purview of the parent Act. The general power to make rules is strictly ancillary and does not enable the authority to extend the scope of the general operation of the enactment.[10] In this case, Rule 12 aims to limit the free flow of data as envisioned under Section 12. Delegated legislation, by nature, cannot replace or modify the parent law, nor can it establish substantive law. If subordinate legislation tends to replace, modify, or create new law, it can be struck down as ultra vires.[11] The Rules should not go against the spirit of the statute.
  • The Rules are silent on the composition of the committee under this rule, leaving wide discretion. Furthermore, the nature of the recommendations is unclear, i.e., whether the recommendations are binding on the Central Government or not.
  • The Rules introduce data localisation, which may affect trade, create difficulties for international businesses, and impose additional burdens on MSMEs.
  • The ambiguity on types of data covered by this provision will leave businesses uncertain, and they may perceive India’s data protection framework to be opaque. This could erode confidence and impede business expansion and investment in the country. For instance, in the financial sector, Mastercard’s response to the Reserve Bank of India’s data localisation requirements under Rule 14 is not clarified, and other sectors may face similar outcomes. This will increase operational complexity for companies, especially those involved in data-driven activities like AI, cloud computing, and e-commerce, where uninterrupted data sharing is essential for innovation and future expansion.
IV. Due Diligence for Algorithmic Software

Rule 12 requires SDF to “observe due diligence to verify that algorithmic software deployed by it for hosting, display, uploading, modification, publishing, transmission, storage, updating, or sharing of personal data processed by it are not likely to pose a risk to the rights of Data Principals.”

This provision raises concerns:

  • The Rules do not clarify what kind of due diligence is expected from the SDF, creating uncertainty about their obligations under this provision.
  • The term “algorithmic software” is too broad and could encompass any and all computer programmes, including AI-based solutions. Accordingly, the scope and applicability of this provision need to be clarified.
V. Functioning of Board as Digital Office

Section 28 of the Act provides that the Board will function as a digital office and adopt such techno-legal measures as may be prescribed. While Section 2(m) of the Act defines a digital office as “an office that adopts an online mechanism wherein the proceedings, from receipt of intimation or complaint or reference or directions or appeal, as the case may be, to the disposal thereof, are conducted in online or digital mode”, the Act requires the Rules to expand on the techno-legal measures to be adopted by the Board.

However, the Rules minimally state that “the Board may adopt techno-legal measures to conduct proceedings in a manner that does not require the physical presence of any individual.” This leaves the techno-legal measures underexplained, with no further clarification provided in the Rules.

Endnotes

[1] Ministry of Electronics and IT, The Digital Personal Data Protection Act, August 11, 2023, https://www.meity.gov.in/static/uploads/2024/06/2bf1f0e9f04e6fb4f8fef35e82c42aa5.pdf.

[2] Ministry of Electronics and IT, “Preamble,” Digital Personal Data Protection Act, 2023.

[3] Ministry of Electronics and Information Technology, Digital Personal Data Protection Rules, 2025, https://www.meity.gov.in/writereaddata/files/259889.pdf .

[4] Ministry of Communications, “Rural Youth Lead India's Digital Transformation,” Government of India, October 21, 2024, https://pib.gov.in/PressNoteDetails.aspx?NoteId=153358&ModuleId=3&reg=3&lang=1.

[5] Rule 10 under Draft DPDP Rules, 2025, https://static.mygov.in/innovateindia/2025/01/03/mygov-999999999568142946.pdf

[6] Medha Garg and Shravani N. Lanka, “Safety of Children Online: A Privacy Tradeoff?,” Internet Freedom Foundation, January 19, 2025, https://internetfreedom.in/safety-of-children-online-a-privacy-trade-off/ .

[7] Garg and Lanka, “Safety of Children Online: A Privacy Tradeoff?”

[8] Rule 10 (b) Draft DPDP Rules, 2025, https://static.mygov.in/innovateindia/2025/01/03/mygov-999999999568142946.pdf

[9] The Rights Of Persons With Disabilities Act, 2016, https://www.indiacode.nic.in/bitstream/123456789/15939/1/the_rights_of_persons_with_disabilities_act%2C_2016.pdf .

[10] A. Unnikrishnan, “Scope and Limitations of Subordinate Legislation Under IBC,” Insolvency and Bankruptcy Board of India, https://ibbi.gov.in/uploads/resources/6ebf0b574193197ec88c562234877962.pdf .

[11] Rajya Sabha Secretariat 2005, Committee on Subordinate Legislation, Rajya Sabha Practice & Procedure Series, New Delhi, February, 2005, https://cms.rajyasabha.nic.in/UploadedFiles/Procedure/PracticeAndProcedure/English/13/committ_sub_legis.pdf .

The views expressed above belong to the author(s). ORF research and analyses now available on Telegram! Click here to access our curated content — blogs, longforms and interviews.

Authors

Basu Chandola

Basu Chandola

Basu Chandola is an Associate Fellow. His areas of research include competition law, interface of intellectual property rights and competition law, and tech policy. Basu has ...

Read More +
Shravishtha Ajaykumar

Shravishtha Ajaykumar

Shravishtha Ajaykumar is Associate Fellow at the Centre for Security, Strategy and Technology. Her fields of research include geospatial technology, data privacy, cybersecurity, and strategic ...

Read More +
Tanusha Tyagi

Tanusha Tyagi

Tanusha Tyagi is a research assistant with the Centre for Digital Societies at ORF. Her research focuses on issues of emerging technologies, data protection and ...

Read More +