Author : Pooja Pandey

Expert Speak India Matters
Published on Oct 16, 2024

The rapid growth of Ed-Tech during the pandemic highlights gaps in quality, legal frameworks, and children's rights—India must adopt proper measures to address these challenges

Ed-Tech in India: The quest for child privacy and well-being

Image Source: Getty

Introduction

In the past decade, the appeal for integrating technology into education has seen a meteoric rise. The Education Technology (Ed-Tech) sector in India received its golden ticket during the COVID pandemic which endorsed its legitimacy and pervasiveness. The necessity of Ed-Tech services, along with their rapid growth and volume of adoption during the pandemic, led in some cases to a reduced focus on quality parameters, legal framework, and fair-trade requirements. This, in turn, had impacts on children’s rights and well-being.

The integration of Ed-Tech in mainstream education continues to gain interest, which is also recognised, prioritised and actively promoted by the National Education Policy (NEP 2020) (Sections 23 and 24). This is, however, predominantly from the standpoints of service delivery, curriculum, pedagogy, innovations, and impacts. Worryingly, one of the primary beneficiaries of Ed-Tech viz the children, continues to be the least discussed in Ed-Tech linked discourse, inter alia, the normative aspects and impacts of Ed-Tech relating to children’s right to privacy, agency, and developmental rights.

Ed-Tech and child well-being

The data privacy of children closely safeguards their rights and well-being. Ed-Tech has been indicted for stealing data and harming children. Recent international reports have highlighted recurring,  non-consensual digital tracking of children and instances of using children’s data for hard-selling and mis-selling tactics. While there is no definitive meaning to classifying any activity as Ed-Tech, the commonality is that it involves platforms that collect, process, and use large amounts of personal data. Piecemeal data of children is often obtained without prior consent and can be used to construct their full profiles, which includes critical information like identities, location, biometrics, preferences and abilities.

Evidence suggests that surveillance can even extend to families and educational institutions, especially in cases of device sharing or home network scanning.

The said data can either be used to inform the design and pricing of Ed-Tech ‘products’ or can be invasively shared with data brokers, advertisers or third parties, increasing the risks of behavioural assessment, surveillance, unsolicited advertising, extortion, exposure to sensitive and age-inappropriate content, etc. These implications are also cross-border, likely leading to violations of international regulations and global commercialisation of children’s data, which can become increasingly difficult to track and curb.  Evidence suggests that surveillance can even extend to families and educational institutions, especially in cases of device sharing or home network scanning. The intermeshing of Artificial Intelligence (AI) in Ed-Tech can lead to cognitive manipulations, controlling or altering student behaviour. For instance, some Ed-Tech platforms offer AI-led prods and personalized learning options, steering parents and children towards advanced learning options, and altering their educational behaviours and learning patterns.

Landscape overview

There continues to be a definitional ambiguity on what includes ‘Ed-Tech’, as a sector and as a service. Legally, the Ed-Tech is not recognised as sui generis and is governed, in fragments, by a rainbow of legislations including the newly formulated Digital Personal Data Protection (DPDP) Act 2023, the Information and Technology Act 2000, the Consumer Protection Act 2019, the Rights of Persons with Disabilities Act 2016, the Right to Education Act 2009 etc. Barring the DPDP Act which partially addresses the aspect of child protection and data privacy, other legislations are peripheral in their scope, at best.

Additionally, Ed-Tech in India is provided by a diverse range of providers—large and small; state and non-state; for-profit and not-for-profit. However, a comprehensive mapping of the Ed-Tech providers and the nature of their services or offerings is yet to be undertaken.

The ambivalent regulatory stand has created further challenges such as data breaches, economic exploitation, and digital exclusion of marginalised children in India.

With respect to governance, there appears to be a lack of clarity in identifying the focal ministry(ies) or bodies, responsible for regulating Ed-Tech and its framework. While the central government has issued parliamentary response and advisory in the past, to warn against the misuse of Ed-Tech, the enforceability has been limited. The ambivalent regulatory stand has created further challenges such as data breaches, economic exploitation, and digital exclusion of marginalised children in India.

Challenges

Several challenges prevent child privacy and well-being in Ed-Tech. In addition to regulatory ambiguities, the sector is also ambivalent on registration, categorisation and accreditation of Ed-Tech. There continues to be a major gap in technical and socio-legal evidence that can inform the extent and modality of privacy and rights infringements. The former relates to a lack of functional understanding of how technology operates. The latter relates to a limited regulatory and experiential understanding of the governance and adoption of Ed-tech and its impact on child privacy.

The concentration of power by global tech companies with other ancillary organisations is rapidly redefining India’s education sector. There is a significant asymmetry of power and information between the Ed-Tech providers and all the other stakeholders, coupled with the provider’s data practices, which are often difficult to comprehend. By leveraging personal data to inform the design and format of future educational services, these providers frequently modify curricular and pedagogical aspects, with or without aligning them to state-mandated frameworks governing curriculum and pedagogy. The absence of information on clear adherence to such a framework can undermine the oversight and capacities of the state and educational institutions to effectively manage such technologies. The implications of this are not just on privacy and safety but even on educational outcomes.

By leveraging personal data to inform the design and format of future educational services, these providers frequently modify curricular and pedagogical aspects, with or without aligning them to state-mandated frameworks governing curriculum and pedagogy.

Another critical challenge relates to awareness and perception of Ed-tech amongst parents, educators and education systems. These stakeholders often fail to grasp the nature, risks and impacts of data misuse in Ed-Tech. Stakeholders often favour the pedagogical, curricular or financial aspects and actively de-prioritise the concerns of child privacy and well-being. The adoption of Ed-Tech is frequently justified by its purported ability to bring innovation in education. However, such claims are not supported by conclusive evidence. On the contrary, ed-tech services often exhibit questionable pedagogical value and have been reported to reinforce biases and beliefs, which is particularly detrimental in the Indian context which is marred by socio-economic inequalities and discrimination.

Way forward

Addressing the issues of children’s data privacy and safety requires normative, regulatory, institutional and design-level commitments. Understanding the real aims and objectives of education is the first step, which would inform the ‘Ed’ aspect of Ed-Tech and set it apart from any other service or profit-oriented business endeavour. A child of today is an active citizen and consumer in the future. Any distortion, extortion or manipulation of their personal data can have catastrophic impacts on their life pathways, including higher education, job opportunities, world views and beliefs. Safeguarding their educational experiences is critical.

To this end, the first step would be instituting stringent regulatory reforms, including: a) Definition of “Ed-Tech” and its constituents in India; b) Codification of Ed-tech norms and standards and c) Institution of statutory agency/ board/ organisation of the central and state government(s) to act as the accrediting and registering institution for all Ed-Tech organisations/ individuals and provide grievance redressal. Such a body should also provide a digital unified interface for any Ed-Tech services to control and check the quality and privacy concerns of the recipient of such services, including features like end-to-end encryption, auto-recognition and sensitivity towards explicit words and personal information etc.

Being an evolving sector, a push towards self-regulation is also being demanded. This must be viewed critically and should ultimately be tied to a codified set of practices, to ensure uniform monitoring and checks. The creation of autonomous bodies such as the National Education Technology Forum (NETF) by the Ministry of Education is a welcome step, inter-alia, in generating national and sub-national evidence on Ed-Tech.

Recent longitudinal, large-scale studies in other contexts capture perceptions and risk assessments of Ed-Tech data privacy.

Secondly, there is a vital need to understand the intricacies of data use and processing and the impacts it may have on children’s privacy and well-being coupled with awareness among parents and educators. Measures such as Ed-Tech audits, or ranking Ed-tech based on privacy measures can be adopted to foster awareness among parents. Recent longitudinal, large-scale studies in other contexts capture perceptions and risk assessments of Ed-Tech data privacy. Similar assessments are needed in India to document the experiences and awareness of stakeholders using Ed-Tech. The media can also play a significant role in fostering wider awareness and disseminating information.

Thirdly, the providers must be instructed to adopt robust design measures that may prevent any misuse or deceptive data practices. These design measures should fundamentally be interlinked with the broader Education design principles and deter risky design features, that may harm the child. Informed consent of the child and parents should be mandated. A useful example is the case of Google which after repeated advocacy and pressure, revised its design features on the Ed-Tech platforms, that favoured children's safety and privacy. Schools and institutions must conduct thorough evaluations and implement safeguards before deploying Ed-Tech platforms for children, while also training parents, teachers and administrators to responsibly manage student data.

Lastly, the agendas and vision of the state vis á vis Ed-Tech need to be clearly articulated and communicated, thereby guiding any subsequent interventions. Such interpositions must be mindful of the Indian socio-political context and inequalities, to minimise instances of harm for the most marginalised children and commit to the realisation of their rights to education, protection, and development.


Pooja Pandey is a Senior Resident Fellow in Education at Vidhi Centre for Legal Policy, New Delhi. 

The views expressed above belong to the author(s). ORF research and analyses now available on Telegram! Click here to access our curated content — blogs, longforms and interviews.