
Image by Lone Thomasky & Bits&Bäume / https://betterimagesofai.org / https://creativecommons.org/licenses/by/4.0/
Who is Using Data and How?
Everyday, technology and the internet become more integrated into our day to day lives. The data that the use of this technology generates becomes an inevitable and unavoidable result of our time spent online and our virtual interactions. 149 zettabytes (where a zettabyte is one trillion gigabytes) of data were “created, captured, copied, and consumed” (Statista 2025c) in 2024 alone. However, lack of transparency and agency over our data remains a big concern, making data privacy and protection an urgent and pressing need.
The commodification of our data continues to grow, with Service Providers increasing focus on actions such as trading of users’ data. The amount of sensitive data and personal identifying information collected by service providers for these aims continues to increase simultaneously. Personal data collected in this manner could be misused for crime, used for targeted marketing, or other such use cases. While some of these cases are unequivocally bad (such as personal data being used for doxing or e-crime) and some might have benefits (such as using data for targeted marketing offers), none of these cases has explicitly been agreed to by users. Thus, data’s commodification, combined with its “Non-Rivalrous, Invisible and Recombinant nature” (Pandey et al, 2020: 37), leaves it vulnerable to unchecked proliferation, which results in potential harm to unsuspecting users. This problem is compounded by the fact that a majority of online activities do not involve conscious or voluntary data disclosure (tracking of search terms, browsing history, cursor activity, etc).
Through opt-out and not opt-in measures, service providers place the burden of responsibility entirely on the individual. Such measures lead to any disclosure of information being seen as a willing choice of users to disregard their own privacy interests, despite this not being an actual representation of their informed or deliberate choice (UNESCO 2022). These opt-out measures usually include broad and generic statements of consent, which leaves users and their data vulnerable. With little to no knowledge of how their data could be used or distributed, this creates a clear information asymmetry.
In 2024, 31% of respondents from a 12 country survey shared that they as internet users did not regulate their cookie preferences, instead accepting all cookies as their default (Statista 2025b). While this was a decrease from the previous year (45%) (Statista 2025b), when we consider the number of 5.5 billion global Internet users in 2024 (up from 5.3 billion in the previous year) (Statista 2025a), this 31% still remains an extremely significant number. Further, even if the consent is given with full knowledge of the terms, this need not necessarily be a reflection of the user’s willing acceptance of the terms, but might instead be a reflection of the need to access the service. This leads to a clear power imbalance, with users being left with an “illusion of empowerment”—presented with a choice, even if in effect they are provided with no better alternatives. If they want to access a service or site, and that too right away, most people would consent in order to move forward with accessing the service or site. This imbalance is even acknowledged in legal terms for internet users and service providers, with the former referred to as “Data Subjects” and the latter as “Data Controllers”, in a modern-day pastiche of feudal hierarchies, instead of data democracy (Lawrence 2016).
There is a real need to combat the misuse and non-consensual monetisation of our data. In a world where everything we do leaves an electronic trail, and yet there is little to no transparency regarding how this data is collected and where it is shared, solving for the issue of consent and accountability regarding the ethical use of our data becomes vital.
How Do Data Trusts Safeguard Data?
By creating a platform for users to provide explicit consent on when and how to use their data, data trusts help solve this problem to a large extent, returning the control of the data back to the users. First publicly put forth in 2016 by Professor Neil Lawrence, data trusts were described as “a mutual organisation formed to manage data on its members’ behalf. Data subjects would pool their data forming a trust, stipulating conditions under which data could be shared” (Lawrence 2016). These trusts could be considered for numerous purposes, from managing medical data to whether the user wanted better product recommendations. Both trivial and sensitive data could be entrusted to data trusts. By providing choice and allowing the concerns of data subjects to be highlighted and reflected, an ecosystem of data trusts would allow for the current landscape to move from its data feudalism to a data democracy, with “data governance by the people, for the people and with their consent” (Lawrence 2016).
Expanding upon the concept of data trusts, in 2019 Professor Sylvie Delacroix and Professor Neil Lawrence published a paper on ‘Bottom-up data Trusts: disturbing the ‘one size fits all’ approach to data governance’. The article established the need for data trusts, and provides several case studies of how they could be incorporated into legal frameworks and everyday navigation of the internet through use cases such as social media use and medical data management. It further considered and rebutted certain arguments against data trusts, such as those of security concerns and exit procedures.
Professors Delacroix and Professor Lawrence’s article thus provides a foundational and necessary understanding of data trusts and their potential implementation, and this paper aims to build upon it to consider a youth-centric adoption of this model. In order to do this, we must clearly outline the mechanism and processes of a data trust: The legal mechanism of a data Trust aims to leverage the resources concomitant with the pooling of data to directly address the power-asymmetries mentioned above. A Trust is formed when a person in whom a set of resources is vested—the Trustee—is compelled to hold and manage those resources either for the benefit of another person(s)—the beneficiaries—or for some legally enforceable purpose(s) other than the Trustee’s own. Aside from its allowing for ‘more subtle shades of ownership than the common law permits’, the duties which a Trust structure imposes upon the Trustee(s) are also better suited to the particular vulnerabilities at stake, as they demand the Trustee’s undivided loyalty and dedication to the interests and aspirations of the data subjects (as beneficiaries of the Trust). (Delacroix and Lawrence 2019: 40) with the added caveat that “The Trustee’s duties are fiduciary”, and thus “considerably more onerous than a ‘duty of care’.” (Delacroix and Lawrence 2019: 40).
But What About Children?
However, these data trusts must also account for the data of children, arguably the most vulnerable user segment owing to their status as newly active online individuals with limited maturity and experience, and therefore limited capability to effectively handle the processing of their data. Despite this, young people make up a significant portion of Internet users, with those between the ages of 15 and 24 years being “more likely to use the Internet than the rest of the population” (United Nations n.d.) and increasing numbers of younger children and toddlers being given access to devices every year (Guard Child n.d.). The digital presence of children is one that, for these reasons and of data privacy and protection, should be carefully considered. Recital 38 of the European Union GDPR (Intersoft Consulting n.d.), being cognisant of this fact, makes specific note of it, stating:
Children require specific protection with regard to their personal data as they may be less aware of the risks, consequences and safeguards concerned and their rights in relation to the processing of personal data. Such specific protection should, in particular, apply to the use of personal data of children for the purposes of marketing or creating personality or user profiles and the collection of personal data with regard to children when using services offered directly to a child. (Recital 38).
Although in recent years there has been a growing awareness of this fact, with care put forth by parents and even the law to ensure protections, there are still alarming statistics showing that children continue to slip through the gaps of online data protection. This is reflected in the statistics: only 15% of parents are aware of the social internet usage habits of their children, 86% of girls claim they can have online conversations without their parents being aware, and 31% of children have lied about their ages to access certain sites (Guard Child n.d.). The last is of particular concern, as children as young as 8 to 12 years of age have been noted to have an increase in use of social media (Instagram, Snapchat, Facebook, etc) (Moyer 2022), even though the minimum required age to use such platforms is 13 years, due to laws prohibiting collection and usage of children’s data by companies (because of the risks associated with such an action, and the privacy breach of data collection for marketing purposes) (United Nations n.d.). This is even more concerning when compounded with the fact that children may have limited knowledge, understanding, and awareness of data privacy and protection, with them showing confusion about how much privacy they have and how their data might be valuable to service providers and be at risk of further (mis)use. (Livingstone et al 2019).
Even responsible adults such as parents and educators might not be aware of the full degree of consent they provide on behalf of their children/wards. But when children make these choices themselves, with limited information and understanding, it can be even more dangerous. Data security being compromised could lead to children being exposed to myriad potential harms, such as cyberbullying, blackmail, or identity theft. With most digital systems still being vulnerable to data theft, this is indeed a situation where it would be better to be safe than to be sorry. There is thus a general acknowledgement of the fact that children cannot—and must not—be expected to tackle the burdens of understanding and engaging with the complex digital landscape that currently exists, by themselves (UNESCO 2022).
It is of note that most of the literature focuses on a more generalised category of “children”, encapsulating all from those twelve years and below, and teenagers, even though children and adolescents across different age groups can demonstrate different levels of maturity in understanding data privacy safeguards. However, just as those under the age of thirteen are often not informed enough to make truly informed choices, even teenagers may have differential awareness of data privacy and security. Teenagers from economically disadvantaged groups, those out of school and first generation digital users may be particularly vulnerable. Teenagers from privileged backgrounds who have been taught digital literacy are in many ways also not as empowered as it may appear. Due to the stage of growth and change they are at, they are often more intent on gratification and not able to gauge the threats of data proliferation, leading to effectively the same end result for all groups.
Keeping in mind all of these considerations, the digital presence of children and adolescents poses several questions regarding their digital footprint, and how they can be incorporated into the model of data trusts. Although, as minors, they could be signed up to a certain data trust by their guardians, many teenagers and youth are on social media platforms, such as Instagram, without the knowledge of their parents. In such a situation, having parents/guardians sign on minors to data trusts would not be effective, as the parents/guardians might not even know the social media platforms their minor wards use, and hence the data generated by their wards on these platforms would not even be piped to these data trusts and thus fall through the cracks, remaining open to misuse. Further, as minors become adults, there might be a desire to own their past and future data which would need to be handled effectively, given that their existing registration with a data trust is likely under their guardians’ control. And then there is always the question of what happens to first generation digital users, who are often managing their own and their parents’ digital interactions.
How Data Trusts Could Accommodate the Needs of Children
There could be differing ways to tackle this issue of ensuring data trusts effectively handle online data of children. One option could be setting up a separate default data trust for minors, with extremely stringent norms regarding data sharing beyond the data trust. In order for this to work, minors would need to self-report any social networking accounts and platforms that they had made use of. If parental backlash was a concern, a system could be devised for these discussions to be had with the trustee directly. Co-opting schools by providing training and sensitisation on this topic to teachers, and further for them to be available to counsel and advise students might also be a possible means of increasing awareness amongst students.
Older minors, such as those above the age of 13 (the age of lawful consent for data processing as stipulated in article 8 of the GDPR)—or the minimum legal age for consent, depending on the nationality of the individual—could begin to discuss their preferences for the handling of their data, in the supervision of a guardian or trusted adult (or without, if the guardian permitted). Their data could then be switched to another, more mainstream data trust that would better align with their interests—unless their guardians choose a data trust for them in accordance with their own trust.
Further, regarding the transition from minors to adults, a similar process could be instated as suggested by the Joint Parliamentary Committee, set up in India to evaluate the Personal Data Protection (PDP) Bill in 2019. While the regulations are pertaining to data protection as per the Data Protection Authority of India (DPA), which was meant to protect the interests of users, its processes are still pertinent to the issue of data trusts. Minors could re-register with the data trust and provide fresh consent a certain period before attaining the age of 18—although for the DPA this time period was stated to be of three months prior (Malhotra and Bhilwar 2024), here the exact period could be negotiated by the trust and guardians. This fresh consent can also include the terms of transfer of the minors’ past data to the trust they register with as adults, if they desire to do so, or the deletion of their past data, in accordance with the Right to be Forgotten (Information Commissioner’s Office n.d.; Malhotra and Bhilwar 2024).
A key consideration in this is that several, if not most, choices made regarding the child’s default data trust and preferences would be made by the guardian or parent of the child. However, the guardian might not themselves be informed enough to make apt choices in selecting the data trust most appropriate for their own needs and those of their child. Further, they might not have the awareness to sign themselves up for a data trust at all. This is especially true in countries such as India, where there is a clear urban-rural digital divide in terms of digital literacy, which would prevent parents from knowing the importance of the correct handling of their data, and that of their child. This divide can further be exacerbated through the intersections of gender and income disparity. Thus, necessary provisions would need to be made in order to lessen this information gap, enabling easy sign-ons to data trusts.
The Way Forward
Finding a solution to onboarding young people to data trusts would be very important in countries like India, where tens of millions of minors are online, often without the knowledge of their parents. The same problem is likely to exist in most countries in the global south, impacting hundreds of millions of children.
The intent of this article was to bring this issue to the forefront, and spark discussion. Data trusts could be truly revolutionary when it comes to the handling of individuals’ data, and as conversation of their potential implementation continues, this conversation should not leave behind the more vulnerable segments of internet users, such as children and adolescents. It is also a call for action to include young people in developing policies and systems that have a bearing on their digital lives. While this article has provided only a rudimentary understanding of the problem and how it could be addressed, it is my hope that this understanding can draw attention and cause more developed solutions to be built around this issue, with inclusive and participatory approaches that involve young people.
Reflexivity note:
As a 16-year-old grade 12 student from an urban context I acknowledge the privilege I might have in terms of digital literacy and my know how of data usage by service providers. This article emerged from my concerns of the use of my personal data by various platforms I access. My hope is that more young people speak up on these issues and policy makers and service providers include young people as they frame policies for data trusts and data protection.
About the Author:
Avni Nautiyal is grade 12 student at Oberoi International School, JVLR, Mumbai/
References:
- Delacroix, Sylvie, Neil D. Lawrence (2019): “Bottom-up data Trusts: disturbing the ‘one size fits
- all’ approach to data governance,” International Data Privacy Law, Vol 9, No 4, pp 236–252.
- GDPR, Recital 38, “Special Protection of Children’s Personal Data,” (n.d.): Intersoft Consulting,
- https://gdpr-info.eu/recitals/no-38/.
- Guard Child, “Internet Statistics,” (n.d.): Guard Child, https://www.guardchild.com/statistics/.
- “How does the right to erasure apply to children?” (n.d.): Information Commissioner’s Office,
- https://ico.org.uk/for-organisations/uk-gdpr-guidance-and-resources/childrens-information/childr
- en-and-the-uk-gdpr/how-does-the-right-to-erasure-apply-to-children/.
- Lawrence, Neil, “Data trusts could allay our privacy fears,” (2016): The Guardian,
- https://www.theguardian.com/media-network/2016/jun/03/data-trusts-privacy-fears-feudalism-de
- mocracy/.
- Livingstone, Sonia, Mariya Stoilova, and Rishita Nandagiri (2019): “Children’s Data and
- Privacy Online: Growing Up in a Digital Age – An Evidence Review,” London: London School
- of Economics and Political Science, accessed on 19 July 2025, https://eprints.lse.ac.uk/101283/.
- Malhotra, Charru, and Anushka Bhilwar (2024): “Striving to Build Citizens’ Trust in Digital
- World: Data Protection Bill (2021) of India,” Technology, Policy, and Inclusion: An Intersection
- of Ideas for Public Policy, Anjal Prakash, Aarushi Jain, Puran Singh, and Avik Sarkar (ed),
- Hyderabad: Routledge, pp 141–161.
- Moyer, Melinda Wenner, “Kids as Young as 8 Are Using Social Media More Than Ever, Study
- Finds”, (2022): The New York Times,
- https://www.nytimes.com/2022/03/24/well/family/child-social-media-use.html.
- Pandey, Alok Shankar, Nisheeth Dixit, and Mahim Sagar (2020): “Data Protection Framework
- for India,” Telecom Business Review, Vol 13, No 1, pp 36–46.
- Petrosyan, Ani, “Global number of internet users 2005-2024,” (2025a): Statista,
- https://www.statista.com/statistics/273018/number-of-internet-users-worldwide/.
- Statista Research Department, “Share of internet users who manage their cookie preferences
- worldwide 2023-2024,” (2025b): Statista,
- https://www.statista.com/statistics/1616768/cookie-presenfence-manage-worldwide/.
- Taylor, Petroc, “Amount of data created, consumed, and stored 2010-2023, with forecasts to
- 2028,” (2025c): Statista, https://www.statista.com/statistics/871513/worldwide-data-created/.
- United Nations, Global Issues, “Child and Youth Safety Online,” (n.d.): United Nations,
- https://www.un.org/en/global-issues/child-and-youth-safety-online/.
- UNESCO (2022): Minding the data: protecting learners’ privacy and security, France: United
- Nations Educational, Scientific and Cultural Organization.