Children and teenagers inhabit digital worlds shaped by laws, policies, and platform rules created primarily by adults. Their data is collected before they can consent meaningfully. Their digital footprints may follow them throughout life. Their access to information and expression faces restrictions intended to protect but sometimes constraining. As digital life becomes increasingly central to childhood and adolescence, questions about young people's digital rights demand serious attention.
The Rights Framework
The UN Convention on the Rights of the Child establishes that children have rights, not merely protections to be granted by adults. These include rights to privacy, expression, information, association, and participation in decisions affecting them. Digital environments implicate all these rights, yet specific application to digital contexts remains underdeveloped.
Canada ratified the Convention and incorporated its principles into domestic law and policy to varying degrees. The 2021 General Comment on children's rights in relation to the digital environment provides guidance on applying the Convention to digital contexts, but implementation in Canadian law remains incomplete.
Tension persists between protection and autonomy. Children clearly need protection from certain digital harms—exploitation, abuse, predation. But protective measures can also limit legitimate expression, restrict access to information, and deny agency. Balancing these concerns appropriately requires ongoing attention to children's evolving capacities and interests.
Privacy Rights
Children's digital data is collected extensively, often without meaningful consent. Websites, apps, games, educational platforms, and devices all gather information about young users. This data may be used for advertising, shared with third parties, or retained indefinitely.
Canada's Personal Information Protection and Electronic Documents Act (PIPEDA) applies to children but lacks child-specific provisions comparable to the US Children's Online Privacy Protection Act (COPPA). Age verification remains unreliable, and parental consent mechanisms often provide more the appearance than reality of control.
Children's privacy is often compromised by parents themselves through "sharenting"—sharing photos, stories, and information about children online. Children may object to this sharing but lack practical recourse. Their digital footprints begin before they can participate in creating them.
Educational technology creates particular privacy concerns. Platforms used in schools collect extensive data about students' learning, behaviour, and interactions. Students often have no choice about using these platforms and may not understand what data is collected or how it's used. Recent attention to EdTech privacy has prompted some improvements but significant gaps remain.
Expression Rights
Young people increasingly express themselves through digital platforms, but face restrictions both from platforms and from adults in their lives. Platform content moderation may remove youth speech that doesn't violate rules when applied to adults. School policies often regulate students' online expression even outside school. Parents may monitor and control children's communications.
These restrictions sometimes serve legitimate protective purposes; sometimes they suppress legitimate youth voice. A blanket approach that treats all youth expression as subject to adult control fails to respect the evolving capacities that the Convention recognizes.
Anonymous and pseudonymous expression serves important purposes for young people exploring identity, seeking support for sensitive issues, or participating in contexts where revealing identity could be harmful. Efforts to eliminate anonymity online would disproportionately affect youth who benefit from these protections.
Information Access
Children have rights to access information, but their access is extensively filtered and restricted. Content filtering on school and library networks blocks material deemed inappropriate, but overblocking often prevents access to legitimate information about health, sexuality, identity, and other topics important to youth.
Age-gating restricts access to content deemed inappropriate for minors. While some restrictions serve clear protective purposes, others reflect contested judgments about what information young people should encounter. LGBTQ+ content, reproductive health information, and politically contentious material all face restrictions in various contexts.
Search algorithms may present different results to users identified as minors, potentially limiting access to information without users' knowledge. Personalization intended to protect can also constrain, and young people may not realize what they're not seeing.
Participation Rights
Digital platforms increasingly constitute important public spaces, but children and teenagers face barriers to participation. Age restrictions lock young people out of platforms where public discourse occurs. Even where technically permitted, youth voices are often marginalized or not taken seriously.
Decisions about digital policy—platform governance, content moderation, privacy regulation—rarely include meaningful youth input. Adults make decisions affecting young people's digital lives without consulting those affected. This contradicts Convention principles requiring that children's views be heard in matters affecting them.
School technology policies typically provide no student voice. Decisions about what platforms to use, what monitoring to implement, and what restrictions to impose are made by administrators without student participation. Students affected most directly have least say.
Protection From Harm
Children's rights to protection from exploitation, abuse, and harmful content are widely recognized. Digital environments create new vectors for harm that protection efforts must address.
Online child sexual exploitation remains a serious concern requiring robust responses. Canadian law criminalizes various forms of child exploitation, and enforcement efforts continue to develop. Platforms bear responsibility for preventing exploitation on their services, though compliance varies.
Cyberbullying affects significant numbers of young Canadians. Legal frameworks provide some recourse, but prevention and response remain inconsistent. Schools struggle to address behavior that occurs outside their direct oversight but affects students within.
Exposure to violent, disturbing, or age-inappropriate content concerns many parents and policymakers. Yet defining what's harmful proves contested, and restrictions motivated by protection concerns may limit access to valuable information. Age-appropriate responses to legitimate concerns should avoid overreach.
Economic Rights
Children participate in digital economies as consumers, content creators, and increasingly as workers. Their economic interests deserve consideration within rights frameworks.
Advertising directed at children, including influencer marketing that may not be recognized as advertising, exploits developmental limitations in understanding persuasive intent. While some regulation exists, enforcement in digital contexts proves challenging.
Young content creators generate value for platforms while facing restrictions on monetization, limited labor protections, and potential exploitation by parents managing their careers. The kidfluencer phenomenon raises concerns about child labour in new forms that existing protections may not address.
In-game purchases and digital goods markets target young consumers with designs intended to maximize spending. Loot boxes and similar mechanics may constitute gambling by some definitions. Young people's economic vulnerability in these contexts deserves policy attention.
Implementation Gaps
Even where digital rights for children are recognized in principle, implementation lags. Enforcement mechanisms prove inadequate. Complaint processes assume adult sophistication. Remedies for violations may not exist or may require resources young people lack.
Youth awareness of their own rights remains limited. Educational efforts have not kept pace with the rights framework's development. Young people may not know what rights they have, let alone how to exercise them.
Developing meaningful rights realization requires child-centered design of systems and processes—considering young people's actual capacities and circumstances rather than treating them as small adults or passive beneficiaries of adult protection.
Questions for Reflection
How should digital rights evolve as children mature? What rights should young children, pre-teens, and teenagers have, and how should transitions between these stages work?
When parents' interests in sharing about their children conflict with children's privacy interests, how should those conflicts be resolved?
What mechanisms could ensure young people's meaningful participation in decisions about digital policy that affects them?