We’ll be launching this exciting and innovative new tool early next year. To register your interest click here.
World Children’s Day on 20 November 2024 marks 35 years since the UN Convention on the Rights of the Child (UNCRC) was adopted by the UN General Assembly. In the intervening years the UNCRC’s 54 articles have shaped the protection and empowerment of children globally – recognising each child’s status as a rights holder in every aspect of their life across the spectrum of civil, political, economic, social, health, and cultural rights. The UNCRC remains the foremost international treaty related to children: a legally-binding agreement endorsed by 196 countries, including every member of the UN except the United States, guiding governments, regulators and policymakers around the world in their approach to recognising and vindicating the rights of the child.
Much has changed societally and technologically since 1989. Childhood is increasingly digitised - one in three users of the internet worldwide is a child according to research by Innocenti (2016), UNICEF's dedicated research centre. But the UNCRC remains as relevant as ever. In fact, the monitoring body for the UNCRC, the UN Committee on the Rights of the Child, has issued a key guidance document, General Comment No. 25 (2021) in which it clarifies how the UNCRC applies to the digital lives of children. It looks at issues such as: commercial advertising, marketing and profiling; digital exclusion; access to information; harmful online content and communications; cyberaggression, online abuse and exploitation; digital surveillance; and digital play through the lens of UNCRC rights and sets out the different roles of governments, state agencies, parents and businesses in protecting and respecting children’s rights.
The UNCRC is an international legal instrument which creates obligations for the countries which are parties to it to ensure that its rights and protections for children are implemented and respected at a national level. Additionally, the UN Committee on the Rights of the Child has specifically stated that businesses in the digital environment should respect children’s rights and prevent and remedy any abuses of their rights, and that countries which are parties to the UNCRC have obligations to ensure that businesses meet those responsibilities. Because the UNCRC applies at a state/governmental level, this generally means that regulators as state agencies must also ensure that the rights and principles in the UNCRC are respected and promoted in their specific areas of regulatory competence where children are impacted. Many regulators responsible for different aspects of digital regulation refer to the UNCRC in their guidelines and standards relating to children. The “best interests of the child” principle (one of the UNCRC’s four “General Principles” which guide the interpretation and implementation of all UNCRC rights) is often at the core of regulatory policies and guidelines for promoting and protecting the rights of child users of digital services.
The influence of the UNCRC can also be seen in laws and policies which govern different elements of the online ecosystem, such as consumer protection, data protection and privacy, marketing and advertising, digital services and online safety. In fact, the UNCRC continues to shape emerging legal frameworks in critical areas of digital regulation like artificial intelligence; for example the EU’s new AI Act references both the UNCRC and General Comment No. 25, emphasising the need to consider children’s vulnerabilities and provide such protection and care as is necessary for their well-being in the context of the development and deployment of AI systems.
It is often remarked that despite the huge proportion of children accessing digital services as part of their everyday lives – such as to communicate, play and learn - that the digital world was not designed with them in mind. However high-profile enforcement decisions in the last few years, in particular from regulators in the EU and US, have demonstrated that digital businesses with children amongst their user populations must not treat child users in the same way as adult users. Instead, digital service providers need to assess the specific risks posed to children by their services as a whole, as well as from different service features and functionalities, and take measures to ensure that child users benefit from a high level of protection when using them. In essence this means safety-by-design, privacy-by-design, and security-by-design but there is no “one size fits all” set of measures; rather digital services providers need to tailor their services depending on the risks posed to children. Risk assessments may take into account issues such as: the type of content which child users may encounter using the service (e.g. violence, pornography); the nature of contact or communications which child users may receive from other users (e.g. the possibility of unknown adults contacting them); the conduct of other users towards them online (e.g. the possibility of harassment or exploitation); and how commercial practices of the service may pose risks (e.g. the possibility of profiling and personalised advertising, addictive design).
Around the world, and even within the same region, there can be major variations in national laws and regulatory requirements relating to children’s use of digital products and services – for example some countries may have very specific online safety laws while others may not. The consequence is that for businesses offering digital products and services to mixed use audiences (i.e. children and adults) across multiple jurisdictions, it is very challenging to navigate the growing spider’s web of different national legal requirements, regulatory decisions and guidelines aimed at protecting children and respecting their rights.
Critical practical issues which digital businesses may need to consider on a country-by-country basis include the following:
The benefits of prioritising the protection of children and respect for children’s rights under the UNCRC are countless for any digital business which has children as users. Aside from mitigating legal/ regulatory risks arising from non-compliance, implementing specific measures to help children have a safe, healthy and positive experience as a digital service user will go towards promoting more trusted (and potentially longer enduring) consumer relationships with children (who are the adult users of the future) as well as with parents/ guardians. It will also help digital businesses position themselves as market leaders in championing best practices and driving higher standards in their sector in the sphere of protection and respect for children’s rights. Having a strategy for protecting children and promoting their rights may also help digital business towards achieving their ESG and CSR goals and demonstrating higher standards of ethical business practices.
One of the first steps for any digital services provider in implementing higher standards of protection for children and respect for their rights involves understanding the legal landscape in the jurisdictions where there are children accessing their services. We have developed a tool to help digital businesses to understand and map a range of core obligations concerning children as users of digital services, on a country-by-country basis, in more than 40 jurisdictions. We cover obligations across consumer protection, data protection and privacy, advertising and marketing, and online safety as well as considering how the UNCRC has been implemented at national level. We also provide an at-a-glance risk profile using the RAG (red, amber, green) methodology for each country covered, to help digital businesses to design and prioritise their own multi-jurisdictional strategy to comply with requirements for protecting children and respecting their rights as service users.
We’ll be launching this exciting and innovative new tool early next year. To register your interest click here.