Regulating the UAE’s Virtual Playground: New Child Digital Safety Law

Contacts

saarah badr Module
Saarah Badr

Senior Counsel
United Arab Emirates

Through working in-house in the media & entertainment industry for many years, I bring extensive regional knowledge, coupled with a practical and commercial approach.

charles christie Module
Charles Christie

Associate
United Arab Emirates

I am an associate at our Dubai office, where I assist clients with commercial, technology, and data-related issues across the Middle East, with a primary focus on the UAE and Saudi Arabia

Reflecting international best practice, the United Arab Emirates (UAE) has issued Federal Decree Law No. 26 of 2025 on Child Digital Safety (the Digital Safety Law), regulating the virtual worlds that are increasingly becoming the primary playground of the under-18s.  The law applies to Internet Service Providers (ISPs) and digital platforms (including gaming, social media, live-streaming and e-commerce platforms) operating in the UAE or targeting users in the UAE, clearly a broad scope that includes those platforms with a large UAE user base but no physical UAE presence.  

The Digital Safety Law came into force on 1 January 2026, with compliance required by 1 January 2027.  Implementing regulations, specifying the penalties for breach and enforcement mechanisms, will be issued at a later stage.  A Child Digital Safety Council will also be established to coordinate policies, strategies and directions in relation to child online safety.

Here are our six key takeaways from the new Digital Safety Law:

  1. The “Virtual World”; defined: There are no other laws* (*to our knowledge, at the time of writing) that define this emerging digital environment; a first for the Digital Safety Law and the UAE.  A virtual world is defined as a “computer-generated simulation of reality or an imaginary environment…where users participate through digital avatars, interacting with each other and with elements of the environment according to a set of rules, within a continuous virtual time and space.”  These largely unregulated virtual spaces, and the use (and abuse) of avatars, are key risks for parents and platforms alike (particularly in the Gulf region, where revealing clothing, inappropriate props and romantic interactions conflict with cultural and religious sensitivities).  Under the law, the child’s parent or guardian must refrain from displaying or negatively exploiting a child online or in a virtual world in a way that threatens that child’s privacy, dignity and social and psychological wellbeing (such as exposing them to online bullying).   This targets “sharenting”, the practice of caregivers sharing content about children online, requiring careful consideration prior to uploading content featuring minors.
     
  2. Game over for “Harmful Content”: This is defined broadly, using a general concept of content that negatively affects the “moral, psychological or social values of children or society…and that violates the media content standards”.  The media content standards are those set out in the UAE Media Law and its Implementing Regulations, integrating the two content-related legislative regimes.  The result is a multi-layered compliance framework; online content must now comply with the Cybercrimes Law, the Media Law and the Digital Safety Law (although the concepts of non-compliant content are consistent).  Under the Digital Safety Law, all digital platforms have a duty to protect children from harmful content, including:
    • Implementing clear and user-friendly tools to report instances of harmful content.
    • Leveraging their technical capabilities, including AI systems and machine learning algorithms, for the proactive detection, removal or reporting of harmful content (the “proactive” detection and removal by AI systems is of particular interest as it seems to go further than the cybercrimes requirements of reactive takedowns upon notice).
    • Immediately reporting harmful content to the “concerned authorities”.
       
  3. All bets are off: Platforms are categorically prohibited from allowing children to access, register for, or participate in, online commercial games content, whether directly or indirectly, including promotional content.  “Online Commercial Games” are defined as those games with a primary purpose of generating (direct or indirect) revenue for the operator, including placing bets and wagers for money or money’s worth.  This is a broad definition and may well include unplayable gaming content if considered to be promotion of gambling activities. 
     
  4. Rated and restricted: Platforms will be classified according to their type, content, usage and impact (to be determined by a decision of the Cabinet, in coordination with the relevant authorities, at a later stage) and access restricted to children that are not of the required minimum age.  In support of the classification system, platforms are required to implement “effective and reasonable” age verification mechanisms to ensure that content can be appropriately age-gated. 
     
  5. Hands-off under-13s data: Platforms are now specifically prohibited from collecting, processing, publishing or sharing personal data of under-13s unless they have obtained “explicit, documented and verifiable consent” from the child’s parent or guardian.  There must also be a quick and accessible method to withdraw consent and strict controls around access to data within the platform by its personnel.  Under-13s data must not be used for targeted advertising or tracking activity for commercial purposes. Although ‘explicit consent’ is not yet defined under the UAE’s Protection of Personal Data Law, our expectation is that the threshold should be slightly higher than usual consent, i.e. express and unambiguous confirmation.
     
  6. ISP duties; defined: The law places specific obligations on ISPs including activating content filtering systems, ensuring there are terms of service with parents/guardians, facilitating parental control mechanisms and immediately reporting illegal or harmful content. 

The Digital Safety Law places new obligations on ISPs and platforms operating from, and targeting under-18s users in, the UAE.  The penalties and enforcement mechanisms are, as yet, undefined. However, platforms are required to comply with the new requirements and implement tools, processes and mechanisms to protect children in online environments. Further obligations for age classification and verification will also follow.  We can assist with reviewing compliance of current business processes with the Digital Safety Law and recommending enhancements where required.

Saarah Badr is Senior Counsel at Bird & Bird (having spent eight years in-house in the media & entertainment industry) and available to discuss any of the issues highlighted above.

Latest insights

More Insights
featured image

Accessibility Alert: Non-conformity and derogation disclosure obligations for operators under the European Accessibility Act

7 minutes Jan 06 2026

Read More
featured image

Boilerplate clauses in English law governed contracts: common pitfalls and how to avoid them - Part 2

4 minutes Jan 06 2026

Read More
Curiosity line pink background

China Cybersecurity and Data Protection: Monthly Update – December 2025 Issue

22 minutes Jan 06 2026

Read More