UAE Child Digital Safety Law – Tighter Regulation for Digital Platform Operators and ISPs

The United Arab Emirates has enacted Federal Decree-Law No. 26/2025 on Child Digital Safety (CDS Law), introducing a comprehensive regulatory framework to protect children in the digital environment. The law came into effect on 1 January 2026 and includes a one-year compliance ramp-up period for entities subject to its provisions.

The CDS Law represents a significant development in the UAE’s digital regulatory landscape, aligning with global trends such as the EU Digital Services Act and the UK Online Safety Act. It reflects a growing regional focus on monitoring digital platforms and mitigating risks associated with children’s exposure to unsuitable online content, including platforms operated outside the UAE but accessible to users within the country.

The law applies broadly to internet service providers and digital platforms operating in the UAE or targeting users in the country where children under the age of 18 may access or be exposed to their services. Covered platforms include websites, mobile applications, gaming and streaming services, e-commerce platforms, and social media. In addition, the law imposes defined responsibilities on caregivers, including parents and guardians.

A central feature of the framework is the establishment of a Child Digital Safety Council, which will oversee implementation, propose regulatory updates, and develop standards for children’s digital privacy, security, and safe platform use. The UAE Cabinet will also issue a digital platform classification system, categorising platforms based on content type, usage, and impact, and defining corresponding compliance obligations, controls, disclosure requirements, and verification mechanisms.

The CDS Law introduces enhanced protections for children’s personal data, particularly for those under the age of 13. It restricts the collection, processing, sharing, or publication of children’s data unless strict conditions are met, including verified parental consent, transparent privacy policies, data minimisation, and a prohibition on commercial use such as targeted advertising.

Digital platforms are required to implement effective age-verification measures proportionate to their risk classification, alongside safeguards to prevent children’s access to online commercial games and betting-related mechanics, which may include certain in-game wagering or loot box features. Platform operators must also deploy privacy settings for children’s accounts, parental controls, filtering and blocking tools, reporting mechanisms for harmful or illegal content, and systems—potentially including AI-based tools—for proactive detection and removal of harmful material.

Immediate reporting of child exploitation material to authorities is mandatory, along with compliance with takedown orders and the submission of periodic compliance reports. Additional policies will be issued by the Telecommunications and Digital Government Regulatory Authority for ISPs, particularly concerning content filtering and parental consent.

Caregivers are assigned explicit responsibilities to monitor children’s online activity and limit exposure to harmful content, while balancing digital autonomy appropriate to a child’s age.

While penalties for non-compliance will be detailed in subsequent decisions, the law already provides sufficient clarity for affected entities to begin compliance assessments. For many platform operators, especially those hosting user-generated content, significant operational and technical changes may be required to meet the new standards.