Is Your Child’s Data Safe?
Written by Catarina Santos - Data Protection Expert
By adopting a proactive and legally sound approach, organisations can not only comply with evolving regulatory requirements but also build trust with users and create a safer, more responsible digital ecosystem for future generations.

Is Your Child’s Data Safe?
Understanding Privacy Risks on Social Media & Gaming Platforms
Children today are immersed in the digital world, spending hours on social media and gaming platforms. While these spaces offer entertainment and connection, they also expose children to significant risks. Personal data is often collected, stored, and shared—sometimes without adequate safeguards. As data protection practitioners, businesses, and parents, we must ensure children’s data is protected and handled responsibly.
What is really happening?
Many platforms rely on data collection for their business models, and children’s information is no exception. Social media apps track browsing habits, locations, and even biometric data. Gaming platforms encourage in-game purchases and often require extensive personal details for account creation. Targeted advertising can expose children to age-inappropriate content, while weak privacy settings make them vulnerable to online exploitation.
One major concern is how easily children’s data can be accessed or misused. Even when platforms claim to offer security, breaches and leaks happen. Data is often shared with third-party advertisers, meaning a child’s online behaviour could be tracked across multiple websites. This raises questions about consent and whether children (or their parents) truly understand what they’re agreeing to when they sign up.
Examples of real-world incidents highlight vulnerabilities children face online:
In April 2023, TikTok was fined £12.7 million by the UK’s Information Commissioner’s Office (ICO) for misusing children’s data, including failing to obtain parental consent for users under 13 and not implementing adequate age verification measures.1
Also, as of March 2025, the ICO launched investigations into TikTok, Reddit, and Imgur to assess their compliance with children’s data protection regulations. These investigations aim to ensure that these platforms have robust safety measures to prevent exposing young users to inappropriate or harmful content.2
The gaming industry has also faced criticism for inadequate data protection practices. Regulators have fined video game companies for unlawful practices involving young people’s personal data, emphasising the need for stricter compliance with privacy and data protection laws.3
Finally, the UK government is considering a social media ban for children under 16. Chief Medical Officer Chris Whitty has been tasked to assess the potential risks and harms associated with children using social media, which could lead to increasing the digital “age of consent” from 13 to 16.4
These examples underscore the pressing need for enhanced data protection measures tailored to children’s online activities.
UK GDPR & Children’s Data Protection
The UK General Data Protection Regulation (UK GDPR) explicitly recognises that children require greater protection when it comes to their personal data. This is because they may be less aware of the risks, consequences, and safeguards available to them. Recital 38 of the UK GDPR emphasises that children’s personal data merits specific protection, particularly in the context of online services such as social networking, gaming platforms, and digital marketing.
To address these concerns, UK GDPR imposes stricter obligations on organisations processing children’s data. Article 8 sets the legal age of digital consent at 13 in the UK, meaning that any online service provider offering services directly to children under this age must obtain verifiable parental consent before processing their data. Furthermore, organisations must ensure that privacy notices are written in clear, age-appropriate language, so children and their guardians fully understand how their information is collected, used, and shared.
The principle of data minimisation plays a crucial role in safeguarding young users, requiring that only the necessary amount of personal data is collected and retained for as long as needed. Additionally, the right to erasure, also known as the “right to be forgotten” (Article 17), allows children or their guardians to request the deletion of their data if it is no longer necessary or has been unlawfully processed.
A significant requirement under UK GDPR is that platforms must implement high privacy settings by default, particularly for children’s accounts. This aligns with the Age-Appropriate Design Code (Children’s Code), issued by the Information Commissioner’s Office (ICO), which mandates that services likely to be accessed by children must provide a high level of data protection by design and default. Despite these legal requirements, enforcement remains a challenge, with many online platforms failing to fully implement child-friendly privacy measures, leaving young users vulnerable to data misuse and online exploitation.
The Online Safety Act 2023
The Online Safety Act 2023 introduces a comprehensive legal framework designed to regulate online platforms and ensure the safety of children in the digital environment. Recognising the increasing risks posed by harmful content, data misuse, and exploitative online practices, the Act places a legal duty of care on service providers to identify and mitigate potential dangers to children using their platforms. This legislation is particularly relevant for social media networks, gaming platforms, and other digital services accessible to minors.
A key requirement under the Act is that companies must conduct mandatory risk assessments to evaluate how their platforms may expose children to illegal or harmful material, including content that promotes self-harm, exploitation, or misinformation. The legislation mandates that platforms implement proportionate measures to prevent such risks, ensuring compliance through robust safety mechanisms and content moderation systems.
Age verification and assurance mechanisms form another cornerstone of the Online Safety Act. Service providers are now legally obligated to implement technology that effectively determines whether a user is underage, thereby preventing children from accessing inappropriate or harmful content. This aligns with the Age-Appropriate Design Code (Children’s Code), which complements both UK GDPR and the Online Safety Act by setting high standards for protecting children’s data and ensuring digital services act in their best interests.5 6
Despite these legal safeguards, enforcement and implementation remain a challenge. Many online platforms still operate within regulatory grey areas, making compliance a complex but essential responsibility for organisations that process children’s data.
Conclusion
The protection of children’s data and online safety is a shared responsibility between organisations, regulators and parents. UK GDPR provides a strong legal foundation by requiring high privacy settings, minimal data collection, and clear parental consent mechanisms, while the Online Safety Act enforces stricter obligations on platforms to protect children from online harms.
For businesses, compliance is no longer an option but a legal necessity. Organisations processing children’s data must integrate privacy-by-design principles, conduct risk assessments, and implement robust age verification systems to meet their legal obligations.
Parents, too, play an essential role by actively engaging with their children’s online activities, leveraging their UK GDPR rights, and advocating for greater transparency from digital service providers. While legislation provides a crucial framework, the practical implementation of these laws will determine whether they effectively safeguard children in an increasingly digital world.
By adopting a proactive and legally sound approach, organisations can not only comply with evolving regulatory requirements but also build trust with users and create a safer, more responsible digital ecosystem for future generations.
Join the conversation
This article was written by Data Protection Expert, Catarina Santos who will be joining our audience live on the Data Protection Made Easy podcast on the 14th of March 2024 between 12:30PM and 13:30PM. It’s completely free to join and anyone is welcome to get involved, if you would like to sign up for this upcoming discussion simply visit our events page and register for this discussion for free.

Written by Catarina Santos