📱 Children and Social Media - The Future of Data Protection
How are we shaping a digital future on networks to protect privacy?
Saying that minors don’t use social media because the "terms of use" don’t allow it is as false as assuming that none of us from Generation Y used the Internet in the ‘90s. Every child and teenager with access to a screen, whether at home or on the street, is on social media.
If you liked this topic, the content is free—make sure to subscribe to the blog using the button below.
According to the TIC Kids Online Brazil1 survey, 88% of children and teenagers between 9 and 17 years old have a social media profile, with WhatsApp being used by 78%, Instagram by 64%, and TikTok by 60% of these users.
Among platforms less popular among adults, Snapchat stands out as a favorite among teenagers. A study revealed that in the United States, 60% of teenagers use Snapchat, while only 33% of adults are present on the platform.
Another example is Yubo, a social network designed to "make new friends" and foster a sense of community, targeting teenagers and young adults aged 13 to 25. Yubo allows users to create video and live chat groups with up to 10 friends and has 40 million users worldwide.
And you, grown-up, had you heard of Yubo? I hadn’t until I started writing this post and had to look up what the "kids" are using these days.
With every click, like, or share, children and teenagers are feeding a digital ecosystem that doesn’t always work in their favor—especially since this concern falls on us, not the average user.
Recently, the UK’s Information Commissioner’s Office (ICO) announced investigations2 into how platforms like TikTok, Reddit, and Imgur handle the data of users aged 13 to 17. The goal of this investigation is to ensure that children's privacy isn’t treated as a bargaining chip for algorithms that determine what content these young users consume. The concern isn’t just about what’s being shown but also about what’s being collected behind the scenes.
It’s a tricky issue because children and teenagers are both pioneers—the first to adopt new platforms—and at the same time, products of an industry built on attention. While the ICO pushes companies to comply with data protection laws, the reality is that many social networks use young users' information to fuel their recommendation engines. The TikTok case, for example, is investigating whether the platform manipulates what teenagers watch based on a data profile they don’t even realize they are building.
At its core, it’s a tug-of-war between regulation and innovation. Companies claim they are implementing safety measures, but history shows that these solutions often come too late and fall short. The UK, through its Children's Code, has been reinforcing rules like age verification requirements, data collection transparency, and minimizing the use of sensitive information. Even so, challenges persist, and one of the biggest is ensuring that these young users truly understand the full extent of what’s at stake.
Interestingly, resistance to these initiatives doesn’t come only from companies but also from users and their parents. An ICO survey found that 42% of British parents feel they have little to no control over the information social media platforms collect about their children. On the other hand, only 23% said they stopped using a platform due to privacy concerns. This highlights a paradox: the fear is real, but so is digital fatigue. People are exhausted from fighting a system that always seems one step ahead.
So what’s the alternative? Completely isolating kids from digital media, which could limit their access to education and leave them with less knowledge compared to their peers? Or maybe what I’m presenting as a big problem is actually part of the solution?
I only got my first computer at 15, while my classmates had been using them since they were 10. That didn’t make me less competitive—if anything, I might have developed other sensory skills.
This debate ties directly to the concept of privacy and monitoring. How do we give kids the freedom to develop digital autonomy without exposing them to unnecessary risks? The solution isn’t in drastic bans but in education and transparency. Families need to understand that excessive monitoring can create a false sense of security, while an environment built on trust and open dialogue can encourage a more conscious and responsible use of technology.
Talking is easy, but in everyday life, I imagine parents just need a tablet with SpongeBob to get a little peace and quiet.
In Brazil, the General Data Protection Law (LGPD) also includes specific measures for protecting children and teenagers, ensuring that the principle of "best interest" is upheld. The National Data Protection Authority (ANPD) has been working on guidelines to help companies navigate the collection and processing of minors' data. However, challenges remain, especially when it comes to gambling disguised as entertainment and the use of deepfakes to manipulate children online. These technologies introduce even more sophisticated and harder-to-detect risks, making the regulatory debate even more urgent.
In the future, we’ll likely see more global regulations similar to the UK’s Children's Code and the U.S. COPPA (Children's Online Privacy Protection Act). But the real question is: will these laws be enough? Or will we keep chasing an industry that always finds creative ways to bypass regulations? If we truly want a safer digital world for kids, we need a real commitment from all stakeholders—governments, companies, parents, and educators alike.
At the end of the day, children's privacy can’t be a battle we just accept as lost. Protecting kids' and teenagers' data isn’t just a matter of regulation—it’s about digital justice. A future where young people can explore the internet without fear of being watched or manipulated isn’t utopian—it’s necessary. The real question is whether we’ll take responsibility for this or keep pretending that unread terms of service are enough to solve the problem.
https://cetic.br/pt/pesquisa/kids-online/
https://www.theregister.com/2025/03/03/uk_regulator_investigates_tiktok_and/
Reflexões muito pertinentes! Na esfera do indivÃduo ou do núcleo familiar é difÃcil pensar em soluções simples. Precisamos, de fato, de uma ação coordenada das empresas e governos nesse sentido (o que parece ainda menos simples, ainda que imprescindÃvel).