EU Proposes Raising the Bar for Kids’ Safety Online: New Minimum Age, Bans on Addictive Design
SDG 16: Peace, Justice & Strong Institutions | SDG 3: Good Health & Well-Being
Institutions: Ministry of Electronics & Information Technology | Ministry of Women & Child Development
Members of the European Parliament (MEP) in the Internal Market and Consumer Protection Committee unanimously backed new rules to enhance minors’ safety online. Their proposals include: an EU-wide digital minimum age of 16 for accessing social media, video platforms, and AI companions, unless allowed by parents, and a baseline age of 13 for general social media registration.
The report calls for stricter regulation of persuasive technologies, such as targeted ads, influencer marketing, addictive interface features (infinite scroll, autoplay), loot boxes, and “dark patterns.” Many such mechanisms would be banned by default for minors.
MEPs also pressed for enforcement of the Digital Services Act (DSA), including fines or bans on non-compliant platforms. They suggest personal liability for senior leaders in cases of repeated or serious violations regarding minors.
These proposals mark a shift from reactive moderation to “safety by default” design for platforms accessed by minors. If adopted, they could set a global benchmark, especially for countries crafting their own child-protection rules for digital platforms. The EU’s move underscores the emerging norm that online rights of minors require differentiated regulatory design.
Currently Australia is the only country with a ban on Social media usage for under-16s.
Relevant Question for Policy Stakeholders:
For countries like India, how would a higher digital minimum age and default bans on addictive features translate into local regulation, enforcement, child rights protection, and tech design norms?
Follow the full release here: New EU Measures Needed to Make Online Services Safer for Minors