Global Legislative Momentum on Social Media Age Restrictions Intensifies with New Measures in Norway and Turkey
Introduction
A growing number of national governments are enacting or proposing legislation to restrict social media access for minors, with Norway and Turkey the most recent jurisdictions to announce such measures. These actions reflect a widespread regulatory response to concerns about the impact of social media platforms on children''s mental health, safety, and development. Concurrently, technology companies such as Meta are attempting to mitigate regulatory pressure by introducing enhanced parental supervision tools.
Main Body
Norway’s minority Labour government announced on April 24 that it will present a bill to parliament by the end of 2026 to prohibit children under the age of 16 from using social media. Under the proposal, technology companies would bear responsibility for verifying users’ ages at login, with enforcement mechanisms linked to the EU-inspired Digital Services Act framework. Prime Minister Jonas Gahr Store stated that the legislation aims to prevent childhood from being dominated by algorithms and screens. The government previously considered a 15-year age limit based on date of birth but opted for a stricter, uniform system to avoid social divisions within school classes. Norway’s Data Protection Authority has expressed concerns that delegating age verification to private companies could undermine user privacy. Official figures from Norway’s media regulator indicate that 51% of children aged 9–10 and 74% of those aged 11–12 were active on social platforms in 2026, despite a current minimum age of 13. Turkey’s parliament passed a bill on April 23 that includes restricting social media access for children under the age of 15. The legislation mandates that platforms implement age-verification systems and create secure digital spaces for minors, with controlled use encouraged. It also requires platforms with a high number of users to appoint a representative in Turkey and brings game software platforms under the regulatory scope, obliging them to classify games based on users’ age criteria. President Recep Tayyip Erdogan has 15 days to approve the bill, after which it becomes law. In a televised address, Erdogan characterized certain digital sharing applications as corrupting children’s minds and described social media platforms as problematic. Australia became the first country to implement a comprehensive ban on social media for children under 16 in December 2025, covering platforms such as TikTok, YouTube, Instagram, Facebook, Snapchat, and X. Non-compliant companies face penalties of up to A$49.5 million ($35.3 million). Following Australia’s lead, several other nations have announced or enacted similar restrictions. France’s National Assembly approved legislation in January to ban children under 15 from social media, pending Senate approval. Greece will ban access for children under 15 from January 2027. Denmark announced a ban for children under 15, with parental consent possible for those aged 13 and above. Spain will ban access for minors under 16 and require age-verification systems. Indonesia and Malaysia have restricted access for children under 16, with Indonesia gradually deactivating accounts on high-risk platforms. Brazil’s Digital Statute, effective March 2025, requires minors under 16 to link accounts to a legal guardian and bans addictive features like infinite scroll. In India, the state of Karnataka banned social media for children under 16 in March, and the national chief economic adviser called for age restrictions, describing platforms as “predatory.” The United Kingdom is considering an Australia-style ban and testing curfews and time limits in 300 households. The European Parliament adopted a non-binding resolution in November calling for a minimum age of 16 on social media and harmonized EU digital age limits. In response to the regulatory trend, Meta has announced updates to its Teen Accounts proposition, including a new Insights tab for parents in the US, UK, Australia, Canada, and Brazil that shows topics their teen has asked Meta AI about. The company also introduced a content filter inspired by 13+ movie ratings and default restrictions on following accounts that share age-inappropriate content. Meta acknowledged that no system is perfect and requested continued parental feedback. However, child protection advocates argue that existing controls remain insufficient, noting that official data in several European countries shows large numbers of children under 13 maintain social media accounts. Additionally, concerns have been raised about the potential for circumvention via virtual private networks (VPNs). Following the UK’s Online Safety Act, Proton VPN reported a 1,400% hourly increase in new registrations, and NordVPN reported a 1,000% rise in subscriptions from UK users. In Australia, daily active VPN sessions peaked at 1.32 million after age restrictions were implemented.
Conclusion
The global regulatory landscape for children’s social media access is undergoing rapid transformation, with legislative momentum building across multiple jurisdictions. While governments increasingly view age-based restrictions as a necessary intervention, challenges persist regarding effective enforcement, privacy implications of mandatory age verification, and the potential for users to circumvent restrictions through technological means. Technology companies are simultaneously adapting their platforms to retain parental trust, though the efficacy of these measures remains subject to ongoing debate.