A recent report by Australia's eSafety regulator highlights that over 80% of children aged 12 and under were using social media platforms, often bypassing age restrictions. As the country considers a social media ban for under-16s, experts emphasize the need for stricter enforcement of age verification measures.
Rising Social Media Use Among Young Children Sparks Regulatory Concerns in Australia

Rising Social Media Use Among Young Children Sparks Regulatory Concerns in Australia
Australian eSafety report reveals alarming social media usage rates among children aged 12 and under, prompting calls for stronger regulations.
In a concerning trend, a significant portion of children in Australia are using social media platforms that are intended for individuals aged 13 and older. A study conducted by the eSafety regulator revealed that more than 80% of children aged 12 and under accessed platforms such as YouTube, TikTok, and Snapchat last year. As Australia looks to implement a ban on social media usage for those under 16 by the end of this year, scrutiny over these platforms is intensifying.
The survey involved over 1,500 children, revealing that 84% had utilized at least one social media or messaging service in the past year. Interestingly, over half of these children accessed these platforms through accounts belonging to a parent or caregiver. A third of the participants had their own accounts, with 80% receiving assistance from adults during the account setup process.
Despite these findings, the report indicates that only 13% of children encountered account shutdowns from social media companies for violating age restrictions. This raises concerns about the efficacy of age verification methods employed by these platforms, which often lack robust measures at sign-up.
The eSafety commissioner, Julie Inman Grant, highlighted the shared responsibility in ensuring children's online safety, calling for collaborative efforts among social media companies, device manufacturers, and guardians. The report also suggested that current measures to verify user ages fall short, with platforms like Snapchat, TikTok, Twitch, and YouTube relying on user engagement to detect underage users—a process that may expose children to potential risks and harms before identification occurs.
As Australia strives to enhance digital safety for its youth, the findings of this report underscore the urgent need for improved regulatory frameworks that hold social media platforms accountable for enforcing age restrictions effectively.