A report by Australia's eSafety regulator indicates that a significant number of children below the legal minimum age are using popular social media platforms, prompting discussions about stricter regulations and age verification.
Australian Children Frequently Use Social Media Despite Age Restrictions

Australian Children Frequently Use Social Media Despite Age Restrictions
A new report reveals that over 80% of Australian children under 13 are active on social media, with plans underway to enforce stricter age restrictions.
More than 80% of children in Australia aged 12 and under accessed social media or messaging platforms last year, which are primarily intended for those aged 13 and older, according to findings from the eSafety regulator. The report highlights the widespread use of platforms like YouTube, TikTok, and Snapchat among younger users. In response to this growing trend, Australia is preparing to introduce a social media ban for individuals under 16, anticipated by the end of this year.
The companies under scrutiny, including Discord, Google (YouTube), Meta (Facebook and Instagram), Reddit, Snap, TikTok, and Twitch, did not immediately provide commentary on the findings. While most platforms mandate users to be at least 13 years old to create accounts, exceptions exist. For instance, YouTube offers Family Link, allowing guardians to supervise accounts for users under 13 and provides a dedicated app, YouTube Kids, suitable for younger children. Notably, usage of YouTube Kids was excluded from this report for this reason.
The eSafety commissioner, Julie Inman Grant, shared that the report's insights are essential for formulating further actions, emphasizing the collective responsibility of safeguarding children online, which includes social media companies, device producers, parents, educators, and legislators.
In a survey of over 1,500 Australian children aged 8 to 12 regarding their social media habits, researchers discovered that 84% had utilized at least one social media or messaging service since early last year. Over half of them accessed these platforms through a parent or guardian's account, while a third of those who participated in social media had their own accounts. Among those with personal accounts, 80% reported receiving assistance from a parent or caregiver during the account setup process. Alarmingly, only 13% of children with accounts faced closure due to age restrictions.
The report underscores discrepancies within the industry regarding age verification measures at different stages of the user experience, highlighting the inadequate interventions during the account registration process. Furthermore, when the platforms were asked about their age verification methods, Snapchat, TikTok, Twitch, and YouTube reported employing various tools and technologies to identify users who may be under the age of 13, based on their activities.
However, these proactive measures often rely on user engagement, which means potentially exposing children to risks before their age can be adequately verified.