Teen Influencers Condemn Australia's Social-Media Restrictions, Calling Online Work Their 'Purpose'
Young creators say account bans erase livelihoods as platforms and the government prepare for enforcement.

Teen creators say a world-first age restriction that stops under-16s from holding accounts will strip them of livelihoods, communities, and civic voice.
Many young people who have built followings on platforms such as TikTok and Instagram told this newspaper they regard content creation as work and identity; the federal government says the new rules, which require platforms to take 'reasonable steps' to stop under-16s holding accounts, are designed to protect children from harms linked to platform design.
The law comes into force on 10 December 2025 and has prompted legal challenges, industry warnings about enforcement, and a fierce public debate about parenting, privacy, and political speech.
'This Is My Job' — Creators Say The Ban Will Cost Livelihoods
For many teenagers interviewed or who spoke publicly in recent weeks, the platforms are not merely a leisure but a source of income, education, and community.
'This is my job,' one 15-year-old influencer said in comments attached to a TikTok clip decrying the ban; the clip's comment stream shows a sharp split, with many users saying teenagers' primary role should be study and supporting the restrictions.
Platforms, including TikTok and Instagram, host hundreds of thousands of Australian youngsters who post regularly and who may lose monetisation, brand deals, and audiences when account access is removed.
@bbcnews The ban, which targets children under 16 and takes effect on 10 December, is aimed at protecting them from cyberbullying, online predators, and harmful content. #Australia #SocialMedia #SocialMediaBan #Influencer #BBCNews
♬ original sound - BBC News - BBC News
Industry statements underline the practical impact. TikTok's Australian newsroom submission and public comments have warned that the ban's scope and the exclusion or inclusion of specific services will be complex to implement and could push teens to smaller, less-regulated apps. Meta and Snap have said they oppose aspects of the law but will comply with the new minimum-age rules.
Teens And Advocates Take The Case To The High Court
Within days of the government finalising implementation details, a constitutional challenge was filed in the High Court by the Digital Freedom Project, backed by two 15-year-olds who say the law will 'rob' young Australians of a meaningful avenue of political communication.
The filing argues the restriction is 'not reasonably appropriate and adapted' and points to democratic harms: teenagers use interactive social tools to follow news, contact representatives, and organise civic activity.
The Digital Freedom Project's public materials say the ban could exclude about 2.6 million young people from account-based interaction online; government figures and independent researchers provide different estimates of the affected cohorts, but all sources agree the numbers are large and that enforcement will raise technical and privacy questions.

Communications Minister Anika Wells has defended the policy in Parliament and the National Press Club, saying the measure supports parents and aims to reduce harms associated with logged-in social media use.
The government's eSafety regulator has published guidance and assessment tools to help platforms implement age-assurance measures and warns that companies could face heavy penalties for systemic breaches.
The Risk Of Driving To Smaller Apps
The law requires platforms to take 'reasonable steps' to prevent under-16s holding accounts; regulators and platforms have debated tactics such as facial age-estimation, third-party age checks, and identity documents.
Platforms including Snapchat and YouTube have begun introducing age-verification prompts in Australia; others warn that identity checks could force minors to surrender privacy or create new security risks if verification data is centralised. The eSafety commissioner expects companies to report on removals and will have enforcement powers; civil penalties could reach £24.6 million ($33 million) for corporations found to have failed to take reasonable steps.
Child-safety advocates who supported the measure say removing account features for young children will lower exposure to targeted recommendation systems and harmful interactions.
Critics say the policy is blunt, will drive children to unregulated platforms, and could hamper vulnerable groups, particularly disabled, rural, and LGBTIQ+ teens, who rely on online communities. Surveys and trials fed into the legislation, but the Age Assurance Technology Trial's findings also underscore uncertainties about reliability and unintended consequences.
'I am learning to be a creator — I'm learning skills I'll carry through life,' one teen told reporters. Others argue that being locked out of interactive features while still allowed to 'view' content logged out is no substitute for the social and civic functions that accounts enable.
How the courts, platforms, and regulators resolve the competing imperatives of child safety, privacy, and freedom of expression will be watched globally, and not least by the young Australians who say the platforms are where they find community, work, and, as several put it plainly, 'purpose'.
© Copyright IBTimes 2025. All rights reserved.





















