7 min read
7 min read
Imagine receiving a notice that your Instagram or TikTok account will be deactivated under a new law for under-16s. That is the new reality for young people in Australia due to a groundbreaking law.
Governments and regulators around the world are watching closely because the outcome could shape future online safety laws. The rest of the world is keenly observing this unprecedented digital experiment. Its success or failure might inspire similar laws in other countries, reshaping online experiences for a generation.

Australia will require designated social media platforms to prevent users under 16 from holding accounts from December 10, 2025, and platforms must take reasonable steps to comply. Major companies like Meta and TikTok must find and deactivate millions of underage accounts.
The government is determined to create a safer online environment for children. This decisive action highlights growing global concerns about youth mental health and online safety.

Despite loud objections, big social media companies promised to follow the new rules. Executives from Meta, TikTok, and Snap confirmed their compliance in a recent hearing. They stated they disagree with the law but will fully abide by it.
Their cooperation is absolutely crucial for the ban to have any real effect. This reluctant agreement shows the increasing power of national legislation over global tech firms.

Global concern about social media’s impact on youth mental health is at an all-time high. Politicians and parents are increasingly worried about cyberbullying and inappropriate content. Australia’s bold move is a direct response to this growing public anxiety.
This law represents a major test for digital safety regulations worldwide. Other nations are likely to consider similar measures if Australia’s approach proves successful.

The tech giants have several reasons for opposing this ban. They argue that simply blocking access does not solve core problems. Companies believe it might push young users toward more dangerous, unregulated corners of the internet.
They also feel it cuts off a vital space for social connection and creative expression. This perspective highlights the complex debate between safety and freedom online.

Meta told Australian authorities it had identified about 450,000 accounts under 16 across Facebook and Instagram in Australia, while TikTok estimated about 200,000, and Snapchat estimated about 440,000. These are company estimates reported at a parliamentary hearing.
These platforms now face the huge task of contacting all these users before the December deadline. This logistical challenge is unprecedented in the history of social media regulation.

If an account is deactivated, what becomes of all your photos and messages? Meta says it will give young users a clear choice. They can either permanently delete all their data or have the company store it until they turn sixteen.
TikTok and Snap are planning to offer very similar options to their users. This ensures people do not immediately lose their cherished digital memories and conversations.

You might wonder how platforms can accurately determine who is underage. The platforms say they will use automated behaviour signals and other age estimation tools to flag accounts that appear to be run by people under 16, and those flagged may be able to appeal with third-party age verification tools.
For instance, an account claiming to be 25 but acting like a teenager would be flagged. This behavioral analysis is a key part of their new detection strategy.
No automated system is perfect, and mistakes are inevitable. What happens if you are over sixteen but get mistakenly flagged? Meta and TikTok will direct those users to a third-party age-verification tool. This provides a way for people to prove their age and regain access.
Snap is still developing its own specific solution for handling these appeals. Fairness and accuracy in this process are major concerns for everyone involved.

The government reversed an earlier plan to exempt YouTube and added the site to the platforms covered by the restrictions after safety reviews and pressure from the eSafety commissioner.
The research revealed a surprisingly high percentage of children encountered harmful content there. This data convinced officials that YouTube posed a similar risk to other social platforms.

The financial penalties for ignoring this law are absolutely massive. Social media companies that do not comply face fines of up to 49.5 million Australian dollars. This serious financial threat forces the platforms to invest heavily in new systems.
It is a powerful motivator for them to develop effective age-detection and enforcement procedures before the deadline.

Australia’s law is a major test case for digital safety regulation. Governments in the United States and Europe are observing the results with great interest. The outcome could inspire a whole new wave of age-restriction laws around the world.
The policy could influence other governments and shape debates about youth access to social media in other jurisdictions.

This situation sparks a complex debate about the best path forward. Critics say an outright ban is a clumsy tool that does not teach responsible online behavior.
They argue for better digital literacy education and improved parental controls instead. They believe these solutions are more effective than simply shutting down access for young teens.

Supporters of the ban believe the risks of social media outweigh the benefits for young teens. They see it as a necessary step to protect mental well-being, similar to age limits for driving.
For them, safety is a higher priority than unrestricted digital access. This perspective is driven by increasing alarm over studies linking social media to teen anxiety.

Tech companies openly admit that enforcing this ban will be incredibly difficult. They point to the fundamental challenge of accurately identifying age online without being intrusive.
This massive technical and logistical hurdle was a core reason for their initial opposition. Creating a fair and effective system in such a short time is a monumental task.

This law represents a significant shift in the relationship between governments and tech giants. It demonstrates that even the most powerful companies must yield to national legislation.
Their reluctant compliance shows that lawmakers are finally asserting more control. It signals a new era of increased accountability and regulation for the digital world.
Wondering how new laws might affect social media giants? Explore the potential for Perplexity to take on TikTok.

This story is about who gets to make the rules for our online lives. Should it be governments, parents, or the tech companies themselves?
As this Australian experiment unfolds, it is a perfect time to think about your own views. The global conversation about social media responsibility is just beginning, and your voice matters.
Curious about how social media giants are navigating new regulations? See how Meta and TikTok just scored a win in the EU.
Do you think other countries should follow Australia’s lead? Share your thoughts below and leave a like if you learned something new.
Read More From This Brand:
Don’t forget to follow us for more exclusive content right here on MSN.
This slideshow was made with AI assistance and human editing.
This content is exclusive for our subscribers.
Get instant FREE access to ALL of our articles.
Dan Mitchell has been in the computer industry for more than 25 years, getting started with computers at age 7 on an Apple II.
We appreciate you taking the time to share your feedback about this page with us.
Whether it's praise for something good, or ideas to improve something that
isn't quite right, we're excited to hear from you.
Stay up to date on all the latest tech, computing and smarter living. 100% FREE
Unsubscribe at any time. We hate spam too, don't worry.

Lucky you! This thread is empty,
which means you've got dibs on the first comment.
Go for it!