Explore more publications!

Australia’s Children Social Media Ban to Take Effect on Wednesday

(MENAFN) Australia is on track to become the first nation in the world to prohibit children under 16 from accessing major social media platforms, including TikTok, YouTube, Instagram, and Facebook.

The measure, passed by Parliament last year, is set to take effect on Wednesday. Companies that fail to comply could face fines of up to $33 million.

“From 10 December 2025, age-restricted social media platforms will have to take reasonable steps to prevent Australians under the age of 16 from creating or keeping an account,” the government said, framing the law as a safeguard for children “at a critical stage of their development.”

Under the new rules, platforms must use a combination of signals—including account activity, viewing habits, and user photos—to identify underage users. Companies will also be required to block minors from bypassing age restrictions through fake IDs, AI-generated images, deepfakes, or VPNs.

The tech industry has criticized the legislation as “vague,” “problematic,” and “rushed.” TikTok and Meta warned that enforcement could be challenging but committed to compliance. Meta has already started removing under-16 accounts ahead of the December 10 deadline. Snapchat and other companies cautioned that the law could drive children toward “darker corners of the internet.” Reddit called the legislation “legally erroneous” and “arbitrary.”

Similar proposals are gaining traction globally. The European Parliament adopted a non-binding resolution in November advocating a minimum social media age of 16 to ensure “age-appropriate online engagement.” Denmark has suggested banning users under 15, while France, Spain, Italy, Denmark, and Greece are testing a joint age-verification app. Malaysia plans to prohibit social media use for under-16s starting in 2026.

Last week, Russia banned Roblox, a popular online gaming platform for children, citing the spread of extremist content and LGBTQ-related material.

Rising concerns over online child safety have also increased legal pressure on tech giants. Meta faces lawsuits in the U.S. alleging it allowed harmful content to persist on its platforms, including interactions with adult strangers, content linked to suicide, eating disorders, and child sexual abuse.

MENAFN10122025000045017169ID1110462674

Legal Disclaimer:

EIN Presswire provides this news content "as is" without warranty of any kind. We do not accept any responsibility or liability for the accuracy, content, images, videos, licenses, completeness, legality, or reliability of the information contained in this article. If you have any complaints or copyright issues related to this article, kindly contact the author above.

Share us

on your social networks:
AGPs

Get the latest news on this topic.

SIGN UP FOR FREE TODAY

No Thanks

By signing to this email alert, you
agree to our Terms & Conditions