Communications Minister Anika Wells has characterized social media platforms as deliberately deploying algorithms to target teenagers and maximize engagement for corporate profits, positioning Australia’s under-16 ban as necessary intervention against exploitative business practices. During her National Press Club address, Wells emphasized that tech companies have wielded enormous power over young users by manipulating teenage psychology, and the December 10 implementation represents reclaiming that power for families.
YouTube will begin removing underage users next week despite parent company Google’s extensive concerns about the approach. Rachel Lord from Google’s policy division warned that the ban eliminates safety features families currently rely on, including parental supervision tools that allow collaborative content management, restrictions on specific channels, and wellbeing reminders promoting healthy usage patterns. The company argues the legislation was rushed and fundamentally misunderstands youth digital engagement.
Wells has dismissed industry pushback with unusually direct criticism, calling YouTube’s warnings “outright weird” and insisting platforms bear responsibility for content safety. She argued that if YouTube acknowledges hosting age-inappropriate material in logged-out states, that represents a problem the company must solve independently of government regulation. The minister directed families toward YouTube Kids as the government’s preferred alternative for younger audiences.
ByteDance’s Lemon8 app demonstrates how regulatory pressure extends beyond platforms explicitly named in legislation. The Instagram-style service announced voluntary over-16 restrictions from December 10 despite not being included in the original law. Lemon8 had experienced increased interest specifically because it avoided the initial ban, but eSafety Commissioner monitoring prompted proactive compliance rather than waiting for potential future inclusion.
The government has acknowledged implementation won’t be perfect immediately, with Wells conceding it may take days or weeks to fully materialize, but emphasized authorities remain committed despite imperfect initial results. The eSafety Commissioner will collect compliance data beginning December 11 with monthly updates thereafter, while platforms face penalties up to 50 million dollars. Wells warned that any site becoming a destination for harmful content targeting young teens will be added to the restricted list. Her focus on corporate profit motives and algorithmic manipulation positions Australia’s ban as addressing fundamental business model concerns rather than simply restricting platform access, framing the debate as protecting children from deliberate exploitation rather than paternalistic limitation of teen autonomy.
Algorithms Targeting Teens: Australia’s Ban Addresses Corporate Profit Motives
37
