Meta implements major privacy overhaul for teen protection
Meta implements major privacy overhaul for teen protection
Meta, the tech conglomerate that owns Facebook and Instagram, has unveiled a major privacy update, marking a significant stride in teen user protection. This development emerges amid a series of lawsuits in the United States, prompting a need for stricter content control for younger users.
According to an Engadget report, the update is a comprehensive attempt by Meta to shield teenagers from exposure to harmful content on its platforms. In a pivotal change, users under 16 will no longer see content related to self-harm, graphic violence, and eating disorders in their Feeds and Stories on Instagram and Facebook. This restriction applies even to content shared by accounts they follow. Additionally, in a move to guide teens toward healthier content, searches for sensitive topics will redirect them to expert resources. This decision was informed by consultations with specialists in adolescent development, reflecting Meta’s commitment to responsible content moderation.
Expanding control settings for enhanced safety
In an unprecedented move, existing teenage users on Meta’s platforms will be automatically placed under the most restrictive control settings. This expansion of the previously implemented policy, which applied only to new users, introduces “Sensitive Content Control” on Instagram and “Reduce” on Facebook as mandatory features for users under 16.
Furthermore, Meta is rolling out notifications to direct teens toward updating their privacy settings to recommended levels. These settings are designed to enhance online safety, limiting who can repost, tag, or mention them. The updated controls also include mechanisms to block messages from non-followers and hide offensive comments, fortifying the digital barriers against potential online harassment.
This initiative is part of Meta’s ongoing efforts to create a safer online environment for its younger audience. Previously, the company had set a precedent by switching users under 16 to more restrictive content settings. It also introduced measures to prevent unsolicited contact from “suspicious” adults on Facebook and Instagram and limited ad targeting based on gender for teenage users.
The urgency for these measures has been highlighted by a spate of legal challenges against Meta. Notably, a collective complaint from 41 states accused the company of exacerbating mental health issues among young users. Seattle schools have filed a lawsuit over a youth mental health crisis, and a recent ruling requires social media companies to defend themselves in teen addiction lawsuits. Adding to the legal pressure, a complaint from 33 states alleges that Meta actively sought users under 13 and was not transparent about its handling of underage accounts when detected.