Meta tightens teen safety on Facebook and Instagram with new DM restrictions

Meta tightens teen safety on Facebook and Instagram with new DM restrictions

Meta, the parent company of Facebook and Instagram, has announced the implementation of new direct messaging restrictions aimed at enhancing the safety of teenage users on its platforms. According to TechCrunch, the move comes amid growing concerns about the online safety of younger users.

The new restrictions will prevent unsolicited messages to teens on both Facebook and Instagram. On Instagram, adults over the age of 18 were already restricted from messaging teens who don’t follow them. The updated limits will now apply to all users under 16, and in some regions under 18, by default. Existing users will be informed about these changes through notifications.

On Facebook’s Messenger, teens will only receive messages from individuals who are already their Facebook friends or those listed in their contacts. This measure is designed to further shield young users from potentially harmful interactions with strangers.

In addition to the direct messaging restrictions, Meta is bolstering its parental control features. Guardians will now have the authority to approve or reject any changes that teens make to their default privacy settings. This update empowers parents and guardians to have more control over their teens’ account settings, including the ability to keep accounts private, manage sensitive content exposure, and control who can send messages to their teens.

Upcoming features and ongoing concerns

Meta also revealed plans to launch a feature that will block teens from receiving unwanted and inappropriate images in their direct messages, even in end-to-end encrypted chats. However, the company has not yet detailed how it will define or detect “inappropriate” content while maintaining user privacy.

This announcement follows Meta’s recent introduction of tools to limit exposure to content related to self-harm and eating disorders for teen users on Facebook and Instagram.

Meta’s initiatives come at a time when the company faces increasing scrutiny over child safety on its platforms. The company is currently involved in a civil lawsuit in New Mexico and is facing allegations in a federal court in California regarding the promotion of harmful content to minors. Additionally, Meta is scheduled to testify before the Senate on Jan. 31, alongside other social networks, on issues surrounding child safety online.

Maxwell William

Freelance Journalist

Maxwell Koopsen, a seasoned crypto journalist and content strategist, has notably contributed to industry-leading platforms such as Cointelegraph, OKX Insights, and Decrypt, weaving complex crypto narratives into insightful articles that resonate with a broad readership.

Add a Comment