In response to growing concerns over the safety of teen users, Meta Platforms, the parent company of Instagram and Facebook, has announced new measures to enhance protections against unwanted direct messages. The move, disclosed on Thursday, is part of Meta’s ongoing efforts to address regulatory pressure and ensure a safer online environment for young users.
This initiative follows Meta’s recent commitment to concealing more content from teens, a response to regulatory demands urging the safeguarding of children from potentially harmful content across its platforms.
The heightened regulatory scrutiny intensified after a former Meta employee testified in the U.S. Senate, alleging that the company was aware of harassment and other risks faced by teens on its platforms but failed to take appropriate action.
Under the new safeguards, Instagram users who are teens will no longer receive direct messages from individuals they do not follow or have no existing connection with by default. Additionally, changes to certain app settings will now require parental approval.
On Messenger, accounts of users under 16 (and below 18 in specific countries) will only receive messages from Facebook friends or individuals connected through phone contacts. Meta emphasized that adults aged 19 and above will be restricted from messaging teens who do not follow them.
These updates reflect Meta’s commitment to addressing concerns related to the well-being of teen users and ensuring a more secure online experience.
The company continues to adapt its platforms to meet evolving regulatory standards and to enhance user safety in response to emerging challenges.