Meta, the parent company overseeing Facebook and Instagram, has unveiled plans to implement more stringent content controls to enhance online safety for teenagers.
The social media giant aims to prioritize age-appropriate experiences for young users, aligning with expert recommendations in adolescent development and mental health.
In a detailed blog post, Meta outlined its commitment to placing teenagers under “the most restrictive” content control settings on both Instagram and Facebook.
The company disclosed that it will obscure specific content categories, including posts related to suicide, self-harm, and eating disorders. These topics will no longer be recommended or easily accessible in the ‘Search’ and ‘Explore’ features.
Meta emphasized the significance of creating a secure digital environment for teens, stating, “We want teens to have safe, age-appropriate experiences on our apps.”
The announcement highlighted over a decade of investment in developing policies, technology, and more than 30 tools and resources to support both teens and their parents.
As part of the heightened safety measures, Meta disclosed its intention to hide search results linked to suicide, self-harm, and eating disorders. Users searching for such terms will be redirected to expert resources for assistance, reinforcing the company’s commitment to mental health support.
This latest initiative reflects Meta’s ongoing efforts to evolve its platforms responsibly, taking into account the well-being and safety of its younger user base. The move aligns with industry trends emphasizing the importance of age-appropriate content and responsible digital engagement.