Meta Platforms, the parent company of social media giants Instagram, Facebook and Whatsapp, is rolling out important changes aimed at enhancing the safety and well-being of teen users. The move comes as the company faces lawsuits and scrutiny, with over 40 U.S. states alleging that the platform misled the public about the harmful effects of its services on young people.

In an official blog post, Meta revealed its commitment to providing safe and age-appropriate experiences for teens on its apps. The company has developed over 30 tools and resources over the years to support teens and their parents, with a particular focus on addressing content that may break platform rules or be deemed sensitive.

One major announcement is the introduction of new content policies tailored to the types of content teens encounter on Instagram and Facebook. Meta acknowledged the importance of consulting experts in adolescent development, psychology, and mental health to create a safer online environment for young users.

Another key change involves the removal of content discussing sensitive topics, such as self-harm, from teens’ experiences on Instagram and Facebook. Recognising the importance of addressing such issues, Meta seems to understand the complexity of these topics and their potential unsuitability for all young audiences. The company will no longer display this type of content in teens’ Feeds and Stories, even if shared by accounts they follow.

To further support teens, Meta will continue to share resources from expert organisations, such as the National Alliance on Mental Illness, when users post content related to struggles with self-harm or eating disorders. These changes are gradually rolling out to teens under 18 and are expected to be fully implemented on Instagram and Facebook in the coming months.

Read more at thetechportal.com

Add comment