In response to global regulatory pressure urging enhanced protection for children on its platforms, Meta Platforms announced on Tuesday that it would implement stricter content controls for teens on Instagram and Facebook. All teenagers will now be subjected to the most restrictive content settings, and additional limitations on search terms will be imposed on Instagram to mitigate exposure to sensitive content such as suicide, self-harm, and eating disorders.
The initiative, outlined in a blog post by Meta, aims to create a more age-appropriate experience for teens and is expected to roll out over the coming weeks. By tightening the controls, Meta seeks to address concerns raised by regulators worldwide about the potentially harmful impact of its apps on youth mental health.
The move comes as Meta faces legal challenges, with attorneys general from 33 U.S. states, including California and New York, suing the company in October over allegations of misleading the public about the dangers of its platforms. In Europe, the European Commission has also sought information regarding how Meta safeguards children from illegal and harmful content.
The scrutiny intensified following testimony in the U.S. Senate by a former Meta employee, Arturo Bejar, who alleged that the company was aware of the harassment and other harms faced by teens on its platforms but failed to take adequate action. Bejar, who called for design changes to encourage positive behaviors and improve tools for managing unpleasant experiences, expressed dissatisfaction with Meta's latest changes, stating that the company still lacked an effective way for teens to report unwanted advances.
The competition for young users between Meta and TikTok has been fierce, as the usage of Facebook among teens has declined. Brands targeting children as consumers on Facebook and Instagram have long found this demographic appealing. According to a 2023 Pew Research Center survey, TikTok and Instagram are more popular among U.S. teens, with 63% and 59% using them, respectively, compared to only 33% using Facebook.
As Meta strives to address concerns and navigate regulatory challenges, the upcoming changes in content controls for teens reflect an ongoing effort to balance user experience with the imperative to safeguard young users from potentially harmful content.
Also Read
Apple Vision Pro: Pioneering the Era of Spatial Computing