Instagram Faces Scrutiny Over Delayed Teen Safety Features

This article was generated by AI and cites original sources.

Instagram has faced scrutiny for the delayed rollout of essential safety features for teens, such as a nudity filter, as revealed in a recent court filing. Prosecutors investigating social media app addiction questioned Meta’s sluggish response in launching tools to protect young users. Despite acknowledging teen safety issues in 2018, Meta only introduced an unwanted-nudity filter in private messages in 2024, highlighting a significant gap in safety measures.

During a deposition, Instagram head Adam Mosseri discussed an email exchange from 2018 where concerns were raised about inappropriate content, including explicit images, in private messages on the platform. Mosseri acknowledged the risks but emphasized the challenge of balancing privacy and safety. The delay in implementing crucial safety updates, including the nudity filter, drew attention to Instagram’s approach to protecting young users from harmful content.

Statistics revealed during the testimony painted a concerning picture, with a significant percentage of teens reporting exposure to inappropriate images and harmful behavior on the platform. While Instagram has introduced various safety features over the years, questions lingered about the effectiveness of these measures and the timeliness of their implementation.

Source: TechCrunch