Instagram, owned by Meta, is implementing a new feature that will alert parents if their teenagers repeatedly search for terms related to self-harm or suicide on the platform. This initiative aims to provide parents with the opportunity to support their children in sensitive situations.
The alert system will notify parents when their child ‘repeatedly tries to search for terms clearly associated with suicide or self-harm within a short period of time.’ Initially launching in the US, UK, Australia, and Canada for parents and teens who opt-in to supervision, the feature will later expand to other regions.
According to Instagram, most teens do not actively search for self-harm content on the platform. In cases where such searches occur, Instagram’s policy is to block them and direct users to support resources. The goal of the new parental alerts is to empower parents to intervene if their child’s search behavior suggests a need for assistance.
Notifications will be sent via email, text, or WhatsApp, along with in-app notifications offering guidance on approaching conversations about sensitive topics with their children. This move by Instagram underscores the platform’s commitment to promoting safety and well-being among its younger users.
Source: The Verge