Instagram is rolling out a new feature to protect teen users, alerting parents to harmful searches.
Instagram has announced a new feature for the safety of teen users. If a teen user searches for self-harm, a notification will be sent to their parents.
Instagram will soon be rolling out a feature to protect teen users. Parents will receive a notification if a teen user searches for anything related to suicide or self-harm. The company is strengthening safeguards related to sensitive mental health content, and this will be a new update to its teen accounts and parental supervision tools. Let's learn more about this feature.
Why is this feature being introduced?
Instagram stated that these alerts are designed to alert parents and help them know if their child needs support. These unnecessary notifications will be removed. The company stated that most teen users don't search for self-harm, but if someone does, Instagram will block the search and redirect them to a helpline or support system.
How will the new feature work?
This feature will begin rolling out next week. Parents and teens who have enrolled in supervision will receive a pre-alert alert informing them that the alert feature is being activated. After this, if a teen repeatedly searches for something that could lead to self-harm, their parents will receive a notification. These notifications can be sent via in-app alerts, as well as email, text message, and WhatsApp. These notifications will include information related to the teen's search, along with expert advice, so they can seek appropriate support in a critical situation. Parents will only receive this alert if the teen user has made multiple searches related to self-harm within a short period of time. The company has also said that in many cases parents may receive notifications even when there is no cause for concern regarding the safety of the teen.

