New feature allows parents to track teens’ risky online searches

The photo and video sharing platform announced that parents will now receive alerts if their children search for terms related to self-harm or suicide.

New feature allows
New feature allows

CALIFORNIA: Instagram has rolled out a new safety feature designed to help parents monitor their teenagers’ online activity and protect their mental well-being.

The photo and video sharing platform announced that parents will now receive alerts if their children search for terms related to self-harm or suicide.

The move is aimed at enabling timely intervention and offering families an opportunity to address concerns before they escalate.

The new addition enhances Instagram’s existing parental supervision tools, providing real-time notifications along with access to helpful resources.

These resources are intended to guide parents in having constructive conversations with their children about sensitive issues and online safety.

According to the company, the alerts will begin rolling out next week in the United States, United Kingdom, Australia, and Canada, with plans to expand the feature to additional regions in the near future.

In a further step to safeguard minors, Instagram has restricted search results for sensitive content among younger users. Content related to self-harm or suicide will now be blocked for underage accounts.

The company stated that the initiative is part of its broader efforts to create a safer digital environment, ensuring parents receive early warning signs and can take proactive measures to support their children’s mental health.