The alerts will begin within the next two weeks and will only be sent to parents who have signed up for the social media platform’s supervision tool.
Charities have branded the move “flimsy” as they called for Meta to do more to protect children.
Announcing the policy change, Meta said: “We understand how sensitive these issues are, and how distressing it could be for a parent to receive an alert like this.
“The vast majority of teens do not try to search for suicide and self-harm content on Instagram, and when they do, our policy is to block these searches, instead directing them to resources and helplines that can offer support.
Read Full Article Here
