Poached eggs, nights out and fancy new purchases – Instagram doesn’t exactly give the full story of our lives. We all know the social media site has become something of a highlight reel, so it might come as a surprise to hear that researchers can predict which users had depression based on what filters they used (or didn’t use). This week Instagram launched a have new feature designed to help monitor its users’ low points even more closely.
Now, if you see a pal post something that worries you, you can do something about it.
If a friend posts about self-harm, you can anonymously report it. They’ll get a message that reads: “Someone saw one of your posts and thinks you might be going through a difficult time. If you need support, we’d like to help.” They’ll then receive a prompt that encourages them to chat to a friend, contact a mental health line or received some mental health advice.
Plus, if you search for any problematic hashtags like self-harm you’ll be directed to the help page. Search terms like self-harm and thinspiration have also been banned. Instagram worked with groups like National Eating Disorders Association and The National Suicide Prevention Lifeline to come up with the right language for the messages too.
“We listen to mental health experts when they tell us that outreach from a loved one can make a real difference for those who may be in distress. At the same time, we understand friends and family often want to offer support but don’t know how best to reach out,” Instagram chief operating officer Marne Levine told Seventeen. “These tools are designed to let you know that you are surrounded by a community that cares about you, at a moment when you might most need that reminder.”
What do you think of the latest feature? Let us know on Twitter!