ShareThis Page

Instagram takes steps to identify troubling posts, answer users' cries for help

Ben Schmitt
| Monday, Oct. 31, 2016, 7:33 p.m.

Instagram, the popular, free photo-sharing application, is tackling self-harm and eating disorders with a tool aimed at answering online cries for help.

The social media platform last month launched a support system that relies on users instead of algorithms to flag troubling posts.

When a user anonymously flags a post, Instagram will send a message to the person who posted the troubling words. The message reads: “Someone saw one of your posts and thinks you might be going through a difficult time. If you need support, we'd like to help.”

The recipient in question will then be encouraged to talk with a friend, call a help line or get a list of support options.

A team of people working around the clock will review flagged posts, without relying on algorithms, to judge whether someone is vulnerable. The team will prioritize the most serious reports and respond quickly.

“There's emerging data to show this can be effective,” said Dr. Abigail Schlesinger, medical director for integrated care for Children's Hospital of Pittsburgh of UPMC and Western Psychiatric Institute and Clinic. “Kids, for example, spend a lot of time on social media. If this is a way to get them access for care, I am all in favor of it.”

Instagram, which is owned by Facebook, worked closely with the National Eating Disorders Association, the National Suicide Prevention Lifeline and Suicide Awareness Voices of Education (SAVE), along with people with real-life experience with eating disorders, self-injury or suicide. Last year, Facebook unveiled a similar tool.

“Most of the time when it comes to suicidal crisis, people are scared to ask about it,” SAVE Executive Director Daniel Reidenberg said. “If you see a post or a picture and are worried about it, you may not know who to turn to. This gives you a way to report it.”

Dr. P.V. Nickell, chair of Allegheny Health Network's Adult Psychiatric Services division, viewed the app last week and came away impressed.

“The worst thing is not asking someone if they need help,” he said. “This could help demystify that process.”

Suicide is the 10th-leading cause of death in the United States, according to the American Foundation for Suicide Prevention. Each year, about 43,000 Americans die by suicide.

Hollie Geitner, vice president of client services for WordWrite Communications public relations firm in Pittsburgh, said Instagram's effort exemplifies an avenue in which social media can do good.

“Because Instagram's system is not guided by algorithms, but by people, it can be a powerful tool in identifying and helping people in distress,” she said. “A post from someone offering help or support can be that one shimmer of light and hope for someone who feels lost in a tunnel of darkness. It's not difficult to find reasons to declare social media as bad — however, it is here to stay and a part of our culture. Let's look for the positive when we can. This certainly fits that category, in my opinion.”

Geitner said she struggled with a bout of depression years ago before social media existed.

“I believe that, in many situations, the person experiencing depression is crying out for help,” she said. “Social media is one way people do that today, whether through posts in their own words or via shared memes or images.”

With more than 500 million users worldwide, Instagram is a natural platform for outreach.

“There's no question that there are many people online who are having emotional issues, and I definitely applaud Instagram's interest in trying to come up with something to address this,” said Dr. Brian Primack, director of the University of Pittsburgh Center for Research on Media, Technology and Health. “I would love to study this type of thing, see how often the system is activated and how effective it is over the years to come.”

Ben Schmitt is a Tribune-Review staff writer. Reach him at 412-320-7991 or bschmitt@tribweb.com.

TribLIVE commenting policy

You are solely responsible for your comments and by using TribLive.com you agree to our Terms of Service.

We moderate comments. Our goal is to provide substantive commentary for a general readership. By screening submissions, we provide a space where readers can share intelligent and informed commentary that enhances the quality of our news and information.

While most comments will be posted if they are on-topic and not abusive, moderating decisions are subjective. We will make them as carefully and consistently as we can. Because of the volume of reader comments, we cannot review individual moderation decisions with readers.

We value thoughtful comments representing a range of views that make their point quickly and politely. We make an effort to protect discussions from repeated comments either by the same reader or different readers

We follow the same standards for taste as the daily newspaper. A few things we won't tolerate: personal attacks, obscenity, vulgarity, profanity (including expletives and letters followed by dashes), commercial promotion, impersonations, incoherence, proselytizing and SHOUTING. Don't include URLs to Web sites.

We do not edit comments. They are either approved or deleted. We reserve the right to edit a comment that is quoted or excerpted in an article. In this case, we may fix spelling and punctuation.

We welcome strong opinions and criticism of our work, but we don't want comments to become bogged down with discussions of our policies and we will moderate accordingly.

We appreciate it when readers and people quoted in articles or blog posts point out errors of fact or emphasis and will investigate all assertions. But these suggestions should be sent via e-mail. To avoid distracting other readers, we won't publish comments that suggest a correction. Instead, corrections will be made in a blog post or in an article.