Continued from page 1

If harmful behavior is clicked, then self-harm, Facebook’s user safety team reviews it and sends it to Lifeline. Once the comment is determined to be legitimate, Facebook sends an email to the user who originally posted the thoughts perceived as suicidal. The email includes Lifeline’s phone number and a link to start a confidential chat session.

The recipient decides whether to respond.

Facebook also sends an email to the person who reported the content to let the person know that the site responded. If a suicide or other threats appear imminent, Facebook encourages friends to call law enforcement.

The vetting process guards against any misuse and harassment and keeps the experience within the user’s control, Wolens said.

Facebook, however, has not created any software that searches the site for suicidal expressions. It would be far too difficult with so many users and so many comments that could be misinterpreted by a computer algorithm, Wolens said.

“The only people who will have a really good idea of what’s going on is your friends. So we’re encouraging them to speak up and giving them an easy and quick way to get help,” he said.

The Lifeline currently responds to dozens of users on Facebook each day. Crisis center workers will be available 24 hours a day to respond to users selecting the chat option.

___

Online:

Facebook: http://www.facebook.com

The National Suicide Prevention Lifeline: http://www.suicidepreventionlifeline.org