Just over two years ago, Facebook launched some serious suicide prevention tools, and last summer made them available around the globe. Now, the company is further upgrading its support for suicide prevention measures and also building them into Facebook Live.
Facebook has a responsibility to help keep its users safe. Everyone’s favourite social network currently finds itself in quite a “unique position – through friendships on the site.” It can connect people who are in distress and who are contemplating suicide, with those who can support them. After all, it’s true that one of the best ways to prevent suicide, is for people who are in distress, to hear from those who care about them. So, as part of Facebook‘s efforts to make it a safer place for everyone, it is updating its suicide prevention tools and resources aimed at those who are contemplating suicide, and integrating them into Facebook Live.
Suicide prevention tools integrated into Facebook Live include, “live chat support from crisis support organizations through Messenger,” and “streamlined reporting for suicide, assisted by artificial intelligence.” On Facebook, users can already reach out or report those that seem to be in distress. Facebook says that it has “teams working around the world, 24/7, who review reports that come in and prioritize the most serious reports like suicide.”
The platform offers people who are expressing thoughts of suicide, several options of support – including prompting someone to reach out to a friend. This done with pre-populated text to make starting a conversation easier. Another option is the suggestion to contact a help line. Finally, it also includes “tips and resources for people to help themselves in that moment.”
With the recent update, Facebook also allows people to connect with “crisis support partners” like Crisis Text Line, the National Eating Disorder Association and the National Suicide Prevention Lifeline, via Messenger. The option to message with either partner will now be available through Facebook’s suicide prevention tools, or the organisation’s Page.
Finally, Facebook is also using AI to streamline its reporting process. As Vanessa Callison-Burch, Product Manager, Jennifer Guadagno, Researcher, and Antigone Davis, Head of Global Safety at Facebook, explain in a recent post announcing the new measures,
Based on feedback from experts, we are testing a streamlined reporting process using pattern recognition in posts previously reported for suicide. This artificial intelligence approach will make the option to report a post about “suicide or self injury” more prominent for potentially concerning posts like these.
The company is also testing “pattern recognition to identify posts as very likely to include thoughts of suicide.” Facebook says that its Community Operations team reviews posts that have been identified as such and resources will be provided to the poster.