IEyeNews

iLocal News Archives

Gadget of the week

Facebook rolls out new AI software designed to identify and assist suicidal users

By Yoni Heisler From BGR

In a testament to how entrenched Facebook has become in our day-to-day lives, it’s not uncommon for users who commit suicide to leave telling videos or posts on their wall during the preceding few days. In an effort to pro-actively tackle the problem, Facebook is rolling out a suite of new AI tools designed to detect posts which may suggest a particular user is suicidal. In instances when cryptic posts are essentially cries for help, being able to identify such posts sooner rather than later can ultimately help save lives.

As part of Facebook’s new push to identify situations that warrant more attention, the social networking giant’s new software can analyze posts and even live video streams for times when a user might be having suicidal thoughts. What’s more, the software is designed to even take comments on a problematic post into account, with Facebook noting that comments like “Are you ok?” can be strong indicators that something is wrong. The new AI initiative is rolling out in international markets today and will eventually make its way to the United States. Incidentally, Facebook notes that the feature will not be available across the EU.

Facebook explains how the new initiative works as follows:

We are also using artificial intelligence to prioritize the order in which our team reviews reported posts, videos and live streams. This ensures we can get the right resources to people in distress and, where appropriate, we can more quickly alert first responders.
Context is critical for our review teams, so we have developed ways to enhance our tools to get people help as quickly as possible. For example, our reviewers can quickly identify which points within a video receive increased levels of comments, reactions and reports from people on Facebook. Tools like these help reviewers understand whether someone may be in distress and get them help.
In addition to those tools, we’re using automation so the team can more quickly access the appropriate first responders’ contact information.
Facebook also adds that it has employees working around the clock, 365 days a year, monitoring posts that might suggest a user is suicidal. At the same time, Facebook is working closely with first-responders and support groups to help distressed users as quickly as possible.

You can read more about Facebook’s suicide prevention measures at the web link below.

For more on this story go to: http://bgr.com/2017/11/27/facebook-suicide-prevention-ai-resource-tools/

LEAVE A RESPONSE

Your email address will not be published. Required fields are marked *