Facebook launches AI tool to identify users expressing suicidal thoughts

Facebook is to use artificial intelligence to help spot user posts expressing thoughts of suicide, and quickly connect members with help.
Suicidal posts will reach local authorities "twice as quickly"Suicidal posts will reach local authorities "twice as quickly"
Suicidal posts will reach local authorities "twice as quickly"

The social network has said it is to begin using "pattern recognition" on posts and live video to the site, as well as prioritise such posts to Facebook's review team who can alert authorities if necessary.

The social media giant said early tests of the new programme had seen reporting of the most concerning suicidal posts reach local authorities "twice as quickly" as other reports.

Hide Ad
Hide Ad

Facebook's Guy Rosen said: "We are starting to roll out artificial intelligence outside the US to help identify when someone might be expressing thoughts of suicide, including on Facebook Live. This will eventually be available worldwide, except the EU.

"This approach uses pattern recognition technology to help identify posts and live streams as likely to be expressing thoughts of suicide.

"We continue to work on this technology to increase accuracy and avoid false positives before our team reviews."

As part of the programming, Facebook said its artificial intelligence would also analyse the text in comments beneath posts, as offers of help that appear here can be a sign that a user may be in danger.

Hide Ad
Hide Ad

"We use signals like the text used in the post and comments - for example, comments like 'Are you OK?" and "Can I help?' can be strong indicators," Mr Rosen said.

"In some instances, we have found that the technology has identified videos that may have gone unreported."

The technology will be used alongside the social network's existing Community Operations team, who review reports about content posted to the site, and include specialists who have been trained in speaking to those expressing thoughts of self harm.

Facebook said it also works for more than 80 mental health organisations around the world as part of its suicide prevention tools.

Hide Ad
Hide Ad

The site also encouraged users to report any posts they themselves spot that could suggest a user was expressing thoughts of suicide.

Facebook has been heavily criticised in recent months over its policing of content that appears on the site, with concerns over extremist material and fake news raised by government and industry experts.

Last week the firm announced it was creating a new tool to help users identify if they had liked or followed now-deleted pages linked to a Russian propaganda group as part of its crackdown on misinformation on the site.

Related topics: