Facebook launch AI to help vulnerable users

Facebook

Facebook has unveiled AI designed to identify individuals at risk of committing suicide, creating algorithms with the intention of spotting ‘warning signs’ that may be evident in posts or comments from users.

Any posts that may be concerning will then be reviewed by Facebook staff, with the company then contacting those who are thought to be at risk of self-harm or suicide.

Founder Mark Zuckerberg announced recently that he also plans to use similar algorithms to spot signs of radicalism, as well as various other content that may give cause for concern.

Aside from post detection, Facebook has announced plans to spot and reduce signs of suicidal behaviour found on its Facebook Live platform. The social network has teamed up with a number of mental health organisations, and will use its Messenger service to allow vulnerable users to contact them directly.

Advice has been available for vulnerable users for a number of years, but until now the system required other users to raise concern through the report button.

The system now incorporates algorithms that can spot patterns, and have been trained by previously flagged posts. Upon being identified as concerning, the post is sent to be reviewed by the network’s community operations team.

“We know that speed is critical when things are urgent,” Facebook product manager Vanessa Callison-Burch told the BBC.

The effort was praised by the US National Suicide Prevention Lifeline but said he hoped that Facebook could do more than just offered advice.

“It’s something that we have been discussing with Facebook,” said Dr John Draper.

“The more we can mobilise the support network of an individual in distress to help them, the more likely they are to get help. The question is how we can do that in a way that doesn’t feel invasive. I would say though that what they are now offering is a huge step forward.”

The main aim of the project is to help at-risk users during the broadcast, rather than waiting until after the event.

“Some might say we should cut off the stream of the video the moment there is a hint of somebody talking about suicide,” said Jennifer Guadagno, Facebook’s lead researcher on the project.

“But what the experts emphasised was that cutting off the stream too early would remove the opportunity for people to reach out and offer support.

“So, this opens up the ability for friends and family to reach out to a person in distress at the time they may really need it the most.”

The new system is being rolled out worldwide.

X