
Facebook AI will be able to prevent suicides on live-streams.
The act of suicide is somewhat controversial, even more so when some people decide to make the life-ending act a public one by s live-streaming it on various platforms such as Facebook Live video. In order to combat this increasing phenomenon, Facebook has implemented a new set of artificial intelligence tools.
The company has recently announced that it will integrate a number of suicide prevention tools for its Live video feature. The new Facebook AI tools are not only meant to prevent the suicide from being broadcasted to Facebook users, but it will also provide live-chat support from different organizations like the National Suicide Prevention Lifeline, or the Crisis Text Line, supported in Facebook Messenger. These tools are meant to connect suicidal people with those that can help them overcome their current state.
The new initiative involves new AI algorithms which can detect several warnings signs in the post of users and in the comments their viewers leave in response to the live-stream. The AI will compile all the warning signs into a report, which requires review by a human team. Afterward, the team contacts those at risk of self-harm to suggest ways to overcome their situation and find support and help.
Public suicides are nothing to the world. However, since the advent of various live-video streaming services, suicidal people have used them to broadcast their acts to a wider audience than ever before. With its over 1.8 billion users worldwide, Facebook has seen the largest number of live suicides, although official numbers were not revealed.
Dan Reidenberg, the director of Save.org, which collaborates with Facebook on these types of cases, has revealed that there have been seven known cases since live-streaming became popular, and not all on Facebook alone.
Facebook is one of the leading tech companies in suicide prevention, as it constantly updates its tools to provide troubled users with the help they need. However, for the most part of the service, the suicidal behavior needed to be reported by other users. The company hopes to change all that by using pattern-recognition AI software to detect early signs of self-harm behavior.
What do you think about these types of tools? Have you known anyone who attempted suicide?
Image source: Pixabay