Artificial Intelligence

Using Artificial Intelligence to Strengthen Suicide Prevention

– A team from the University of Southern California (USC) has designed an artificial intelligence algorithm capable of identifying who in a real-life social group would be the best individuals to be trained as “gatekeepers” who recognize the warning signs of suicide and know how to respond.

Researchers noted that according to the CDC, the suicide rate for people between the ages of 10 and 24 increased by 56 percent between 2007 and 2017. In comparison to the general population, more than half of individuals who experience homelessness have had thoughts of suicide or have attempted suicide.

The group aimed to examine the potential for social connections such as friends, relatives, and acquaintances to help mitigate the risk of suicide.

“In this research, we wanted to find ways to mitigate suicidal ideation and death among youth. Our idea was to leverage real-life social network information to build a support network of strategically positioned individuals that can ‘watch-out’ for their friends and refer them to help as needed,” said Phebe Vayanos, assistant professor of Industrial and Systems Engineering and Computer Science at the USC Viterbi School of Engineering.

Researchers looked at the network of social relationships of young people experiencing homelessness in Los Angeles. The team developed an algorithm that could try to plan how human gatekeepers can be best positioned and trained in a network to watch out for others.

“We want to ensure that a maximum number of people are being watched out for, taking into account resource limitations and uncertainties of open world deployment. For example, if some of the people in the network are not able to make it to the gatekeeper training, we still want to have a robust support network,” said Vayanos.

See also  Taiwan Aims to be Global Leader in Artificial Intelligence with New AI HUB Initiative - Kilgore News Herald

The researchers want to ensure the algorithm is deployed in a way that will ensure fairness and transparency.

“We often work in environments that have limited resources, and this tends to disproportionately affect historically marginalized and vulnerable populations,” said co-author on the study Anthony Fulginiti, an assistant professor of social work at the University of Denver who received his PhD from USC.

“This algorithm can help us find a subset of people in a social network that gives us the best chance that youth will be connected to someone who has been trained when dealing with resource constraints and other uncertainties.”

The algorithm was able to reduce the bias in coverage in real-life social networks of homeless youth by as much as 20 percent.

“One of the surprising things we discovered in our experiments based on social networks of homeless youth is that existing AI algorithms, if deployed without customization, result in discriminatory outcomes by up to 68 percent difference in protection rate across races. The goal is to make this algorithm as fair as possible and adjust the algorithm to protect those groups that are worse off,” said Aida Rahmattalabi, lead author of the study.

“Our aim is to protect as many youth as possible. Not only does our solution advance the field of computer science by addressing a computationally hard problem, but also it pushes the boundaries of social work and risk management science by bringing in computational methods into design and deployment of prevention programs.”

With this AI tool, healthcare professionals could help mitigate suicide risk for homeless and other high-risk individuals.

See also  Artificial Intelligence in sextech Market Analysis | By Company Profiles | Size | Share | Growth

“Our algorithm can improve the efficiency of suicide prevention trainings for this particularly vulnerable population. If you are strategic, you can cover more people and you can have a more robust network of support,” said Vayanos.

“Through this study, we can also help inform policymakers who are making decisions regarding funding on suicide prevention initiatives; for example, by sharing with them the minimum number of people who need to receive the gatekeeper training to ensure that all youth have at least one trained friend who can watch out for them.”


Leave a Reply

This website uses cookies. By continuing to use this site, you accept our use of cookies.