Artificial Intelligence’s deployment is once again helping healthcare by achieving a demonstrated utility within suicide prediction and clinical management of patients at high risk.
It is impossible to know if those who committed suicide would have changed their mind. If something had predicted their suicide attempt, perhaps.
Many, though, have survived the suicidal thoughts and the attempt. After receiving help, treatment, and being fully recovered, many have recognized that causing their death would have been a mistake.
Yet, there are still too many victims of suicide. It takes just a few seconds for a tortured mind to terminate their life.
Suicide has been on the rise in the United States alone for a generation. It has been estimated that suicide claims the lives of 14 in 100,000 Americans each year. Suicide is the nation’s tenth leading cause of death. Nationwide, around 8.5 percent of suicide attempts end in death.
The Vanderbilt University Medical Center (VUMC)has developed and trialed a Machine Learning algorithm that predicts suicide attempt.
The trial took place over 11 consecutive months concluding in April 2020 in which predictions ran silently in the background as adult patients were seen at VUMC.
The algorithm, dubbed the Vanderbilt Suicide Attempt and Ideation Likelihood (VSAIL) model, uses routine information from electronic health records (EHRs) to calculate 30-day risk of return visits for suicide attempt, and, by extension, suicidal ideation.
Colin Walsh, MD, MA, and his team evaluated the performance of the predictive algorithm with an eye to its potential clinical implementation. They reported the study in JAMA Network Open.
Dr. Walsh’s team included Kevin Johnson, MD, MS, Michael Ripperger, Sarah Sperry, PhD, Joyce Harris, Nathaniel Clark, MD, Elliot Fielstein, PhD, Laurie Novak, PhD, MHSA, and Katelyn Robinson.
According to the study, upon stratifying adult patients into eight groups according to their risk scores per the algorithm, the top stratum alone accounted for more than one-third of all suicide attempts documented in the study, and approximately half of all cases of suicidal ideation. As documented in the EHR, one in 23 individuals in this high-risk group went on to report suicidal thoughts, and one in 271 went on to attempt suicide.
“Today, across the Medical Center, we cannot screen every patient for suicide risk in every encounter — nor should we,” said Dr. Walsh, assistant professor of Biomedical Informatics, Medicine and Psychiatry. “But we know some individuals are never screened despite factors that might put them at higher risk. This risk model is a first pass at that screening and might suggest which patients to screen further in settings where suicidality is not often discussed.”
Over the 11-month test, some 78,000 adult patients were seen in the hospital, emergency room and surgical clinics at VUMC. As subsequently documented in the EHR, 395 individuals in this group reported having suicidal thoughts and 85 lived through at least one suicide attempt, with 23 surviving repeated attempts.
Dr. Walsh says that for every 271 people identified in the highest predicted risk group, one returned for treatment for a suicide attempt.”
“This number is on a par with numbers needed to screen for problems like abnormal cholesterol and certain cancers. We might feasibly ask hundreds or even thousands of individuals about suicidal thinking, but we cannot ask the millions who visit our Medical Center every year — and not all patients need to be asked, he says.”
The study results suggest Artificial Intelligence might help as one step in directing limited clinical resources to where they are most needed.
Dr. Walsh, who originally created the algorithm with colleagues who are now at Florida State University, had previously validated it using retrospective EHR data from VUMC.
The new study’s senior author, William Stead, MD, Professor of Biomedical Informatics, said that Dr. Walsh and his team have shown how to stress test and adapt an Artificial Intelligence predictive model in an operational electronic health record, paving the way to real world testing of decision support interventions.