A collaboration of doctors and academics including ABMU Health Board emergency medicine consultant Dr Sarah Spencer found that artificial intelligence (AI) can be used to scan a patient’s health history and has the potential to highlight patients who may be at increased risk of suicide.
It’s based on findings which show the majority of people who take their own lives attend services other than mental health in the final year for apparently unrelated reasons. More than 80% of suicide cases studied had at least one contact with their GP during their last year.
This new research has looked at ways of using this health information to identify the most vulnerable. And while there is a long way to go before such methods can be used in clinical practice, Dr Spencer, who works in the Princess of Wales Hospital in Bridgend, said the collaboration is having exciting results.
Whilst very early days, this has the potential to be highly clinically relevant to a significant number of patients across the unscheduled care system, she said.
Each year around 800,000 people across the world take their own lives, leading to a huge impact on family, the community and health professionals. But it is not easy to identify those at risk of suicide. A large number of variables and complex interactions mean only a trained clinician can assess for immediate risk of suicide.
The research was carried out by a team including Dr Marcos del Pozo Banos, Professor Ann John, Professor Damon Berridge and Professor Keith Lloyd from Swansea University’s Medical School, and Dr Caroline Jones of the University’s Hillary Rodham Clinton School of Law, in collaboration with academic colleagues from European universities as well as Dr Spencer.
Dr del Pozo Banos said: We wanted to see if we could develop an algorithm that analyses routinely collected health data to flag people, so that when patients present with seemingly unrelated conditions, practitioners can ask them appropriate questions about their thoughts and feelings if required.
He explained the crucial role artificial intelligence (AI) can now play in processing millions of records with thousands of variables to build a risk model very quickly. It can then process the health history of patients and highlight those who may be at risk.
Dr del Pozo Banos, an expert in mental health informatics and AI, said that by using primary care data it was possible to substantially increase the coverage of the system compared to other published proposals.
The team now plans to gradually increase the complexity of the AI system to improve the tool’s precision before carrying out thorough trials.
He said: Our proposal will not replace clinical assessment of immediate risk, but we hope it will help clinicians to identify vulnerable people when they access health services so that the right questions may be asked.
Dr Spencer said: I got involved with this project as a result of a longstanding interest in the management of mental health conditions presenting to the ED, and particularly assessment of suicide risk in patients presenting to the ED.
Optimising care for mental health patients is one of the Royal College of Emergency Medicine’s top three research priorities; the contribution of suicide risk assessment to optimising care is self-evident.