Mental health, psychotherapy
There is an increased need for mental health providers to improve self-harm and suicide risk prediction. Rosy - The world is worth thousands of pictures/Pixabay

Artificial intelligence (AI) models are being utilised to enhance the prediction of suicide risk in children and adolescents through three comprehensive machine-learning models incorporating 84 data points from electronic records.

These machines can cross-analyse multiple mental health diagnostic codes and indicators, enabling them to outperform traditional health systems that often miss a sizable portion of children having self-injurious thoughts or behaviours.

According to the Royal College of Paediatrics and Child Health, the UK rate of suicide among the youth has gradually fallen between 1992 and 2017 but rose again in 2018. In 2018, a total of 714 deaths were registered with the suicide rate being the highest within Northern Ireland.

In England, a quarter of 11-16-year-olds, and nearly half of 17-19-year-olds (46.8%), with a diagnosed mental disorder have reported that they have committed an act of self-harm or have had a suicide attempt at some point in their lives. For 11-16-year-olds, this represents a greater than eightfold risk compared to those without a mental health issue.

Amidst the nationwide youth mental health crisis, there is an increased need for mental health providers to improve on self-harm and suicide risk prediction to be able to facilitate intervention before it is too late.

The most critical reason for suicide risk detection is to prevent loss of life. Identifying individuals at risk allows for timely intervention and support such as coping strategies, and treatment options to help address their emotional struggles.

Current health systems in place often do not have a full understanding of how to accurately assess whether a patient coming through their doors has self-injurious thoughts or behaviours, Many risk-prediction models designed to flag children at future risk have extremely limited prediction accuracy and can be prone to human error.

Experts who reviewed the clinical notes of thousands of doctors found that diagnostic codes known as the International Classification of Diseases (ICD) missed 29 per cent of children who came to the emergency department for self-injurious thoughts or behaviours.

Over half, (54%) of those patients also had their chief complaint missed. Chief complaints are brief statements patients provide at the beginning of their healthcare visit describing why they are seeking care. Even after using the ICD code and the chief complaint together, mental health professionals still missed about 22 per cent of those patients.

Juliet Edgcomb, MD, PhD, the study's lead author and associate director of UCLA's Mental Health Informatics and Data Science (MINDS) Hub, said: "Our ability to anticipate which children may have suicidal thoughts or behaviours in the future is not great – a key reason is our field jumped to prediction rather than pausing to figure out if we are systematically detecting everyone who is coming in for suicide-related care. We sought to understand if we can first get better at detection."

The AI models being developed for suicide risk prediction are not intended to replace human intervention, but rather to augment the capabilities of mental health professionals and caregivers. These tools can provide an additional source of information to help guide decisions about appropriate interventions and support.

Several tech companies and research institutions are at the forefront of developing and refining these AI models. Collaborations between data scientists, mental health professionals, and technology experts are crucial to ensuring that these tools are accurate, reliable, and ethically sound.

As AI models continue to evolve and gain traction in the realm of mental health care, the collective focus remains on striking a balance between innovation and ethics for efficient suicide risk prevention.

By harnessing the power of technology to enhance early detection and intervention for children and adolescents at risk of suicide, society takes a significant stride towards addressing the complex and pressing issue of youth mental health with a multifaceted approach.