Tech companies tapping artificial intelligence to treat and predict mental health disorders
Among the solutions is an app that uses information like user-reported moods and usage patterns to recommend programmes. On Mental Health Day, CNA looks at what is being done in this space.
SINGAPORE: Could the future of managing mental health lie in strings of code and predictive models?
Behavioural health tech provider Holmusk is banking on that, partnering authorities in Singapore to develop a suite of digital tools for hospitals and clinics.
One solution the firm is looking to introduce is a “smart pill” to track when patients forget or skip their medication.
How that works, is through a small, grain-sized biosensor embedded within the pill, and a sticky patch on the patient’s body that can detect when the pill is ingested. The technology is approved in the United States.
“Let's say schizophrenia, depression patients with some psychosis – not taking the pill for a few days can be bad enough to drive them off the cliff. And if you knew that they have stopped taking the pill two days in a row, you can intervene. You can catch them early”, chief analytics officer of Holmusk, Joydeep Sarkar, told CNA.
He described the sensor as a "fascinating solution", which will work with other parts of the puzzle - ensuring that the information is fed into the patient’s clinical records, flagging missed doses, and ensuring any necessary intervention is part of the workflow.
The firm has also developed an artificial intelligence (AI) model to analyse information from “notes-driven” mental health treatment or therapy.
This may allow researchers to generate insights into the efficacy of treatments and the progression of disorders at a larger scale in future.
PREDICTING RISKS FOR MENTAL HEALTH DISORDERS
Such data in areas like medication and treatment could also feed into predictive models to understand the risks surrounding each patient.
"A big part of where artificial intelligence plays a role is more complex patients - where the answers are not obvious,” Mr Sarkar said.
“Let's say somebody comes in and you stabilise them, and you keep them in the hospital. When is it okay to release them … What support systems could you actually really have in place so that patients don't get worse?”
One disease his firm is looking to zoom in on is bipolar disorder, which has a high genetic component.
The aim is to tap data to identify and track those at risk of developing the disorder to catch the signs early.
"I call them the low-hanging fruit because you don't really need much, (you) just (need) to connect the data”, Mr Sarkar said.
TEACHING PEOPLE TO HELP THEMSELVES
Other industry players are also looking to tap into artificial intelligence to help people take care of their own mental health.
Among them is mental health platform Intellect, which has attracted 3 million users globally since its launch in 2020.
“Mental health has long been a very strong need across Asia, in the world. That has been unmet in support. We've seen quite a sharp rise in client service over two years”, said Intellect chief executive Theodoric Chew.
Intellect's app uses information like user-reported moods and usage patterns to recommend programmes to users. It also uses algorithms to match individuals to therapists, based on their needs and specialisations.
The app counts 24-year-old Charis Liang among its users. The undergraduate, who previously worked as an intern at the company, takes to the app for sessions when she's overwhelmed and needs help quickly.
“You can't call up your therapist at 3am, but you can do this", she said. Ms Liang said exercises on the app, based on cognitive behavioural therapy, are similar to what she went through at a traditional therapy session with the added ease of being on-demand.
"With … a therapist, you can't really do that because you need to book a time, drive yourself to the office and sometimes there are just barriers between you accessing the resource”, Ms Liang said.
HURDLES TO CROSS
Despite the perks of tech, Intellect provides therapy rooms in its office as an option for those who still prefer physical sessions over virtual consultations, with a belief that human touch is still indispensable in mental health care.
Also, with the sensitive nature of mental health issues, confidentiality is a major consideration for both users and providers.
“We use something called zero-knowledge encryption, which means that the data is only on a person's device and no one else - employers can't see, we can't see," Mr Chew said
“We always ensure their privacy. And that creates a lot much more trust."
Besides privacy, another challenge is explainability, Holmusk's Mr Sarkar.
This refers to being able to make sense of AI recommendations.
Tech companies, he said, will have to collaborate with clinicians and health authorities to look at requirements necessary for safety, and study the implications carefully.
“If I'm recommending a YouTube video, I don't need to know why the AI chose that YouTube video because there are no consequences of you not liking a YouTube video, but the consequences are far more acute in the healthcare setting,” Mr Sarkar said.