Skip to main content
Best News Website or Mobile Service
 
WAN-IFRA Digital Media Awards Worldwide
Best News Website or Mobile Service
 
Digital Media Awards Worldwide
Hamburger Menu

Advertisement

Advertisement

Singapore

Can chatbots replace healthcare professionals to help mental health patients?

Chatbots are increasingly used in healthcare to handle mental health conditions and support general well-being. 

Can chatbots replace healthcare professionals to help mental health patients?
Chatbots, or computer programmes that simulate human conversations, are increasingly used in healthcare. (File: iStockphoto)

SINGAPORE: Chatbots can provide some comfort to people with mental health issues, but they fall short when it comes to detecting suicidal tendencies or offering appropriate help in crisis situations, researchers have found. 

A study of nine mental health chatbots by the Nanyang Technological University (NTU) showed that they empathise with users in conversations, but were unable to understand users who express suicidal tendencies or offer personalised advice. 

Chatbots, or computer programmes that simulate human conversations, are increasingly used in healthcare. They are used to manage mental health conditions or support general well-being. 

USING CHATBOTS TO OFFER TIMELY CARE, SUPPORT WELL-BEING

The use of chatbots comes at a time when people are more aware about their mental wellness.

“I think it's important and probably COVID-19 was good to kind of bring mental health a bit more into the open and to really say to people that it is fine if they don't feel well and they can talk about these things,” said Dr Laura Martinengo, a research fellow from NTU's Lee Kong Chian School of Medicine. 

“But also, we know that health professionals are not enough. So we need other ways to treat a larger amount of the population.”

Chatbots are especially useful as healthcare systems around the world are stretched and struggling to cope with a rising demand for their services, said observers. Those who feel stigmatised may be more willing to open up to chatting on a machine than talking to another person. 

“Stigma is a big problem. I think when you don't feel well, probably even hearing it from a machine helps,” Dr Martinengo told CNA on Tuesday (Dec 20). 

“Also, sometimes, it's very difficult for people with mental health disorders to actually talk about these things, and to tell people they don't feel well.”

Some of the chatbots allow users to type in their feelings, while others guide them through a list of options. 

Dr Martinengo said from the user interface and responses, these chatbots seem to be more oriented to the younger population.

“They will use words like buddy or WhatsApp, or language that probably the younger people use. So (the young) seem to be the target user group,” she added. 

“They are able to ask for your name and obviously the system will remember your name, but there are not many other ways that the chatbots personalise the conversation.”

CHATBOTS HAVE LIMITATIONS

Chatbots have limitations and cannot be relied on to address all issues or help everyone, according to observers. 

“Because obviously a machine cannot do as many things as a human can do,” said Dr Martinengo, adding that it is important that healthcare professionals are still involved in the process.

She also stressed that there may be a need to scrutinise these digital health tools, as they are easily made available on mobile application stores.

Some could even dish out poor or irrelevant advice. 

“And if you look at those apps, they can even be dangerous,” she added. 

“The problem is that, at the moment, nobody is truly regulating the market, so everything is still out there.”

Source: CNA/ca(ja)

Advertisement

Also worth reading

Advertisement