Patients Telling Digital Healthcare Apps of Suicidal Thoughts

By DocWire News Editors - Last Updated: April 11, 2023

Patients are beginning to use digital health apps designed to allow chats with doctors or receive diagnoses to confess thoughts of suicide. These apps are becoming an extremely effective means of managing diabetes and translating symptoms to a diagnosis, but many health startups are struggling to handle these dilemmas that even suicide hotlines can struggle with.

Advertisement

“To be honest, when we started this, I didn’t think it was as big an issue as it obviously is,” said Daniel Nathrath, CEO of Ada Health, whose AI-powered app has been ranked the number one medical app in 130 countries.

This Europe-based company’s chatbot was designed to give consumers potential diagnoses for their experienced symptoms and has completed over 10 million health assessments since its 2016 debut. Unfortunately, Ada Health claims that in 130,000 of these reviews the user informed the chatbot of suicidal tendencies or thoughts.

Patients expressing thoughts about causing self-harm is among several adverse situations that were not predicted by many digital health startups. American Well, a telemedicine company, once experienced a situation in which a physician had to call the police to report domestic violence seen in a video visit.

READ MORE: Parkinson’s Disease Coach App Uses AI Virtual Coach to Help Parkinson’s Patients

Though this situation described is a severe crisis that occurred through digital health platforms, suicidal thoughts are much more prevalent. This is to no surprise, being that much research has shown higher likelihood of confessing adverse thoughts on a computer rather than in person.

“People are going to express their suicidality,” said April Foreman, a psychologist working with the veteran’s crisis line run by the Department of Veterans Affairs. “We’ve destigmatized it. What we’ve not done is prepared everybody [to respond]”.

When using the Ada Health app and answering a serious of questions starting at “Do you feel tired today?”, leading to questions about depression, and finally arriving at questions about suicide, the app will instruct the user to immediately seek help.

Some users continue their inquiries about suicide with after the conversation with the chatbot ends, emailing suicidal thoughts to Ada’s customer support team. These workers then have to try to handle these questions that are far out of their job description, recommending different mental health resources.

READ MORE: FDA Launches MyStudies App Allowing Users to Enroll in Studies

Most digital health companies have plans in place to handle patients who share thoughts of suicide. Despite the small percentage of cases in which this happens, the stakes are high enough that most executives feel it is necessary to have a protocol in place. Interventions often include directing them to suicide hotlines, having them call a friend, or encouraging them to go to emergency departments. In higher risk scenarios the companies are often prepared to get police or emergency medical services involved, regardless of patient consent.

These digital health platforms offer a convenient, and often extremely effective means of providing health care to the consumer, but many feel they are not yet equipped to handle thoughts of suicide. Dr. Peter Antall, CMO at American Well, feels medical chatbots have great potential but worries that suicidal thoughts can go unseen without a physician seeing and hearing the patient. American Well focuses on patient-physician interactions via video, or over the phone if connection is poor.

“Given the acuity and the seriousness of somebody potentially trying to kill themselves, I don’t believe that any of those other technologies are there at this point,” said Antall.

READ MORE: FDA Approves First Birth Control App – Natural Cycles

Source: STAT

Post Tags:suicide
Advertisement