×

An overview of digital mental health, Part II

Technology reading your thoughts? This would have been impossible in the past. However, Marcel Just, PhD, psychology faculty at Carnegie Mellon University, has developed and researched a functional MRI algorithm that can literally read your thoughts.

A regular MRI can take pictures of the brain that could identify brain tumors, brain aneurysms and cognitive dementia. A functional MRI, which is what Just is developing, can identify blood flow movements within the brain. When one thinks of a house, for example, the functional MRI (fMRI) moves to the area of the brain that “thinks” about this object. It is therefore possible to determine what an individual is thinking about by measuring blood flow to different regions of the brain. Just has talked about how this could eventually be used in forensic cases, where someone’s guilt could be determined by this type of a brain scan. He stated that it is even possible to develop a digital tool that could scan someone’s thoughts, without someone even being aware of this, in a retail store or in a park. When the producer of 60 Minutes interviewed him, Just was able to determine, with 100% accuracy, the thoughts of this television producer.

Although this technology has not yet been implemented in the legal system, healthcare fields such as radiology, oncology and dermatology have been utilizing artificial intelligence, or AI. In addition, functional magnetic resonance can improve communication with a stroke patient who has lost verbal expression. An fMRI can communicate with a patient by scanning the patient’s thoughts and discerning cognitive and emotional symptoms, including those of depression. This has helped post stroke patients with an expressive aphasia, who are unable to verbally communicate with their clinical team.

The pandemic provided a gateway for new apps and digital tools. As previously noted, mental health digital tools have unfortunately not been researched adequately and, based upon current knowledge, no one tool appears better than another. There may also be redundancies among these apps and no clear benefit to using one version compared to a different version, since even free online tools may be providing similar benefits.

Theoretically, however, mental health AI could provide assistance in diagnosing patients without bias. An article many years ago noted that psychiatrists were more likely to label someone with schizophrenia if they were of lower socioeconomic status or were African-Americans. This racial and demographic bias may occur on an unconscious level, and might be virtually impossible to circumvent. Digital tools would not succumb to these biases. In addition, they would be more accessible to all individuals in society, and help to identify and provide treatment for the marginalized, including those who are acutely suffering from severe illnesses such as schizophrenia.

A recent article written by thought leaders in the field of digital psychiatry noted that there are no FDA-cleared or approved mental health digital tools available. This has recently changed with Pear Therapeutics, a company that has developed digital prescription therapeutics, which are software tools that can be prescribed for psychiatric conditions.

However, despite FDA approval, John Torous, MD, director of digital psychiatry at Harvard, noted that these tools have weak data and are non-scalable. An email to a senior staff at the FDA went unanswered.

However, the use of natural language processing (NLP) can help to extract data from patient authored communications that discuss symptoms. NLP can analyze written and verbal language to determine if a patient has psychiatric symptoms, such as depression. The calibration of this data can help in the early development of precision medicine. Information about depressive symptoms, for example, can help create a database that can be targeted to a specific individual. The digital tool will collect this information and may be able to determine if a patient has relapsed, utilizing only their written or verbal speech.

Psychiatry has, unfortunately, been a field of medicine challenged by the lack of objective test results. Genetic screening for medication interactions, and determining a patient’s cytochrome P450 oxyenase, a liver enzyme that metabolizes psychiatric meds, can predict if there will be interactions for a specific psychiatric medication. This is the only clinical test available in mental health. A report will list those medications, such as stimulants for Attention Deficit Hyperactivity Disorder, anti-anxiety pharmaceuticals, antidepressants, and anti-psychotics, that can be prescribed safely, those that will possibly be tolerated, and those that will definitely cause a negative interaction. This tool doesn’t provide information into what medications will be treatment effective and is limited to providing data about medication interactions within a specific individual.

In practice, however, these tools are not always accurate, and a clinician may prescribe an antidepressant, for example, that has been cleared by this clinical test, but still causes significant side effects.

Although quantitative scales can help to determine whether an individual might be depressed, suicidal or experiencing obsessive compulsive symptoms, much of this depends on whether clinicians have the time to employ them and whether patients are agreeable to taking them and being honest about their responses. The use of natural language processing, utilizing tools to dissect written and spoken communication, has been able to scan social media sites and determine, based upon language, whether someone might be depressed.

Carnegie Mellon University, along with other universities, has been conducting research in this domain. Social media can also be scanned to identify social groups for an individual, which can provide an index of one’s emotional support. Loneliness is correlated with mental illness, and the ability to define social groups, and predilection towards depression, can be a welcome tool. However, privacy and confidentiality have to accompany tools in this nascent field.

Furthermore, given the large number of languages that exist, much more work is needed to utilize NLP for the masses. For example, there are NLP systems currently available for languages such as Spanish, Chinese, German, French and English. Other languages, such as Indonesia, Swahili and Bengali have not yet been included into NLP systems for language analysis.

In addition, prosody could be confusing when a computer or digital tool is confronted with the sentence “I am having a wonderful time,” which could be truthful and straightforward or with the proper inflection means that the person is being sarcastic, and having a terrible time.

Machine translation of consumer sentiment about large companies, such as the Fortune 500, allows computers autonomy in trading stocks, something that would have been impossible at the dawn of the computer age. Machine learning can likewise identify the sentiment and depressive symptoms of an individual consumer and be potentially helpful in screening for suicidal thoughts.

It is important to utilize tools that can address suicidal ideation quickly, and even more effectively than a mental health practitioner can. A study by the Substance Abuse and Mental Health Services Administration demonstrated an extreme gap between making a suicide attempt and having follow up. A survey showed that almost 20% of suicide attempts evaluated in the emergency room had no subsequent follow up. Another study noted that primary care physicians can identify patient depression just around 42% of the time.

In a different study it was found that almost a quarter of patients (24.6%) who attempted suicide had seen a mental health professional in the week prior to their suicide attempt. Some reasons for this might include the lack of time that most psychiatrists have with follow up patients, often limited to a brief 15 minutes, where much of the practitioner’s time is taken up by checking boxes in an electronic health record. This has become so distracting to patients that, anecdotally, some wonder if their clinician is reading and writing emails, instead of attending to their personal needs.

Another impediment might find a practitioner asking a patient to complete a quantitative check list that can estimate suicide risk. Patients my choose not to be honest and forthcoming about their suicidal ideation, especially if the patient is serious about killing themselves. Stigma may also lead patients to cover up their symptoms due to embarrassment or low self-esteem. Furthermore, key words associated with depression and suicide can enable NLP to determine if an individual is depressed, even when the individual patient may not recognize how severe their symptoms actually are. Like a radiology app that might be more effective at diagnosis than a live radiologist, mental health digital apps may recognize depression in someone who doesn’t appreciate how severe their symptoms actually are.

Suicidal behavior is difficult to clinically predict and even the most experienced and intelligent practitioners can be fallible. Part of the reason for this is that each individual has a cumulative degree of stress throughout their lifetime. Something minor might happen, that would be under the radar of both patient and clinician. In a limited time slot severe stressors may not able to be adequately processed. There just isn’t enough time in an expedited med check. Having the ability to data mine patients, with their permission, on social media such as Twitter and Facebook, can discern linguistic patterns that indicate that a patient is at serious risk.

At a time where the pandemic has outgrown the supply of mental health clinicians who were already taxed prior to COVID, artificial intelligence, natural language processing, and data mining social media sites and online blogs might provide consistent, accurate, and highly sensitive results that can help diagnose depression, predict risk of suicide, and deliver highly accessible treatment after a suicide attempt that is not completed. The future is bright, and nonprofits like Rock Health are bringing separate disciplines together to discuss ways of breaking the barriers to achieving evidence based digital mental health solutions to the public.

— — —

This is part two of a three-part series.

CORRECTION: An earlier version of this column had a sentence that read: “Suicidal behavior is difficult to clinically predict and even the most experienced and intelligent practitioners can be infallible.” It was intended to say “fallible.” The Enterprise regrets the error.

NEWSLETTER

Today's breaking news and more in your inbox

I'm interested in (please check all that apply)
Are you a paying subscriber to the newspaper? *

Starting at $4.75/week.

Subscribe Today