An overview of digital mental health, Part III
When Elizabeth Holmes started a blood testing company, Theranos, as a Stanford University dropout, she claimed to be able to process blood from a fingerstick and still be able to accurately provide hundreds of blood test results. This turned out to be a fraud, and what turned into a $9 billion valuation for Holme’s network ultimately was acknowledged to be worth no dollars at all. Such stalwarts as Henry Kissinger and Charles Shultz were on the company board, and Walgreens had signed an agreement to carry this blood testing technology in its stores. Holmes has since been successfully prosecuted and is awaiting sentencing.
As Seth Feuerstein, MD, JD, founding board member of the Center for Biomedical and Interventional Technology at Yale and Executive Director of the Yale Center for Digital Health, Innovation and Excellence notes, there is more than just evidence-based requirements for digital health tools, but also a demand to be truthful and non-fraudulent in business activities. He discussed how the mental health app Cerebral has been charging subscribers for non-prescribed services and advised me to review the information online about Cerebral and help to warn the public.
I had enrolled as a subscriber to Cerebral a few months past, and prior to this interview with Feuerstein, to evaluate the service. Even though I never used the service, and never spoke to a therapist or psychiatrist, I still had a monthly fee of $85 charged to my bank account. There are many folks who have experienced similar treatment.
A customer response posted on the Better Business Bureau site declared: “I started Cerebral on May 4, 2022. I feel that I have wasted money ($195 thus far) and have not received the appropriate treatment. I was started off with a medication not directed towards my concerns/symptoms and was not provided the correct questionnaire at the beginning of my treatment, which would have allowed me to start the proper medication at a timely manner. Eventually I was told to stop taking that medication as it contraindicated with another medication I take for my migraines. I had mentioned to the prescriber that I was taking this other medication during my very first meeting, yet she did not mention any contraindications at that time. Additionally, I have been having much difficulty reaching a prescriber to start my new medication. I have directly reached out to the prescriber several times over the past week and have not heard back. It has now been over a week since I have been unable to reach the prescriber and almost 2 weeks since I have not been taking any medications, yet continuing to pay the membership fees. Aside from this, my file is never up to date…”
Feuerstein stated that it is important to become familiar with the Cerebral consumer complaints, and that it is important for consumers to have data and information about the research behind an app, to help determine if it is evidence-based. He stated that although evidence may be lacking, or not yet available, in this increasingly crowded space, that there is an app that does indeed have data and evidence behind it. He acknowledged that he believes the evidence for insomnia digital prescription therapeutic Somryst is quite robust. This is a digital tool that is used to treat chronic insomnia, and the only prescription digital tool for chronic insomnia available on the market. Pear therapeutics, on the Somryst website, is openly candid about this not being a treatment for everyone, and warning that it would be contraindicated for patients with “any disorder exacerbated by sleep restriction, such as bipolar disorder, schizophrenia, and other psychotic spectrum disorders.” Other contraindications include epilepsy, pregnant women, individuals at risk for falls, and patients with untreated obstructive sleep apnea.
The research on technology and behavioral health notes that it is more likely that technology will be that agent to help advance mental healthcare, even more than new psychological concepts or new theories. In 2008, what was eventually known as LYSSN, there was a collaborative research group that evaluated the measurement of empathy and evidence based interventions. The purpose of this was to develop technological tools to augment, but not to replace therapy. Feuerstein states that LYSSN is working with machine learning and natural language processing to discern the effectiveness of diagnosis and treatment beyond what a human being might be capable of. It can also evaluate each session for patient engagement and discern which evidence-based tools might be introduced, versus those that are lacking. It might also provided predictive modeling to evaluate a likely outcome of a specific therapeutic intervention. The accuracy of these predictions could ultimately provide a sensitive suicidal warning index based upon the AI-driven parsing of language.
Anecdotally, Yale students, including myself, were frequently mesmerized by Yale college psychiatry guru Robert Arnstein’s supervision sessions, where he was able to predict what a patient would discuss in a follow up session, based solely on process notes from the present one. While this may be a gift to a very small set of clinicians, NLP promises to make this predictive skill available to every session where linguistics are analyzed. In a 2017 research article written by Imel, et al., it is noted that “it is also possible that machine learning models will improve our ability to predict response to psychotherapy, but may not necessarily improve our understanding.” This is similar to behaviorist BF Skinner’s admonition that all that matters in behavior is what the stimulus and the response are, and that the thinking or “inside the black box analysis” is immaterial. While it is true that a repeatable response to a stimulus may prove the reflex arc, it will ultimately be helpful to understand the cognition inside of a stimulus-response volley.
Another difficulty with mental health apps is that they are not always engaging or motivating enough to drive initial completion or recurrent visits. Skip Rizzo, the director of Medical Virtual Realty at USC, notes that “we are working to develop an app to treat burnout in populations including medical clinicians.” About 400 medical students and physicians each year die by suicide, which is the equivalent of an entire medical school class.
Virtual reality can simulate situations that would be too dangerous or impractical to create in real life. Recreating a medically traumatic event or addressing a fear of great heights can be replicated by immersing a patient in a virtual reality simulation with a headset. It will trigger the same panic that occurs in reality, except there is no risk of the patient jumping or falling. Gradually exposing a patient to various intensities can help to extinguish the phobia so that in real life, this patient might hopefully experience less anxiety and panic. The next time the patient is in a skyscraper or traveling on a plane, or coping with an extreme medical emergency, the anxiety response should be significantly mitigated.
SerenityDTx is a digital virtual reality tool that can potentially help with dementia and agitation, and has an anxiety moderating effect to help people to feel better and less ruminative. The undersigned interviewed Dr. Stephanie Yarnell-Mac Grory, the chief medical officer, Paul McCrae MBA, the CEO, and Dr. Dion Neame, the chief medical advisor, and the chair of the advisory board.
The digital tool may also prove to be helpful in a wide variety of situations, with a simulation that can mimic the variables of outside reality. For example, a crowded grocery store, which might trigger a phobia in a susceptible individual, can be easier to interact with as a three-dimensional simulation. However, exposure to a stimulus can induce a panic episode that will mimic a real trip to a grocery store. VR, unlike a real venue, can reduce the stimulus and help people to become more comfortable with triggers. When visiting a brick and mortar grocery store the patient may have less negative responses as a result of the VR intervention.
Rizzo stated that many people feel more comfortable speaking to a chatbot, especially when they are accurately informed that the only interaction occurring is between them and a computer. Rizzo notes “people may be more comfortable speaking with an avatar when it doesn’t represent a live human.” Fear of being judged is mitigated when the software program directly interacts with the patient, and no humans are involved. Furthermore, if the symptoms are severe the avatar can be programmed to, in a low key manner, recommend that the patient speaks to a live therapist. This will be a graded and measured response that will only be offered if there are imminent dangerous symptoms. The pandemic has dramatically increased the level of mental health issues among the public and among the medical professions. An app that is engaging and can offer a more anonymous treatment might be appealing to many people.
The World Health Organization estimates that there are almost half a billion people worldwide who are suffering from mental illness and a large percentage of them have not seen a therapist. This could be related to cultural determinants and to fear of being stigmatized by seeking psychological help. If virtual reality and other apps can even address 1% of this population, Rizzo notes that 4.5 million people could be treated and helped.
Rizzo notes that “the future is bright for virtual reality and other apps,” and that this domain will only improve and offer much more benefits to patients of all demographics.
Rizzo also notes that there is a very deep need for medical publications, such as this article, that write about research in virtual reality and other apps, to be translated for the public. The Plain Language Act of 2010, signed by then-President Barack Obama, requires all federal agencies to use plain and clear language when communicating with the public. Rizzo notes that such translations can improve patient knowledge, and help to empower them to make better educated decisions about their mental health.