The default sentiment about where mental health and technology meet may not immediately feel positive. For several, social media is the primary meeting point for mental health and technology, and that tends to lead to negative associations like filters encouraging body dysmorphia, incessant lifestyle comparison, and unwelcomed exposure to the opinions of too many strangers. The positive associations between mental health and technology are often overlooked. Social media can also serve as an abundant, affordable resource for learning about all things mental health (Check out Remedy’s Instagram and our founder, Dr. Kirsten Thompson’s tiktok ) and apps like Most Days provide simple, convenient ways for people to begin tending to their mental health. Though it may not be the most immediate assumption, there are several ways the influence of technology on mental health is beneficial, and today’s blog topic demonstrates another example of that. In this blog, we will discuss what artificial intelligence (AI) is, the latest trends with AI in mental health care, and the challenges researchers continue to face in producing working clinical applications for AI in mental health care. 

 

What is Artificial Intelligence (AI)?

In the simplest terms, artificial intelligence combines computer science and data science to create systems or machines that mimic the problem-solving and decision-making capabilities of a human mind. Common examples of AI include chatbots on various websites to provide faster customer service, automatic speech recognition which processes human speech into a written format, and recommendation engines on platforms like Netflix which provide automated recommendations based on a user’s past viewing habits. An important subset of AI is machine learning. While artificial intelligence broadly refers to the ability of computers to mimic certain capabilities of the human mind, machine learning specifically refers to systems that learn and/or improve their performance based on the data they consume. 

Especially in the field of medicine, the point of AI is not to replace humans, but to enhance and optimize human capabilities. For example, an AI-powered tool built by the Icahn School of Medicine at Mount Sinai called Deep Patient allows doctors to identify high-risk patients before diseases are even diagnosed. Deep Patient analyzes a patient’s medical history to predict almost 80 diseases up to one year prior to their onset. Similar implications for AI are being explored in the field of mental health. A study published in 2021 used machine learning to identify two potential subtypes of post-traumatic stress disorder (PTSD) – as well their potential biological features – based on consolidated datasets which included self-reported symptom severity surveys, blood tests, patients’ medical history, patient demographics, and more. These are just two examples of how the optimization power of AI is not only reserved for fields like business and engineering, but also has important implications for human well-being as well.   

 

Latest Trends with AI in Mental Health Care

Though the future of AI in mental health care looks promising and revolutionary, it remains in an exploratory state. There are several ways AI is being looked at to positively impact mental health care, such as optimizing patient evaluation, reducing the effects of human bias and error, individualizing patient diagnosis, providing greater access to psychiatric care, predicting relapse, and so much more. Because using AI in mental health care is a relatively new endeavor for researchers and there are so many avenues of benefit to explore, there still isn’t enough research for the widespread adoption of any direct clinical applications yet. 

One of the leading areas of study for the use of AI in mental health care is digital phenotyping. In the field of genetics, a phenotype is a person’s complete set of observable characteristics, such as hair color or height. It is essentially what a person expresses on the outside as a result of what their genetic makeup looks like on the inside. Translating this to digital phenotyping, digital phenotyping can simply be understood as the observable characteristics a person demonstrates as a result of their internal state. Researchers are thus refining methods of digital phenotyping so people can passively provide insights about their internal mental and emotional state by what they are expressing on the outside through their digital behavior. What is also particularly interesting about digital phenotyping is that not only is it observable – it’s measurable. Things like screen time, keyboard activity, and app usage are just a few examples of digital behavior data that can be helpful in capturing a more accurate picture of a person’s mental state instead of relying on self-report data.

 

Project mindLAMP 

In discussing the implications of AI in mental health care, the IEEE Engineering in Medicine and Biology Society noted two interesting ongoing projects: projects mindLAMP and BiAffect. Project mindLAMP (Learn, Assess, Manage, Prevent) involves the use of a neuropsychiatric research app that uses sensors in smartphones to help predict and understand people’s lived experience of mental illness and recovery. It was initially conceptualized to monitor the mental health of those with schizophrenia, but the success and versatility of the app has expanded its use to monitor other mental illnesses, such as Alzheimer’s and chronic depression. The mindLAMP app has the ability to collect different data from its users, such as moment-to-moment survey responses, cognitive tests, GPS coordinates, and physical activity information. The amount of information gathered by the app can be customized according to what’s agreed upon between a patient and their mental health care provider. Since AI has the ability to consolidate several streams of data, recognize patterns within them, and make predictions from those patterns, there is hope that mindLAMP can also be used to predict relapse of serious mental illness. The hope is that AI algorithms will eventually possess the ability to assess the risk of a patient’s relapse and notify their mental health care provider before a patient is even fully aware of any significant symptomatic changes.  

 

Project BiAffect

Similar to mindLAMP, Project BiAffect also uses an app to gather information provided by certain smartphone sensors. More specifically, BiAffect involves running AI algorithms on a person’s keyboard metadata to predict manic and depressive episodes in those with bipolar disorder. Examples of keyboard metadata include variability in typing dynamics, mistakes, pauses, and use of the backspace key. In short, the app is able to make informed predictions about a person’s bipolar episode from their typing patterns, without actually collecting what they type. The concept was supported by a small pilot study that found associations between certain mood disturbances and specific changes in keyboard metadata, such as faster typing speeds during manic episodes and shorter messages during depressive episodes. 

  

A staggering 47 million Americans (approximately 19%) experienced a mental illness in 2021- a figure that is likely to rise in the wake of the COVID-19 pandemic. With nearly 40% of Americans living in areas where there are not enough mental health professionals to meet the community’s needs, the importance of developing solutions to the nation’s declining mental health is abundantly clear and AI may present just that. Project mindLAMP and project BiAffect are two examples of ways AI may be used to optimize mental health care to provide care to more people in the future. They each demonstrate the potential for AI to revolutionize psychiatric care, both in maintenance and prevention. Every day, smartphone users provide mountains of digital information to their devices, and AI presents a promising method for not only sifting through it all, but drawing informative and useful conclusions from the patterns it recognizes. In addition to the convenience of AI, it has the added benefit of eliminating human bias and error since people’s digital behavior is analyzed by an algorithm, making the process standardized and the conclusions objective.  

 

Challenges for AI in Mental Health Care

Along with the revolutionary possibilities AI presents for mental health care, there are also field-specific challenges that are sure to delay the widespread adoption of AI amongst patients and mental health care providers alike. Firstly, and most obviously, there are certain elements of personal psychiatric care, specifically psychotherapy, that cannot be emulated by AI. Elements of human-to-human care, such as empathy and compassion, are often important factors in the efficacy of a person’s treatment. Even the most advanced AI algorithm can only generate empathetic or compassionate sentences, but a patient’s perception of those statements as automated can quickly eliminate their therapeutic element. As a physician in the IEEE Engineering in Medicine and Biology Society review aptly stated, “Even the best technology will fail if the end users are not open to it.”

The final challenge we’ll discuss is less an inherent limitation of AI, and more a limitation of mental health as a whole. Although AI can standardize assessment and diagnosis in psychiatry, there is still the challenge that psychiatric conditions are not standardized themselves. We know that patients with the same psychiatric diagnosis may present differently, and biological markers for psychiatric conditions are even more ambiguous than their expressed characteristics. Though symptom variation can exist among diagnoses of the same physical condition as well, the variation is typically much smaller in scale for physical conditions, and thus, more likely for AI algorithms to overcome. Therefore, while AI is making significant strides for early detection and diagnosis for physical conditions because their diagnosis tends to follow stricter patterns, more research is needed to fully translate the potential AI has for psychiatric care into realized, clinical applications.

Book an Appointment
Oracle Staff. (n.d.). What is AI? Learn About Artificial Intelligence. Oracle. https://www.oracle.com/artificial-intelligence/what-is-ai/ 
Siegel, C.E., Laska, E.M., Lin, Z. et al. Utilization of machine learning for identifying symptom severity military-related PTSD subtypes and their biological correlates. Transl Psychiatry 11, 227 (2021). https://doi.org/10.1038/s41398-021-01324-8 
Summer Allen. (June 2020). Artificial Intelligence and the Future of Psychiatry. IEEE Engineering in Medicine and Biology Society. https://www.embs.org/pulse/articles/artificial-intelligence-and-the-future-of-psychiatry/ 
Consent Preferences