In recent years we have heard a lot about the potential of digital doctors and nurses: the example of AI becoming directly responsible for our welfare.
Being a logical step after AI assists in diagnosing and evaluating the treatment path, the digitization of medical professionals is something that the broad public is not yet completely comfortable with.
But what if technology turns to mental health and starts digitizing not doctors but psychologists?
Digitizing Psychologists: Is It Possible?
The implications are all favorable to the introduction of AI into the sphere: it is estimated that a quarter of the adult population have mental disorders. According to the World Health Organization , depression alone affects some 300 million people around the world. The sad truth is that not everyone can ask for help. The obstacles are related to the stigma that still exists in society, the lack of therapists, the price of therapy and, in some countries, the qualification of specialists.
AI appears to offer many opportunities to help people maintain and improve their mental health. Currently, the most prospective domains for the application of artificial intelligence techniques are computational psychiatry and the development of specialized chatbots that could provide counseling and therapeutic services.
Computational psychiatry, widely defined, encompasses two approaches: data-driven and theory-driven.
They apply machine learning methods to high-dimensional data to improve disease classification, predict treatment outcomes, or improve treatment selection.
Approaches based on theory
They use models that instantiate the prior knowledge of such mechanisms at multiple levels of analysis and abstraction. Computational psychiatry combines multiple levels and types of computation with multiple types of data to improve understanding, diagnostics, prediction, and treatment of mental disorders.
Digitize the Diagnostics
Mental disorders are known to be difficult to diagnose. Currently, diagnosis is based on visualization of symptoms classified as mental health disorders by professionals and collected in the Diagnostic and Statistical Manual of Mental Disorders (DSM). However, in many cases, with the current lack of biomarkers and symptoms collected through observations, these symptoms overlap across diagnoses. Furthermore, humans are prone to inaccuracy and subjectivity - what is three on one person's anxiety scale may be seven for another.
One possible way for AI to assist or even replace human experts, as offered by the group Virginia Tech , is to digitize these inhomogeneous responses with homogeneous criteria, combining fMRI neuroimaging with massive data collection. Survey responses, functional and structural MRIs, behavioral data, voice data from interviews and psychological assessments.
Another example is QuartetHealth , which analyzes patients' medical histories and behavioral patterns to uncover undiagnosed mental health problems. To illustrate the concept, Quartet can also report possible anxiety based on the fact that someone has been repeatedly tested for a non-existent heart problem.
Artificial intelligence can help researchers uncover the physical symptoms of mental disorders and track the effectiveness of various interventions in the body. Furthermore, it may find new patterns in our social behaviors. Or see where and when a certain therapeutic intervention is effective, providing a model for digitizing preventive mental health treatment.
Digitizing therapeutic assistance
Similarly to somatic diseases, artificial intelligence algorithms can be used to evaluate the treatment of mental disorders, predict the course of the disease, and help select the optimal treatment path. Statistical modeling by extracting data from existing clinical trials can enable prospective identification of patients who may respond to a specific line of treatment.
Predict the best antidepressant
An example of using machine learning is theapplication of algorithms to predict the specific antidepressant with the best chance of success . Although physicians do not have empirically validated mechanisms for assessing whether a patient with depression will respond to a specific antidepressant, the effectiveness of treatment can be improved by matching patients to interventions.
In addition to analyzing fMRI images, faces of computational psychiatry, ethical, spiritual, practical and technological issues. For example, the huge archives of intensely personal data needed for algorithms immediately raise the problem of cybersecurity. At the same time, however, it is a barrier between the individual, personal data and the consultant. A barrier that can help overcome patients' fear of stigmatization and reluctance to turn for help.
The idea of creating chatbots that provided immediate counseling services arose as a response to the lack of therapists and the embarrassment of patients. Patients, who are often reluctant to disclose problems to a therapist they have never met before, are believed to let their guard down with AI-based tools. Additionally, the lower cost of AI treatments compared to a psychiatrist or psychologist allows you to expand coverage to a wider circle of people requiring care.
The idea of digitizing (and simulating) conversations between a therapist and a patient dates back to the 60s. At the time, the MIT Artificial Intelligence Laboratory designed ELIZA, the grandfather of modern chatbots. Today's advances in natural language processing and the popularity of smartphones are at the fore in mental health care.
For example, the Ginger.io app offers video and text-based therapy and coaching sessions. By analyzing past assessments and real-time data collected via mobile devices, the Ginger.io app can help specialists track patient progress, identify times of crisis and develop personalized care plans.
Another example is Woebot. Woebot is a computer program integrated into Facebook that aims to digitize and replicate the conversations between a patient and a therapist. Digital health technology asks about your mood and thoughts. "Hear" how you feel, learn about you and offer cognitive behavioral therapy (CBT) tools based on the evidence . The first randomized control study with Woebot showed interesting results. After just two weeks, the participants experienced a significant reduction in trough e anxiety.
The next generation of chatbots will feature avatars that can detect non-verbal cues and respond accordingly. Such a virtual therapist named Ellie is was launched by the Institute for Creative Technologies (ICT) of the University of Southern California. The purpose? Treating veterans suffering from depression and post-traumatic stress syndrome. Ellie works by using several algorithms that determine her questions, movements and gestures. The program observes 66 points on the patient's face and detects the patient's speech rate and the length of pauses. Ellie's actions, movements, and language mimic those of a real therapist only to the extent that they don't seem too human.
Prevent social isolation
Another problem that can be addressed by AI-powered chatbots is extreme social isolation and difficulties in building close social relationships between people suffering from mental illness. Combined with Internet social networks, such chatbots can foster a sense of belonging and encourage positive communication. The National Center of Excellence in Youth Mental Health in Melbourne, Australia, has launched the Moderate Online Social Therapy (MOST) project. It aims to help young people recovering from psychosis and depression. Technology digitizes a therapeutic environment in which young people learn and interact, as well as serving as a platform for practicing therapeutic techniques.
Recent developments suggest that we will soon face the artificial intelligence revolution in the mental health,. And this will result in better access and better care at affordable costs. However, if AI builds models for mental health disorders, aren't we also building a model for normality? And if so, who can define what is "normal" and will it be used as a tool or club?
What we should remember when we apply artificial intelligence to study our brains is that we should be careful not to reduce the personality to a combination of quantifiable factors and to demystify mental disorders without finding problems in any idiosyncrasy.
Bianca Stan - Graduated in Law, writer with several books published in Romania and journalist for the group "Anticipatia" (Bucharest). She focuses on the impact of exponential technologies, military robotics and their intersection with global trends, urbanization and long-term geopolitics. She lives in Naples.