Our Menu

How AI is transforming the practice of Psychiatry Therapy

Artificial intelligence (AI) refers to the development of computer systems capable of performing tasks that typically require human intelligence. It involves imitating human behaviour by processing various information sources including machine learning (algorithms), natural language processing, speech processing, robotics, and other similar automated decision-making processes.

Over the past few years, AI has replaced humans in many professional fields. There are now robots delivering groceries, working in assembling lines in factories, and there are AI assistants scheduling meetings and answering the phone line of customer services in various organisations.

In 2017 the technology giant IBM predicted that AI would transform the delivery of mental health care over the next five years by helping clinicians better predict, monitor and track conditions, and that “what we say and write will be used as indicators of our mental health and physical wellbeing”[1].

True to this prediction, artificial intelligence (machine learning in particular) is being used in the detection, diagnoses and development of treatment solutions for psychiatry therapy and mental health care in general.

Before the integration of AI, psychiatric therapy relied entirely on human expertise, traditional therapeutic techniques, and manual processes for diagnosis, treatment, and patient care. While effective, it lacked the efficiency, personalization, and data-driven insights that AI now offers. Psychiatrists relied on interviews, questionnaires, and standardized psychological assessments to diagnose mental health conditions. Diagnostic manuals, like the Diagnostic and Statistical Manual of Mental Disorders(DSM) and International Classification of Diseases(ICD), were key references.

Though this pre-AI psychiatric therapy was time-consuming and heavily dependent on the psychiatrist’s expertise and intuition, the human connection between patient and therapist cannot be overemphasised. This was pivotal in building strong therapeutic alliances, fostering trust and empathy as well as encouraging therapists’ direct involvement with their clients.

Since the integration of AI in psychiatric therapy, artificial intelligence has proven to be helpful in diagnosing various kinds of mental illness through methods hitherto unavailable to human therapists. For instance, AI can access relevant information about a patient from various sources (medical records, social media posts, internet searches, wearable devices, etc.),  quickly analyse and combine the different datasets it has gathered,  identify relevant patterns in the data and consequently help diagnose any mental illness [2].

AI has been beneficial to mental healthcare in three main ways namely:

  1. Personal sensing (digital phenotyping): This is the use of digital data to measure and monitor mental health. AI can analyse social media posts and medical records to determine behavioural changes that it has learnt to associate with mental health issues. An example is when a person who wears a smartwatch to measure physical activity suddenly becomes less active, AI technology may take this to be a sign of depression as lethargy and lack of motivation are symptoms of depression.
  2. Natural language processing algorithms: This is the use of language in conversations (chats, emails, social media posts) to detect patterns associated with mental health issues such as depression or anxiety. Most personal devices contain a significant amount of personal data, making them a convenient and practical tool for detecting language patterns that can be linked to certain mental health conditions. With them, changes in language and typing can also be used to monitor a patient’s mental health and recovery progress.
  3. Chatbots: This is the use of questions to detect mental issues. The chatbot asks questions about the patient’s mood, stress levels, energy levels, sleep patterns etc. in the way a medical practitioner would normally ask these questions; then analyses the patients’ answers and suggest appropriate intervention.  Where there are immediate concerns for the safety of the patient or those close to them, the chatbot could send an alert to the patient’s doctor.

The integration of AI has transformed psychiatry therapy in many ways including:

  1. Alleviating the Effect of the Global Shortage of Mental Health Workers: According to the World Health Organization (WHO), there is a global shortage of 4.3 million mental health workers, and it is estimated that the shortage will reach 10 million by 2030 in low and lower-middle-income countries. AI diagnosis and treatment of patients through apps that are easily installed on a smartphone or through chatbots that can aid psychotherapy provide a minimum standard of care to a much wider portion of the population. This should reduce  the global shortage of therapists as AI would make intervention available to anyone who owns a smartphone and can access the internet[3]
  2. Improving Objectivity and Lack of Bias: Even though impartiality and objectivity are the ethos of all healthcare practitioners, most humans are prone to partiality. A human practitioner is likely to find it more difficult than AI to disregard certain information about the patient and focus exclusively on the symptoms.
  • Creating Awareness of  Individual Mental Health Status: One of the reasons people do not seek help when they suffer from mental health issues is that they are often unaware of their mental health status. AI-based tools help make people more aware of their mental health status and amenable to seeking professional help. Apps monitoring behavioural patterns can send a message to the user, advising them to seek medical help where they may be suffering from certain mental health conditions that they had no previous knowledge of. This often results in the early detection of conditions that may have otherwise gone unnoticed.
  • Reducing the Impact of Mental Illness Stigmatisation: Providing help without any need for the patient to disclose their challenges to another human being can be beneficial to patients who struggle with the social stigma associated with mental illness. For these people, virtual mental health therapists or chatbots can provide the mental health support they require and also provide diagnoses and recommend therapies.
  • Predicting Suicide: According to a meta-analysis covering 365 studies published over the past 50 years, psychiatrists are only marginally better than chance at predicting suicide[4]. AI researchers have developed an algorithm that they claim can predict with 85% accuracy if an individual will attempt suicide within  24 months, and 92% accuracy if they would attempt suicide within a week[5]. This result was obtained using large datasets, analysis of medical records, and tracking of social media posts. It seems, therefore, that empathy and human emotions are not as important in predicting suicide as AI’s access to substantial amounts of information and data.
  • Providing Help for Asocial Patients: In patients that are asocial and struggling with human interaction, AI could be a more useful tool than a face-to- face session with a therapist. For instance, autistic patients with limited social cues can find interactions with other humans exceedingly difficult.

Despite these amazing contributions of AI to psychiatric therapy there have been some costs; one of which is the dehumanisation of health care. This may not matter very much in other areas but in psychiatry and mental health care, interactions with human beings are irreplaceable. Empathy and trust (both human dimensions)  are the elixir of mental healthcare provision and neither can be delegated to artificial intelligence.

According to Minerva & Giubilini (2023)[6], even as we celebrate the positive transformation of psychiatric therapy by the integration of AI, there are two pertinent questions that need to be addressed:

  • Though AI is better than humans at diagnosing certain diseases because it can learn from vast data- sets and recognise patterns better, will this digital phenotyping not lead to overdiagnosis which could overburden healthcare systems and increase cost and inefficiencies?
  • Though empathy and trust are a human aspect that may be difficult but not impossible to encode in an algorithm if an algorithm makes a mistake, do human therapists have a responsibility for rectifying it? And if not, who is responsible?

It is possible that AI might never develop human emotions that could allow it to fully understand the emotions of a patient and human practitioners may never be able to cover the growing demand for mental healthcare; however, innovation and transformation in healthcare can only be achieved with human involvement. We humans are the key stakeholders in psychiatric healthcare and there can really be nothing about us, without us.

-Chiadi Ndu, PhD


[1] IBM. (2017). With AI, our words will be a window into our mental health. IBM Research. Retrieved from https://www.research.ibm.com/5-in-5/mental-health/ (Accessed 2 august 2019).

[2] Walsh, C. G., Ribeiro, J. D., & Franklin, J. C. (2017). Predicting risk of suicide attempts over time through machine learning. Clinical Psychological Science5(3), 457-469.

[3] https://www.who.int/health-topics/health-workforce#tab=tab_

[4] Franklin, J. C., Ribeiro, J. D., Fox, K. R., Bentley, K. H., Kleiman, E. M., Huang, X., … & Nock, M. K. (2017). Risk factors for suicidal thoughts and behaviours: A meta-analysis of 50 years of research. Psychological bulletin143(2), 187.

[5] Walsh, C. G., Ribeiro, J. D., & Franklin, J. C. (2017). Predicting risk of suicide attempts over time through machine learning. Clinical Psychological Science5(3), 457-469.

[6] Minerva, F., & Giubilini, A. (2023). Is AI the Future of Mental Healthcare?. Topoi : an international review of philosophy42(3), 1–9. Advance online publication. https://doi.org/10.1007/s11245-023-09932-3

Share your love
BTH Admin
BTH Admin
Articles: 4

Newsletter Updates

Enter your email address below and subscribe to our newsletter

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Need Help? Chat with us
Please accept our privacy policy first to start a conversation.