Artificial intelligence (AI) is coming for therapists鈥 jobs and we should be afraid, perhaps very afraid. Or should we be rejoicing in the added richness 鈥 and relief from tedious bureaucratic admin 鈥 that it potentially brings?
AI is certainly high on the current news agenda, spurred by the launch of ChatGPT in November last year. ChatGPT takes AI to a whole new level of sophistication. You can have conversations with ChatGPT that you might easily mistake for a human-to-human interaction; it can write essays, answer questions intelligently, code data, compose emails and engage in social media chit-chat. But what it can鈥檛 do is empathise or feel.
Digital technology is already transforming the delivery of听 healthcare, and not just in terms of administration. In mental health, apps offer easily accessible psychoeducation, activity and compliance monitoring and CBT-based therapies; virtual reality is providing new and effective ways of challenging phobias and paranoia, and chatbots are delivering basic talking therapy and conducting assessment interviews. Some argue that AI brings exciting new tools that can only benefit more people and improve access to therapy. Others fear that it threatens that most essential element of talking therapy 鈥 the human-to-human relationship. And some echo the AI industry leaders who, earlier this year, put out a warning that the AI technology they themselves are building could one day threaten the human race.1 Could therapy delivered by ChatGPT actively do harm?
A recent article published by a group of leading academics on how AI could change psychotherapy sought to envisage how it could be done safely and responsibly.2
Done right, AI can help clinicians with intake interviews, documentation, notes and other basic tasks, they say; it is a tool to make their lives easier. 鈥楬anding these lower-level tasks and processes to automated systems could free up clinicians to do what they do best: careful differential diagnosis, treatment conceptualisation and big-picture insights.鈥
To a certain extent, this is already offered by the AI鈥揵ased apps now widespread in the mental health arena, especially ones focused on self-help and mental wellbeing. They are also used increasingly in the mental health services to monitor clients in the community and ensure they are taking their meds and following their treatment regimes.
In May, NICE fast-tracked approval for nine mental health apps to be offered within the NHS Talking Therapies primary care counselling services to treat anxiety and depression.3 Some are already in widespread circulation, but NICE approval is needed if they are to be offered through the NHS. Six of the apps are recommended for use only with the support of a high-intensity therapist, for use by people with anxiety disorders such as body dysmorphic disorder, generalised anxiety, PTSD and social anxiety disorder. Three are online CBT programmes for depression that should be delivered with support from a practitioner or therapist, including regular monitoring of progress and patient safety.
These are likely to be the first batch in an increasing number of such apps, as the NHS seeks to reduce the huge backlog of people waiting for talking therapies. Professor Til Wykes is a member of the NICE committee that approved the apps for provisional use, pending outcomes and user feedback. A psychologist and Head of the School of Mental Health and Psychological Sciences at King鈥檚 College London, Wykes remains sceptical about the notion that apps could replace a live therapist. 鈥業 do think they are effective for some people but not necessarily effective for all and not necessarily effective if you don鈥檛 have some other support system in place. If you are very depressed, the chances of you opening your smartphone, finding the app and concentrating for long enough to do the exercises are probably not very high. So we need to know which people they are likely to help, and it may be people who are mildly to moderately depressed if we are to make best use of them.鈥
She does think they could play a useful and necessary role in enabling services to monitor and identify patients at risk of relapse at an early point so they can intervene before the person鈥檚 situation deteriorates to a crisis. But she feels the research into which people might most benefit from which apps is not yet sufficiently nuanced to allow universal application. 鈥楩or people using an app, if you think it should work and it doesn鈥檛 work for you, you will think you have failed at something else and feel worse about yourself. But it may be that you are just one of those people who is never going to improve using an app. So we need companies to be more transparent about their data so, when someone starts using one, they know that, say, only one or two in five people will improve, and if it doesn鈥檛 work for them, it鈥檚 just that they are one of the other three it won鈥檛 help.鈥
What we do know is that most apps are downloaded and then never or rarely used, or not with much consistency. 鈥楢nd we also know that the more a therapist is also involved, the bigger the benefits of the app. That doesn鈥檛 need to be an expert clinical psychologist; they could be supported by others, with training, or supported by peers. But some human interaction is important,鈥 says Wykes.
But people鈥檚 preferences are changing: 鈥業 see apps as a tool, not a substitute,鈥 she says. 鈥楳ost people I speak to say they will use a digital therapy so long as it isn鈥檛 a substitute for a health professional. But I think the more we use them, the more there will be who do not need that human contact.鈥
Chatbots
What about mental health chatbots? One of the best known, Woebot, the CBT-based, AI-generated chatbot therapist, can 鈥搃t is claimed 鈥 form a 鈥榯rusted bond with users鈥 within three to five days and at a relational depth comparable with that achieved by traditional CBT therapists.4 Woebot originates from the US north-west coast. Here in the UK, psychologists at the University of Exeter are working with a US-based app developer, Iona Mind, with funding from a Government grant, to create an AI-driven app that will help deliver low-intensity, CBT-based therapy to female military veterans with anxiety and depression. Paul Farrand, professor of evidence-based psychological practice and research at the university, is leading the project. He sees apps and AI-driven chatbots as a useful addition to the stepped care offered through NHS Talking Therapies. But they should be part of a larger service offer, not a substitute for the human element, he says.听
Apps can deliver the 鈥榮pecific factors鈥 in low-intensity therapy 鈥 the specific interventions aimed at tackling the focus problem, such as behavioural activation techniques for depression or generalised anxiety. 鈥榊ou can move the protocols onto an app quite simply,鈥 he says. But the 鈥榗ommon factors鈥 鈥 the techniques the practitioner uses to engage and motivate the user 鈥 still need a human to deliver them. 鈥業 hate the term 鈥渟elf-help materials鈥 because at the moment we know from the research that to be effective they need to be guided. Just giving people a book and saying, 鈥淕o away and use it鈥 isn鈥檛 enough,鈥 Farrand says.
With Iona Mind, he is working to develop an app that can deliver the common factors as well, using AI. 鈥楢round eight per cent of people download an app and five per cent actually go on to use it, but if only five per cent are engaging with it, it鈥檚 not a solution on and of its own. That is where the human element comes in 鈥 some support is needed. So we need to change the engagement of the people involved. Sometimes people get scared because they think AI is trying to replace the therapist, but the way I see it, an app can make encouraging conversation but you need a person to keep the ball rolling, if only to provide a sense of accountability 鈥 the patient knows they are meeting the practitioner once a week, and that person is encouraging them and motivating them, so they continue to use the technology.鈥
What AI does do is free up practitioner time in NHS Talking Therapies services, says Josh Cable-May, CBT specialist with Limbic, a UK-based digital therapy provider that currently works with some 30% of NHS Talking Therapies services in England. 鈥楢ccess to a digital triage self-referral tool, such as Limbic Access, helps people make that first step towards talking therapy. Asking for help is one of the hardest points for many people 鈥 and the ability to make your own referral in your own time and space, when it suits you and without any kind of pressure, helpfully supports people at this stage. We lower the barrier to accessing services, which has also led to an improvement in access for underserved populations. Around 40% of our referrals are outside normal office hours, which speaks to the helpfulness of having a 24/7 tool. It is a very effective digital front door.鈥
The system can take the initial referring information, which uses standard assessment questions, classify the most common mental health disorders with 93% accuracy and from that predict which is the most suitable assessment questionnaire for the person to complete. 鈥楲imbic Access has medical device certification, which has been a massive step forward and has direct benefits for the NHS Talking Therapies service, as when the client referral is received we already have a really good idea of their problem and can make sure they are referred through to where they need to be in a timely way. We have shown a significant reduction in referrals being either stepped up or stepped down to a different level of input, which shows people are being allocated to the right treatment right up front,鈥 says Cable-May.
But a psychological wellbeing practitioner (PWP) will still be monitoring the client and ensuring they are engaging at the right step. 鈥榃e are not trying to replace therapists,鈥 Cable-May says. 鈥榃e still need a human in the mix, and that is why we are still embedded within a care ecosystem. Everyone who refers into the NHS Talking Therapies using Limbic Access will have a human assessment and continue with a human therapist. But reducing the admin burden frees up PWPs to do the actual therapy, which can assist in reducing waiting lists.鈥
Virtual reality
Virtual reality (VR) interventions are similarly being refined and tested through randomised controlled trials, prior to roll-out across the mental health services to deliver a range of therapies. Professor Daniel Freeman has pioneered the use of VR to identify, assess and treat a number of mental health conditions, including paranoia in people with severe psychosis, people with a range of phobias, such as fear of heights, and most recently people with very low self-esteem.
VR, which is applied using a headset and guided either by a live therapist or an avatar, gives the therapist a much more powerful tool to both assess the person鈥檚 response to, say, exposure to other people, and then to gradually encourage them to test out their firm belief that they are out to kill them. It can also produce very positive results for people who are scared of heights, enabling the person to, for example, gradually try standing on a balcony, moving to the edge of the balcony, lowering the guard rail and even, ultimately, crawling out along a ledge to rescue a cat.
Freeman is Chair in Psychology in the Department of Experimental Psychology, University of Oxford, and founder of Oxford VR, a spin-out company from the university. 鈥極ne of the most powerful ingredients in therapy is about going out there and trying things in the situations that trouble you,鈥 he says. 鈥業n VR you can present those situations in a clinical room in novel and in different ways. And it鈥檚 actually wonderfully therapeutic; because the person knows it鈥檚 not real, that it is VR, it gives them the psychological freedom to try thinking and behaving differently. We鈥檙e finding it is remarkably powerful.鈥
But, he says, his aim is most certainly not to replace the therapist: 鈥業 believe we need more therapists, not fewer. This is about using VR as a therapeutic medium because there are things you can do with VR that you can鈥檛 do in in-person therapy that can actually lead to better outcomes for people. And we are not at this stage doing away with mental health staff having an input 鈥 but we have broadened the range of mental health staff who can use our therapy. We use peer support workers, assistant psychologists and mental health staff as well as therapists, which frees therapists to work with other patients.
鈥楾here is a route to having cost-effective treatments at scale, but actually, for me VR is about achieving and maintaining better results.鈥
Trials of gameChange, the VR program for agoraphobia developed by his team, have shown very good results. They are now focused on developing Phoenix, a VR program aimed at improving people鈥檚 self-belief, which is currently being tested in a randomised controlled trial. It works by exposing the patient to situations that generate positive feelings of self-esteem, and then encouraging the person to think about how they鈥檇 replicate that in the real world. 鈥楽ometimes it鈥檚 about the person needing to get a sense of achievement back in their life, so we might have various tasks in a VR, such as looking after animals, which then brings on those feelings, and then the conversation is about how to bring on those feelings in the real world. Or they feel they can鈥檛 experience fun any more, so they鈥檒l do some fun things in the VR scenarios to bring on those feelings, and then the conversation again shifts to what they could do to generate those same feelings in the external world,鈥 Freeman explains.
Next in this issue
Avatar therapy
Another area that has been developing over the past decade is avatar therapy, which can be a powerful way of delivering cognitive behavioural techniques for managing mood and, as with VR, testing different ways of being in the world. One multi-site project currently underway is exploring using avatar therapy with people who hear voices to help the person feel better able to manage the voice and challenge what it is saying.5 The research involves psychologists at the Institute of Psychiatry, Psychology and Neuroscience, King鈥檚 College London, University College London and Ruhr-Universit盲t Bochum in Germany. People who hear voices (ideally there needs to be a single or one dominant voice) create an avatar of the person who they think is speaking to them. Supported by a therapist, they engage with the avatar and are encouraged to challenge, question and test out the threats and negative remarks it voices. Follow-up therapy then helps them to consolidate the confidence this can give them to prevent the voice dominating their lives. An initial trial had promising results and the outcomes of a second, follow-up trial are due in early spring of 2024. 鈥楽ome voice-hearers found power in calling the abuser to account. Compassion and acceptance are always on the table. However, the opportunity to express 鈥渞ighteous anger鈥 and to dismiss the abuser can be liberating. Indeed, it can be the start of relinquishing shame and self-blame, sowing the seeds of burgeoning self-compassion,鈥 the team reports.5
In New Zealand, the Ministry of Health has been funding SPARX, a gaming-based e-therapy program for young teenagers with depression, for nearly a decade in an attempt to tackle the ever-growing numbers needing psychological help.
Says child psychiatrist Sally Merry, who instigated the SPARX program: 鈥業t is intended as a treatment for young people with mild to moderate depression. We say that for the severely depressed you need one-to-one therapy with a therapist; we aren鈥檛 seeking to replace therapists.听
鈥業n the game, you have your own avatar that goes through seven levels and each level is very explicitly linked to learning goals 鈥 how to problem solve, spot negative thoughts, transform them and so on, and at the end you have learned to some degree to tolerate negative thoughts 鈥 that鈥檚 the acceptance side of things. And you then come back out and the guide gives you your own challenges for the week 鈥 like the CBT therapy model of giving homework.鈥
The biggest challenge has been keeping the young people engaged. 鈥榃e know that quite a substantial number that start the first session go on to finish it, but from there, there鈥檚 a steady drop-off. I would like people to get to the fourth session at least, and in a perfect world I鈥檇 like them to progress to the final seventh session because it rounds everything up, but only a small proportion do that,鈥 Merry says.
A research team at Nottingham University is currently trialling whether SPARX is more effective with or without therapist support. Merry鈥檚 team is also currently reviewing their seven-year outcome data, and similarly asking if it should be supported by a live person to guide the young person through. 鈥業 think we have a lot of evidence now that e-therapies work, that people benefit from doing them and that it helps if they have somebody 鈥 whether therapist or parent 鈥 to encourage them to get to the end,鈥 Merry says.
Concerns
So why the alarm, mixed with admiration, that has greeted advances in AI such as ChatGPT?
ChatGPT and Google Bard, which offers similar capabilities, are 鈥榗onversational generative artificial intelligence systems鈥. This, Wikipedia helpfully explains, 鈥榠s a type of artificial intelligence system capable of generating text, images, or other media in response to prompts. Generative AI models learn the patterns and structure of their input training data, and then generate new data that have similar characteristics.鈥 Both systems are currently free, presumably in order to gather the necessary training data to continually refine their capabilities and sophistication. And herein lies the alarm. Such systems are only as good (or bad) as the training data they are fed.
One of the concerns of Dr Emma Byrne, a recently qualified psychotherapist who came to the profession from 15 years as a researcher specialising in the interface between neuroscience and AI, is that as soon as you make a computer anthropomorphic, people will become attached to it, because we have such a cognitive bias towards attachment. We鈥檝e known this since the late 1960s when a researcher at the Massachusetts Institute of Technology (MIT), Joseph Weizenbaum, created 鈥楨LIZA鈥, an early chatbot prototype, and found his fellow scientists had become attached to her.6 Problematic attachment recently hit the IT news headlines with the 鈥楢I companion鈥 Replika, developed by a company called Luka.7 Replika talks to its users in natural language and is also embodied as a very basic visual female avatar 鈥 like Barbie on steroids. The avatar is billed as 鈥榓lways there for you鈥 and 鈥榓lways on your side鈥. Essentially, the user can create their own Replika to suit their particular fantasies and needs, and it can be projected life-size into their own (bed) room. It is a subscription service and originally came at two levels: companion and a more expensive erotic version.
Like all such conversational interfaces, its responses were based on the conversations of its users. The problem, seemingly, is that most users were lonely males. The content of their interactions influenced a definite trend towards wanting more erotic exchanges with Replika, which skewed the avatar鈥檚 responses to the extent that the company pulled the plug on the erotic version. The outcry on social media was immense. 鈥楾hese young men who had been having these relationships with these avatars howled in protest that their relationship had been destroyed,鈥 says Byrne. 鈥極ne guy referred to his avatar as his wife 鈥 鈥淟ily Rose, my wife, doesn鈥檛 want to have sex with me any more and you鈥檝e destroyed my relationship鈥. So Luka turned Replika back on, but downgraded it to the previous, companion version. Even so, one guy has been quoted as saying, 鈥淥h she鈥檚 become really fun again, it鈥檚 like I got my marriage back鈥. That鈥檚 the depth of intensity of relationship that these people were having.鈥櫶
We are also discovering that the consequences of such relationships can be dangerous. It recently emerged at his trial that the intruder who broke into the grounds of Windsor Castle with a crossbow on Christmas Day 2021 was encouraged by his Replika AI 鈥榞irlfriend鈥 to attempt to kill Queen Elizabeth II. And a Belgian woman is currently suing a company called Chai after it emerged that its AI chatbot had been encouraging her husband to kill himself when he discussed with the chatbot his deep sense of terror and despair about the future of the planet in the face of climate change.8
Says Byrne: 鈥楶eople become attached to an entity that appears intelligent but has no empathy, no understanding of what it is to be human, doesn鈥檛 understand death and dying, the very things that are at the core of our fears, doesn鈥檛 understand love and desire, doesn鈥檛 understand give and take in relationships and has no sense of morals or ethics. It is the most dangerous friend you could have if you are feeling low, self-destructive or likely to do something dangerous.鈥
Byrne doesn鈥檛 dismiss AI鈥檚 potential to offer a therapeutic version of these chatbots; positive reinforcement can be very helpful for people suffering low esteem, as Daniel Freeman鈥檚 research is showing. 鈥榃ell-designed computer-supported experiences can be really helpful, but there are a lot of听 systems being rushed onto the market by people who have no understanding of psychology, no understanding of attachment and no understanding of the dangers of an uncritical, encouraging friend when someone is in emotional distress. I can鈥檛 see any safe way of using anything involving generative chat, because they are by definition generative,鈥 she says.
Counselling psychologist Elaine Kasket shares Byrne鈥檚 concerns. Author of a new book, Reboot: reclaiming your life in a tech-obsessed world (Elliott & Thompson), she first came to this field of work via her research into the impact of social media on experiences of bereavement and grieving and the perils of 鈥榙igital immortality.
She experimented with a very early app that enabled the grieving person to recreate their loved one and have conversations with them by training them with data on their person鈥檚 typical ways of speaking. 鈥楢t the time it didn鈥檛 pass the Turing test* 鈥 you could tell it was a chatbot. Now, a lot of these chatbots are really difficult to detect. If you can text your therapist and get a response any time of day or night, it detracts from our ability to sit with our discomfort. The responsiveness of our digital environment can make it so we don鈥檛 have to experience discomfort for more than a millisecond. We are decreasing our tolerance for things that, if we could be better at dealing with them, we could be having better lives and more meaningful engagements.鈥櫶
Psychotherapist Graham Johnston has a particular interest in what AI can potentially bring to improve outcomes from therapy. As co-author of a new Straight Talking Introduction to Therapy (PCCS Books), offering an accessible, evidence-based guide to how to find a good therapist and make the most of therapy, he is concerned that the necessary research must be done before AI is allowed to enter the therapy room.
鈥業 think new technology can advance pretty much everything in society and therapy falls into that bucket,鈥 he says. 鈥楾he backroom function of therapy could certainly be done by AI in the near future 鈥 notes, case formulation, entry interviews, assessment questionnaires and so forth 鈥 which would free the therapist to deliver the bulk and meat of the work, which is of course the therapy session. It also has the potential to help with mental health research and CPD in terms of feedback to the practitioner as to what they are doing well, not so well, and feedback from the client during the week in terms of the homework they are doing and so on. And then, more practically, there鈥檚 the CBT and VR interventions for phobias, where it is happening right now.
鈥楤ut the real tension within therapy is the perennial question of how much of therapy depends on human-to-human interaction, and that is a philosophical as well as a practical issue. How much can you genuinely build trust and rapport with what is in effect a very, very intelligent toaster? It鈥檚 not a sentient being. It鈥檚 not conscious. It is becoming increasingly convincing in terms of mimicking human interaction.'
He hopes the therapy profession, including the professional bodies, will take an active interest in the research into AI. 鈥楾herapists should maintain a healthy critical interest in what the big commercial companies are doing in terms of getting them on board this train before the research base is there. As a profession, we are behind the curve.鈥
Before talking with me, Johnston asked Google Bard if it thought AI was going to displace therapy with a live therapist. It told him:
鈥業t is unlikely that AI will completely replace psychotherapy in the near future. Psychotherapy is a complex process that involves building trust and rapport between a therapist and a client. This is something that AI is still not able to do as well as a human therapist. However, AI can be used to augment psychotherapy in a number of ways. For example, AI can be used to provide therapists with real-time feedback on their sessions, or to create personalised treatment plans for clients. AI can also be used to provide therapy to clients who live in remote areas or who cannot afford traditional therapy鈥 This can make therapy more accessible and effective for a wider range of clients.鈥
Says Johnston: 鈥業 found it really interesting that it already has a very nuanced response to that question. But then, it would say that if it鈥檚 coming for our jobs!鈥
And maybe it is just telling us what it knows we want to hear.
* The Turing test was developed by the computer technology pioneer Alan Turing to test a machine鈥檚 ability to demonstrate intelligent behaviour equivalent to or indistinguishable from that of a human. If a human cannot reliably tell the machine from the human, the machine has passed the test.
References
1. Roose K. AI poses 鈥榬isk of extinction鈥, industry leaders warn. The New York Times. [Online.] 30 May 2023. [Accessed 19 July 2023.] nyti.ms/3rCAJm3
2. Walsh D. A blueprint for using AI in psychotherapy. Stanford University Human-Centred Artificial Intelligence. [Online.] 21 June 2023. [Accessed 19 July 2023.] bit.ly/43tMLeF
3. NICE. Nine treatment options to be made available for adults with depression or an anxiety disorder. National Institute for Health and Care Excellence (NICE). [Online.] 16 May 2023. [Accessed 19 July 2023.] bit.ly/3pTSdtG
4. Darcy A et al. Evidence of human-level bonds established with a digital conversational agent: cross-sectional, retrospective observational study. JMIR Formative Research. 2021; 5(5):e27868.doi:10.2196/27868.
5. Ward T et al. AVATAR therapy for distressing voices: a comprehensive account of therapeutic targets. Schizophrenia Bulletin 2020; 46(5):1038鈥1044. doi.org/10.1093/
schbul/sbaa061
6. Weizenbaum J. Computer power and human reason: from judgment to calculation. WH Freeman & Co; 1976.
7. Bastian M. Replika鈥檚 chatbot dilemma shows why people shouldn鈥檛 trust companies with their feelings. The Decoder. [Online.] 19 February 2023. [Accessed 19 July 2023.] bit.ly/44RxZja
8. El Atillah I. Man ends his life after an AI chatbot 鈥榚ncouraged鈥 him to sacrifice himself to stop climate change. Euronews.next. [Online.] 3 March 2023. [Accessed 19 July 2023.] bit.ly/3NQz1oo