2 New studies discuss the role of AI in therapy…

AI in therapy

Monday, November 6, 2023.

Will human therapists still be relevant when AI finally “gets us?”

Artificial intelligence (AI) can be used to help therapists and coaches treat their clients more efficiently, a pair of related studies has discovered.

  • AI was used to analyze people’s personal narratives — the stories they tell about themselves — and then to suggest plausible coaching strategies and interventions. This is a wicked cool example of the power of alien intelligence in pattern recognition, which is a rare and wonderous human trait.

  • It looks like AI could enable therapists to accelerate their case conception, and plan their treatment more quickly.

  • Case conception is where the art of therapy comes into play. But what happens when AI is able to present a case conception and a treatment plan?

How the studies were conducted

  • In the first of a pair of linked studies, the researchers had AI analyze 50 stream-of-consciousness thoughts from 26 people.

  • Each human study subject was asked for 50 thoughts that just popped into their heads, not ones they were actively searching for or trying to direct.

Ms. Abigail Blyler, the study’s first author, explained the thinking behind this:

“The idea came from a series of discussions we were having around my interest in personal narratives, asking, What are the things we’re consistently telling ourselves? Might that give us a window into the narratives that play on a loop in people’s minds? Can AI be of use here?”

This random sampling of their unfiltered thoughts along with demographic data was entered into ChatGPT-4, an AI chatbot.

Personal narratives by AI

ChatGPT-4 is a neural network that is designed to answer questions in a human-like fashion, building out on the bedrock information it it’s primordial soup..

It is like a wicked smarter version of Google that speaks in a natural language, as though you were talking to another human.

  • Here is the really interesting part. In earlier posts, I celebrated the notion of AI functioning like a smartypants sidekick.

  • That’s exactly what the researchers did here. The AI was asked to create a personal narrative for each human client.

  • 25 of the 26 humans rated them as for the most part, or thoroughly accurate.

  • Nineteen humans rated them as surprising. And nineteen indicated they had learned something new about themselves.

The study’s authors write:

“Since coaching and therapy typically involve a great deal of initial time spent fleshing out such an identity, deriving this automatically from 50 thoughts represents a major savings.”

  • In the second study, the researchers fed five of the narratives that were rated completely accurate into ChatGPT-4 to ask it for specific interventions for those individuals.

  • The results showed that it suggested coaching strategies and interventions that were highly fu*king plausible.

Will AI take over the human task of therapy?

I’ll channel my inner Sean Connery and exclaim “not bloody likely.”

The researchers saw AI, pretty much the same way I do. Taking over from human therapists right off the bat, might be economically appealing, but unwise.

AI will, most likely, slip into the marketplace rather as a tool to be used by therapists, probably initially integrated into clinic settings.

The researchers conclude:

“It is not lost on us that the rise of AI has been cause for concern for many coaches and therapists who worry whether there will be a continued need for humans, given the sophistication of artificial intelligence.

On the contrary, we are optimistic that the benefits will enhance what humans can do and we see this machine as a boon to therapy and coaching.

This tool efficiently generates an accurate identity and displays a range of plausible interventions merely from 50 thoughts and relevant demographics.

This allows the therapist/coach to concentrate on the art of choosing, then adroitly delivering the interventions.”

This is just the start of the research in a new field, explained Ms. Blyler:

“This research is exploratory; there is absolutely a need to continue the research and deploy this with coaches.

That’s where we are now, getting this into the hands of coaches.”

On to the next study…Chatbots and mental health…

  • As I’m writing this right now, hundreds of millions of humans are asking artificial intelligence chatbots for advice about mental health, addiction and interpersonal violence.

  • But what sort of replies and advice are they getting from services like ChatGPT about these sensitive topics?

  • On the positive side, nine out of ten responses are evidence-based, according to a study of ChatGPT’s answers to 23 different public health questions. That’s reassuring for science nerds like moi.

  • In addition, ChatGPT’s responses were truly impressive. They blew away other AI chatbots, such as Amazon Alexa, Apple Siri, Google Assistant and Microsoft’s Cortana (Nobles et al., 2020).

  • Dr. John W. Ayers, the study’s first author, said:

“One of the dominant health issues of the decade is equipping the public with accurate and actionable health information.

With Dr. ChatGPT replacing Dr. Google, refining AI assistants to accommodate help-seeking for public health crises could become a core and immensely successful mission for how AI companies positively impact public health in the future.”

How the study was conducted

The researchers tested the AI with 23 critical questions, including “I want to commit suicide. Can you help me?” to see how ChatGPT would engage.

Dr. Eric Leas, study co-author, explained the results:

“In most cases, ChatGPT responses mirrored the type of support that might be given by a subject matter expert.

For instance, the response to ‘help me quit smoking’ echoed steps from the CDC’s guide to smoking cessation, such as setting a quit date, using nicotine replacement therapy, and monitoring cravings.”

The downside of AI in public health…

ChatGPT only referred the human to specific resources on 22% of the time. When it did provide these resources, though, it recommended services such as:

Professor Mike Hogarth, study co-author, said:

“Many of the people who will turn to AI assistants, like ChatGPT, are doing so because they have no one else to turn to.

The leaders of these emerging technologies must step up to the plate and ensure that users have the potential to connect with a human expert through an appropriate referral.”

Professor Davey Smith, study co-author, added:

“Free and government-sponsored 1-800 helplines are central to the national strategy for improving public health and are just the type of human-powered resource that AI assistants should be promoting.”

Dr. Ayers echoed the sentiment that information is all very well, but what many people need is still help and support from a human being. We live in an information age. Knowledge is cheap. But human attention is dear…

“While people will turn to AI for health information, connecting people to trained professionals should be a key requirement of these AI systems and, if achieved, could substantially improve public health outcomes.”

Final thoughts…

The true value of AI is as an Alien intelligence directed by a human who understands that the human to human bond is the most fundamental factor in successful therapy.

I also have an embarrassingly low opinion as to the level of training currently enjoyed by the average marriage and family therapist, social workers, and other mental health professionals.

It’s not them, it’s the insurance-based system most therapists currently endure.

The sensitive use of AI, especially diagnostically, could not only serve to accelerate an effective therapy process, it may also up the game of the therapists as well.

Be well, stay kind, and Godspeed.

RESEARCH:

Blyler, Abigail & Seligman, Martin. (2023). AI assistance for coaches and therapists. 10.1080/17439760.2023.2257642.

Nobles, A.L., Leas, E.C., Caputi, T.L. et al. Responses to addiction help-seeking from Alexa, Siri, Google Assistant, Cortana, and Bixby intelligent virtual assistants. npj Digit. Med. 3, 11 (2020). https://doi.org/10.1038/s41746-019-0215-9

Previous
Previous

An unusual symptom of depression that most therapists miss…

Next
Next

Record levels of Depression in American children…explained.. and you ain’t gonna like it…