The AI will soothe you now…

Saturday, October 7, 2023. A dreary, soggy day of persistent rain.

The power of Cognitive Empathy… is it “good enough?”

“AI can even be better than humans at helping us with socio-emotional learning because we can feed it the knowledge of the best psychologists in the world to coach and train people,” Grin Lord, CEO of mpathic.ai, a conversation analytics company in based in Bellevue, Washington. 

Jodi Halpern, a professor of bioethics at University of California, Berkeley, and an authority on empathy and technology, frowns at this unfettered enthusiasm.

She concedes that AI might achieve the ability to bestow “cognitive empathy.” She then goes on to celebrate humans by suggesting that the empathy generated by pattern recognition is inherently inferior to the richly expressive “emotional empathy,” that neurotypical humans find so deeply satisfying.

It makes sense to me that neurotypical therapists might also celebrate the uniquely human capacity to internalize another person’s pain, hope and suffering, and feel genuine concern…

Jodi made her point of view quite clear. “Empathy… that’s what’s most clinically valuable. It requires that the clinician experience something when they listen to a patient,” she said. That’s something a bot, without feelings or experiences, can’t do.

I’d like to reveal two secrets about therapists that you might find interesting. First, the level of training of most therapists is alarmingly modest.

Second, a huge cohort of American therapists feel an unrealistic sense of self-importance, because they are “wounded healers.” Many therapist rely on the profound power of having a meaningful conversation with their clients…but is the client getting the best available science-based feedback on their presenting problems?

Probably not.

So it won’t surprise me when “Ethical AI,” despite it’s utter lack of any direct experience of human suffering, becomes a clinical factotum.

I hear therapists complain how empathetic responses, generated by an AI bot in a clinical settings will somehow be debased, simply because they are of non-human origin.

We’re human… not machines..sniff.. we’re BETTER!

Here’s what AI is good at, that therapists suck at…

I may catch sh*t for this, but so be it. Too many clients working with “all-purpose” therapists have specific issues such as Borderline Personality Disorder, Generalized Anxiety Disorders, and/or Depression, Eating Disorders, etc., that may respond best to a specifically structured therapeutic regime, such as Cognitive Behavioral Therapy, Motivational Interviewing, Dialectical Behavioral Therapy, etc. which AI can guide a therapist through a clinical protocol reliably, without fail.

It’s important to note that these complex and regimented approaches have been taught with great success in psycho-educational video format for decades, with impressive outcomes… without therapists!

Let’s discuss the bias of empathy quality…

Clients who have worked with me have heard me de-construct an empathetic response down to the studs. IMHO, the best use of AI is as a therapy sidekick that can compensate for an increasingly de-skilled, sh*tty therapist.

HALEY is an AI bot who was used as a sidekick to 300 volunteer support personnel. HALEY’S task was to help render their advice to a bit more empathetic. How good did Haley do?

Support staff types: “I understand how you feel.”

HALEY interrupts and suggests something different.. “ If that happened to me, I would feel really isolated.”

The breathtaking results…

Here’s what they found. When the human volunteers accepted influence from HALEY, it resulted in a nearly 20% (ok, 19.6%) more empathetic text. Coming out of the gate, for a novel, nascent technology, that’s an incredible outcome.

The study found, compared with relying exclusively on cranky and unreliable humans, “AI can enhance empathy when paired with a human,” says Tim Althoff, an assistant professor of computer science at the University of Washington, who was a co-author of this breaking research.

AI assisted therapy is coming to a couch near you…

Mental health in the United States is a highly regulated environment with over-worked state boards of regulation.

As AI finds a place in talk therapy, it will only be because the requisite government regulation of “Ethical AI” will mandate transparency, and ostensibly protect client with a higher order of care.

If there are serious economies to be realized (and, LOL, you know there will be…), The Powers That Be will push to further de-skill public mental health.

So it logically follows that they will flatten the AAMFT lobby. Labor unions, such as SEIU, who made recent strides organizing mental health workers in clinic settings will howl.

Expect Ethical AI to nibble on existing labor agreements, which it probably will, at first. When you’re hungry for profits, AI is an awfully tasty technology.

How AI is already helping us to be more empathetic…

  • Uniphore is an enterprise AI platform based in Palo Alto, Calif. One of their product offerings is an AI virtual-meeting tool that tracks the emotional cues of participants on a call to help the conductor of the call to analyze in real time who is and isn’t engaged and what content is being well-received.

  • The technology reminds me of some of the breakthrough facial coding technology developed by John Gottman in his couples therapy research.

  • The meeting tool measures signals of emotions such as happiness, anger, or confusion by analyzing the facial expressions of the call attendees, along with a real-time analysis of their prosody, pitch, and vocal tone, as well as the words they choose to form their thoughts.

  • The director of the call can see a control panel on the screen which renders the emotional state of the attendees in mathematical scores of “engagement” and the level of, and nature of, the detected emotion.

  • One of my recent clients told me that at one large tech company, AI noticed that some of their recruiters lacked empathy and active listening skills when they were interviewing women.

  • Thanks to the pattern recognition skills of Ethical AI, the recruiters received specific feedback, which resulted in an 8% increase in female job acceptance.

Can AI make us better humans?

Here’s an idea. What if AI becomes like a Jiminy Cricket, singing “Always Let Your Conscience Be Your Guide?”

AI responds to our “little whistle” with a plethora of therapeutic assists from a psycho-educational factoid, to the kindest words of empathy for the given situation. I think we could use the help.

But like Jiminy, AI will get confused at times.

“What happens if machines aren’t good at measuring aspects of empathy that humans consider important, like the experience of illness, pain, love and loss?” … “What machines can score will become the definition of what empathy is.” Sherry Turkle, professor of the social studies of science and technology at the Massachusetts Institute of Technology.

But Sherry, what if we refuse to consider it as a score, but rather as a suggestion?

AI is an alien form of powerful noticing, but it is still an immensely powerful noticing. Humans must decide what a “good enough” experience of empathy is comprised of, and whether it can be captured in a “score.”

  • I’d prefer we enjoy the suggestions and exercise the final decision as well-informed as humanly possible, because making emotional decisions is a sacred human prerogative.

    Final Thoughts on AI agita…

    I remember when I was studying Narrative Therapy, I read about a case where a our hero therapist (he was the author of the textbook as I recall..) explained a situation in which he refused to consult with his client’s physician of 11 years… because he prized his own analytical abilities on a cold read, and did not want to be sullied by the physician’s preconceptions.

I remember being amazed that what I was reading…was in a textbook.

It was perhaps the stupidest single idea I‘d encountered. It was an arrogant, anti-curiosity.

It seems our human capacity for hubris has become fully normalized. We credit ourselves with metaphysical powers, and often shun external sources of wisdom in all it’s forms.

In a world where lonely army brides are soothed by pictures of puppies, why can’t a gentle helper like me whistle, and receive useful ideas for my client who is struggling with me in the moment?

My clients know me as a plain speaking dude, but know that I could do a lot better. So could most of us.

I for one, would welcome real-time non-human data on how to say something with more empathy.

I would give it my full consideration, and then I would decide. Over time, the more I used AI to de-construct the human message of empathy, the more empathetic I would become. Who knows? Eventually, I might even become a real boy!

Whether we’re using animated crickets… or machine learning… if your whistle’s weak…it’s good to get another point of view.

The more we know, the more we grow.

Be well, give a little whistle, and Godspeed!

RESEARCH:

Coan, J. A., & Gottman, J. M. (2007). The Specific Affect Coding System (SPAFF). In J. A. Coan & J. J. B. Allen (Eds.), Handbook of emotion elicitation and assessment (pp. 267–285). Oxford University Press.

Sharma, A., Lin, I.W., Miner, A.S. et al. Human–AI collaboration enables more empathic conversations in text-based peer-to-peer mental health support. Nat Mach Intell 5, 46–57 (2023). https://doi.org/10.1038/s42256-022-00593-2

Previous
Previous

The Hobby / Happiness connection…

Next
Next

The problem with a nagging relationship…