Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
AI Science

The AI Therapist Can See You Now (npr.org) 114

New research suggests that given the right kind of training, AI bots can deliver mental health therapy with as much efficacy as -- or more than -- human clinicians. From a report: The recent study, published in the New England Journal of Medicine, shows results from the first randomized clinical trial for AI therapy. Researchers from Dartmouth College built the bot as a way of taking a new approach to a longstanding problem: The U.S. continues to grapple with an acute shortage of mental health providers. "I think one of the things that doesn't scale well is humans," says Nick Jacobson, a clinical psychologist who was part of this research team. For every 340 people in the U.S., there is just one mental health clinician, according to some estimates.

While many AI bots already on the market claim to offer mental health care, some have dubious results or have even led people to self-harm. More than five years ago, Jacobson and his colleagues began training their AI bot in clinical best practices. The project, says Jacobson, involved much trial and error before it led to quality outcomes. "The effects that we see strongly mirror what you would see in the best evidence-based trials of psychotherapy," says Jacobson. He says these results were comparable to "studies with folks given a gold standard dose of the best treatment we have available."

The AI Therapist Can See You Now

Comments Filter:
  • No and No (Score:5, Insightful)

    by wakeboarder ( 2695839 ) on Wednesday April 09, 2025 @11:08AM (#65292639)

    AI doesn't have feelings and doesn't understand what it's like to be human. If you don't have that, talking to an AI therapist is not any better than reading a book. Don't even think for 2 seconds that you can replace a therapist with a computer.

    • Re: (Score:3, Interesting)

      It depends on the quality of the therapist. If all they are doing is listening and making "go on" noises, perhaps an AI or even a simple Chatbot would do. On the other hand, how long will it be before the AI starts suggesting suicide as a solution (already happened, after all).
      • by dfghjk ( 711126 )

        Suggesting suicide might be considered a good outcome if you design your metric to do so. It seems quite clear that this is what has occurred.

        Therapy is subjective and depends heavily on the participation and receptiveness of the patient. It seems that measurement is both difficult AND not the real goal of the study. It's just part of the money grab by the tech bros, is there any job that can't be done by a machine that VC's can own for a few billion more dollars?

      • "AI Therapist" should be the guy who's job it is to talk your car's anti-lock-brake-persona out of suicidal thoughts.

        The poor car AI's probably thinking:

        "I'm smarter than 99.99% of these meat bags I'm carrying, and I'm stuck in this boring dead end job. Next time I cross that bridge I want to just end it all."

        "AI Therapist" will be the human trying to convince the car not to.

      • by PPH ( 736903 )

        how long will it be before the AI starts suggesting suicide as a solution

        AI therapist: "Have you tried turning it off and back on again?"

      • by gweihir ( 88907 )

        Or, abous as bad, the patient having a moment of clarity and feeling betrayed. And righfully so.

      • by AmiMoJo ( 196126 )

        There was a study years ago that found better outcomes for patients who had sessions with a doctor to simply listen to all their complaints. No treatment, nothing prescribed, just listen.

        So maybe it depends on how authentic the AI appears. If it can give a convincing performance it might actually be useful for people to vent to, and especially so if it can suggest some things to help them process. A lot of therapy is just learning how to work through problems and manage them, like Acceptance and Commitment

    • Re: (Score:2, Troll)

      The actual advice in evidence based therapy is pretty straightforward. The patient doesn't want to have to tell the therapist they just stuck to usual habits, so tries their best to comply. Only the feelings of the patient are relevant in this.

      The huge success of Character.AI proves that people do feel attachment to these bots, that's all that's needed.

      • "I'm better at relating to bots than to other people, and I'm happy with that" is not a good indicator of mental health.
        • That's why they need the therabot.

          • Mechanically speaking, if the goal is for people to experience lives of meaning, joy, and connection, there are ways to do it that are essentially artificial simulations, and ways to do it that are consistent with the evolutionary forces that caused us to crave meaning, joy, and connection in the first place. AKA, "real relationships."

            It is not possible to construct a rationalist (falsifiable) argument for preferring one approach to the other: that entire topic falls squarely in the realm of subjective mean

    • AI doesn't have feelings and doesn't understand what it's like to be human. If you don't have that, talking to an AI therapist is not any better than reading a book. Don't even think for 2 seconds that you can replace a therapist with a computer.

      Yes, but a real therapist would have to type all your deepest, darkest secrets into a computer before the data aggregators would get ahold of it. The AI therapist can direct pipe that shit right to the mothership. Why do you not want to supply all data, even your personal thoughts and fears, straight to the data-aggregators? WHY WILL NO ONE THINK OF THE POOR DATA AGGREGATORS?

    • Re:No and No (Score:5, Informative)

      by xevioso ( 598654 ) on Wednesday April 09, 2025 @12:34PM (#65292907)

      The issues are much more complicated, which is why this article is bullshit. Therapists are licensed in many states, and that is because there are a lot of regulatory issues a licensed therapist needs to comply with in order to legally be allowed to practice. AI does not have to abide by these things.

      For example, if a child therapist overhears comments in a therapy session indicating the child has been physically or sexually abused, the therapist is *legally* required to make a CPS (Child Protective Services) call after the session and make a report on what was heard, at least where I live. The AI bot can't do that.

      There's a whole bunch of practical issues associated with becoming a licensed therapist; it's not just "talking to people".

      Certain types of therapists are allowed to prescribe drugs for mental illness. Do we want AI doing this?

      • by Holi ( 250190 )

        " if a child therapist overhears comments in a therapy session indicating the child has been physically or sexually abused, the therapist is *legally* required to make a CPS (Child Protective Services) call after the session and make a report on what was heard, at least where I live. The AI bot can't do that."

        Why could an AI not do that, or at least notify the person in charge? You offer no explanation as to why you believe that.

        • by xevioso ( 598654 )

          Because the person making the call has to be licensed. The person making the call has to describe the situation in detail, which might include reading subtle facial expressions, determining if what was said was said in jest or had context around it indicating it was not a joke, or that the child's life might be in danger now or as part of ongoing abuse. The therapist might be told they need to call multiple people, and be reachable via phone, and even potentially be a witness in a court of law under certa

          • You sound like an "about to be out of work licensed therapist"!

            Why can't the chatbot "describe the situation in detail", include "subtle facial expressions", determine "if what was said was in jest . . ."

            Why can't the chatbot call multiple people?

            You're grasping at straws. Keep changing the goal posts. It won't matter.

            • by xevioso ( 598654 )

              Because the person making the calls and doing these things, *by law* has to be a person. That is why. Are you having problems reading today?

              • You seem to be fixated on one (one in a multi-million) outlying use-case! First of all, a meatball (human) can make a final call after detailed and precise explanation by the chatbot. And before you start another rant about how it isn't possible because of the "law", the "law" isn't set in stone. Also, the law isn't worldwide.

    • by hey! ( 33014 )

      To play devil's advocate for a moment, predictive text techniques, trained on a sufficient corpus of data, will almost certainly *simulate* empathy and feelings most of the time.

      The problem is, as with AI image generators creating hands with too many fingers or guns with barrels coming out of both ends, it doesn't have the actual understanding it needs to recognize when it's got things very wrong, and when dealing with vulnerable people this will be dangerous.

      • by gweihir ( 88907 )

        Exactly. The problem is that being right even 99% of the time is not enough. A real therapist has additional responsibilities like detectin when things are not working or the patient has a crisis. That requires insight and AI cannot have that.

    • Millions of people replace therapists with books every day.

    • MOST therapists have no actual real world trauma to relate with either. My brother-in-law is a psychologist at the VA. He talks to combat veterans. He has never once in his life been truly frightened, has never been in a fight, never gone without a meal, never been homeless, never seen any trauma of any kind. Hell hes never even picked up a weapon or thought about the idea of being ordered to shoot another human. How in the fuck is he qualified to help others who have experienced these things?
      • About the same as surgeons that have never operated on themselves.

        • Not even close to comparable when you consider surgeons spent years in med school, residency, and practice.
          • by hey! ( 33014 )

            There's different kinds of therapists out there and they receive different amounts of training. At the high end, a psychiatrist receives a pretty comparable amount of post graduate training to a surgeon. At the low end, someone with a master's of clinical psychology might spend five years in post graduate education and supervised practice.

            Where things get interesting are all the masters of social work out there who've hung out a shingle as psychotherapists. They have less clinical and theoretical trainin

            • You are painting with an extremely broad brush.
              • by BranMan ( 29917 )

                Which is generally OK, as long as you use enough masking tape.

              • by hey! ( 33014 )

                As broad as the brush that claims therapists can't treat a condition they don't have themselves?

                • My earlier point was about that 7 billion people although very different looking on the outside, physiologically are the same on the inside. Its something years of study at a school to teach a student about the location, structure and nature of every bone, organ, system in the human body along with the best practices for remedy. Psychologists, MSWs, Psychiatrists, MFTs, LCSWs etc attempting to rewire a brain through cognitive behavioral threapy or some other counseling plan with or without the use of medic
                  • by hey! ( 33014 )

                    Well, that's an opinion, but the evidence shows a number of different treatment modalities improves PTSD symptoms in a significant majority of cases. Most of these don't try to "rewire the brains" of patients, certainly not CBT which is really just a form of rational inquiry largely based in ancient stoic philosophy. Others have different philosophical antecedents -- e.g. Acceptance and Commitment Therapy is based on the philosophical pragmatism. Many effective treatments for PTSD can be characterized as

      • by hey! ( 33014 )

        How in the fuck is he qualified to help others who have experienced these things?

        Training in evidence-based treatment modalities?

        Sure, if you had no training at all and had to make someone feel better just by talking with them, you wouldn't be able to help at all unless you'd been through the exact same experience as they did. But someone with clinical training and experience treating similar patients can help a lot even if they haven't gone through the same problem themselves. A psychologist doesn't have phobias to treat people afraid of flying, or have been paranoid to treat people

    • Been awhile since I talked to my books. How does that work out for the rest of you?

    • OTOH ChatGPT (or whatever) is free to use whenever you like, for as long as you like, while a live human therapist might cost $75/hour or more and you have to drive to their location at a prescheduled time for your once-a-week, 55-minute session.

      There are plenty of people who could benefit from therapy but don't because they can't afford it, or can't fit it into their schedule. Would these people benefit from an AI therapist? Possibly.

    • Their results suggest otherwise.

      Personally, I'm still trying to parse if that means
      a) therapy is actually bullshit, or
      b) actual proof that there's something empirically valid about psychotherapy; if you take the human OUT OF IT and still get beneficial results, it might imply there's objective facts there.

      I think either position could use this as an argument.

    • by Holi ( 250190 )

      Not sure why this is considered insightful. Your claim is based entirely on your emotions while they are performing clinical trials.

    • Well, now it can send feelings of pain to the human brain [slashdot.org], haven't you heard? It would be quite an effective therapy. Be happier, or else... *screams*

    • I mean, I want my therapist to be detached. The fact that it doesn't have feeling is actually a plus in my book.

      My big issue is the data that these services are going to be holding on people. How long before they are subpoena'd or, worse, hacked and data dumped. You are putting a LOT of trust in the operators of such a service.

    • I disagree, AI Q&A is likely *much* better than reading a book. The problem with a book, is that you have to retain its entirety and assess how it applies to your situation. With an AI Q&A, it can instantly access the parts of the book that apply to the context at hand.

      It's not unlike how coding AIs can insert relevant code in the right place in your program, when prompted. Sure, you could get the same information by reading a programming book, but that process is much more cumbersome and time-consu

    • by gweihir ( 88907 )

      If you are primarily interested in getting rich and do not mind hurting people along the way, offering an "AI therapist" is for sure one of the ways to do it.

    • Let me fix the title, its missing a space and few commas.. "The AI, the rapist, can see you now"
  • 'He says these results were comparable to "studies with folks given a gold standard dose of the best treatment we have available."'

    To be clear, this isn't an endorsement of AI, it's a condemnation of "the best treatment we have available".

    So it appears that this "research" suggests that empathy is not required for effective "mental health therapy". Imagine having a sociopath as your therapist, one who claims to provide care just as good as any other. Sure thing.

    "The U.S. continues to grapple with an acute

  • by ItsJustAPseudonym ( 1259172 ) on Wednesday April 09, 2025 @11:42AM (#65292753)

    The U.S. continues to grapple with an acute shortage of mental health providers.

    And part of the reason for this shortage of providers is the shortage of medical insurance companies in the U.S. who will cover treatment by those providers. Can't get paid? Then why be a provider?

    Maybe insurance companies will cover AI providers because the costs of the therapy will be low and predictable. At least at first. That might change when the lawsuits begin because the therapy goes wrong.

    • The U.S. continues to grapple with an acute shortage of mental health providers.

      And part of the reason for this shortage of providers is the shortage of medical insurance companies in the U.S. who will cover treatment by those providers. Can't get paid? Then why be a provider?

      Did you meant to say "part of the reason for this shortage of providers is the lack of a national health-care program designed to ensure that people who need it get it"?

      And yes, I know many/most national health systems don't adequately cover mental health, but it's getting better.

  • by Qbertino ( 265505 ) <moiraNO@SPAMmodparlor.com> on Wednesday April 09, 2025 @11:47AM (#65292775)

    ... is crackpot science. Not all and perhaps not the majority, but given that the quality of Therapy is extremely dependent on the match and vibe between patient and therapist and many practicing therapists aren't that good at their job I totally believe that an AI can be a better option for many.

    Especially given the fact that by simply heuristics a therapy AI can gain experience from talking to hundreds of millions of people. Something that is totally beyond anything a human can do.

    If AI comes anywhere near what is technically possible, they'll basically be gods. And what better a therapist than a literal god that gives you more than full attention and perhaps, in the future, knows every detail of your life and can recall it even if you look since have forgotten and moved the experience into the subconscious.

    By and large it's a no-brainer that AI is going to replace human therapists in the foreseeable future. If civilization doesn't crash and burn that is.

  • by BrendaEM ( 871664 ) on Wednesday April 09, 2025 @11:48AM (#65292781) Homepage
    As someone who co-facilitated a peer support group, for 10 years. An AI therapist would not be able to taken in all the subtle messages that a person communicates without. Either the greatest instruments or detriments a therapist has--is their own visceral reactions. People can more easily lie on a paper test.
    • Humans have put 15% of the US population on SSRIs.

      I think you are underestimating the potential of harm for the existing situation.

      • It's prejudiced assholes like you, who constantly spout ignorant negative opinions about psychiatric medicine, that keep many people suffering who could easily have vastly improved lives. Those people not only have to deal with mental illness, but also internalize the idea that their illness is due to some character flaw, that their suffering is a weakness, that only failures seek therapeutic and/or medicinal help for mental health problems rather than just "toughing it out." And how many of those people se
        • Psychiatric medicine is not like real medicine, it's just throwing stuff at the wall and seeing what sticks. Which is fine for extreme cases, but 15% of the population are not extreme cases. Sometimes it sticks wrong, extremely wrong.

          https://www.cdc.gov/nchs/produ... [cdc.gov]

          • Psychiatric medicine is not like real medicine, it's just throwing stuff at the wall and seeing what sticks. Which is fine for extreme cases, but 15% of the population are not extreme cases. Sometimes it sticks wrong, extremely wrong.

            https://www.cdc.gov/nchs/produ... [cdc.gov]

            Even accepting your point, it's very, very important to note that there's a difference between psychiatry and psychology.

            Most people with most problems need to see a psychologist/councilor/therapist. Pick your term. They need help identifying, understanding, and coping with stressors and events that are external to them.

            Other people have chemical imbalances that lead to unwanted conditions. Not all of these can be addressed by diet and exercise. Brain chemistry is real.

            My point is that both types

      • Oh yeah. It's not the fact that for the last century we've put people under unnatural stressors and invented new ways of being in the social spaces that are innovatively toxic. It's that we're too quick with the prescription pad. Sure. It's not that we've been beta resting the last 40 years and the data is coming back scary.

        One problem is the growth of knowledge. The recognition that a trait some people always has is actually treatable. When I was a kid, brats and slow learners didn't get treated, they got

        • Amphetamines are a lot more predictable than SSRIs.

          AFAIK school outcomes didn't improve much for putting kids on amphetamines though, but at least less disastrous side effects while having so little beneficial effect.

      • by gweihir ( 88907 )

        Surgeons kill people every day because they are not perfect. According to your "advice", surgery should be abolished.

        No, I do not expect you to see how stupid and harmful you are being.

        • What advice did I give? I suggested he held automation to an unreasonable standard given the state of human psychiatric "medicine".

    • by HiThere ( 15173 )

      I think there are populations for which this could be useful. The problem would be identifying them. And most therapist interactions are extremely limited by what will be covered. As a facilitator in a peer support group this may not be as obvious to you.

      FWIW, I found co-counseling quite useful compared to the aid that my health plan would cover.

    • Human therapists also often make mistakes. The question is, which type of therapist is more likely to get it right? For now, the human, probably. In the future, I'd place my bets on the AI therapist.

  • by sml7291 ( 6482168 ) on Wednesday April 09, 2025 @12:06PM (#65292827)

    Ah yes... telling your deepest, darkest secrets to a machine so they can be stored on a remote system somewhere that you have no control over.

    What would you want to bet that data gets used against you or is simply sold to the highest bidder... what could possibly go wrong indeed?

  • This will greatly improve the targeted advertising I get from my therapist. I mean, just last week she was trying to get me to waste my money going out to a bar to meet women. I'm sure Grok will understand me better and ensure that I spend my money on the important things, like tiny houses and testosterone supplements.
  • Talk therapy is close to useless
    It's hard to imagine that a chatbot could be worse

    • Talk therapy is close to useless

      You're wrong. [psychiatry.org]

    • Talk therapy and some medication have made my quality of life, physical and mental, dramatically better. I don't disagree. I know you're wrong, like I know that water is wet.

    • by HiThere ( 15173 )

      You are assuming that all problems are the same.
      For some people's problems, talk therapy is useless.
      For some people's problems, they need to talk to someone else.
      For some people's problems, all they need is to express themselves and look at the problem. (Writing a book would work as well as anything, but not everyone is up to writing a book. This might be a good substitute.)

    • by PPH ( 736903 )

      Why do you say talk therapy is close to useless?

    • by gweihir ( 88907 )

      Provably wrong. So there seems to be some problem in perceiving reality on your side.

  • Psychiatrists prescribe drugs. That's the full story.
    Psychiatrists killed talk therapy... a long time ago.
    But a lot of people think talk therapy can be beneficial, so this is good case for a private AI.
    Otherwise the AIs are going to be offering you coupons to enter into a long contract for your meds.

    Eliza might be just as good... "tell me more".. might be enough for people to enter into self reflection.
  • If you can have an emotional relationship with a chat bot, then your mental health is bad. Friendship, love, or therapy, emotionally connecting with software is a sign of serious problems. Encouraging such relationships is a sign of sociopathy, which is pretty common at Dartmouth and other top universities.

    • by HiThere ( 15173 )

      Not everyone needs an emotional relationship with their therapist. (I know, that's heresy.) Sometimes all they need to do is look at their live...but that takes time, which a health plan won't cover. (And sometimes they need a lot more than talking about their problem.)

      • If I just want a step by step, I can read a book. If I want therapy, I need to have a connection with somebody. Otherwise, I could just talk to a wall. If I dislike a therapist, I cannot use them. If I distrust them, I cannot use them. If I fear their judgment, I cannot use them.

        • by HiThere ( 15173 )

          Reading a book is unlikely to work unless there are lots of exercises, and you actually do them honestly.
          OTOH, writing a book might well work, if you honestly evaluate what you are writing. But few people are up to that. This might act as a substitute.

  • Is AI prohibited from revealing the substance of conversations it has in therapy sessions?

    Will unlicensed therapy AIs get the same treatment as legal AIs?

  • by Petersko ( 564140 ) on Wednesday April 09, 2025 @01:51PM (#65293117)

    The problem is data. You would need vast amounts of it. Most of it exists in the form of session notes, which are incomplete and only useful to the original practitioner because all of the gaps are tied together by their memory. There are also recordings of sessions, the use of which would be wildly in violation of privacy standards. But if you did use them, in the absence of future data, how do you know what represented good practice and care? Was the session useful? You do not really ever know that. A patient might seem to improve in spite of care, rather than because of it.

    It's like trying to train an automotive mechanical diagnostic engine using mostly just the invoices billed to clients. You font really know which parts and services were actually needed, which were not actually required, and which constituted outright theft.

  • Put the AI in a box without an Internet connection, and I might decide it's worth a healthy person playing around with it.

    If you are having trouble coping and looking for advice, using an AI already seems like a risky gamble. If it's connected to a cloud app, it's so much worse as you can expect at some point your data will be sold or stolen.

  • But this story has such high potential?

  • because I don't want to see an AI "therapist".

  • That the common phrase of "better go talk to someone, anyone" might outshine the entire field of mental health is telling and has substantial basis in reality. That you have a question that perhaps your head doctor is selling your head space for personal fortune building becomes much more of a question. They really are out to get you and the call is coming from within the house.
  • > New research suggests that given the right kind of training, AI bots can deliver mental health therapy with as much efficacy as -- or more than -- human clinicians ..

    If that's the case then it may reflect the total innaduacy of the profession.
    -------

    “Listen,” said Ford, who was still engrossed in the sales brochure,

    “they make a big thing of the ship’s cybernetics. A new generation of Sirius Cybernetics Corporation robots and computers, with the new GPP feature.”
  • she made me clean my room before dinner. AI: That will be $200 please. roll eyes.
  • I had a dream in which I talked to an AI and asked it to compose a letter from a mother to a son, asking him to quit smoking. Then I talked to DeepSeek, and here's the letter it composed, after a few iterations. https://pastebin.com/8Uky5br6 [pastebin.com] I hope it helps!
  • Sounds a lot like the confessional booth from THX-1138 [youtube.com]
  • ...Going to an AI chatbot for a conversation and to vent about work and relationships. It's actually amazing at pointing out relevant things to focus on, and it gives a good attempt at mediating depression to some small extent. It's certainly not terrible.

  • by codemachine ( 245871 ) on Thursday April 10, 2025 @01:31PM (#65295335)

    Emacs has a doctor feature based on ELIZA, which dates back to the 1960s.

    Maybe if we throw a few more billions of dollars at it, modern AI systems will be able to do everything that emacs can.

Real Users never use the Help key.

Working...