Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
AI Medicine

A Chatbot Helped More People Access Mental-Health Services (technologyreview.com) 25

An AI chatbot helped increase the number of patients referred for mental-health services through England's National Health Service (NHS), particularly among underrepresented groups who are less likely to seek help, new research has found. MIT Tech Review: Demand for mental-health services in England is on the rise, particularly since the covid-19 pandemic. Mental-health services received 4.6 million patient referrals in 2022 -- the highest number on record -- and the number of people in contact with such services is growing steadily. But neither the funding nor the number of mental-health professionals is adequate to meet this rising demand, according to the British Medical Association.

The chatbot's creators, from the AI company Limbic, set out to investigate whether AI could lower the barrier to care by helping patients access help more quickly and efficiently. A new study, published this week in Nature Medicine, evaluated the effect that the chatbot, called Limbic Access, had on referrals to the NHS Talking Therapies for Anxiety and Depression program, a series of evidence-based psychological therapies for adults experiencing anxiety disorders, depression, or both.
Venture capitalist Vinod Khosla comments on X: This landmark study codifies what we have believed for so long -- that AI will not only increase the quality of care but also massively improve its access, which is one of the largest barriers to good health in all corners of the globe.
This discussion has been archived. No new comments can be posted.

A Chatbot Helped More People Access Mental-Health Services

Comments Filter:
  • Seek mental health treatment treatment immediately.
  • This just reeks of faked demo BS to promote their product.
  • by Kunedog ( 1033226 ) on Wednesday February 07, 2024 @02:46PM (#64222618)

    But neither the funding nor the number of mental-health professionals is adequate to meet this rising demand, according to the British Medical Association.

    The chatbot's creators, from the AI company Limbic, set out to investigate whether AI could lower the barrier to care by helping patients access help more quickly and efficiently.

    Sounds like demand is still up. You assert a supply problem, then say nothing about addressing it.

    • by mjwx ( 966435 )

      But neither the funding nor the number of mental-health professionals is adequate to meet this rising demand, according to the British Medical Association.

      The chatbot's creators, from the AI company Limbic, set out to investigate whether AI could lower the barrier to care by helping patients access help more quickly and efficiently.

      Sounds like demand is still up. You assert a supply problem, then say nothing about addressing it.

      There's a lot of big problems with mental health access in the UK, funding is just one of them, this helps with several of the others.

      Knowing how and where to get access is a bit of an issue, specifically when it comes down to trying to get help with a specific problem. Second to this issue is that some people don't know they need (or can get) help with a mental health issue. Another huge issue is embarrassment, people find it quite difficult to discuss their problems with others, loved ones and stranger

  • Comment removed (Score:4, Insightful)

    by account_deleted ( 4530225 ) on Wednesday February 07, 2024 @03:32PM (#64222720)
    Comment removed based on user account deletion
    • This sounds to be the opposite. It's an option for people who don't want to discuss things with their GP for whatever reason. In this case, it is to replace filling out a form that will enable you to be referred to a human therapist.
    • by XXongo ( 3986865 )

      This seems to be just another robot gauntlet to conquer in order to finally get access.

      Yep.

      But, in the best case, it replaces a minimum-wage phone operator, rather than adding another level of gauntlet. And replaces it with one who knows all the rules.

      In the near term, automated systems that try to do this absolutely suck. They can only answer the simplest of questions, and they don't really understand the questions, they just parrot canned answers in response to hearing key words.

      The question is, are they going to get better?

  • by eln ( 21727 ) on Wednesday February 07, 2024 @04:23PM (#64222860)
    Eliza helped me through some serious shit back in the day.
  • by electroniceric ( 468976 ) on Wednesday February 07, 2024 @04:38PM (#64222892)

    As with all things AI (and actually for many technologies stretching into the past), there is a raft of stories along the lines of:

    Can you believe what technology X can do?
    It can help with hard problem Z!
    Here is a study where it showeld a positive result!
    We should definitely use technology X for everything!

    In this case, as in so many, technology X is being pushed by some company that is selling it and funded the study. I've created, run, and published studies like that myself. They can have meaningful findings, but often the company's goal is derived from only the positive findings:

    Overall, patients who’d used the chatbot and provided positive feedback to Limbic mentioned its ease and convenience.

    Translation:

    If you liked this idea, you thought it was neat

    What I want to know is where the studies are that show how often people who need mental health help try a technology interaction with or without AI and are turned off, discouraged, frustrated. But nobody likes to measure that, and anyone talking about AI is going to report stuff people love about it.

    We'll have to wait 5 more years until people can say "AI's not the answer after all".

  • California has a unique mental-healthcare problem: comparatively, there are few psychologists, compared to say, New England. There are a lot of social workers here, doing counseling, but that is not covered under Medicare. If you are on disability/Medicare such as I am, finding mental healthcare is difficult. I tried for six months to get a counselor from Stanford Psychiatry for trauma and autism related issues, and what I got from them is nothing but problems. They lied to me, and misrepresented what care

Truly simple systems... require infinite testing. -- Norman Augustine

Working...