Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Medicine

Hospital Algorithms Are Biased Against Black Patients, New Research Shows (medium.com) 79

Algorithms meant to help patients in need of extra medical care are more likely to recommend relatively healthy white patients over sicker black patients, according to new research set to be published in the journal Science. From a report: While the researchers studied one specific algorithm in use at Brigham and Women's Hospital in Boston, they say their audit found that all algorithms of this kind being sold to hospitals function the same way. It's a problem that affects up to 200 million patients being sorted through this system, their paper claims. Sendhil Mullainathan, co-author of the paper and professor at the University of Chicago, says the research is intended to empower "customers" -- hospitals in this case -- to vet the mechanisms behind the software they're buying. "We're going through this phase where customers buying crucial products aren't being informed in what they are," Mullainathan says. "It's like when I buy a car -- I don't literally know what's happening under the hood."
This discussion has been archived. No new comments can be posted.

Hospital Algorithms Are Biased Against Black Patients, New Research Shows

Comments Filter:
  • by Chas ( 5144 )

    I'm tired! BEING TIRED IS RACIST!
    I'm hungry. BEING HUNGRY IS RACIST!
    Can you stop screaming at me please? NORMAL TONES OF VOICE ARE RACIST!

    Hospitals try to come up with a care regimen based on a proverbial "average" patient.

    That doesn't mean they don't deal with patients as individuals.

    Merely that their care metrics START at a given point, then are adjusted to meet the patient's needs.

    It's not like they get a POC in, and the nursing staff berate them for not responding "as well as a WHITE patient does".

    • Re: (Score:1, Flamebait)

      by Can'tNot ( 5553824 )
      No one said anything about racism, that's all you. Algorithms can't be racist but they can be wrong or misleading, and when that happens systematically like this it can be a problem and it can also be interesting to learn why.
      • by Chas ( 5144 )

        Biased
        Against
        Black
        *INSERT HERE*

        In common political parlance of 2019, that's "THAT'S RACIST!"

        • In common political parlance of 2019, that's "THAT'S RACIST!"

          No, it isn't. When an algorithm has bias, that's a specific problem with a specific meaning. It's not a dog whistle, it's not code for anything, it's not part of some conspiracy. It's a flaw in an algorithm.

          • by Chas ( 5144 )

            I know this.
            You know this.

            Social justice activists?
            Ignore it as hard as they can and scream "Racism".

  • Dupe (Score:5, Insightful)

    by Shaitan ( 22585 ) on Monday October 28, 2019 @09:43AM (#59354636)

    The algorithms aren't biased toward white people. They are biased toward conditions and insurance coverage which generates more revenue. This apparently indirectly results in recommending more attention for white patients than black patients as a statistical correction but has nothing to do with racial thinking or bias. The software is not looking at race, if more black patients came in with better insurance or more profitable conditions the software would show a different correlation.

    The race card is a red herring to divide us roughly down the middle and pit us against each other. The actually issue is one of wealth inequality which divides us at 99.99% to 0.01% distracting you from that is the purpose "multiple dimension" racial thinking. The straw man is when they "sicker black people." It shouldn't matter what race the individuals are that has nothing to do with medical care but the sicker part does. Are you good with a for profit medical system that would put assign more resources to a nose job than your heart attack? If not stop supporting the idea the system should work that way, regardless of race.

    • Pretty sure that's the point, and I'm pretty sure the goal here is to maximize patient outcome, not profit.

      When you ask "Are you good with a for profit medical system that would put assign more resources to a nose job than your heart attack? " the answer is no, and these algorithms need to be fixed.

      • ...I'm pretty sure the goal here is to maximize patient outcome, not profit.

        If you truly believe that is true in the 21st Century, then I dare you to try and sell that line of bullshit to any Board of Directors presiding over any part of the Medical Industrial Complex.

        There are hundreds of thousands of deaths every year in the US due to medical error, and yet how often have you heard about that problem on the evening news, or even in social media? Exactly. THAT is how much they actually care about "outcome" over profits.

        And as far anyone asking about for-profit ethics in medicine

        • Re: (Score:3, Informative)

          by geekoid ( 135745 )

          No, this is about improving health for more people. Full. Stop.
          Better outcome means better profits.

          "There are hundreds of thousands of deaths every year in the US due to medical error"
          False.
          Not that I expect actual thinking and data to change your incorrect narrative, but in case I am wrong(and I hope I am) here you go:

          https://sciencebasedmedicine.o... [sciencebasedmedicine.org]

          • Re:Dupe (Score:5, Insightful)

            by carlcmc ( 322350 ) on Monday October 28, 2019 @12:32PM (#59355060)
            <quote><p>No, this is about improving health for more people. Full. Stop.
            Better outcome means better profits.</p><p>"There are hundreds of thousands of deaths every year in the US due to medical error"
            False.
            Not that I expect actual thinking and data to change your incorrect narrative, but in case I am wrong(and I hope I am) here you go:</p><p><a href="https://sciencebasedmedicine.org/are-medical-errors-really-the-third-most-common-cause-of-death-in-the-u-s-2019-edition/">https://sciencebasedmedicine.o...</a></p></quote>

            You are incorrect on several points.

            Your first point is such a nebulous poorly quantifiable point - a)improving 2)health 3)more people = all three of those have significant conotations and room for debate among many on all sides ... it is not just your definition

            Better outcomes do not equal more profits. Example: we could do a CT AB/PELV on every single person in the US beginning at say ... age 30? Deaths 1)mostly eliminated: kidney 2)significant reduced: small/large bowel, urothelial cancer 3)somewhat reduced: pancreas

            But it would be enormously expensive. Health care is a cost/benefit calculation. If you don't make enough to keep the lights on, you can't serve your primary mission.
            • You're horribly confused. Yes, it would be very expensive. That expense would equal MORE profit for the medical system, and higher costs for you. Or are you suggesting that they would just do it for free?

              • You're horribly confused. Yes, it would be very expensive. That expense would equal MORE profit for the medical system, and higher costs for you. Or are you suggesting that they would just do it for free?

                For someone who simply assumes that a "very expensive" business expense would automatically equal more profit, I'd say you're the one who's horribly confused. Insurance companies have tolerances, and you're only going to attract a handful of clients wiling to pay $100K for a treatment.

                I'd also say you have no idea how much profit there is in keeping people sick either. Why do you think shitty junk food will always be inexpensive in our society? Think corn subsidies are going away anytime soon when HFCS n

                • For someone who simply assumes that a "very expensive" business expense would automatically equal more profit, I'd say you're the one who's horribly confused.

                  Nope, you're still horribly confused. It's only a business expense if you can't charge for it. So I guess you are suggesting that these tests would - for some mystical reason - be free?

                  Insurance companies have tolerances, and you're only going to attract a handful of clients wiling to pay $100K for a treatment.

                  Yeah, that's what I said. They would LOVE to be able to charge everyone for those tests. The problem isn't that they can't make money by performing unnecessary tests; the problem is that most people/insurance companies don't want to pay for it.

                  I'd also say you have no idea how much profit there is in keeping people sick either.

                  I assume you're speaking from experience?

                  Why do you think shitty junk food will always be inexpensive in our society?

                  Because it's cheap to produce and easy

                  • For someone who simply assumes that a "very expensive" business expense would automatically equal more profit, I'd say you're the one who's horribly confused.

                    Nope, you're still horribly confused. It's only a business expense if you can't charge for it. So I guess you are suggesting that these tests would - for some mystical reason - be free?

                    How exactly does every major capital purchase by every corporation get justified? Ever hear of ROI before? Getting lost in taxation semantics is pointless here. Bottom line is companies justify all purchases/expenses with revenue or potential revenue. This would be no different.

                    Insurance companies have tolerances, and you're only going to attract a handful of clients wiling to pay $100K for a treatment.

                    Yeah, that's what I said. They would LOVE to be able to charge everyone for those tests. The problem isn't that they can't make money by performing unnecessary tests; the problem is that most people/insurance companies don't want to pay for it.

                    Agreed. However, the difference in unnecessary and necessary treatments is a matter of morals and ethics, which our Medical system has none. You've already highlighted the fact that we already have great preventative tests that

          • No, this is about improving health for more people. Full. Stop. Better outcome means better profits.

            "There are hundreds of thousands of deaths every year in the US due to medical error" False. Not that I expect actual thinking and data to change your incorrect narrative, but in case I am wrong(and I hope I am) here you go:

            https://sciencebasedmedicine.o... [sciencebasedmedicine.org]

            I have read that article and several others, and there is one line in your cited article that tends to stand out above the rest:

            "In addition, it is probable that a significant number of deaths involving AEMT are not captured because of incomplete reporting."

            In other words, the author readily admits that no study is "bulletproof", and has highlighted a major issue when trying to determine the actual number of deaths those who perpetually "practice" their craft within the massive for-profit business we call the Medical Industrial Complex.

            Even the 108,000 death figure cited in the article is a number you never hear about in the evening n

          • by Shaitan ( 22585 )

            "Better outcome means better profits."

            Not when curing a condition generates less profit than not. In some cases it is more profitable to make the most effective treatment cost prohibitive for all parties or only affordable to insurance and let the rest make due with less effective treatments they'll continue to take as their condition degenerates and they eventually die. Those aren't better outcomes but they are certainly more profitable.

            The free market doesn't work when the market will bear "anything I can

        • According to a recent study by Johns Hopkins, more than 250,000 people in the United States die every year because of medical mistakes, making it the third leading cause of death after heart disease and cancer.Feb 22, 2018

          :https://www.google.com/search?q=statistics+on+medical+deaths+due+to+mistakes&rlz=1C1GCEB_enUS843US843&oq=statistics+on+medical+deaths+due+to+mi&aqs=chrome.1.69i57j33l3.13488j1j8&sourceid=chrome&ie=UTF-8/ [slashdot.org]

          Seriously, I didn't even scroll down the page. Very first par

          • The article he linked touches on the Hopkins study and shows the problems with it. Had you bothered following his link instead of immediately jumping to Dr. Google you could have avoided looking like a jackass.

      • by jwhyche ( 6192 )

        the answer is no, and these algorithms need to be removed.

        Your answer is correct but your reasoning on the algorithms is wrong. There is no room in medicine for algorithms like this. The very nature of such algorithms will maximize profit, even if that wasn't their intent.

        • Then how do you select patients and assign treatments or doctors?

          First come, first serve is an algorithm.
          So is random assignment.
          ANY procedural process is an algorithm.

          To completely remove "algorithms like this" from medicine, we can do nothing but shut down all medicine! The only TRULY fair outcome is everyone dies!

          SMOD 2020

          • by jwhyche ( 6192 )

            Incorrect. When you remove algorithms like this then you get people dealing with problems. They assign proper order to the medical conditions. It is what happens when computers are taken out of the situation.

            • And how do people make those decisions? Is there some sort of set of rules they follow? Something that might be described as... an algorithm?

              • by Shaitan ( 22585 )

                Yes and no, and only yes because the algorithm has line items that equate to open ended subjective discretion and are therefore meaningless in terms of an algorithm.

    • The report says that those who pay more are recommended for better or more treatment, using amounts paid as a proxy for healthcare needs. If the report stuck to discussion of income it would probably not have provoked this this reaction in you. In the report, 'black' is used as a proxy for poorer. But it is right; since black people are poorer and hence pay less for their healthcare they get recommended for less treatment in the future.
      • Not quite (Score:3, Informative)

        by BuckB ( 1340061 )
        The algorithm calculates a threshold based on prior expenditures and asks doctors to pay more attention to people who are above the threshold. The study actually says that Doctors are biased _for_ black people. Doctors countermand the algorithm and admit/accept black patients at a much higher rate (19.2% vs. 11.9% in sample), and higher than the trained/computed factor of 18.3%. The study does further sift the data for the most critical patients and says that doctors could do better there.
    • Anyone maximizing profit more, by curing nobody while taking all their money and merely keeping them alive so they can make more for him, will win against anybody who doesn't.

      Healthcare can only be done by an organization that tries something with an actual purpose that is also correlating with improving humanity.

      Not by ravaging "as much as 'the market' (=humans) can bear".

      This is usually the job of a government. The organization of those humans, that furthers their interests.
      Sadly, something like that does

    • The race card is a red herring to divide us roughly down the middle and pit us against each other. The actually issue is one of wealth inequality which divides us at 99.99% to 0.01% distracting you from that is the purpose "multiple dimension" racial thinking.

      Completely agree that distracting people with trivial social issues is great cover for actual issues. That's why transgender issues get promoted to the top as it's both trivial and divisive. Let's get back to economic issues which affect the 99% and stop focusing on smaller issues. The race card should be banned as it never serves a helpful purpose, just justification when other legitimate facts are lacking.

    • Comment removed based on user account deletion
    • I've seen so many dubious studies where what it really means is that poorer people tend to have worse prospects and aren't identifying real problems at all that I don't even bother to look any more to see what the study is about. Why waste the time? It's bound to be another case of race baiting.
  • by lFaRptHjbZDI ( 6200322 ) on Monday October 28, 2019 @09:46AM (#59354642)
    When you read the actual article it turns out that the algorithm isn't biased against black people, it selects people who have more healthcare expenses. It turns out that black people spend less on healthcare, and so their cases don't receive the same weighting. The article is very careful to avoid providing enough detail for us to see how it all really works because that would probably make it clear that the problem wasn't the algorithm, but rather s conflict between the researcher's desired outcomes and the actual outcomes. The algorithm is designed to steer patients with the highest need of medical attention over and above the "usual" level of care, and the article does nothing to suggest that the algorithm is not correctly identifying these patients. The number of people with chronic heart disease is not the same number of people who need extra care over and above the usual care for people with chronic heart disease, but the researcher and the author of the article conflate the two in their hunt for victimhood.
    • The algorithm is based on assumptions that bias the results in favor of white people. It needs to bias results on better patient outcomes. You said "t turns out that black people spend less on healthcare", and that needs to be part of the consideration. Or rather knowing the answer is biased, don't make it part of the question.

      • by guruevi ( 827432 ) on Monday October 28, 2019 @11:52AM (#59354852)

        It does bias results based on better patient outcomes, the algorithm simply decided that there are real divides between certain groups of people and then the "journalist" inserted their own biased thinking into the article.

        It's not racist, because it doesn't exclude ALL black people or ONLY includes white people. It divides across a line which happens to have more black people on one side compared to the other. Moreover, it happens to be that the people recommended by the algorithm get MORE intensive treatment which statistics prove that groups that receive more treatment also have higher deaths both as a result of the cases they came in as well as higher risk of hospital-borne diseases.

        We inserted the bias into the statistical result, the 'algorithm' is just a simple statistical binning algorithm, you can't call the algorithm or its creators racist because they happen to show results that don't stroke your ego or accord to your political viewpoint.

      • Agree completely. One of the authors of the paper (who is quoted in the article) is simply pointing out that using ineffective proxies can result in poor patient outcomes. In this case, the proxy should be adjusted to result in better patient outcomes. The conclusion would be the same even if you leave out the dimension of race.

      • by Shaitan ( 22585 )

        "The algorithm is based on assumptions that bias the results in favor of white people. It needs to bias results on better patient outcomes. "

        It does both, but the first is coincidental correlation and not bias. The algorithm should not be developed with any built in racial bias, including one toward statistical racial equality.

        "You said "t turns out that black people spend less on healthcare", and that needs to be part of the consideration. Or rather knowing the answer is biased, don't make it part of the q

    • by geekoid ( 135745 )

      Yes, and THAT'S why it's biased.

      "but rather s conflict between the researcher's desired outcomes and the actual outcomes."
      Thanks for the random speculation; but it's wrong.

      Bias, when discussing algorithms, is not the same as a biased person, or racism.
      It's exposing a flaw in the algorithm, which can them be adjusted for.

      "article does nothing to suggest that the algorithm is not correctly identifying these patients"
      Yes, it does.

      There is no victim hood going on. If you really cared, you would read the study.

    • by hey! ( 33014 )

      In other words the review of the found *structural* racism as opposed to *ideological* racism.

      I suspect the same algorithms would also significantly better than chance predict whether a patient has quality primary care. The idea that you can show up at a US hospital and get excellent treatment is ridiculous; as a patient you don't know what you need or what should be done, you need a doctor to advocate for you.

      Difference in primary care would readily account for racial disparities, whatever the *intend

      • In other words the review of the found *structural* racism as opposed to *ideological* racism.

        Your conclusion is based on the assumption that resistance to traditional medical treatment is distributed proportionally across race and socioeconomic class. Otherwise, if it were not the case, variation from that proportional distribution would not be evidence of any bias. If black people were half as likely as white people to need extra medical care for the same condition, an algorithm that selected black people for extra care half as often as white people would have exactly zero bias. There would be no

      • by Shaitan ( 22585 )

        "In other words the review of the found *structural* racism"

        Actually it did not. The algorithm doesn't account for race, if it did, that would be structural racism. An inequitable racial outcome is not necessarily structural racism.

        Race is a completely artificial concept with no clear delineation or definition. A process, procedure, or algorithm which doesn't have that concept introduced* will never be racist no matter what the statistical correlation to race. The math isn't racist. If the methods we are us

    • . It turns out that black people spend less on healthcare

      whoa there. black people have less money to spend on health care.

  • by Vanyle ( 5553318 ) on Monday October 28, 2019 @11:16AM (#59354682)
    It claims that black people spend less on medical care than white people do. We need to raise their costs so it is equal...
  • Bias Article (Score:4, Insightful)

    by sdinfoserv ( 1793266 ) on Monday October 28, 2019 @11:29AM (#59354740)
    nonsense. read the article: Part of the sorting criteria is includes how much people have "paid" for health care. Wealthier people have paid more. Wealthier people also tend to be white. This is not a bias ~against~ black people, it's a bias ~for~ wealthy people..
    It's like crime statistics. Poorer neighborhoods tend to have more crime. Poorer neighborhoods also tend to have more people of color. It's not about color, it's about crime.
    At some point honesty has to be part of the equation and we need to stop hiding behind imagined racism.
    • Poorer neighborhoods tend to have more crime.

      That does not mean it's the causal factor. It's relatively easy to run a linear regression against population and economic data from various cities to see if race or socioeconomics is more strongly correlated with crime rates, but be prepared to be labeled a white supremacist and racist if you dare to to perform that racist math.

  • by pesho ( 843750 ) on Monday October 28, 2019 @11:33AM (#59354746)

    From the comments [sciencemag.org] on the original article: [sciencemag.org]

    "When the researchers searched for the source of the scores, they discovered that Impact Pro was using bills and insurance payouts as a proxy for a person’s overall health—a common tactic in both academic and commercial health algorithms..."

    Seems to me the algorithm is design to maximize profit rather than health outcomes, or at the very least the designers are being lazy/clueless in selecting parameters that describe the patient's state. African Americans get ignored because they are less profitable:

    "The problem with that, he notes, is that health care costs tend to be lower for black patients, regardless of their actual wellbeing. Compared with white patients, many black patients live farther from their hospitals, for example, making it harder to go regularly. They also tend to have less flexible job schedules and more child care responsibilities."

    The same should apply to any low income patient regardless of skin color. So the algorithm is not exactly racist. It is just greedy, stupid and inhumane like the rest of the healthcare system in US.

    • Agree - the algorithm is about costs, reasons about prior costs, and makes recommendations. It's up to the nurses and doctors to do their job. But what the study leaves out is that, regardless of income, women spend more on health care than men. But you can't report that an algorithm is biased against men, so they left out those conclusions.
    • by tomhath ( 637240 )

      It is just greedy, stupid and inhumane like the rest of the healthcare system in US.

      Nonsense. The algorithm attempted to make a high level recommendation based on the data which is available. Thank HIPAA for keeping most healthcare related data out of sight. That, plus the fact that medical records are notoriously difficult to parse and understand, it was a reasonable decision to use total previous cost as a proxy for how sick the patient has been in the past (billing records are much more standardized than medical records).

  • I don't want them to attemt to cure a "pregnant" white guy's baby's sickle cell anemia because they "can't" tell different genes and need apart.

    Hint: Don't abuse "bias" to mean some kind of *ism!

    The whole point of our brains is to bias new input based on previous input to properly predict things based on (exclusivity of) correlation.
    Any algoritm that is perfectly neutral is by definition perfectly useless.

  • by zkiwi34 ( 974563 ) on Monday October 28, 2019 @11:44AM (#59354794)

    Caring nothing about the data they are looking at or understanding it, they find that black people etc.

    If they had clue 1, they would have noted that black people are far more likely to be poor than other groups.

    In the USA it is ALWAYs the poor who get screwed by the health system.

    Blacks feature more prominently as poor, ergo they get screwed over by the health system at a higher rate than their general presence in the community.

    • So in the 60s when they wanted to keep blacks out of neighborhoods they couldn't just have a "No Blacks" sign anymore. Civil rights and all that.

      So they made housing projects with zoning laws that required very large backyards.

      This bumped the price of the homes a bit. Coincidentally just a bit over what the average black home buyer could afford.

      It's also no secret that Voter Id laws are tailor made to disenfranchise minorities. The reason it's no secret is that several GOP operatives got caught
  • It's been mildly amusing to hear media reporting on Russia trying to push propaganda to create racial strife in the US while that same US media itself has been doing the same for simple profit for decades.

    Whether you find articles ridiculous or whether you buy into a narrative makes no difference. In either case media was successful in generating outrage. Outrage yields profit and continuing motivation to keep right on doing what their doing regardless of consequences.

    • by geekoid ( 135745 )

      WTF are you talking about? none of that makes sense.

      And a biased algorithm is just a flawed algorithm. Bias does not imply racist.
      All that is you projecting.

      " In either case media was successful in generating outrage."
      Yes, Russia propaganda being pushed on American citizens is outrageous. People should be outraged.

      Just so you know, that's actual factual data that that happened and is currently happening. There is no 'buy in'. Either you acknowledge the data, or you're an idiot.

  • DUPE (Score:4, Insightful)

    by MobyDisk ( 75490 ) on Monday October 28, 2019 @12:17PM (#59354982) Homepage

    But Slashdot's algorithms cannot yet detect a duplicate article [slashdot.org]

  • For all the reasons laid out in the above comments this is not an example of bias against blacks. The article is black racism if written by a black person, and the output of a white hyperliberal fellow-traveler black racist if put out by a white person. In any case, the easy, entitled racism against whites we see in this article has to stop.
  • The Electronic Frontier Foundation [eff.org] recently posted an article about algorithms discriminating in the housing market [eff.org]. Apparently, the existing laws protect people from racial bias that is done by proxy: meaning that you can't rent to people based on what neighborhood they group up in, or what school they attended. While I see what they are trying to do, it concerns me. The trouble with algorithms is that they do exactly what they have been told to do. They optimize for that statistic - blindly.

    I know for

    • People who argue for what you describe are so myopic that they cannot see their latent racism. If you believe a credit score is discriminatory against blacks, you have to also believe that blacks are less capable of the things required for good credit score than are white people. How is that not racist?
  • Suggesting that looking at one algorithm allowed them to extrapolate 200 others is so intellectually dishonest as to make this entire study junk. It's not even a single data-point in favor of their intended case as a result the extreme bias of the researchers, their methods can't even be trusted.

Some people manage by the book, even though they don't know who wrote the book or even what book.

Working...