Researchers Find Racial Bias In Hospital Algorithm (wsj.com) 114
Black patients were less likely than white patients to get extra medical help, despite being sicker, when an algorithm used by a large hospital chose who got the additional attention, according to a new study underscoring the risks as technology gains a foothold in medicine. The Wall Street Journal reports: Hospitals use the algorithm -- from Optum, UnitedHealth Group's health-services arm -- to find patients with diabetes, heart disease and other chronic ailments who could benefit from having health-care workers monitor their overall health, manage their prescriptions and juggle doctor visits, according to the study published Thursday in the journal Science. Yet the algorithm gave healthier white patients the same ranking as black patients who had one more chronic illness as well as poorer laboratory results and vital signs. The reason? The algorithm used cost to rank patients, and researchers found health-care spending for black patients was less than for white patients with similar medical conditions.
For the study, data-science researchers looked at the assessments made by one hospital's use of the algorithm. The study didn't name the hospital. The researchers focused on the algorithm's rankings of 6,079 patients who identified themselves as black in the hospital's records, and 43,539 who identified as white and didn't identify themselves as any other race or ethnicity. Then the researchers assessed the health needs of the same set of patients using their medical records, laboratory results and vital signs, and developed a different algorithm. Using that data, the researchers found that black patients were sicker than white patients who had a similar predicted cost. Among those rated the highest priority by the hospital's algorithm, black patients had 4.8 chronic diseases compared with 3.8 of the conditions among white patients. The researchers found the number of black patients eligible for fast-track enrollment in the program more than doubled by prioritizing patients based on their number of chronic conditions, rather than ranking them based on cost.
For the study, data-science researchers looked at the assessments made by one hospital's use of the algorithm. The study didn't name the hospital. The researchers focused on the algorithm's rankings of 6,079 patients who identified themselves as black in the hospital's records, and 43,539 who identified as white and didn't identify themselves as any other race or ethnicity. Then the researchers assessed the health needs of the same set of patients using their medical records, laboratory results and vital signs, and developed a different algorithm. Using that data, the researchers found that black patients were sicker than white patients who had a similar predicted cost. Among those rated the highest priority by the hospital's algorithm, black patients had 4.8 chronic diseases compared with 3.8 of the conditions among white patients. The researchers found the number of black patients eligible for fast-track enrollment in the program more than doubled by prioritizing patients based on their number of chronic conditions, rather than ranking them based on cost.
Re:Confused, Please Explain (Score:5, Insightful)
Re: (Score:2)
Re: (Score:2)
Re: Confused, Please Explain (Score:5, Informative)
They authors are saying that white people generate more revenue for the hospital so they are likely to receive preferential bias in the automated algorithm used. According to the paper black people were statistically sicker than white people but had the same healthcare costs (e.g. revenue). This shows that sickness isn't likely the factor used to refer people to the program and that revenue generation strongly correlates to the outcome documented instead.
Re: (Score:1, Troll)
I have noticed the same thing in grocery stores: People spending more money get more food.
Re: (Score:3, Insightful)
Doctors are supposed to prioritize treatment based on medical need. If someone needs urgent treatment and someone else can wait, they are supposed to treat the urgent case first even if the less urgent patient is wealthier.
Such ethical considerations tend not to factor in to the provision of groceries.
Re: (Score:2)
Such ethical considerations tend not to factor in to the provision of groceries.
Neither in private health care.
Re: (Score:2)
> Doctors are supposed to prioritize treatment based on medical need.
What is the medical need for Viagra, when men can have sexual pleasure without erections? What is the medical need for laparoscopic surgery, or for addiction treatment centers, when people can simply eat less food or take less drugs? And what is the medical need for a wig for a cancer patient who's lost their hair, or for a replacement breast after a mastectomy for cancer? Treatment to ease medical consequences or complications are an
Re: Confused, Please Explain (Score:4)
In those cases the medical need is usually the psychological consequences, or the longer term health consequences.
Re: (Score:2)
Err...new one on me.
Exactly how would that work?
Re: (Score:2)
I can't speak on it from first-hand experience, but can't you go around back and stimulate things without needing co-operation from the front-end?
Not an ideal solution. Or maybe it is.
I dunno.
Re: (Score:1)
Re: (Score:3)
Does not happen. At least in NL (privatized medical insurance, yay!). I was kicked out (for being too expensive) from the system with chronic illness (actually it is a disability) brought by incomplete and incompetent treatment in the same health care system. I was told (by doctors and legal advisors) not to even think about suing either the system for mistreatment or the insurance for kicking me out) because I would lose (in NL doctors are over protected from lawsuits for the fear of becoming "like US, whe
Re: (Score:3)
That's really awful and I hope something can be done for you, but what does medical malpractice have to do with SJWs?
Is it the capitalist angle, i.e. the huge monetary imbalance between you and the doctors that prevents you from suing them for restitution? Because if you it seems like you should probably support the SJWs.
Re: (Score:2)
"what does medical malpractice have to do with SJWs?"
In the context of this article they are spinning an algorithm run by a business choosing the most profitable path as an issue about race. Maybe you think what the algorithm is doing is okay, maybe you think it is not, maybe you think it is fine for a for-profit and also why we shouldn't make medicine for profit. In any case race is merely a correlation and bringing it up in this manner only serves to deepen a divide based on skin color instead of solidari
Re: (Score:2)
Imagine thinking the Social Justice Inquisition was anti-capitalist. JFL. You progressives are "beards" for the capitalists who, like the gay men who used to marry women to maintain their cover, pander to lunatics on Twitter and use woke talking points for PR points. "See, we're on the side of the People. Now consume product then get excited for next product."
Re: (Score:2)
I thought they were supposed to be Communists? Aren't Communists against capitalism???
Re: (Score:2)
Sure, if you're a Boomer to whom anyone left of Reagan is a communist, as I'm sure a radlib like yourself could attest to.
Not quite (Score:4, Interesting)
Because the algorithm uses expected expenditures as a proxy for medical need, it's running into a problem because, for unrelated socio-economic or cultural reasons, black patients use fewer medical resources than others with the same needs. So, they tend to be sicker than whites who are consuming the same amount of resources, but as the algorithm only looks at expected costs, patients who should be in the program (which should more efficiently provide the needed resources) are not.
Re: (Score:2)
Actually it shows that revenue generation was the cause and not racism, the correlation to race was coincidental. Basically, the white people had better insurance.
Re: (Score:2)
Re:Confused, Please Explain (Score:4, Informative)
From the article abstract:
From the body:
In other words, the algorithm measures cost effectiveness vs risk. Will adding the patient to the program produce a cost-effective result? High values automatically get in, middle values are referred back to the primary physician, low values are automatically excluded.
The authors of the paper are unhappy that this doesn't produce the result they think it should. There's an entire section about socioeconomic unequal factors and race, historical bias and race, and social trust and race (tracing back to the Tuskegee syphilis study). Aka, there's a lot of non-measurable non-math that the authors use to justify their argument that pure math is unfair.
Re: (Score:3)
Aka, there's a lot of non-measurable non-math that the authors use to justify their argument that pure math is unfair.
Pure math is often times very unfair.
I think that part of the problem is that math whether it be pure , or sullied, does not have a good "bedside manner".
Curing ills and healing the injured require a little more than an algorithm or "A.I."
Empathy, compassion, dedication, commitment, insight, persistence : These are often times the ingredients that provide the best clinical outcomes,
They are difficult enough to "teach" to humans. The best of then come pre-loaded.
Incorporating this into an algorithm or "A.I.
Re: (Score:2)
Depends who can pay and who is paying long term...
Re: (Score:2)
Math isn't unfair. The people using the math might be.
The problem here seems to be that the algorithm considers money as a factor. If you're the hospital, you probably think that's great. If you don't happen to be wealthy, you probably think it's unfair.
Re: (Score:2)
Money is, and must be, a factor in medical care.
In Iowa, there is a person that requires $1,000,000 in medical treatments [usatoday.com] per month. Yes, $12 million per year.
That $12 million being paid to keep that one person alive could, instead, be used to pay the lifetime expenses of 500-1000 other 'average' people.
There is, quite simply, not enough resources to give everyone unlimited treatment. There must be limits - and money is a more honest and fair proxy than any other that people have come up with.
Re: (Score:2)
I didn't say it shouldn't be. Of course money must be a factor in health care. *How* it is a factor is an important question. Most people, including health care providers and even administrators would agree that how much money *you* have would ideally not be a factor. Some health care systems are set up in such a way that the patient's wealth is an overwhelmingly important factor. Economists call that a perverse incentive.
Re:Confused, Please Explain (Score:4, Interesting)
I don't think they were "unhappy" about the results, I think they were merely discussing possible mechanisms of causation because the algorithm, according to their paper, specifically excludes race (https://science.sciencemag.org/content/366/6464/447). Since the correlation is not debatable, and because race is apparently not a parameter in the algorithm, then discussing possible mechanisms of causation is entirely reasonable, in my opinion. And when you're talking about that then of course socioeconomic status (as a possible "cause") needs to be examined or discussed because of well documented (but not well understood) socioeconomic health inequalities.
Re: (Score:3)
I should add that I think that the algorithm behaves as it does is quite interesting when taken in context of the socioeconomic possible (see social determinants of health (SDH)). Considering that the social determinants of health are much studied (you'll probably find hundreds, if not thousands of papers using Google Scholar) but not well explained (if explained at all considering some of the "paradoxes" such as moving up into a higher socioeconomic status from a lower one does not improve health, and vice
Re: (Score:3)
A more interesting question is if healthcare opportunities and outcomes should be dependent on socioeconomic status at all.
In most European countries the two are not supposed to be linked. They are, because we are not perfect, but in principal treatment is available based on medical need rather than socioeconomic status.
Re: (Score:3, Insightful)
Aka, there's a lot of non-measurable non-math that the authors use to justify their argument that pure math is unfair.
This isn't pure or even applied math. It's application of an ML algorithm. Anyone who ever uses this crap knows they're biased as hell and most of the work is massaging the data so you get sensible results rather than a bunch of irrelevant biases.
Its weird that people who work with buggy tech all day fall over themselves to deny problems when they relate to race.
Re: (Score:2)
Bull. It's perfectly possible to engineer a conventional algorithm that's inherently biased. Learning algorithms are not biased (generally they are provably so), and are much closer to "pure math" because the decision making part is explicitly not based on what some human expert thinks.
The data you train one of these things with is usually biased, because very few of the people collecting the data, or using it, know the first thing about how to do so.
Re:Confused, Please Explain (Score:4, Informative)
It sounds like you're saying "bull" then largely agreeing. Bear in mind you're talking about three learning algorithm whereas TFA was referring to the inference step as the "algorithm". I was sticking to that definition, but I'm mentioning that here for clarity because it's pretty ambiguous over all.
I was avoiding talking about statistical bias because every time I've brought this up in these conversations I get modded troll. Apparently that topic is just too contentious for some of the mods here...
There are proofs for some algorithms for sure showing that they are unbiased estimators of the quantities in question. Classic ones being mean and population variance. Many algorithms haves proofs of the opposite (for example some semi supervised techniques) and many have no proof. We don't know what algorithm they use of course and don't know if they have used one that makes the bias variance tradeoff more in the direction of variance. It's moderately popular to use ensembles of deep networks, and various ensemble techniques (boosting vs bagging) make different tradeoffs.
In no sense am I talking about human inserted bias, but statistically biased estimates are a very real thing.
That's of course only the bias inherent in the underlying training algorithms.
I think we are both in violent agreement that even with an unbiased learning technique, bias in the training data will cause the inference step to exhibit the same bias.
Re: (Score:2)
Results depend on the data you feed them. One important part of the data was: How much money did we spend already on that patient? The theory goes than an expensive patient probably has something that just gets more expensive with time, thus it would be better to spend the money now than have higher costs later, a.k.a. prevention is better than treatment.
As white people on average have more money, they live more close to hospitals and doctors, making it more easy
Re: (Score:2)
Re: (Score:2)
They are saying the black people have lower health spending, the hospital makes less money treating them. In other words, the algorithm had no racism at all the business just prioritized the most profitable clients and there was a correlation to race.
Re: (Score:2)
the algorithm, and the engineers are optimizing the equation they've been told to optimize, but the people paying the bill literally have no idea what they're doing.
The algorithm has been told to focus on "expensive" patients, because making them healthier would in theory save more money. However, black people are probably sicker *because* less money is spent on their health in the first place. So this algorithm looks at patients with high costs, who incidentally *tend* to be white, then decides it should f
Re: (Score:2)
I bet you will find some sort of bias towards hydrotherapy in their algo.
Does the algorithm take skin color into account? (Score:4, Insightful)
Re: (Score:2)
How could it not? Skin color is one of the simplest factors to measure.
Re: (Score:3)
True, but does the algorithm "know" about the skin color of the patient? Just because some variable is known doesn't mean it show up in the algorithm.
Re:Does the algorithm take skin color into account (Score:5, Informative)
Re: (Score:1)
Re: (Score:1)
If you want to "think" something, you can also look up the facts about if "African Americans" have better health, e.g. through mortality rates. People actually measure number from the real world!
Re: (Score:2)
...and its not freaking close. https://www.cdc.gov/mmwr/previ... [cdc.gov]
Re: (Score:2)
This is why you control for non-natural factors.
Of course, it's not that trivial. I would expect the incidence of gunshot wound treatment to have clear racial divides, purely on the assumptions that most murders in the US are with firearms, that there's a high survival rate of attempted murder and that there are clear racial divides on murders.
So you have a healthcare need that exists in some communities to a far greater extent than in others, even though that would not be necessarily evident through analys
Re: (Score:2)
Re: (Score:2)
It does. White people spend more money on medication and medical treatment, and the hospital algo prefers people who are more likely to spend money on something they can sell. In other words, no, they did not directly select for skin color, they selected for a trait that is indirectly linked to skin color.
Re: (Score:2)
'Healthcare' needs to be about taking care of people's health, not about 'how much money can we make', leading to 'black people are poor, screw them, they're on their own'.
Re: (Score:2)
Yes, it should be. But if you make healthcare a for-profit business, don't be surprised if it acts like a for-profit business.
I think you missed a key component (Score:2)
The algorithm uses spending as a proxy for need, and since black patients tend to consume fewer medical resources than they need, their actual costs are lower than they should be and the algorithm overlooks them.
Re: Does the algorithm take skin color into accoun (Score:1)
Re:Does the algorithm take skin color into account (Score:4, Informative)
I'm not sure the algorithm in question takes skin colour into account, but it apparently does not take race into account. From the paper (https://science.sciencemag.org/content/366/6464/447)
Notably, the algorithm specifically excludes race.
Re: (Score:2)
Re: (Score:2)
The point they are trying to make is that even when they tried to exclude race from the algorithm some proxy for race was just found anyway.
Re: (Score:2)
Re:Does the algorithm take skin color into account (Score:5, Informative)
TFA already answers that, and the answer is no.
The issue, as TFA explains, is that the algorithm is prioritizing based on who has the most money rather than who has the greatest medical need. A non-white person might need treatment more urgently and benefit more from an early appointment, and a human would hopefully take that into account and put them at the front of the queue, but the algorithm puts the white person first because they have more money available.
You may think that's fine, and you can certainly make the argument that healthcare should be prioritized based on wealth if you like, but we should at least have that debate rather than pretending that medical need is the primary factor in decision making.
Re: (Score:2)
Re: (Score:2)
Algorithm specifically excluded race (Score:1)
Can we see this Optum algorithm and make our own minds up, seeing as it specifically excluded race [washingtonpost.com].
Re: (Score:2)
Can we see this Optum algorithm
No, you can't. And you can't even access it unless unless you pay them lots of money.
and make our own minds up
You're going to have to be a big boy and reach conclusions based on what you have not some hypothetica.
seeing as it specifically excluded race.
It's amazing how slashdotters who work with buggy software all day every day and who are rightly happy to rag on companies for crappy software practices and poor managements suddenly do a massive about face when on of those mig
For profit medicine... (Score:2)
Will always consider cost first.
Sigh (Score:2)
Yup, because cost should be at the heart of any algorithm intended to identify at-risk patients and improve healthcare delivery....
Re: (Score:2)
Gov buys from a limited list of approved medicines at a very low cost. Want new medicine that is not yet approved? That new medicine does not exist in that nation.
It might get approved in 20 years depending on cost?
Get sick and hope the medicine is covered by that gov that year.
Surgery? Hope the condition can wait for the months and years to see the needed experts.
A private hospital offers new options to more people who can pay. Options
Re: (Score:3)
No, it’s not at the heart of all healthcare systems, and it certainly has absolutely NO place in any algorithm intended to identify patients that can be supported better - the cost decisions in, say, the British NHS happen after that point. Why? Because of things like this. Patient needs are identified and if a particular drug or service is on the restricted list due to cost, then it goes to panel *after* the individual patient need is identified, not as part of the identification process.
Re: (Score:2)
> No, it’s not at the heart of all healthcare systems, and it certainly has absolutely NO place in any algorithm intended to identify patients that can be supported better
The funds are not infinite. Efficiency, and effectiveness, has to count somewhat. Pre-natal care and vaccines, for example, are vastly more _efficient_ than months of natal ICU care and plagues, and the funds for them are sometimes drained by a very few cases. Brushing teeth is much less expensive than root canal. So treatments ar
Re: (Score:1)
Re: (Score:1)
Re: (Score:1)
If you want to pay extra for some kind of treatment, you are looked at as stomping on someone else's civil rights.
Financial bias for profitable healthcare? (Score:1)
Re: (Score:2)
Interestingly, since black patients are consuming less healthcare than they may need (hence the disparity), putting them in the program may result in them consuming more healthcare. It's hard to say what the average impact of that would be on patient spending and the hospital
Racist surgeons (Score:1)
Whenever your skin turns black from gangrene they want to cut it out. Those racists!
Algorithm (Score:1)
Data correlation vs racial bias (Score:1)
I completely sure that the algorithm don't have explicit rules against race. Not even rules designed to do something againt a race hidden but created intentionaly.
Here what we have is data correlation. People from different races has different traits,different start conditions, etc. which means that it will be have different results in total numbers.
That's not "racial bias" but different results based on the apllied rules.
Talking about "bias" is a very bad approach to address the problems because imply some
Black X Whites, or Rich vs Poor? (Score:1)
Re: (Score:2)
So ... (Score:2)
Re: (Score:2)
So how does race fit in? (Score:1)
A bias on class and wealth is not related to race. I'm sure there are many wealthy black people who are preferred along with "the whites". The bias appears to be with financial class, and not race at all. In fact, Asians and Middle-Eastern descendants rank right there with white people. So why single white people out? So, a business prefers people with money. Wow, big surprise.
The author is clearly a racist, and this article is what you get when a racist continues to look at everything as a "race" problem.
Major flaw in this.. (Score:2)
Algorithms are based on logic. The problem with trying to apply them to our healthcare system is that at best you will get a sliver of understanding of a larger system that nobody understands fully. I have worked at a couple of big healthcare companies in my career, specifically for hospitals, and it is extremely complex. Cost accounting in hospitals is a labyrinth, and that just addresses one of the problems. There are many more, and I highly doubt there is anyone who really understands it all holistic
Just follow the dollar (Score:2)
Being as the ACA very nearly gave them license to print money, it should be no surprise that they would pull something like this to increase their own payout. They knew they wouldn't have any real consequences to face; they'll say "oops!"
Wrong conclusion (Score:2)
poor (Score:2)
New normal: it's racist (Score:1)
Re:Complete Results? (Score:5, Informative)
https://science.sciencemag.org/content/suppl/2019/10/23/366.6464.447.DC1
Re: (Score:1)
From the paper: "Any patient who identified as Black was considered to be Black for the purpose of this analysis. Of the remaining patients, those who self-identified as races other than White (e.g., Hispanic) were so considered (data on these patients are presented in table S1 and fig. S1 in the supplementary materials). We considered all remaining patients to be White."
Presumably, this would include everyone who declined to answer the question.
About their approach, the paper concedes "i
Re: (Score:2)
Ssssh! You are ruining Kunedog's reverse-racism narrative!
Re: (Score:2)
The historical classification of hispanics in the US is fascinating. It flips back and forth based on contemporary politics and what some asshole from the Federal Bureau of Narcotics thinks.
oh dear (Score:1)
are you being oppressed?
Re: (Score:1)
Re: (Score:1)
Re: (Score:1)
Re: (Score:3)
When the researchers changed a patients race from black to white and kept all other metrics identical, then ran it through the algorithm again, the rank score came out identical.
The same when changing race from white to black and leaving all other metrics identical.
That doesn't surprise me and is why I wouldn't use the word 'bias' to describe the outcome.
It's not a racial bias, it's reflecting the reality of existing healthcare.
Far more useful would be an analysis and understanding of the root causes and determinants that lead to a racial discrepancy in treatment, because that can offer solutions.
From the summary it does seem that the primary factor driving the discrepancy is the historic cost, and that eliminating that factor provides recommendations that are far les
Re: (Score:2)
Wow, where do you get those statistics from?
Re: (Score:2)
I'm sure his first sentence is correct. Poor people are less likely to pay bills, particularly unreasonably enormous ones. Black people in the US are, on average, poorer than white ones.
The rest of his post is racist bullshit.