Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
AI Science

AI Skin Cancer Diagnoses Risk Being Less Accurate For Dark Skin (theguardian.com) 56

AI systems being developed to diagnose skin cancer run the risk of being less accurate for people with dark skin, research suggests. From a report: The potential of AI has led to developments in healthcare, with some studies suggesting image recognition technology based on machine learning algorithms can classify skin cancers as successfully as human experts. NHS trusts have begun exploring AI to help dermatologists triage patients with skin lesions. But researchers say more needs to be done to ensure the technology benefits all patients, after finding that few freely available image databases that could be used to develop or "train" AI systems for skin cancer diagnosis contain information on ethnicity or skin type. Those that do have very few images of people with dark skin.

Dr David Wen, first author of the study from the University of Oxford, said: "You could have a situation where the regulatory authorities say that because this algorithm has only been trained on images in fair-skinned people, you're only allowed to use it for fair-skinned individuals, and therefore that could lead to certain populations being excluded from algorithms that are approved for clinical use. Alternatively, if the regulators are a bit more relaxed and say: 'OK, you can use it [on all patients]', the algorithms may not perform as accurately on populations who don't have that many images involved in training." That could bring other problems including risking avoidable surgery, missing treatable cancers and causing unnecessary anxiety, the team said.

This discussion has been archived. No new comments can be posted.

AI Skin Cancer Diagnoses Risk Being Less Accurate For Dark Skin

Comments Filter:
  • Is that because (Score:3, Insightful)

    by RightwingNutjob ( 1302813 ) on Thursday November 11, 2021 @05:34PM (#61979663)

    There are fewer samples in the training set or because the visual signal's SNR for skin cancer is lower for dark skin by virtue of the contrast between healthy skin and cancer being *objectively* lower for dark skin?

    I know. I'm sorry. I used the "o" word. I promise to do better next time and rage against the laws of logic, mathematics, and the physical universe next time.

    • Re:Is that because (Score:5, Insightful)

      by Flownez ( 589611 ) on Thursday November 11, 2021 @05:57PM (#61979733)
      You're on the money: SNR is the objective reason for this.

      The primary means of diagnosis of skin cancers is visual. Lower contrast between cancer visual indicators and skin tone = lower signal strength.

      Due to lower instances of skin cancer in this cohort there will be less data overall, with lower certainty in diagnosis where data does exist due to skin tone differences.

      This is a bias of nature, not a bias of race. A cosmic injustice perhaps, but one without a villain to pillory.
      • by AmiMoJo ( 196126 )

        And yet humans can cope with dark skin just fine, so what is the issue with AI?

        It's poor hardware. The lighting and the camera's settings are calibrated for light skin. Same thing used to happen a lot in movies, the lighting and film stock was all designed for light skin and people with darker skin tended to look even darker than they really were, with little contrast and all the subtle shades of their skin tones lost.

        Even now you often find with cosmetic products there are 10 different light shades and 1 o

      • You're on the money: SNR is the objective reason for this. The primary means of diagnosis of skin cancers is visual. Lower contrast between cancer visual indicators and skin tone = lower signal strength. Due to lower instances of skin cancer in this cohort there will be less data overall, with lower certainty in diagnosis where data does exist due to skin tone differences.

        Melanin blocks out sunlight to avoid skin cancer. People closer to the equator are exposed to more solar radiation, so develop more melanin than races from closer to the poles. With greater filters, you will find fewer dark-skinned individuals with skin cancer.

        This is a bias of nature, not a bias of race. A cosmic injustice perhaps, but one without a villain to pillory.

        In the case of skin cancer, I wouldn't call having darker skin an injustice of any kind; it is such an advantage that it became the norm for peoples living in the tropics.

    • Re: (Score:1, Funny)

      by Anonymous Coward

      The SNR improves if you use a woke approved pronoun.

    • Re:Is that because (Score:4, Insightful)

      by timeOday ( 582209 ) on Thursday November 11, 2021 @08:42PM (#61980109)
      In this study, definitively the former: "Only 1,585 images contained data on ethnicity instead of, or as well as, information on skin type. "No images were from individuals with an African, African-Caribbean or South Asian background," the team report."

      If they had tried, the latter seems likely enough. But one piece of interesting info would be how humans (dermatologists) perform on different skin tones.

      It's not just a matter of SNR. The base rate for skin cancer is less with darker skin - more protection:

      According to the American Academy of Dermatology, people with darker skin tones often do not receive a diagnosis until the cancer is in its later stages. This tends to be because the symptoms are harder to recognize.

      Late stage diagnosis can be life threatening. According to the Skin Cancer FoundationTrusted Source, the 5-year melanoma survival rate for black people in the United States is 65%, compared with 91% for white people.

      Skin cancer is not common among people with darker skin.

      In fact, according to the Skin Cancer FoundationTrusted Source, among black people in the U.S., skin cancer makes up only 1â"2% of all cases of cancer. In people with darker skin, such as Hispanic and Asian people, skin cancer makes up 4â"5% and 2â"4%, respectively, of all cases of cancer.

      So, basically, nothing about the incidence, diagnosis, or survival rate of skin cancer is the same for people with white vs. black skin. And that's before AI enters the picture at all.

      • by AmiMoJo ( 196126 )

        It's likely that doctors have less experience at spotting skin cancer with darker skin, because it is less common in those groups. So AI could be a great leveller here, if trained to properly spot cancer in dark skin. Unlike a human the AI can gain a lot of experience very quickly and never forgets it.

        One other thing that isn't mentioned here is access to medical services. Late diagnosis could be partly due to patients not presenting themselves for checks until later. People will less money or less good hea

  • by Snotnose ( 212196 ) on Thursday November 11, 2021 @05:36PM (#61979679)
    Is to run more darker skin samples in the training runs.

    FFS this isn't racism or whatever, it's using what's available. It's stupid silicon, it has no free thought. Quit reading racism into everyfucking thing that happens.
    • by K. S. Kyosuke ( 729550 ) on Thursday November 11, 2021 @05:53PM (#61979729)
      It's not even that much of a problem because people with dark skin have *vastly* lower incidence of skin cancer. (That actually may reinforce the problem of collecting useful training data points for them.)
      • Oh, and there might also be the problem that a diagnostic test for a condition with a probability of 20*X in one population may not necessarily be suitable for the same condition with a probability of X in a different population in the first place. Your false positives may be through the roof in the second case and you may need to use a different approach for the population with low occurrence of the condition.
    • Re: (Score:2, Insightful)

      by tlhIngan ( 30335 )

      Is to run more darker skin samples in the training runs.

      FFS this isn't racism or whatever, it's using what's available. It's stupid silicon, it has no free thought. Quit reading racism into everyfucking thing that happens.

      Yes, that's the solution. The problem isn't AI is racist, it's the training data is. AI is only as good as the training data, and the real problem is all training data contains bias. AI just makes it plainly obvious what was subtle before.

      Perhaps the training data was simply all the employ

      • Yes, that's the solution. The problem isn't AI is racist, it's the training data is. AI is only as good as the training data, and the real problem is all training data contains bias. AI just makes it plainly obvious what was subtle before.

        You're an idiot. Dark people have less skin cancer to begin with. There will be less sample of cancerous dark skins to analyze.

        Perhaps the training data was simply all the employees in the company, which seems reasonable as a starting point, except you don't realize that perhaps a slightly racist HR employee has made it so most people are white.

        Listen up, Captain Fucktard: White people are still the majority in the US. Therefore, in a perfect situation, the employee ratio should reflect the general population. (made up number, but probably fairly accurate) White = 60%, Black = 12%, Mexican = 20%, Everybody else = 8%

        So, if you've got 100 employees, 60 of them should be white. That's a majority of white people. it'

      • In fact, the real truth is, and I suspect no one wants to hear it, is that racism is so embedded in society that it's everywhere. All AI has done is basically expose it (it's dumb, it doesn't known any better, after all). And people are reacting because it's an ugly truth raising all sorts of very uncomfortable questions in basically every area of society. In places that prided themselves in equality, or in being a meritocracy, or such (especially in western democracies), having such ugliness exposed is galling. After all, it goes against the entire grain of being and the hypocrisy in it all.

        One more thing... Don't fuck the rest of us by reproducing. Your genetic line needs to end with you.

      • White people are the majority of the population in the US. I don't know how every single US company is supposed to get more than 13% black people on board, since it seems you're only "diverse" if white people are the minority. Especially when the applicant pool already differs from the general population. But if 90% of qualified applicants are white or Asian (aka "white privilege adjacent" and don't count towards diversity goals according to the excessively woke), and 90% of employees are therefore white, R
    • It's stupid silicon, it has no free thought.

      Oh stop with the double standards.

      Stupid silicon also does not and cannot do anything without a human to drive it. Algorithms don't come from nowhere and run on "stupid silicon" all by themselves in a way that ultimate no one is responsible for the output.

      A person made it, a person is responsible.

  • Stop This (Score:5, Insightful)

    by bill_mcgonigle ( 4333 ) * on Thursday November 11, 2021 @05:37PM (#61979681) Homepage Journal

    Your woke bullshit is killing people.

    Yes, contrast helps. Yes, training sets can be increased in size and variety. Yes, even human experts can pick out more on an albino. Yes, algorithms need adjustment for different inputs - that's goddamn reality.

    This technology is so helpful in assisting dermatologists that the hand-wringing worrywarts are going to kill people with their virtue-signaling delays.

    That is NOT a virtue. It's mental illness and we don't need an AI to detect it.

    • I think the point is to ensure that it doesn't become racist. Corporations are going to go after profit and they could likely utilize data that shows a high percentage of detection but mask the training criteria and thus eventually becoming a racist issue. Or, as the article states, it's approved with the bias in training and is denied against any individual that doesn't conform to the training data. If that were to happen, the AI could be extremely beneficial for one race and not another. And when you talk
      • Re:Stop This (Score:4, Insightful)

        by Bert64 ( 520050 ) <bert AT slashdot DOT firenzee DOT com> on Thursday November 11, 2021 @10:41PM (#61980395) Homepage

        Skin cancer affects those with lighter skin FAR more than those with darker skin. In the US the ratio is something like 22:1 (source: https://dermlite.com/pages/eth... [dermlite.com])

        Skin cancers typically manifest as a darker patch of skin - on someone with very light skin a dark patch is very easy to detect, if someone's skin is already dark then such things are much harder to detect.

        So a system which is able to detect skin cancers in light skinned patients (ie the vast majority of patients who have skin cancer) does the most good and can start saving lives. If you delay this technology because it doesn't work on darker skin then you'll be spending a huge amount of extra effort for a very small additional benefit, but how many light skinned people will die while waiting for this?

        Making everything about race and accusing people of bias is absolutely stupid. The fact is there are inherent differences between races so this is obviously going to translate into differences in the way certain conditions are detected and treated.
        The people developing these technologies are trying to produce a useful technology to aid diagnosis and save lives, but instead of being able to get on with their jobs they now have to deal with this shit.

    • Re: (Score:2, Insightful)

      You just repeated what the summary said. There was never a woke slant or any hint of social justice until you added it. You really do have wokeness on the brain.

    • by NFN_NLN ( 633283 )

      > are going to kill people with their virtue-signaling delays.

      On purpose though: https://bostonreview.net/scien... [bostonreview.net]

      "Sensitive to these injustices, we have taken redress in our particular initiative to mean providing precisely what was denied for at least a decade: **a preferential admission option for Black and Latinx heart failure patients to our specialty cardiology service**. The Healing ARC will include a flag in our electronic medical record and admissions system suggesting that **providers admit Bl

    • Re: Stop This (Score:4, Interesting)

      by RobinH ( 124750 ) on Thursday November 11, 2021 @08:28PM (#61980071) Homepage
      As long as the hand-winging only kills white people, I don't think the woke care.
    • by AmiMoJo ( 196126 )

      Here's the actual study: https://www.thelancet.com/jour... [thelancet.com]

      They don't sound "woke", they sound like doctors doing a serious study into the effectiveness of a new technology and raising some statistically valid concerns, along with an examination of why the issues may be occurring.

      In this case, it's anti-woke bullshit that is killing people.

  • by ClueHammer ( 6261830 ) on Thursday November 11, 2021 @06:14PM (#61979765)
    do not feed the trolls
  • by poptopdrop ( 6713596 ) on Thursday November 11, 2021 @06:14PM (#61979769)

    Let them self-identify as white.

  • Few of the 21 datasets recorded the ethnicity or skin type of the individuals photographed

    Isn't it obvious from the picture? Both a human and an AI can tell the difference between light and dark. The lack of dark skin is indeed a problem, but I wonder why the study had to rely on metadata for skin tone when it is the most obvious thing you can extract from a picture.

  • Who is this "Al Skin Cancer", and how did he diagnose that risk is less accurate for dark skin? What kind of risk?

  • Rayciss! Why is this reseach even funded?
    /sarc
  • In latest news, scientists find it is harder to see in the dark.

It's a naive, domestic operating system without any breeding, but I think you'll be amused by its presumption.

Working...