Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
AI Medicine Software Science Technology

AI Better Than Dermatologists At Detecting Skin Cancer, Study Finds (cbsnews.com) 60

An anonymous reader quotes a report from CBS News: For the first time, new research suggests artificial intelligence may be better than highly-trained humans at detecting skin cancer. A study conducted by an international team of researchers pitted experienced dermatologists against a machine learning system, known as a deep learning convolutional neural network, or CNN, to see which was more effective at detecting malignant melanomas. The results? 'Most dermatologists were outperformed by the CNN,' the researchers wrote in their report, published in the journal Annals of Oncology. Fifty-eight dermatologists from 17 countries around the world participated in the study. More than half of the doctors were considered expert level with more than five years' experience. Nineteen percent said they had between two to five years' experience, and 29 percent had less than two years' experience. At first look, dermatologists correctly detected an average of 87 percent of melanomas, and accurately identified an average of 73 percent of lesions that were not malignant. Conversely, the CNN correctly detected 95 percent of melanomas. The study has been published in the journal Annals of Oncology.
This discussion has been archived. No new comments can be posted.

AI Better Than Dermatologists At Detecting Skin Cancer, Study Finds

Comments Filter:
  • Bedside? (Score:2, Troll)

    by Mikkeles ( 698461 )

    How did CNN compare in bedside manner and comforting during the examination or biopsy?

    • Re: (Score:3, Insightful)

      And this is why healthcare in the US is so expensive.

      Who cares? I want an accurate fast cheap diagnosis. Not a happy ending massage.

      • Re:Bedside? (Score:4, Informative)

        by ShanghaiBill ( 739463 ) on Thursday May 31, 2018 @02:33AM (#56703818)

        Who cares? I want an accurate fast cheap diagnosis.

        Go read Yelp reviews for doctors. 95% of the bad reviews are because of rude office staff. The competence of the doctor is irrelevant to their rating.

      • Who cares? I want an accurate fast cheap diagnosis.

        The human will not stop at looking at a picture. Anything that's suspicious will be tested. The 95% for the CNN, by itself, is not good enough. That remaining 5% is quite a large percentage of the non-obvious portion.

        • Filtering out (Score:4, Informative)

          by DrYak ( 748999 ) on Thursday May 31, 2018 @06:54AM (#56704282) Homepage

          The thing with diseases that get regularly checked, is that it doesn't make *that much* importance if they are missed on the first check.

          5% is a large percentage, but some of these could be picked up by the dermatologist supervising the exam (if there's one on the premice)
          and some of those 5% will eventually be picked up during next year's check, or the year after that.
          (so 12 or 24 months later, which is still within the 28 months [hindawi.com] median time before metastasis, at which point the disease turn fatal.
          Meaning that we're considering 0.05 ^ 3 [people missed after 3 tests in a row spread over 24 months] * 0.5 [portion of them who potentially developped metastasis] = 6.25 per million.
          That is still a lot of potentially future dead cancer patients, but that's a lot better than no testing at all).

          ---

          NOTE:
          unlike benign skin features (birth marks, whatever),
          malignant lesion change gradually over time (that, per se, is one of the criteria used by dermatologists).
          so even if you run then exact same CNN on the skin picture the year after that, that CNN will definitely not see the exact same skin picture and might actually detect the cancer this time.

          • How many cases go undiagnosed because of the difficulty in just seeing a doctor?

            A cell phone picture of any problem areas twice a week should give the CNN more than enough data.

      • accurate fast cheap

        You know the old saying...

    • Re: (Score:3, Insightful)

      by AvitarX ( 172628 )

      I'm more wondering how it did on the identification of non malignant lesions.

      I could identify 100% with an algorithm that just kicked out yes, but if I got 0% of the non malignant ones, it's not very useful.

      It looks like doctors got 87% of the bad ones and falsely identified 21% of non malignant as bad, the computer getting 95% of the bad ones.

      Or I completely misunderstood the summary because I don't know medicine.

      • Re:Bedside? (Score:5, Informative)

        by ShanghaiBill ( 739463 ) on Thursday May 31, 2018 @02:42AM (#56703828)

        Or I completely misunderstood the summary because I don't know medicine.

        TFA is not much better. It is horrible journalism. It is unclear if the "AI" is actually better, with false positives and false negatives rates mingled together. It also seems to say that the humans and AI were shown DIFFERENT IMAGES, and that the humans were given additional information that the AI did not have. So the comparison of results may be meaningless.

        The only thing that can be said with certainty is that CBS produces some garbage journalism.

      • by Anonymous Coward

        I think you've got it. The medical industry may not be that interested in the suspiciously-not-quoted false positive rate because they can replace skilled technicians with machine analysis to save costs, plus perform treatments on healthy patients, which would probably have OK outcomes. Plus, the extra false positives treated will raise the reported instance statistics and so induce more of the populous into getting tested. There is a similar movement in the general cancer area where more non-cancerous p

  • by theheadlessrabbit ( 1022587 ) on Thursday May 31, 2018 @12:38AM (#56703636) Homepage Journal

    When a human doctor makes a mistake, they might learn from it and know better for next time.

    When an AI makes a mistake, every single system connected to the network might learn from it and know better for next time.

    And once an AI reaches superhuman levels of performance, it's safe to assume it will stay better.

    I am hopeful there will be some rapid advancement in this field.

    • by niks42 ( 768188 )
      Anyone who has ever used Voice Recognition knows that heuristics can foul up. The quality of your AI experience, as it absorbs new data may go down as well as up. Wasn't there a recent experience with some Microsoft social media software where its neural net became more and more racist over a period of 24 hours thanks to it gleaning data from Twitter?

      As a technical architect, I'd like to know how often I am going to have to back up the state of a neural net doing Computer Aided Diagnosis on CT and MR dat
      • Re: (Score:2, Insightful)

        by Anonymous Coward

        As a technical architect, I'd like to know how often I am going to have to back up the state of a neural net doing Computer Aided Diagnosis on CT and MR datasets so I can restore it when it starts to get worse at finding lung nodules and not better ..

        You don't "restore it", you publish only the best model you have, and if you collect more data, you retrain it until you produce a better model.

        Letting your AI train on any dataset you throw at it constantly without checking it for class balance, content, and output quality is a fun experiment, but that's not how you do it in the real world.

        If performance for a given model "gets worse", it's usually because the input quality/distribution you give it changes.

      • by Kjella ( 173770 )

        Anyone who has ever used Voice Recognition knows that heuristics can foul up. The quality of your AI experience, as it absorbs new data may go down as well as up. Wasn't there a recent experience with some Microsoft social media software where its neural net became more and more racist over a period of 24 hours thanks to it gleaning data from Twitter?

        That could be an issue in many cases, but when you're getting the gold standard answer a bit further down the road when the biopsy is done I doubt that's much of an issue. Over time you'll simply build up a bigger and bigger database of correct answers and incrementally improve.

    • When an AI makes a mistake, every single system connected to the network might learn from it and know better for next time.

      You know the converse is true as well. If they make a mistake ALL the machines will make the same mistake until it is corrected. With a human doctor that isn't generally true.

      • Comment removed based on user account deletion
        • by sjbe ( 173966 )

          Then, after a few years of active training, the systems will probably be ready to work on their own.

          That will (probably) never happen. More likely what will happen is that they will work in conjunction with the physician and the physician will gradually concern themselves more with results interpretation, management, and treatment. But no doctor is going to just let the machine go do its thing without any sort of oversight.

  • Years of schooling only to be replaced by a tech and AI (if a tech is even needed for imaging)...

    Where can I send my dick pics... I mean epidermal selfies?

    • by sjbe ( 173966 ) on Thursday May 31, 2018 @06:33AM (#56704228)

      Years of schooling only to be replaced by a tech and AI (if a tech is even needed for imaging)...

      That's like arguing that a code library and compiler can replace a programmer. It might change the tasks they have to deal with but it doesn't eliminate the job. If you think something like this is going to replace dermatologists you have no idea what they actually do.

      Where can I send my dick pics... I mean epidermal selfies?

      I think this says everything we need to know about you.

      • by MMC Monster ( 602931 ) on Thursday May 31, 2018 @07:18AM (#56704362)

        I think that it's the canary in the coal mine for many medical fields.

        Image processing neural nets are getting more powerful and more accessible for hospitals and (more importantly) large hospital networks.

        The ability to scale this so that a primary care physician can take a shot of a lesion and have it identify those that need confirmation with a specialist (versus sending everyone to a specialist) means there's a lower demand for specialists.

        This expands well beyond dermatologists. No reason why similar image processing techniques can't be used in radiology, reducing a health system's need to hire more radiologists. Or echocardiogram and electrocardiogram interpretation, freeing up the time of cardiologists (so less cardiologists need to be hired in the future).

        I am a cardiologist. Our current MUSE electrocardiogram (EKG) system pre-reads the EKGs and has us correct the interpretations. It's correct probably 95% of the time. I can't wait until a similar system gets that good with echocardiograms. It'll free up our time so we can go home earlier in the evenings.

        • by sjbe ( 173966 ) on Thursday May 31, 2018 @07:56AM (#56704474)

          I think that it's the canary in the coal mine for many medical fields.

          I think you are worrying about it more than is justified. My wife is a pathologist so I'm watching this issue closely but so far the net benefit seems to be positive.

          Image processing neural nets are getting more powerful and more accessible for hospitals and (more importantly) large hospital networks.

          I see little evidence we are danger of them getting powerful enough any time soon to start seriously denting the number of doctors needed. I think they will impact how the doctors do their job (and that's probably good) but mostly in the sense of removing a lot of needless busywork and improving quality of care.

          This expands well beyond dermatologists. No reason why similar image processing techniques can't be used in radiology, reducing a health system's need to hire more radiologists. Or echocardiogram and electrocardiogram interpretation, freeing up the time of cardiologists (so less cardiologists need to be hired in the future).

          I worked in a radiology clinic about 15 years ago where they were using this sort of tech to help with diagnosis. It helps the radiologist do their job better. It doesn't replace the radiologist. (or pathologist or cardiologist etc) To reduce head count you need to have it have enough of an impact to be a step function. I don't think we are in any real danger of that happening any time soon. Even if we did though it will just change some jobs. My wife is an AP/CP pathologist so instead of looking through a microscope most of the day she might end up looking at a monitor or even reading reports more like a clinical pathologist. That's not a bad thing, it's just different.

  • "AI better at something" is likely to become very common., in many fields.
    • by sjbe ( 173966 ) on Thursday May 31, 2018 @07:15AM (#56704352)

      "AI better at something" is likely to become very common., in many fields.

      But that doesn't always mean what people think it means. In this case for instance it doesn't matter if the AI is marginally better at guessing whether a lesion is melanoma from a picture because that isn't how melanoma is actually diagnosed. If the doctor even suspects a small chance of melanoma they are going to biopsy the tissue and send it to pathology where the pathologist tells them what it really is to the best of out knowledge. No dermatologist is going to treat a melanoma without a biopsy. The only important detail here is that the number of false negatives be as small as possible. When it doubt, cut it out. There should be some amount of false positives if the doctor is doing his job correctly. Obviously we want as few of these as possible but it's not the critical issue.

      Technology like this isn't going to replace dermatologists. It's just going to become a supplemental tool to help ensure consistency of care and to help ensure more accurate results.

      • by jbengt ( 874751 )

        If the doctor even suspects a small chance of melanoma they are going to biopsy the tissue and send it to pathology where the pathologist tells them what it really is to the best of out knowledge. No dermatologist is going to treat a melanoma without a biopsy. The only important detail here is that the number of false negatives be as small as possible.

        From personal experience, this is true. Diagnosing from a single picture is just wrong, other than to alert to the obvious cancer, in which case a biopsy wi

  • by jpostel ( 114922 ) on Thursday May 31, 2018 @01:26AM (#56703720) Homepage Journal

    Oddly, the original publication calls out the use of Google Inception v4 CNN in the Methods, but the CBS News article doesn't mention it at all.

  • by chthon ( 580889 ) on Thursday May 31, 2018 @02:18AM (#56703806) Journal

    If my father-in-law's MD would have had such a system handy, then maybe he would have been diagnosed much earlier for melanoma on his foot sole. As it was, the doctor thought it was something else, and he wasn't treated until it really was too late.

    The advancement is not that it is better than a dermatologist, but that it can be added to a system that is easier to access than a dermatologist. We really need the dermatologist after the diagnosis for therapy and treatment.

    • by Anonymous Coward

      He should have gone to a dermatologist then. It is common knowledge that dermatologists CANNOT determine melanoma by looking at it. Go ask one, they will all tell you that. If it looks anything that possibly could be melanoma, they cut a sample off and send it to a lab. The lab is 98%+ accurate at detecting it and has a near zero false positive rate.

      While this AI is interesting, it does nothing to help at all. Its only real impact might be false negatives. I had 10 non-suspect spots on me removed and

  • Conclusion (Score:4, Insightful)

    by bankman ( 136859 ) on Thursday May 31, 2018 @02:22AM (#56703808) Homepage

    The obvious conclusion is therefore not to train more doctors to correctly diagnose melanomas but to have them photograph parts of your body and submit the pictures to an IT system which will in turn deliver a diagnosis. While this may be beneficial for a health system in general (not necessarily the US, where the CNN diagnosis will of course be an order of magnitutde more expensive than traditional methods...) it may lead to less well trained physicians in front of the patient. Which may sound like technophobia is in fact happening, or rather has been happening for quite some time. Many orthopedic and trauma surgeons solely rely on imaging systems for diagnosis, while older clinicians or those trained in less advanced health systems could perform reliable first diagnosis with conventional means. You can stilll witness this in parts of Germany with physicians trained in the GDR.

    In theory, modern technology should supplement experience when in fact it far too often replaces it, increasing the overall financial burden to the system.

    • by sjbe ( 173966 ) on Thursday May 31, 2018 @06:51AM (#56704278)

      The obvious conclusion is therefore not to train more doctors to correctly diagnose melanomas but to have them photograph parts of your body and submit the pictures to an IT system which will in turn deliver a diagnosis.

      No visual IT system nor any doctor can definitively diagnose a melanoma on the skin. They can have a strong suspicion and a good dermatologist will be right most of the time but they cannot be certain in all but the most obvious of cases. Even then the tissue has to be biopsied and sent to pathology for any sort of definitive diagnosis. There they have stains, genetic markers and other tools to figure out what is growing on the patient. Furthermore a definitive diagnosis is not always possible because we have no unambiguous test for all forms of melanoma nor do we even have an unambiguous set of criteria in some cases. There is an alarming amount of gestalt in the process. Some are fairly straightforward and others are nigh impossible to diagnose. Melanoma is challenging because it can appear in a variety of forms because of it's genetic origins. It can look near indistinguishable from many other types of lesions and the genetic tests and stains and other tools we have don't always give a clear answer.

      The simple fact is that in many cases a diagnosis is really just an informed guess based on the probabilities. We're saying effectively that there is an X% chance that this is melanoma so we should treat it as if it is just to be safe.

      Source: my wife is a dematopathologist so I get to hear about all this stuff daily.

      • by bankman ( 136859 )

        You are absolutely right of course on how proper and final diagnoses is established. The problem is, and your wife will probably agree, is that the initial diagnoses, you may even call it suspicion, is the most important because it determines whether follow-up tests will be conducted at all. My point is, actually two points are that 1) fewer first patient facing physicians (the front line if you will) will be taught and gain the experience to recognise melanomas (or the likelihood of it being one) in the fi

  • Dermatologists don't diagnose melanoma on the skin. They might suspect melanoma but what they do is biopsy the sample and send it to pathology where they can run a variety of tests (genetic markers, stains, etc) and look at the tissue morphology carefully under a microscope. The pathologist then gives an evaluation of the probability that this is indeed a malignancy. Melanoma's are sneaky and can resemble a wide variety of other conditions, many of them benign and there is no dermatologists or AI system

    • All very true, but I suspect that this sort of technique is going to push its way down into the pathology departments too. After all, a number of the pathology steps involve looking at things closely and pattern matching, which are the same types of things being done by the AI being discussed.

      I don't expect it to happen any time soon but I can certainly envision a system in 15-20 years that would have a nurse practitioner scan a patient, take biopsies of flagged spots and feed the biopsies into a "pathologi

      • After all, a number of the pathology steps involve looking at things closely and pattern matching, which are the same types of things being done by the AI being discussed.

        Vision based expert systems (I wouldn't really call them AI) will certainly get used someday though there are a lot of technical challenges to get through before this is possible. There is much more to pathology than just pattern matching however.

        I don't expect it to happen any time soon but I can certainly envision a system in 15-20 years that would have a nurse practitioner scan a patient, take biopsies of flagged spots and feed the biopsies into a "pathologist in a box" machine that would do all the checks required - probably down to the DNA level.

        This already happens today. It's called Clinical Pathology [wikipedia.org]. Every time you have blood drawn (for example) that tissue gets sent to the clinical pathology department. The tissue is processed through a machine which spits out a report. The pathologist then evalu

  • ... with more than five years' experience.

    So, how do they call doctors with 20-30 years of experience?

  • Think of the sci fi script "tax" side on this for saving a nations "free" health budget.
    A doctor uses the scan to see of the patient qualifies for tests.
    Computer always says no. The patient as presented is in good health.
    Thats no free referral to the needed expert.
    No appointment is made to see an expert.
    No free pathology is done. No free pathologist to look at the results.
    No results to send back to the doctor.
    The patient goes home after a full "medical" thinking they had an actual expert look at t
  • Those elites who get their reputation by having read the book‘ unlike the unwashed masses, like Doctors, Lawyers, etc

  • One huge difference between dermatologists and all medical professionals, and other workers, is that they will still be pulling in huge salaries and working when we're automated out and roasting rats over garbage cans in the street.

    Doctors and other health professions were smart and formed a professional organization. They're set for life - the professional body limits the supply of new entrants, ensures they're trained properly and buys legislation/regulations needed to ensure they're highly compensated an

"When the going gets tough, the tough get empirical." -- Jon Carroll

Working...