Google AI Beats Doctors at Breast Cancer Detection -- Sometimes (wsj.com) 26
Google's health research unit said it has developed an artificial-intelligence system that can match or outperform radiologists at detecting breast cancer, according to new research. But doctors still beat the machines in some cases. From a report: The model, developed by an international team of researchers, caught cancers that were originally missed and reduced false-positive cancer flags for patients who didn't actually have cancer, according to a paper published on Wednesday in the journal Nature. Data from thousands of mammograms from women in the U.K. and the U.S. was used to train the AI system. But the algorithm isn't yet ready for clinical use, the researchers said.
The model is the latest step in Google's push into health care. The Alphabet company has developed similar systems to detect lung cancer, eye disease and kidney injury. Google and Alphabet have come under scrutiny for privacy concerns related to the use of patient data. A deal with Ascension, the second-largest health system in the U.S., allows Google to use AI to mine personal, identifiable health information from millions of patients to improve processes and care. The health data used in the breast-cancer project doesn't include identifiable information, Google Health officials said, and the data was stripped of personal indicators before being given to Google. Radiologists and AI specialists said the model is promising, and officials at Google Health said the system could eventually support radiologists in improving breast-cancer detection and outcomes, as well as efficiency in mammogram reading.
The model is the latest step in Google's push into health care. The Alphabet company has developed similar systems to detect lung cancer, eye disease and kidney injury. Google and Alphabet have come under scrutiny for privacy concerns related to the use of patient data. A deal with Ascension, the second-largest health system in the U.S., allows Google to use AI to mine personal, identifiable health information from millions of patients to improve processes and care. The health data used in the breast-cancer project doesn't include identifiable information, Google Health officials said, and the data was stripped of personal indicators before being given to Google. Radiologists and AI specialists said the model is promising, and officials at Google Health said the system could eventually support radiologists in improving breast-cancer detection and outcomes, as well as efficiency in mammogram reading.
This is like those dumb animals picking the winner (Score:2)
Re:This is like those dumb animals picking the win (Score:4, Informative)
It's not luck, it's pattern matching. If you want to call that AI then fine but it is data science.
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Can't read TFA since it's behind a paywall. But a coin flip will beat doctors at "detecting" breast cancer, sometimes. This really sounds like a case of someone so eager to demonstrate their project was working, that they lowered the bar to the point where random chance would've shown some successes.
I see, because you can't read the article (and apparently can't use a search engine to find alternative reports) you've decided to make up what it said and then complain that it's baloney. Interesting system.
Hey, I can't see her and I don't know where she is but I reckon your ma is doing tricks for crack. What a slut!
Re: (Score:2)
Its just luck... not science.
Well, there are some folks who claim that dogs can sniff out breast cancer:
https://www.telegraph.co.uk/ne... [telegraph.co.uk]
However, when I tried it in a bar, on a strange woman, she was less than amused.
It just goes to show you, there's one rule for dogs sniffing breasts, and another rule for humans sniffing breasts.
Re: (Score:2)
AI could be really awesome at early detection (Score:3)
Imagine if you could feed into an AI system breast cancer scans over time from everyone who did and did not develop cancer, it seems like it might be able to discern patterns even humans have not for early stage detection. So many women get scans done you could probably get a very viable recognition model from it.
Re: (Score:3)
That would be great, but it relies on assuming that there are universal signs that don't differ across populations. The more features you introduce to a neural network the more you run the risk of overfitting it and getting something that doesn't generalize well outside of your training data. The bright side here is it's a supervised learning problem so we can just throw data at the network and see how it comes out.
hmmm.. (Score:2)
Google and Alphabet have come under scrutiny for privacy concerns related to the use of patient data.
If these patient data helps perfect the system to a point were detection is spot on and in the long run can do away with doctors (for detection), I'm all for it, screw the privacy concerns of a single patient if it serves humanity. Way too many people die because of the difficult and mostly way too late detection.
Re: (Score:1)
Not "AI"! (Score:1)
Re: (Score:1)
I was wondering how long it would be before this tired canard gets tossed out. Artificial Intelligence as a term of art was never intended solely to mean human-style intelligence. Since the earliest days of the field, it has been fully understood that AI includes a checkers playing program, equation solvers, etc. Heck, even computer game bots are called AI.
I just cannot understand why some people seem to want to slap down and disparage the enormous machine learning advances in AI in the recent decade. You s
Re: (Score:1)
Re: (Score:2)
And neural nets of considerably less sophistication manage a majority of the animal kingdom. Flatworms and honeybees can learn and figure out things individually, ants and other hymenoptera can have quite complex "thoughts" as groups.
Re: (Score:2)
Yes, and I'm sure 'No True Scotsmen' were involved in the study, either.
Radiologist POV (Score:4, Interesting)
No one likes to read mammograms. They are dangerously dull and repetitive. They also have a special quality shared by few radiological studies, namely that the only important determination is "maybe cancer? get ultrasound/MRI/biopsy" or "no cancer, see you next year."
For comparison, your doctor might order a chest radiograph to rule out pneumonia, but the list of possible clinically important findings is huge - pneumonia, edema, lung nodule, mediastinal mass, lymph node enlargement, calcified granulomas, cardiac enlargement, pleural effusion, pneumothorax, fibrosis, fractured ribs, clavicular erosion from rheumatoid arthritis, the list goes on and on and some diagnoses can only be distinguished by cracking open the patient chart and doing some reading. Much more complicated to develop a system able to integrate all those possible diagnoses.
I think the vast majority of radiologists would be happy to see mammography automated. "No mammo!" is an emphatic perk often cited in radiologist job postings. There is currently a significant shortage of mammographers despite the fact that it tends to be one of the easiest subspecializations in terms of call and hours. Even mammographers might be relieved to only have to focus on ultrasound, MR and biopsies.
Re: (Score:2)
Re: (Score:2)
Biopsy examinations are problematic as well. For many types of cancer automated scanning systems have been found to do a much better job, including breast cancer slides if I remember correctly. A pathologist can spend 10 minutes examining 10 percent of a slide with the same accuracy as the computer, except the computer could then examine the other 90 percent of the slide and still be done in under a minute. Once insurers and AMA admit that the machine can do a better job in this one area (for a frack of
Turing test redux (Score:1)
Radiologists spend like 20 seconds looking at an x-ray. Very many are EXTREMELY sloppy. In my own experience as a patient I have seen numerous radiologist errors and omissions.
Great (Score:2)
breast cancer grows faster than the test interval (Score:1)
Sometimes (Score:1)
Dont keep supporting failed students for non academic reasons.
Cant learn? Not able to keep up with the level of education?
2. Peer review your experts, every year, often.
3. Visit every lab and test all the workers/experts to see if they still have the skills needed to meet your nations expected level of medical care.