You Got a Brain Scan at the Hospital. Someday a Computer May Use It To Identify You. (nytimes.com) 19
Thousands of people have received brain scans, as well as cognitive and genetic tests, while participating in research studies. Though the data may be widely distributed among scientists, most participants assume their privacy is protected because researchers remove their names and other identifying information from their records. But could a curious family member identify one of them just from a brain scan? Could a company mining medical records to sell targeted ads do so, or someone who wants to embarrass a study participant?
The New York Times: The answer is yes, investigators at the Mayo Clinic reported on Wednesday. A magnetic resonance imaging scan includes the entire head, including the subject's face. And while the countenance is blurry, imaging technology has advanced to the point that the face can be reconstructed from the scan. Under some circumstances, that face can be matched to an individual with facial recognition software.
In a letter published in the New England Journal of Medicine, researchers at the Mayo Clinic showed that the required steps are not complex. But privacy experts questioned whether the process could be replicated on a much larger scale with today's technology. The subjects were 84 healthy participants in a long-term study of about 2,000 residents of Olmsted County, Minn. Participants get brain scans to look for signs of Alzheimer's disease, as well as cognitive, blood and genetic tests. Over the years, the study has accumulated over 6,000 M.R.I. scans.
The New York Times: The answer is yes, investigators at the Mayo Clinic reported on Wednesday. A magnetic resonance imaging scan includes the entire head, including the subject's face. And while the countenance is blurry, imaging technology has advanced to the point that the face can be reconstructed from the scan. Under some circumstances, that face can be matched to an individual with facial recognition software.
In a letter published in the New England Journal of Medicine, researchers at the Mayo Clinic showed that the required steps are not complex. But privacy experts questioned whether the process could be replicated on a much larger scale with today's technology. The subjects were 84 healthy participants in a long-term study of about 2,000 residents of Olmsted County, Minn. Participants get brain scans to look for signs of Alzheimer's disease, as well as cognitive, blood and genetic tests. Over the years, the study has accumulated over 6,000 M.R.I. scans.
This is a bit silly (Score:3)
I need to start submitting things to Nature and NEJM that have been known for decades. Many projects that release data and have given more than lip service to anonymization have developed and used defacing software to solve this problem.
A bigger problem is that things like the morphometry of your brain, and the pattern of vessels, are absolutely unique, and cannot be removed without obscuring the scan enough to make it useless.
Biometrics (Score:4, Interesting)
A bigger problem is that things like the morphometry of your brain, and the pattern of vessels, are absolutely unique, and cannot be removed without obscuring the scan enough to make it useless.
Yup, there's a reason why the blood vessel pattern of the hand (where it is easier to image) or your retirna is considered as a possible biometric to unlock stuiff.
Re: (Score:3)
We frequently have the problem in longitudinal studies where someone has entered the wrong ID. Sometimes it's obvious, you've got a kid with a small head, small head, adult with giant head, kid with small head, etc. But sometimes the error is really subtle: you look at the scan and think "is that really the same person?" then you squint at some vessel patterns and it's clear that it's not.
Re: (Score:2)
de-facing software, haha
Re: (Score:2)
My favourite is "the radiologists were blinded...."
Just another privacy issue... (Score:2)
Could a company mining medical records to sell targeted ads do so, or someone who wants to embarrass a study participant?
This is just one of the many, many privacy aspects that need to be better regulated and enforced.
If any company is found mining individual medical records for non-medical purposes (and that most certainly includes insurance), then its leadership should shortly be found mining minerals in an actual salt mine.
Holy carp! (Score:1)
Has anyone ever been subjected to a 'brain scan' against their will or unknowingly? Is this really a problem, or is it something someone thinks may someday possibly be a problem?
There are much easier ways to capture a persons face and name - like when you get your drivers license, when you check in in the lobby of the doctor's building, etc. Reconstructing a fuzzy face from a 'brain scan' seems like a very expensive way to collect such data - get back to me when we start requiring people to get 'brain scans
Re: (Score:2)
Re: Holy crap! (Score:2)
Associating a name with a face, an extrapolated face derived from peripheral data from a brain scan isn't really breaching anyone's privacy.
Here in the US, insurers want photo is when delivering any covered medical procedure, so your name and face are already in the DMV database. Your actual (say "cheese") photo, taken with your permission and consigned to the state for whatever purposes they deem appropriate.
Is the patient a child, not of driving age? OK, if they go to public school their picture and name
Thankfully, HIPAA makes this non-issue for now (Score:2)
Re: Thankfully, HIPAA makes this non-issue for now (Score:2)
Re: (Score:2)
The relevant portion would be the form where you sign allowing it to be used. I have reluctantly refused to participate in at least one research study because it included language like "You consent for $TISSUE to be used for $RESEARCH_STUDY, and any other use."
Uh, no. For a particular study that I consent to, yeah. For any other purpose, without limit? No. I hope consent forms aren't still so broad, but never sign things without reading them.
Re: (Score:2)
Nope. (Score:5, Informative)
TL;DR: The actual doctors and scientists involved *know* that the data is sensitive and should not be shared. Identification itself has been already possible for a couple of decades and isn't new.
Though the data may be widely distributed among scientists,
Nope. It's not. Not unless the patient has signed an authorization to use the collected data for research. It can't end-up in a publication if the patient hasn't given consent *explicitly*. (Such forms are commonly even found as part of the pile of administrative papers that is given to patients).
In theory, even showing the scan to a colleague to ask for advice would also require a consent from the patient, though in that precise case, the current norm is to consider implicit consent (if the patient hasn't explicitly stated that they don't want their scan show to other doctors and if the patient might benefit from the doctors asking other peers for advice, one can presume that the patient would have given consent and proceed anyway) - though this might vary among places (US tend to have much more trigger-happy lawyers than here around in Europe).
most participants assume their privacy is protected because researchers remove their names and other identifying information from their records
Most *patients* might be assuming indeed, but that's not the case of the doctors. It's still considered medical data, it's still considered potentially identifying, and it's still something that needs to be protected.
But could a curious family member identify one of them just from a brain scan? Could a company mining medical records to sell targeted ads do so, or someone who wants to embarrass a study participant?
The limitation aren't technical. The problem is that these people (curious family members and a company) should never had access to that data to begin with (they should have no access to the medical records and scan pictues to mine them), if they got access to this data (e.g.: from a publication where the image was used) and patient hadn't explicitly consented to it, somebody along the chain of custody of the patient's data is in BIG TROUBLE.
(my bet: security problems arising from conflicts between understaffed and underfunded IT department, vs. bean counters of the hospital wanting to squeeze out some more money - though probably not as pronounced as in your garden variety modern business. But it might be some clueless medical intern, or some researcher under too much pressure to "publish-or-perish")
imaging technology has advanced to the point that the face can be reconstructed from the scan
Images slice have had a good resolution form quite some time. A couple of decades a go, the thickness between slices have also been tuned to similar resolution because it useful both in research (my case) and for some neurological investigations (epilepsy investigation used the same protocols, I've heard).
End result: you end-up with object with individual voxels of 1mm^3.
Skin (but not hair) has a clearly visible signal on MRI (and could somewhat be imaged on CAT-Scan), air has nearly no signal. So you just take the outer surface of the voxels in your volume (in your stack of thinly sliced images) and you get a face (but usually no hair). It's very definitely recognizable.
That's no modern "cloud AI powered" magic, that's just basic stuff doable with tech from ~15 years ago.
Now throw in slightly more-modernish image patern recognition and you could even identify patient from their insides: patterns of tiny blood vessels, patterns of the sulci on the surface of the cortex, etc.
And without even seeing the actual (3D reconstructed from the voxels) face, you could match say, a picture that was published in a study about whatever brain disease and a couple of scans belonging to a given patient from a data collection you've hacked from some hospital's poorly protected database.
No shit Sherlock! (Score:2)
They don't scan your brain, they scan your head!
The head has a front part called 'the face' that can be used to identify you.
Dentist and dental records (Score:2)
Without laws (Score:2)
There can be no individual liberty.
Allowing constructs of legal fiction to operate with impunity means those of us who are living, breathing citizens of the United States sacrifice some of our personal privacy and individual liberty.