Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
DEAL: For $25 - Add A Second Phone Number To Your Smartphone for life! Use promo code SLASHDOT25. Also, Slashdot's Facebook page has a chat bot now. Message it for stories and more. Check out the new SourceForge HTML5 internet speed test! ×
Medicine Security Software Hardware Technology

Implantable Cardiac Devices Could Be Vulnerable To Hackers, FDA Warns (vice.com) 60

The U.S. Food and Drug Administration warned on Monday that pacemakers, defibrillators and other devices manufactured by St. Jude Medical, a medical device company based in Minnesota, could have put patients' lives at risk, as hackers could remotely access the devices and change the heart rate, administer shocks, or quickly deplete the battery. Thankfully, St. Jude released a new software patch on the same day as the FDA warning to address these vulnerabilities. Motherboard reports: St. Jude Medical's implantable cardiac devices are put under the skin, in the upper chest area, and have insulated wires that go into the heart to help it beat properly, if it's too slow or too fast. They work together with the Merlin@home Transmitter, located in the patient's house, which sends the patient's data to their physician using the Merlin.net Patient Care Network. Hackers could have exploited the transmitter, the manufacturer confirmed. "[It] could (...) be used to modify programming commands to the implanted device," the FDA safety communication reads. In an emailed response to Motherboard, a St. Jude Medical representative noted that the company "has taken numerous measures to protect the security and safety of our devices," including the new patch, and the creation of a "cyber security medical advisory board." The company plans to implement additional updates in 2017, the email said. This warning comes a few days after Abbott Laboratories acquired St. Jude Medical, and four months after a group of experts at Miami-based cybersecurity company MedSec Holding published a paper explaining several vulnerabilities they found in St. Jude Medical's pacemakers and defibrillators. They made the announcement at the end of August 2016, together with investment house Muddy Waters Capital.
This discussion has been archived. No new comments can be posted.

Implantable Cardiac Devices Could Be Vulnerable To Hackers, FDA Warns

Comments Filter:
  • Well over a decade ago RSA was working with a cardiac device manufacturer to prevent the device being vunerable to hackers.
    Apparently the device had a Z80 variant in it so the RSA guys had to get hold of some old books to work out how to code for it.

    Of course there are always cowboys taking shortcuts so that's probably why the FDA is warning some manufacturers that they should not be taking a stupid shortcut.
  • If the FDA weren't so strict about certifying every possible change to a medical device, this would be less of an issue. Because of all the hoops and red tape manufacturers have to go through anytime they make a change, the FDA rules/regulations provide a disincentive to make changes.

    And, why is the FDA pointing a finger at device manufacturers, whey they themselves are responsible for device approval and should have identified these issues before giving that approval? Either they're responsible for ensur
    • by geekmux ( 1040042 ) on Wednesday January 11, 2017 @10:32PM (#53651697)

      ...And, why is the FDA pointing a finger at device manufacturers, whey they themselves are responsible for device approval and should have identified these issues before giving that approval?

      Because the FDA does not maintain an elite army of Cyberhackers. That's why.

      Either they're responsible for ensuring that devices are safe, or they're not. They can't have it both ways. Your government at work.

      Divisions of the government that do maintain Cybersecurity divisions have been hacked, as well as the corporate sector. Even the most accelerated plan to approve changes may not be fast enough to keep up with potential threats and discovered vulnerabilities.

      Perhaps the ultimate answer is to not tie every fucking thing to the damn cloud.

      I know, I know. Fuck the inherent risks, because whoring out our digital lives is worth it every time.

      • by msauve ( 701917 )
        "Because the FDA does not maintain an elite army of Cyberhackers."

        So, you freely admit they're unqualified to complain about Cyberhacks [sic].
        • by skids ( 119237 )

          This is a silly argument you are making. Compliance to legal and policy requirements do not have to be validated by the enforcing agency. If they did, we'd have to hire people to stand in your driveway and check that your seatbelt was buckled every time you drove off.

      • by sjames ( 1099 )

        If the FDA is going to regulate this sort of thing, they'd better get some experts in.

        Agreed, we don't need everything on the cloud, but with appropriate precautions, some things can be better if they are. Why not make the device read-only unless the patient holds a security token up to his chest, for example. If the FDA was actually about more than making sure the reams of paperwork were filled out correctly and the right asses were kissed, they might even give that advice or even insist on it.

        • If the FDA is going to regulate this sort of thing, they'd better get some experts in.

          After careful analysis, the expert steps up to the FDA Directors desk with a single sheet of paper. It reads:

          To Whomever is Pretending to Be Concerned,

          Stop putting every fucking thing in the damn cloud.

          Sincerely,

          Common F. Sense

    • by phantomfive ( 622387 ) on Thursday January 12, 2017 @03:29AM (#53652385) Journal
      I don't trust the FDA. I know what you are talking about, I've worked in the medical device industry, and it's a serious pain to get device approval, and the approval doesn't mean the code quality is good.

      That said, I trust the manufacturers even less, because I've worked with them. If you let them do easy OTA updates the way we update cell phones, you'll end up with a bunch of people dropping dead on February 29th.
    • by TheRaven64 ( 641858 ) on Thursday January 12, 2017 @08:01AM (#53652885) Journal
      I agree with your heading, but not with the rest of your post. The problem is that the FDA requires that the company have the software certified as safe by a third party, but places very few rules on what this entails. In a lot of cases, the people certifying the software don't even have access to the code: they read the design docs, but nothing else. There's no red teaming of medical device software before widespread deployment and no auditing by the FDA. The FDA is happy to certify such devices as 'safe' with nothing like enough information to be able to honestly make that claim.
  • by Anonymous Coward

    See it free! Die? It's the cost you pay for info which MUST BE FREE! It's got to be FREE!

  • by e**(i pi)-1 ( 462311 ) on Wednesday January 11, 2017 @11:07PM (#53651817) Homepage Journal
    = " I opt for a Tomb"
  • 1) Interactions with medical implants need to supply their own source of power (e.g. via RF).
    2) Unpower interactions may only occur if the medical implant detects a medical event.

    If your medical implant violates either of these rules then it is improperly designed.

    • Note: Basic security practices still apply but this solves the remote attack problem, especially those that would drain the battery.

  • by jbn-o ( 555068 ) <mail@digitalcitizen.info> on Wednesday January 11, 2017 @11:58PM (#53651951) Homepage

    Karen Sandler, Executive Director of the Software Freedom Conservancy [sfconservancy.org], has an enlarged heart (hypertrophic cardiomyopathy) and is at risk of suddenly dying (due to a medical condition called "sudden death"). She has no symptoms. She has given a talk about this [youtube.com] many times at tech conferences, you should be able to find a copy of her talk online quite easily. She calls herself a "cyborg lawyer running on proprietary software" because she needs to wear a pacemaker/defibrillator device on her heart which keeps her heart beating within a predetermined acceptable range (not too slow, not too fast) by shocking her heart until it beats at an acceptable rhythm. Sandler said she's been shocked before and it's like being kicked in the chest and it takes the wind out of her for a while, requiring her to take some time for recovery.

    She knew of software freedom and figured on these weaknesses in these devices, some of which can be controlled remotely at some distance, because all of them run on proprietary software. She tried to get the source code, even offering to sign a non-disclosure agreement to do so, and nobody would share the code with her. She said she was the only one to ask her doctors about what ran on the device. She therefore chose an older model which requires the "programmer" device which sends a signal to the pacemaker/defibrillator be quite close to her body so that she'd probably know if someone were doing things to her device. The lack of software freedom and full user control (ownership) of the device is quite obviously a health risk and possibly lethal. Don't let anyone tell you a lack of software freedom isn't serious.

    An interesting thing happened during her pregnancy, which she explained in an update to her talk: She learned that a pregnant woman's heart sometimes naturally races. For most women of childbearing age this isn't a problem as they're unlikely to need a pacemaker/defibrillator, so their heart can occasionally race without serious consequences. For Sandler this racing triggers the device to shock her back into an "acceptable" heart rhythm. It appears that the pacemaker/defibrillator device makers didn't test this device on women young enough to be of childbearing age but they're apparently happy to sell the devices for implanting into users of any age. This lack of testing in combination with the lack of software freedom means the device manufacturers aren't doing due diligence and they're preventing younger women, such as Sandler, from looking out for their own interests—avoiding "sudden death". One can only imagine what horrible multiply lethal outcome could predictably result for a pregnant woman with the same condition Sandler has whose heart races when she was driving while receiving a shock from her non-free pacemaker/defibrillator device. Don't let anyone tell you a lack of software freedom isn't serious.

    • "One can imagine how X could be a problem therefore X is a problem" is a fallacy. I can imagine that unicorns exist. That doesn't mean (unfortunately) that they exist.
    • by cshamis ( 854596 )
      Pacemaker and implanted device procedures aren't that dumb. They're actually pretty well thought out. All implanted pacemaker devices require near-proximity access(1-2 centimeters) to access, the communication is completely encrypted between the programmer unit and the device being implanted. Once it's implanted it can ONLY be "tuned, modified" with your cardiac surgeon present, and in an operating theater environment. Changing anything in the device is treated exactly the same as a "new surgery." They
    • by clodney ( 778910 )

      I don't see anything in your post that makes me believe that if Karen Sandler had access to the code she could make improvements to the device for her particular situation.

      First, as another poster has noted, modern implantable devices are extensively configurable, and yet most of them go in with the default settings, because the cardiologist/surgeon don't know enough about each device to tweak the settings. So it is quite conceivable that it could be already be configured to deal properly with a pregnant w

      • by jbn-o ( 555068 )

        So the threat of death is enough for you to argue the status quo standing behind proprietors and denying the user full control of a device they obtained (in Sandler's case wear inside their body) but not enough for you to let the user control. We still don't think that's the case for more common devices that are involved in lot of harm such as cars. In light of what's actually already happened to Sandler, your response is remarkably sycophantic to power. Automakers would probably be interested to talk to yo

        • by clodney ( 778910 )

          So the threat of death is enough for you to argue the status quo standing behind proprietors and denying the user full control of a device they obtained (in Sandler's case wear inside their body) but not enough for you to let the user control. We still don't think that's the case for more common devices that are involved in lot of harm such as cars. In light of what's actually already happened to Sandler, your response is remarkably sycophantic to power.

          I think you are mixing arguments. I was making the utilitarian case that the remedy proposed (software freedom) was unlikely to be an effective remedy in this case. I said nothing pro or con about software freedom.

          If you want to argue conceptually for software freedom, then Karen Sandler's case is nothing but an anecdote, and we can rehash the usual pro/anti FSF and GPL arguments all day long. Personally I don't view proprietary software as evil or even morally suspect, and I am fairly sure you disagree

  • I know I can't think of any internet connected system that doesn't have potential vulnerabilities. Why would anyone think medical devices were some sort of magical exception ?

  • by mmell ( 832646 ) on Thursday January 12, 2017 @02:51AM (#53652297)
    During development of these devices, I suspect that if the software developers ever tried to raise security concerns, they were (correctly) told to worry about that after they had a device that could save lives. Not unlike documentation, once the miracle gizmo has made it past the FDA (I.e., gone into production), going back to fix kludges and clean up dirty code slides wa-a-ay down the list of priorities. Happens all the time in IT.

    True story.

    • Not unlike documentation, once the miracle gizmo has made it past the FDA (I.e., gone into production), going back to fix kludges and clean up dirty code slides wa-a-ay down the list of priorities.

      And this is why capitalism is evil. It doesn't care about you, it just wants your money. It doesn't care how much environmental damage is done, for example. Therefore, it rewards a company for forever abandoning projects half-finished, at the point at which someone will pay for them. Then we are buried beneath an avalance of ill-conceived garbage.

    • You don't want kludges in your pacemaker. There's a time and place for them, but that's not it.
  • A high ranking intelligence official, speaking on condition of anonymity, cited details of a classified government report which confirms without a doubt that hackers working on behalf of the Russian government, and personally supervised by Vladimir Putin, are conspiring to hack pacemakers in elderly Democrat-leaning USA voters to interfere with the 2018 elections.

  • Everything except telemetry should require some kind of hardware level permission (analogous to write protect switch), not software. Telemetry should be end to end encrypted.

  • should we start CVE's for biomedical equipment?

  • This is exactly like from game Hacknet! One mission was to hack medical device manufacturer system, find a patient's pacer IP and kill him. He wanted a euthanasia, but wasn't allowed to get one legally, so the player had to overload the pacer.

U X e dUdX, e dX, cosine, secant, tangent, sine, 3.14159...

Working...