The Heart of the Matter
March 24, 2015 8:18 AM   Subscribe

Patients should be allowed to access data generated by implanted devices. After losing his health insurance, Hugo Campos has written an article detailing his frustrations with self-care: "I can’t access the data generated by my implanted defibrillator. That’s absurd."
posted by domo (47 comments total) 20 users marked this as a favorite
 
My chest tightened a little when he started talking about hacking (basically) into his heart with a thing he bought on eBay. Eeeeeee! What if you broke your own heart?!
posted by ThePinkSuperhero at 8:31 AM on March 24, 2015 [2 favorites]


Karen Sandler gave a talk at OSCON 2011 about the issue which is actually bigger than just the right to access to the data. People who require implants should also have the right to inspect the source code and hardware design of the machines being put into their body to keep them alive.
posted by Poldo at 8:33 AM on March 24, 2015 [9 favorites]


Still less dangerous than taking drugs bought on a Silk Road-style website.
posted by ymgve at 8:34 AM on March 24, 2015


Silly people. You only rent your medical information, so long as you keep up the premiums.
posted by Thorzdad at 8:37 AM on March 24, 2015 [2 favorites]


Just make closed-source software de facto illegal in all commercial products. In practice, we'd merely need to decide that (a) only human produced works are copyrightable, not the works derived from them by machines, but that (b) compiled code can derive a copyright if the source code is made available.
posted by jeffburdges at 8:38 AM on March 24, 2015 [3 favorites]


Some of my family has ICDs and it is completely inefficient for them to have to go to the doctor only to see if the thing is still working. That's all the check-ups usually are -is it on? -how's the battery?
posted by Gor-ella at 8:39 AM on March 24, 2015 [1 favorite]


According to Irish medical-device maker Medtronic

Bit of a derail, but that phrase will generate a lot of snorts and harrumphs from Minnesota readers.
posted by gimonca at 8:40 AM on March 24, 2015 [8 favorites]


Silly people. You only rent your medical information, so long as you keep up the premiums.

I know he's unpopular here, but Cory Doctorow gave an interesting talk touching on this, "The Coming Century of War Against Your Computer", at the Long Now Foundation a few years ago.

Just wait until manufacturers - or 3rd parties they've sold your account maintenance to - can threaten to shut off that defibrillator for lack of payment. Or they stop updating its software when they fold and DRM prevents - hell, makes illegal - running an alternative. Smart medical devices are going to be a multi-layered nightmare for consumers.
posted by ryanshepard at 8:43 AM on March 24, 2015 [20 favorites]


> Just make closed-source software de facto illegal in all commercial products.

At the very least for situations where not having access impacts your fundamental rights such as being in control of your own body.
posted by Poldo at 8:44 AM on March 24, 2015 [1 favorite]


Surely this is a market rife with potential for the next generation of smartphone apps.

pacemakr
defibrillatr
deep brain stimulatr
posted by phunniemee at 8:45 AM on March 24, 2015 [11 favorites]


Some of my family has ICDs and it is completely inefficient for them to have to go to the doctor only to see if the thing is still working. That's all the check-ups usually are -is it on? -how's the battery?

They should ask about getting the checkups done via telemonitoring instead, if they're not already. Though I assume the check-ups are a bit more thorough than your description - the doctor should at least check the device's reports and do potential parameter adjustments to it, and ask you how you've felt since the last visit.

It's like with any other checkup - it might feel useless as long as everything is OK, but you'll be really glad you got them those few times there's something wrong.
posted by ymgve at 8:48 AM on March 24, 2015 [2 favorites]


So, patients don't already have a right of access to these medical records under HIPAA... why? Aren't the device manufacturers healthcare clearinghouses covered under HIPAA? I don't understand this.
posted by zennie at 8:59 AM on March 24, 2015 [1 favorite]


According to Irish medical-device maker Medtronic

Bit of a derail, but that phrase will generate a lot of snorts and harrumphs from Minnesota readers.


Did they or did they not do the all-American inversion and buyout scam tax maneuver to pay fewer taxes ?
posted by k5.user at 9:06 AM on March 24, 2015 [1 favorite]


One of the things that weirds me out about implanted devices with murky security protocols is that I'm sure one of the first proof-of-concepts for hackers is to make somebody's heart beat in the rhythm to a pop song.

Your induced heart attack should not be to the tune of "Roar."
posted by fifteen schnitzengruben is my limit at 9:12 AM on March 24, 2015


Did they or did they not

They did.
posted by gimonca at 9:22 AM on March 24, 2015


Oh, man. It made me angry enough when I had to fight my insurer to get them to cover the type of CPAP machine that would let me access my own sleep apnea tracking data in detail. Hadn't even occurred to me that you'd have to do the same for things actually implanted in your body. Am now angry/terrified.
posted by Stacey at 9:33 AM on March 24, 2015 [3 favorites]


I decided to figure out a way—even if it was radical—to download information from my ICD without being coerced into paying thousands of dollars for routine reports. The solution was to buy a pacemaker programmer, a medical device used by the clinic to program ICDs, which I soon found on eBay. I also spent two weeks... taking a course on the fundamentals of cardiac rhythm management.

This was the point in the article that I found myself wishing I was friends with the author. Buying a programmer for the implanted biomedical device that helps keep you alive, then using it to analyze your cardiac rhythms and correlate them with environmental influences, is about the most badass nerdpunk thing I've read in a long time.

It seems so self-evident that people should have the right to see their own medical information that I'd be curious to read a cogent argument against the idea. You can side with corporate interests in protecting proprietorial data-gathering techniques, but don't other legal structures already protect those technologies? The article cites the manufacturers' pandering to doctors as the hurdle, but I also suspect they simply don't trust regular people to make use of raw data.

There's an interesting debate forming, as technology gives us more and more medical data on ourselves, about when people make good decisions and when information overload actually harms our decision-making ability. We're prone to cognitive errors and we have a frightfully flawed tendency to misjudge risk. It's increasingly difficult to tell who is an unbiased medical advocate and who's treating our wallets rather than our bodies - even our own doctors are influenced by multiple financial interests. Socially, there's been something of an erosion of faith in experts. People feel themselves to be alone and are overwhelmed by the information streams that are accessible to them. How, in that environment, do we get the right information to people in a format that actually helps them make better decisions about their health?
posted by itstheclamsname at 10:03 AM on March 24, 2015 [7 favorites]


Why isnt this industry more regulated?
posted by pmfail at 10:12 AM on March 24, 2015


Like zennie, I'm also not clear on why HIPAA would not apply. Does anyone have any insight into this?
posted by yeolcoatl at 10:32 AM on March 24, 2015


It's HIPAA vs. The New Millenium Copyright Act.

One costs money, the other makes money, guess which wins. Relevant EFF pdf.
posted by Dreidl at 10:50 AM on March 24, 2015 [5 favorites]


The last thing we need is a generation of Boomers who never did figure out how to set the clock on their old VCRs to go around bricking their hearts.
posted by Pope Guilty at 10:55 AM on March 24, 2015 [13 favorites]


You think this is bad, just wait till an implantable device manufacturer declares bankruptcy due to the lawsuits brought on by a batch of defective devices. Your (probably defective) pacemaker is no longer supported because the manufacturer doesn't exist - what do you do now?

Remember, the company's valuable intellectual property assets are being held in escrow for the creditors to sort through - and they'll get to it any day now, but probably not in time for you.

Meanwhile, your pacemaker demands: Abort, Retry, Fail?
posted by RedOrGreen at 10:58 AM on March 24, 2015 [7 favorites]


Thorzdad:
"Silly people. You only rent your medical information, so long as you keep up the premiums"
No problem. Just get the free model that plays ads to you and everybody in a 10 foot radius every 5 minutes or so by transmitting sound waves through your bones and using your skull cavities as speakers.
posted by Hairy Lobster at 11:18 AM on March 24, 2015 [1 favorite]


It's HIPAA vs. The New Millenium Copyright Act.

One costs money, the other makes money, guess which wins. Relevant EFF pdf.


Thanks.

Are these medical records currently subject to copyright though? The EFF pdf says "many outputs on medical devices are not protectable as copyrighted works" (footnote 2 on page 2).
posted by zennie at 11:27 AM on March 24, 2015


Your induced heart attack should not be to the tune of "Roar.
--posted by fifteen schnitzengruben is my limit

Yeah, I much prefer my impending doom to be the gratefully staccato beats of Bucephalus Bouncing Ball
------------
The last thing we need is a generation of Boomers who never did figure out how to set the clock on their old VCRs to go around bricking their hearts.
-- posted by Pope Guilty

Well, chmod 600, no?

And make sure user doesn't have super permissions to override the settings. Read and execute have to be available for the user, certainly Read. But it's the Write that is scary, and something you need to figure out how to deal with.

I don't think I'd ever want to get a brain implant that has any sort of wireless connectivity (at the very least, transdermal data transference would be necessary to make me feel comfortable), but remote access? Hell no. You gotta put a big ol' helmet on me to get ANY privs...
posted by symbioid at 12:56 PM on March 24, 2015


I'm pretty comfortable in requiring a doctor to make *changes* to your settings on your life-supporting hardware.

It's pretty unconscionable to prevent people *reading* information about their own medical condition/state.

I'm not thrilled with the whole 'baby-boomers are techno-illiterates who can't be trusted with technology' crap either. To start with, most of their doctors are of this generation. To continue with, most of the pioneers of technology are also aging folks (who still contribute to the state-of-the-art significantly). Ageism is as attractive as other flavors of bigotry.
posted by el io at 2:09 PM on March 24, 2015 [3 favorites]


'm pretty comfortable in requiring a doctor to make *changes* to your settings on your life-supporting hardware.

It's pretty unconscionable to prevent people *reading* information about their own medical condition/state.


The problem is that it is difficult to accomplish the latter while preserving the restrictions of the former. We've already seen proof of concept attacks that can cause an insulin pump to empty itself into the user (which is almost guaranteed to kill the user.)
posted by NoxAeternum at 3:31 PM on March 24, 2015 [2 favorites]


The last thing we need is a generation of Boomers who never did figure out how to set the clock on their old VCRs to go around bricking their hearts.

I just had a double cheeseburger topped with slabs of bacon and a fried egg with a side of crispy hash potato with duck heart gravy. I'm already bricking my heart.
posted by srboisvert at 3:53 PM on March 24, 2015 [3 favorites]


Fitbits, hexoskin and the rest of the heart monitors seem to only "ship your data to the cloud".

Open APIs are almost non existent - or at least I didn't see 'em.

The only thing that doesn't is http://www.openbci.com/ and there the software isn't fully baked. (there were some other 1 off ardunio projects that might work)

The Courts are not gonna be able to help you, getting this fixed is Con-gress only. Unless you can make an argument that the International Human Rights Treaty grants you the boon, and good luck on making THAT stick.
posted by rough ashlar at 4:46 PM on March 24, 2015 [2 favorites]


The last thing we need is a generation of Boomers who never did figure out how to set the clock on their old VCRs to go around bricking their hearts.

Sure, we should definitely leave it up to disenfranchised millennials who have time to hack it to handle it. I mean if you seriously think that keeping the source a secret will stop anyone, well then you haven't read tech news in like 20 years.
posted by lumpenprole at 5:13 PM on March 24, 2015


In case one wants to start doing the heart tracking without the pacemaker Neurosky has an ecg set and API
posted by rough ashlar at 8:12 AM on March 25, 2015 [1 favorite]


It seems so self-evident that people should have the right to see their own medical information that I'd be curious to read a cogent argument against the idea.

1. Security risk. How do you make it possible for patients to see the data without exposing it to hackers?

2. Incompetent analysis. I'm very conflicted on this issue, as an old zine/democracy/EFF sympathizer. But there is a pretty broad-based attack on science these days, led by lots of activists (mostly Tea Party conservatives and privileged New Agey liberals) that comes from people who really don't know what they're talking about digging into scientific details that are way over the heads.

The results are things like climate-change deniers picking nits on studies, and anti-vaccine and anti-flouride activists vehemently niggling over strands of data against an ocean of counter-data, and even Donald Rumsfeld demanding raw CIA reports so he can come to a different conclusion.

These are very unsympathetic people to defend, pharmaceutical companies that make vaccines, UN meteorological experts, CIA analysts. But generally speaking they're good at what they do, and letting people who oppose them for ideological or political reasons second-guess their scientific decisions seems to be leading to a lot more harm than good.
posted by msalt at 4:11 PM on March 25, 2015


You think this is bad, just wait till an implantable device manufacturer declares bankruptcy due to the lawsuits brought on by a batch of defective devices

You can criticize the current system for a lot of things, but in the U.S. anyway this is not going to happen because of all the regulations. These systems are about the most rigorously tested machines on earth. Probably more so than airplanes and nuclear power plants and drones. And only well-funded companies can afford the studies in the first place.
posted by msalt at 4:12 PM on March 25, 2015


Actually, it's the opposite - Aerospace rules are stricter than those for medical devices.
posted by DesbaratsDays at 11:34 AM on March 26, 2015


> 1. Security risk. How do you make it possible for patients to see the data without exposing it to hackers?

Security through obscurity is probably the weakest kind of security you can have. Security through examination by dedicated software engineers is going to be a lot better.

> letting people who oppose them for ideological or political reasons second-guess their scientific decisions seems to be leading to a lot more harm than good

Wow. This has nothing to do with questioning the science, it's about questioning demonstrably shitty engineering. Frankly its kind of offensive for you to equate skepticism of the FDA to that of climate skeptics and antivaxxers because what they are doing is exactly the opposite of what those people want. They're not questioning the underlying science or the need for the implants, they're not looking to stop others from using them, they just want to be able to make it possible to find and fix more problems than the manufacturer may have been able to on their own. And at least in the case of Hugo Campos and Karen Sandler, their lives may literally depend on it.
posted by Poldo at 6:19 PM on March 26, 2015 [1 favorite]


> These systems are about the most rigorously tested machines on earth.

Umm, no. As Sandler pointed out in her talk, the FDA doesn't have any kind of testing or validation requirements for the software that controls these devices. They rely entirely upon reports produced by the manufacturers.
posted by Poldo at 6:26 PM on March 26, 2015


Poldo -- be offended if you like, but your hyperbole and 4-5 year old studies (showing 1 adverse event!) are not that persuasive. Karen Sandler just throws out an assumption that there is 1 bug in very 100 lines of code as fact. Incidentally, the article linked here is really close to plagiarising Karen Sandler's talk, at least on the cyborg gag that both use as their opener.

The common issue is average citizens demanding to see technical details of scientific processes they aren't really qualified to analyse. Sure, some people can teach themselves how to do it. But others can draw conclusions based on limited information, and this is readily (and frequently) politicized.

There's a broader issue about the ideology of open source software. It assumes a critical mass of volunteer programmers willing to poke, prod and improve the software. That crowd-sourced labor outweighs the risks making the source code public -- if it exists.

But if that labor doesn't materialize you've just opened a big security hole with no offsetting advantage. It's hard enough getting people to work on sexy projects like FoxFire or Drupal. I think it's fair to ask if a crowd will show up to dig through and fix several manufacturer's insulin pump source code.
posted by msalt at 12:19 AM on March 27, 2015 [1 favorite]


30 seconds of Googling found a pretty important point you missed:

The FDA issued tigher regulations over device software security 2 years ago, stating that devices won't be approved until manufacturers demonstrate cybersecurity.
posted by msalt at 12:27 AM on March 27, 2015


> your hyperbole

Please. You are the one comparing actual concerns to climate change skeptics and anti-vaxxers. If I said your arguments sound like those of an astrologer it would be equally silly. How about we all just stick to the issues?

> But if that labor doesn't materialize you've just opened a big security hole with no offsetting advantage.

The idea that having the source code visible makes the product less secure is an awfully big claim and comparing the motivation in reviewing drupal or firefox to reviewing an implanted heart monitor/pacemaker is also pretty interesting. The new guidelines you linked to were issued to address "security concerns" discovered without access to any source code. As I said, relying on security through keeping secrets is not good enough.

> FDA issued tigher regulations over device software security 2 years ago, stating that devices won't be approved until manufacturers demonstrate cybersecurity.

Yes I'm sure issuing new recommendations and guidelines will solve the problem once and for all. And as they even state in their safety communication:
Note: The FDA typically does not need to review or approve medical device software changes made solely to strengthen cybersecurity.


So no, they still do not review software changes and they continue to rely on self reporting from manufacturers. You can brush off that JAMA article I linked to all you want, but the concerns over how the FDA regulates are real and ongoing.
posted by Poldo at 6:26 PM on March 30, 2015


We must trust individuals with detailed health information because not doing so helps create anti-vaxxers, etc., msalt. There are security concerns around how one extracts the data, since you could kill someone by reflashing bad software, but overall far greater risks arise through restricting the technology. Imagine if your doctor said you should not visit say Brazil because they were embargoed by the monopolist manufacturer for violating some patent?
posted by jeffburdges at 9:27 PM on March 30, 2015




Poldo: It's really hard to take you as a fair witness about these concerns when you present evidence with obvious flaws. Besides 5 year old links and not mentioning the FDA guidelines from two years ago that you dismiss and apparently knew about, you're clearly misquoting the FDA note above.

You conclude "they still do not review software changes" but that's not what your quote says at all:

The FDA typically does not need to review or approve medical device software changes made solely to strengthen cybersecurity.

Your more recent link is important, but the problems were found as part of a government audit of software security in key medical devices. That's exactly the thing you complained was missing. Etc.
posted by msalt at 10:35 AM on April 4, 2015


jeffburdges: There are two completely separate issues here, software security against malicious hacks, and rights to personal data.

The first is a technical and policy issue, but the latter is thornier. In general, sure it's your data. But I think there are legitimate concerns about people who want to find different answers in data that they aren't really qualified to understand.

My concern is stuff like Dick Cheney demanding the raw intel on Iraq because he didn't like the analyst's conclusions. When people are facing death, there are powerful drives to change the news to their liking.

The other concern is charlatans like Dr. Joseph Mercola, who are expert at preying on people's fears and emotions, including natural resistance to authority figures such as doctors and big companies, to make himself money. And he is in fact a prime driver of anti-vaxx, using exactly these methods. This is what quacks do, from literal "snake oil" salesmen to laetrile peddlers.
posted by msalt at 10:44 AM on April 4, 2015


msalt, you still haven't even explained how having access to the source code could make the software less secure. And despite your dismissal of "5 year old" paper, these are intended to show how this is an ongoing problem. If you want to claim that the new guidelines have somehow solved this problem go right ahead, but so far you haven't done that. Had you read the the trivial security issues release I pointed to, you would have noted that the problems were not found by the FDA (even after those great new suggestions were put in place), DHS or indeed any "government audit" but by an independent security researcher who was nice enough to report it.

As for your claims I misquoted, I don't see it. Not reviewing changes made "solely" to strengthen "cybersecurity" are indeed changes made without review and are exactly the sort of changes that should require it. Had the code been available, software developers who's own lives (or those of loved ones) depended on such devices would have had more than enough personal motivation to find and report problems.

Your inability to stand behind your claims combined with your baseless and obnoxious ongoing insinuations about the intentions of these people involved make this my last comment on the subject.
posted by Poldo at 9:43 AM on April 5, 2015


There's an interesting contradiction here on the software side of this issue. If the problem is lack of government review of software, how do you square that with open source? The whole point of open source is to remove controls over what software is released.

In an open source model, no individual or company is responsible for the software, and any user can use any modification of the software in their device. So if there are people hurt as a result of software errors, who will compensate the victims? This isn't a big problem if you're talking about WordPress and your website, but the stakes are very different with an insulin pump.

>> Had the code been available, software developers who's own lives (or those of loved ones) depended on such devices would have had more than enough personal motivation to find and report problems

This assumes that there are multiple software developers who need each of the devices in question or have loved ones that do (and that they know the right language, have time to work on it, etc.) I don't think that's a safe assumption.

What IS a safe assumption is that anyone with malicious intent would download the source code and look for vulnerabilities. That would be much easier in open source. The asymmetry is that this model requires multiple developers to protect each device, but a single malicious person with nihilistic or terroristic goals only needs one vulnerable device -- they can check different ones until they find an easy target.
posted by msalt at 4:36 PM on April 5, 2015


There is no real review except through open source software. You can pay the core developer group regardless of license, which must includes in-house review. If however code isn't being written to be read by complete strangers outside the organization, then it'll wind up sucking.
posted by jeffburdges at 7:21 PM on April 5, 2015


So you're saying there was never good code in history before the open source movement? That seems ... extreme. Visicalc sucked?
posted by msalt at 11:17 PM on April 5, 2015 [1 favorite]


« Older I Might Have Some Sensitive Files   |   an adorible ESL class project Newer »


This thread has been archived and is closed to new comments