When Ross Compton’s Ohio home caught fire last September, the story he told police was that he grabbed a few things and rushed out of the house, hurling essentials out a bedroom window he broke open with his cane before scrambling out himself.
Police, though, were suspicious. Compton’s few things had included a computer, a suitcase packed with clothes and the charger for the external heart pump that he needed to survive. It seemed unlikely that a 59-year-old man with a pacemaker and a heart pump would have been able to gather all those things and make it out of a burning house alive. But police were stumped on how exactly to make arson charges stick.
In the end, it was his pacemaker that did him in.
After obtaining a search warrant for all the electronic data stored in Compton’s pacemaker, police determined that his device did not corroborate his story. His heart rate, pacer demand and cardiac rhythms all suggested that Compton had not in fact quickly bundled up all his most prized possessions and left in a hurry as his house went up in flames. Last month, with the help of the pacemaker data, he was indicted on charges of aggravated arson and insurance fraud.
Privacy issues are moving under our skin—now the devices that keep us alive and healthy can also be used against us in the court of law.
In 2014, a Canadian law firm used a client’s Fitbit history to help make her case in a personal injury claim, in a first-of-its-kind strategy. In 2015, data from a Fitbit was used to undermine a woman’s rape claim. Now court cases regularly include evidence gleaned from fitness trackers.
It makes sense. The technology we use is programmed to serve dual masters, those who use it and those who make it. Sometimes, the interests of those two parties conflict. Think Facebook outing gay users in order to better target advertising or Yahoo scanning user emails on behalf of governments.
But Compton’s case breaks a new barrier—flesh. While you can delete your Facebook account or leave your Fitbit at home if you’re going somewhere you’d rather not be tracked, you can’t simply turn off your pacemaker. Not only does deactivating a pacemaker require a doctor, in some cases doctors actually refuse. What happens when privacy violations are committed by devices inside of us, devices that we can’t just turn off via settings?
“EFF is concerned that as technology advances, the erosion of individual privacy in personally identifiable health information increases,” Stephanie Lacambra, the Electronic Frontier Foundation’s criminal defence attorney, said in a statement to Gizmodo. “Americans shouldn’t have to make a choice between health and privacy. We as a society value our rights to maintain privacy over personal and medical information, and compelling citizens to turn over protected health data to law enforcement erodes those rights.”
There are more than 200,000 people in the US walking around with pacemakers, and they aren’t the only ones with tiny computers inside of them. The insulin pumps that diabetics rely on to maintain their blood sugar contain computer chips. Thousands of people with Parkinson’s disease rely on chips embedded deep in their brain to control violent tremors. Advanced prosthetics also increasingly contain microprocessors that allow those who wear them to move more naturally.
Tech titans like Apple and Google are investing heavily in health tech to allow us to not only gather data about our own bodies, but share that data more easily with people like doctors. Privacy watchdogs have cast these sorts of ventures as “medical surveillance.”
Ryan Calo, a law professor at the University of Washington who focuses on emerging technologies, said that evidence from devices like pacemakers shouldn’t even be admissible into court. Like DNA evidence before it, Calo said the risk of using it to wrongly implicate someone in a crime is just too high.
“There’s a tendency to believe that because something is recorded by a machine it is gospel,” Calo said.
In Compton’s case, there is other evidence to suggest he committed the crime: Not only did he pack up a suitcase and several bags of things before he left the house, he had gasoline on his clothes. Still, the medical data, which was analysed by a cardiologist, represented “key pieces of evidence” in the case according to police.
“The idea that a random cardiologist is going to be able to read the data in a pacemaker well enough to tell whether someone committed a crime is so implausible,” Calo said. “There is danger in not understanding what this data really tells you.”
Calo said that, at least for now, when it comes to medical devices and implants, he’s more concerned about hacking. Software is copyrightable, which means that manufacturers can prevent users from altering or even doing basic security research on it. That means that in implants there is often bits of code that even those making them cannot see. People have carried out hacks via internet connected devices like kettles and baby monitors. What if that hidden code had a backdoor that allowed a hacker to turn off your ability to turn off your insulin pump? Just last autumn, Johnson & Johnson warned diabetic patients of a defect in one of its insulin pumps that could theoretically allow such an attack. A few years ago, former Vice President Dick Cheney opted to remove the wireless functionality of his own heart, fearing a similar attack.
“There is really is the possibility of ubiquitous sensors,” Calo said. And the more sensors there are, the more vulnerabilities in them that there are to exploit.
“There’s a tendency to believe that because something is recorded by a machine it is gospel.”
“If we have computers in our bodies designed to treat us as adversaries, it amplifies all of the powers of inequality,” science fiction author and privacy activist Cory Doctorow recently told Gizmodo.
Doctorow said he first started to worry about implantable technology after attending a demonstration by MIT biomechatronics professor Hugh Herr. Herr, a double amputee himself, was jumping around on stage to show off a new bionic leg that could run, climb and dance.
It was a feat of engineering—something that could offer those who had lost limbs the opportunity to do things that were previously unthinkable, like climb a mountain. But Doctorow thought of stories about the catastrophes that had occurred when car loan lenders started simply turning off people’s cars when they failed to make a payment. A device installed in the vehicles allowed lenders to not only turn them off after a missed payment, but monitor their location. One woman was left stranded while trying to get her daughter to a hospital during an asthma attack. Others found their car suddenly indisposed while at stoplights or even while driving on the highway. Those car loan catastrophes, of course, primarily affected high-risk buyers that could only wrangle subprime loans.
“What if you miss a payment and suddenly your leg switches off,” Doctorow said. “Or the government turns you off and says, ‘We’ve immobilised you, we’re coming to get you.’”
This may seem a bit far-fetched, but payment is often a barrier to accessing essential services. US hospitals even sometimes weigh whether a patient can pay for medical care in deciding whether to continue life support.
Technology is both progress and peril. Every opportunity it brings must be balanced with the consequences of the new vulnerabilities it creates with them. It could give us more agency over our own bodies. Or it could land us in jail.
Last week, Ross Compton pleaded not guilty to setting his home on fire. He told a local TV station that the investigation had “gone way out of control” and that he had “no motive whatsoever to burn down my house.” His hearing is set for later this month.
In an interview with the Washington Post, the Middletown, Ohio officer who responded to the scene of Compton’s alleged crime marvelled at just how useful that pacemaker data had been.
“It was much more informative than we thought,” he said.
It was the first time officers in Middleton had relied on pacemaker data to help make a case. Since then, they have used pacemaker data again. Twice.