Apple’s Health Experiment Is Riddled With Privacy Problems

By Jennifer Ouellette on at

Pharmaceutical giant GlaxoSmithKline (GSK) has partnered with Apple on a new clinical study on rheumatoid arthritis. The study relies on an iPhone app to collect data about arthritic symptoms from users as they go about their daily lives. That sounds great at first glance, but how well will it protect your privacy?

The app was built by the London-based GSK using Apple’s ResearchKit, an open source software framework to transform your iPhone into a handy diagnostic tool for clinical studies. wrote at the time, “The platform aims to give anyone with an iOS device the opportunity to participate in medical research, join programmes that can help them track their symptoms, or share information with their doctors.”

So far there are just a handful of ResearchKit apps tied to clinical studies, but the GSK partnership is the first time Apple has joined forces with a major drug company. The Patient Rheumatoid Arthritis Data from the Real World (PARADE) study will use its app to track the mobility of over 300 participants suffering from rheumatoid arthritis, including information on their level of joint pain, fatigue, and changing moods. No drugs are being tested. Rather, the app guides users through a simple wrist exercise, with the iPhone’s built-in sensors recording data from that motion. That data may help Glaxo design better clinical trials in the future.

There are plenty of potential benefits to using the ResearchKit platform. It’s a huge boon to recruiting viable participants for medical studies—which can take months or years, depending on the study—and it’s cost effective, potentially saving millions of pounds. “Certainly you’ve also taken out the site costs, and the costs of having nurses and physicians explaining the studies to them and recording information,” Rob DiCicco, head of Glaxo’s clinical innovation and digital platforms group, told Bloomberg News.

Apple’s Health Experiment Is Riddled With Privacy Problems

But from the start, ResearchKit raised a host of ethical questions, particularly about protecting consumer privacy, and getting informed consent from all study participants.

Apple insists it never sees the data you provide through ResearchKit. But who the research institutions, hospitals, and doctors share that data with is up to them (within the constraints of laws like HIPAA.) And collected data is typically “anonymised” by companies like Sage Bionetwork, removing any possible identifiable information before sending the data to the institution conducting the study.

That may not be sufficient to protect users’ privacy in this age of powerful big data informatics. There’s something called the “mosaic effect,” whereby it is possible to reconstruct someone’s identity from a relatively small amount of data, even after that data has technically been made anonymous. “We can’t promise perfect anonymity,” John Wilbanks, chief commons officer for Sage Bionetwork, told The Verge last year. “We’re going to de-identify it, but because we’re going to make it available for lots of research, there exists the change that someone could re-identify you.”

The first few ResearchKit apps seemed to improve on the usual one-on-one process for informed consent. With an asthma app designed by Mount Sinai Hospital, for instance, users must flick through 12 separate pages, each with big graphics, large 18-point font print, and simple words describing every possible risk and benefit of the study. Then users take a quiz to demonstrate they at least have a rudimentary understanding of just what they’re agreeing to.

So how does the GSK PARADE app-based study address those concerns, if at all? The company has yet to respond to Gizmodo’s request for comment, but we asked bioethicist Nicholas Evans of the University of Massachusetts Lowell to weigh in, based on his own cursory exploration of the app.

The biggest issue, according to Evans, is the app’s approach to informed consent. Tapping the button to launch the consent process takes the user to a nine-page PDF document in 12-point font—the kind of thing that’s really difficult to read on your iPhone. Then the user taps the “Agree” button in the bottom right hand corner. Voila! Instant informed consent. It’s similar to those End User License Agreements that everybody agrees to without even reading them, because come on, we just want to use the damn app already.

“A nine-page letter in PDF format is not a great way to structure informed consent on an iPhone,” Evans said. “I don’t see that there’s any incentive for someone to read this.”

In response to concerns about informed consent, the folks at Apple ResearchKit changed its terms of use to require an ethics review when creating an app—that is, the app must pass muster with what’s known as an Institutional Review Board (IRB). Presumably GSK PARADE did so before launching their app-based study.

The problem, according to Evans, is that the Apple platform doesn’t work with the usual university-based IRBs, but with a Seattle-based, private, for-profit IRB called Quorum Review. A quick perusal of their Website reveals they offer a 24-hour site review turnaround. “Twenty-four hours is not enough time to review a study protocol and its informed consent documents, and make a determination on whether or not you should approve this for the participation of human subjects,” Evans said.

It’s great that Apple put an IRB requirement in place, but that has led to an emerging market for private IRBs like Quorum Review, and a greater risk of “rubber stamping” proposed studies. “It’s going to become an increasing problem as more tech companies get into the biomedical space to do research that have really promising results,” Evans said, “but don’t go about crossing all their T’s and dotting all their I’s when it comes to their human subject research ethics.”

Part of that is linked to the “fail early, fail often” mantra associated with the Silicon Valley start-up culture, which is at odds with the slow and incremental steps required for biomedical research. “It’s great to fail early and often when the worst that can happen is someone losing their Instagram account,” Evans said. “It’s a lot more serious when failing early involves giving someone an incorrect diagnosis for a congenital condition for which there isn’t a cure. So validation has to be done.” [Bloomberg News]