Apple’s CareKit Is the Best Argument Yet for Strong Encryption
ON THE EVE of his company’s court date with the FBI, where it will defend its right to not weaken the security of its own devices, Apple CEO Tim Cook took the stage at a small theater in Cupertino to introduce a few new devices. The message of the event’s opening, though? Encryption matters. And soon, on iOS, it will matter even more.
While Cook’s remarks were brief, they were determined.
“We need to decide as a nation how much power the government should have over our data, and over our privacy,” Cook said before a mixed crowd of journalists and Apple employees. “We did not expect to be in this position, at odds with our own government. But we believe strongly that we have a responsibility to help you protect your data, and protect your privacy. We owe it to our customers, and we owe it to our country. This is an issue that impacts all of us, and we will not shrink from this responsibility.”
That Apple is prepared to fight the FBI’s insistence that it create software that would help law enforcement unlock an iPhone has been clear from the beginning. The company’s response has been active on its own site, in the media, in legal briefs, and before Congress. Cook’s message was of a piece with what Apple has repeatedly stressed.
The more powerful message, though, came a few minutes later in the keynote address in the form of a new platform called CareKit, which would let developers introduce health care apps capable of monitoring a wealth of information though Apple devices. The kind of data that you’d want to guard as closely as your own heartbeat.
Your Body’s Secrets
CareKit is best seen as a close cousin to ResearchKit, a platform Apple introduced last year that helped research institutions implement large-scale studies through the use of iOS devices. CareKit applies that same motivation to the individual level.
“We believe that giving individuals the tools to understand what is happening with their health is incredibly powerful,” says Apple COO Jeff Williams. “Apps designed using CareKit make this a reality by empowering people to take a more active role in their care.”
CareKit won’t be broadly available until next month, but Apple demonstrated several apps that were given early access. One, developed by Save Bionetworks and the University of Rochester, monitors the effectiveness of Parkinson’s disease medication. Another, called Start, helps those taking antidepressants track the effectiveness of their medications. Still another, from the Texas Medical Center, replaces a pile of take-home paperwork with software that prompts patients to help them follow their often complicated post-operative steps to the letter.
The possibilities continue. CareKit will include a Care Card, which helps keep tabs on medication or physical therapy. People can measure their symptoms, upload photos that illustrate the healing process, and more. And it includes something called Connect, which enables information-sharing between patients and doctors, or family members.
The implications of CareKit, and ResearchKit before it, are momentous. ResearchKit has already helped track everything from asthma outbreaks to, through a new partnership with 23andMe, genetic profiles. Now, in addition to trying to find a cure for Parkinson’s disease on a large scale, and iPhone will be able to help a Parkinson’s patient deal with its everyday realities.
The Most Sensitive of Data
This is powerful stuff. And the iPhone makes for an ideal health care aide, thanks to its motion-tracking M-series coprocessor suite of sensors, and its huge install base. There’s no limit to what people with serious medical conditions can track, and no cap on how useful an iOS device might be in coping with the day-to-day realities of illness. Our phones, if they didn’t already, will know more about us than we know about ourselves.
It’s the sort of information you can’t provide unless you know that it will be protected, and the kind of data you can’t collect unless you know you can protect it. The sort of information that simply can’t be given or received without strong encryption. It’s not simply a matter of giving up a sent message or a private photo. It’s an intimate record of your health, in some cases, presumably, down to the second.
FBI Director James Comey often complains of “dark spaces,” and the dangers of a digital locker that can’t be opened. “The problem with the safe comparison is there’s no safe in the world that can’t be opened,” Comey testified to Congress last month. “If our experts can’t crack it we’ll blow the door off. This is different.”
Thank goodness. If anything merits the safety of a “dark space,” it’s our biological ticks and tocks, our blood pressure and our treatments and our tremors. Those things are ours, and no one else’s, whether they’re in our bodies or in our phones. The safe analogy doesn’t work not just because you can’t blow off a door; it doesn’t work because you can’t store your most vulnerable self inside a safe.
And that’s what CareKit, and the apps it will enable, means. Our phones will become more than just extensions of ourselves. They’re not just the equivalent of a photo album or a stack of telegraphs. They’re our our diseases and our therapies, heartbeats and breaths. If anything, the idea of sharing that much even with our devices should make us squeamish, especially given that even strong encryption isn’t always fully safe. The idea that anyone, would actively attempt to weaken an already fragile system, with this kind of information at stake, seems unthinkable. Especially when that “anyone” is a law enforcement agency.
“If we are going to trust Apple with this data, I think this makes a very strong argument for keeping the data away from prying eyes,” says Jake Williams, founder of Rendition Infosec, who also highlighted potential concerns about handing over this much data to any company in the first place, including insecure apps, and the potential that sensitive data would be subject to subpoena. “In other words, this makes a great case for wide scale encryption on the iPhone with no back doors.”
The alternative would be for Apple not to introduce CareKit at all, and to withdraw ResearchKit, and to bail on the idea that it can use its millions of iPhone sales to enrich the human experience through science. That hardly seems viable either, especially when what’s gained in exchange—the degradation of both privacy and security for all Apple users—rings so hollow.
As Apple and the FBI bring their arguments into the courtroom, remember that the fight is not just about one iPhone, or even about every iPhone that exists today. It’s not just about privacy and security. It’s about progress, and for what that’s worth impeding. The answer, if there is one, probably isn’t whatever’s in that San Bernardino iPhone.
Source | Wired