For the past few weeks, I’ve been following the FBI / Apple phone unlocking case, and digging deep into the debate around encryption, security and privacy.
This debate is as old as the sun, and the exact same arguments we’re going through now were fought through 20 years ago during the first crypto wars and the US government’s effort to deploy the Clipper Chip as a way of sharing crypto keys between industry and government. The stance of the tech industry has always been “strong crypto or else, because Math” and the stance of the government has been “come on guys, let’s figure something out here“.
At USV, we’ve been trying to look at this round of the fight with fresh eyes, to the extent possible. What we’ve been wondering is: is there something different this time around? Has anything changed that might make us reconsider these dug-in, partisan-esque positions? Are there unintended consequences that the tech industry hasn’t been considering?
To paraphrase my colleague’s arguments, Fred points out that trust, safety and security are serious issues within and around web platforms, and platform operators do have a civic duty to cooperate with law enforcement when it’s necessary and lawful (on the surface this is not controversial — it all depends on the whys and hows). Albert adds to that, and has also written extensively about the general concerns of crypto trench wars leading us down the path to a spy vs spy society where information and knowledge are locked up, rather than an open society that benefits from collective intelligence and open knowledge.
The part I really want to dig into is an apparent parallel here between data security and DRM. With DRM, there’s been a 30 year battle to lock down the entire software and hardware ecosystem in the name of controlling access to content. Internet / free culture advocates have long pointed out that the more enlightened approach is to understand that information wants to be free, and we can all be better off if adapt our culture, expectations, and business models to a world where remixing is allowed.
Now, as we look at data security and privacy, I feel a lot of those same forces coming to bear: in the name of data security and privacy, we need to all get on board with a controlled software / hardware model where companies, rather than users themselves, control data flows. This is best exemplified by Apple’s security model, which stores encryption keys in a separate “secure element” and only allows software to be installed that’s signed by Apple — conforming not only to their security policies but to their control policies.
This, I think, is where some of us have gotten uncomfortable. What we don’t want is the cause of security and privacy to lead us down the path to lockdown and the war against general purpose computing, the way that DRM has. A risk here seems that many of the folks who are fighting for copyright reform & device unlocking, may also be unwittingly undermining those same causes in the crypto/privacy/security fight.
So what I’ve been trying to do is parse apart the issues of security and control. Can we have one without the other, and can we talk about them, and advocate for (or against) them separately?
Amazingly, as I’ve been chewing on this part specifically, I came across this announcement about the effort to assemble a secure, open, mobile OS + app + app store stack. What we’ve got here is a hardened operating system built on Android (Copperhead OS), a set of secure applications (from the Guardian Project), and a distributed app store (F-Droid) with no central gatekeeping.
Why is this important? Because it shows that it’s possible to have verifiable security without the anti-innovation control that comes from centralized app stores. For example: one of our portfolio companies recently realized that by shifting from an app-store model to an API-based model, they could increase their product iterations by 1000% — shipping new code instantly, rather than waiting weeks for app store approval. This is the kind of innovation we want, and it’s just not possible with the controlled app store model.
It’s also important for other kinds of security — specifically, the ability for users to audit and inspect the devices and services they use. This was a key outcome of the VW emissions scandal, and will be increasingly important as more Internet of Things devices do more things with more data. If we move towards a world of DRM-style data lockdown, we’ll have less knowledge of how products work and less control over our information.
This has been a long post, so I’ll just summarize by saying: I think it would do everyone good to keep looking at the encryption issue not simply through the lens of privacy and security, but also through the lens of openness and innovation, and make sure that whatever policies and technologies we support coming out of this strike the best possible balance.
 the best resources from the academic community on the subject are Keys Under Doormats, an MIT publication pointing out the security risks of “key escrow” systems that the government prefers, and Don’t Panic, a Berkman Center report pointing out the extent to which the “going dark” framing is misleading, since the overall surface area for digital surveillance has grown dramatically at the same time that strong encryption has made some data inaccessible.