Tag Archives: encryption

Apple believes it is protecting freedom. It’s wrong. Here’s why.

Ed. note: This is an expanded version of a previous article, “We Don’t Need Backdoors.”

By Dave Schroeder

Let me open by saying I’m not for backdoors in encryption. It’s a bad idea, and people who call for backdoors don’t understand how encryption fundamentally works.

Apple has been ordered by a court to assist the FBI in accessing data on an iPhone 5c belonging to the employer of one of the San Bernardino shooters, who planned and perpetrated an international terrorist attack against the United States. Apple has invested a lot in OS security and encryption, but Apple may be able comply with this order in this very specific set of circumstances.

Apple CEO Tim Cook penned a thoughtful open letter justifying Apple’s position that it shouldn’t have to comply with this order. However, what the letter essentially says is that any technical cooperation beyond the most superficial claims that there is “nothing that can be done” is tantamount to creating a “backdoor,” irrevocably weakening encryption, and faith in encryption, for everyone.

That is wrong on its face, and we don’t need “backdoors.”

What we do need is this:

A clear acknowledgment that what increasingly exists essentially amounts to virtual fortresses impenetrable by the legal and judicial mechanisms of free society, that many of those systems are developed and employed by US companies, within the US, and that US adversaries use those systems — sometimes specifically and deliberately because they are in the US — against the US and our allies, and for the discussion to start from that point.

The US has a clear and compelling interest in strong encryption, and especially in protecting US encryption systems used by our government, our citizens, and people around the world, from defeat. But the assumption that the only alternatives are either universal strong encryption, or wholesale and deliberate weakening of encryption systems and/or “backdoors,” is a false dichotomy.

How is that so?

Encrypted communication has to be decrypted somewhere, in order for it to be utilized by the recipient. That fact can be exploited in various ways. It is done now. It’s done by governments and cyber criminals and glorified script kiddies. US vendors like Apple, can be at least a partial aid in that process on a device-by-device, situation-by-situation basis, within clear and specific legal authorities, without doing things we don’t want, like key escrow, wholesale weakening of encryption, creating “backdoors,” or anything similar, with regard to software or devices themselves.

When Admiral Michael Rogers, Director of the National Security Agency and Commander, US Cyber Command, says:

“My position is — hey look, I think that we’re lying that this isn’t technically feasible. Now, it needs to be done within a framework. I’m the first to acknowledge that. You don’t want the FBI and you don’t want the NSA unilaterally deciding, so, what are we going to access and what are we not going to access? That shouldn’t be for us. I just believe that this is achievable. We’ll have to work our way through it. And I’m the first to acknowledge there are international implications. I think we can work our way through this.”

…some believe that is code for, “We need backdoors.” No. He means precisely what he says.

When US adversaries use systems and services physically located in the US, designed and operated by US companies, existing under US law, there are many things — entirely compatible with both the letter and spirit of our law and Constitution — that could be explored, depending on the precise system, service, software, device, and circumstances. Pretending that there is absolutely nothing that can be done, and that it must be either unbreakable, universal encryption for all, or nothing, is a false choice.

To further pretend that it’s some kind of “people’s victory” when a technical system renders itself effectively impenetrable to the legitimate legal, judicial, and intelligence processes of democratic governments operating under the rule of law in free civil society is curious indeed. Would we say the same about a hypothetical physical structure that cannot be entered by law enforcement with a court order?

Many ask why terrorists wouldn’t just switch to something else.

That’s a really easy answer — terrorists use these simple, turnkey platforms for the same reason normal people do: because they’re easy to use. A lot of our techniques, capabilities, sources, and methods have unfortunately been laid bare, but people use things like WhatsApp, iMessage, and Telegram because they’re easy. It’s the same reason that ordinary people — and terrorists — don’t use Ello instead of Facebook, or ProtonMail instead of Gmail. And when people switch to more complicated, non-turnkey encryption solutions — no matter how “simple” the more tech-savvy may think them — they make mistakes that can render their communications security measures vulnerable to defeat.

And as long as the US and its fundamental freedoms engender the culture of innovation which allows companies like Apple to grow and thrive, we will always have the advantage.

Vendors and cloud providers may not always be able to provide assistance; but sometimes they can, given a particular target (person, device, platform, situation, etc.), and they can do so in a way that comports with the rule of law in free society, doesn’t require creating backdoors in encryption, doesn’t require “weakening” their products, does not constitute an undue burden, and doesn’t violate the legal and Constitutional rights of Americans, or the privacy of free peoples anywhere in the world.

Some privacy advocates look at this as a black-and-white, either-or situation, without consideration for national interests, borders, or policy, legal, and political realities. They look at the “law” of the US or UK as fundamentally on the same footing the “law” of China, Russia, Iran, or North Korea: they’re all “laws”, and people are subject to them. They warn that if Apple provides assistance, even just this once, then someone “bad” — by their own, arbitrary standards, whether in our own government or in a repressive regime — will abuse it.

The problem is that this simplistic line of reasoning ignores other key factors in the debate. The US is not China. Democracy is not the same as Communism. Free states are not repressive states. We don’t stand for, defend, or espouse the same principles. Apple is not a Chinese company. If Apple really believes it will set a precedent for nations like China by complying with a lawful US court order, it really should perform a little self-examination and ask why it would seek to operate in China, and thus be subject to such law.

The other argument seems to be that if Apple does this once, it would constitute a “backdoor” for “all” iPhones, and thus the abrogation of the rights of all. That is also categorically false. There are a number of factors here: The iPhone belongs to the deceased individual’s employer. The FBI may have a companion laptop that this specific iPhone considers a “trusted device”, and is thus potentially able to deploy an OS update without a passcode. The specific device and/or OS version may have other vulnerabilities or shortcomings that can be exploited with physical access.

This argument seems to be equivalent to saying that if government has any power or capability, it will be abused, and thus should be denied; and that encryption, or anything related to it, should somehow be considered sacrosanct. It’s like saying, if we grant the government the lawful to enter a door, they could enter any door — even yours. Some might be quick to say this is not the same. Oh, but it is. This is not an encryption backdoor, and does not apply to all iPhones, or even all iPhone 5c models, or even most. It applies to this specific set of circumstances — legally and technically.

It is puzzling indeed to assert that the government can try to break this device, or its crypto, on its own, but if the creator of the cryptosystem helps in any way, that is somehow “weakening” the crypto or creating a “backdoor.” It is puzzling, because it is false.

Specific sets of conditions happen to exist that allows Apple to unlock certain older devices. These conditions exist less and less, and in fewer forms, as devices and iOS versions get newer. Unlocking iOS 7 only works, for example, because Apple has the key. The methodology would only work in this case because it’s specifically a pre-iPhone 6 model with a 4-digit passcode and there is a paired laptop in the government’s possession. All of this is moot on iPhone 6 and newer.

Apple is welcome to use every legal mechanism possible to fight this court order — that is their absolute right. But to start and grow their company in the United States, to exist here because of the fundamental environment we create for freedom and innovation, and then to act as if Apple is somehow divorced from the US and owes it nothing, even when ordered by a court to do so, is a puzzling and worrisome position.  They can’t have it both ways.

If Apple wishes to argue against the application of the All Writs Act — which, while old, is precisely on-point — it needs to make the case that performing the technical steps necessary to comply with this court order creates an “undue burden.” It may be able to make just that argument.

ios

We exist not in an idealized world where the differences of people, groups, and nation-states are erased by the promise of the Internet and the perceived panacea of unbreakable encryption.

We exist in a messy and complicated reality. People seek to do us harm. They use our own laws, creations, and technologies against us. People attack the US and the West, and they use iPhones.

Apple says that breaking this device, even just this once, assuming it is even technically possible in this instance, sets a dangerous precedent.

Refusing to comply with a legitimate court order levied by a democratic society, because of a devotion to some perceived higher ideal of rendering data off-limits under all circumstances to the valid legal processes of that society, is the dangerous precedent.

The national security implications of this case cannot be overstated. By effectively thumbing its nose at the court’s order, Apple is not protecting freedom; it is subverting the protection of it for the sake of a misguided belief in an ideal that does not exist, and is not supported by reality.

Dave Schroeder serves as an Information Warfare Officer in the US Navy. He is also is a tech geek at the University of Wisconsin—Madison. He holds a master’s degree in Information Warfare, is a graduate of the Naval Postgraduate School, and is currently in the Cybersecurity Policy graduate program at the University of Maryland University College. He also manages the Navy IWC Self Synchronization effort. Follow @daveschroeder and @IDCsync.

The views expressed in this article do not represent the views of the US Navy or the University of Wisconsin—Madison.