Guan’s blog

home

Crypto Wars II

26 Feb 2016

This is a modified version of an email newsletter I sent on February 22, 2016. I have attempted to update it for recent developments, but it may not be fully up to date.

Marketing all the way down

The Justice Department has called Apple’s refusal to install a backdoor in the work phone of San Bernardino terrorist Syed Farook a “marketing strategy,” implying that there illegitimate about catering to customers’ real demand for security. For most of us, privacy means keeping our nudes and dickpics and occasional subversive thoughts private. What is that but a form of marketing?

At the same time, the government’s stance is just as much marketing as Apple’s. In all likelihood, as Marcy Wheeler writes in Slate, there is nothing interesting on the iPhone 5c in question, because Farook destroyed two of his personal devices and because the FBI has so much other information about his activities, including a lot of information already turned over by Apple from iCloud servers.

Even if there is something interesting on the phone, the US government almost certainly has the means to hack into the phone. And there are indications that Apple offered to backdoor the phone under seal, presumably so the case could not be used to set precedent, and presumably the United States turned that down for the same reason.

What is the slippery slope Apple is afraid of? Is it being forced in the future to install a backdoor into all devices? Is the Farook case a proxy for a fight in some sealed ex parte proceeding at the FISA court? Perhaps Apple has already been secretly ordered to install a backdoor in all or some of its devices, or to hand over a backdoor to the government? Or is the fight about not giving the Chinese or Indian governments the ammunition to demand their own backdoors?

Is the FBI truly worried about the devices of terrorist suspects or kidnappers? It seems unlikely because the bureau can gather so much other information or through other means and bring massive investigative resources to bear in big terrorism cases. Or is the bureau more worried about its ability to investigate drug cases or traffic accidents or kidnappings? Is this proceeding part of some long game to delegitimize and ban encryption? Is there any rational reason?

Are the intelligence and law enforcement communities against encryption because they believe it completely prevents them from catching bad guys? Do they hate encryption because it merely slows down their work? Is the goal to punish Apple and Tim Cook for being annoying and contradicting them in public? Are undercover Chinese fifth columnists within the US government engaged in a long con that will result in iPhones sold in China having backdoors?

Reading the debates in the security community, people will have different answers to all of these questions. The answer could, of course, be “all of the above.” This first battle in Crypto Wars II is hard to understand because it is hard to know what the two sides’ true motivations are, what they want and what they fear.

Crypto Wars I

This is the beginning of Crypto Wars II, so let’s remind ourselves about Crypto Wars I. Back in the 1990s, the United States government tried to restrict cryptography in various ways, by banning it or by mandating weak crypto and backdoors.

The most common encryption algorithm was still DES, published in 1975. DES uses a key that is 56 bits long, which was arguably inadequate even in the 90s. But the US government wouldn’t let American companies export software that supported full DES. So the “export” version of Netscape Navigator, which I had to use in Europe, only supported an effective key length of 40 bits, which meant the NSA could easily decrypt the credit card numbers of Europeans shopping on Amazon.co.uk. Not much else was protected at the time. Encrypted email was not common.

The encryption used in the GSM standard for mobile telephony was also deliberately weakened so attackers could more easily eavesdrop on phone calls. The NSA obviously had the technical and computational capabilities to do all this eavesdropping. At the time anyone with a few million dollars to spend could do it too, but the cost fell rapidly to perhaps a few hundred thousand dollars in the late 90s and perhaps a few thousand dollars today.

The NSA wanted people to use an encryption chip of their own design called the Clipper chip. It was supposed to provide reasonably good cryptography, but was intentionally backdoored. In the end, the Clipper chip failed both commercially and technically, as the backdoor could be easily circumvented.

While the US government could regulate the manufacture and export of munitions, which it considered cryptography products to be, since 1791 it has not been able to ban free speech. So someone had the bright idea of printing the source code for the popular Pretty Good Privacy (PGP) crypto program in physical books that were legally exported from the United States. Those books would then be scanned and the source code reconstructed using OCR and released outside the United States as “PGPi,” the international version of PGP.

DJ Bernstein, a computer science professor then at the University of Illinois at Chicago, sued the United States on two occasions to assert his right to publish the source code for an encryption algorithm, arguing that code is speech, and won in the 9th Circuit. (Some of you may also love or hate djb as the author of qmail.) And overall, the pro-crypto side won Crypto Wars I. Although there have been a few attempts to fine American technology companies for crypto exports, in practice, cryptography is not illegal. There are few overt attempts to purposefully weaken cryptography; instead, the attempts happen in the shadows through mysterious government committees and bribes.

What is Crypto Wars II about?

There are basically two types of data you might want to encrypt and secure. The first is data in transit, for example iMessage messages, which might be captured by the NSA. Apple has designed the encryption behind iMessage so only the sender and the recipient has access to it, and Apple itself does not. This is known as “end-to-end” encryption.

The other kind is data at rest, stored on your laptop, phone and other devices. The content of data in transit is probably easier to secure than data at rest. But the NSA will always be able to intercept the metadata and thus know who you are talking to, how often, and when, and learn a lot about what you are doing. If you use a messaging service such as Facebook Messenger, your messages are encrypted between your computer and Facebook’s servers, and again between Facebook’s servers and the recipient. But Facebook itself has full access to the contents of the messages, and law enforcement agencies can go to Facebook and simply request a copy of your messages.

Data at rest is difficult to secure. There are often bugs in cryptography products and various ways to attack them that don’t rely on breaking the cryptography directly, and all of these flaws are easier to exploit when you have physical access to the encryption device.

The attacker can literally freeze your laptop and extract the contents of the memory chip later, a much cheaper attack. Or more trivially, you might have forgotten to lock your phone or laptop, or the FBI or the mob or whoever might get to you before you have an opportunity to do so. Or if you are still alive, enhanced-interrogate you until you reveal the password, or simply reveal the contents of your devices, or place your finger on the Touch ID sensor. The NSA was able to “listen” to secure faxes at the EU mission in Washington by manipulating the power supply of the fax machine to leak the contents of the faxes.

If you follow the discussion in the security community about Crypto Wars II, there are differing opinions on what the government’s true objectives are.

In the first scenario, the NSA basically has enough data about messages in transit, either through metadata or through other vulnerabilities, that they don’t really care about that. The true challenge is data at rest inside devices physically captured from targets.

In the second scenario, once law enforcement officers have seized physical evidence and executed search warrants, they know so much about the target that a few encrypted devices don’t matter so much. By the time the FBI has seized your iPhone, the agents have so much else that you are for all practical purposes p0wn3d, and any cooperation from Apple is often beside the point. The attacker is really going after messages in transit, before there is an opportunity to seize physical devices, where end-to-end protected services like iMessage are a big problem.

It is also possible that the US government doesn’t really have good reasons for wanting what it wants. Much of the Crypto Wars II agenda is driven by the intelligence community, and the intelligence community has always overestimated the importance of spying, and is ideologically opposed to anything that appears to hinder spying. See, for example, the lengths the community will go to to spy on Angela Merkel, of all people.

Threat models

Security folks love to talk about threat models. For our purposes, think of a threat model as a theory of who might want to attack us and how. If you are a Chinese dissident, you might be worried about the capabilities of state actors with large budgets. If you are a normal law-abiding resident of a Western democracy, you might be more worried about identity thieves and internet trolls, and possibly being caught in a mass surveillance dragnet, but you are less worried about being singled out and specifically targeted by your government.

As someone in the latter category who uses a secure message such as iMessage, the physical security of your phone may well be of great concern. Someone who is out to get me—with the resources of someone who is likely out to get me—may well find it much easier to find me on the street and steal my phone than to eavesdrop on my iMessages and try to break the encryption.

What has Apple done in the past?

Apple has a 15-page document about what it will and will not do for US law enforcement agencies. Generally speaking, anything stored on Apple’s servers is fair game for disclosure. Even if you use the end-to-end encrypted iMessage system for texting with your friends, if you have iCloud Backup enabled, those messages, as well as your photos and a lot of other data, are backed up to Apple’s servers, and can be released to law enforcement (and onanists).

If you don’t use iCloud Backup, your messages are secure, but Apple will still tell the FBI everything about when you sent or received those messages and who you communicated with. Your iCloud Keychain is encrypted in a different way and cannot be released this way without your passcode (or authorization from an unlocked device). All this has been done in the San Bernardino case.

When a law enforcement agency presents Apple with a locked iOS device, Apple will also extract whatever data it can. In older versions of iOS, some data was stored on an iPhone unencrypted, including photos and messages, and Apple would be happy to extract that too. But a reasonably sophisticated law enforcement organization such as the FBI can also desolder the flash memory chips in the phone and do the same thing themselves.

In successive versions of iOS, more and more data has been encrypted and protected by the passcode. So today, very little data can directly extracted from an iOS device without having to worry about encryption and passcodes. If you want to get naked selfies out of an iPhone running a recent version of iOS, you need to enter the passcode or use Touch ID.

Older versions of iOS only supported a 4-digit numeric passcode, so an intern could probably go through all the 10,000 possible combinations in a day. However, iPhones are set up to introduce a delay in between passcode attempts, and can be set up to wipe the phone after 10 failed passcode attempts. That is why the FBI wants a backdoor for Farook’s phone. If Apple could develop a special version of iOS that does not have a delay between passcode attempts or wipe the phone after 10 failed attempts, and Farook’s phone has a 4-digit passcode, then the FBI could indeed have an intern unlock the phone in not much time at all.

In the security community, there have been rumors for years that Apple had exactly such a backdoor and would regularly use it to assist in investigations. As far as we now know, that is not the case, and Apple has never assisted law enforcement to create a backdoor on an iOS device that bypasses the passcode delay and auto-wipe features. Apple has assisted in extracting data that was not encrypted, bypassing the need for a passcode altogether, but starting with iOS 8 there is very little such completely unencrypted data on iPhones.

Under the rubric of “reasonable technical assistance,” the United States magistrate judge in the San Bernardino case is asking Apple to do two things: (1) Write a special version (locked to Farook’s iPhone) of the iOS software with a backdoor so it does not enforce passcode delays or auto-wipe. (2) Cryptographically sign the special backdoored iOS version so Farook’s iPhone 5c will trust it as a legitimate software update.

The magistrate judge has allowed Apple to restrict the backdoor so it only works on Farook’s iPhone, perhaps using its serial number, and to install the backdoor at an Apple facility. However, the order strongly suggests that the backdoor be provided to the FBI.

What is this particular backdoor useful for?

It is generally accepted that the FBI will not recover anything interesting from Farook’s iPhone. Even the San Bernardino police chief says so. That means the case is either a proxy for some other sealed proceeding or a piece of the larger Crypto Wars. Let’s pause for a moment to think about what the FBI might be able to gain from this particular backdoor. The US government certainly has the ability to create its own backdoor in iOS, and has access to exploits that allow it to bypass the need for the iOS software update to be signed by Apple, at least for some iPhone models and iOS versions.

If Apple is compelled to create backdoor and to sign it, the signature portion of that equation is not very useful. Assuming there are no vulnerabilities in the signature creation or verification algorithms, that signature cannot be used for anything else.

But the backdoor itself is more interesting. Any backdoor written by the FBI or the NSA would be a hack, conceived without access to the iOS source code and based on an incomplete understanding of how iOS security works. Apple would provide a good backdoor, written by someone who understands iOS security, and won’t screw anything else up. Even if the signed version is limited to a particular iPhone copy, the FBI and NSA—and anyone else who gets hold of it—can learn something useful about how to compromise iOS. Someone who gets hold of the signed version—and thousands of copies could be out there soon, with law enforcement agencies and defense experts.

It is telling that Apple is not simply being asked to break the encryption, but to create a specific tool that circumvents the encryption in a specific way. That is probably no coincidence. If the FBI gets this tool, they will either recklessly access Farook’s data directly on the device, or have to compel Apple to create second tool that can extract the data on the phone in a forensically sound way.

Maybe Apple could hire a third party team of hackers to comply with the order, and not let any of its own engineers work on it.

Apple’s response

Apple responded on Thursday with a motion to oppose the original order and the FBI’s motion to compel. (I won’t get into why the United States is suing a black Lexus.) The magistrate judge will not be convinced by arguments about slippery slopes or China, but will look at whether the order is legal and whether it is excessively burdensome for Apple. The motion is worth reading in full.

Apple makes it clear that allowing the order would set a precedent that could be exploited almost immediately:

If this order is permitted to stand, it will only be a matter of days before some other prosecutor, in some other important case, before some other judge, seeks a similar order using this case as precedent. Once the floodgates open, the cannot be closed, and the device security that Apple has worked so tirelessly to achieve will be unwound without so much as a congressional vote.

There are also few limits to what else Apple could be asked to do:

For example, if Apple can be forced to write code in this case to bypass security features and create new accessibility, what is to stop the government from demanding that Apple write code to turn on the microphone in aid of government surveillance, activate the video camera, surreptitiously record conversations, or turn on location services to track the phone’s user? Nothing.

The upshot of this line of argument is that when relief that is that unusual, it should not come as a result of a law as broad and vague as the All Writs Act, but from an explicit act of Congress. That is what happened with CALEA. Apple’s motion goes on at length about this argument, essentially saying that the existence of the more specific CALEA law trumps the All Writs Act in this area; if that were not the case, no law would ever be needed to authorize new types of court orders.

Apple also articulates the burden imposed by the order. The company estimates that six to ten Apple engineers would have to spend two to four weeks on the project. Some of them would come from Apple’s core OS group. (There have been many complaints about the perceived deteriorating quality of Apple software in recent years; that won’t be helped by pulling engineers away!)

No operating system currently exists that can accomplish what the government wants, and any effort to create one will require that Apple write new code, not just disable existing code functionality. Rather, Apple will need to design and implement untested functionality in order to allow the capability to enter passcodes into the device electronically in the manner that the government describes.

And what happens after this case? Would Apple be allowed to destroy the resulting code, in which case it would have to start over for the next order, or would it be forced to keep it around, with the resulting security risks? To Zdziarski’s point, the declaration attached to the motion reminds us:

Moreover, even if Apple were able to truly destroy the actual operating system and the underlying code (which I believe to be an unrealistic proposition), it would presumably need to maintain the records and logs of the processes it used to create, validate, and deploy GovtOS in case Apple’s methods ever need to be defended, for example in court. The government, or anyone else, could use such records and logs as a roadmap to recreate Apple’s methodology, even if the operating system and underlying code no longer exist.

In other words, even if Apple is allowed to destroy the code, a blueprint for the backdoor still has to remain, and the existence of that blueprint is a security risk.

As Marty Lederman first pointed out, this is is intriguing:

Moreover, the government has not made any showing that it sought or received technical assistance from other federal agencies with expertise in digital forensics, which assistance might obviate the need to conscript Apple to create the back door it now seeks.

In other words, the NSA might have a backdoor, and it is not the case that there is “no conceivable way” for the government to extract data, which was one of the arguments in the 1977 case that first used the All Writs Act to compel phone company assistance with wiretaps.

Free speech, a digression

The first of the two things the magistrate judge ordered Apple to do involves writing software, and recall that some of the pro-crypto side’s victories in Crypto Wars I were won by convincing judges that code is speech. By compelling Apple to comply with the order, the government is mandating speech. That is one of Apple’s arguments in resisting the order. The specific speech being required is not something trivial like “I love kittens,” but speech that is deeply antithetical to Apple’s policy and the wishes of its customers. This is very different from merely providing technical or forensic assistance, which Apple renders all the time, including in the San Bernardino case.

The US government cannot ban speech. Can it compel particular speech? The US government frequently obtains orders against companies that include gag orders. Companies run by security and privacy advocates often set up “warrant canaries.” For example, here is one by backup provider rsync.net declaring that it has never received such a warrant. If rsync.net does receive one in the future, it would no longer be able to publish that notice.

The theory behind warrant canaries relies on the assumption that the government cannot (secretly) compel speech. If it can, warrant canaries will not work.

(By the way: This newsletter has never been the subject of legal process I was not free to disclose.)

Apple has made a free speech argument in its motion to oppose the order:

Under well-settled law, computer code is treated as speech within the meaning of the First Amendment. … The Supreme Court has made clear that where, as here, the government seeks to compel speech, such action triggers First Amendment protections. … Compelled speech is a content-based restriction subject to exacting scrutiny, [Riley v. Nat’l Fed. of the Blind of N.C., Inc., 487 U.S. 781, 796 (1988)] at 795, 797–98, and so may only be upheld if it is narrowly tailored to obtain a compelling state interest, see Turner Broad. Sys., Inc. v. FCC, 512 U.S. 622, 662 (1994).

There are mixed views on whether Apple’s free speech argument has any merit. Marty Lederman thinks it is bonkers, while the Washington Post found a lawyer who likes it a little more.

Alternatives

Is compelling Apple to create a backdoor the only option available to the FBI? Probably not.

Recall that Apple is being ordered to do two things: create a backdoor and cryptographically sign it. Although the US government probably doesn’t have access to the full source code for iOS, it probably isn’t too hard to create a backdoor because jailbreakers—people who want the ability to install non-Apple approved apps on their phones and make iOS do things that Apple does not support and won’t let regular apps do—have done all sorts of weird modifications to iOS. Jailbreakers have also had success loading custom versions of iOS into many iPhone models. The National Security Agency, with its $10 billion budget, is much better at this than a bunch of random hackers, and the NSA almost certainly has several exploits that can be used against Farook’s iPhone.

Why doesn’t the FBI just use one of those? For one thing, although the FBI is active in this area and regularly buys exploits against iPhones and other devices, it is probably the NSA that has the good stuff. And maybe the NSA just doesn’t feel like helping the folk at the bureau, perhaps because they find FBI agents annoying. So the NSA may not want to waste a valuable exploit on a phone that is unlikely to contain anything useful.

Another thing the FBI could do is use one of its existing tools to bypass iPhone passcodes. According to USA Today, the Justice Department told one federal judge in Brooklyn that the “lack of a passcode is not fatal to the government’s ability to obtain the records.” One of the tools available to the FBI is the IP-Box, which inserts itself on the power cable leading to an iPhone’s battery. By cutting the power shortly after a failed passcode attempt, IP-Box apparently prevents the phone from recording information about the failed attempt, and it won’t count toward the 10 failed attempts allowed before automatically wiping the phone. IP-Box does not work on iOS 9, which Farook’s iPhone may be running, but it wouldn’t be crazy to think that the NSA or the FBI has a device that would work.

A digression on jailbreaking

When a hacker manages to jailbreak a new iPhone model or a new version of iOS, he or she is exploiting a security flaw in iOS. That lets people jailbreak their phones and install whatever apps they want, circumventing the restrictions in Apple’s App Store. There is also a community of coders who make modifications to iOS itself; one such modification adds more information to the iOS lock screen.

Whenever a jailbreak is released, Apple usually releases an updated version of iOS a few months later that patches the security hole used by that jailbreak. Some of these holes have been quite egregious. One infamous jailbreak required only that the user visit a particular website in Safari. When Apple closes these holes, the jailbreak community invariably gets annoyed. But these are the same holes that are exploited by the NSA and other attackers.

Security is hard. There are leaks. Apple’s own experiences with jailbreakers and with IP-Box shows this. Deliberately helping attackers doesn’t help matters.

Secure Enclave

iPhone 5s and later models have a Secure Enclave. This is a separate processor within the iPhone’s chip, that runs a different operating system, the L4 microkernel. The Secure Enclave stores sensitive information such as Touch ID fingerprint data and Apple Pay credit card numbers in such a way that the data cannot be extracted from the enclave. That is why Touch ID cannot leak your fingerprints: data from the fingerprint scanner is sent to the Secure Enclave, which then compares it to recorded fingerprints without revealing them.

Having a separate processor and operating system to do this is beneficial because it can be made smaller and simpler, and therefore easier to audit. At the same time, the Secure Enclave’s relative isolation from the rest of the system reduces its so-called “attack surface,” the set of possible attacks that can be staged against it.

In iPhones that have a Secure Enclave, it is also responsible for enforcing passcode restrictions, such as the pause between passcode attempts and auto-wiping after 10 failed passcode attempts. It has been suggested that if Apple could create a backdoor for Farook’s iPhone 5c, which does not have a Secure Enclave, later phones would not be vulnerable in the same way.

Although Apple’s disclosure is a little vague in this area, it now appears that the software running on the Secure Enclave can, in fact, be updated with an update file that is properly signed by Apple. This means that what Apple is being ordered to do in the San Bernardino case is also possible for phones with a Secure Enclave. What is still a little unclear is whether a correct passcode needs to be entered prior to such an update. In some ways, Apple brought all this on itself with inadequate security design.

The slippery slope is 215 degrees

Section 215 of the Patriot Act had four elements. It allowed (1) the FBI to request the production of (2) records (“tangible things”) to (3) aid in an (4) investigation. The bulk metadata collection program did not meet any of those criteria. Section 215 was used by the NSA, not the FBI. It did not require the production of any tangible things then in existence; instead, the phone companies were ordered to send all new call records every day. It was not to support an investigation, because there was no investigation; the data was stored indefinitely for possible future investigations or other use.

The All Writs Act, which is being invoked in the San Bernardino case, is much broader than Section 215. Here is the relevant portion:

The Supreme Court and all courts established by Act of Congress may issue all writs necessary or appropriate in aid of their respective jurisdictions and agreeable to the usages and principles of law.

Absent clear case law, this is the kind of statute that could be very broad or very narrow to suit the whims of the judge applying it. An overzealous judge could use it to order anyone to almost anything, while a more conservative judge might more careful about making sure his or her orders are necessary, appropriate and agreeable to the usages and principles of law.

The phrase “in aid of their respective jurisdictions” implies that there needs to be a case to have jurisdiction over, but the lack of a case being investigated didn’t prevent Section 215 from being abused. The lesson of Section 215 is that the slippery slope in this area is very steep indeed.

Given how broadly the All Writs Act is written, there is a sense that absent more clear language, it be that broad in practice, and probably does not create an exception to the First Amendment. So it is quite possible that the Justice Department will lose this particular battle, or some version of it, that the Act cannot be used to compel particular speech, and that there are limits to the type and extent of “technical assistance” technology companies can be required to provide.

That is not the end of Crypto Wars II, however. The fight would then turn to Congress, which could enact a law (which Barack Obama will sign) which will clarify what Apple’s technical assistance obligations are. In addition to requiring technical assistance with custom one-off backdoors like the one for Farook’s phone, a new law could require Apple to build in backdoors to released versions of iOS. Assuming this is constitutional, an explicit legislative mandate would probably be better than using the All Writs Act. There is precedent for this: CALEA is a law from 1994 that required phone companies to build wiretapping functionality into their networks.

Not All Writs, Brooklyn edition

This isn’t the only All Writs Act fight Apple is involved in. Since October, Apple has been resisting a technical assistance order in a case in Brooklyn. The Brooklyn phone runs iOS 7, and as far as we can tell, Apple is being asked to do the kind of extraction of unencrypted data that it has done in the past. But Apple is objecting to how wide-ranging the order. Alison Frankel at Reuters has a great post on how magistrate judge James Orenstein in Brooklyn approached the government’s request in that case and ordered extensive briefing and argument on whether the proposed order was proper. It is thanks to Judge Orenstein that we know that Apple has received similar orders in 9 other cases, involving 12 iOS devices. That is in addition to the device in Judge Orenstein’s case and the San Bernardino device.

Amy Davidson in the New Yorker discusses some of the dangers of overly broad use of the All Writs Act. I sometimes like to say that slippery slope arguments are a slippery slope, that they prevent you from doing anything at all. In this case there are literally US attorneys and district attorneys waiting to push us down the slope, and the government has not articulated any limiting principle to what types of assistance can be compelled using the All Writs Act.

If a case involving a non-digital phone network could be applied to smartphones, what technologies might an Apple precedent be applied to, three or four decades from now? (The N.S.A. used, or rather promiscuously misused, another pen-register case from the same era to justify its bulk data collection.) It no longer becomes fanciful to wonder about what the F.B.I. might, for example, ask coders adept in whatever genetic-editing language emerges from the recent developments in CRISPR technology to do. But some of the alarming potential applications are low-tech, too. What if the government was trying to get information not out of a phone but out of a community? Could it require someone with distinct cultural or linguistic knowledge not only to give it information but to use that expertise to devise ways for it to infiltrate that community? Could an imam, for example, be asked not only to tell what he knows but to manufacture an informant?

These are examples of cooperation that go beyond providing information or access to facilities, but are not all that different from what Apple is being ordered to in California.

China and India

A lot of American technology companies operate in foreign countries. Sometimes they get in trouble. Yahoo! China got a Chinese dissident imprisoned by releasing his emails to the Chinese government. Some American companies, such as Facebook and Google, deal with the ethical issues involved by simply not doing certain kinds of business in China.

China has not aggressively mandated backdoors or weakened cryptography for products sold in China. An iPhone bought in China is just as secure as one bought anywhere else. India, on the other hand, has required backdoors in BlackBerry devices sold there. Pro tip: if you are an Indian criminal, don’t use a BlackBerry. That advice also applies if you are not an Indian criminal.

BlackBerry is more or less dead, but Apple sells a lot of iPhones in China, and the Chinese market is vital to the company’s future growth prospects. iCloud data for Chinese users is already stored within China and is presumably available to the Chinese authorities. Apple can probably live with that because sending sensitive information to iCloud is optional for users. But if the Chinese government starts requiring a backdoor in iPhone hardware or software, that might be a bridge too far for Apple.

As John Gruber puts it:

I’ve long wondered why China allows companies like Apple to sell devices without back doors for their government. A big part of why they tolerate it seems to be the fact that no government gets this.

If the US government starts demanding backdoors, then the Chinese government will to. Then India, then Great Britain, then every oppressive regime and incompetent intelligence agency will want one too.

It’s hard to know what the US government thinks about all this. The US law enforcement community has a history of trying to use the global position of American technology companies to its own ends, while ignoring the repercussions for the companies’ competitiveness and how other governments might respond. For example, the Department of Justice has asserted the right to demand emails stored on Microsoft servers in Ireland directly, without going through the Irish authorities.

Great Britain

Maria Farrell points out that if Apple had been a British company, this entire debate would be moot:

Because the Investigatory Powers Bill both requires an overly broad base for ‘reasonable assistance’ and accompanies it with a gagging order. (Long experience has taught me there is always a good reason for what first appears to be sloppy drafting in a Home Office bill.) So, unlike in the US, there will be no chance here of a Vodafone executive publicly refusing to actively assist government hackers once this bill has passed. The old gag about the Snowden revelations plays out once again; it was oddly funny that wide-scale surveillance was able to happen in the US illegally, and in the UK almost wholly legally.

So that’s something the USA has going for it. 🇺🇸🇺🇸🇺🇸

Why do I need security?

If you were Osama bin Laden and the US government has your iPhone, there is probably nothing you can do and this is all moot. The Justice Department won’t play games with the All Writs Act, and will have the NSA hack into your phone.

If you are in the United States, keep in mind that literally everything is a federal crime. And there is probably evidence of your crimes on your phone.

If you have commercial secrets on your phone, and you travel to China, you probably don’t want the Chinese government, or anyone else, to easily read your secrets. (If you are a major terrorist or splittist, they probably will anyway.) And while your factory processes or HVAC system or whatever else are probably not controllable from the iPhone that you carry around, you might have some indirect access to that network through your phone’s VPN access. Your network’s security is probably set up so compromising your iPhone shouldn’t be too dangerous, but you would lose one level of a defense in depth strategy.

If you have nude selfies, dickpics or other embarrassing material on your phone, you probably don’t want people to easily access them, whether it’s onanists or the US government trying to discredit you by revealing your porn habits.

If the anti-crypto side wins Crypto Wars II, the best response might be to preemptively publish your porn history and dickpics, and to just panic about the security of your systems.

Secure backdoor is secure

A couple of the Crypto Wars II takes in the last couple of weeks have tried to expose a supposed hypocrisy on Apple’s part by arguing that Apple’s stance reveals that secure backdoors are possible. The argument goes that the California magistrate judge’s order truly only applies to this one phone. The next time such an order is issued, it would only apply to the phone in that case, and so on. There is no mechanism for the government to illegimately gain access to data on an iPhone, and no room for mass surveillance.

All of these arguments assume that the backdoor would, indeed, be secure. That would be the case if relatively few engineers at Apple have access to it, and there is no chance of it leaking. Apple provided some data for 3,093 device requests by US law enforcement agencies in the first half of 2015. Manhattan district attorney Cy Vance is ready with 175 phones he wants unlocked.

Today, the signing keys for iOS releases do not have to be used very often, and they are likely kept very secure and difficult to casually use or abuse. If thousands of devices have to be “individually” backdoored every year, those keys would also have to be used thousands of times a year, and it would be much more difficult to keep them safe. A lot of the current hacking tools for iPhone already originate from Apple’s Chinese supply chain, and you simply can’t keep secure a signing process that has to be used thousands of times a year when the stakes are this high.

Apple has not been ordered to hand over a general backdoor yet. As Jonathan Zdziarski points out, that may not continue to be the case if Farook’s phone is backdoored, and important evidence is found that points to a hitherto unknown co-conspirator. If that evidence is introduced at trial, there is a real possibility that Apple will be required to disclose the details of exactly how it obtained the evidence being presented. Now, because there is nothing interesting on this iPhone, its contents probably won’t be released at trial, and there is no defense team that will demand that its experts get full access to the backdoor. The worst case is that Apple has to provide a tool that allows the FBI to extract the full contents of the phone, including the backdoored iOS code, and that could then leak out of the FBI. However, the very next case might involve a phone with useful evidence on it, and then you have a major risk of leaks.

It is appropriate that Apple sees itself as part of the threat model for its users’ devices. Julian Sanchez explains why this kind of technical assistance is so dangerous. The whole piece is interesting; here is an important point:

This would create an internal conflict of interest: The same company must work to both secure its products and to undermine that security—and the better it does at the first job, the larger the headaches it creates for itself in doing the second. It would also, as Apple’s Cook has argued, make it far more difficult to prevent those cracking tools from escaping into the wild or being replicated.

Why this case?

The FBI will probably not recover anything useful from the iPhone in question. The bureau already have a lot of material about the attackers, much of it from iCloud, through Apple’s cooperation with search warrants and subpoenas in the case. Farook destroyed several devices before the attack so carefully that it has not been possible to recover any data and it is likely that any sensitive information are on the destroyed devices, rather than the one left intact. It was a work phone. And as I mentioned, the FBI probably turned down an offer by Apple to handle all this secretly.

We now also know that the FBI stupidly reset Syed Farook’s iCloud password in the hours after the attack. The iPhone had not backed up to iCloud for 6 weeks, but there was a chance that it would do so given the opportunity. That opportunity was lost when the iCloud password was changed. (We now know that the FBI ordered this, not the San Barnardino health department.)

The FBI also stupidly turned off the phone, which cut off many potential avenues of attack. Given how incompetent the bureau has been in this case, and the fact that the phone is unlikely to contain any information of interest, you might think this is a weird case to use to kick off Crypto Wars II.

It is a difficult choice to make for the bureau. It would have to be a terrorism investigation. A kidnapping investigation is too time-sensitive. If this had been Osama bin Laden’s iPhone with potential information on al-Qaeda operations, they would not have bothered with some law from 1789 and would simply have hacked the phone. So only a case where the phone is relatively unimportant would be a candidate for trying to crush Apple. That probably rules out phones recovered from investigations of foreign terrorism cases, which is why they settled on a case that, while tragic, involved someone going postal on his co-workers—as American as apple pie.

The FBI should probably have waited for a case where it didn’t screw up the evidence so badly, but you go to Crypto Wars with the case you have, not the case you want.

What should Apple do?

It is hard to make suggestions for what Apple should do legally and politically because, as I discussed above, it’s hard to know what this particular phase of Crypto Wars II is really about. Is the FBI trying to get some piece of legislation passed, or is it pressuring Apple to do something in a case that is under seal? Does Apple fear the legislation, or does it fear how the Chinese and Indian governments will react?

On the technical side, there are ways Apple could mitigate the risks of the measures sought in this case. Why is it even possible to upgrade iOS software or Secure Enclave firmware without entering the passcode? It shouldn’t be. Maybe a phone should not be bricked if its owner forgets the passcode. In that case, resetting the phone should also wipe all the data. And Katie Benner reports in the New York Times that Apple is working on measures to make iPhones less vulnerable to the type of attack the FBI is staging.

Apple should get rid of “security” questions.

Apple should stop selling older phones without a Secure Enclave.

Apple should also think hard about iCloud security. Apple has the encryption keys to your iCloud backups and can disclose your backed up data—including messages and photos—to anyone. Apple handed over Farook’s iCloud backups and that is also how Jennifer Lawrence’s nude selfies were leaked. There should probably be an option to have only you hold the keys to your backups, which is already the case for your passwords in iCloud Keychain. People who use that option would be at some risk of losing all their data if they forget all their passwords and keys. Apple does not already do this because it does not want to be in the awkward position of telling parents that all the photos of their kids are lost because they were not careful enough with passwords.

Some practical advice

Don’t use iCloud backups. Yes, I know, it’s a huge hassle to remember to back up to iTunes.

Encrypt everything. Use HTTPS on your web servers; Let’s Encrypt provides free certificates now. On Mac, use FileVault 2, and on Linux, use eCryptfs. Use a long passcode on your phone.

Use Signal to communicate securely.

Disable Touch ID if a government or someone who might kidnap you for your finger are part of your threat model.

If you are a corporate IT department, just enable MDM. It lets you avoid a situation where you can’t get into your employee’s work issued phone. This is especially important if you are legally required to maintain such access (no, iPhones are not illegal in the financial industry).

More links

How Tim Cook, in iPhone Battle, became a bulwark for digital privacy. Kieran Healy explains why Apple is fighting the FBI on this, and not some other tech company, and why the rest of Silicon Valley has been slow in coming to Apple’s support. General Hayden has a take too. Do we have a right to security?

iOS Security Guide. Read this religiously. A cartoon. If I am the victim, please pay my ransom. Definitely go read everything Jonathan Zdziarski writes on this.

A poem (apologies to various parties)

First they came for an iPhone in Brooklyn, and I did not speak out—because it was in Brooklyn. Brooklyn is terrible.

Then they came for an iPhone in California, and I did not speak out—because it was not my phone and it was used by a murderer.

Then they came for another iPhone, and I did not speak out—because it was not my phone.

Then they came for another iPhone, and I did not speak out—because it was not my phone.

Then they came for another iPhone, and I did not speak out—because it was not my phone.

Then they came for another iPhone, and I did not speak out—because it was not my phone.

Then they came for my iMessages, and I did not speak out—because I was confused and thought this was about phones.

iPhone 5c is a lame plastic phone anyway. It deserves to be backdoored.

Another poem (by JC)

Here is a cryptography scheme

that is appropriately lean and mean:

just add all the symbols mod 2

cuz if I can’t read it, neither can you

Final thoughts

I love you Admiral Rogers!

Did anyone watch last week’s episode of Scandal? The director of the NSA is always a general or admiral. It would be very strange for a mere captain to become director. Also I want a Gettysburger, with freedom fries and union rings.

1620 #farookpasscodeguesses