Guan’s blog

home

England

24 Jun 2016

JOHN OF GAUNT

Methinks I am a prophet new inspired
And thus expiring do foretell of him:
His rash fierce blaze of riot cannot last,
For violent fires soon burn out themselves;
Small showers last long, but sudden storms are short;
He tires betimes that spurs too fast betimes;
With eager feeding food doth choke the feeder:
Light vanity, insatiate cormorant,
Consuming means, soon preys upon itself.
This royal throne of kings, this scepter’d isle,
This earth of majesty, this seat of Mars,
This other Eden, demi-paradise,
This fortress built by Nature for herself
Against infection and the hand of war,
This happy breed of men, this little world,
This precious stone set in the silver sea,
Which serves it in the office of a wall,
Or as a moat defensive to a house,
Against the envy of less happier lands,
This blessed plot, this earth, this realm, this England,
This nurse, this teeming womb of royal kings,
Fear’d by their breed and famous by their birth,
Renowned for their deeds as far from home,
For Christian service and true chivalry,
As is the sepulchre in stubborn Jewry,
Of the world’s ransom, blessed Mary’s Son,
This land of such dear souls, this dear dear land,
Dear for her reputation through the world,
Is now leased out, I die pronouncing it,
Like to a tenement or pelting farm:
England, bound in with the triumphant sea
Whose rocky shore beats back the envious siege
Of watery Neptune, is now bound in with shame,
With inky blots and rotten parchment bonds:
That England, that was wont to conquer others,
Hath made a shameful conquest of itself.
Ah, would the scandal vanish with my life,
How happy then were my ensuing death!

Crypto Wars II

26 Feb 2016

This is a modified version of an email newsletter I sent on February 22, 2016. I have attempted to update it for recent developments, but it may not be fully up to date.

Marketing all the way down

The Justice Department has called Apple’s refusal to install a backdoor in the work phone of San Bernardino terrorist Syed Farook a “marketing strategy,” implying that there illegitimate about catering to customers’ real demand for security. For most of us, privacy means keeping our nudes and dickpics and occasional subversive thoughts private. What is that but a form of marketing?

At the same time, the government’s stance is just as much marketing as Apple’s. In all likelihood, as Marcy Wheeler writes in Slate, there is nothing interesting on the iPhone 5c in question, because Farook destroyed two of his personal devices and because the FBI has so much other information about his activities, including a lot of information already turned over by Apple from iCloud servers.

Even if there is something interesting on the phone, the US government almost certainly has the means to hack into the phone. And there are indications that Apple offered to backdoor the phone under seal, presumably so the case could not be used to set precedent, and presumably the United States turned that down for the same reason.

What is the slippery slope Apple is afraid of? Is it being forced in the future to install a backdoor into all devices? Is the Farook case a proxy for a fight in some sealed ex parte proceeding at the FISA court? Perhaps Apple has already been secretly ordered to install a backdoor in all or some of its devices, or to hand over a backdoor to the government? Or is the fight about not giving the Chinese or Indian governments the ammunition to demand their own backdoors?

Is the FBI truly worried about the devices of terrorist suspects or kidnappers? It seems unlikely because the bureau can gather so much other information or through other means and bring massive investigative resources to bear in big terrorism cases. Or is the bureau more worried about its ability to investigate drug cases or traffic accidents or kidnappings? Is this proceeding part of some long game to delegitimize and ban encryption? Is there any rational reason?

Are the intelligence and law enforcement communities against encryption because they believe it completely prevents them from catching bad guys? Do they hate encryption because it merely slows down their work? Is the goal to punish Apple and Tim Cook for being annoying and contradicting them in public? Are undercover Chinese fifth columnists within the US government engaged in a long con that will result in iPhones sold in China having backdoors?

Reading the debates in the security community, people will have different answers to all of these questions. The answer could, of course, be “all of the above.” This first battle in Crypto Wars II is hard to understand because it is hard to know what the two sides’ true motivations are, what they want and what they fear.

Crypto Wars I

This is the beginning of Crypto Wars II, so let’s remind ourselves about Crypto Wars I. Back in the 1990s, the United States government tried to restrict cryptography in various ways, by banning it or by mandating weak crypto and backdoors.

The most common encryption algorithm was still DES, published in 1975. DES uses a key that is 56 bits long, which was arguably inadequate even in the 90s. But the US government wouldn’t let American companies export software that supported full DES. So the “export” version of Netscape Navigator, which I had to use in Europe, only supported an effective key length of 40 bits, which meant the NSA could easily decrypt the credit card numbers of Europeans shopping on Amazon.co.uk. Not much else was protected at the time. Encrypted email was not common.

The encryption used in the GSM standard for mobile telephony was also deliberately weakened so attackers could more easily eavesdrop on phone calls. The NSA obviously had the technical and computational capabilities to do all this eavesdropping. At the time anyone with a few million dollars to spend could do it too, but the cost fell rapidly to perhaps a few hundred thousand dollars in the late 90s and perhaps a few thousand dollars today.

The NSA wanted people to use an encryption chip of their own design called the Clipper chip. It was supposed to provide reasonably good cryptography, but was intentionally backdoored. In the end, the Clipper chip failed both commercially and technically, as the backdoor could be easily circumvented.

While the US government could regulate the manufacture and export of munitions, which it considered cryptography products to be, since 1791 it has not been able to ban free speech. So someone had the bright idea of printing the source code for the popular Pretty Good Privacy (PGP) crypto program in physical books that were legally exported from the United States. Those books would then be scanned and the source code reconstructed using OCR and released outside the United States as “PGPi,” the international version of PGP.

DJ Bernstein, a computer science professor then at the University of Illinois at Chicago, sued the United States on two occasions to assert his right to publish the source code for an encryption algorithm, arguing that code is speech, and won in the 9th Circuit. (Some of you may also love or hate djb as the author of qmail.) And overall, the pro-crypto side won Crypto Wars I. Although there have been a few attempts to fine American technology companies for crypto exports, in practice, cryptography is not illegal. There are few overt attempts to purposefully weaken cryptography; instead, the attempts happen in the shadows through mysterious government committees and bribes.

What is Crypto Wars II about?

There are basically two types of data you might want to encrypt and secure. The first is data in transit, for example iMessage messages, which might be captured by the NSA. Apple has designed the encryption behind iMessage so only the sender and the recipient has access to it, and Apple itself does not. This is known as “end-to-end” encryption.

The other kind is data at rest, stored on your laptop, phone and other devices. The content of data in transit is probably easier to secure than data at rest. But the NSA will always be able to intercept the metadata and thus know who you are talking to, how often, and when, and learn a lot about what you are doing. If you use a messaging service such as Facebook Messenger, your messages are encrypted between your computer and Facebook’s servers, and again between Facebook’s servers and the recipient. But Facebook itself has full access to the contents of the messages, and law enforcement agencies can go to Facebook and simply request a copy of your messages.

Data at rest is difficult to secure. There are often bugs in cryptography products and various ways to attack them that don’t rely on breaking the cryptography directly, and all of these flaws are easier to exploit when you have physical access to the encryption device.

The attacker can literally freeze your laptop and extract the contents of the memory chip later, a much cheaper attack. Or more trivially, you might have forgotten to lock your phone or laptop, or the FBI or the mob or whoever might get to you before you have an opportunity to do so. Or if you are still alive, enhanced-interrogate you until you reveal the password, or simply reveal the contents of your devices, or place your finger on the Touch ID sensor. The NSA was able to “listen” to secure faxes at the EU mission in Washington by manipulating the power supply of the fax machine to leak the contents of the faxes.

If you follow the discussion in the security community about Crypto Wars II, there are differing opinions on what the government’s true objectives are.

In the first scenario, the NSA basically has enough data about messages in transit, either through metadata or through other vulnerabilities, that they don’t really care about that. The true challenge is data at rest inside devices physically captured from targets.

In the second scenario, once law enforcement officers have seized physical evidence and executed search warrants, they know so much about the target that a few encrypted devices don’t matter so much. By the time the FBI has seized your iPhone, the agents have so much else that you are for all practical purposes p0wn3d, and any cooperation from Apple is often beside the point. The attacker is really going after messages in transit, before there is an opportunity to seize physical devices, where end-to-end protected services like iMessage are a big problem.

It is also possible that the US government doesn’t really have good reasons for wanting what it wants. Much of the Crypto Wars II agenda is driven by the intelligence community, and the intelligence community has always overestimated the importance of spying, and is ideologically opposed to anything that appears to hinder spying. See, for example, the lengths the community will go to to spy on Angela Merkel, of all people.

Threat models

Security folks love to talk about threat models. For our purposes, think of a threat model as a theory of who might want to attack us and how. If you are a Chinese dissident, you might be worried about the capabilities of state actors with large budgets. If you are a normal law-abiding resident of a Western democracy, you might be more worried about identity thieves and internet trolls, and possibly being caught in a mass surveillance dragnet, but you are less worried about being singled out and specifically targeted by your government.

As someone in the latter category who uses a secure message such as iMessage, the physical security of your phone may well be of great concern. Someone who is out to get me—with the resources of someone who is likely out to get me—may well find it much easier to find me on the street and steal my phone than to eavesdrop on my iMessages and try to break the encryption.

What has Apple done in the past?

Apple has a 15-page document about what it will and will not do for US law enforcement agencies. Generally speaking, anything stored on Apple’s servers is fair game for disclosure. Even if you use the end-to-end encrypted iMessage system for texting with your friends, if you have iCloud Backup enabled, those messages, as well as your photos and a lot of other data, are backed up to Apple’s servers, and can be released to law enforcement (and onanists).

If you don’t use iCloud Backup, your messages are secure, but Apple will still tell the FBI everything about when you sent or received those messages and who you communicated with. Your iCloud Keychain is encrypted in a different way and cannot be released this way without your passcode (or authorization from an unlocked device). All this has been done in the San Bernardino case.

When a law enforcement agency presents Apple with a locked iOS device, Apple will also extract whatever data it can. In older versions of iOS, some data was stored on an iPhone unencrypted, including photos and messages, and Apple would be happy to extract that too. But a reasonably sophisticated law enforcement organization such as the FBI can also desolder the flash memory chips in the phone and do the same thing themselves.

In successive versions of iOS, more and more data has been encrypted and protected by the passcode. So today, very little data can directly extracted from an iOS device without having to worry about encryption and passcodes. If you want to get naked selfies out of an iPhone running a recent version of iOS, you need to enter the passcode or use Touch ID.

Older versions of iOS only supported a 4-digit numeric passcode, so an intern could probably go through all the 10,000 possible combinations in a day. However, iPhones are set up to introduce a delay in between passcode attempts, and can be set up to wipe the phone after 10 failed passcode attempts. That is why the FBI wants a backdoor for Farook’s phone. If Apple could develop a special version of iOS that does not have a delay between passcode attempts or wipe the phone after 10 failed attempts, and Farook’s phone has a 4-digit passcode, then the FBI could indeed have an intern unlock the phone in not much time at all.

In the security community, there have been rumors for years that Apple had exactly such a backdoor and would regularly use it to assist in investigations. As far as we now know, that is not the case, and Apple has never assisted law enforcement to create a backdoor on an iOS device that bypasses the passcode delay and auto-wipe features. Apple has assisted in extracting data that was not encrypted, bypassing the need for a passcode altogether, but starting with iOS 8 there is very little such completely unencrypted data on iPhones.

Under the rubric of “reasonable technical assistance,” the United States magistrate judge in the San Bernardino case is asking Apple to do two things: (1) Write a special version (locked to Farook’s iPhone) of the iOS software with a backdoor so it does not enforce passcode delays or auto-wipe. (2) Cryptographically sign the special backdoored iOS version so Farook’s iPhone 5c will trust it as a legitimate software update.

The magistrate judge has allowed Apple to restrict the backdoor so it only works on Farook’s iPhone, perhaps using its serial number, and to install the backdoor at an Apple facility. However, the order strongly suggests that the backdoor be provided to the FBI.

What is this particular backdoor useful for?

It is generally accepted that the FBI will not recover anything interesting from Farook’s iPhone. Even the San Bernardino police chief says so. That means the case is either a proxy for some other sealed proceeding or a piece of the larger Crypto Wars. Let’s pause for a moment to think about what the FBI might be able to gain from this particular backdoor. The US government certainly has the ability to create its own backdoor in iOS, and has access to exploits that allow it to bypass the need for the iOS software update to be signed by Apple, at least for some iPhone models and iOS versions.

If Apple is compelled to create backdoor and to sign it, the signature portion of that equation is not very useful. Assuming there are no vulnerabilities in the signature creation or verification algorithms, that signature cannot be used for anything else.

But the backdoor itself is more interesting. Any backdoor written by the FBI or the NSA would be a hack, conceived without access to the iOS source code and based on an incomplete understanding of how iOS security works. Apple would provide a good backdoor, written by someone who understands iOS security, and won’t screw anything else up. Even if the signed version is limited to a particular iPhone copy, the FBI and NSA—and anyone else who gets hold of it—can learn something useful about how to compromise iOS. Someone who gets hold of the signed version—and thousands of copies could be out there soon, with law enforcement agencies and defense experts.

It is telling that Apple is not simply being asked to break the encryption, but to create a specific tool that circumvents the encryption in a specific way. That is probably no coincidence. If the FBI gets this tool, they will either recklessly access Farook’s data directly on the device, or have to compel Apple to create second tool that can extract the data on the phone in a forensically sound way.

Maybe Apple could hire a third party team of hackers to comply with the order, and not let any of its own engineers work on it.

Apple’s response

Apple responded on Thursday with a motion to oppose the original order and the FBI’s motion to compel. (I won’t get into why the United States is suing a black Lexus.) The magistrate judge will not be convinced by arguments about slippery slopes or China, but will look at whether the order is legal and whether it is excessively burdensome for Apple. The motion is worth reading in full.

Apple makes it clear that allowing the order would set a precedent that could be exploited almost immediately:

If this order is permitted to stand, it will only be a matter of days before some other prosecutor, in some other important case, before some other judge, seeks a similar order using this case as precedent. Once the floodgates open, the cannot be closed, and the device security that Apple has worked so tirelessly to achieve will be unwound without so much as a congressional vote.

There are also few limits to what else Apple could be asked to do:

For example, if Apple can be forced to write code in this case to bypass security features and create new accessibility, what is to stop the government from demanding that Apple write code to turn on the microphone in aid of government surveillance, activate the video camera, surreptitiously record conversations, or turn on location services to track the phone’s user? Nothing.

The upshot of this line of argument is that when relief that is that unusual, it should not come as a result of a law as broad and vague as the All Writs Act, but from an explicit act of Congress. That is what happened with CALEA. Apple’s motion goes on at length about this argument, essentially saying that the existence of the more specific CALEA law trumps the All Writs Act in this area; if that were not the case, no law would ever be needed to authorize new types of court orders.

Apple also articulates the burden imposed by the order. The company estimates that six to ten Apple engineers would have to spend two to four weeks on the project. Some of them would come from Apple’s core OS group. (There have been many complaints about the perceived deteriorating quality of Apple software in recent years; that won’t be helped by pulling engineers away!)

No operating system currently exists that can accomplish what the government wants, and any effort to create one will require that Apple write new code, not just disable existing code functionality. Rather, Apple will need to design and implement untested functionality in order to allow the capability to enter passcodes into the device electronically in the manner that the government describes.

And what happens after this case? Would Apple be allowed to destroy the resulting code, in which case it would have to start over for the next order, or would it be forced to keep it around, with the resulting security risks? To Zdziarski’s point, the declaration attached to the motion reminds us:

Moreover, even if Apple were able to truly destroy the actual operating system and the underlying code (which I believe to be an unrealistic proposition), it would presumably need to maintain the records and logs of the processes it used to create, validate, and deploy GovtOS in case Apple’s methods ever need to be defended, for example in court. The government, or anyone else, could use such records and logs as a roadmap to recreate Apple’s methodology, even if the operating system and underlying code no longer exist.

In other words, even if Apple is allowed to destroy the code, a blueprint for the backdoor still has to remain, and the existence of that blueprint is a security risk.

As Marty Lederman first pointed out, this is is intriguing:

Moreover, the government has not made any showing that it sought or received technical assistance from other federal agencies with expertise in digital forensics, which assistance might obviate the need to conscript Apple to create the back door it now seeks.

In other words, the NSA might have a backdoor, and it is not the case that there is “no conceivable way” for the government to extract data, which was one of the arguments in the 1977 case that first used the All Writs Act to compel phone company assistance with wiretaps.

Free speech, a digression

The first of the two things the magistrate judge ordered Apple to do involves writing software, and recall that some of the pro-crypto side’s victories in Crypto Wars I were won by convincing judges that code is speech. By compelling Apple to comply with the order, the government is mandating speech. That is one of Apple’s arguments in resisting the order. The specific speech being required is not something trivial like “I love kittens,” but speech that is deeply antithetical to Apple’s policy and the wishes of its customers. This is very different from merely providing technical or forensic assistance, which Apple renders all the time, including in the San Bernardino case.

The US government cannot ban speech. Can it compel particular speech? The US government frequently obtains orders against companies that include gag orders. Companies run by security and privacy advocates often set up “warrant canaries.” For example, here is one by backup provider rsync.net declaring that it has never received such a warrant. If rsync.net does receive one in the future, it would no longer be able to publish that notice.

The theory behind warrant canaries relies on the assumption that the government cannot (secretly) compel speech. If it can, warrant canaries will not work.

(By the way: This newsletter has never been the subject of legal process I was not free to disclose.)

Apple has made a free speech argument in its motion to oppose the order:

Under well-settled law, computer code is treated as speech within the meaning of the First Amendment. … The Supreme Court has made clear that where, as here, the government seeks to compel speech, such action triggers First Amendment protections. … Compelled speech is a content-based restriction subject to exacting scrutiny, [Riley v. Nat’l Fed. of the Blind of N.C., Inc., 487 U.S. 781, 796 (1988)] at 795, 797–98, and so may only be upheld if it is narrowly tailored to obtain a compelling state interest, see Turner Broad. Sys., Inc. v. FCC, 512 U.S. 622, 662 (1994).

There are mixed views on whether Apple’s free speech argument has any merit. Marty Lederman thinks it is bonkers, while the Washington Post found a lawyer who likes it a little more.

Alternatives

Is compelling Apple to create a backdoor the only option available to the FBI? Probably not.

Recall that Apple is being ordered to do two things: create a backdoor and cryptographically sign it. Although the US government probably doesn’t have access to the full source code for iOS, it probably isn’t too hard to create a backdoor because jailbreakers—people who want the ability to install non-Apple approved apps on their phones and make iOS do things that Apple does not support and won’t let regular apps do—have done all sorts of weird modifications to iOS. Jailbreakers have also had success loading custom versions of iOS into many iPhone models. The National Security Agency, with its $10 billion budget, is much better at this than a bunch of random hackers, and the NSA almost certainly has several exploits that can be used against Farook’s iPhone.

Why doesn’t the FBI just use one of those? For one thing, although the FBI is active in this area and regularly buys exploits against iPhones and other devices, it is probably the NSA that has the good stuff. And maybe the NSA just doesn’t feel like helping the folk at the bureau, perhaps because they find FBI agents annoying. So the NSA may not want to waste a valuable exploit on a phone that is unlikely to contain anything useful.

Another thing the FBI could do is use one of its existing tools to bypass iPhone passcodes. According to USA Today, the Justice Department told one federal judge in Brooklyn that the “lack of a passcode is not fatal to the government’s ability to obtain the records.” One of the tools available to the FBI is the IP-Box, which inserts itself on the power cable leading to an iPhone’s battery. By cutting the power shortly after a failed passcode attempt, IP-Box apparently prevents the phone from recording information about the failed attempt, and it won’t count toward the 10 failed attempts allowed before automatically wiping the phone. IP-Box does not work on iOS 9, which Farook’s iPhone may be running, but it wouldn’t be crazy to think that the NSA or the FBI has a device that would work.

A digression on jailbreaking

When a hacker manages to jailbreak a new iPhone model or a new version of iOS, he or she is exploiting a security flaw in iOS. That lets people jailbreak their phones and install whatever apps they want, circumventing the restrictions in Apple’s App Store. There is also a community of coders who make modifications to iOS itself; one such modification adds more information to the iOS lock screen.

Whenever a jailbreak is released, Apple usually releases an updated version of iOS a few months later that patches the security hole used by that jailbreak. Some of these holes have been quite egregious. One infamous jailbreak required only that the user visit a particular website in Safari. When Apple closes these holes, the jailbreak community invariably gets annoyed. But these are the same holes that are exploited by the NSA and other attackers.

Security is hard. There are leaks. Apple’s own experiences with jailbreakers and with IP-Box shows this. Deliberately helping attackers doesn’t help matters.

Secure Enclave

iPhone 5s and later models have a Secure Enclave. This is a separate processor within the iPhone’s chip, that runs a different operating system, the L4 microkernel. The Secure Enclave stores sensitive information such as Touch ID fingerprint data and Apple Pay credit card numbers in such a way that the data cannot be extracted from the enclave. That is why Touch ID cannot leak your fingerprints: data from the fingerprint scanner is sent to the Secure Enclave, which then compares it to recorded fingerprints without revealing them.

Having a separate processor and operating system to do this is beneficial because it can be made smaller and simpler, and therefore easier to audit. At the same time, the Secure Enclave’s relative isolation from the rest of the system reduces its so-called “attack surface,” the set of possible attacks that can be staged against it.

In iPhones that have a Secure Enclave, it is also responsible for enforcing passcode restrictions, such as the pause between passcode attempts and auto-wiping after 10 failed passcode attempts. It has been suggested that if Apple could create a backdoor for Farook’s iPhone 5c, which does not have a Secure Enclave, later phones would not be vulnerable in the same way.

Although Apple’s disclosure is a little vague in this area, it now appears that the software running on the Secure Enclave can, in fact, be updated with an update file that is properly signed by Apple. This means that what Apple is being ordered to do in the San Bernardino case is also possible for phones with a Secure Enclave. What is still a little unclear is whether a correct passcode needs to be entered prior to such an update. In some ways, Apple brought all this on itself with inadequate security design.

The slippery slope is 215 degrees

Section 215 of the Patriot Act had four elements. It allowed (1) the FBI to request the production of (2) records (“tangible things”) to (3) aid in an (4) investigation. The bulk metadata collection program did not meet any of those criteria. Section 215 was used by the NSA, not the FBI. It did not require the production of any tangible things then in existence; instead, the phone companies were ordered to send all new call records every day. It was not to support an investigation, because there was no investigation; the data was stored indefinitely for possible future investigations or other use.

The All Writs Act, which is being invoked in the San Bernardino case, is much broader than Section 215. Here is the relevant portion:

The Supreme Court and all courts established by Act of Congress may issue all writs necessary or appropriate in aid of their respective jurisdictions and agreeable to the usages and principles of law.

Absent clear case law, this is the kind of statute that could be very broad or very narrow to suit the whims of the judge applying it. An overzealous judge could use it to order anyone to almost anything, while a more conservative judge might more careful about making sure his or her orders are necessary, appropriate and agreeable to the usages and principles of law.

The phrase “in aid of their respective jurisdictions” implies that there needs to be a case to have jurisdiction over, but the lack of a case being investigated didn’t prevent Section 215 from being abused. The lesson of Section 215 is that the slippery slope in this area is very steep indeed.

Given how broadly the All Writs Act is written, there is a sense that absent more clear language, it be that broad in practice, and probably does not create an exception to the First Amendment. So it is quite possible that the Justice Department will lose this particular battle, or some version of it, that the Act cannot be used to compel particular speech, and that there are limits to the type and extent of “technical assistance” technology companies can be required to provide.

That is not the end of Crypto Wars II, however. The fight would then turn to Congress, which could enact a law (which Barack Obama will sign) which will clarify what Apple’s technical assistance obligations are. In addition to requiring technical assistance with custom one-off backdoors like the one for Farook’s phone, a new law could require Apple to build in backdoors to released versions of iOS. Assuming this is constitutional, an explicit legislative mandate would probably be better than using the All Writs Act. There is precedent for this: CALEA is a law from 1994 that required phone companies to build wiretapping functionality into their networks.

Not All Writs, Brooklyn edition

This isn’t the only All Writs Act fight Apple is involved in. Since October, Apple has been resisting a technical assistance order in a case in Brooklyn. The Brooklyn phone runs iOS 7, and as far as we can tell, Apple is being asked to do the kind of extraction of unencrypted data that it has done in the past. But Apple is objecting to how wide-ranging the order. Alison Frankel at Reuters has a great post on how magistrate judge James Orenstein in Brooklyn approached the government’s request in that case and ordered extensive briefing and argument on whether the proposed order was proper. It is thanks to Judge Orenstein that we know that Apple has received similar orders in 9 other cases, involving 12 iOS devices. That is in addition to the device in Judge Orenstein’s case and the San Bernardino device.

Amy Davidson in the New Yorker discusses some of the dangers of overly broad use of the All Writs Act. I sometimes like to say that slippery slope arguments are a slippery slope, that they prevent you from doing anything at all. In this case there are literally US attorneys and district attorneys waiting to push us down the slope, and the government has not articulated any limiting principle to what types of assistance can be compelled using the All Writs Act.

If a case involving a non-digital phone network could be applied to smartphones, what technologies might an Apple precedent be applied to, three or four decades from now? (The N.S.A. used, or rather promiscuously misused, another pen-register case from the same era to justify its bulk data collection.) It no longer becomes fanciful to wonder about what the F.B.I. might, for example, ask coders adept in whatever genetic-editing language emerges from the recent developments in CRISPR technology to do. But some of the alarming potential applications are low-tech, too. What if the government was trying to get information not out of a phone but out of a community? Could it require someone with distinct cultural or linguistic knowledge not only to give it information but to use that expertise to devise ways for it to infiltrate that community? Could an imam, for example, be asked not only to tell what he knows but to manufacture an informant?

These are examples of cooperation that go beyond providing information or access to facilities, but are not all that different from what Apple is being ordered to in California.

China and India

A lot of American technology companies operate in foreign countries. Sometimes they get in trouble. Yahoo! China got a Chinese dissident imprisoned by releasing his emails to the Chinese government. Some American companies, such as Facebook and Google, deal with the ethical issues involved by simply not doing certain kinds of business in China.

China has not aggressively mandated backdoors or weakened cryptography for products sold in China. An iPhone bought in China is just as secure as one bought anywhere else. India, on the other hand, has required backdoors in BlackBerry devices sold there. Pro tip: if you are an Indian criminal, don’t use a BlackBerry. That advice also applies if you are not an Indian criminal.

BlackBerry is more or less dead, but Apple sells a lot of iPhones in China, and the Chinese market is vital to the company’s future growth prospects. iCloud data for Chinese users is already stored within China and is presumably available to the Chinese authorities. Apple can probably live with that because sending sensitive information to iCloud is optional for users. But if the Chinese government starts requiring a backdoor in iPhone hardware or software, that might be a bridge too far for Apple.

As John Gruber puts it:

I’ve long wondered why China allows companies like Apple to sell devices without back doors for their government. A big part of why they tolerate it seems to be the fact that no government gets this.

If the US government starts demanding backdoors, then the Chinese government will to. Then India, then Great Britain, then every oppressive regime and incompetent intelligence agency will want one too.

It’s hard to know what the US government thinks about all this. The US law enforcement community has a history of trying to use the global position of American technology companies to its own ends, while ignoring the repercussions for the companies’ competitiveness and how other governments might respond. For example, the Department of Justice has asserted the right to demand emails stored on Microsoft servers in Ireland directly, without going through the Irish authorities.

Great Britain

Maria Farrell points out that if Apple had been a British company, this entire debate would be moot:

Because the Investigatory Powers Bill both requires an overly broad base for ‘reasonable assistance’ and accompanies it with a gagging order. (Long experience has taught me there is always a good reason for what first appears to be sloppy drafting in a Home Office bill.) So, unlike in the US, there will be no chance here of a Vodafone executive publicly refusing to actively assist government hackers once this bill has passed. The old gag about the Snowden revelations plays out once again; it was oddly funny that wide-scale surveillance was able to happen in the US illegally, and in the UK almost wholly legally.

So that’s something the USA has going for it. 🇺🇸🇺🇸🇺🇸

Why do I need security?

If you were Osama bin Laden and the US government has your iPhone, there is probably nothing you can do and this is all moot. The Justice Department won’t play games with the All Writs Act, and will have the NSA hack into your phone.

If you are in the United States, keep in mind that literally everything is a federal crime. And there is probably evidence of your crimes on your phone.

If you have commercial secrets on your phone, and you travel to China, you probably don’t want the Chinese government, or anyone else, to easily read your secrets. (If you are a major terrorist or splittist, they probably will anyway.) And while your factory processes or HVAC system or whatever else are probably not controllable from the iPhone that you carry around, you might have some indirect access to that network through your phone’s VPN access. Your network’s security is probably set up so compromising your iPhone shouldn’t be too dangerous, but you would lose one level of a defense in depth strategy.

If you have nude selfies, dickpics or other embarrassing material on your phone, you probably don’t want people to easily access them, whether it’s onanists or the US government trying to discredit you by revealing your porn habits.

If the anti-crypto side wins Crypto Wars II, the best response might be to preemptively publish your porn history and dickpics, and to just panic about the security of your systems.

Secure backdoor is secure

A couple of the Crypto Wars II takes in the last couple of weeks have tried to expose a supposed hypocrisy on Apple’s part by arguing that Apple’s stance reveals that secure backdoors are possible. The argument goes that the California magistrate judge’s order truly only applies to this one phone. The next time such an order is issued, it would only apply to the phone in that case, and so on. There is no mechanism for the government to illegimately gain access to data on an iPhone, and no room for mass surveillance.

All of these arguments assume that the backdoor would, indeed, be secure. That would be the case if relatively few engineers at Apple have access to it, and there is no chance of it leaking. Apple provided some data for 3,093 device requests by US law enforcement agencies in the first half of 2015. Manhattan district attorney Cy Vance is ready with 175 phones he wants unlocked.

Today, the signing keys for iOS releases do not have to be used very often, and they are likely kept very secure and difficult to casually use or abuse. If thousands of devices have to be “individually” backdoored every year, those keys would also have to be used thousands of times a year, and it would be much more difficult to keep them safe. A lot of the current hacking tools for iPhone already originate from Apple’s Chinese supply chain, and you simply can’t keep secure a signing process that has to be used thousands of times a year when the stakes are this high.

Apple has not been ordered to hand over a general backdoor yet. As Jonathan Zdziarski points out, that may not continue to be the case if Farook’s phone is backdoored, and important evidence is found that points to a hitherto unknown co-conspirator. If that evidence is introduced at trial, there is a real possibility that Apple will be required to disclose the details of exactly how it obtained the evidence being presented. Now, because there is nothing interesting on this iPhone, its contents probably won’t be released at trial, and there is no defense team that will demand that its experts get full access to the backdoor. The worst case is that Apple has to provide a tool that allows the FBI to extract the full contents of the phone, including the backdoored iOS code, and that could then leak out of the FBI. However, the very next case might involve a phone with useful evidence on it, and then you have a major risk of leaks.

It is appropriate that Apple sees itself as part of the threat model for its users’ devices. Julian Sanchez explains why this kind of technical assistance is so dangerous. The whole piece is interesting; here is an important point:

This would create an internal conflict of interest: The same company must work to both secure its products and to undermine that security—and the better it does at the first job, the larger the headaches it creates for itself in doing the second. It would also, as Apple’s Cook has argued, make it far more difficult to prevent those cracking tools from escaping into the wild or being replicated.

Why this case?

The FBI will probably not recover anything useful from the iPhone in question. The bureau already have a lot of material about the attackers, much of it from iCloud, through Apple’s cooperation with search warrants and subpoenas in the case. Farook destroyed several devices before the attack so carefully that it has not been possible to recover any data and it is likely that any sensitive information are on the destroyed devices, rather than the one left intact. It was a work phone. And as I mentioned, the FBI probably turned down an offer by Apple to handle all this secretly.

We now also know that the FBI stupidly reset Syed Farook’s iCloud password in the hours after the attack. The iPhone had not backed up to iCloud for 6 weeks, but there was a chance that it would do so given the opportunity. That opportunity was lost when the iCloud password was changed. (We now know that the FBI ordered this, not the San Barnardino health department.)

The FBI also stupidly turned off the phone, which cut off many potential avenues of attack. Given how incompetent the bureau has been in this case, and the fact that the phone is unlikely to contain any information of interest, you might think this is a weird case to use to kick off Crypto Wars II.

It is a difficult choice to make for the bureau. It would have to be a terrorism investigation. A kidnapping investigation is too time-sensitive. If this had been Osama bin Laden’s iPhone with potential information on al-Qaeda operations, they would not have bothered with some law from 1789 and would simply have hacked the phone. So only a case where the phone is relatively unimportant would be a candidate for trying to crush Apple. That probably rules out phones recovered from investigations of foreign terrorism cases, which is why they settled on a case that, while tragic, involved someone going postal on his co-workers—as American as apple pie.

The FBI should probably have waited for a case where it didn’t screw up the evidence so badly, but you go to Crypto Wars with the case you have, not the case you want.

What should Apple do?

It is hard to make suggestions for what Apple should do legally and politically because, as I discussed above, it’s hard to know what this particular phase of Crypto Wars II is really about. Is the FBI trying to get some piece of legislation passed, or is it pressuring Apple to do something in a case that is under seal? Does Apple fear the legislation, or does it fear how the Chinese and Indian governments will react?

On the technical side, there are ways Apple could mitigate the risks of the measures sought in this case. Why is it even possible to upgrade iOS software or Secure Enclave firmware without entering the passcode? It shouldn’t be. Maybe a phone should not be bricked if its owner forgets the passcode. In that case, resetting the phone should also wipe all the data. And Katie Benner reports in the New York Times that Apple is working on measures to make iPhones less vulnerable to the type of attack the FBI is staging.

Apple should get rid of “security” questions.

Apple should stop selling older phones without a Secure Enclave.

Apple should also think hard about iCloud security. Apple has the encryption keys to your iCloud backups and can disclose your backed up data—including messages and photos—to anyone. Apple handed over Farook’s iCloud backups and that is also how Jennifer Lawrence’s nude selfies were leaked. There should probably be an option to have only you hold the keys to your backups, which is already the case for your passwords in iCloud Keychain. People who use that option would be at some risk of losing all their data if they forget all their passwords and keys. Apple does not already do this because it does not want to be in the awkward position of telling parents that all the photos of their kids are lost because they were not careful enough with passwords.

Some practical advice

Don’t use iCloud backups. Yes, I know, it’s a huge hassle to remember to back up to iTunes.

Encrypt everything. Use HTTPS on your web servers; Let’s Encrypt provides free certificates now. On Mac, use FileVault 2, and on Linux, use eCryptfs. Use a long passcode on your phone.

Use Signal to communicate securely.

Disable Touch ID if a government or someone who might kidnap you for your finger are part of your threat model.

If you are a corporate IT department, just enable MDM. It lets you avoid a situation where you can’t get into your employee’s work issued phone. This is especially important if you are legally required to maintain such access (no, iPhones are not illegal in the financial industry).

More links

How Tim Cook, in iPhone Battle, became a bulwark for digital privacy. Kieran Healy explains why Apple is fighting the FBI on this, and not some other tech company, and why the rest of Silicon Valley has been slow in coming to Apple’s support. General Hayden has a take too. Do we have a right to security?

iOS Security Guide. Read this religiously. A cartoon. If I am the victim, please pay my ransom. Definitely go read everything Jonathan Zdziarski writes on this.

A poem (apologies to various parties)

First they came for an iPhone in Brooklyn, and I did not speak out—because it was in Brooklyn. Brooklyn is terrible.

Then they came for an iPhone in California, and I did not speak out—because it was not my phone and it was used by a murderer.

Then they came for another iPhone, and I did not speak out—because it was not my phone.

Then they came for another iPhone, and I did not speak out—because it was not my phone.

Then they came for another iPhone, and I did not speak out—because it was not my phone.

Then they came for another iPhone, and I did not speak out—because it was not my phone.

Then they came for my iMessages, and I did not speak out—because I was confused and thought this was about phones.

iPhone 5c is a lame plastic phone anyway. It deserves to be backdoored.

Another poem (by JC)

Here is a cryptography scheme

that is appropriately lean and mean:

just add all the symbols mod 2

cuz if I can’t read it, neither can you

Final thoughts

I love you Admiral Rogers!

Did anyone watch last week’s episode of Scandal? The director of the NSA is always a general or admiral. It would be very strange for a mere captain to become director. Also I want a Gettysburger, with freedom fries and union rings.

1620 #farookpasscodeguesses

The Danish political crisis

25 Feb 2016

For only the first or second time ever (depending on who you ask), the Danish parliament on Wednesday will vote to express no confidence in a Danish government minister. The center-left parties and the Conservative People’s Party no longer trust the environment and food minister Eva Kjer Hansen.

This all started with a farm law, passed this morning, that would ease pollution restrictions on Danish farmers, egged on by an extremist farmer group that unironically calls itself Sustainable Farming. One of the big changes in the law will be to let farmers use more fertilizer. Using more fertilizer will release nitrates into the waterways, which is bad and could be illegal under EU law. The law would also permit more aquaculture, which also discharges nitrates into the water. (It also releases greenhouse gasses, making it more difficult for Denmark to fulfill its greenhouse gas emissions targets.)

The Conservatives have positioned itself as an environmentalist party—they’d like to conserve the environment getitgetit—so they wanted to see an improvement in nitrogen discharge, not an increase. That’s why the law includes various initiatives to reduce nitrogen discharge, to outweigh the additional discharge from increased fertilizer use.

But those initiatives won’t be effective until 2019, and the government wanted to show an improvement in nitrogen every year starting in 2016. So it decided to include a “baseline effect,” which accounts for reduced nitrogen discharge due to factors such as farm area that is converted to residential or industrial use.

That might be a little questionable, but so far so good. The real problem is: the baseline effect used for 2016 includes the baseline effects for 2012–2015. Without that misleading figure, there would be no net nitrogen reduction in 2016.

At a parliamentary hearing last week, six of the seven experts who testified agreed that the figures are very misleading. But Eva Kjer Hansen has not been willing to admit that she misled parliament. That is why she is now facing a no-confidence vote on Wednesday, the earliest possible day under parliamentary rules, and she would be forced to resign if she loses that vote. Prime minister Lars Løkke Rasmussen is backing his minister and is not willing to fire her, and has threatened to call an election if she is deposed.

(Obviously there is more going on. The single-party minority government is weak and the nitrogen crisis reflects deep problems in the government’s ruling coalition.)

Vox on AC/DC

25 Jan 2016

(Cross-posted from Ello with changes.)

Vox has a video explainer on why your laptop charger gets so hot. They make the interesting claim that it’s because of the conversion from AC to DC power.

In fact, the rectification itself is relatively efficient. But after you have rectified, you are left with a quite high (DC) voltages. Then there is circuitry that steps down the 170V (peak) DC voltage (in the US) to the roughly 18V that a Mac laptop expects, mainly for charging its batteries. It is this DC-to-DC conversion that accounts for most of the inefficiency in a laptop power supply.

The main reason we don’t get DC power into our houses is that a hundred years ago, it was hard or expensive to change the voltage of a DC power line, while AC voltages can be easily stepped up or down with transformers. A transformer sits near your (American) house to step down the 1.5kV to the 120/240V that an American house expects. Larger transformers sit in various places in the grid to convert 115kV to various lower voltages, and finally to the 1.5kV that runs down residential streets.

That is not to say that the traditional account of the Edison–Tesla rivalry is entirely incorrect, although Vox’s version is a bit simplified. However, even if DC had won some short-term victories, the technology of the day simply wasn’t adequate for widespread DC deployment, mainly due to boosting and bucking voltages, not AC–DC conversion.

Today high-voltage direct current is much more feasible with thyristors and integrated gate bipolar transistors, and is often used for undersea connections and long distance power lines from hydroelectric power plants. It is particularly useful for undersea connections, with AC–DC converters on each end, because it can be used to make connections between electrical grids that operate at different frequencies or phases. It is also useful for long distance because less conductor material can be used due to the skin effect, which limits how deep alternating current can penetrate copper wire. You also do not have to worry about capacitance as much, which is useful for both long-distance and undersea connections.

The real question that should be asked is: Why don’t we just get a lower voltage into our house? If the outlet had 18V for laptops, or perhaps 12V for a lot of other electronic devices or 5V for charging our phones, we wouldn’t need any voltage conversion. These lower voltages would cause problems for makers of higher power devices like hair dryers or ovens, but those could possibly be overcome, and who needs hair dryers in this new internet of things world. If there was a wall outlet with 18V, the laptop charger wouldn’t be necessary at all. (You can, in fact, get electrical outlets with 5V USB ports. They have an AC–DC(–DC) converter inside the outlet, and there is absolutely no way I have installed any because it would not be permissible for me to do so as I am not a licensed electrician in New York City, but hypothetically, if I had, I would think they are awesome and highly recommend them.)

Transmission voltages are not lower because lower voltage means higher current (for the same amount of power), which means more losses in the transmission lines. A 50 or 100 meter phone charging cord would be infeasible for this reason—there would be too much loss in the wire, or the cord would be infeasibly large. If we did receive DC from the electric company into our houses, it would probably be relatively high voltage DC, and we would still need inefficient DC-DC converters everywhere. Proposals for powering data centers with direct current usually call for 48VDC, with DC–DC converters at the point of load.

The voltage of overhead AC power lines is typically around 115kV and most large new HVDC lines are 800kV.

The video also makes the interesting claim that the United States has one of the best and most reliable power grids in the world? It is certainly physically and electrically large and has a huge amount of capital invested in it, which makes it a questionable proposition to replace it or adapt it to a new standard, but there are more blackouts and brownouts than residents of most wealthy countries would be satisfied with.

Also, the LED light bulb pictured in the video probably has a rectifier inside and runs on DC. Only very cheap LED lamps run directly off AC; I do own a set I bought at Home Depot that I installed as under cabinet lighting, but it’s relatively rare these days.

I know a MacBook charger works a little differently, has power factor correction, and you actually get 380VDC inside it. The point is that rectifiers are efficient. One amp times 1.4V, which is a worst case since most bridge rectifiers are based on Schottky diodes, means less than 2 watts dissipated as heat. It is also still the case that if your laptop did accept 170VDC, or 380VDC, the adapter would create much less heat. Despite what Vox claims, DC–DC conversion is the main source of efficiency gains, not AC–DC conversion.

ISDS can be fixed

28 Aug 2015

What is ISDS?

Investor-state dispute settlement, or ISDS, is a way of settling disputes between investors and states.

It most often happens in the form of an arbitration, so some people call it investor-state arbitration, or ISA. ISDS provisions have traditionally been inserted in bilateral investment treaties, which are negotiated between two states and provide for certain benefits and protections when an investor located in one of the states invests in the other state.

One of the problems investment treaties try to solve is that the host state could seize the investment outright, or might introduce limits on currency convertibility that make it difficult to patriate profits, or introduce regulations that discriminate against foreign investors. These concerns loom particularly large in countries with an ineffective legal system or weak rule of law, but even countries that we might think would know better will sometimes discriminate against foreign companies.

When the foreign investor can’t get justice in domestic courts, the traditional remedy has been to get the investor’s home state to try to convince the host state to pay some compensation through diplomacy, or perhaps by threatening retaliation in some other area. In the 19th century, the home state might have sent along some gunboats or even started a full scale war. Today the use of military force to settle commercial disputes is frowned upon.

The first bilateral investment treaty allowing an investor to take a state to arbitration was signed by France and Tunisia in 1969. Since then, hundreds of treaties have included arbitration clauses, and a few multilateral ones have too, notably NAFTA and the Energy Charter Treaty.

ISDS provisions give investors some confidence that if their investment is confiscated or otherwise impaired, they can take their case to an impartial tribunal and receive compensation. While an investor would probably not invest in a state that has a terrible legal system and is certain to ignore property rights, the option of using ISDS can provide a little more assurance and could tip the scales in favor of a particular country, or investing versus not investing.

ISDS provides a way to circumvent weaknesses in the host state’s rule of law and respect for property rights; it might be better to just fix that, but major institutional changes can be challenging to realize.

Even in places where courts work well, arbitration is often seen as a more efficient, faster and cheaper process than a traditional lawsuit that can take years, especially when the state is the defendant.

Most ISDS cases have been brought by European investors against poorer countries. Unsurprisingly, Argentina and Venezuela dominate the league tables of respondent states as a result of their large scale nationalizations of foreign businesses.

ISDS clauses usually say that arbitration has to follow one of two sets of rules, those set out by ICSID, part of the World Bank, or UNCITRAL, a UN institution. UNCITRAL arbitral awards against a state can be enforced in courts around the world like traditional arbitral awards. ICSID awards are easier to enforce because they are supposed to be treated like a domestic court judgment. In most cases, a state that loses will simply pay up.

Why are we talking about ISDS now?

There have been a few high-profile ISDS cases that were widely seen as abuses of the process, which has created anxiety that ISDS, which is ultimately a limitation on a state’s sovereignty, could have a chilling effect on legislation in areas like environmental protection, health and patents, and perhaps undermine democracy itself.

After the Fukushima nuclear incident in 2011, Germany decided to phase out nuclear power by 2022. As part of the phaseout, some older nuclear reactors were shut down. The Swedish state-owned energy company Vattenfall, which owned two of those reactors, decided to sue request arbitration for billions of euros—we don’t know exactly how much because arbitration proceedings are secret—under the ISDS provision of the Energy Charter Treaty.

Even if Vattenfall wins, the reactors will stay closed and Germany will still be nuclear free by 2022. Arbitrators cannot issue an injunction that orders German regulators reopen the reactors and their awards have no direct influence on public policy. Many regulations will end up costing someone money, and if that someone is a foreign investor from a country with an ISDS clause, regulating them in a way that benefits the public—for example by reducing the risk of a nuclear disaster—could end up costing a lot of money.

In most countries, the government has to pay compensation if it confiscates a piece of property and takes it into its own ownership, but usually not when there is a good public policy rationale for the regulation. If the kind of legislation that legislators routinely enact could end up being ruinously expensive because a foreign investor happens to lose out, legislators could be reluctant enact it in the first place. That is alleged to have happened in Canada in 2014 when legislation mandating plain packaging for cigarettes was withdrawn. Australia is currently engaged in an ISDS case on the same issue because Philip Morris Asia, which is nominally a Hong Kong investor, argues that the Australian plain tobacco packaging law expropriates its investment.

Proposed Australian cigarette package

ISDS cases can also be brought when courts reinterpret existing legislation to limit investors’ property rights. The Supreme Court of the United States has sharply limited what can be patented in the US through a series of decisions over the past decade, with the effect of limiting or eliminating those patent owners’ “property” rights. We think of those decisions as simply clarifying what the law always was, but arbitrators may not see it that way. No investor has yet sued requested arbitration over CLS v. Alice or other American patent decisions, but Eli Lilly is doing so over an aspect of Canadian patent law known as the “promise doctrine.” The same could also happen as a result of the new Unified Patent in Europe.

It’s important to note that the chilling effect only exists if legislators, regulators and courts are chilled by it. As long as they are not ruinous, the host state could choose to pay out any arbitral awards and just continue governing as if they never happened. The US Congress and state legislatures would probably not feel very chilled.

ISDS is part of the drafts for both the new Trans-Pacific Partnership deal being negotiated by the US and various Pacific rim countries, and the TTIP deal between the US and the European Union, which makes it very relevant. People who are debating TPP and TTIP spend a lot of time discussing whether ISDS should be part of those deals. It also looks bad that ISDS provisions grant special privileges to foreign companies, because only foreign investors can request arbitration against a state.

[For my 🇺🇸 readers: United States has yet to lose an ISDS case. That might be because American courts are awesomesauce, because the system is rigged, or it might just be luck. Also, check out this hilarious pro-ISDS-argument that invokes the Federal Circuit.]

[For my 🇪🇺 readers: There are some intra-EU ISDS treaties. The European Commission requested their termination a few months ago because they’re not really compatible with EU law.]

How can ISDS be fixed?

The intentions behind ISDS are good: ISDS can help attract investment by protecting property rights, and by circumventing ineffectual legal systems. With some fixes—some small, some radical—I think ISDS can be cut down to size and be made palatable, while still achieving the goals it was created for.

Countries with good legal systems

When Vattenfall felt expropriated by the German government, did they really need access to an alternative judicial system to receive the compensation they deserved? Germany has very well-functioning and widely respected courts. I poo-pooed Vattenfall’s claims over its two nuclear reactors, but if they truly have merit, couldn’t German courts have decided that?

In fact, RWE and E.ON, two other energy companies with large investments in Germany, sued in German courts over a nuclear fuel tax, a different provision of the phaseout, and won. Well, the European Court of Justice in Luxembourg later ruled that the tax was legal. In any case, that lawsuit is close to being resolved, while Vattenfall’s case appears to be never-ending. So much for arbitration being faster.

Swedish investors don’t need ISDS to be reassured that it is safe to invest in Germany. Hong Kong investors don’t need ISDS to invest in Australia. American investors don’t need it to invest in Canada.

Vattenfall used the Energy Charter Treaty, which includes countries that at the time did not have great legal systems, and it would probably look bad to only apply ISDS to cases involving those countries. But there is no pressing need for TTIP, which is between the United States and the European Union, to include an ISDS provision. (There is an argument making the rounds that ISDS in TTIP is necessary to get China to agree to ISDS, and it doesn’t quite make sense.)

Stronger public policy exceptions

Most recent ISDS clauses include some recognition that not all losses caused by government action constitute a taking, and many arbitrators accept that. The leaked TPP investment chapter explains in an annex that

Non-discriminatory regulatory actions by a Party that are designed and applied to protect legitimate public welfare objectives, such as public health, safety, and the environment, do not constitute indirect expropriations, except in rare circumstances.

These guarantees can be made stronger, so something like the Australian plain tobacco packaging law is understood by everyone to not be expropriation.

Multilateral investment treaties often allow a commission of government representatives to issue interpretations of the treaty that are binding on arbitration panels. NAFTA has one and the TPP draft has one too. These commissions should be used more frequently and aggressively to curb abuses of the arbitration process.

Appeals

Arbitration panels consist of three often not very well selected arbitrators, and they can make mistakes. When a judge in a conventional court makes a system, there is usually an appeals court that can correct the mistake. ISDS awards can amount to billions of dollars and the consequences of an error can be serious to the losing state.

While there are a few narrow ways in which an arbitral award can be set aside, a permanent appeals body, like the WTO has, could do much to curb bad arbitral awards and abuses of the system. Arbitration awards can normally not be appealed because that would increase the potential cost of a system that is supposed to save money. One way to preserve that is to limit appeals to awards over a certain size.

Transparency

Arbitration is secret. The arbitration panel has the option of issuing wide-ranging confidentiality orders over the documents in the case, and in-person sessions are usually not open to the public. Opening up the proceedings whenever possible would reduce anxiety about the process.

Alternatives to ISDS

Arbitration is not the only way that investors can be protected against expropriation. MIGA, an agency of the World Bank, sells insurance against breach of contract and political risks such as expropriation. Many countries’ export-import banks offer similar coverage.

An investor with such an insurance policy may be even better protected than with an ISDS clause, because the insurer is less likely than a state to resist paying out a valid claim. MIGA’s capital can be expanded so the $720 million per-country limit can be raised and more investors can take advantage of this kind of insurance.

MIGA has only paid out eight claims so far, and only one of those was for a project in Argentina. According to the agency itself, the number is so low because they are awesome at negotiating an informal solution before a full claim is necessary.


Addendum: Contrary to popular belief, the bilateral investment treaty concluded between Germany and Pakistan in 1959 did not contain an ISDS provision. It was the first such treaty, but the arbitration procedure in Article 11 refers only to disputes between the parties to the treaty, not between a private investor and the host state of the investment.