Apple in FBI Crosshairs: An Epic Confrontation Begins

Note to RTL readers: I asked John Simek, Sensei’s Vice President, to join me in composing this post because I didn’t want to be among the ranks of those who have been issuing reports which are technologically inaccurate. It is a longer post than customary, but not sufficiently long to discuss every element of this case. We hope you find it a useful summary of events to date.

If you have enjoyed following this important story, you’re likely to hear updates and commentary at ABA TECHSHOW,® March17-19 in Chicago. That is especially true because the keynoter is Cindy Cohn, the Executive Director of the Electronic Frontier Foundation. She has been quoted on this story online, in print and on TV so we are really looking forward to her insights. Other speakers will no doubt reference this story as well and incorporate the most current developments as they unfold.

The encryption and privacy debate went into nuclear mode last week. The FBI went to federal court and got a judge to issue an order to have Apple assist the government in accessing an iPhone 5c that was used by Syed Rizwan Farook, who was one of the two killers in the San Bernardino mass shooting in December of last year. Apple has publicly stated it will not comply with the order and is prepared to do battle in court. Last Friday, the government filed a motion to compel Apple to comply with the court order. Obviously, news outlets have been flooded with reports, many wildly inaccurate from a tech standpoint.

Why can’t the FBI access the data on the iPhone? Apple has continued to improve the security of the iPhone and the data on the phone is automatically encrypted when the user configures a lock code. The data on the phone is automatically destroyed if there are 10 incorrect lock code attempts. In addition, iOS introduces increased delay as the number of invalid lock codes increases. This security mechanism is built into iOS for the iPhone 5c.

For later phone versions (5s and later) that are equipped with TouchID, the security mechanism exists in a special hardware implementation called a Secure Enclave. Since the security feature is part of iOS for the 5c, the FBI is asking Apple to remove the checks for invalid lock codes, remove the delay as invalid lock codes are attempted and allow for the lock codes to be entered from a source other than the on‑screen keyboard. This means that the FBI could try multiple PIN codes using a computer until they hit the right one. If you try every single possible PIN (a brute force attack), you WILL eventually find the right one.

One of the key questions is whether or not the FBI is asking Apple to build a “backdoor” in iOS. Technically, I wouldn’t call it a backdoor since the request is to have the custom modified iOS only be able to be installed on the specific device used by Farook. The obvious concern is that the custom code could be reverse engineered to remove the restriction to only be installed on one device, thereby creating a backdoor in iOS that could be installed on any iPhone 5c. But since the term “back door” has captured the public’s fancy, we see it misused a lot. Even Apple representatives have used the term, presumably surrendering to public adoption of the term.

It is also true that the degraded operating system could be leaked from Apple. Such things happen.

On the legal side, there are a number of troubling legal issues impacting both sides.

Farook was not the phone’s owner. Therefore, Apple is defying not only the Justice Department but also the wishes of the iPhone’s owner, the San Bernardino County Department of Public Health, which issued it to Syed Farook to use at work. Can you elevate the privacy interests of a phone’s user over the wishes of its owner? That could cause some companies to cease purchasing iPhones for employees’ use.

In the financial industry, letting yourself be locked out of employees’ communications is a violation of federal law. Since 2007, financial industry regulators have made clear that “FINRA expects a firm to have supervisory policies and procedures to monitor all electronic communications technology used by the firm and its associated persons to conduct the firm’s business.” In 2014, financial institutions were fined under this policy for failing to capture all of their employees’ text messages. Apple’s current legal stance may be inconsistent with this requirement.

What the court order requires of Apple is very narrow. But can the government, under the All Writs Act of 1789 (it seriously smacks of desperation to get such an ancient law to handle digital age issues), get a court order that commands a company to create a degraded operating system which does not currently exist to help circumvent privacy protections and security?

If that software were reverse-engineered or otherwise got “into the wild”, the security of any iPhone5c would be compromised.

The All Writs act was written early in our county’s history, when we had a fairly small body of law. It said that federal courts can issue “all writs necessary or appropriate in aid of their respective jurisdictions and agreeable to the usages and principles of law.” Read as a whole, the act says that judges can tell people to follow the law, but they have to do so in a way that, in itself, respects the law. There are qualifications there: warnings about the writs having to be “appropriate” and “agreeable,” not just to the law but to the law’s “principles.” The government is, unsurprisingly, ignoring those cautions. If the government can tell Apple, which itself is accused of no wrongdoing, to write a custom operating system for it, what else could it do?

The software Apple is being asked to develop would not help law enforcement with the iPhone 5s and later phone – we note that Apple CEO Tim Cook has made several misleading statements about this, making the threat to privacy appear much bigger in the particular case. But would upholding this court order be a precedent for demanding similar help with later versions of the iPhone?

For more than a year, the government has been trying to use the All Writs Act to unlocked encrypted phones. That didn’t generate the publicity of the recent showdown but is worth noting.

Likewise, we only recently found out that there was a secret meeting around Thanksgiving of last year, convened by the White House, in which senior national security officials devised a National Security Council “decision memo”, tasking government agencies with developing encryption workarounds, estimating additional budgets and identifying laws that need to be changed to counter what FBI Director James Comey calls the “going dark” problem – investigators being unable to access the contents of encrypted data stored on mobile devices or traveling across the Internet. Apparently, the public signs of rapprochement with Silicon Valley were, at least in part, window dressing.

The Justice Department has said that Apple’s refusal to help “appears to be based on its concern for its business model and public brand marketing strategy” rather than a legal rationale. That’s both true and a crock. It is true that Apple has been moving steadily toward making privacy the core of its corporate ethos. That is good marketing (around the world) but it is also in line with Constitutional privacy protections. And why would the government make our technology corporations less competitive because their products are less secure? Why hurt the American economy and erode privacy rights at the same time?

And do we think that, if Apple loses, that countries like Russia and China won’t demand back doors as the price of selling Apple products within their borders? China already tried to get back doors, and then backed down. But let’s not imagine that the Chinese aren’t watching developments in this case closely. You may recall that RIM (now BlackBerry) agreed to a back door in order to sell its phones in India. A dangerous precedent.

Apple has powerful allies in this battle, including Google, FireEye, Symantec, Microsoft the Electronic Frontier Foundation and the Center for Democracy & Technology.

The truth is that virtually all cybersecurity experts agree that back doors (by whatever name) degrade everyone’s security. We hope the courts won’t take a step in that direction.

This post originally appeared on Ride the Lightening.

Nelson_SharonE-mail: snelson@senseient.com   Phone: 703-359-0700
Digital Forensics/Information Security/Information Technology
http://www.senseient.com
http://twitter.com/sharonnelsonesq
www.linkedin.com/in/sharondnelson