By Robert Ackerman, Founder and Managing Director of Allegis Capital
As anybody familiar with the computer industry knows, the FBI wants Apple to break security protections on an iPhone linked to the deadly San Bernardo terrorist attacks — and a U.S. district court has ordered Apple to do just that.
Apple is fighting the decision for good reason. If it obeyed the court, the security of the iPhone could be compromised, helping to set in motion a trend that would materially undermine the effectiveness of cybersecurity in every conceivable venue.
Many law enforcement officials do not agree with this view; they believe encryption already allows far too many criminals to go scot free. But why lean on Apple to crack a phone they may able to crack themselves?
Authorities might be able to accomplish this, for example, by using a very precise atomic saw that can cut through the outer structures of the A6 microprocessor inside the phone, according to a recent story in The Wall Street Journal. They could target the portion of the chip that holds the user ID (the UID key). Then they could move the iPhone’s scrambled data to another computer and unlock it by using technology to guess the passcode of San Bernardo killer Syed Rizwan Farook.
It is true that this tactic would be risky and very expensive. And if anything goes wrong during the process, the data could be lost forever. But why is this a greater risk than forcing Apple to comply with authorities and possibly provide the essence of a “golden key” to unveil encrypted communications to help catch criminals and terrorists?
The authorities always sidestep one extremely important detail — in the domain of cybersecurity and encryption, the bad guys are just as smart as the good guys. Exploiting vulnerabilities is their expertise. If there is a back door, they will find it, exploit it and seize valuable personal data. And how can we trust government entities, which are regularly breached, to keep such a golden key safe from criminals?
Data is the target of the vast majority of breaches of every stripe. Encryption is the last resort of data defense, one used to protect data more than 99.999 percent of the time. If encryption is penetrated, the cornerstone of defense disappears and the stage is set for even more hacking mayhem.
Two fundamental issues are at play in the Apple-FBI brouhaha. One is the Fourth Amendment of the Constitution, which protects against unreasonable searches and seizures. Isn’t this the point of encryption? The second issue is whether a back door would, in fact, improve the effectiveness of the FBI and other law enforcement agencies. FBI Director James Comey has suggested that police would have been able to track down the shooter of an Illinois man last year but for encryption built into both of the victim’s two phones. What he failed to mention was that one of the phones – a Samsung Galaxy S6 – isn’t encrypted by default.
Let’s return to the specifics of the dispute. For most iPhones, most danger is poised by criminals. If thieves can break into these phones, victims can easily be exposed to identity theft and perhaps even extortion. This is one of the main reasons Apple designed stronger encryption, starting on the iOS 8 operating system. Any software that by-passes those protections could materially hurt iPhone users.
It’s true that the FBI’s proposed system for Apple has protections to ensure its passcode hack can’t be used by anyone else. Apple signs any automatic firmware updates before a given iPHone will accept them, and the FBI’s proposed update would be coded to an individual phone. The software wouldn’t install unless the phone’s serial number matches the serial number in the code. The method proposed by the FBI is also specific to iPhone 5c, the one in Farook’s possession. While this doesn’t have the Secure Enclave chip that ties lock screen protections to hardware in newer iPhones, it’s highly likely that the FBI would request similar methods for cracking Enclave-equipped phones if it is successful in its current feud with Apple.
The software proposed by the FBI can be useful to thieves even though it can’t be used to unlock other phones. If the code falls into the wrong hands, it can potentially be reverse-engineered into a generic version, removing the code that ties the attack to a specific phone.
This reverse-engineered version would still need Apple’s signature before it could be installed – something, of course, thieves are unlikely to have. The fundamental point, however, is that that signature system would be the only thing protecting a stolen iPhone and the information inside it. By itself, this is a huge problem. New vulnerabilities pop up in software all the time, and no single system is ever considered entirely impenetrable. An undisclosed vulnerability could be used in a way that Apple and the FBI can’t predict.
Law enforcement and intelligence communities do important work, and new technology has made their jobs tougher. But the answer is not lowering standards for protecting data. The right answer is to work on new approaches to identify the bad guys. Innovation – not compromised security is the solution.
Robert Ackerman Jr. is founder and managing director of Allegis Capital, a Palo Alto, CA-based early stage venture capital firm specializing in cybersecurity. Allegis Capital’s cybersecurity investments include Shape Security, vArmour, Red Owl, Synack, E8 Security and Area 1.