The terrorist attack at the Pensacola Naval Air Station in December has reignited a bitter debate in digital technology: Are cellphones and other devices a threat to public safety and national security if the government doesn’t have a way to get past their digital security?
On Monday, US Attorney General William Barr complained that smartphone giant Apple had failed to help federal investigators break the encrypted passwords of two iPhones belonging to Mohammed Saeed Alshamrani, the Saudi Air Force cadet who murdered three people and wounded eight others in a Dec. 6 mass shooting.
”It is very important to know with whom and about what the shooter was communicating before he died,” Barr said. But he said that Apple “has not given us any substantive assistance,” and added that “this situation perfectly illustrates why it is critical that investigators be able to get access to digital evidence once they have obtained a court order based on probable cause.”
Barr’s complaint is an echo of the government’s confrontation with Apple following the 2015 mass shooting in San Bernardino, Calif., when a husband and wife murdered 14 people during a shooting spree. In that case, the Justice Department won a court order seeking to force Apple to open the iPhone of one of the shooters. Apple resisted, but a courtroom showdown was averted when the FBI obtained phone-cracking technology from a cyber-security company.
Despite that technological breakthrough for investigators, Apple phones have apparently gotten even harder to crack. Even older iPhones are tougher than they used to be, thanks to upgrades to the operating system software. Joe Caruso, chief technology officer of Global Digital Forensics in Boston, said that his company breaks into thousands of phones every year, to assist in civil and criminal investigations, using tools that work well on phones with relatively simple passwords.
But newer iPhone software uses six-digit passwords by default and can be set to use much longer ones. As a result, “there are some iPhones that we cannot get into,” Caruso said.
In the current case, investigators recovered an iPhone 5 and an iPhone 7 after Alshamrani was killed by police. FBI investigators want to read files stored on the phones to determine whether the killer acted alone or had accomplices with whom he communicated.
The iPhone 7 comes with a fingerprint lock. It’s not known if Alshamrani used it, but in any case that may be a moot question: Most fingerprint readers on phones rely on a tiny amount of electrical current flowing through the finger, Anil Jain, a professor of computer science and engineering at Michigan State University and an expert on fingerprint technology, told the publication Live Science. And that flow stops once the person is dead.
There’s still the old-fashioned password, of course. But an iPhone is designed so that unlocking it without the correct password is almost impossible. Importantly, an iPhone can be set so that all data on the phone are wiped out after 10 failed attempts to enter a password. To avoid that risk, the FBI sought help from Apple in bypassing the phone’s security system.
In a statement, Apple said it supplied the FBI with “many gigabytes of information” related to the phones, including the owner’s account information and data backups stored in Apple’s Internet-based iCloud service. The company also said its engineers have worked closely with the FBI to provide technical assistance in investigating the case.
But the company also defended its longstanding policy of refusing to design its phones with “backdoors” that would allow Apple or the police to gain access to a phone without its owner’s permission. The reason: Any such backdoor would sooner or later be exploited by criminals.
“We have always maintained there is no such thing as a backdoor just for the good guys,” the Apple statement read. “Backdoors can also be exploited by those who threaten our national security and the data security of our customers.”
In addition, even if Apple could grant access to the phone, there’s nothing to stop the user from installing an app which uses its own non-Apple method for encrypting data. Such apps are available from foreign companies beyond the reach of US law.
But Stewart Baker, former general counsel at the National Security Agency, said that Apple and other companies should build backdoors for police.
“Tim Cook talks a lot about social responsibility, and his company has a responsibility to help the government catch terrorists who use its products,” Baker said, referring to Apple’s chief executive. “Instead, the company has been spending enormous resources on making it harder and harder for the government to do that.”
A version of this argument has been underway for decades. When the first robust encryption programs became available for use on personal computers and telephones in the early 1990s, law enforcement and intelligence agencies warned the technology would make it possible for criminals and spies to communicate in complete secrecy, and sought tough limits on their use.
Since then, encryption technology has became a vital tool for lawful citizens and businesses. For instance, online banking and shopping would be unthinkable without encryption systems such as SSL, which enables secure transfers of money. Soon, hardcore encryption became a common feature in everyday gadgets.
But now that criminals routinely use encrypted devices, the controversy is becoming more intense. Apple shows no signs of changing its position, and the government may again seek a court order to force the company’s hand.
Meanwhile, in Congress, members of both parties have expressed support for legislation to make backdoors mandatory in encrypted consumer devices.
And in a speech in July at Fordham University, Barr said that “while we remain open to a cooperative approach, the time to achieve that may be limited.”