Researchers at Johns Hopkins University have found a security flaw in Apple software that allowed them to pierce the company’s encryption, The Washington Post reports.
The flaw would likely not have allowed investigators to break into the iPhone of San Bernardino shooter Syed Rizwan Farook, but it undermines the idea that Apple has created impenetrable encryption, said research team leader Matthew Green.
{mosads}Tech experts like those at the National Security Agency could easily have discovered the hole, Green said.
“If you put resources into it, you will come across something like this,” he told the Post.
The discovery comes the day before the first court hearing in the government’s bid to force Apple to unlock Farook’s phone. The case is considered a proxy for a larger dispute over whether companies should be permitted to build encryption that can’t be accessed by the manufacturer under court order.
Law enforcement officials have warned that such “warrant-proof” encryption shields criminals and terrorists from investigation. Apple, which like other companies has been steadily shoring up its security, has become the poster child for such warnings.
Technologists argue that forcing Apple to deliberately build a vulnerability into its product only opens the doors for criminals to exploit it, endangering all users of the Internet.
“Even Apple, with all their skills — and they have terrific cryptographers — wasn’t able to quite get this right,” Green said. “So it scares me that we’re having this conversation about adding back doors to encryption when we can’t even get basic encryption right.”
Green’s team has alerted Apple to the flaw and will publish a paper describing how they hacked in as soon as Apple issues a patch. The company is expected to address the problem in version 9.3 of its operating software, which will be released Monday.
Using an older version of Apple’s operating system, Green’s team wrote software to mimic an Apple server. Then they targeted a message with a link to a photo stored in Apple’s iCloud server, which was encrypted with a 64-digit key.
The team guessed the code by repeatedly changing a single digit or letter in the key and sending it to the target phone. The phone would accept a correctly-entered digit and thousands of tries later, Green said, “we had the key.”
The same technique would work on later operating systems, Green said, but it would likely require the more sophisticated hacking skills of a nation state.
Law enforcement could use the same technique on an unpatched phone to acquire content sent via iMessage in a criminal or terrorist investigation, he said.
FBI Director James Comey told lawmakers this month that investigators had unsuccessfully sought help from intelligence agencies to unlock Farook’s phone and were ultimately stymied.
“We don’t have the capabilities that people sometimes on TV imagine us to have,” he said.
But some experts have suggested that the solution to the encryption debate is for authorities to simply get better at legally hacking into devices.
“We’re in this situation where I think law enforcement needs to really develop those skills up by themselves,” Dr. Susan Landau, a professor of cybersecurity policy at Worcester Polytechnic Institute, told the House Judiciary Committee earlier this month.
“It’s thinking about the right way for law enforcement to develop those capabilities, the right level of funding. The funding is well below what it should be but they also don’t have the skills,” she said.