Apple could bypass iPhone security, experts say - but won't
NEW YORK — Faced with a federal judge's order to help investigators break into an iPhone used by one of the San Bernardino, California, shooters, Apple may well argue that the request places an unreasonable burden on the company.
In fact, experts say that complying with the government's request wouldn't be particularly challenging for Apple. But doing so might set a dangerous precedent that could threaten the data security of the millions of iPhone users around the world.
The phone in question was used by Syed Farook, who along with his wife, Tashfeen Malik, killed 14 people in a December attack. Investigators don't know if the phone contains important evidence about the attack or the couple's communications — and because its contents are encrypted, they won't unless they can get the passcode to unlock it. The phone was issued by Farook's employer, the county of San Bernardino.
Investigators can't just try random passcodes until they hit on the right one, either. The phone has apparently enabled an Apple security feature — a sort of self-destruct option that would render the phone's data unreadable after 10 incorrect passcode attempts.
The judge's order requires Apple to create a unique software package — one Apple CEO Tim Cook described as "a new version of the iPhone operating system" — that would allow investigators to bypass the self-destruct system. The same software would also let the government enter passcodes electronically, eliminating both the tedium of manual entry and the enforced delays the iPhone system imposes after a few wrong guesses.
Apple opposes the order, arguing that such software would amount to a security "backdoor" that would ultimately make iPhone users across the globe more vulnerable to information or identity theft. Both the ACLU and the Electronic Frontier Foundation have pledged to support Apple, saying that the government's request endangers security and privacy.
From a technical perspective, making such software shouldn't be difficult for Apple, experts say. But once created, it would be nearly impossible to contain, says Ajay Arora, CEO and co-founder of Vera, a startup that provides companies with encryption services.
"Imagine if that got into the wrong hands," he says. "What they're asking for is a God key — and once you get that, there's no going back."
The demands being made of Apple border on the bizarre, says Lee Tien, a staff attorney for the Electronic Frontier Foundation, a digital rights group. "Asking a technology company to make its security less secure is a crazy, stupid thing to do," he says. "It's like asking water not to be wet."
The government's best bet may be to argue that its request doesn't actually create a backdoor, even if that's how Apple characterizes the request, says Robert Cattanach, a former Justice Department attorney. But Apple is probably right to worry that a government win in this case will lead to broader requests down the road.
"If the court rules in favor of the government, then I think the stage has been set for the next step, which is, 'Thanks for removing the auto-wipe. Now you need to help us defeat the code'," Cattanach says. "If you're the government, you're going to ask for that."
- Latest
- Trending