When a US judge ruled that Apple must help the FBI break into an iPhone belonging to one of the killers in the San Bernardino shootings, the tech world shuddered.
Why? The battle of encryption "backdoors" has been longstanding in Silicon Valley, where a company's success could be made or broken based on its ability to protect customer data.
The issue came into the spotlight after Edward Snowden disclosed the extent to which technology and phone companies were letting the US government spy on data being transmitted through their network.
Since Edward Snowden's whistleblowing revelations, Facebook, Apple and Twitter have unilaterally said they are not going to create such backdoors anymore.
So here's the "backdoor" the FBI wants: Right now, iPhone users have the option to set a security feature that only allows a certain number of tries to guess the correct passcode to unlock the phone before all the data on the iPhone is deleted.
It's a security measure Apple put in place to keep important data out of the wrong hands.
Federal prosecutors looking for more information behind the San Bernardino shootings don't know the phone's passcode. If they guess incorrectly too many times, the data they hope to find will be deleted.
That's why the FBI wants Apple to disable the security feature. Once the security is crippled, agents would be able to guess as many combinations as possible.
Kurt Opsahl, a lawyer for the Electronic Frontier Foundation, a digital rights non-profit, explained that this "backdoor" means Apple will have to to write brand new code that will compromise key features of the phone's security.
Apple has five business days to respond to the request.
What does Apple have to say about this? Apple CEO Tim Cook said that the company would oppose the ruling.
In a message to customers published on Apple's website, he said: "We can find no precedent for an American company being forced to expose its customers to a greater risk of attack. For years, cryptologists and national security experts have been warning against weakening encryption. Doing so would hurt only the well-meaning and law-abiding citizens who rely on companies like Apple to protect their data."
Back in December, Cook defended the company's use of encryption on its mobile devices, saying users should not have to trade privacy for national security, in a broad interview with 60 Minutes.
In the interview, Cook stood by the company's stance of refusing to offer encrypted texts and messages from users.
What does this mean for the next time the government wants access? The order doesn't create a precedent in the sense that other courts will be compelled to follow it, but it will give the government more ammunition.
What do digital rights experts have to say?
There are two things that make this order very dangerous, Opsahl said. The first is the question it raises about who can make this type of demand. If the US government can force Apple to do this, why can't the Chinese or Russian governments?
The second is that while the government is requesting a program to allow it to break into this one, specific iPhone, once the program is created it will essentially be a master key.
It would be possible for the government to take this key, modify it and use it on other phones. That risks a lot, that the government will have this power and it will not be misused, he said.
Source: MCT