I seem to remember the FBI attempting to compel Apple to decrypt a criminal's iPhone, only for Apple to refuse and claim that it wasn't possible. I'm not sure exactly what happened after that. I think it was suspected that the NSA was able to do it by exploiting an unpatched zero-day. So they didn't need Apple's help anymore and the issue was dropped from the public's eye.
There's a couple overlapping things here:
1. Apple can and does comply with subpoenas for user information that it has access to. This includes tons of data from your phone unless you're enrolled in Advanced Data Protection, because Apple stores your data encrypted at rest but retains the ability to decrypt it so that users who lose their device/credentials can still restore their data.
2. Apple has refused on multiple occasions, publicly, to take advantage of their position in the supply chain to insert malicious code that expands the data they have access to. This would be things like shipping an updated iOS that lets them fetch end-to-end encrypted data off of a suspect's device.
> Apple can and does comply with subpoenas for user information that it has access to.
When we are talking about data stored on a company server, you have no choice when you are served a valid warrant.
That's why Apple went all in on the concept of keeping sensitive data off their servers as much as possible.
For instance, Apple Maps never stored the driving routes you take on Apple's servers, but does remember them on your device.
Not to mention, while apple will publically deny it, there are government agents working undercover at every major tech firm. They may or may not know. They certainly exist.
> remember the FBI attempting to compel Apple to decrypt a criminal's iPhone, only for Apple to refuse and claim that it wasn't possible
Apple refused “to write new software that would let the government bypass these devices' security and unlock” suspects’ phones [1].
> not sure exactly what happened after that
Cupertino got a lot of vitriol and limited support for its efforts.
[1] https://en.wikipedia.org/wiki/Apple%E2%80%93FBI_encryption_d...
I always assume these public performances are merely performances and that no one hears about the actual dirty work.
And of course Apple is quite right not to miss the marketing opportunity, on behalf of the shareholders. While acquiescing to lawful demands of course.
I don't remember Apple ever saying that it was impossible for them to do it, just that they didn't want to.
It was always kind of assumed that they could, by eg signing a malicious OS update without PIN code retry limits, so the FBI could brute force it at their leisure, or something similar.
They said it was impossible for them to build a backdoor into iOS that would only be accessible to legal requests from law enforcement, which is true in the strict sense. So law enforcement bought a vulnerability exploit from a third party.
> they could, by eg signing a malicious OS update
They successfully argued in court that being forced to insert code the government wanted would be equivalent to compelled speech, in violation of the first amendment.
As the Feds often do, they dropped the case instead of allowing it to set a precedent they didn't want.
> They successfully argued in court that being forced to insert code the government wanted would be equivalent to compelled speech
This isn't true, they never "successfully argued in court". There was never any judgement, and no precedent. They resisted a court order briefly before the FBI withdrew the request after finding another way into the device.
There wasn't judgement because the Feds dropped a case that would set a precedent they wanted to avoid.
Since there is longstanding legal precedent that corporations are people and code is speech, forcing a corporation to insert code that the US government demands is a violation of the first amendment.
https://en.wikipedia.org/wiki/Apple%E2%80%93FBI_encryption_d...
That was show put on for the sole reason of the public seeing it.
If you follow the things that have been disclosed / leaked/ confirmed when they’re 20+ years out of date, then yes the probability this is true is significant.
I recall there being a little more substance to it at the time. But looking back from where we are now, that is a succinct way of describing its results.
Cellebrite did the job using a vulnerability..
That being JTAG debugging. Now there are greyhat groups discovering what they can do with it beyond bypassing the PIN at power-up. Honestly surprised phones are not being sold/marketed as having that disabled on both bluetooth and USB.