How the FBI is trying to get to the Core of Apple's Security
A few weeks ago we wrote about the Microsoft Ireland case, where a US judge had ordered Microsoft to hand over the email contents of a suspected drug trafficker. The emails were stored in Microsoft’s Dublin data centre, but Microsoft refused to hand them over on the grounds that this would set a dangerous precedent, and would comprise international data protection laws.
Now it’s Apple’s turn to be in the spotlight for refusing to comply with the American Justice department, although this case has fundamental differences.
San Bernardino, California, 2 December 2015: The San Bernardino Public Health Department are holding a joint training event and Christmas party when one of their employees, Syed Rizwan Farook and his wife Tashfeen Malik, murder 14 people and wound a further 22 in a mass shootout and attempted bombing. It is declared a terrorist attack and the FBI promptly open an anti-terrorist investigation.
After fleeing the scene, Farook and Malik are pursued by police before both are killed following a shootout.
As part of their investigation, the FBI claim to have found that crucial information about who Farook was communicating with before the attack is now stored on his phone – an iPhone 5c running iOS 8.
iOS 8 (and newer versions of Apple’s mobile operating system) have a security feature which means that the phone is encrypted. The only way to unencrypt it is to know the user’s passcode. It’s a conscious decision which Apple made back in 2014, so that they would never be able to obtain the passcodes of their customers.
Farook may have also chosen a certain iOS setting which would mean his phone would be wiped of all data if 10 incorrect passcodes are entered.
The FBI now want Apple to build a new SIF (System Information File) which circumvents major iPhone security features, in order to get around the passcode issue. This would essentially help the FBI conduct a ‘brute force’ attack (a brute force attack is a speedy way of trying lots of passcode combinations until the right one is found – a 4 digit passcode has 10,000 possibilities) and ensure that the phone wouldn’t erase itself after 10 failed logins.
A few days ago Apple released a public announcement (via Tim Cook) explaining why they are not complying with this request. It was entitled ‘A message to our customers’.
Apple argue that whilst they have helped the FBI with their enquiries thus far (handing over data where they can), creating this new piece of software would be a very dangerous thing to do.
They say it would undo the work Apple have done to make iPhones as secure as possible (some would say draconian if you look at the way they’ve handled Error 53!) and, in the wrong hands, it could compromise the data belonging to every single iPhone user (which for many people would now include credit card information).
Apple have addressed the argument that this is just a ‘one off’ for the San Bernardino case:
“Some would argue that building a backdoor for just one iPhone is a simple, clean-cut solution. But it ignores both the basics of digital security and the significance of what the government is demanding in this case.
In today’s digital world, the “key” to an encrypted system is a piece of information that unlocks the data, and it is only as secure as the protections around it. Once the information is known, or a way to bypass the code is revealed, the encryption can be defeated by anyone with that knowledge.
The government suggests this tool could only be used once, on one phone. But that’s simply not true. Once created, the technique could be used over and over again, on any number of devices. In the physical world, it would be the equivalent of a master key, capable of opening hundreds of millions of locks — from restaurants and banks to stores and homes. No reasonable person would find that acceptable.”
The government is asking Apple to hack our own users and undermine decades of security advancements that protect our customers….”
Privacy activists have supported Apple’s stance. Sherif Elsayed-Ali from Amnesty International also gave the argument that, “This isn’t an issue about reading a known terrorist’s texts – it’s about how the functionality could be abused in other ways.”
So where does this leave things? It all comes down to whether encrypted actually does mean encrypted.
The difference between this case and the Microsoft Ireland case is that the data the US government is asking Microsoft for (from the suspected drug trafficker) is stored in the Cloud, and is unencrypted. There would be a way of handing the data over if Microsoft wanted to – they’re not doing so because they believe the US government had no right to access data that was being held outside of the US and that all data should be protected by the laws of the country in which its servers are located (not to mention give reassurance to all their Cloud customers that their data is safe).
Here, Apple are saying they simply don’t have the means to unlock an encrypted device without developing insecure software. Or if they do, (and I believe they do) they certainly don’t want the truth to be out there i.e that their hardware can be hacked into even when it’s supposed to be ‘impossible’ without a passcode.
This Register article does a good job of offering up several explanations as to why Apple may be refusing to comply with the court command when they have the capability to do exactly what they’re asking.
Our security partners Sophos have taken the following stance about technology vendors such as themselves building ‘back door’ systems for their own products:
“Our ethos and development practices prohibit “backdoors” or any other means of compromising the strength of our products for any purpose, and we vigorously oppose any law that would compel Sophos (or any other technology supplier) to weaken the security of our products.”
From my standpoint, I believe Apple have the capability to unlock this particular iPhone, and in cases like this if a court has ruled there is sound evidence which says that unlocking this device will aid an anti-terrorist investigation, then they should do so.
Clearly many will disagree with this standpoint, and the planned Apple support rallies on the 23rd February across the US will show this.
For me, the more complex issues are:
- How Apple unlock the device whilst guaranteeing the software and methods cannot be replicated.
- At what point the line is drawn should Apple back down on this occasion and therefore set a future precedent on being able to unencrypt data – Fraud cases? Drug trafficking? People trafficking? That is, for me, a highly complex issue, and one for international law to decide.
Meet TSG’s Risk & Security Management Experts: Claire Vandenbroecke & Mike Tudor
Certification & Compliance: Securing your data and enabling clients to secure theirs
Date: 11/07/23 12:00
Business Continuity: Moving your business to a more secure environment
Date: 13/09/23 12:00
Prevention, Detection & Response: From security threats to resilience
Event date passed - Available on demand