Skip to content

The Worm in the Apple

Signe Wilkinson
© Signe Wilkinson

The dispute between Apple and the FBI will be going on for a while.

The government can’t even protect their own secrets. And yet they ask us to trust them to put backdoors into the devices that contain our passwords and other details about our most private information, including our health, wealth, and safety.

The bottom line is that compromising digital encryption on a single phone, for evidence that may or may not exist, will make us much less safe in the long run.

Think about it. Did torture make us safer? Would installing video cameras everywhere (including inside people’s homes) to watch everyone’s movements and actions make us safer? Would giving up all of our rights make us safer?

Share

16 Comments

  1. Jonah wrote:

    My understanding is that the FBI wants to prevent the iphone from the suspect from permanently erasing data if the password is entered incorrectly multiple times. Overcoming encryption didn’t seem to be a contentious topic as far as I know but there are so many articles regarding this issue that I’m not absolutely certain what the FBI wants. Either way it will be interesting how this case ends up being resolved. Will landlords no longer have to provide keys to law enforcement to enter residences being rented out to suspects? Will wiretapping be made illegal even if approved by a court order?

    One additional comment about surveillance camera’s. I moved to london temporarily for personal reasons a couple of years. I may stay here if Trump or any other republican wins 🙂 A few weeks ago an innocent man walking to work was stabbed to death by someone probably looking to rob him. They were eventually able to nab him mainly using some surveillance camera’s that were trained on areas close to the stabbing. I’ve seen some gross misuse of London surveillance by the police now and then. They posted a picture of a public celebrity walking out of a club once and made a joke about it. However none of the londeners I hang around with are bothered by the cameras and instead seem to embrace it as a tool that keeps them safe or at the very least prevents the next tragedy.

    Friday, February 26, 2016 at 5:54 am | Permalink
  2. ThatGuy wrote:

    A back-door into all Apple devices is a horrible idea. If it exists, someone will figure out how to abuse it. Apple and others already have issues securing their cloud services (or at least making users aware of what gets sent there), I can’t imagine creating vulnerabilities intentionally would be a very good idea.

    I do agree with the government/critic charge that Apple is doing this to protect their brand, to which I say: “no kidding.” But protecting your brand and protecting your customers aren’t always at odds.

    Bottom line, we give away enough of our privacy just by using the Internet, and risk it even further by putting just about everything on several-ounce smartphones that most people are better at losing or breaking than taking care of. We don’t need to put any more vulnerabilities.

    Oh, look, Amazon is kind enough to let me know that it is having a sale on those computer parts I want…

    Friday, February 26, 2016 at 7:59 am | Permalink
  3. Carter wrote:

    Yep, Apple cares about our privacy. Not like their tracking our every movement or monitoring every purchase we make.

    Friday, February 26, 2016 at 10:32 am | Permalink
  4. PATRIOTSGT wrote:

    So, if they FBI uncovers a plot to assassinate President Obama because an informant said his buddy called and told him there was a group of skin head individuals with an extensive plan. The FBI seizes the iPhone from the person who called the informant, but alas it is locked. The FBI could stop the plot if it got access to the phone.

    So if Apple prevails any telecommunications provider that handles sensitive information should now be able to deny a lawful court order as provided for in our constitution. What about banks, the IRS, credit card companies, stock firms? Do our phones contain more sensitive data then that? How about the lawfully executed search warrant on a house or business, would we find less private information inside a locked file cabinet?

    Here’s my advice, if you don’t want law enforcement to find out about your Ashley Madison account, don’t create one. Because if you have more sensitive data on your phone then could be retrieved from Banks, Credit companies, IRS, political parties, your doctor, your closet, your bathroom, or your car? Then you are an idiot. And if you don’t want someone to know who you snap chatted with (Anthony Weiner for example)then we are in a world of hurt.

    Friday, February 26, 2016 at 11:47 am | Permalink
  5. Michael wrote:

    This may be more than you care to know, but since Jonah asked, but here are the technical details in as clear language as I can explain. This case is very complicated and most people don’t understand the implications of what the FBI is asking. The FBI is asking for 3 primary things: (a) disabling the feature to wipe the phone after 10 failed tries, (b) disabling the exponential back-off between tries, and (c) enabling them to send the PIN attempts over a wired or wireless interface instead of typing them. This will allow them to run a program that automatically tries all possible PINs. They are also asking for a 4th thing indirectly: (d) the ability to restrict these features to only this particular iPhone. Feature (a) is fine, but features (b) and (c) are extremely unwise, while (d) is impossible.

    Here’s how the system works. Every iPhone has a unique cryptographic key–a very large number that is impractical to guess–burned into the hardware. Your PIN gets combined with this key in a routine that takes 80ms to run. This 80ms delay is completely unavoidable and Apple can’t do anything about that. But the result is the generation of a new key that no one knows. The only way to get this key is to run this routine with that PIN on that device. This new key is the one that is then used to encrypt and decrypt data stored on the phone. The phone confirms your identity by trying to decrypt a small piece of data with this generated key. If the decryption fails, you entered the wrong PIN and the phone tells you this.

    What happens next is the heart of the issue. You can fail up to 4 times with no delay. After the 5th attempt, you’re locked out for 1 minute before you can try again. After the 6th, you’re locked out for 5 minutes. Then 15 minutes for 7 and 8 failed, then an hour after your 9th. This is called exponential back-off; the amount of lock-out time increases exponentially. The reason for this is simple: If you can try all possible PINs as quickly as the hardware allows, you can break in. Apple’s documentation says this will take 5.5 years to try all 6-character alphanumeric passwords. However, most people still use a 4-digit PIN of just numbers. Trying all of those would require a MAXIMUM of 13 hours. End result: If I steal your iPhone and apply the patch to disable exponential back-off, I can easily break the encryption and get your private data in half a day. This includes your credit card information used by Apple Pay and other such services. It’s NOT just about the privacy of your Instagram pictures or other stuff. It’s ALSO about money and keeping users protected from identity theft, and exponential back-off is critical to protecting it.

    To make matters worse, the FBI wants to be able to do this over Bluetooth…which means I don’t even need to steal your phone anymore! Depending on which version of Bluetooth is used, all I would need to do is leave a device with 100 meters (less in practice because of wireless interference) of your iPhone for up to 13 hours. Perhaps I leave it in the bushes by your front porch. The combination of features (b) and (c) completely destroys the iPhone’s security for financial data. It is the very definition of a backdoor, and the FBI and the judge are flat-out lying (or frighteningly deluded) when they suggest it’s not.

    To allay concerns of this security threat, they throw in feature (d): this backdoor only runs on this device. Hogwash. This is impossible to guarantee. The software, music, and movies industries have been trying to do this for a couple of decades now; it’s called DRM (digital rights management). Every proposed system has failed, because it depends on trust assumptions that cannot be made. For instance, you can make it depend on the MAC address. But MAC addresses can be changed. You can make it depend on the radio’s ID number, but those can be forged. You can make it depend on a hardwired cryptographic key. But if you put probes in the right place on the device, you can eavesdrop on the bits in the wires. It is literally impossible to guarantee that this backdoor will only run on that one particular device.

    It’s not just a civil liberties, security vs. privacy vs. law enforcement issue. What the FBI is requesting puts all iPhone users at additional risk if their phone is ever lost or stolen. In fact, it puts them at risk of identity theft even if their phone is NOT stolen. It puts all iPhone users at definite risk unnecessarily for the possibility that they might recover some data about a potential threat that may happen at some point in the future.

    Friday, February 26, 2016 at 3:01 pm | Permalink
  6. Michael wrote:

    Also, it’s worth pointing out that this entire debacle is possibly a deliberate escalation on the part of the FBI. With the original PIN still in place, the owner of the phone (San Bernardino County) could have retrieved backups from the iCloud service, which would have included most of the data in question. But during the early stages of investigation, the FBI told San Bernardino County to change the PIN immediately. No explanation was provided for why this was necessary, and there still is no explanation. In fact, it goes entirely against standard digital forensics practices. You simply don’t change anything. But the FBI told them to. The key question is why. I see two explanations that are equally unpalatable: (a) deliberate escalation to use this as a way to attack Apple in court, or (b) incompetence.

    Friday, February 26, 2016 at 3:05 pm | Permalink
  7. Joe Blow wrote:

    Hey Michael,

    Would it be possible for Apple to:
    a) Give the FBI what they want and
    b) Then change the iPhone such that the solution for step a) is a one off ?

    Friday, February 26, 2016 at 3:17 pm | Permalink
  8. Iron Knee wrote:

    Good summary, Michael. It is also worth pointing out that weakening encryption can also endanger lives. So in exchange for maybe getting information from this one phone, we are exposing information on many phones. Governments could use this to find out the names of dissidents, or even American intelligence workers, thus endangering their lives.

    Similar arguments were used for torture — if some terrorist knows where a nuclear device is hidden, it should be justifiable to torture them to find out. But in reality we got little (if any) actionable information from torture. Instead, the government didn’t just use it in extreme cases like this, they used it indiscriminately and seemingly just for sick fun. And we are still living with the blowback.

    Friday, February 26, 2016 at 4:22 pm | Permalink
  9. PatriotST wrote:

    So at the end of the day one can make several assumptions.
    1. Apple can, but is refusing to follow a lawful court order.
    2. Apple could comply, but they are worried that the solution will be stolen from or sold by their employees.
    3. So your data is safe, because no engineer at Apple would do that?
    4. It’s much easier to hack into your home computer or simply break into your house to steal your bank accounts, or hack directly into your bank, so why would a criminal go through the hassle of assaulting your phone.
    5. If I want to easily steal the secrets on your iPhone, it is simple. All I need to do is wait for you to be using the phone, hit you over the head, and keep the phone live until I can find the info I’m looking for.

    Great technical description Michael, but at the end of the day, they are taking a stand. If we were talking about the mass collection and data mining of phone records, etc. of persons not involved in criminal activity I would fully support Apples position. But, we are talking about mass murderer’s and there link to terrorist entities. Two completely different subjects. l

    Friday, February 26, 2016 at 4:24 pm | Permalink
  10. Iron Knee wrote:

    PSgt, some of your statements are just wrong. Like:
    2) The generating cryptographic key required to create the hack is only known to an extremely small number of Apple engineers (I would guess three or four). Once the hack is developed, it would be much easier to duplicate.
    3) It wouldn’t have to be an Apple employee who would steal or sell the hack, it could easily be someone in the FBI.
    5) Not true. Getting access to a live phone does not give you access to encrypted information on that phone (for that, you would have to torture the phone’s owner).

    Friday, February 26, 2016 at 5:37 pm | Permalink
  11. Iron Knee wrote:

    And now, the police chief of the city where the shooting happened says that there is a reasonably good chance that there is nothing of value on the phone. See http://www.npr.org/2016/02/26/468216198/san-bernardino-police-chief-weighs-in-on-whether-apple-should-unlock-shooter-s-p

    He says they mainly want the contacts from the phone, but that doesn’t make sense to me. We know the government already collects metadata from cell phone calls, so they should already have the phone numbers for any calls. Not to mention that they could have gotten that from the backup of their contacts from the cloud, except that the FBI had them change the PIN. Weird.

    Friday, February 26, 2016 at 10:55 pm | Permalink
  12. jonah wrote:

    Thanks for the detailed description of what the FBI is trying to do michael!

    Some thoughts on points 3 and 4.

    3 – Typical bluetooth connections have to be enabled from both sides ie they have to be paired and in most cases there is a pairing code that has to be manually entered. Unless it works differently on the iphone. So a user theoretically cannot be attacked using this backdoor unknowingly. Even if the phone is lost or stolen the pin has to be entered to pair with a bluetooth device and there is no way to enter the pin without erasing everything. I wonder how the FBI will circumvent this with suspects iphone?

    4 – To change the mac address the attacker would need to get into the phone. So its not easy to do but i understand the premise that its difficult to restrict the usage to just one phone. I understand the suspect was using an older iphone. Perhaps there my be a way to restrict usage by limiting it to the particular processor used in the older iphone vs the newer ones. Typically processors are ASICs with hardcoded serial numbers, model numbers etc that cannot be changed. While not a perfect solution that would ensure that the program will not be used in future phones and hopefully limited to the older iphone the suspect was using.

    I think i’m still missing something about what the FBI is trying to do. I presume they will need to install the SW that does 1-4 1st on the iphone and the only way to do this would be via the automatic iphone update? Or can it be done via a USB connection?

    If via the iphone update then presumably a future attack can only done using Apple.

    Saturday, February 27, 2016 at 5:32 am | Permalink
  13. Peter wrote:

    For some reason, the debate about this sort of thing reminds me of an old joke:

    A man walks up to a woman and says, “Would you sleep with me for a million dollars? The woman thinks for a moment and says, “Yes, I would.” The man then says, “Would you sleep with me for one dollar?” The woman is shocked and says, “What kind of woman do you think I am?!” The man replies, “We’ve determined what kind of woman you are, we are now only haggling over the price.”

    Back during the torture debate, people were bringing up the “24 Scenario”. If you remember the TV Show 24, our hero Jack Bauer had 24 hours to figure out who the bad guys were and save countless millions from some terrorist plot. So the scenario would usually go, “There’s an A-Bomb in a major city. We have the man who knows where the bomb is in custody but he won’t tell us where it is. Is it okay to use torture?”

    At what point do you throw out your values in order to save your life?

    The Apple thing is an interesting commentary. There is no reason to believe that there is anything incriminating on that phone, other than the fact that it was used by a criminal. There could be a complete list of ISIS sympathizers on that phone. Or there could be nothing at all.

    So this isn’t a “24 Scenario.” There is no lit fuse or anything like that.

    Apple is in the right on this.

    Saturday, February 27, 2016 at 2:36 pm | Permalink
  14. Joe Blow wrote:

    Hey Peter,

    What it it wasn’t a phone though. What it it was a safe. Let’s say Lockwood have developed a tamper proof safe which will destroy any documents held in it if somebody attempts to force it open.

    And that type of safe is found in the apartment of a terrorist / terror suspect.

    There might be nothing in that safe or there might be a list of ISIS agents in the USA with their agenda’s and targets. who knows.

    What would the reaction be like then if the FBI went to the Courts and asked Lockwood to provide a master key / combination to open said safe.

    I don’t think it would be a problem.

    Saturday, February 27, 2016 at 5:40 pm | Permalink
  15. Ralph wrote:

    Thanks Michael, best technical description of the issue I’ve seen yet.
    Joe – I think your question is answered by Michael under section (d).

    One other question arises I haven’t heard mentioned yet – what is the status of Android phones’ security features? Are they as robust as Apple’s? Or planned to be? I have an i-phone but the wife and kids have Androids. Just curious how this case may be reverberating throughout the rest of the industry.

    This is predicted to wind up in the Supreme Court and boil down to a Constitutional issue around the 4th Amendment. Just because the FBI/gov’t wants something doesn’t mean they have the Constitutional authority, or be given the indiscriminate ability, to conduct surveillance without reasonable cause. As providing a back door gives them such indiscriminate ability that compromises all such devices (item ‘d’ from Michael’s post), you’d think the gov’t loses that case. But I’m not a lawyer and I’ve seen SCOTUS make some doozies over the years, so how it ultimately plays out is anyone’s guess.

    Saturday, February 27, 2016 at 6:35 pm | Permalink
  16. Joe Blow wrote:

    Hey Ralph,

    Sort of, after all in my hypothetical if Lockwood do have some kind of master key / combination to their safe it would apply to all the Lockwood safes of that type.

    ANY such safe could be compromised. But I don’t think anybody would particularly care. And I don’t think it’s entirely down to the difference between the physical and digital key.

    In the UK (and some other jurisdictions) they have laws which compel the user to hand passwords over to the authorities or face jail.

    Sunday, February 28, 2016 at 12:51 am | Permalink