Apple vs. The FBI Demonstrates Convenience Versus Security

While there are definitely bigger issues at play in the case of Apple vs. The FBI, issues that have been discussed ad-infinitum, there’s actually something a bit more innocuous to look at, and it was discussed on my favorite podcast, No Agenda, hosted by Adam Curry and John C. Dvorak.

Recall that the actual court order is asking for Apple to make it possible on a specific device to allow the PIN code to be brute-forced in an automated fashion without introducing unnecessarily delays and without wiping the device after 10 missed guesses. This is because the PIN code unlocks the encryption key for the device, which would allow the FBI to access the device.

To explain how this works, I’ll quote from Apple’s iOS Security Guide for iOS 9:

The passcode is entangled with the device’s UID, so brute-force attempts must be performed on the device under attack. A large iteration count is used to make each attempt slower. The iteration count is calibrated so that one attempt takes approximately 80 milliseconds. This means it would take more than 5½ years to try all combinations of a six-character alphanumeric passcode with lowercase letters and numbers.

The stronger the user passcode is, the stronger the encryption key becomes. Touch ID can be used to enhance this equation by enabling the user to establish a much stronger passcode than would otherwise be practical. This increases the effective amount of entropy protecting the encryption keys used for Data Protection, without adversely affecting the user experience of unlocking an iOS device multiple times throughout the day.

To further discourage brute-force passcode attacks, there are escalating time delays after the entry of an invalid passcode at the Lock screen. If Settings > Touch ID & Passcode > Erase Data is turned on, the device will automatically wipe after 10 consecutive incorrect attempts to enter the passcode. This setting is also available as an administrative policy through mobile device management (MDM) and Exchange ActiveSync, and can be set to a lower threshold.

It’s tricky to provide very high security for devices while at the same time maintain usability. This is a classic case of balancing convenience and security and Apple should be lauded for their efforts here. With all of these limitations Apple puts in place, including the end user enabling the “Erase Data after 10 failed passcode attempts” option, even a 4-digit PIN can provide a reasonable balance between usability and security.

That said, I generally agree with Adam Curry’s assessment in that a 4-digit PIN code isn’t really that secure. This is because end users generally pick PIN codes that are easy to guess if you know even a little bit about the person in question. Given the passwords people choose, this should be no surprise.

Even with a 4-digit PIN and the “Erase Data after 10 failed passcode attempts” option turned off, the ever-escalating passcode lockout provides some level of protection. Same with limiting passcode entry to the touch screen. Without those limitations in place, it would be trivial to brute code a 4-digit PIN code (about 15 minutes) or even a 6-digit PIN code (about a day).

Which is, of course, why the FBI wants the ability to remove those protections. And why, if you care even a little bit about the security of the data on your mobile device, that you should use a longer, more complex passcode on your device. Given that even without those restrictions, it takes roughly 80 milliseconds to check each PIN/passcode attempt, it doesn’t have to even be that long or that complex to keep the passcode from being guessed in your lifetime.

Disclaimer: My employer Check Point Software Technologies might have differing views on this topic. These thoughts are my own.

Apple, FBI, and The Case Against Mobile Device Management

I will admit, I’ve never been a huge fan of Mobile Device Management (MDM). Given the way this whole kerfuffle between the FBI and Apple is playing out, anyone who cares about their personal digital privacy should think twice before subjecting their personally-purchased devices to MDM.

One of many things an MDM solution can do is control the PIN code on the device. Namely, it can control that one exists, force a specific length of PIN, and even reset a PIN. This fact entered the public discourse around the San Bernardino shooter’s iPhone that the FBI is trying to get Apple to assist them in unlocking.

If San Bernardino County (who owned the phone the shooter used) installed MDM on the target device, the whole public debate around this would not be happening. I don’t know of any company who would deny a request to reset or disable the PIN, particularly if it were backed by a court order. Unlike what the FBI is asking Apple to do now, it would not be burdensome to carry out, either.

This places an extra burden on employers who manage employee devices through MDM. Specifically, do you have a process to handle law enforcement requests like this? Are your employees aware of this policy and have they consented? Also, MDM doesn’t do a whole lot to protect corporate data or detect the presence of malicious software on mobile devices.

As an individual, this doesn’t make me feel all that safe about trusting my phone to MDM, at least not without understand precisely what features and functionality will be under MDM control.

Disclaimer: My employer Check Point Software Technologies might have differing views on this topic. These thoughts are my own.

FireEye: Indemnification That's Basically Worthless

From FireEye’s CEO and the meaning of ‘basically’:

In an interview on CNBC’s “Mad Money” with Jim Cramer, FireEye CEO Dave DeWalt said a certification granted by the Department of Homeland Security under a law known as the SAFETY Act “allows companies who use our product to basically be indemnified against legal costs relative to being breached.”

Which, if you unpack this statement, turns out to be basically meaningless.

From the FAQ on the Safety Act maintained by the Department of Homeland Security, emphasis added:

[The] Act creates certain liability limitations for “claims arising out of, relating to, or resulting from an Act of Terrorism” where Qualified Anti-Terrorism Technologies have been deployed. The Act does not limit liability for harms caused by anti-terrorism technologies when no Act of Terrorism has occurred.

What is an Act of Terrorism? The FAQ about the SAFETY Act continues:

A: Pursuant to the SAFETY Act, an Act of Terrorism is: ACT OF TERRORISM- (A) The term “act of terrorism” means any act that the Secretary determines meets the requirements under subparagraph (b) of the Act, as such requirements are further defined and specified by the Secretary. REQUIREMENTS- (B) An act meets the requirements of this subparagraph if the act- (i) is unlawful; (ii) causes harm to a person, property, or entity, in the United States, or in the case of a domestic United States air carrier or a United States-flag vessel (or a vessel based principally in the United States on which the United States income tax is paid and whose insurance coverage is subject to regulation in the United States), in or outside the United States; and (iii) uses or attempts to use instrumentalities, weapons or other methods designed or intended to cause mass destruction, injury or other loss to citizens or institutions of the United States.

That’s actually a pretty broad definition of terrorism that I should probably explore in another forum. Sufficed to say, most breaches that affect most companies are not recognized “Acts of Terrorism” under the SAFETY Act. Which means there is likely no legal indemnification if and when a breach happens.

Even on the off chance legal indemnification applies, there are still plenty of other costs that won’t be covered by the SAFETY Act. I’m sure FireEye will happily sell you the consulting necessary to clean up from such a breach, and I’m pretty sure it won’t be for free, either.

Personally, I’d rather prevent the breach from happening rather than relying on promises of indemnification if and when they do. But that’s just me.

Disclaimer: My employer Check Point Software Technologies competes with FireEye in the market. These thoughts are my own.

My Podcasts on Apple and the FBI Backdoor Requests

For those of you who don’t know, I produce a short but regular podcast called PhoneBoy Speaks. It is not exclusively information security focused, though I do maintain a RSS feed for infosec-related episodes if that’s all your interested in.

On the last couple episodes, I discussed the FBI’s requests to Apple to assist them in breaking into a specific iPhone “because terrorism,” which I covered in written form already. I don’t think I say anything different in these podcasts, but if you like audio better than reading, here you go:

Disclaimer: I haven’t asked what my employer Check Point Software Technologies thinks about all this. These thoughts are my own.

Can Apple Actually Comply With The FBI Request To Allow Bruteforcing Pin Codes?

For a moment, anyway, ignore the politics of whether or not Apple should comply with the FBI’s request to assist them in unlocking a particular iPhone relevant to a highly publicized terrorism case, which has been court ordered. Let’s talk about whether it’s actually possible or not, based on the information available and what I know is possible.

To summarize, the request is to provide “reasonable technical assistance” to achieve the following three things:

  1. Disable the auto-erase function (which can happen if this feature is enabled)
  2. Allow submission of passcodes via something other than the touchscreen
  3. Disable the ever-escalating delays that Apple introduces when you enter incorrect passcodes

This would allow the FBI to effectively brute-force the PIN code on the device, assuming the owner of said device used a simple 4-digit passcode (very likely). A 6-digit passcode, which is the new default in iOS 9.2, might take a little longer. If it’s an actual strong password, the FBI might be waiting a long time to actually unlock the phone.

I don’t think there is any debate that Apple can write firmware that accomplishes the above tasks. The trick, of course, is getting the changed firmware onto the device. The only way I know of that can happen on a device where the passcode is not known is by using the Device Firmware Upgrade (DFU) mode.

Just for kicks, I put one of my iPhones into DFU mode and plugged it into one of my computers. I got the following warning:

Loading new firmware on the device will erase the device, which kind of defeats the purpose the FBI is trying to achieve. That is, of course, unless Apple has designed a way to prevent that from happening in certain cases. Dan Guido seems to think this is the case and if you read Apple’s Legal Process Guidelines, it seems to suggest this is possible, at least in versions of iOS prior to 8.0:

For iOS devices running iOS versions earlier than iOS 8.0, upon receipt of a valid search warrant issued upon a showing of probable cause, Apple can extract certain categories of active data from passcode locked iOS devices. Specifically, the user generated active files on an iOS device that are contained in Apple’s native apps and for which the data is not encrypted using the passcode (“user generated active files”), can be extracted and provided to law enforcement on external media. Apple can perform this data extraction process on iOS devices running iOS 4 through iOS 7. Please note the only categories of user generated active files that can be provided to law enforcement, pursuant to a valid search warrant, are: SMS, iMessage, MMS, photos, videos, contacts, audio recording, and call history. Apple cannot provide: email, calendar entries, or any third-party app data.

It’s possible Apple didn’t need new firmware to obtain access to the data it outlines above, it could simply interrogate the device in DFU mode and the device would give the information up. Even if a new firmware load is required to accomplish this task, it’s possible that as part of improving device security by encrypting all the data, Apple also closed the loophole that allowed them to put a new version of iOS on the device in DFU mode without erasing it. As the device in question is reportedly running iOS 9.0, the entire argument for whether this is technically possible or not hinges on the answer to this question.

The only other method available to Apple: hacking the actual device in much the same way jailbreakers do. The very same thing John Mcafee is offering to do for free using his army of hackers with 24-inch purple mohawks, 10-gauge ear piercings, and tattooed faces–people not likely to be under the employ of the FBI.

Edit 19 February 2016: I asked Dan Guido via a comment about whether or not you could load new firmware via DFU mode. He said it could be loaded as a ramdisk which would leave the user data alone. Whether this could actually accomplish what the FBI is asking for is still not known, but the fact you can do this certainly makes it seem a lot more plausible.

Regardless of whether or not this is technically possible, Apple is right to challenge this court order. The implications go well beyond this one iPhone and will impact our digital rights globally for decades to come.

Disclaimer: I haven’t asked what my employer Check Point Software Technologies thinks about all this. These thoughts are my own.