Technology experts at the NATO Cooperative Cyber Defence Centre of Excellence look at the recent case of FBI asking Apple to disable the feature that wipes the data on an encrypted iPhone. In doing so, the brute force protection of the phone would need to be overridden and a back door created; the effects would not and cannot be limited to one device or a single case. It will inevitably decrease the level of security and be misused. Generally, not creating any back doors to any secure systems is recommended.
For everyday users of such devices, using longer passcodes creates additional security – brute-forcing encryption is possible because people are commonly using 4- or 6-digit passcodes.
NATO CCD COE Technology Researcher Teemu Väisänen: Users Should Not Rely on Short Passcodes
“In brute-force attacks, the attacker generates and tests every possible combination of a password to try for the right one. As the password’s length increases, the amount of time to find the correct password increases exponentially. 10000 attempts are required with 4-digit passcodes; attempts on longer ones are not feasible as they could take decades.
Apple iOS uses various security features for preventing the brute-force attacks. Currently default passcode length is 6 digits (meaning ~1 million required attempts to break it), Secure Enclave (or other software as in the iPhone model 5C used in San Bernardino case) holds the keys, does the encryption and decryption, adds delay time when checking a passcode, and gives a possibility to wipe the encryption key after a certain number of failed attempts.
Tim Cook described it in a letter to customers: “The United States government has demanded that Apple take an unprecedented step which threatens the security of our customers. We oppose this order, which has implications far beyond the legal case at hand. This moment calls for public discussion, and we want our customers and people around the country to understand what is at stake.”
Cook says that the FBI wants Apple to make a new version of the iOS, circumventing several security features, and install it on an iPhone recovered during the investigation. The apparent court order (authenticity of which has not been confirmed) gives more details: the FBI wants Apple to bypass or disable the auto-erase function, to enable the FBI to submit passcodes to the device electronically, and to ensure that there is no additional delay between passcode attempts. The request describes one possibility; coding a piece of software that can be loaded and run (only) in the subject device.
The FBI seems to be talking about a laboratory environment for the investigation and a specific software tool that can be used for brute forcing of the passcode. Based on this, it might be unclear why Apple insists that the law enforcement agency means a specific iOS version. That is perhaps because that by bypassing the security features, new firmware for iOS and Secure Enclave is required.
From a technical point of view, there are no obstacles for creating such software, even though Apple claims in that they cannot unlock device for anyone. If the software is ever created, it has to be designed, created, tested, used, shared, and managed properly, otherwise (and most likely still) it would decrease the level of security and be misused. It is worth to note that there exist other ways to get the data from various devices that do not require breaking the encryption.
As researchers at the NATO CCD COE, we generally do not recommend creating any back-doors to any secure systems. However, it is interesting to ask if Apple’s security techniques preventing the brute-force attacks are just providing additional security. Would the tool be a back door or is the back door actually the technique which enables installing such tools without user acceptance? Would creating such software only mean weakening or disabling some of the used security functionalities?
If brute-forcing of encryption is possible in a feasible time, there is either serious problems in the used encryption algorithm itself (which is not the case in the iOS), or the encryption key is too short. The latter would be the case here, as people are commonly using just 4 or 6 numbers as passcodes. Brute-forcing them would be possible (and fast) without Apple’s additional security features such as delays and key-destruction. Because of this, the new brute-force-prevention-disabling-software would not help in cases where the phone user has used a long enough passcode (it is possible to use arbitrary-length alphanumeric ones).
As a general guideline, regardless of how this particular case turns out, is to use longer passcodes including letters and not settle for 4 or 6 numbers. You wouldn’t use only 4 or 6 numbers as a password for encrypting your PC’s hard disk, right?”
NATO CCD COE Technology Researcher Tarmo Randel: Compliance Might Be Easier As the iPhone in Question Has Less Advanced Security
“It seems that a decision was made in evidence handling and forensics process to reset suspects Apple ID password. Therefore the device could not sync latest information to iCloud for easier retrieval by the FBI. Technical analysis of the publicly available court order seems to reveal that Apple can probably indeed provide the FBI what the agency is asking for, namely custom firmware tailored specifically to disable one feature. It may be easier since the iPhone obtained from the suspect was a model with less advanced security implementation (lacks Secure Enclave co-processor).
Apple has focused on creating a device where encryption with operational measures is used to provide as secure environment to the user as possible while maintaining usability (the 4-digit passcode). There are some security certifications and programs where Apple has or wants it’s iOS to participate (FIPS 140-2, ISO 15408, CSfC). While having technical capability of providing law enforcement agencies means evading some security measures in order to provide easier access to the device, this raises some legal and policy concerns in current globalized world.”