FACTBOX: What Apple and the Government Are Fighting Over

(Reuters) —
Pieces of an iPhone are seen on a repair store in New York, February 17, 2016. A court order demanding that Apple Inc help the U.S. government unlock the encrypted iPhone of one of the San Bernardino shooters opens a new chapter in the legal, political and technological fight pitting law enforcement against civil liberties advocates and major tech companies. REUTERS/Eduardo Munoz TPX IMAGES OF THE DAY
Pieces of an iPhone are seen in a repair store in New York, Wednesday. (Reuters/Eduardo Munoz)

Apple, Inc. is resisting a federal court order that it help the U.S. government break into the iPhone 5c of Rizwan Farook, who along with his wife killed 14 people in a December shooting in San Bernardino, California, which the government has described as a terror attack.

The following is an explanation of the technology and data privacy issues at issue.

Q. Why does the U.S. government need Apple’s help?

A. The government wants Apple to provide technical assistance to help it break into Farook’s phone. Apple’s mobile operating system encrypts virtually all of its data so that forensics experts cannot access e-mail, text messages, photos or other information unless they enter a password.

The phone requires two digital “keys” to unscramble the data: a passcode entered by the user when they want to use the device and a unique 256-bit AES key that is coded into the hardware during manufacture. The hardware key cannot be removed from the device, which prevents hackers from copying the contents of its hard drive and then cracking the passcode with the help of powerful computers.

Apple’s mobile iOS system offers an auto-erase function that will wipe the device after 10 failed attempts to unlock it. The government says it is not sure if Farook enabled that function but has not attempted to unlock it because it does not want to risk losing the data.

Q. What exactly does the government want Apple to do?

A. The government has asked Apple to create a new version of iOS that disables the auto-erase function. It also requested the new software circumvent a feature that causes delays of up to one hour when nine wrong passwords are entered – making it possible to break into the phone using the “brute force” method of trying millions of different passwords. The government says it is possible for Apple to create software that will only work on the device used by Farook.

Q. What are Apple’s objections?

A. Apple says that such a tool would essentially create a “back door” that could be used by the FBI or others to break into any iPhone. Apple CEO Tim Cook, in a letter to customers, cited the possibility of the specially created software falling into the “wrong hands” and rejected the notion that it would only be used in this single case.

Cook also said that the move would establish a dangerous precedent. “The government could extend this breach of privacy and demand that Apple build surveillance software to intercept your messages, access your health records or financial data, track your location, or even access your phone’s microphone or camera without your knowledge,” he said.

Q. Is Apple right?

A. It is not clear why Apple would worry about the specially created software being stolen or misused, since the work would take place in Apple’s labs and would presumably be no more subject to theft than any other Apple software. Apple is known for its strong security and there are no known incidents of its source code or cryptographic keys being stolen.

Further, the same technique would not work on devices launched after the 5c because they are equipped with a chip known as “Secure Enclave,” which helps encrypt data using both the password and a unique user ID that is provisioned during manufacturing and not known to Apple.

The bigger concern is the precedent. If Apple complied, it would mark the first time a software company created a tool to break into its own products in response to an order from law enforcement. Technology companies and privacy advocates fear an endless stream of similar requests – not just from the U.S. government, but also from foreign governments and even litigants in civil cases. Technologists are horrified by the very idea of deliberately creating software that undermines security.

Q. Why is practical compromise impossible here?

A. Apple is drawing a line in the sand to avoid setting a precedent.

Q. What information is the government seeking?

A. Prosecutors say they believe data on the phone could help determine who Farook and his wife Tashfeen Malik communicated with as they planned the shootings, where they traveled to before and after the attack, and other details about the attack.

Q. Will all the data the government wants be on the device?

A. Not necessarily. Even if the government is right in its assumption that the phone was used to plan that attack, Farook may have used encrypted apps that wipe all evidence of communications. For example, Islamic State uses a certain mobile messaging service for propaganda and recruitment. The service allows the group to broadcast messages to large numbers of followers, then move to private, one-to-one encrypted messaging that likely cannot be retrieved by forensics experts.

Q. Are there similar issues with Android devices?

A. Smartphones powered by Google’s Android operating system offer a variety of encryption options, depending on the manufacturer and model. Forensic technicians can “bypass” passcodes on some of the devices, according to a November report by Manhattan’s district attorney. Google can remotely reset the passcodes, when served with a search warrant and an order instructing them to assist law enforcement to extract data, allowing authorities to view contents of a device.

To Read The Full Story

Are you already a subscriber?
Click to log in!