March 8, 2016

Three Critical Cybersecurity Lessons for Your Business from the Apple-DOJ Fight

Posted in Cybersecurity, Employment Law by Gene Killian |

On December 2, 2015, Syed Rizwan Farook and Tashfeen Malik, a married couple, committed a horrific terrorist attack at the Inland Regional Center in San Bernardino, CA.  They were both later killed in a shootout with police.   Acting on a search warrant, the police discovered Farook’s iPhone in his car.  Naturally government investigators wanted to access the iPhone’s contents, to see whether information existed that would assist in a broader criminal investigation.  But iPhones contain hardware-based and software-based encryption of their password-protected contents. These protections essentially “lock” the device to outsiders, preventing everyone (including Apple) from accessing the encrypted data without the passcode designated by the phone’s user.   

The FBI wants to force Apple to design and implement a “back door” to Farook’s iPhone, to get past the password protection. A Federal Magistrate has ordered Apple to design the necessary software at the government’s expense, but Apple has appealed that decision to the District Court Judge. The case largely involves the interpretation of the arcane and somewhat ambiguous “All Writs Act,” codified at 28 U.S.C. § 1651, which authorizes federal courts to "issue all writs necessary or appropriate in aid of their respective jurisdictions and agreeable to the usages and principles of law."  But, in its papers filed with the Court, Apple defines the broader cybersecurity issues as follows:

“[I]f Apple can be forced to write code in this case to bypass security features and create new accessibility, what is to stop the government from demanding that Apple write code to turn on the microphone in aid of government surveillance, activate the video camera, surreptitiously record conversations, or turn on location services to track the phone’s user? Nothing.…Any back door enabling government officials to obtain encrypted data would also create a vulnerability that could be exploited by criminals and foreign agents, weakening critical security protections in creating new and unforeseen access to private information.”

(Meanwhile, a Federal Magistrate in Brooklyn, faced with a similar request from the government in a drug trafficking case, has now held that Apple does not have an obligation to design a “back door,” writing that none of the relevant factors “justifies imposing on Apple the obligation to assist the government’s investigation against its will.”)  

Here are three observations about the Apple case in California that I think are critical for businesses to consider:

  1. In the movie Apollo 13, Ed Harris’s character, faced with the fact that certain necessary cartridges on the spacecraft wouldn’t fit in the necessary ports, sarcastically remarked: “Tell me this isn’t a government operation.” In the Apple case, the FBI appears to have stupidly created its own problem by inexplicably changing the iCloud password relating to the attackers’ account, preventing the phone from initiating an automatic iCloud backup of the data – which would have obviated the need for any “back door.”  The FBI didn’t consult with Apple before taking this step, which FBI Director Comey now admits was a mistake.  The analogy here is that it’s remarkably easy for employees to do dumb things in Cyber-world, like, for example, mindlessly clicking on an e-mail link, or becoming the victim of a phishing scheme.  The costs to your business can be disastrous.   Do your employees know how to recognize and avoid so-called social engineering scams?  Click here for a useful memo from Microsoft on the subject.      
  1. The idea that this supposedly locked-down, secure iPhone would have backed itself up to iCloud - but for the FBI monkeying around with the codes - leads to a second point:  Very few things in Cyber-world are totally secure.  (We once had a client who refused to use e-mail because, he said, the “e” stands for “evidence.”)  Click here for an interesting article on improving operational security.  Here’s a simple, low-tech tenet that one of my former law partners was very fond of preaching:  If you wouldn’t want to see it on the front page of the New York Times, be very careful about putting it in writing. 
  1. The iPhone involved in this case was not owned by the terrorist Farook but by his employer, the San Bernardino County Department of Public Health. The SBCDPH gave its consent to a search of the iPhone – but the security code had not been created by the SBCDPH, but by the employee, Farook, and not disclosed to the SBCDPH.  That should NEVER happen on an employer-provided device.  Your employees should be advised, in writing, that they have no expectation of privacy in any employer-provided device, and that all such devices can be searched at any time. They should also be advised, in writing, that all security codes relating to employer-provided electronic devices MUST be provided to the employer or discipline will result.  And managers should periodically be checking the electronic devices to ensure compliance.