New Technology

Liability: The Legal Pitfall of Apple’s Unbeatable Encryption?

The last ten years for the internet privacy movement have been very interesting indeed. Around the world, and definitely in the US, their has been a moral tug of war going on between concerned citizens and their rights to privacy versus the concern of things like national security, political ramifications, and societal standards.

As the world becomes more connected, the lines of privacy begin to get more controversial and complex, creating situations where governments feel nothing should be private in the name of protecting their citizens or values, and internet users feel this poses a threat to their personal information and rights to privacy.

Is it the mark of any over-reaching government, or is it a necessary evil? Is sacrificing our privacy really worth the perceived benefits of national security, or do individual rights to privacy trump intrusion by the state? We’re not here to debate that – it’s a long, dark corridor, and we’re not here to go there in this post.

What has become an interesting question circulating the web right now is whether Apple’s amazing encryption is setting them up for legal ramifications in the future. A law blog explored the topic in what is quickly becoming a controversial and viral post on just where a tech company stands in terms of liability in instances of cyber crime and terrorism.

Apple’s devices are some of the best secured in the world. They’ve taken a striking stance on their users’ privacy, and their software strongly reflects that. It’s amazing how consistently awesome Apple has been with their security in respect to their users’ privacy.

For starters, all of Apple’s iMessaging and FaceTime is encrypted from start to finish. The only way to break through this encryption is to use your completely unique pass code to access it yourself. Not even Apple retains the information or ability to access your data, so they “wouldn’t be able to comply with a wiretap order even if we wanted to”, according to their website.

The only exception and possible pitfall to this system is that Apple automatically backs up your conversations in the iCloud, so you can recover the information if you need to. However, this is a setting you have total control over – you can turn it off at any time.

Apple extends their incredible privacy features to their cloud services as well, using encryption and security keys. From start, to finish, to storage, your data and files are encrypted. If Apple ever uses a third party to store your information, their policy is to use an encryption key that they never give the company, so that your data remains completely private.

Virtually every facet of the user experience with Apple is encrypted, including their browser, webmail, and other app features. Users can even choose to erase their ‘relationship’ with Siri and begin from scratch, since the software stores voice recordings to learn what their user sounds like.

All of these features back the reputation that Apple has come to develop for protecting their users’ privacy, even in the wake of court orders and forced wiretap. It’s brilliant, it’s legendary, but is it going to land them in hot water in the future?

Bloggers Zoe Bedell and Benjamin Wittes at the Lawfare Blog tackled this question in their two part blog series, ‘Civil Liability for End-to-End Cryption, Threat or Fantasy?’ The big question on everybody’s mind is, what if a terrorist attack succeeds because Apple’s encryption kept government surveillance out? Could the company be held liable for the resulting damages of victims of such an attack?

It’s unpleasant to think about the online community’s champion of personal privacy falling prey to a lawsuit as a result of that very attribute, but is it possible? This question goes far beyond the scope of what a Terms of Service contract can address, as it becomes a question of civic duty if lives are the cost of such an impenetrable privacy policy. In Lawfare’s article, they point out that terrorist organization ISIS is known to have used social media services as a method for recruitment. So what if they did this on an iPhone and it resulted in an attack?

This hypothetical scenario appears to center on  section 702 of the Freedom of Information Sharing Act, which basically says that though a company is required to comply with court orders of records and logs on its users, it is not obligated to make that that information is decrypted, unless they already possess the means to do so.

It would seem that Apple’s hands off encryption policy may save them in future cases, since according to their own policy, they couldn’t decrypt the information if they tried. However, cases similar to this have seen strikingly contrasting verdicts. Lawfare.com compared it to cases involving crimes and gun sales, in which the retailer was either found negligent or innocent.

Since it would be completely unrealistic for Apple to screen all of its customers to make sure they were compatible with this service, a plaintiff would be hard-pressed to find a judge that would hold the tech giant liable. Though the possibility is now quite well-known and documented, the percentage of Apple’s users that would use their encryption to that end is so small, it’s not likely a court would find them negligent for these practices.

The law appears to be on Apple’s side for now, and while we can hope these theories are never put to the test, we can never really know for sure how the litigation will change in the future. With whistle blowers like Edward Snowden drawing attention to the growing problem of government surveillance programs, this is an issue we likely haven’t seen the last of.

Previous post

There is no more story.

ibm apple
Next post

IBM and Apple Partnership Continues to Evolve

The Buzz

The Buzz

No Comment

Leave a reply

Your email address will not be published. Required fields are marked *