Apple vs. the FBI: Is Your Privacy Really Worth It?

Is Privacy That Important?

Should we really be concerned about the privacy of your selfies when national security is at risk?

Bergamot Ink logo

Never before have humans had a house, box, car, locker, or thing that couldn’t be searched somehow with the right Constitutional or un-Constitutional backing.

Until now.

Apple and Smartphone creators have finally built a product that can’t be opened for a warranted search, complete with the refusal of the company that created it and encryption to help impede the authorities from searching that device in that after so many tries to unlock the device, the OS and memory data are scrambled and possibly destroyed beyond readability.

After the San Bernardino attack and the FBI’s pleading for Apple to build “back door” programming in its operating system, Apple is defending the right of that product to remain locked and the data therein destroyed upon the wrong entry code. Period.

Even if Apple’s product is returned to its company to unlock the mechanism, there are still privacy issues, and there is no skeleton key to give the F.B.I. in order to simply put in a password in order to read possibly pertinent data.

So what’s the issue? Does “back door” programming really threaten the security of customers? And if so, who cares?

Your school, church, bank, phone, computer, internet, and utility records can all be searched at any time by the right subpoena or hacker. Every public thing about you is researchable, and with the right private investigator and IT set of know-how (legal or illegal), nothing is hidden. Almost.

Yet still we have unsolved crimes, terrorist attacks, and cheating hearts out there, mocking modern technology’s reach to simply unlock every box.

So the question for me remains: what’s the big deal for Apple or Android or some kid to build an operating system that is friendlier to the “good guys” fighting crime and saving lives?

In America, in a capitalist democracy, and in our modern version of a respectable republic, it seems that we just don’t do that. We don’t sell out our neighbors just because. We don’t give away private information willy-nilly. We don’t allow companies and governments to be all powerful and omniscient.

Except that we do.

Your data trail is enormous. There’s more proof that you exist and were wherever you were yesterday than there is to prove that Jesus, Homer, or Shakespeare existed. You and I willingly give Facebook, Twitter, and the government—or some app that some kid sold for a billion dollars last quarter—more private information in a day than is necessary for a lifetime of any human in any generation since the dawn of time.

So why is Apple protecting the right of terrorists to have their phones kept locked? Are our selfies that important? Are my emails that sacred? Is your latest Vine that sacrosanct?

It’s a fine line, and regardless of your adamant answer, there’s some kid half your age who can write a code to delay the encryption block enough times to spin the wheel on possible codes and then unlock your phone in the time it takes you to figure out how to set your D.V.R. to tape that Salinger documentary or Mr. Robot reruns.

There is also a story not being told in the media that, in time, will be uncovered. Among the tech companies there are wheels within wheels that intersect the same wheels and platforms for the NSA, FBI, and CIA. We’ve all seen this movie a few times, with different plot twists and endings.

So here’s to liberty! And here’s to all the strange, unbreakable things we’ll build as a society that we can’t even dream of in the present moment.

Here is Tim Cook’s letter, in whole:

February 16, 2016

A Message to Our Customers

The United States government has demanded that Apple take an unprecedented step which threatens the security of our customers. We oppose this order, which has implications far beyond the legal case at hand.

This moment calls for public discussion, and we want our customers and people around the country to understand what is at stake.

The Need for Encryption

Smartphones, led by iPhone, have become an essential part of our lives. People use them to store an incredible amount of personal information, from our private conversations to our photos, our music, our notes, our calendars and contacts, our financial information and health data, even where we have been and where we are going.

All that information needs to be protected from hackers and criminals who want to access it, steal it, and use it without our knowledge or permission. Customers expect Apple and other technology companies to do everything in our power to protect their personal information, and at Apple we are deeply committed to safeguarding their data.

Compromising the security of our personal information can ultimately put our personal safety at risk. That is why encryption has become so important to all of us.

For many years, we have used encryption to protect our customers’ personal data because we believe it’s the only way to keep their information safe. We have even put that data out of our own reach, because we believe the contents of your iPhone are none of our business.

The San Bernardino Case

We were shocked and outraged by the deadly act of terrorism in San Bernardino last December. We mourn the loss of life and want justice for all those whose lives were affected. The FBI asked us for help in the days following the attack, and we have worked hard to support the government’s efforts to solve this horrible crime. We have no sympathy for terrorists.

When the FBI has requested data that’s in our possession, we have provided it. Apple complies with valid subpoenas and search warrants, as we have in the San Bernardino case. We have also made Apple engineers available to advise the FBI, and we’ve offered our best ideas on a number of investigative options at their disposal.

We have great respect for the professionals at the FBI, and we believe their intentions are good. Up to this point, we have done everything that is both within our power and within the law to help them. But now the U.S. government has asked us for something we simply do not have, and something we consider too dangerous to create. They have asked us to build a backdoor to the iPhone.

Specifically, the FBI wants us to make a new version of the iPhone operating system, circumventing several important security features, and install it on an iPhone recovered during the investigation. In the wrong hands, this software — which does not exist today — would have the potential to unlock any iPhone in someone’s physical possession.

The FBI may use different words to describe this tool, but make no mistake: Building a version of iOS that bypasses security in this way would undeniably create a backdoor. And while the government may argue that its use would be limited to this case, there is no way to guarantee such control.

The Threat to Data Security

Some would argue that building a backdoor for just one iPhone is a simple, clean-cut solution. But it ignores both the basics of digital security and the significance of what the government is demanding in this case.

In today’s digital world, the “key” to an encrypted system is a piece of information that unlocks the data, and it is only as secure as the protections around it. Once the information is known, or a way to bypass the code is revealed, the encryption can be defeated by anyone with that knowledge.

The government suggests this tool could only be used once, on one phone. But that’s simply not true. Once created, the technique could be used over and over again, on any number of devices. In the physical world, it would be the equivalent of a master key, capable of opening hundreds of millions of locks — from restaurants and banks to stores and homes. No reasonable person would find that acceptable.

The government is asking Apple to hack our own users and undermine decades of security advancements that protect our customers — including tens of millions of American citizens — from sophisticated hackers and cybercriminals. The same engineers who built strong encryption into the iPhone to protect our users would, ironically, be ordered to weaken those protections and make our users less safe.

We can find no precedent for an American company being forced to expose its customers to a greater risk of attack. For years, cryptologists and national security experts have been warning against weakening encryption. Doing so would hurt only the well-meaning and law-abiding citizens who rely on companies like Apple to protect their data. Criminals and bad actors will still encrypt, using tools that are readily available to them.

A Dangerous Precedent

Rather than asking for legislative action through Congress, the FBI is proposing an unprecedented use of the All Writs Act of 1789 to justify an expansion of its authority.

The government would have us remove security features and add new capabilities to the operating system, allowing a passcode to be input electronically. This would make it easier to unlock an iPhone by “brute force,” trying thousands or millions of combinations with the speed of a modern computer.

The implications of the government’s demands are chilling. If the government can use the All Writs Act to make it easier to unlock your iPhone, it would have the power to reach into anyone’s device to capture their data. The government could extend this breach of privacy and demand that Apple build surveillance software to intercept your messages, access your health records or financial data, track your location, or even access your phone’s microphone or camera without your knowledge.

Opposing this order is not something we take lightly. We feel we must speak up in the face of what we see as an overreach by the U.S. government.

We are challenging the FBI’s demands with the deepest respect for American democracy and a love of our country. We believe it would be in the best interest of everyone to step back and consider the implications.

While we believe the FBI’s intentions are good, it would be wrong for the government to force us to build a backdoor into our products. And ultimately, we fear that this demand would undermine the very freedoms and liberty our government is meant to protect.

Tim Cook


-Photo: Televisione Streaming/Flickr

About Jeremy McKeen

Bergamot Ink Columnist Jeremy McKeen is a high school English teacher, coach, musician, and father of three. In addition to his writings on The Good Men Project, he is also a Lead Editor.
He has been featured on and written for Sammiches & Psych Meds, Ravishly, YourTango,
Scary Mommy, BLUNTMoms, Yahoo! Parenting, HuffPo Parents, The Motherish, MockMom, The Gloucester Clam, Take Magazine, and maintains his own site, Nerdy Dad Shirt. You can find him on Facebook and Twitter.


  1. Richard Aubrey says:

    J. G.
    You’re imagining the future. Thing is, the future is now. See the IRS. A woman in Texas started a group called True The Vote. She was audited by the IRS and visited by the DEA, ATF, OSHA, and FBI. Democrats call this a coincidence.

  2. “Never hand a man a gun unless you know which way he’s going to point it.”

    And the Federal government is an awfully big gun. You might trust the man holding it now, but that will change. Who knows what leaders the future will bring, or what they will do with that power?

    Actually, we don’t have to go back all that far in history to get some ideas. Nixon’s Plumbers got all the way to break-ins at the Watergate Hotel to spy on others. Imagine the fun an unscrupulous executive could have with intimate details of their targets’ lives easily downloadable in electronic form!

  3. My privacy is more important than your whim.

  4. It comes down to two things:
    1) They (and SV in large) are uncomfortable announcing that actually it’s surprisingly easy based on password recovery, and heuristics of what people actually use for passwords.
    2) SV is covered with people who are gleaming with the sweat of their own hubris and in many cases much to their own detriment. I cannot recall the number of times I spoke with various companies on consulting to projects and nearly all of them had a glee of “we do everything ourselves.”

    In Apple’s case – it comes down to it’s Steve legacy of it’s sanctioned mantra: We’re Apple. WE make the Mac / iPhone, you don’t – so shut up because you don’t know what the @!#$ you’re talking about. And subsequently – we’re here to make A LOT of money. And we’re not sharing – and that doesn’t include two weeks worth of a 5 man engineering team and 1 PM.

    This actually leads to the conclusion that security isn’t really “secure”, we are simply led to believe it is.

    Apple: 12M records in 2012

    “Freedom” or “Liberty” of anything you freely give to them isn’t about either one of those.
    It is strictly about what Pink Floyd lamented the evilness of.

  5. Accessing a secure device unfortunately has some similarities to getting inside a home. Doors don’t discriminate between good guys and bad guys. They open for anyone who has a key, regardless of whether it actually belongs to them or if it was found under the doormat.

    I do wish people would stop claiming that Apple is “defending the right to privacy of two terrorists”, because it is much more accurate to say that Apple is defending the right to privacy of all of its mobile/tablet customers, two of whom committed acts of terrorism.

    My biggest question for the FBI proponents is this: What’s to stop another Edward Snowden from waltzing out the building with this backdoored-iOS on a thumb drive?

Speak Your Mind