Andrew Ladd talks mobile tech and privacy in his review of Robert Vamosi’s When Gadgets Betray Us.
Last month I gave into temptation and bought an iPad 2. No big surprise; after a year coveting the first iPad, nothing was going to stop me from crawling into bed with its vastly slimmer new sibling. The flame of my consumerist lust, though, was snuffed out before I even opened the box: when I got home from Best Buy I found an unsolicited copy of Robert Vamosi’s new book, When Gadgets Betray Us (Basic Books, $26.99), waiting on my doorstep. Talk about bad omens!
The book’s main premise, if you can’t guess from the title, is that our relationship with gadgets is a risky one. Also no big surprise; Vamosi is a security consultant and tech journalist by day, and Gadgets essentially rehashes his years of magazine work documenting failures of technology into half a dozen vaguely thematic chapters. Occasionally these meander, and it’s a little too easy to get lost in his pages of acronyms and bizarre technobabble (“piconet”? “yagi antenna”? “bluesnarfing”?!), but I don’t want to sell him short: he’s clearly done a lot of research, and his message about it all is important, too. Sometimes our gadgets really can betray us.
Take the current news stories about the iPhone. If you haven’t heard, programmers have recently discovered that the devices not only track their users’ locations but log the information indefinitely and back it up, unencrypted, whenever synced with iTunes. That means anybody with access to your phone or computer can generate a map of everywhere you’ve been since activating your cell service, with very little effort.
And that seems quaint compared to the 24-like stories Vamosi relates. Apparently it’s possible for hackers to access your car’s onboard computer remotely and disable its engine while you’re driving. (Yes, really!) Or they can access your home wireless network, and all the devices on it, through a Web-enabled DVR. Or they can surreptitiously download all the contacts, text messages, and call records on your phone through a simple Bluetooth loophole.
Vamosi’s main criticism, however, is not really that our increasingly connected gadgets, such as the Web-enabled DVR, are more vulnerable to such attacks, or even that they collect a lot of information about us—in one chapter he practically drools over the potential applications of so much data. His complaint, rather, is that the user has no idea any of these things might be going on. If you know your iPhone is logging your every move, you can take steps to stop it, or at least to adequately protect the relevant files. It’s dangerous only because it’s hidden.
The problem is, gadget-makers are trying so hard to create an out-of-the-box, user-friendly experience—and consumers are so vocally demanding it—that the trend in technology is toward hiding more functions, not fewer. And reduced security, Vamosi says, is the price we pay for that ease of use. Can you imagine if all your iPhone’s settings and data were unlocked for you? Maybe people wouldn’t be able to track your every move, but you’d get a headache just looking at the thing. (I get a headache trying to change the settings on mine already.)
Besides, allowing greater user control doesn’t necessarily make gadgets safer: the more settings are customizable, the harder it is for developers to test all possible configurations—which quickly spiral into the millions—and the more potential there is for inadvertently opening up security holes. That’s why Microsoft and Apple are always releasing patches.
So where does this leave us? Should we just give up on our gadgets altogether? Hardly, says Vamosi. His solution is to make security the starting point for any electronic system, rather than a frill that gets tacked on as an afterthought. Don’t build the computer and then work out to attach the firewall; build the firewall and then work out how to attach the computer.
Because if all gadgets were built primarily to secure a clearly defined set of data, says Vamosi, and only secondarily to make calls or connect to the Internet or whatever, there would be very few opportunities for valuable information to be inadvertently lost or stolen. Meanwhile, whatever data we did collect could be made available to authorized researchers and organizations, which would give us greater insight into how people use technology, and thus let us improve gadgets even further.
That’s a technical writer’s conclusion, though, and one that ultimately leaves me a little disappointed. For a start it ignores the fact that authorized researchers in authorized organizations can harbor criminal intent just as easily as criminals—and that even secured data is only as secure as the people using it. If you forget to log out of your Facebook one day, even the most elaborate password won’t stop your friends from posting embarrassing status updates. Better security just shifts the locus of danger elsewhere.
In fact, there might be some value to securing less of our personal data. Privacy, at least the way we think about it today, is a novel idea; even a few hundred years ago nobody would have been that surprised to hear that their every move was being monitored. If you live in a small town today you probably still aren’t, because in limited settings with small populations it’s easy and common to keep tabs on people even with no technology whatsoever. (Think of retirees who spend their whole day by the window and can tell you exactly how their neighbors spend their time.)
So sites like Facebook, and phones that track their users’ every move, bring even those of us who live in big, anonymous cities closer to the old-fashioned paradigm of a community where someone is always watching. And as any sociologist will tell you, for most people even the knowledge that they might be seen doing something unsavory is enough to stop them from trying. That’s why, generally speaking, small communities have less crime than large ones—and why a world of open data might well be one where we all act a little better toward each other.
There would be potential problems, too, of course, and some data should still be secured no matter what—nuclear launch codes, say. But building better electronic boxes only makes it easier for us to retreat blithely into our clouds without addressing the consequences of doing so. Frankly, I’m not so comfortable with that future. But that doesn’t mean I’m giving up my iPad.