In Part 1 of his essay, Liam Day traced the history of the data revolution from the factory floor to baseball’s front offices and beyond. In Part 2, he examines the fear of it.
Data’s era coincides with the computer’s. The relationship isn’t one of correlation but cause. One can’t execute thousands of daily trades based on quantitative algorithms without high-speed machines. Neither is it likely that sabermetricians would have the wherewithal to capture and parse all the data they do without computers.
The fear of data, then, a fear I referred to in the first part of this essay, is compounded by a fear of technology, which is itself a subfear of that larger fear to which I also referred last week, and to which all of us, old and young, are susceptible—fear of the new, new ways of doing business, new ways of playing and watching the games our fathers loved.
Scientific and technological advances have always met resistance. Galileo was persecuted by the Catholic Church for daring to challenge the notion of geocentrism. The mechanization of the workplace as a result of industrialization was fought by the Luddites, who would sabotage factories by wrecking the machinery in them.
Heck, Darwin’s theory of evolution remains controversial some 150 years after the publication of The Origin of Species. Just last year 46% percent of respondents in a Gallup poll said they believe in Creationism.
What lies at the heart of this resistance is something else I alluded to last week—the loss of individual identity—for with every scientific discovery human beings were pushed further and further to the margins of a creation of which we had once believed we were the crown jewel. Yahweh created Adam for companionship. Adam even played a role in the process of creation by naming all the plants and animals in the Garden of Eden.
But if Earth is just the third planet revolving around a nondescript star, one of billions of other stars comprising a galaxy that’s merely one among billions of galaxies, honestly how high could we rate in God’s estimation of the universe He had created? To top it off, it turns out that He didn’t create us from clay, but that we evolved, as all life on this planet has, from a different species. It turns out we aren’t so special after all.
Similarly, with industrial progress, artisans were marginalized, reduced to the status of laborers, basically skillless and interchangeable. Today, workers aren’t reduced to a lower status so much as replaced altogether. Achieving six sigma, which statistically represents a 99.99966% rate of defect-free production, isn’t possible when human hands are making the products. Such a rate of success is only achievable, as six sigma adherents will tell you, by eliminating process variation. Human beings represent nothing if not process variation.
Even at the upper end of the economic food chain, the impact of technology and data has been felt. As I also mentioned in the first part of this essay, Tom Wolfe’s Masters of the Universe, who were once the princes of Wall Street, have been replaced by quants who execute thousands of trades daily on slim margins with the help of high-speed computers, which can do the work of hundreds of traders.
When the data revolution swept through Major League Baseball’s front offices, the battle front, at least as it was drawn by Aaron Sorkin in his screenplay for Moneyball, was formed between the new sabermetricians and the older scouts whose subjective judgments, which general managers had heretofore relied on, would no longer be needed to evaluate players.
This taps directly into one of the fears that has historically defined masculinity: obsolescence, the fear of no longer being able to provide for one’s family. I can hardly imagine what it feels like to have spent 30 years of a professional life exercising a single skill, even, perhaps, refining that skill to near perfection, only to have it slowly dawn on me that my skill is no longer needed in the workplace. No one wants to be Willy Loman.
But this is about more than a paycheck. It’s also about control. What’s most frightening during times of economic uncertainty, which we’ve experienced going on half a decade now, is the feeling that so much of our fate lies outside of our control. Economic forces, and the statistics economists, politicians, and news commentators use to describe them, seem so abstract as to be almost otherworldly.
In Cinderella Man, the biopic of Depression-era boxer James J. Braddock, Braddock, whose career is on its last legs due to injuries and a string of losses, decides to get back in the ring one more time because he can’t afford to keep the heat and electricity on in the apartment his family occupies. The intermittent shifts he’s picking up on the docks as a longshoreman don’t bring in enough income. His wife, who doesn’t want to watch her husband get hurt or, worse, killed in the ring, is opposed to the idea of him fighting again. As he tells her, though, “At least [in the ring] I know who’s hittin’ me.”
The lack of control goes beyond the fear of creative destruction, though. It extends to the work itself. As I’ve written before on The Good Men Project, Studs Terkel, in the introduction to his oral history, Working, says the phrase that pervades the interviews in his book is “more or less”, which, he says, reflects the ambiguity most people have toward their jobs. “Something more than Orwellian acceptance, something less than Luddite sabotage.”
The people in Working who speak to Terkel about their jobs in the most positive terms are those who work with their hands—the stonemason in Indiana, the piano tuner in Chicago. Not only do those workers have control of the task at hand, but they can see it through to completion. Yes, the stonemason may only be building part of a wall and not all of it, but as he said to Terkel, “I can’t imagine a job where you go home and maybe go by a year later and you don’t know what you’ve done. My work, I can see what I did the first day I started. All my work is set right out there in the open and I can look at it as I go by.”
Work is, then, more than just a paycheck, as important as a paycheck may be for the male psyche. It is a search for meaning. The other refrain that, in his introduction, Terkel says pervades the conversations he had with blue and white collar workers alike is, “I’m a robot.” No one wants to do the same thing over and over again day after day, even if it’s statistically true that that one thing is the efficient thing to do. Those three little words perfectly encapsulate the mix of fear of data and fear of technology that have driven the backlash against the data revolution as it has swept through each new field.
Ironically, the engine driving the data revolution—Silicon Valley—was and remains a refuge of libertarian thought. Fairchild Semiconductor, founded in 1957, wasn’t just a new company, it was a new culture. The eight men who comprised the original company eschewed ties, titles, offices, and assigned parking spaces. They would spark not just a revolution in technology, but a revolution against the era’s prevailing corporate culture, which was best described in William Whyte’s The Organization Man.
Fairchild Semiconductor was no less a part of the social and political upheaval of the 1960’s, which were, themselves, as much a revolution against the homogenization of life as they were a revolution against the political elites who governed that homogenous world, as Abbie Hoffman and the Chicago Seven. The companies it spawned, most notably Intel, spread its libertarian business philosophy throughout the tech world, filtering down to us today in the forms of Google and Facebook, two companies simultaneously famous for their casual work environments and their collection and use of vast amounts of data.
In some senses, then, it could be argued we’ve come full circle. William Whyte wrote his diagnosis of America’s corporate culture during the 1950s precisely because he feared the shift from individual initiative to a belief in society’s perfectibility through collective means. Fairchild Semiconductor was founded for many of the same reasons.
Now, here we stand today, and data, the aggregation of millions of individual choices to, in turn, shape those very choices on the basis of the data that’s been aggregated, reign supreme. The company man may wear flip-flops and jeans instead of wing-tips and three-piece suits, but he’s still a company man.
This, I suspect, is the view of it for those who resist the data revolution. But are they right? Does the use of data to improve performance strip of us of our individuality? Does its application in the workplace remove the artistry from our daily tasks, reducing our lives from fiction’s astonishment to non-fiction’s plodding analysis?
What are the next fields data will conquer? Healthcare and education are clearly in its crosshairs, though these worlds are hardly strangers to data as it is. Surgical quality improvement has been a thing for at least two decades and mandatory testing implemented as part of President George W. Bush’s No Child Left Behind legislation has produced a decade’s worth of data for states and local districts to use to evaluate schools. The fights are now over how or even whether to use that data to determine payments to doctors or to evaluate teachers. And, unsurprisingly, the opposition to new uses of data in these fields derives from the very same fears that have driven opposition to data at every other step along the way.
Death panels. Two words used by everyone’s favorite former governor of Alaska to describe the Independent Payment Advisory Board (IPAB), created as part of the Patient Protection and Affordable Care Act, otherwise known as Obamacare. They summed up the opposition to the use of data to determine payments to doctors during the debate that raged over healthcare reform.
But just as with Joe Morgan and Joe Scarborough, whose respective fears, as I pointed out last time, of sabermetrics and Nate Silver led them to say pretty stupid things, and just as with the three words—”I’m a robot.”—that recur throughout Studs Terkel’s conversations with the subjects of his book, Sarah Palin’s use of “death panels” to describe the IPAB comes from a genuine fear, what I would hope is obvious by now is a near-universal fear, of data, particularly of data wedded to technology and used to make decisions about our lives that seem to be out of our control.
That’s not to say there wasn’t a large dollop of opportunism mixed in with the fear Governor Palin both evoked and exploited. It’s merely to state that the fear was genuine.
Likewise, much of the opposition to the use of data in public education derives from the same fear. No one wants to be thought of, or have their son or daughter thought of, as a cog in an educational machine. For the past half-century a number of educational reformers has attempted to throw off the industrial model of education that emerged, alongside Henry Ford’s assembly line, from the one-room school houses of quaint historical memory. The Montessori method, which is almost as old as America’s industrial model of education, but which didn’t take root here until the 1960s, is probably the most notable attempt.
For reformers, the advent of universal testing is a huge blow to the dream of an educational system that can be tailored to meet the individual needs of students. It has also been seen by many as an assault on the prerogative of teachers, who once ruled their individual classrooms as feudal lords ruled their fiefdoms.
In no other field, perhaps, does the debate between science and art rage more fiercely than it does in education, for if teaching is, truly, an art, than the teacher remains unassailable, because how does one measure whether one artist is better than another. Who’s to say whether Homer or Virgil is the better epic poet, Michelangelo or Da Vinci the better painter, Kubrik or Spielberg the better director.
Gilbert Highet wrote The Art of Teaching in 1950 and it remains today required reading at many teacher colleges. The title itself is attractive to authors of pedagogical tomes—The Art of Teaching Reading, The Art of Teaching Writing, The Art of Teaching Art, The Art of Teaching Piano—all of which you can find on Amazon.
But is teaching really an art? Even if it is, does that mean there aren’t basic techniques that form the art’s basis? Figuration and perspective may have gone out with Abstract Expressionism, but that doesn’t mean Jackson Pollock and Willem de Kooning didn’t know how to draw.
In his book, Teach Like A Champion, Doug Lemov addresses the concept of technique, as opposed to strategy, as it relates to the art of teaching. “I’ve tried to write this book to help artisans be artists, not because I think the work of teaching can be mechanized or made formulaic,” he writes. It is in the proper application of the techniques, particularly in the application of the right technique in the right situation, that the teacher’s art lies.
In other words, art lies in developing the right strategies. At the point of implementation, process takes over, and refining that process to six sigma should never be considered a bad thing. As Lemov goes on to say, “Great teaching is no less great because the teacher mastered specific skills systematically than is David a lesser reflection of Michelangelo’s genius because Michelangelo mastered the grammar of the chisel before he created the statue.”
In the athletic realm, even the most mechanistic of sports, such as golf, can’t be reduced to mere technique.
Yes, Tiger Woods will take hundreds of swings every day, in what can be called the golfer’s version of eliminating process variation, but decisions about whether to fade or draw the ball, where to land it and with how much back spin have nothing to do with technique and everything to do with the intangible that defines so many of the athletes we call great—feel.
In the end, data shouldn’t detract from the games we play and watch. Data should only enhance our enjoyment of them, and, more importantly, the arguments we have with fathers and friends at bars and tailgates. After all, data has always been the language we use to talk about sports, by which we compare athletes across generations.
Who was better? Willie Mays or Babe Ruth? Roger Clemens or Walter Johnson? Peyton Manning or Dan Marino? Tom Brady or Joe Montana? Michael Jordan or Oscar Robertson? To settle these debates, we go to the numbers. What the sabermetricians have shown us is only that we’ve been using the wrong numbers. Nothing more.
Photo: AP/Nam Y. Huh