Historian Oliver Lee Bateman examines our obsession with lists, facts, and clickbait headlines.
“The world,” wrote Ludwig Wittgenstein, “is the totality of facts, not of things.” The year in review, by contrast, is not even the totality of facts–it is just a bunch of them, selected more or less at random, that purport to provide the reader with a panoramic view of the previous 365 days. And I suppose it serves a function: easy-to-write filler copy at a time of the season when most journalists are taking it easy, battling with their significant others during the holidays as well as with the various illnesses that accompany the onset of winter.
Mind you, I’ve participated in two of these myself. They weren’t bad, all things considered, as I struggled to make manifest my criticisms of the form by working with comedienne Susie Meister to parody it. But such an effort does little to undermine the tyranny of the year in review, the bête noire of the professional historian. My training, such as it is, enables me to familiarize students with interpretations and to equip them to make arguments. In this respect, the fact that “selfie” is Oxford Dictionaries’ 2013 Word of the Year, or that Amazon.com was BusinessWeek’s 2005 Merchant of the Year amounts to nothing more than unhelpful intellectual clutter. When the great narrative of the 21st century is written–provided, of course, that narratives and writing survive into the 22nd–will its authors single out the prevalence of “selfies” in 2013 and Amazon.com’s towering retail reputation as the cynosures for anyone wishing to understand this period? Or, to quote the rapper Paul Barman, “Will Amazon dot com be around when grandma’s gone, mom?”
In an age when almost everything is clickbait and few concepts are held in the mind longer than however many parsecs it takes to get enraged about them, the year in review is a necessary evil. Will the leading story of 2014 be that some ignorant and possibly well-meaning fool wrote a short blog about the racial dynamics at her yoga studio? Will it be the even more absurd fact that over five hundred thousand words (a conservative estimate) were spilled in response?
In the absence of a year in review, is anyone likely to remember that Girls was once pilloried in the outrage blogosphere for racial insensitivity? I mean, that contretemps occurred so long ago (2012) that even I had forgotten that I had written something about it, which apparently I did. Not that I’d know that or you’d know that, because hell, we’re both probably in the process of forgetting this essay, which is unlikely to capture either of our limited attention spans for more than a handful of sentences given that there’s no link to a catchy video or a tone-deaf matter of fact claim about some current crisis appearing no more than two lines under the header (which itself is hopefully tone-deaf and clickbait-y).
The students, bless their innocent little hearts, show up in my freshman U.S. History from 1865 to the Present class having spent twelve years responding to questions such as this one:
The Cree Indians subsisted primarily on a diet of:
c) Chicory root
The answer to that multiple-choice query, says whatever state board of education happens to be administering this all-important End of Grade (“EOG”) examination, provides compelling proof of whether our children is learning. If they marked choice B, they’re prepared to enter the world, and we wise stewards of the future can take solace in their ability to bubble in one answer or another vis-à-vis some terms that appeared months earlier on a study guide. This is the year in review as assessment of learning; clearly a person who can’t distinguish between the Battle of Midway and the Battle of Guadalcanal isn’t fit to work at the fast-paced call centers and Wal-Marts that are representative of the rewarding white-collar positions available to everyone in this 21st century “Knowledge Economy.”
But maybe I’m wrong in my condemnation of this sort of thing, or at least wrong for equating year in reviews and EOG exams. Heck, maybe both provide the valuable service of compressing tour complicated epistêmê into a few choice and easily regurgitated sound bites. Maybe this really is the future. After all, even I recently levelled a mighty blow against obsessive devotion to a single activity, explaining why I had turned my back on the massive time-suck that is DOTA 2.
Perhaps these bits and pieces–blog entries skimmed hastily or not at all, online education courses consisting of hundreds of tedious but minimally taxing assignments, year in review entries that capture almost nothing about the lived past–are what we are hard-wired to seek out. As of this sentence, I’m at 805 words and counting…and what intrepid reader could be expected to progress this far when there are so many YouTube videos to half-watch while simultaneously skimming 140-character “@” tweets from a friend?
In a short piece in The New Yorker, Maria Konnikova profiled author Jonah Berger, who has written extensively about why certain posts “go viral” on the Internet.
We share what we’re thinking about—and we think about the things we can remember. This facet of sharing helps explain the appeal of list-type stories (which I wrote about in detail last month), as well as stories that stick in your mind because they are bizarre. Lists also get shared because of another feature that Berger often finds successful: the promise of practical value. “We see top-ten lists on Buzzfeed and the like all the time,” he notes. “It allows people to feel like there’s a nice packet of useful information that they can share with others.”
“Useful information,” eh? It’s all useful information in some sense, I guess. I struggle as a history professor because I take so much for granted (or “for granite,” as my students, or at least the auto-correct features on their devices, occasionally write). How can anyone benefit from a lecture on the Gilded Age without having some hard core of factual information about the notables/quotables/potables of that period? I mean, if one makes reference to the Grant Administration, and the listeners have no clue what that is, you have to step back and offer some kind of in-a-nutshell explanation. But in the course of giving this explanation, you might say something about Grant that is itself unknown to the audience, whereupon you’ve got to summarize that as well, and so forth and so on until fifty minutes are squandered and class is over (really 45, because with five minutes to go, they need to start packing up their books and preparing to bust a move). It’s turtles all the way down, brother.
Alas and alack, such complaints amount to little more than beating against the current. In point of fact, innumerable jeremiads have been written about the declining intellectual quality and attention spans of previous generations, only to see life continue much as it had before (“things ain’t what they used to be and they never were,” goes a quote attributed to various wits, most notably Will Rogers). It’s regrettable that one out of four Americans don’t know that the Earth revolves around the sun–but how many ever knew this? There are still plenty of intellectually courageous young people who are grasping and clambering toward a deeper understanding of life, the universe and everything, and perhaps they’ll get further than I ever did. Hastily prepared articles about 2013 being the year of the selfie, however jejune and outright foolish, will do little to deter their progress.