Four stars, read in November 2016.
This is by far one of the most interesting books I’ve read this year. It’s a fascinating perspective, a way of looking at the world and trying to figure out what will last beyond our own lifetimes. Klosterman makes some predictions, and if the logic behind them wasn’t so practical, they would sound totally insane—because the point is, the things that seem relevant to future societies aren’t the things that seem relevant to the ones that create them. The first chapter discusses books, trying to foresee which contemporary authors might become known as the ones who exemplify this period in history. The second chapter does the same with music, and one toward the middle of the book examines television shows. If nothing else, these predictions are a lot of fun to make.
It wasn’t until the second half of the book, though, that I had to get out my favorite blades-of-grass page markers and start keeping track of things. The predictions are fun, and there’s a lot of thoughtful content in the book, but this is where I started seeing implications.
First, there was a chapter on the unreliability of history (and this isn’t even touching the political angle written about by Howard Zinn and James Loewen):
A man had attacked a female police officer with a hammer and was shot by the policewoman’s partner . . . Now, one assumes seeing a maniac swinging a hammer at a cop’s skull before being shot in broad daylight would be the kind of moment that sticks in a person’s mind. Yet [The New York] Times story explained how at least two of the eyewitness accounts of this event ended up being wrong . . . The witnesses were describing something that had happened that same day, and they had no incentive to lie. But video surveillance proved their depictions of reality were inaccurate.
This is a level of scrutiny that can’t be applied to the distant past, for purely practical reasons. Most of history has not been videotaped. But what’s interesting is our communal willingness to assume most old stories may as well be true, based on the logic that (a) the story is already ancient, and (b) there isn’t any way to confirm an alternative version, despite the fact that we can’t categorically confirm the original version, either.
Klosterman discusses Seymour Hersh’s story, “The Killing of Osama bin Laden,” which essentially claimed that the official story we’ve all heard was made up by the Obama administration, then provided an alternative version—obviously, the version he claims is true. Hersh is “the finest investigative reporter of the past half century,” but as Klosterman points out, his story didn’t create much skepticism regarding the original version among the majority of Americans.
This acceptance [of the Zero Dark Thirty version] is noteworthy for at least two reasons. The first is that—had this kind of alternative story emerged from a country like Russia, and if the man orchestrating the alleged conspiracy was Vladimir Putin—nobody in America would question it at all. It would immediately be accepted as plausible, and perhaps even probable. The second is a discomfiting example of how “multiple truths” don’t really mesh with the machinations of human nature: Because we were incessantly told one version of a story before hearing the second version, it’s become impossible to overturn the original template. It was unconsciously assumed that Hersh’s alternative story had to both prove itself and disprove the primary story, which automatically galvanizes the primary version as factual. It took only four years [since bin Laden’s death] for that thinking to congeal. Extrapolate that phenomenon to forty years, or to four hundred years, or to four thousand years: How much of history is classified as true simply because it can’t be sufficiently proven false?
The unreliability of people reporting even on current events is only part of the problem (although honestly, it’s enough for me to doubt pretty much everything . . . ever. Essentially, unless there is video proof, even the most reliable source can’t be any more reliable than the human brain allows them to be, and the answer to that is not very). “There’s also the human compulsion to lie—and not just for bad reasons, but for good reasons, and sometimes for no reasons, beyond a desire to seem interesting.”
Which I think we all know is something that happens. So the takeaway from this is pretty significant to me. I already have an almost militant attitude toward history (as far as non-professional historians go): you consult primary resources instead of textbooks that just summarize, you check to see who compiled the sources you’re using, you read carefully and look for bias. But now it occurs to me that really, you should also maintain a certain level of skepticism even for the most reliable sources—because like those eyewitness accounts, even primary resources might be flawed. This actually fits right in with my worldview, because I believe in skepticism as a fundamental principle of life—maybe even a virtue.
The next section that caught my attention was a chapter called “The Case Against Freedom,” about the Constitution, the Declaration of Independence, and the flaws inherent in our system of government (because there’s no such thing as a system, or a governing document, without flaws).
When the Constitution is criticized, the disapproval is more often with how the courts have interpreted its language. But if you doggedly ask a person who has studied the Constitution about its flaws, that person will usually concede that the greatest strength of any document is inherently tied to its flaws. Take someone like Jay D. Wexler, for example. Wexler is a law professor at Boston University who wrote a book titled The Odd Clauses, an examination of the Constitution through ten of its most bizarre provisions . . . He’s fascinated by ideas like the separation of powers, inserted by the founders as a barrier against their ultimate fear, tyranny. He will directly exclaim, “I love the separation of powers!” which is a weird thing to exclaim. But he also realizes this trifurcation comes with a cost.
“One can imagine how the sluggishness and potential for gridlock that such a system creates might actually be our undoing—perhaps because of some single major incident that the government cannot respond to adequately. But more likely because it slowly, quietly, in ways that may be hard to identify, weakens our society and culture and economy, rendering the nation unable to sustain itself and rise to the challenges of the future.”
The First Amendment is one of the most important, one of the most American ideas in the Constitution, and I would say it’s the most significant concept that defines our national identity. But, as Klosterman suggests, “its function is highly specific.” Because we have a capitalistic society, it actually doesn’t do very much. In fact, in practice, it creates an environment that is the opposite of its intent:
If someone publishes an essay or tells a joke or performs a play that forwards a problematic idea, the US government generally wouldn’t try to stop that person from doing so, even if they could. If the expression doesn’t involve national security, the government generally doesn’t give a shit. But if enough vocal consumers are personally offended, they can silence that artist just as effectively . . . As Americans, we tend to look down on European countries that impose legal limitations on speech—yet as long as speakers in those countries stay within the specified boundaries, discourse is allowed relatively unfettered (even when it’s unpopular). In the US, there are absolutely no speech boundaries imposed by the government, so the citizenry creates its own limitations, based on the arbitrary values of whichever activist group is most successful at inflicting its worldview upon an economically fragile public sphere. As a consequence, the United States is a safe place for those who want to criticize the government but a dangerous place for those who want to advance unpopular thoughts about any other subject that could be deemed insulting or discomfiting.
Some would argue that this trade-off is worth it. Time may prove otherwise.
The idea of other Western countries with less of that type of freedom has always bothered me. But I wonder if it actually feels like less freedom, in practice. Without committing myself to anything, I wonder if that trade-off might be preferable—because at least government has a responsibility to citizens. They have a potentially scarier kind of power, but one that is regulated; and honestly, if I had to choose between being at the mercy of a stricter government or the American public, I might feel a lot safer with the government. Because the American public, as has been comprehensively demonstrated, is a terrifying entity.
Which brings me to another idea I thought was pretty fascinating: that people have not always seen democracy as the objective good we now believe it to be.
The men and women who forged this nation were straight-up maniacs about freedom . . . They were of the opinion that a man cannot be happy if he is not wholly free from tyranny, a sentiment that is still believed by almost every American citizen.
But how, exactly, do we know this?
It wasn’t always this way. For a long time, many smart people—Plato, most famously in The Republic—did not automatically think like this . . .
“When we talk about one-man rule—some kind of dictatorship or empire or whatever—look at the examples recent history has given us. They’re not exactly shining examples of how it might work out well, whether it’s a Hitler or a Stalin or whoever, so we don’t have any good examples [of how this could successfully operate]. But in the ancient world, they often had bad examples of democracy. Some of those guys looked at democracies the way we look at failed dictatorships.”
(The quote within the quote comes from Dan Carlin, host of the podcast Hardcore History.)
Who was it that said the thing about democracy being the worst form of government, except for all the others? Or was that quote about capitalism? I think it was democracy, but they’re essentially interchangeable in the United States, anyway. This idea has always annoyed me a little bit, mostly because of how smug the person quoting it inevitably is. You would think—and those people probably do think—that it’s a self-evident truth. But it’s not. It hasn’t always been “true,” and if our modern history hadn’t involved the particularly bad examples of bad dictatorships that it did include, it might not be “true” to us, either.
I had page markers on at least two other sections, but in trying to write about them here, I realize that they require far too much context to be able to discuss on their own. This would be such an excellent book for a group to read together, because the opportunities for discussion are almost limitless, and everyone needs to have the same background knowledge for many of those conversations to thrive. I’ve had Chuck Klosterman vaguely on my radar for a while, but who knows when I would have gotten around to him if not with this book. I really enjoyed having so many intriguing concepts to think about.