This insight led me to write The Better Angels of Our Nature: Why Violence Has Declined. But it was not the end of my encounters with pessimism. After writing a book on war, genocide, rape, torture, and sadism, I thought I would take on some truly controversial issues — namely, split infinitives, dangling participles, prepositions at the end of sentences, and other issues of style and usage in writing. There, too, I found widespread pessimism. When I told people that I was writing a book on why writing is so bad and how we might improve it, the universal reaction was that writing is getting worse and that the language is degenerating.
There are a number of popular explanations for this alleged fact: “Google is making us stoopid” (as a famous Atlantic cover story put it). Twitter is forcing us to write and think in 140 characters. The digital age has produced “the dumbest generation.” When people offer these explanations to me, I ask them to stop and think. If this is really true, it implies that it must have been better before the digital age. And of course those of you who are old enough remember the 1980s will recall that it was an age when teenagers spoke in articulate paragraphs, bureaucrats wrote in plain English, and every academic article was a masterpiece in the art of the essay. (Or was it the 1970s?)
The fact is that if you go back to the history of commentary on the state of language, you find that people were pessimistic in every era. In 1961: “Recent graduates, including those with university degrees, seem to have no mastery of the language at all.”
Well, perhaps we need to go back to the era before radio and television. In 1917: “From every college in the country goes up the cry, ‘Our freshmen can’t spell, can’t punctuate.’ Every high school is in disrepair because its pupils are so ignorant of the merest rudiments.”
Well, maybe you have to go back to the age of the European Enlightenment. In 1785: “Our language is degenerating very fast … I begin to fear that it will be impossible to check it.”
Above and beyond the psychology of violence and the psychology of language, these findings point toward an interesting question for a psychologist such as myself. Why are people always convinced that the world is going downhill? What is the psychology of pessimism? I’m going to suggest that it’s a combination of several elements of human psychology interacting with the nature of news. Let’s start with the psychology.
There are a number of emotional biases toward pessimism that have been well documented by psychologists and have been summarized by the slogan “Bad is stronger than good.” This is the title of a review article by the psychologist Roy Baumeister in which he reviewed a wide variety of evidence that people are more sensitive to bad things than to good things. If you lose $10, that makes you feel a lot worse than the amount by which you feel better if you gain $10. That is, losses are felt more keenly than gains — as Jimmy Connors once put it, “I hate to lose more than I like to win.” Bad events leave longer traces in mood and memory than good ones. Criticism hurts more than praise encourages. Bad information is processed more attentively than good information. This is the tip of an iceberg of laboratory phenomena showing the bad outweighs the good.
But why is bad stronger than good? I suspect that there is a profound reason, ultimately related to the second law of thermodynamics, namely that entropy, or disorder, never decreases. By definition, there are more ways in which the state of the world can be disordered than ordered — or, in the more vernacular version, “Shit happens.”
Here’s a question once posed to me by my late colleague Amos Tversky, a cognitive psychologist at Stanford University. As you leave this conference, how many really good things could happen to you today? Let your imagination run wild. And now: How many really bad things could happen to you today? Imagine the terrible things that could happen and I think you’ll agree that the second list is longer than the first. As another thought experiment, think about how much better you could feel than you’re feeling right now. Now consider how much worse you could feel. You don’t even have to do the experiment. Not surprisingly, this has probably left a mark on the psychology of risk perception.
There’s also an asymmetry of payoffs in the responses to the possibility of good and bad things. What is the average cost of overreacting to a threat? Well, it’s not zero, and we can all document cases where we have paid in forgone opportunities for reacting to a threat that never happens. But what’s the cost of underreacting to a threat? It’s a plausible hypothesis that for most of human evolutionary history, the fitness cost of underreaction was much greater than the fitness cost of overreaction. In other words, the typical threat in the environment in which our brains evolved was probably much greater than it is today, now that we have exerted technological mastery over so much of our environment. The implication is that our current psychology is tuned to a world that was more dangerous than the world that we’re in today, and that therefore our sense of risk and fear and anxiety is not optimally tuned to the objective risks that we face.
The bad‐dominates‐good phenomenon is multiplied by a second source of bias, sometimes called the illusion of the good old days.
People always pine for a golden age. They’re nostalgic about an era in which life was simpler and more predictable. The psychologist Roger Eibach has argued that this is because people confuse changes in themselves with changes in the times. As we get older, certain things inevitably happen to us. We take on more responsibilities, so we have a greater cognitive burden. We become more vigilant about threats, especially as we become parents.
We also become more sensitive to more kinds of errors and lapses. This is clear enough in language: as you become more literate, you become more sensitive to the fine points of punctuation and spelling and grammar that went unnoticed when you had a shorter history of attending to the printed word. At the same time, we see our own capacities decline. As we get older, we become stupider in terms of the sheer ability to process and retain information.
There’s a strong tendency to misattribute these changes in ourselves to changes in the world. A number of experimental manipulations bear this out. If you have people try to make some change in their lives — say, to eat less fat — often they become convinced that there are more and more advertisements for fatty foods.
Now, it would be hypocritical for me to say that more and more people today pine for the good old days, compared to the good old days in which perception of the times was more accurate. In fact, pessimism is not a recent phenomenon: people always were nostalgic for the good old days. In 1777 David Hume noted that “the humour of blaming the present, and admiring the past, is strongly rooted in human nature.” This may be explained by an insight from Thomas Hobbes offered a century before. “Competition of praise inclineth to a reverence of antiquity,” he wrote pithily, “for men contend with the living, not with the dead.” In other words, criticizing the present is a way of criticizing your rivals.
This ties into a third emotional bias, the psychology of moralization. People compete for moral authority — for who gets to be considered more noble — and critics are seen as more morally engaged than those who are apathetic. This is particularly true of contested ideas in a local community. People identify with moral tribes: what you think is worthy of moralization identifies which group you affiliate with. So the question at hand today — is the world getting better or worse? — has become a referendum on modernity, on the erosion over the centuries of family, tribe, tradition, and religion as they give way to individualism, cosmopolitanism, reason, and science. Simply put: Your factual belief on whether the world is getting better or worse advertises your moral beliefs on what kinds of institutions and ideas make us better or worse off.
Those are three emotional biases toward pessimism. We also have cognitive biases that incline us that way, foremost among them being the “availability heuristic.” This is a feature of the psychology of probability also documented by Tversky, in collaboration with the Nobel Prize–winning economist Daniel Kahneman. Forty years ago, Kahnemanm and Tversky argued that one of the ways the human brain estimates probability is by using a simple rule of thumb: the more easily you can recall an example of something, the more likely you estimate it to be. The result is that anything that makes an incident more memorable will also make it seem more probable. The quirks of the brain’s ability to retain information will bleed into our estimates of a risk’s likelihood.
Events that are more recent, or easier to imagine, or easier to retrieve — anything that forms a picture in the mind’s eye — will be judged to come from more probable categories of events.
Kahneman and Tversky offer a simple example: Which are more common, words that begin with the letter r or words that have r in the third position. People say that there are more words that begin with r, even though it’s the other way around. The reason for this error is that we retrieve words by their onsets, not their third letter. You can ask this of almost any letter in the alphabet and you’ll get the same result, because we can’t call words to mind by any position than the first. We see the availability heuristic in action all the time. People are more fearful of plane crashes, shark attacks, and terrorist bombings — especially if one just happened recently — than of accidental electrocutions, falls, and drownings. The latter are objectively much riskier, but they tend not to make headlines.
I believe that each of these psychological biases interacts with the nature of news to lead to an aura of pessimism. What is news? News is, by definition, things that happen.
It’s not things that don’t happen. If a high school gets shot up, that’s news. If there’s another high school that doesn’t, you don’t see a reporter in front with a camera and a news truck saying, “There hasn’t been a rampage shooting in this high school today” — or in the other thousands of high schools at which shootings have not taken place. The news is inherently biased toward violent events because of the simple fact that they are events.
This bias is then multiplied by the programming policy “If it bleeds, it leads.” Consuming stories of violence is pleasurable. We pay a substantial amount of our disposable income to watch Shakespearean tragedies, Westerns, mafia flicks, James Bond thrillers, shoot‐em‐ups, spatter films, pulp fiction, and other narratives in which people get shot, cut, or blown up. It’s not surprising that when it comes to attracting eyeballs to news sites, the same kind of mayhem that we pay money to see fictionalized we also pay money to see in reality. This is multiplied by the fact that the world now has 1.75 billion smartphones, which means the world now has 1.75 billion news reporters. Gory events that as recently as a decade ago would have been trees falling in the forest with no one to hear them can now be filmed in real time and instantly broadcast on the Internet. All of these features of the news media stoke the availability heuristic. They give us vivid, memorable, recent events, exactly the kind of material that tilts our probability estimates.
Let me conclude by noting that these phenomena give rise to a perverse violence news codependency, in which people commit acts of violence precisely because they anticipate news coverage. There are at least two categories of violence which are dubious gifts of the news media. One is terrorism, which is a technology for extracting the maximum amount of publicity for the smallest amount of violence. By any measure, terrorism accounts for a trivial proportion of the world’s deaths by violence, to say nothing of deaths from all causes put together. The most damaging terrorist event in history was September 11, 2001, which killed fewer than 3,000 people. While undeniably tragic, this is in the noise when compared to statistics on homicide or civil wars.
The second category is rampage killings, which probably would not occur, at least not nearly as often, if it weren’t for wall‐to‐wall news coverage. In his book The Myth of Martyrdom, the criminal justice scholar Adam Lankford proposes a thought experiment.
Suppose you want to become famous. You are determined to attain worldwide fame over the next year, or month, or even week. What could you do that would guarantee this? Well, it would be nice to come up with the cure for a disease, but how many of us can do that? You could try to circulate an internet meme, but thousands of people upload cat videos and few of them go viral. Lankford notes that there is one guaranteed way in which any person could become famous: kill a lot of innocent people.
Because of that feature of modern life, a market has been created for those who view notoriety as more important than anything else, including life. And that feeds a category of violence that would barely exist if it weren’t for the nature of news.
In sum, there are many reasons to think that people tend to be more pessimistic about the world than the evidence warrants.
I have suggested that this can be attributed to three emotional biases that are baked into our psychology: bad dominates good, the illusion of the good old days, and moralistic competition. These feed into a single cognitive bias — the availability heuristic — which in turn interacts with the nature of news, thereby generating an inclination toward pessimism.