Humans Are Bad at Predicting Futures That Don’t Benefit Them

Unrealistic optimism makes people think bad things are less likely to happen to them than to others, and it hampers their decision-making.

A crystal ball under a pleasant blue sky
Olivia ZZ / Getty

Between 1956 and 1962, the University of Cape Town psychologist Kurt Danziger asked 436 South African high-school and college students to imagine they were future historians. Write an essay predicting how the rest of the 20th century unfolds, he told them. “This is not a test of imagination—just describe what you really expect to happen,” the instructions read.

Of course, everyone wrote about apartheid. Roughly two-thirds of black Africans and 80 percent of Indian descendants predicted social and political changes amounting to the end of apartheid. Only 4 percent of white Afrikaners, on the other hand, thought the same. How did they get it so wrong?

Students’ predictions were more like fantasies. “Those who were the beneficiaries of the existing state of affairs were extremely reluctant to predict its end,” Danziger explains, “while those who felt oppressed by the same situation found it all too easy to foresee its collapse.”

Psychology research indeed suggests that the more desirable a future event is, the more likely people think it is. When the sociologists Edward Brent and Donald Granberg studied wish fulfillment in U.S. presidential elections between 1952 and 1980, they found that 80 percent of each of the major candidates’ supporters expected their preferred candidate to win by a ratio of around four to one. “People distort their perception of an election's closeness in ways that are consistent with their preferences,” a later paper concluded. Likewise, after the 2008 election, researchers analyzed survey predictions from 19,000 Americans and found that Democrats tended to think Barack Obama was more likely to win, while Republicans assumed John McCain would.

Conversely, the more someone dreads or fears a potential outcome, the less likely they think it is to happen. In November 2007, economists in the Philadelphia Federal Reserve’s Survey of Professional Forecasters predicted just a 20 percent chance of “negative growth”—read: decline—in the U.S. economy any time in 2008, despite visible signals of an impending recession. There is, the economist Sergey Smirnov wrote in a review of economists’ botched predictions on the 2008 recession, “some deep inherent unwillingness to predict undesirable things.”

But people’s thinking isn’t as simplistic as “I wish it to happen, so it will” or, “I don’t want it to happen, so it won’t.” Self-interest influences our predictions in subtler ways.

* * *

The Rutgers University psychologist Neil Weinstein discovered unrealistic optimism by accident. It was the late 1970s, and for no particular reason he had asked study subjects to rate their likelihood of experiencing certain negative future events, like getting divorced, fired, or mugged, from “below average risk” to “above average risk.” This was back when data were manually entered on yellow punch cards. So he was sitting at his punch-card machine punching people’s responses when he realized that “all the responses were on the below-average side of the scale.”

Unrealistic optimism is thinking that good things are more likely to happen to you than to other people, whereas bad things are less likely. It’s not outright denial of risk, says Weinstein. “People don’t say, ‘It can’t happen to me.’ It’s more like, ‘It could happen to me, but it’s not as likely [for me] as for other people around me.’” People predict that they’re less likely than others to experience illness, injury, divorce, death, and other adverse events—even when they’re exposed to the same risk factors. For instance, someone might think she’s less prone to diabetes than others, even if she weighs the same, eats the same, shares similar family history with, and has the same lifestyle as the people she's comparing herself to.

Take these Pew Research Center findings: In 2015, 65 percent of a representative sample of American workers predicted that automation would monopolize most of the work currently performed by humans within 50 years. But 80 percent of workers believed that their own jobs would remain intact. In essence, they admitted that automation posed a threat to workers but assumed they were less susceptible than average to its effects.

Anxiety affects people’s predictions subliminally. For example, they may unwittingly only gather and synthesize facts about their prediction that support the outcome they want. This process may even be biologically ingrained: Neuroscience research suggests that facts supporting a desired conclusion are more readily available in people’s memories than other equally relevant but less appealing information. Our predictions are often less imaginative than we think.

They’re also more self-absorbed. Unrealistic optimism occurs in part because people fail to consider others’ experiences, especially when they think a future outcome is controllable. Imagine you’re trying to find a job. You do everything you can to make yourself an appealing candidate: get relevant experience, fix up your resume and cover letter, network. You might conclude that these measures will make you more likely to get a job than other job seekers, not taking into account that others are likely doing the same things you are to boost their chances, too. People may think their odds are better than average because they don’t know what the average really is.

Weinstein tried to curb this bias in his lab with limited success. He told students to list the factors that influenced their chances of experiencing certain events, like a heart attack, and then read other students’ responses to the same prompt. When students realized that their risk factors matched everyone else’s, their unrealistic optimism was reduced but, oddly, not eliminated. They still thought they were less likely than the average student with the same risk factors to experience negative events, even without any objective justification. Other research indicates that people resist revising their estimations of personal risk even when confronted with relevant averages that explicitly contradict their initial predictions.

Despite attempts to remedy unrealistic optimism, Weinstein insists that it isn’t all bad. Thinking things are going to turn out well may actually be adaptive, a way to soothe our fears about the future. “It keeps you from falling apart.”

* * *

Before Ray Kurzweil became a high-profile inventor, author, and futurist, he was a kid who lost his father. Kurzweil has kept every physical memento of his dad—records, notes, pictures, electric bills, 50 boxes in total—in a storage facility in Massachusetts in preparation for his prediction: By the mid-2030s, “we will be able to create avatars of people who have passed away from all of the information they have left behind.”

Kurzweil himself plans on immortality. He takes 90 supplements per day, gets regular blood tests and infusions, and has been working with the famed longevity doctor Terry Grossman for the last two decades to arrest his aging. But Kurzweil thinks that someday he’ll transcend the need for these antiaging measures. He predicts that in the 2030s our brains will connect directly to the cloud to augment our existing intelligence, and that our biological bodies will be replaced with machines part by part.

Research suggests that far-off events, like death, are particularly vulnerable to overly optimistic predictions. Moreover, predictions appear to be most influenced by whether “the event in question is of vital personal importance to the predictor.”

The author, speaker, and global-trends expert Mark Stevenson says that people who predict the future are victim to their own prejudices, wish lists, and life experiences, which are often reflected in their predictions. When I asked Stevenson for an example, he told me to consider at what point in time any futurist approaching 50 predicts life extension will be normal—“quite soon!”

But Kurzweil tells me that he makes his predictions based on what he calls the “law of accelerating returns,” which says that technology follows “a predictable and exponential trajectory.” Kurzweil says that he controls for wishful thinking in his predictions by using that trend to project out a future curve. It’s math. In 2045, the year Kurzweil predicts humans will become effectively immortal after they’ve merged with machines, Kurzweil will be 97.

* * *

In 2010, Pew Research Center found that 41 percent of Americans—70 percent of whom identify as Christian—believed that Jesus would probably or definitely return to Earth by 2050. These 50 million Americans might be right this time, but the oft-prophesied Second Coming has failed to come before. Jesus predicted his own return within a single generation in Matthew 24:34; 500 years ago, Martin Luther predicted that Judgment Day would occur within the next 300 years.

University of Pennsylvania’s Philip Tetlock, who studies the art and science of forecasting, says that strongly held beliefs are a big reason people make bad predictions.

For example, the Western faith in progress—the idea that, as sociologist Robert Nisbet put it, “mankind has advanced in the past, is now advancing, and may be expected to continue advancing in the future”—biased early predictions about the internet. In 2005, only 32 percent of hundreds of internet experts surveyed by Pew agreed that by 2014 most people “will use the internet in a way that filters out information that challenges their viewpoints on political and social issues.”

Sometimes the self-interest influencing people’s predictions is simply their desire to be right. People predict outcomes that will affirm their beliefs about the world: that democracy is winning, that death is a needless tragedy, that there is or isn’t a God. “Once you commit to anything you then have a vested interest in the outcome,” Kurzweil says. Strongly held beliefs become self-interested beliefs.

People aren’t so naïve as to think that just because something is important to them, it will happen. Rather, they tend to think most other people share their beliefs, and thus the future they endorse is likely. What researchers call “projection bias” explains why individuals so often bungle election predictions. Because they assume that others have political opinions similar to their own, people think their chosen candidate is more popular than she or he actually is. Liberals’ underestimates of the true scale of Trump support during the 2016 election may have at least partially resulted from this bias.

People who are abnormally good at predicting the future—“super-forecasters”—skillfully ban their prejudices from their probability equations. As Tetlock told Steven Dubner on the Freakonomics podcast, super-forecasters “try not to have too many ideological sacred cows.”

* * *

Sometimes self-interested predictions pan out. When Cold War historians predicted the imminent Westernization of the world amid the threat of nuclear war with the Soviet Union, they were praying they wouldn’t witness the alternative. They ended up being right. And of course, the black and Indian South Africans who predicted—and hoped—that apartheid would end were right, too.

Still, on the whole people would make better predictions with more objectivity and awareness. And good predictions matter. If you think you won’t get an STI, you may not practice safe sex or get tested regularly, thereby increasing your risk. On a larger scale, people may settle in regions prone to natural disasters because they assume they won’t be affected. Or, as Weinstein observed, people may think Trump’s proposed tax reform will help them more than it actually will, or at least that it will help them more than it will help other people. These beliefs could sway public support and congressional votes and, as a result, the future of American taxes.

In short, how we predict the future is important because it affects what we do in the present. So how do you forfeit your fantasies?

Faith Popcorn, the CEO of “a future-focused strategic consultancy,” advises shaking up your perspective: “Learn how the other side thinks,” she says. Chat up interesting people; go to readings, talks, and fairs to “expand your horizons.”

Tetlock says forecasting tournaments can teach people how to outsmart their shortcomings and instead play “a pure accuracy game.” (His wife runs one such tournament, called the Foresight Project.) He says that many people “have a little voice in the back of their heads saying, ‘Watch out! You might be distorting things a bit here,’ and forecasting tournaments encourage people to get in touch with those little inner voices.”

But Weinstein has been around long enough to know that extinguishing unrealistic optimism isn’t so simple. “It’s hard because it has all these different roots,” he says. And human nature is so obstinate.

In one study, Weinstein and his collaborators asked people to estimate the probability that their Texas town would be hit by a tornado. Everyone thought that their own town was less at risk than other towns. Even when a town was actually hit, its inhabitants continued to believe that their town was less likely to get hit than average. But then, by chance, one town got hit twice during the study. Finally, these particular townspeople realized that their odds were the same as all the other towns. They woke up, says Weinstein. “So you might say it takes two tornadoes.”

Caroline Beaton is a writer based in Denver. Her work has appeared in Vice and Psychology Today.