The sense of relief that Americans used to feel after a contentious election seems a distant memory. After what seemed like a slam-dunk for Clinton, many NeXters are incensed about Trump’s victory. How did the predictors get it SO wrong?

See! I was right.

Confirmation bias is our tendency to favor evidence that confirms our opinions, whether they’re true or not. It afflicts how we gather information (selectively), interpret it (prejudicially), and recall it (unreliably). The term was coined by psychologist Peter Wason, whose experiments in the 1960s found people are indeed biased towards confirming their preexisting beliefs.

Your commitment to an opposing view may actually push others away, even if evidence is gathered and interpreted in a neutral manner. Not only does your brilliant marshalling of data fail to persuade them, it actually strengthens their belief. Called the “backfire effect” (2010), they will cling to their opinions even if their original evidence is totally debunked. This effect is particularly strong for emotionally charged issues, such as politics.

Bombs away…

I've been framed!Many pundits and people in the mainstream media never imagined Mr. Trump as the President of the United States. That made it very hard for them to see him winning the nomination—until he did—or winning the election—until he did. Consider the predictions from these modern day Nostradamuses (see: I’ve been framed!):

  • Just before election night, Fox News analyst Frank Luntz tweeted, “In case I wasn’t clear… Hillary Clinton will be the next President of the United States.”
  • National pollster Nate Silver of FiveThirtyEight.com gave Mr. Trump a less than 1 in 6 chance of winning before a single vote was counted.
  • NBC and Reuters news services predicted Mrs. Clinton would win by 5 percentage points nationwide. The ABC/Washington Post poll put it at 3 points.
  • The Washington Post didn’t just predict a victory for her; it said flat-out that it was mathematically impossible for Mr. Trump to gain the needed 270 electoral votes.
  • The New York Times gave Mrs. Clinton an 80% chance of winning as the polls were closing.

On election night, it soon became clear that the media predictions were totally bogus. The Hillary juggernaut evaporated (beam ’em up Scotty) as one-by-one, Mr. Trump picked off the states he needed to win the presidency. Like deer in the headlights, the talking heads were completely stunned—nothing had remotely prepared them for the Trump big rig.

Yet, a small polling firm in Atlanta turned out to be right. Statistician Matt Briggs (2016) concluded there was a “shy” Trump effect in plain view that the major pollsters had left out of their models. These voters didn’t want to admit their support to pollsters or simply didn’t answer the phone. Although dismissed by the mainstream press, Trafalgar Group bet that this “hidden” Trump vote existed and factored it into their predictions.

When facts collide with our wish list.

A two-decade study of political pundits by Philip Tetlock (2005) found that, on the whole, their predictions were not much better than chance. He blamed their failure on confirmation bias—the inability to make use of new information that contradicted their beliefs. In Mrs. Clinton’s case, prognosticators suffered from tunnel vision (framing trap*) and a selective collection and weighting of the evidence called “wishcasting.” But that’s not all.

Jumping on the BandwagonThe probability of adopting a belief increases with the proportion who have already done so. With everybody heralding a Clinton landslide (except Trafalgar), it was difficult for pollsters, pundits, and media elites to discount the opinion of their colleagues (inertia trap*). Known as an argumentum ad populum, this is a red herring that alleges “If many believe it is so, then it is so” (see: Jumping on the bandwagon).

That spark set everyone’s hair on fire. And changed history.

Smart people believe weird things because they are skilled at defending beliefs they arrived at for non-smart reasons.

~ Michael Shermer, founder, The Skeptics Society

* Questionable beliefs can “trap” our better judgment, leading to poor decisions and unintended consequences. In the framing trap, we often neglect to consider other possible contingencies. In the inertia trap, we tend to be overly influenced by our peers. Learn more about these, and other interesting topics, in the Young Person’s Guide to Wisdom, Power, and Life Success.

Image credit: “surprised and shocked teenage girl” by lanak, licensed from 123rf.com (2016).