Why People Are Likely To Believe Political Lies

Have you ever wondered why people are persuaded by outright lies during political campaigns? And why lies tend to “stick” even after they’re debunked by facts? Some new research sheds light on why this happens, at least in terms of people’s thought processes, if not their underlying emotional drives.

It’s a major phenomena: Prior to the 2012 election campaign, the most glaring lies in the political arena were that Obama is a Muslim and that global warming is a big hoax. For example, a Pew Research poll found that 30 percent of all Republicans described the president as Muslim. And others, such as Sen. James Inhofe have regularly called climate change “the greatest hoax” of all. And recently, Rep. Paul Broun — who sits on the House Science Committee, ironically — argued that evolution and the big bang are “lies from hell.”

Currently, as the presidential campaign went into high gear after Labor Day, both sides regularly accuse each other of engaging in outright lies and extreme exaggeration about their positions and “facts,” while insisting on the truthfulness of their own. Media outlets such as the Washington Post, the New York Times and NPR have been providing fact-checking analyses about statements from President Obama and Gov. Romney as a means to restore some degree of truth.

Lies tend to stick in people’s minds, and can sway the outcome of elections, as well as public opinion in many arenas. So, what happens within our minds and emotions that make us receptive to lies, and then resistant to information that exposes the truth? A study led by Stephan Lewandowsky of the University of Western Australia explains part of what may happen. The researchers found that “Weighing the plausibility and the source of a message is cognitively more difficult than simply accepting that the message is true — it requires additional motivational and cognitive resources.”

If the subject isn’t very important to you or you have other things on your mind, misinformation is more likely to take hold, according to the researchers. They point out that rejecting false information requires more cognitive effort than just taking it in. That is, weighing how plausible a message is, or assessing the reliability of its source, is more difficult, cognitively, than simply accepting that the message is true. In short, it takes more mental work. And if the topic isn’t very important to you or you have other things on your mind, the misinformation is more likely to take hold.

Moreover, when you do take the time to valuate a claim or allegation, you’re likely to pay attention just to a limited number of features, the study found. For example: Does the information fit with other things you already believe? Does it make a coherent story with what you already know? Does it come from a credible source? And do others believe it?

And to the extent those questions affect believability, your response to them may also reflect the impact of what Eli Parisner calls the “filter bubble:” Your information milieu may reinforce information consistent with what you already “know” or are selectively exposed to. Similarly, as the researchers concluded, lies and misinformation may become deeply rooted when it conforms to preexisting political, religious, or other views.

Even worse, attempts to correct misinformation can backfire and actually increase the effect of the false belief. A good, recent example is the report that unemployment dropped below 8 percent during September. The GOP had emphasized their conviction that unemployment would remain above 8 percent — and benefit Romney’s campaign. However, following the report that unemployment dropped to 7.8 percent during September, several Republican spokesmen immediately claimed that the figures had been falsified. And despite factual corroboration that the numbers were accurately determined, some doubled-down on their allegation that a conspiracy to cook the numbers must have occurred.

“This persistence of misinformation has fairly alarming implications in a democracy because people may base decisions on information that, at some level, they know to be false,” Lewandowsky states. “At an individual level, misinformation about health issues — for example, unwarranted fears regarding vaccinations or unwarranted trust in alternative medicine — can do a lot of damage. At a societal level, persistent misinformation about political issues can create considerable harm. On a global scale, misinformation about climate change is currently delaying mitigative action.”

The researchers offer some guidelines that may help people focus on information that is true and accurate. For example, providing people with a narrative that replaces the gap left by false information; emphasizing the facts you want to highlight; keeping the information you want people to take away simple and brief; and strengthening your message through repetition.

These are useful strategies and can help penetrate falsehoods that people have absorbed, and provide them with clear, truthful information in contrast. But I think they only go so far, because the hidden factor is more emotional and attitudinal. It’s the person’s internal drivers: the fears, needs and prejudices that are largely unconscious, and very resistant to information that challenges or conflicts with them in ways that are threatening.

So it’s not just a matter of cognitive factors that make one receptive to lies or resistant to acknowledging the truth. It’s one’s entire psychology. That is, many emotional needs or conflicts may fuel one’s cognitive, conscious beliefs and attitudes. And the latter may only tighten and become more deeply entrenched when challenged. It’s much harder to address those. What may penetrate is empathic, supportive communication that recognizes deeply help positions. This may be a bridge to messages that enable one to examine one’s own beliefs and become more receptive to the truth.

Share