by

Why Lies Often Stick Better Than Truth

There is no good reason to believe vaccines cause autism. A 1998 paper in The Lancet that championed the link was immediately pilloried and later withdrawn as fraudulent. Its author, the British physician Andrew J. Wakefield, was found guilty of dishonesty and abuse of developmentally disabled children by the British General Medical Council. He has been stripped of his medical license. No other researcher has been able to replicate his work, and journals have retracted his other papers. The Centers for Disease Control and Prevention, the National Academy of Sciences, and many other groups found no evidence of a link.

Yet surveys in 2002 found that as much as 53 percent of the public believed there was good evidence on both sides, as did a good number of health professionals. Politicians also bought in. In 2008 the presidential candidate Barack Obama said, “We’ve seen just a skyrocketing autism rate. Some people are suspicious that it’s connected to the vaccines … The science right now is inconclusive.” His rival, John McCain, said that “there’s strong evidence that indicates it’s got to do with a preservative in vaccines.” And in 2011 Web sites were still reporting that vaccine injury cases showed evidence of autism.

There are also many people who, even after seeing President Obama’s birth certificate, believe he was not born in the United States. And many doubt there is global warming, despite an overwhelming scientific consensus that things are heating up. Why do we like our slanted information and outright lies so much?

Because rejecting them is hard work, say psychologists in a new article in Psychological Science in the Public Interest. Making a cognitive shift means rethinking already-held beliefs. It’s much easier to slot evidence into ideas we already hold, says Stephan Lewandowsky, a professor of psychology at the University of Western Australia and an author of the report.

That’s not a new discovery. More interesting, however, are the strategies the psychologists recommend for breaking through the fog of disbelief. You need to find an alternate explanation that fits the same basic facts, says another report author, Colleen M. Seifert, a professor of psychology at the University of Michigan at Ann Arbor. Misinformation persists when “you don’t have an alternative account that works as well as does the wrong one,” she explains by e-mail.

She examined reaction to a report of food poisoning. “This is a true example,” she writes. ”Suppose you hear of a family of four who died after eating at Golden Gate Chinese Restaurant. The authorities investigate, and release the information that food poisoning was not the cause. Do you go out for Chinese tonight?

“You know it is not true,” she continues, “but it’s such a good explanation for what happened that you fall back on it even while knowing it is in error. So you say, ‘It was not food poisoning, but let’s have Italian tonight.’”

In her studies, the only way to get people to let go of such an idea was to give them a plausible alternative. So in fact, “the family was found to have suffered carbon-monoxide poisoning,” she notes. “Now I have an account that is wholly satisfactory and explains the circumstances, and now I am happy to eat at Golden Gate.”

One of the worst ways to offer alternatives, though, is to repeat the bad information while doing so. “There is a risk that repeating an association can make it stronger in memory,” says Ullrich K.H. Ecker, another author, in an e-mail. “Saying that ‘it’s incorrect that the flu vaccine has major side effects’ repeats and hence potentially strengthens the link between ‘vaccine’ and ‘side effects’ even though it negates it,” notes Ecker, an assistant professor of psychology at Western Australia. His research and that of others has demonstrated this. Much smarter, he says, is to stick to the alternative, talking about the safety of the vaccine.

Seifert adds that the backfire effect is very common. “In later studies, we tried overemphasizing the fact that the information is wrong,” she writes. She tried saying to her skeptical Chinese-restaurant patrons that “‘it was definitely not food poisoning.’ But that made people more suspicious of the truth of the information, and didn’t make them any less likely to use it.”

Lewandowsky and another co-author, a researcher at the University of Queensland named John Cook, have collected those strategies and cautions in a publication for academics and science communicators called The Debunking Handbook.

None of this is easy, cautions Edward Maibach, director of the Center for Climate Change Communication at George Mason University. If it were, misinformation would diminish and facts would win in the marketplace of ideas. But, he wrote in a commentary in the same issue of the journal, the report from Lewandowsky and his colleagues does a great service by showing why some distortions are so “sticky” in our minds.

[Photo credit: Elmer Martinez, AFP, Getty Images]

Return to Top