by John Weckerle
Throughout our history at New Mexico Central we have, at times, attempted to shine a light on what is now widely termed “fake news” when we see it (and when time permits) – and we will continue to do so as often as we can. In concert with this, we have long been fascinated by the factors that feed into what appears to be an insistence on the part of some people to believe things even when factual evidence is provided that disproves the concept in which belief is held. This phenomenon has never been more obviously present or widespread than it has in the year or so leading up to the 2016 U.S. presidential election, and in the months since.
In a Scientific American article titled How to Convince Someone When Facts Fail: Why worldview threats undermine evidence (originally published with the title “When Facts Backfire), Michael Shermer, author of the magazines Skeptic column (and also the founding publisher of Skeptic Magazine and author of The Believing Brain) discusses cognitive dissonance, which he describes as “the uncomfortable tension that comes from holding two conflicting thoughts simultaneously.” In the article, quotes a study by Brendan Nyhan (Dartmouth College) and Jason Reifler (University of Exeter) in which subjects were provided first with fake newspaper articles and then an article correcting the misinformation in the first. After reading the correction, the subjects believed the initial article even more strongly. The researchers termed this “the backfire effect” in which corrections actually increase misperceptions among the group in question.” The reason: “Because it threatens their worldview or self-concept.”
Dr. Shermer provides some fascinating information on this phenomenon in The Believing Brain, and many of the relevant concepts are discussed in his Ted Talk The pattern behind self-deception, which we highly recommend (along with his other Ted Talk, Why people believe weird things). In the former Ted Talk, as in the book, Dr Shermer explains “patternicity” – the “tendency to find meaningful patterns in both meaningful and meaningless noise” – and identifies two types of error – Type 1, seeing a pattern where there is none, and Type II, not perceiving a pattern that is real. These are false positives and false negatives. Later, in the talk, he introduces the concept of agenticity, “the tendency to infuse patterns with meaning, intention and agency, often invisible beings from the top down.” Agents may include a number of concepts; some examples provided include ghosts, gods, aliens, intelligent designers, and government conspirators. He then addresses conspiracy theories, observing that many are believed even though they are shown to be false – while noting, of course that some conspiracy theories are actually true.
In the Scientific American article, Dr. Shermer provides a strategy for potentially changing at least some minds caught up in believing falsehoods – a strategy very much like the one New Mexico Central has followed, albeit admittedly sometimes less than perfectly. We have focused specifically, in many cases, on re-posted/recycled falsehoods that again fall into the category of fake news. What we have found, at least in a couple of cases, that directly addressing the re-posts some times results in a reduced frequency in their appearance, and in one case may have lead to a cessation. Nobody likes to be shown as purveying falsehood, and stopping fake news anywhere in the chain can only help, even if just a little. With two years to mid-term elections, we have a lot of work to do in the hope that perhaps voters will have better information on which to base their decisions than they did last year.
To that end, we have expanded into the “Twitterverse” and will be moving into Facebook, so we can find, follow, and potentially correct misinformation and disinformation as it is forwarded/re-posted. We hope that our readers, when presented with the inevitable e-mail forwards, re-posted articles, and similar communications containing fake news or misleading information, will consider sending us a link or forward them to email@example.com and providing us with the source so we can follow up.