truth and lies about equally. The upshot is that promoting truth may
depend more on social and psychological factors than just on harping on the evidence alone – or blaming the technology.
Cognitive & Social Factors in the
Transmission of Information
If one is aware of how lies spread, one can begin to defuse their influence.
In an earlier column, I described five elements commonly used by science con-artists (Allchin, 2012). They generate trust and belief through
(1) style, (2) disguise, (3) exploiting social emotions, (4) conjuring
doubt, and (5) flooding the media. The proliferation of fake news underscores the urgency of teaching about those tactics in broader contexts.
First, information con-artists flood the Internet and airwaves with
their message. Often, they pay to have their bogus claims broadcast
more widely or prominently. For example, advertising on Google is
often tied to keywords in the user’s search. That’s one way climate-change naysayers have gained traction, by appearing higher on the list
of search results. The prevalence or availability of a message matters.
People assess the reliability of information in part by gauging the
beliefs of others. We tend to trust – and thus are susceptible to –
the voice of the herd. So, people can now buy influence through a
company that generates fake extra views for their You Tube channel.
Or writes fake positive reviews on Amazon or Yelp. Or creates fake
followers on Twitter. More mendacity. A majority of individuals –
perhaps as many as 95% – tend to follow what others do. Of course,
there is a fine line between following the “wisdom of the crowd” and
unhealthy mob behavior. Sadly, merely flooding the media with disinformation increases the chances that it is believed.
A second factor in spreading lies is social emotions: the common
desire to feel part of a group. Pursuit of a sense of belonging can bend
beliefs. People tend to align their ideas and values to “fit in” and to show
solidarity with those around them (Kahan, 2013). For example, persistence of belief in creationism is strongly influenced by the dynamics of
social cohesion (Sacred Bovines, February 2013). Similar behavior is
found among those who reject the human causes of climate change.
Crudely, group membership or loyalty to family can easily trump evidence. Sociologist of science Bruno Latour provocatively called that
form of thinking “sociologics”: reasoning based on the strength of social
connections rather than on traditional deduction. Thus, the need for
social identity and acceptance may help perpetuate conspiracy theories
as much as any misunderstanding of how science checks facts. Similarly, social networks that bring like-minded people together foster
mutual accord, intolerance of dissent, and large-scale attitudinal polarization. Social dynamics seem to contribute significantly to the uncritical
acceptance of fake news – and to the dismissal of relevant evidence.
These two factors (flooding the media and exploiting social emo-
tions) are tactics exploited by both science con-artists and purveyors
of fake news. Another factor that contributes significantly to the
reception of misinformation is prior beliefs and values. New informa-
tion is subject to mental filters. Evidence that concurs with one’s cur-
rent beliefs is readily accepted (without critical review), while
evidence that challenges those ideas receives extra scrutiny or is dis-
counted: what is known as confirmation bias. All sorts of websites
exist for fact-checking, but their value is limited if no one feels the
need to consult them. Likewise, a plausible explanation combined
with a few examples can often convince someone of an “agreeable”
but ultimately unfounded claim. Suggestiveness can substitute for
documented facts. The “truth will out” only with more complete
or systematic evidence – and the motivation to find it.
Even worse, perhaps, values, ideology, and feelings can also
influence how one views facts, a more potent process that psycholo-
gists call “motivated reasoning” (Kunda, 1990). That is, a willingness
to believe can strongly influence actual belief. One can even dismiss
contradictory evidence outright. Motivated reasoning underlies some
of the most strident anti-science sentiments (Kraft et al., 2015). Cog-
nitive studies have documented how political beliefs lead individuals
to reject scientific consensus and even the relevance of evidence on
issues from climate change to the safety of nuclear waste disposal
to the social consequences of carrying concealed weapons (Kahan
et al., 2011). Other investigations have shown how local economic
conditions prime the thinking of residents on the reality and causes
of climate change. As noted above, ideological interpretations of
evidence can further intersect with mental processes that preserve a
sense of loyalty to and affinity with important groups (Kahan,
2013). In short, ideology can make any individual vulnerable to
accepting a claim that has no factual merit or rejecting one that does.
Immunization by Inoculation
What is to be done? First, exposing lies promptly is important. It’s
easier to prevent errors than to fix them and the harm done in the
interim. That means being prepared to identify misinformation,
disable it, and curb its spread.
One could, of course, instruct students to debunk each false
claim on its own, with rules of evidence and a healthy library of
fact-checking websites. But this requires too much effort. Instead,
one can “inoculate” individuals to fake news. Recent research confirms the adage that “forewarned is forearmed.” When individuals
become aware of the strategies used to spread misinformation, they
are better able to recognize it and neutralize its effect (Cook et al.,
2017). So, teaching about those tactics seems critical to disarming
“alternative facts.” Again, the con-artist’s ploys include
• style, aimed to evoke trust
• disguise, or the falsified appearance of expertise
• exploiting social emotions
• conjuring doubt
• flooding the media (Allchin, 2012)
Case stories and examples from history can help convey the basic
lesson. An excellent documentary that vividly unmasks the disinformation “playbook” is Merchants of Doubt. For an extended student
inquiry activity on trust and credibility, see Zemplén (2009).
Equally important, perhaps, is teaching about our cognitive vul-nerabilities. We should understand how our minds work unchecked,
potentially misleading us. With practice, we can be alert to and outsmart our unfruitful habits and intuitions. Of course, our tendencies
may be easier to appreciate by first seeing how they affect other people.
All unconsciously. Confirmation bias and motivated reasoning can
cripple our own efforts to get to the facts. Just understanding this
allows us to monitor our reasoning for possible lapses.
Ideally, self-analysis needs to be coupled with an underlying
respect for truth. Commitment to the value of facts over the other
values that can distort thinking is essential. “A necessary condition