970x125
It can seem like everyone feels entitled to their own opinions and even their own facts. But skeptic Michael Shermer argues that reports of truth’s death are greatly exaggerated.
In his forthcoming book, Truth: What It Is, How to Find It, and Why It Still Matters, he shows that while our species is susceptible to self-deception, we’re not condemned to it.
Imagine you’re at work, Shermer writes, and you get a call from your neighbor warning that suspicious-looking people seem to be casing your house. You call the police. They say they don’t see anything out of the ordinary. Then your neighbor calls again to say there’s a moving truck in your driveway. Again, the police assure you that they don’t see anything. Then your neighbor calls a third time, this time frantic because he sees people inside your home. What do you do?
Most people, Shermer notes, would rush home. Even if your neighbor was lying or mistaken, acting on the belief that your neighbor was telling the truth “would be a rational response to an apparently real threat.”
That same logic, Shermer says, can explain how thousands ended up storming the U.S. capitol on January 6, 2021. When people are convinced that their democracy is being stolen, the impulse to do something overrides doubt. People always act on what they believe to be true. So seeking the actual truth will always be essential.
Psychologically, this is the crux of the “post-truth” problem. Especially in times of uncertainty, our reasoning can become a servant of belonging. Social psychologists call this the my-side bias. Shermer cites Keith Stanovich’s research showing that highly intelligent people are even better than less intelligent people “at rationalizing beliefs that they hold for non-smart reasons.”
Why We Believe Our Own Side
“There are known knowns; there are things we know we know. We also know there are known unknowns; that is to say, we know there are some things we do not know. But there are also unknown unknowns—the ones we don’t know we don’t know.”
Donald Rumsfeld, U.S. Secretary of Defense
While by definition, Shermer can’t address unknown unknowns, he breaks things down into known knowns, (things like evidence, causation, and how to think about highly unlikely events), known unknowns, (things we know we don’t know about) and a section on known unknowables (like consciousness, free will, and God).
Known knowns is a guided tour through the architecture of motivated reasoning. With information drawn from studies across cognitive psychology, behavioral economics, and neuroscience, it demonstrates that belief formation is not primarily a search for truth but a negotiation between evidence and identity.
Examples familiar to anyone who has watched science become tribal: climate change skeptics distrust government; anti-vaxxers distrust pharmaceutical companies; fundamentalists distrust Darwin. “In many cases, it isn’t the truth about the facts that is under dispute,” Shermer observes. Instead, each group’s bias has a psychological function—it maintains a primary human need: belonging.
The implication is sobering. Humans are not gullible because we’re stupid; we’re gullible because we’re social. As the cognitive scientist Hugo Mercier has shown, belief functions as a signal of trust within groups. We’re wired less for independent verification than for alliance.
As social psychologist Jonathan Haidt noted in The Righteous Mind, “When it comes to moral judgments, we think we are scientists discovering truth, but we are actually lawyers arguing for positions we arrived at by other means.” This helps explain why arguments often don’t change minds—unless they also change identities.
How to Reason
Yet Shermer’s outlook is not cynical. The same mental machinery that enables bias also enables reason—when properly trained. Quoting British warrior-scholar Alfred Mander, Shermer reminds us that thinking is “skilled work.” It can be practiced. It can even be rehabilitated.
Shermer defends the Enlightenment project not as a historical artifact but as a cognitive technology: a set of tools—science, skepticism, free speech—that extend the reach of reason beyond the tribal brain. He contrasts it with postmodernism’s claim that all truth is socially constructed.
To illustrate the difference between rhetoric and reality, he recounts physicist Richard Feynman’s demonstration during the Challenger disaster investigation: dunking a piece of the shuttle’s O-ring in ice water to show how it lost elasticity at 32°F. “Reality must take precedence over public relations,” Feynman concluded. Shermer turns that into a credo for the age of misinformation.
The point isn’t that scientists are free of bias; it’s that science, done properly, is a bias-correcting enterprise. As historian Naomi Oreskes argues, objectivity doesn’t depend on perfect people but on diverse, open communities that allow criticism to do its work. That social-epistemic insight—that truth is a community function—is why universities need a diversity of thinkers with differing biases.
A recent doctoral dissertation at Ohio State by HyeKyung Park illustrates that the wisdom of crowds improves group performance by cancelling out the noise—but not bias. If everyone’s guesses are consistently biased in the same direction, averaging doesn’t improve accuracy.
Pluralistic Ignorance and the Spiral of Silence
In a particularly psychologically astute chapter, Shermer explores pluralistic ignorance, the phenomenon in which most people assume that most others believe something they themselves do not believe, and therefore they stay silent. Drawing on Aleksandr Solzhenitsyn’s Gulag Archipelago, he recounts the infamous 11-minute standing ovation for Stalin—because no one dared stop clapping first.
The mechanism not only sustained authoritarian movements in the past, it suppresses dissent today. When social costs outweigh the benefit of speaking honestly, it results in self-censorship and even preference falsification: professing to hold opinions one doesn’t hold out of fear of repercussions. “To break a spiral of ignorance,” Shermer writes, “two elements are necessary: knowledge and communication.”
The lesson is psychological as much as political: truth requires courage.
The “Post-Truth” Era
Are we really in a post-truth era? Shermer’s answer is an emphatic no. “If the statement ‘We are living in a post-truth era’ were true,” he declares, “then it wouldn’t be.” The paradox reveals what psychologists have long known: Even those who deny objectivity can’t help but rely on it. To claim that nothing is objective is an objective statement.
For Shermer, the crisis isn’t epistemological—it’s emotional. The human brain evolved to value solidarity over accuracy. The challenge is to cultivate institutions and habits that make truth-seeking socially rewarding rather than socially costly.
That insight connects his work to broader psychological themes: cognitive dissonance (the discomfort of holding conflicting beliefs), the need for cognitive closure (our intolerance of ambiguity), and identity-protective cognition (our instinct to defend the tribe). Shermer reframes these not as pathologies but as design flaws that can be mitigated.
Skepticism as Civic Virtue
Quoting the late Stephen Jay Gould, Shermer writes that moral decency alone isn’t enough; reason must join it. “Skepticism,” Gould argued, “is the agent of reason against organized irrationalism—and is therefore one of the keys to human social and civic decency.”
The antidote to misinformation isn’t scolding people for being irrational; it’s teaching the psychology of belief itself. Shermer’s approach mirrors therapeutic techniques in cognitive-behavioral therapy (CBT): Identify distortions, examine evidence, replace automatic thoughts with reality-based ones.
In this sense, Truth is a kind of therapy for civilization.

