970x125
When a rival lies or cheats, we demand justice. But when a friend does, we offer excuses. Equally, we believe our team plays by the rules while others bend them.
Yet honesty depends on the messenger.
When someone from our in-group bends the truth, we call it strategic, but when the out-group does it, we call it deceit.
In a modern era of algorithmic bubbles, deep fakes, and partisan feeds, the cost of this bias grows. When we assume the “other side” lies more, we stop fact-checking ourselves. This fuels misinformation and distrust.
A recent study of over 5,000 participants rated out-group members as more likely to lie compared to in-groups—without hard evidence. The more we cling to our group identity, the more distorted our ethical radar tends to become.
That distortion matters. It can fuel whisper campaigns, discrimination, unfair penalization, and disproportionate punishment. All deepen division as polarization continues to intensify.
The Honesty Illusion
In the study, participants were offered the opportunity to engage in dishonest behavior that either benefited an in-group or an out-group member.
Interestingly, observers predicted the out-group would lie more than the in-group. But there was no difference.
This common illusion of in-group honest brokers emerges from well-documented mechanisms.
Consider social identity theory. We instinctively categorize others as “us” or “them.” Whether in sports, politics, or religion, we trust the in-group and distrust the out-group.
After all, moral superiority feels safer than self-scrutiny.
Another mechanism is attribution error. When we behave questionably, we ascribe the cause to external situational forces. For instance, “we were under pressure or stressed.” But when others do so, we attribute it to character flaws. For instance, “they’re dishonest or greedy.”
Similarly, a leader notices a cost overrun and assumes the subcontractor must be concealing something. The mistake is seen as deliberate deception. Yet if a departmental colleague misses a deadline or sales target and hides it, the same leader may label it bad timing or unfortunate circumstances.
The assumption that outsiders deceive us more becomes a method of defending our identity. I write about this ethical misjudgment trap in my book TUNE IN. Psychologists call this moral typecasting—seeing “them” as moral violators and “us” as moral victims. This was evidenced when the Houston Astros baseball team was caught in a sign-stealing scandal and charged $5 million. Rival fans condemned it as cheating while supporters framed it as “gamesmanship.”
It’s Pervasive
From boardrooms to domestic situations, loyalty trumps integrity more than we would like to admit. This phenomenon is also evident in business. When Volkswagen manipulated emissions tests, internal teams rationalized the deception as “protecting jobs.”
During the Boeing safety crisis, insiders described issues as “communication failures,” while regulators called them “cover-ups.” This revealed deep tribal bias at an organizational level.
It’s also dominant in politics. Supporters of one party assume the other side cheats, exaggerates, or distorts facts. U.S. Democrats and Republicans may well overestimate the other’s likelihood of lying—despite fact-checking records. Meanwhile, internal party scandals may be downplayed or dismissed.
The same pattern appears in sports, religion, and national identity.
Have you ever considered how misjudging dishonesty can skew perceptions and policies? Law enforcement, corporate oversight, and judicial systems risk focusing on the wrong targets—chasing out-groups while overlooking internal wrongdoing.
From a decision-making perspective, this rush to misjudgment costs in several material ways.
- It creates false positives. Assigning dishonesty to outsiders when none exists fosters mistrust and can lead to violence. It also undermines team and community cooperation. Once suspicion sets in, it’s hard to reverse.
- It creates false negatives. Overlooking our own group’s misconduct because we assume moral purity limits learning and accountability. I see this from sports fans defending fouls to managers covering up for underperforming colleagues.
- It distorts risk. If professional auditors focus only on external vendors and ignore client dynamics, they risk missing larger sources of financial or operational cracks. Although it’s unconscious, doctors dismiss out-group patients more easily, and recruiters rate those from different disciplines or countries that little bit less.
These moral asymmetries waste time, resources, and goodwill. Worse, they erode integrity.
Your Truth Radar
The good news is that perception can be recalibrated. Several science-based remedies can reduce truth gaps.
- Perspective shifting: Before judging another group, imagine being them. What pressures, incentives, or norms might you face? This exercise reduces identity-based attributions with much more situational empathy.
- Audit symmetry: In your social circles or organisations, scrutinize everyone equally. Ensure the scope of your conclusion isn’t skewed by group membership, such as political party, religious group, race, or nationality.
- Signal checking: When you suspect dishonesty, ask yourself: “Would I assume this if it were my own group?” This simple flip test highlights double standards.
When our radar is tuned only to others, we blind ourselves to the threats from ourselves. Ethical fading can result from blind spots and deaf spots. Over time, relying on reinterpretation of evidence replaces inbuilt prejudice and builds a more accurate sense of risk.
While we’re wired to overestimate out-group dishonesty and favor in-groups, in today’s interconnected, noisy world, that wiring spreads misinformation. The challenge is to re-evaluate situations as a habit.
Smarter judgment starts with not simply believing you’re right but decoding why you think so.

