Why Are We Convinced We Know What We Don’t?

by Denis Storey
October 18, 2024 at 6:48 AM UTC

The illusion of information adequacy fuels misunderstandings and entrenched opinions in relationships and political debates.

Clinical relevance: The illusion of information adequacy leads us to mistakenly believe we have enough knowledge for decision-making.

  • New research suggests that participants lacking full information felt equally competent as those with comprehensive data, reinforcing their opinions and beliefs.
  • This cognitive bias, combined with naive realism, hinders understanding of differing perspectives.
  • Encouraging humility can improve communication and foster better decision-making in personal and societal contexts.

You don’t need (another) tight presidential election to see how profoundly the way we handle differing perspectives influences our relationships. This applies to more intimate settings, such as a marriage, and in larger public forums, like politics. How we navigate these differences usually relies on our underlying assumptions. Now, new data clears the air around the cognitive bias that can cloud these interactions.

The Illusion of Information Adequacy

The study explored how we typically fail to recognize our own knowledge gaps when making decisions. In short, the researchers wanted to know why so many of us (incorrectly) assume we have enough information to make an informed decision. These assumptions influence our decision-making processes in multiple social contexts, from interpersonal relationships to policy debates.

The team – comprised of researchers from Johns Hopkins, Stanford, and Ohio State universities – constructed a scenario where they segregated the participants.

  • A control group, which received comprehensive information about a given issue.
  • And a treatment group, which only received half of that information.

Despite possessing less data, the treatment group members believed they had just as much information as the control group. As a result, they felt equally competent in making decisions. And they remained convinced that most would agree with their conclusions.

Naive Realism

The results underscore the role of “naïve realism,” a psychological bias where individuals insist that their perception of the world is objective and reasonable. This bias drives them to assume that anyone who disagrees either doesn’t have enough information or they’re simply being irrational.

When you pair that with the illusion of information adequacy, people fail to consider what they might be missing. The study results ripple over into most everyday situations.

For example, the authors proposed the scenario of a driver honking at the car ahead for not moving fast enough when the light turns. The honking driver assumes, they write, that they have all the information they need to properly understand the situation. They’re unaware the car ahead is waiting for a pedestrian to cross the street.

This hypothetical illustrates how the illusion of information adequacy can feed frustrations and foster misunderstandings. And it’s an example derived from the late George Carlin.

“Carlin’s observation — that, relative to our driving, slower drivers are idiots and faster drivers are maniacs — is often invoked in discussions of naïve realism,” the authors wrote. “Perhaps the mere act of wondering, ‘Do I have enough information?’ could infuse humility and a willingness to allow more benefit of the doubt. Was the driver who just zoomed past en route to the hospital with a family emergency? Is the painfully slow driver transporting a fragile family heirloom? These types of questions might dissolve road rage (George Carlin’s or our own) before it begins.”

In their experiment, the researchers tested how this bias might influence the participants’ decisions on whether to merge two schools in response to waning water resources. The control group received a balanced set of arguments for both merging and remaining separate. On the other hand, the researchers provided the treatment group with biased data that supported one argument or the other. Despite only having part of the picture, treatment participants felt they had enough information to make the best decision. They also assumed others would agree with them.

Notably, some participants from the treatment group still clung to their original beliefs even after being exposed to the full set of arguments. The researchers inferred from this that once people settle on a position, even incomplete data can reinforce their opinions. Consequently, it becomes that much harder to change their minds later with all the data.

Breaking the Illusion of Cognitive Bias

The researchers insist that the illusion of information adequacy looms as a crucial obstacle in the decision-making processes. Most of us don’t normally realize we’re operating without all of the facts. This can fuel overconfidence. And this cognitive bias, shackled to naïve realism, makes it nearly impossible for us to understand one another’s perspective. Common ground eludes us, even when we’re starting with the same set of facts.

The study also offers insights into broader societal issues, such as political polarization. In today’s never-ending information ecosystem – where people receive fragmented or biased news multiple times a day – the illusion of information adequacy can fuel misunderstandings and deepen divides. People typically remain steadfast in their opinions, even when they’re built on a faulty foundation.

Moving Forward With Humility

Addressing this bias might involve encouraging individuals to pause and reflect on what they don’t know. By recognizing that their understanding could be limited, the authors argue that people might approach decision-making with more humility and openness to alternative perspectives. This shift in mindset could foster better communication and understanding, whether in personal relationships or global political discussions.

The illusion of information adequacy persists as a pervasive cognitive bias that informs how we make decisions and interact with others. By failing to recognize the limits of our knowledge, we often misjudge situations. And we fail to find common ground with those who believe differently – even if we start from the same place. Knowing this – fighting back against this bias – could foster better decision-making and healthier interpersonal and societal relationships.

“Although people may not know what they do not know, perhaps there is wisdom in assuming that some relevant information is missing,” the authors conclude. “In a world of prodigious polarization and dubious information, this humility  —and corresponding curiosity about what information is lacking — may help us better take the perspective of others before we pass judgment on them.”

Further Reading

Evaluating the Psychiatric Effects of Social Media

The Psychology of Political Polarization

Positive Psychiatry: Its Time Has Come

Original Research

Characteristics and Predictors of Fluctuating Attention-Deficit/Hyperactivity Disorder in the Multimodal Treatment of ADHD (MTA) Study

This study demonstrates that ADHD fluctuations are common and substantive and that, when remitted, individuals with fluctuating ADHD can successfully manage increased responsibilities.

Margaret H. Sibley and others

Rounds in the General Hospital

Strategies for the Treatment of Pain in the Context of Alcohol and Substance Use Disorders

When prescribing opioids for pain, prescription monitoring programs should be checked, regular urine toxicology screening should be conducted, and patients with an opioid use disorder should receive FDA-approved medication and a counseling referral.

Sofia E. Matta and others