The Traps of Shallow Understanding: When Models and Observations Are Not Enough

jorzeljorzel
9 min read

"The most dangerous thing is not ignorance, but the illusion of knowledge."

Daniel J. Boorstin

We live in an age obsessed with data, models, and explanations. More information equals better decisions- or so we believe. However, this premise contains a dangerous oversight: theoretical models and empirical observations can profoundly mislead us when they capture only partial aspects of reality.

This partial understanding often proves more dangerous than acknowledged ignorance. It creates an illusion of comprehension while leaving us blind to critical factors outside our awareness.

The False Conviction of Understanding

According to the Dunning-Kruger effect, our confidence in our understanding usually increases faster than our actual knowledge. We reach premature "eureka" moments, believing we've grasped the whole when we've only glimpsed a fraction. Sometimes, having no model (that represents our understanding) might be safer than having a partial one that creates false security.

Applying technically correct information in the wrong context leads to disaster. This is precisely the danger of partial understanding - our knowledge isn't incorrect, but incomplete in ways we don't recognize.

Entire disciplines can develop blind spots when their models achieve sufficient success to create institutional overconfidence. The history of science shows repeatedly how dominant paradigms can become intellectual prisons, limiting our ability to see evidence that doesn't fit established models—what Thomas Kuhn termed "normal science" in his landmark work on scientific revolutions.

Different Approaches to Knowledge

Black Box vs. Glass Box Approaches

Traditional wisdom typically takes a "black box" approach, focusing on observed outcomes rather than underlying mechanisms. It doesn't claim to understand how something works, only that it works under specific conditions.

Modern science pursues a "glass box" path, attempting to understand internal mechanisms and processes. This offers tremendous benefits but creates overconfidence when models oversimplify complex realities.

The Time Dimension of Traditional Wisdom

A crucial factor in the success of traditional wisdom is its foundation in long-term outcome observation, sometimes spanning several decades or even generations. This extended timeframe allows patterns to emerge that short-term studies might miss entirely.

However, traditional wisdom suffers from poor precision. It typically provides a high-level picture without explaining the specific causes of observed effects. The knowledge comes in the form of "this works" rather than "this works because of X mechanism"—a statement validated primarily by survival and continuity: "We have done it this way and survived."

The Structured Lens of Science

Science and empirical approaches are also evidence and outcome-based, but with a crucial difference: they require defining upfront what we will observe, how we'll measure it, and for how long. This structured approach enables precision but creates a narrower field of vision.

In traditional wisdom, we assess holistic effects without predefining detailed points of concern, allowing unexpected patterns to reveal themselves over time.


Consider relationship dynamics: Scientific research shows that oxytocin levels increase during female orgasm, creating bonding, while testosterone (which men have in higher amounts) blocks oxytocin's effects. Research indicates that vasopressin increases during male sexual arousal but drops after orgasm. Meanwhile, men in stable relationships show lower testosterone levels than single men.

These findings suggest that early sexual intimacy might impact men and women differently in forming relationships. Interestingly, this scientific understanding confirms traditional wisdom - the old-fashioned advice that women should be cautious about physical intimacy early in relationships (see great talk of Dawn Maslar about “How your brain falls in love”).

For generations, this guidance was transmitted without neurochemical explanations. It represented a "black box" understanding, focusing on outcomes rather than mechanisms. Yet it captured something meaningful about human relationship patterns that science is only now explaining.

Where systems are highly complex and dynamic, black box approaches sometimes prove more reliable than glass box theories. They prioritize "what works" over "why it works" - an approach that acknowledges the limitations of our understanding.

System Complexity and Dynamism

To understand why our models fail, we need to recognize that systems vary along two critical dimensions:

  1. Complexity (simple to complex)

  2. Dynamism (stable to rapidly changing)

Understanding where systems fall on the dimensions of complexity and dynamism doesn't mean abandoning scientific approaches for complex, dynamic systems. Rather, it suggests adapting our methods and expectations appropriately.

For simple, stable systems, scientific models typically work exceptionally well. A pendulum's motion or basic chemical reactions can be modeled with high precision. Here, glass box approaches excel.

As complexity and dynamism increase, we shouldn't abandon scientific inquiry but supplement it with other knowledge sources. These systems require more caution, humility, and methodological diversity. Scientific understanding remains valuable but should be recognized as partial.

The greatest danger emerges when we misclassify systems—when we treat highly complex, dynamic systems as if they were simple and stable. This misconception leads to overconfidence in limited models.

When uncertainty about a system's complexity is high, a conservative approach makes sense. This means utilizing multiple knowledge sources, including traditional wisdom, observational patterns, and diverse perspectives, alongside scientific models.

The key insight isn't that science fails for complex systems, but that scientific understanding of such systems should be held with appropriate humility. We should be especially cautious about discarding traditional practices that have demonstrated effectiveness over time, even when we don't fully understand their mechanisms.

The Narrowness Problem: Real-World Examples

The information isn't wrong - it's dangerously incomplete. And often, we aren't even aware of what we're missing.

Example 1: The Depression Paradox

The modern approach to depression reveals multiple dimensions of model narrowness.

First, many therapeutic approaches emphasize emotional awareness and introspection, inadvertently encouraging excessive self-focus. It potentially traps individuals in cycles of negative thinking.

Second, and perhaps more fundamentally, our model is predominantly reactive rather than preventative. It is mobilizing resources to treat depression once it appears instead of designing lives that naturally foster psychological well-being. This reactive stance reflects a medical model that seeks to "fix" mental health problems rather than a wisdom tradition that would ask how to structure daily life, communities, and meaning systems to prevent such problems from emerging.

Traditional approaches across cultures often embedded practices - from communal rituals to meaningful work structures and intergenerational connections - that naturally supported psychological health without explicitly targeting it.

The depression paradox thus reveals a double narrowness: overfocusing on emotions rather than transcendent purpose, while simultaneously prioritizing treatment over prevention. This perfectly illustrates how technically accurate but incomplete models can lead us astray in complex systems like mental health.

Example 2: Dietary Recommendations

Nutritional science in the 1970s blamed dietary fat for heart disease.

This led to decades of low-fat diet recommendations. The model wasn't entirely wrong. Some fats do correlate with heart disease. But the model missed important details. It didn't distinguish between healthy and unhealthy fats. It also missed how people often replaced fat with sugar and refined carbs. These substitutions may have made health worse, not better.

The problem wasn't incorrect information but incomplete understanding. Only by seeing the full complexity of nutrition could better guidance emerge. This shows how partial models can create an illusion of knowledge while missing crucial parts of reality.

Example 3: Digital Learning

Scandinavian schools embraced digital learning tools in the early 2010s.

The reasons seemed solid. Digital tools offered better access to information. They could adapt to different learning styles. They prepared students for tech careers

But by 2018-2019, new data revealed surprising findings. Students who took notes by hand understood concepts better. Handwriting activated brain circuits that typing did not. Physical books improved reading comprehension more than digital texts. Research showed handwriting created neural pathways that typing couldn't match.

The initial model missed how writing physically affects brain development. Scandinavian schools then brought back traditional writing alongside digital tools. This shows how even evidence-based changes can cause problems when based on partial understanding. Traditional practices sometimes contain wisdom that only becomes clear after observing long-term results.

Progressive vs. Conservative Decision-Making

Given that all understanding is inherently partial, how should we approach decisions? The answer depends largely on where a system falls along the dimensions of complexity and dynamism:

When to Use Progressive Approaches (Model-Based Understanding)

  • Simple, stable systems where models can be accurate

  • Moderately complex but stable systems where patterns remain consistent

  • Situations requiring quick adaptation to novel challenges

When to Use Conservative Approaches (Outcome-Based Traditional Wisdom)

  • Complex, dynamic systems where models quickly become outdated

  • Situations where unexpected side effects or emergent properties are likely

  • Systems with long historical precedent and gradual change

The choice between progressive (model-based) and conservative (tradition-based) approaches isn't arbitrary but should be guided by careful system assessment:

System classification: Where does the system fall on complexity and dynamism scales?

  • Simple, stable systems (basic physics, mechanical processes) → Progressive approaches excel

  • Complex, dynamic systems (social relationships, ecosystems) → Conservative approaches provide important safeguards

Consequence asymmetry: What are the potential costs of error?

  • When false positives and false negatives carry similar costs → Progressive approaches may be justified

  • When errors could cause irreversible harm → Conservative approaches offer important protection

Knowledge state: How much do we genuinely understand?

  • When mechanisms are well-established and tested → Progressive approaches build on solid ground

  • When mechanisms remain partially understood → Conservative approaches acknowledge these limitations

Integration potential: Can both approaches be combined?

  • Often, the wisest path incorporates elements of both: using traditional wisdom to identify patterns and scientific approaches to understand mechanisms

This framework doesn't privilege either approach absolutely but recognizes that their value depends on context. The truly sophisticated decision-maker knows when to innovate and when to rely on established wisdom - a judgment that requires epistemic humility.

Toward Epistemic Humility

"6.371 The whole modern conception of the world is founded on the illusion that the so-called laws of nature are the explanations of natural phenomena."

"6.372 Thus people today stop at the laws of nature, treating them as something inviolable, just as God and Fate were treated in past ages. And in fact both are right and both wrong: though the view of the ancients is clearer in so far as they have a clear and acknowledged terminus, while the modern system tries to make it look as if everything were explained."

Tractatus Logico-Philosophicus, L. Wittgenstein

Recognizing the inherent limitations in our understanding is epistemic humility. It isn't merely a philosophical position but a practical decision-making tool. It guides us in selecting appropriate approaches for different contexts:

  • It encourages us to accurately assess complexity and dynamism before choosing an approach

  • It reminds us to maintain awareness of blind spots in our chosen perspective

  • It motivates us to seek diverse viewpoints that might capture different aspects of reality

  • It keeps us open to adjusting our approach as new information emerges

Far from undermining decision confidence, epistemic humility strengthens it by acknowledging complexity rather than pretending it doesn't exist. The most dangerous decisions often come not from recognized uncertainty but from unrecognized certainty - the illusion that our partial understanding represents the complete picture.

This is where traditional wisdom and scientific approaches can complement each other. The former often embodies humility through its focus on 'what works' rather than 'why it works,' while the latter provides precision and adaptability. Together, they offer a more complete approach to navigating complex reality than either could alone.

0
Subscribe to my newsletter

Read articles from jorzel directly inside your inbox. Subscribe to the newsletter, and don't miss out.

Written by

jorzel
jorzel

Backend developer with special interest in software design, architecture and system modelling. Trying to stay in a continuous learning mindset. Enjoy refactoring, clean code, DDD philosophy and TDD approach.