Think and Save the World

The Practice Of Sitting With Not Knowing

· 8 min read

The Neuroscience of Uncertainty Aversion

The brain's drive toward certainty is not a character weakness — it's a design feature. The predictive processing framework in neuroscience describes the brain as essentially a prediction machine: it constantly generates models of what's about to happen, compares those predictions against incoming sensory data, and updates when predictions fail. The goal of this system is to minimize what's called "prediction error" — the gap between what was expected and what actually happened.

Uncertainty is prediction error that can't be resolved. The prediction machine is running without enough information to make a good prediction, and it knows it. This activates a sustained low-level threat response. Research by Archy de Berker and colleagues at University College London showed that uncertainty about whether you'll receive a shock is actually more stressful than certainty that you will. People were more stressed not knowing if a bad thing would happen than knowing it definitely would. The certainty of bad news is easier to bear than the uncertainty.

This explains a lot of human behavior that looks irrational from the outside. Why people stay in relationships they know are wrong for them. Why people find comfort in conspiratorial thinking. Why political extremism feels like relief to people who are psychologically overwhelmed. The false certainty of "I know exactly who to blame" and "I know exactly what to do" is neurologically soothing, even if the content is catastrophically wrong. The brain would rather have a map — even a wrong one — than no map at all.

Understanding this mechanism doesn't dissolve it. But it reframes it. The pull toward premature closure isn't weakness. It's biology. The work of building uncertainty tolerance is working against a biological current, which means it requires deliberate practice, not just better intentions.

Apophatic Thinking

One of the oldest approaches to sitting with not-knowing comes from apophatic theology — the via negativa tradition in mystical philosophy. The idea: some things cannot be known directly or positively. You can only circle around them by saying what they are not. God, in this tradition, is not this, not that, not this either. You approach the unknown asymptotically, through negation, without ever arriving at a final description.

This sounds like philosophy and it is, but it has practical applications well outside theology. The apophatic move is useful whenever you're dealing with something genuinely complex — which is most things that matter. You don't know what the best career decision is, but you can get clearer on what it isn't. You don't know what's wrong in a relationship, but you can articulate what's not working. You can navigate by negation toward the shape of what you're looking for without claiming to have arrived.

The practical version: when you're in genuine uncertainty, try making a list of what you can confidently rule out. What is this definitely not? What would you definitely not accept? What paths are genuinely closed? The shape of the remaining space — what's left after you've eliminated what you can — often gives you more traction than trying to directly identify the answer.

Keats and Negative Capability

In December 1817, John Keats wrote a letter to his brothers George and Thomas. In it, almost as an aside, he described something he'd been thinking about — the quality that made Shakespeare great:

"I mean Negative Capability, that is, when a man is capable of being in uncertainties, mysteries, doubts, without any irritable reaching after fact and reason."

Keats was writing about creative genius, but he identified something that goes far beyond creativity. The "irritable reaching after fact and reason" he describes is exactly the brain's prediction machinery in overdrive — the compulsive need to resolve uncertainty even when the situation isn't yet resolved, even when premature resolution would close off something important.

The greatest mistake in high-uncertainty situations is to reach for resolution before the situation has shown you what it is. This is true in creative work — the writer who decides too early what a story is about will miss what it's trying to become. But it's equally true in business, in relationships, in leadership. The leader who declares certainty in an ambiguous situation doesn't reassure — they lock in a trajectory that may be wrong. The person in a new relationship who decides in month three exactly what kind of relationship this is may foreclose what it could have become by month twelve.

Negative capability is not passivity. Keats was an intensely active poet — he wrote constantly, revised constantly, engaged constantly. The negative capability was in his willingness to hold open questions open while working with them, rather than forcing premature resolution.

The Zen Beginner's Mind

Shunryu Suzuki, the Soto Zen teacher who brought Zen Buddhism to the West, had a famous line: "In the beginner's mind there are many possibilities, but in the expert's mind there are few."

Beginner's mind (shoshin in Japanese) is the practice of approaching each situation as if you don't already know what's there. This is genuinely hard to do as you accumulate experience, because expertise is literally the accumulation of pattern-recognition — shortcuts that let you recognize situations quickly and respond appropriately. This is enormously valuable. It's also a source of systematic error, because patterns make you prone to seeing what you expect to see rather than what's actually there.

The expert who has seen a thousand presentations can grade yours in the first thirty seconds. Sometimes that fast pattern-recognition is right. Sometimes they've stopped seeing your actual presentation and are seeing the projection of their pattern. The beginner's mind is the corrective — it keeps the expert genuinely looking, genuinely uncertain, genuinely open to being surprised.

This is not anti-expertise. It's the meta-skill that keeps expertise honest. The best practitioners in any field hold both: the knowledge that experience has given them, and the openness to what the specific situation in front of them might be showing them that doesn't fit the pattern.

Epistemic Humility vs. Epistemic Paralysis

There's an important distinction between healthy uncertainty tolerance and what might be called epistemic paralysis — the inability to act because you can't achieve complete certainty first.

Epistemic paralysis looks like humility but is often a kind of fear. "I don't know enough yet" can be a genuine observation or a strategy for avoiding the vulnerability of commitment. The person who never launches because they're still researching, who never commits because they're not sure yet, who never takes a position because any position might be wrong — these aren't people with good uncertainty tolerance. They're people who are using uncertainty as a shield.

Genuine uncertainty tolerance is different. It means: I don't know the full picture, and I'm going to act anyway based on what I do know, while remaining genuinely open to updating as new information comes in. It's the capacity to hold a position lightly — committed enough to act on it, loose enough to change it. The scientist who has a hypothesis, runs the experiment committed to finding out whether it's true, and updates when the data contradicts it is modeling this. The person who refuses to form a hypothesis because they might be wrong is modeling paralysis.

The practical version: the question is not "do I know enough to act?" The question is "do I know enough to take the next step?" The next step is usually much smaller than the final answer. You rarely need complete certainty to take the next step. You just need enough.

Tolerance as a Trainable Capacity

Uncertainty tolerance is not a fixed trait. It's a skill, and it responds to training like other skills.

Research by Jeremy Carpendale and others on epistemic development suggests that tolerance for ambiguity increases with both age and deliberate exposure. You build uncertainty tolerance the same way you build any other tolerance: gradual exposure, without the safety behavior (the premature resolution, the compulsive checking, the manufactured certainty).

In clinical terms, the treatment for intolerance of uncertainty looks very similar to exposure therapy for anxiety: you deliberately expose yourself to small doses of not-knowing, resist the urge to resolve it prematurely, and let yourself discover that you can survive the ambiguity. Over time, the threat signal weakens. The brain begins to trust that uncertainty is not equivalent to danger.

You can practice this at small scale in daily life: when you don't know what someone meant by a message, wait before asking. When you don't know how something will turn out, resist the urge to "just see if you can find out." Let the ambiguity sit for a day. Notice what your nervous system does with it. Then, when you resolve it, notice how often the reality was better than what you'd manufactured, or at least different from the worst-case your brain defaulted to.

Practical Exercises

The "I don't know" practice — For one day, catch every time you speak with false certainty about something you actually don't know. Replace "it'll probably be fine" with "I don't know, we'll see." Replace "she was definitely annoyed at me" with "I'm not sure what she was feeling." This is not about becoming wishy-washy. It's about bringing your speech into alignment with your actual epistemic state. It's remarkably uncomfortable at first.

Sitting with an open question — Choose one significant open question in your life. Write it down. Then, for thirty days, do not try to resolve it. Let it sit. Bring curiosity to it rather than urgency. Notice what the question opens up rather than trying to close it. At the end of thirty days, see what's emerged.

Pre-mortem for false certainty — When you find yourself feeling certain about something in a high-stakes situation, run a deliberate pre-mortem: assume your certainty is wrong. How would you find out? What evidence would change your mind? What are you not looking at because you've already decided? This is not self-doubt for its own sake — it's using negative capability as a quality check on your reasoning.

Deliberate exposure to irresolution — Read fiction that doesn't resolve cleanly. Spend time in art that asks questions rather than answers them. Sit with music that doesn't resolve to a tonic chord. Seek out conversations with people who hold views you can't quickly dismiss but can't fully reconcile with your own. Practice staying in the unresolved rather than reaching for the exit.

The World Stakes

The great crises of the modern world are crises of false certainty. Leaders who were sure they knew what was right and stopped looking at evidence. Populations seduced by simple explanations for genuinely complex problems. Ideologies that promise resolution of all ambiguity if you just commit hard enough.

A person who can sit with not knowing is a person who is less susceptible to demagogues, to extremism, to the psychological relief of easy answers. They are a person who can stay curious when the pressure is to close down, who can say "I'm not sure" in rooms that punish that honesty, who can hold open questions long enough for the actual answers to emerge.

This is not a small thing. In a world that is offering false certainty at every turn, the capacity to remain genuinely open is a form of resistance. And it starts with practicing it in your own mind, on the questions that are most personal to you, right now.

Cite this:

Comments

·

Sign in to join the conversation.

Be the first to share how this landed.