Why You Should Argue Positions You Disagree With Regularly
The straw man fallacy is well-known. Less discussed is its inverse: the obligation to argue positions you oppose at their strongest. This second practice — steel-manning — is intellectually harder, less socially rewarded, and far more valuable.
The ideological Turing test
Bryan Caplan (economist, George Mason) proposed the ideological Turing test in 2011. The original Turing test asks whether a machine can produce outputs indistinguishable from a human. The ideological version asks: can a person with a given political or moral view produce outputs indistinguishable from someone with the opposing view?
Most people cannot pass this test for their own views. Caplan ran informal experiments asking people to characterize liberal and conservative positions. People articulating the opposing view were routinely recognizable as hostile characterizations — dismissive, lacking the genuine motivations of the other side, omitting the strongest arguments.
This is diagnostic. If you can't pass the ideological Turing test, it means you don't understand the opposing position well enough to be confident that your position is better. You may be right. But you don't know it — you just haven't heard the actual case.
The test has high standards. The criterion isn't "does your characterization sound plausible to a neutral observer?" It's "would a sophisticated person who holds this view recognize it as an accurate and fair characterization?" These are different standards, and the second is much harder to meet.
The historical practice
The practice of arguing the opposite position has deep roots. In academic philosophy, a common training exercise is to write a steel-man defense of a position you personally oppose. Law schools use moot court and appellate advocacy to force students to argue assigned positions regardless of personal belief. Jesuit education, particularly in the ratio studiorum, emphasized disputation — formal argument of assigned positions, not chosen ones — as a method of developing rigorous thinking.
Stuart Mill's argument in On Liberty (1859) is the philosophical foundation: "He who knows only his own side of the case knows little of that." Mill argues that even positions that are true need to be contested regularly, because without genuine opposition they become dead dogma — believed on authority rather than understanding. The live knowledge requires the intellectual friction of having been genuinely tested.
The adversarial collaboration approach — practiced by some academic researchers — takes this seriously: two researchers who disagree design and run a study together, agreeing in advance to accept the result. The collaborative design process forces each party to genuinely engage with the other's objections. The results tend to be more credible and more surprising than solo research.
Motivated reasoning and why you need this
The main obstacle to genuinely engaging with opposing views is motivated reasoning — the documented tendency to evaluate information in ways that confirm existing beliefs. Kunda (1990) established the basic finding: people in fact reason toward desired conclusions, and they use the same cognitive processes as genuine reasoning — they just apply different scrutiny to confirming vs. disconfirming evidence.
The mechanism is roughly: you encounter information that opposes your view, you immediately begin generating objections to it, you find objections more or less readily, you update based on the objections found rather than the information encountered. The asymmetry in scrutiny — disconfirming information gets examined hard, confirming information gets a pass — produces the illusion of reasoning while actually producing motivated conclusions.
Steel-manning disrupts this by forcing you to stop generating objections and start generating the strongest possible case. It temporarily flips the motivated reasoning direction: now you're motivated to find the best arguments on the other side. This doesn't make you objective — nothing makes you objective — but it creates a countervailing force against the usual bias.
What the practice actually produces
If done consistently over time, arguing positions you disagree with produces several cognitive changes:
Better calibration about your own certainty. When you've genuinely engaged with the strongest case against your view, you tend to hold your own position at a more accurate confidence level. Not necessarily lower — sometimes engaging with the opposition makes you more confident, because you discover their best arguments are answerable. But the confidence is earned rather than assumed.
Access to the actual premises of disagreement. Most disagreements that look like factual disagreements are actually value disagreements (or are factual disagreements resting on value premises). Steel-manning an opposing view tends to reveal which layer the disagreement actually lives at. Is this person wrong about the facts? About how to weigh competing values? About what counts as evidence? Locating the actual source of disagreement is necessary for genuine engagement with it.
Reduced false consensus. People systematically overestimate the degree to which others share their views (the false consensus effect, Ross et al. 1977). Seriously engaging with the opposing view breaks this down. You start to understand how a thoughtful person ends up in a different place — not because they're stupid or evil, but because they're weighing things differently, or starting from different priors, or have had different experiences that made certain arguments more salient.
Better arguments for your own position. This is the instrumental case for steel-manning, and it's real. Having mapped the opposing position thoroughly, you know exactly where the pressure points are and which objections you need to address. Your arguments become sharper because they've been tested against the actual opposition, not the caricature.
The devil's advocate mechanism in groups
Individuals can practice steel-manning in private reasoning. Groups need institutional mechanisms.
Alfred Sloan, running General Motors in the mid-twentieth century, is said to have concluded any meeting where everyone agreed by saying "then let's table this and meet again when we have some disagreement." The reasoning: agreement in a meeting doesn't mean everyone genuinely agrees — it means no one spoke up. Real agreement needs to survive genuine challenge.
Irving Janis's research on groupthink (1972) documented how cohesive groups suppress dissent and thereby make systematically worse decisions. The classic cases — Bay of Pigs, Challenger disaster, Vietnam escalation — all share a pattern of suppressed doubt and premature consensus. Janis recommended institutionalizing devil's advocacy: assign someone to argue against the emerging consensus, with explicit protection from social penalties for doing so.
The Red Team approach (used in military planning, intelligence analysis, and some corporate strategy) is a formalized version. A Red Team is explicitly tasked with defeating the group's preferred plan — finding weaknesses, generating opposing scenarios, arguing the enemy's (or competitor's) position. Good Red Teams are not just critical; they generate genuinely strong alternative framings.
The failure mode of devil's advocacy is when it becomes performative — going through the motions of challenging a decision without actually generating the strongest challenge. Groups often learn to tolerate the ritual devil's advocate while ignoring what they say. The standard to hold is: did the devil's advocate produce arguments strong enough that someone has to actually answer them, not just dismiss them?
Limits of the practice
Steel-manning is not the same as moral relativism or epistemic cowardice. The practice of arguing opposing positions well is in service of finding out what's actually true — it's not a commitment to treating all positions as equally valid.
Some positions don't have strong steel-man versions. Young earth creationism does not have a steel-man case that stands up to geological evidence. Holocaust denial does not have a steel-man version that survives historical scrutiny. The obligation is to engage with the strongest form of a view — not to invent strength where none exists.
The relevant distinction: positions held by thoughtful, well-informed people who've engaged seriously with the evidence deserve steel-man treatment. Positions maintained only through information suppression or motivated reasoning do not require equivalent engagement — they require identifying the psychological mechanism, not the intellectual case.
But it's striking how often people apply the "that's not worth engaging with" dismissal to positions that are, in fact, held by thoughtful and well-informed people. The dismissal is often itself a form of motivated reasoning.
The practice, concretely
Every week or so: pick a position you hold strongly and write the best possible case against it. Not the easy case — the hard one. Find the primary sources. Read the strongest advocates. Write a version that would be recognizable to someone who holds that position. Then evaluate your original position in light of it.
For group decisions: assign devil's advocate roles explicitly, rotate them, and protect them from social penalty. Give the devil's advocate the last word in any discussion before a decision. Require the group to explicitly answer the objections raised, not just note that objections were raised.
Over years: you develop a reputation as someone who understands all sides of issues they care about, which makes you more persuasive, more trusted, and — most importantly — more likely to actually be right.
The goal isn't to become a blank slate. It's to be someone whose positions have survived genuine pressure. Most people's views haven't been tested that way. Yours can be.
Comments
Sign in to join the conversation.
Be the first to share how this landed.