Think and Save the World

What Happens To Innovation When Failure Is Celebrated At National Scale

· 14 min read

The Shame Tax on Innovation

Every system that treats failure primarily as shameful pays a tax.

The tax is not always visible. It gets paid in the form of ideas that were never tried, companies that were never started, experiments that were never run, policy approaches that were never piloted because the political cost of a visible failure exceeded the potential benefit of a successful one. The tax is also paid in the form of failures that did happen but were not acknowledged as failures — that were reframed as incomplete successes, or attributed to external factors, or quietly buried so as not to generate the political or social consequences that honest accounting would produce.

This is not a metaphor. It is measurable.

Research on organizational culture and innovation consistently finds that organizations with high psychological safety — defined by Amy Edmondson at Harvard Business School as the shared belief that interpersonal risk-taking will not be punished — are substantially more innovative than organizations without it. The mechanism is direct: when people fear that admitting error or uncertainty will be used against them, they do not admit error or uncertainty. They present certainty they do not have. They pursue projects they know are likely to fail rather than kill them early, because killing them early means acknowledging the failure. They optimize for the appearance of performance rather than actual performance.

This is well-documented at the organizational level. The scaling of these dynamics to national and civilizational levels is less studied but follows the same logic.

The Specific Case of Japan and Nokia: Two Failure Models

Japan in the 1990s offers one of the most instructive case studies in what happens when a culture with enormous innovative capacity becomes structured around the impossibility of acknowledged failure.

The Japanese economic miracle of the postwar period was genuinely extraordinary — a devastated economy rebuilt into a global industrial power within a generation, through a combination of state industrial policy, high domestic savings rates, and extraordinary organizational capacity in the manufacturing sector. The Toyota Production System, which became the template for lean manufacturing globally, was in significant part a formal system for institutionalizing the acknowledgment of error. The concept of "jidoka" — stopping the production line when a defect is detected — encoded, at the manufacturing level, the principle that identifying failure early is more valuable than maintaining the appearance of smooth operation.

That organizational principle, which made Toyota an extraordinary manufacturing operation, was not generalized into the broader culture. The Japanese financial system of the 1980s became, instead, a demonstration of what happens when the social cost of acknowledging failure is so high that institutions create elaborate structures specifically to avoid that acknowledgment.

The asset bubble of the late 1980s — in real estate and equities — was not, at the level of individual banks and corporations, unrecognized. People within these institutions understood, or had access to evidence that should have made them understand, that the asset valuations underpinning enormous amounts of lending were not sustainable. The acknowledgment of that understanding, however, would have required admitting that previous decisions had been wrong. The social and professional consequences of that admission, within a corporate culture organized around face-saving and hierarchical loyalty, were severe enough that the acknowledgment didn't happen — or happened too late, too privately, too partially.

The bubble collapsed. The Japanese financial system entered what became known as the "Lost Decade" — actually closer to two lost decades — of economic stagnation. The recovery was slowed, among other things, by the fact that the banking system spent years carrying what became known as "zombie loans" — lending to insolvent companies that everyone knew were insolvent, in order to avoid writing down the loans and taking the accounting hit that would reveal the true state of the banks' balance sheets.

The zombie loan phenomenon is a direct expression of the shame tax. The cost of acknowledging failure — of writing down the loan, of letting the company go bankrupt, of accepting that the original lending decision was wrong — was experienced as so high that the system preferred to maintain the fiction of viability, even as the fiction consumed enormous resources and prevented the reallocation of capital to more productive uses.

Nokia, in the mid-2000s, provides a complementary European case study. Nokia was, at its peak, the world's dominant mobile phone manufacturer — not just by market share but by the sophistication of its technology and the breadth of its global reach. By 2007, Nokia engineers had prototype touchscreen smartphones. By 2008, internal research had identified the trajectory of software-driven mobile computing as the likely dominant direction of the market.

The research did not translate into organizational action at the pace and scale the evidence warranted, for a specific reason that has been documented by researchers who studied Nokia's decline: the internal culture had become one in which middle managers were afraid to deliver bad news to senior leadership, because senior leadership had a documented pattern of shooting the messenger. The result was that information about the threat posed by Apple's iPhone — information that existed within Nokia — was not surfaced in forms and at speeds that would have allowed the organization to respond.

The fear of delivering information that would be experienced as a failure — of the current strategy, of previous investment decisions — created an information blockage that functionally blinded the organization to a known threat. Nokia didn't lose to Apple because it lacked the technical capacity to compete. It lost, in significant part, because the shame structure of its organization prevented information from flowing.

The Countries That Built Failure Into the System

Some of the most instructive examples of intentional failure culture come from places that built the acknowledgment of failure into their institutional architecture deliberately, often in response to having experienced the costs of not having it.

Finland is the most frequently cited European example. The Finnish approach to education, which became widely studied after Finnish students began producing unusually strong results on PISA assessments in the early 2000s, is in significant part a story about failure culture. Finnish schools do not rank students against each other through competitive grading systems in the early years. Standardized testing is minimal until late in secondary school. The explicit pedagogical philosophy is that the purpose of error is learning, not evaluation — that a classroom in which students are afraid to give wrong answers is a classroom in which students are not actually learning how to think.

This is not just philosophy. It has structural consequences. Finnish students, on measures of willingness to attempt difficult problems and willingness to revise their approaches when initial attempts fail, perform differently from students in high-stakes testing environments. The willingness to fail, built into the educational culture, transfers into the broader innovation culture.

Finland also has one of the world's most robust corporate bankruptcy and restart frameworks. The legal infrastructure for failing — for winding down an insolvent business, clearing the debts, and starting again — is designed to minimize the permanent stigma of failure and maximize the speed with which failed entrepreneurs can attempt something new. The time from bankruptcy filing to discharge has been substantially shortened. The social conversation about entrepreneurship explicitly includes failure as an expected part of the process.

Israel's startup ecosystem — which is disproportionately productive relative to the country's size and resources — has been analyzed in detail by various researchers, and one consistent finding is the cultural attitude toward military service failure. Israeli military culture, which is deeply integrated into the national identity given the country's security situation, has developed a relatively high tolerance for commanders acknowledging when operations did not go as planned and drawing explicit lessons from that failure. The After-Action Review, which is an Israeli military innovation that has been adopted globally, is a structured process for honest assessment of what did and didn't work, conducted without the defensive maneuvering that tends to characterize failure analysis in more shame-prone cultures. The practice of honest failure analysis, normalized through military service, appears to transfer into entrepreneurial culture.

South Korea presents a more complex and cautionary case. The Korean innovation system has produced genuinely world-class outcomes in specific sectors — semiconductors, electronics, shipbuilding, more recently entertainment and design. But it has done so through a model that is, at the organizational level, still significantly organized around shame avoidance. The chaebol system — large family-controlled conglomerates that dominate the Korean economy — has historically been characterized by the same kind of internal information suppression that produced Nokia's decline and Japan's zombie loan problem. The Samsung Note 7 battery fire disaster of 2016, which resulted in a global product recall and enormous reputational damage, was preceded by internal warning signs that the batteries were not safe. The organizational culture that suppressed those warning signs, prioritizing the appearance of being on schedule over the honest assessment of a safety problem, is recognizable as a shame structure.

Korea's innovation is real and impressive. Korea's relationship with acknowledged failure within organizations remains, on the evidence, a constraint on what that innovation could be.

The Failure of Public Policy and What It Costs

The domain where the shame tax on failure is most consequential, and most invisible, is public policy.

Political systems in most democracies punish acknowledged failure with extreme severity. A politician who says "that program I championed didn't work, so we should change course" is, in most political environments, providing ammunition to opponents who will use the acknowledgment to characterize the politician as incompetent, unreliable, or ideologically exposed. The rational political response is to not acknowledge failure — to defend the program, to argue that it simply needs more time or more resources, to find ways to attribute poor outcomes to external factors.

The consequence is that policy programs that are not working continue. They continue because the people with the most information about whether they are working — the people who designed and championed them — have the strongest incentives to not surface that information honestly. The people with incentives to surface that information honestly — the political opponents — have their information discounted as partisan.

The result is a system that is structurally bad at learning. Not because the information isn't available. Because the information cannot be honestly processed within the shame structure of adversarial political competition.

This is catastrophic at the scale of genuinely difficult social problems. The war on drugs is one of the longest-running examples in American policy history. By essentially every measure — drug use rates, incarceration rates, public health outcomes, community stability in high-enforcement areas — the dominant enforcement-focused approach has failed to produce its stated goals over fifty-plus years. The evidence for this failure has been available, in clear form, for decades. The political acknowledgment of the failure has been limited, partial, and very slow, because the political cost of acknowledging that the war on drugs has been a failure is experienced as the political cost of appearing soft on crime.

The result: the policy continues. People die. Communities are destroyed. And the political conversation about what should replace the failed approach is structured, from the beginning, by the need to not look like you're validating what the failure looks like.

Healthcare policy in the United States follows the same pattern. Agricultural policy in most wealthy countries follows the same pattern. Infrastructure maintenance policy follows the same pattern. The shame of acknowledging that a previous approach didn't work — that resources were spent on something that isn't producing what was promised — is so high that the acknowledgment either doesn't happen or happens too slowly and too partially to allow genuine course correction.

This is not a problem unique to the United States. It is a structural feature of political systems organized around adversarial competition for power in which failure is the primary ammunition used by opponents. The design of the system creates the shame structure. The shame structure prevents learning. The failure of learning produces more failure, which produces more shame, which produces more concealment.

What Would It Mean to Design for Failure?

The obvious question is: what does a civilization look like if it actually designs its institutions to make failure safe to acknowledge?

This is not entirely hypothetical. There are institutional designs that have moved meaningfully in this direction.

Aviation safety is the most-cited example. The aviation industry has, over the past sixty years, built one of the most sophisticated failure-learning systems in any industry. The core mechanism is the near-miss and incident reporting system — a structure in which pilots, air traffic controllers, and other aviation personnel are actively encouraged to report situations that could have become accidents but didn't, with strong protections against those reports being used for disciplinary action. The data from near-miss reports is analyzed systematically to identify patterns in the system that create risk, and those patterns are addressed at the system level.

The result is that commercial aviation is now extraordinarily safe — dramatically safer than it was fifty years ago, despite an enormous increase in the volume of flights. The improvement is not primarily a function of better technology, though technology has improved. It is primarily a function of better learning — of having built a system that can actually ingest information about failure and near-failure and use it to prevent future failure, rather than suppressing it to avoid accountability.

The specific design features that make this work: immunity from punishment for honest reporting; confidential or anonymous reporting channels that reduce social risk; systematic analysis by entities separate from the reporters; public communication of findings in forms that allow the entire industry to learn; and a cultural norm — not just a policy — that treats reporting failure as professional responsibility rather than personal exposure.

This design has been replicated in parts of the medical system (particularly in surgical safety), in nuclear power operation, and in some financial risk management contexts. It has not been generalized because generalization would require accepting that honest accounting of failure is more valuable than the political and social protection of the people who created the failure.

The question of whether a national government could be designed on similar principles is genuinely interesting. The closest existing models are independent audit institutions — offices of inspector general, national audit offices, independent evaluators — that are structurally protected from political pressure to report honestly on whether programs are working. These institutions exist in many countries. They are rarely given the power or the political standing to produce the kind of systematic learning that would actually change policy.

The Relationship Between Failure Culture and Democratic Resilience

There's a connection between a civilization's relationship with acknowledged failure and its capacity to maintain democratic governance over time.

Democratic systems are, in theory, learning systems. They are designed to allow the replacement of failed leadership and failed approaches through peaceful competitive processes. The feedback loop — bad governance leads to electoral defeat leads to new governance leads to correction — is the mechanism by which democracies are supposed to update.

The mechanism breaks down, in various ways, when the shame structure prevents honest accounting of what has failed and why. When voters cannot reliably distinguish between genuine failure and adversarial characterization of normal outcomes as failure, the feedback loop loses fidelity. When politicians cannot acknowledge failure without political catastrophe, the feedback loop loses honesty. When media systems reward the performance of certainty and punish the expression of uncertainty or the acknowledgment of error, the feedback loop loses information.

The result is a democracy that is increasingly unable to use its designed learning mechanism — not because the mechanism was wrong in design, but because the shame structure has colonized the information environment that the mechanism requires to function.

A democratic culture that has genuinely internalized the value of acknowledged failure — that has built this value into its educational system, its political norms, its media culture, its institutional design — is a democratic culture that is substantially more resilient than one that hasn't. Not because failure becomes less painful. Because the pain is allowed to be information rather than being driven underground.

The Individual Practice That Scales

The civilizational transformation of the relationship to failure begins, as most civilizational transformations begin, with individuals who have done that work in their own lives.

A person who has made genuine peace with their own failures — not the performance of philosophical acceptance, but the actual embodied experience of having failed, having felt the shame, having sat with it rather than converting it into something else, and having found that the sitting produced something valuable — that person becomes, in their professional and civic life, a specific kind of resource.

They become the person who can say, in the meeting, "that didn't work, here's what I think we should learn from it" — and have the credibility and the self-possession to make that acknowledgment land as information rather than as weakness. They become the manager who can tell their team that the project is being killed, not because everyone failed, but because the hypothesis it was based on turned out to be wrong, and that's what hypothesis testing is for. They become the politician who can say, "I supported that approach, it produced these outcomes, my updated view is different" — and not collapse when opponents use the acknowledgment as an attack.

These individuals are not common. They're not common because the work of making genuine peace with failure is not common — because most of the socialization that people receive pushes in the opposite direction. But they exist. And they tend to be disproportionately present in the institutions and cultures that are genuinely innovative, genuinely adaptive, genuinely capable of doing something other than repeating their own history.

The civilizational version of this is a culture that intentionally produces these people, at scale, by deciding to treat failure honestly — in schools, in political culture, in media, in the stories told about what it means to be a member of this society.

Not a culture that tells stories about people who fail heroically on their way to triumph. A culture that honors the people who fail, stay in it, figure out what they can learn, and try something different — without needing the triumph to justify the attempt.

That culture is possible. It requires that the people who build it have dealt with their own shame about failure first.

The work is personal. The stakes are civilizational.

---

Exercise: Name three things you or your organization tried in the last year that didn't work. Not things that were partly successful — things that failed. Write down what you actually learned from each one, in specific terms. Not the lesson you'd put in a LinkedIn post. The actual specific thing you now know that you didn't know before.

Then ask: is there any person, institution, or structure in your life that would make it safe to share that learning publicly? If yes, share it. If no, ask what it would take to build that structure — and whether you're willing to do that work, knowing that the first step is probably acknowledging a failure you haven't acknowledged yet.

The innovation that comes out of the other side of that is not the inspirational poster version. It is more specific, more honest, more grounded in what actually happened — and therefore more likely to produce something that actually works.

Cite this:

Comments

·

Sign in to join the conversation.

Be the first to share how this landed.