“We can become so accustomed to problems that we stop seeing them as problems.” — Dan Heath
In 1986, the Space Shuttle Challenger exploded 73 seconds after launch, killing all seven crew members. The investigation that followed uncovered a disturbing finding: NASA engineers had been aware of problems with the O-ring seals on the solid rocket boosters for years before the accident. They had seen evidence that the seals were failing in cold temperatures. They had worried. They had raised concerns.
But they had also watched previous launches succeed despite the O-ring problems. Each successful launch, despite the known issue, provided a data point: maybe it’s not so bad. Maybe the risk is acceptable. Over time, what had started as a genuine safety concern became normalized — incorporated into the system’s accepted risk profile.
The sociologist Diane Vaughan coined the term “normalization of deviance” for this phenomenon. It describes how organizations — and people — gradually come to accept behaviors or conditions that deviate from safety standards, simply because those deviations have not yet caused catastrophic failure.
Normalization of deviance is not limited to aerospace engineering. It happens in healthcare (nurses normalizing patient care shortcuts), in finance (traders normalizing increasingly risky positions), in education (teachers normalizing student disengagement), and in everyday life (couples normalizing communication patterns that are slowly damaging their relationship).
The pattern is always the same: a problem exists, is recognized, causes concern — and then, because catastrophe doesn’t immediately follow, is gradually accepted as the way things are. The background noise of the problem becomes the new silence.
Heath introduces a vivid metaphor for a related phenomenon: the smoke detector effect. We only notice the smoke detector when it beeps. When conditions are stable — no smoke, no alarm — we don’t think about the detector at all. We don’t think about the potential fire it exists to prevent.
Our attention is drawn to signals, not to the absence of signals. The absence of an emergency is not a message we process as meaningful. This is why upstream problems — problems that are building slowly, that haven’t yet triggered the alarm — are so easy to miss.
When a crime prevention program works perfectly, what do you observe? Nothing. No crime was committed. There is no visible evidence of success — only the absence of a problem that might have existed.
Compare this to the downstream scenario: a crime is committed, police are called, an investigation opens, an arrest is made. The downstream work is vivid, visible, and trackable. The upstream success is invisible, uncredited, and often unfunded.
This asymmetry of visibility creates a systematic bias against upstream investment. We can see what we’re spending money on downstream. We cannot easily see what we’re preventing upstream — so we undervalue it.
The most dangerous form of normalization is when a problem becomes so familiar that it ceases to register as a problem at all. This is not denial — it is something more subtle and more pervasive.
Think of the school with a 40% dropout rate. For the teachers who have worked there for twenty years, that number is not an emergency. It is the school’s baseline. It is the way things are. The emergency feeling has faded, leaving behind resignation and adaptation.
Heath offers several strategies for breaking normalization:
The goal is to restore the original feeling of alarm — to see the accumulated problem freshly, as someone encountering it for the first time would.
What problem in your organization or life has been normalized — accepted as inevitable when it is actually preventable? What would a newcomer notice that you’ve stopped seeing?