Every industry that takes safety seriously got there the same way. Not through gradual, proactive improvement. Through catastrophe. Through a failure so visible, so costly, or so lethal that the old way of operating became politically and culturally impossible. The new safety culture did not emerge from enlightenment. It emerged from wreckage.
Aviation had Tenerife in 1977, where two 747s collided on a runway and killed 583 people, triggering a complete redesign of cockpit communication norms, crew resource management training, and authority gradients. Nuclear had Three Mile Island in 1979 and Chernobyl in 1986, triggering independent safety reviews, redundant safety systems, and an entirely new regulatory posture. Medicine had the Institute of Medicine's "To Err is Human" report in 1999, which documented 98,000 preventable deaths annually in U.S. hospitals and triggered a decade of systemic reform.
Each industry resisted change before its reckoning. Each had advocates who argued that existing practices were sufficient, that costs of reform were prohibitive, that catastrophic failure was unlikely. Each was wrong. And each, after the reckoning, built safety infrastructure that made the pre-reckoning era look like a different civilization.
Software is still in the pre-reckoning era. The question is not whether a reckoning is coming. It is when โ and whether the industry will act before it arrives or after.
What a Safety Culture Actually Requires
A safety culture is not a set of policies. It is a set of structural commitments that change the default behavior of everyone in the system โ from the engineer writing code to the executive making release decisions. The industries that have built it share a common architecture.
How Other Industries Built Safety Culture โ and What Forced Them To
The pattern in each case is identical. Before the reckoning: safety is someone's job among many, schedule pressure routinely overrides safety concerns, individuals who raise safety issues face professional consequences, and the system is optimized for throughput rather than reliability. After the reckoning: safety becomes a structural property of the system, not a personal responsibility of individuals within it. The change is not cultural in the soft sense. It is architectural.
Where Software Stands
Software has had incidents. It has had breaches, outages, and failures with real human consequences. Patients have been harmed by software failures in medical devices. Financial systems have failed and taken customer assets with them. Critical infrastructure has been compromised. Aircraft flight management software has contributed to fatal crashes.
What software has not had is a single, visible, unambiguous failure large enough and attributable enough to force structural reform across the industry. The failures that have occurred have been absorbed โ financially, reputationally, legally โ without producing the systemic response that aviation's or medicine's reckoning produced. The industry has been fortunate in this. The fortune may not continue.
Why Vibe Coding Accelerates the Timeline
The emergence of AI-assisted development โ what the industry has taken to calling vibe coding โ changes the risk calculus in ways the industry has not yet absorbed. Not because AI is inherently unsafe, but because it dramatically increases the volume of code entering production pipelines while the verification infrastructure underneath those pipelines has not changed.
The Four Ways AI-Assisted Development Amplifies Software Safety Risk
"The aviation industry's pre-Tenerife culture had a phrase: 'captain's authority.' The captain's judgment was not to be questioned by crew. Multiple crashes were attributable, in retrospect, to crew members who saw something wrong and did not say it โ because the culture made deference to authority the safe personal choice. Software has its own version of this: 'ship it.' The culture makes deference to schedule the safe personal choice. The mechanism is different. The outcome is the same."
What Software Safety Culture Would Actually Require
The industries that built safety cultures did not do it by asking individuals to be more careful. They built infrastructure that made unsafe behavior structurally difficult โ that changed the system so that the safest path and the easiest path were the same path. Software safety culture, when it arrives, will require the same approach.
The Structural Properties of a Software Safety Culture
None of these properties require a regulatory mandate. They require a decision by engineering leadership to build the infrastructure before the reckoning rather than after it. Aviation did not wait for regulators to invent Crew Resource Management โ the industry built it because the alternative was continued catastrophe. The software industry can make the same choice.
The Bottom Line
The software industry is not uniquely reckless. It is where every high-velocity, high-consequence industry was before its safety culture moment โ optimizing for throughput, absorbing failures as individual events rather than systemic signals, and deferring the structural reform that would prevent future failures until those future failures become impossible to absorb.
The difference is that software's failures are increasingly consequential. The systems it builds are increasingly critical. The pace of AI-assisted development is increasing the volume and velocity of code entering production. And the verification infrastructure underneath all of it has not changed in proportion to any of these trends.
The reckoning may be a single visible catastrophe. It may be the cumulative weight of a thousand smaller failures. It may be regulatory action that arrives faster than the industry expects. Whatever form it takes, the organizations that will navigate it best are not the ones that responded to it โ they are the ones that built the infrastructure before it arrived.
Aviation did not need Tenerife to tell it that cockpit communication was important. It needed Tenerife to act on what it already knew. Software already knows. The question is whether it will act.
Build the Infrastructure Before the Reckoning.
MCX Services helps engineering organizations build the quality infrastructure that a genuine software safety culture requires โ continuous verification, objective release criteria, and evidence-based decision making. The conversation starts with where your organization stands today.