Invisible Crack

Microscopic defects propagate silently until catastrophic failure.

← Back to trails
Structures

Structures

In this way a tiny unseen crack may start from any hole or notch or irregularity in a stressed metal and may spread across the material, which is not, as a whole, changed in any obvious way. Sooner or later, such a ‘fatigue crack’ will reach the critical length for an ordinary common or garden crack. When this happens, the crack will immediately speed up and run right across the material, often with very serious consequences.
To Engineer Is Human

To Engineer Is Human

“Brittle fracture,” in which a large crack runs spontaneously through a structure near the speed of sound, severing steel with a report that signals the breakup of ships, the bursting of pressure vessels, or the collapse of bridges, has been a chronic problem for centuries. There is almost always a period of “gestation”—the slow lengthening and sharpening of cracks through the process of fatigue that precedes the catastrophe.
What Technology Wants

What Technology Wants

In each case the minor error triggers, or combines with, other unforeseen consequences in the system, also minor. But because of the tight interdependence of parts, minor glitches in the right improbable sequence cascade until the trouble becomes an unstoppable wave and reaches catastrophic proportions. Sociologist Charles Perrow calls these “normal accidents” because they “naturally” emerge from the dynamics of large systems. The system is to blame, not the operators. Perrow did an exhaustive minute-by-minute study of 50 large-scale technological accidents (such as Three Mile Island, the Bhopal disaster, Apollo 13, Exxon Valdez, Y2K, etc.) and concluded, “We have produced designs so complicated that we cannot anticipate all the possible interactions of the inevitable failures; we add safety devices that are deceived or avoided or defeated by hidden paths in the systems.”
The Perfectionists

The Perfectionists

Yet, for reasons that have much to do with what is euphemistically called the “culture” of that particular facility within Rolls-Royce’s immense engineering establishment, the stub pipe passed all its inspections. A potentially weakened engine component made its way all along the supply chain until it was placed into the engine, and there to await its inevitable breakage—and the equally inevitable destruction of the entire engine. It should have failed inspection, but it didn’t. It just failed in real life.
The Origin of Wealth

The Origin of Wealth

Complexity catastrophes help explain why bureaucracy seems to grow with the tenacity of weeds. Many companies go through bureaucracy-clearing exercises only to find it has sprung back a few years later. No one ever sits down to deliberately design a bureaucratic muddle. Instead, bureaucracy springs up as people just try to optimize their local patch of the network: finance is just trying to ensure that the numbers add up, legal wants to keep us out of jail, and marketing is trying to promote the brand. The problem isn’t dumb people or evil intentions. Rather, network growth creates interdependencies, interdependencies create conflicting constraints, and conflicting constraints create slow decision making and, ultimately, bureaucratic gridlock. ... How could one of the world’s greatest corporations, a company with billions of dollars in assets, with hundreds of thousands of talented employees around the world, and with Nobel Prize–winning technical research, lose out to a teenager with pocket money from his stamp collection? We can speculate that as Dell began eating into IBM’s PC market share in the early 1990s, some smart person in IBM must have said, “Customers seem to like buying computers through the mail and Dell is growing fast—why don’t we sell computers through the mail?” Selling computers through the mail was certainly not beyond the capabilities of IBM; it could buy boxes and bubble wrap and put things in the mail as well as anyone else. So why did the company wait until several years after Dell had passed it in market share to begin selling computers directly to its customers? The reason is that IBM fell prey to a complexity catastrophe.
Collapse

Collapse

Politicians use the term “creeping normalcy” to refer to such slow trends concealed within noisy fluctuations. If the economy, schools, traffic congestion, or anything else is deteriorating only slowly, it’s difficult to recognize that each successive year is on the average slightly worse than the year before, so one’s baseline standard for what constitutes “normalcy” shifts gradually and imperceptibly. It may take a few decades of a long sequence of such slight year-to-year changes before people realize, with a jolt, that conditions used to be much better several decades ago, and that what is accepted as normalcy has crept downwards.
Thinking in Systems

Thinking in Systems

Some systems not only resist policy and stay in a normal bad state, they keep getting worse. One name for this archetype is “drift to low performance.” Examples include falling market share in a business, eroding quality of service at a hospital, continuously dirtier rivers or air, increased fat in spite of periodic diets, the state of America’s public schools—or my onetime jogging program, which somehow just faded away. ... The balancing feedback loop that should keep the system state at an acceptable level is overwhelmed by a reinforcing feedback loop heading downhill. The lower the perceived system state, the lower the desired state. The lower the desired state, the less discrepancy, and the less corrective action is taken. The less corrective action, the lower the system state. If this loop is allowed to run unchecked, it can lead to a continuous degradation in the system’s performance. Another name for this system trap is “eroding goals.” It is also called the “boiled frog syndrome,” from the old story (I don’t know whether it is true) that a frog put suddenly in hot water will jump right out, but if it is put into cold water that is gradually heated up, the frog will stay there happily until it boils.
Behave

Behave

The first is the persuasive power of the incremental. ... We rarely have a rational explanation for an intuitive sense that a line has been crossed on a continuum. What incrementalism does is put the potential resister on the defensive, making the savagery seem like an issue of rationality rather than of morality. This represents an ironic inversion of our tendency to think in categories, to irrationally inflate the importance of an arbitrary boundary. The descent into savagery can be so incremental as to come with nothing but arbitrary boundaries, and our descent becomes like the proverbial frog cooked alive without noticing.
The Elephant in the Brain

The Elephant in the Brain

When a group’s fundamental tenets are at stake, those who demonstrate the most steadfast commitment—who continue to chant the loudest or clench their eyes the tightest in the face of conflicting evidence—earn the most trust from their fellow group members. The employee who drinks the company Kool-Aid, however epistemically noxious, will tend to win favor from colleagues, especially in management, and move faster up the chain. In fact, we often measure loyalty in our relationships by the degree to which a belief is irrational or unwarranted by the evidence.
The True Believer

The True Believer

The surprising thing is that this pathological mistrust within the ranks leads not to dissension but to strict conformity. Knowing themselves continually watched, the faithful strive to escape suspicion by adhering zealously to prescribed behavior and opinion. Strict orthodoxy is as much the result of mutual suspicion as of ardent faith.