How Complex Systems Fail – a very concise, yet complete paper on how complex systems fail. It’s not system or industry specific. Here are just the bullet points:
- Complex systems are intrinsically hazardous systems.
- Complex systems are heavily and successfully defended against failure.
- Catastrophe requires multiple failures – single point failures are not enough…
- Complex systems contain changing mixtures of failures latent within them.
- Complex systems run in degraded mode.
- Catastrophe is always just around the corner.
- Post-accident attribution accident to a ‘root cause’ is fundamentally wrong.
- Hindsight biases post-accident assessments of human performance.
- Human operators have dual roles: as producers & as defenders against failure.
- All practitioner actions are gambles.
- Actions at the sharp end resolve all ambiguity.
- Human practitioners are the adaptable element of complex systems.
- Human expertise in complex systems is constantly changing.
- Change introduces new forms of failure.
- Views of ’cause’ limit the effectiveness of defenses against future events.
- Safety is a characteristic of systems and not of their components.
- People continuously create safety.
- Failure free operations require experience with failure.
8,9,12,13,17 are usually the causes ;-)
Steve Hutch liked this on Facebook.
this is why keeping things simple, especially in security, is important: http://t.co/F8GoHdbTtd
RT @enygma: this is why keeping things simple, especially in security, is important: http://t.co/F8GoHdbTtd