HRO3 A Fire Aboard Ship Can Ruin Your Day ...

There are few facts available about the fire aboard the USS BONHOMME RICHARD (LHD 6), but lots of assumptions and opinions. This post provides an HRO introduction to the fire. It starts with the basic facts associated with Navy ship fire safety. It continues with some observations causes of serious shipboard fires. It concludes with the need to treat shipboard fire safety as a system.

HRO2 Introduction to HRO

This post provides an overview of High Reliability Organizing (HRO) and the research on it. I distinguish high reliability organizing, the principles and practices that yield superb performance, from high reliability organizations, collections of people working to create high reliability. I focus on the principles and practices of organizing to achieve high reliability (High Reliability Organizing) what the people in the organizations DO and WHY rather than on the organizations themselves. This blog is about the organizational design practices for active management to reduce failure and increase the reliability of important outcomes.

HRO1 My HRO Blog

This post is an introduction to my writings on High Reliability Organizing (HRO). I have a different perspective than others because I am both an organizational scholar and have decades of experience as an HRO practitioner. The blog is a way for me to explore those different ideas and share them with others.

Summary: This is part three in my series of posts on human error. This post is again based heavily on Chapter 3 of “To Err is Human” by the Institute of Medicine. This post reviews some ideas on the nature of safety, observes that some types of complex systems are more prone to error/failure than others, and introduces the term “naturalistic decision making.”
This is part two in what I believe will be a five-part series of posts on human error. This post is based heavily on Chapter 3 of “To Err is Human” by the Institute of Medicine. The book reviews the current understanding of why medical mistakes happen and its approach is applicable to other high hazard industries as well. A key theme is that legitimate liability and accountability concerns discourage reporting and deeper analysis of errors--which begs the question, "How can we learn from our mistakes?" This post covers why errors happen and distinguishes between active and latent errors.
To paraphrase Mark Twain, it is not what people know about accident investigations, causality, and corrective actions that gets them into trouble (or leads to weak corrective action and thus the same problems over and over again), it is what they know that just is not so. This post is based on Sidney Dekker’s “The Field Guide to Understanding Human Error.” Only read further if you dare to have your worldview of critiques and corrective action challenged. You may conclude that everything you think you know about human error investigations (also known as critiques and fact findings) is wrong or in need of serious revision. Part 1 focuses on same basics of human error that more people should know and part 2 will recommend things you can do to get better at managing human error (strange as that may sound). This post is a little longer than normal, so make sure you have some extra time to read it.