The Coronavirus & Big Disaster Avoidance

Many dedicated and talented people are working furiously to contain and cure the Coronavirus, but are we missing something when it comes to such disasters: Flint water, Boeing Max, Fukushima nuclear reactor, California PG&E fires, and other disasters with big impacts but less press attention? For many of these, a word or two is enough to evoke public anger, and what many have in common is What Went Wrong might have been identified and rectified long before disaster struck.

 

Pie in the sky, impractical thinking? Not if companies and governmental agencies required a What Could Go Wrong review that examines programs and activities affecting public health and safety with a critical eye, drawing on the expertise of independent experts whose only mission is to identify vulnerabilities; not hindering progress or innovation, making sure big risks are identified as early as possible.

 

Many will insist we are already doing this but my experience and deep-dive looks at these disasters reveals no-holds-barred What Could Go Wrong assessments are rarely done. Studying these big disasters, what surprised me is they didn’t occur because technical mistakes were made, or work was shoddy, or contractors were incompetent, and none of these were in the category of virtually unpredictable “asteroid strikes”. These programs or activities went forward based on assumptions that turned out to be flawed or didn’t consider a broad enough range of risks.

 

Why? The prevailing culture rewards short-term results, conventional thinking, political considerations (corporate and public), and is too dependent on legal cautions rather than establishing a disciplined process to ferret out faulty assumptions and blinders.  Don’t companies and governmental agencies have quality programs and checklists to make sure things are done as they ought to be? Quality programs identify what should be done whereas a What Could Go Wrong review asks what are we missing. This means people with deep knowledge of the subject matter, people with big picture perspectives, are brought in for no-holds-barred risk identification, even risk imagination. I know Michiganders who could have prevented the Flint water disaster and all the misery that ensued in a one day What Could Go Wrong meeting.

 

Lastly, an awful death on a white sand beach at a deluxe resort, “trap and release” the checklist measure to control alligator access. What Could Go Wrong? Without an unbreachable (meaning, environmentally “unfriendly”) barrier to prevent these predators from getting into resort waterways the possibility that alligators would migrate into these food-rich waters and that some would avoid capture was likely, if not inevitable. An expert reptile scientist prompted to imagine what are we missing and with no incentive to tell the company what it wanted to hear could have identified this risk in short order.

 

A WCGW review, properly planned and facilitated, can be done quickly and inexpensively. Disasters can’t be eliminated but many can be prevented or mitigated by having the right people ask What Could Go Wrong.

 

Thomas M. Doran has managed hundreds of projects for companies, communities, and states for 40 years. He is a Fellow of The Engineering Society of Detroit, was President of an engineering company and an adjunct engineering professor at Lawrence Technological University.