Preparing for the unexpected – A case for inherently safer design

Preparing for the unexpected – A case for inherently safer design

Process Safety Preparing for the [1_TD$IF]unexpected – A case for inherently safer design W hen Benjamin Franklin said ‘‘an ounce of prevention is ...

129KB Sizes 0 Downloads 72 Views

Process Safety

Preparing for the [1_TD$IF]unexpected – A case for inherently safer design

W

hen Benjamin Franklin said ‘‘an ounce of prevention is worth a pound of cure’’ he was actually giving fire fighting advice, an early example of process safety and loss prevention. Franklin’s point was that it was much better to avoid problems to begin with, rather than to fix them after they have occurred. Franklin was a pioneer in many areas, including fire protection. In 1736 he organized the Union Fire Company, the first in the city of [2_TD$IF]Philadelphia.1 In 1752, he was involved in the founding of the Philadelphia Contribution for Insurance Against Loss by Fire, an early mutual insurance company which continues in business today as The Philadelphia [3_TD$IF]Contributionship.2 As I write this in early May, Franklin’s words come to mind frequently as I listen to the news. My column in the March 2011 issue of this journal, which discussed the reasons why organizations fail to adequately for disaster seems particularly relevant, given recent events:  The earthquake and tsunami in Japan  The resulting Fukashima nuclear power plant accident  Severe tornadoes throughout the American south  Severe floods along the Mississippi River

Also, we have recent examples of human and mechanical failures which could fall into the ‘‘who ever would have thought that could happen?’’ category:  Air traffic controllers sleeping on the job, in some cases resulting in pilots having to land airplanes without assistance from the control tower  Fatigue failures in the fuselage of commercial airliners When Ben Franklin gave his fire fighting advice, he was an early advocate of inherently safer design. Of course, this was long before Trevor Kletz coined the term in the late 1970s, 1

http://www.ushistory.org/franklin/info/ index.htm, accessed May 5, 2011. 2 http://www.contributionship.com/history/ index.html, accessed May 5, 2011. 62

ß Division of Chemical Health and Safety of the American Chemical Society Elsevier Inc. All rights reserved.

the process industries were in their infancy, and Franklin was more concerned about general city fire hazards. But the point remains valid – we should always question the existence of hazards in our installations and challenge ourselves to eliminate them whenever possible. We should also be more open to the possible occurrence of things that are ‘‘highly unlikely’’ or ‘‘can’t happen here’’. I once took a short course on statistical design of experiments, and the instructor mentioned that one good rule in interpreting statistical data is that ‘‘rare events happen, but not to me’’. That is probably a good general rule for interpreting experiments. If I flip a coin 20 times and get 20 heads, it is probably a good interpretation that it is not a fair coin, rather than that I have run the one experiment in 220 which gives me that result by chance. But when thinking about rare events in process safety, the results may be so catastrophic or costly that we must prepare for them even though they are extremely unlikely. In many cases, they may not actually be as unlikely as we think anyway. Our performance expectations for process safety are extremely high, with published risk criteria for catastrophic events from governments and others often being in the range of one chance in hundreds of thousands or even millions of years. We may not be able to effectively estimate the likelihood of these kinds of events, and we certainly can’t do it qualitatively from our personal experience. Sometimes these risks must be managed, and we must use the best risk analysis and management tools that we have, but it is always preferable to eliminate the hazard whenever this is feasible. I sometimes wonder if we have begun to have too much faith in our technological risk management systems. Are we becoming more willing to trust in these systems rather than avoiding hazards than we were in the past? In June 1972, my home town of Harrisburg, Pennsylvania was struck by severe flooding from Hurricane Agnes. The new governor’s mansion, built a few years before Agnes, was flooded and the governor of Pennsylvania had to be evacuated by boat. Interestingly, most of nearby downtown Harrisburg, on high ground beside the Susquehanna River, was not flooded, although it was somewhat isolated by high water. When John Harris selected the site for his town in 1785, he recognized that

1871-5532/$36.00 doi:10.1016/j.jchas.2011.07.004

rivers do flood occasionally, and selected a place which was unlikely to flood. He knew about inherent safety and hazard avoidance, but today the city has expanded onto ground which is prone to flooding. This is certainly not the only city where this has occurred – much of the old part of New Orleans was not flooded by Hur-

ricane Katrina. Often today we rely on technology such as dikes, levies, flood control dams, and other systems. Sometimes these fail, or they are overwhelmed by an event of greater magnitude than was considered ‘‘credible’’ by the designers. The message for the process industries – remember what Ben Franklin

Journal of Chemical Health & Safety, September/October 2011

had to say and emphasize hazard elimination when we design processes and plants. This may not be possible, but I guarantee that you will never eliminate a hazard if you do not challenge yourself to come up with a design which does eliminate the hazard. Don’t give up before you try, and make sure that you try hard!

63