Pulp and Paper Canada

Features Research & Innovation
Book Review: To Err Is Human

Recurring, but potentially avoidable human errors cause serious accidents and disasters. According to R. B. (Barry) Whittingham, author of The Blame Machine -- Why Human Error Causes Accidents, these are preventable because they result from defect...

July 1, 2004  By Pulp & Paper Canada


Recurring, but potentially avoidable human errors cause serious accidents and disasters. According to R. B. (Barry) Whittingham, author of The Blame Machine — Why Human Error Causes Accidents, these are preventable because they result from defective systems within a company.

Whittingham, who worked as an engineer for the chemical, nuclear, oil and gas, railway and aviation industries, coined the term ‘The Blame Culture.’ It refers to when people focus on individual human error and ignore the system failures that cause it. Whittingham eloquently explains in his book how this popular blame culture is very damaging to an organization, as it hampers proper investigation and blemishes the enquiry process by sidetracking the issues. Therefore, everybody loses. Thus, Whittingham, who developed a career as a safety consultant specializing in the human factor aspect of accident causation, suggests an ‘Open Culture’ instead.

In his book, Whittingham stresses that for a human error to occur within an action, the action must be accompanied by an intention to achieve a desired result or outcome. This eliminates spontaneous and involuntary actions — having no prior conscious thought or intent and including the random errors. Instead of putting the blame on people, Whittingham suggests concentrating on the systems, before scrutinizing the men and women who run them.

Advertisement

By using real incidents as examples to show common causes of human errors and typical system deficiencies that led to the problems, Whittingham lets readers see where the errors occurred so they can avoid them or better yet, prevent the errors before they occur.

The book also suggests understanding human error and its systemic causes. It is more difficult and daunting, but definitely a more satisfactory approach than attributing blame to individuals. The book explains how to do this and provides several case studies to explain it further.

One such example was the disastrous Chernobyl accident in the former USSR (now Ukraine) — arguably one of the world’s worst accidents and definitely the worst nuclear disaster. Its effects are still felt and seen in terms of radioactive contamination of land and incidence of cancers among neighboring populations. The accident took place as a result of a test carried out on April 26, 1986 in Unit 4 of the power station. The test was conducted in direct violation of standing procedures.

Whittingham writes, “the root of the accident was a design deficiency of the reactor which allowed a test procedure to take reactor conditions beyond the safe operating envelope. Although the test procedure was planned in advance, it involved violation of a number of normal operating rules specifically designed to protect the reactor against such an event. These violations were therefore the direct cause of the accident. There were a number of opportunities during the procedure where the reactor could and should have been shut down safely had the existing safety rules been observed. However, the completion of the test procedure took precedence over the normal rules which were comprehensively ignored.”

Instead of putting the blame solely on the operators, Whittingham explained and supported his claims with facts about design faults, as well as the violations that occurred.

The accident was caused primarily, he writes, by a combination of design deficiencies and violations of rules.


Print this page

Advertisement

Stories continue below