The “human error” myth
For the most part, modern food production facilities are very safe. Managers use several levels of precautions to keep them that way. Training, rules, laws, checks, safety devices, organizational norms and formal processes are layered over one another to make sure that no single failure would lead to a serious injury. The result is that serious injuries are very rare.
There’s a sneaky downside to this. With injuries happening so infrequently, people get complacent, they forget the risks inherent in their jobs, and they start taking shortcuts. They gradually whittle away safety margins, and in the absence of proper checks and controls, that narrowing of the margins doesn’t get noticed until an accident happens.
The general instinct is then for managers to tell themselves that there are so many layers of precautions at work, the best explanation for the accident has to be that it was a freak occurrence of human error. They conclude, “Bob got injured because he put his hand in the grinder in spite of all the precautions in place to prevent him doing that.” If that’s a factual assessment of what happened, it’s hard to argue with. But that can’t be the end of your investigation.
You have to ask, “why did Bob do that?” Possible contributing factors include:
- The grinder had jammed, and Bob is accountable for the throughput of his station.
- Bob remembers that the last person who stopped the production line got a humiliating dressing down in front of his coworkers.
- Bob’s boss sees Bob put his hand in the grinder every day but has never said anything about it, giving his tacit approval to Bob’s habitual bending of the rules to get the job done.
- The plant is so noisy and the lighting so poor that Bob did not realize the grinder was still spinning down when he put his hand in it. The spring-loaded guard grate didn’t protect him; someone had intentionally disabled it ages ago because it was inconvenient.
- Some of Bob’s co-workers had been laid off recently. For fear of being next, Bob was trying to keep a low profile and unjam the grinder himself.
Any of these are plausible real-life scenarios.
Recognizing system failure
What the safest food companies recognize in a situation like this is that Bob’s injury isn’t a random case of user error. It’s a result of a failure of the organization’s systems to protect him. And that failure happened because the organization’s culture didn’t properly prioritize safety.
Culture, not user error, is the ultimate cause of the most serious workplace injuries. Safety experts in some of the riskiest fields — including aviation, medicine and nuclear power — are coming around to the realization that, when an industry is generally ultra-safe, disasters can only happen when the system fails.
Error management expert James Reason originated ‘the Swiss cheese model’ to help explain this phenomenon. The idea is that safety measures are like individual slices of Swiss cheese. Each one is imperfect and penetrable, but several layers stacked on top of one another form a solid barrier. If you add too many holes to the layers, or remove some of the layers, the holes could line up and allow an accident to pass through.
So in the case above, Bob’s accident wasn’t a freak occurrence at all. It was inevitable that someone would be hurt eventually because the safety culture was broken. The multiple layers of protection broke down and left Bob with too narrow a safety margin. The holes lined up and he got hurt.
Prevent system failure ahead of time
When your safety culture is broken, a serious accident is usually the first sign. Don’t wait until that happens. Take steps now to make sure everyone in your organization knows:
- how to keep themselves and each other safe;
- the company prioritizes safety above all else;
- nobody at your company ever gets in trouble for reporting unsafe conditions or refusing unsafe work.
Remember that it’s culture that keeps people safe, not rules and regulations. Broken culture is what leads to the erosion of safety margins until a serious accident becomes statistically probable. Be like the best organizations in the riskiest industries and make sure you create and foster the kind of culture where Bob would not have been hurt.