ADVERTISEMENT
Working Toward a Safer Workplace
A longtime airline captain and chief safety officer at American Airlines, Scott Griffith gained national prominence in 1994 after he created what would become a major airline safety initiative.
Under the Aviation Safety Action Partnership (ASAP), pilots and other airline staff are encouraged to voluntarily self-report errors and safety lapses, which are then addressed collaboratively to make improvements and prevent accidents. ASAP was later embraced by the Federal Aviation Administration, unions and the airline industry and remains one of the cornerstones of U.S. airline safety.
In 2000, Griffith was asked to join a think-tank and consulting firm that had developed a strategy for reducing errors by better managing human behavior relating to the systems people worked within. The company, based in Dallas, is now known as Outcomes Engenuity.
Griffith spoke with Best Practices about his company’s approach to reducing risk and what EMS has to learn about safety from the airline industry’s experience. The following excerpted interview can be found in its entirety at emergencybestpractices.com.
What was the main obstacle to improving airline safety when you created ASAP?
For many years in aviation, there was this disconnect between what was really taking place in the cockpit—the mistakes pilots would make and the system errors they would see—and what the regulator would see. The regulator had very little ability to see risk inside the airline aviation system unless something was reported externally, such as a plane crash. All of the precursors to that accident, all of the systems and behaviors we know that can lead to an accident, went unnoticed by the regulator. There was no mechanism for reporting it, and pilots held the regulator at arm’s length—who in their right mind would turn in a report on themselves where the result could be that you lose your license? The only tool the regulator had was a hammer.
The role I played was to help foster a better, more just environment. If a pilot came forward in good faith and raised his or her hand and acknowledged a behavior or an event, a team would come together to work collaboratively to resolve the hazard or the risk. The program was an amazing transformation. We started to see pilots coming forward in large numbers to identify risk in the system, including their own behaviors.
What is "Just Culture"?
Just Culture is a safety supportive, or a value supportive, system that focuses attention away from errors and outcomes and puts the focus on the systems and behaviors that produce better outcomes. It’s knowing how quirky human beings think and act, and knowing how sophisticated technical systems can be better designed to work better with the human being.
Humans inside systems will do the unexpected, thinking it’s the right thing to do. Or someone will cut a corner. What we know about people is that we make these choices virtually every day. You’re not going to prevent humans from making mistakes. What you can manage is the choices we make ahead of those mistakes.
How can employers best work toward providing a safer work environment for their staff and patients?
They can start by having a more honest conversation with their employees. At times they will be caught between two values. The problem is that most organizations measure and reward outcomes rather than behaviors. If you’re getting good outcomes, you may be rewarding a very risky behavior.
We see outcome bias again and again in organizations. But it’s a dangerous approach, and an accident waiting to happen. What can help this is coaching, peer to peer, so that people can become aware of where they are cutting corners and are brought to an understanding of the risks. More often than not, people cut corners and have a positive outcome, which reinforces the risk-taking behaviors. We break that paradigm. It’s the systems and behaviors that matter most, and then the outcomes will improve.
What would you say to someone who points out that EMS responders don’t work in a controlled environment—they’re doing their best to be safe, but it’s just the nature of the business that there’s risk?
That’s exactly right. It’s very difficult to write procedures or rules that work in all circumstances. There’s a rule in aviation that in an emergency a pilot can deviate from any rule to the extent required to deal with the emergency. There’s a recognition that managing risk may require you to deviate from a rule.
We have an algorithm we developed that guides organizations in how to respond to events and behaviors when you see an EMT or paramedic who didn’t follow procedure. You hold them accountable for doing the right thing. The rules are subordinate to your values. We will stand in judgment of whether you did the right thing. Sometimes that means following procedures, sometimes it means going beyond the procedures in certain emergency situations.
But that’s not the place organizations typically struggle. Under extreme situations, organizations, supervisors and managers usually have a good feel for whether the employee did the right thing. Where we struggle is with the day-to-day cutting of corners, because it wasn’t risky enough to cause an accident at the time. If a paramedic is texting while driving and going 30 mph and there is no crash, how do you manage that behavior? You could try to build a better system around the human to deter them from the behavior. You could put a device in the ambulance that wouldn’t allow the cell phone to operate, but that might not be practical. So you have to raise the humans’ awareness, because humans can circumvent systems, and they do so quite often.
Jenifer Goodwin is associate editor of the monthly newsletter Best Practices in Emergency Services.