By By Joy Finnegan, Editor | September 1, 2010
A while back I was talking with an A&P mechanic about human factors, working environments and the pressure they often feel in pushing their personal limits. He said to me, “I can’t wait for the day that I report an error to my manager and am rewarded with a bonus or a trip to Hawaii because I have potentially saved lives.” It took me a while to digest what he had said, but he was way ahead of me in understanding one thing.
During the Rotor & Wing Safety & Training Summit in June, one of our expert panel members, Jerry Allen of Baines Simmons Americas, an airworthiness and aviation safety consulting and training firm, mentioned the term “just culture” during his presentation on human factors. After the presentation, someone asked him to elaborate on what just culture is. It got me thinking that there are some folks who might never have heard of this philosophical movement in the safety and human factors area.
James Reason, who appears to be the first to coin or use the term, is the author of “Managing the Risks of Organizational Accidents,” and describes a just culture as an atmosphere of trust in which people are encouraged (even rewarded) for providing essential safety-related information, but in which they are also clear about where the line must be drawn between acceptable and unacceptable behavior.
Just culture is being embraced in the medical community and in some aviation communities, as a safety system that facilitates open communication within an organization, accountability, and making safe behavioral choices among workers. Having a just culture in your working environment encourages workers in any field to come forward and report mistakes that have been made.
People seem to have strong feelings about this philosophy. Some say it allows people to go ahead and make mistakes, just as long as they report them. Others think it is the wave of the future and are prophesying that it is the only way we will make progress in safety. Contrary to the nay-sayers, just culture is not a “blame-free” approach to safety. Obviously a blame-free approach that would allow someone to willfully and recklessly make unsafe choices can’t work in aviation.
But if you have worked in an environment where fear drives the decisions being made by workers, then you understand how safety can suffer. That kind of culture is known as a punitive culture. Negative outcomes are met with disciplinary action or with someone losing their job. This has the opposite effect of the one intended—safety. It simply encourages people to be less than honest for fear of embarrassment, retribution or worse, losing their livelihood.
Some of the main ideas that are established in a just culture include the benefits of having a learning culture versus a blaming culture and establishing where the border between “acceptable” and “unacceptable” behavior should be. Having a learning culture includes taking the reports of mistakes and analyzing them to see why they happened. Was the procedure wrong or easily misunderstood? Was the person on duty too long? Was the manual incorrect?
This is the idea behind NASA’s Aviation Safety Reporting System (ASRS). Most of you know this program but if you want to learn more, I wrote about it in the January issue this year. But one major flaw is that the information gathered doesn’t always get disseminated to the right people. I flew with a 30-year captain just before his retirement. He told me that over the course of his 30 years of flying, he had submitted numerous reports (more than 100). Most were for simple things like an altitude bust. But he had never seen any data based on those reports (although it is accessible online). He also said that he had never gotten a follow-up phone call about any of the reports.
This conversation was prompted by me telling him that after filing an ASRS report, I had quickly gotten a follow-up phone call requesting more info. To me the missing part of the ASRS is that follow-up and it is key to just culture. Open communication and discussion about why things went wrong are crucial.
According to David Marx, founder of the Just Culture Community (www.justculture.org), open communication about system errors, risks and mistakes is key. In his new book entitled, “Whack-a-Mole,” Marx says, “We all take risks in life. We all make choices weighing one risk against another.” We do these types of analyses every day. But it is when we do the analysis and see the risk as “substantial or unjustifiable” but still choose to take the risk to put ourselves or others in harm’s way, that it becomes a problem.
Marx also emphasizes that a well-established system of accountability is needed to ensure the success of such a culture. Marx wrote in the inaugural issue of their newsletter: “It is an organization that recognizes that we as humans are fallible; however, it recognizes that in most circumstances, we do have control of our behavioral choices—whether we are an executive, a manager, or a staff member. It is an organization that understands shared accountability—that both good system design and good behavioral choices of staff together produce good results. It has to be both.” I encourage you to learn more about just culture and make your own judgment.