Fire Fighting in Canada

Features Hot topics Opinion
Editors’ pick 2015: From the Floor

Knowing what you don’t know could be the key to solving all of your problems. But, what if we don’t actually know what we should? Or worse, we think we know but we are misinformed.

September 28, 2015 
By Jay Shaw


Knowing what you don’t know could be the key to solving all of your problems. Jay Shaw warns about the brain trap in the fire service.

Many organizations operate with what I call brain trap – great people working under a system of beliefs and structures that are obsolete and possibly dangerous (it goes way deeper than this but in fewer than 800 words I have to keep it short.) These antiquated beliefs become learned behaviours over time. No matter how you look at it, society progresses; unfortunately we have to move sometimes at the pace of our slowest walker. In some cases, a critical event such as an injury, loss of life, or business failure shakes an organization to its core and sparks the realization that it is time to change; but many inside the organization may not be ready.

Our day-to-day routines, what we see, analyze, and make decisions about every second, are created from our collective experiences; and these experiences, in turn, form our perceptions that dictate the how, why, what and when we decide to do things. If you check your fire truck every day and the same equipment is in the same compartment every single time, you start to perceive that it will always be there. In reality there are many times when the equipment is not there – while it was used, maintained, sent out for service and replaced.

When we refuse to accept new inputs or information that may change our behaviours we actually start to accept what is wrong as a reasonable action.

If a firefighter is able to escape a low-risk event with little or no consequence by circumventing a rule, then, over time, that unsafe action becomes routine and acceptable. Now imagine a firefighter escaping a high-risk event but ignoring the proper procedures. If we accept this type of behaviour, we could be setting ourselves and our organizations up for a critical failure.

Advertisement

What if your body of knowledge is based on experiences that formed perceptions that are actually wrong? If your department has never really embraced professionalism and is beholden to a system of beliefs that were once considered acceptable, you are working under an outdated belief system. Do you never clean your gear or still think a dirty helmet is cool? Is hazing the rookie acceptable? Does your department still have a fire-hall beer fridge? What do these actions or lack of actions say about your department’s ability to operate clearly? Is your perception of these examples positive or negative?

A new term has emerged in the fire world that was recently brought to my attention. The term is normalization of deviance, and it’s just another way of saying brain trap. Normalization of deviance is loosely defined as the process of letting your perceptions normalize incorrect actions, behaviours and or processes until they seem perfectly acceptable.

Normalization of deviance is what took down two space shuttles and collapsed the United States space program, twice. Normalization of deviance allowed NASA to accept a faulty O-ring on a solid-rocket booster that was deemed a critical component for launch success. After several missions during which nothing happened, NASA started to normalize the failure, or deviate from the safety standard, and slowly, over time, the faulty O-ring became acceptable. In 1986, the space shuttle Challenger blew up about a minute into its launch when an O-ring failed and super-heated fuel leaked through, destroying the rocket and killing seven astronauts. The sick part about this is that some NASA engineers’ perceptions were not normalized – they did not have brain trap. In fact, those engineers pleaded up the chain of command that the fuel leak was predictable based on the data and the known facts. But because nothing bad had happened before with a faulty O-ring, NASA chose to ignore the warning and perceived the situation to be safe.

After the tragedy, you would think there would have been lessons learned; however, in 2003 the space shuttle Columbia disintegrated on reentry and killed seven more astronauts. The heat shields designed to protect the shuttle had been damaged before, but were perceived to be acceptable.

So ask yourself how you perceive things in your department. Are you normalizing the deviation of behaviours, processes or procedures because, “That’s just the way it is done around here”? I can think of many things in my personal life and at work that I should probably stop accepting as normal risks; the problem is sometimes those risks might affect someone else and that is not only a brain trap, but also just not good enough anymore! 


Jay Shaw is a firefighter and primary-care paramedic with the City of Winnipeg, and an independent consultant focused on leadership, management, emergency preparedness and communication skills. jayshaw@mts.net  @firecollege


Print this page

Advertisement

Stories continue below


Related