The purpose of SMS:
“Is SMS enough to make our organisations safer?”

It was my great pleasure to speak at the excellent safety event organised by Líder Aviação in Copacabana, Rio de Janeiro on 20-10-2015.

In this presentation I asked the question if we need to do more than just build the bureaucracy of the SMS to make us safer. Of course, SMS is a great tool that can tell us a lot about our organisation, but if the safety data does not lead to actions to reduce risk, what is it good for?

Bill Voss of the Flight Safety Foundation expressed the purpose of SMS very well in a blog article;
he contends that the SMS’s function is to help allocate resources to reduce risk:

4 simple questions about Safety Risk Management

It is one of the reasons why I started SRMcoach.eu; I want to challenge the assumption that getting the safety data automatically means that action will be taken to improve risk.

I was very glad to see that the other speakers (their presentations here) were also looking for new approaches to safety management. I think this reflects a growing awareness that we need to look beyond the SMS implementation and start using it effectively to improve our organisations.

There is a long way to go for the safety profesional to convert safety data into something useful that influences the management team to take action.
In the presentation, [Pdf] I talked about 3 major problem areas:

Understanding Risk

You can find a great resource about Systems thinking at the Skybrary.aero page.
(this is little known but an excellent safety concept resource maintained by Eurocontrol so the information and editors are validated)

On Skybrary.aero you will find the Systems Thinking toolkit, a set of 10 principles that will help you to implement systems thinking in your SMS.

For me, one of the most important concepts is Local Rationality;

People do things that make sense to them given
their goals,
understanding of the situation and
focus of attention at that time.
Work needs to be understood from the local perspectives of those doing the work.

 

This concept helps to understand why many safety violations are not necessarily made because people are behaving “badly”, very often they are just trying to make it work with limited information, understanding and goal conflicts.

This is why the concept of Just Culture is so important! We need to ensure we have effective interventions.
Punishing people will not change the local rationality and unless we change the environment, we might leave a error trap in our organisation that can also catch out the next guy!

Understanding this concept also enables you to identify how you can change behaviour by changing the elements that influence the local rationality.

Note that this is also applicable to your management team, they too have a personal, limited understanding (due to complexity) , different personal goals and a different focus so they make safety related decisions based on their own local rationality. As a Safety Professional you need to explore these different local rationalities if you want to have a chance at changing anything

Psychological concepts mentioned

(following text contains excerpts from Wikipedia)

A great book related to the psychology of understanding risk and probability is written by Daniel Kahneman, notable for his work on the psychology of judgment and decision-making, as well as behavioral economics, for which he was awarded the 2002 Nobel Memorial Prize in Economic Sciences.

The book is called Thinking, fast and slow (since it is a bit large, you might want to try the Audible audiobook version.)

brainHeuristics


The second section [of Thinking, Fast and Slow ] offers explanations for why humans struggle to think statistically. It begins by documenting a variety of situations in which we either arrive at binary decisions or fail to precisely associate reasonable probabilities to outcomes. Kahneman explains this phenomenon using the theory of heuristics. Kahneman and Tversky originally covered this topic in their landmark article from 1974 titled Judgment under Uncertainty: Heuristics and Biases.[6]

Kahneman uses heuristics to assert that System 1 thinking involves associating new information with existing patterns, or thoughts, rather than creating new patterns for each new experience.
For example, a child who has only seen shapes with straight edges would experience an octagon rather than a triangle when first viewing a circle. In a legal metaphor, a judge limited to heuristic thinking would only be able to think of similar historical cases when presented with a new dispute, rather than seeing the unique aspects of that case. In addition to offering an explanation for the statistical problem, the theory also offers an explanation for human biases.

Anchoring

The “anchoring effect” names our tendency to be influenced by irrelevant numbers. Shown higher/lower numbers, experimental subjects gave higher/lower responses.[2]

Availability

The availability heuristic is a mental shortcut that occurs when people make judgments about the probability of events by how easy it is to think of examples.[7] The availability heuristic operates on the notion that, “if you can think of it, it must be important.” The availability of consequences associated with an action is positively related to perceptions of the magnitude of the consequences of that action. In other words, the easier it is to recall the consequences of something, the greater we perceive these consequences to be. Sometimes, this heuristic is beneficial, but the frequencies that events come to mind are usually not accurate reflections of the probabilities of such events in real life.[8]

Substitution

System 1 is prone to substituting a difficult question with a simpler one. In what Kahneman calls their “best-known and most controversial” experiment, “the Linda problem.” Subjects were told about an imaginary Linda, young, single, outspoken and very bright, who, as a student, was deeply concerned with discrimination and social justice. They asked whether it was more probable that Linda is a bank teller or that she is a bank teller and an active feminist. The overwhelming response was that “feminist bank teller” was more likely than “bank teller,” violating the laws of probability. (Every feminist bank teller is a bank teller.) In this case System 1 substituted the easier question, “Is Linda a feminist?” dropping the occupation qualifier. An alternative view is that the subjects added an unstated cultural implicature to the effect that the other answer implied an exclusive or (xor), that Linda was not a feminist.[2]

Optimism and loss aversion

Kahneman writes of a “pervasive optimistic bias“, which “may well be the most significant of the cognitive biases.” This bias generates the illusion of control, that we have substantial control of our lives. This bias may be usefully adaptive. Optimists are more psychologically resilient and have stronger immune systems than more reality-based opposites.[citation needed] Also optimists are wrongly thought of having longer lives on average, a common belief which was disproved in the longevity project . Optimism protects from loss aversion: people’s tendency to fear losses more than they value gains.[2]

A natural experiment reveals the prevalence of one kind of unwarranted optimism. The planning fallacy is the tendency to overestimate benefits and underestimate costs, impelling people to take on risky projects. In 2002, American kitchen remodeling was expected on average to cost $18,658, but actually cost $38,769.[2]

To explain overconfidence, Kahneman introduces the concept he labels What You See Is All There Is (WYSIATI). This theory states that when the mind makes decisions, it deals primarily with Known Knowns, phenomena it has already observed. It rarely considers Known Unknowns, phenomena that it knows to be relevant but about which it has no information. Finally it appears oblivious to the possibility of Unknown Unknowns, unknown phenomena of unknown relevance.

He explains that humans fail to take into account complexity and that their understanding of the world consists of a small and necessarily un-representative set of observations. Furthermore, the mind generally does not account for the role of chance and therefore falsely assumes that a future event will mirror a past event.

 

Understanding operational realityfat guy

Practical drift

 

Just Culture

 

Taking (Effective) Action

Some of the below links will help to understand the factors that impact taking effective actions.

carrot stick (Small)

First of all I talked about how important it is to understand the difference between extrinsic (i.e stick and carrot) and intrinsic motivation which is based on Self Determination Theory, but it is explained a lot better in  by Daniel Pink

Having an updated understanding of what motivates people is very important. Science has shown that both punishment and incentives are very bad in their own way to motivate people, for anything but the most simplistic mechanistic tasks.
From the moment that what you ask people to do requires even minimum cognitive effort, both punishment and incentives can backfire on you, leading to perverse incentives whereby people find it easier to manipulate the numbers than actually comply.

If you want to make sure people cooperate or comply you need to obtain a more powerful intrinsic motivation:
In order to do this you need to have 3 things:

Autonomy:  it helps enourmously at any level if the safety recommendations are based on input of the people who need to execute them, instead of telling people what they should do, a coaching approach can be to explore the available options together, let the people decide what action they want to take and then hold them acountable for their results. The feeling of autonomy will increase a lot if the people feel that their suggestions are the basis of the recommendation and the chance that they will comply will be a lot higher.

Mastery
of course, people need to feel that they are able to actually do what they are asked to do, related to competence, which can depend on skill, knowledge or attitude

Purpose

People need to understand why it is important that they do something, there is a natural tendency in most persons to want to belong to something bigger than themselves. If you explain how their actions contribute to the success of the group, it will increase intrinsic motivation and probability of compliance.

Safety manager or safety coach?
Safety coaching 

A few articles from SRMcoach.eu which explain a bit more how safety professionals might improve their relationship with management by shifting to a coaching role.

Non-technical skills for the safety professional

This article explains how safety professionals can benefit from building a few non-technical skills to improve their impact in the organisation (and their careers!)

Setting SMART goals , by having these criteria, action plans are more concrete and easier to follow up.

 Please feel free to ask in the comments for more background info on anything else!

About the Author
Jan Peeters


Jan is an experienced Safety practitioner who is always on the lookout to improve SMS and the management of safety. He coaches organisations and individuals in Safety Management.

One Response to The purpose of SMS: Is SMS enough to make our organisations safer?
  1. […] This question came from two major organisational accidents I saw unfold from up close and which made me think about the effectiveness of our organisations in using all that safety data and actually doing something with it. More resources can be found here. […]


[top]