«

»

Feb
24

What are near misses?

What are near misses? The disputed territory between lagging and leading indicators.

It seems that everyone wants to claim near misses for their cause. That should be good news – near misses present valuable opportunities to learn and improve. Unfortunately, sensible discussion on near misses can all too often degenerate into a debate about whether they are lagging or leading indicators of safety.  Turn this around though and maybe that is a clue to debunking some of the fallacious arguments involving lagging and leading indicators. Let’s explore.

Near What?

As a starting point: are we talking about a near MISS or a near HIT?

  • It was a miss, but it was a close escape – a near miss, with “near” describing the miss.
  • It nearly hit – a near hit, with “near” meaning almost.

The choice is yours. “Near miss” is conventional and well understood, that is why I choose to use it. However, having read Daniel Kahneman’s book Thinking, Fast and Slow it occurs to me that psychologically (considering cognitive ease) “near miss” may sound too good, too comfortable, too safe. “Near hit” is a less common phrase, with “hit”  more likely to trigger a mental reaction than “miss” . If so, it may draw greater attention and reaction to the same event.

Luckily nothing happened – this time

Some say that in a near miss nothing actually happened. They argue that a near miss provides a glimpse into the future – a suggestion of something more serious that might happen on another occasion. The message is that, correctly understood, a near miss is an opportunity to learn. Apply that knowledge to take action to prevent possibly more serious consequences another time. Using this argument, near misses are taken as leading indicators that can be used to help create safety.

But it was an incident

“Near misses describe incidents where no property was damaged and no personal injury sustained, but where, given a slight shift in time or position, damage and/or injury easily could have occurred” (U.S. OSHA definition). The clear message is that, despite no physical harm, something undesirable happened. On this basis a near miss is a lagging indicator.

Is a near miss an unsafe condition?

We can make a distinction between “near miss” and “unsafe condition”. An unsafe condition can exist even when there is no incident – making it a leading indicator. Examples could be corrosion of steel walkways, uninspected pressure vessels, defective brakes, PPE not worn, poor electrical grounding.

Too late?

Classing near misses as a lagging indicator does not necessarily mean too late.  True you cannot go back and prevent that particular incident. But as with all incidents up to and including fatalities, it is still possible, if not an obligation, to investigate to learn from the experience and take remedial action to prevent a recurrence. In a sense the lagging indicator generated by incidents becomes a leading indicator for prevention.

Neither and both

Near misses are, quite simply, indicators. They straddle the descriptions of lagging and leading. They represent something that was unsafe (but you were lucky); they are weak signals that provide evidence in advance of the possibility for injury or damage. What matters is that near misses can be a relatively plentiful and rich source of data for learning and improvement.

We should stop worrying about the terms lagging and leading and use wisely whatever data we can to predict and improve. We miss opportunities for improvement if we get too dogmatic and say that lagging = too late, or leading = too subjective.

PDCA / PDSA

Continual improvement in safety can be achieved by using PDCA, or better PDSA:

PDCA (Plan-Do-Check-Act), also known as the Deming Cycle, is used extensively for continual improvement and has been adopted as the basis for management system standards such as OHSAS 18001.

PDSA (Plan-Do-Study-Act), also known as the Shewhart Cycle, is the version Deming taught (for more details of PDSA see Out of the Crisis and The New Economics). He advocated “Study” as this implies understanding why something improved, or not, whereas “check” suggests more an answer to whether or not all is going to plan.

PDSA can be used for continual improvement:

  • to help create what we predict will be a safe work environment  (safety precursors);
  • as the basis for an operational definition of safety

Creating safety precursors

Take time to think what is required for safe working. Decide how you will achieve it – consistently. Implement those plans. In a previous article I discussed a practical approach for effective improvement to the control of safety risks. Using PDSA can build on that.

Monitor what you do. How well are you doing on what you say you will do? Reassurance could come for example, through observations from inspections, timely completion of actions and and reporting of near misses . Call these leading indicators if you like. Given the importance of learning from near misses, action taken to encourage people to report near misses is also be an important leading indicator of safety. Note that an increase may represent greater transparency and trust, not a worsening situation. Near miss should be prevented but, whatever the incident was, people should be congratulated for REPORTING a near miss.

Complete a PDSA cycle by taking action to rectify any shortcomings in your methods of achieving safety (see Figure 1). Keep repeating the process. The message is that leading indicators help you predict and improve safety.

Image 1 - Near missFigure 1: PDSA on the work process with examples of leading indicators

How effective are the safety efforts?

Wisely chosen indicators will allow you to improve safety performance. But recognise that leading indicators are a heuristic – they substitute the answer to the difficult question “are we safe?” with the answer to a simpler question “are we doing what we think are the right things?”

Even if your indicators are well chosen and showing excellent performance, you may have missed one crucial failure path that could lead to an accident. This is where near misses, as well as injuries and damage, provide valuable information about weaknesses in the safety system.

So, to judge success in achieving safety in any period of time, it is not sufficient to rely solely on leading indicators. However good you may feel about the effort you have put into creating a safe workplace, the acid test is this: did anyone get injured? The final part of the jigsaw is to obtain data on injuries and occupational illnesses, for example TRIR (Total Recordable Incident Rate), or the rates for LTI (Lost Time Injuries) and DART (Days Away, Restricted, or Transferred).

Figure 2 shows how a TRIR might be shown graphically, with 3-sigma process behaviour limits calculated for a level of performance that appears to have reached a stable level of about 1.0, with annual variation predicted to be between 0.5 and 1.5.

Image 2 - Near missFigure 2: Example of TRIR to report safety performance

Call these lagging indicators if you like, but do not stop there. Crucially, you must study the data to understand what it means. As Nassim Nicholas Taleb says in The Black Swan “You can be very confident about what is wrong, not about what you believe is right”.

The data may be necessary for corporate reporting or other reasons, but comparing results against last year or some industry standard provides little or no knowledge. If you want to learn and improve you must study the details – specifically, your details.

Was it just luck?

If you have had no injuries, or a low rate, why was this? If you have had injuries you need to know the specifics, such as the type and severity of injuries, the type of work, the conditions and location where the injury occurred.

How much do you trust the reporting / recording process? Are thorough incident investigations identifying the cause of incidents (not who to blame)?

Is there good reporting on near misses to give a wider perspective of what is happening? Is that distorted by a disproportionate volume of trivial near miss reports to achieve a target or reward? Does fear restrict reporting on serious near misses?

The important point is that knowledge gained from studying lagging performance indicators must be fed back to improve your leading indicators and ensure that you are controlling actual safety risks. For example, if your incidents involve many soft tissue injuries you may need more focus on ergonomics or material handling. Safety checklists should be amended if necessary and perhaps new leading indicators established to track performance in these areas.

An operational definition of safety

The requirements for an Operational Definition for Safety are:

  • What is the organization’s aim with respect to safety – the ideal?
  • What will the organization do to create improvements in safety – the methodology?
  • How will you know if you have achieved improved safety – the judgement?

The argument above suggests how leading and lagging indicators fit together in an integrated way as part of an operational definition of safety – a balanced approach, if you will.

The aim of continual improvement is implemented through preventive action, tracked by leading indicators that optimise performance of the safety system. Accidents provide data from which judgements can be made about the effectiveness of the preventive action. The knowledge gained from studying accidents is the basis for action to improve the preventive measures and the leading indicators. The cycle is shown diagrammatically in Figure 3.

Image 3 - Near miss

Figure 3: PDSA as an operational definition of safety

Preoccupation with Failure

Fortunately, there appears to be widespread agreement about how learning from near misses can help with safety improvement. However, we should not let our enthusiasm about the benefits of near miss reporting and investigations overshadow the point that each near miss represents A FAILURE.

In their book, Managing the Unexpected, Karl Weick and Kathleen Sutcliffe identify a preoccupation with failure as a feature of so-called High Reliability Organizations. They stress that a near miss should be interpreted as a sign that your system’s safeguards are vulnerable. Any lapse is a symptom that something may be wrong with the system: something that could have severe consequences. Their view can be compared to the common cause hypothesis that, for an incident in any one organisation, the same causal path may lead to no injury, a minor injury or a major injury.  We should, as Weick & Sutcliffe suggest, “Interpret a near miss as danger in the guise of safety rather than safety in the guise of danger”.

 Beware of people who talk about rear view mirrors

Some people love analogies. An analogy can be a useful way to explain a new concept, but remember that analogies can be fallacies. Just because there is one similarity does not mean that there are similarities in all respects. Less excusably, analogies are sometimes used to amuse an audience and divert attention from rational assessment.

“Using lagging indicators is like driving a car by looking in the rear view mirror”. Ha, ha!

What is meant of course, but conveniently missed, is that it refers to ONLY looking in the rear view mirror. Driving a car and looking from time to time in the rear view mirror warns you about cars approaching fast and about to overtake you. That improves safety.

Summary:  “Don’t throw the baby out with the bathwater”

I hope this article has shown that we learn something from both the PROCESS of creating safety (via leading indicators) and the safety RESULT (via lagging indicators).

Leading indicators can provide valuable day to day information on how PRECISE we are in doing what we say we should do. However, we should also continually revise our judgement of the effectiveness of our safety efforts by the successes and failures we experience. Lagging indicators can provide essential feedback on how ACCURATE we have been in identifying the necessary safety measures required. You just need to make sure you understand your data and not torture it to make it confess to suit some other motive.

Finally, we should learn from near misses and take action to improve. They are a performance indicator; talk of leading or lagging is a distraction from the fact that each near miss represents a failure…. and an opportunity.

References

Deming, W.E., 1986, Out of the Crisis. Cambridge, MA: MIT

Deming, W.E., 1993, The New Economics. Cambridge, MA: MIT

Kahneman, D., 2011, Thinking, Fast and Slow. London: Penguin Books

Taleb, N.N., 2007, The Black Swan: The Impact of the Highly Improbable. London: Penguin Books

Weick K.E. & Sutcliffe, K.M., 2007, Managing the Unexpected: Resilient Performance in an Age of Uncertainty, 2nd Ed. San Francisco, CA: Jossey-Bass.

Biography

Nick - photoNick Gardener spent many years working in the chemical, nuclear and automotive industries. He is now a global risk and HSE consultant working for Risk International Services Inc. He can be contacted at: ngardener@riskinternational.com.

21 comments

  1. Cary Usrey says:

    Nick,

    Great article – thanks for sharing.

    It is very interesting when discussing leading and lagging indicators:

    You can’t have a ‘safe’ worksite it you have high injury rates. However, you can have low injury rates and still work unsafely!

    The balananced approach is a very good methodology.

  2. Nick Gardener says:

    Thanks Cary. That is the point really isn’t it . You need a feedback loop through PDSA. Decide what you should do. Then leading to check that you are doing what you plan to do. Lagging help you assess how you did, and then use what you learn from that as leading to improve to go through the next iteration.

    Nick

  3. Sean Cuffe says:

    Great article Nick, a well presented argument.

    The near miss for me is a critical lag indicator, particularly if you rate them in terms of the risk presented or the potential for harm. If there was serious potential for harm such as when a structural element collapses and no one is injured it may be referred to as a potential class 1 event, class 1 being the potential for death or life changing damage. If the end of a hammer flies off and sails passed someone you may be talking PC3 or potential for minor injury.

    I would always have these uncontrolled releases in the lag stats.

    The other issues you mentioned in the paragraph “Is a near miss an unsafe condition?”, are hazards, they exist without their potential being realised, no event has occurred so the hazard can be rectified before it placed someone at serious risk.

    Regards Sean

    1. Nick Gardener says:

      Hi Sean

      Thanks for your feedback and thoughts.

      I agree with your point about about hazards, and would be interested to hear more about your PC1, 2, 3 classifications. I understand the idea and your examples make sense. Do you have definitions you can share.

      I would also be interested to hear if you investigate PC1, 2 and 3 in different ways or to different degrees. I guess that you do, but in what ways?

      Nick

      .

  4. mike fuller says:

    An excellent presentation and some brilliant responses and all very thought provoking.
    In the old days we talked about health & safety, we then bolted on environment and quality. Now we also have behaviour and culture to add to our toolbox!

    Near misses/hits are most times caused by human error and often due to lack of the usuals’ – you all know what the list looks like!

    I believe what is important is that people work safely and actually own up and report so that a more serious event is prevented. All that’s down to the culture in the workplace – and the response from the supervisor/manager – we all have a list of these as well !

    1. Nick Gardener says:

      Thanks for your comments Mike

      Your third paragraph highlights the crucial point. We rely on those involved in a near miss to report the near miss. The best way to foster that is to make it clear why this is important; make it easy and safe to report; and then act on the report to improve safety so that it becomes a reinforcing cycle. Everyone takes responsibility for improvement and sees the benefits.

      Nick

  5. Nick Gardener says:

    When considering how to implement an effective near miss reporting culture I have found that there are 3 groups to consider.

    Management supported by H&S professionals who want near misses to be reported, it is all upside for them. But what about everyone else?

    Workers at the sharp end, who are most likely to be involved in / close to the near misses. They are naturally likely to be cautious to admitting to an error. An idea I heard of from Jim Rowe was to refer to near misses as “Good Catch”. That sounds a great way to reassure people. But they will be looking to see if management really means it?

    The crucial third group is the supervisors. Their attitude is what will influence whether near misses are reported or not. They have reputations at stake too, and workers will look to them not senior management to decide whether or not to report. There are ways to achieve this. But too often an enthusiastic “initiative” from the top is directed to the shopfloor and misses bringing the supervisors on board first. Wittingly or unwittingly they can wreck any chance of success if they are not convinced.

    Nick

  6. Nick Gardener says:

    A further observation prompted by comments elsewhere:

    Leading indicators are a DIAGNOSTIC metric (measurement) within the operating system;

    lagging indicators are a PERFORMANCE metric of the system.

    They must neither be considered as either /or options, nor judged as one being better than the other. You need both for continual improvement in a PDSA cycle (as shown in Figure 3 in the article).

    Nick

  7. Steven Lamb says:

    An aside comment really to the article (which I enjoyed by the way!).
    Mike Fuller made an interesting comment when he stated;

    “Near misses/hits are most times caused by human error and often due to lack of the usuals’ – you all know what the list looks like!”

    I am working to the premise that any near miss/hit is an actual incident. My question is this.

    Is human error really a causal effect of incidents? I acknowledge it can certainly be a contributing factor. Could we not argue that human error is really only a symptom of a failed ‘safe system of work’? e.g. fatigue management, training etc.

    1. Nick Gardener says:

      Steve – thank you for your comment.

      Causation is a hotly debated subject, but, in the context of what we are discussing here on near misses, let me offer the following thoughts.

      Apart from the effects of natural phenomena, near misses occur because, somewhere and sometime before that, a person either did something unsafe or missed making something safe (the so called acts of commission and omission). NOTE this has nothing to do with blame. For example it may have been a completely innocent mistake, or a good decision with an unpredicted outcome.

      We tend to focus on unsafe acts at the time of the incident, and, arguably are often mainly concerned with putting in measure to prevent in future what was done in error – an unsafe act that precipitated the incident. Encouraging people to do what was NOT done to prevent the incident somehow seems to be more difficult.

      We should not forget that people create, or allow to exist, unsafe conditions or practices when there is no incident. But then one day a near miss (or worse) happens that night have been prevented. NOTE this is not saying that all accidents are preventable. It is merely that we can and should do better. Workplace inspections are one excellent way to spot and correct unsafe conditions and failures to follow safe working practices.

      Nick

      1. Nick Gardener says:

        I should add that there are several other relevant articles worth checking here in SafetyCary that can help identify unsafe conditions and unsafe acts:

        By Cary on Safety Inspection Strategy and Frequency

        By James Loud on Management Walkarounds:
        What Are They?
        How Do You Do Them?
        Do and Don’t Tips

        Nick

  8. Ricardo Montero says:

    Thanks for the article.
    Which are the evidences that a near misses reporting system has some impact in reducing accidents?, Are some data supporting it?, or only there are oppinions. What I can read is that there is not any evidence that using near misses impact in accident reduction…perhaps only as a motivational activity more, or as a learning tool, but…impact more than our logic oppinion?

    1. Cary Usrey says:

      Ricardo,

      I will let the author, Nicholas Gardener, expand further.

      However, there are numerous studies out there that provide evidence and support the collection of near misses.

      Here is a link to one good article: http://www.ncbi.nlm.nih.gov/pmc/articles/PMC1117768/

      If you look at the Cost Benefit Analysis section at the bottom of the article, there are multiple cited examples and references.

    2. Nick Gardener says:

      Ricardo – thank you for reading the article and your question asking about evidence.

      We cannot show that by taking action following a near miss we prevented a more serious accident. What we can show is that, for an incident in any one organisation, the same causal path may lead to no injury, a minor injury or a major injury – the common cause hypothesis.

      There are data supporting the common cause hypothesis. A good example is the paper by Linda Wright & Tjerk van der Schaaf “Accident versus near miss causation: a critical review of the literature, an empirical test in the UK railway domain, and their implications for other sectors” (Journal of Hazardous Materials 111(2004) 105-110).

      The problem comes when people try to use the common cause hypothesis to suggest that there is some ratio between near misses and more serious accidents. There is no inevitability.

      We cannot say that following a near miss there WILL be a more serious accident resulting from the same cause, let alone that after a certain number of near misses we should EXPECT a more serious accident. All we can say is that we should learn from near misses, which represent a failure of the system, because the same cause COULD on another occasion lead to a more serious accident.

      Empirically, it is often easy to see that, but for a small difference in position or time, a particular near miss could have had much more serious consequences. This is supported by evidence from investigations into major incidents, which sometimes show that there were warning signs in the past from near misses that were either ignored, or, in hindsight, inadequately addressed.

      Nick

  9. Ricardo Montero says:

    Dear Cary,
    Thanks for the comments and the paper link

    Dear Nick,
    Thanks for answering.
    Agree, my previous oppinion was about to confuse interpreting the common cause hypothesis. As you can read I agree about the potential of near miss for learning.
    But, using them as an indicator?, I thing it is wrong, it is only a missinterpretation of the common cause hypothesis, if not someone must support it with data, or remain as me, as oppinion.

    By the way, there is an interesting discussion about “human error” as cause of incidents and accidents. I always recommend to specify what is the definition being using. For example: it is about the operator?, it is included the supervisor?, the designer?, the maintenance people?. Of course the 99.9 % of the cause of accidents and incidents are due to human error, if the definition is not only limited to the direct operator. Even when the immediate cause is the operator, probably the root cause is related to human errors of other than the operator (sometimes confused when persons refered them as “technical, organizational, etc. causes”, as if them are not originated in the error of some specific person, perhaps long time ago from the time of the event.

    Sorry my bad english, it is not my original languaje.

    1. Nick Gardener says:

      Ricardo – I understand your argument about whether or not near misses are an indicator.

      Firstly, I think we agree that the common cause hypothesis only suggests the POSSIBILITY that the same causal path may lead to no injury, a minor injury or a major injury. It says nothing about the PROBABILITY of any of those happening.

      My argument is that, whatever the consequence, a near misses represents a failure. Tracked, with full reporting, and grouped appropriately into similar causes, they are an indicator of the vulnerability of your system to the relevant cause.

      I’m sure you will agree though that, as with all indicators, you need to understand what an indicator is measuring and interpret it with care. An indicator only suggests in a context; it proves nothing.

      Full reporting is a problem of course. I shall be discussing near misses reporting in more detail a further article, which I hope you will read and comment on..

      As well as tracking near misses as failures, you can also track the number of near misses being reported, perhaps grouped by some criteria. Used intelligently, this indicator can tell you about the activity going on. How engaged are people with reporting near misses? You have to decide whether an increase means more near misses are occurring, or whether it means that more near misses are becoming visible. But, unless the workplace has changed significantly, an increase in near miss reports will more likely be a sign of better reporting.

      Thank you for exploring the meaning with me. I hope this reply with my views is helpful. It is always useful for me to hear from the people who read my articles, and especially from those who have questions or challenges.

      Nick

  10. John Moffat says:

    Great article.
    To extract myself from the Near Miss or Hit play on words I now tend to use the term contained in OHSAS 18001 “definitions” of Close Call as I consider that more closely represents what we all know is meant by near miss … I regularly het “if it nearly missed it must have hit!” Close Call doesn”t have the same confusion possibility..

    Just a thought.

    1. Nick Gardener says:

      John – thanks. Glad you liked the article.

      Interesting isn’t it how the words seem to cause so much discussion.

      I do like “close call” as a way to avoid the hit / miss argument.

      I suspect the different options will be more effective in different circumstances, though I have no psychological research to back this up.

      – “Near miss”, while arguable linguistically, has the benefit of being very well known.

      – “Near hit” does give (in my opinion) extra emphasis on how lucky we were, but can sound a bit pedantic.

      – “Close call”, as you say avoids the hit / miss distraction to focus on what actually happened.

      – Good catch, says very little but can be advantageous to emphasise the value of reporting to those who might be suspicious / cautious about reporting.

      Thanks again for your comment. It is an important aspect of safety and, the more we can do to understand the subject, the better chance we have of improving safety.

      Nick

  11. Nick Gardener says:

    To all who visit here, you may also wish to read the follow-on article posted in on this site in June 2013:

    Near Miss Reporting – Practical Advice

    Click on the Archive link above right, or go to http://www.predictivesolutions.com/safetycary/near-miss-reporting-practical-advice/

    NOTE: I am still very interested in comments on the article here too. it’s been a great discussion so far. Thank you all, and Cary for hosting on his blog.

    Nick

  12. Eben Tanwani says:

    Thanks for sharing, it is a great article indeed.

  13. suresh varghese says:

    It is more knowledgeable and interesting.

Comments have been disabled.