Jun
13

Myth busting: All Accidents Are Preventable?

Recently my friend Alan Quilley had a fine article at this very spot dealing with some Safety Myths.  I thought that it would be a good idea to expand a bit on this and explore some myths further, starting with a very persistent one that has found its way in many a Safety Policy and also several text books:

All Accidents Are Preventable.

The question of course is: Are they indeed?  Why then don’t we see this happening in our everyday observations?  Just check the news or your company’s incident statistics.  Why are we still having accidents after almost a century of more or less serious safety management efforts, scientific and technical progress and increased societal demands through better standards and regulations?

If the presumption that all accidents are preventable would be true, aren’t we trying hard enough after all?  Or can’t we after all?  Or is this (as some cynical folk think) just a phrase that is used to justify so-called ‘Zero Harm’ goals?  The latter thought isn’t so absurd, by the way, because ‘zero’ is only achievable if indeed all accidents can be prevented.

Some may argue that some things cannot be prevented because if someone’s out to get you, they surely will.  Safety professionals have thought long about this and that’s probably one reason that ‘Acts of God’ usually are excluded from accidents and that most ‘safety definitions’ of accidents (check a popular one on Wikipedia) describe them as “unintended” events thus excluding terrorism, sabotage and the like and sending these events over to the realm of security.

After some thoughts I’ve come to the conclusion that the often heard manta of “All Accidents Are Preventable” is true only if we add a couple of words.  Let’s discuss some good candidates:

All Accidents Are Preventable…

…In Theory

What theory that would be, I’m actually quite unsure about.  But some safety academics seem to think so.  By the way, the denominator “academics” here is meant in the meaning of “safety professionals living in ivory towers with little or no relation to reality”, not in the ordinary dictionary meaning of people involved in higher education or research of safety.

Actually, the term theory is not defined as “a contemplative and rational type of abstract or generalizing thinking, or the results of such thinking” (Wikipedia definition) either. Nor is it a “generalized explanations of how nature works”.  Rather we use ‘theory’ here as the opposite of ‘practice’ and one might even see it as a synonym for ‘dream’, ‘vision’ or even ‘delusion’.

…In Hindsight

In real life and at the sharp end decisions are usually made under difficult circumstances, a lot of uncertainty, limited knowledge and time pressure.  Mostly we manage very well, but sometimes the outcome of our decisions isn’t quite what we expected or hoped for.  We did our best, but things went otherwise because we did the wrong thing regarding the circumstances of which we hadn’t the full overview at the time.  As a result of all of this an accident happens.

These difficulties and limitations are significantly absent after the fact.  Then one suddenly has full overview of circumstances, there is plenty of time to reflect and contemplate, gather additional information (preferably to confirm a hypothesis) and best of all: outcomes of the decisions made are known, so no uncertainty at all!

We do have blind spots in real life, and so do organizations.  In hindsight we seemingly don’t suffer from this.  Of course there are still blind spots but at least we now see the things that went wrong, which are the things that we should have seen before, according to everyone pointing their fingers afterwards.

…Given unlimited knowledge, resources, perfect prediction (and quite some luck)

This is the best of the contextual candidates.  If we didn’t have those annoying limitations discussed before. If we just knew everything with an enormous deal of certainty and precision, including the results of our actions and decisions.  If we had unlimited resources to remove all hazards.  Truly, no accident would happen. Ever. Or rather never.

But, how realistic is that scenario?  People have limitations and resources (time, money, etc.) yet must face exactly the same problem.  Anyone who has experienced otherwise should really share his experience with us mere mortals.  It must have been a really boring experience by the way.  So where does that leave us who are living out there in the real world?

Let’s just face it, we cannot prevent everything.  Let’s just be very realistic about that.  We don’t even want to prevent absolutely everything – some things we just can live with (the proverbial finger cuts when filling paper into the printer being just one example).  This is clearly one reason that in many safety and OHS legislations the ‘reasonably’ criterion is found.

Mind you, this is not an argument out of fatalism! We cannot prevent everything, but that doesn’t take away the responsibility to try as hard as we can within reasonable boundaries.

Allow me to quote Prof. James Reason from the conclusion of his fine 2008 book “The Human Contribution”:

Safety is a guerrilla war that you will probably lose (since entropy gets us all in the end), but you can still do the best you can.

Let’s take these wise words at heart and get on it.  Maybe we cannot prevent all accidents, but we can prevent a substantial part if we want and work systematically and structurally.  Hopefully we’ll succeed in preventing the most important ones. Good luck!

 

Biography

Carsten Busch photoCarsten Busch has studied Mechanical Engineering and after that Safety. He also spent some time at Law School. He has over 20 years of HSEQ experience from various railway and oil & gas related companies in The Netherlands, United Kingdom and Norway. These days he works as Senior Advisor Safety and Quality for Jernbaneverket’s infrastructure division and is owner/founder of www.mindtherisk.com.

 

 

May
18

The Blind Spots of Behavioral Observation Programs

Behavioral observation programs are a mainstay in many safety systems that are looking to move beyond compliance and get employees involved. The idea is pretty straightforward – have employees observe other employees doing job tasks. The observers then judge whether the behavior is “safe” or “unsafe” and provide immediate feedback to the employees who did the tasks. You seem to accomplish a lot with a program such as this, including:

  • Immediate and specific feedback to employees for “unsafe” behaviors, which enhances learning;
  • Employees get involved in the process and take ownership of safety at the site; and,
  • You get another feedback loop that you can use to identify exposures and risks at the site (you can also use it as a handy metric).

This sounds like a panacea for all your safety performance needs. So what’s the problem?

Well, the problem with most behavioral observation programs is that they don’t account for some blind spots that the programs tend to have, both practical and foundational.

Let’s start with an example of the practical – First, when it comes to identifying “safe” and “unsafe” behaviors, your employees are far more likely to identify obvious “unsafe” behaviors that lead to smaller accidents than they are to identify the less obvious behaviors that are more of a grey area and, coincidentally, are more associated with serious injuries and disasters. So, for example, behavioral observation programs are very good at identifying whether or not employees are using the required PPE for a given task. However, these programs are not very good at identifying whether technical procedures that are only indirectly related to safety are being followed or even if those procedures are adequate for the reality the employees are facing. In cases where deviance from procedures is normalized you might have employees note a given task as “safe” because that’s the way the job is normally done, without realizing the risks involved. So the program provides an unreliable data source, causing you to think that your system is “safe” when, in reality, you are drifting toward danger.

The bottom line from a practical perspective – behavior observation works for obvious behaviors. If “safe” and “unsafe” behaviors are not as obvious though then the behavior observation program may be a false indicator.

This leads to the foundational blind spot of behavior observation programs – the programs tend to assume that behavior is either “safe” or it is “unsafe.” This is categorically false. Behavior is inherently tied to the context and almost any behavior you can think of, if put in another context, is either safe or unsafe. Even the proverbial safety “no-no,” running with scissors, is sometimes the right thing to do (medical professionals run with scissors all the time in emergency situations).

Now it may be possible to identify a behavior that is always unsafe (using some definition of “safe” and “unsafe”), no matter what the context. But that’s not the point. If we really have to think hard to find something that’s always an unsafe behavior, is the idea that behavior is either “safe” or “unsafe” a really useful concept?

What if instead of a behavioral observation program we just had a performance observation program? Instead of judging whether the employee is doing things right or wrong, we just observe and try to understand how employees are doing work. Then, we ask questions (not just about the things we think they did wrong!), listen to stories, trying to find the best way to do the job in the context that the job is to be done. With the rich understanding of the reality the employees at the sharp end face, instead of telling them that what they are doing is wrong, we give them the tools (equipment, knowledge, time, etc.) they need to learn to adapt their behaviors to the contexts they face. We move past the obvious things and get to the real story of how work is performed in the organization. We move from a place of judgment to a place of cooperation. Then we not only get the basic advantages of traditional behavior observation programs noted above, we also eliminate the blind spots and build a foundation of trust between ourselves and the real source of safety in our organizations – our workers.

Biography

Ron Gantt photoRon Gantt is Vice President of Safety Compliance Management. He has over a decade experience is safety and health management. Ron is a Certified Safety Professional, an Associate in Risk Management, and a Certified Environment, Safety and Health Trainer. He has a Master of Engineering degree in Advanced Safety Engineering and Management, as well as undergraduate degrees in Psychology and Occupational Safety and Health. Ron specializes in safety leadership, system safety, safety management systems, and human and organizational performance improvement.

May
02

The Myths of Safety – React If You Will

Sometimes when you critically examine and expose well established myths you run the risk of having folks who believe the Myths attack “other” issues around the revealed “truths.” Some of it even gets “personal.” In the following article (and many other articles I’ve written) I understand this “danger” and I’m more than willing to take that risk. I’m hardly claiming absolute correctness and knowledge of these issues, but I do believe we should, as a profession, examine what and why we believe what we do.

Penn & Teller’s Showtime TV series is a perfect example. If you haven’t seen the series I highly recommend it. Not because I agree with everything they say but because what they do is challenge what they believe are myths and in some cases, lies. They encourage their viewers to think critically. The series is certainly not meant for the faint of heart. Their approach isn’t for everyone and it’s certainly an adult conversation with graphic language and at times has sexual content. This approach is used not to titillate but to be outrageous to get the viewer’s attention. I believe they accomplish these goals…get people’s attention and encourage viewers to critically think about what they believe is true. I’m not alone in enjoying their approach…they ran from 2003 – 2010. Check it out and keep your minds open; some of it is uncomfortable to watch.

Critical thinking is essential if we are to successfully help our fellow humans work and play safely. Our agreement is not. In fact we may learn more if we don’t agree. So let’s examine some of what I believe are the most prevalent and dangerous myths in the world of Safety Management. You may agree, you may disagree, and perhaps the real positive is that we’re at least examining what we believe to be true.

The Myths of Safety

1) Safety is #1

Some companies and professionals have adopted this “chant” as the ultimate statement of commitment to creating safety at their companies. The real issue is that your corporation is NOT created to be safe. The owners and shareholders have invested their money to make a profit, provide a service and/or to create a product. This is the reason for a corporation’s existence. How we accomplish this is indeed important. Doing it safely while being environmentally friendly, a good corporate citizen, ethical and legal is the real measurement of success. We need not number the priorities. They need to happen ALL AT THE SAME TIME! Safe Production is and should be the goal.

2) Counting Injuries is a Measurement of Safety

We’ve all done unsafe things and not felt the consequences of our unsafe behaviours. Standing on a chair, using a grinder without safety glasses, using a knife as a screwdriver are all examples of common unsafe behaviours we have done. That being the reality, unsafe/safe and injury/uninjured are NOT linked. AT ALL. If no injuries means we’ve been safe then we would have a great deal of evidence available to us to support that statement, right? Then consider how often you have done things unsafe and yet avoided injury. In this case, no injury was an outcome but how it was achieved was not by being safe.

3) Zero Harm/Injuries is a Commitment We MUST Believe In and Commit To

Thinking that something can’t be accomplished without believing in it is simply NOT true. The opposite is also NOT true. If this “faith” in something were the secret to accomplishment then believing in unicorns would have made them appear. Companies that have mistakenly linked Zero to some measure of safety are in error. No injuries can and does in many cases mean you were lucky. These types of goals also motivate some very wrong behaviours like hiding injuries through reclassification and accommodation.

4) Passing a Safety Audit Means You’ll Be Safe

There are many well intentioned standards and audits available in safety management. Many of them have impressive names with very long numbers attached to them. Some are international and have been created with “world-wide” input making them sound even more impressive. The reality is that most if not all are “opinion based” documents with little or no REAL evidence that they reveal any “secrets to success.” Groups of well-intentioned experts get together in a room and GUESS what they believe will work. Some of it is indeed highly intuitive and very much linked to good management practices. The problem comes with combining these “statements of intention” into something that if you PASS you will be on the road to success. As stated, there is little or no independent evidence that any of these work. In fact there is much evidence that they don’t. Passing an audit does nothing but state that you’ve “passed the audit.”

Most, if not all, of the popular audit instruments were created by well-meaning groups of people and are not based on any scientific evidence. Now, most of the questions in these audits are likely to be positives to your company outcomes but let’s examine a typical example question.

“Does your company have a signed Health & Safety Policy?” Arguably a good way to communicate your company’s intentions regarding the management of H&S. Problem is, the score. What is it worth? What are other questions in the audit worth toward your passing mark? Have they been measured in a test using control group companies which compare outcome measurements with inputs? If the scientific method has not been used to validate the audit… we have to admit that we are just guessing. Some very unsafe companies can and do pass audits. That being true, then this audit process is flawed. I’m not suggesting you abandon your audits. I am suggesting you read the results with a clear view of what the audit score may not be telling you about your safety management system.

5) All Incidents Are Preventable

What a beautiful idea…that ALL pain and suffering can be eliminated. Problem is that to prevent an incident, we would need absolute power over all things and absolute insight into the future consequences of our decisions. Absolutes in human experience don’t exist so the use of the word ALL makes the statement wrong without even considering what it takes to prevent incidents. See above for the other opposite end of the impossible scale (read ZERO)

6) Safety Can Be Done TO People

As in most human experiences “the few controlling the many” has a predictably poor outcome either in the short and/or long term. The idea that safety can be delivered like a pizza to passive workers who will just take our orders and comply is overly optimistic and frankly just the wildest of fantasies. So re-examine the Orientation DVD you’ve created and realize blasting passive workers with tons of information in a 20 minute DVD is not likely to have much of a long term impact on their safety. “Too much, Too Soon” comes to mind. This seems to be an efficient way to orient new employees. All too often we fail to really measure their retention and their resulting behaviours as an outcome of this “training.”

7) If We Make Non-Safety Illegal We’ll Reduce Unsafe Behaviour

At no time in human history have we eliminated undesired human behaviour through a “crime and punishment” approach. It hasn’t worked on our roadways with speeding and now the newest illegal act of driving distracted will hardly be eliminated by making rules and randomly (at best) pouncing on the violations. Human behaviour does work within a system of Activators, Behaviours and Consequences. This indeed is a complex area and has and is being studied continually. What we do know with some certainty is that random consequences are not very effective in changing behaviour. Being “caught” by some authority and then feeling the negative consequence of a fine can motivate some. It is in the full range of consequences that can provide very real motivators to support safe behaviours and help to take away the motivation to have unsafe behaviours.

The real knowledge about consequences is that positive consequences are much more influential and effective than negative. Focusing on the positive makes people WANT to get caught doing the safe behaviour. A focus on the negative enforcement accomplishes making people want to avoid being caught doing the prohibited. Is that really what we want…people avoiding punishment? A current example is how many “texters” are now trying to hide their “illegal” behaviour by texting while they drive with their phones in their lap out of sight of the enforcers. A win? Hardly. We’ve actually in all likelihood made it worse.

 

Well there you have it. Some of the most popular “myths” in safety management. You certainly don’t need to agree with what I’ve presented here, but you do need to examine (as we all do) what we believe and why we believe what we believe. As always, I’m always open to new ideas and views on these subjects… it’s what true professionals do.

Biography

Quilley - PhotoAlan D. Quilley is the author of The Emperor Has No Hard Hat – Achieving REAL Safety Results and Creating & Maintaining a Practical Based Safety Culture© . He is president of Safety Results Ltd., a Sherwood Park, Alberta OH&S consulting company (http://www.safetyresults.ca/). You can reach him at aquilley@safetyresults.ca.

Apr
17

Safety Observer Training Done Right

All employees are openly encouraged to report hazards when they are discovered.  In many organizations, this is a basic tenet and often included in the duties and responsibilities of each employee.  Some organizations take this one step further and utilize worksite safety observations as an activity to provide meaningful employee involvement in the safety program.  This is commendable and even encouraged but certain requirements must be met in order to provide a beneficial experience for both the company and the employees.  First, the employees must have proper safety observer training.  Second, the employee must have a way in which to report the findings efficiently and effectively, preferably also having the work-stop authority to engage with the observed party and work to provide a safe outcome.

According to OSHA , a “competent person” is defined as “one who is capable of identifying existing and predictable hazards in the surroundings or working conditions which are unsanitary, hazardous, or dangerous to employees, and who has authorization to take prompt corrective measures to eliminate them”.  I would say that each person tasked with performing worksite safety observations must meet this definition to some extent.  To become a competent person, an employee can obtain the capability through a combination of training and experience.  The authorization must be established with the company and the employee and should also involve tools and techniques to positively intercede.

This leads me to a recent conversation I had with a safety director on this very subject. Here is the gist of the exchange (Safety Director = SD; Myself = CU):  Safety Competency - Image 1

Upon looking at the data, we saw that the PPE category comprised of over 35% of the observations. We also found that about 50% of all inspections documented no hazards. Safety Competency - Image 2a

We went back to the data and saw that there were very few observations in these areas.  In addition, the checklist used for these critical areas were insufficient and didn’t really incorporate lessons learned and contributing factors discovered during the injury assessment process. Safety Competency - Image 3

We both agreed that the current way of doing things could definitely improve.  There was a broad assumption that the training provided was sufficient and met the safety observer training objective yet the data did not support the reality.  I then began to explain how we did this in the military.  When I served in the United States Navy, we used a Personnel Qualification System (PQS) program.  The Naval Education and Command describes the program as follows:

The PQS is a qualification system for everyone where certification of a minimum level of competency is required prior to qualifying to perform specific duties. A PQS is a compilation of the minimum knowledge and skills that an individual must demonstrate in order to qualify to stand watches or perform other specific routine duties necessary for the safety, security or proper operation of a ship, aircraft or support system. The objective of PQS is to standardize and facilitate these qualifications.

Although the wording was a bit odd and we weren’t on a ship, we did feel the concept was sound and could be used.  We felt that there were two primary components to this process. The first was that of knowledge (e.g. How to ride a bike).  The second part was that of demonstration of that knowledge (e.g. Actually riding the bike).  Here is the basic structure we used:

  1. We reviewed the basic structure of the system and felt the best way to roll this out would be to break this up by safety categories, such as PPE, Housekeeping, Hand & Power Tools, and Fall Protection.  There were several reasons for doing this.  First, the categories matched the observation checklists used.  Second, each category was focused enough so that training could be done in a relatively brief time.  Third, the knowledge was specific enough to the hazard, as opposed to a basic overview, such as from the OSHA training.
  2. Each category had both a knowledge and demonstration component identified.  This involved developing training methodology that would be used to impart the knowledge, such as a training session.  In addition, an activity was designed to demonstrate the knowledge, such as conducting a walkthrough with a safety professional or conducting a hands-on evaluation with tools and equipment.
  3. Training would include how to approach and coach as well to ensure observers positively interceded when they saw hazards and at-risk behavior and not just document them.
  4. Each employee would have a ‘qual card’ developed to show their progress, category by category.
  5. Each employee was limited to conduct observations based on their qualifications signed off on their ‘qual card’.

After rolling out this concept and implementing it, there were a few hurdles, such as finding the time for the individual attention to each employee.  This was made more manageable by targeting those employees with the greatest need based on quality aspects, such as high frequency of 100% safe inspections, high frequency of PPE observations, and low participation overall.

The process was done in such a manner as to make it manageable (aka eating the elephant one bite at a time).   The benefits turned out to be quite numerous such as defining clear expectations, confidence to participate and intercede, and increased communication.

Employee involvement is a vital part of any safety management system.  For the involvement to be useful, it must be meaningful and mutually beneficial for the employee and the company.  Structuring a program that defines the purpose, communicates it respectfully, and provides the tools necessary to fulfill the obligation is what is needed to achieve this benefit.

References

 

Apr
04

Capturing and Using Leading Safety Metrics

Introduction

As safety professionals we collect data.  It doesn’t make a difference if your focus is general safety, occupational hygiene or a combination of the two.  We perform safety observations, collect air samples and perform some analysis on the data to make inferences on potential hazards.  Wouldn’t it be nice if we could use data to predict the future?

Data Collection can predict what?

Collecting the right data points can help in forecasting potential exposures that can then be prioritized and an effective elimination or control mechanism developed.  The first thing we have to determine is what data we need to obtain.

There are two types of data categories that we hear discussed in the business world today; leading and lagging indicators (or metrics).    Leading indicators tend to be direct or indirect precursors of an incident, such as workplace conditions and behaviors.  Collecting this information provides the opportunity to implement preventative actions prior to an unwanted incident or injury.  Much like a coach, you manage your team as the game unfolds, trying to score.

Lagging indicators are those that are historic.  OSHA recordable, DART rate and experience modifications are like opponent’s points on the scoreboard.  At the end of the game, it’s too late to change them.

Let’s say that you are the Safety Director for a large construction firm of 1,000 employees.  The safety committee has decided that using worksite observations as a leading indicator will allow us to predict exposures and develop control options.    These leading indicators, such as with quality worksite safety observations, diversity of observers, and actions on collected safety data, have proven to predict injuries.  For more insight and detail, view this white paper from Predictive Solutions.

Data Collection

Determine how the data will be collected before it is collected.  If supervisors or frontline workers are to be collecting information, then a simplified mechanism that allows for data collection and storage with minimal disruption in their work day will aid in obtaining their support.  This also provides for the collection of reliable and useable data.

Pre-determine how large of a sample pool that will be effective.    Collecting safety observations on small construction crew of less than five people, three times a day, may not be advantageous.  Develop a strategy to ensure contractors and crews with the most manpower coupled with higher risk activity are afforded more observations than smaller crews with less at-risk activity.

How should the data be captured and analyzed?

As leading indicators are collected, there must be a plan in place to utilize them beyond the immediate activity.  Simply observing and correcting is known as the ‘whack-a-mole’ approach and doesn’t promote safety, because the same things could pop up over and over without effective resolution.  For example, a police officer pulls a car over for speeding and gives a warning.  Does this stop the at-risk behavior – in this case, speeding?  How do you know?  Can the observations be tracked– both positive and negative – to establish a tendency or trend?  Following an action plan to address a trend, can the observations support a positive shift? A single instance of finding and correcting an at-risk behavior or condition is but the first step.  Establishing overall tendencies from the expected outcome over time is the goal.  Prioritizing the undesired trends is then the next logical step.

In addition to acting on the data, the findings and the resolutions must be communicated.  Providing feedback to observers on action items resulting from the observations is crucial.  This way they understand their efforts are being heard and acted upon.  Coaching observers on good quality observations is also vital so that management is confident enough to act on the data obtained.

Conclusion

Unlike lagging indicators that measure a process purely on failures, leading safety metrics and indicatorscan measure a process on accomplishments.  Developing a sound strategy on what is being done to achieve safety is much more effective than hoping and praying that no injuries occur.

 

Biography

Paul WatsonPaul Watson has over 27 years as an occupational health and safety consultant.  He has worked in environmental contracting, the nuclear industry and the private sector.  As a Senior Industrial Hygienist with the Center for Toxicology and Environmental Health (CTEH), Mr. Watson participates as a member of the Industrial Hygiene (IH) group and Manages CTEH’s Northeast IH Operations. He oversees projects and performs industrial hygiene activities including qualitative and quantitative IH/safety surveys, air sampling/monitoring, and indoor air quality surveys. He leads a team of IHs in performing air sampling, noise surveys, review of  SDS and preparation and/or review of site specific health and safety plans.   He is primarily responsible for providing data management and evaluation support to CTEH project managers in the areas of industrial hygiene, toxicology, litigation support, risk assessment, and emergency response.

Mar
11

Management Support Is Essential for Safety – But What Is It?

Management support: We all say we want it, need it, and can’t do our jobs without it.  Saying that management support is essential for safety “success” has in fact become something of a safety profession mantra.  A majority of safety professionals, 51.2% according to a 2002 ASSE survey (Kendrick/Pater 2), however, don’t believe they receive that support. But what do we mean by management support?  What specifically should we want our managers to do to in support of safety?  Is asking for support even the right question?  As a staff/support function, shouldn’t safety professionals really be asking what they can do to support management?  If we really want support for safety, not just ourselves, we must also know what to ask for.

The “Wrong” Support

Over the years I’ve come to believe that one of the principal reasons management fails to support safety is that we (the safety profession) far too often ask them to support ineffective, and sometimes counterproductive, practices.  Some safety professionals believe, for example, that management must rigidly enforce the safety rules and procedures with punitive methods that kill employee trust and cooperation.   Others cite the importance of management in financing and paying lip service to their flavor-of-the-month, off-the-shelf quick fixes and “silver bullets.”   Perhaps worst of all, some safety professionals seem to view management support as firm backing for their attempts to run the entire organizational safety effort with management and the workers as idle bystanders.  Support that enables the abdication of management from the safety effort is not what you should want.

The “Right” Support

Rather than attempting to manage safety for them, we should want and expect our management to be good managers – of safety!  It’s not enough merely writing memos and speeches for managers to deliver.  Safety professionals need to help management actively drive the safety effort like other important organizational objectives (e.g., production, quality, schedule, costs).  Most managers got to their positions because they were successful at getting things done.  Safety professionals should encourage managers to use the same skills that got them recognized as effective leaders for the safety effort.   Why manage safety differently than other important organizational objectives?

So What Specifically Should We Ask Our Management to Do?

After 40 years of observing and assessing both successful and unsuccessful safety efforts, I’ve concluded that we need only three things from our management to attain and sustain safety excellence.  Here’s the support for safety I want from management and what I tell managers anytime I get a chance.

1.  Own safety.  Line management safety ownership cannot be delegated and must be demonstrated.  Don’t attempt to farm it out to safety specialists, consultants or employee committees.  Only you (line management) can make safety an organizational value and part of the culture.  Maximize your resources, including your safety staff, the management team and workers, to help you succeed but stay actively and visibly involved.  Recognize that just as you own production you also own how that production is achieved.  Production, quality, cost and safety, are all your responsibilities. Safety problems are your problems. Just telling employees to work safely is not enough.  Get out of the office and see what your workers are doing.  Use these work observations to partner with your employees to identify ways you can work together to help perform work more safely.

 For greater details into the concepts of Safety Management by Walking Around, see these articles:

Many high safety performance companies believe these on-the-floor, face-to-face employee interactions are the single most important action managers can take to promote safety (Thomen, 1991). Nothing you do will pay a bigger dividend than your visible good example and commitment.  It’s simply not realistic to expect employees to take safety more seriously than you show them you do.   Finally, be very skeptical of any quick fix safety solution, especially if it takes safety management out of your hands or requires you to handle safety differently than your other top business priorities.

2.  Manage safety like it’s important.  Make sure you have integrated safety into every aspect of your business from design and procurement to facility shutdowns.  If you don’t build safety into your business functions, you’ll later find safety in competition with them.  Like quality, safety is merely part of the work process that is your ultimate responsibility.  Don’t let it get separated.

Ensure that you and your management team meet routinely to hold yourselves accountable and to personally discuss (and not just listen to the safety manager) safety issues and progress – like you do for other important business objectives.  Ensure timely and appropriate corrective actions are taken – and that they are effective.  Your employees expect and deserve prompt attention to their concerns and suggestions for improvement.  In short, expect and lead continuous safety improvement.

If you and your other line managers aren’t already leading the safety effort with active participation, improvement is not going to happen overnight.  The point is to get started and don’t stop.  First you’ll need an effective PDCA approach to safety (ANSI Z10-2012 is a useful guide) that is committed to continuous improvement, and the will to make it happen.  You may well find that you just need to work smarter rather than harder.

3.  Get Your Employees Involved.  Although safety is ultimately your responsibility, you can’t manage it by yourself.  I have not seen top safety performance in any organization that did not have active and widespread engagement of the workforce in the effort.  Top safety performers recognized years ago that employees aren’t the safety problem; they are an important part of the safety solution.  These companies engage their workforce in a variety of meaningful safety activities.  They expect, and get, the large majority of their safety input (i.e., opportunities for improvement) from their workers.  They actively solicit and respond promptly to this input because they know it gets results.  Employees are given genuine opportunities to influence their own safety by helping design their work environment, policies and work procedures.  This adult-to-adult engagement clearly demonstrates to the workforce that they are respected and taken seriously.  As a result they are much more likely to work safely – and more productively.

Conclusion

Every employee wants and deserves the support of his or her management.  Safety professionals are no different.  We all want respect, decent remuneration and adequate resources to accomplish our assigned tasks.  It is also true that not all managers are created equal and we don’t always get the managers we would want.  Certainly not everyone in a management position is an effective manager of anything, including safety.  If your management believes that safety is your responsibility – not theirs – you’ve got your work cut out for you.  Regardless of your management, however, the main role of the safety function should be to provide the best possible guidance (i.e., support) to line managers who alone possess the responsibility and capability to achieve high performance in safety.  Safety professionals need to stop trying to do all things safety and instead use their talents, expertise and good judgment to support management in doing the right thing.

References

ANSI Z10-2012, American National Standard – Occupational Health and Safety Management Systems.

Kendrick, James and Pater, Robert. 2006. “The Future of Safety Leadership” Presented at the 2006  ASSE Professional Development Conference in Orlando, FL.

Thomen, J. R. (1991). Leadership in Safety Management. New York: Wiley.

Biography 

ASSE PictureMr. Loud’s (jjl7280@aol.com) over 40 years of safety experience includes 15 years with the Tennessee Valley Authority (TVA) where he served as the supervisor of Safety and Loss Control for a large commercial nuclear facility and later as manager of the corporate nuclear safety oversight body for all three of TVA’s nuclear sites.  At Los Alamos National Laboratory he headed the independent assessment organization responsible for safety, health, environmental protection, and security oversight of all Laboratory operations.  Mr. Loud is a regular presenter at national and international safety conferences.  He is the author of numerous papers and articles.  Mr. Loud is a Certified Safety Professional (CSP), and a retired Certified Hazardous Materials Manager (CHMM).  He holds a BBA from the University of Memphis, an MS in Environmental Science from the University of Oklahoma and an MPH in Occupational Health and Safety from the University of Tennessee. 

Feb
24

Construction Safety: Stopping Killing Conditions

Safety professionals are often perceived as alarmists. When you stop a project because a crane is just “a bit off level” or isolate a work area because the rebar is not capped, you can be seen as “just a bit too careful.” The common retort from those overseeing the work you stopped is, “Really? What are the odds of that happening?”

The following is a discussion on the need to reset how inspectors must look at a hazard based on fact. If the condition has killed in the past, it’s a “killing condition” allowed by a system that is “creeping” from what’s allowed. More on that later. For many reasons, we are reluctant to admit this tendency and, in turn, people get injured or killed from the killing conditions we allow. Why are we reluctant to step up? I suggest several reasons:

  1. Those in charge of operations allow hazards to exist. Their goal is production. Whether in a manufacturing plant or building a high-rise, the responsibility of the operations team is to make widgets to sell or spaces to rent. Anything that slows down that process, such as quality or safety control, is an obstacle. Those in charge do not see the possible outcome, so the safety inspectors must recognize that, when they find a hazard, no accident has occurred. This is why many in charge can’t understand the risk.
  2. Another reason is personal acceptance of hazards we all possess, shaped by years of encountering the same condition without harm. This is the fellow who climbs his roof each fall to clean out his chimney with nothing but experience between him and the driveway.
  3. We may have had a “close call” or been injured from a similar killing condition, but we were not killed. Think repeat drunk drivers, skiing without a helmet, and that taped cord in the garage missing its ground.
  4. Hazards are accepted and often embraced by professionals. Consider the ironworker who refuses to tie off at 28 feet above a killing surface because “it takes away my manhood.” This remains the classic example.
  5. Another reason is that rules allow a killing condition. For example, refer to the archaic Occupational Safety and Health Administration regulation of allowing ironworkers to work 28 feet above a killing surface. (See above.)
  6. The need to keep your job is yet another reason. When your boss tells you to jump into an 8-foot excavation, you are reluctant to say, “Fat chance, boss man” if you need to feed your kids.

System Creep

When looking at contributors that allow killing conditions on your project, you don’t need to look far. Over the years, all safety systems will creep from what is right to what is allowed. This was recognized as a contributor to the Challenger disaster. Although there were incidents (foam routinely striking the orbiter), the launches went okay until they didn’t. That anomaly was accepted—foam strikes on the shuttle continued, and that became the new normal. System creep.

Recently, I was traveling down the New York State Thruway, and a crew was clearing trees and brush from along the edge of the highway. Some of the trees were only about 20 feet from the travel lanes, while the original fence line was 40 feet from the edge of the road. Over the years, a tree likely grew on the traveling side of the fence, so they mowed around that one, then another grew next to it. They mowed around that one, and the forest crept closer to the lanes. Before long, the trees were too hazardous and too close to the cars. That system needed to be reset by cutting the trees back. That is a good example of system creep. In construction, that is easily recognized as a messy site.

A second contributor is a hazard that is unrecognized or does not cause an incident until late in the game. At that point, it is discounted, or, when the hazard kills, it is considered a rare occurrence. Consider this dated but great example from California. The bottom line exemplifies our tendency to study rather than accept the obvious.

1997-12-22 04:00:00 PDT SAN FRANCISCO— A 2-year-old girl stumbled while walking on the Golden Gate Bridge with her family yesterday, plunging through a narrow gap 167 feet to her death on the ground below.

The girl, identified as Gauri Govil of Fremont, fell through a 9 1/2-inch space between the sidewalk and the traffic lanes. The gap runs along a metal barrier that separates the sidewalk from the roadway, and is barely visible to pedestrians.

Although more than 1,200 people have jumped to their deaths from the world-famous span since it opened in 1937, bridge officials last night could not recall a similar accident. “This has not been viewed as a risk for children to fall through,” said Mervin Giacomini, chief engineer for the Golden Gate Bridge Highway and Transportation District.

We will certainly be looking at that space in a new perspective. If there is a potential for accident, we will take whatever action is necessary.”

“Geez, TJ, if you drink too much water, that will kill you. Where do you draw the line?” Safety professionals often hear such chatter, for we are often viewed as an obstacle and barrier to the good people who build our buildings. Our role is typically unseen. In most cases, our success is measured when nothing happens. In the book, The Black Swan: The Impact of the Highly Improbable, Nassim Nicholas Taleb helps explain the frustrations of why workers can get killed on a safe site (randomness) and provides a compelling observation on those who avoid wars and hazards—and why they get the short end of the stick.

It is the same logic we saw earlier with the value of what we do not know; everybody knows that you need more prevention than treatment, but few reward acts of prevention. We glorify those who left their names in history books at the expense of those contributors about whom our books are silent. We humans are not just a superficial race (this may be curable to some extent); we are a very unfair one.

The tendency to overlook or soften our views of a hazard during inspections is ours (the inspector’s) alone. Everyone sets his or her limit of hazard perception and tolerance based on what one knows, experienced, and fears. I do the same, and so do you.

Be Honest and Nice during Inspections

When having a correcting conversation as you inspect, try using real examples, be blunt, and be honest. Speak as you would to your son or daughter. Consider this scenario. You are approaching a crew rigging precast panels and need to tell the foreman to stop the pick, suspecting a poor strap. That requires some finesse—but do not let them make that pick. Here is how you might do this:

Just checking on rigging today: How many panels do you need to set? Well, no need to shut this down, guys, but let’s take a quick look … make sure you can keep this going smoothly.

You have identified both the hazard and the value of the inspection (keep production going), and you included the entire team in the learning (“let’s”). Plus, you never asked them to stop. But they will, for you have answered their question of “What’s in this for me?”

Highlight Killing Conditions

As you tour a work site, you will see conditions that have killed in the past—from cords missing a ground to scaffolds without rails. Finding these conditions is relatively easy, but getting the user to understand the threat is the challenge. Another example: You are inspecting a work site and see some scaffold similar to that shown in Figure 1.

Click here for Figure 1: Shoddy Scaffolding.

Can these conditions kill? Certainly. The material is good but the erection shoddy. One can see the chance to fall from the scaffold (no rails) but also the potential for someone to step on the Styrofoam on the top levels thinking there are planks underneath.

First, keep in mind that, although many competent persons are appointed, they may not be as competent as needed. Just the right person in the right place at the right time can result in disaster. The disaster is not his or her fault but is evidence of system creep. You can focus on what you see, but the goal is to avoid those killing conditions.

So, take the time to tie in some real-life examples as you discuss the need for some additional training. Let the workers on the planks know that you just read of a student slipping off some scaffolding, falling head first into an open barrel, where he died. Focus on “What are the odds?” This confirms to the user that this is indeed a killing condition. By doing so, the workers will understand how their work could kill someone. Bringing the news clipping along proves you care.

Seeing Is Believing

During a recent inspection, I noted an ironworker whose leg straps on his harness were extremely loose. As we watched him set a piece of steel and come back down in the scissor lift, I asked the crew who had gathered, “Did you guys ever see what happens to a guy when he falls, and his leg straps are loose?” No one knew the damage that habit could do to men only. I later brought back a photo of what had actually happened to someone else and left it with them. When we went by later in the day, each and every leg strap was tight. Teach by example.

Encountering a Killing Condition

When you spot a killing condition, your first role as an inspector is to immediately protect it. That may require standing in the same spot for a few hours until the threat is gone, but never leave a killing condition without addressing it. If the power cord is bad, find the owner and take it out of service. If the scaffold is unsafe, get the people off, and find the person in charge. Regardless of the pushback later, if what you see could kill someone, and you continue on—shame on you!

In the photo shown in Figure 2, some borings had been augured for goal posts. When these were discovered unprotected at the ends of the playing field, there was no immediate threat—except to me. But the field was also in a nice neighborhood surrounded by nice kids. These holes had been left open for 2 days. So, I called the foreman over and stood by this particular boring until he gathered a crew and a machine and brought over some nice pipes that looked a lot like the goal posts. They were rigged and loosely placed in the hole, thus eliminating the killing condition. That threat had existed for 48 hours and was corrected in 30 minutes. The safety program clearly stated that no holes could be advanced until covers were staged and ready. Not done. System creep.

Click here for Figure 2: Open Boring.

As the corrections were being made, the foreman started the “What are the odds?” conversation, so I recounted what happened to Joe in Southern California, detailing how the late Joe could only take one breath as he fell into the boring and probably died holding his breath, for he could not exhale. That’s paints a picture. From that point on, the foreman and I had an understanding that lasted.

It is critical that safety professionals recognize and react, but it’s just as important that they move from telling people to providing a lesson and simple examples of similar conditions that actually killed someone.

Tirelessly Praise Elimination

The safety professional recognizes it is important to praise any progress or achievement no matter how small. Should you recognize that a crew has taken on the elimination of a hazard or brought one to your attention, make a big deal of that. Spotting a killing condition that has not killed—that is everyone’s gift.

Take the time to capture a photo of the worker who spotted a hazard. Write up a lesson learned or the best practice that resulted and post it for everyone to see. Recognition of what is done right and done well trumps discipline every time.

A Resource

My friends describe me as a storyteller, and that is accurate, for I have the ability to remember incidents and details that bolster most of my arguments with the “what are the odds” folks. When I read the article detailing how they found Joe Alamillo stuck in a hole, I filed that away. Joe was a father who did not just pass away—he was killed. That was not an accident but the result of system creep.

One of the best resources to find examples of killing conditions is a summary of incidents compiled by a Bryan Haywood. Bryan is one of the top safety professionals in the county and my “go-to” guy for questions on confined spaces, as that’s his expertise. Bryan publishes a roster of recent industrial, construction, and other accidents you can access from www.safteng.net. These are quick summaries and links to real cases where killing conditions were encountered. Though not everyone died from the condition, that was just luck. Tie one of these examples to what you see in field and share these as near misses. Find a crew in a 6-foot trench “just for a few minutes”? Here are some examples from that website, which are provided at no cost every week or so:

CONSTRUCTION

  • EXCAVATOR on BARGE FATALITYDelta barge worker dies in accident (a construction worker, 49, was killed after a piece of equipment he was working on fell into the delta – he was operating an excavator from a barge around 1 p.m. when the machinery fell into the water – he was trapped inside the submerged excavator and was pronounced dead at the scene)
  • WORKZONE FATALITYConstruction worker killed on Selmon Expressway (a construction worker died after he was hit by a dump truck – he was working in a construction area of the expressway at about 4:44 a.m. when he was struck – he was taken to a hospital, where he later died)
  • TRUCK FATALITYConstruction Worker Killed In Nevada (a man was killed at a construction site – he was run over by a water truck and pronounced dead at the scene)
  • SCAFFOLDING COLLAPSE3 workers injured in BGC construction site accident (a structure that was supposed to ensure safety at a construction site instead sent three workers to the hospital – a heavy meshwork of steel and wire designed to protect pedestrians from falling debris gave way at the hotel worksite around 1:30 p.m. – three men suffered injuries mostly on their arms and legs and were brought to a nearby Medical Center – four other workers were then under the structure but were able to run away as it gradually fell to the ground)
  • TRENCH FATALITYMan killed when trench collapses (workers were working in a six-foot trench and installing an electrical conduit when a wall on the trench collapsed and buried a worker, 35, – by the time he was uncovered by emergency personnel, he had passed away due to the injuries sustained in the collapse)
  • SCAFFOLDING COLLAPSE Two men taken to hospital after scaffolding accident in Northampton (two men were taken to hospital after scaffolding collapsed – the men were both at the top of the scaffolding and fell from a height of up to 15 metres – one patient, a 55-year old man, sustained serious chest injuries as well as a potential head injury – the second patient, 42, sustained injuries to his lower back)

Conclusion

As safety professionals, we have the ability and opportunity to help reset system creep and guide those allowing that creep. Often, we can help those in charge recognize that creep. Like the trees growing too close the highway, it can be corrected. Work with the crews so they understand that the same hazard you are looking at killed someone before, and then provide what happened as a lesson. The goal is to make someone so uncomfortable with an obvious hazard that he or she is forever compelled to correct it.

A safety professional’s success will be measured when those you have taught come back and show off what they have done. That is a system being reset.

Biography

Thomas Lyons (left), with his son, Andy

Thomas Lyons (right), with his son, Andy

Thomas (TJ) Lyons CSP, CRIS – Mr. Lyons is a safety professional living in Warwick, New York, working for Innovative Technical Solution Inc. of Walnut Creek, California. With a strong background in construction safety and industrial hygiene, he focuses on reducing or eliminating risk through proper planning, implementation of best practices and lessons learned at the project level with a focus on driving from risk management to risk elimination.Board certified as an Occupational Health and Safety Technologist and Certified Safety Professional, he is proud to have taken some of these skills to his local community. A past assistant chief, NY State adjutant fire instructor (hazardous materials) and EMT, he sees the need to drive safety from the field to the home as often as possible.

In 2001, Mr. Lyons was awarded the IRMI Gary E. Bird Horizon Award for his efforts in implementing the OSHA Voluntary Protection Program on the first construction project in the state of New York. He has presented at the IRMI Construction Risk Conference and is often found heading up a table at the Construction Café.

Mr. Lyons was a past chapter writer for the American Society of Testing of Materials (ASTM) and for the recent American Society of Safety Engineers (ASSE) Construction Safety Management and Engineering Volume 1 and has recently penned a chapter on preplanning for the second edition currently in the works.

 

 

Feb
07

Gone Fish’n: Safety Measurements don’t have to be an Illusion

Float, fishing line and hook underwater verticalThe measurement revolution began in my prime.  I was working with Industrial Engineers, a nice bunch, with the world-view that everything can be designed to run smoothly.  We were a part of the Quality Revolution.  I had just gone to see W. Edwards Deming for the third time, eager to hear his thoughts on measurement.

Deming was a statistician who, like me, saw the world in terms of variance.  We don’t look at absolutes, we look at differences.  Therefore, when viewing injury data I look for change over time, variation across work units, comparisons to industry top quartile, and the like.  Within the groups of numbers, quite interesting and actionable sources of variation lie.

When someone gets injured, many look at the investigation for answers as to what to change.  All well & good.  But what about the dozens of stories, the dozens of variations in human and machine behavior that, if measured, would have revealed a basis for action before the injury?  A good measurement system can save a world of hurt…literally.

But if you don’t understand variance, your view of the world blinds you to risk.  Deming said that one of the greatest threats to organizational quality (and safety, from my perspective) is “single data-point management” too often practiced by managers who don’t understand the concept of variability.  Unfortunately, in the safety world I see too many single data-point managers.

I recall a story of a man in South Africa who was caught without his safety gloves by his supervisor who immediately fired him based on this single data point.  The safety manager refused to accept the firing because the man had 3 young children.  Upon interviewing the man he found out that he had removed his gloves temporarily to clean his safety glasses for better visibility.

FISHING BADLY

I feel bad for my son because he likes fishing and I’m not very competent in the sport.  We go to the lake, he throws out his line and waits to hook that one fish of the unseen thousands.

Single data-point management resembles that type of fishing.  The manager or supervisor, or you, yourself, have a list of measures consisting of outcome data (such as injuries), or instrumentation data (from machines or via other forms of technology) resembling the kinds collected on fishing trips.  Also, walk-arounds are like fishing trips where you’re fishing for violations of rules.  You’ve got your line out and are trolling for rule-breaking behavior.

Measurement is often used that way much too often.  You don’t pay attention to thousands of safe behaviors, but just hunt for violations. As long as the measures come back clean, you credit an absence of bites.  So you continue looking for that instance in which the data gives you what you think is a clear danger signal and you yank on the line to attempt to hook the violation.

Ludwig - Fishing - 2Once, my sons and I went fishing with a guide on a chilly day in Florida.  The fish were not biting because it was so cold, so the guide threw some “chum” into the water (something like candy for fish), to draw them in, so my young sons could score some catches.  And, fortunately for our vacation, we did catch a couple of small hungry fish.

Sometimes when managers see injuries occurring, but lack valid or accurate measurement systems capable of revealing root causes, they may begin throwing safety’s version of “chum.”  They walk around trolling frequently observing the smallest fish; that is, catching the smallest violations.

BUT… these apparent safety successes actually are an ILLUSION.

You are made to think you are doing something productive, when in fact you’re actually unintentionally promoting injuries; and diminishing your managerial skills in the process.

THE ILLUSION

Think about it.  Early in your career as a lead, supervisor, and/or manager you probably didn’t start your new job thinking: “I’m going to be pissed off all the time and scold employees.”  Rather probably you approached your assignment thinking: “I’m going to be the calm, understanding boss.”  The measurement illusion you picked up by chumming the waters during your fishing trips turned you into a scolding and ranting supervisor.

The reality is that each of your employees has good moments and bad moments.  All have moments of good safety performance, making an effort to keep themselves and others safe. At other times they may have moments when they take risks, intentionally or not.  Most of the time employees vary around an average set of behaviors that, for the most part, keep them safe and productive.Ludwig - Fishing - 3

Now, as a manager who fails to pay attention to variance, you tend to overlook this average set of behaviors generally responsible for keeping employees safe.  What you noticed early in your leadership career on your “fishing expeditions” were the outliers in performance.

You noticed when a particular employee promoted safety in some exceptional way.  Perhaps, tired of waiting for an engineered fix, the employee built a guard on a piece of machinery that had worn down and presented a hazard.  You noted that behavior and made a clear effort to praise the employee, perhaps even recognizing that effort in front of the work team.

Ludwig - Fishing - 4

But you didn’t understand variation; you praised a single data point.  This exceptional performance was an outlier; not what that employee always did.  So the employee’s safety performance regressed back to the employee’s average set of behaviors.  You saw the employee’s safety performance drop and didn’t recognize this as a natural occurrence.  Perhaps the next day you see the employee take a shortcut by walking over some pipes.  You think to yourself “I just praised him and now he’s doing something risky.”  You become less likely to use praise in the future because you’ve just been punished for being positive.  But your interpretation is an illusion, because it was not your praise that caused the drop.  It happened naturally.

Ludwig - Fishing - 5
 Now consider what happens when you go trolling for violations.  You search far and wide for risk, ignoring the average safe practices going on all around; or worse, think that if you praise safety, you’ll promote injuries.  Then you catch an employee who does something obviously risky.  Perhaps, he had his safety glasses around his neck while engaging a machine.  He had taken them off to clean them and check some paperwork and forgot to put them back on.  So you yanked the line and caught him.  You were upset and disappointed in him and let him know it by reprimanding him.

Ludwig - Fishing - 6
Well, in the past, this employee probably had always worn his Personal Protective Equipment (PPE).  This lapse was a rarity.  In fact, after this incident, this employee wore his PPE consistently like he always had.  You may have perceived “I just scolded him and his safety performance increased”.  You become more likely to scold in the future because your use of scolding and/or discipline has just been reinforced.  But that is an illusion because your scolding did not produce the improvement; it happened naturally.

Ludwig - Fishing - 7

Your punitive management style is an insidious superstition because slowly, at first, it shaped your increasing use of negative consequences like scolding and discipline. Eventually your management behavior has become like a fishing expedition– looking for opportunities to use unpleasant consequences — under the illusion that they work.  Sometimes that can shape you into quite the grumpy dude.  Even worse, you’re not stopping injuries.

THE DAMAGE

Fishing expeditions frighten employees along with the measurement system you use.  This damages a safety culture.  Think about it from the employees’ perspective.  When they see you coming, they anticipate punishment.  When they see a measurement taken, they worry it will be used against them.  Nothing positive there!

Folks get nervous and distracted.  They hide their behaviors. Peers teach one another how to avoid getting caught. (I’d do that if I were a fish who could talk.)  The culture deteriorates.  Injuries increase.

Don’t be fooled by this illusion.  Instead, look for variation, discover its source, and manage it.  Better fisher folk than I use sonar in their boats.  Sonars bounce sound waves down toward the lake bottom.  Schools of fish reflect this wave back to the sensors and show up on the sonar screen, giving you a good idea of where fishing may be productive.

A good measurement system scans your operations objectively, looking for pockets of variation that may reflect hazards and risks.  When you find this variance, use your problem solving tools to cast your net over the potential problem and solve it… before harm results.

But you can only look at variation and its sources if employees are participating in safety measurements.  Outcome data such as injury rates are not sufficiently sensitive.  Additionally, your own walk-arounds probably reflect your personal biases and are confounded, thereby, by your presence.  Instead, turn to your employees who should be able to provide the insights into the risky behaviors and conditions that set the stage for them.  They know the sources of variance; they can collect those data in a no-name/no-blame system of the type found to be successful in Behavioral Safety systems and they can play a part in reducing that variance.

After all, isn’t having the fish just jump in the boat a fisherman’s dream?

Biography

Ludwig - Image 6 - Photo of authorTimothy Ludwig’s website is Safety-Doc.com where you can read more safety culture stories and contribute your own.  Dr. Ludwig serves as a commissioner for Behavioral Safety Accreditation at the non-profit Cambridge Center for Behavioral Studies (CCBS: behavior.org) and teaches behavioral psychology at Appalachian State University, in Boone, NC.  If you want Tim to share his stories at your next safety event you can contact him at TimLudwig@Safety-Doc.com.

Jan
24

Accidents Cannot Be Prevented if Leading Indicators For Safety Are Ignored

We’ve all seen and read the articles about how data can help us improve production, quality, safety and environmental performance within the work environment.  With that said, how many of us have actually been close to a real life scenario where an incident happened and data was available, but wasn’t used?  For me, this story is unsettling and my fear is that it’s happening more often than we think.  My hope is that this article brings to light the value of using your data preventatively, in order to identify common risks which may be present every day.

From time to time, the Process and Technology Team at Predictive Solutions makes random contacts to our client base to discuss the status of their data and determine how they are tracking against their goals.  From an agenda perspective, this is a discussion which is focused on feedback and a review of the most common reports in the system.

This is a no frills, real life case where the data could have been used to prevent a serious injury.  Let’s review it and discover how we might avoid taking the same path.

Overview

This session began with a standard review of the overall process and a discussion about how the team was progressing to plan.  The feedback was good, although mention was made of an incident which occurred to a subcontractor who fell nine (9) feet.  During the fall, the worker sustained a head injury along with a fractured wrist.  When I hear something like this, I start to think about the data and what might have led us down a path to identify the leading indicators which, if recognized, might have helped to prevent this.  It’s important to note that this was a situation where they were not reviewing the data.  In talking with the client I asked what they believed led to this and they thought it was a planning issue.  This as a key point as it will come up again within the context of this article.  Where would you start if you were looking for the key leading indicators?

The Data

The focus of this review was on 90 days prior to the date of the incident and the first report we reviewed was the Summary Report.  The Summary Report shows categorical information and exactly what was (and was not) observed over a period of time.  A sample Summary Report is shown below.

Falkowitz - Data Review Image

Upon review, there were 34 at-risks which had been identified in the category of fall protection and 32 in the category of PPE.  Sometimes we see an uptick in the at-risks collected for PPE and believe that it’s related to an attempt to collect low hanging fruit and while in many cases it is, you will see that this case was different.

The next report we reviewed was the Detail Report which shows virtually all aspects of the observation and provides the most information available.  We saw that the non-mandatory field known as “root cause” which is available for selection at the observation level wasn’t always filled out, but for the ones that were, over 30% were chosen as “Not Following Plan.”

The third report we used to look into this further is the Summary Drill Down.  This is a function within the standard Summary Report which allows the user to look at categorical, project and contractor information along with specific details.  What we found was that when viewing by project, the one with the highest number of unsafes for fall protection by a factor of almost three, was the same project where this fall incident occurred.

Additional Incident Findings

While our investigation of the data occurred only over the course of an hour, there was a much larger investigation that occurred within the organization.  It was identified that the injured party wasn’t trained or qualified to perform the task.  The fall protection that they were wearing was rigged from the floor level, as opposed to above the employee and the harness which was worn, had no leg straps.  While the Foremen were trained, they didn’t understand the basic principles of use and essentially provided the PPE without knowledge or documentation of the employee’s competency.

This subcontractor was used without supervision or regard to a policy which mandates that the safety department be notified prior to them working on the project.  It was also reported that the project where this incident occurred was responsible for 50% of the incidents since the beginning of the year.

Lessons Learned

The data told us that the observers believed that PPE was a primary issue.  In many cases, elevated PPE counts occur in the absence of a well understood inspection strategy.  These are often simply cultural proxies for a larger issue.  In these types of cases, we should not only look at the data collected, but discuss it with the teams as well.  This type of communication encourages thinking about what was identified, as well as other relevant items based on the ongoing tasks.  Since there were a substantial number of fall protection concerns, these provide considerable opportunity and priority for prevention.

When asked, the client believed that a lack of adequate planning contributed to the outcome of this incident.  The data told us that the observers, while performing their observations of at-risk conditions, believed the same.  A third of the root causes assigned were chosen as “Not Following Plan.”  The lesson learned here is that we should be listening to what our observers are saying as they are the eyes on the front line.

The highest numbers of unsafes were found at the location where the incident occurred.   When we look at the data and see an at-risk trend, that’s the time to take action and investigate what the circumstances are which are driving the number up. Consider this as a pre-incident investigation that can lead to the prevention of a potential injury.

Summary

When you think about the mechanism of injury and then compare to what the data was indicating prior to the incident, the benefits are clear.  Taking it one step further, combining the current data along with the historical incident data is a great way to refocus your inspection strategy to ensure that the things that matter the most are being looked at.

You’ve taken the time to train your employees on your observation process and are working towards improving all phases of the continuous improvement loop.  This is a critical advance towards long term sustainability. The next logical step is to then coordinate action on information that is likely already there for your use.

Finally, we do what we do because we’re trying to support the continuous improvement process and reduce the injuries that are occurring on our work sites.  Part of this is building the culture through employee involvement.  When your observers are conducting observations, that’s participation!  There is no better way to say thank you than taking the time to review their data and improving the process based on what they identified in the field.

 

Biography

FalkowitzScott J. Falkowitz, OHST, CHST is a Process Improvement Leader with Predictive Solutions Corporation.  Scott has been a safety professional for over 10 years with a specific emphasis in heavy industries including Scrap Metal Recycling, Over Water Operations, Wire Rope Manufacturing and Construction.  He holds an A.A.S. degree in Fire Science Technology  and is actively pursuing his B.S. in Occupational Health and Safety.   Scott previously served the local community as a fire fighter and has been a registered EMT since 1996.

 

Dec
30

“Practical Drift”: Why people don’t always follow procedure and can Relationship Based Safety help?

Each year many are injured or killed in incidents where following a procedure or using available safety equipment would have saved his or her life.  Both managers and safety professionals have asked, “Why do people take short cuts and put themselves at risk?”  This article will explore this question from a variety of perspectives ranging from the traditional to the more recent thinking related to complexity and relationship-based safety.

A vigorous discussion on Linked-in (Nov 2013) asking why people take short cuts attracted responses from 66 safety and health professionals from several countries. If you are a member of Linked-in you may view the discussion at this link. By way of conducting an informal study, I analyzed the comments and attempted to put them into categories to see if we might get a glimpse into the underlying assumptions that guide EHS professionals in understanding and addressing why people take risks that appear to be avoidable. This informal study shows that the perceptions could fall into four broad categories:

  1. Human nature (26 responses / 46%),
  2. Leadership & culture (10 responses / 17.5%),
  3. Production/financial pressures (11 / 19% responses),
  4. Operational/Management systems (10/ 17.5% responses).

Defining the Problem

Rosa - Figure 1Human nature includes ways of thinking, feeling and acting that humans tend to have universally, which can be regarded as both a source of both good and destructive tendencies.  Ascertaining the characteristics of human nature and what causes them are some of the oldest questions in human civilization.  This provides a hint as to the enormous complexity of defining the problem. Almost half (46 percent) of the explanations of why people take short cuts fell into this category.  Figure 1 is a summary of the perceptions on how human nature plays a part in an individual’s decision to take short cuts and risks.

 

Poor leadership and culture (figure 2) and production /financial pressures (figure 3) had ten and 11 responses respectively.  Arguably you could say that these perceptions are also related to human nature, however, we separate them because they appear to not place sole responsibility on the person performing the work.

Rosa - Figure 2
Rosa - Figure 3

 

Management systems (figure 4), which include rewards and consequences seems to weigh in fairly equally with production pressures and culture/leadership issues.

Rosa - Figure 4This analysis is not scientific but it provides a snapshot of a mindset in the safety and health profession on this very important issue.  It is encouraging that 57% of the respondents articulated multiple influencing factors. However, the indication that a negative assumption of human nature is frequently used to explain why people take short cuts is cause for concern. Are these assumptions correct?  Unless we are making decisions on how to address human performance based on facts, we are likely to take the wrong or insufficient actions.  The way we define the problem informs our strategies to correct it.

A New Perspective

Prominent researchers have brought into question the belief that human error is at the root of most accidents (Reason 1988, Hollnagel & Leveson 2006, Dekker 2006). The focus has turned from focusing on the individual to examining the influence of social relationships and operational systems. Our previous articles have shown how research points to relationships and social interaction as much stronger influences on how people choose to act than previously thought.

In this article we want to examine how beliefs and assumptions about human error as the most frequent cause of accidents might be wrong. These beliefs seem to stem from Heinrich’s triangle theory, which made the elimination of unsafe acts a primary objective. His theory made a lot of sense and was supported by many respected safety professionals (Bird 1969, Heinrich, Petersen & Roos 1980). Now this assumption is being questioned by recent research indicating that Heinrich’s research was flawed (Manuele 2011).

Not only has Manuele’s research uncovered defects in Heinrich’s original statistics, other research shows that, “human variability is what preserves imperfect systems in an uncertain and dynamic world,”  (Reason, 1988: 239). According to Reason, mangers attribute unreliability to unwanted variability. They believe that increasing consistency in behavior will increase system performance. This is why standard operating procedures, and automation are so prevalent.  James Reason provides many examples of famous events where human variability, just in time actions and adjustments, is what saved the day, not following procedure. He suggests that attempting to constrain human variability by prescribing a limited set of “safe behaviors” undermines the most valuable assets we have.

We do know that many an individual has been hurt by not following a basic safety procedure. So we have to deal with a reality where the solution is not black and white. While efforts to control accidents through policy, procedure and behavior interventions have proven insufficient, fatalities and disasters such as the BP Oil spill are not acceptable in companies with values to protect people and environment. So, how can we define and address the problem?

Practical Drift

Scott Snook’s (2000) “practical drift” theory provides direction for actions that could address the unpredictable nature of organizational behavior that produces disasters such as the accidental shoot down of the Black Hawk helicopters in Iraq.

He defines practical drift as “the slow uncoupling of practice from procedure” (p. 24).  He concludes that the typical response of tightening procedures and increasing penalties for failure to comply would inevitably lead to the same pathology because in time, the new procedures would also be ignored. Instead, he urges professionals and managers to realize that the important question is not how to fix pilot error, crew inaction or even practical drift. The more fundamental question is, what can be done given this reality of human behavior? How can practical drift be addressed if not with increased and tighter rules?

A beginning would be to accept that drift will occur, that it is a positive aspect of human nature and more rules are not the answer. It would be beneficial to explore how people come to believe that not following the procedure makes more sense, and engage people in an inquiry that could lead to a more profound sense of mindfulness in their work.  By increasing safety awareness through conversation and dialogue held with respected individuals we might prevent future tragedies more effectively than by increasing rules and procedures.

Given the level of uncertainty and a changing environment, drift is the only certainty. Engaging workers in identifying it and knowing when to bring it up for discussion is a challenge.  Getting buy-in and consistency will require routines and structures that are imbedded into the work. Workers will have to see it as beneficial to getting their work done right and efficiently.

In this regard the Relational coordination (RC) dimensions of effective communication can be quite useful. Its seven dimensions evaluate the health of the working relationships (trust and communication levels) to accomplish specific work processes.  In this case, the dimensions of frequency, accuracy, timeliness, problem solving, shared goals, shared knowledge and mutual respect would be measured in relation to communications about procedural changes (Carrillo 2011, 2012). For more detail see our blog explaining the RC theory and dimensions.

Relationship Based Safety (RBS)

RBS differs from behavior-based safety because it changes the focus from changing individual behavior to building collaborative relationships. We are being asked to consider moving towards approaches that support relationship building as a way to influence behavior, to lessen dependence on the types of controls we’ve been discussing, and broaden our understanding of how to work with human nature.  There are basic safety management systems that need to be in place. It is the role of the safety professional to evaluate those systems and advise.  However, the evidence indicates that we need to recognize that the quality of relationships in the workplace plays an equally important role.

It isn’t easy to champion investments for relationship skill building. We live in a time that puts a premium on the measurement of outcomes, on the ability to predict them, and on the need to be absolutely clear about what we want to accomplish. To aspire for less is to court the loss of professional credibility.  That is why we need to continue to search for reliable ways to measure relationship effectiveness. The Relational Coordination survey is one such instrument.

From a social perspective it is understandable why tight controls, accountability, and standards should have such power. Especially when the public is involved, the response is to tighten up, to mandate, to measure, and to produce action plans. The manager’s ability to exercise professional discretion is likely to be constrained when the public has lost confidence in management’s ability to control the risks.

This insistence on efficiency and control are dominant values today. Under these circumstances the aim of leadership development is typically in the arena of finance, strategy, and planning.  People are considered capital or expense. There is little provided in the area of relationship building, living with ambiguity, responsiveness to the unexpected and engagement of the imagination.

Regulators try to control drift through regulations and enforcement. It appears that human behavior is not responsive to that approach, since (especially when you factor in unpredictable social interactions) it is impossible to completely control the effect of messages and the quality of information sharing.  It is difficult to go against this grain.  There is very little room for exploration and learning from small mistakes. Leaders are constantly measured by dashboards, 360 evaluations and taught a series of “leadership competencies” that promote the fallacy that there is one correct answer.

In leadership, judgment replaces rules.  What constitutes the right qualitative relationships for any particular work is unique to that work.  Who should be at a meeting or huddle? It isn’t just about titles. Who brings a necessary perspective? One person may be known as pessimistic and must be balanced. Another may have the trust and confidence of the people needed for collaboration. Part of leading is knowing who to bring into the conversation that will surface the hidden issues.

To the extent that a leader can make it okay to learn from mistakes, people will talk about drift from procedure and learn from it. There is no way to manage in this way sitting behind a desk or through a computer. Managers must get comfortable with social interaction.

It might be hard to accept but workers and managers are always making up their world as they go, and fitting their actions under the umbrella of the way they see the world or the truth in that moment. Rules and procedures are an attempt to hold the world still and create a uniform decision making model. It works for a while, until the human mind shifts and creates a new understanding of what’s going on. Safety management is not exempt from that dynamic no matter how important the goal of eliminating harm and failure may be.

Biography

Rosa photoRosa Carrillo, President of Carrillo & Associates, is a thought leader in transformational leadership for environment, safety and health. She brings 20 years of industry experience with all levels of the organization. Her results and many publications create instant credibility with leadership and the workforce. She is fluent in English and Spanish and is at ease working across many cultures. For more information, please visit her website: http://carrilloconsultants.com

 

References

Bird, F. E. and G. L. Germain. (1969) Practical loss control Leadership.

BP. (2010, Sept. 8). Deepwater Horizon accident investigation report. Houston, TX: Author. Retrieved Nov 12, 2013, from http://www.bp.com/content/dam/bp/pdf/gulfofmexico/Deepwater_Horizon_Accident_Investigation_Report.pdf

Carrillo, Rosa A. (May 2010). Positive Safety Culture: How to create, lead and maintain, Professional Safety, 47-54.

Dekker, S. (2006). Resilience engineering: Chronicling the emergence of confused consensus. In E. Hollnagel, D. Woods & N. Leveson (Eds.), Resilience engineering: Concepts and precepts (pp. 77-92). London, U.K.: Ashgate Publishing.

Eisner, Elliot W. ( 2002) The Arts and Creation of Mind. Integrated Publishing Solutions: VA.

Goodman, N. (1978).  Ways of Worldmaking. Hackett Publishing: Indiana

Heinrich, H.W., Petersen, D. & Roos, N. (1980)  Industrial accident prevention. New York: McGraw-Hill.

Hollnagel, E., Woods, D.D. & Levenson, N.G. (Eds.). (2006). Resilience engineering: concepts and precepts. London, U.K.: Ashgate Publishing.

EHS Professionals Linked-in Group comments viewed 11/15/2013 http://www.linkedin.com/groups?home=&gid=4278711&trk=anet_ug_hm

Manuele, F. A. (2011 Oct).  Reviewing Heinrich: Dislodging two myths from the practice of safety. Professional Safety. 52-61

Snook, S.A. (2000). Friendly fire. Princeton, NJ: Princeton University Press.

Reason, J.T. (1988). The Human Contribution: Unsafe Acts, Accidents and Heroic Recoveries. Ashgate Publishing: VT.

Older posts «