Leading Indicators Increasingly Accepted As Means to Anticipate, Mitigate Hazards

By Bruce Rolfsen

Jan. 22 — When Jack Toellner, senior safety consultant for ExxonMobil, explains safety and health “leading indicators” to conference audiences, he often shows a drawing of a man slipping on a floor.

Measurements of the actions that could have prevented the slip are leading indicators, according to Toellner, whereas the slip's aftermath produces the trailing indicators—the number of missed work days, the injury being recorded in the company's OSHA log and a workers' compensation claim.

Tom Daniel, director of environment, health and safety for Owens Corning, defines leading indicators as measuring a company's “risk temperature.” At Honeywell Process Solutions, Cary Gherman, the director of health, safety and environment, likens leading indicators to a using a patient's family medical history to determine the health problems he or she may face.

Search for Clarity, Consensus

Over the past five years, acceptance has grown for the use of leading indicators, John Downy, acting director of the National Safety Council's Campbell Institute, told Bloomberg BNA.

More employers are adding leading indicators to their safety programs, accepting the idea that by paying attention to seemingly unrelated actions—such as the frequency of unscheduled maintenance or hours spent in training—a company can identify possible risks and take action to reduce on-the-job hazards. Mitigating the risks leads to safer worksites and should lower the injury and illness rates reported to the Occupational Safety and Health Administration.

“At first, we didn't see a great deal of consensus or clarity,” Downy said about which factors succeeded as leading indicators.

In October 2013, the Campbell Institute published the results of a survey of member companies, which are mostly large employers with thousands of workers such as UPS, Dow Chemical Co., Norfolk Southern Corp. and DuPont Co. The survey reported that members faces “several common barriers to leading indicator implementation.”

The barriers included difficultly in developing consistently actionable leading indicators, lack of a reliable relationship between leading and lagging indicators, and the benchmarking of leading indicator results that wasn't standardized among employers.

A year later, the Campbell Institute published a follow-up report looking at the best practices of its members—“Practical Guide to Leading Indicators: Metrics, Case Studies & Strategies”—saying that companies, after internal debates, had settled on their leading indicators.

The report, based on workgroup meetings of Campbell Institute members, identified more than 80 leading indicators that could be tracked.

By publicizing through workshops and publications the best practices and experiences of its members, Downy said, the institute has provided a foundation for employers to build on.

Limitations of OSHA Metrics

Most companies gauge the success of their safety and health programs by lagging indicators, such as the number of injuries and illnesses that must be recorded in OSHA logs after they have taken place.

The fewer the recorded injuries and illnesses, the more successful a company may consider its safety program. Other trailing indicators include lost time, damage to equipment, and hazardous material releases and spills.

Proponents of leading indicators often point to research concluding that low injury and illness rates don't always equate to lower risks for catastrophic accidents and don't provide insights into how to avoid incidents.

“I'd be the first to tell you that OSHA data has serious limitations as a global metric,” Steve Newell, executive vice president of the safety consulting firm ORCHSE Strategies LLC and a former director of OSHA's Office of Statistics, told an audience during a recent Campbell Institute forum.

“Companies that have low OSHA rates—if that is the only thing they are looking at—don't know what they are doing,” Newell said. “They don't know if that is a result of being good, whether they are lucky, whether their people don't know the recording criteria.”

One challenge to implementing leading indicators is convincing company leaders to invest time and money in a new program, he said.

Safety managers, Newell said, must make the case that improving safety will add to the company's overall value and that leading indicators will become the management levers that leadership can push and pull. Using leading indicators will improve the production process, not just reduce workers' comp rates, Newell said.

A 2014 report published by the National Safety Council's Campbell Institute identified more than 80 possible leading indicators. This list is a sample of leading indicators mentioned by the group's members. The report, “Practical Guide to Leading Indicators: Metrics, Case Studies and Strategies,” is available at http://op.bna.com/env.nsf/r?Open=rdae-9srqyc.
• How many assessments of compliance with standard operating procedures were conducted? 
• How many of the assessments were validated by a safety/health manager?
• On average, how many days did it take to close a corrective action request? 
• What percentage of requests were closed on time? 
• What is the percentage of corrective actions by the type of hazard (for example confined space and fall protection)? 
• How many effective corrective actions were verified by managers? 
• How many near miss or unsafe observation reports were filed? 
• How many of the reported hazards were previously unknown? 
• What is the ratio of safe to unsafe observations?
• How many workers were trained in hazard identification? 
• What was the number of certified trainers? 
• How much money was spent annually for training? 
• How many new employees completed orientation? 
• On average, what is the number of training hours for each employee at a worksite? 
• How many incidents' root cause included a lack of training?
• How many employee suggestions were implemented? 
• How many managers participated in critical design reviews? 
• What is the percentage of positive ratings of managers from employees?
• How much job turnover was there? 
• How often did workers lead safety meetings? 
• How many hazard observations are submitted by employees? 
• What was the number of grievances submitted?
• What is the ratio of positive to negative observations? 
• What is the ratio of peer-to-peer observations compared to supervisory observations? 
• How many observations were made? 
• What is the ratio of high-risk observations to low-risk observations?
Pioneering Indicators

ExxonMobil was among the first companies to implement a leading indicators program.

Toellner, who has spent almost half of his 46-year career at ExxonMobil focused on workplace safety, said he first used leading indicators on a broad scale in 1998. At that time the company was constructing two offshore oil and gas platforms, weighing a combined 32 million pounds. At the project's peak, more than 1,200 people were building the platforms.

Toellner credited project mangers for their openness to using leading indicators and their emphasis to supervisors and employees that safety was a priority.

The project's leading indicators included grading the thoroughness and professionalism of the toolbox talks at the start of each shift, as well as evaluating the worksite's housekeeping, the precautions taken to prevent components and tools from falling to lower levels and the participation of foremen and senior managers in safety walkthroughs, Toellner said.

Following the introduction of leading indicators, the total recordable injury and illness rate for the project declined about 50 percent, to approximately two cases for every 100 workers.

Toellner didn't give leading indicators all the credit for the reduced number of injuries, but he said he is certain the program led to higher morale and an attitude that safety was everyone's concern.

Today at ExxonMobil, all new projects are expected to incorporate leading indicators into their site-specific plans, Toellner said.

Entire Company Involvement 

Company safety directors and consultants tend to agree that when starting a leading indicators effort, there is no magic metric. Determining which factors to measure requires the involvement of the entire company.

“Think about the questions you want your measures to answer,” Newell advised. “Come up with measures that answer those questions.”

Toellner cautioned that leading indicator proponents can't let themselves delay a program by prolonging debates on which indicators to use.

“People ask me what the best leading indicator is,” Toellner said. “It's the one you're committed to following through on.”

One often-suggested leading indicator is collecting near-miss reports, which detail accidents that almost happened. However, just collecting near-miss reports isn't enough.

“Counting near misses, and then doing nothing to prevent more, is a lagging indicator,” Toellner said. The value of near-miss reports is the insight they provide into the circumstances leading up to the near miss.

One misconception, several safety managers told Bloomberg BNA, is that starting a leading indicators program will require setting up new reporting systems for each indicator. Often the data are already being collected—such as the percentage of new hires at a worksite—but no one is looking at the information from a workplace safety perspective.

Common leading indicators include:

• the frequency with which senior managers and supervisors participated in safety walkthroughs,

• the frequency of unscheduled repairs and whether a hazard review was conducted before the repair,

• the percentage of worksite employees who recently started,

• the number of hazard and near-miss reports,

• the number of days abatement of a reported hazard took,

• the number hours each worker spent in training,

• compliance with company rules and

• the frequency of overtime.

The correlation between some leading indicators and lower OSHA rates is clear: Leading indicators identify hazards that, if quickly abated, improve safety, managers said.

Some leading indicators are warning signals to supervisors and workers that risks have increased and actions should be taken to reduce those risks—for example the opening of a new production line requires heightened attention.

Once a program is place, the indicators should be widely available and analyzed monthly or quarterly to identify trends, safety officials said. Additionally, restricting information to just the safety department or plant managers prevents all employees from participating.

The indicators should be reviewed annually to determine whether they are effective and whether new goals should be established.

For example, a company may find in its first year of using leading indicators there was a correlation between increasing safety training hours and lowering the injury rate. And adding more safety training in the program's second year may have had the same result. However, there will be a point where adding safety training no longer lowers the rate, and it is time to move onto a new indicator that will lower risks.

Toellner recalled that one of the primary changes his first leading indicators efforts produced 14 years ago was a factor that wasn't measured or anticipated.

Managers participating in safety walkthroughs demonstrated that safety was a “moral value,” an attitude that surprised many of the employees, Toellner said.

“They had never worked at a place where safety was a core value and managers really care about them,” Toellner said. “We didn't quite understand how much we changed the culture.”

To contact the reporter on this story: Bruce Rolfsen in Washington at brolfsen@bna.com

To contact the editor responsible for this story: Jim Stimson at jstimson@bna.com

Campbell Institute studies on leading indicators, including the “Practical Guide to Leading Indicators: Metrics, Case Studies & Strategies,” are available for free at http://www.thecampbellinstitute.org/research.