5 reasons why lagging indicators are problematic

Are your safety metrics deceiving you?

Lagging indicators, such as number of incidents, injuries and days away from work, are widely used in occupational health and safety — but they bring with them a host of problems, delegates heard at the Canadian Society of Safety Engineering conference in September.

The first issue is that lagging indicators are negative and are premised upon the understanding that safety is the absence of bad, explained Tanya Hewitt, human and organizational performance specialist with the Canadian Nuclear Safety Commission in Ottawa, speaking at the conference in Niagara Falls, Ont.

“Safety being the absence of bad makes you hunt for bad. It makes you look for bad things. It makes you look for deficiencies. It makes you look for non-compliances. And this is going to get people guarded, you know, when all you’re doing is looking for something to get them in trouble,” she said.

Furthermore, this forces safety professionals to be too narrowly focused.

“If you put your blinders on and only look for what you are looking for, you will not see the big picture stuff, you just won’t,” said Hewitt.

Another problem with lagging indicators is that they are quantitative. The metrics are rolled up and put into a dashboard without the safety manager having any idea how those numbers got there in the first place, Hewitt said. This is especially concerning because decisions are often made on those aggregate numbers.

“This is bordering on unethical. This is not right. If you don’t know how that data came to you, you shouldn’t be making decisions on it. Period. You should have the full story,” Hewitt said.

Plus, these metrics don’t include the “invisible numbers.”

“It could be happening all the time but you have these people at some part of the process fixing the problem. If it’s not reported into some sort of system, you wouldn’t even know that it happened,” said Hewitt.

Yet another problem with lagging indicators is that they are historical by definition, and they are premised upon the “safety myth” that past performance predicts the future.

“The future is super, super hard to predict and looking in the rearview mirror doesn’t do it. Past performance does not predict the future,” Hewitt said.

The fourth major problem with lagging indicators is that they can be deceptive. For example, BP had a fantastic safety record at the time of the 2010 Gulf of Mexico oil spill, having just won a safety award weeks before. But the investigation report said, “BP focused too much on the little details of personal worker safety instead of the big systemic hazards.”

“Reducing lost time for workers and making sure they wear the right kind of boot is important, nobody’s denying that, but it isn’t enough,” Hewitt said. “You can think you’re doing well. You can lull yourself into a false sense of security if you’re looking at the wrong stuff.”

Finally, lagging indicators are problematic because they are highly gameable. For example, when there is external motivation, such as a financial bonus at the end of a successful project, the focus is on the reward.

“You will hide stuff that shouldn’t be hidden because the reward trumps the work,” said Hewitt.

Instead of lagging indicators, Hewitt recommends focusing on leading indicators, such as the number of safety inspections, length of time to implement corrective actions, the content of employee complaints, diversity of voices involved in decision-making and personal safety stories.

This article originally appeared in the December 2018/January 2019 issue of COS.