Hacking the TRIF

Commonly used measurement lacking in 7 key areas

Hacking the TRIF
Rob  McLean

Hacking is breaking up a system into its smallest parts, analyzing the strengths and weaknesses of those parts to which changes are then made to achieve a certain goal. Although the term has been negatively branded as a criminal activity, such as when information databases get compromised, it is not that act of hacking that is evil — it’s the goal and methods you use that determines the ethical nature of your hacking. With good intentions in mind, let’s hack the TRIF.

TRIF stands for total recordable injury frequency, sometimes termed total recordable injury rate or simply total recordable rate. It’s a ratio of injuries of a certain severity (termed recordable injuries) to hours worked.

This is how the TRIF is calculated:

(# injuries x 200,000) divided by (# hours worked)

Why 200,000? Because 100 workers working a 40-hour work week each year is equal to roughly 200,000 hours. The result of the equation is to figure out how many workers would get injured for every 100 workers worth of full-time hours worked. So a TRIF of 5 means that if you theoretically had exactly 100 workers on a site working a 40-hour work week, you would have seen five recordable injuries in the year.  If you wanted to do a quarterly TRIF, as some work sites do, you would take the injuries in any given quarter and replace 200,000 with 50,000 and the approximate number of hours worked.

History of the TRIF

The seeds of the TRIF were planted at the beginning of the 20th century with the United States Bureau of Labour’s first full-scale survey of industrial accidents and hygiene. But they didn’t have the clout at the time to standardize the collection of nationwide injury data until 1954. It was at this time that the American Standard Method of Measuring and Recording Work Injury Experience was accepted. But it was not enforced by legislation and, as a result, few voluntarily reported on their injury statistics, and no one reported on what we term today as “first aids” and “medical aids”; only on lost times and fatalities.

In 1970, the United States’ Occupational Health and Safety Act (OHSA) was passed with the mandate of decreasing the economic cost of workplace injuries and illnesses. As a part of this legislative move, workplace injury and occupational disease statistics were to be tracked. Not safety statistics, loss statistics. And here we find a significant turning point for why the TRIF is so widespread today (back then referred to as OHSA log 300). This was the most important statistic the most powerful safety force in North America. And people began to associate the TRIF as a measure of safety; they began to associate the TRIF as appropriate to be used as an internal managerial safety metric, as appropriate to be benchmarked from company to company, business division to business division. This was never the intention of the statisticians who chose the equation of the TRIF — it was a tool to benchmark different industries to assign priority to what industries were hurting the country the most and what industries were the highest priority for investigations. Using man-hours in the billions, they assigned their primary targets. And as American industry limped along in following this legislation, they were taught a habit they didn’t fully understand.

What does the TRIF do well?

•It’s simple. You only need two pieces of data: number of incidents and man hours. Comparatively easy data to grab as a company’s HR department would normally have total man hours readily available and a very common piece of data for the safety department to have would be the number of injuries.

•It’s standardized. It’s the current standard used by OSHA. It’s a common metric used in many vendor management databases and, although it depends on the client, often a businesses’ TRIF is taken into consideration before doing business with them, especially in construction for contractors and subcontractors. It is the current dominating safety metric of choice in the U.S. as well as Canada.

What doesn’t the TRIF do very well?

•Hours are all weighed equally in terms of risk. Some safety programs out there focus less of their resources on actual safety and more resources on tracking down every possible source of man hours on a site in an effort to “water down” the number of recordable injuries they sustain.

•Possible conflict of interest. The person recording the TRIF in some businesses is often the one whose performance is measured on that key performance indicator (KPI). That’s leaving the data vulnerable to temptation. Treatments can be re-interpreted, incident reporting can be discouraged and man hours can be unethically acquired. (For example, hiring a bunch of “consultants” to flood your TRIF with low risk man hours of people doing sedentary work of nefarious value).

•Benchmarking. It is an inaccurate benchmarking tool, unless you had identical sites working on identical projects with identical cultures of transparency and a comparable volume of hours, ideally around 1 million.

• It’s not prescriptive. High TRIF? Bad. Low TRIF? Better, but the TRIF will never tell you how to get better. Only where you once were and where you were before that. Not where you’re going and how to get there.

•It’s a lagging indicator. You only find out your TRIF once things have happened, once hazardous energy has found its way through your controls and inflicted significant harm. It is a measure of yesterday’s safety performance. How many of us go to the weatherman for yesterday’s weather?

•Doesn’t scale well. Small employers don’t produce enough man hours to create a TRIF with appropriate volatility. Without enough man hours, all it takes is a single recordable injury to jump your TRIF sky high which will have you appearing culpably unsafe when what you really are is a small employer that values transparency.

•It is indirect. The TRIF measures injuries directly and a correlation is then drawn between injuries and safety to determine how safe a workplace is. But what if there are no injuries? What if there are injuries but they don’t get reported? What if some are reported and some are not? How accurately can we measure safety in this uncertainty? What if I injected $1 million into a site’s safety infrastructure overnight? What would our TRIF be tomorrow morning? It would be the same.

I don’t mean to vilify the TRIF; everything comes with its own package of pros and cons. It is still a metric that measures safety. Kind of. But to what degree of efficacy?

It’s important to understand what the TRIF is and what the TRIF is not. The TRIF is the performance of the past. It’s not the performance of today. The performance of today is up to you.