Already have an account?

Login
Line of figures falling off a cliff chasing a pie chart

When data misleads us

Mark Murrell

"When a measure becomes a target, it ceases to be a good measure"Goodhart's Law (as paraphrased by Marilyn Strathern)

I read an excellent article recently that highlighted the various ways that data and statistics betray us. Because the world is complex and full of subtle nuances, it turns out that when we try to boil that down to a finite set of numbers that are managed and controlled, we often set ourselves up to be deceived. The quote above is just one of the reasons why that happens - once we start trying to improve something that we're measuring, those measurements become less and less reliable.

The reason behind that is a fairly simple one - people respond to incentives. It may not be obvious or even conscious, but it happens. In 2005, Freakonomics explained in great detail the different ways that that phenomenon has played out, but it continues to happen even when we're aware of it.

Personally, I saw this play out with the first job I had in the tech industry. I was working in Apple's tech support call center, and we were measured on the call volume and abandonment rate (the number of people who hung up before reaching an agent). The idea was to compensate based on the number of calls answered, and incent people to answer those calls in a timely fashion. The unintended consequence was that agents got very good at quickly prescribing a set of potential resolution steps, then sending customers off to try them with the instruction to call back if it didn't work. Lots of calls answered, low abandonment, but longer time to resolution, and not a great experience for the caller.

As we get more and more data points to measure things that are happening in the world, and as we attempt to use those data points to improve performance, it's important to understand the different ways that data can mislead us, and do what we can to prevent it.

Avoiding Data Proxies

One way that data misleads us is when it's wrapped up in a proxy - an abstracted measurement system used in place of real data. Sometimes it's too complicated to get real numbers, or they're just not available at all, so proxies get created to try and make sense of what's going on and create some metrics that allow for comparisons.

The safety world is full of these. Most safety metrics that people talk about regularly - DOT reportable incidents, CSA scores, insurance loss ratios - are actually proxies that summarize or simplify more complex numbers in such a way that a third party (e.g. enforcement agencies, insurers) can rank and prioritize people. Those numbers, on their own, are pretty much meaningless.

They can be helpful in understanding relative performance, but they only tell a tiny portion of the story. If two fleets have the same DOT reportable number or insurance loss ratio, are they equally safe? Not necessarily.

They're also all lagging indicators so they only summarize what happened in the past and not what's happening now. If those two fleets have matching numbers today because of what happened 3, 6, or even 12 months ago, what does that tell us about how safe they are today? Pretty much nothing.

In order to get meaningful data, and be able to act on it appropriately, it's important to bypass the proxies and measure the actual numbers. In the safety world, that means adding up the actual costs (both direct and indirect) in total dollars and as a percent of revenue. That provides something more useful, helping to identify the specific places where costs are happening (rather than having them summarized by the proxy). It also spells out pretty clearly the benefit of fixing the problems.

Measuring All the Parts

The quote that opened this piece highlights the challenges that arise when you start trying to improve the numbers that you're measuring, and the false sense of security that can be created when complex issues are reduced to simple numbers. Our experience in the Best Fleets program has shown that the way to avoid that is to consider a broader set of metrics that identify and address the potential for unintended consequences.

As an example, in my last article I talked about how fleets measure manager performance and noted how many were focused solely on driver productivity. That opens up a range of potential problems for the fleet since there's no incentive to ensure drivers perform safely, show up with undamaged cargo, take care of the equipment, or have any satisfaction in the job. Plenty of people will say that those are table stakes, but if one thing is measured and others aren't, that one thing becomes the priority.

Similarly, the most common benchmarking we see for driver performance is average MPG. Benchmarking fuel performance, on the surface, looks like a fantastic idea since good fuel performance requires speed management and smooth driving, which help with safety as well. However, when used on its own, this is a great example of data that creates complacency. Time is a zero sum game so if a driver is incented to spend more time in one area they'll necessarily spend less somewhere else. If they're focused on driving more slowly and smoothly, what aren't they focusing on? What are they sacrificing in order to have more time to get where they need to go? Is it trip planning? Vehicle inspection? Customer service? This is particularly an issue in situations where drivers are bonused for on time performance - they're incented to arrive on time, and to drive slowly to get there, but that can create dual pressures that lead to maintenance issues and higher turnover.

To be clear, I'm not saying that benchmarking fuel performance is a bad idea, and I'm certainly not saying that we shouldn't incent drivers to slow down and drive more smoothly. But we do need to recognize that those benchmarking and incentive programs can't operate alone or in isolation - they need to be part of a larger package that balances all the places where performance can improve.

Creating an effective safety management program for drivers means making sure that ALL the things that define an exceptional driver are measured, incented, and developed. That means not only tracking speed, hard brakes, lane changes, and other on-road performance metrics, but also measuring trip planning, inspections, cargo securement, general workplace safety, sharing of best practices, and continuous learning as well. We've seen some fleets go so far as tracking whether drivers keep the cab clean and sweep out the trailer when they're done.

The Complex, Messy World of Data

Putting all those pieces together, there are lots of data points available now, and more coming all the time. Much that was previously unknowable is getting tracked, reported, and benchmarked. All that is fantastic, and the foundation for making some dramatic improvements in safety, efficiency, and general quality of life for drivers. As long as we remember that the world is complex, data is messy, and it's easy to fall into complacency by relying on incomplete or overly simplified metrics.