All Posts
DeliverySeptember 2024·6 min read

The Delivery Metrics That Actually Tell You Something

Not all metrics are created equal. Here are the delivery metrics I track in every program — and the ones I've stopped paying attention to.


Every team tracks something. The question is whether what you're tracking tells you anything useful about whether you'll deliver on time, or whether you're just generating numbers that look good in a report.

After 14+ years managing delivery, here are the metrics I actually care about.

Cycle time

How long does it take from "work starts" to "work is in production"? This is arguably the most useful single metric in delivery. Short cycle time means fast feedback, fewer integration problems, and less work in flight at any one time. If your cycle time is measured in weeks rather than days, that's a systemic problem worth solving.

Planned vs. actual velocity

Not velocity on its own — that's just a number. The useful signal is the gap between what you planned and what you delivered, tracked over time. A team consistently delivering 80% of planned velocity isn't failing — they're probably estimating too optimistically. Acknowledging the pattern lets you plan more accurately.

Escaped defects

How many bugs are being found in production versus caught in testing? A high escaped defect rate is a signal about test coverage, code review quality, or deployment practices. It's also a lagging indicator — by the time users find bugs, the cost of fixing them is much higher.

Dependency lead time

How long does it take from raising a dependency with another team to getting it resolved? In large organisations, inter-team dependencies are often the biggest delivery risk. Measuring them makes them visible and creates pressure to resolve them faster.

Metrics I've stopped caring about

Story points as a performance measure. They're a planning tool, not a performance metric. Comparing story points across teams or individuals is usually counterproductive and sometimes actively harmful to team culture.

Attendance at standups. If your standup is valuable, people will show up. If you're tracking attendance to enforce participation, fix the standup.

The rule I try to apply

Every metric you track should change how you make a decision. If you can't name a specific decision that would be different based on what the metric tells you, it's probably just noise. Keep your dashboard small and meaningful.

Want to discuss this or work together?

Get in Touch