Last week I was speaking with my friend April Mills. She was talking about how velocity doesn’t really accelerate value. In her words, it doesn’t improve one’s “value-waste ratio.” This made me sit up straight. It reminded me of Tufte’s famous “data-ink ratio.” It’s a great concept that might help us arrive at an interesting definition of waste.
Tufte’s idea was that in a visual display of information there is some “non-erasable core” needed to effectively communicate the data represented. Much of the stimuli present beyond this is unneeded, what he dubbed “chartjunk” (Tufte, 1983). The lower the data-ink ratio, the more visual “noise” there is in a chart. (It’s basically a signal-noise ratio.)
April’s comment applies similar reasoning to products. In product work, we often end up taking requests from people who believe they know what the right things to build are. Or we rely on Product Owners to make the right prioritization calls for us. We focus on how fast we can ship and not on the effects of having shipped what we did.
And yet this isn’t hardware: With software teams, if builders don’t engage in discovery, often no one will so long as there is “demand.” This is a mistake. As Seiden (2019) notes, the result has been a world full of features that work as specified but do not create any value. Whether we know it or not, we’re building whatever we build to create some desired value — and the value doesn’t happen without the right behavior changes, demand or no demand. (Image adapted from Adzic, 2012.)
Typically, we build without a “North Star.” There is no product vision and concrete strategy. There are no target outcomes. There are no value metrics tied to the vision. No one is focused on the needed change in the world created by the building. We’re only focused on building. It’s like we’re driving, but we’re scared to turn the headlights on. We can’t see where we’re going. The result is we end up acting like users should just create value for us because we built something, but that’s not how it works.
We need to get better at tying it all together, at getting the signaling in place to help us make actual evidence-based decisions. We can’t learn our way forward when we’re just taking requests and meeting demand. We need a cohesive product strategy, with target outcomes feeding into a concrete product vision, and with both tied to metrics.
These metrics are our “key results” — not output, not velocity. As Andy Grove would say, when we deploy, there should be zero ambiguity whether the key result was obtained, whether the target metric moved in the right direction. If it doesn’t, we should try something else. Without this unambiguous pivot signaling in place, we call things “done” whether they achieve outcomes or not. We’re driving blind.
Even with a product strategy and concrete goals, stakeholders will still think they know the right things to build. Fine. If a stakeholder comes and says she knows what outcome she wants achieved and has features she wants added to the roadmap then, as John Cutler would advise, reframe the situation for her: The features she’s requesting are bets. The target outcome metric is how she’ll know if her bets pay off.
If she has features but no outcome, then ask her: What outcomes would building these features achieve? How will she know if the bets pay off? What would building this allow people to do that they can’t currently do? If she can’t say, then why prioritize these features over others actually tied to the product strategy?
OK, above we saw that data-ink ÷ total ink = the data-ink ratio. We kicked this post off with April Mills saying that velocity doesn’t improve one’s value-waste ratio. And, interestingly, Ben Foster (2019) just argued that ROI should be thought of as outcomes achieved ÷ revenue. He’s almost giving us a bridge between Mills and Tufte. We could restate this as:
Value metrics should be created before we start solutioning, before we start making changes. Unfortunately, Agile tends to perpetuate the fallacy that speed of learning can be reduced to speed of building. We often then start with building, with no headlights. There are two problems here. First, there’s a lot we can learn (and outcomes we can achieve) without building anything at all. Second, Agile teams build tons of stuff without learning anything because they don’t have the right structures in place to support said learning. This is why I’ve stopped talking about “iterating” and instead talk about speed of learning.
To close, another way of looking at Tufte’s equation is that 1 – data-ink ratio = the proportion of a chart that should be erased (Few, 2006). Given this reasoning, if we’re still using solution-based roadmaps (as most Agile teams are), then we must at least insist they’re tied to target outcome metrics. Planning out years of a solution-based roadmap isn’t strategic. How much of that roadmap helps us achieve what we need achieved? How much of that Gantt chart is value-adding?
When we aren’t focused on our value-waste ratio (or outcome-roadmap ratio), we don’t know how much waste we’re producing. Building without concrete outcome or value metrics is like doing research without research questions. It’s sort of like a “throwing spaghetti at the wall” strategy, except we’re not even focusing on whether the spaghetti sticks!
So, it’s really more just a “throwing spaghetti” strategy.
This is, after all, what waste is — building faster than we can learn.
Waste is spaghetti hurling, sans the wall.
Adzic, G. (2012). Impact mapping: making a big impact with software products and projects. UK: Provoking Thoughts Limited.
Few, S. (2006). Information dashboard design: The effective visual communication of data. Sebastopol, CA: O’Reilly Media, Inc.
Foster, B. (2019). Vision-led product management. Presented at the NY Product Conference, New York.
Seiden, J. (2019). Outcomes over outputs: Why customer behavior is the key metric for business success. USA: Sense & Respond Press.
Tufte, E. R. (1983). The visual display of quantitative information. Cheshire, CT: Graphics Press LLC.