The inspiration for this post comes from my good friend Nikola Ilic, whose tagline is “I make music from the data”. This in turn has always made me think of a a piece of musical legend on the internet (not because that Nikola can’t make music from data - quite the opposite - but in the way that very few others can).
This melodical masterpiece has been ascribed to everything from a Swedish children’s orchestra to an ensemble that borrows each other’s instruments for a fun annual event. It’s actually neither: it turns out to be the Portsmouth Sinfonia, an orchestra open to anyone, regardless of whether they could play their chosen instrument or not. Professional musicians playing instruments they’d never touched before, non-musicians enthusiastically sawing away at cellos and blowing into trumpets. The result is… well, let’s call it “memorable.”
Here’s the thing: they had all the instruments. Beautiful, expensive, professional instruments. What they lacked was the ability to make music with them.
This feels a lot like business intelligence today. Are we playing the instruments or are we making music? We seem to be so extremely focused on the tool - the instrument - that we forget we’re supposed to DO something with it. As Simon Sinek puts it: you don’t buy a car to put gas in it, you buy it to go somewhere.
I’ve spent the last few weeks diving deep into something that’s been bothering me for years. Everyone talks about being “data-driven,” but when you actually look at what that means in practice, something doesn’t add up. Companies are knee-deep in data, wading in dashboards, drowning in reports, and yet… nothing changes.
So I went looking for examples. Real examples. Not “we implemented analytics and it was amazing” marketing fluff, but concrete cases where data actually improved outcomes. What I found was fascinating, and not at all what the analytics vendors want you to hear.
The Most Famous Data Mining Discovery Never Happened #
There’s a famous story in the analytics world: back in 1992, a retailer discovered that men buying diapers between 17.00 and 19.00 in the evening were also likely to buy beer. They placed beer next to diapers, sales exploded, and data analytics proved its worth.
Great story. Except it’s not true.
Here’s what actually happened: In 1992, Teradata analyzed 1.2 million market baskets from Osco Drug and found a correlation between beer and diaper purchases during that evening window. But they weren’t fishing blindly through data, hoping to find something useful. They were specifically investigating baby product correlations - they had a hypothesis about new parents’ shopping patterns and went looking for evidence.
And critically: the store never acted on the finding. No beer moved to the diaper aisle. No promotional campaign. Nothing. The entire legend is built on a hypothesis-driven analysis that led precisely nowhere.
The story persists because it fits a seductive narrative: collect enough data, apply sophisticated analytics, and valuable insights will emerge spontaneously. But that’s not what happened at Osco Drug, and it’s not what happens in successful data initiatives.
There’s nothing wrong with exploration. Exploration can be incredibly valuable - for generating hypotheses. But exploration becomes harmful when organizations mistake it for a plan.
Exploratory analysis answers the question:
“What might be interesting here?”
Hypothesis-driven analysis answers the question:
“What matters enough to test, measure, and act on?”
Exploration is wandering the instrument to find possible melodies. Hypothesis-driven work is choosing one melody and composing a piece around it. Hypothesis-first doesn’t mean limiting creativity; it means giving curiosity a direction.
Both matter, but they are not interchangeable.
The biggest mistake organizations make is confusing the two:
They treat exploration as the strategy (“let’s see what the data says”),
and treat hypotheses as optional (“we’ll figure out the action later”).
This is backwards. You don’t start by exploring a violin to see what noises it makes. You start by deciding what piece you want to play and then explore variations within that structure.
A simple way to put it:
Exploration without hypotheses → noise
Hypotheses without exploration → dogma
Exploration informed by hypotheses → insight → (possible) action
What Actually Works #
Let’s look at cases where data analytics actually drove business outcomes:
Netflix and House of Cards (2013) Netflix didn’t just collect viewing data and hope to stumble upon insights. They had a specific question: “What makes a successful original series?” They analyzed 30 million daily plays to test their hypothesis that viewers who liked David Fincher films and British political dramas would love a Fincher-directed remake of the British series House of Cards. They were right. The result: a $100 million commitment to a show where 86% of subscribers who watched were less likely to cancel their subscriptions.
UPS and ORION (2013-2016) UPS didn’t collect truck telemetry and wait for patterns to emerge. They started with a clear hypothesis: “If we optimize delivery routes and minimize left turns - which waste fuel idling at intersections - we can dramatically reduce costs.” They built ORION (On-Road Integrated Optimization and Navigation) specifically to test this. The result: $300-400 million in annual savings and 10 million gallons of fuel saved. (Not to mention the environmental impact.)
Harrah’s Entertainment (1998) Gary Loveman, a Harvard Business School professor turned Harrah’s COO, had a hypothesis that contradicted industry wisdom: frequent players were more valuable than high rollers. He implemented a Total Rewards program to test this, tracking customer behavior across properties. He was right. By 2001, Harrah’s captured 43% of customer gambling dollars, up from 36%. The company went from seventh to first in its market.
Capital One (1988-1994) Richard Fairbank and Nigel Morris hypothesized that customized credit card offers would outperform the industry’s one-size-fits-all approach. They convinced Signet Bank to let them test thousands of variations in interest rates, fees, and terms against different customer segments. They were right. By 1994, this “test and learn” strategy powered Capital One’s IPO at a $1.1 billion valuation.
Notice the pattern? Each started with:
- Deep understanding of their domain (entertainment, logistics, gaming, financial services)
- A specific hypothesis about what would drive outcomes
- Data collection designed to test that hypothesis
- Tools built to act on what they learned
They weren’t exploring. They were testing..
The Business Intelligence Problem #
“But wait,” you might say. “We have dashboards. We’re data-driven!”
Here’s a test: Look at your dashboards. For each metric, ask: “What decision does this inform? What action does someone take because of this number?”
If you can’t answer clearly, you’re not data-driven. You’re data-decorated.
There’s an entire industry built on creating dashboards that no one acts on. HashiCorp discovered they had over 500 dashboards across their organization. After auditing them, they found that most were either never viewed or informed any decisions. They reduced the number by 60% with no loss in operational effectiveness. Sometimes less really is more.
“But dashboards work in some places!” Yes, they do. And those cases prove the point.
Healthcare dashboards often work brilliantly - not because healthcare is special, but because they embody hypotheses built from decades of clinical research. An ICU dashboard showing early warning scores operationalizes what we already know: specific vital sign combinations predict deterioration. When the score hits a threshold, the rapid response team mobilizes. A systematic review of 70 studies found that hospital dashboards reduce length of stay and improve patient satisfaction, but look closer: they’re built around validated clinical protocols, not exploratory data fishing.
This hypothesis-first approach isn’t limited to healthcare. At Volvo Powertrain, researchers investigating dimensional variations explicitly required domain experts to define hypotheses before collecting data. They used structured workshops to identify which parameters were most likely causing quality issues, then collected targeted data to test those hypotheses. As they discovered, “without proper domain understanding, there is a risk of conducting complex analyses that only lead to the discovery of obvious or previously known patterns.”
The problem isn’t dashboards. The problem is dashboards built without domain expertise, without hypotheses, without action pathways. We’re playing notes without making music.
The “Go Fish” Problem: When Tools Become the Strategy #
Most data initiatives follow this sequence:
- Collect massive amounts of data
- Apply the latest technology (AI! Machine Learning! Neural Networks!)
- Hope to find something useful
- Figure out what to do with it
When you start with the tool, you’re optimizing for the wrong thing. You’re asking “what can this technology do?” instead of “what do we need to achieve?”
It’s like buying a really expensive hammer and then wandering around looking for things to hit. Sure, you might find some nails eventually. But you’ll also waste a lot of time hitting things that aren’t nails, and you’ll completely miss the problems that require a screwdriver.
The sequence that actually works:
- Understand your process deeply
- Form hypotheses about what drives outcomes
- Collect data designed to test those hypotheses
- Use tools to analyze that data
- Take action based on what you learn
Netflix didn’t say “let’s collect all this viewing data and see what machine learning finds.” They said, “We want to create original content that people will love - what data would tell us what to make?”
UPS didn’t say, “Let’s apply AI to our logistics and see what happens.” They said, “We’re spending too much on fuel - how can we optimize routes to reduce that?”
This tool-first thinking is why so many “AI initiatives” and “data transformation programs” fail to deliver value. Not because the technology doesn’t work, but because the organization never articulated what success would look like in business terms.
What Comes Next #
So we’ve diagnosed the problem: organizations are obsessed with instruments instead of music. They’re collecting data, hoping insights will magically appear. They’re building dashboards that show what happened without driving what happens next. They’re deploying AI because it’s cool, not because it solves a specific problem.
The successful initiatives we examined—Netflix, UPS, Harrah’s, Capital One—all shared something fundamental: they combined a deep understanding of their domain with imagination to form clear hypotheses, then had the courage to act on what they learned. Understanding, imagination, and courage—that’s what separates making music from making noise.
But knowing what’s wrong is only half the battle. The harder question is: how do you actually build a data practice that embodies these principles? How do you create an organization where understanding informs imagination, and imagination drives courageous action?
That’s what we’ll tackle in Part 2: the practical steps, the cultural shifts, and the mechanisms you can put in place to ensure your data initiatives actually drive business outcomes.
Join the Conversation #
What’s your experience with data-driven initiatives? Have you seen reports that actually led to change? I’d love to hear your thoughts. Reach out to me or comment on LinkedIn or BlueSky!
References #
On the beer and diapers myth:
- The Register: The parable of the beer and diapers - Daniel Power’s investigation tracing the story back to 1992 Teradata work with Osco Drug
- Power, D.J. (2002). “Decision Support Systems: Frequently Asked Questions”, DSSResources.com
Netflix and House of Cards:
- Carr, D. (2013). “Giving Viewers What They Want”, The New York Times
- Madrigal, A. (2014). “How Netflix Reverse Engineered Hollywood”, The Atlantic
UPS ORION:
- UPS. “On-Road Integrated Optimization and Navigation (ORION)”
- Bensinger, G. (2014). “The Logistics of E-Commerce”, Wall Street Journal
Harrah’s Total Rewards:
- Loveman, G. (2003). “Diamonds in the Data Mine”, Harvard Business Review
- Davenport, T.H. & Harris, J.G. (2007). Competing on Analytics: The New Science of Winning, Harvard Business Press
Capital One’s information-based strategy:
- Stewart, T.A. (1996). “Managing in a Wired Company”, Fortune
- Clemons, E.K. & Thatcher, M. (1998). “Capital One: Exploiting an Information-Based Strategy”, Wharton School Case Study
HashiCorp dashboard reduction:
- Sigma Computing: How HashiCorp eliminated dashboard sprawl - Case study on reducing 500+ dashboards by 60%
On the strategy-execution gap:
- Brightline Initiative: Closing the Gap - Research on why strategic initiatives fail during execution
- Project Management Institute (2018). “Success Rates Rise: Transforming the high cost of low performance”
Hospital dashboards and clinical decision support:
- Valero-Elizondo, J. et al. (2025). “Association of Clinical Dashboards and Operational Metrics”, JAMIA Open - Systematic review of 70 studies showing dashboard effectiveness in healthcare
Manufacturing data analytics:
- Lundén, N. (2022). “Implementing data analytics for improved quality in manufacturing: a case study”, Chalmers University of Technology Master’s Thesis - Adapting CRISP-DM methodology for Volvo Powertrain
- Vilarinho, S. et al. (2018). “Developing dashboards for SMEs to improve performance of productive equipment and processes”, Journal of Industrial Information Integration
On data contracts and data mesh:
- Dehghani, Z. (2022). Data Mesh: Delivering Data-Driven Value at Scale, O’Reilly Media
- ThoughtWorks: Implementing Data Contracts - Practical guidance on data contracts for distributed data architectures
Image by Franz P. Sauerteig from Pixabay