Skip to main content
The Amateur Orchestra, Part 2: How to Make Music Instead of Noise
  1. Posts/

The Amateur Orchestra, Part 2: How to Make Music Instead of Noise

·2211 words·11 mins
Table of Contents

In Part 1, we explored why most data initiatives fail: organizations are obsessed with instruments (tools, dashboards, AI) instead of music (business outcomes). We saw that every successful data initiative - from Netflix’s House of Cards to UPS’s route optimization - followed the same pattern:

  1. Deep understanding of the business process
  2. Imagination to form a hypothesis
  3. Courage to test and act on results

The pattern was clear: they started with a problem, formed a hypothesis about how to solve it, collected specific data to test that hypothesis, and then actually changed what they were doing based on what they learned.

But knowing what’s broken is only half the battle. The harder question is: how do you actually build this into your organization? How do you create a culture that makes music instead of noise?

What Actually Matters: Imagination Applied to Business Knowledge
#

1. Understanding Your Domain

You can’t form useful hypotheses without understanding the process you’re trying to improve. This means talking to people who actually do the work—not just reading reports about it.

When Gary Loveman transformed Harrah’s, he didn’t start with data. He started by understanding how casinos actually make money, which customers drive value, and what behaviors matter. The data came later, designed to test his hypotheses.

The Volvo researchers didn’t just collect all available machine data and hope for patterns. They brought together production engineers who understood the manufacturing process, operators who knew the machines intimately, and maintenance specialists who understood failure modes. Together, they formed hypotheses about which parameters (temperature? spindle load? tool wear?) most likely caused dimensional variations. Only then did they collect data.

This is like learning an instrument before joining an orchestra. You need to know what notes are possible before you can play music.

This isn’t just for giants with unlimited resources. A 2018 study of small and medium manufacturers found that effective dashboards share a common characteristic: they’re built on continuous improvement methodologies like kaizen and Total Productive Maintenance (TPM). The researchers noted that dashboard development must consider “the level of quality maturity of the company”—in other words, you can’t just copy someone else’s dashboard. You need to understand your own processes first, then build the dashboard to support those processes. The dashboard’s purpose, they emphasized, is “to turn information into knowledge, plans, and actions which promote effective shop floor activity.” Knowledge → Plans → Actions. Not data → dashboards → hope.

2. Imagination to Form Hypotheses

Understanding alone isn’t enough. You need imagination to form hypotheses about what might work.

Richard Fairbank’s insight at Capital One wasn’t obvious. The entire industry treated credit cards as a commodity—same rates, same terms for everyone. His hypothesis that customization would win required imagination. It contradicted conventional wisdom. But he had enough understanding of consumer behavior and enough imagination to see a different possibility.

This is like composing music: you need to know which notes work together (understanding) and envision something that doesn’t exist yet (imagination).

3. Courage to Act

The hardest part isn’t forming hypotheses. It’s acting on them.

Netflix committed $100 million to House of Cards based on their analysis—before shooting a single scene. That takes courage. UPS invested in ORION, knowing it would face resistance from drivers who’d been doing routes their own way for decades. Harrah’s rebuilt its entire business model around Loveman’s hypothesis.

Data doesn’t make decisions. People do. And most people find reasons not to act: the data isn’t perfect, we need more analysis, it’s too risky, let’s wait.

The Portsmouth Sinfonia had all the instruments but couldn’t make music because they lacked the fundamental skills. Most organizations have all the data but can’t drive outcomes because they lack understanding, imagination, or courage.

Usually all three.

From Insight to Action: Reports as Persuasion
#

But even with understanding, imagination, and courage, data initiatives still fail. Why?

Most people think about reports incorrectly. They treat them as neutral information delivery mechanisms—“here’s what the data says, now you decide.”

But that’s not how good analysis works.

Every report makes an argument, whether you intend it to or not. The question is whether you’re making that argument consciously and well, or unconsciously and poorly.

Think back to Netflix’s House of Cards analysis. They weren’t just “presenting data.” They were making a high-stakes argument: “Based on viewing patterns, here’s why a $100 million bet on David Fincher directing Kevin Spacey in a political drama will work.” The data was evidence supporting a recommendation that required courage to act on.

This is where understanding and imagination meet courage. You’ve understood your domain deeply enough to form a hypothesis. You’ve used imagination to envision what success looks like. Now you need to persuade someone to act. That’s what good reports do.

Think of them like musical compositions:

  • Lead with the conclusion - the melody people will remember
  • Support it with evidence - the harmony that makes the melody richer
  • Anticipate objections - the rhythm that propels the piece forward
  • Make the next action clear - a satisfying resolution that tells the orchestra exactly what to play

Bad reports dump all the notes on the page and hope something coherent emerges. Good reports are structured to create impact and drive decisions.

When you treat reports as persuasion, you think differently about what to include. You’re not showcasing your analytical prowess. You’re building a case for action that acknowledges competing priorities, addresses reasonable concerns, and makes it easy for someone to say yes.

The Implementation Gap
#

Even brilliant analysis often goes nowhere. Why?

The Brightline Initiative found that most strategies fail during execution, not conception. Similarly, most data insights fail not because the analysis was wrong, but because nothing happened afterward.

Four common reasons:

1. Nobody Owned the Outcome

Your report identifies a problem. Great. Who’s responsible for fixing it? If the answer is unclear, nothing will happen.

In an orchestra, the conductor is responsible for the music. In data initiatives, someone needs to be responsible for the outcome—not just for “doing the analysis.”

2. The Recommendation Required Too Much Change

Your analysis says: “We should completely restructure how we operate.” Even if you’re right, that’s not actionable. It’s like telling the Portsmouth Sinfonia: “You should all become virtuosos.” True, but not helpful.

Better: identify the smallest change that would have a meaningful impact. Then build from there.

3. No Clear Next Action

“We should improve customer satisfaction” isn’t actionable. “We should respond to support tickets within 4 hours instead of 24” is actionable.

Sheet music doesn’t say “play beautifully.” It says “F sharp, quarter note, forte.”

4. Analysis Without Context

You present data showing a problem. But you haven’t shown why it matters more than the fifty other problems everyone’s dealing with.

An orchestra score doesn’t include every possible note. It includes only the notes that create the specific piece of music you’re trying to play. Similarly, your analysis needs to show not just what you found, but why it matters more than everything else competing for attention.

Putting It Together: Hypothesis-First at Volvo
#

This all sounds good in theory, but how does it work in practice? Let’s look at a real example that demonstrates all these principles working together.

When Volvo Powertrain tackled dimensional variations in machined holes, they faced a classic data problem: too many potential causes, too much possible data to collect, and pressure to “just start analyzing something.”

Instead, they took the hypothesis-first approach. They formed a cross-functional team—production engineers, machine operators, maintenance specialists—who together understood the domain deeply. Through structured workshops using Ishikawa diagrams, they systematically formed hypotheses about what might cause variations: temperature, spindle load, and tool wear. Only after forming clear hypotheses did they collect data tied to those specific parameters.

They modified their standard data science methodology to require domain expertise upfront rather than treating it as something to “consult” during analysis. As they discovered, “without proper domain understanding, there is a risk of conducting complex analyses that only lead to the discovery of obvious or previously known patterns.”

The result: instead of drowning in data or building dashboards that showed everything but revealed nothing, they systematically tested specific hypotheses. They could act on what they learned because they knew exactly what they were testing and why it mattered.

This is what it looks like to make music instead of noise.

Practical Advice
#

So what should you actually do?

1. Start with one specific decision that matters.

Before collecting anything, ask: “What decision are we trying to inform? What would we do differently based on what we learn?”

If you can’t answer clearly, you’re not ready to collect data. You’re ready to have more conversations about what actually matters. Don’t try to become “data-driven” across your entire organization. Pick one decision. Understand it deeply. Form a hypothesis. Test it. Netflix didn’t start with House of Cards—they started with understanding streaming behavior. UPS piloted ORION. Capital One started small and learned.

2. Involve people who understand the work.

Data analysts can tell you what the data says. But people who do the work can tell you what it means and whether it’s worth acting on.

The best analysis happens at the intersection of analytical skills and domain expertise. If you keep those separate, you get either rigorous analysis of irrelevant questions or relevant questions with sloppy analysis.

3. Make your hypothesis explicit.

Write it down. “We believe that if we do X, Y will happen because Z.”

This does two things: First, it forces clarity. Second, it makes it possible to learn. If you’re not explicit about your hypothesis, you can’t tell whether you were right or wrong. You’ll just move on to the next thing without learning.

4. Build feedback loops.

How will you know if your hypothesis was right? How will you know if the action worked?

Capital One’s entire competitive advantage was its “test and learn” approach—they built tight feedback loops to see what worked. Netflix constantly monitors viewing patterns to refine its content strategy. UPS measures fuel savings religiously.

The Portsmouth Sinfonia probably never got much better because they had no feedback mechanism beyond audience reaction. Build in ways to measure whether you’re actually making music.

5. Use data contracts to enforce discipline.

Here’s a technique from software engineering that works brilliantly for data initiatives: before collecting data or building a dashboard, create a “data contract” that specifies:

  • Business question: What decision does this inform?
  • Hypothesis: What do we believe will be true?
  • Owner: Who’s accountable for acting on this data?
  • SLA: How fresh, accurate, and complete does it need to be?
  • Success criteria: How will we know if our hypothesis was right?

Think of this as sheet music for data work. It tells every musician (analyst, engineer, business owner) exactly what to play and when. Without it, everyone’s improvising, and the result is noise.

If stakeholders can’t fill out a data contract, they’re not ready for the data. This isn’t bureaucracy—it’s discipline. It’s the difference between playing notes and making music.

The Meta-Lesson
#

The deepest lesson from the Portsmouth Sinfonia isn’t about music. It’s about confusing tools with outcomes.

They thought having instruments made them an orchestra. Many organizations think having data makes them data-driven. But instruments don’t make music any more than data makes decisions.

What makes music? Musicians who understand their instruments, can read sheet music, have practiced their parts, and play together toward a shared goal.

What makes good decisions? People who understand their domain, form clear hypotheses, collect relevant data, use appropriate tools, and have the courage to act.

The tools matter. But they’re not what matters most. Your data initiative will succeed not because you have sophisticated tools, but because you have understanding, imagination, and courage.

Start there. The tools come later.

And remember: before you can make music, you need to know what music you’re trying to make.


Join the Conversation
#

What’s your experience with data-driven initiatives? Have you seen reports that actually led to change? I’d love to hear your thoughts. Reach out to me or comment on LinkedIn or BlueSky!

And if you missed it, check out Part 1 where we diagnosed the problem and explored why most data initiatives fail.


References
#

Netflix and House of Cards:

UPS ORION:

Harrah’s Total Rewards:

Capital One’s information-based strategy:

On the strategy-execution gap:

On data contracts and data mesh:

  • Dehghani, Z. (2022). Data Mesh: Delivering Data-Driven Value at Scale, O’Reilly Media - Foundation for data contracts and ownership models
  • ThoughtWorks: Implementing Data Contracts - Practical guidance on data contracts for distributed data architectures

Manufacturing data analytics:

Image by HeungSoon from Pixabay