The Story: A Perfect Problem That Wasn’t
A few years ago, a client asked my colleague and me to review their problem-solving process.
One team presented what looked like an airtight case about a critical safety issue: a steel shaft had flown out of a lathe and nearly struck an operator.
Their presentation was confident and polished.
The 5 Why chart was complete, the logic linear.
A chuck jaw had fractured, so the answer seemed clear — install a sturdier chuck with larger jaws.
Consensus reached.
Case closed.
Until we went to gemba.
There were four similar lathes in that production area, yet the incident had occurred only on one — and it happened twice on that same machine.
That inconsistency bothered us.
Talking with the operator revealed something stranger: the failure occurred during the finishing pass, when cutting forces are lowest.
If weak jaws were the cause, they should have failed during the heavy rough-cut stage, not the light finishing cut.
A quick conversation with a mechanical engineer confirmed our suspicion — cutting loads on the larger-diameter shaft were unchanged from previous runs.
The math didn’t support the theory.
Then a clue surfaced.
The operator casually mentioned that this particular machine leaked hydraulic oil — a lot.
The maintenance lead sighed, “That one’s a guzzler.”
That word changed everything.
This was an older lathe design that used a rotary hydraulic distributor to supply oil pressure to the chuck while the spindle rotated.
At higher spindle speeds, centrifugal force acts outward on the jaws, reducing their effective clamping force.
To maintain grip, the system must sustain stable hydraulic pressure throughout the cycle.
But a small cut in one hydraulic hose allowed oil to leak, and the circuit design lacked any feedback or mechanical lock.
During the high-RPM finishing pass, oil loss dropped pressure below the safe limit.
The jaws relaxed just enough for the part to slip.
At that rotational speed, the shaft released violently, breaking a jaw on its way out.
The real problem was loss of clamping pressure — a design weakness compounded by leakage — not insufficient jaw strength.
The correct countermeasures were to repair the leak, redesign the hydraulic circuit with better valves, and add a Jidoka interlock to automatically stop the cycle when pressure fell below standard.
It was an obvious solution in hindsight but far from obvious in the meeting room.
The other lathes worked because they didn’t leak — not because the design was sound.
Modern lathes have largely eliminated this risk through mechanical-locking chucks, pressure sensors, and automatic interlocks that halt the cycle if clamping pressure drops below standard.
Safety and problem-solving maturity often evolve together.
It was a humbling experience.
Everything looked right in the conference room — charts, consensus, and logic — but none of it matched reality.
That single case illustrated nearly every pitfall we see in problem definition: jumping to solutions before seeing, thinking fast instead of verifying, filling out tools without facts, mistaking agreement for evidence, and skipping the AQD discipline that prevents them all.
Pitfall #1 — Premature Solutions: Solving Before Seeing
The team had already ordered upgraded chucks before verifying the cause.
A month later, the real fix cost a few dollars in seals and tubing.
This is the most common pattern across industries:
action before understanding.
Dashboards demand progress; leaders reward quick closure.
But countermeasures built on weak definitions always crumble.
Coaching cue: Contain the symptom if you must, but never confuse containment with correction.
Pitfall #2 — Fast Thinking: The System 1 Trap
The team’s first explanation felt right.
That’s System 1 — our brain’s fast, intuitive mode — taking over.
System 2 thinking — slow, analytical, and effortful — never got a chance.
They leapt from symptom to solution in minutes, confident they’d seen this before.
Had we not gone to gemba, the plant would have spent thousands on stronger chucks that solved nothing.
Coaching cue: Speed of action is not speed of learning. Slow down to verify.
Pitfall #3 — Tool Use Without Seeing
The team’s original 5 Why analysis looked solid on paper:
- Why did the part fly out? → Because the chuck jaw broke during machining.
- Why did the chuck jaw break? → Because the jaw material was too weak for the new, larger shaft.
- Why was the jaw material too weak? → Because the chuck design wasn’t updated when the new part size was introduced.
- Why wasn’t the chuck design updated? → Because engineering assumed existing chucks were sufficient for all part sizes.
- Why did engineering assume that? → Because there was no formal design review process for safety-critical tooling changes.
The conclusion seemed perfectly logical:
Purchase and install a more robust chuck and alter the design review process going forward.
The team’s analysis was neat, convincing, and unfortunately wrong.
They had built a paper trail of logic around a mechanical assumption, not a physical fact.
The jaws didn’t fail because they were weak; they failed because hydraulic pressure dropped.
It looked plausible because it fit a familiar story: mechanical failure traced back to an administrative process gap.
But the analysis was done in the meeting room, not at the machine.
The reasoning chain was tidy — and detached from the actual physics.
A perfect 5 Why on paper is still wrong if it never touches reality.
Coaching cue: Tools don’t solve problems; they reveal how clearly you’ve seen them.
Each “why” must survive contact with evidence.
Pitfall #4 — Meeting-Room Consensus Instead of Gemba Facts
Everyone in the room agreed — and that was the problem.
Consensus created confidence that replaced curiosity.
No one asked, “Why only one machine?” or “When in the cycle did it happen?”
Agreement became a substitute for data.
Only when we walked to the floor, spoke with the operator, and looked under the machine did the truth appear — a thin oil film glistening under fluorescent light.
Lesson: Gemba beats groupthink every time.
You can’t define reality from across the table.
Pitfall #5 — Poor Definition: Failing the AQD Test
The team’s initial statement — “Jaw failure on new shaft – need stronger chuck” — failed every element of AQD:
- Analyze: They never asked why it happened on one machine, not all four.
- Quantify: No data on hydraulic pressure or cutting force.
- Detail: No direct observation of the event or point of cause.
A better problem statement would have been:
“Loss of clamping pressure on lathe #4 during finishing pass causes part ejection twice in 30 days; hydraulic oil pressure below standard at time of failure.”
Coaching cue: If others can’t go see the same condition and measure the same gap, it isn’t defined yet.
A Note on Problem Definition vs. Root Cause
Some readers might argue that describing the issue as “loss of clamping pressure” steps into root cause territory. The two can appear to overlap at times.
In this case, it represents the last layer of problem definition — a clear, measurable condition that defines the gap from standard.
The question of why that pressure was lost comes next.
This distinction matters because vague definitions (“Safety incident – shaft ejected from machine”) stop the analysis too soon.
Clear, factual definitions like “loss of clamping pressure during finishing pass” lead naturally to a truer 5 Why.
Good definitions don’t replace root cause analysis — they make it possible.
The Pattern Behind the Pitfalls
Each misstep shares the same root: certainty outpacing clarity.
Humans crave closure; we’d rather have a fast answer than an accurate one.
That’s why the discipline of Analyzing, Quantifying, and Detailing (AQD) is so valuable.
It forces us into System 2 thinking — to pause, verify, and let reality speak.
When we finally looked beneath that lathe and saw hydraulic oil dripping onto the floor, the real problem defined itself.
Solutions no longer needed debate.
Summary: Recognizing the Pattern
| Pitfall | Root Cause | Coaching Countermeasure |
|---|---|---|
| Premature Solutions | Pressure for speed | Define before fixing; clarity first. |
| Fast Thinking | System 1 dominance | Slow the conversation; verify before agreeing. |
| Tool Overuse | Treating forms as thinking | Test each “why” at gemba. |
| Consensus Culture | Agreement over evidence | Go and see; confirm physically. |
| Poor Definition (AQD) | Lack of structured analysis | Ask for analysis, measurement, and point of cause. |
Coaching Point
Every organization has its “flying-shaft” story — a moment when logic, consensus, and tools aligned perfectly but reality disagreed.
The cure isn’t smarter people or fancier templates.
It’s the discipline to stop, go see, and let the facts define the problem before the solution does.