The Problem With the Problem: A Deep Dive Into Wicked Problems

In 1973, two planners named the thing that makes hard problems impossible to solve with conventional tools. But they stopped one layer too shallow. The real challenge is not just that wicked problems resist solutions — it is that they resist definition. That is a different, deeper kind of trap.

In 1973, Horst Rittel and Melvin Webber published a paper that most people in policy and systems thinking have heard of but far fewer have truly sat with. "Dilemmas in a General Theory of Planning" introduced the concept of wicked problems — a class of problem fundamentally different from the tame, well-defined problems that engineering and science are built to solve.

The core insight was precise: wicked problems have no definitive formulation, no stopping rule, no right or wrong answer (only better or worse), no test for a solution, and no second chance. Every intervention changes the system. Every solution is a one-shot operation with consequences that ripple in ways you cannot fully anticipate.

Healthcare, urban poverty, climate change, education — Rittel and Webber were pointing at these. They were right. But fifty years on, they stopped one layer too shallow.

Because the deeper problem is not just that wicked problems are hard to solve. It is that they are hard to find.

When Complexity Is Felt But Cannot Be Named

There is a particular kind of discomfort that practitioners in complex systems learn to recognise. It is not the discomfort of knowing something is wrong and being unable to fix it. It is the prior discomfort: knowing something is wrong but being unable to say what.

The data looks reasonable. The processes appear sound. The people are competent. And yet — something is off. A seasoned clinician walks into a ward and feels it before they see it. A policy designer reads a proposal and knows, without being able to articulate why, that it will fail. A CEO sits with a strategy that checks every box and still cannot shake the sense that they are solving the wrong thing.

This is not intuition in the mystical sense. It is pattern recognition operating faster than language. The brain assembles signal from a thousand sources — tone, flow, absence, proportion — and produces a verdict before the conscious mind has caught up. The problem is that when someone asks you to justify the feeling, you cannot. Not yet. And in organisations built on evidence and explicit reasoning, not yet is often treated as not real.

This is the first trap. Complexity that cannot be articulated is complexity that gets dismissed.

Examples From the Mess

Take NHS waiting lists. The surface problem is visible: too many patients, not enough capacity, times too long. So the solution seems obvious — increase throughput, add beds, hire staff, set targets.

But practitioners inside the system have been saying for years that something else is going on. The patients are getting more complex. Comorbidities are higher. The pathways designed for a different patient population are creating friction everywhere. Discharge is blocked not by beds but by the absence of social care infrastructure. The system is flowing badly not because it lacks capacity in the narrow sense, but because it was designed for a population that no longer exists.

This was felt before it was named. People working inside it sensed the category error — that the problem being solved was not the problem causing the pain. But the articulation was hard, because the data did not surface it cleanly. The data told you about waiting times, not about systemic misalignment.

Or take rural health infrastructure in India — a context we work in closely. The default diagnosis is resource scarcity: not enough doctors, not enough facilities, not enough money. The interventions follow from that diagnosis. Build more clinics. Train more community health workers. Deploy telemedicine.

And yet, villages in Rajasthan and Bihar that receive these interventions often do not see the outcomes the models predict. The resources arrive and something still does not move. Why? Because the actual problem is a layered thing: trust deficits built over decades of state failures, local power structures that route resources away from the people who need them, a design philosophy that treats communities as recipients rather than agents.

None of that shows up in the resource scarcity data. Someone had to feel that the interventions were not landing before they could begin to explain why. The diagnosis of scarcity was real. It just was not the whole problem — and in some cases it was obscuring the deeper one.

Or consider mental health systems in the UK, where the official problem is access — not enough therapists, too long to wait, too few crisis beds. Invest in access, the logic goes, and outcomes improve. But clinicians working in community mental health have been troubled for years by something they struggled to name: that many of the people cycling through the system were not getting better even when they reached it. That the model of care — episodic, time-limited, deficit-focused — was not matched to the nature of the conditions it was treating. That access had become the proxy metric for a system that had lost sight of the actual question.

In each case, the person closest to the problem felt the wrongness before the system could describe it. And in each case, the gap between the felt sense and the articulable diagnosis was where the real intellectual work lived — and where most organisations stopped.

The Kafkaesque Bind

Kafka wrote characters trapped in systems they cannot understand, governed by rules they cannot see, charged with offences they cannot name. The horror is not violence. It is opacity. The protagonist reaches for clarity and finds more fog.

This is precisely the experience of trying to diagnose a wicked problem from inside a complex system.

You raise a concern and are asked for evidence. The evidence you need to gather requires first knowing what to look for — which requires having already named the problem. You propose a solution and are told it needs a business case. The business case requires a clear problem statement. The problem statement requires data. The data was collected for a different question. Round and round.

Meanwhile, the system keeps running. The wrong interventions accumulate. Each one makes the system slightly more brittle, slightly more optimised for a problem that is not the real one. And the felt sense — that something is fundamentally off — gets quieter, not because it was wrong, but because people stop saying it out loud when no one knows what to do with it.

This is not a failure of intelligence. It is a structural feature of complex adaptive systems. The problem and the problem-framing apparatus evolved together. You cannot easily see the frame from inside it.

Finding the Problem From the Mess

Rittel and Webber were right that wicked problems resist solution. But the work that precedes solution — the work of problem-finding — has its own disciplines, and they are largely absent from how organisations are built.

The first discipline is treating felt discomfort as data. Not as evidence, not as proof — but as signal worth pursuing. When practitioners with deep system experience say something is wrong, the appropriate response is not to ask for a clean articulation. It is to create conditions where the inchoate can be explored without having to justify itself prematurely.

The second is what we might call problem archaeology — working backwards from interventions that have not worked to ask what problem they were implicitly solving, and whether that problem was real. Systems accumulate the sediment of past framings. The waiting list target was solving a political problem. The triage protocol was solving a liability problem. The reporting framework was solving a funding problem. Each made sense at the time. Layered on top of each other, they now constitute the environment in which the actual health challenge is trying to be addressed.

The third is holding multiple competing problem framings simultaneously — not to resolve them too quickly, but to let them illuminate each other. The NHS capacity problem and the NHS system-design problem are both real. The tension between them is where the more interesting diagnosis lives.

None of this is comfortable. It requires organisations to tolerate uncertainty for longer than they want to. It requires leaders to say we do not yet know what we are dealing with — which is existentially difficult when you are accountable for results and measured on outputs.

What This Means for Systems Change

Rittel and Webber's original contribution was to say: stop treating these problems as tame. The engineering mindset — define, analyse, solve — is not just insufficient here. It is actively harmful, because it produces confident interventions that make the system harder to diagnose over time.

We would go one step further. The problem before the problem is legibility. Before you can say a problem is wicked, you have to be able to say it is a problem. And the most consequential failures in health systems, in development programmes, in organisational strategy, do not fail at the solution stage. They fail at the stage where someone felt something was off — and the system had no way to honour that.

The Kafkaesque bind is not just an analogy. It is the actual operational experience of people trying to do serious work in complex environments. The door to the law is open, but there is always another door behind it. The charge is real, but it cannot be read aloud.

The way out is not better analysis, though analysis matters. It is building cultures and systems and practices that take the inarticulate seriously — that treat the gap between felt complexity and named complexity as the site of the most important work, not as a problem of communication to be tidied up.

Healthcare will not be fixed by solving the problems we have already named. It will be changed by people with the courage and the method to find the ones we have not.