Arnold Kling points to a paper by Horst W. J. Rittel and Melvin M. Webber, “Dilemmas in a General Theory of Planning” (Policy Sciences, June 1973). As Kling says, the paper is “notable for the way in which it describes — in 1973 — the fallibility of experts relative to technocratic expectations”.
Among the authors’ many insights are these about government planning:
The kinds of problems that planners deal with-societal problems-are inherently different from the problems that scientists and perhaps some classes of engineers deal with. Planning problems are inherently wicked.
As distinguished from problems in the natural sciences, which are definable and separable and may have solutions that are findable, the problems of governmental planning-and especially those of social or policy planning-are ill-defined; and they rely upon elusive political judgment for resolution. (Not “solution.” Social problems are never solved. At best they are only re-solved-over and over again.) Permit us to draw a cartoon that will help clarify the distinction we intend.
The problems that scientists and engineers have usually focused upon are mostly “tame” or “benign” ones. As an example, consider a problem of mathematics, such as solving an equation; or the task of an organic chemist in analyzing the structure of some unknown compound; or that of the chessplayer attempting to accomplish checkmate in five moves. For each the mission is clear. It is clear, in turn, whether or not the problems have been solved.
Wicked problems, in contrast, have neither of these clarifying traits; and they include nearly all public policy issues-whether the question concerns the location of a freeway, the adjustment of a tax rate, the modification of school curricula, or the confrontation of crime….
In the sciences and in fields like mathematics, chess, puzzle-solving or mechanical engineering design, the problem-solver can try various runs without penalty. Whatever his outcome on these individual experimental runs, it doesn’t matter much to the subject-system or to the course of societal affairs. A lost chess game is seldom consequential for other chess games or for non-chess-players.
With wicked planning problems, however, every implemented solution is consequential. It leaves “traces” that cannot be undone. One cannot build a freeway to see how it works, and then easily correct it after unsatisfactory performance. Large public-works are effectively irreversible, and the consequences they generate have long half-lives. Many people’s lives will have been irreversibly influenced, and large amounts of money will have been spent-another irreversible act. The same happens with most other large-scale public works and with virtually all public-service programs. The effects of an experimental curriculum will follow the pupils into their adult lives.
Rittel and Webber address a subject about which I know a lot, from first-hand experience — systems analysis. This is a loose discipline in which mathematical tools are applied to broad and seemingly intractable problems in an effort to arrive at “optimal” solutions to those problems. In fact, as Rittel and Webber say:
With arrogant confidence, the early systems analysts pronounced themselves ready to take on anyone’s perceived problem, diagnostically to discover its hidden character, and then, having exposed its true nature, skillfully to excise its root causes. Two decades of experience have worn the self-assurances thin. These analysts are coming to realize how valid their model really is, for they themselves have been caught by the very same diagnostic difficulties that troubled their clients.
Remember, that was written in 1973, a scant five years after Robert Strange McNamara — that supreme rationalist — left the Pentagon, having discovered that the Vietnam War wasn’t amenable to systems analysis. McNamara’s demise as secretary of defense also marked the demise of the power that had been wielded by his Systems Analysis Office (though it lives on under a different name, having long since been pushed down the departmental hierarchy).
My own disillusionment with systems analysis came to a head at about the same time as Rittel and Webber published their paper. A paper that I wrote in 1981 (much to the consternation of my colleagues in the defense-analysis business) was an outgrowth of a memorandum that I had written in 1975 to the head of the defense think-tank where I worked. Here is the crux of the 1981 paper:
Aside from a natural urge for certainty, faith in quantitative models of warfare springs from the experience of World War II, when they seemed to lead to more effective tactics and equipment. But the foundation of this success was not the quantitative methods themselves. Rather, it was the fact that the methods were applied in wartime. Morse and Kimball put it well [in Methods of Operations Research (1946)]:
Operations research done separately from an administrator in charge of operations becomes an empty exercise. To be valuable it must be toughened by the repeated impact of hard operational facts and pressing day-by-day demands, and its scale of values must be repeatedly tested in the acid of use. Otherwise it may be philosophy, but it is hardly science. [p. 10]
Contrast this attitude with the attempts of analysts for the past twenty years to evaluate weapons, forces, and strategies with abstract models of combat. However elegant and internally consistent the models, they have remained as untested and untestable as the postulates of theology.
There is, of course, no valid test to apply to a warfare model. In peacetime, there is no enemy; in wartime, the enemy’s actions cannot be controlled….
Lacking pertinent data, an analyst is likely to resort to models of great complexity. Thus, if useful estimates of detection probabilities are unavailable, the detection process is modeled; if estimates of the outcomes of dogfights are unavailable, aerial combat is reduced to minutiae. Spurious accuracy replaces obvious inaccuracy; untestable hypotheses and unchecked calibrations multiply apace. Yet the analyst claims relative if not absolute accuracy, certifying that he has identified, measured, and properly linked, a priori, the parameters that differentiate weapons, forces, and strategies.
In the end, “reasonableness” is the only defense of warfare models of any stripe.
It is ironic that analysts must fall back upon the appeal to intuition that has been denied to military men — whose intuition at least flows from a life-or-death incentive to make good guesses when choosing weapons, forces, or strategies.
This generalizes to government planning of almost every kind, at every level, and certainly to the perpetually recurring — and badly mistaken — belief that an entire economy can be planned and its produce “equitably” distributed according to needs rather than abilities.
(For much more in this vein, see the posts listed at “Modeling, Science, and ‘Reason’“. See also “Why I Am Bunkered in My Half-Acre of Austin“.)