In ping-pong, there are only two possible outcomes, you can either win or lose. If you are losing, it makes sense to change your strategy since the situation can only get better. In real life, however, things can either get better or they can get worse.
The failure to appreciate the difference between ping-pong and real life has been the frequent cause of disaster. Prior to the Israeli withdrawal from Lebanon, for instance, an average of 23 Jewish soldiers a year were killed in various Hizbullah attacks. The response to that tragic situation was to withdraw all IDF forces from Lebanon. That would clearly have been the right strategy in ping-pong.
As it turned out, however, Israel’s hasty withdrawal from Lebanon taught the Palestinians the lesson that the Israeli public cannot tolerate any level of casualties and would therefore respond to terror with ever greater concessions. That perception encouraged the Palestinians to embark on the terror war that has cost over a thousand Jews their lives in the last three years. The cure, in short, has proven even more tragic than the original.
The lesson for would be social reformers is: remember things can always get worse, a lot worse.
A second important lesson for would be social reformers: be aware that the unintended consequences of fixing one problem often bring in their wake much worse problems. Modern man is particularly susceptible to the belief in his ability to improve highly complex and interrelated eco and social systems with a little intelligent tinkering. We imagine that we can ``cure" problems in isolation without considering the manifold ripple effects from that cure.
In his book The Logic of Failure, Dietrich Dorner, a German psychologist, shows why such efforts at ecological and social engineering are so prone to failure. Using computer simulations, he presented subjects with a series of real life problems – e.g., poverty, poor medical care, insufficient water, excessive hunting and fishing -- and presented them with a number of alternative policy options. He found that even the most highly trained and intelligent subjects tended to focus too narrowly on the specific problem at hand and fail to anticipate the system-wide effects of their cures. The result was a series of "virtual" catastrophes.
One could multiply endlessly examples of unintended consequences and harmful consequences resulting from human interventions in highly complex ecosystems. The African "killer" bee introduced into Brazil in order to increase pollination levels is today a major menace as far north as California. Cane toads introduced into Australia to eat the cane beetle turned out to eat everything besides the cane beetle, and have developed into a plague of near Biblical proportions.
A whole field of scientific forestry developed in an effort to increase timber output. The means of doing so was artificial, ``rationally-designed" forests with same-age trees arranged linearly. Unfortunately, the scientific proponents of such artificial forests failed to take into account such factors as the processes of soil replenishment and the symbiotic relations between insects, fungi, flora, and mammals in natural forests.
In the short-run, the scientifically planned forests did produce more timber. But the soil in such forests quickly became depleted, and by the second generation of planting, production had dropped and the forests proved highly vulnerable to being destroyed by storms.
Human social systems have often proven even less amenable to rational planning than ecosystems. Brasilia, a capital city built from scratch according to the principles of Swiss urban planner Le Corbusier, must have looked wonderful on paper. The only problem was that once it was built no one wanted to live in it – geometric designs and wide public spaces notwithstanding.
None of these examples of human hubris prove that science and planning can never improve our lives. Examples of bad science – science that fails to take into account relevant factors – do not discredit all science.
Without doubt science has contributed much to improving life. Between 1900 and 2000, for instance, average life expectancy doubled in America. Most of us take for granted creature comforts that would have been unimaginable to even kings and queens 250 years ago.
Vaccines against polio and smallpox have largely wiped out these once dreaded diseases. The explosion of understanding of the brain in recent decades has been responsible for rescuing countless children who would once have been consigned to lives of academic failure.
Nor do all efforts at social planning make things worse. In Dorner’s computer simulations, there were policy options that resulted in significant amelioration with a minimum of negative side effects. The only problem is that few would-be planners found these options.
But if the examples do not establish that nothing can ever get better, they ought to at least give pause to a variety of social planners of all sorts, including some within our own world. Yes, many social problems can be reduced or alleviated, and life improved in many ways. But many of the schemes for doing so will only result in catastrophes of various sorts. Things can get worse, or be made worse, by hasty and ill-considered interventions.
It is easy enough to point to many unhealthy phenomena within our community, and to imagine a society freed of them. And indeed many of our most talented and idealistic members are working on these problems, often with the explicit encouragement of the gedolim.
Yet because the gedolim, with their broad historical perspective, instinctively know the rules that we have been discussing, they are not quick to adopt every scheme for social improvement. They are well aware of the hard won victories that produced our present society and all that is extraordinary about that society. Their first concern is not to jeopardize any of these victories, as a consequence of poorly conceived tinkering.
Frequently they serve as a brake. Not because they do not see the problems or are not affected by the painful cases that reach them constantly, but because they are acutely aware of how fragile can be the balance holding a particular society together. Their first rule is do no harm; shev ve’al ta’aseh adif.
James Scott, in his book Seeing Like a State: How Certain Schemes to Improve the Human Condition Have Failed, offers two useful rules for preventing social engineering from wreaking havoc: keep the initiatives small and experimental.
Small initiatives allow for testing of unforeseen consequences and prevent those consequences from generation catastrophic results. When planners see their efforts as nothing more than tentative experiments, they do not become married to them and it is easier to reverse course if disaster lurks.
In this vein, the fifty states of the United States have been described as social laboratories in which different solutions to common problems can be tested. The nationwide welfare reform of the Clinton years, for instance, was largely based on the experience of similar programs at the state-wide level in Wisconsin and Michigan.
Keep it small and experimental would be good rules for efforts to improve our own society as well.