DDT once promised unparalleled benefits to humanity and especially to inhabitants of the developing world. Its insecticidal properties had the potential to eliminate the scourge of insect-borne disease and, by boosting agricultural production, to alleviate poverty and starvation. Sadly unable to foresee limitations to this seemingly miraculous compound, we used DDT in a way that prevented us from garnering all that it offered.

Many critics have recently blamed Rachel Carson’s 1962 best-seller, “Silent Spring,” which warned of hazards from the over-use of DDT, for our failure to take full advantage of its potential for malaria control. In fact, the usefulness of DDT was already in decline a decade before the publication of “Silent Spring.” DDT’s effectiveness was partially compromised by nature’s unanticipated ability to adapt to even our most miraculous measures to control it. It was also squandered by the uncoordinated, cross-purposed efforts of those trying to increase crop yields through modern chemistry — The Green Revolution — and those trying to eradicate malaria.

The public’s attention and enthusiasm for malaria eradication has waxed and waned since the discovery, more than 100 years ago, that the mosquito was the disease’s vector. Our best chance to accomplish this eradication came shortly after World War II. In 1950, 12 years before “Silent Spring” and 22 years before the insecticide was banned in the United States, the World Health Organization (WHO) met in Kampala, Uganda, inviting the world’s leading malaria experts to formulate a plan for combating the disease in Africa. DDT was the new “wonder chemical” and by the end of the meeting the majority decision was summed up in three words: “Let us spray.”

The Kampala meeting’s minority opinion was regrettably never published. Dissenters, among them P.C.C. Garnham, arguably the greatest malariologist of the time, maintained that an assault on malaria in a politically unstable Africa, often lacking even the rudiments of a health-care infrastructure, would be both prohibitively expensive and, at best, a long-term gamble. He believed that DDT could not eradicate the disease completely in such a socio-economic situation; furthermore, anything less than total success would allow the disease to eventually re-emerge in a population that had lost the partial immunity that constant exposure to the disease had previously provided.

Following the Kampala meeting, well-planned and amply-funded studies were carried out in each distinct environmental region of Africa where malaria occurred. These pilot projects culminated in the Garki Nigeria project, where every tool available was used to stop malaria transmission in the stubborn lowland savannah. The Garki studies showed malaria transmission could not be prevented in this environment even with a combination of indoor residual spraying, the availability of effective anti-malaria drugs, and a well-staffed clinical infrastructure. The WHO concluded that malaria eradication in Africa was impossible even with DDT and turned to control strategies. The global malaria eradication program thus went forward in 1955, largely omitting Sub-Saharan Africa.

During this period, however, the agricultural use of DDT grew exponentially in Sub-Saharan Africa. Inexpensively eliminating agricultural pests like boll weevils, cutworms and insect-borne viruses, DDT made arable vast tracts of previously unproductive land. This provided hope that the poverty and famine plaguing the continent could at last be ended. The incidence of famine did in fact fall, at least temporarily, but that success carried a high price for disease control.

Spraying crops for agricultural pests had the collateral effect of killing many malaria-carrying mosquitoes, but the large agricultural projects also created ideal conditions for DDT resistance to arise and for malaria to rebound. Standing water from the irrigation of rice, cotton and tobacco fields provided congenial new environments in which spray-surviving mosquitoes could breed. People seeking jobs flocked to the agricultural sites, increasing mosquito-human contact. These workers also brought along their local diseases, including a brew of genetically-distinct malaria parasites to which workers from other areas had less immunity and were therefore highly susceptible. As long as DDT sprayed on crops was effective in keeping the mosquito population down, this susceptibility did not present a problem. Furthermore, if someone did contract the disease a new drug, cholorquine, introduced to Africa in the 1960s, was highly effective treating it.

Just as the discovery of antibiotics led western nations to declare a premature victory over infectious diseases, the use of DDT and chloroquine led to a complacency in the developed world about the threat of malaria. During those years it would have been nearly impossible to predict that natural resistance to these scientific wonders loomed just over the horizon, for there were few precedents to assess the adaptability of nature to broad-scale use of man-made chemicals.

By the time the ban on DDT in the U.S. began in 1972, Africa had already become a patchwork of regions where DDT remained an effective insecticide and other areas where resistance was established and expanding. At this moment, commercial interests surpassed other concerns as a force in undermining DDT’s usefulness for malaria control.

After 1972, DDT was still widely available in Africa thanks to unrestricted importation from India and China. In the 1970s, however, the U.S. and Europe banned the import of DDT-contaminated goods. Agricultural businesses in Africa, often foreign-owned, reacted to these restrictions by forcing local prohibitions on the compound’s use, even though in many areas it was still effective against malaria. At the same time, western chemical companies recognized that the spreading resistance to DDT was opening a sizeable and financially viable market for new, more expensive insecticides, and aggressively promoted their use.

In the years that followed, the situation grew increasingly grim. A rising birth rate and rapid urbanization of a still-impoverished Africa put even greater numbers of Africans at risk. Resistance to chloroquine surfaced and quickly spread throughout Africa; thus, the long-awaited boost in the economic status of the average African that DDT was supposed to have provided, never materialized.

The recent assault on Rachel Carson, one of America’s most revered environmentalists, has brought much needed attention to the fact that Africans are dying of malaria in numbers as high or higher than ever. It has also stemmed negative reaction to the large scale use of DDT, which many Americans still consider a toxic compound. The current invective’s most pernicious effect on long-term malaria control is that it has served to convince many that DDT (and bed nets) can be simple answers to the very complex malaria problem. There is no single, encompassing way to control malaria in Africa; no single approach will lead to a lasting solution. In fact, malaria can be effectively controlled on a regional basis but the commitment has to be long-term and the measures taken need to be tailored to the social, geographical and environmental situations that are unique to Africa. This is the lesson of the Garki project, learned more than 50 years ago, and it still represents the clearest assessment of the malaria problem.

“Silent Spring” powerfully portrayed the effect of human activity on the environment. Scientific literature had expressed concern over the use of DDT as early as 1942, but it was Rachel Carson’s book that brought that concern to the public’s attention. During the last 50 years, some of Carson’s fears about the long-term effect of this insecticide have proven unfounded, but her view of a fragile environment effected by the activities of man has stood the test of time. Perhaps in response to her book, many have developed a new sensitivity to these issues leading to a phasing out of leaded gasoline, to restrictions on PCB use, and to a focus on global warming. We owe a great debt to Carson and others like her who bring such pressing issues into the public forum.

We must, furthermore, also acknowledge that Carson was not opposed to the use of DDT in disease control, but only to its reckless over-use in agriculture. If Carson’s vociferous critics actually read her book, they overlooked repeated and very clear statements she made regarding this, e.g.:

“All this is not to say that there is no insect problem and no need of control. I am saying, rather, that control must be geared to realities, not mythical situations and that the methods employed must be such that they do not destroy us along with insects…. Disease-carrying insects become important where human beings are crowded together, especially under conditions where sanitation is poor, as in time of natural disaster or war or in situations of extreme poverty and deprivation. Then control of some sort is necessary…. It is not my contention that chemicals insecticides must never be used.”