By Jordan Cuevas, Mackenzie Moore, and Grant Rosensteel
Malaria is one of the most widespread infectious diseases, affecting millions of people around the world every year and resulting in hundreds of thousands of deaths. Despite its ubiquity, some countries have been able to successfully control, and even eliminate, malaria. This piece takes a look at how the United States tackled the challenge of malaria elimination, identifying some of the key features that allowed for its success.
Malaria Disease Ecology Poses Unique Challenges
Malaria is one of the most complex pathogens to afflict humans. Unlike many other human pathogens such as polio or measles, there is not a singular “malaria.” There are five species of Plasmodium parasites important for human health, which have different virulence and life cycles, and are carried by approximately 30-40 species of mosquitoes which themselves have different geographic ranges, breeding habitats, preferred blood meals, and feeding times. These complexities create a very unique and challenging disease ecology of malaria to consider when seeking to achieve elimination.
Since malaria is greatly influenced by the type of Plasmodium and its mosquito vector and the sheer geographic diversity of this pairing, localized solutions that take into account the disease ecology of the malaria in question are essential for tackling this disease. A failure to fully consider the biological complexities of malaria will almost certainly lead to unsuccessful or underperforming control campaigns. Approaches such as focusing on indoor prevention of malaria are inherently misguided if they are not tailored to the disease ecology of the malaria in the target area. Malaria elimination cannot be achieved by a “one-size-fits-all” approach and instead must take into account the underlying biology and ecology of the parasite and mosquito in the specific area being targeted.
Historical Dynamics of Malaria in the United States
Genetic and archeological evidence suggests that Plasmodium vivax malaria has existed in the Americas for thousands of years. However, the exact historical distribution of the disease is unknown. Plasmodium falciparum was almost certainly introduced after Columbus’ arrival in the late 15th century and became established in local vectors. In the territory that now encompasses the United States, the spread of malaria was intrinsically tied to settler colonialism and human movement. During the push into the West, malaria accompanied settlers and found homes in irrigation farm ditches as well as hill-side trenches used by gold miners. However, with increasing urbanization, in turn accompanied by establishment of centralized water sources and improved hygiene and sanitation, malaria transmission was reduced, especially in more northern and western areas where the climate was less favorable to vector persistence. In the southern states, hot and humid summers provided suitable conditions for malaria transmission virtually year-round; poor housing and sanitation, which were often especially dire for enslaved populations, contributed to the high disease risk.
The identification of the mosquito vector of malaria by Sir Ronald Ross in the final years of the 19th century, and the US military’s success in controlling yellow fever in Cuba, prompted renewed interest in top-down, large-scale disease control efforts, and indeed the 20th century is when the United States began to implement dedicated malaria elimination efforts. Starting in the 1920s, the Public Health Service and the Rockefeller Foundation began widespread education campaigns in schools that lasted for 30 years. State and county health departments created dedicated malaria control units that helped push malaria out of larger urban areas due to the health coverage they provided. Homes were given screens for windows and doors, and nets were placed over beds. Improvements in transportation and road networks began a migration to “malaria-free” communities. The development of wind-blown Paris green, a highly toxic chemical yet effective insecticide, and the use of Flit and other home insecticides proved effective in reducing vector populations in many rural communities in the South. Laborers on relief during the depression of the 1930s dug more than 30,000 miles of ditches to provide consistent drainage of low lying areas and spread thousands of tons of Paris green, further sustaining elimination efforts. It is believed that these public works programs helped to drastically reduce new infection rates in the 1940s.
The final push towards elimination of malaria came during and immediately after World War II. In 1942, the Office of Malaria Control in War Areas was set up to mitigate the impact of vector-borne diseases, including malaria, around Army bases in the South; despite its federal mandate, it was established in Atlanta, rather than Washington DC, given the greatest need for malaria control remained in the southern states. It was later renamed as the Communicable Disease Center, and over the years, its mandate broadened to include non-communicable diseases as well as prevention, thus becoming the CDC that we recognize today.
In 1945, the Extended Malaria Control Program, a cooperative undertaking between the Communicable Disease Center of the US Public Health Service, and state and county health agencies, was carried out in 315 counties in the 13 states still impacted by malaria transmission, which focused on spraying DDT on the inside walls of millions of houses. After two years, the Malaria Eradication Program was launched, with operational offices established in each of the 13 target states. This delegation of oversight to the state level—albeit with the federal government providing technical, financial, and materiel support—allowed for more tailored and targeted interventions based on the specific conditions in each state. While insecticide application, and specifically the widespread use of DDT, remained the mainstay of the Program, it was complemented by active case finding, diagnostic refresher training, and other forms of epidemiological surveillance. In some areas, financial rewards were offered to physicians for finding new confirmed cases! These surveillance efforts proved to be important in identifying cases of malaria in returned servicemen throughout the 1940s and into the 1950s, with malaria being declared eliminated from the United States in 1951.
Today, while thousands of cases of malaria are reported in the United States every year, they are all travelers who acquired the infection overseas. Every so often, one of these cases may lead to locally infected mosquitoes, and a small cluster of locally transmitted cases. But thanks to the strong surveillance systems in place, in part a continuing legacy of the Malaria Eradication Program, these cases are identified quickly, treated, and the affected mosquito populations targeted with insecticides to prevent further spread.
Lessons Learned & Moving Forward
Malaria eliminaton in the United States was noticeably tied to investment and development of infrastructure, as well as the general improvement of sanitation and housing conditions, especially for the emerging middle-class and in the northern states, during the end of the 19th and beginning of the 20th centuries. This is perhaps the strongest takeaway from the elimination efforts – that simultaneous investment in improvements of livelihood bolsters the success of top-down vertical disease control programs. Disease and illness do not exist in a vacuum. Their prevalence and burden are driven by a number of determinants, both biological and sociological. Directing efforts that only target one element and fail to address others will most likely not succeed.
The United States’ malaria eradication efforts also exemplified the possible role of the military in disease control work. The knowledge brought back by military physicians working in Central America and Cuba was instrumental in the design and successful implementation of control measures utilized throughout the South, including widespread insecticide spraying, draining projects to disrupt mosquito breeding sites, and window screening. The need to protect armed forces personnel was also a major incentive for increased research into malaria medication, as well as domestic malaria control, in the 1940s, during the years that the United States was active in World War II. However, the final successful push towards elimination relied primarily on civilian institutions, and particularly state-level health departments, highlighting the importance of local ownership and adaptation of programs to the local level for sustainable elimination success.
A key feature of the United States’ successful elimination program was the central role of vector control, and particularly the use of agents such as Paris green, and later, DDT. With increasing recognition of the environmental damage caused by these chemicals, DDT has since been banned, and so cannot be used by endemic countries elsewhere in the world. With mosquitoes increasingly evolving resistance to modern insecticides, some have argued that countries in Africa and Asia should be allowed to utilize DDT, as the United States once did, to control malaria. The temperate climate across much of the United States, combined with demographic changes and urbanization that naturally reduced malaria incidence across the northern and western states, allowed national focus on the southern states, and thus a more targeted area for interventions. Keeping in mind these factors, the overall theme of the United States’ experience of eliminating malaria is instructive in its multi-pronged targeting of both biological and sociological elements of disease control by combining mass treatment, prevention efforts, education programs, and infrastructure investment.
Leave a Reply
You must be logged in to post a comment.