The history of public health in the United states studies the US history of public health roles of the medical and nursing professions; scientific research; municipal sanitation; the agencies of local, state and federal governments; and private philanthropy. It looks at pandemics and epidemics and relevant responses with special attention to age, gender and race. It covers developments from the colonial era to the late 20th century in the United States and its main overseas possessions and its military roles overseas.
At critical points in American history the public health movement focused on different priorities. When epidemics or pandemics took place the movement focused on minimizing the disaster, as well as sponsoring long-term statistical and scientific research into finding ways to cure or prevent such dangerous diseases as smallpox, malaria, cholera. typhoid fever, hookworm, Spanish flu, polio, HIV/AIDS, and covid-19. The acceptance of the germ theory of disease in the late 19th century caused a shift in perspective. Instead of attributing disease to personal failings or God's will, reformers focused on removing threats in the environment. Special emphasis was given to expensive sanitation programs to remove masses of dirt, dung and outhouse production from the fast-growing cities or (after1900) mosquitos in rural areas. Since the mid-19th century there has been an emphasis on laboratory science and training professional medical and nursing personnel to handle public health roles, and setting up city, state and federal agencies. The 20th century saw efforts to reach out widely to convince citizens to support public health initiatives and replace old folk remedies. In the late 20th century anxious popular environmentalism led to an urgency in removing pollutants like DDT or harmful chemicals in the water and the air. In the 21st century there is a concern for Diversity, equity, and inclusion, with the goal of removing handicaps historically imposed on minorities. [1] [2] [3] [4]
The healthcare system began in the Colonial Era. Localistic community-oriented care was typical, with families and neighbors providing assistance to the sick using traditional remedies and herbs. New immigrants to the colonies had high death rates from their exposure to a new disease environment. However by the second generation death rates were lower than in England because there was much more food and less crowding. Becoming a regular doctor was difficult. Finally in 1765 the first medical school opened at the College of Philadelphia. That city opened a hospital in 1751; the second one opened in New York City in 1791. By 1775 the 13 colonies had 3,500 to 4,000 regular doctors. About one in ten was formally trained, usually in England or Scotland. They had a clientele among the wealthier classes, but the popular image was one of distrust. [5] [6] [7]
Smallpox was pandemic but vaccination was introduced in the 1750s. During the Revolution General George Washington insisted his soldiers get inoculated else his forces might get decimated or the British try to use smallpox as a weapon. [8]
The Chesapeake region --Maryland and Virginia--experienced high mortality rates, particularly among new arrivals from England. This high death rate significantly impacted family structures and slowed population growth. Men immigrated far more than women, so there was a persistent shortage of females in the Chesapeake colonies, which further stifled natural population increase and lowered rates of marriage for men. Due to the high mortality rates and unbalanced sex ratio, traditional family structures were difficult to maintain. Many families consisted of step-parents, step-children, and half-siblings, creating complex family networks. By contrast, New England had lower death rates, and much more family stability, and enabled the patriarchal New Englanders to better make long-term plans for acquiring enough land to provide farms for the next generation. [9] [10] [11]
Lemuel Shattuck (1793-1859) of Boston promoted legislation that required a better statewide system for the local registration of vital information on births and deaths. He specified the need for precise details on age, sex, race, and occupation, as well as standard terminology for diseases and cause of death. This law was passed in 1842 and was soon copied by most other states. [12] His proposals greatly expanded the questionnaires used in the Massachusetts state census of 1845. He was a key consultant for the 1850 United States census. He helped convince Congress to fund a much more complex census, and he designed most of the interview forms used by door-to-door canvassers. His Report on the Sanitary Condition of Massachusetts in 1850 on a sanitary survey of Massachusetts was farsighted. [13] It explained how to remove the giant mounds of dirt, horse dung, and outhouse production that were overwhelming the neighborhoods of fast growing cities, and inspired reforms in many cities that faced the same public health crisis. [14]
The urban-rural dichotomy has a medical dimension. Two major diseases, malaria and hookworm, historically were rural phenomenon warm areas of the South. They were stamped out by large-scale efforts to clean up the environment. Malaria is spread by the bite of a particular species of mosquito, and is eradicated by systematically draining pools of stagnant water. [15] [16]
The Rockefeller Sanitary Commission in 1910 discovered that nearly half the farm people, white and Black, in the poorest parts of the South were infected with hookworms. In the typical victim hundreds of the worms live hooked to the wall inside the small intestine, eat the best food, and leave the victim weak and listless. It was called the "germ of laziness." People were infected by walking barefoot in grassy areas where people defecate. In the long run outhouses and shoes solved the problem but the Commission developed a quick cure. The volunteer drank a special medicine that loosened the insects' grip, then drank a strong laxative. When most residents did so the hookworms would be gone. The Commission, headed by Wickliffe Rose, helped state health departments set up eradication crusades that treated 440,000 people in 578 counties in all 11 Southern states, and ended the epidemic. [17] [18] [19]
In the Southern states 1890s to 1930s, Jim Crow virtually dictated inferior medical care for the large, very poor African American minority. There was neglect and racism on the part of white physicians. Black physicians were too few and too poorly trained at their small schools, Likerwise nursing standards were subpar, and there were very few all-Black hospitals. The southern progressive movement did initiate reforms that helped somewhat, as did Northern philanthropies, but the whites benefitted more. [20] [21] [22]
The most infamous American episode of bad public health ethics was the Tuskegee syphilis study. It was conducted between 1932 and 1972 by two federal agencies, the United States Public Health Service (PHS) and the Centers for Disease Control and Prevention (CDC) on a group of 399 African American men with syphilis. They were not asked to give permission, were not told their medical condition, and when penicillin became available in the mid 1940s it was deliberately not given them so the researchers could discover what happens to untreated men. As a result the lives of 100 of the 399 men were cut short and they died of syphilis. [23]
The study took place in Tuskegee, Alabama, in partnership with the Tuskegee Institute, the famous school founded by the late Booker T. Washington. [24] The study began in 1932, when syphilis was a widespread problem and there was no safe and effective treatment. [25] The study was designed to measure the progression of untreated syphilis. By 1947, the new drug penicillin had been shown to be an effective cure for early syphilis and was becoming widely used to treat the disease. [24] Its use in late-stage syphilis, however, was still unclear. [25] Study directors continued the study and did not offer the participants treatment with penicillin. [24] This is debated, and some have found that penicillin was given to many of the subjects. [25]
In the 1960s, physician Peter Buxtun exposed the experiment to the mainstream press, causing a nationwide public outcry. As a result, the program was terminated, a lawsuit brought those men affected $9 million, and Congress created a commission empowered to write regulations to deter such abuses from occurring in the future. [24] In 1997, survivors of the study were invited to the White House to be present when President Bill Clinton apologized on behalf of the United States government for the study, denouncing it as “shameful” and “clearly racist.”. [26]
In retrospect the Tuskegee experiment caused deep distrust on the part of the African American community, and apparently reduced Black reliance on public health agencies. One research study in 2018 estimated that the negative response caused the average life expectancy at age 45 for all Black men to fall by up to 1.5 years. [27]
Public health nursing after 1900 offered a new career for professional nurses in addition to private duty work. The role of public health nurse began in Los Angeles in 1898, and by 1924, there were 12,000 public health nurses, half of them in America's 100 largest cities. Their average annual salary of public health nurses in larger cities was $1390. In addition, there were thousands of nurses employed by private agencies handling similar work. Public health nurses supervised health issues in the public and parochial schools, to prenatal and infant care, handled communicable diseases such as tuberculosis, and dealt with an aerial diseases. [28] [29]
Historian Nancy Bristow has argued that the great 1918 flu pandemic contributed to the success of women in the field of nursing. This was due in part to the failure of medical doctors, who were nearly all men, to contain and prevent the illness. Nursing staff, who were nearly all women, celebrated the success of their patients and were less inclined to identify the spread of the disease with their own work. [30]
During the Great Depression in the 1930s, federal relief agencies funded many large-scale public health programs in every state, some of which became permanent. The programs expanding job opportunities for nurses, especially the private duty RNs who suffered high unemployment rates. [31] [32]
In the United States, a representative public health worker was Dr. Sara Josephine Baker who established many programs to help the poor in New York City keep their infants healthy, leading teams of nurses into the crowded neighborhoods of Hell's Kitchen and teaching mothers how to dress, feed, and bathe their babies.
The federal Office of Indian Affairs (OIA) operated a large-scale field nursing program. Field nurses targeted native women for health education, emphasizing personal hygiene, and infant care and nutrition. [33]
Life expectancy in the United States has shown a remarkable increase over the past century, with a few small fluctuations. In 1900, life expectancy at birth was approximately 47 years. This figure rose steadily, reaching about 69 years by 1950; 72 in 1975, and 77 in 2000 . In 2023 it reached 78.4 years --75.8 years for males and 81.1 years for females. [34]
Multiple factors Influenced life expectancy at birth: [35]