The history of public health in the United states studies the US history of public health roles of the medical and nursing professions; scientific research; municipal sanitation; the agencies of local, state and federal governments; and private philanthropy. It looks at pandemics and epidemics and relevant responses with special attention to age, gender and race. It covers the main developments from the colonial era to the early 21st century.
At critical points in American history the public health movement focused on different priorities. When epidemics or pandemics took place the movement focused on minimizing the disaster, as well as sponsoring long-term statistical and scientific research into finding ways to cure or prevent such dangerous diseases as smallpox, malaria, cholera. typhoid fever, hookworm, Spanish flu, polio, HIV/AIDS, and covid-19. The acceptance of the germ theory of disease in the late 19th century caused a shift in perspective. Instead of attributing disease to personal failings or God's will, reformers focused on removing threats in the environment. Special emphasis was given to expensive sanitation programs to remove masses of dirt, dung and outhouse production from the fast-growing cities or (after1900) mosquitos in rural areas. Public health reformers before 1900 took the lead in expanding the scope, powers and financing of. local governments, with New York City and Boston providing the models.
Since the 1880s there has been an emphasis on laboratory science and training professional medical and nursing personnel to handle public health roles, and setting up city, state and federal agencies. The 20th century saw efforts to reach out widely to convince citizens to support public health initiatives and replace old folk remedies. Starting in the 1960s popular environmentalism led to an urgency in removing pollutants like DDT or harmful chemicals from the water and the air, and from cigarettes. [1] [2] [3] [4] A high priority for social reformers was to obtain federal health insurance despite the strong opposition of the American Medical Association and the insurance industry. After 1970 public health causes were no longer deeply rooted in liberal political movements. Leadership came more from scientists rather than social reformers. Activists now focused less on the government and less on infectious disease. They concentrated on chronic illness and the necessity of individuals to reform their personal behavior --especially to stop smoking and watch the diet-- in order. to avoid cancer and heart problems. [5] [6] [7]
The healthcare system began in the Colonial Era. Localistic community-oriented care was typical, with families and neighbors providing assistance to the sick using traditional remedies and herbs. New immigrants to the colonies had high death rates from their exposure to a new disease environment. However by the second generation death rates were lower than in England because there was much more food and less crowding. Becoming a regular doctor was difficult. Finally in 1765 the first medical school opened at the College of Philadelphia. That city opened a hospital in 1751; the second one opened in New York City in 1791. By 1775 the 13 colonies had 3,500 to 4,000 regular doctors. About one in ten was formally trained, usually in England or Scotland. They had a clientele among the wealthier classes, but the popular image was one of distrust. [8] [9] [10]
The Chesapeake region --Maryland and Virginia—experienced high mortality rates, particularly among new arrivals from England. This high death rate significantly impacted family structures and slowed population growth. Men immigrated far more than women, so there was a persistent shortage of females in the Chesapeake colonies, which further stifled natural population increase and lowered rates of marriage for men. Due to the high mortality rates and unbalanced sex ratio, traditional family structures were difficult to maintain. Many families consisted of step-parents, step-children, and half-siblings, creating complex family networks. By contrast, New England had lower death rates, and much more family stability, and enabled the patriarchal New Englanders to better make long-term plans for acquiring enough land to provide farms for the next generation. [11] [12] [13]
Smallpox was pandemic but vaccination was introduced in the 1750s. In the 1775–1782 North American smallpox epidemic data based on remnant settlements indicate at least 130,000 people died in the epidemic that started in 1775. [14] [15]
During the Revolution General George Washington insisted his soldiers get inoculated else his forces might get decimated or the British try to use smallpox as a weapon. [16] [17]
Lemuel Shattuck (1793-1859) of Boston promoted legislation that required a better statewide system for the local registration of vital information on births and deaths. He specified the need for precise details on age, sex, race, and occupation, as well as standard terminology for diseases and cause of death. This law was passed in 1842 and was soon copied by most other states. [18] His proposals greatly expanded the questionnaires used in the Massachusetts state census of 1845. He was a key consultant for the 1850 United States census. He helped convince Congress to fund a much more complex census, and he designed most of the interview forms used by door-to-door canvassers. His Report on the Sanitary Condition of Massachusetts in 1850 on a sanitary survey of Massachusetts was farsighted. [19] It explained how to remove the giant mounds of dirt, horse dung, and outhouse production that were overwhelming the neighborhoods of fast growing cities. [20] It inspired reforms in many cities that faced the same public health crisis. [21]
The Metropolitan Board of Health was established in 1866 by the Radical Republican who controlled the New York state legislature. It became a model for many major cities due to its innovative approach and effectiveness in addressing public health issues. The state government gave the city's Board extensive powers to create, execute, and judge ordinances related to public health. This comprehensive authority allowed for swift and effective action in addressing health crises. The Board leadership consisted of four police commissioners, the health officer of the Port of New York, and four commissioners appointed by the governor, three of whom were required to be physicians. This diverse makeup ensured a balance of expertise and perspectives. Within weeks of its formation, the Board secured agreements with city butchers to clean up and relocate slaughterhouses, imposed health standards on the milk industry, improved water supply, and began cleaning city streets. When the cholera epidemic broke out in the spring of 1866, the Board successfully fought it with a stringent health code, house-to-house inspections, disinfectants, and quarantines. This resulted in a significantly lower death toll in New York City compared to other major cities. The Board's formation was preceded by a comprehensive sanitary inspection of New York City, which revealed widespread poor living conditions in the slum districts. This data-driven approach to identifying and addressing public health issues was modelled after Shattuck's statewide work in Massachusetts. It became a standard practice in other cities. Furthermore, the Board recognized the connection between housing, politics, morals, and health, setting a precedent for addressing the social determinants of health. [22] [23]
The success of New York City's Metropolitan Board of Health in improving public health conditions and managing disease outbreaks demonstrated the effectiveness of a centralized, empowered health authority. This model was subsequently adopted by other cities and states, shaping the future of public health administration in America. [24] [25] [26]
Many of the early medical schools in the United States were founded by alumni of the University of Edinburgh Medical School, in Scotland. The nation's first medical school opened in 1765 at the College of Philadelphia by John Morgan and William Shippen Jr.. It evolved into the University of Pennsylvania's Perelman School of Medicine. In New York City in1767, Dr. Samuel Bard opened a medical school. In 1814 it became Columbia University's Vagelos College of Physicians and Surgeons . Harvard Medical School opened in 1782; Dartmouth in 1797; Yale in 1810. [27]
According to Kenneth Ludmerer and William G. Rothstein American medical schools before 1880 had far more weaknesses than strengths. There were no entrance requirements--any young man could sign up and many schools did not even require a high school diploma. The curriculum was narrow, consisting of only seven courses, and instruction was entirely didactic lectures with little to no practical experience, no laboratories, and no work with patients. Physical facilities were meager, often just a single amphitheater or the second floor. Most schools were proprietary, operated for profit by their faculty, who gave most of their attention to their private practice. The standard course consisted of only two four-month terms of lectures. Graduation requirements were minimal, with brief and superficial examinations. The strengths were that the many proprietary schools made a professional career more widely available than the colonial apprenticeship system it replaced. Furthermore the lectures provided more systematic teaching compared to the apprenticeship model After 1880 German medical influences modernized the system with leaders like Johns Hopkins, Harvard, Pennsylvania, Columbia and Michigan extending their courses, adding new scientific subjects, and hiring full-time medical scientists with laboratories. [28] [29] [30]
Hospitals in the 19th century were largely designed for poor people in the larger cities. There were no paying patients. Very small proprietary hospitals were operated by practicing physicians and surgeons to take care of their own paying patients in better facilities than the charity hospitals offered. By the 1840s, the major religious denominations, especially the Catholics and Methodists, began opening hospitals in major cities. The South had small hospitals in its few cities. In the rich plantation areas, slave owners hired physicians to keep their slaves in working shape. In the poor white areas there were few doctors and very few hospitals. [31]
In the 1840s–1880s era, Catholics in Philadelphia founded two hospitals, for the Irish and German Catholic immigrants. They depended on revenues from the paying sick, and became important health and welfare institutions in the Catholic community. [32] By 1900 the Catholics had set up hospitals in most major cities. In New York the Dominicans, Franciscans, Sisters of Charity, and other orders set up hospitals to care primarily for their own ethnic group. By the 1920s they were serving everyone in the neighborhood. [33] In smaller cities too the Catholics set up hospitals, such as St. Patrick Hospital in Missoula, Montana. The Sisters of Providence opened it in 1873. It was in part funded by the county contract to care for the poor, and also operated a day school and a boarding school. The nuns provided nursing care especially for infectious diseases and traumatic injuries. They also proselytized the patients to attract converts and restore lapsed Catholics back into the Church. They built a larger hospital in 1890. [34] Catholic hospitals were largely owned and staffed by orders of nuns (who took oaths of poverty), as well as unpaid nursing students. When the population of nuns dropped sharply after the 1960s, the hospitals were sold. The Catholic Hospital Association formed in 1915. [35] [36]
The Methodists made medical services a priority from the 1850s. They began opening charitable institutions such as orphanages and old people's homes. In the 1880s, Methodists began opening hospitals which served people of all religious beliefs. By 1895, 13 hospitals were in operation in major cities. [37]
Compared to the North and West, the South was always a warmer climate that fostered diseases. It had far fewer cities and they lagged the North in innovation. [38] After the Civil War it was a much more sickly region, lacking in doctors, hospitals, medicine, and all aspects of public health. When a threat of yellow fever appeared Southern cities imposed temporary quarantines to stop travel from infected areas. The rest of the time there was inaction, and a reluctance to spend on sanitation. [39] Most Southerners were too poor to buy the patent medicines that were so popular elsewhere. Instead there was a heavy reliance on cheap herbal and folk remedies, especially among African Americans and Appalachians. [40] [41] [42]
The urban-rural dichotomy has a medical dimension. Two major diseases, malaria and hookworm, historically were rural phenomenon in warm areas of the South. They were stamped out by large-scale efforts to clean up the environment. Malaria is spread by the bite of a particular species of mosquito, and is eradicated by systematically draining pools of stagnant water or spraying with DDT. [43] [44]
The Rockefeller Sanitary Commission in 1910 discovered that nearly half the farm people, white and Black, in the poorest parts of the South were infected with hookworms. In the typical victim hundreds of the worms live hooked to the wall inside the small intestine, eat the best food, and leave the victim weak and listless. It was called the "germ of laziness." Victims were infected by walking barefoot in grassy areas where people defecate. In the long run outhouses and shoes solved the problem but the Commission developed a quick cure. The volunteer drank a special medicine that loosened the insects' grip, then drank a strong laxative. When most residents did so the hookworms would be gone. The Commission, headed by Wickliffe Rose, helped state health departments set up eradication crusades that treated 440,000 people in 578 counties in all 11 Southern states, and ended the epidemic. [45] [46] [47]
In the Southern states 1890s to 1930s, Jim Crow virtually dictated inferior medical care for the large, very poor African American minority. There was neglect and racism on the part of white physicians. Black physicians were too few and too poorly trained at their two small schools, Howard University and Meharry Medical College. Likewise nursing standards were subpar, and there were very few all-Black hospitals. The southern progressive movement did initiate reforms that helped somewhat, as did Northern philanthropies, but the whites benefitted more. [48] [49] [50] [51]
The most infamous American episode of bad medical ethics was the Tuskegee syphilis study. It was conducted between 1932 and 1972 by two federal agencies, the United States Public Health Service (PHS) and the Centers for Disease Control and Prevention (CDC) on a group of 399 African American men with syphilis. They were not asked to give permission, were not told their medical condition, and when penicillin became available in the mid 1940s it was deliberately not given them so the researchers could discover what happens to untreated men. As a result the lives of 100 of the 399 men were cut short and they died of syphilis. [52] [53]
In retrospect the Tuskegee experiment caused deep distrust on the part of the African American community, and apparently reduced Black reliance on public health agencies. [54] [55] [56] One research study in 2018 estimated that the angry negative response caused the average life expectancy at age 45 for all Black men to fall by up to 1.5 years. [57]
In the U.S., the number of hospitals reached 4400 in 1910, when they provided 420,000 beds. [58] These were operated by city, state and federal agencies, by churches, by stand-alone non-profits, and by for-profit enterprises. All the major denominations built hospitals; the 541 Catholic ones (in 1915) were staffed primarily by unpaid nuns. The others sometimes had a small cadre of deaconesses as staff. [59] Non-profit hospitals were supplemented by large public hospitals in major cities and research hospitals often affiliated with a medical school. The largest public hospital system in America is the New York City Health and Hospitals Corporation, which includes Bellevue Hospital, the oldest U.S. hospital, affiliated with New York University Medical School. [60] [61]
According to the Center for Disease Control: [62]
Before the measles vaccination program started in 1963, an estimated 3 to 4 million people got measles each year in the United States, of which 500,000 were reported. Among reported cases, 400 to 500 died, 48,000 were hospitalized, and 1,000 developed encephalitis (brain swelling) from measles.
Measles affected approximately 3,000 Americans per million until the 1960s. The first effective vaccine appeared in 1963, and was quickly adopted with little controversy. The rate plunged to 13 cases per million by the 1980s, and to about 1 case per million by 2000. [63] In the 21st century occasional measles, outbreaks occur locally, usually caused by a person returning from a foreign visit. The disease is highly contagious, but with a community vaccination rate of 95% or higher, a local outbreak will quickly end. With lower rates of vaccination, however, measles can continue to spread. There are low vaccination rates in some traditionalistic religious groups, such as some Orthodox Jewish, Amish, Mennonite and Jehovah’s Witnesses communities. [64]
According to a March 2021 poll conducted by The Associated Press/NORC, vaccine skepticism and Vaccine hesitancy is s more widespread among white evangelicals than most other blocs of Americans. Among white evangelical Protestants, 40% said they were not likely to get vaccinated against COVID-19. That compares with 25% of all Americans, 28% of white mainline Protestants and 27% of nonwhite Protestants. [65]
When the U. S Army began drafting 4 million soldiers in 1917–1918, 95,000 men who had never been exposed to measles before caught the disease. Of these, 23,000 were hospitalized and 3,206 died. Most of the victims came from rural areas where measles was uncommon. There was simultaneously a parallel epidemic of primary streptococcal pneumonia in soldiers without measles. [66]
the world wide Spanish flu epidemic of 1918 probably originated in the United States, and had a major impact on all parts of the country, as well as US Army in the American Expeditionary Force in France. [67]
In the U.S., about 20 million out of a population of 105 million became infected in the 1918–1919 season, and an estimated 500,000 to 850,000 died (0.5 to 0.8 percent of the U.S. population). [68] [69] [70] Native American tribes were particularly hard hit. In the Four Corners area, there were 3,293 registered deaths among Native Americans. [71] Entire Inuit and Alaskan Native village communities died in Alaska. [72]
The Flexner Report of 1910 made for a radical change in medical education. It emphasized the importance of high quality. university-based, research oriented medical. education. It had the result of closing down most of the of small proprietary. local schools that produced doctors for rural America. In 1938, rural counties with without a city of 2500 people had 69 doctors per 100,000 population, while urban counties with cities of 50,000 or more population had 174. [73] The growing shortage of physicians in rural areas, especially in the South. [74] [75]
Public health nursing after 1900 offered a new career for professional nurses in addition to private duty work. The role of public health nurse began in Los Angeles in 1898, and by 1924, there were 12,000 public health nurses, half of them in America's 100 largest cities. Their average annual salary of public health nurses in larger cities was $1390. In addition, there were thousands of nurses employed by private agencies handling similar work. Public health nurses supervised health issues in the public and parochial schools, to prenatal and infant care, handled communicable diseases such as tuberculosis, and dealt with an aerial diseases. [76] [77]
Historian Nancy Bristow has argued that the great 1918 flu pandemic contributed to the success of women in the field of nursing. This was due in part to the failure of medical doctors, who were nearly all men, to contain and prevent the illness. Nursing staff, who were nearly all women, celebrated the success of their patients and were less inclined to identify the spread of the disease with their own work. [78]
During the Great Depression in the 1930s, federal relief agencies funded many large-scale public health programs in every state, some of which became permanent. The programs expanding job opportunities for nurses, especially the private duty RNs who suffered high unemployment rates. [79] [80]
A leader was Dr. Sara Josephine Baker who established many programs to help the poor in New York City keep their infants healthy, leading teams of nurses into the crowded neighborhoods of Hell's Kitchen and teaching mothers how to dress, feed, and bathe their babies. [81]
The federal Office of Indian Affairs (OIA) operated a large-scale field nursing program. Field nurses targeted native women for health education, emphasizing personal hygiene, and infant care and nutrition. [82]
In the United States there was dramatic reduction in what had been the greatest killer, tuberculosis (often called "consumption"). [83] Starting in the 1900s, public health campaigns were launched to educate people about the contagion. [84] In later decades, posters, pamphlets and newspapers continued to inform people about the risk of contagion and methods to avoid it, including increasing public awareness about the importance of good hygiene and avoidance of spitting in public. [85] Improved awareness of good hygiene practices reduced the number of cases, especially in middle class neighborhoods. Public clinics were set up to improve awareness and provide screenings. This resulted in sharp declines through the 1920s and 1940s. Thanks to the public health campaigns, as well as the antibiotic drug streptomycin as a powerful cure from 1947, tuberculosis was downgraded to a minor disease in the U.S. by 1960 [86]
Public health programs have significantly improved children's health over the past century through various initiatives and interventions. These programs have addressed key issues such as infant mortality, disease prevention, and access to local healthcare for mothers and their babies. By 1915 child health had become a priority. Progressive Era reformers state by state focused on rescuing children under age 10 or 12 from low-wage employment in factories. See Child labor in the United States [87]
At the national level, the United States Children's Bureau, founded in 1912, played a crucial role in improving children's health. Congress originally gave it a very broad mandate: [88]
The said bureau shall investigate and report ...upon all matters pertaining to the welfare of children and child life among all classes of our people, and shall especially investigate the questions of infant mortality, the birth-rate, orphanage, juvenile courts, desertion, dangerous occupations, accidents and diseases of children, employment, legislation affecting children in the several states and territories.
Its actual work was much more limited. Its major initiatives included the Campaign for Better Babies (1915), to educate mothers, reduce infant mortality, and identify threats to children's health. The Children's Year (1918-1919) promoted child health and welfare, focusing on reducing infant mortality, improving nutrition, and promoting safe recreation. [89]
The Sheppard–Towner Act of 1921 had a significant influence on children's health policies, marking a turning point in public health initiatives for mothers and infants. It set the stage for future federal involvement in maternal and child health care. It set up 3,000 child and maternal health care centers, many in rural areas. It funded millions of home visits by nurses to mothers and their infants. One result was the infant mortality rate dropped from 76 deaths per 1000 live births to 68 in 1929. [90]
Title V of the Social Security Act (1935) established a federal-state partnership for maternal and child health services, providing funding for state health departments to implement children's health programs. [91] The 1950s and 1960s saw major efforts to vaccinate children against various diseases, especially polio. [92] In 1971 the measles vaccine (approved in 1963) was combined with new vaccines against mumps (1967) and rubella (1969) into a single vaccination MMR by Dr Maurice Hilleman. [93]
March of Dimes is a nonprofit organization that works to improve the health of mothers and babies. [94] It reaches a mass audience of contributors for funding health care of victims of polio and other diseases, and is a major source of medical research funding. [95] It was founded in 1938 by businessman Basil O'Connor and wheel-chair-bound polio victim President Franklin D. Roosevelt, as the National Foundation for Infantile Paralysis, to combat polio. [96] In the 1940s there were 40,000 new cases every year, and summer programs for children were restricted, especially swimming pools. From 1938 through the approval of the Salk vaccine in 1955, the foundation spent $233 million on polio patient care, which led to more than 80 percent of U.S. polio patients' receiving significant foundation aid. [97] After Salk's polio vaccine virtually ended the polio epidemic by 1959, the organization needed a new mission for its 3,100 chapters nationwide, and 80,000 volunteers who had collected billions of dimes. It expanded its focus under Virginia Apgar to the prevention of birth defects and infant mortality. [98] In 2005, as preterm birth emerged as the leading cause of death for children worldwide, [99] research and prevention of premature birth became the organization's primary focus. [100]
In the 1940s penicillin, streptomycin and other powerful antibiotics became available. They were quick, cheap cures for many of the most common and deadly bacterial infections, including tuberculosis and pneumonia. They extended the average human lifespan by 23 years and marked a "Golden Age" of public health. [101] The 1950s and 1960s saw the advent of other powerful drugs: medicines to prevent inflammation in the joints and kidneys; to dilate arteries in the battle against high blood pressure or constrict blood vessels to combat shock; to regulate heartbeats; and to thin the blood. Professional medicine now for the first time was armed with drugs to cure major diseases. Diseases caused by a virus, however, were still not curable, and the new problem emerged of new variants of bacteria that resisted the new drugs. [102] [103]
Committee on the Costs of Medical Care designed new government insurance programs with its 1932 report. It was strongly opposed by the American Medical Association, which blocked all such notions by presidents Franklin D. Roosevelt and Harry S Truman. [104] [105] However, New Deal legislation, especially the Wagner Act of 1935 greatly strengthened labor unions. Membership grew in the late 1930s and soared during World War II. One of the high priorities for unions was to negotiate health insurance for workers and their families, and take credit for it. [106] [107]
In July 1965, under the leadership of President Lyndon Johnson, Congress enacted Medicare under Title XVIII of the Social Security Act to provide government health insurance to people age 65 and older, regardless of income or medical history. Before Medicare was created, approximately 60% of people over the age of 65 had health insurance (as opposed to about 70% of the population younger than that), with coverage often unavailable or unaffordable to many others, because older adults paid more than three times as much for health insurance as younger people. [108]
In 1997 a compromise was reached with private insurance companies, which were given a major role in Medicare Advantage as part of the Medicare program for retired people. By 2024 54% of Medicare recipients were enrolled in Medicate Advantage. [109] [110]
In 2010 the Obama Administration passed the Affordable Care Act a program to enable wider health insurance for lower income families. There was a partisan dimension, with Republicans generally opposed, even though their constituencies were increasingly composed of lower income voters. [111] [112] [113]
The worldwide Covid-19 pandemic in 2020–2022 led to 1.2 million deaths among 103 million who got sick in the U.S. There was massive economic damage as people stayed home from school, work and entertainment venues. [114] [115]
Life expectancy in the United States has shown a remarkable increase over the past century, with a few small fluctuations. In 1900, life expectancy at birth was approximately 47 years. This figure rose steadily, reaching about 69 years by 1950; 72 in 1975, and 77 in 2000 . In 2023 it reached 78.4 years—75.8 years for males and 81.1 years for females. [116]
Multiple factors Influenced life expectancy at birth: [117]