This article is part of a series on the |
History of the United States |
---|
The history of medicine in the United States encompasses a variety of approaches to health care in the United States spanning from colonial days to the present. These interpretations of medicine vary from early folk remedies that fell under various different medical systems to the increasingly standardized and professional managed care of modern biomedicine.
At the time settlers first came to the United States, the predominant medical system was humoral theory, or the idea that diseases are caused by an imbalance of bodily fluids. [1] Settlers initially believed that they should only use medicines that fit in this medical system and were made out of "such things only as grown in England, they being most fit for English Bodies," as said in The English Physitian Enlarged, a medical handbook commonly owned by early settlers. [2] However, as settlers were faced with new diseases and a scarcity of typical plants and herbs used to make therapies in England, they increasingly turned to local flora and Native American remedies as alternatives to European medicine. The Native American medical system typically tied the administration of herbal treatments with rituals and prayers. [3] This inclusion of a different spiritual system was denounced by Europeans, in particular Spanish colonies, as part of the religious fervor associated with the Inquisition. Any Native American medical information that did not agree with humoral theory was deemed heretical by Spanish authorities, and tribal healers were condemned as witches. [4] In English colonies on the other hand, it was more common for settlers to seek medical help from Native American healers. [3]
Mortality was very high for new arrivals, and high for children in the colonial era. [5] [6] Malaria was deadly to many new arrivals. The disease environment was very hostile to European settlers, especially in all the Southern colonies. Malaria was endemic in the South, with very high mortality rates for new arrivals. Children born in the new world had some immunity—they experienced mild recurrent forms of malaria but survived. For an example of newly arrived able-bodied young men, over one-fourth of the Anglican missionaries died within five years of their arrival in the Carolinas. [7] Mortality was high for infants and small children, especially from diphtheria, yellow fever, and malaria. Most sick people turn to local healers, and used folk remedies. Others relied upon the minister-physicians, barber-surgeons, apothecaries, midwives, and ministers; a few used colonial physicians trained either in Britain, or an apprenticeship in the colonies. There was little government control, regulation of medical care, or attention to public health. By the 18th century, Colonial physicians, following the models in England and Scotland, introduced modern medicine to the cities. This allowed some advances in vaccination, pathology, anatomy and pharmacology. [8]
There was a fundamental difference in the human infectious diseases present in the indigenous peoples and that of sailors and explorers from Europe and Africa. Some viruses, like smallpox, have only human hosts and appeared to have never occurred on the North American continent before 1492. The indigenous people lacked genetic resistance to such new infections, and suffered overwhelming mortality when exposed to smallpox, measles, malaria, tuberculosis and other diseases. The depopulation occurred years before the European settlers arrived in the vicinity and resulted from contact with trappers. [9] [10]
The French colonial city of New Orleans, Louisiana opened two hospitals in the early 1700s. The first was the Royal Hospital, which opened in 1722 as a small military infirmary, but grew to importance when the Ursuline Sisters took over its management in 1727 and made it a major hospital for the public, with a new and larger building built in 1734. The other was the Charity Hospital, which was staffed by many of the same people but was established in 1736 as a supplement to the Royal Hospital so that the poorer classes (who usually could not afford treatment at the Royal Hospital) had somewhere to go. [11]
In the British colonies, medicine was rudimentary for the first few generations, as few upper-class British physicians emigrated to the colonies. The first medical society was organized in Boston in 1735. In the 18th century, 117 Americans from wealthy families had graduated in medicine in Edinburgh, Scotland, but most physicians learned as apprentices in the colonies. [12] In Philadelphia, the Medical College of Philadelphia was founded in 1765, and became affiliated with the university in 1791. In New York, the medical department of King's College was established in 1767, and in 1770, awarded the first American M.D. degree. [13]
Smallpox inoculation was introduced 1716–1766, well before it was accepted in Europe. The first medical schools were established in Philadelphia in 1765 and New York in 1768. The first textbook appeared in 1775, though physicians had easy access to British textbooks. The first pharmacopoeia appeared in 1778. [14] [15] The European populations had a historic exposure and partial immunity to smallpox, but the Native American populations did not, and their death rates were high enough for one epidemic to virtually destroy a small tribe. [16]
Physicians in port cities realized the need to quarantine sick sailors and passengers as soon as they arrived. Pest houses for them were established in Boston (1717), Philadelphia (1742) Charleston (1752) and New York (1757). The first general hospital was established in Philadelphia in 1752. [17] [18]
In the 19th century the nation was flooded with medical sects promoting a wide range of alternative treatments for all ailments. The medical societies tried to impose licensing requirements in state law, but the eclectics undid their efforts. The most famous of the eclectics was Samuel Thomson (1769-1843), a self educated New England farm boy who developed a wildly popular herbal medical system. [19] He founded the Friendly Botanic Societies in 1813 and wrote a manual detailing his new methods. [20] He promised that even the worst ailments could be cured without any harsh treatments. There would be no surgery, no deliberate bleeding, no powerful drugs. Disease was a matter of maladjustment in the body's internal heat, and could be cured by applying certain herbs and medicinal plants, coupled with vomiting, enemas, and steam baths. Thomson's approach resonated with workers and farmers who distrusted the bloody hands of traditional physicians. [21] President Andrew Jackson endorsed the new fad, and Brigham Young promoted it to the new Mormon movement. [22]
Thomson was a master promoter. He patented his system and sold licenses to hundreds of field agents who gained patients during the cholera outbreaks of 1832 and 1834. By 1839, the movement claimed three million followers and was strongest in New England. [23]
However, in the 1840s it all fell apart. Thomson died in 1843, many patients grew worse after the treatment, while a bitter schism emerged among the Thomsonian agents. The result the sect's sharp decline by 1860. Nevertheless, Thomson's influence still can be seen among people suspicious of modern medicine. Many herbs he popularized, such as cayenne pepper, lobelia, and goldenseal, remain widely used to this day in herbal healing routines. The Thomsonians had been briefly successful in blocking state laws limited medical practice to licensed physicians. After the collapse the MDs made a comeback and reimposed strict licensing laws on the practice of medicine. [24]
In the American Civil War (1861–65), as was typical of the 19th century, far more soldiers died of disease than in battle, and even larger numbers were temporarily incapacitated by wounds, disease and accidents. [25] Conditions were worse in the Confederacy, where doctors and medical supplies were in short supply. [26] The war had a dramatic long-term impact on American medicine, from new surgical technique to creation of many hospitals, to expanded nursing and to modern research facilities.
In the Civil War, as was typical of the 19th century, far more soldiers died of disease than in battle, and even larger numbers were temporarily incapacitated by wounds, disease and accidents. [27] Conditions were very poor in the Confederacy, where doctors, hospitals and medical supplies were in short supply. [28] [29] [30]
Medicine in the 1860s did not know about germs and it tolerated bad hygiene. The risk was highest at the beginning of the war when men who had seldom been far from home were brought together for training alongside thousands of strangers who carried unfamiliar germs. Men from rural areas were twice as likely to die from infectious diseases as soldiers from urban areas. [31] Recruits first encountered epidemics of the childhood diseases of chicken pox, mumps, whooping cough, and, especially, measles. Later the fatal disease environment included diarrhea, dysentery, typhoid fever, and malaria. Disease vectors were often unknown. Bullet wounds often led to gangrene, usually necessitating an amputation before it became fatal. The surgeons used chloroform if available, whiskey otherwise. [32] Harsh weather, bad water, inadequate shelter in winter quarters, poor sanitation within the camps, and dirty camp hospitals took their toll. [33] This was a common scenario in wars from time immemorial, and conditions faced by the Confederate army were even worse since the Union blockade sharply reduced medical supplies; adequate food, shoes and warm clothing were in very short supply.
The Union had money and responded by building 204 army hospitals with 137,000 beds, with doctors, nurses and staff as needed, as well as hospital ships and trains located close to the battlefields. Mortality was only 8 percent. [34] What was critical in the Union was the emergence of skilled, well-funded medical organizers who took proactive action, especially in the much enlarged United States Army Medical Department, [35] and the United States Sanitary Commission, a new private agency. [36] Numerous other new agencies also targeted the medical and morale needs of soldiers, including the United States Christian Commission as well as smaller private agencies such as the Women's Central Association of Relief for Sick and Wounded in the Army (WCAR) founded in 1861 by Henry Whitney Bellows, and Dorothea Dix. Systematic funding appeals raised public consciousness, as well as millions of dollars. Many thousands of volunteers worked in the hospitals and rest homes, most famously poet Walt Whitman. Frederick Law Olmsted, a famous landscape architect, was the highly efficient executive director of the Sanitary Commission. [37]
States could use their own tax money to support their troops as Ohio did. Following the unexpected carnage at the battle of Shiloh in April 1862, the Ohio state government sent 3 steamboats to the scene as floating hospitals with doctors, nurses and medical supplies. The state fleet expanded to eleven hospital ships. The state also set up 12 local offices in main transportation nodes to help Ohio soldiers moving back and forth. [38] The U.S. Army learned many lessons and in 1886, it established the Hospital Corps. The Sanitary Commission collected enormous amounts of statistical data, and opened up the problems of storing information for fast access and mechanically searching for data patterns. The pioneer was John Shaw Billings (1838-1913). A senior surgeon in the war, Billings built the Library of the Surgeon General's Office (now the National Library of Medicine, the centerpiece of modern medical information systems. Billings figured out how to mechanically analyze medical and demographic data by turning information into numbers and punching onto cardboard cards as developed by his assistant Herman Hollerith. This was the origin of the computer punch card system that dominated computers and statistical data manipulation until the 1970s. [39]
After 1870 the Nightingale model of professional training of nurses was widely copied. Linda Richards (1841 – 1930) studied in London and became the first professionally trained American nurse. She established nursing training programs in the United States and Japan, and created the first system for keeping individual medical records for hospitalized patients. [40]
After the American Revolution, the United States was slow to adopt advances in European medicine, but adopted germ theory and science-based practices in the late 1800s as the medical education system changed. [41] Historian Elaine G. Breslaw describes earlier post-colonial American medical schools as "diploma mills", and credits the large 1889 endowment of Johns Hopkins Hospital for giving it the ability to lead the transition to science-based medicine. [42] Johns Hopkins originated several modern organizational practices, including residency and rounds . In 1910, the Flexner Report was published, standardizing many aspects of medical education. The Flexner Report is a book-length study of medical education and called for stricter standards for medical education based on the scientific approach used at universities, including Johns Hopkins. [43]
As Campbell (1984) shows, the nursing profession was transformed by World War II. Army and Navy nursing was highly attractive and a larger proportion of nurses volunteered for service higher than any other occupation in American society. [44] [45]
The public image of the nurses was highly favorable during the war, as exemplified by such Hollywood films as Cry "Havoc" , which made the selfless nurses heroes under enemy fire. Some nurses were captured by the Japanese, [46] but in practice they were kept out of harm's way, with the great majority stationed on the home front. The medical services were large operations, with over 600,000 soldiers, and ten enlisted men for every nurse. Nearly all the doctors were men, with women doctors allowed only to examine patients from the Women's Army Corps. [44]
In the colonial era, women played a major role in terms of healthcare, especially regarding midwives and childbirth. Local healers used herbal and folk remedies to treat friends and neighbors. Published housekeeping guides included instructions in medical care and the preparation of common remedies. Nursing was considered a female role. [47] Babies were delivered at home without the services of a physician well into the 20th century, making the midwife a central figure in healthcare. [48] [49]
The professionalization of medicine, starting slowly in the early 19th century, included systematic efforts to minimize the role of untrained uncertified women and keep them out of new institutions such as hospitals and medical schools. [50]
In 1849 Elizabeth Blackwell (1821–1910), an immigrant from England, graduated from Geneva Medical College in New York at the head of her class and thus became the first female doctor in America. In 1857, she and her sister Emily, and their colleague Marie Zakrzewska, founded the New York Infirmary for Women and Children, the first American hospital run by women and the first dedicated to serving women and children. [51] Blackwell viewed medicine as a means for social and moral reform, while a younger pioneer Mary Putnam Jacobi (1842-1906) focused on curing disease. At a deeper level of disagreement, Blackwell felt that women would succeed in medicine because of their humane female values, but Jacobi believed that women should participate as the equals of men in all medical specialties. [52] In 1982, nephrologist Leah Lowenstein became the first woman dean of a co-education medical school upon her appointment at Jefferson Medical College. [53]
Nursing became professionalized in the late 19th century, opening a new middle-class career for talented young women of all social backgrounds. The School of Nursing at Detroit's Harper Hospital, begun in 1884, was a national leader. Its graduates worked at the hospital and also in institutions, public health services, as private duty nurses, and volunteered for duty at military hospitals during the Spanish–American War and the two world wars. [54]
The major religious denominations were active in establishing hospitals in many cities. Several Catholic orders of nuns specialized in nursing roles. While most lay women got married and stopped, or became private duty nurses in the homes and private hospital rooms of the wealthy, the Catholic sisters had lifetime careers in the hospitals. This enabled hospitals like St. Vincent's Hospital in New York, where nurses from the Sisters of Charity began their work in 1849; patients of all backgrounds were welcome, but most came from the low-income Catholic population. [55]
The history of medicine is both a study of medicine throughout history as well as a multidisciplinary field of study that seeks to explore and understand medical practices, both past and present, throughout human societies.
Samuel Thomson was a self-taught American herbalist and botanist, best known as the founder of the alternative system of medicine known as "Thomsonian Medicine" or "Thomsonianism", which enjoyed wide popularity in the United States during the early 19th century.
A children's hospital(CH) is a hospital that offers its services exclusively to infants, children, adolescents, and young adults from birth up to until age 18, and through age 21 and older in the United States. In certain special cases, they may also treat adults. The number of children's hospitals proliferated in the 20th century, as pediatric medical and surgical specialties separated from internal medicine and adult surgical specialties.
Thomsonianism was an early 19th century American-based system of alternative medicine, developed and promoted by Samuel Thomson. It grew rapidly during the Popular Health Movement when many Americans distrusted expertise in general and physicians in particular. It was based on herbal treatments designed to regulate body heat, and rejected drugs, heavy bleeding and surgery. The cult faltered and factionalized in the 1850s, and was largely a spent force by the 1860s. However, its emphasis on herbal remedies was copied by other herbalist sects.
The term military medicine has a number of potential connotations. It may mean:
The presence of women in medicine, particularly in the practicing fields of surgery and as physicians, has been traced to the earliest of history. Women have historically had lower participation levels in medical fields compared to men with occupancy rates varying by race, socioeconomic status, and geography.
The word "nurse" originally came from the Latin word "nutrire", meaning to suckle, referring to a wet-nurse; only in the late 16th century did it attain its modern meaning of a person who cares for the infirm.
Nursing is a health care profession that "integrates the art and science of caring and focuses on the protection, promotion, and optimization of health and human functioning; prevention of illness and injury; facilitation of healing; and alleviation of suffering through compassionate presence". Nurses practice in many specialties with varying levels of certification and responsibility. Nurses comprise the largest component of most healthcare environments. Shortages of qualified nurses are found in many countries.
A hospital is a healthcare institution providing patient treatment with specialized health science and auxiliary healthcare staff and medical equipment. The best-known type of hospital is the general hospital, which typically has an emergency department to treat urgent health problems ranging from fire and accident victims to a sudden illness. A district hospital typically is the major health care facility in its region, with many beds for intensive care and additional beds for patients who need long-term care.
Disease in colonial America that afflicted the early immigrant settlers was a dangerous threat to life. Some of the diseases were new and treatments were ineffective. Malaria was deadly to many new arrivals, especially in the Southern colonies. Of newly arrived able-bodied young men, over one-fourth of the Anglican missionaries died within five years of their arrival in the Carolinas. Mortality was high for infants and small children, especially for diphtheria, smallpox, yellow fever, and malaria. Most sick people turned to local healers, and used folk remedies. Others relied upon the minister-physicians, barber-surgeons, apothecaries, midwives, and ministers; a few used colonial physicians trained either in Britain, or an apprenticeship in the colonies. One common treatment was blood letting. The method was crude due to a lack of knowledge about infection and disease among medical practitioners. There was little government control, regulation of medical care, or attention to public health. By the 18th century, Colonial physicians, following the models in England and Scotland, introduced modern medicine to the cities in the 18th century, and made some advances in vaccination, pathology, anatomy and pharmacology.
Nurses in Canada practise in a wide variety of settings, with various levels of training and experience. They provide evidence-based care and educate their patients about health and disease.
The state of medical knowledge at the time of the Civil War was quite limited by 21st century standards. Doctors did not understand germs, and did little to prevent infection. It was a time before antiseptics, and a time when there was no attempt to maintain sterility during surgery. No antibiotics were available, and minor wounds could easily become infected, and hence fatal. While the typical soldier was at risk of being hit by rifle or artillery fire, he faced an even greater risk of dying from disease.
The Popular Health Movement of the 1830s–1850s was an aspect of Jacksonian-era politics and society in the United States. The movement promoted a rational skepticism toward claims of medical expertise that were based on personal authority, and encouraged ordinary people to understand the pragmatics of health care. Arising in the spirit of Andrew Jackson's anti-elitist views, the movement succeeded in ending almost all government regulation of health care. During the first two decades of the 19th century, states had regularly enacted licensing legislation; by 1845, only three states still licensed medical doctors. Among the leading figures within the movement were Samuel Thomson and Sylvester Graham.
The history of hospitals began in antiquity with hospitals in Greece, the Roman Empire and on the Indian subcontinent as well, starting with precursors in the Asclepian temples in ancient Greece and then the military hospitals in ancient Rome. The Greek temples were dedicated to the sick and infirm but did not look anything like modern hospitals. The Romans did not have dedicated, public hospitals. Public hospitals, per se, did not exist until the Christian period. Towards the end of the 4th century, the "second medical revolution" took place with the founding of the first Christian hospital in the eastern Byzantine Empire by Basil of Caesarea, and within a few decades, such hospitals had become ubiquitous in Byzantine society. The hospital would undergo development and progress throughout Byzantine, medieval European and Islamic societies from the 5th to the 15th century. European exploration brought hospitals to colonies in North America, Africa, and Asia. St Bartholomew's hospital in West Smithfield in London, founded in 1123, is widely considered the oldest functioning hospital today. Originally a charitable institution, currently an NHS hospital it continues to provide free care to Londoners, as it has for 900 years. In contrast, the Mihintale Hospital in Sri Lanka, established in the 9th century is probably the site with the oldest archaeological evidence available for a hospital in the world. Serving monks and the local community, it represents early advancements in healthcare practices.
The Catholic Church is the largest non-government provider of health care services in the world. It has around 18,000 clinics, 16,000 homes for the elderly and those with special needs, and 5,500 hospitals, with 65 percent of them located in developing countries. In 2010, the Church's Pontifical Council for the Pastoral Care of Health Care Workers said that the Church manages 26% of the world's health care facilities. The Church's involvement in health care has ancient origins.
The history of medicine in the Philippines discusses the folk medicinal practices and the medical applications used in Philippine society from the prehistoric times before the Spaniards were able to set a firm foothold on the islands of the Philippines for over 300 years, to the transition from Spanish rule to fifty-year American colonial embrace of the Philippines, and up to the establishment of the Philippine Republic of the present. Although according to Dr. José Policarpio Bantug in his book A Short History of Medicine in the Philippines During The Spanish Regime, 1565-1898, there were "no authentic monuments have come down to us that indicate with some certainty early medical practices" regarding the "beginnings of medicine in the Philippines". A historian from the United States named Edward Gaylord Borne described that the Philippines became "ahead of all the other European colonies" in providing healthcare to ill and invalid people during the start of the 17th century, a time period when the Philippines was a colony of Spain. From the 17th and 18th centuries, there had been a "state-of-the-art medical and pharmaceutical science" developed by Spanish friars based on Filipino curanderos that was "unique to the [Philippine] islands."
The history of nursing in the United States focuses on the professionalization of Nursing in the United States since the Civil War.
The history of medicine in France focuses on how the medical profession and medical institutions in France have changed over time. Early medicine in France was defined by, and administered by, the Catholic church. Medicine and care were one of the many charitable ventures of the church. During the era of the French Revolution, new ideas took hold within the world of medicine and medicine was made more scientific and the hospitals were made more medical. Paris Medicine is a term defining the series of changes to the hospital and care received with a hospital that occurred during the period of the French Revolution. Ideas from the Enlightenment and Scientific Revolution were introduced into the medical field.
A teaching hospital is a hospital or medical center that provides medical education and training to future and current health professionals. Teaching hospitals are almost always affiliated with one or more universities and are often co-located with medical schools.