This article is part of a series on the |
Science and technology of the United States of America |
---|
Timeline |
Development |
The technological and industrial history of the United States describes the emergence of the United States as one of the most technologically advanced nations in the world in the 19th and 20th centuries. The availability of land and literate labor, the absence of a landed aristocracy, the prestige of entrepreneurship, the diversity of climate and large easily accessed upscale and literate markets all contributed to America's rapid industrialization.
The availability of capital, development by the free market of navigable rivers and coastal waterways, as well as the abundance of natural resources facilitated the cheap extraction of energy all contributed to America's rapid industrialization. Fast transport by the first transcontinental railroad built in the mid-19th century, and the Interstate Highway System built in the late 20th century, enlarged the markets and reduced shipping and production costs. The legal system facilitated business operations and guaranteed contracts. Cut off from Europe by the embargo and the British blockade in the War of 1812 (1807–15), entrepreneurs opened factories in the Northeastern United States that set the stage for rapid industrialization modeled on British innovations.
From its emergence as an independent nation, the United States has encouraged science and innovation. As a result, the United States has been the birthplace of 161 of Encyclopædia Britannica 's 321 Greatest Inventions, including items such as the airplane, internet, microchip, laser, cellphone, refrigerator, email, microwave, personal computer, liquid-crystal display and light-emitting diode technology, air conditioning, assembly line, supermarket, bar code, and automated teller machine. [1]
The early technological and industrial development in the United States was facilitated by a unique confluence of geographical, social, and economic factors. The relative lack of workers kept U.S. wages generally higher than salaries in Europe and provided an incentive to mechanize some tasks. The United States population had some semi-unique advantages in that they were former British subjects, had high English literacy skills, for that period, including over 80% in New England, had stable institutions, with some minor American modifications, of courts, laws, right to vote, protection of property rights and in many cases personal contacts with the British innovators of the Industrial Revolution. They had a good basic structure to build on.
Another major advantage enjoyed by the United States was the absence of an aristocracy or gentry. The eastern seaboard of the United States, with a great number of rivers and streams along the Atlantic seaboard, provided many potential sites for constructing textile mills necessary for early industrialization. The technology and information on how to build a textile industry were largely provided by Samuel Slater (1768–1835) who emigrated to New England in 1789. He had studied and worked in British textile mills for a number of years and immigrated to the United States, despite restrictions against it, to try his luck with U.S. manufacturers who were trying to set up a textile industry. He was offered a full partnership if he could succeed—he did. A vast supply of natural resources, the technological knowledge on how to build and power the necessary machines along with a labor supply of mobile workers, often unmarried females, all aided early industrialization. The broad knowledge carried by European migrants of two periods that advanced the societies there, namely the European Industrial Revolution and European Scientific Revolution, helped facilitate understanding for the construction and invention of new manufacturing businesses and technologies. A limited government that would allow them to succeed or fail on their own merit helped.
After the end of the American Revolutionary War in 1783, the new government continued the strong property rights established under British rule and established a rule of law necessary to protect those property rights. The idea of issuing patents was incorporated into Article I, Section 8 of the Constitution authorizing Congress "to promote the progress of science and useful arts by securing for limited times to authors and inventors the exclusive right to their respective writings and discoveries." The invention of the cotton gin by American inventor Eli Whitney, combined with the widespread prevalence of slavery in the United States and U.S. settler expansion made cotton potentially a cheap and readily available resource for use in the new textile industry.
One of the real impetuses for the United States entering the Industrial Revolution was the passage of the Embargo Act of 1807, the War of 1812 (1812–15) and the Napoleonic Wars (1803–15) which cut off supplies of new and cheaper Industrial revolution products from Britain. The lack of access to these goods all provided a strong incentive to learn how to develop the industries and to make their own goods instead of simply buying the goods produced by Britain.
Modern productivity researchers have shown that the period in which the greatest economic and technological progress occurred was between the last half of the 19th century and the first half of the 20th. [2] [3] [4] During this period the nation was transformed from an agricultural economy to the foremost industrial power in the world, with more than a third of the global industrial output. This can be illustrated by the index of total industrial production, which increased from 4.29 in 1790 to 1,975.00 in 1913, an increase of 460 times (base year 1850 – 100). [5]
American colonies gained independence in 1783 just as profound changes in industrial production and coordination were beginning to shift production from artisans to factories. Growth of the nation's transportation infrastructure with internal improvements and a confluence of technological innovations before the Civil War facilitated an expansion in organization, coordination, and scale of industrial production. Around the turn of the 20th century, American industry had superseded its European counterparts economically and the nation began to assert its military power. Although the Great Depression challenged its technological momentum, America emerged from it and World War II as one of two global superpowers. In the second half of the 20th century, as the United States was drawn into competition with the Soviet Union for political, economic, and military primacy, the government invested heavily in scientific research and technological development which spawned advances in spaceflight, computing, and biotechnology.
Science, technology, and industry have not only profoundly shaped America's economic success, but have also contributed to its distinct political institutions, social structure, educational system, and cultural identity.
North America has been inhabited continuously since approximately 4,000 BC. The earliest inhabitants were nomadic, big-game hunter-gatherers who crossed the Bering land bridge. These first Native Americans relied upon chipped-stone spearheads, rudimentary harpoons, and boats clad in animal hides for hunting in the Arctic. As they dispersed within the continent, they encountered the varied temperate climates in the Pacific northwest, central plains, Appalachian woodlands, and arid Southwest, where they began to make permanent settlements. The peoples living in the Pacific northwest built wooden houses, used nets and weirs to catch fish, and practiced food preservation to ensure longevity of their food sources, since substantial agriculture was not developed. [6] Peoples living on the plains remained largely nomadic; some practiced agriculture for parts of the year and became adept leather workers as they hunted buffalo while people living in the arid southwest built adobe buildings, fired pottery, domesticated cotton, and wove cloth.
Tribes in the eastern woodlands and Mississippian Valley developed extensive trade networks, built pyramid-like mounds, and practiced substantial agriculture while the peoples living in the Appalachian Mountains and coastal Atlantic practiced highly sustainable forest agriculture and were expert woodworkers. However, the populations of these peoples were small and their rate of technological change was very low. [7] Indigenous peoples did not domesticate animals for drafting or husbandry, develop writing systems, or create bronze or iron-based tools like their European/Asian counterparts.
In the 17th century, Pilgrims, Puritans, and Quakers fleeing religious persecution in Europe brought with them plowshares, guns, and domesticated animals like cows and pigs. These immigrants and other European colonists initially farmed subsistence crops like corn, wheat, rye, and oats as well as rendering potash and maple syrup for trade. [8] Due to the more temperate climate, large-scale plantations in the American South grew labor-intensive cash crops like sugarcane, rice, cotton, and tobacco requiring the importation of thousands of enslaved Africans to maintain. Early American farmers were not self-sufficient; they relied upon other farmers, specialized craftsmen, and merchants to provide tools, process their harvests, and bring them to market. [9]
Colonial artisanship emerged slowly as the market for advanced craftsmanship was small. American artisans developed a more relaxed (less regulated) version of the Old World apprenticeship system for educating and employing the next generation. Despite the fact that mercantilist, export-heavy economy impaired the emergence of a robust self-sustaining economy, craftsmen and merchants developed a growing interdependence on each other for their trades. [10] In the mid-18th century, attempts by the British to subdue or control the colonies by means of taxation sowed increased discontent among these artisans, who increasingly joined the Patriot cause.
Colonial Virginia provided a potential market of rich plantations. At least 19 silversmiths worked in Williamsburg between 1699 and 1775. The best-known were James Eddy (1731–1809) and his brother-in-law William Wadill, also an engraver. Most planters, however, purchased English-made silver. [11]
In Boston, goldsmiths and silversmiths were stratified. The most prosperous were merchant-artisans, with a business outlook and high status. Most craftsmen were laboring artisans who either operated small shops or, more often, did piecework for the merchant artisans. The small market meant there was no steady or well-paid employment; many lived in constant debt. [12]
Colonial silver working was pre-industrial in many ways: many pieces made were "bespoke," or uniquely made for each customer, and emphasized artistry as well as functionality. Silver (and other metal) mines were scarcer in North America than in Europe, and colonial craftsmen had no consistent source of materials with which to work. [13] For each piece of silver they crafted, raw materials had to be collected and often reused from disparate sources, most commonly Spanish coins. The purity of these sources was not regulated, nor was there an organized supply chain through which to obtain silver. [14] As silver objects were sold by weight, manufacturers who could produce silver objects cheaply by mass had an advantage. [15] Many of these unique, individual aspects to silver working kept artisan practices in place through the late 18th century.
As demand for silver increased and large-scale manufacturing techniques emerged, silver products became much more standardized. For special-order objects that would likely only be made once, silversmiths generally used lost-wax casting, in which a sculpted object was carved out of wax, an investment casting was made, and the wax was melted away. The molds produced in this manner could only be used once, which made them inconvenient for standard objects like handles and buckles. Permanent mold casting, an industrial casting technique focused on high-volume production, allowed smiths to reuse molds to make exact replicas of the most commonly used items they sold. In creating these molds and developing standardized manufacturing processes, silversmiths could begin delegating some work to apprentices and journeymen.
After 1780, Paul Revere's sons took on more significant roles in his shop, [16] and his silver pieces often included wooden handles made by carpenters more experienced with woodwork. [17] For even some of the most successful artisans like Revere, [18] artisan was not a profitable enterprise compared to mass-production using iron or bronze casting. [17] Creating products that could be replicated for multiple customers, adopting new business practices and labor policies, and new equipment made manufacturing more ultimately efficient. These changes, in tandem with new techniques and requirements defined by changing social standards, led to the introduction of new manufacturing techniques in Colonial America that preceded and anticipated the industrial revolution.
Late in the colonial era a few silversmiths expanded operations with manufacturing techniques and changing business practices They hired assistants, subcontracted out piecework and standardized output. [19] One individual in the vanguard of America's shift towards more industrial methods was Paul Revere, who emphasized the production of increasingly standardized items later in his career with the use of a silver flatting mill, increased numbers of salaried employees, and other advances. [20] Still, traditional methods of artisan remained, and smiths performed a great deal of work by hand. The coexistence of the craft and industrial production styles prior to the industrial revolution is an example of proto-industrialization.
In the mid-1780s, Oliver Evans invented an automated flour mill that included a grain elevator and hopper boy. Evans' design eventually displaced the traditional gristmills. By the turn of the century, Evans also developed one of the first high-pressure steam engines and began establishing a network of machine workshops to manufacture and repair these popular inventions. In 1789, the widow of Nathanael Greene recruited Eli Whitney to develop a machine to separate the seeds of short fibered cotton from the fibers. The resulting cotton gin could be made with basic carpentry skills but reduced the necessary labor by a factor of 50 and generated huge profits for cotton growers in the South, leading to a rapid expansion of slavery in the United States. [21] While Whitney did not realize financial success from his invention, he moved on to manufacturing rifles and other armaments under a government contract that could be made with "expedition, uniformity, and exactness"—the foundational ideas for interchangeable parts. [22] However, Whitney's vision of interchangeable parts would not be achieved for over two decades with firearms and even longer for other devices.
Between 1800 and 1820, new industrial tools that rapidly increased the quality and efficiency of manufacturing emerged. Simeon North suggested using division of labor to increase the speed with which a complete pistol could be manufactured which led to the development of a milling machine in 1798. In 1819, Thomas Blanchard created a lathe that could reliably cut irregular shapes, like those needed for arms manufacture. By 1822, Captain John H. Hall had developed a system using machine tools, division of labor, and an unskilled workforce to produce a breech-loading rifle—a process that came to be known as "Armory practice" in the U.S. and the American system of manufacturing in England. [23] [24]
The textile industry, which had previously relied upon labor-intensive production methods, was also rife with potential for mechanization. In the late 18th century, the English textile industry had adopted the spinning jenny, water frame, and spinning mule which greatly improved the efficiency and quality of textile manufacture, but were closely guarded by the British government which forbade their export or the emigration of those who were familiar with the technology. The 1787 Beverly Cotton Manufactory was the first cotton mill in the United States, but it relied on horse power. Samuel Slater, an apprentice in one of the largest textile factories in England, immigrated to the United States in 1789 upon learning that American states were paying bounties to British expatriates with a knowledge of textile machinery. [25] With the help of Moses Brown of Providence, Slater established America's oldest currently existing cotton-spinning mill with a fully mechanized water power system at the Slater Mill in Pawtucket, Rhode Island in 1793.
Hoping to harness the ample power of the Merrimack River, another group of investors began building the Middlesex Canal up the Mystic River, both Mystic Lakes and generally following stream valleys (near to today's MA 38) reached the Merrimack in Chelmsford 35 miles (56 km) from Boston Harbor, establishing limited operations by 1808, and a system of navigations and canals reaching past Manchester by mid-1814—and spawning commercial activities, and especially new clothing mills throughout the region. At nearly the same time as the canal was completed, Francis Cabot Lowell and a consortium of businessmen set up the clothing mills in Waltham, Massachusetts making use of water power from the Charles River with the concept of housing together production of feedstocks complete consumer processes so raw materials entered, and dyed fabrics or clothing left. For a few decades, it seemed that every lock along the canal had mills and water wheels. In 1821, Boston Manufacturing Company built a major expansion in East Chelmsford, which was soon incorporated as Lowell, Massachusetts—which came to dominate the cloth production and clothing industry for decades.
Slater's Mill was established in the Blackstone Valley, which extended into neighboring Massachusetts, (Daniel Day's Woolen Mill, 1809 at Uxbridge), and became one of the earliest industrialized region in the United States, second to the North Shore of Massachusetts. Slater's business model of independent mills and mill villages (the "Rhode Island System") began to be replaced by the 1820s by a more efficient system (the "Waltham System") based upon Francis Cabot Lowell's replications of British power looms. Slater went on to build several more cotton and wool mills throughout New England, but when faced with a labor shortage, resorted to building housing, shops, and churches for the workers and their families adjacent to his factories.
The first power looms for woolens were installed in 1820, at Uxbridge, Massachusetts, by John Capron, of Cumberland, Rhode Island. These added automated weaving under the same roof, a step which Slater's system outsourced to local farms. Lowell looms were managed by specialized employees, many of the employed were unmarried young women ("Lowell mill girls"), and owned by a corporation. [26] Unlike the previous forms of labor (apprenticeship, family labor, slavery, and indenture), the Lowell system popularized the concept of wage laborer who sells his labor to an employer under contract—a socio-economic system which persists in many modern countries and industries. The corporation also looked out for the health and well-being of the young women, including their spiritual health, and the hundreds of women employed by it culturally established the pattern of a young woman going off to work for a few years to save money before returning home to school and marriage. It created an independent breed of women uncommon in most of the world.
Even as the country grew even larger with the admission of Kentucky, Tennessee, and Ohio by 1803, the only means of transportation between these landlocked western states and their coastal neighbors was by foot, pack animal, or ship. Recognizing the success of Roman roads in unifying that empire, political and business leaders in the United States began to construct roads and canals to connect the disparate parts of the nation. [27]
Early toll roads were constructed and owned by joint-stock companies that sold stock to raise construction capital like Pennsylvania's 1795 Lancaster Turnpike Company. In 1808, Secretary of the Treasury Albert Gallatin's Report on the Subject of Public Roads and Canals suggested that the federal government should fund the construction of interstate turnpikes and canals. While many Anti-Federalists opposed the federal government assuming such a role, the British blockade in the War of 1812 demonstrated the United States' reliance upon these overland roads for military operations as well as for general commerce. [28] Construction on the National Road began in 1815 in Cumberland, Maryland and reached Wheeling, Virginia in 1818, but political strife thereafter ultimately prevented its western advance to the Mississippi River. Nevertheless, the road became a primary overland conduit through Appalachian Mountains and was the gateway for thousands of antebellum westward-bound settlers.
Numerous canal companies had also been chartered; but of all the canals projected, only three had been completed when the War of 1812 began: the Dismal Swamp Canal in Virginia, the Santee Canal in South Carolina, and the Middlesex Canal in Massachusetts. It remained for New York to usher in a new era in internal communication by authorizing in 1817 the construction of the Erie Canal. This bold bid for Western trade alarmed the merchants of Philadelphia, particularly as the completion of the national road threatened to divert much of their traffic to Baltimore. In 1825, the Pennsylvania General Assembly grappled with the problem by projecting a series of canals which were to connect its great seaport with Pittsburgh on the west and with Lake Erie and the upper Susquehanna on the north. [29] The Blackstone Canal, (1823–1828) in Rhode Island and Massachusetts, and the Morris Canal across northern New Jersey (1824–1924) soon followed, along with the Illinois and Michigan Canal from Chicago to the Illinois River(1824–1848).
Like the turnpikes, the early canals were constructed, owned, and operated by private joint-stock companies but later gave way to larger projects funded by the states. The Erie Canal, proposed by Governor of New York De Witt Clinton, was the first canal project undertaken as a public good to be financed at the public risk through the issuance of bonds. [30] When the project was completed in 1825, the canal linked Lake Erie with the Hudson River through 83 separate locks and over a distance of 363 miles (584 km). The success of the Erie Canal spawned a boom of other canal-building around the country: over 3,326 miles (5,353 km) of artificial waterways were constructed between 1816 and 1840. [31] Smaller cities, such as Syracuse, New York, Buffalo, New York, and Cleveland that lay along major canal routes boomed into major industrial and trade centers, while canal-building pushed some states like Pennsylvania, Ohio, and Indiana to the brink of bankruptcy. [31]
The magnitude of the transportation problem was such, however, that neither individual states nor private corporations seemed able to meet the demands of an expanding internal trade. As early as 1807, Albert Gallatin had advocated the construction of a great system of internal waterways to connect East and West, at an estimated cost of $20,000,000 ($416,181,818 in 2023 consumer dollars). But the only contribution of the national government to internal improvements during the Jeffersonian era was an appropriation in 1806 of two percent of the net proceeds of the sales of public lands in Ohio for the construction of a national road, with the consent of the states through which it should pass. By 1818 the road was open to traffic from Cumberland, Maryland, to Wheeling, West Virginia. [32]
In 1816, with the experiences of the war before him, no well-informed statesman could shut his eyes to the national aspects of the problem.[ citation needed ] Even President James Madison invited the attention of Congress to the need of establishing "a comprehensive system of roads and canals". Soon after Congress met, it took under consideration a bill drafted by John C. Calhoun which proposed an appropriation of $1,500,000 ($26,929,412 in 2023 consumer dollars) for internal improvements. Because this appropriation was to be met by the moneys paid by the National Bank to the government, the bill was commonly referred to as the "Bonus Bill". But on the day before he left office, President Madison vetoed the bill because it was unconstitutional. The policy of internal improvements by federal aid was thus wrecked on the constitutional scruples of the last of the Virginia dynasty. Having less regard for consistency, the United States House of Representatives recorded its conviction, by close votes, that Congress could appropriate money to construct roads and canals, but had not the power to construct them. As yet the only direct aid of the national government to internal improvements consisted of various appropriations, amounting to about $1,500,000 for the Cumberland Road. [33]
As the country recovered from financial depression following the Panic of 1819, the question of internal improvements again forged to the front. In 1822, a bill to authorize the collection of tolls on the Cumberland Road had been vetoed by President James Monroe. In an elaborate essay, Monroe set forth his views on the constitutional aspects of a policy of internal improvements. Congress might appropriate money, he admitted, but it might not undertake the actual construction of national works nor assume jurisdiction over them. For the moment, the drift toward a larger participation of the national government in internal improvements was stayed.
Two years later, Congress authorized the U.S. president to institute surveys for such roads and canals as he believed to be needed for commerce and military defense. No one pleaded more eloquently for a larger conception of the functions of the national government than Henry Clay. He called the attention of his hearers to provisions made for coast surveys and lighthouses on the Atlantic seaboard and deplored the neglect of the interior of the country. Among the other presidential candidates, Andrew Jackson voted in the United States Senate for the general survey bill; and John Quincy Adams left no doubt in the public mind that he did not reflect the narrow views of his section on this issue. William H. Crawford felt the constitutional scruples which were everywhere being voiced in the South, and followed the old expedient of advocating a constitutional amendment to sanction national internal improvements. [34]
In President Adams' first message to Congress, he advocated not only the construction of roads and canals but also the establishment of observatories and a national university. President Thomas Jefferson had recommended many of these in 1806 for Congress to consider for creation of necessary amendments to the Constitution. Adams seemed oblivious to the limitations of the Constitution.[ citation needed ] In much alarm, Jefferson suggested to Madison the desirability of having Virginia adopt a new set of resolutions, bottomed on those of 1798, and directed against the acts for internal improvements. In March 1826, the Virginia General Assembly declared that all the principles of the earlier resolutions applied "with full force against the powers assumed by Congress" in passing acts to protect manufacturers and to further internal improvements. That the administration would meet with opposition in Congress was a foregone conclusion. [35]
Despite the new efficiencies introduced by the turnpikes and canals, travel along these routes was still time-consuming and expensive. The idea of integrating a steam boiler and propulsion system can be first attributed to John Fitch and James Rumsey who both filed for patents or state monopolies on steamboats in the late 1780s. However, these first steamboats were complicated, heavy, and expensive. It would be almost 20 years until Robert R. Livingston contracted a civil engineer named Robert Fulton to develop an economical steamboat. Fulton's paddle steamer, The North River Steamboat (erroneously referred to as the Clermont), made its first trip from New York City north on the Hudson River to Albany on August 17, 1807. By 1820, steamboat services had been established on all the Atlantic tidal rivers and Chesapeake Bay. The shallow-bottomed boats were also ideally suited for navigating the Mississippi and Ohio Rivers and the number of boats on these rivers increased from 17 boats to 727 boats between 1817 and 1855. [36] The speed of the steamboats decreased travel times between coastal ports and upstream cities by weeks and costs for transporting goods along these rivers by as much as 90%. [37]
Steamboats profoundly altered the relationships between the federal government, state governments, and private property owners. Livingston and Fulton had obtained monopoly rights to operate a steamboat service within the state of New York, but Thomas Gibbons, who operated a competing New Jersey ferry service, was enjoined from entering New York waters under the terms of the monopoly. In 1824, the Supreme Court ruled in Gibbons v. Ogden that Congress could regulate commerce and transportation under the Commerce Clause which compelled the state of New York to allow steamboat services from other states.
Because the physics and metallurgy of boilers were poorly understood, steamboats were prone to boiler explosions that killed hundreds of people between the 1810s and 1840s. [38] In 1838, legislation was enacted that mandated boiler inspections by federal agents under the threat of revocation of the operator's navigation licenses and lowered the threshold for liability in suits arising from such accidents. While Americans long resisted any government's power to regulate private property, these new rules demonstrated that many Americans believed that property rights did not override civil rights and set the precedent for future federal safety regulations. [39]
Many minerals have been mined across the United States. Coal was one of the earliest, mainly mined in the eastern states, and when combined with iron ore led to the growth of the Steel industry. Gold and silver mining was a significant contributor to the rapid expansion of the western states in the 19thl. century. Copper mining began in the 1840s, and the United States is the fourth largest copper producer in the World.
Role of industry & technology in causes, conduct & operations, reconstruction
This section needs expansionwith: Radio, television and electronics (e.g., vacuum tubes, semiconductors). You can help by adding to it. (April 2022) |
The period after the American Civil War was marked by increasing intense and pervasive industrialization and successive technological advances like the railroad, telegraph & telephone, and internal combustion engine. This facilitated America's westward expansion and economic development by connecting the frontier with the industrial, financial, and political centers of the East. Americans increasingly relied upon technological infrastructures like the railroad, electric, and telecommunications systems for economic and social activities.
The Overman Wheel Company, founded 1882, was the first manufacturer of safety bicycles in the United States, in their factory complex in Chicopee, Massachusetts. [40] Following their creation in England, Overman rushed a safety bicycle to production before the end of 1887. [41] Overman was known for making all-steel bicycles with no cast metal parts. [40] The Overman Victor bicycle was said to be of higher quality and lower weight than other bicycles of its time. [40] By 1893, the Overman factory made the complete bicycle, including tyres, saddles, rims, etc. [42]
Between 1820 and 1830, many inventors and entrepreneurs began to apply emerging steamboat technology to engines that could travel on land. The earliest proposal came in 1813 from Oliver Evans' idea of a railway to connect New York City and Philadelphia with "carriages drawn by steam engines." [43] Many individuals and companies have a claim to being the first railroad in the United States, but by the mid-1830s several companies were using steam-powered locomotives to move train cars on rail tracks.
Between 1840 and 1860, the total length of railroad trackage increased from 3,326 to 30,600 miles (5,350 to 49,250 km). [44] The efficiency of railroad to move large, bulk items contributed enabled further drops in cost of transporting goods to market but in so doing undermined the profitability of the earlier turnpikes and canals which began to fold and fall into disrepair. However, the early railroads were poorly integrated; there were hundreds of competing companies using different gauges for their track requiring cargo to be trans-shipped—rather than traveling directly—between cities.
The completion of the First transcontinental railroad in 1869 and its attendant profit and efficiency had the effect of stimulating a period of intense consolidation and technological standardization that would last another 50 years. It was during this time that railroad magnates such as Jay Gould and Cornelius Vanderbilt amassed great power and fortunes from consolidation of smaller rail lines into national corporations. By 1920, 254,000 miles (408,800 km) of standard-gauge railroad track had been laid in the United States, all of it owned or controlled by seven organizations. [45] The need to synchronize train schedules and the inefficiencies introduced by every city having its own local time, also led to introduction of Standard time by railway managers in 1883. Railroads began using diesel locomotives in the 1930s, and they completely replaced steam locomotives by the 1950s, which reduced costs and improved reliability. [46]
During the Post–World War II economic expansion many railroads were driven out of business due to competition from airlines and Interstate highways. The rise of the automobile led to the end of passenger train service on most railroads. Trucking businesses had become major competitors by the 1930s with the advent of improved paved roads, and after the war they expanded their operations as the interstate highway network grew, and acquired increased market share of freight business. [47]
In 1970 the Penn Central railroad declared bankruptcy, the largest bankruptcy in the US at that time. In response Congress created a government corporation, Amtrak, to take over operation of the passenger lines of Penn Central and other railroads, under the Rail Passenger Service Act. [48] [49] Amtrak began inter-city rail operations in 1971. [50] In 1980 Congress enacted the Staggers Rail Act to revive freight traffic, by removing restrictive regulations and enabling railroads to be more competitive with the trucking industry. [51]
More railroad companies merged and consolidated their lines in order to remain successful. These changes led to the current system of fewer, but profitable, Class I railroads covering larger regions of the United States. [52] To replace the loss of commuter passenger rail service, state and local government agencies established their own commuter rail systems in several metropolitan areas, generally by leasing rail lines from Amtrak or freight railroads. [53] The first rapid transit systems began operation in Chicago (1892), Boston (1897) and New York City (1904).
Because iron occurs in nature commonly as an oxide, it must be smelted to drive off the oxygen to obtain the metallic form. Bloomery forges were prevalent in the colonies and could produce small batches of iron to be smithed for local needs (horseshoes, axeblades, plowshares) but were unable to scale production for exporting or larger-scale industry, including gunmaking, shipbuilding, and wheelmaking. [54] Blast furnaces creating cast iron and pig iron emerged on large self-sufficient plantations in the mid-17th century to meet these demands, but production was expensive and labor-intensive: forges, furnaces, and waterwheels had to be constructed, huge swaths of forest had to be cleared and the wood rendered into charcoal, and iron ore and limestone had to be mined and transported.
By the end of the 18th century, the threat of deforestation forced the English to use coke, a fuel derived from coal, to fire their furnaces. This shift precipitated a drop in iron prices since the process no longer required charcoal, the production of which was labor-intensive. This was a practice that was later adopted in the US as well. [55]
Although steel is an alloy of iron and a small amount of carbon, historically steel and iron-making were intended for different products given the high costs of steel over wrought iron. The main difficulty with making steel is that its higher melting point than pig or cast iron was not easily achievable in large-scale production until methods that introduced air or oxygen to oxidize the carbon in the molten pig iron were developed, allowing the direct conversion of molten pig iron to molten steel.
Throughout the 18th and early 19th centuries, the English steelmakers produced blister and crucible steel which required specialized equipment. In the 18th century, innovations like steamboats, railroads, and guns increased demand for wrought iron and steel. The Mount Savage Iron Works in Maryland was the largest in the United States in the late 1840s, and the first in the nation to produce heavy rails for the construction of railroads. [56]
In the 1850s, American William Kelly and Englishman Henry Bessemer independently discovered that air blown through the molten iron increases its temperature by oxidizing the carbon and separating additional impurities into the slag. The Kelly-Bessemer process, because it reduces the amount of coke needed for blasting and increases the quality of the finished iron, revolutionized the mass production of high-quality steel and facilitated a drastic drop in steel prices and expansion of its availability.
In 1868, Andrew Carnegie saw an opportunity to integrate new coke-making methods with the recently developed Kelly-Bessemer process to supply steel for railroads. In 1872, he built a steel plant in Braddock, Pennsylvania at the junction of several major railroad lines. Carnegie earned enormous profits by pioneering vertical integration; he owned the iron ore mines in Minnesota, the transport steamboats on the Great Lakes, the coal mines and coke ovens, and the rail lines delivering the coke and ore to his Pennsylvania mills. By 1900, the Carnegie Steel Company was producing more steel than all of Britain and in 1901 Carnegie sold his business to J.P. Morgan's U.S. Steel earning Carnegie $480 million personally.
The ability to quickly transmit information over long distances would prove to have an enormous impact on many diverse fields like journalism, banking, and diplomacy. Between 1837 and 1844, Samuel F.B. Morse and Alfred Vail developed a transmitter that could send "short" or "long" electric currents which would move an electromagnetic receiver to record the signal as dots and dashes. Morse established the first telegraph line (between Baltimore and Washington D.C.) in 1844 and by 1849 almost every state east of the Mississippi had telegraph service. [57] Between 1850 and 1865, the telegraph business became progressively more consolidated and the 1866 incorporation of Western Union emerged with a near-monopoly over 22,000 telegraph offices and 827,000 miles (1,330,900 km) of cable throughout the country. [57] The telegraph was used to dispatch news from the fronts of the Mexican–American War, coordinate Union troop movements during the Civil War, relay stock and commodity prices and orders between markets on ticker tape, and conduct diplomatic negotiations after the Transatlantic telegraph cable was laid in 1866.
Alexander Graham Bell obtained a patent in 1876 to a device that could transmit and reproduce the sound of a voice over electrical cables. Bell realized the enormous potential for his telephone and formed the Bell Telephone Company which would control the whole system, from manufacturing the telephones to leasing the equipment to customers and operators. Between 1877 and 1893 (the term of Bell's patent coverage) the number of phones leased by Bell's company increased from 3,000 to 260,000, although these were largely limited to businesses and government offices that could afford the relatively high rates. [58] After the Bell patents expired, thousands of independent operators became incorporated and their competition for services to middle and low-class households as well as rural farmers drove prices down significantly. By 1920, there were 13 million phones in the United States providing service to 39 percent of all farm households and 34 percent of non-farm households. [59]
The 1859 discovery of crude oil in western Pennsylvania set off an "oil rush" reminiscent of the 1849 California Gold Rush and would prove to be a valuable resource on the eve of the Civil War. Because crude oil needs to be distilled to extract usable fuel oils, oil refining quickly became a major industry in the area. The rural and mountainous terrain of these Pennsylvania oilfields allowed neither economical in-situ refining nor efficient railroad transportation of extracted oil. Beginning in 1865, the construction of oil pipelines to connect the oilfields with railroads or oil refineries alleviated this geographical bottleneck but also put thousands of coopers and teamsters (who made the barrels and drove the wagons to transport oil) out of business. [60] As the network of oil pipelines expanded, they became more integrated with both the railway and telegraph systems which enabled even greater coordination in production, scheduling, and pricing.
John D. Rockefeller was a forceful driver of consolidation in the American oil industry. Beginning in 1865, he bought refineries, railroads, pipelines, and oilfields and ruthlessly eliminated competition to his Standard Oil. By 1879, he controlled 90% of oil refined in the US. [60] Standard Oil used pipelines to directly connect the Pennsylvanian oilfields with the refineries in New Jersey, Cleveland, Philadelphia, and Baltimore, rather than loading and unloading railroad tank cars, which enabled huge gains in efficiency and profitability. Given the unprecedented scale of Standard Oil's network, the company developed novel methods for managing, financing, and organizing its businesses. Because laws governing corporations limited their ability to do business across state lines, Standard Oil pioneered the use of a central trust that owned and controlled the constituent companies in each state. [60] The use of trusts by other industries to stifle competition and extract monopoly prices led to the 1890 passage of the Sherman Antitrust Act.
In the 1911 case of Standard Oil Co. of New Jersey v. United States , the U.S. Supreme Court ordered the Standard Oil Trust be disbanded into competing companies that would become Exxon (Standard Oil of New Jersey), Mobil (Standard Oil of New York), and Chevron (Standard Oil of California).
The demand for petroleum products increased rapidly after the turn of the century as families relied upon kerosene to heat and light their houses, industries relied upon lubricants for machinery, and the ever-more prevalent internal combustion engine demanded gasoline fuel. Between 1880 and 1920, the amount of oil refined annually jumped from 26 million to 442 million. [60] The discovery of large oil fields in Texas, Oklahoma, Louisiana, and California in the early 20th century touched off "oil crazes" and contributed to these states' rapid industrialization. Because these previously agrarian western states lay outside of the various Standard Oil's production and refining networks, cities like Long Beach, California, Dallas, Texas, and Houston, Texas emerged as major centers for refining and managing these new fields under companies like Sunoco, Texaco, and Gulf Oil.
Benjamin Franklin pioneered the study of electricity by being the first to describe positive and negative charges, [61] as well as advancing the principle of conservation of charge. [62] Franklin is best known for the apocryphal feat of flying a kite in thunderstorm to prove that lightning is a form of electricity which, in turn, led to the invention of the lightning rod to protect buildings.
Electricity would remain a novelty through the early to mid-19th century, but advances in battery storage, generating, and lighting would turn it into a domestic business. By the late 1870s and early 1880 central generating plants supplying power to arc lamps, first in Europe and then in the US, began spreading rapidly, replacing oil and gas for outdoor lighting, systems that ran on very high voltage (3,000–6,000 volt) direct current or alternating current.
In 1880, Thomas Edison developed and patented a system for indoor lighting that competed with gas lighting, based on a long-lasting high resistance incandescent light bulb that ran on relatively low voltage (110 volt) direct current. Commercializing this venture was a task far beyond what Edison's small laboratory could handle, requiring the setup of a large investor backed utility that involving companies that would manufacture the whole technological system upon which the "light bulb" would depend—generators (Edison Machine Company), cables (Edison Electric Tube Company), generating plants and electric service (Edison Electric Light Company), sockets, and bulbs. [63]
In addition to lighting, electric motors (analogous to generators operating in reverse, or using a current to spin a magnet to perform work) became extremely important to industry. Speed control of early DC motors limited their use. Frank J. Sprague developed the first successful DC motor (ca. 1886) by solving the problem of varying speed with load. Within a few years DC motors were used in electric street railways. [64] In 1888, a Serbian immigrant, Nikola Tesla, a former employee of Edison's, patented an AC induction motor and licensed it to the Westinghouse Corporation. Electric motors eventually replaced steam engines in factories around the nation as they required neither complex mechanical transmissions from a central engine nor water sources for steam boilers in order to operate. [65]
Edison's direct current generation dominated the initial years of indoor commercial and residential electric lighting and electric power distribution. However, DC transmission was hampered by the difficulty in changing voltages between industrial generation and residential/commercial consumption and the low voltages used suffered from poor transmission efficiency. The mid-1880s saw the introduction of the transformer, allowing alternating current to be transmitted at high voltage long distances with greater efficiency and then "stepped down" to supply commercial and domestic indoor lighting, resulting in AC going from being the outdoor "arc lighting current" to taking over the domestic lighting utility market Edison's DC system was designed to supply. The rapid spread of AC and haphazard installation of power lines, especially in the city of New York, led to a series of deaths attributed to high voltage AC and an eventual media backlash against the current. [66] [67]
In 1888, the Edison company played up the dangers of AC power in their literature and assisted self-appointed anti-AC crusader Harold P. Brown in a parallel goal to limit, to the point of ineffectiveness, the voltages in AC power systems, a market then dominated by Westinghouse Electric. This series of events came to be known as the war of the currents. Brown and Edison's lobbying in state legislatures went nowhere and the Edison company continued to lose market share and profitability to the AC based companies. In 1892 the "war" ended with Thomas Edison losing any remaining control of his own company when it was merged with Westinghouse's chief AC rival, the Thomson-Houston Electric Company, to form General Electric, creating a company that controlled three quarters of the US electrical business. [68] [69] Westinghouse's lead in AC development would allow them to win a contract in 1893 to build an AC based power station at the Niagara Falls but the transmission contract was awarded to General Electric, who would come to dominate the US electrical business for many years afterwards. [68]
As in other industries of the era, these companies achieved greater efficiencies by eventually merging to form conglomerated companies, with over a dozen electric companies in the 1880s merging down to just two, General Electric and Westinghouse. Lighting was immensely popular: between 1882 and 1920 the number of generating plants in the US increased from one in Downtown Manhattan to nearly 4,000. [63] While the earliest generating plants were constructed in the immediate vicinity of consumers, plants generating electricity for long-distance transmissions were in place by 1900. To help finance this great expansion, the utility industry exploited a financial innovation known as the "holding company"; a favorite holding company investment among many was the Electric Bond and Share Company, later known as Ebasco, created by the GE company in 1905. The abuse of holding companies, like trusts before it, led to the Public Utility Holding Company Act of 1935, [70] but by 1920, electricity had surpassed petroleum-based lighting sources that had dominated the previous century.
The technology for creating an automobile emerged in Germany in the 1870 and 1880s: Nicolaus Otto created a four-stroke internal combustion engine, Gottlieb Daimler and Wilhelm Maybach modified the Otto engine to run at higher speeds, and Karl Benz pioneered the electric ignition. The Duryea brothers and Hiram Percy Maxim were among the first to construct a "horseless carriage" in the US in the mid-1890s, but these early cars proved to be heavy and expensive.
Henry Ford revolutionized the automobile manufacturing process by employing interchangeable parts on assembly lines—the beginning of industrial mass production. In 1908, the Ford Motor Company released the Ford Model T which could generate 20 horsepower, was lightweight, and easy to repair. Demand for the car was so great, he had to relocate his assembly plant to Highland Park, Michigan in 1912. The new plant was a model of industrial efficiency for the time: it was well lit and ventilated, employed conveyors to move parts along an assembly line, and workers' stations were orderly arranged along the line. The efficiency of the assembly line allowed Ford to realize great gains in economy and productivity; in 1912, Ford sold 6,000 cars for approximately $900 and by 1916 approximately 577,000 Model T automobiles were sold for $360. [71] Ford was able to scale production rapidly because assembly-line workers were unskilled laborers performing repetitive tasks. Ford hired European immigrants, African-Americans, ex-convicts, and the disabled and paid comparatively high wages, but was quick to dismiss anyone involved in labor unions or radical political associations. [72]
With growth of American automobile usage, urban and rural roads were gradually upgraded for the new traffic. Local automobile clubs formed the American Automobile Association to lobby city, state, and federal governments to widen and pave existing roads and build limited-access highways. Some federal road aid was passed in the 1910s and 1920s, resulting in major national highways like U.S. Route 1 and U.S. Route 66. The coverage and quality of many roads would greatly improve following Depression-era Works Progress Administration investment in road infrastructure. [73]
New automobile sales were temporarily slowed during World War II when wartime rationing and military production lines limited the number of automobiles that could be manufactured—the largest companies like Ford, GM, and Chrysler would survive those lean years. After the war, rising family sizes, increasing affluence, and government-subsidized mortgages for veterans fueled a boom in single-family homes. [74] Many were automobile-owners. In 1956, Congress passed the Federal Aid Highway Act of 1956 which provided funding for the construction of 41,000 miles (66,000 km) of toll-free expressways throughout the country laying the legislative and infrastructural foundations for the modern American highway system.
Radio communication, originally known as "wireless telegraphy", was first developed in the 1890s. The first wireless transmissions were achieved by Guglielmo Marconi in Europe and they were first replicated in the United States in April 1899 by Professor Jerome Green at the University of Notre Dame. [75] [76] [77]
The spark-gap transmitters initially employed could only transmit the dots-and-dashes of Morse code. Despite this limitation, in 1905 a small number of U.S. Navy stations inaugurated daily time signal broadcasts. [78] In 1913 the high-powered station NAA in Arlington, Virginia began broadcasting daily time signals and weather reports in Morse code which covered much of the eastern United States. [79]
The leading early proponent of radio broadcasting in the United States was Lee de Forest, who employed versions of an arc transmitter developed by Valdemar Poulsen to make a series of demonstrations beginning in 1907. [80] From the outset de Forest noted the potential for regular entertainment broadcasts, envisioning "the distribution of music from a central station" and that "by using four different forms of wave as many classes of music can be sent out as desired by the different subscribers". [81]
In the mid-1910s, the development of vacuum tube transmitters provided a significant improvement in the quality and reliability of audio transmissions. Lee de Forest established experimental station 2XG in New York City and conducted the first broadcast in 1916. [82] Station KDKA in Pittsburgh was the first to offer regularly scheduled broadcasts in 1920. [83]
Early radio receivers were developed in the 1890s and used a coherer, a primitive radio wave detector. Vacuum tube technology led to major improvements in receiving sets, both in detecting the radio signals via amplification and better audio quality. The Audion (triode) vacuum tube, invented by de Forest in 1906, was the first practical amplifying device and revolutionized radio. [84] The advent of radio broadcasting increased the market for radio receivers greatly, and transformed them into a consumer product. By the 1930s, the broadcast receiver had become a common household item, with standardized controls that anyone could use. [85]
The invention of the transistor in 1947 again revolutionized radio technology, making truly portable receivers possible, beginning with transistor radios in the late 1950s. [86]
In the 1840s, as more and more western states joined the Union, many poor and middle-class Americans increasingly agitated for free land in these large, undeveloped areas. Early efforts to pass a Homestead Act by George Henry Evans and Horace Greeley were stymied by Southern states who feared that free land would threaten the plantation system. The Homestead Act was passed in 1862 after the opposing Southern states had seceded. The Homestead Act granted 160 acres (65 hectares) to farmers who lived on the land for 5 years or allowed the farmer to purchase the land after 6 months for $1.25 per acre ($3.1/ha).
Even as America's westward expansion allowed over 400 million acres (1,600,000 km2) of new land to be put under cultivation, between 1870 and 1910 the number of Americans involved in farming or farm labor dropped by a third. [87] New farming techniques and agricultural mechanization facilitated both processes. Cyrus McCormick's reaper (invented in 1834) allowed farmers to quadruple their harvesting efficiency by replacing hand labor with a mechanical device. John Deere invented the steel plow in 1837, keeping the soil from sticking to the plow and making it easier to farm in the rich prairies of the Midwest. The harvester, self-binder, and combine allowed even greater efficiencies: wheat farmers in 1866 achieved an average yield of 9.9 bushels per acre but by 1898 yields had increased to 15.3 bushels per acre even as the total area had tripled. [88]
Railroads allowed harvests to reach markets more quickly and Gustavus Franklin Swift's refrigerated railroad car allowed fresh meat and fish to reach distant markets. Food distribution also became more mechanized as companies like Heinz and Campbell distributed previously perishable foods by canning and evaporation. Commercial bakeries, breweries, and meatpackers replaced locally owned operators and drove demand for raw agricultural goods. Despite increasing demand, rising production caused a drop in prices, creating substantial discontent among farmers. Organizations like The Grange and Farmers Alliance emerged to demand monetary policy that allowed for money supply expansion (as most farmers carried significant debt from planting time to harvest time), railroad regulations, and protective tariffs.
The period between 1865 and 1920 was marked by the increasing concentration of people, political power, and economic activity in urban areas. In 1860, there were nine cities with populations over 100,000 and by 1910 there were fifty. [87] These new large cities were not coastal port cities (like New York, Boston, and Philadelphia) but laid inland along new transportation routes, including Chicago, Cleveland, and Denver. The first twelve presidents of the United States had all been born into farming communities, but between 1865 and 1912 the presidency was filled by men with backgrounds of representing businesses and cities.
Industrialization and urbanization reinforced each other and urban areas became increasingly congested. As a result of unsanitary living conditions, diseases like cholera, dysentery, and typhoid fever struck urban areas with increasing frequency. [89] Cities responded by paving streets, digging sewers, [90] sanitizing water, [91] constructing housing, and creating public transportation systems.
The first large-scale electric trolley line in the world, the Richmond Union Passenger Railway opened on Feb. 2, 1888. [92] The first interurban tram-train was the Newark and Granville Street Railway in Ohio, which opened in 1889. [93] The first non-experimental trolleybus system was installed near Nantasket Beach in 1904; [94] the first year-round commercial line was built to open a hilly property to development just outside Los Angeles in 1910. [95] In 1912, articulated trams were invented and first used by the Boston Elevated Railway.
As the nation deepened its technological base, old-fashioned artisans and craftsmen became "deskilled" and replaced by specialized workers and engineers who used machines to replicate in minutes or hours work that would require a journeyman hours or days to complete. Frederick W. Taylor, recognizing the inefficiencies introduced by some production lines, proposed that by studying the motions and processes necessary to manufacture each component of a product, reorganizing the factory and manufacturing processes around workers, and paying workers piece rates would allow great gains in process efficiency. Scientific management, or "Taylorism" as it came to be known, was soon being applied by progressive city governments to make their urban areas more efficient and by suffragettes to home economics. [97]
Increasing industrialization outpaced the supply of laborers able or willing to work in dangerous, low-paying, and dead-end jobs. However, the demand for low or unskilled jobs drove wages up and attracted waves of Irish, Italian, Polish, Russian, and Jewish immigrants who could earn more in America than in their homelands.
The earliest unions emerged before the Civil War as trade guilds composed of journeyman carpenters, masons, and other artisans who would engage in strikes to demand better hours and pay from their masters. All branches of government generally sought to stop labor from organizing into unions or from organizing strikes.
To finance the larger-scale enterprises required during this era, the Stockholder Corporation emerged as the dominant form of business organization. Corporations expanded by combining into trusts, and by creating single firms out of competing firms, known as monopolies. Banking, investment, insurance, consulting, corporations, speculation, business cycle
The Progressive movement and the Progressive Era that emerged from it was in part a reaction to excesses of the new industrial age. "Muckraking" journalists reported on a wide array of social issues, and the reaction of the public lent urgency to reforms that led to increased government regulation, such as the Interstate Commerce Act of 1887 (pertaining to railroads), and the Meat Inspection Act and Pure Food and Drug Act (1906).
In the 20th century, the pace of technological developments increasingly became tied into a complex set of interactions between Congress, the industrial manufacturers, university research, and the military establishment. This set of relations, known more popularly as the "military-industrial complex," emerged because the military's unique technological demands, concentration of funding, large-scale application, and highly centralized control played a dominant role in driving technological innovation. Fundamental advances in medicine, physics, chemistry, computing, aviation, material science, naval architecture, and meteorology, and other fields, can be traced back to basic and applied research for military applications.
Smokestack America became a nickname applied to traditional manufacturing core of U.S. industry, used to represent particular industries, [98] regions, or towns. [99] [100] [101] [102] [103] [104] [105]
The first universities in the United States were modeled on the liberal curricula of the great English universities and were meant to educate clergymen and lawyers rather than teach vocational skills or conduct scientific research. The U.S. Military Academy, established in 1811, broke the mold of traditional universities and military academies alike by including practical engineering-related subjects in its earliest curricula.
By the middle of the 19th century, polytechnic institutes were being founded in increasing numbers to train students in the scientific and technical skills needed to design, build, and operate increasingly complex machines. In 1824, Stephen van Rensselaer established the first American institute granting a bachelor's degree in technical subjects and in the 1850s several Ivy League schools began to offer courses of study in scientific fields.
Congressional legislators, recognizing the increasing importance and prevalence of these eastern polytechnic schools, passed the 1862 Morrill Land-Grant Colleges Act providing large grants of land [106] that were to be used toward establishing and funding the educational institutions that would teach courses on military tactics, engineering, and agriculture.
Many of the United States' noted public research universities can trace their origins back to land grant colleges. Between 1900 and 1939, enrollments in post-secondary institutes increased from 238,000 to 1,494,000 [107] and higher education had become so available and affordable that a college degree was increasingly required for scientific, engineering, and government jobs that previously only required only vocational or secondary education. [108]
After World War II, the GI Bill caused university enrollments to explode as millions of veterans earned college degrees.
Great White fleet, Spanish–American War, tanks, machine gun, medicine, chemical weapons,
The introduction of the airplane to the battlefield was one of the most radical changes in the history of warfare.[ citation needed ] The history of flight spans hundreds of years and the distinction of building the first flying machine is complicated, but in December 1903 the Wright Brothers achieved sustained, piloted, and controlled heavier-than-air flight. The Wright brothers had difficulty raising funding from the government and military, but after World War I began in 1914, airplanes quickly assumed great tactical importance for both sides (see Aviation in World War I); the US government appropriated $640 million in 1917 to procure 20,000 airplanes for the war for aerial reconnaissance, dogfighting, and aerial bombing. [109]
After the close of the war in 1918, the US government continued to fund peacetime aeronautical activities like airmail and the National Advisory Committee for Aeronautics. Throughout the 1920s and 1930s, industrial, university, and military research continued to realize gains in the power, maneuverability, and reliability of airplanes: Charles Lindbergh completed a solo non-stop transatlantic flight in 1927, Wiley Post flew around the world in nine days in 1931, and Howard Hughes shattered flight airspeed records throughout the decade. In the 1930s, passenger airlines boomed as a result of the Kelley Act, state and local governments began constructing airports to attract airlines, and the federal government began to regulate air traffic control and investigate aviation accidents and incidents.
The American physicist Robert Goddard was one of the first scientists to experiment with rocket propulsion systems. In his small laboratory in Worcester, Massachusetts, Goddard worked with liquid oxygen and gasoline to propel rockets into the atmosphere, and in 1926 successfully fired the world's first liquid-fuel rocket which reached a height of 12.5 meters. Over the next 10 years, Goddard's rockets achieved modest altitudes of nearly two kilometers, and interest in rocketry increased in the United States, Britain, Germany, and the Soviet Union.
At the end of World War II, both the American and Russian forces recruited or smuggled top German scientists like Wernher von Braun back to their respective countries to continue defense-related work. Expendable rockets provided the means for launching artificial satellites, as well as crewed spacecraft. In 1957 the Soviet Union launched the first satellite, Sputnik I, and the United States followed with Explorer I in 1958. The first crewed space flights were made in early 1961, first by Soviet cosmonaut Yuri Gagarin and then by American astronaut Alan Shepard.
From those first tentative steps, to the 1969 Apollo program landing on the Moon, to the reusable Space Shuttle, the American space program has created a large display of applied science. Communications satellites transmit computer data, telephone calls, and radio and television broadcasts. Weather satellites furnish the data necessary to provide early warnings of severe storms.
As in physics and chemistry, Americans have dominated the Nobel Prize for physiology or medicine since World War II. The private sector has been the focal point for biomedical research in the United States, and has played a key role in this achievement. As of 2000, for-profit industry funded 57%, non-profit private organizations funded 7%, and the tax-funded National Institutes of Health funded 36% of medical research in the U.S. [110] Funding by private industry increased 102% from 1994 to 2003. [111]
The National Institutes of Health consists of 24 separate institutes supporting the prevention, detection, diagnosis, and treatment of diseases and disabilities. At any given time, grants from the NIH support the research of about 35,000 principal investigators, working in every US state and several foreign countries. Between 1971 and 1991, mortality from heart disease dropped 41 percent, strokes decreased by 59 percent, and today more than 70 percent of children who get cancer are cured.[ citation needed ]
Molecular genetics and genomics research have revolutionized biomedical sciences. In the 1980s and 1990s, researchers performed the first trial of gene therapy in humans and are now able to locate, identify, and describe the function of many genes in the human genome.[ citation needed ]
Research conducted by universities, hospitals, and corporations also contributes to improvement in diagnosis and treatment of disease. NIH funded the basic research on Acquired Immune Deficiency Syndrome (HIV/AIDS), for example. Many of the drugs used to treat this disease have emerged from the laboratories of the American pharmaceutical industry.[ citation needed ]
Radio, television, newspapers, movies, music, games
This section discusses technology, scientific studies, engineering, and overall impact.
This section needs expansionwith: History prior to 1970s: tabulating machines, mainframe computers, data communications, semiconductors, etc. You can help by adding to it. (April 2022) |
American researchers made fundamental advances in telecommunications and information technology. For example, AT&T's Bell Laboratories spearheaded the American technological revolution with a series of inventions including the light emitting diode (LED), the transistor, the C programming language, and the UNIX computer operating system. SRI International and Xerox PARC in Silicon Valley helped give birth to the personal computer industry, while ARPA and NASA funded the development of the ARPANET and the Internet. Companies like IBM and Apple Computer developed personal computers while Microsoft created operating systems and office productivity software to run on them.
With the growth of information on the World Wide Web, search companies like Yahoo! and Google developed technologies to sort and rank web pages based on relevance. The web also has become a site for computer-mediated social interactions and web services like MySpace, Facebook, and Twitter are used by millions to communicate. Moore's Law and the increasing pervasiveness and speed of wireless networks had led to substantial adoption of mobile phones and increasingly powerful smartphones based on software platforms like Apple's iOS and Google's Android.[ citation needed ]
The Industrial Revolution, sometimes divided into the First Industrial Revolution and Second Industrial Revolution, was a period of global transition of the human economy towards more widespread, efficient and stable manufacturing processes that succeeded the Agricultural Revolution. Beginning in Great Britain, the Industrial Revolution spread to continental Europe and the United States, from around 1760 to about 1820–1840. This transition included going from hand production methods to machines; new chemical manufacturing and iron production processes; the increasing use of water power and steam power; the development of machine tools; and the rise of the mechanised factory system. Output greatly increased, and the result was an unprecedented rise in population and the rate of population growth. The textile industry was the first to use modern production methods, and textiles became the dominant industry in terms of employment, value of output, and capital invested.
Manufacturing is the creation or production of goods with the help of equipment, labor, machines, tools, and chemical or biological processing or formulation. It is the essence of the secondary sector of the economy. The term may refer to a range of human activity, from handicraft to high-tech, but it is most commonly applied to industrial design, in which raw materials from the primary sector are transformed into finished goods on a large scale. Such goods may be sold to other manufacturers for the production of other more complex products, or distributed via the tertiary industry to end users and consumers.
The Bessemer process was the first inexpensive industrial process for the mass production of steel from molten pig iron before the development of the open hearth furnace. The key principle is removal of impurities from the iron by oxidation with air being blown through the molten iron. The oxidation also raises the temperature of the iron mass and keeps it molten.
Mass production, also known as flow production, series production, series manufacture, or continuous production, is the production of substantial amounts of standardized products in a constant flow, including and especially on assembly lines. Together with job production and batch production, it is one of the three main production methods.
The history of rail transport began before the beginning of the common era. It can be divided into several discrete periods defined by the principal means of track material and motive power used.
The Westinghouse Electric Corporation was an American manufacturing company founded in 1886 by George Westinghouse and headquartered in Pittsburgh, Pennsylvania. It was originally named "Westinghouse Electric & Manufacturing Company" and was renamed "Westinghouse Electric Corporation" in 1945. Through the early and mid-20th century, Westinghouse Electric was a powerhouse in heavy industry, electrical production and distribution, consumer electronics, home appliances and a wide variety of other products. They were a major supplier of generators and steam turbines for most of their history, and was also a major player in the field of nuclear power, starting with the Westinghouse Atom Smasher in 1937.
The Second Industrial Revolution, also known as the Technological Revolution, was a phase of rapid scientific discovery, standardisation, mass production and industrialisation from the late 19th century into the early 20th century. The First Industrial Revolution, which ended in the middle of the 19th century, was punctuated by a slowdown in important inventions before the Second Industrial Revolution in 1870. Though a number of its events can be traced to earlier innovations in manufacturing, such as the establishment of a machine tool industry, the development of methods for manufacturing interchangeable parts, as well as the invention of the Bessemer process and open hearth furnace to produce steel, later developments heralded the Second Industrial Revolution, which is generally dated between 1870 and 1914.
Railroads played a large role in the development of the United States from the Industrial Revolution in the Northeast (1820s–1850s) to the settlement of the West (1850s–1890s). The American railroad mania began with the founding of the first passenger and freight line in the country, the Baltimore and Ohio Railroad, in 1827, and the "Laying of the First Stone" ceremonies and the beginning of its long construction heading westward over the obstacles of the Appalachian Mountains eastern chain in the next year. It flourished with continuous railway building projects for the next 45 years until the financial Panic of 1873, followed by a major economic depression, that bankrupted many companies and temporarily stymied and ended growth.
The history of technology is the history of the invention of tools and techniques by humans. Technology includes methods ranging from simple stone tools to the complex genetic engineering and information technology that has emerged since the 1980s. The term technology comes from the Greek word techne, meaning art and craft, and the word logos, meaning word and speech. It was first used to describe applied arts, but it is now used to describe advancements and changes that affect the environment around us.
The Rust Belt, formerly the Steel Belt, is an area of industrial decline in the United States. From the late 19th century to late 20th century, the region formed the industrial heartland of the country, with its economies largely based on automobile and steel production, coal mining, and processing of raw materials. The term "Rust Belt" is a dysphemism to describe an industry that has "rusted out", referring to the impact of deindustrialization, economic decline, population loss, and urban decay which is attributable to an area's shrinking industrial sector. The term gained popularity in the U.S. beginning in the 1980s when it was commonly contrasted with the Sun Belt, which was then surging. Common definitions of the region stretch from Upstate New York and western Pennsylvania to southeastern Wisconsin and northern Illinois, including large parts of Ohio, Indiana, and Michigan. Some definitions of the Rust Belt also include parts of Iowa, Kentucky, Maryland, Minnesota, Missouri, New Jersey and West Virginia. Much of the Rust Belt is synonymous with the Great Lakes region of the United States.
Improvements to the steam engine were some of the most important technologies of the Industrial Revolution, although steam did not replace water power in importance in Britain until after the Industrial Revolution. From Englishman Thomas Newcomen's atmospheric engine, of 1712, through major developments by Scottish inventor and mechanical engineer James Watt, the steam engine began to be used in many industrial settings, not just in mining, where the first engines had been used to pump water from deep workings. Early mills had run successfully with water power, but by using a steam engine a factory could be located anywhere, not just close to a water source. Water power varied with the seasons and was not always available.
The economic history of the United States spans the colonial era through the 21st century. The initial settlements depended on agriculture and hunting/trapping, later adding international trade, manufacturing, and finally, services, to the point where agriculture represented less than 2% of GDP. Until the end of the Civil War, slavery was a significant factor in the agricultural economy of the South. The US was the world's largest economy for decades, possibly surpassed in the 21st century.
Economic stagnation is a prolonged period of slow economic growth, usually accompanied by high unemployment. Under some definitions, slow means significantly slower than potential growth as estimated by macroeconomists, even though the growth rate may be nominally higher than in other countries not experiencing economic stagnation.
An industrial slave is a type of slave who typically worked in an industrial setting. These slaves often had work that was more dangerous than agricultural slaves. Besides agricultural and domestic service, slaves were a main labor force in mining, in shipping as galley slaves, and in many kinds of manufacturing.
Before 1800 A.D., the iron and steel industry was located where raw material, power supply and running water were easily available. After 1950, the iron and steel industry began to be located on large areas of flat land near sea ports. The history of the modern steel industry began in the late 1850s. Since then, steel has become a staple of the world's industrial economy. This article is intended only to address the business, economic and social dimensions of the industry, since the bulk production of steel began as a result of Henry Bessemer's development of the Bessemer converter, in 1857. Previously, steel was very expensive to produce, and was only used in small, expensive items, such as knives, swords and armor.
At the time of its founding, the People's Republic of China was one of the poorest countries in the world. In the early 1950s, its industry developed rapidly through a state-led process heavily influenced by the Soviet experience. Aiming to close the gap between its political ambitions and its phase of development, China began the Great Leap Forward, which sought to even more rapidly industrialize the country. The effort largely failed, and its policies contributed to famine.
The United Kingdom, where the Industrial Revolution began in the late 18th century, has a long history of manufacturing, which contributed to Britain's early economic growth. During the second half of the 20th century, there was a steady decline in the importance of manufacturing and the economy of the United Kingdom shifted toward services. Manufacturing, however, remains important for overseas trade and accounted for 44% of goods exports in 2014. In June 2010, manufacturing in the United Kingdom accounted for 8.2% of the workforce and 12% of the country's national output. The East Midlands and West Midlands were the regions with the highest proportion of employees in manufacturing. London had the lowest at 2.8%.
The productivity-improving technologies are the technological innovations that have historically increased productivity.
The Edison Ore-Milling Company was a venture by Thomas Edison that began in 1881. Edison introduced some significant technological developments to the iron ore milling industry but the company ultimately proved to be unprofitable. Towards the end of the company's life, Edison realized the potential application of his technologies to the cement industry and formed the Edison Portland Cement Company in 1899.
In the United States from the late 18th and 19th centuries, the Industrial Revolution affected the U.S. economy, progressing it from manual labor, farm labor and handicraft work, to a greater degree of industrialization based on wage labor. There were many improvements in technology and manufacturing fundamentals with results that greatly improved overall production and economic growth in the U.S.
The historic industrial base is popularly called smokestack America
30,000 acres (120 km²) of federal land, either within or contiguous to its boundaries, for each member of Congress the state had as of the census of 1860 – a minimum of 90,000 acres (360 km²).