Electrical engineering |
---|
This article details the history of electrical engineering .
Long before any knowledge of electricity existed, people were aware of shocks from electric fish. Ancient Egyptian texts dating from 2750 BCE referred to these fish as the "Thunderer of the Nile", and described them as the "protectors" of all other fish. Electric fish were again reported millennia later by ancient Greek, Roman and Arabic naturalists and physicians. [1] Several ancient writers, such as Pliny the Elder and Scribonius Largus, attested to the numbing effect of electric shocks delivered by electric catfish and electric rays, and knew that such shocks could travel along conducting objects. [2] Patients with ailments such as gout or headache were directed to touch electric fish in the hope that the powerful jolt might cure them. [3] Possibly the earliest and nearest approach to the discovery of the identity of lightning, and electricity from any other source, is to be attributed to the Arabs, who before the 15th century had the Arabic word for lightning ra‘ad (رعد) applied to the electric ray. [4]
Ancient cultures around the Mediterranean knew that certain objects, such as rods of amber, could be rubbed with cat's fur to attract light objects like feathers. Thales of Miletus, an ancient Greek philosopher, writing at around 600 BCE, described a form of static electricity, noting that rubbing fur on various substances, such as amber, would cause a particular attraction between the two. He noted that the amber buttons could attract light objects such as hair and that if they rubbed the amber for long enough they could even get a spark to jump.
At around 450 BCE Democritus, a later Greek philosopher, developed an atomic theory that was similar to modern atomic theory. His mentor, Leucippus, is credited with this same theory. The hypothesis of Leucippus and Democritus held everything to be composed of atoms. But these atoms, called "atomos", were indivisible, and indestructible. He presciently stated that between atoms lies empty space, and that atoms are constantly in motion. He was incorrect only in stating that atoms come in different sizes and shapes, and that each object had its own shaped and sized atom. [5] [6]
An object found in Iraq in 1938, dated to about 250 BCE and called the Baghdad Battery, resembles a galvanic cell and is claimed by some to have been used for electroplating in Mesopotamia, although there is no evidence for this.
Electricity would remain little more than an intellectual curiosity for millennia. In 1600, the English scientist, William Gilbert extended the study of Cardano on electricity and magnetism, distinguishing the lodestone effect from static electricity produced by rubbing amber. [7] He coined the Neo-Latin word electricus ("of amber" or "like amber", from ήλεκτρον [elektron], the Greek word for "amber") to refer to the property of attracting small objects after being rubbed. [8] This association gave rise to the English words "electric" and "electricity", which made their first appearance in print in Thomas Browne's Pseudodoxia Epidemica of 1646. [9]
Further work was conducted by Otto von Guericke who showed electrostatic repulsion. Robert Boyle also published work. [10]
Though electrical phenomena had been known for centuries, in the 18th century, the systematic study of electricity became known as "the youngest of the sciences", and the public became electrified by the newest discoveries in the field. [11]
By 1705, Francis Hauksbee had discovered that if he placed a small amount of mercury in the glass of his modified version of Otto von Guericke's generator, evacuated the air from it to create a mild vacuum and rubbed the ball to build up a charge, a glow was visible if he placed his hand on the outside of the ball. This glow was bright enough to read by. It seemed to be similar to St. Elmo's Fire. This effect later became the basis of the gas-discharge lamp, which led to neon lighting and mercury vapor lamps. In 1706 he produced an 'Influence machine' to generate this effect. [12] He was elected a Fellow of the Royal Society the same year. [13]
Hauksbee continued to experiment with electricity, making numerous observations and developing machines to generate and demonstrate various electrical phenomena. In 1709 he published Physico-Mechanical Experiments on Various Subjects which summarized much of his scientific work.
Stephen Gray discovered the importance of insulators and conductors. C. F. du Fay seeing his work, developed a "two-fluid" theory of electricity. [10]
In the 18th century, Benjamin Franklin conducted extensive research in electricity, selling his possessions to fund his work. In June 1752 he is reputed to have attached a metal key to the bottom of a dampened kite string and flown the kite in a storm-threatened sky. [14] A succession of sparks jumping from the key to the back of his hand showed that lightning was indeed electrical in nature. [15] He also explained the apparently paradoxical behavior of the Leyden jar as a device for storing large amounts of electrical charge, by coming up with the single fluid, two states theory of electricity.
In 1791, Italian Luigi Galvani published his discovery of bioelectricity, demonstrating that electricity was the medium by which nerve cells passed signals to the muscles. [10] [16] [17] Alessandro Volta's battery, or voltaic pile, of 1800, made from alternating layers of zinc and copper, provided scientists with a more reliable source of electrical energy than the electrostatic machines previously used. [16] [17]
The first application of electricity that was put to practical use was electromagnetism. [18] William Sturgeon invented the electromagnet in 1825. [19] Electromagnets were then used in the first practical engineering application of electricity by William Fothergill Cooke and Charles Wheatstone who co-developed a telegraph system that used a number of needles on a board which were moved to point to letters of the alphabet. A five needle system was used initially, but was given up as too expensive. In 1838 an improvement reduced the number of needles to two, and a patent for this version was taken out by Cooke and Wheatstone. [20] Cooke tested the invention, with the London & Blackwall Railway, the London & Birmingham Railway, and the Great Western Railway companies, successively allowing the use of their lines for the experiment. Subsequently, railways developed systems with signal boxes along the line communicating with their neighbouring boxes by telegraphic sounding of single-stroke bells and three-position needle telegraph instruments. Such systems implementing signalling block systems remained in use on rural lines well into the 21st century. [21]
Electrical engineering became a profession in the late 19th century. Practitioners had created a global electric telegraph network and the first electrical engineering institutions to support the new discipline were founded in the UK and US. Although it is impossible to precisely pinpoint a first electrical engineer, Francis Ronalds stands ahead of the field, who created a working electric telegraph system in 1816 and documented his vision of how the world could be transformed by electricity. [22] [23] Over 50 years later, he joined the new Society of Telegraph Engineers (soon to be renamed the Institution of Electrical Engineers) where he was regarded by other members as the first of their cohort. [24] The donation of his extensive electrical library was a considerable boon for the fledgling Society.
Development of the scientific basis for electrical engineering, using research techniques, intensified during the 19th century. Notable developments early in this century include the work of Georg Ohm, who in 1827 quantified the relationship between the electric current and potential difference in a conductor, Michael Faraday, the discoverer of electromagnetic induction in 1831. [26] In the 1830s, Georg Ohm also constructed an early electrostatic machine. The homopolar generator was developed first by Michael Faraday during his memorable experiments in 1831. It was the beginning of modern dynamos – that is, electrical generators which operate using a magnetic field. The invention of the industrial generator in 1866 by Werner von Siemens – which did not need external magnetic power – made a large series of other inventions possible.
In 1873, James Clerk Maxwell published a unified treatment of electricity and magnetism in A Treatise on Electricity and Magnetism which stimulated several theorists to think in terms of fields described by Maxwell's equations. In 1878, the British inventor James Wimshurst developed an apparatus that had two glass disks mounted on two shafts. It was not until 1883 that the Wimshurst machine was more fully reported to the scientific community.
During the latter part of the 1800s, the study of electricity was largely considered to be a subfield of physics. It was not until the late 19th century that universities started to offer degrees in electrical engineering. In 1882, Darmstadt University of Technology founded the first chair and the first faculty of electrical engineering worldwide. In the same year, under Professor Charles Cross, the Massachusetts Institute of Technology began offering the first option of Electrical Engineering within a physics department. [27] In 1883, Darmstadt University of Technology and Cornell University introduced the world's first courses of study in electrical engineering and in 1885 the University College London founded the first chair of electrical engineering in the United Kingdom. The University of Missouri subsequently established the first department of electrical engineering in the United States in 1886. [28]
During this period commercial use of electricity increased dramatically. Starting in the late 1870s cities started installing large scale electric street lighting systems based on arc lamps. [29] After the development of a practical incandescent lamp for indoor lighting, Thomas Edison switched on the world's first public electric supply utility in 1882, using what was considered a relatively safe 110 volts direct current system to supply customers. Engineering advances in the 1880s, including the invention of the transformer, led to electric utilities starting to adopting alternating current, up until then used primarily in arc lighting systems, as a distribution standard for outdoor and indoor lighting (eventually replacing direct current for such purposes). In the US there was a rivalry, primarily between a Westinghouse AC and the Edison DC system known as the "war of the currents". [30]
"By the mid-1890s the four "Maxwell equations" were recognized as the foundation of one of the strongest and most successful theories in all of physics; they had taken their place as companions, even rivals, to Newton's laws of mechanics. The equations were by then also being put to practical use, most dramatically in the emerging new technology of radio communications, but also in the telegraph, telephone, and electric power industries." [31] By the end of the 19th century, figures in the progress of electrical engineering were beginning to emerge. [32]
Charles Proteus Steinmetz helped foster the development of alternating current that made possible the expansion of the electric power industry in the United States, formulating mathematical theories for engineers.
During the development of radio, many scientists and inventors contributed to radio technology and electronics. In his classic UHF experiments of 1888, Heinrich Hertz demonstrated the existence of electromagnetic waves (radio waves) leading many inventors and scientists to try to adapt them to commercial applications, such as Guglielmo Marconi (1895) and Alexander Popov (1896).
Millimetre wave communication was first investigated by Jagadish Chandra Bose during 1894–1896, when he reached an extremely high frequency of up to 60 GHz in his experiments. [33] He also introduced the use of semiconductor junctions to detect radio waves, [34] when he patented the radio crystal detector in 1901. [35] [36]
John Fleming invented the first radio tube, the diode, in 1904.
Reginald Fessenden recognized that a continuous wave needed to be generated to make speech transmission possible, and by the end of 1906 he sent the first radio broadcast of voice. Also in 1906, Robert von Lieben and Lee De Forest independently developed the amplifier tube, called the triode. [37] Edwin Howard Armstrong enabling technology for electronic television, in 1931. [38]
In the early 1920s, there was a growing interest in the development of domestic applications for electricity. [39] Public interest led to exhibitions such featuring "homes of the future" and in the UK, the Electrical Association for Women was established with Caroline Haslett as its director in 1924 to encourage women to become involved in electrical engineering. [40]
The second world war saw tremendous advances in the field of electronics; especially in radar and with the invention of the magnetron by Randall and Boot at the University of Birmingham in 1940. Radio location, radio communication and radio guidance of aircraft were all developed at this time. An early electronic computing device, Colossus was built by Tommy Flowers of the GPO to decipher the coded messages of the German Lorenz cipher machine. Also developed at this time were advanced clandestine radio transmitters and receivers for use by secret agents.
An American invention at the time was a device to scramble the telephone calls between Winston Churchill and Franklin D. Roosevelt. This was called the Green Hornet system and worked by inserting noise into the signal. The noise was then extracted at the receiving end. This system was never broken by the Germans.
A great amount of work was undertaken in the United States as part of the War Training Program in the areas of radio direction finding, pulsed linear networks, frequency modulation, vacuum tube circuits, transmission line theory and fundamentals of electromagnetic engineering. These studies were published shortly after the war in what became known as the 'Radio Communication Series' published by McGraw-Hill in 1946.
In 1941 Konrad Zuse presented the Z3, the world's first fully functional and programmable computer. [41]
Prior to the Second World War, the subject was commonly known as 'radio engineering' and was primarily restricted to aspects of communications and radar, commercial radio and early television. At this time, the study of radio engineering at universities could only be undertaken as part of a physics degree.
Later, in post war years, as consumer devices began to be developed, the field broadened to include modern TV, audio systems, Hi-Fi and latterly computers and microprocessors. In 1946 the ENIAC (Electronic Numerical Integrator and Computer) of John Presper Eckert and John Mauchly followed, beginning the computing era. The arithmetic performance of these machines allowed engineers to develop completely new technologies and achieve new objectives, including the Apollo missions and the NASA Moon landing. [42]
In the mid-to-late 1950s, the term radio engineering gradually gave way to the name electronics engineering, which then became a stand-alone university degree subject, usually taught alongside electrical engineering with which it had become associated due to some similarities.
The first working transistor was a point-contact transistor invented by John Bardeen and Walter Houser Brattain while working under William Shockley at the Bell Telephone Laboratories (BTL) in 1947. [43] They then invented the bipolar junction transistor in 1948. [44] While early junction transistors were relatively bulky devices that were difficult to manufacture on a mass-production basis, [45] they opened the door for more compact devices. [46]
The first integrated circuits were the hybrid integrated circuit invented by Jack Kilby at Texas Instruments in 1958 and the monolithic integrated circuit chip invented by Robert Noyce at Fairchild Semiconductor in 1959. [47]
In 1955, Carl Frosch and Lincoln Derick accidentally grew a layer of silicon dioxide over the silicon wafer, for which they observed surface passivation effects. [48] By 1957 Frosch and Derrick, using masking and predeposition, published their manufactured silicon dioxide planar transistors, the first field effect transistors in which drain and source were adjacent at the same surface. [49] They showed that silicon dioxide insulated, protected silicon wafers and prevented dopants from diffusing into the wafer. [48] [50]
Following this research, Mohamed Atalla and Dawon Kahng proposed a silicon MOS transistor in 1959 [51] and successfully demonstrated a working MOS device with their Bell Labs team in 1960. [52] [53] Their team included E. E. LaBate and E. I. Povilonis who fabricated the device; M. O. Thurston, L. A. D’Asaro, and J. R. Ligenza who developed the diffusion processes, and H. K. Gummel and R. Lindner who characterized the device. [54] [55] This was a culmination of decades of field-effect research that began with Lilienfeld.
The MOSFET was the first truly compact transistor that could be miniaturised and mass-produced for a wide range of uses. [45] It revolutionized the electronics industry, [56] [57] becoming the most widely used electronic device in the world. [58] [59] [60]
The MOSFET made it possible to build high-density integrated circuit chips. [58] The earliest experimental MOS IC chip to be fabricated was built by Fred Heiman and Steven Hofstein at RCA Laboratories in 1962. [61] MOS technology enabled Moore's law, the doubling of transistors on an IC chip every two years, predicted by Gordon Moore in 1965. [62] Silicon-gate MOS technology was developed by Federico Faggin at Fairchild in 1968. [63] Since then, the MOSFET has been the basic building block of modern electronics. [64] [65] [66] The mass-production of silicon MOSFETs and MOS integrated circuit chips, along with continuous MOSFET scaling miniaturization at an exponential pace (as predicted by Moore's law), has since led to revolutionary changes in technology, economy, culture and thinking. [67]
The Apollo program which culminated in landing astronauts on the Moon with Apollo 11 in 1969 was enabled by NASA's adoption of advances in semiconductor electronic technology, including MOSFETs in the Interplanetary Monitoring Platform (IMP) [68] [69] and silicon integrated circuit chips in the Apollo Guidance Computer (AGC). [70]
The development of MOS integrated circuit technology in the 1960s led to the invention of the microprocessor in the early 1970s. [71] [72] The first single-chip microprocessor was the Intel 4004, released in 1971. [71] [73] The Intel 4004 was designed and realized by Federico Faggin at Intel with his silicon-gate MOS technology, [71] along with Intel's Marcian Hoff and Stanley Mazor and Busicom's Masatoshi Shima. [74] This ignited the development of the personal computer. The 4004, a 4-bit processor, was followed in 1973 by the Intel 8080, an 8-bit processor, which made possible the building of the first personal computer, the Altair 8800. [75]
Electrical engineering is an engineering discipline concerned with the study, design, and application of equipment, devices, and systems which use electricity, electronics, and electromagnetism. It emerged as an identifiable occupation in the latter half of the 19th century after the commercialization of the electric telegraph, the telephone, and electrical power generation, distribution, and use.
Electronics is a scientific and engineering discipline that studies and applies the principles of physics to design, create, and operate devices that manipulate electrons and other electrically charged particles. Electronics is a subfield of physics and electrical engineering which uses active devices such as transistors, diodes, and integrated circuits to control and amplify the flow of electric current and to convert it from one form to another, such as from alternating current (AC) to direct current (DC) or from analog signals to digital signals.
An integrated circuit (IC), also known as a microchip, computer chip, or simply chip, is a small electronic device made up of multiple interconnected electronic components such as transistors, resistors, and capacitors. These components are etched onto a small piece of semiconductor material, usually silicon. Integrated circuits are used in a wide range of electronic devices, including computers, smartphones, and televisions, to perform various functions such as processing and storing information. They have greatly impacted the field of electronics by enabling device miniaturization and enhanced functionality.
A transistor is a semiconductor device used to amplify or switch electrical signals and power. It is one of the basic building blocks of modern electronics. It is composed of semiconductor material, usually with at least three terminals for connection to an electronic circuit. A voltage or current applied to one pair of the transistor's terminals controls the current through another pair of terminals. Because the controlled (output) power can be higher than the controlling (input) power, a transistor can amplify a signal. Some transistors are packaged individually, but many more in miniature form are found embedded in integrated circuits. Because transistors are the key active components in practically all modern electronics, many people consider them one of the 20th century's greatest inventions.
Very-large-scale integration (VLSI) is the process of creating an integrated circuit (IC) by combining millions or billions of MOS transistors onto a single chip. VLSI began in the 1970s when MOS integrated circuit chips were developed and then widely adopted, enabling complex semiconductor and telecommunications technologies. The microprocessor and memory chips are VLSI devices.
Digital electronics is a field of electronics involving the study of digital signals and the engineering of devices that use or produce them. This is in contrast to analog electronics which work primarily with analog signals. Despite the name, digital electronics designs includes important analog design considerations.
In electronics, the metal–oxide–semiconductor field-effect transistor is a type of field-effect transistor (FET), most commonly fabricated by the controlled oxidation of silicon. It has an insulated gate, the voltage of which determines the conductivity of the device. This ability to change conductivity with the amount of applied voltage can be used for amplifying or switching electronic signals. The term metal–insulator–semiconductor field-effect transistor (MISFET) is almost synonymous with MOSFET. Another near-synonym is insulated-gate field-effect transistor (IGFET).
Complementary metal–oxide–semiconductor is a type of metal–oxide–semiconductor field-effect transistor (MOSFET) fabrication process that uses complementary and symmetrical pairs of p-type and n-type MOSFETs for logic functions. CMOS technology is used for constructing integrated circuit (IC) chips, including microprocessors, microcontrollers, memory chips, and other digital logic circuits. CMOS technology is also used for analog circuits such as image sensors, data converters, RF circuits, and highly integrated transceivers for many types of communication.
Federico Faggin is an Italian-American physicist, engineer, inventor and entrepreneur. He is best known for designing the first commercial microprocessor, the Intel 4004. He led the 4004 (MCS-4) project and the design group during the first five years of Intel's microprocessor effort. Faggin also created, while working at Fairchild Semiconductor in 1968, the self-aligned MOS (metal-oxide-semiconductor) silicon-gate technology (SGT), which made possible MOS semiconductor memory chips, CCD image sensors, and the microprocessor. After the 4004, he led development of the Intel 8008 and 8080, using his SGT methodology for random logic chip design, which was essential to the creation of early Intel microprocessors. He was co-founder and CEO of Zilog, the first company solely dedicated to microprocessors, and led the development of the Zilog Z80 and Z8 processors. He was later the co-founder and CEO of Cygnet Technologies, and then Synaptics.
A mixed-signal integrated circuit is any integrated circuit that has both analog circuits and digital circuits on a single semiconductor die. Their usage has grown dramatically with the increased use of cell phones, telecommunications, portable electronics, and automobiles with electronics and digital sensors.
Semiconductor memory is a digital electronic semiconductor device used for digital data storage, such as computer memory. It typically refers to devices in which data is stored within metal–oxide–semiconductor (MOS) memory cells on a silicon integrated circuit memory chip. There are numerous different types using different semiconductor technologies. The two main types of random-access memory (RAM) are static RAM (SRAM), which uses several transistors per memory cell, and dynamic RAM (DRAM), which uses a transistor and a MOS capacitor per cell. Non-volatile memory uses floating-gate memory cells, which consist of a single floating-gate transistor per cell.
In semiconductor electronics fabrication technology, a self-aligned gate is a transistor manufacturing approach whereby the gate electrode of a MOSFET is used as a mask for the doping of the source and drain regions. This technique ensures that the gate is naturally and precisely aligned to the edges of the source and drain.
PMOS or pMOS logic is a family of digital circuits based on p-channel, enhancement mode metal–oxide–semiconductor field-effect transistors (MOSFETs). In the late 1960s and early 1970s, PMOS logic was the dominant semiconductor technology for large-scale integrated circuits before being superseded by NMOS and CMOS devices.
A transistor is a semiconductor device with at least three terminals for connection to an electric circuit. In the common case, the third terminal controls the flow of current between the other two terminals. This can be used for amplification, as in the case of a radio receiver, or for rapid switching, as in the case of digital circuits. The transistor replaced the vacuum-tube triode, also called a (thermionic) valve, which was much larger in size and used significantly more power to operate. The first transistor was successfully demonstrated on December 23, 1947, at Bell Laboratories in Murray Hill, New Jersey. Bell Labs was the research arm of American Telephone and Telegraph (AT&T). The three individuals credited with the invention of the transistor were William Shockley, John Bardeen and Walter Brattain. The introduction of the transistor is often considered one of the most important inventions in history.
Automotive electronics are electronic systems used in vehicles, including engine management, ignition, radio, carputers, telematics, in-car entertainment systems, and others. Ignition, engine and transmission electronics are also found in trucks, motorcycles, off-road vehicles, and other internal combustion powered machinery such as forklifts, tractors and excavators. Related elements for control of relevant electrical systems are also found on hybrid vehicles and electric cars.
Electromechanics combines processes and procedures drawn from electrical engineering and mechanical engineering. Electromechanics focuses on the interaction of electrical and mechanical systems as a whole and how the two systems interact with each other. This process is especially prominent in systems such as those of DC or AC rotating electrical machines which can be designed and operated to generate power from a mechanical process (generator) or used to power a mechanical effect (motor). Electrical engineering in this context also encompasses electronics engineering.
This article details the history of electronics engineering. Chambers Twentieth Century Dictionary (1972) defines electronics as "The science and technology of the conduction of electricity in a vacuum, a gas, or a semiconductor, and devices based thereon".
Dawon Kahng was a Korean-American electrical engineer and inventor, known for his work in solid-state electronics. He is best known for inventing the MOSFET, along with his colleague Mohamed Atalla, in 1959. Kahng and Atalla developed both the PMOS and NMOS processes for MOSFET semiconductor device fabrication. The MOSFET is the most widely used type of transistor, and the basic element in most modern electronic equipment.
Mohamed M. Atalla was an Egyptian-American engineer, physicist, cryptographer, inventor and entrepreneur. He was a semiconductor pioneer who made important contributions to modern electronics. He is best known for inventing, along with his colleague Dawon Kahng, the MOSFET in 1959, which along with Atalla's earlier surface passivation processes, had a significant impact on the development of the electronics industry. He is also known as the founder of the data security company Atalla Corporation, founded in 1972. He received the Stuart Ballantine Medal and was inducted into the National Inventors Hall of Fame for his important contributions to semiconductor technology as well as data security.
{{cite web}}
: Check |url=
value (help)...and on the 29th of August 1831 he obtained the first evidence that an electric current can induce another in a different circuit
{{cite book}}
: |journal=
ignored (help) reprinted in Igor Grigorov, Ed., Antentop , Vol. 2, No.3, pp. 87–96.The Si MOSFET has revolutionized the electronics industry and as a result impacts our daily lives in almost every conceivable way.
The metal–oxide–semiconductor field-effect transistor (MOSFET) is the most commonly used active device in the very large-scale integration of digital integrated circuits (VLSI). During the 1970s these components revolutionized electronic signal processing, control systems and computers.