Information Age

Last updated
Rings of time on a tree log marked to show some important dates of the Information Age (Digital Revolution) from 1968 to 2017 Rings of time Information Age (Digital Revolution).jpg
Rings of time on a tree log marked to show some important dates of the Information Age (Digital Revolution) from 1968 to 2017

The Information Age (also known as the Computer Age, Digital Age, or New Media Age) is a historical period that began in the early 20th century, characterized by a rapid epochal shift from the traditional industry established by the Industrial Revolution to an economy primarily based upon information technology. [1] [2] [3] [4] The onset of the Information Age can be associated with the development of transistor technology, [4] particularly the MOSFET (metal-oxide-semiconductor field-effect transistor), [5] [6] which became the fundamental building block of digital electronics [5] [6] and revolutionized modern technology. [4] [7]

Contents

According to the United Nations Public Administration Network, the Information Age was formed by capitalizing on computer microminiaturization advances, [8] which, upon broader usage within society, would lead to modernized information and to communication processes becoming the driving force of social evolution. [2]

Overview of early developments

Library expansion and Moore's law

Library expansion was calculated in 1945 by Fremont Rider to double in capacity every 16 years were sufficient space made available. [9] He advocated replacing bulky, decaying printed works with miniaturized microform analog photographs, which could be duplicated on-demand for library patrons and other institutions.

Rider did not foresee, however, the digital technology that would follow decades later to replace analog microform with digital imaging, storage, and transmission media, whereby vast increases in the rapidity of information growth would be made possible through automated, potentially-lossless digital technologies. Accordingly, Moore's law , formulated around 1965, would calculate that the number of transistors in a dense integrated circuit doubles approximately every two years. [10] [11]

By the early 1980s, along with improvements in computing power, the proliferation of the smaller and less expensive personal computers allowed for immediate access to information and the ability to share and store such for increasing numbers of workers. Connectivity between computers within organizations enabled employees at different levels to access greater amounts of information.

Information storage and Kryder's law

The world's technological capacity to store information grew from 2.6 (optimally compressed) exabytes (EB) in 1986 to 15.8 EB in 1993; over 54.5 EB in 2000; and to 295 (optimally compressed) EB in 2007. [12] [13] This is the informational equivalent to less than one 730-megabyte (MB) CD-ROM per person in 1986 (539 MB per person); roughly four CD-ROM per person in 1993; twelve CD-ROM per person in the year 2000; and almost sixty-one CD-ROM per person in 2007. [14] It is estimated that the world's capacity to store information has reached 5 zettabytes in 2014, [15] the informational equivalent of 4,500 stacks of printed books from the earth to the sun.

The amount of digital data stored appears to be growing approx.exponentially, reminiscent of Moore's law. As such, Kryder's law prescribes that the amount of storage space available appears to be growing approximately exponentially. [16] [17] [18] [11]

Information transmission

The world's technological capacity to receive information through one-way broadcast networks was 432 exabytes of (optimally compressed) information in 1986; 715 (optimally compressed) exabytes in 1993; 1.2 (optimally compressed) zettabytes in 2000; and 1.9 zettabytes in 2007, the information equivalent of 174 newspapers per person per day. [14]

The world's effective capacity to exchange information through two-way telecommunication networks was 281 petabytes of (optimally compressed) information in 1986; 471 petabytes in 1993; 2.2 (optimally compressed) exabytes in 2000; and 65 (optimally compressed) exabytes in 2007, the information equivalent of 6 newspapers per person per day. [14] In the 1990s, the spread of the Internet caused a sudden leap in access to and ability to share information in businesses and homes globally. Technology was developing so quickly that a computer costing $3000 in 1997 would cost $2000 two years later and $1000 the following year.

Computation

The world's technological capacity to compute information with humanly guided general-purpose computers grew from 3.0 × 108 MIPS in 1986, to 4.4 × 109 MIPS in 1993; to 2.9 × 1011 MIPS in 2000; to 6.4 × 1012 MIPS in 2007. [14] An article featured in the journal Trends in Ecology and Evolution reports that, by now: [15]

[ Digital technology] has vastly exceeded the cognitive capacity of any single human being and has done so a decade earlier than predicted. In terms of capacity, there are two measures of importance: the number of operations a system can perform and the amount of information that can be stored. The number of synaptic operations per second in a human brain has been estimated to lie between 10^15 and 10^17. While this number is impressive, even in 2007 humanity's general-purpose computers were capable of performing well over 10^18 instructions per second. Estimates suggest that the storage capacity of an individual human brain is about 10^12 bytes. On a per capita basis, this is matched by current digital storage (5x10^21 bytes per 7.2x10^9 people).

Three-stage concept

The Information Age can define as the Primary Information Age and the Secondary Information Age. Information in the Primary Information age was handled by newspapers, radio and television. The Secondary Information Age was developed by Internet, satellite televisions and mobile phones. The Tertiary Information Age was emerged by media of the Primary Information Age interconnected with media of the Secondary Information Age. Today we are experiencing it. [19]

Three stages of the Information Age Three stages of the Information Age.png
Three stages of the Information Age

Economics

Eventually, Information and communication technology (ICT)—i.e. computers, computerized machinery, fiber optics, communication satellites, the Internet, and other ICT tools—became a significant part of the world economy, as the development of microcomputers greatly changed many businesses and industries. [20] [21] Nicholas Negroponte captured the essence of these changes in his 1995 book, Being Digital, in which he discusses the similarities and differences between products made of atoms and products made of bits. [22] In essence, a copy of a product made of bits can be made cheaply and quickly, then expediently shipped across the country or the world at very low cost.

Jobs and income distribution

The Information Age has affected the workforce in several ways, such as compelling workers to compete in a global job market. One of the most evident concerns is the replacement of human labor by computers that can do their jobs faster and more effectively, thus creating a situation in which individuals who perform tasks that can easily be automated are forced to find employment where their labor is not as disposable. [23] This especially creates issue for those in industrial cities, where solutions typically involve lowering working time, which is often highly resisted. Thus, individuals who lose their jobs may be pressed to move up into joining "mind workers" (e.g. engineers, doctors, lawyers, teachers, professors, scientists, executives, journalists, consultants), who are able to compete successfully in the world market and receive (relatively) high wages. [24]

Along with automation, jobs traditionally associated with the middle class (e.g. assembly line, data processing, management, and supervision) have also begun to disappear as result of outsourcing. [25] Unable to compete with those in developing countries, production and service workers in post-industrial (i.e. developed) societies either lose their jobs through outsourcing, accept wage cuts, or settle for low-skill, low-wage service jobs. [25] In the past, the economic fate of individuals would be tied to that of their nation's. For example, workers in the United States were once well paid in comparison to those in other countries. With the advent of the Information Age and improvements in communication, this is no longer the case, as workers must now compete in a global job market, whereby wages are less dependent on the success or failure of individual economies. [25]

In effectuating a globalized workforce, the internet has just as well allowed for increased opportunity in developing countries, making it possible for workers in such places to provide in-person services, therefore competing directly with their counterparts in other nations. This competitive advantage translates into increased opportunities and higher wages. [26]

Automation, productivity, and job gain

The Information Age has affected the workforce in that automation and computerisation have resulted in higher productivity coupled with net job loss in manufacturing. In the United States, for example, from January 1972 to August 2010, the number of people employed in manufacturing jobs fell from 17,500,000 to 11,500,000 while manufacturing value rose 270%. [27]

Although it initially appeared that job loss in the industrial sector might be partially offset by the rapid growth of jobs in information technology, the recession of March 2001 foreshadowed a sharp drop in the number of jobs in the sector. This pattern of decrease in jobs would continue until 2003, [28] and data has shown that, overall, technology creates more jobs than it destroys even in the short run. [29]

Information-intensive industry

Industry is has become more information-intensive while less labor- and capital-intensive. This has left important implications for the workforce, as workers have become increasingly productive as the value of their labor decreases. For the system of capitalism itself, not only is the value of labor decreased, the value of capital is also diminished.

In the classical model, investments in human and financial capital are important predictors of the performance of a new venture. [30] However, as demonstrated by Mark Zuckerberg and Facebook, it now seems possible for a group of relatively inexperienced people with limited capital to succeed on a large scale. [31]

Innovations

A visualization of the various routes through a portion of the Internet. Internet map 1024.jpg
A visualization of the various routes through a portion of the Internet.

The Information Age was enabled by technology developed in the Digital Revolution, which was itself enabled by building on the developments of the Technological Revolution.

Transistors

The onset of the Information Age can be associated with the development of transistor technology. [4] The concept of a field-effect transistor was first theorized by Julius Edgar Lilienfeld in 1925. [32] The first practical transistor was the point-contact transistor, invented by the engineers Walter Houser Brattain and John Bardeen at Bell Labs in 1947. This was a breakthrough that laid the foundations for modern technology. [4] Shockley's research team also invented the bipolar junction transistor in 1952. [33] [32] However, early junction transistors were relatively bulky devices that were difficult to manufacture on a mass-production basis, which limited them to a number of specialised applications. [34]

The beginning of the Information Age, along with the Silicon Age, has been dated back to the invention of the metal–oxide–semiconductor field-effect transistor (MOSFET; or MOS transistor), [35] which was invented by Mohamed M. Atalla and Dawon Kahng at Bell Labs in 1959. [6] [33] [36] The MOSFET was the first truly compact transistor that could be miniaturised and mass-produced for a wide range of uses. [34] With its high scalability, [37] and much lower power consumption and higher density than bipolar junction transistors, [38] the MOSFET made it possible to build high-density integrated circuits (ICs), [33] allowing the integration of more than 10,000 transistors in a small IC, [39] and later billions of transistors in a single device. [40]

The widespread adoption of MOSFETs revolutionized the electronics industry, [41] such as control systems and computers since the 1970s. [42] The MOSFET revolutionized the world during the Information Age, with its high density enabling a computer to exist on a few small IC chips rather than filling a room, [7] and later making possible digital communications technology, such as smartphones. [40] As of 2013, billions of MOS transistors are manufactured every day. [33] The MOS transistor has been the fundamental building block of digital electronics since the late 20th century, paving the way for the digital age. [6] The MOS transistor is credited with transforming society around the world, [40] [6] and has been described as the "workhorse" of the Information Age, [5] as the basis for every microprocessor, memory chip, and telecommunication circuit in use as of 2016. [43]

Computers

Before the advent of electronics, mechanical computers, like the Analytical Engine in 1837, were designed to provide routine mathematical calculation and simple decision-making capabilities. Military needs during World War II drove development of the first electronic computers, based on vacuum tubes, including the Z3, the Atanasoff–Berry Computer, Colossus computer, and ENIAC.

The invention of the transistor enabled the era of mainframe computers (1950s–1970s), typified by the IBM 360. These large, room-sized computers provided data calculation and manipulation that was much faster than humanly possible, but were expensive to buy and maintain, so were initially limited to a few scientific institutions, large corporations, and government agencies.

The germanium integrated circuit (IC) was invented by Jack Kilby at Texas Instruments in 1958. [44] The silicon integrated circuit was then invented in 1959 by Robert Noyce at Fairchild Semiconductor, using the planar process developed by Jean Hoerni, who was in turn building on Mohamed Atalla's silicon surface passivation method developed at Bell Labs in 1957. [45] [46] Following the invention of the MOS transistor by Mohamed Atalla and Dawon Kahng at Bell Labs in 1959, [36] the MOS integrated circuit was developed by Fred Heiman and Steven Hofstein at RCA in 1962. [47] The silicon-gate MOS IC was later developed by Federico Faggin at Fairchild Semiconductor in 1968. [48] With the advent of the MOS transistor and the MOS IC, transistor technology rapidly improved, and the ratio of computing power to size increased dramatically, giving direct access to computers to ever smaller groups of people.

The MOS integrated circuit led to the invention of the microprocessor. The first commercial single-chip microprocessor launched in 1971, the Intel 4004, which was developed by Federico Faggin using his silicon-gate MOS IC technology, along with Marcian Hoff, Masatoshi Shima and Stan Mazor. [49] [50]

Along with electronic arcade machines and home video game consoles in the 1970s, the development of personal computers like the Commodore PET and Apple II (both in 1977) gave individuals access to the computer. But data sharing between individual computers was either non-existent or largely manual, at first using punched cards and magnetic tape, and later floppy disks.

Data

The first developments for storing data were initially based on photographs, starting with microphotography in 1851 and then microform in the 1920s, with the ability to store documents on film, making them much more compact. Early information theory and Hamming codes were developed about 1950, but awaited technical innovations in data transmission and storage to be put to full use.

Magnetic-core memory was developed from the research of Frederick W. Viehe in 1947 and An Wang at Harvard University in 1949. [51] The first commer|title=1956: First commercial hard disk drive shipped |url=https://www.computerhistory.org/storageengine/first-commercial-hard-disk-drive-shipped/ |website=Computer History Museum |accessdate=31 July 2019}}</ref> With the advent of the MOS transistor, MOS semiconductor memory was developed by John Schmidt at Fairchild Semiconductor in 1964. [52] [53] In 1967, Dawon Kahng and Simon Sze at Bell Labs developed the floating-gate MOSFET (FGMOS), which they proposed could be used for erasable programmable read-only memory (EPROM), [54] providing the basis for non-volatile memory (NVM) technologies such as flash memory. [55] Following the invention of flash memory by Fujio Masuoka at Toshiba in 1980, [56] [57] Toshiba commercialized NAND flash memory in 1987. [58] [59]

While cables transmitting digital data connected computer terminals and peripherals to mainframes were common, and special message-sharing systems leading to email were first developed in the 1960s, independent computer-to-computer networking began with ARPANET in 1969. This expanded to become the Internet (coined in 1974), and then the World Wide Web in 1989.

Public digital data transmission first utilized existing phone lines using dial-up, starting in the 1950s, and this was the mainstay of the Internet until broadband in the 2000s. The wireless revolution, the introduction and proliferation of wireless networking, began in the 1990s and was enabled by the wide adoption of MOSFET-based RF power amplifiers (power MOSFET and LDMOS) and RF circuits (RF CMOS). [60] [61] [62] Wireless networks, combined with the proliferation of communications satellites in the 2000s, allowed for public digital transmission without the need for cables. This technology led to digital television, GPS, satellite radio, wireless Internet, and mobile phones through the 1990s to 2000s.

MOSFET scaling, the rapid miniaturization of MOSFETs at a rate predicted by Moore's law, [63] led to computers becoming smaller and more powerful, to the point where they could be carried. During the 1980s1990s, laptops were developed as a form of portable computer, and personal digital assistants (PDAs) could be used while standing or walking. Pagers, widely used by the 1980s, were largely replaced by mobile phones beginning in the late 1990s, providing mobile networking features to some computers. Now commonplace, this technology is extended to digital cameras and other wearable devices. Starting in the late 1990s, tablets and then smartphones combined and extended these abilities of computing, mobility, and information sharing.

Discrete cosine transform (DCT) coding, a data compression technique first proposed by Nasir Ahmed in 1972, [64] enabled practical digital media transmission, [65] [66] [67] with image compression formats such as JPEG (1992), video coding formats such as H.26x (1988 onwards) and MPEG (1993 onwards), [68] audio coding standards such as Dolby Digital (1991) [69] [70] and MP3 (1994), [68] and digital TV standards such as video-on-demand (VOD) [65] and high-definition television (HDTV). [71] Internet video was popularized by YouTube, an online video platform founded by Chad Hurley, Jawed Karim and Steve Chen in 2005, which enabled the video streaming of MPEG-4 AVC (H.264) user-generated content from anywhere on the World Wide Web. [72]

Electronic paper, which has origins in the 1970s, allows digital information to appear as paper documents.

Optics

Optical communication has played an important role in communication networks. [73] Optical communication provided the hardware basis for Internet technology, laying the foundations for the Digital Revolution and Information Age. [74]

In 1953, Bram van Heel demonstrated image transmission through bundles of optical fibers with a transparent cladding. The same year, Harold Hopkins and Narinder Singh Kapany at Imperial College succeeded in making image-transmitting bundles with over 10,000 optical fibers, and subsequently achieved image transmission through a 75 cm long bundle which combined several thousand fibers. [75]

While working at Tohoku University, Japanese engineer Jun-ichi Nishizawa proposed fiber-optic communication, the use of optical fibers for optical communication, in 1963. [76] Nishizawa invented other technologies that contributed to the development of optical fiber communications, such as the graded-index optical fiber as a channel for transmitting light from semiconductor lasers. [77] [78] He patented the graded-index optical fiber in 1964. [74] The solid-state optical fiber was invented by Nishizawa in 1964. [79]

The three essential elements of optical communication were invented by Jun-ichi Nishizawa: the semiconductor laser (1957) being the light source, the graded-index optical fiber (1964) as the transmission line, and the PIN photodiode (1950) as the optical receiver. [74] Izuo Hayashi's invention of the continuous wave semiconductor laser in 1970 led directly to the light sources in fiber-optic communication, laser printers, barcode readers, and optical disc drives, commercialized by Japanese entrepreneurs, [80] and opening up the field of optical communications. [73]

Metal–oxide–semiconductor (MOS) image sensors, which first began appearing in the late 1960s, led to the transition from analog to digital imaging, and from analog to digital cameras, during the 1980s1990s. The most common image sensors are the charge-coupled device (CCD) sensor and the CMOS (complementary MOS) active-pixel sensor (CMOS sensor). [81] [82]

See also

Related Research Articles

Electrical engineering Field of engineering that deals with electricity, electromagnetism, and electronics

Electrical engineering is an engineering discipline concerned with the study, design and application of equipment, devices and systems which use electricity, electronics, and electromagnetism. It emerged as an identifiable occupation in the latter half of the 19th century after commercialization of the electric telegraph, the telephone, and electrical power generation, distribution and use.

Electronics Technical field

Electronics comprises the physics, engineering, technology and applications that deal with the emission, flow and control of electrons in vacuum and matter. It uses active devices to control electron flow by amplification and rectification, which distinguishes it from classical electrical engineering which uses passive effects such as resistance, capacitance and inductance to control current flow.

Integrated circuit Electronic circuit

An integrated circuit or monolithic integrated circuit is a set of electronic circuits on one small flat piece of semiconductor material that is normally silicon. The integration of large numbers of tiny MOS transistors into a small chip results in circuits that are orders of magnitude smaller, faster, and less expensive than those constructed of discrete electronic components. The IC's mass production capability, reliability, and building-block approach to integrated circuit design has ensured the rapid adoption of standardized ICs in place of designs using discrete transistors. ICs are now used in virtually all electronic equipment and have revolutionized the world of electronics. Computers, mobile phones, and other digital home appliances are now inextricable parts of the structure of modern societies, made possible by the small size and low cost of ICs.

A semiconductor device is an electronic component that relies on the electronic properties of a semiconductor material for its function. Semiconductor devices have replaced vacuum tubes in most applications. They use electrical conduction in the solid state rather than the gaseous state or thermionic emission in a vacuum.

MOSFET Transistor used for amplifying or switching electronic signals.

The metal–oxide–semiconductor field-effect transistor, also known as the metal–oxide–silicon transistor, is a type of insulated-gate field-effect transistor that is fabricated by the controlled oxidation of a semiconductor, typically silicon. The voltage of the covered gate determines the electrical conductivity of the device; this ability to change conductivity with the amount of applied voltage can be used for amplifying or switching electronic signals.

CMOS Technology for constructing integrated circuits

Complementary metal–oxide–semiconductor (CMOS), also known as complementary-symmetry metal–oxide–semiconductor (COS-MOS), is a type of metal–oxide–semiconductor field-effect transistor (MOSFET) fabrication process that uses complementary and symmetrical pairs of p-type and n-type MOSFETs for logic functions. CMOS technology is used for constructing integrated circuit (IC) chips, including microprocessors, microcontrollers, memory chips, and other digital logic circuits, and replaced earlier transistor-transistor logic (TTL) technology. CMOS technology is also used for analog circuits such as image sensors, data converters, RF circuits, and highly integrated transceivers for many types of communication.

A mixed-signal integrated circuit is any integrated circuit that has both analog circuits and digital circuits on a single semiconductor die. In real-life applications mixed-signal designs are everywhere, for example, smart mobile phones. Mixed-signal ICs also process both analog and digital signals together. For example, an analog-to-digital converter is a mixed-signal circuit. Mixed-signal circuits or systems are typically cost-effective solutions for building any modern consumer electronics applications.

Dr. Frank Marion Wanlass was an American electrical engineer. He is best known for inventing CMOS logic with Chih-Tang Sah in 1963. CMOS has since become the standard semiconductor device fabrication process for MOSFETs.

Semiconductor memory is a digital electronic semiconductor device used for digital data storage, such as computer memory. It typically refers to MOS memory, where data is stored within metal–oxide–semiconductor (MOS) memory cells on a silicon integrated circuit memory chip. There are numerous different types using different semiconductor technologies. The two main types of random-access memory (RAM) are static RAM (SRAM), which uses several transistors per memory cell, and Dynamic random-access memory (DRAM), which uses a single transistor and MOS capacitor per cell. Non-volatile memory uses floating-gate memory cells, which consist of a single transistor per cell.

History of telecommunication aspect of history relating to telecommunications

The history of telecommunication began with the use of smoke signals and drums in Africa, Asia, and the Americas. In the 1790s, the first fixed semaphore systems emerged in Europe. However, it was not until the 1830s that electrical telecommunication systems started to appear. This article details the history of telecommunication and the individuals who helped make telecommunication systems what they are today. The history of telecommunication is an important part of the larger history of communication.

Solid-state electronics circuits or devices built entirely from solid materials and in which the electrons, or other charge carriers, are confined entirely within the solid material

Solid-state electronics means semiconductor electronics; electronic equipment using semiconductor devices such as transistors, diodes and integrated circuits (ICs). The term is also used for devices in which semiconductor electronics which have no moving parts replace devices with moving parts, such as the solid-state relay in which transistor switches are used in place of a moving-arm electromechanical relay, or the solid-state drive (SSD) a type of semiconductor memory used in computers to replace hard disk drives, which store data on a rotating disk.

History of electrical engineering aspect of history

This article details the history of electrical engineering.

Digital Revolution change from analog, mechanical, and electronic technology to digital technology

The Digital Revolution is the shift from mechanical and analogue electronic technology to digital electronics which began in the latter half of the 20th century, with the adoption and proliferation of digital computers and digital record-keeping, that continues to the present day. Implicitly, the term also refers to the sweeping changes brought about by digital computing and communication technology during this period. Analogous to the Agricultural Revolution and Industrial Revolution, the Digital Revolution marked the beginning of the Information Age.

A transistor is a semiconductor device with at least three terminals for connection to an electric circuit. The vacuum-tube triode, also called a (thermionic) valve, was the transistor's precursor, introduced in 1907. The principle of a field-effect transistor was proposed by Julius Edgar Lilienfeld in 1925.

Edholm's law, proposed by and named after Phil Edholm, refers to the observation that the three categories of telecommunication, namely wireless (mobile), nomadic and wired networks (fixed), are in lockstep and gradually converging. Edholm's law also holds that data rates for these telecommunications categories increase on similar exponential curves, with the slower rates trailing the faster ones by a predictable time lag. Edholm's law predicts that the bandwidth and data rates double every 18 months, which has proven to be true since the 1970s. The trend is evident in the cases of Internet, cellular (mobile), wireless LAN and wireless personal area networks.

In computing, bandwidth is the maximum rate of data transfer across a given path. Bandwidth may be characterized as network bandwidth, data bandwidth, or digital bandwidth.

This article details the history of electronic engineering. Chambers Twentieth Century Dictionary (1972) defines electronics as "The science and technology of the conduction of electricity in a vacuum, a gas, or a semiconductor, and devices based thereon".

Dawon Kahng South Korean engineer

Dawon Kahng was a Korean-American electrical engineer and inventor, known for his work in solid-state electronics. He is best known for inventing the MOSFET, also known as the MOS transistor, with Mohamed Atalla in 1959. Atalla and Kahng developed both the PMOS and NMOS processes for MOSFET semiconductor device fabrication. The MOSFET is the most widely used type of transistor, and the basic element in most modern electronic equipment.

Mohamed M. Atalla Egyptian engineer, physical chemist, cryptographer, inventor and entrepreneur

Mohamed Mohamed Atalla was an Egyptian engineer, physical chemist, cryptographer, inventor and entrepreneur. He was a semiconductor pioneer who made important contributions to modern electronics. His invention of the MOSFET in 1959, along with his earlier surface passivation and thermal oxidation processes, revolutionized the electronics industry. He is also known as the founder of the data security company Atalla Corporation, founded in 1972. He received the Stuart Ballantine Medal and was inducted into the National Inventors Hall of Fame for his important contributions to semiconductor technology as well as data security.

MOSFET applications Wikimedia list article

The metal–oxide–semiconductor field-effect transistor (MOSFET, MOS-FET, or MOS FET), also known as the metal–oxide–silicon transistor (MOS transistor, or MOS), is a type of insulated-gate field-effect transistor (IGFET) that is fabricated by the controlled oxidation of a semiconductor, typically silicon. The voltage of the covered gate determines the electrical conductivity of the device; this ability to change conductivity with the amount of applied voltage can be used for amplifying or switching electronic signals. The MOSFET was invented by Egyptian engineer Mohamed M. Atalla and Korean engineer Dawon Kahng at Bell Labs in 1959. It is the basic building block of modern electronics, and the most frequently manufactured device in history, with an estimated total of 13 sextillion (1.3 × 1022) MOSFETs manufactured between 1960 and 2018.

References

  1. Zimmerman, Kathy Ann (September 7, 2017). "History of Computers: A Brief Timeline". livescience.com.
  2. 1 2 "The History of Computers". thought.co.
  3. "The 4 industrial revolutions". sentryo.net. February 23, 2017.
  4. 1 2 3 4 5 Manuel, Castells (1996). The information age : economy, society and culture. Oxford: Blackwell. ISBN   978-0631215943. OCLC   43092627.
  5. 1 2 3 Raymer, Michael G. (2009). The Silicon Web: Physics for the Internet Age. CRC Press. p. 365. ISBN   9781439803127.
  6. 1 2 3 4 5 "Triumph of the MOS Transistor". YouTube . Computer History Museum. 6 August 2010. Retrieved 21 July 2019.
  7. 1 2 Cressler, John D.; Mantooth, H. Alan (2017). Extreme Environment Electronics. CRC Press. p. 959. ISBN   978-1-351-83280-9. While the bipolar junction transistor was the first transistor device to take hold in the integrated circuit world, there is no question that the advent of MOSFETs, an acronym for metal-oxide-semiconductor field-effect transistor, is what truly revolutionized the world in the so-called information age. The density with which these devices can be made has allowed entire computers to exist on a few small chips rather than filling a room.
  8. Kluver, Randy. "Globalization, Informatization, and Intercultural Communication". un.org. Retrieved 18 April 2013.
  9. Rider, Fredmont (1944). The Scholar and the Future of the Research Library. New York City: Hadham Press.
  10. "Moore's Law to roll on for another decade" . Retrieved 2011-11-27. Moore also affirmed he never said transistor count would double every 18 months, as is commonly said. Initially, he said transistors on a chip would double every year. He then recalibrated it to every two years in 1975. David House, an Intel executive at the time, noted that the changes would cause computer performance to double every 18 months.
  11. 1 2 Roser, Max, and Hannah Ritchie. 2013. "Technological Progress." Our World in Data. Retrieved on 9 June 2020.
  12. Hilbert, M.; Lopez, P. (2011-02-10). "The World's Technological Capacity to Store, Communicate, and Compute Information". Science. 332 (6025): 60–65. doi:10.1126/science.1200970. ISSN   0036-8075. PMID   21310967. S2CID   206531385.
  13. Hilbert, Martin R. (2011). Supporting online material for the world's technological capacity to store, communicate, and compute infrormation. Science/AAAS. OCLC   755633889.
  14. 1 2 3 4 Hilbert, Martin; López, Priscila (2011). "The World's Technological Capacity to Store, Communicate, and Compute Information". Science. 332 (6025): 60–65. Bibcode:2011Sci...332...60H. doi:10.1126/science.1200970. ISSN   0036-8075. PMID   21310967. S2CID   206531385.
  15. 1 2 Gillings, Michael R.; Hilbert, Martin; Kemp, Darrell J. (2016). "Information in the Biosphere: Biological and Digital Worlds". Trends in Ecology & Evolution . 31 (3): 180–189. doi:10.1016/j.tree.2015.12.013. PMID   26777788.
  16. Gantz, John, and David Reinsel. 2012. "The Digital Universe in 2020: Big Data, Bigger Digital Shadows, and Biggest Growth in the Far East." IDC iView. S2CID : 112313325. View multimedia content.
  17. Rizzatti, Lauro. 14 September 2016. "Digital Data Storage is Undergoing Mind-Boggling Growth." EE Times . Archived from the original on 16 September 2016.
  18. "The historical growth of data: Why we need a faster transfer solution for large data sets." Signiant. 2020. Retrieved 9 June 2020.
  19. Iranga, Suroshana (2016). Social Media Culture. Colombo: S. Godage and Brothers. ISBN   978-9553067432.
  20. "Information Age Education Newsletter". Information Age Education. August 2008. Retrieved 4 December 2019.
  21. Moursund, David. "Information Age". IAE-Pedia. Retrieved 4 December 2019.
  22. "Negroponte's articles". Archives.obs-us.com. 1996-12-30. Retrieved 2012-06-11.
  23. Porter, Michael. "How Information Gives You Competitive Advantage". Harvard Business Review. Retrieved 9 September 2015.
  24. Geiger, Christophe (2011), "Copyright and Digital Libraries", E-Publishing and Digital Libraries, IGI Global, pp. 257–272, doi:10.4018/978-1-60960-031-0.ch013, ISBN   978-1-60960-031-0
  25. 1 2 3 McGowan, Robert. 1991. "The Work of Nations by Robert Reich" (book review). Human Resource Management 30(4):535–38. doi : 10.1002/hrm.3930300407. ISSN   1099-050X.
  26. Bhagwati, Jagdish N. (2005). In defense of Globalization. New York: Oxford University Press.
  27. Smith, Fran. 5 Oct 2010. "Job Losses and Productivity Gains." Competitive Enterprise Institute .
  28. Cooke, Sandra D. 2003. "Information Technology Workers in the Digital Economy." In Digital Economy. Economics and Statistics Administration, Department of Commerce.
  29. Yongsung, Chang, and Jay H. Hong (2013). "Does Technology Create Jobs?". SERI Quarterly. 6 (3): 44–53. Retrieved 29 April 2014.CS1 maint: multiple names: authors list (link)
  30. Cooper, Arnold C.; Gimeno-Gascon, F. Javier; Woo, Carolyn Y. (1994). "Initial human and financial capital as predictors of new venture performance". Journal of Business Venturing. 9 (5): 371–395. doi:10.1016/0883-9026(94)90013-2.
  31. Carr, David (2010-10-03). "Film Version of Zuckerberg Divides the Generations". The New York Times. ISSN   0362-4331 . Retrieved 2016-12-20.
  32. 1 2 Lee, Thomas H. (2003). "A Review of MOS Device Physics" (PDF). The Design of CMOS Radio-Frequency Integrated Circuits. Cambridge University Press. ISBN   9781139643771.
  33. 1 2 3 4 "Who Invented the Transistor?". Computer History Museum . 4 December 2013. Retrieved 20 July 2019.
  34. 1 2 Moskowitz, Sanford L. (2016). Advanced Materials Innovation: Managing Global Technology in the 21st century. John Wiley & Sons. p. 168. ISBN   9780470508923.
  35. "100 incredible years of physics – materials science". Institute of Physics . December 2019. Retrieved 10 December 2019.
  36. 1 2 "1960 - Metal Oxide Semiconductor (MOS) Transistor Demonstrated". The Silicon Engine. Computer History Museum.
  37. Motoyoshi, M. (2009). "Through-Silicon Via (TSV)" (PDF). Proceedings of the IEEE. 97 (1): 43–48. doi:10.1109/JPROC.2008.2007462. ISSN   0018-9219. S2CID   29105721.
  38. "Transistors Keep Moore's Law Alive". EETimes . 12 December 2018. Retrieved 18 July 2019.
  39. Hittinger, William C. (1973). "Metal-Oxide-Semiconductor Technology". Scientific American. 229 (2): 48–59. Bibcode:1973SciAm.229b..48H. doi:10.1038/scientificamerican0873-48. ISSN   0036-8733. JSTOR   24923169.
  40. 1 2 3 "Remarks by Director Iancu at the 2019 International Intellectual Property Conference". United States Patent and Trademark Office . June 10, 2019. Archived from the original on 17 December 2019. Retrieved 20 July 2019.
  41. Chan, Yi-Jen (1992). Studies of InAIAs/InGaAs and GaInP/GaAs heterostructure FET's for high speed applications. University of Michigan. p. 1. The Si MOSFET has revolutionized the electronics industry and as a result impacts our daily lives in almost every conceivable way.
  42. Grant, Duncan Andrew; Gowar, John (1989). Power MOSFETS: theory and applications. Wiley. p. 1. ISBN   9780471828679. The metal-oxide-semiconductor field-effect transistor (MOSFET) is the most commonly used active device in the very large-scale integration of digital integrated circuits (VLSI). During the 1970s these components revolutionized electronic signal processing, control systems and computers.
  43. Colinge, Jean-Pierre; Greer, James C. (2016). Nanowire Transistors: Physics of Devices and Materials in One Dimension. Cambridge University Press. p. 2. ISBN   9781107052406.
  44. Kilby, Jack (2000), Nobel lecture (PDF), Stockholm: Nobel Foundation, retrieved 15 May 2008
  45. Lojek, Bo (2007). History of Semiconductor Engineering. Springer Science & Business Media. p. 120. ISBN   9783540342588.
  46. Bassett, Ross Knox (2007). To the Digital Age: Research Labs, Start-up Companies, and the Rise of MOS Technology. Johns Hopkins University Press. p. 46. ISBN   9780801886393.
  47. "Tortoise of Transistors Wins the Race - CHM Revolution". Computer History Museum . Retrieved 22 July 2019.
  48. "1968: Silicon Gate Technology Developed for ICs". Computer History Museum . Retrieved 22 July 2019.
  49. "1971: Microprocessor Integrates CPU Function onto a Single Chip". Computer History Museum . Retrieved 22 July 2019.
  50. Colinge, Jean-Pierre; Greer, James C.; Greer, Jim (2016). Nanowire Transistors: Physics of Devices and Materials in One Dimension. Cambridge University Press. p. 2. ISBN   9781107052406.
  51. "1953: Whirlwind computer debuts core memory". Computer History Museum . Retrieved 31 July 2019.
  52. "1970: MOS Dynamic RAM Competes with Magnetic Core Memory on Price". Computer History Museum . Retrieved 29 July 2019.
  53. Solid State Design - Vol. 6. Horizon House. 1965.
  54. "1971: Reusable semiconductor ROM introduced". Computer History Museum . Retrieved 19 June 2019.
  55. Bez, R.; Pirovano, A. (2019). Advances in Non-Volatile Memory and Storage Technology. Woodhead Publishing. ISBN   9780081025857.
  56. Fulford, Benjamin (24 June 2002). "Unsung hero". Forbes. Archived from the original on 3 March 2008. Retrieved 18 March 2008.
  57. US 4531203 Fujio Masuoka
  58. "1987: Toshiba Launches NAND Flash". eWeek . April 11, 2012. Retrieved 20 June 2019.
  59. "1971: Reusable semiconductor ROM introduced". Computer History Museum . Retrieved 19 June 2019.
  60. Golio, Mike; Golio, Janet (2018). RF and Microwave Passive and Active Technologies. CRC Press. pp. ix, I-1, 18–2. ISBN   9781420006728.
  61. Rappaport, T. S. (November 1991). "The wireless revolution". IEEE Communications Magazine. 29 (11): 52–71. doi:10.1109/35.109666. S2CID   46573735.
  62. "The wireless revolution". The Economist . January 21, 1999. Retrieved 12 September 2019.
  63. Sahay, Shubham; Kumar, Mamidala Jagadesh (2019). Junctionless Field-Effect Transistors: Design, Modeling, and Simulation. John Wiley & Sons. ISBN   9781119523536.
  64. Ahmed, Nasir (January 1991). "How I Came Up With the Discrete Cosine Transform". Digital Signal Processing . 1 (1): 4–5. doi:10.1016/1051-2004(91)90086-Z.
  65. 1 2 Lea, William (1994). Video on demand: Research Paper 94/68. 9 May 1994: House of Commons Library. Archived from the original on 20 September 2019. Retrieved 20 September 2019.CS1 maint: location (link)
  66. Frolov, Artem; Primechaev, S. (2006). "Compressed Domain Image Retrievals Based On DCT-Processing". Semantic Scholar . S2CID   4553 . Retrieved 18 October 2019.
  67. Lee, Ruby Bei-Loh; Beck, John P.; Lamb, Joel; Severson, Kenneth E. (April 1995). "Real-time software MPEG video decoder on multimedia-enhanced PA 7100LC processors" (PDF). Hewlett-Packard Journal . 46 (2). ISSN   0018-1153.
  68. 1 2 Stanković, Radomir S.; Astola, Jaakko T. (2012). "Reminiscences of the Early Work in DCT: Interview with K.R. Rao" (PDF). Reprints from the Early Days of Information Sciences. 60. Retrieved 13 October 2019.
  69. Luo, Fa-Long (2008). Mobile Multimedia Broadcasting Standards: Technology and Practice. Springer Science & Business Media. p. 590. ISBN   9780387782638.
  70. Britanak, V. (2011). "On Properties, Relations, and Simplified Implementation of Filter Banks in the Dolby Digital (Plus) AC-3 Audio Coding Standards". IEEE Transactions on Audio, Speech, and Language Processing. 19 (5): 1231–1241. doi:10.1109/TASL.2010.2087755. S2CID   897622.
  71. Shishikui, Yoshiaki; Nakanishi, Hiroshi; Imaizumi, Hiroyuki (October 26–28, 1993). "An HDTV Coding Scheme using Adaptive-Dimension DCT". Signal Processing of HDTV: Proceedings of the International Workshop on HDTV '93, Ottawa, Canada. Elsevier: 611–618. doi:10.1016/B978-0-444-81844-7.50072-3. ISBN   9781483298511.
  72. Matthew, Crick (2016). Power, Surveillance, and Culture in YouTube™'s Digital Sphere. IGI Global. pp. 36–7. ISBN   9781466698567.
  73. 1 2 S. Millman (1983), A History of Engineering and Science in the Bell System, page 10 Archived 2017-10-26 at the Wayback Machine , AT&T Bell Laboratories
  74. 1 2 3 The Third Industrial Revolution Occurred in Sendai, Soh-VEHE International Patent Office, Japan Patent Attorneys Association
  75. Hecht, Jeff (2004). City of Light: The Story of Fiber Optics (revised ed.). Oxford University. pp. 55–70. ISBN   9780195162554.
  76. Nishizawa, Jun-ichi & Suto, Ken (2004). "Terahertz wave generation and light amplification using Raman effect". In Bhat, K. N. & DasGupta, Amitava (eds.). Physics of semiconductor devices. New Delhi, India: Narosa Publishing House. p. 27. ISBN   978-81-7319-567-9.
  77. "Optical Fiber". Sendai New. Archived from the original on September 29, 2009. Retrieved April 5, 2009.
  78. "New Medal Honors Japanese Microelectrics Industry Leader". Institute of Electrical and Electronics Engineers.
  79. Semiconductor Technologies, page 338, Ohmsha, 1982
  80. Johnstone, Bob (2000). We were burning : Japanese entrepreneurs and the forging of the electronic age. New York: BasicBooks. p. 252. ISBN   9780465091188.
  81. Williams, J. B. (2017). The Electronics Revolution: Inventing the Future. Springer. pp. 245–8. ISBN   9783319490885.
  82. Fossum, Eric R. (12 July 1993). Blouke, Morley M. (ed.). "Active pixel sensors: are CCDs dinosaurs?". SPIE Proceedings Vol. 1900: Charge-Coupled Devices and Solid State Optical Sensors III. International Society for Optics and Photonics. 1900: 2–14. Bibcode:1993SPIE.1900....2F. CiteSeerX   10.1.1.408.6558 . doi:10.1117/12.148585. S2CID   10556755.
  83. "Newspapers News and News Archive Resources: Computer and Technology Sources". Temple University. Retrieved 9 September 2015.

Further reading