Information science |
---|
General aspects |
Related fields and subfields |
Information technology (IT) is a set of related fields that encompass computer systems, software, programming languages, and data and information processing, and storage. [1] IT forms part of information and communications technology (ICT). [2] An information technology system (IT system) is generally an information system, a communications system, or, more specifically speaking, a computer system — including all hardware, software, and peripheral equipment — operated by a limited group of IT users, and an IT project usually refers to the commissioning and implementation of an IT system. [3] IT systems play a vital role in facilitating efficient data management, enhancing communication networks, and supporting organizational processes across various industries. Successful IT projects require meticulous planning, seamless integration, and ongoing maintenance to ensure optimal functionality and alignment with organizational objectives. [4]
Although humans have been storing, retrieving, manipulating, and communicating information since the earliest writing systems were developed, [5] the term information technology in its modern sense first appeared in a 1958 article published in the Harvard Business Review ; authors Harold J. Leavitt and Thomas L. Whisler commented that "the new technology does not yet have a single established name. We shall call it information technology (IT)." [6] Their definition consists of three categories: techniques for processing, the application of statistical and mathematical methods to decision-making, and the simulation of higher-order thinking through computer programs. [6]
The term is commonly used as a synonym for computers and computer networks, but it also encompasses other information distribution technologies such as television and telephones. Several products or services within an economy are associated with information technology, including computer hardware, software, electronics, semiconductors, internet, telecom equipment, and e-commerce. [7] [a]
Based on the storage and processing technologies employed, it is possible to distinguish four distinct phases of IT development: pre-mechanical (3000 BC — 1450 AD), mechanical (1450 — 1840), electromechanical (1840 — 1940), and electronic (1940 to present). [5]
Information technology is a branch of computer science, defined as the study of procedures, structures, and the processing of various types of data. As this field continues to evolve globally, its priority and importance have grown, leading to the introduction of computer science-related courses in K-12 education.
Ideas of computer science were first mentioned before the 1950s under the Massachusetts Institute of Technology (MIT) and Harvard University, where they had discussed and began thinking of computer circuits and numerical calculations. As time went on, the field of information technology and computer science became more complex and was able to handle the processing of more data. Scholarly articles began to be published from different organizations. [9]
Looking at early computing, Alan Turing, J. Presper Eckert, and John Mauchly were considered some of the major pioneers of computer technology in the mid-1900s. Giving them such credit for their developments, most of their efforts were focused on designing the first digital computer. Along with that, topics such as artificial intelligence began to be brought up as Turing was beginning to question such technology of the time period. [10]
Devices have been used to aid computation for thousands of years, probably initially in the form of a tally stick. [11] The Antikythera mechanism, dating from about the beginning of the first century BC, is generally considered the earliest known mechanical analog computer, and the earliest known geared mechanism. [12] Comparable geared devices did not emerge in Europe until the 16th century, and it was not until 1645 that the first mechanical calculator capable of performing the four basic arithmetical operations was developed. [13]
Electronic computers, using either relays or valves, began to appear in the early 1940s. The electromechanical Zuse Z3, completed in 1941, was the world's first programmable computer, and by modern standards one of the first machines that could be considered a complete computing machine. During the Second World War, Colossus developed the first electronic digital computer to decrypt German messages. Although it was programmable, it was not general-purpose, being designed to perform only a single task. It also lacked the ability to store its program in memory; programming was carried out using plugs and switches to alter the internal wiring. [14] The first recognizably modern electronic digital stored-program computer was the Manchester Baby, which ran its first program on 21 June 1948. [15]
The development of transistors in the late 1940s at Bell Laboratories allowed a new generation of computers to be designed with greatly reduced power consumption. The first commercially available stored-program computer, the Ferranti Mark I, contained 4050 valves and had a power consumption of 25 kilowatts. By comparison, the first transistorized computer developed at the University of Manchester and operational by November 1953, consumed only 150 watts in its final version. [16]
Several other breakthroughs in semiconductor technology include the integrated circuit (IC) invented by Jack Kilby at Texas Instruments and Robert Noyce at Fairchild Semiconductor in 1959, silicon dioxide surface passivation by Carl Frosch and Lincoln Derick in 1955, [17] the first planar silicon dioxide transistors by Frosch and Derick in 1957, [18] the MOSFET demonstration by a Bell Labs team. [19] [20] [21] [22] the planar process by Jean Hoerni in 1959, [23] [24] [25] and the microprocessor invented by Ted Hoff, Federico Faggin, Masatoshi Shima, and Stanley Mazor at Intel in 1971. These important inventions led to the development of the personal computer (PC) in the 1970s, and the emergence of information and communications technology (ICT). [26]
By the year of 1984, according to the National Westminster Bank Quarterly Review, the term information technology had been redefined as "The development of cable television was made possible by the convergence of telecommunications and computing technology (…generally known in Britain as information technology)." We then begin to see the appearance of the term in 1990 contained within documents for the International Organization for Standardization (ISO). [27]
Innovations in technology have already revolutionized the world by the twenty-first century as people were able to access different online services. This has changed the workforce drastically as thirty percent of U.S. workers were already in careers in this profession. 136.9 million people were personally connected to the Internet, which was equivalent to 51 million households. [28] Along with the Internet, new types of technology were also being introduced across the globe, which has improved efficiency and made things easier across the globe.
Along with technology revolutionizing society, millions of processes could be done in seconds. Innovations in communication were also crucial as people began to rely on the computer to communicate through telephone lines and cable. The introduction of the email was considered revolutionary as "companies in one part of the world could communicate by e-mail with suppliers and buyers in another part of the world..." [29]
Not only personally, computers and technology have also revolutionized the marketing industry, resulting in more buyers of their products. During the year of 2002, Americans exceeded $28 billion in goods just over the Internet alone while e-commerce a decade later resulted in $289 billion in sales. [29] And as computers are rapidly becoming more sophisticated by the day, they are becoming more used as people are becoming more reliant on them during the twenty-first century.
Early electronic computers such as Colossus made use of punched tape, a long strip of paper on which data was represented by a series of holes, a technology now obsolete. [30] Electronic data storage, which is used in modern computers, dates from World War II, when a form of delay-line memory was developed to remove the clutter from radar signals, the first practical application of which was the mercury delay line. [31] The first random-access digital storage device was the Williams tube, which was based on a standard cathode ray tube. [32] However, the information stored in it and delay-line memory was volatile in the fact that it had to be continuously refreshed, and thus was lost once power was removed. The earliest form of non-volatile computer storage was the magnetic drum, invented in 1932 [33] and used in the Ferranti Mark 1, the world's first commercially available general-purpose electronic computer. [34]
IBM introduced the first hard disk drive in 1956, as a component of their 305 RAMAC computer system. [35] : 6 Most digital data today is still stored magnetically on hard disks, or optically on media such as CD-ROMs. [36] : 4–5 Until 2002 most information was stored on analog devices, but that year digital storage capacity exceeded analog for the first time. As of 2007 [update] , almost 94% of the data stored worldwide was held digitally: [37] 52% on hard disks, 28% on optical devices, and 11% on digital magnetic tape. It has been estimated that the worldwide capacity to store information on electronic devices grew from less than 3 exabytes in 1986 to 295 exabytes in 2007, [38] doubling roughly every 3 years. [39]
Database Management Systems (DMS) emerged in the 1960s to address the problem of storing and retrieving large amounts of data accurately and quickly. An early such system was IBM's Information Management System (IMS), [40] which is still widely deployed more than 50 years later. [41] IMS stores data hierarchically, [40] but in the 1970s Ted Codd proposed an alternative relational storage model based on set theory and predicate logic and the familiar concepts of tables, rows, and columns. In 1981, the first commercially available relational database management system (RDBMS) was released by Oracle. [42]
All DMS consist of components, they allow the data they store to be accessed simultaneously by many users while maintaining its integrity. [43] All databases are common in one point that the structure of the data they contain is defined and stored separately from the data itself, in a database schema. [40]
In recent years, the extensible markup language (XML) has become a popular format for data representation. Although XML data can be stored in normal file systems, it is commonly held in relational databases to take advantage of their "robust implementation verified by years of both theoretical and practical effort." [44] As an evolution of the Standard Generalized Markup Language (SGML), XML's text-based structure offers the advantage of being both machine- and human-readable. [45]
Data transmission has three aspects: transmission, propagation, and reception. [46] It can be broadly categorized as broadcasting, in which information is transmitted unidirectionally downstream, or telecommunications, with bidirectional upstream and downstream channels. [38]
XML has been increasingly employed as a means of data interchange since the early 2000s, [47] particularly for machine-oriented interactions such as those involved in web-oriented protocols such as SOAP, [45] describing "data-in-transit rather than... data-at-rest". [47]
Hilbert and Lopez identify the exponential pace of technological change (a kind of Moore's law): machines' application-specific capacity to compute information per capita roughly doubled every 14 months between 1986 and 2007; the per capita capacity of the world's general-purpose computers doubled every 18 months during the same two decades; the global telecommunication capacity per capita doubled every 34 months; the world's storage capacity per capita required roughly 40 months to double (every 3 years); and per capita broadcast information has doubled every 12.3 years. [38]
Massive amounts of data are stored worldwide every day, but unless it can be analyzed and presented effectively it essentially resides in what have been called data tombs: "data archives that are seldom visited". [48] To address that issue, the field of data mining — "the process of discovering interesting patterns and knowledge from large amounts of data" [49] — emerged in the late 1980s. [50]
The technology and services it provides for sending and receiving electronic messages (called "letters" or "electronic letters") over a distributed (including global) computer network. In terms of the composition of elements and the principle of operation, electronic mail practically repeats the system of regular (paper) mail, borrowing both terms (mail, letter, envelope, attachment, box, delivery, and others) and characteristic features — ease of use, message transmission delays, sufficient reliability and at the same time no guarantee of delivery. The advantages of e-mail are: easily perceived and remembered by a person addresses of the form user_name@domain_name (for example, somebody@example.com); the ability to transfer both plain text and formatted, as well as arbitrary files; independence of servers (in the general case, they address each other directly); sufficiently high reliability of message delivery; ease of use by humans and programs.
Disadvantages of e-mail: the presence of such a phenomenon as spam (massive advertising and viral mailings); the theoretical impossibility of guaranteed delivery of a particular letter; possible delays in message delivery (up to several days); limits on the size of one message and on the total size of messages in the mailbox (personal for users).
A software and hardware complex with a web interface that provides the ability to search for information on the Internet. A search engine usually means a site that hosts the interface (front-end) of the system. The software part of a search engine is a search engine (search engine) — a set of programs that provides the functionality of a search engine and is usually a trade secret of the search engine developer company. Most search engines look for information on World Wide Web sites, but there are also systems that can look for files on FTP servers, items in online stores, and information on Usenet newsgroups. Improving search is one of the priorities of the modern Internet (see the Deep Web article about the main problems in the work of search engines).
Companies in the information technology field are often discussed as a group as the "tech sector" or the "tech industry." [51] [52] [53] These titles can be misleading at times and should not be mistaken for "tech companies;" which are generally large scale, for-profit corporations that sell consumer technology and software. It is also worth noting that from a business perspective, Information technology departments are a "cost center" the majority of the time. A cost center is a department or staff which incurs expenses, or "costs", within a company rather than generating profits or revenue streams. Modern businesses rely heavily on technology for their day-to-day operations, so the expenses delegated to cover technology that facilitates business in a more efficient manner are usually seen as "just the cost of doing business." IT departments are allocated funds by senior leadership and must attempt to achieve the desired deliverables while staying within that budget. Government and the private sector might have different funding mechanisms, but the principles are more-or-less the same. This is an often overlooked reason for the rapid interest in automation and Artificial Intelligence, but the constant pressure to do more with less is opening the door for automation to take control of at least some minor operations in large companies.
Many companies now have IT departments for managing the computers, networks, and other technical areas of their businesses. Companies have also sought to integrate IT with business outcomes and decision-making through a BizOps or business operations department. [54]
In a business context, the Information Technology Association of America has defined information technology as "the study, design, development, application, implementation, support, or management of computer-based information systems". [55] [ page needed ] The responsibilities of those working in the field include network administration, software development and installation, and the planning and management of an organization's technology life cycle, by which hardware and software are maintained, upgraded, and replaced.
Information services is a term somewhat loosely applied to a variety of IT-related services offered by commercial companies, [56] [57] [58] as well as data brokers.
The field of information ethics was established by mathematician Norbert Wiener in the 1940s. [60] : 9 Some of the ethical issues associated with the use of information technology include: [61] : 20–21
Research suggests that IT projects in business and public administration can easily become significant in scale. Work conducted by McKinsey in collaboration with the University of Oxford suggested that half of all large-scale IT projects (those with initial cost estimates of $15 million or more) often failed to maintain costs within their initial budgets or to complete on time. [62]
Computing is any goal-oriented activity requiring, benefiting from, or creating computing machinery. It includes the study and experimentation of algorithmic processes, and the development of both hardware and software. Computing has scientific, engineering, mathematical, technological, and social aspects. Major computing disciplines include computer engineering, computer science, cybersecurity, data science, information systems, information technology, and software engineering.
Computer memory stores information, such as data and programs, for immediate use in the computer. The term memory is often synonymous with the terms RAM,main memory, or primary storage. Archaic synonyms for main memory include core and store.
Digital electronics is a field of electronics involving the study of digital signals and the engineering of devices that use or produce them. This is in contrast to analog electronics which work primarily with analog signals. Despite the name, digital electronics designs includes important analog design considerations.
Computer engineering is a branch of electrical engineering that integrates several fields of electrical engineering, electronics engineering and Computer Science required to develop computer hardware and software. Computer engineering is referred to as Electrical and Computer engineering OR Computer Science and Engineering at some universities
The Information Age is a historical period that began in the mid-20th century. It is characterized by a rapid shift from traditional industries, as established during the Industrial Revolution, to an economy centered on information technology. The onset of the Information Age has been linked to the development of the transistor in 1947 and the optical amplifier in 1957. These technological advances have had a significant impact on the way information is processed and transmitted.
A stored-program computer is a computer that stores program instructions in electronically, electromagnetically, or optically accessible memory. This contrasts with systems that stored the program instructions with plugboards or similar mechanisms.
An information system (IS) is a formal, sociotechnical, organizational system designed to collect, process, store, and distribute information. From a sociotechnical perspective, information systems comprise four components: task, people, structure, and technology. Information systems can be defined as an integration of components for collection, storage and processing of data, comprising digital products that process data to facilitate decision making and the data being used to provide information and contribute to knowledge.
The history of computing hardware starting at 1960 is marked by the conversion from vacuum tube to solid-state devices such as transistors and then integrated circuit (IC) chips. Around 1953 to 1959, discrete transistors started being considered sufficiently reliable and economical that they made further vacuum tube computers uncompetitive. Metal–oxide–semiconductor (MOS) large-scale integration (LSI) technology subsequently led to the development of semiconductor memory in the mid-to-late 1960s and then the microprocessor in the early 1970s. This led to primary computer memory moving away from magnetic-core memory devices to solid-state static and dynamic semiconductor memory, which greatly reduced the cost, size, and power consumption of computers. These advances led to the miniaturized personal computer (PC) in the 1970s, starting with home computers and desktop computers, followed by laptops and then mobile computers over the next several decades.
Electronic data processing (EDP) or business information processing can refer to the use of automated methods to process commercial data. Typically, this uses relatively simple, repetitive activities to process large volumes of similar information. For example: stock updates applied to an inventory, banking transactions applied to account and customer master files, booking and ticketing transactions to an airline's reservation system, billing for utility services. The modifier "electronic" or "automatic" was used with "data processing" (DP), especially c. 1960, to distinguish human clerical data processing from that done by computer.
Enterprise information integration (EII) is the ability to support a unified view of data and information for an entire organization. In a data virtualization application of EII, a process of information integration, using data abstraction to provide a unified interface for viewing all the data within an organization, and a single set of structures and naming conventions to represent this data; the goal of EII is to get a large set of heterogeneous data sources to appear to a user or system as a single, homogeneous data source.
Semiconductor memory is a digital electronic semiconductor device used for digital data storage, such as computer memory. It typically refers to devices in which data is stored within metal–oxide–semiconductor (MOS) memory cells on a silicon integrated circuit memory chip. There are numerous different types using different semiconductor technologies. The two main types of random-access memory (RAM) are static RAM (SRAM), which uses several transistors per memory cell, and dynamic RAM (DRAM), which uses a transistor and a MOS capacitor per cell. Non-volatile memory uses floating-gate memory cells, which consist of a single floating-gate transistor per cell.
Enterprise content management (ECM) extends the concept of content management by adding a timeline for each content item and, possibly, enforcing processes for its creation, approval, and distribution. Systems using ECM generally provide a secure repository for managed items, analog or digital. They also include one methods for importing content to manage new items, and several presentation methods to make items available for use. Although ECM content may be protected by digital rights management (DRM), it is not required. ECM is distinguished from general content management by its cognizance of the processes and procedures of the enterprise for which it is created.
The following outline is provided as an overview of and topical guide to information technology:
A computer is a machine that can be programmed to automatically carry out sequences of arithmetic or logical operations (computation). Modern digital electronic computers can perform generic sets of operations known as programs. These programs enable computers to perform a wide range of tasks. The term computer system may refer to a nominally complete computer that includes the hardware, operating system, software, and peripheral equipment needed and used for full operation; or to a group of computers that are linked and function together, such as a computer network or computer cluster.
A personal computer, often referred to as a PC, is a computer designed for individual use. It is typically used for tasks such as word processing, internet browsing, email, multimedia playback, and gaming. Personal computers are intended to be operated directly by an end user, rather than by a computer expert or technician. Unlike large, costly minicomputers and mainframes, time-sharing by many people at the same time is not used with personal computers. The term home computer has also been used, primarily in the late 1970s and 1980s. The advent of personal computers and the concurrent Digital Revolution have significantly affected the lives of people.
Computer hardware includes the physical parts of a computer, such as the central processing unit (CPU), random access memory (RAM), motherboard, computer data storage, graphics card, sound card, and computer case. It includes external devices such as a monitor, mouse, keyboard, and speakers.
The following outline is provided as an overview of and topical guide to computing:
This article details the history of electronics engineering. Chambers Twentieth Century Dictionary (1972) defines electronics as "The science and technology of the conduction of electricity in a vacuum, a gas, or a semiconductor, and devices based thereon".
This glossary of computer science is a list of definitions of terms and concepts used in computer science, its sub-disciplines, and related fields, including terms relevant to software, data science, and computer programming.
Commonly a synonym for computers and computer networks but more broadly designating any technology that is used to generate, store, process, and/or distribute information electronically, including television and telephone..