Computer science

Last updated

Computer science is the study of computation, information, and automation. [1] [2] [3] Computer science spans theoretical disciplines (such as algorithms, theory of computation, and information theory) to applied disciplines (including the design and implementation of hardware and software). [4] [5] [6]

Contents

Algorithms and data structures are central to computer science. [7] The theory of computation concerns abstract models of computation and general classes of problems that can be solved using them. The fields of cryptography and computer security involve studying the means for secure communication and preventing security vulnerabilities. Computer graphics and computational geometry address the generation of images. Programming language theory considers different ways to describe computational processes, and database theory concerns the management of repositories of data. Human–computer interaction investigates the interfaces through which humans and computers interact, and software engineering focuses on the design and principles behind developing software. Areas such as operating systems, networks and embedded systems investigate the principles and design behind complex systems. Computer architecture describes the construction of computer components and computer-operated equipment. Artificial intelligence and machine learning aim to synthesize goal-orientated processes such as problem-solving, decision-making, environmental adaptation, planning and learning found in humans and animals. Within artificial intelligence, computer vision aims to understand and process image and video data, while natural language processing aims to understand and process textual and linguistic data.

The fundamental concern of computer science is determining what can and cannot be automated. [2] [8] [3] [9] [10] The Turing Award is generally recognized as the highest distinction in computer science. [11] [12]

History

Gottfried Wilhelm Leibniz (1646-1716) developed logic in a binary number system and has been called the "founder of computer science". Christoph Bernhard Francke - Bildnis des Philosophen Leibniz (ca. 1695).jpg
Gottfried Wilhelm Leibniz (1646–1716) developed logic in a binary number system and has been called the "founder of computer science".
Charles Babbage is sometimes referred to as the "father of computing". Babbage40.png
Charles Babbage is sometimes referred to as the "father of computing".
Ada Lovelace published the first algorithm intended for processing on a computer. Ada lovelace.jpg
Ada Lovelace published the first algorithm intended for processing on a computer.

The earliest foundations of what would become computer science predate the invention of the modern digital computer. Machines for calculating fixed numerical tasks such as the abacus have existed since antiquity, aiding in computations such as multiplication and division. Algorithms for performing computations have existed since antiquity, even before the development of sophisticated computing equipment. [16]

Wilhelm Schickard designed and constructed the first working mechanical calculator in 1623. [17] In 1673, Gottfried Leibniz demonstrated a digital mechanical calculator, called the Stepped Reckoner. [18] Leibniz may be considered the first computer scientist and information theorist, because of various reasons, including the fact that he documented the binary number system. In 1820, Thomas de Colmar launched the mechanical calculator industry [note 1] when he invented his simplified arithmometer, the first calculating machine strong enough and reliable enough to be used daily in an office environment. Charles Babbage started the design of the first automatic mechanical calculator, his Difference Engine, in 1822, which eventually gave him the idea of the first programmable mechanical calculator, his Analytical Engine. [19] He started developing this machine in 1834, and "in less than two years, he had sketched out many of the salient features of the modern computer". [20] "A crucial step was the adoption of a punched card system derived from the Jacquard loom" [20] making it infinitely programmable. [note 2] In 1843, during the translation of a French article on the Analytical Engine, Ada Lovelace wrote, in one of the many notes she included, an algorithm to compute the Bernoulli numbers, which is considered to be the first published algorithm ever specifically tailored for implementation on a computer. [21] Around 1885, Herman Hollerith invented the tabulator, which used punched cards to process statistical information; eventually his company became part of IBM. Following Babbage, although unaware of his earlier work, Percy Ludgate in 1909 published [22] the 2nd of the only two designs for mechanical analytical engines in history. In 1914, the Spanish engineer Leonardo Torres Quevedo published his Essays on Automatics, [23] and designed, inspired by Babbage, a theoretical electromechanical calculating machine which was to be controlled by a read-only program. The paper also introduced the idea of floating-point arithmetic. [24] [25] In 1920, to celebrate the 100th anniversary of the invention of the arithmometer, Torres presented in Paris the Electromechanical Arithmometer, a prototype that demonstrated the feasibility of an electromechanical analytical engine, [26] on which commands could be typed and the results printed automatically. [27] In 1937, one hundred years after Babbage's impossible dream, Howard Aiken convinced IBM, which was making all kinds of punched card equipment and was also in the calculator business [28] to develop his giant programmable calculator, the ASCC/Harvard Mark I, based on Babbage's Analytical Engine, which itself used cards and a central computing unit. When the machine was finished, some hailed it as "Babbage's dream come true". [29]

During the 1940s, with the development of new and more powerful computing machines such as the Atanasoff–Berry computer and ENIAC, the term computer came to refer to the machines rather than their human predecessors. [30] As it became clear that computers could be used for more than just mathematical calculations, the field of computer science broadened to study computation in general. In 1945, IBM founded the Watson Scientific Computing Laboratory at Columbia University in New York City. The renovated fraternity house on Manhattan's West Side was IBM's first laboratory devoted to pure science. The lab is the forerunner of IBM's Research Division, which today operates research facilities around the world. [31] Ultimately, the close relationship between IBM and Columbia University was instrumental in the emergence of a new scientific discipline, with Columbia offering one of the first academic-credit courses in computer science in 1946. [32] Computer science began to be established as a distinct academic discipline in the 1950s and early 1960s. [33] [34] The world's first computer science degree program, the Cambridge Diploma in Computer Science, began at the University of Cambridge Computer Laboratory in 1953. The first computer science department in the United States was formed at Purdue University in 1962. [35] Since practical computers became available, many applications of computing have become distinct areas of study in their own rights.

Etymology and scope

Although first proposed in 1956, [36] the term "computer science" appears in a 1959 article in Communications of the ACM , [37] in which Louis Fein argues for the creation of a Graduate School in Computer Sciences analogous to the creation of Harvard Business School in 1921. [38] Louis justifies the name by arguing that, like management science, the subject is applied and interdisciplinary in nature, while having the characteristics typical of an academic discipline. [37] His efforts, and those of others such as numerical analyst George Forsythe, were rewarded: universities went on to create such departments, starting with Purdue in 1962. [39] Despite its name, a significant amount of computer science does not involve the study of computers themselves. Because of this, several alternative names have been proposed. [40] Certain departments of major universities prefer the term computing science, to emphasize precisely that difference. Danish scientist Peter Naur suggested the term datalogy, [41] to reflect the fact that the scientific discipline revolves around data and data treatment, while not necessarily involving computers. The first scientific institution to use the term was the Department of Datalogy at the University of Copenhagen, founded in 1969, with Peter Naur being the first professor in datalogy. The term is used mainly in the Scandinavian countries. An alternative term, also proposed by Naur, is data science; this is now used for a multi-disciplinary field of data analysis, including statistics and databases.

In the early days of computing, a number of terms for the practitioners of the field of computing were suggested in the Communications of the ACMturingineer, turologist, flow-charts-man, applied meta-mathematician, and applied epistemologist . [42] Three months later in the same journal, comptologist was suggested, followed next year by hypologist. [43] The term computics has also been suggested. [44] In Europe, terms derived from contracted translations of the expression "automatic information" (e.g. "informazione automatica" in Italian) or "information and mathematics" are often used, e.g. informatique (French), Informatik (German), informatica (Italian, Dutch), informática (Spanish, Portuguese), informatika (Slavic languages and Hungarian) or pliroforiki (πληροφορική, which means informatics) in Greek. Similar words have also been adopted in the UK (as in the School of Informatics, University of Edinburgh). [45] "In the U.S., however, informatics is linked with applied computing, or computing in the context of another domain." [46]

A folkloric quotation, often attributed to—but almost certainly not first formulated by—Edsger Dijkstra, states that "computer science is no more about computers than astronomy is about telescopes." [note 3] The design and deployment of computers and computer systems is generally considered the province of disciplines other than computer science. For example, the study of computer hardware is usually considered part of computer engineering, while the study of commercial computer systems and their deployment is often called information technology or information systems. However, there has been exchange of ideas between the various computer-related disciplines. Computer science research also often intersects other disciplines, such as cognitive science, linguistics, mathematics, physics, biology, Earth science, statistics, philosophy, and logic.

Computer science is considered by some to have a much closer relationship with mathematics than many scientific disciplines, with some observers saying that computing is a mathematical science. [33] Early computer science was strongly influenced by the work of mathematicians such as Kurt Gödel, Alan Turing, John von Neumann, Rózsa Péter and Alonzo Church and there continues to be a useful interchange of ideas between the two fields in areas such as mathematical logic, category theory, domain theory, and algebra. [36]

The relationship between computer science and software engineering is a contentious issue, which is further muddied by disputes over what the term "software engineering" means, and how computer science is defined. [47] David Parnas, taking a cue from the relationship between other engineering and science disciplines, has claimed that the principal focus of computer science is studying the properties of computation in general, while the principal focus of software engineering is the design of specific computations to achieve practical goals, making the two separate but complementary disciplines. [48]

The academic, political, and funding aspects of computer science tend to depend on whether a department is formed with a mathematical emphasis or with an engineering emphasis. Computer science departments with a mathematics emphasis and with a numerical orientation consider alignment with computational science. Both types of departments tend to make efforts to bridge the field educationally if not across all research.

Philosophy

Epistemology of computer science

Despite the word science in its name, there is debate over whether or not computer science is a discipline of science, [49] mathematics, [50] or engineering. [51] Allen Newell and Herbert A. Simon argued in 1975,

Computer science is an empirical discipline. We would have called it an experimental science, but like astronomy, economics, and geology, some of its unique forms of observation and experience do not fit a narrow stereotype of the experimental method. Nonetheless, they are experiments. Each new machine that is built is an experiment. Actually constructing the machine poses a question to nature; and we listen for the answer by observing the machine in operation and analyzing it by all analytical and measurement means available. [51]

It has since been argued that computer science can be classified as an empirical science since it makes use of empirical testing to evaluate the correctness of programs, but a problem remains in defining the laws and theorems of computer science (if any exist) and defining the nature of experiments in computer science. [51] Proponents of classifying computer science as an engineering discipline argue that the reliability of computational systems is investigated in the same way as bridges in civil engineering and airplanes in aerospace engineering. [51] They also argue that while empirical sciences observe what presently exists, computer science observes what is possible to exist and while scientists discover laws from observation, no proper laws have been found in computer science and it is instead concerned with creating phenomena. [51]

Proponents of classifying computer science as a mathematical discipline argue that computer programs are physical realizations of mathematical entities and programs that can be deductively reasoned through mathematical formal methods. [51] Computer scientists Edsger W. Dijkstra and Tony Hoare regard instructions for computer programs as mathematical sentences and interpret formal semantics for programming languages as mathematical axiomatic systems. [51]

Paradigms of computer science

A number of computer scientists have argued for the distinction of three separate paradigms in computer science. Peter Wegner argued that those paradigms are science, technology, and mathematics. [52] Peter Denning's working group argued that they are theory, abstraction (modeling), and design. [33] Amnon H. Eden described them as the "rationalist paradigm" (which treats computer science as a branch of mathematics, which is prevalent in theoretical computer science, and mainly employs deductive reasoning), the "technocratic paradigm" (which might be found in engineering approaches, most prominently in software engineering), and the "scientific paradigm" (which approaches computer-related artifacts from the empirical perspective of natural sciences, [53] identifiable in some branches of artificial intelligence). [54] Computer science focuses on methods involved in design, specification, programming, verification, implementation and testing of human-made computing systems. [55]

Fields

As a discipline, computer science spans a range of topics from theoretical studies of algorithms and the limits of computation to the practical issues of implementing computing systems in hardware and software. [56] [57] CSAB, formerly called Computing Sciences Accreditation Board—which is made up of representatives of the Association for Computing Machinery (ACM), and the IEEE Computer Society (IEEE CS) [58] —identifies four areas that it considers crucial to the discipline of computer science: theory of computation, algorithms and data structures, programming methodology and languages, and computer elements and architecture. In addition to these four areas, CSAB also identifies fields such as software engineering, artificial intelligence, computer networking and communication, database systems, parallel computation, distributed computation, human–computer interaction, computer graphics, operating systems, and numerical and symbolic computation as being important areas of computer science. [56]

Theoretical computer science

Theoretical computer science is mathematical and abstract in spirit, but it derives its motivation from practical and everyday computation. It aims to understand the nature of computation and, as a consequence of this understanding, provide more efficient methodologies.

Theory of computation

According to Peter Denning, the fundamental question underlying computer science is, "What can be automated?" [3] Theory of computation is focused on answering fundamental questions about what can be computed and what amount of resources are required to perform those computations. In an effort to answer the first question, computability theory examines which computational problems are solvable on various theoretical models of computation. The second question is addressed by computational complexity theory, which studies the time and space costs associated with different approaches to solving a multitude of computational problems.

The famous P = NP? problem, one of the Millennium Prize Problems, [59] is an open problem in the theory of computation.

DFAexample.svg Syntax tree.svg Complexity classes.svg
Automata theory Formal languages Computability theory Computational complexity theory
Interaction Net as Configuration.png Blochsphere.svg XNOR ANSI Labelled.svg Kellerautomat.svg
Models of computation Quantum computing theory Logic circuit theory Cellular automata

Information and coding theory

Information theory, closely related to probability and statistics, is related to the quantification of information. This was developed by Claude Shannon to find fundamental limits on signal processing operations such as compressing data and on reliably storing and communicating data. [60] Coding theory is the study of the properties of codes (systems for converting information from one form to another) and their fitness for a specific application. Codes are used for data compression, cryptography, error detection and correction, and more recently also for network coding. Codes are studied for the purpose of designing efficient and reliable data transmission methods. [61]

Hamming.jpg Binary symmetric channel.svg Digitalteilchen.svg H0 h1 fehler.jpg Mandelpart2 red.png
Coding theory Channel capacity Algorithmic information theory Signal detection theory Kolmogorov complexity

Data structures and algorithms

Data structures and algorithms are the studies of commonly used computational methods and their computational efficiency.

O(n2) Sorting quicksort anim.gif Tree (computer science).svg TSP Deutschland 3.png SimplexRangeSearching.svg Contraction vertices.jpg
Analysis of algorithms Algorithm design Data structures Combinatorial optimization Computational geometry Randomized algorithms

Programming language theory and formal methods

Programming language theory is a branch of computer science that deals with the design, implementation, analysis, characterization, and classification of programming languages and their individual features. It falls within the discipline of computer science, both depending on and affecting mathematics, software engineering, and linguistics. It is an active research area, with numerous dedicated academic journals.

Formal methods are a particular kind of mathematically based technique for the specification, development and verification of software and hardware systems. [62] The use of formal methods for software and hardware design is motivated by the expectation that, as in other engineering disciplines, performing appropriate mathematical analysis can contribute to the reliability and robustness of a design. They form an important theoretical underpinning for software engineering, especially where safety or security is involved. Formal methods are a useful adjunct to software testing since they help avoid errors and can also give a framework for testing. For industrial use, tool support is required. However, the high cost of using formal methods means that they are usually only used in the development of high-integrity and life-critical systems, where safety or security is of utmost importance. Formal methods are best described as the application of a fairly broad variety of theoretical computer science fundamentals, in particular logic calculi, formal languages, automata theory, and program semantics, but also type systems and algebraic data types to problems in software and hardware specification and verification.

IF-THEN-ELSE-END flowchart.svg Compiler.svg Python add5 syntax.svg Prop-tableau-1.svg Coq plus comm screenshot.jpg
Formal semantics Type theory Compiler design Programming languages Formal verification Automated theorem proving

Applied computer science

Computer graphics and visualization

Computer graphics is the study of digital visual contents and involves the synthesis and manipulation of image data. The study is connected to many other fields in computer science, including computer vision, image processing, and computational geometry, and is heavily applied in the fields of special effects and video games.

Simx2=transl OK.svg FWDvsINV Kinematics HighResTransp.png 5-cell.gif Hud on the cat.jpg Visible light eye-tracking algorithm.jpg Csg tree.png
2D computer graphics Computer animation Rendering Mixed reality Virtual reality Solid modeling

Image and sound processing

Information can take the form of images, sound, video or other multimedia. Bits of information can be streamed via signals. Its processing is the central notion of informatics, the European view on computing, which studies information processing algorithms independently of the type of information carrier – whether it is electrical, mechanical or biological. This field plays important role in information theory, telecommunications, information engineering and has applications in medical image computing and speech synthesis, among others. What is the lower bound on the complexity of fast Fourier transform algorithms? is one of the unsolved problems in theoretical computer science.

DIT-FFT-butterfly.png Bayer pattern on sensor.svg Opus quality comparison colorblind compatible.svg Quality comparison jpg vs saveforweb.jpg MeningiomaMRISegmentation.png AEtoms - Translation.svg
FFT algorithms Image processing Speech recognition Data compression Medical image computing Speech synthesis

Computational science, finance and engineering

Scientific computing (or computational science) is the field of study concerned with constructing mathematical models and quantitative analysis techniques and using computers to analyze and solve scientific problems. A major usage of scientific computing is simulation of various processes, including computational fluid dynamics, physical, electrical, and electronic systems and circuits, as well as societies and social situations (notably war games) along with their habitats, among many others. Modern computers enable optimization of such designs as complete aircraft. Notable in electrical and electronic circuit design are SPICE, [63] as well as software for physical realization of new (or modified) designs. The latter includes essential design software for integrated circuits. [64]

Lorenz attractor yb.svg Quark wiki.jpg Naphthalene-3D-balls.png 1u04-argonaute.png GalvesLocherbach - Low resolution.gif Plutchik-wheel.svg X-ray of hand, where bone age is automatically found by BoneXpert software.jpg Elmer-pump-heatequation.png Bachlut1.png
Numerical analysis Computational physics Computational chemistry Bioinformatics Neuroinformatics Psychoinformatics Medical informatics Computational engineering Computational musicology

Human–computer interaction

Human–computer interaction (HCI) is the field of study and research concerned with the design and use of computer systems, mainly based on the analysis of the interaction between humans and computer interfaces. HCI has several subfields that focus on the relationship between emotions, social behavior and brain activity with computers.

Maker Faire, Berlin (BL7C0099).jpg InterfaceNeuronaleDirecte-tag.svg Human-Centered Design-Prozess nach HBS.png Physical computing.svg WMID social media network.png
Affective computing Brain–computer interface Human-centered design Physical computing Social computing

Software engineering

Software engineering is the study of designing, implementing, and modifying the software in order to ensure it is of high quality, affordable, maintainable, and fast to build. It is a systematic approach to software design, involving the application of engineering practices to software. Software engineering deals with the organizing and analyzing of software—it does not just deal with the creation or manufacture of new software, but its internal arrangement and maintenance. For example software testing, systems engineering, technical debt and software development processes.

Artificial intelligence

Artificial intelligence (AI) aims to or is required to synthesize goal-orientated processes such as problem-solving, decision-making, environmental adaptation, learning, and communication found in humans and animals. From its origins in cybernetics and in the Dartmouth Conference (1956), artificial intelligence research has been necessarily cross-disciplinary, drawing on areas of expertise such as applied mathematics, symbolic logic, semiotics, electrical engineering, philosophy of mind, neurophysiology, and social intelligence. AI is associated in the popular mind with robotic development, but the main field of practical application has been as an embedded component in areas of software development, which require computational understanding. The starting point in the late 1940s was Alan Turing's question "Can computers think?", and the question remains effectively unanswered, although the Turing test is still used to assess computer output on the scale of human intelligence. But the automation of evaluative and predictive tasks has been increasingly successful as a substitute for human monitoring and intervention in domains of computer application involving complex real-world data.

Nicolas P. Rougier's rendering of the human brain.png Human eye, rendered from Eye.png Colored neural network.svg Markov Decision Process.svg
Computational learning theory Computer vision Neural networks Planning and scheduling
English.png Knight's tour.svg Ackley.gif AutonomicSystemModel.png
Natural language processing Computational game theory Evolutionary computation Autonomic computing
Neuron.svg KnnClassification.svg ROS C logo.jpg Rule alignment.gif
Representation and reasoning Pattern recognition Robotics Swarm intelligence

Computer systems

Computer architecture and microarchitecture

Computer architecture, or digital computer organization, is the conceptual design and fundamental operational structure of a computer system. It focuses largely on the way by which the central processing unit performs internally and accesses addresses in memory. [65] Computer engineers study computational logic and design of computer hardware, from individual processor components, microcontrollers, personal computers to supercomputers and embedded systems. The term "architecture" in computer literature can be traced to the work of Lyle R. Johnson and Frederick P. Brooks Jr., members of the Machine Organization department in IBM's main research center in 1959.

ABasicComputer.svg Intel Core2 arch.svg SIMD.svg Z80 arch.svg
Processing unit Microarchitecture Multiprocessing Processor design
Roomba original.jpg Flowchart.png Kernel Layout.svg Uarm metal wiki2.jpg
Ubiquitous computing Systems architecture Operating systems Input/output
Physical computing.svg FIR Filter General.svg Dep-1.svg Linker.svg
Embedded system Real-time computing Dependability Interpreter

Concurrent, parallel and distributed computing

Concurrency is a property of systems in which several computations are executing simultaneously, and potentially interacting with each other. [66] A number of mathematical models have been developed for general concurrent computation including Petri nets, process calculi and the parallel random access machine model. [67] When multiple computers are connected in a network while using concurrency, this is known as a distributed system. Computers within that distributed system have their own private memory, and information can be exchanged to achieve common goals. [68]

Computer networks

This branch of computer science aims to manage networks between computers worldwide.

Computer security and cryptography

Computer security is a branch of computer technology with the objective of protecting information from unauthorized access, disruption, or modification while maintaining the accessibility and usability of the system for its intended users.

Historical cryptography is the art of writing and deciphering secret messages. Modern cryptography is the scientific study of problems relating to distributed computations that can be attacked. [69] Technologies studied in modern cryptography include symmetric and asymmetric encryption, digital signatures, cryptographic hash functions, key-agreement protocols, blockchain, zero-knowledge proofs, and garbled circuits.

Databases and data mining

A database is intended to organize, store, and retrieve large amounts of data easily. Digital databases are managed using database management systems to store, create, maintain, and search data, through database models and query languages. Data mining is a process of discovering patterns in large data sets.

Discoveries

The philosopher of computing Bill Rapaport noted three Great Insights of Computer Science: [70]

All the information about any computable problem can be represented using only 0 and 1 (or any other bistable pair that can flip-flop between two easily distinguishable states, such as "on/off", "magnetized/de-magnetized", "high-voltage/low-voltage", etc.).
Every algorithm can be expressed in a language for a computer consisting of only five basic instructions: [71]
  • move left one location;
  • move right one location;
  • read symbol at current location;
  • print 0 at current location;
  • print 1 at current location.
Only three rules are needed to combine any set of basic instructions into more complex ones:
  • sequence: first do this, then do that;
  • selection: IF such-and-such is the case, THEN do this, ELSE do that;
  • repetition: WHILE such-and-such is the case, DO this.
The three rules of Boehm's and Jacopini's insight can be further simplified with the use of goto (which means it is more elementary than structured programming).

Programming paradigms

Programming languages can be used to accomplish different tasks in different ways. Common programming paradigms include:

Many languages offer support for multiple paradigms, making the distinction more a matter of style than of technical capabilities. [76]

Research

Conferences are important events for computer science research. During these conferences, researchers from the public and private sectors present their recent work and meet. Unlike in most other academic fields, in computer science, the prestige of conference papers is greater than that of journal publications. [77] [78] One proposed explanation for this is the quick development of this relatively new field requires rapid review and distribution of results, a task better handled by conferences than by journals. [79]

See also

Notes

  1. In 1851
  2. "The introduction of punched cards into the new engine was important not only as a more convenient form of control than the drums, or because programs could now be of unlimited extent, and could be stored and repeated without the danger of introducing errors in setting the machine by hand; it was important also because it served to crystallize Babbage's feeling that he had invented something really new, something much more than a sophisticated calculating machine." Bruce Collier, 1970
  3. See the entry "Computer science" on Wikiquote for the history of this quotation.
  4. The word "anything" is written in quotation marks because there are things that computers cannot do. One example is: to answer the question if an arbitrary given computer program will eventually finish or run forever (the Halting problem).

Related Research Articles

<span class="mw-page-title-main">Algorithm</span> Sequence of operations for a task

In mathematics and computer science, an algorithm is a finite sequence of mathematically rigorous instructions, typically used to solve a class of specific problems or to perform a computation. Algorithms are used as specifications for performing calculations and data processing. More advanced algorithms can use conditionals to divert the code execution through various routes and deduce valid inferences.

<span class="mw-page-title-main">Computing</span> Activity involving calculations or computing machinery

Computing is any goal-oriented activity requiring, benefiting from, or creating computing machinery. It includes the study and experimentation of algorithmic processes, and the development of both hardware and software. Computing has scientific, engineering, mathematical, technological, and social aspects. Major computing disciplines include computer engineering, computer science, cybersecurity, data science, information systems, information technology, and software engineering.

<span class="mw-page-title-main">Edsger W. Dijkstra</span> Dutch computer scientist (1930–2002)

Edsger Wybe Dijkstra was a Dutch computer scientist, programmer, software engineer, mathematician, and science essayist.

In computability theory, a system of data-manipulation rules is said to be Turing-complete or computationally universal if it can be used to simulate any Turing machine. This means that this system is able to recognize or decode other data-manipulation rule sets. Turing completeness is used as a way to express the power of such a data-manipulation rule set. Virtually all programming languages today are Turing-complete.

<span class="mw-page-title-main">John Backus</span> American computer scientist

John Warner Backus was an American computer scientist. He led the team that invented and implemented FORTRAN, the first widely used high-level programming language, and was the inventor of the Backus–Naur form (BNF), a widely used notation to define syntaxes of formal languages. He later did research into the function-level programming paradigm, presenting his findings in his influential 1977 Turing Award lecture "Can Programming Be Liberated from the von Neumann Style?"

Computer science is the study of the theoretical foundations of information and computation and their implementation and application in computer systems. One well known subject classification system for computer science is the ACM Computing Classification System devised by the Association for Computing Machinery.

<span class="mw-page-title-main">Theoretical computer science</span> Subfield of computer science and mathematics

Theoretical computer science is a subfield of computer science and mathematics that focuses on the abstract and mathematical foundations of computation.

A computer scientist is a scientist who specializes in the academic study of computer science.

Bio-inspired computing, short for biologically inspired computing, is a field of study which seeks to solve computer science problems using models of biology. It relates to connectionism, social behavior, and emergence. Within computer science, bio-inspired computing relates to artificial intelligence and machine learning. Bio-inspired computing is a major subset of natural computation.

<span class="mw-page-title-main">History of computing</span>

The history of computing is longer than the history of computing hardware and modern computing technology and includes the history of methods intended for pen and paper or for chalk and slate, with or without the aid of tables.

<span class="mw-page-title-main">Computer science and engineering</span> University academic program

Computer science and engineering (CSE) or Computer Science (CS) also integrated as Electrical engineering and Computer Science (EECS) in some universities, is an academic program at many universities which comprises approaches of computer science and computer engineering. There is no clear division in computing between science and engineering, just like in the field of materials science and engineering. However, some classes are historically more related to computer science, and other to computer engineering. CSE is also a term often used in Europe to translate the name of technical or engineering informatics academic programs. It is offered in both undergraduate as well postgraduate with specializations.

<span class="mw-page-title-main">Jack Dongarra</span> American computer scientist (born 1950)

Jack Joseph Dongarra is an American computer scientist and mathematician. He is a University Distinguished Professor Emeritus of Computer Science in the Electrical Engineering and Computer Science Department at the University of Tennessee. He holds the position of a Distinguished Research Staff member in the Computer Science and Mathematics Division at Oak Ridge National Laboratory, Turing Fellowship in the School of Mathematics at the University of Manchester, and is an adjunct professor and teacher in the Computer Science Department at Rice University. He served as a faculty fellow at the Texas A&M University Institute for Advanced Study (2014–2018). Dongarra is the founding director of the Innovative Computing Laboratory at the University of Tennessee. He was the recipient of the Turing Award in 2021.

<span class="mw-page-title-main">History of computer science</span>

The history of computer science began long before the modern discipline of computer science, usually appearing in forms like mathematics or physics. Developments in previous centuries alluded to the discipline that we now know as computer science. This progression, from mechanical inventions and mathematical theories towards modern computer concepts and machines, led to the development of a major academic field, massive technological advancement across the Western world, and the basis of a massive worldwide trade and culture.

<span class="mw-page-title-main">Programming language theory</span> Branch of computer science

Programming language theory (PLT) is a branch of computer science that deals with the design, implementation, analysis, characterization, and classification of formal languages known as programming languages. Programming language theory is closely related to other fields including mathematics, software engineering, and linguistics. There are a number of academic conferences and journals in the area.

<span class="mw-page-title-main">Frances Allen</span> American computer scientist (1932–2020)

Frances Elizabeth Allen was an American computer scientist and pioneer in the field of optimizing compilers. Allen was the first woman to become an IBM Fellow, and in 2006 became the first woman to win the Turing Award. Her achievements include seminal work in compilers, program optimization, and parallelization. She worked for IBM from 1957 to 2002 and subsequently was a Fellow Emerita.

Informatics is the study of computational systems. According to the ACM Europe Council and Informatics Europe, informatics is synonymous with computer science and computing as a profession, in which the central notion is transformation of information. In some cases, the term "informatics" may also be used with different meanings, e.g. in the context of social computing, or in context of library science.

<span class="mw-page-title-main">History of software</span> Description of the evolution and development of software throughout history

Software is a set of programmed instructions stored in the memory of stored-program digital computers for execution by the processor. Software is a recent development in human history and is fundamental to the Information Age.

This glossary of artificial intelligence is a list of definitions of terms and concepts relevant to the study of artificial intelligence (AI), its subdisciplines, and related fields. Related glossaries include Glossary of computer science, Glossary of robotics, and Glossary of machine vision.

This glossary of computer science is a list of definitions of terms and concepts used in computer science, its sub-disciplines, and related fields, including terms relevant to software, data science, and computer programming.

References

  1. "What is Computer Science?". Department of Computer Science, University of York. Archived from the original on June 11, 2020. Retrieved June 11, 2020.
  2. 1 2 What Can Be Automated? Computer Science and Engineering Research Study. Computer Science Series. MIT Press. 1980. ISBN   978-0262010603. Archived from the original on January 9, 2021.
  3. 1 2 3 Denning, P.J.; Comer, D.E.; Gries, D.; Mulder, M.C.; Tucker, A.; Turner, A.J.; Young, P.R. (February 1989). "Computing as a discipline". Computer. 22 (2): 63–70. doi:10.1109/2.19833. ISSN   1558-0814. Archived from the original on March 3, 2022. Retrieved March 3, 2022. The discipline of computing is the systematic study of algorithmic processes that describe and transform information, their theory, analysis, design, efficiency, implementation, and application. The fundamental question underlying all of computing is, 'What can be (efficiently) automated?'
  4. "WordNet Search—3.1". WordNet Search. Wordnetweb.princeton.edu. Archived from the original on October 18, 2017. Retrieved May 14, 2012.
  5. "Definition of computer science | Dictionary.com". www.dictionary.com. Archived from the original on June 11, 2020. Retrieved June 11, 2020.
  6. "What is Computer Science? | Undergraduate Computer Science at UMD". undergrad.cs.umd.edu. Archived from the original on November 27, 2020. Retrieved July 15, 2022.
  7. Harel, David (2014). Algorithmics The Spirit of Computing. Springer Berlin. ISBN   978-3-642-44135-6. OCLC   876384882. Archived from the original on June 17, 2020. Retrieved June 17, 2020.
  8. Patton, Richard D.; Patton, Peter C. (2009), Nof, Shimon Y. (ed.), "What Can be Automated? What Cannot be Automated?", Springer Handbook of Automation, Springer Handbooks, Berlin, Heidelberg: Springer, pp. 305–313, doi:10.1007/978-3-540-78831-7_18, ISBN   978-3-540-78831-7, archived from the original on January 11, 2023, retrieved March 3, 2022
  9. Forsythe, George (August 5–10, 1969). "Computer Science and Education". Proceedings of IFIP Congress 1968. The question 'What can be automated?' is one of the most inspiring philosophical and practical questions of contemporary civilization.
  10. Knuth, Donald E. (August 1, 1972). "George Forsythe and the development of computer science". Communications of the ACM. 15 (8): 721–726. doi: 10.1145/361532.361538 . ISSN   0001-0782. S2CID   12512057.
  11. Hanson, Vicki L. (January 23, 2017). "Celebrating 50 years of the Turing award". Communications of the ACM. 60 (2): 5. doi: 10.1145/3033604 . ISSN   0001-0782. S2CID   29984960.
  12. Scott, Eric; Martins, Marcella Scoczynski Ribeiro; Yafrani, Mohamed El; Volz, Vanessa; Wilson, Dennis G (June 5, 2018). "ACM marks 50 years of the ACM A.M. turing award and computing's greatest achievements". ACM SIGEVOlution. 10 (3): 9–11. doi:10.1145/3231560.3231563. ISSN   1931-8499. S2CID   47021559.
  13. "2021: 375th birthday of Leibniz, father of computer science". people.idsia.ch. Archived from the original on September 21, 2022. Retrieved February 4, 2023.
  14. "Charles Babbage Institute: Who Was Charles Babbage?". cbi.umn.edu. Archived from the original on January 9, 2007. Retrieved December 28, 2016.
  15. "Ada Lovelace | Babbage Engine | Computer History Museum". www.computerhistory.org. Archived from the original on December 25, 2018. Retrieved December 28, 2016.
  16. "History of Computer Science". cs.uwaterloo.ca. Archived from the original on July 29, 2017. Retrieved July 15, 2022.
  17. "Wilhelm Schickard – Ein Computerpionier" (PDF) (in German). Archived from the original (PDF) on September 19, 2020. Retrieved December 4, 2016.
  18. Keates, Fiona (June 25, 2012). "A Brief History of Computing". The Repository. The Royal Society. Archived from the original on June 29, 2012. Retrieved January 19, 2014.
  19. "Science Museum, Babbage's Analytical Engine, 1834–1871 (Trial model)". Archived from the original on August 30, 2019. Retrieved May 11, 2020.
  20. 1 2 Hyman, Anthony (1982). Charles Babbage: Pioneer of the Computer . Oxford University Press. ISBN   978-0691083032.
  21. "A Selection and Adaptation From Ada's Notes found in Ada, The Enchantress of Numbers," by Betty Alexandra Toole Ed.D. Strawberry Press, Mill Valley, CA". Archived from the original on February 10, 2006. Retrieved May 4, 2006.
  22. "The John Gabriel Byrne Computer Science Collection" (PDF). Archived from the original on April 16, 2019. Retrieved August 8, 2019.
  23. Torres Quevedo, L. (1914). "Ensayos sobre Automática – Su definicion. Extension teórica de sus aplicaciones". Revista de la Academia de Ciencias Exacta, 12, pp. 391–418.
  24. Torres Quevedo, Leonardo. Automática: Complemento de la Teoría de las Máquinas, (pdf), pp. 575-583, Revista de Obras Públicas, 19 November 1914.
  25. Ronald T. Kneusel. Numbers and Computers, Springer, pp. 84-85, 2017. ISBN   978-3319505084
  26. Randell, Brian. Digital Computers, History of Origins, (pdf), p. 545, Digital Computers: Origins, Encyclopedia of Computer Science, January 2003.
  27. Randell 1982, p. 6, 11–13.
  28. "In this sense Aiken needed IBM, whose technology included the use of punched cards, the accumulation of numerical data, and the transfer of numerical data from one register to another", Bernard Cohen, p.44 (2000)
  29. Brian Randell, p. 187, 1975
  30. The Association for Computing Machinery (ACM) was founded in 1947.
  31. "IBM Archives: 1945". Ibm.com. January 23, 2003. Archived from the original on January 5, 2019. Retrieved March 19, 2019.
  32. "IBM100 – The Origins of Computer Science". Ibm.com. September 15, 1995. Archived from the original on January 5, 2019. Retrieved March 19, 2019.
  33. 1 2 3 Denning, P.J.; Comer, D.E.; Gries, D.; Mulder, M.C.; Tucker, A.; Turner, A.J.; Young, P.R. (February 1989). "Computing as a discipline". Computer. 22 (2): 63–70. doi:10.1109/2.19833. ISSN   1558-0814. Archived from the original on March 3, 2022. Retrieved March 3, 2022.
  34. "Some EDSAC statistics". University of Cambridge. Archived from the original on September 3, 2007. Retrieved November 19, 2011.
  35. "Computer science pioneer Samuel D. Conte dies at 85". Purdue Computer Science. July 1, 2002. Archived from the original on October 6, 2014. Retrieved December 12, 2014.
  36. 1 2 Tedre, Matti (2014). The Science of Computing: Shaping a Discipline. Taylor and Francis / CRC Press.
  37. 1 2 Louis Fine (1960). "The Role of the University in Computers, Data Processing, and Related Fields". Communications of the ACM. 2 (9): 7–14. doi: 10.1145/368424.368427 . S2CID   6740821.
  38. "Stanford University Oral History". Stanford University. Archived from the original on April 4, 2017. Retrieved May 30, 2013.
  39. Donald Knuth (1972). "George Forsythe and the Development of Computer Science" . Comms. ACM. Archived October 20, 2013, at the Wayback Machine
  40. Matti Tedre (2006). "The Development of Computer Science: A Sociocultural Perspective" (PDF). p. 260. Archived (PDF) from the original on October 9, 2022. Retrieved December 12, 2014.
  41. Peter Naur (1966). "The science of datalogy". Communications of the ACM. 9 (7): 485. doi: 10.1145/365719.366510 . S2CID   47558402.
  42. Weiss, E.A.; Corley, Henry P.T. "Letters to the editor". Communications of the ACM. 1 (4): 6. doi: 10.1145/368796.368802 . S2CID   5379449.
  43. Communications of the ACM 2(1):p.4
  44. IEEE Computer 28(12): p.136
  45. P. Mounier-Kuhn, L'Informatique en France, de la seconde guerre mondiale au Plan Calcul. L'émergence d'une science, Paris, PUPS, 2010, ch. 3 & 4.
  46. Groth, Dennis P. (February 2010). "Why an Informatics Degree?". Communications of the ACM. Cacm.acm.org. Archived from the original on January 11, 2023. Retrieved June 14, 2016.
  47. Tedre, M. (2011). "Computing as a Science: A Survey of Competing Viewpoints". Minds and Machines. 21 (3): 361–387. doi:10.1007/s11023-011-9240-4. S2CID   14263916.
  48. Parnas, D.L. (1998). "Software engineering programmes are not computer science programmes". Annals of Software Engineering. 6: 19–37. doi:10.1023/A:1018949113292. S2CID   35786237., p. 19: "Rather than treat software engineering as a subfield of computer science, I treat it as an element of the set, Civil Engineering, Mechanical Engineering, Chemical Engineering, Electrical Engineering, [...]"
  49. Luk, R.W.P. (2020). "Insight in how computer science can be a science". Science & Philosophy. 8 (2): 17–47. doi:10.23756/sp.v8i2.531.
  50. Knuth, D.E. (1974). "Computer science and its relation to mathematics". The American Mathematical Monthly. 81 (4): 323–343. doi:10.2307/2318994. JSTOR   2318994.
  51. 1 2 3 4 5 6 7 "The Philosophy of Computer Science". The Philosophy of Computer Science (Stanford Encyclopedia of Philosophy). Metaphysics Research Lab, Stanford University. 2021. Archived from the original on September 16, 2021. Retrieved September 16, 2021.
  52. Wegner, P. (October 13–15, 1976). Research paradigms in computer science—Proceedings of the 2nd international Conference on Software Engineering. San Francisco, California, United States: IEEE Computer Society Press, Los Alamitos, CA.
  53. Denning, Peter J. (2007). "Computing is a natural science". Communications of the ACM. 50 (7): 13–18. doi:10.1145/1272516.1272529. S2CID   20045303.
  54. Eden, A.H. (2007). "Three Paradigms of Computer Science" (PDF). Minds and Machines . 17 (2): 135–167. CiteSeerX   10.1.1.304.7763 . doi:10.1007/s11023-007-9060-8. S2CID   3023076. Archived from the original (PDF) on February 15, 2016.
  55. Turner, Raymond; Angius, Nicola (2019). "The Philosophy of Computer Science". In Zalta, Edward N. (ed.). The Stanford Encyclopedia of Philosophy. Archived from the original on October 14, 2019. Retrieved October 14, 2019.
  56. 1 2 "Computer Science as a Profession". Computing Sciences Accreditation Board. May 28, 1997. Archived from the original on June 17, 2008. Retrieved May 23, 2010.
  57. Committee on the Fundamentals of Computer Science: Challenges and Opportunities, National Research Council (2004). Computer Science: Reflections on the Field, Reflections from the Field. National Academies Press. ISBN   978-0-309-09301-9. Archived from the original on February 18, 2011. Retrieved August 31, 2008.
  58. "CSAB Leading Computer Education". CSAB. August 3, 2011. Archived from the original on January 20, 2019. Retrieved November 19, 2011.
  59. Clay Mathematics Institute P = NP Archived October 14, 2013, at the Wayback Machine
  60. P. Collins, Graham (October 14, 2002). "Claude E. Shannon: Founder of Information Theory". Scientific American. Archived from the original on January 16, 2014. Retrieved December 12, 2014.
  61. Van-Nam Huynh; Vladik Kreinovich; Songsak Sriboonchitta; 2012. Uncertainty Analysis in Econometrics with Applications. Springer Science & Business Media. p. 63. ISBN   978-3-642-35443-4.
  62. Phillip A. Laplante, (2010). Encyclopedia of Software Engineering Three-Volume Set (Print). CRC Press. p. 309. ISBN   978-1-351-24926-3.
  63. Muhammad H. Rashid, (2016). SPICE for Power Electronics and Electric Power. CRC Press. p. 6. ISBN   978-1-4398-6047-2.
  64. "What is an integrated circuit (IC)? A vital component of modern electronics". WhatIs.com. Archived from the original on November 15, 2021. Retrieved November 15, 2021.
  65. A. Thisted, Ronald (April 7, 1997). "Computer Architecture" (PDF). The University of Chicago. Archived (PDF) from the original on October 9, 2022.
  66. Jiacun Wang, (2017). Real-Time Embedded Systems. Wiley. p. 12. ISBN   978-1-119-42070-5.
  67. Gordana Dodig-Crnkovic; Raffaela Giovagnoli, (2013). Computing Nature: Turing Centenary Perspective. Springer Science & Business Media. p. 247. ISBN   978-3-642-37225-4.
  68. Simon Elias Bibri (2018). Smart Sustainable Cities of the Future: The Untapped Potential of Big Data Analytics and Context-Aware Computing for Advancing Sustainability. Springer. p. 74. ISBN   978-3-319-73981-6.
  69. Katz, Jonathan (2008). Introduction to modern cryptography. Yehuda Lindell. Boca Raton: Chapman & Hall/CRC. ISBN   978-1-58488-551-1. OCLC   137325053. Archived from the original on May 6, 2022. Retrieved November 17, 2021.
  70. Rapaport, William J. (September 20, 2013). "What Is Computation?". State University of New York at Buffalo. Archived from the original on February 14, 2001. Retrieved August 31, 2013.
  71. B. Jack Copeland, (2012). Alan Turing's Electronic Brain: The Struggle to Build the ACE, the World's Fastest Computer. OUP Oxford. p. 107. ISBN   978-0-19-960915-4.
  72. Charles W. Herbert, (2010). An Introduction to Programming Using Alice 2.2. Cengage Learning. p. 122. ISBN   0-538-47866-7.
  73. Md. Rezaul Karim; Sridhar Alla, (2017). Scala and Spark for Big Data Analytics: Explore the concepts of functional programming, data streaming, and machine learning. Packt Publishing Ltd. p. 87. ISBN   978-1-78355-050-0.
  74. Lex Sheehan, (2017). Learning Functional Programming in Go: Change the way you approach your applications using functional programming in Go. Packt Publishing Ltd. p. 16. ISBN   978-1-78728-604-7.
  75. Evelio Padilla, (2015). Substation Automation Systems: Design and Implementation. Wiley. p. 245. ISBN   978-1-118-98730-8.
  76. "Multi-Paradigm Programming Language". MDN Web Docs. Mozilla Foundation. Archived from the original on August 21, 2013.
  77. Meyer, Bertrand (April 2009). "Viewpoint: Research evaluation for computer science". Communications of the ACM. 25 (4): 31–34. doi:10.1145/1498765.1498780. S2CID   8625066.
  78. Patterson, David (August 1999). "Evaluating Computer Scientists and Engineers For Promotion and Tenure". Computing Research Association. Archived from the original on July 22, 2015. Retrieved July 19, 2015.
  79. Fortnow, Lance (August 2009). "Viewpoint: Time for Computer Science to Grow Up". Communications of the ACM. 52 (8): 33–35. doi: 10.1145/1536616.1536631 .

Further reading