Ethnocomputing is the study of the interactions between computing and culture. It is carried out through theoretical analysis, empirical investigation, and design implementation. It includes research on the impact of computing on society, as well as the reverse: how cultural, historical, personal, and societal origins and surroundings cause and affect the innovation, development, diffusion, maintenance, and appropriation of computational artifacts or ideas. From the ethnocomputing perspective, no computational technology is culturally "neutral," and no cultural practice is a computational void. [1] Instead of considering culture to be a hindrance for software engineering, culture should be seen as a resource for innovation and design.
Social categories for ethnocomputing include:
Technical categories in ethnocomputing include:
Ethnocomputing has its origins in ethnomathematics. There are a large number of studies in ethnomathematics that could be considered ethnocomputing as well (e.g., Eglash (1999) and Ascher & Ascher (1981)). The idea of a separate field was introduced in 1992 by Anthony Petrillo in Responsive Evaluation of Mathematics Education in a Community of Jos, Nigeria, Dissertation (Ph.D.): State University of New York at Buffalo, which Petrillo elaborated a bit more on in March 1994, Ethnocomputers in Nigerian Computer Education, paper presented at the 31st Annual Conference of the Mathematical Association of Nigeria. Just like computer science is nowadays considered to be a field of research distinct from mathematics, ethnocomputing is considered to be a research topic distinct from ethnomathematics. Some aspects of ethnocomputing that have their roots in ethnomathematics are listed below:
Computer science is the study of algorithmic processes, computational machines and computation itself. As a discipline, computer science spans a range of topics from theoretical studies of algorithms, computation and information to the practical issues of implementing computational systems in hardware and software.
Computation is any type of calculation includes both arithmetical and non-arithmetical steps and which follows a well-defined model.
A Turing machine is a mathematical model of computation that defines an abstract machine that manipulates symbols on a strip of tape according to a table of rules. Despite the model's simplicity, given any computer algorithm, a Turing machine capable of simulating that algorithm's logic can be constructed.
In computability theory, a system of data-manipulation rules is said to be Turing-complete or computationally universal if it can be used to simulate any Turing machine. This means that this system is able to recognize or decide other data-manipulation rule sets. Turing completeness is used as a way to express the power of such a data-manipulation rule set. Virtually all programming languages today are Turing-complete. The concept is named after English mathematician and computer scientist Alan Turing.
A pattern is a regularity in the world, in human-made design, or in abstract ideas. As such, the elements of a pattern repeat in a predictable manner. A geometric pattern is a kind of pattern formed of geometric shapes and typically repeated like a wallpaper design.
In computer science, a universal Turing machine (UTM) is a Turing machine that simulates an arbitrary Turing machine on arbitrary input. The universal machine essentially achieves this by reading both the description of the machine to be simulated as well as the input to that machine from its own tape. Alan Turing introduced the idea of such a machine in 1936–1937. This principle is considered to be the origin of the idea of a stored-program computer used by John von Neumann in 1946 for the "Electronic Computing Instrument" that now bears von Neumann's name: the von Neumann architecture.
A New Kind of Science is a book by Stephen Wolfram, published by his company Wolfram Research under the imprint Wolfram Media in 2002. It contains an empirical and systematic study of computational systems such as cellular automata. Wolfram calls these systems simple programs and argues that the scientific philosophy and methods appropriate for the study of simple programs are relevant to other fields of science.
Computer science is the study of the theoretical foundations of information and computation and their implementation and application in computer systems. One well known subject classification system for computer science is the ACM Computing Classification System devised by the Association for Computing Machinery.
An academic discipline or field of study is a branch of knowledge, taught and researched as part of higher education. A scholar's discipline is commonly defined by the university faculties and learned societies to which he/she belongs and the academic journals in which he/she publishes research.
In mathematics education, ethnomathematics is the study of the relationship between mathematics and culture. Often associated with "cultures without written expression", it may also be defined as "the mathematics which is practised among identifiable cultural groups". It refers to a broad cluster of ideas ranging from distinct numerical and mathematical systems to multicultural mathematics education. The goal of ethnomathematics is to contribute both to the understanding of culture and the understanding of mathematics, and mainly to lead to an appreciation of the connections between the two.
Computability is the ability to solve a problem in an effective manner. It is a key topic of the field of computability theory within mathematical logic and the theory of computation within computer science. The computability of a problem is closely linked to the existence of an algorithm to solve the problem.
The von Neumann architecture—also known as the von Neumann model or Princeton architecture—is a computer architecture based on a 1945 description by John von Neumann and others in the First Draft of a Report on the EDVAC. That document describes a design architecture for an electronic digital computer with these components:
Shafrira Goldwasser is an Israeli-American computer scientist and winner of the Turing Award in 2012. She is the RSA Professor of Electrical Engineering and Computer Science at MIT, a professor of mathematical sciences at the Weizmann Institute of Science, Israel, co-founder and chief scientist of Duality Technologies and the director of the Simons Institute for the Theory of Computing in Berkeley, CA. She was on the Mathematical Sciences jury for the Infosys Prize in 2020.
The history of computer science began long before our modern discipline of computer science, usually appearing in forms like mathematics or physics. Developments in previous centuries alluded to the discipline that we now know as computer science. This progression, from mechanical inventions and mathematical theories towards modern computer concepts and machines, led to the development of a major academic field, massive technological advancement across the Western world, and the basis of a massive worldwide trade and culture.
John Vivian Tucker is a British computer scientist and expert on computability theory, also known as recursion theory. Computability theory is about what can and cannot be computed by people and machines. His work has focused on generalising the classical theory to deal with all forms of discrete/digital and continuous/analogue data; and on using the generalisations as formal methods for system design; and on the interface between algorithms and physical equipment.
Ron Eglash is an American who works in cybernetics, professor in the School of Information at the University of Michigan with a secondary appointment in the School of Design, and an author widely known for his work in the field of ethnomathematics, which aims to study the diverse relationships between mathematics and culture.
In education, computational thinking (CT) is a set of problem-solving methods that involve expressing problems and their solutions in ways that a computer could also execute. It involves the mental skills and practices for designing computations that get computers to do jobs for people, and explaining and interpreting the world as a complex of information processes. Those ideas range from basic CT for beginners to advanced CT for experts, and CT includes both CT-in-the-small and CT-in-the-large.
Lateral computing is a lateral thinking approach to solving computing problems. Lateral thinking has been made popular by Edward de Bono. This thinking technique is applied to generate creative ideas and solve problems. Similarly, by applying lateral-computing techniques to a problem, it can become much easier to arrive at a computationally inexpensive, easy to implement, efficient, innovative or unconventional solution.
The philosophy of computer science is concerned with the philosophical questions that arise within the study of computer science. There is still no common understanding of the content, aim, focus, or topic of the philosophy of computer science, despite some attempts to develop a philosophy of computer science like the philosophy of physics or the philosophy of mathematics. Due to the abstract nature of computer programs and the technological ambitions of computer science, many of the conceptual questions of the philosophy of computer science are also comparable to the philosophy of science, and the philosophy of technology.
The following outline is provided as an overview of and topical guide to formal science: