Metacomputing

Last updated

Metacomputing is all computing and computing-oriented activity which involves computing knowledge (science and technology) utilized for the research, development and application of different types of computing. It may also deal with numerous types of computing applications, such as: industry, business, management and human-related management. New emerging fields of metacomputing focus on the methodological and technological aspects of the development of large computer networks/grids, such as the Internet, intranet and other territorially distributed computer networks for special purposes. [1]

Contents

Uses

In computer science

Metacomputing, as a computing of computing, includes: the organization of large computer networks, choice of the design criteria (for example: peer-to-peer or centralized solution) and metacomputing software (middleware, metaprogramming) development where, in the specific domains, the concept metacomputing is used as a description of software meta-layers which are networked platforms for the development of user-oriented calculations, for example for computational physics and bio-informatics.

Here, serious scientific problems of systems/networks complexity emerge, not only related to domain-dependent complexities but focused on systemic meta-complexity of computer network infrastructures.

Metacomputing is also a useful descriptor for self-referential programming systems. Often these systems are functional as fifth-generation computer languages which require the use of an underlying metaprocessor software operating system in order to be operative. Typically metacomputing occurs in an interpreted or real-time compiling system since the changing nature of information in processing results may result in an unpredictable compute state throughout the existence of the metacomputer (the information state operated upon by the metacomputing platform).

In socio-cognitive engineering

From the human and social perspectives, metacomputing is especially focused on: human-computer software, cognitive interrelations/interfaces, the possibilities of the development of intelligent computer grids for the cooperation of human organizations, and on ubiquitous computing technologies. In particular, it relates to the development of software infrastructures for the computational modeling and simulation of cognitive architectures for various decision support systems.

In systemics and from philosophical perspective

Metacomputing refers to the general problems of computationality of human knowledge, to the limits of the transformation of human knowledge and individual thinking to the form of computer programs. These and similar questions are also of interest of mathematical psychology.

See also

Related Research Articles

<span class="mw-page-title-main">Computing</span> Activity involving calculations or computing machinery

Computing is any goal-oriented activity requiring, benefiting from, or creating computing machinery. It includes the study and experimentation of algorithmic processes, and development of both hardware and software. Computing has scientific, engineering, mathematical, technological and social aspects. Major computing disciplines include computer engineering, computer science, cybersecurity, data science, information systems, information technology and software engineering.

Grid computing is the use of widely distributed computer resources to reach a common goal. A computing grid can be thought of as a distributed system with non-interactive workloads that involve many files. Grid computing is distinguished from conventional high-performance computing systems such as cluster computing in that grid computers have each node set to perform a different task/application. Grid computers also tend to be more heterogeneous and geographically dispersed than cluster computers. Although a single grid can be dedicated to a particular application, commonly a grid is used for a variety of purposes. Grids are often constructed with general-purpose grid middleware software libraries. Grid sizes can be quite large.

Computer science is the study of the theoretical foundations of information and computation and their implementation and application in computer systems. One well known subject classification system for computer science is the ACM Computing Classification System devised by the Association for Computing Machinery.

<span class="mw-page-title-main">Theoretical computer science</span> Subfield of computer science and mathematics

Theoretical computer science (TCS) is a subset of general computer science and mathematics that focuses on mathematical aspects of computer science such as the theory of computation, lambda calculus, and type theory.

A computer scientist is a scholar who specializes in the academic study of computer science.

Autonomic computing (AC) is distributed computing resources with self-managing characteristics, adapting to unpredictable changes while hiding intrinsic complexity to operators and users. Initiated by IBM in 2001, this initiative ultimately aimed to develop computer systems capable of self-management, to overcome the rapidly growing complexity of computing systems management, and to reduce the barrier that complexity poses to further growth.

Social computing is an area of computer science that is concerned with the intersection of social behavior and computational systems. It is based on creating or recreating social conventions and social contexts through the use of software and technology. Thus, blogs, email, instant messaging, social network services, wikis, social bookmarking and other instances of what is often called social software illustrate ideas from social computing.

Human-centered computing (HCC) studies the design, development, and deployment of mixed-initiative human-computer systems. It is emerged from the convergence of multiple disciplines that are concerned both with understanding human beings and with the design of computational artifacts. Human-centered computing is closely related to human-computer interaction and information science. Human-centered computing is usually concerned with systems and practices of technology use while human-computer interaction is more focused on ergonomics and the usability of computing artifacts and information science is focused on practices surrounding the collection, manipulation, and use of information.

<span class="mw-page-title-main">David P. Anderson</span> American research scientist (born 1955)

David Pope Anderson is an American research scientist at the Space Sciences Laboratory, at the University of California, Berkeley, and an adjunct professor of computer science at the University of Houston. Anderson leads the SETI@home, BOINC, Bossa and Bolt software projects.

Model-driven engineering (MDE) is a software development methodology that focuses on creating and exploiting domain models, which are conceptual models of all the topics related to a specific problem. Hence, it highlights and aims at abstract representations of the knowledge and activities that govern a particular application domain, rather than the computing concepts.

<span class="mw-page-title-main">Charlie Catlett</span> American computer scientist

Charlie Catlett is a senior computer scientist at Argonne National Laboratory and a visiting senior fellow at the Mansueto Institute for Urban Innovation at the University of Chicago. From 2020 to 2022 he was a senior research scientist at the University of Illinois Discovery Partners Institute. He was previously a senior computer scientist at Argonne National Laboratory and a senior fellow in the Computation Institute, a joint institute of Argonne National Laboratory and The University of Chicago, and a senior fellow at the University of Chicago's Harris School of Public Policy.

The following outline is provided as an overview of and topical guide to computer programming:

<span class="mw-page-title-main">Volunteer computing</span> System where users donate computer resources to contribute to research

Volunteer computing is a type of distributed computing in which people donate their computers' unused resources to a research-oriented project, and sometimes in exchange for credit points. The fundamental idea behind it is that a modern desktop computer is sufficiently powerful to perform billions of operations a second, but for most users only between 10-15% of its capacity is used. Typical uses like basic word processing or web browsing leave the computer mostly idle.

<span class="mw-page-title-main">Róbert Lovas</span> Hungarian computer scientist

Róbert Lovas is a Hungarian computer scientist at SZTAKI, Budapest, Hungary.

Polish Grid Infrastructure PL-Grid, a nationwide computing infrastructure, built in 2009-2011, under the scientific project PL-Grid - Polish Infrastructure for Supporting Computational Science in the European Research Space. Its purpose was to enable scientific research based on advanced computer simulations and large-scale computations using the computer clusters, and to provide convenient access to the computer resources for research teams, also outside the communities, in which the High Performance Computing centers operate.

Computation offloading is the transfer of resource intensive computational tasks to a separate processor, such as a hardware accelerator, or an external platform, such as a cluster, grid, or a cloud. Offloading to a coprocessor can be used to accelerate applications including: image rendering and mathematical calculations. Offloading computing to an external platform over a network can provide computing power and overcome hardware limitations of a device, such as limited computational power, storage, and energy.

References

  1. Smarr, Larry; Catlett, Charles E. (1992). "Metacomputing". Communications of the ACM. 35 (6): 44. doi: 10.1145/129888.129890 .

Further reading