Parallel Processing Letters

Last updated

Scope

Parallel Processing Letters publishes short papers in the field of parallel processing. This journal has a wide scope and topics covered include: [1]

Abstracting and indexing

The journal is abstracted and indexed in: [2]

Related Research Articles

A distributed system is a system whose components are located on different networked computers, which communicate and coordinate their actions by passing messages to one another. Distributed computing is a field of computer science that studies distributed systems.

<span class="mw-page-title-main">Parallel computing</span> Programming paradigm in which many processes are executed simultaneously

Parallel computing is a type of computation in which many calculations or processes are carried out simultaneously. Large problems can often be divided into smaller ones, which can then be solved at the same time. There are several different forms of parallel computing: bit-level, instruction-level, data, and task parallelism. Parallelism has long been employed in high-performance computing, but has gained broader interest due to the physical constraints preventing frequency scaling. As power consumption by computers has become a concern in recent years, parallel computing has become the dominant paradigm in computer architecture, mainly in the form of multi-core processors.

<span class="mw-page-title-main">Theoretical computer science</span> Subfield of computer science and mathematics

Theoretical computer science (TCS) is a subset of general computer science and mathematics that focuses on mathematical aspects of computer science such as the theory of computation, lambda calculus, and type theory.

Reversible computing is any model of computation where the computational process, to some extent, is time-reversible. In a model of computation that uses deterministic transitions from one state of the abstract machine to another, a necessary condition for reversibility is that the relation of the mapping from states to their successors must be one-to-one. Reversible computing is a form of unconventional computing.

<span class="mw-page-title-main">Hardware acceleration</span> Specialized computer hardware

Hardware acceleration is the use of computer hardware designed to perform specific functions more efficiently when compared to software running on a general-purpose central processing unit (CPU). Any transformation of data that can be calculated in software running on a generic CPU can also be calculated in custom-made hardware, or in some mix of both.

<span class="mw-page-title-main">Computational engineering</span>

Computational Engineering is an emerging discipline that deals with the development and application of computational models for engineering. At this time, various different approaches are summarized under the term Computational Engineering, including using computational geometry and virtual design for engineering tasks, often coupled with a simulation-driven approach In Computational Engineering, algorithms solve mathematical and logical models that describe engineering challenges, sometimes coupled with some aspect of AI, specifically Reinforcement Learning.

Selim G. Akl is a professor at Queen's University in the Queen's School of Computing, where he leads the Parallel and Unconventional Computation Group. His research interests are primarily in the area of algorithm design and analysis, in particular for problems in parallel computing and unconventional computing.

Lateral computing is a lateral thinking approach to solving computing problems. Lateral thinking has been made popular by Edward de Bono. This thinking technique is applied to generate creative ideas and solve problems. Similarly, by applying lateral-computing techniques to a problem, it can become much easier to arrive at a computationally inexpensive, easy to implement, efficient, innovative or unconventional solution.

Algorithmica is a monthly peer-reviewed scientific journal focusing on research and the application of computer science algorithms. The journal was established in 1986 and is published by Springer Science+Business Media. The editor in chief is Mohammad Hajiaghayi. Subject coverage includes sorting, searching, data structures, computational geometry, and linear programming, VLSI, distributed computing, parallel processing, computer aided design, robotics, graphics, data base design, and software tools.

Informatics is the study of computational systems. According to the ACM Europe Council and Informatics Europe, informatics is synonymous with computer science and computing as a profession, in which the central notion is transformation of information. In other countries, the term "informatics" is used with a different meaning in the context of library science, in which case it is synonymous with data storage and retrieval.

In computing, algorithmic skeletons, or parallelism patterns, are a high-level parallel programming model for parallel and distributed computing.

<i>Information Processing Letters</i> Academic journal

Information Processing Letters is a peer reviewed scientific journal in the field of computer science, published by Elsevier. The aim of the journal is to enable fast dissemination of results in the field of information processing in the form of short papers. Submissions are limited to nine double-spaced pages.

Natural computing, also called natural computation, is a terminology introduced to encompass three classes of methods: 1) those that take inspiration from nature for the development of novel problem-solving techniques; 2) those that are based on the use of computers to synthesize natural phenomena; and 3) those that employ natural materials to compute. The main fields of research that compose these three branches are artificial neural networks, evolutionary algorithms, swarm intelligence, artificial immune systems, fractal geometry, artificial life, DNA computing, and quantum computing, among others.

Data-intensive computing is a class of parallel computing applications which use a data parallel approach to process large volumes of data typically terabytes or petabytes in size and typically referred to as big data. Computing applications which devote most of their execution time to computational requirements are deemed compute-intensive, whereas computing applications which require large volumes of data and devote most of their processing time to I/O and manipulation of data are deemed data-intensive.

The following outline is provided as an overview of and topical guide to formal science:

This glossary of computer science is a list of definitions of terms and concepts used in computer science, its sub-disciplines, and related fields, including terms relevant to software, data science, and computer programming.

References

  1. "Scope" . Retrieved 2022-05-15.
  2. "Abstracted & Indexed in" . Retrieved 2022-05-15.