Simplexity

Last updated

Simplexity is a neologism which proposes a possible complementary relationship between complexity and simplicity.

Contents

One of the first formally published instances of the word was in the journal 'Childhood Education' (1924), in the article it appears to be used to discuss education and psychology related issues. [1]

Simplexity was defined by computer scientists Broder and Stolfi as: "The simplexity of a problem is the maximum inefficiency among the reluctant algorithms that solve P. An algorithm is said to be pessimal for a problem P if the best-case inefficiency of A is asymptotically equal to the simplexity of P." [2]

In 1974 Rustum Roy and Olaf Müller noted simplexity in the structure of ternary compounds: "By dealing with approximately ten ternary structural groupings we can cover the most important structures of science and technology specific to the non-metallics world. It is a remarkable instance of nature's 'simplexity'". [3]

In 2003 Philippe Compain in an article on the future of synthetic chemistry stated: "Simplexity may be defined as the combination of simplicity and complexity within the context of a dynamic relationship between means and ends."; [4] [5]

Simplexity: Why Simple Things Become Complex (and How Complex Things Can Be Made Simple) by Jeffrey Kluger details ways in which simplexity theory can be applied to multiple disciplines. Kluger offers a look at simplexity at work in economics, sports, linguistics, technology, medicine and human behavior.

Simplexity has been used by Jens Nordvig to describe the particular aim of his analytics firm Exante Data."A research product that draws on a very complex analytical foundation, but is presented in a very simple and easy to digest manner" [6]

Related Research Articles

<span class="mw-page-title-main">Algebraic geometry</span> Branch of mathematics

Algebraic geometry is a branch of mathematics which uses abstract algebraic techniques, mainly from commutative algebra, to solve geometrical problems. Classically, it studies zeros of multivariate polynomials; the modern approach generalizes this in a few different aspects.

Complexity characterises the behaviour of a system or model whose components interact in multiple ways and follow local rules, leading to non-linearity, randomness, collective dynamics, hierarchy, and emergence.

<span class="mw-page-title-main">Sorting algorithm</span> Algorithm that arranges lists in order

In computer science, a sorting algorithm is an algorithm that puts elements of a list into an order. The most frequently used orders are numerical order and lexicographical order, and either ascending or descending. Efficient sorting is important for optimizing the efficiency of other algorithms that require input data to be in sorted lists. Sorting is also often useful for canonicalizing data and for producing human-readable output.

In philosophy, Occam's razor is the problem-solving principle that recommends searching for explanations constructed with the smallest possible set of elements. It is also known as the principle of parsimony or the law of parsimony. Attributed to William of Ockham, a 14th-century English philosopher and theologian, it is frequently cited as Entia non sunt multiplicanda praeter necessitatem, which translates as "Entities must not be multiplied beyond necessity", although Occam never used these exact words. Popularly, the principle is sometimes inaccurately paraphrased as "The simplest explanation is usually the best one."

<span class="mw-page-title-main">Robert Sedgewick (computer scientist)</span> American computer scientist

Robert Sedgewick is an American computer scientist. He is the founding chair and the William O. Baker Professor in Computer Science at Princeton University and was a member of the board of directors of Adobe Systems (1990–2016). He previously served on the faculty at Brown University and has held visiting research positions at Xerox PARC, Institute for Defense Analyses, and INRIA. His research expertise is in algorithm science, data structures, and analytic combinatorics. He is also active in developing the college curriculum in computer science and in harnessing technology to make that curriculum available to anyone seeking the opportunity to learn from it.

Bio-inspired computing, short for biologically inspired computing, is a field of study which seeks to solve computer science problems using models of biology. It relates to connectionism, social behavior, and emergence. Within computer science, bio-inspired computing relates to artificial intelligence and machine learning. Bio-inspired computing is a major subset of natural computation.

The point location problem is a fundamental topic of computational geometry. It finds applications in areas that deal with processing geometrical data: computer graphics, geographic information systems (GIS), motion planning, and computer aided design (CAD).

The following outline is provided as an overview of and topical guide to human–computer interaction:

In inorganic chemistry and materials chemistry, a ternary compound or ternary phase is a chemical compound containing three different elements.

Algorithmic information theory (AIT) is a branch of theoretical computer science that concerns itself with the relationship between computation and information of computably generated objects (as opposed to stochastically generated), such as strings or any other data structure. In other words, it is shown within algorithmic information theory that computational incompressibility "mimics" (except for a constant that only depends on the chosen universal programming language) the relations or inequalities found in information theory. According to Gregory Chaitin, it is "the result of putting Shannon's information theory and Turing's computability theory into a cocktail shaker and shaking vigorously."

Unconventional computing is computing by any of a wide range of new or unusual methods. It is also known as alternative computing.

Information-based complexity (IBC) studies optimal algorithms and computational complexity for the continuous problems that arise in physical science, economics, engineering, and mathematical finance. IBC has studied such continuous problems as path integration, partial differential equations, systems of ordinary differential equations, nonlinear equations, integral equations, fixed points, and very-high-dimensional integration. All these problems involve functions of a real or complex variable. Since one can never obtain a closed-form solution to the problems of interest one has to settle for a numerical solution. Since a function of a real or complex variable cannot be entered into a digital computer, the solution of continuous problems involves partial information. To give a simple illustration, in the numerical approximation of an integral, only samples of the integrand at a finite number of points are available. In the numerical solution of partial differential equations the functions specifying the boundary conditions and the coefficients of the differential operator can only be sampled. Furthermore, this partial information can be expensive to obtain. Finally the information is often contaminated by noise.

Rustum Roy was a physicist, born in India, who became a professor at Pennsylvania State University and was a leader in materials research. As an advocate for interdisciplinarity, he initiated a movement of materials research societies and, outside of his multiple areas of scientific and engineering expertise, wrote impassioned pleas about the need for a fusion of religion and science and humanistic causes.

<span class="mw-page-title-main">Alexey Ivakhnenko</span> Soviet–Ukrainian mathematician and computer scientist

Alexey Grigoryevich Ivakhnenko was a Soviet and Ukrainian mathematician most famous for developing the group method of data handling (GMDH), a method of inductive statistical learning, for which he is sometimes referred to as the "Father of deep learning".

In computational complexity theory, the average-case complexity of an algorithm is the amount of some computational resource used by the algorithm, averaged over all possible inputs. It is frequently contrasted with worst-case complexity which considers the maximal complexity of the algorithm over all possible inputs.

<span class="mw-page-title-main">Human–computer interaction</span> Academic discipline studying the relationship between computer systems and their users

Human–computer interaction (HCI) is research in the design and the use of computer technology, which focuses on the interfaces between people (users) and computers. HCI researchers observe the ways humans interact with computers and design technologies that allow humans to interact with computers in novel ways. A device that allows interaction between human being and a computer is known as a "Human-computer Interface (HCI)".

The eXtensible Host Controller Interface (xHCI) is a technical specification that provides a detailed framework for the functioning of a computer's host controller for Universal Serial Bus (USB). Known alternately as the USB 3.0 host controller specification, xHCI is designed to be backward compatible, supporting a wide range of USB devices from older USB 1.x to the more recent USB 3.x versions.

<span class="mw-page-title-main">Symbolic regression</span> Type of regression analysis

Symbolic regression (SR) is a type of regression analysis that searches the space of mathematical expressions to find the model that best fits a given dataset, both in terms of accuracy and simplicity.

References

  1. (Unknown). Childhood Education Published 1924. Association for Childhood Education International.
  2. Broder, Andrei and Jorge Stolfi. "Pessimal Algorithms and Simplexity Analysis."
  3. Rustum Roy & Olaf Müller (1974) The Major Ternary Structural Families, pages 3 & 4, Springer-Verlag ISBN   9780387064307
  4. Philippe Compain "The challenge of simplexity. The simple and the complex in organic synthesis" Act. Chim., 2003, 263-264, pages 129-134.
  5. Philippe Compain et al. "Looking forward: a glance into the future of organic chemistry" New J. Chem., 2006, 30, pages 823-831.
  6. "Letter: Simplexity in Research". exante data. Exante Data. Retrieved 2018-05-03.

Further reading

Books

Dan Geesin first used the term 'Simplexity' in his essay 'The melancholy of the set square', 2002, when describing how technology creates more distance through complex interfaces whilst performing a simple task. For example, getting money from a bank machine. He describes how in between the chain of interfaces there is more room for error. More interfaces, more potential problems.

Articles
Conference Proceedings
Blogs