![]() | This article includes a list of references, related reading, or external links, but its sources remain unclear because it lacks inline citations .(October 2014) |
Information-based complexity (IBC) studies optimal algorithms and computational complexity for the continuous problems that arise in physical science, economics, engineering, and mathematical finance.
{{cite book}}
: CS1 maint: multiple names: authors list (link)Extensive bibliographies may be found in the monographs N (1988), TW (1980), TWW (1988) and TW (1998). The IBC website has a searchable data base of some 730 items.
Algebraic geometry is a branch of mathematics which uses abstract algebraic techniques, mainly from commutative algebra, to solve geometrical problems. Classically, it studies zeros of multivariate polynomials; the modern approach generalizes this in a few different aspects.
Statistical inference is the process of using data analysis to infer properties of an underlying probability distribution. Inferential statistical analysis infers properties of a population, for example by testing hypotheses and deriving estimates. It is assumed that the observed data set is sampled from a larger population.
Linear programming (LP), also called linear optimization, is a method to achieve the best outcome in a mathematical model whose requirements and objective are represented by linear relationships. Linear programming is a special case of mathematical programming.
Geostatistics is a branch of statistics focusing on spatial or spatiotemporal datasets. Developed originally to predict probability distributions of ore grades for mining operations, it is currently applied in diverse disciplines including petroleum geology, hydrogeology, hydrology, meteorology, oceanography, geochemistry, geometallurgy, geography, forestry, environmental control, landscape ecology, soil science, and agriculture. Geostatistics is applied in varied branches of geography, particularly those involving the spread of diseases (epidemiology), the practice of commerce and military planning (logistics), and the development of efficient spatial networks. Geostatistical algorithms are incorporated in many places, including geographic information systems (GIS).
Mathematical optimization or mathematical programming is the selection of a best element, with regard to some criteria, from some set of available alternatives. It is generally divided into two subfields: discrete optimization and continuous optimization. Optimization problems arise in all quantitative disciplines from computer science and engineering to operations research and economics, and the development of solution methods has been of interest in mathematics for centuries.
Theoretical computer science is a subfield of computer science and mathematics that focuses on the abstract and mathematical foundations of computation.
In mathematics, and more specifically in computer algebra, computational algebraic geometry, and computational commutative algebra, a Gröbner basis is a particular kind of generating set of an ideal in a polynomial ring K[x1, ..., xn] over a field K. A Gröbner basis allows many important properties of the ideal and the associated algebraic variety to be deduced easily, such as the dimension and the number of zeros when it is finite. Gröbner basis computation is one of the main practical tools for solving systems of polynomial equations and computing the images of algebraic varieties under projections or rational maps.
In 3D computer graphics, solid objects are usually modeled by polyhedra. A face of a polyhedron is a planar polygon bounded by straight line segments, called edges. Curved surfaces are usually approximated by a polygon mesh. Computer programs for line drawings of opaque objects must be able to decide which edges or which parts of the edges are hidden by an object itself or by other objects, so that those edges can be clipped during rendering. This problem is known as hidden-line removal.
Richard Peirce Brent is an Australian mathematician and computer scientist. He is an emeritus professor at the Australian National University. From March 2005 to March 2010 he was a Federation Fellow at the Australian National University. His research interests include number theory, random number generators, computer architecture, and analysis of algorithms.
Hsiang-Tsung Kung is a Taiwanese-American computer scientist. He is the William H. Gates Professor of Computer Science at Harvard University. His early research in parallel computing produced the systolic array in 1979, which has since become a core computational component of hardware accelerators for artificial intelligence, including Google's Tensor Processing Unit (TPU). Similarly, he proposed optimistic concurrency control in 1981, now a key principle in memory and database transaction systems, including MySQL, Apache CouchDB, Google's App Engine, and Ruby on Rails.
In statistics, M-estimators are a broad class of extremum estimators for which the objective function is a sample average. Both non-linear least squares and maximum likelihood estimation are special cases of M-estimators. The definition of M-estimators was motivated by robust statistics, which contributed new types of M-estimators. However, M-estimators are not inherently robust, as is clear from the fact that they include maximum likelihood estimators, which are in general not robust. The statistical procedure of evaluating an M-estimator on a data set is called M-estimation. The "M" initial stands for "maximum likelihood-type".
Joseph Frederick Traub was an American computer scientist. He was the Edwin Howard Armstrong Professor of Computer Science at Columbia University and External Professor at the Santa Fe Institute. He held positions at Bell Laboratories, University of Washington, Carnegie Mellon, and Columbia, as well as sabbatical positions at Stanford, Berkeley, Princeton, California Institute of Technology, and Technical University, Munich.
High-dimensional integrals in hundreds or thousands of variables occur commonly in finance. These integrals have to be computed numerically to within a threshold . If the integral is of dimension then in the worst case, where one has a guarantee of error at most , the computational complexity is typically of order . That is, the problem suffers the curse of dimensionality. In 1977 P. Boyle, University of Waterloo, proposed using Monte Carlo (MC) to evaluate options. Starting in early 1992, J. F. Traub, Columbia University, and a graduate student at the time, S. Paskov, used quasi-Monte Carlo (QMC) to price a Collateralized mortgage obligation with parameters specified by Goldman Sachs. Even though it was believed by the world's leading experts that QMC should not be used for high-dimensional integration, Paskov and Traub found that QMC beat MC by one to three orders of magnitude and also enjoyed other desirable attributes. Their results were first published in 1995. Today QMC is widely used in the financial sector to value financial derivatives; see list of books below.
In mathematics and computer science, computer algebra, also called symbolic computation or algebraic computation, is a scientific area that refers to the study and development of algorithms and software for manipulating mathematical expressions and other mathematical objects. Although computer algebra could be considered a subfield of scientific computing, they are generally considered as distinct fields because scientific computing is usually based on numerical computation with approximate floating point numbers, while symbolic computation emphasizes exact computation with expressions containing variables that have no given value and are manipulated as symbols.
Eldon Robert Hansen is an American mathematician and author who has published in global optimization theory and interval arithmetic.
Michael Ira Shub is an American mathematician who has done research into dynamical systems and the complexity of real number algorithms.
Continuous-variable (CV) quantum information is the area of quantum information science that makes use of physical observables, like the strength of an electromagnetic field, whose numerical values belong to continuous intervals. One primary application is quantum computing. In a sense, continuous-variable quantum computation is "analog", while quantum computation using qubits is "digital." In more technical terms, the former makes use of Hilbert spaces that are infinite-dimensional, while the Hilbert spaces for systems comprising collections of qubits are finite-dimensional. One motivation for studying continuous-variable quantum computation is to understand what resources are necessary to make quantum computers more powerful than classical ones.
Probabilistic numerics is an active field of study at the intersection of applied mathematics, statistics, and machine learning centering on the concept of uncertainty in computation. In probabilistic numerics, tasks in numerical analysis such as finding numerical solutions for integration, linear algebra, optimization and simulation and differential equations are seen as problems of statistical, probabilistic, or Bayesian inference.
Fixed-point computation refers to the process of computing an exact or approximate fixed point of a given function. In its most common form, the given function satisfies the condition to the Brouwer fixed-point theorem: that is, is continuous and maps the unit d-cube to itself. The Brouwer fixed-point theorem guarantees that has a fixed point, but the proof is not constructive. Various algorithms have been devised for computing an approximate fixed point. Such algorithms are used in economics for computing a market equilibrium, in game theory for computing a Nash equilibrium, and in dynamic system analysis.
Edward Wesler Packel is an American mathematician, game theorist, theoretical computer scientist, and expert on the use of Wolfram Mathematica in teaching mathematics. His 1981 book The Mathematics of Games and Gambling won the 1986 Beckenbach Book Prize.