Kim-Chuan Toh

Last updated

Kim-Chuan Toh is a Singaporean mathematician, and Leo Tan Professor in Science at the National University of Singapore (NUS). He is known for his contributions to the theory, practice, and application of convex optimization, especially semidefinite programming and conic programming.

Contents

Toh received BSc (Hon.) in 1990 and MSc in 1992, from NUS, and PhD in 1996 from Cornell University.

Selected works

Awards and honours

Toh received the 2017 INFORMS Optimization Society Farkas Prize, [1] and 2019 President's Science Award (Singapore). [2] He is a fellow of the Society of Industrial and Applied Mathematics (Class of 2018). [3]

Related Research Articles

<span class="mw-page-title-main">George Dantzig</span> American mathematician (1914-2005)

George Bernard Dantzig was an American mathematical scientist who made contributions to industrial engineering, operations research, computer science, economics, and statistics.

Quadratic programming (QP) is the process of solving certain mathematical optimization problems involving quadratic functions. Specifically, one seeks to optimize a multivariate quadratic function subject to linear constraints on the variables. Quadratic programming is a type of nonlinear programming.

IBM ILOG CPLEX Optimization Studio is an optimization software package. In 2004, the work on CPLEX earned the first INFORMS Impact Prize.

<span class="mw-page-title-main">Optimal design</span> Experimental design that is optimal with respect to some statistical criterion

In the design of experiments, optimal designs are a class of experimental designs that are optimal with respect to some statistical criterion. The creation of this field of statistics has been credited to Danish statistician Kirstine Smith.

Convex optimization is a subfield of mathematical optimization that studies the problem of minimizing convex functions over convex sets. Many classes of convex optimization problems admit polynomial-time algorithms, whereas mathematical optimization is in general NP-hard.

<span class="mw-page-title-main">Convex cone</span> Mathematical set closed under positive linear combinations

In linear algebra, a cone—sometimes called a linear cone for distinguishing it from other sorts of cones—is a subset of a vector space that is closed under scalar multiplication; that is, C is a cone if implies for every positive scalar s.

Semidefinite programming (SDP) is a subfield of convex optimization concerned with the optimization of a linear objective function over the intersection of the cone of positive semidefinite matrices with an affine space, i.e., a spectrahedron.

In mathematical optimization and related fields, relaxation is a modeling strategy. A relaxation is an approximation of a difficult problem by a nearby problem that is easier to solve. A solution of the relaxed problem provides information about the original problem.

In mathematical optimization, a quadratically constrained quadratic program (QCQP) is an optimization problem in which both the objective function and the constraints are quadratic functions. It has the form

A finite element limit analysis (FELA) uses optimisation techniques to directly compute the upper or lower bound plastic collapse load for a mechanical system rather than time stepping to a collapse load, as might be undertaken with conventional non-linear finite element techniques. The problem may be formulated in either a kinematic or equilibrium form.

<span class="mw-page-title-main">R. Tyrrell Rockafellar</span> American mathematician

Ralph Tyrrell Rockafellar is an American mathematician and one of the leading scholars in optimization theory and related fields of analysis and combinatorics. He is the author of four major books including the landmark text "Convex Analysis" (1970), which has been cited more than 27,000 times according to Google Scholar and remains the standard reference on the subject, and "Variational Analysis" for which the authors received the Frederick W. Lanchester Prize from the Institute for Operations Research and the Management Sciences (INFORMS).

Nimrod Megiddo is a mathematician and computer scientist. He is a research scientist at the IBM Almaden Research Center and Stanford University. His interests include combinatorial optimization, algorithm design and analysis, game theory, and machine learning. He was one of the first people to propose a solution to the bounding sphere and smallest-circle problem.

Robert J. Vanderbei is an American mathematician and Professor in the Department of Operations Research and Financial Engineering at Princeton University.

Arkadi Nemirovski is a professor at the H. Milton Stewart School of Industrial and Systems Engineering at the Georgia Institute of Technology. He has been a leader in continuous optimization and is best known for his work on the ellipsoid method, modern interior-point methods and robust optimization.

<span class="mw-page-title-main">Michel Goemans</span> Belgian-American mathematician

Michel Xavier Goemans is a Belgian-American professor of applied mathematics and the RSA Professor of Mathematics at MIT working in discrete mathematics and combinatorial optimization at CSAIL and MIT Operations Research Center.

<span class="mw-page-title-main">George Nemhauser</span> American operations researcher (born 1937)

George Lann Nemhauser is an American operations researcher, the A. Russell Chandler III Chair and Institute Professor of Industrial and Systems Engineering at the Georgia Institute of Technology and the former president of the Operations Research Society of America.

The Truncated Newton Method originated in a paper by Ron Dembo and Trond Steihaug first published in Mathematical Programming '. Convergence results for this algorithm can be found in: Dembo, Ron S., Stanley C. Eisenstat, and Trond Steihaug. "Inexact newton methods." SIAM Journal on Numerical analysis 19.2 (1982): 400-408. Also known as Hessian-free optimization, are a family of optimization algorithms designed for optimizing non-linear functions with large numbers of independent variables. A truncated Newton method consists of repeated application of an iterative optimization algorithm to approximately solve Newton's equations, to determine an update to the function's parameters. The inner solver is truncated, i.e., run for only a limited number of iterations. It follows that, for truncated Newton methods to work, the inner solver needs to produce a good approximation in a finite number of iterations; conjugate gradient has been suggested and evaluated as a candidate inner loop. Another prerequisite is good preconditioning for the inner algorithm.

<span class="mw-page-title-main">Affine scaling</span> Algorithm for solving linear programming problems

In mathematical optimization, affine scaling is an algorithm for solving linear programming problems. Specifically, it is an interior point method, discovered by Soviet mathematician I. I. Dikin in 1967 and reinvented in the U.S. in the mid-1980s.

Katya Scheinberg is a Russian-American applied mathematician known for her research in continuous optimization and particularly in derivative-free optimization. She works at Cornell University and is a professor in Cornell's School of Operations Research and Information Engineering.

<span class="mw-page-title-main">Tamás Terlaky</span> Hungarian mathematician (born 1955)

Tamás Terlaky is a Hungarian-Canadian-American professor of Industrial and Systems Engineering at Lehigh University. He is especially well known for his work on criss-cross algorithms, interior-point methods, Klee-Minty examples for path following algorithms, and optimization.

References

  1. "2017 INFORMS Optimization Society Farkas Prize".
  2. "Singapore honors top researchers".
  3. "Class of 2018 - SIAM".