Regina S. Burachik | |
---|---|
Nationality | Argentine |
Academic background | |
Alma mater | Instituto Nacional de Matemática Pura e Aplicada |
Thesis | Generalized Proximal Point Method for the Variational Inequality Problem (1995) |
Doctoral advisor | Alfredo Noel Iusem |
Academic work | |
Discipline | Mathematics |
Sub-discipline | Mathematical optimization, Mathematical analysis |
Institutions | University of South Australia |
Regina Sandra Burachik is an Argentine [1] mathematician who works on optimization and analysis (particularly:convex analysis,functional analysis and non-smooth analysis). Currently,she is a professor at the University of South Australia. [2]
She earned her Ph.D. from the IMPA in 1995 under the supervision of Alfredo Noel Iusem (Generalized Proximal Point Method for the Variational Inequality Problem). [3] In her thesis,she "introduced and analyzed solution methods for variational inequalities,the latter being a generalization of the convex constrained optimization problem." [4]
In mathematics,a contraction mapping,or contraction or contractor,on a metric space (M, d) is a function f from M to itself,with the property that there is some real number such that for all x and y in M,
Mathematical optimization or mathematical programming is the selection of a best element,with regard to some criterion,from some set of available alternatives. It is generally divided into two subfields:discrete optimization and continuous optimization. Optimization problems of sorts arise in all quantitative disciplines from computer science and engineering to operations research and economics,and the development of solution methods has been of interest in mathematics for centuries.
In mathematics,gradient descent is a first-order iterative optimization algorithm for finding a local minimum of a differentiable function. The idea is to take repeated steps in the opposite direction of the gradient of the function at the current point,because this is the direction of steepest descent. Conversely,stepping in the direction of the gradient will lead to a local maximum of that function;the procedure is then known as gradient ascent.
Pierre-Louis Lions is a French mathematician. He is known for a number of contributions to the fields of partial differential equations and the calculus of variations. He was a recipient of the 1994 Fields Medal and the 1991 Prize of the Philip Morris tobacco and cigarette company.
Convex optimization is a subfield of mathematical optimization that studies the problem of minimizing convex functions over convex sets. Many classes of convex optimization problems admit polynomial-time algorithms,whereas mathematical optimization is in general NP-hard.
Convex analysis is the branch of mathematics devoted to the study of properties of convex functions and convex sets,often with applications in convex minimization,a subdomain of optimization theory.
In mathematical optimization theory,duality or the duality principle is the principle that optimization problems may be viewed from either of two perspectives,the primal problem or the dual problem. If the primal is a minimization problem then the dual is a maximization problem. Any feasible solution to the primal (minimization) problem is at least as large as any feasible solution to the dual (maximization) problem. Therefore,the solution to the primal is an upper bound to the solution of the dual,and the solution of the dual is a lower bound to the solution of the primal. This fact is called weak duality.
In mathematics,a quasiconvex function is a real-valued function defined on an interval or on a convex subset of a real vector space such that the inverse image of any set of the form is a convex set. For a function of a single variable,along any stretch of the curve the highest point is one of the endpoints. The negative of a quasiconvex function is said to be quasiconcave.
A set-valued function is a mathematical function that maps elements from one set,the domain of the function,to subsets of
Ralph Tyrrell Rockafellar is an American mathematician and one of the leading scholars in optimization theory and related fields of analysis and combinatorics. He is the author of four major books including the landmark text "Convex Analysis" (1970),which has been cited more than 27,000 times according to Google Scholar and remains the standard reference on the subject,and "Variational Analysis" for which the authors received the Frederick W. Lanchester Prize from the Institute for Operations Research and the Management Sciences (INFORMS).
Augmented Lagrangian methods are a certain class of algorithms for solving constrained optimization problems. They have similarities to penalty methods in that they replace a constrained optimization problem by a series of unconstrained problems and add a penalty term to the objective;the difference is that the augmented Lagrangian method adds yet another term,designed to mimic a Lagrange multiplier. The augmented Lagrangian is related to,but not identical with the method of Lagrange multipliers.
Robert Ralph Phelps was an American mathematician who was known for his contributions to analysis,particularly to functional analysis and measure theory. He was a professor of mathematics at the University of Washington from 1962 until his death.
In mathematics,a submodular set function is a set function whose value,informally,has the property that the difference in the incremental value of the function that a single element makes when added to an input set decreases as the size of the input set increases. Submodular functions have a natural diminishing returns property which makes them suitable for many applications,including approximation algorithms,game theory and electrical networks. Recently,submodular functions have also found immense utility in several real world problems in machine learning and artificial intelligence,including automatic summarization,multi-document summarization,feature selection,active learning,sensor placement,image collection summarization and many other domains.
In optimization problems in applied mathematics,the duality gap is the difference between the primal and dual solutions. If is the optimal dual value and is the optimal primal value then the duality gap is equal to . This value is always greater than or equal to 0. The duality gap is zero if and only if strong duality holds. Otherwise the gap is strictly positive and weak duality holds.
Proximal gradient methods are a generalized form of projection used to solve non-differentiable convex optimization problems.
In mathematics,the term variational analysis usually denotes the combination and extension of methods from convex optimization and the classical calculus of variations to a more general theory. This includes the more general problems of optimization theory,including topics in set-valued analysis,e.g. generalized derivatives.
Alfredo Noel Iusem is an Argentine-born Brazilian mathematician working on mathematical optimization.
In mathematical optimization,the proximal operator is an operator associated with a proper,lower semi-continuous convex function from a Hilbert space to ,and is defined by: