Anatoly Zhigljavsky

Last updated

Anatoly Aleksandrovich Zhigljavsky (born 19 November 1953) is a professor of statistics in the School of Mathematics at Cardiff University. He has authored 12 monographs and over 150 papers in refereed journals. [1] His research interests include stochastic and high-dimensional global optimisation, time series analysis, multivariate data analysis, statistical modeling in market research, probabilistic methods in search and number theory. [2]

He is the Director of the Centre for Optimisation and its Applications, an interdisciplinary centre which encourages joint research and applied projects among members of the Schools of Mathematics, Computer Science and Business and Manufacturing Engineering Centre at Cardiff University. It also encourages increased awareness of the rapidly growing field of optimisation through publications, conferences, joint research and student exchange. [3]

His books include Theory of Global Random Search, [4] Stochastic Global Optimization, [5] Analysis of time series structure: SSA and related techniques, [6] Dynamical Search: Applications of Dynamical Systems in Search and Optimization, [7] and Singular Spectrum Analysis for Time Series. [8] His books received positive comments from reviewers. [9] [5]

Zhigljavsky received an MSc in 1976, a PhD in 1981, and a Habilitation in 1987 from the St. Petersburg State University, Russia. [2]

In 2019 Anatoly Zhigljavsky received the Constantin Caratheodory Prize for his outstanding work and significant contributions in the field of global optimisation. [10] [11]

Related Research Articles

<span class="mw-page-title-main">Mathematical optimization</span> Study of mathematical algorithms for optimization problems

Mathematical optimization or mathematical programming is the selection of a best element, with regard to some criteria, from some set of available alternatives. It is generally divided into two subfields: discrete optimization and continuous optimization. Optimization problems arise in all quantitative disciplines from computer science and engineering to operations research and economics, and the development of solution methods has been of interest in mathematics for centuries.

<span class="mw-page-title-main">Evolutionary algorithm</span> Subset of evolutionary computation

An evolutionary algorithm (EA) in computational intelligence is a subset of evolutionary computation, a generic population-based metaheuristic optimization algorithm. An EA uses mechanisms inspired by biological evolution, such as reproduction, mutation, recombination and selection. Candidate solutions to the optimization problem play the role of individuals in a population, and the fitness function determines the quality of the solutions. Evolution of the population then takes place after the repeated application of the above operators.

<span class="mw-page-title-main">Evolutionary computation</span> Trial and error problem solvers with a metaheuristic or stochastic optimization character

Evolutionary computation from computer science is a family of algorithms for global optimization inspired by biological evolution, and the subfield of artificial intelligence and soft computing studying these algorithms. In technical terms, they are a family of population-based trial and error problem solvers with a metaheuristic or stochastic optimization character.

<span class="mw-page-title-main">Particle swarm optimization</span> Iterative simulation method

In computational science, particle swarm optimization (PSO) is a computational method that optimizes a problem by iteratively trying to improve a candidate solution with regard to a given measure of quality. It solves a problem by having a population of candidate solutions, here dubbed particles, and moving these particles around in the search-space according to simple mathematical formulae over the particle's position and velocity. Each particle's movement is influenced by its local best known position, but is also guided toward the best known positions in the search-space, which are updated as better positions are found by other particles. This is expected to move the swarm toward the best solutions.

Global optimization is a branch of applied mathematics and numerical analysis that attempts to find the global minima or maxima of a function or a set of functions on a given set. It is usually described as a minimization problem because the maximization of the real-valued function is equivalent to the minimization of the function .

<span class="mw-page-title-main">Swarm intelligence</span> Collective behavior of decentralized, self-organized systems

Swarm intelligence (SI) is the collective behavior of decentralized, self-organized systems, natural or artificial. The concept is employed in work on artificial intelligence. The expression was introduced by Gerardo Beni and Jing Wang in 1989, in the context of cellular robotic systems.

Metaheuristic in computer science and mathematical optimization is a higher-level procedure or heuristic designed to find, generate, tune, or select a heuristic that may provide a sufficiently good solution to an optimization problem or a machine learning problem, especially with incomplete or imperfect information or limited computation capacity. Metaheuristics sample a subset of solutions which is otherwise too large to be completely enumerated or otherwise explored. Metaheuristics may make relatively few assumptions about the optimization problem being solved and so may be usable for a variety of problems. Their use is always of interest when exact or other (approximate) methods are not available or are not expedient, either because the calculation time is too long or because, for example, the solution provided is too imprecise.

Peter Whittle was a mathematician and statistician from New Zealand, working in the fields of stochastic nets, optimal control, time series analysis, stochastic optimisation and stochastic dynamics. From 1967 to 1994, he was the Churchill Professor of Mathematics for Operational Research at the University of Cambridge.

Stochastic optimization (SO) are optimization methods that generate and use random variables. For stochastic optimization problems, the objective functions or constraints are random. Stochastic optimization also include methods with random iterates. Some hybrid methods use random iterates to solve stochastic problems, combining both meanings of stochastic optimization. Stochastic optimization methods generalize deterministic methods for deterministic problems.

<span class="mw-page-title-main">Yu-Chi Ho</span> American control theorist

Yu-Chi "Larry" Ho is a Chinese-American mathematician, control theorist, and a professor at the School of Engineering and Applied Sciences, Harvard University.

<span class="mw-page-title-main">Suresh P. Sethi</span> American mathematician

Suresh P. Sethi is an American mathematician who is the Eugene McDermott Chair of Operations Management and Director of the Center for Intelligent Supply Networks at the University of Texas at Dallas.

Roger Jean-Baptiste Robert Wets is a "pioneer" in stochastic programming and a leader in variational analysis who publishes as Roger J-B Wets. His research, expositions, graduate students, and his collaboration with R. Tyrrell Rockafellar have had a profound influence on optimization theory, computations, and applications. Since 2009, Wets has been a distinguished research professor at the mathematics department of the University of California, Davis.

<span class="mw-page-title-main">R. Tyrrell Rockafellar</span> American mathematician

Ralph Tyrrell Rockafellar is an American mathematician and one of the leading scholars in optimization theory and related fields of analysis and combinatorics. He is the author of four major books including the landmark text "Convex Analysis" (1970), which has been cited more than 27,000 times according to Google Scholar and remains the standard reference on the subject, and "Variational Analysis" for which the authors received the Frederick W. Lanchester Prize from the Institute for Operations Research and the Management Sciences (INFORMS).

In economics, non-convexity refers to violations of the convexity assumptions of elementary economics. Basic economics textbooks concentrate on consumers with convex preferences and convex budget sets and on producers with convex production sets; for convex models, the predicted economic behavior is well understood. When convexity assumptions are violated, then many of the good properties of competitive markets need not hold: Thus, non-convexity is associated with market failures, where supply and demand differ or where market equilibria can be inefficient. Non-convex economies are studied with nonsmooth analysis, which is a generalization of convex analysis.

Mark Herbert Ainsworth Davis was Professor of Mathematics at Imperial College London. He made fundamental contributions to the theory of stochastic processes, stochastic control and mathematical finance.

<span class="mw-page-title-main">Michael Ghil</span>

Michael Ghil is an American and European mathematician and physicist, focusing on the climate sciences and their interdisciplinary aspects. He is a founder of theoretical climate dynamics, as well as of advanced data assimilation methodology. He has systematically applied dynamical systems theory to planetary-scale flows, both atmospheric and oceanic. Ghil has used these methods to proceed from simple flows with high temporal regularity and spatial symmetry to the observed flows, with their complex behavior in space and time. His studies of climate variability on many time scales have used a full hierarchy of models, from the simplest ‘toy’ models all the way to atmospheric, oceanic and coupled general circulation models. Recently, Ghil has also worked on modeling and data analysis in population dynamics, macroeconomics, and the climate–economy–biosphere system.

Maria Grazia Speranza is an Italian applied mathematician and operations researcher. Her research involves the application of mathematical optimization to problems including portfolio optimization and the combination of inventory management with vehicle routing.

Adrian Stephen Lewis is a British-Canadian mathematician, specializing in variational analysis and nonsmooth optimization.

<span class="mw-page-title-main">Werner Römisch</span> German mathematician

Werner Römisch is a German mathematician, professor emeritus at the Humboldt University of Berlin, most known for his pioneer work in the field of stochastic programming.

<span class="mw-page-title-main">Eugene A. Feinberg</span> American applied mathematician

Eugene A. Feinberg is an American mathematician and distinguished professor of applied mathematics and statistics at Stony Brook University. He is noted for his work in probability theory, real analysis, and Markov decision processes.

References

  1. Google Scholar
  2. 1 2 Cardiff University
  3. Centre for Optimisation and its Applications
  4. Zhigljavsky, Anatoly A. (1991). Pintér, J. (ed.). Theory of Global Random Search. doi:10.1007/978-94-011-3436-1. ISBN   978-94-010-5519-2.
  5. 1 2 Stochastic Global Optimization. Springer Optimization and Its Applications. Vol. 9. 2008. doi:10.1007/978-0-387-74740-8. ISBN   978-0-387-74022-5.
  6. Analysis of time series structure, Chapman and Hall, 1991
  7. Dynamical Search, Chapman and Hall, 1999
  8. Singular Spectrum Analysis for Time Series. SpringerBriefs in Statistics. 2013. doi:10.1007/978-3-642-34913-3. ISBN   978-3-642-34912-6.
  9. Elsner, James B. (2002). "Analysis of Time Series Structure: SSA and Related Techniques". Journal of the American Statistical Association. 97 (460): 1207–1208. doi:10.1198/jasa.2002.s239.
  10. Society for Global Optimization
  11. AMS Notices