Yevgeniy Yegorov

Last updated
Yevgeniy Yegorov
Medal record
Men's canoeing
Representing Flag of Kazakhstan.svg  Kazakhstan
Asian Games
Gold medal icon (G initial).svg 1998 Bangkok K1 500 m
Gold medal icon (G initial).svg 1998 Bangkok K2 1000 m
Gold medal icon (G initial).svg 2002 Busan K4 500 m
Silver medal icon (S initial).svg 2002 Busan K4 1000 m
Silver medal icon (S initial).svg 2010 Guangzhou K4 1000 m
Bronze medal icon (B initial).svg 1994 Hiroshima K1 500 m
Bronze medal icon (B initial).svg 1994 Hiroshima K1 1000 m

Yevgeny Yegorov (born 14 February 1976) is a Kazakhstani sprint canoeist who competed in the mid-1990s. At the 1996 Summer Olympics, he was eliminated in the repechages of both the K-1 500 m and the K-2 1000 m events.


Related Research Articles

<span class="mw-page-title-main">Binomial distribution</span> Probability distribution

In probability theory and statistics, the binomial distribution with parameters n and p is the discrete probability distribution of the number of successes in a sequence of n independent experiments, each asking a yes–no question, and each with its own Boolean-valued outcome: success or failure. A single success/failure experiment is also called a Bernoulli trial or Bernoulli experiment, and a sequence of outcomes is called a Bernoulli process; for a single trial, i.e., n = 1, the binomial distribution is a Bernoulli distribution. The binomial distribution is the basis for the popular binomial test of statistical significance.

<span class="mw-page-title-main">Binomial coefficient</span> Number of subsets of a given size

In mathematics, the binomial coefficients are the positive integers that occur as coefficients in the binomial theorem. Commonly, a binomial coefficient is indexed by a pair of integers nk ≥ 0 and is written It is the coefficient of the xk term in the polynomial expansion of the binomial power (1 + x)n; this coefficient can be computed by the multiplicative formula

<span class="mw-page-title-main">Fibonacci sequence</span> Numbers obtained by adding the two previous ones

In mathematics, the Fibonacci sequence is a sequence in which each number is the sum of the two preceding ones. Numbers that are part of the Fibonacci sequence are known as Fibonacci numbers, commonly denoted Fn. The sequence commonly starts from 0 and 1, although some authors start the sequence from 1 and 1 or sometimes from 1 and 2. Starting from 0 and 1, the sequence begins

<span class="mw-page-title-main">Kinetic energy</span> Energy of a moving physical body

In physics, the kinetic energy of an object is the form of energy that it possesses due to its motion.

<span class="mw-page-title-main">Modular arithmetic</span> Computation modulo a fixed integer

In mathematics, modular arithmetic is a system of arithmetic for integers, where numbers "wrap around" when reaching a certain value, called the modulus. The modern approach to modular arithmetic was developed by Carl Friedrich Gauss in his book Disquisitiones Arithmeticae, published in 1801.

<span class="mw-page-title-main">Oscillation</span> Repetitive variation of some measure about a central value

Oscillation is the repetitive or periodic variation, typically in time, of some measure about a central value or between two or more different states. Familiar examples of oscillation include a swinging pendulum and alternating current. Oscillations can be used in physics to approximate complex interactions, such as those between atoms.

<span class="mw-page-title-main">Ideal gas law</span> Equation of the state of a hypothetical ideal gas

The ideal gas law, also called the general gas equation, is the equation of state of a hypothetical ideal gas. It is a good approximation of the behavior of many gases under many conditions, although it has several limitations. It was first stated by Benoît Paul Émile Clapeyron in 1834 as a combination of the empirical Boyle's law, Charles's law, Avogadro's law, and Gay-Lussac's law. The ideal gas law is often written in an empirical form:

<span class="mw-page-title-main">Newton (unit)</span> Unit of force in physics

The newton is the unit of force in the International System of Units (SI). It is defined as , the force which gives a mass of 1 kilogram an acceleration of 1 metre per second per second. It is named after Isaac Newton in recognition of his work on classical mechanics, specifically his second law of motion.

<span class="mw-page-title-main">Matrix multiplication</span> Mathematical operation in linear algebra

In mathematics, particularly in linear algebra, matrix multiplication is a binary operation that produces a matrix from two matrices. For matrix multiplication, the number of columns in the first matrix must be equal to the number of rows in the second matrix. The resulting matrix, known as the matrix product, has the number of rows of the first and the number of columns of the second matrix. The product of matrices A and B is denoted as AB.

<span class="mw-page-title-main">Speed of sound</span> Speed of sound wave through elastic medium

The speed of sound is the distance travelled per unit of time by a sound wave as it propagates through an elastic medium. At 20 °C (68 °F), the speed of sound in air is about 343 m/s, or one km in 2.91 s or one mile in 4.69 s. It depends strongly on temperature as well as the medium through which a sound wave is propagating. At 0 °C (32 °F), the speed of sound in air is about 331 m/s. More simply, the speed of sound is how fast vibrations travel.

<span class="mw-page-title-main">Moment of inertia</span> Scalar measure of the rotational inertia with respect to a fixed axis of rotation

The moment of inertia, otherwise known as the mass moment of inertia, angular mass, second moment of mass, or most accurately, rotational inertia, of a rigid body is a quantity that determines the torque needed for a desired angular acceleration about a rotational axis, akin to how mass determines the force needed for a desired acceleration. It depends on the body's mass distribution and the axis chosen, with larger moments requiring more torque to change the body's rate of rotation by a given amount.

<span class="mw-page-title-main">Logistic regression</span> Statistical model for a binary dependent variable

In statistics, the logistic model is a statistical model that models the log-odds of an event as a linear combination of one or more independent variables. In regression analysis, logistic regression is estimating the parameters of a logistic model. Formally, in binary logistic regression there is a single binary dependent variable, coded by an indicator variable, where the two values are labeled "0" and "1", while the independent variables can each be a binary variable or a continuous variable. The corresponding probability of the value labeled "1" can vary between 0 and 1, hence the labeling; the function that converts log-odds to probability is the logistic function, hence the name. The unit of measurement for the log-odds scale is called a logit, from logistic unit, hence the alternative names. See § Background and § Definition for formal mathematics, and § Example for a worked example.

In mathematics, summation is the addition of a sequence of numbers, called addends or summands; the result is their sum or total. Beside numbers, other types of values can be summed as well: functions, vectors, matrices, polynomials and, in general, elements of any type of mathematical objects on which an operation denoted "+" is defined.

<span class="mw-page-title-main">Michaelis–Menten kinetics</span> Model of enzyme kinetics

In biochemistry, Michaelis–Menten kinetics, named after Leonor Michaelis and Maud Menten, is the simplest case of enzyme kinetics, applied to enzyme-catalysed reactions of one substrate and one product. It takes the form of a differential equation describing the reaction rate to , the concentration of the substrate A. Its formula is given by the Michaelis–Menten equation:

This gallery of sovereign state flags shows the national or state flags of sovereign states that appear on the list of sovereign states. For flags of other entities, please see gallery of flags of dependent territories. Each flag is depicted as if the flagpole is positioned on the left of the flag, except for those of Iran, Iraq and Saudi Arabia which are depicted with the hoist to the right.

<span class="mw-page-title-main">Lineweaver–Burk plot</span> Graph of enzyme kinetics

In biochemistry, the Lineweaver–Burk plot is a graphical representation of the Michaelis–Menten equation of enzyme kinetics, described by Hans Lineweaver and Dean Burk in 1934.

<span class="mw-page-title-main">Temperature</span> Physical quantity of hot and cold

Temperature is a physical quantity that quantitatively expresses the attribute of hotness or coldness. Temperature is measured with a thermometer. It reflects the average kinetic energy of the vibrating and colliding atoms making up a substance.

<span class="mw-page-title-main">Poisson distribution</span> Discrete probability distribution

In probability theory and statistics, the Poisson distribution is a discrete probability distribution that expresses the probability of a given number of events occurring in a fixed interval of time if these events occur with a known constant mean rate and independently of the time since the last event. It can also be used for the number of events in other types of intervals than time, and in dimension greater than 1.

<span class="mw-page-title-main">Transformer (deep learning architecture)</span> Machine learning algorithm used for natural-language processing

A transformer is a deep learning architecture based on the multi-head attention mechanism, proposed in a 2017 paper "Attention Is All You Need". It has no recurrent units, and thus requires less training time than previous recurrent neural architectures, such as long short-term memory (LSTM), and its later variation has been prevalently adopted for training large language models (LLM) on large (language) datasets, such as the Wikipedia corpus and Common Crawl. Text is converted to numerical representations called tokens, and each token is converted into a vector via looking up from a word embedding table. At each layer, each token is then contextualized within the scope of the context window with other (unmasked) tokens via a parallel multi-head attention mechanism allowing the signal for key tokens to be amplified and less important tokens to be diminished. The transformer paper, published in 2017, is based on the softmax-based attention mechanism proposed by Bahdanau et. al. in 2014 for machine translation, and the Fast Weight Controller, similar to a transformer, proposed in 1992.

<span class="mw-page-title-main">COVID-19</span> Contagious disease caused by SARS-CoV-2

Coronavirus disease 2019 (COVID-19) is a contagious disease caused by the virus SARS-CoV-2. The first known case was identified in Wuhan, China, in December 2019. The disease quickly spread worldwide, resulting in the COVID-19 pandemic.