Glicko rating system

Last updated

The Glicko rating system and Glicko-2 rating system are methods of assessing a player's strength in zero-sum two-player games. The Glicko rating system was invented by Mark Glickman in 1995 as an improvement on the Elo rating system and initially intended for the primary use as a chess rating system. Glickman's principal contribution to measurement is "ratings reliability", called RD, for ratings deviation .

Contents

Overview

Mark Glickman created the Glicko rating system in 1995 as an improvement on the Elo rating system. [1]

Both the Glicko and Glicko-2 rating systems are under public domain and have been implemented on game servers online like Pokémon Showdown , Pokémon Go , [2] Lichess, Free Internet Chess Server, Chess.com, Online Go Server (OGS), [3] Counter-Strike: Global Offensive , Quake Live , Team Fortress 2 , [4] Dota 2 , [5] Dota Underlords , Guild Wars 2 , [6] Splatoon 2 and 3, [7] Dominion Online, TETR.IO, and competitive programming competitions.

The Reliability Deviation (RD) measures the accuracy of a player's rating, where the RD is equal to one standard deviation. For example, a player with a rating of 1500 and an RD of 50 has a real strength between 1400 and 1600 (two standard deviations from 1500) with 95% confidence. Twice (exact: 1.96) the RD is added and subtracted from their rating to calculate this range. After a game, the amount the rating changes depends on the RD: the change is smaller when the player's RD is low (since their rating is already considered accurate), and also when their opponent's RD is high (since the opponent's true rating is not well known, so little information is being gained). The RD itself decreases after playing a game, but it will increase slowly over time of inactivity.

The Glicko-2 rating system improves upon the Glicko rating system and further introduces the rating volatility σ. [8] A very slightly modified version of the Glicko-2 rating system is implemented by the Australian Chess Federation. [9]

The algorithm of Glicko

Step 1: Determine ratings deviation

The new Ratings Deviation () is found using the old Ratings Deviation ():

where is the amount of time (rating periods) since the last competition and '350' is assumed to be the RD of an unrated player. If several games have occurred within one rating period, the method treats them as having happened simultaneously. The rating period may be as long as several months or as short as a few minutes, according to how frequently games are arranged. The constant is based on the uncertainty of a player's skill over a certain amount of time. It can be derived from thorough data analysis, or estimated by considering the length of time that would have to pass before a player's rating deviation would grow to that of an unrated player. If it is assumed that it would take 100 rating periods for a player's rating deviation to return to an initial uncertainty of 350, and a typical player has a rating deviation of 50 then the constant can be found by solving for . [10]

Or

Step 2: Determine new rating

The new ratings, after a series of m games, are determined by the following equation:

where:

represents the ratings of the individual opponents.

represents the rating deviations of the individual opponents.

represents the outcome of the individual games. A win is 1, a draw is , and a loss is 0.

Step 3: Determine new ratings deviation

The function of the prior RD calculation was to increase the RD appropriately to account for the increasing uncertainty in a player's skill level during a period of non-observation by the model. Now, the RD is updated (decreased) after the series of games:

Glicko-2 algorithm

Glicko-2 works in a similar way to the original Glicko algorithm, with the addition of a rating volatility which measures the degree of expected fluctuation in a player’s rating, based on how erratic the player's performances are. For instance, a player's rating volatility would be low when they performed at a consistent level, and would increase if they had exceptionally strong results after that period of consistency. A simplified explanation of the Glicko-2 algorithm is presented below: [8]

Step 1: Compute ancillary quantities

Across one rating period, a player with a current rating and ratings deviation plays against opponents, with ratings and RDs , resulting in scores . We first need to compute the ancillary quantities and :

where

Step 2: Determine new rating volatility

We then need to choose a small constant which constrains the volatility over time, for instance (smaller values of prevent dramatic rating changes after upset results). Then, for

we need to find the value which satisfies . An efficient way of solving this would be to use the Illinois algorithm, a modified version of the regula falsi procedure (see Regula falsi § The Illinois algorithm for details on how this would be done). Once this iterative procedure is complete, we set the new rating volatility as

Step 3: Determine new ratings deviation and rating

We then get the new RD

and new rating

These ratings and RDs are on a different scale than in the original Glicko algorithm, and would need to be converted to properly compare the two. [8]

See also

Related Research Articles

<span class="mw-page-title-main">Normal distribution</span> Probability distribution

In statistics, a normal distribution or Gaussian distribution is a type of continuous probability distribution for a real-valued random variable. The general form of its probability density function is

<span class="mw-page-title-main">Stress–energy tensor</span> Tensor describing energy momentum density in spacetime

The stress–energy tensor, sometimes called the stress–energy–momentum tensor or the energy–momentum tensor, is a tensor physical quantity that describes the density and flux of energy and momentum in spacetime, generalizing the stress tensor of Newtonian physics. It is an attribute of matter, radiation, and non-gravitational force fields. This density and flux of energy and momentum are the sources of the gravitational field in the Einstein field equations of general relativity, just as mass density is the source of such a field in Newtonian gravity.

In statistics, an effect size is a value measuring the strength of the relationship between two variables in a population, or a sample-based estimate of that quantity. It can refer to the value of a statistic calculated from a sample of data, the value of a parameter for a hypothetical population, or to the equation that operationalizes how statistics or parameters lead to the effect size value. Examples of effect sizes include the correlation between two variables, the regression coefficient in a regression, the mean difference, or the risk of a particular event happening. Effect sizes complement statistical hypothesis testing, and play an important role in power analyses, sample size planning, and in meta-analyses. The cluster of data-analysis methods concerning effect sizes is referred to as estimation statistics.

The Gram–Charlier A series, and the Edgeworth series are series that approximate a probability distribution in terms of its cumulants. The series are the same; but, the arrangement of terms differ. The key idea of these expansions is to write the characteristic function of the distribution whose probability density function f is to be approximated in terms of the characteristic function of a distribution with known and suitable properties, and to recover f through the inverse Fourier transform.

<span class="mw-page-title-main">Multimodal distribution</span> Probability distribution with more than one mode

In statistics, a multimodaldistribution is a probability distribution with more than one mode. These appear as distinct peaks in the probability density function, as shown in Figures 1 and 2. Categorical, continuous, and discrete data can all form multimodal distributions. Among univariate analyses, multimodal distributions are commonly bimodal.

In physics, the Polyakov action is an action of the two-dimensional conformal field theory describing the worldsheet of a string in string theory. It was introduced by Stanley Deser and Bruno Zumino and independently by L. Brink, P. Di Vecchia and P. S. Howe in 1976, and has become associated with Alexander Polyakov after he made use of it in quantizing the string in 1981. The action reads:

In quantum field theory, a quartic interaction is a type of self-interaction in a scalar field. Other types of quartic interactions may be found under the topic of four-fermion interactions. A classical free scalar field satisfies the Klein–Gordon equation. If a scalar field is denoted , a quartic interaction is represented by adding a potential energy term to the Lagrangian density. The coupling constant is dimensionless in 4-dimensional spacetime.

The Kerr–Newman metric is the most general asymptotically flat and stationary solution of the Einstein–Maxwell equations in general relativity that describes the spacetime geometry in the region surrounding an electrically charged and rotating mass. It generalizes the Kerr metric by taking into account the field energy of an electromagnetic field, in addition to describing rotation. It is one of a large number of various different electrovacuum solutions; that is, it is a solution to the Einstein–Maxwell equations that account for the field energy of an electromagnetic field. Such solutions do not include any electric charges other than that associated with the gravitational field, and are thus termed vacuum solutions.

A frame field in general relativity is a set of four pointwise-orthonormal vector fields, one timelike and three spacelike, defined on a Lorentzian manifold that is physically interpreted as a model of spacetime. The timelike unit vector field is often denoted by and the three spacelike unit vector fields by . All tensorial quantities defined on the manifold can be expressed using the frame field and its dual coframe field.

Scalar–tensor–vector gravity (STVG) is a modified theory of gravity developed by John Moffat, a researcher at the Perimeter Institute for Theoretical Physics in Waterloo, Ontario. The theory is also often referred to by the acronym MOG.

<span class="mw-page-title-main">Toroidal coordinates</span>

Toroidal coordinates are a three-dimensional orthogonal coordinate system that results from rotating the two-dimensional bipolar coordinate system about the axis that separates its two foci. Thus, the two foci and in bipolar coordinates become a ring of radius in the plane of the toroidal coordinate system; the -axis is the axis of rotation. The focal ring is also known as the reference circle.

The Newman–Penrose (NP) formalism is a set of notation developed by Ezra T. Newman and Roger Penrose for general relativity (GR). Their notation is an effort to treat general relativity in terms of spinor notation, which introduces complex forms of the usual variables used in GR. The NP formalism is itself a special case of the tetrad formalism, where the tensors of the theory are projected onto a complete vector basis at each point in spacetime. Usually this vector basis is chosen to reflect some symmetry of the spacetime, leading to simplified expressions for physical observables. In the case of the NP formalism, the vector basis chosen is a null tetrad: a set of four null vectors—two real, and a complex-conjugate pair. The two real members often asymptotically point radially inward and radially outward, and the formalism is well adapted to treatment of the propagation of radiation in curved spacetime. The Weyl scalars, derived from the Weyl tensor, are often used. In particular, it can be shown that one of these scalars— in the appropriate frame—encodes the outgoing gravitational radiation of an asymptotically flat system.

f(R) is a type of modified gravity theory which generalizes Einstein's general relativity. f(R) gravity is actually a family of theories, each one defined by a different function, f, of the Ricci scalar, R. The simplest case is just the function being equal to the scalar; this is general relativity. As a consequence of introducing an arbitrary function, there may be freedom to explain the accelerated expansion and structure formation of the Universe without adding unknown forms of dark energy or dark matter. Some functional forms may be inspired by corrections arising from a quantum theory of gravity. f(R) gravity was first proposed in 1970 by Hans Adolph Buchdahl. It has become an active field of research following work by Starobinsky on cosmic inflation. A wide range of phenomena can be produced from this theory by adopting different functions; however, many functional forms can now be ruled out on observational grounds, or because of pathological theoretical problems.

<span class="mw-page-title-main">Wrapped normal distribution</span>

In probability theory and directional statistics, a wrapped normal distribution is a wrapped probability distribution that results from the "wrapping" of the normal distribution around the unit circle. It finds application in the theory of Brownian motion and is a solution to the heat equation for periodic boundary conditions. It is closely approximated by the von Mises distribution, which, due to its mathematical simplicity and tractability, is the most commonly used distribution in directional statistics.

In quantum field theory, a non-topological soliton (NTS) is a soliton field configuration possessing, contrary to a topological one, a conserved Noether charge and stable against transformation into usual particles of this field for the following reason. For fixed charge Q, the mass sum of Q free particles exceeds the energy (mass) of the NTS so that the latter is energetically favorable to exist.

In general relativity, the Vaidya metric describes the non-empty external spacetime of a spherically symmetric and nonrotating star which is either emitting or absorbing null dusts. It is named after the Indian physicist Prahalad Chunnilal Vaidya and constitutes the simplest non-static generalization of the non-radiative Schwarzschild solution to Einstein's field equation, and therefore is also called the "radiating(shining) Schwarzschild metric".

In probability theory, the rectified Gaussian distribution is a modification of the Gaussian distribution when its negative elements are reset to 0. It is essentially a mixture of a discrete distribution and a continuous distribution as a result of censoring.

Lagrangian field theory is a formalism in classical field theory. It is the field-theoretic analogue of Lagrangian mechanics. Lagrangian mechanics is used to analyze the motion of a system of discrete particles each with a finite number of degrees of freedom. Lagrangian field theory applies to continua and fields, which have an infinite number of degrees of freedom.

The pressuron is a hypothetical scalar particle which couples to both gravity and matter theorised in 2013. Although originally postulated without self-interaction potential, the pressuron is also a dark energy candidate when it has such a potential. The pressuron takes its name from the fact that it decouples from matter in pressure-less regimes, allowing the scalar–tensor theory of gravity involving it to pass solar system tests, as well as tests on the equivalence principle, even though it is fundamentally coupled to matter. Such a decoupling mechanism could explain why gravitation seems to be well described by general relativity at present epoch, while it could actually be more complex than that. Because of the way it couples to matter, the pressuron is a special case of the hypothetical string dilaton. Therefore, it is one of the possible solutions to the present non-observation of various signals coming from massless or light scalar fields that are generically predicted in string theory.

In quantum computing, Mølmer–Sørensen gate scheme refers to an implementation procedure for various multi-qubit quantum logic gates used mostly in trapped ion quantum computing. This procedure is based on the original proposition by Klaus Mølmer and Anders Sørensen in 1999-2000.

References

  1. Glickman, Mark. "The Glicko System" (PDF). Retrieved October 13, 2022.
  2. "Farming Volatility: How a major flaw in a well-known rating system takes over the GBL leaderboard". 23 July 2020. Retrieved 12 Dec 2022.
  3. "OGS has a new Glicko-2 based rating system!". 7 August 2017. Retrieved 2020-04-19.
  4. Valve. "Team Fortress 2 Update Released" . Retrieved 29 June 2021.
  5. "The New Frontiers Update - Gameplay Update 7.33" . Retrieved 20 April 2023.
  6. Justin, O'Dell. "Finding the perfect match" . Retrieved 16 January 2015.
  7. OatmealDome. "An In-Depth Look at the Splatoon 2 Ranking System". oatmealdome.me. Retrieved 2021-06-16.
  8. 1 2 3 Glickman, Mark E. (November 30, 2013). "Example of the Glicko-2 system" (PDF). Glicko.net. Retrieved January 27, 2020.
  9. "Australian Chess Federation Ratings By-Law" (PDF). Retrieved 17 January 2019.
  10. "Welcome to Glicko ratings".