Proportional-fair scheduling

Last updated

Proportional-fair scheduling is a compromise-based scheduling algorithm. It is based upon maintaining a balance between two competing interests: Trying to maximize the total throughput of the network (wired or not) while at the same time allowing all users at least a minimal level of service. This is done by assigning each data flow a data rate or a scheduling priority (depending on the implementation) that is inversely proportional to its anticipated resource consumption. [1] [2]

Contents

Weighted fair queuing

Proportionally fair scheduling can be achieved by means of weighted fair queuing (WFQ), by setting the scheduling weights for data flow to , where the cost is the amount of consumed resources per data bit. For instance:

User prioritization

Another way to schedule data transfer that leads to similar results is through the use of prioritization coefficients. [3] Here we schedule the channel for the station that has the maximum of the priority function:

By adjusting and in the formula above, we are able to adjust the balance between serving the best mobiles (the ones in the best channel conditions) more often and serving the costly mobiles often enough that they have an acceptable level of performance.

In the extreme case ( and ) the scheduler acts in a "packet" round-robin fashion and serves all mobiles one after the other (but not equally often in time), with no regard for resource consumption, and such that each user gets the same amount of data. The ( and ) scheduler could be called "maximum fairness scheduler" (to be used to provide equal throughout to voice users for example). If and then the scheduler will always serve the mobile with the best channel conditions. This will maximize the throughput of the channel while stations with low are not served at all. The ( and ) scheduler could be called "max rate" scheduler. [2] Using and will yield the proportional fair scheduling algorithm used in 3G networks. [3] The ( and ) scheduler could be implemented by providing the same amount of time & spectrum for each user, irrespective of the desired packet size, channel quality and data rate (MCS) used. The proportional fair ( and ) scheduler could be called "equal effort scheduler" or "time/spectrum Round Robin scheduler".

This technique can be further parametrized by using a "memory constant" that determines the period of time over which the station data rate used in calculating the priority function is averaged. A larger constant generally improves throughput at the expense of reduced short-term fairness.

See also

Related Research Articles

Rayleigh fading is a statistical model for the effect of a propagation environment on a radio signal, such as that used by wireless devices.

ALOHAnet, also known as the ALOHA System, or simply ALOHA, was a pioneering computer networking system developed at the University of Hawaii. ALOHAnet became operational in June 1971, providing the first public demonstration of a wireless packet data network. ALOHA originally stood for Additive Links On-line Hawaii Area.

Network congestion in data networking and queueing theory is the reduced quality of service that occurs when a network node or link is carrying more data than it can handle. Typical effects include queueing delay, packet loss or the blocking of new connections. A consequence of congestion is that an incremental increase in offered load leads either only to a small increase or even a decrease in network throughput.

Fairness measures or metrics are used in network engineering to determine whether users or applications are receiving a fair share of system resources. There are several mathematical and conceptual definitions of fairness.

Maximum throughput scheduling is a procedure for scheduling data packets in a packet-switched best-effort network, typically a wireless network, in view to maximize the total throughput of the network, or the system spectral efficiency in a wireless network. This is achieved by giving scheduling priority to the least "expensive" data flows in terms of consumed network resources per transferred amount of information.

In communication networks, multiplexing and the division of scarce resources, max-min fairness is said to be achieved by an allocation if and only if the allocation is feasible and an attempt to increase the allocation of any participant necessarily results in the decrease in the allocation of some other participant with an equal or smaller allocation.

A wireless ad hoc network (WANET) or mobile ad hoc network (MANET) is a decentralized type of wireless network. The network is ad hoc because it does not rely on a pre-existing infrastructure, such as routers or wireless access points. Instead, each node participates in routing by forwarding data for other nodes. The determination of which nodes forward data is made dynamically on the basis of network connectivity and the routing algorithm in use.

Weighted fair queueing (WFQ) is a network scheduling algorithm. WFQ is both a packet-based implementation of the generalized processor sharing (GPS) policy, and a natural extension of fair queuing (FQ). Whereas FQ shares the link's capacity in equal subparts, WFQ allows schedulers to specify, for each flow, which fraction of the capacity will be given.

Radio resource management (RRM) is the system level management of co-channel interference, radio resources, and other radio transmission characteristics in wireless communication systems, for example cellular networks, wireless local area networks, wireless sensor systems, and radio broadcasting networks. RRM involves strategies and algorithms for controlling parameters such as transmit power, user allocation, beamforming, data rates, handover criteria, modulation scheme, error coding scheme, etc. The objective is to utilize the limited radio-frequency spectrum resources and radio network infrastructure as efficiently as possible.

TCP-Illinois is a variant of TCP congestion control protocol, developed at the University of Illinois at Urbana–Champaign. It is especially targeted at high-speed, long-distance networks. A sender side modification to the standard TCP congestion control algorithm, it achieves a higher average throughput than the standard TCP, allocates the network resource fairly as the standard TCP, is compatible with the standard TCP, and provides incentives for TCP users to switch.

Precoding is a generalization of beamforming to support multi-stream transmission in multi-antenna wireless communications. In conventional single-stream beamforming, the same signal is emitted from each of the transmit antennas with appropriate weighting such that the signal power is maximized at the receiver output. When the receiver has multiple antennas, single-stream beamforming cannot simultaneously maximize the signal level at all of the receive antennas. In order to maximize the throughput in multiple receive antenna systems, multi-stream transmission is generally required.

Cell lists is a data structure in molecular dynamics simulations to find all atom pairs within a given cut-off distance of each other. These pairs are needed to compute the short-range non-bonded interactions in a system, such as Van der Waals forces or the short-range part of the electrostatic interaction when using Ewald summation.

Cooperative diversity is a cooperative multiple antenna technique for improving or maximising total network channel capacities for any given set of bandwidths which exploits user diversity by decoding the combined signal of the relayed signal and the direct signal in wireless multihop networks. A conventional single hop system uses direct transmission where a receiver decodes the information only based on the direct signal while regarding the relayed signal as interference, whereas the cooperative diversity considers the other signal as contribution. That is, cooperative diversity decodes the information from the combination of two signals. Hence, it can be seen that cooperative diversity is an antenna diversity that uses distributed antennas belonging to each node in a wireless network. Note that user cooperation is another definition of cooperative diversity. User cooperation considers an additional fact that each user relays the other user's signal while cooperative diversity can be also achieved by multi-hop relay networking systems.

In radio, cooperative multiple-input multiple-output is a technology that can effectively exploit the spatial domain of mobile fading channels to bring significant performance improvements to wireless communication systems. It is also called network MIMO, distributed MIMO, virtual MIMO, and virtual antenna arrays.

<span class="mw-page-title-main">MIMO</span> Use of multiple antennas in radio

In radio, multiple-input and multiple-output, or MIMO, is a method for multiplying the capacity of a radio link using multiple transmission and receiving antennas to exploit multipath propagation. MIMO has become an essential element of wireless communication standards including IEEE 802.11n, IEEE 802.11ac, HSPA+ (3G), WiMAX, and Long Term Evolution (LTE). More recently, MIMO has been applied to power-line communication for three-wire installations as part of the ITU G.hn standard and of the HomePlug AV2 specification.

<span class="mw-page-title-main">Log-logistic distribution</span>

In probability and statistics, the log-logistic distribution is a continuous probability distribution for a non-negative random variable. It is used in survival analysis as a parametric model for events whose rate increases initially and decreases later, as, for example, mortality rate from cancer following diagnosis or treatment. It has also been used in hydrology to model stream flow and precipitation, in economics as a simple model of the distribution of wealth or income, and in networking to model the transmission times of data considering both the network and the software.

In 1973, Andrey Kolmogorov proposed a non-probabilistic approach to statistics and model selection. Let each datum be a finite binary string and a model be a finite set of binary strings. Consider model classes consisting of models of given maximal Kolmogorov complexity. The Kolmogorov structure function of an individual data string expresses the relation between the complexity level constraint on a model class and the least log-cardinality of a model in the class containing the data. The structure function determines all stochastic properties of the individual data string: for every constrained model class it determines the individual best-fitting model in the class irrespective of whether the true model is in the model class considered or not. In the classical case we talk about a set of data with a probability distribution, and the properties are those of the expectations. In contrast, here we deal with individual data strings and the properties of the individual string focused on. In this setting, a property holds with certainty rather than with high probability as in the classical case. The Kolmogorov structure function precisely quantifies the goodness-of-fit of an individual model with respect to individual data.

In queueing theory, a discipline within the mathematical theory of probability, a fluid queue is a mathematical model used to describe the fluid level in a reservoir subject to randomly determined periods of filling and emptying. The term dam theory was used in earlier literature for these models. The model has been used to approximate discrete models, model the spread of wildfires, in ruin theory and to model high speed data networks. The model applies the leaky bucket algorithm to a stochastic source.

Design of robust and reliable networks and network services relies on an understanding of the traffic characteristics of the network. Throughout history, different models of network traffic have been developed and used for evaluating existing and proposed networks and services.

Graph cut optimization is a combinatorial optimization method applicable to a family of functions of discrete variables, named after the concept of cut in the theory of flow networks. Thanks to the max-flow min-cut theorem, determining the minimum cut over a graph representing a flow network is equivalent to computing the maximum flow over the network. Given a pseudo-Boolean function , if it is possible to construct a flow network with positive weights such that

References

  1. Kushner, H. J.; Whiting, P.A. (July 2004), "Convergence of proportional-fair sharing algorithms under general conditions", IEEE Transactions on Wireless Communications, 3 (4): 1250–1259, CiteSeerX   10.1.1.8.6408 , doi:10.1109/TWC.2004.830826, S2CID   6780351.
  2. 1 2 Guowang Miao, Jens Zander, Ki Won Sung, and Ben Slimane, Fundamentals of Mobile Data Networks, Cambridge University Press, ISBN   1107143217, 2016.
  3. 1 2 Ji Yang; Zhang Yifan; Wang Ying; Zhang Ping (2004), "Average rate updating mechanism in proportional fair scheduler for HDR", IEEE Global Telecommunications Conference, 2004. GLOBECOM '04, vol. 6, pp. 3464–3466, doi:10.1109/GLOCOM.2004.1379010, ISBN   0-7803-8794-5

Further reading