Truthful job scheduling

Last updated

Truthful job scheduling is a mechanism design variant of the job shop scheduling problem from operations research.

Contents

We have a project composed of several "jobs" (tasks). There are several workers. Each worker can do any job, but for each worker it takes a different amount of time to complete each job. Our goal is to allocate jobs to workers such that the total makespan of the project is minimized. In the standard job shop scheduling problem, the timings of all workers are known, so we have a standard optimization problem. In contrast, in the truthful job scheduling problem, the timings of the workers are not known. We ask each worker how much time he needs to do each job, but, the workers might lie to us. Therefore, we have to give the workers an incentive to tell us their true timings by paying them a certain amount of money. The challenge is to design a payment mechanism which is incentive compatible.

The truthful job scheduling problem was introduced by Nisan and Ronen in their 1999 paper on algorithmic mechanism design. [1]

Definitions

There are jobs and workers ("m" stands for "machine", since the problem comes from scheduling jobs to computers). Worker can do job in time . If worker is assigned a set of jobs , then he can execute them in time:

Given an allocation of jobs to workers, The makespan of a project is:

An optimal allocation is an allocation of jobs to workers in which the makespan is minimized. The minimum makespan is denoted by .

A mechanism is a function that takes as input the matrix (the time each worker needs to do each job) and returns as output:

The utility of worker , under such mechanism, is:

I.e, the agent gains the payment, but loses the time that it spends in executing the tasks. Note that payment and time are measured in the same units (e.g., we can assume that the payments are in dollars and that each time-unit costs the worker one dollar).

A mechanism is called truthful (or incentive compatible) if every worker can attain a maximum utility by reporting his true timing vector (i.e., no worker has an incentive to lie about his timings).

The approximation factor of a mechanism is the largest ratio between and (smaller is better; an approximation factor of 1 means that the mechanism is optimal).

The research on truthful job scheduling aims to find upper (positive) and lower (negative) bounds on approximation factors of truthful mechanisms.

Positive bound – m – VCG mechanism

The first solution that comes to mind is VCG mechanism, which is a generic truthful mechanism. A VCG mechanism can be used to minimize the sum of costs. Here, we can use VCG to find an allocation which minimizes the "make-total", defined as:

Here, minimizing the sum can be done by simply allocating each job to the worker who needs the shortest time for that job. To keep the mechanism truthful, each worker that accepts a job is paid the second-shortest time for that job (like in a Vickrey auction).

Let OPT be an allocation which minimizes the makespan. Then:

(where the last inequality follows from the pigeonhole principle). Hence, the approximation factor of the VCG solution is at most – the number of workers.

The following example shows that the approximation factor of the VCG solution can indeed be exactly . Suppose there are jobs and the timings of the workers are as follows:

Then, the VCG mechanism allocates all tasks to worker 1. Both the "make-total" and the makespan are . But, when each job is assigned to a different worker, the makespan is .

An approximation factor of is not very good, and many researchers have tried to improve it over the following years.

On the other hand, there are some impossibility results that prove that the approximation factor cannot be too small.

Negative bound – 2

The approximation factor of every truthful deterministic mechanism is at least 2. [1] :177-

The proof is typical of lower bounds in mechanism design. We check specific scenarios (in our case, specific timings of the workers). By truthfulness, when a single worker changes his declaration, he must not be able to gain from it. This induces some constraints on the allocations returned by the mechanism in the different scenarios.

In the following proof sketch, for simplicity we assume that there are 2 workers and that the number of jobs is even, . We consider the following scenarios:

  1. The timings of both workers to all jobs are 1. Since the mechanism is deterministic, it must return a unique allocation . Suppose, without loss of generality, that (worker 1 is assigned at most as many jobs as worker 2).
  2. The timings of worker 1 to the jobs in are (a very small positive constant); the timings of worker 1 to the jobs in are ; and the timings of worker 2 to all jobs is still 1. Worker 1 knows that, if he lies and says that his timings to all jobs are 1, the (deterministic) mechanism will allocate him the jobs in and his cost will be very near 0. In order to remain truthful, the mechanism must do the same here, so that worker 1 does not gain from lying. However, the makespan can be made half as large by dividing the jobs in equally between the agents.

Hence, the approximation factor of the mechanism must be at least 2.

Monotonicity and Truthfulness

Consider the special case of Uniform-machines scheduling, in which the workers are single-parametric: for each worker there is a speed, and the time it takes the worker to do a job is the job length divided by the speed. The speed is the worker's private information, and we want to incentivize machines to reveal their true speeds. Archer and Tardos [2] prove that a scheduling algorithm is truthful if and only if it is monotone. This means that, if a machine reports a higher speed, and all other inputs remain the same, then the total processing time allocated to the machine weakly increases. For this problem:

Related Research Articles

<span class="mw-page-title-main">Mechanism design</span> Field in game theory

Mechanism design is a field in economics and game theory that takes an objectives-first approach to designing economic mechanisms or incentives, toward desired objectives, in strategic settings, where players act rationally. Because it starts at the end of the game, then goes backwards, it is also called reverse game theory. It has broad applications, from economics and politics in fields such as market design, auction theory and social choice theory to networked-systems.

In Mechanism design, a strategyproof (SP) mechanism is a game in which each player has a weakly-dominant strategy, so that no player can gain by "spying" over the other players to know what they are going to play. When the players have private information, and the strategy space of each player consists of the possible information values, a truthful mechanism is a game in which revealing the true information is a weakly-dominant strategy for each player. i.e. given no information about what the others do, you fare best or at least not worse by being truthful. An SP mechanism is also called dominant-strategy-incentive-compatible (DSIC), to distinguish it from other kinds of incentive compatibility.

List scheduling is a greedy algorithm for Identical-machines scheduling. The input to this algorithm is a list of jobs that should be executed on a set of m machines. The list is ordered in a fixed order, which can be determined e.g. by the priority of executing the jobs, or by their order of arrival. The algorithm repeatedly executes the following steps until a valid schedule is obtained:

Uniform machine scheduling is an optimization problem in computer science and operations research. It is a variant of optimal job scheduling. We are given n jobs J1, J2, ..., Jn of varying processing times, which need to be scheduled on m different machines. The goal is to minimize the makespan - the total time required to execute the schedule. The time that machine i needs in order to process job j is denoted by pi,j. In the general case, the times pi,j are unrelated, and any matrix of positive processing times is possible. In the specific variant called uniform machine scheduling, some machines are uniformly faster than others. This means that, for each machine i, there is a speed factor si, and the run-time of job j on machine i is pi,j = pj / si.

Single-machine scheduling or single-resource scheduling is an optimization problem in computer science and operations research. We are given n jobs J1, J2, ..., Jn of varying processing times, which need to be scheduled on a single machine, in a way that optimizes a certain objective, such as the throughput.

Job-shop scheduling, the job-shop problem (JSP) or job-shop scheduling problem (JSSP) is an optimization problem in computer science and operations research. It is a variant of optimal job scheduling. In a general job scheduling problem, we are given n jobs J1J2, ..., Jn of varying processing times, which need to be scheduled on m machines with varying processing power, while trying to minimize the makespan – the total length of the schedule. In the specific variant known as job-shop scheduling, each job consists of a set of operationsO1O2, ..., On which need to be processed in a specific order. Each operation has a specific machine that it needs to be processed on and only one operation in a job can be processed at a given time. A common relaxation is the flexible job shop, where each operation can be processed on any machine of a given set.

<span class="mw-page-title-main">David Shmoys</span> American mathematician

David Bernard Shmoys is a Professor in the School of Operations Research and Information Engineering and the Department of Computer Science at Cornell University. He obtained his Ph.D. from the University of California, Berkeley in 1984. His major focus has been in the design and analysis of algorithms for discrete optimization problems.

In mechanism design, a Vickrey–Clarke–Groves (VCG) mechanism is a generic truthful mechanism for achieving a socially-optimal solution. It is a generalization of a Vickrey–Clarke–Groves auction. A VCG auction performs a specific task: dividing items among people. A VCG mechanism is more general: it can be used to select any outcome out of a set of possible outcomes.

A random-sampling mechanism (RSM) is a truthful mechanism that uses sampling in order to achieve approximately-optimal gain in prior-free mechanisms and prior-independent mechanisms.

A Prior-independent mechanism (PIM) is a mechanism in which the designer knows that the agents' valuations are drawn from some probability distribution, but does not know the distribution.

Truthful cake-cutting is the study of algorithms for fair cake-cutting that are also truthful mechanisms, i.e., they incentivize the participants to reveal their true valuations to the various parts of the cake.

Maximin share (MMS) is a criterion of fair item allocation. Given a set of items with different values, the 1-out-of-n maximin-share is the maximum value that can be gained by partitioning the items into parts and taking the part with the minimum value. An allocation of items among agents with different valuations is called MMS-fair if each agent gets a bundle that is at least as good as his/her 1-out-of-n maximin-share. MMS fairness is a relaxation of the criterion of proportionality - each agent gets a bundle that is at least as good as the equal split ( of every resource). Proportionality can be guaranteed when the items are divisible, but not when they are indivisible, even if all agents have identical valuations. In contrast, MMS fairness can always be guaranteed to identical agents, so it is a natural alternative to proportionality even when the agents are different.

Parallel task scheduling is an optimization problem in computer science and operations research. It is a variant of optimal job scheduling. In a general job scheduling problem, we are given n jobs J1J2, ..., Jn of varying processing times, which need to be scheduled on m machines while trying to minimize the makespan - the total length of the schedule. In the specific variant known as parallel-task scheduling, all machines are identical. Each job j has a length parameter pj and a size parameter qj, and it must run for exactly pj time-steps on exactly qj machines in parallel.

In computer science, multiway number partitioning is the problem of partitioning a multiset of numbers into a fixed number of subsets, such that the sums of the subsets are as similar as possible. It was first presented by Ronald Graham in 1969 in the context of the identical-machines scheduling problem. The problem is parametrized by a positive integer k, and called k-way number partitioning. The input to the problem is a multiset S of numbers, whose sum is k*T.

Egalitarian item allocation, also called max-min item allocation is a fair item allocation problem, in which the fairness criterion follows the egalitarian rule. The goal is to maximize the minimum value of an agent. That is, among all possible allocations, the goal is to find an allocation in which the smallest value of an agent is as large as possible. In case there are two or more allocations with the same smallest value, then the goal is to select, from among these allocations, the one in which the second-smallest value is as large as possible, and so on. Therefore, an egalitarian item allocation is sometimes called a leximin item allocation.

Identical-machines scheduling is an optimization problem in computer science and operations research. We are given n jobs J1, J2, ..., Jn of varying processing times, which need to be scheduled on m identical machines, such that a certain objective function is optimized, for example, the makespan is minimized.

Unrelated-machines scheduling is an optimization problem in computer science and operations research. It is a variant of optimal job scheduling. We need to schedule n jobs J1, J2, ..., Jn on m different machines, such that a certain objective function is optimized. The time that machine i needs in order to process job j is denoted by pi,j. The term unrelated emphasizes that there is no relation between values of pi,j for different i and j. This is in contrast to two special cases of this problem: uniform-machines scheduling - in which pi,j = pi / sj, and identical-machines scheduling - in which pi,j = pi.

The welfare maximization problem is an optimization problem studied in economics and computer science. Its goal is to partition a set of items among agents with different utility functions, such that the welfare – defined as the sum of the agents' utilities – is as high as possible. In other words, the goal is to find an item allocation satisfying the utilitarian rule.

Fractional job scheduling is a variant of optimal job scheduling in which it is allowed to break jobs into parts and process each part separately on the same or a different machine. Breaking jobs into parts may allow for improving the overall performance, for example, decreasing the makespan. Moreover, the computational problem of finding an optimal schedule may become easier, as some of the optimization variables become continuous. On the other hand, breaking jobs apart might be costly.

<span class="mw-page-title-main">Knapsack auction</span>

A knapsack auction is an auction in which several identical items are sold, and there are several bidders with different valuations interested in different amounts of items. The goal is to choose a subset of the bidders with a total demand, at most, the number of items and, subject to that, a maximum total value. Finding this set of bidders requires solving an instance of the knapsack problem, which explains the term "knapsack auction".

References

  1. 1 2 Nisan, Noam; Ronen, Amir (2001). "Algorithmic Mechanism Design". Games and Economic Behavior. 35 (1–2): 166–196. CiteSeerX   10.1.1.16.7473 . doi:10.1006/game.1999.0790.
  2. Archer, A.; Tardos, E. (2001-10-01). "Truthful mechanisms for one-parameter agents". Proceedings 42nd IEEE Symposium on Foundations of Computer Science. pp. 482–491. doi:10.1109/SFCS.2001.959924. ISBN   0-7695-1390-5. S2CID   11377808.
  3. Auletta, Vincenzo; De Prisco, Roberto; Penna, Paolo; Persiano, Giuseppe (2004). "Deterministic Truthful Approximation Mechanisms for Scheduling Related Machines". In Diekert, Volker; Habib, Michel (eds.). Stacs 2004. Lecture Notes in Computer Science. Vol. 2996. Berlin, Heidelberg: Springer. pp. 608–619. doi:10.1007/978-3-540-24749-4_53. ISBN   978-3-540-24749-4.
  4. Ambrosio, Pasquale; Auletta, Vincenzo (2005). "Deterministic Monotone Algorithms for Scheduling on Related Machines". In Persiano, Giuseppe; Solis-Oba, Roberto (eds.). Approximation and Online Algorithms. Lecture Notes in Computer Science. Vol. 3351. Berlin, Heidelberg: Springer. pp. 267–280. doi:10.1007/978-3-540-31833-0_22. ISBN   978-3-540-31833-0.
  5. Andelman, Nir; Azar, Yossi; Sorani, Motti (2005). "Truthful Approximation Mechanisms for Scheduling Selfish Related Machines". In Diekert, Volker; Durand, Bruno (eds.). Stacs 2005. Lecture Notes in Computer Science. Vol. 3404. Berlin, Heidelberg: Springer. pp. 69–82. doi:10.1007/978-3-540-31856-9_6. ISBN   978-3-540-31856-9.
  6. Kovács, Annamária (2005). "Fast Monotone 3-Approximation Algorithm for Scheduling Related Machines". In Brodal, Gerth Stølting; Leonardi, Stefano (eds.). Algorithms – ESA 2005. Lecture Notes in Computer Science. Vol. 3669. Berlin, Heidelberg: Springer. pp. 616–627. doi:10.1007/11561071_55. ISBN   978-3-540-31951-1.