Bitonic tour

Last updated
A bitonic tour Bitonic tour.svg
A bitonic tour

In computational geometry, a bitonic tour of a set of point sites in the Euclidean plane is a closed polygonal chain that has each site as one of its vertices, such that any vertical line crosses the chain at most twice.

Computational geometry is a branch of computer science devoted to the study of algorithms which can be stated in terms of geometry. Some purely geometrical problems arise out of the study of computational geometric algorithms, and such problems are also considered to be part of computational geometry. While modern computational geometry is a recent development, it is one of the oldest fields of computing with history stretching back to antiquity.

Point (geometry) fundamental object of geometry: locus within which we can distinguish no other locus than itself

In modern mathematics, a point refers usually to an element of some set called a space.

Line (geometry) straight object with negligible width and depth

The notion of line or straight line was introduced by ancient mathematicians to represent straight objects with negligible width and depth. Lines are an idealization of such objects. Until the 17th century, lines were defined as the "[…] first species of quantity, which has only one dimension, namely length, without any width nor depth, and is nothing else than the flow or run of the point which […] will leave from its imaginary moving some vestige in length, exempt of any width. […] The straight line is that which is equally extended between its points."

Contents

Optimal bitonic tours

The optimal bitonic tour is a bitonic tour of minimum total length. It is a standard exercise in dynamic programming to devise a polynomial time algorithm that constructs the optimal bitonic tour. [1] [2]

Dynamic programming method for solving a complex problem by breaking it down into a collection of simpler subproblems

Dynamic programming is both a mathematical optimization method and a computer programming method. The method was developed by Richard Bellman in the 1950s and has found applications in numerous fields, from aerospace engineering to economics. In both contexts it refers to simplifying a complicated problem by breaking it down into simpler sub-problems in a recursive manner. While some decision problems cannot be taken apart this way, decisions that span several points in time do often break apart recursively. Likewise, in computer science, if a problem can be solved optimally by breaking it into sub-problems and then recursively finding the optimal solutions to the sub-problems, then it is said to have optimal substructure.

The problem of constructing optimal bitonic tours is often credited to Jon L. Bentley, who published in 1990 an experimental comparison of many heuristics for the traveling salesman problem; [3] however, Bentley's experiments do not include bitonic tours. The first publication that describes the bitonic tour problem appears to be a different 1990 publication, the first edition of the textbook Introduction to Algorithms by Thomas H. Cormen, Charles E. Leiserson, and Ron Rivest, which lists Bentley as the originator of the problem.

<i>Introduction to Algorithms</i> algorithms textbook

Introduction to Algorithms is a book by Thomas H. Cormen, Charles E. Leiserson, Ronald L. Rivest, and Clifford Stein. The book has been widely used as the textbook for algorithms courses at many universities and is commonly cited as a reference for algorithms in published papers, with over 10,000 citations documented on CiteSeerX. The book sold half a million copies during its first 20 years. Its fame has led to the common use of the abbreviation "CLRS", or, in the first edition, "CLR".

Thomas H. Cormen is the co-author of Introduction to Algorithms, along with Charles Leiserson, Ron Rivest, and Cliff Stein. In 2013, he published a new book titled Algorithms Unlocked. He is a professor of computer science at Dartmouth College and former Chairman of the Dartmouth College Department of Computer Science. Between 2004 and 2008 he directed the Dartmouth College Writing Program. His research interests are algorithm engineering, parallel computing, speeding up computations with high latency.

Charles E. Leiserson American computer scientist

Charles Eric Leiserson is a computer scientist, specializing in the theory of parallel computing and distributed computing, and particularly practical applications thereof. As part of this effort, he developed the Cilk multithreaded language. He invented the fat-tree interconnection network, a hardware-universal interconnection network used in many supercomputers, including the Connection Machine CM5, for which he was network architect. He helped pioneer the development of VLSI theory, including the retiming method of digital optimization with James B. Saxe and systolic arrays with H. T. Kung. He conceived of the notion of cache-oblivious algorithms, which are algorithms that have no tuning parameters for cache size or cache-line length, but nevertheless use cache near-optimally. He developed the Cilk language for multithreaded programming, which uses a provably good work-stealing algorithm for scheduling. Leiserson coauthored the standard algorithms textbook Introduction to Algorithms together with Thomas H. Cormen, Ronald L. Rivest, and Clifford Stein.

Properties

The optimal bitonic tour has no self-crossings, because any two edges that cross can be replaced by an uncrossed pair of edges with shorter total length due to the triangle inequality.

When compared to other tours that might not be bitonic, the optimal bitonic tour is the one that minimizes the total amount of horizontal motion, with ties broken by Euclidean distance. [4]

Other optimization criteria

The same dynamic programming algorithm that finds the optimal bitonic tour may be used to solve other variants of the traveling salesman problem that minimize lexicographic combinations of motion in a fixed number of coordinate directions. [4]

In mathematics, the lexicographic or lexicographical order is a generalization of the way words are alphabetically ordered based on the alphabetical order of their component letters. This generalization consists primarily in defining a total order over the sequences of elements of a finite totally ordered set, often called an alphabet.

At the 5th International Olympiad in Informatics, in Mendoza, Argentina in 1993, one of the contest problems involved bitonic tours: the contestants were to devise an algorithm that took as input a set of sites and a collection of allowed edges between sites and construct a bitonic tour using those edges that included as many sites as possible. As with the optimal bitonic tour, this problem may be solved by dynamic programming. [5] [6]

International Olympiad in Informatics

The International Olympiad in Informatics (IOI) is an annual competitive programming competition for secondary school students. It is the second largest olympiad, after International Mathematical Olympiad, in terms of number of participating countries. The first IOI was held in 1989 in Pravetz, Bulgaria.

Mendoza, Argentina City in Mendoza, Argentina

Ciudad de Mendoza is the capital of the province of Mendoza in Argentina. It is located in the northern-central part of the province, in a region of foothills and high plains, on the eastern side of the Andes. As of the 2010 census [INDEC], Mendoza had a population of 115,041 with a metropolitan population of 1,055,679, making Greater Mendoza the fourth largest census metropolitan area in the country.

Related Research Articles

Search algorithm Any algorithm which solves the search problem

In computer science, a search algorithm is any algorithm which solves the search problem, namely, to retrieve information stored within some data structure, or calculated in the search space of a problem domain, either with discrete or continuous values. Specific applications of search algorithms include:

Travelling salesman problem algorithmic problem

The travelling salesman problem (TSP) asks the following question: "Given a list of cities and the distances between each pair of cities, what is the shortest possible route that visits each city and returns to the origin city?" It is an NP-hard problem in combinatorial optimization, important in operations research and theoretical computer science.

Minimum spanning tree

A minimum spanning tree (MST) or minimum weight spanning tree is a subset of the edges of a connected, edge-weighted undirected graph that connects all the vertices together, without any cycles and with the minimum possible total edge weight. That is, it is a spanning tree whose sum of edge weights is as small as possible. More generally, any edge-weighted undirected graph has a minimum spanning forest, which is a union of the minimum spanning trees for its connected components.

The nearest neighbour algorithm was one of the first algorithms used to solve the travelling salesman problem. In it, the salesman starts at a random city and repeatedly visits the nearest city until all have been visited. It quickly yields a short tour, but usually not the optimal one.

Shortest path problem computational problem

In graph theory, the shortest path problem is the problem of finding a path between two vertices in a graph such that the sum of the weights of its constituent edges is minimized.

Kruskal's algorithm is a minimum-spanning-tree algorithm which finds an edge of the least possible weight that connects any two trees in the forest. It is a greedy algorithm in graph theory as it finds a minimum spanning tree for a connected weighted graph adding increasing cost arcs at each step. This means it finds a subset of the edges that forms a tree that includes every vertex, where the total weight of all the edges in the tree is minimized. If the graph is not connected, then it finds a minimum spanning forest.

Greedy algorithm algorithm that makes locally optimal choices in a sequence of steps with the goal of reaching a global optimum

A greedy algorithm is an algorithmic paradigm that follows the problem solving heuristic of making the locally optimal choice at each stage with the intent of finding a global optimum. In many problems, a greedy strategy does not usually produce an optimal solution, but nonetheless a greedy heuristic may yield locally optimal solutions that approximate a globally optimal solution in a reasonable amount of time.

Depth-first search search algorithm

Depth-first search (DFS) is an algorithm for traversing or searching tree or graph data structures. The algorithm starts at the root node and explores as far as possible along each branch before backtracking.

Optimal substructure

In computer science, a problem is said to have optimal substructure if an optimal solution can be constructed from optimal solutions of its subproblems. This property is used to determine the usefulness of dynamic programming and greedy algorithms for a problem.

In computer science, local search is a heuristic method for solving computationally hard optimization problems. Local search can be used on problems that can be formulated as finding a solution maximizing a criterion among a number of candidate solutions. Local search algorithms move from solution to solution in the space of candidate solutions by applying local changes, until a solution deemed optimal is found or a time bound is elapsed.

The Bottleneck traveling salesman problem is a problem in discrete or combinatorial optimization. The problem is to find the Hamiltonian cycle in a weighted graph which minimizes the weight of the most weighty edge of the cycle. It was first formulated by Gilmore & Gomory (1964) with some additional constraints, and in its full generality by Garfinkel & Gilbert (1978).

In computer science and operations research, approximation algorithms are efficient algorithms that find approximate solutions to NP-hard optimization problems with provable guarantees on the distance of the returned solution to the optimal one. Approximation algorithms naturally arise in the field of theoretical computer science as a consequence of the widely believed P ≠ NP conjecture. Under this conjecture, a wide class of optimization problems cannot be solved exactly in polynomial time. The field of approximation algorithms, therefore, tries to understand how closely it is possible to approximate optimal solutions to such problems in polynomial time. In an overwhelming majority of the cases, the guarantee of such algorithms is a multiplicative one expressed as an approximation ratio or approximation factor i.e., the optimal solution is always guaranteed to be within a (predetermined) multiplicative factor of the returned solution. However, there are also many approximation algorithms that provide an additive guarantee on the quality of the returned solution. A notable example of an approximation algorithm that provides both is the classic approximation algorithm of Lenstra, Shmoys and Tardos for Scheduling on Unrelated Parallel Machines.

In computer science, a problem is said to have overlapping subproblems if the problem can be broken down into subproblems which are reused several times or a recursive algorithm for the problem solves the same subproblem over and over rather than always generating new subproblems.

In computational geometry, the line segment intersection problem supplies a list of line segments in the Euclidean plane and asks whether any two of them intersect (cross).

Vehicle routing problem

The vehicle routing problem (VRP) is a combinatorial optimization and integer programming problem which asks "What is the optimal set of routes for a fleet of vehicles to traverse in order to deliver to a given set of customers?". It generalises the well-known travelling salesman problem (TSP). It first appeared in a paper by George Dantzig and John Ramser in 1959, in which first algorithmic approach was written and was applied to petrol deliveries. Often, the context is that of delivering goods located at a central depot to customers who have placed orders for such goods. The objective of the VRP is to minimize the total route cost. In 1964, Clarke and Wright improved on Dantzig and Ramser's approach using an effective greedy approach called the savings algorithm.

Closest pair of points problem the problem of finding the two points with minimum distance from a larger finite set of points

The closest pair of points problem or closest pair problem is a problem of computational geometry: given n points in metric space, find a pair of points with the smallest distance between them. The closest pair problem for points in the Euclidean plane was among the first geometric problems that were treated at the origins of the systematic study of the computational complexity of geometric algorithms.

In computer science, artificial intelligence, and mathematical optimization, a heuristic is a technique designed for solving a problem more quickly when classic methods are too slow, or for finding an approximate solution when classic methods fail to find any exact solution. This is achieved by trading optimality, completeness, accuracy, or precision for speed. In a way, it can be considered a shortcut.

The Held–Karp algorithm, also called Bellman–Held–Karp algorithm, is a dynamic programming algorithm proposed in 1962 independently by Bellman and by Held and Karp to solve the Traveling Salesman Problem (TSP). TSP is an extension of the Hamiltonian circuit problem. The problem can be described as: find a tour of N cities in a country, the tour should (a) visit every city just once, (b) return to the starting point and (c) be of minimum distance. Broadly, the TSP is classified as symmetric travelling salesman problem (sTSP), asymmetric travelling salesman problem (aTSP), and multi-travelling salesman problem (mTSP).The mTSP is generally treated as a relaxed vehicle routing problem.

References

  1. Introduction to Algorithms , 3rd ed., T. H. Cormen, C. E. Leiserson, R. Rivest, and C. Stein, MIT Press, 2009. Problem 15-3, p. 405.
  2. Bird, Richard S.; De Moor, Oege (1997), The Algebra of Programming, Prentice Hall, p. 213, ISBN   9780135072455 .
  3. Bentley, Jon L. (1990), "Experiments on traveling salesman heuristics", Proc. 1st ACM-SIAM Symp. Discrete Algorithms (SODA), pp. 91–99.
  4. 1 2 Sourd, Francis (2010), "Lexicographically minimizing axial motions for the Euclidean TSP", Journal of Combinatorial Optimization, 19 (1): 1–15, doi:10.1007/s10878-008-9154-0, MR   2579501 .
  5. IOI'93 contest problems and report.
  6. Guerreiro, Pedro (December 2003), The Canadian Airline Problem and the Bitonic Tour: Is This Dynamic Programming? (PDF), Departamento de Informática, Faculdade de Ciências e Tecnologia, Universidade Nova de Lisboa.