In optimization, 2-opt is a simple local search algorithm for solving the traveling salesman problem. The 2-opt algorithm was first proposed by Croes in 1958, [1] although the basic move had already been suggested by Flood. [2] The main idea behind it is to take a route that crosses over itself and reorder it so that it does not. A complete 2-opt local search will compare every possible valid combination of the swapping mechanism. This technique can be applied to the traveling salesman problem as well as many related problems. These include the vehicle routing problem (VRP) as well as the capacitated VRP, which require minor modification of the algorithm.
Visually, one swap looks like:
- A B - - A - B - × ==> - C D - - C - D -
In pseudocode, the mechanism by which the 2-opt swap manipulates a given route is as follows. Here v1 and v2 are the first vertices of the edges that are to be swapped when traversing through the route:
procedure 2optSwap(route, v1, v2) { 1. take route[start] to route[v1] and add them in order to new_route 2. take route[v1+1] to route[v2] and add them in reverse order to new_route 3. take route[v2+1] to route[start] and add them in order to new_route return new_route; }
Here is an example of the above with arbitrary input:
This is the complete 2-opt swap making use of the above mechanism:
repeat until no improvement is made { best_distance = calculateTotalDistance(existing_route) start_again: for (i = 0; i <= number of nodes eligible to be swapped - 1; i++) { for (j = i + 1; j <= number of nodes eligible to be swapped; j++) { new_route = 2optSwap(existing_route, i, j) new_distance = calculateTotalDistance(new_route) if (new_distance < best_distance) { existing_route = new_route best_distance = new_distance goto start_again } } } }
The particular nodes or depots that are at the start and at the end of the path should be removed from the search as an eligible candidates for swapping, as reversing the order would cause an invalid path.
For example, with depot at A:
A → B → C → D → A
Swapping using node[0] and node[2] would yield
C → B → A → D → A
which is not valid (does not leave from A, the depot).
Building the new route and calculating the distance of the new route can be a very expensive operation, usually where n is the number of vertices in the route. In a symetric case (where the distance between A and B is the same as between B and A), this can be skipped by performing a operation. Since a 2-opt operation involves removing 2 edges and adding 2 different edges we can subtract and add the distances of only those edges.
lengthDelta = - dist(route[v1], route[v1+1]) - dist(route[v2], route[v2+1]) + dist(route[v1+1], route[v2+1]) + dist(route[v1], route[v2])
If lengthDelta
is negative that would mean that the new distance after the swap would be smaller. Once it is known that lengthDelta
is negative, then we perform a 2-opt swap. This saves us a lot of computation.
#include<random>#include<stdio.h>#include<vector>usingnamespacestd;classPoint{public:floatx,y;Point(floatx,floaty){this->x=x;this->y=y;}Point(){this->x=0.0;this->y=0.0;}// Distance between two pointsinlinefloatdist(constPoint&other)const{floatdiffx=x-other.x;floatdiffy=y-other.y;returnsqrt(diffx*diffx+diffy*diffy);}};// Calculate the distance of the whole circuitintpathLength(vector<Point>&path){intn=path.size();floatlength=path[n-1].dist(path[0]);for(inti=0;i<n-1;i++){length+=path[i].dist(path[i+1]);}returnlength;}// Replace edges path[i]->path[i+1] and path[j]->path[j+1]// with path[i]->path[j] and path[i+1]->path[j+1]voidswap_edges(vector<Point>&path,inti,intj){i+=1;while(i<j){Pointtemp=path[i];path[i]=path[j];path[j]=temp;i++;j--;}}// Print the path.voidprintPath(stringpathName,vector<Point>&path){printf("%s = [",pathName.c_str());for(inti=0;i<path.size();i++){if(i%10==0){printf("\n ");}if(i<path.size()-1){printf("[%.1f, %.1f], ",path[i].x,path[i].y);}else{printf("[%.1f, %.1f]",path[i].x,path[i].y);}}printf("\n];\n");}// Create a path of length n with random points between 0 and 1000vector<Point>createRandomPath(intn){vector<Point>path;for(inti=0;i<n;i++){floatx=(float)rand()/(float)(RAND_MAX/1000);floaty=(float)rand()/(float)(RAND_MAX/1000);path.push_back(Point(x,y));}returnpath;}intmain(){vector<Point>path=createRandomPath(100);printPath("path1",path);floatcurLength=pathLength(path);printf("path1len = %.1f;\n\n",curLength);intn=path.size();boolfoundImprovement=true;while(foundImprovement){foundImprovement=false;for(inti=0;i<n-1;i++){for(intj=i+2;j<n;j++){floatlengthDelta=-path[i].dist(path[i+1])-path[j].dist(path[(j+1)%n])+path[i].dist(path[j])+path[i+1].dist(path[(j+1)%n]);// If the length of the path is reduced, do a 2-opt swapif(lengthDelta<0){swap_edges(path,i,j);curLength+=lengthDelta;foundImprovement=true;}}}}printPath("path2",path);printf("path2len = %.1f;\n",curLength);return0;}
path1 = [ [0.0, 131.5], [755.6, 458.7], [532.8, 219.0], [47.0, 678.9], [679.3, 934.7], [383.5, 519.4], [831.0, 34.6], [53.5, 529.7], [671.1, 7.7], [383.4, 66.8], [417.5, 686.8], [589.0, 930.4], [846.2, 526.9], [92.0, 653.9], [416.0, 701.2], [910.3, 762.2], [262.5, 47.5], [736.1, 328.2], [632.6, 756.4], [991.0, 365.3], [247.0, 982.6], [722.7, 753.4], [651.5, 72.7], [631.6, 884.7], [272.7, 436.4], [766.5, 477.7], [237.8, 274.9], [359.3, 166.5], [486.5, 897.7], [909.2, 60.6], [904.7, 504.5], [516.3, 319.0], [986.6, 494.0], [266.1, 90.7], [947.8, 73.7], [500.7, 384.1], [277.1, 913.8], [529.7, 464.4], [941.0, 50.1], [761.5, 770.2], [827.8, 125.4], [15.9, 688.5], [868.2, 629.5], [736.2, 725.4], [999.5, 888.6], [233.2, 306.3], [351.0, 513.3], [591.1, 846.0], [412.1, 841.5], [269.3, 415.4], [537.3, 467.9], [287.2, 178.3], [153.7, 571.7], [802.4, 33.1], [534.4, 498.5], [955.4, 748.3], [554.6, 890.7], [624.8, 842.0], [159.8, 212.8], [714.7, 130.4], [91.0, 274.6], [3.0, 414.3], [26.9, 709.8], [937.9, 239.9], [180.9, 317.5], [887.0, 652.1], [150.3, 681.3], [385.8, 387.7], [499.7, 147.5], [587.2, 845.6], [590.1, 955.4], [556.1, 148.2], [983.3, 408.8], [141.8, 564.9], [252.1, 488.5], [464.0, 961.1], [126.0, 199.8], [319.2, 629.3], [126.7, 651.3], [621.6, 803.1], [247.8, 476.4], [389.3, 203.3], [28.4, 901.7], [426.5, 142.0], [947.5, 410.3], [131.2, 885.6], [92.2, 162.2], [71.1, 365.3], [253.1, 135.1], [783.2, 455.3], [349.5, 452.3], [808.9, 931.7], [651.6, 215.2], [679.6, 908.9], [250.1, 860.9], [471.3, 506.0], [600.4, 817.6], [755.8, 462.2], [951.4, 632.7], [439.3, 824.7]];path1len = 55723.0;path2 = [ [0.0, 131.5], [91.0, 274.6], [71.1, 365.3], [3.0, 414.3], [53.5, 529.7], [92.0, 653.9], [47.0, 678.9], [15.9, 688.5], [26.9, 709.8], [28.4, 901.7], [131.2, 885.6], [247.0, 982.6], [277.1, 913.8], [464.0, 961.1], [486.5, 897.7], [439.3, 824.7], [412.1, 841.5], [250.1, 860.9], [150.3, 681.3], [126.7, 651.3], [141.8, 564.9], [153.7, 571.7], [247.8, 476.4], [252.1, 488.5], [319.2, 629.3], [416.0, 701.2], [417.5, 686.8], [534.4, 498.5], [537.3, 467.9], [529.7, 464.4], [516.3, 319.0], [500.7, 384.1], [471.3, 506.0], [383.5, 519.4], [351.0, 513.3], [349.5, 452.3], [385.8, 387.7], [272.7, 436.4], [269.3, 415.4], [180.9, 317.5], [233.2, 306.3], [237.8, 274.9], [287.2, 178.3], [389.3, 203.3], [532.8, 219.0], [736.1, 328.2], [783.2, 455.3], [755.6, 458.7], [755.8, 462.2], [766.5, 477.7], [846.2, 526.9], [904.7, 504.5], [868.2, 629.5], [736.2, 725.4], [761.5, 770.2], [722.7, 753.4], [632.6, 756.4], [621.6, 803.1], [600.4, 817.6], [624.8, 842.0], [631.6, 884.7], [591.1, 846.0], [587.2, 845.6], [554.6, 890.7], [589.0, 930.4], [590.1, 955.4], [679.3, 934.7], [679.6, 908.9], [808.9, 931.7], [999.5, 888.6], [955.4, 748.3], [910.3, 762.2], [887.0, 652.1], [951.4, 632.7], [986.6, 494.0], [947.5, 410.3], [983.3, 408.8], [991.0, 365.3], [937.9, 239.9], [827.8, 125.4], [947.8, 73.7], [941.0, 50.1], [909.2, 60.6], [831.0, 34.6], [802.4, 33.1], [671.1, 7.7], [651.5, 72.7], [714.7, 130.4], [651.6, 215.2], [556.1, 148.2], [499.7, 147.5], [426.5, 142.0], [359.3, 166.5], [383.4, 66.8], [262.5, 47.5], [266.1, 90.7], [253.1, 135.1], [159.8, 212.8], [126.0, 199.8], [92.2, 162.2]];path2len = 8586.2;
The travelling salesman problem, also known as the travelling salesperson problem (TSP), asks the following question: "Given a list of cities and the distances between each pair of cities, what is the shortest possible route that visits each city exactly once and returns to the origin city?" It is an NP-hard problem in combinatorial optimization, important in theoretical computer science and operations research.
In information theory, the Hamming distance between two strings or vectors of equal length is the number of positions at which the corresponding symbols are different. In other words, it measures the minimum number of substitutions required to change one string into the other, or equivalently, the minimum number of errors that could have transformed one string into the other. In a more general context, the Hamming distance is one of several string metrics for measuring the edit distance between two sequences. It is named after the American mathematician Richard Hamming.
Dijkstra's algorithm is an algorithm for finding the shortest paths between nodes in a weighted graph, which may represent, for example, road networks. It was conceived by computer scientist Edsger W. Dijkstra in 1956 and published three years later.
In computer science, bogosort is a sorting algorithm based on the generate and test paradigm. The function successively generates permutations of its input until it finds one that is sorted. It is not considered useful for sorting, but may be used for educational purposes, to contrast it with more efficient algorithms.
The Bellman–Ford algorithm is an algorithm that computes shortest paths from a single source vertex to all of the other vertices in a weighted digraph. It is slower than Dijkstra's algorithm for the same problem, but more versatile, as it is capable of handling graphs in which some of the edge weights are negative numbers. The algorithm was first proposed by Alfonso Shimbel, but is instead named after Richard Bellman and Lester Ford Jr., who published it in 1958 and 1956, respectively. Edward F. Moore also published a variation of the algorithm in 1959, and for this reason it is also sometimes called the Bellman–Ford–Moore algorithm.
In computer science, the Floyd–Warshall algorithm is an algorithm for finding shortest paths in a directed weighted graph with positive or negative edge weights. A single execution of the algorithm will find the lengths of shortest paths between all pairs of vertices. Although it does not return details of the paths themselves, it is possible to reconstruct the paths with simple modifications to the algorithm. Versions of the algorithm can also be used for finding the transitive closure of a relation , or widest paths between all pairs of vertices in a weighted graph.
OpenMP is an application programming interface (API) that supports multi-platform shared-memory multiprocessing programming in C, C++, and Fortran, on many platforms, instruction-set architectures and operating systems, including Solaris, AIX, FreeBSD, HP-UX, Linux, macOS, and Windows. It consists of a set of compiler directives, library routines, and environment variables that influence run-time behavior.
In information theory, linguistics, and computer science, the Levenshtein distance is a string metric for measuring the difference between two sequences. The Levenshtein distance between two words is the minimum number of single-character edits required to change one word into the other. It is named after Soviet mathematician Vladimir Levenshtein, who defined the metric in 1965.
The syntax of the C programming language is the set of rules governing writing of software in C. It is designed to allow for programs that are extremely terse, have a close relationship with the resulting object code, and yet provide relatively high-level data abstraction. C was the first widely successful high-level language for portable operating-system development.
In computing, POSIX Threads, commonly known as pthreads, is an execution model that exists independently from a programming language, as well as a parallel execution model. It allows a program to control multiple different flows of work that overlap in time. Each flow of work is referred to as a thread, and creation and control over these flows is achieved by making calls to the POSIX Threads API. POSIX Threads is an API defined by the Institute of Electrical and Electronics Engineers (IEEE) standard POSIX.1c, Threads extensions .
In graph theory, a path in a graph is a finite or infinite sequence of edges which joins a sequence of vertices which, by most definitions, are all distinct. A directed path in a directed graph is a finite or infinite sequence of edges which joins a sequence of distinct vertices, but with the added restriction that the edges be all directed in the same direction.
The Blohm & Voss BV 222 Wiking was a large six-engined German flying boat designed and built by the German aircraft manufacturer Blohm & Voss. It was the largest flying boat to attain operational status during the Second World War.
Loop unrolling, also known as loop unwinding, is a loop transformation technique that attempts to optimize a program's execution speed at the expense of its binary size, which is an approach known as space–time tradeoff. The transformation can be undertaken manually by the programmer or by an optimizing compiler. On modern processors, loop unrolling is often counterproductive, as the increased code size can cause more cache misses; cf. Duff's device.
A pseudorandom binary sequence (PRBS), pseudorandom binary code or pseudorandom bitstream is a binary sequence that, while generated with a deterministic algorithm, is difficult to predict and exhibits statistical behavior similar to a truly random sequence. PRBS generators are used in telecommunication, such as in analog-to-information conversion, but also in encryption, simulation, correlation technique and time-of-flight spectroscopy. The most common example is the maximum length sequence generated by a (maximal) linear feedback shift register (LFSR). Other examples are Gold sequences, Kasami sequences and JPL sequences, all based on LFSRs.
The Hungarian method is a combinatorial optimization algorithm that solves the assignment problem in polynomial time and which anticipated later primal–dual methods. It was developed and published in 1955 by Harold Kuhn, who gave it the name "Hungarian method" because the algorithm was largely based on the earlier works of two Hungarian mathematicians, Dénes Kőnig and Jenő Egerváry. However, in 2006 it was discovered that Carl Gustav Jacobi had solved the assignment problem in the 19th century, and the solution had been published posthumously in 1890 in Latin.
In mathematical optimization, the push–relabel algorithm is an algorithm for computing maximum flows in a flow network. The name "push–relabel" comes from the two basic operations used in the algorithm. Throughout its execution, the algorithm maintains a "preflow" and gradually converts it into a maximum flow by moving flow locally between neighboring nodes using push operations under the guidance of an admissible network maintained by relabel operations. In comparison, the Ford–Fulkerson algorithm performs global augmentations that send flow following paths from the source all the way to the sink.
In computer programming, an anonymous function is a function definition that is not bound to an identifier. Anonymous functions are often arguments being passed to higher-order functions or used for constructing the result of a higher-order function that needs to return a function. If the function is only used once, or a limited number of times, an anonymous function may be syntactically lighter than using a named function. Anonymous functions are ubiquitous in functional programming languages and other languages with first-class functions, where they fulfil the same role for the function type as literals do for other data types.
Lexicographic codes or lexicodes are greedily generated error-correcting codes with remarkably good properties. They were produced independently by Vladimir Levenshtein and by John Horton Conway and Neil Sloane. The binary lexicographic codes are linear codes, and include the Hamming codes and the binary Golay codes.
In the C programming language, operations can be performed on a bit level using bitwise operators.
In computer science, an optimal binary search tree (Optimal BST), sometimes called a weight-balanced binary tree, is a binary search tree which provides the smallest possible search time (or expected search time) for a given sequence of accesses (or access probabilities). Optimal BSTs are generally divided into two types: static and dynamic.