The dual problem is a reformulation of a constraint satisfaction problem expressing each constraint of the original problem as a variable. Dual problems only contain binary constraints, and are therefore solvable by algorithms tailored for such problems. The join graphs and join trees of a constraint satisfaction problem are graphs representing its dual problem or a problem obtained from the dual problem removing some redundant constraints.
The dual problem of a constraint satisfaction problem contains a variable for each constraint of the original problem. Its domains and constraints are built so to enforce a sort of equivalence to the original problem. In particular, the domain of a variable of the dual problem contains one element for each tuple satisfying the corresponding original constraint. This way, a dual variable can take a value if and only if the corresponding original constraint is satisfied by the corresponding tuple.
The constraints of the dual problem forbid two dual variables to take values that correspond to two incompatible tuples. Without these constraints, one dual variable may take the value corresponding to the tuple while another dual variable takes the value corresponding to , which assigns a different value to .
More generally, the constraints of the dual problem enforce the same values for all variables shared by two constraints. If two dual variables correspond to constraints sharing some variables, the dual problem contains a constraint between them, enforcing equality of all shared variables.
In the dual problem, all constraints are binary. They all enforce two values, which are tuples, to agree on one or more original variables.
The dual graph is a representation of how variables are constrained in the dual problem. More precisely, the dual graph contains a node for each dual variable and an edge for every constraint between them. In addition, the edge between two variables is labeled by the original variables that are enforced equal between these two dual variables.
The dual graph can be built directly from the original problem: it contains a vertex for each constraint, and an edge between every two constraints sharing variables; such an edge is labeled by these shared variables.
In the dual graph, some constraints may be unnecessary. Indeed, dual constraints enforces equality of original variables, and some constraints may be redundant because of transitivity of equality. For example, if and are joined by an edge whose label contains , and so are and , equality of in all three dual variables is guaranteed. As a result, a dual constraint between and enforcing equality of is not necessary, and could be removed if present.
Since equality of is enforced by other dual constraints, the one between and can be dropped. |
A graph obtained from the dual graph by removing some redundant edges is called a join graph. If it is a tree, it is called a join tree. The dual problem can be solved from a join graph since all removed edges are redundant. In turn, the problem can be solved efficiently if that join graph is a tree, using algorithms tailored for acyclic constraint satisfaction problems.
Finding a join tree, if any, can be done exploiting the following property: if a dual graph has a join tree, then the maximal-weight spanning trees of the graph are all join trees, if edges are weighted by the number of variables the corresponding constraints enforce to be equal. An algorithm for finding a join tree, if any, proceeds as follows. In the first step, edges are assigned weights: if two nodes represent constraints that share variables, the edge joining them is assigned weight . In the second step, a maximal-weight spanning tree is searched for. Once one is found, it is checked whether it enforces the required equality of variables. If this is the case, this spanning tree is a join tree.
Another method for finding out whether a constraint satisfaction problem has a join tree uses the primal graph of the problem, rather than the dual graph. The primal graph of a constraint satisfaction problem is a graph whose nodes are problem variables and whose edges represent the presence of two variables in the same constraint. A join tree for the problem exists if:
In turn, chordality can be checked using a max-cardinality ordering of the variables. Such an ordering can also be used, if the two conditions above are met, for finding a join tree of the problem. Ordering constraints by their highest variable according to the ordering, an algorithm for producing a join tree proceeds from the last to the first constraint; at each step, a constraint is connected to the constraint that shares a maximal number of variables with it among the constraints that precede it in the ordering.
Not all constraint satisfaction problems have a join tree. However, problems can be modified to acquire a join tree. Join-tree clustering is a specific method to modify problems in such a way they acquire a joint tree. This is done by merging constraints, which typically increases the size of the problem; however, solving the resulting problem is easy, as it is for all problems that have a join tree.
Decomposition methods generalize join-tree clustering by grouping variables in such a way the resulting problem has a join tree. Decomposition methods directly associate a tree with problems; the nodes of this tree are associated variables and/or constraints of the original problem. By merging constraints based on this tree, one can produce a problem that has a join tree, and this join tree can be easily derived from the decomposition tree. Alternatively, one can build a binary acyclic problem directly from the decomposition tree.
Linear programming (LP), also called linear optimization, is a method to achieve the best outcome in a mathematical model whose requirements are represented by linear relationships. Linear programming is a special case of mathematical programming.
In computer science and optimization theory, the max-flow min-cut theorem states that in a flow network, the maximum amount of flow passing from the source to the sink is equal to the total weight of the edges in a minimum cut, i.e., the smallest total weight of the edges which if removed would disconnect the source from the sink.
In graph theory, a tree decomposition is a mapping of a graph into a tree that can be used to define the treewidth of the graph and speed up solving certain computational problems on the graph.
In optimization theory, maximum flow problems involve finding a feasible flow through a flow network that obtains the maximum possible flow rate.
In graph theory, a cograph, or complement-reducible graph, or P4-free graph, is a graph that can be generated from the single-vertex graph K1 by complementation and disjoint union. That is, the family of cographs is the smallest class of graphs that includes K1 and is closed under complementation and disjoint union.
The hidden transformation reformulates a constraint satisfaction problem in such a way all constraints have at most two variables. The new problem is satisfiable if and only if the original problem was, and solutions can be converted easily from one problem to the other.
In constraint satisfaction, local consistency conditions are properties of constraint satisfaction problems related to the consistency of subsets of variables or constraints. They can be used to reduce the search space and make the problem easier to solve. Various kinds of local consistency conditions are leveraged, including node consistency, arc consistency, and path consistency.
In constraint satisfaction, local search is an incomplete method for finding a solution to a problem. It is based on iteratively improving an assignment of the variables until all constraints are satisfied. In particular, local search algorithms typically modify the value of a variable in an assignment at each step. The new assignment is close to the previous one in the space of assignment, hence the name local search.
In mathematical optimization, constrained optimization is the process of optimizing an objective function with respect to some variables in the presence of constraints on those variables. The objective function is either a cost function or energy function, which is to be minimized, or a reward function or utility function, which is to be maximized. Constraints can be either hard constraints, which set conditions for the variables that are required to be satisfied, or soft constraints, which have some variable values that are penalized in the objective function if, and based on the extent that, the conditions on the variables are not satisfied.
Within artificial intelligence and operations research for constraint satisfaction a hybrid algorithm solves a constraint satisfaction problem by the combination of two different methods, for example variable conditioning and constraint inference
Constraint logic programming is a form of constraint programming, in which logic programming is extended to include concepts from constraint satisfaction. A constraint logic program is a logic program that contains constraints in the body of clauses. An example of a clause including a constraint is A(X,Y):-X+Y>0,B(X),C(Y)
. In this clause, X+Y>0
is a constraint; A(X,Y)
, B(X)
, and C(Y)
are literals as in regular logic programming. This clause states one condition under which the statement A(X,Y)
holds: X+Y
is greater than zero and both B(X)
and C(Y)
are true.
The complexity of constraint satisfaction is the application of computational complexity theory to constraint satisfaction. It has mainly been studied for discriminating between tractable and intractable classes of constraint satisfaction problems on finite domains.
In constraint satisfaction, a decomposition method translates a constraint satisfaction problem into another constraint satisfaction problem that is binary and acyclic. Decomposition methods work by grouping variables into sets, and solving a subproblem for each set. These translations are done because solving binary acyclic problems is a tractable problem.
Semidefinite programming (SDP) is a subfield of convex optimization concerned with the optimization of a linear objective function over the intersection of the cone of positive semidefinite matrices with an affine space, i.e., a spectrahedron.
In database theory, a conjunctive query is a restricted form of first-order queries using the logical conjunction operator. Many first-order queries can be written as conjunctive queries. In particular, a large part of queries issued on relational databases can be expressed in this way. Conjunctive queries also have a number of desirable theoretical properties that larger classes of queries do not share.
The dual of a given linear program (LP) is another LP that is derived from the original LP in the following schematic way:
In constraint satisfaction research in artificial intelligence and operations research, constraint graphs and hypergraphs are used to represent relations among constraints in a constraint satisfaction problem. A constraint graph is a special case of a factor graph, which allows for the existence of free variables.
Capacitated minimum spanning tree is a minimal cost spanning tree of a graph that has a designated root node and satisfies the capacity constraint . The capacity constraint ensures that all subtrees incident on the root node have no more than nodes. If the tree nodes have weights, then the capacity constraint may be interpreted as follows: the sum of weights in any subtree should be no greater than . The edges connecting the subgraphs to the root node are called gates. Finding the optimal solution is NP-hard.
In artificial intelligence and operations research, a Weighted Constraint Satisfaction Problem (WCSP) is a generalization of a constraint satisfaction problem (CSP) where some of the constraints can be violated and in which preferences among solutions can be expressed. This generalization makes it possible to represent more real-world problems, in particular those that are over-constrained, or those where we want to find a minimal-cost solution among multiple possible solutions.
In computer science, an interchangeability algorithm is a technique used to more efficiently solve constraint satisfaction problems (CSP). A CSP is a mathematical problem in which objects, represented by variables, are subject to constraints on the values of those variables; the goal in a CSP is to assign values to the variables that are consistent with the constraints. If two variables A and B in a CSP may be swapped for each other without changing the nature of the problem or its solutions, then A and B are interchangeable variables. Interchangeable variables represent a symmetry of the CSP and by exploiting that symmetry, the search space for solutions to a CSP problem may be reduced. For example, if solutions with A=1 and B=2 have been tried, then by interchange symmetry, solutions with B=1 and A=2 need not be investigated.