Developer(s) | Tomlab Optimization Inc. |
---|---|

Stable release | 7.8 / December 16, 2011 |

Operating system | TOMLAB - OS Support |

Type | Technical computing |

License | Proprietary |

Website | TomSym product page |

The **TomSym**^{ [1] } MATLAB symbolic modeling engine is a platform for modeling applied optimization and optimal control problems.

TomSym is complete modeling environment in Matlab with support for most built-in mathematical operators in Matlab. It is a combined modeling, compilation and interface to the TOMLAB solvers. The matrix derivative of a matrix function is a fourth rank tensor - that is, a matrix each of whose entries is a matrix. Rather than using four-dimensional matrices to represent this, TomSym continues to work in two dimensions. This makes it possible to take advantage of the very efficient handling of sparse matrices in Matlab, which is not available for higher-dimensional matrices.

TomSym has a variety of functions, among them:

- Ability to transform expressions and generate analytical first and second order derivatives, including sparsity patterns.
- Interfaced and compatible with MAD, i.e. MAD can be used when symbolic modeling is not suitable.
- Numerical differentiation can be used to parts of the model.
- Functionality for plotting and computing a variety of information for the solution to the problem.
- Support for if, then, else statements.
- Ability to analyze p-coded Matlab files.
- Automated code simplification for generated models, for example.
- Multiplication by 1 or the identity matrix is eliminated: 1*A = A
- Addition/subtraction of 0 is eliminated: 0+A = A
- All-same matrices are reduced to scalars: [3;3;3]+x = 3+x
- Scalars are moved to the left in addition/subtraction: A-y = -y+A
- Inverse operations cancel: sqrt(x)^2 = x

The TomSym symbolic source transformation makes it possible to define any the set of decision variables (both continuous and integer) and any type of constraint as well as scalars and constant parameters.

An example linear programming problem would look like this:

`c=[-7;-5];A=[1241];b_U=[6;12];x_L=[0;0];toms2x1xsolution=ezsolve(c'*x,{A*x<=b_U,x_L<=x});`

A MINLP problem is defined just like a linear programming problem. This example also shows how to convert the model into a general TOMLAB problem.

`Name='minlp1Demo - Kocis/Grossman.';toms2x1xtoms3x1integeryobjective=[231.52-0.5]*[x;y];constraints={...x(1)>=0,...x(2)>=1e-8,...x<=1e8,...0<=y<=1,...[10100]*[x;y]<=1.6,...1.333*x(2)+y(2)<=3,...[-1-11]*y<=0,...x(1)^2+y(1)==1.25,...sqrt(x(2)^3)+1.5*y(2)==3,...};guess=struct('x',ones(size(x)),'y',ones(size(y)));options=struct;options.name=Name;Prob=sym2prob('minlp',objective,constraints,guess,options);Prob.DUNDEE.optPar(20)=1;Result=tomRun('minlpBB',Prob,2);`

tomSym makes it possible to build models with two or more variable indices in MATLAB.^{ [2] } The following example creates a variable 'flow' with four indices. The variable is then used to create a constraint over two of the indices and to sum the multiplication with a two-dimensional matrix.

`% Create the indices used in modeli=tomArrayIdx('i',1:6);j=tomArrayIdx('j',1:6);k=tomArrayIdx('k',1:6);l=tomArrayIdx('l',1:6);% Create an integer variable of full lengthflow=tom('flow',6^4,1,'int');% Convert the variable to a matrix with four indices.flow=tomArray(flow,[6,6,6,6]);% Create a constraint valid for all i and jcons={sum(sum(flow(i,j,k,l),k),l)==1};% Create a scalar based on multi-index multiplicationsdistance=tomArray([0945605466747494394;...9450866372638063448;...6058660447145414152;...4667372644710109415;...4749380645411090431;...4394344841524154310]);sumtotal=sum(vec((distance(i,k)+distance(l,j)+...distance(k,l)*.8).*flow(i,j,k,l)));`

For functions that cannot be interpreted by tomSym it is possible to use either automatic differentiation or numerical differentiation. In the following example a simple problem is solved using the two methods.

`tomsx1x2alpha=100;% USE MAD (AUTOMATIC DIFFERENTIATION) FOR ONE FUNCTION%% Create a wrapper function. In this case we use sin, but it could be any% MAD supported function.y=wrap(struct('fun','sin','n',1,'sz1',1,'sz2',1,'JFuns','MAD'),x1/x2);f=alpha*(x2-x1^2)^2+(1-x1)^2+y;% Setup and solve the problemc=-x1^2-x2;con={-1000<=c<=0-10<=x1<=2-10<=x2<=2};x0={x1==-1.2x2==1};solution1=ezsolve(f,con,x0);% USE NUMERICAL DIFFERENTIATION FOR ONE FUNCTIONS% Create a new wrapper function. In this case we use sin, but it could be% any function since we use numerical derivatives.y=wrap(struct('fun','sin','n',1,'sz1',1,'sz2',1,'JFuns','FDJac'),x1/x2);f=alpha*(x2-x1^2)^2+(1-x1)^2+y;solution2=ezsolve(f,con,x0);`

In linear algebra, a **symmetric matrix** is a square matrix that is equal to its transpose. Formally,

In statistics, **canonical-correlation analysis** (**CCA**), also called **canonical variates analysis**, is a way of inferring information from cross-covariance matrices. If we have two vectors *X* = (*X*_{1}, ..., *X*_{n}) and *Y* = (*Y*_{1}, ..., *Y*_{m}) of random variables, and there are correlations among the variables, then canonical-correlation analysis will find linear combinations of *X* and *Y* which have maximum correlation with each other. T. R. Knapp notes that "virtually all of the commonly encountered parametric tests of significance can be treated as special cases of canonical-correlation analysis, which is the general procedure for investigating the relationships between two sets of variables." The method was first introduced by Harold Hotelling in 1936, although in the context of angles between flats the mathematical concept was published by Jordan in 1875.

**NumPy** is a library for the Python programming language, adding support for large, multi-dimensional arrays and matrices, along with a large collection of high-level mathematical functions to operate on these arrays. The ancestor of NumPy, Numeric, was originally created by Jim Hugunin with contributions from several other developers. In 2005, Travis Oliphant created NumPy by incorporating features of the competing Numarray into Numeric, with extensive modifications. NumPy is open-source software and has many contributors. NumPy is a NumFOCUS fiscally sponsored project.

In mathematics, the **Hessian matrix** or **Hessian** is a square matrix of second-order partial derivatives of a scalar-valued function, or scalar field. It describes the local curvature of a function of many variables. The Hessian matrix was developed in the 19th century by the German mathematician Ludwig Otto Hesse and later named after him. Hesse originally used the term "functional determinants".

In computer science, **array programming** refers to solutions which allow the application of operations to an entire set of values at once. Such solutions are commonly used in scientific and engineering settings.

In mathematics, **matrix calculus** is a specialized notation for doing multivariable calculus, especially over spaces of matrices. It collects the various partial derivatives of a single function with respect to many variables, and/or of a multivariate function with respect to a single variable, into vectors and matrices that can be treated as single entities. This greatly simplifies operations such as finding the maximum or minimum of a multivariate function and solving systems of differential equations. The notation used here is commonly used in statistics and engineering, while the tensor index notation is preferred in physics.

In numerical optimization, the **Broyden–Fletcher–Goldfarb–Shanno** (**BFGS**) **algorithm** is an iterative method for solving unconstrained nonlinear optimization problems. Like the related Davidon–Fletcher–Powell method, BFGS determines the descent direction by preconditioning the gradient with curvature information. It does so by gradually improving an approximation to the Hessian matrix of the loss function, obtained only from gradient evaluations via a generalized secant method.

**Semidefinite programming** (**SDP**) is a subfield of convex optimization concerned with the optimization of a linear objective function over the intersection of the cone of positive semidefinite matrices with an affine space, i.e., a spectrahedron.

**Limited-memory BFGS** is an optimization algorithm in the family of quasi-Newton methods that approximates the Broyden–Fletcher–Goldfarb–Shanno algorithm (BFGS) using a limited amount of computer memory. It is a popular algorithm for parameter estimation in machine learning. The algorithm's target problem is to minimize over unconstrained values of the real-vector where is a differentiable scalar function.

The **TOMLAB** Optimization Environment is a modeling platform for solving applied optimization problems in MATLAB.

The **PROPT** MATLAB Optimal Control Software is a new generation platform for solving applied optimal control and parameters estimation problems.

In mathematics, a **matrix** is a rectangular array or table of numbers, symbols, or expressions, arranged in rows and columns, which is used to represent a mathematical object or a property of such an object.

In computer science, an **array type** is a data type that represents a collection of *elements*, each selected by one or more indices that can be computed at run time during program execution. Such a collection is usually called an **array variable**, **array value**, or simply **array**. By analogy with the mathematical concepts vector and matrix, array types with one and two indices are often called **vector type** and **matrix type**, respectively. More generally, a multidimensional array type can be called a **tensor type**.

**Locally Optimal Block Preconditioned Conjugate Gradient** (**LOBPCG**) is a matrix-free method for finding the largest eigenvalues and the corresponding eigenvectors of a symmetric generalized eigenvalue problem

A **sum-of-squares optimization** program is an optimization problem with a linear cost function and a particular type of constraint on the decision variables. These constraints are of the form that when the decision variables are used as coefficients in certain polynomials, those polynomials should have the polynomial SOS property. When fixing the maximum degree of the polynomials involved, sum-of-squares optimization is also known as the **Lasserre hierarchy** of relaxations in semidefinite programming.

In mathematics, the **Hadamard product** is a binary operation that takes two matrices of the same dimensions and produces another matrix of the same dimension as the operands, where each element *i*, *j* is the product of elements *i*, *j* of the original two matrices. It is to be distinguished from the more common matrix product. It is attributed to, and named after, either French mathematician Jacques Hadamard or German Russian mathematician Issai Schur.

In mathematics, **low-rank approximation** is a minimization problem, in which the cost function measures the fit between a given matrix and an approximating matrix, subject to a constraint that the approximating matrix has reduced rank. The problem is used for mathematical modeling and data compression. The rank constraint is related to a constraint on the complexity of a model that fits the data. In applications, often there are other constraints on the approximating matrix apart from the rank constraint, e.g., non-negativity and Hankel structure.

In machine learning, **feature hashing**, also known as the **hashing trick**, is a fast and space-efficient way of vectorizing features, i.e. turning arbitrary features into indices in a vector or matrix. It works by applying a hash function to the features and using their hash values as indices directly, rather than looking the indices up in an associative array. This trick is often attributed to Weinberger et al., but there exists a much earlier description of this method published by John Moody in 1989.

**Artelys Knitro** is a commercial software package for solving large scale nonlinear mathematical optimization problems.

- ↑ Rutquist, Per; M. M. Edvall (Nov 2008).
*User's Manual for TOMLAB*(PDF). Pullman, WA: Tomlab Optimization Inc. - ↑ "Airline Hub Location",
*TOMSYM Home Page*April, 2009.

This page is based on this Wikipedia article

Text is available under the CC BY-SA 4.0 license; additional terms may apply.

Images, videos and audio are available under their respective licenses.

Text is available under the CC BY-SA 4.0 license; additional terms may apply.

Images, videos and audio are available under their respective licenses.