JAMA (numerical linear algebra library)

Last updated
JAMA
Original author(s) NIST
Initial release1998
Stable release
1.0.3 / November 9, 2012 (2012-11-09)
Operating system Cross-platform
Type Library
License Public domain software
Website math.nist.gov/javanumerics/jama/

JAMA is a software library for performing numerical linear algebra tasks created at National Institute of Standards and Technology in 1998 similar in functionality to LAPACK.

Contents

Functionality

The main capabilities provided by JAMA are:

Versions exist for both C++ and the Java programming language. The C++ version uses the Template Numerical Toolkit for lower-level operations. The Java version provides the lower-level operations itself.

History

As work of US governmental organization the algorithm and source code have been released to the public domain around 1998. [1] JAMA has had little development since the year 2000, [2] with only the occasional bug fix being released. The project's webpage contains the following statement, "(JAMA) is no longer actively developed to keep track of evolving usage patterns in the Java language, nor to further improve the API. We will, however, fix outright errors in the code." [3] The last bug fix was released November 2012, with the previous one being released in 2005.

Usage Example

Example of Singular Value Decomposition (SVD):

SingularValueDecompositions=matA.svd();MatrixU=s.getU();MatrixS=s.getS();MatrixV=s.getV();

Example of matrix multiplication:

Matrixresult=A.times(B);

See also

Related Research Articles

<span class="mw-page-title-main">Numerical analysis</span> Study of algorithms using numerical approximation

Numerical analysis is the study of algorithms that use numerical approximation for the problems of mathematical analysis. It is the study of numerical methods that attempt to find approximate solutions of problems rather than the exact ones. Numerical analysis finds application in all fields of engineering and the physical sciences, and in the 21st century also the life and social sciences, medicine, business and even the arts. Current growth in computing power has enabled the use of more complex numerical analysis, providing detailed and realistic mathematical models in science and engineering. Examples of numerical analysis include: ordinary differential equations as found in celestial mechanics, numerical linear algebra in data analysis, and stochastic differential equations and Markov chains for simulating living cells in medicine and biology.

<span class="mw-page-title-main">Singular value decomposition</span> Matrix decomposition

In linear algebra, the singular value decomposition (SVD) is a factorization of a real or complex matrix into a rotation, followed by a rescaling followed by another rotation. It generalizes the eigendecomposition of a square normal matrix with an orthonormal eigenbasis to any matrix. It is related to the polar decomposition.

In the mathematical discipline of linear algebra, a matrix decomposition or matrix factorization is a factorization of a matrix into a product of matrices. There are many different matrix decompositions; each finds use among a particular class of problems.

Latent semantic analysis (LSA) is a technique in natural language processing, in particular distributional semantics, of analyzing relationships between a set of documents and the terms they contain by producing a set of concepts related to the documents and terms. LSA assumes that words that are close in meaning will occur in similar pieces of text. A matrix containing word counts per document is constructed from a large piece of text and a mathematical technique called singular value decomposition (SVD) is used to reduce the number of rows while preserving the similarity structure among columns. Documents are then compared by cosine similarity between any two columns. Values close to 1 represent very similar documents while values close to 0 represent very dissimilar documents.

In linear algebra, the generalized singular value decomposition (GSVD) is the name of two different techniques based on the singular value decomposition (SVD). The two versions differ because one version decomposes two matrices and the other version uses a set of constraints imposed on the left and right singular vectors of a single-matrix SVD.

<span class="mw-page-title-main">LAPACK</span> Software library for numerical linear algebra

LAPACK is a standard software library for numerical linear algebra. It provides routines for solving systems of linear equations and linear least squares, eigenvalue problems, and singular value decomposition. It also includes routines to implement the associated matrix factorizations such as LU, QR, Cholesky and Schur decomposition. LAPACK was originally written in FORTRAN 77, but moved to Fortran 90 in version 3.2 (2008). The routines handle both real and complex matrices in both single and double precision. LAPACK relies on an underlying BLAS implementation to provide efficient and portable computational building blocks for its routines.

In mathematics, the polar decomposition of a square real or complex matrix is a factorization of the form , where is a unitary matrix and is a positive semi-definite Hermitian matrix, both square and of the same size.

In mathematics, a bidiagonal matrix is a banded matrix with non-zero entries along the main diagonal and either the diagonal above or the diagonal below. This means there are exactly two non-zero diagonals in the matrix.

<span class="mw-page-title-main">Template Numerical Toolkit</span>

The Template Numerical Toolkit is a software library for manipulating vectors and matrices in C++ created by the U.S. National Institute of Standards and Technology. TNT provides the fundamental linear algebra operations. TNT is analogous to the BLAS library used by LAPACK. Higher level algorithms, such as LU decomposition and singular value decomposition, are provided by JAMA, also developed at NIST, which uses TNT.

Numerical linear algebra, sometimes called applied linear algebra, is the study of how matrix operations can be used to create computer algorithms which efficiently and accurately provide approximate answers to questions in continuous mathematics. It is a subfield of numerical analysis, and a type of linear algebra. Computers use floating-point arithmetic and cannot exactly represent irrational data, so when a computer algorithm is applied to a matrix of data, it can sometimes increase the difference between a number stored in the computer and the true number that it is an approximation of. Numerical linear algebra uses properties of vectors and matrices to develop computer algorithms that minimize the error introduced by the computer, and is also concerned with ensuring that the algorithm is as efficient as possible.

In multilinear algebra, the higher-order singular value decomposition (HOSVD) of a tensor is a specific orthogonal Tucker decomposition. It may be regarded as one type of generalization of the matrix singular value decomposition. It has applications in computer vision, computer graphics, machine learning, scientific computing, and signal processing. Some aspects can be traced as far back as F. L. Hitchcock in 1928, but it was L. R. Tucker who developed for third-order tensors the general Tucker decomposition in the 1960s, further advocated by L. De Lathauwer et al. in their Multilinear SVD work that employs the power method, or advocated by Vasilescu and Terzopoulos that developed M-mode SVD a parallel algorithm that employs the matrix SVD.

The following tables provide a comparison of linear algebra software libraries, either specialized or general purpose libraries with significant linear algebra coverage.

Colt is a set of open-source Libraries for High Performance Scientific and Technical Computing written in Java and developed at CERN. Colt was developed with a focus on High Energy Physics, but is applicable to many other problems. Colt was last updated in 2004 and its code base has been incorporated into the Parallel Colt code base, which has received more recent development.

Matrix Toolkit Java (MTJ) is an open-source Java software library for performing numerical linear algebra. The library contains a full set of standard linear algebra operations for dense matrices based on BLAS and LAPACK code. Partial set of sparse operations is provided through the Templates project. The library can be configured to run as a pure Java library or use BLAS machine-optimized code through the Java Native Interface.

Parallel Colt is a set of multithreaded version of Colt. It is a collection of open-source libraries for High Performance Scientific and Technical Computing written in Java. It contains all the original capabilities of Colt and adds several new ones, with a focus on multi-threaded algorithms.

Programming with Big Data in R (pbdR) is a series of R packages and an environment for statistical computing with big data by using high-performance statistical computation. The pbdR uses the same programming language as R with S3/S4 classes and methods which is used among statisticians and data miners for developing statistical software. The significant difference between pbdR and R code is that pbdR mainly focuses on distributed memory systems, where data are distributed across several processors and analyzed in a batch mode, while communications between processors are based on MPI that is easily used in large high-performance computing (HPC) systems. R system mainly focuses on single multi-core machines for data analysis via an interactive mode such as GUI interface.

oj! Algorithms or ojAlgo, is an open source Java library for mathematics, linear algebra and optimisation. It was first released in 2003 and is 100% pure Java source code and free from external dependencies. Its feature set make it particularly suitable for use within the financial domain.

Efficient Java Matrix Library (EJML) is a linear algebra library for manipulating real/complex/dense/sparse matrices. Its design goals are; 1) to be as computationally and memory efficient as possible for both small and large matrices, and 2) to be accessible to both novices and experts. These goals are accomplished by dynamically selecting the best algorithms to use at runtime, clean API, and multiple interfaces. EJML is free, written in 100% Java and has been released under an Apache v2.0 license.

jblas is a linear algebra library, created by Mikio Braun, for the Java programming language built upon BLAS and LAPACK. Unlike most other Java linear algebra libraries, jblas is designed to be used with native code through the Java Native Interface (JNI) and comes with precompiled binaries. When used on one of the targeted architectures, it will automatically select the correct binary to use and load it. This allows it to be used out of the box and avoid a potentially tedious compilation process. jblas provides an easier to use high level API on top of the archaic API provided by BLAS and LAPACK, removing much of the tediousness.

References

  1. JAMA : A Java Matrix Package on math.nist.gov
  2. "JAMA Change Log". JAMA. NIST. November 8, 2012. Retrieved November 30, 2012.
  3. "JAMA Project Page". JAMA. NIST. Retrieved November 30, 2012.