Multilinear principal component analysis (MPCA) is a multilinear extension of principal component analysis (PCA) that is used to analyze M-way arrays, also informally referred to as "data tensors". M-way arrays may be modeled by linear tensor models, such as CANDECOMP/Parafac, or by multilinear tensor models, such as multilinear principal component analysis (MPCA) or multilinear independent component analysis (MICA).
Tensor rank decomposition were introduced by Frank Lauren Hitchcock in 1927; [1] explanded upon with the Tucker decomposition; [2] and by the "3-mode PCA" by Kroonenberg [3] Kroonenbeg's algorithm is an itterative algorithm that employs gradient descent. In 2000, De Lathauwer et al. restated Tucker and Kroonenberg's work in clear terms in their SIAM paper entitled "Multilinear Singular Value Decomposition", [4] and provided an itterative algorithm that employed the power method in their paper "On the Best Rank-1 and Rank-(R1, R2, ..., RN ) Approximation of Higher-order Tensors". [5]
Vasilescu and Terzopoulos in their paper "Multilinear Image Representation: TensorFaces" [6] introduced the M-mode SVD algorithm which is a simple and elegant algorithm suitable for parallel computation. This algorithm is often misidentified in the literature as the HOSVD or the Tucker which are sequential itterative algorithms that employ gradient descent. Vasilescu and Terzopoulos framed the data analysis, recognition and synthesis problems as multilinear tensor problems. Data is viewed as the compositional consequence of several causal factors, and which are well suited for multi-modal tensor factor analysis. The power of the tensor framework was showcased by analyzing human motion joint angles, facial images or textures in the following papers: Human Motion Signatures [7] (CVPR 2001, ICPR 2002), face recognition – TensorFaces, [6] [8] (ECCV 2002, CVPR 2003, etc.) and computer graphics – TensorTextures [9] (Siggraph 2004).
In 2005, Vasilescu and Terzopoulos introduced the Multilinear PCA [10] terminology as a way to better differentiate between linear and multilinear tensor decomposition, as well as, to better differentiate between the work [7] [6] [8] [9] that employed 2nd order statistics associated with each data tensor mode(axis), and subsequent work on Multilinear Independent Component Analysis [10] that employed higher order statistics associated with each tensor mode/axis.
Multilinear PCA may be applied to compute the causal factors of data formation, or as signal processing tool on data tensors whose individual observation have either been vectorized, [7] [6] [8] [9] or whose observations are treated as a collection of column/row observations, an "observation as a matrix", and concatenated into a data tensor. The main disadvantage of the latter approach is that MPCA computes a set of orthonormal matrices associated with row and column space that are unrelated to the causal factors of data formation.
The MPCA solution follows the alternating least square (ALS) approach. It is iterative in nature. As in PCA, MPCA works on centered data. Centering is a little more complicated for tensors, and it is problem dependent.
MPCA features: Supervised MPCA is employed in causal factor analysis that facilitates object recognition [11] while a semi-supervised MPCA feature selection is employed in visualization tasks. [12]
Various extension of MPCA: