Multilinear principal component analysis

Multilinear principal component analysis (MPCA) is a multilinear extension of principal component analysis (PCA). MPCA is employed in the analysis of n-way arrays, i.e. a cube or hyper-cube of numbers, also informally referred to as a "data tensor". N-way arrays may be decomposed, analyzed, or modeled by

  • linear tensor models such as CANDECOMP/Parafac, or
  • multilinear tensor models, such as multilinear principal component analysis (MPCA), or multilinear independent component analysis (MICA), etc.

The origin of MPCA can be traced back to the Tucker decomposition[1] and Peter Kroonenberg's "M-mode PCA/3-mode PCA" work.[2] In 2000, De Lathauwer et al. restated Tucker and Kroonenberg's work in clear and concise numerical computational terms in their SIAM paper entitled "Multilinear Singular Value Decomposition",[3] (HOSVD) and in their paper "On the Best Rank-1 and Rank-(R1, R2, ..., RN ) Approximation of Higher-order Tensors".[4]

Circa 2001, Vasilescu reframed the data analysis, recognition and synthesis problems as multilinear tensor problems based on the insight that most observed data are the compositional consequence of several causal factors of data formation, and are well suited for multi-modal data tensor analysis. The power of the tensor framework was showcased by analyzing human motion joint angles, facial images or textures in terms of their causal factors of data formation in the following works: Human Motion Signatures[5] (CVPR 2001, ICPR 2002), face recognition – TensorFaces,[6][7] (ECCV 2002, CVPR 2003, etc.) and computer graphics – TensorTextures[8] (Siggraph 2004).

Historically, MPCA has been referred to as "M-mode PCA", a terminology which was coined by Peter Kroonenberg in 1980.[2] In 2005, Vasilescu and Terzopoulos introduced the Multilinear PCA[9] terminology as a way to better differentiate between linear and multilinear tensor decomposition, as well as, to better differentiate between the work[5][6][7][8] that computed 2nd order statistics associated with each data tensor mode(axis), and subsequent work on Multilinear Independent Component Analysis[9] that computed higher order statistics associated with each tensor mode/axis.

Multilinear PCA may be applied to compute the causal factors of data formation, or as signal processing tool on data tensors whose individual observation have either been vectorized,[5][6][7][8] or whose observations are treated as matrix[10] and concatenated into a data tensor.

MPCA computes a set of orthonormal matrices associated with each mode of the data tensor which are analogous to the orthonormal row and column space of a matrix computed by the matrix SVD. This transformation aims to capture as high a variance as possible, accounting for as much of the variability in the data associated with each data tensor mode(axis).

The algorithm

The MPCA solution follows the alternating least square (ALS) approach.[2] It is iterative in nature. As in PCA, MPCA works on centered data. Centering is a little more complicated for tensors, and it is problem dependent.

Feature selection

MPCA features: Supervised MPCA feature selection is used in object recognition[11] while unsupervised MPCA feature selection is employed in visualization task.[12]

Extensions

Various extensions of MPCA have been developed:[13]

  • Uncorrelated MPCA (UMPCA)[14] In contrast, the uncorrelated MPCA (UMPCA) generates uncorrelated multilinear features.[14]
  • Boosting+MPCA[15]
  • Non-negative MPCA (NMPCA)[16]
  • Robust MPCA (RMPCA)[17]
  • Multi-Tensor Factorization, that also finds the number of components automatically (MTF)[18]

References

  1. Tucker, Ledyard R (September 1966). "Some mathematical notes on three-mode factor analysis". Psychometrika. 31 (3): 279–311. doi:10.1007/BF02289464. PMID 5221127.
  2. P. M. Kroonenberg and J. de Leeuw, Principal component analysis of three-mode data by means of alternating least squares algorithms, Psychometrika, 45 (1980), pp. 69–97.
  3. Lathauwer, L.D.; Moor, B.D.; Vandewalle, J. (2000). "A multilinear singular value decomposition". SIAM Journal on Matrix Analysis and Applications. 21 (4): 1253–1278. doi:10.1137/s0895479896305696.
  4. Lathauwer, L. D.; Moor, B. D.; Vandewalle, J. (2000). "On the best rank-1 and rank-(R1, R2, ..., RN ) approximation of higher-order tensors". SIAM Journal on Matrix Analysis and Applications. 21 (4): 1324–1342. doi:10.1137/s0895479898346995.
  5. M.A.O. Vasilescu (2002) "Human Motion Signatures: Analysis, Synthesis, Recognition," Proceedings of International Conference on Pattern Recognition (ICPR 2002), Vol. 3, Quebec City, Canada, Aug, 2002, 456–460.
  6. M.A.O. Vasilescu, D. Terzopoulos (2002) "Multilinear Analysis of Image Ensembles: TensorFaces," Proc. 7th European Conference on Computer Vision (ECCV'02), Copenhagen, Denmark, May, 2002, in Computer Vision – ECCV 2002, Lecture Notes in Computer Science, Vol. 2350, A. Heyden et al. (Eds.), Springer-Verlag, Berlin, 2002, 447–460.
  7. M.A.O. Vasilescu, D. Terzopoulos (2003) "Multilinear Subspace Analysis for Image Ensembles, M. A. O. Vasilescu, D. Terzopoulos, Proc. Computer Vision and Pattern Recognition Conf. (CVPR '03), Vol.2, Madison, WI, June, 2003, 93–99.
  8. M.A.O. Vasilescu, D. Terzopoulos (2004) "TensorTextures: Multilinear Image-Based Rendering", M. A. O. Vasilescu and D. Terzopoulos, Proc. ACM SIGGRAPH 2004 Conference Los Angeles, CA, August, 2004, in Computer Graphics Proceedings, Annual Conference Series, 2004, 336–342.
  9. M. A. O. Vasilescu, D. Terzopoulos (2005) "Multilinear Independent Component Analysis", "Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR’05), San Diego, CA, June 2005, vol.1, 547–553."
  10. Lu, H.; Plataniotis, K. N.; Venetsanopoulos, A. N. (2008). "MPCA: Multilinear principal component analysis of tensor objects" (PDF). IEEE Trans. Neural Netw. 19 (1): 18–39. CiteSeerX 10.1.1.331.5543. doi:10.1109/tnn.2007.901277. PMID 18269936.
  11. M. A. O. Vasilescu, D. Terzopoulos (2003) "Multilinear Subspace Analysis of Image Ensembles", "Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR’03), Madison, WI, June, 2003"
  12. H. Lu, H.-L. Eng, M. Thida, and K.N. Plataniotis, "Visualization and Clustering of Crowd Video Content in MPCA Subspace," in Proceedings of the 19th ACM Conference on Information and Knowledge Management (CIKM 2010), Toronto, ON, Canada, October, 2010.
  13. Lu, Haiping; Plataniotis, K.N.; Venetsanopoulos, A.N. (2011). "A Survey of Multilinear Subspace Learning for Tensor Data" (PDF). Pattern Recognition. 44 (7): 1540–1551. doi:10.1016/j.patcog.2011.01.004.
  14. H. Lu, K. N. Plataniotis, and A. N. Venetsanopoulos, "Uncorrelated multilinear principal component analysis for unsupervised multilinear subspace learning," IEEE Trans. Neural Netw., vol. 20, no. 11, pp. 1820–1836, Nov. 2009.
  15. H. Lu, K. N. Plataniotis and A. N. Venetsanopoulos, "Boosting Discriminant Learners for Gait Recognition using MPCA Features Archived 2010-10-22 at the Wayback Machine", EURASIP Journal on Image and Video Processing, Volume 2009, Article ID 713183, 11 pages, 2009. doi:10.1155/2009/713183.
  16. Y. Panagakis, C. Kotropoulos, G. R. Arce, "Non-negative multilinear principal component analysis of auditory temporal modulations for music genre classification", IEEE Trans. on Audio, Speech, and Language Processing, vol. 18, no. 3, pp. 576–588, 2010.
  17. K. Inoue, K. Hara, K. Urahama, "Robust multilinear principal component analysis", Proc. IEEE Conference on Computer Vision, 2009, pp. 591–597.
  18. Khan, Suleiman A.; Leppäaho, Eemeli; Kaski, Samuel (2016-06-10). "Bayesian multi-tensor factorization". Machine Learning. 105 (2): 233–253. arXiv:1412.4679. doi:10.1007/s10994-016-5563-y. ISSN 0885-6125.
This article is issued from Wikipedia. The text is licensed under Creative Commons - Attribution - Sharealike. Additional terms may apply for the media files.