Tensor Decomposition for Machine Learning

Xinyu Chen, Dingyi Zhuang, Jinhua Zhao (2024)

An overview of the development of tensor decomposition models and algorithms, along with tutorials on matrix and tensor computations, as well as tensor decomposition techniques across a wide range of scientific areas and applications.

Introduction

This article summarizes the development of tensor decomposition models and algorithms in the literature, offering comprehensive reviews and tutorials on topics ranging from matrix and tensor computations to tensor decomposition techniques across a wide range of scientific areas and applications. Since the decomposition of tensors is often formulated as an optimization problem, this article also provides a preliminary introduction to some classical methods for solving convex and nonconvex optimization problems. This work aims to offer valuable insights to both the machine learning and data science communities by drawing strong connections with the key concepts related to tensor decomposition. To ensure reproducibility and sustainability, we provide resources such as datasets and Python implementations, primarily utilizing Python’s numpy library. The content includes:

  • Introduction
  • What are tensors?
  • Foundation of tensor computations
  • Foundation of optimization
  • CP decomposition
  • Tucker decomposition
  • Tensor-train decomposition
  • Bayesian tensor factorization
  • Non-negative tensor factorization
  • Multi-relational tensor factorization
  • Robust tensor factorization
  • Multilinear tensor regression
  • Low-rank tensor completion

For more details on the table of contents, please see the outline below.

Foundation of Optimization

  • Gradient descent methods [Gradient descent | Steepest gradient descent | Conjugate gradient descent | Proximal gradient descent | LASSO]
  • Alternating minimization [Alternating least squares | Subproblem approximation for generalized Sylvester equations]
  • Alternating direction method of multipliers [Problem formulation | Augmented Lagrangian method | LASSO]
  • Greedy methods [Orthogonal matching pursuit | Subspace pursuit]
  • Bayesian optimization [Conjugate priors | Bayesian inference (e.g., MCMC) | Bayesian linear regression]
  • Power iteration [Eigenvalue decomposition | Randomized singular value decomposition]
  • Procrustes problems [Orthogonal Procrustes problem]

CP Decomposition

  • Randomized CP decomposition

Tucker Decomposition

  • Higher-Order singular value decomposition

Bayesian Tensor Decomposition

Non-Negative Tensor Factorization

  • Non-negative matrix factorization

Robust Tensor Factorization

Low-Rank Tensor Completion

Tensor-Train Decomposition

Tensor Regression

Regularization Techniques

Open Science on GitHub

Feel free to reach out to us via the GitHub repository or email with any suggestions and feedback. We appreciate contributions in any form to advance the development of open science.