WORKSHOP on TENSOR COMPUTATIONS

A mini-workshop on tensor
computations will take place on Tuesday 16

October at 9.15-12.
The speakers and titles are

Mariya Ishteva, Catholic
University of Leuven, Belgium,

Higher-order tensors: best
rank-$(R_1,R_2,\cdots,R_N)$ approximation algorithms

Giorgio
Tomasi, Copenhagen University,

Recent developments in fast
algorithms for fitting the PARAFAC model

Berkant Savas,
Linköping University,

A Quasi-Newton Algorithm for Best
Multi-linear Rank Approximation of Tensors

The workshop will
take place in Kompakta rummet.

Welcome!

Lars Eldén

ABSTRACTS

**Higher-order tensors: best rank-(R_1,R_2,...,R_N)
approximation algorithms **

Mariya Ishteva, Catholic University of Leuven, Belgium

Higher-order tensors are generalizations of matrices in the same
way as matrices are generalizations of vectors and vectors are
generalization of scalars. They have various application areas, such
as biomedical engineering, image processing, scientific computing,
system identification, control, and signal processing. For example,
dimensionality reduction of a multi-way problem can be achieved by
the best rank-(R1,R2,…,RN) approximation of tensors.

Contrary
to the matrix case, where the best rank approximation of a matrix can
be obtained directly from the truncated singular value decomposition,
the best rank-(R1,R2,…,RN) approximation of tensors cannot be
computed in a straightforward way. In this talk, we present the
higher-order orthogonal iterations algorithm and derive two new
algorithms that solve this problem, based on the trust-region and
conjugate gradient methods on manifolds. We touch on some of the
applications.

Key words: multilinear algebra, higher-order
tensor, rank reduction, singular value decomposition, trust-region,
conjugate gradients, Grassmann manifold

**Recent developments in fast algorithms for fitting the
PARAFAC model**

Giorgio Tomasi, Copenhagen University

The PARAFAC model is commonly used in chemometrics because of its essential uniqueness properties and because it corresponds to the physical model of several types of data (e.g. spectroscopic or chromatographic data). This model is considerably more difficult and expensive to fit than HO-SVD and low-rank Tucker models and several algorithms have been tested with varying degrees of success. In particular, none of the procedures implemented thus far appears to be the best for all problems. For example the Alternating Least Squares algorithm is slow in case of high collinearity whereas the Levenberg Marquadt algorithm may be too expensive for large problems. The study of the properties of the Khatri-Rao product, which is a special selection of rows and columns of the more common Kronecker product, allowed for great improvements in the efficiency of the ALS and all other methods. A QR decomposition based Levenberg-Marquardt algorithm exploiting these properties will be outlined that reduce the computational complexity to O((I + J + K)F) for a three -way array and is compatible with nonnegativity constraints.

**Quasi-Newton algorithms based on BFGS and LM-BFGS for best multilinear
rank approximation of tensors. **

Berkant Savas, Linköping University

In this talk we introduce two methods for solving the best multilinear rank approximation problem. The first one employs BFGS-updates and the second one Limited-Memory-BFGS-updates for approximating the Hessian of the objective function. Both algorithms exploit the fact that the problem may be viewed as an optimization problem over a product of Grassmann manifolds. Tensor approximation problems occur in various applications involving multidimensional data. The performance of the Quasi-Newton and algorithm is compared with the Newton-Grassmann and Higher Order Orthogonal Iteration algorithms for general and symmetric 3-tensors. We present convergence plots for the LM-BFGS algorithm as well.