20–21 Jan 2025
Aula Magna "Fratelli Pontecorvo", Building E, Polo Fibonacci. Pisa
Europe/Rome timezone

Tensor-Oriented LSQR for Tensor Least Squares Problems

20 Jan 2025, 14:40
20m
Building E (Aula Magna "Fratelli Pontecorvo", Building E, Polo Fibonacci. Pisa)

Building E

Aula Magna "Fratelli Pontecorvo", Building E, Polo Fibonacci. Pisa

Largo Bruno Pontecorvo 3, 56127 Pisa (Building E)

Speaker

Lorenzo Piccinini (Università di Bologna)

Description

We are interested in the numerical solution of the multiterm tensor least squares problem
$$ \min_{\mathcal{X}} \| \mathcal{F} - \sum_{i =0}^{\ell} \mathcal{X} \times_1 A_1^{(i)} \times_2 A_2^{(i)} \cdots \times_d A_d^{(i)} \|_F, $$ where $\mathcal{X}\in\mathbb{R}^{m_1 \times m_2 \times \cdots \times m_d}$, $\mathcal{F}\in\mathbb{R}^{n_1\times n_2 \times \cdots \times n_d}$ are tensors with $d$ dimensions (or modes), while $A_j^{(i)}\in\mathbb{R}^{n_j \times m_j}$ for every $i=1,\ldots,\ell$ and $j=1,\ldots,d$. The symbol $\times_j$ with $j=1,\ldots,d$ denotes the $j$-mode product of a tensor times a matrix. We are interested in the Tucker and Tensor-Train formats. Least squares tensor (and matrix) formulations have emerged in recent literature from different applications, including the numerical solution of PDEs, data science problems such as dictionary learning [1,2], control systems. The problem is challenging for the absence of direct methods that can efficiently handle multiple addends in tensorial form. In particular, the problem is extremely memory demanding even for modest values of each mode size, for large $\ell$.

In our presentation we will propose an implementation of Truncated Tensor-oriented LSQR, first introduced in vector form in [3] and studied in the matrix version in [6], and illustrate the potential of these new approaches on problems stemming from the
discretization of multidimensional PDEs and from tensor Dictionary Learning.

  1. C. F. Dantas, J. E. Cohen, and R. Gribonval. "Learning Tensor-structured Dictionaries with Application to Hyperspectral Image Denoising". In: 27th European Signal Processing Conference, EUSPICO 2019, Coruna, Spain, September 2-6, 2019. IEEE, 2019, pp1-5.

  2. C. F. Dantas, M. N. Da Costa, and R. da Rocha Lopes. "Learning dictionaries as sum of Kronecker products". In: IEEE Signal Processing Letters 24.5 (2017), pp. 559-563.

  3. Christopher C. Paige, and Micheal A. Saunders. "LSQR: An algorithm for sparse linear equation and sparse least squares". In: ACM Transactions on Mathematical Software (TOMS) 8.1 (1982), pp. 43-71.

  4. I. Oseledets. "Tensor-Train Decomposition". In: SIAM J. Scientific Computing 33 (Jan. 2011), pp. 2295-2317.

  5. H. Al Daas et al. "Randomized Algorithms for Rounding in the Tensor-Train Format". In: \textit{SIAM Journal on Scientific Computing} 45.1 (2023), A47-A95.

  6. V. Simoncini, L. Piccinini. "Truncated LSQR for matrix least squares problems". pp. 1-22, February 2024. To appear in Computational Optimization and Applications (COAP).

Primary authors

Lorenzo Piccinini (Università di Bologna) Valeria Simoncini (Universita' di Bologna)

Presentation materials

There are no materials yet.