August 31, 2025 to September 5, 2025
Palazzone di Cortona
Europe/Rome timezone

Sampling and Randomized Sketching for Efficient Matrix and Tensor Decompositions

Sep 1, 2025, 4:10 PM
1h 10m
Palazzone di Cortona

Palazzone di Cortona

52044 Le Contesse, Province of Arezzo

Speaker

Sascha Portaro (University of Bologna)

Description

The randomized singular value decomposition (SVD) proposed in 1 has become one of the most well-established randomization-based algorithms in numerical linear algebra. Its core idea is the computation of a subspace that approximates the column space of a target matrix $\mathbf{A}$ with high probabilistic confidence. In our work [2], we introduce a modification to the standard randomized SVD that, in general, yields better approximations to $\text{Range}(\mathbf{A})$ at comparable computational cost. This is achieved by explicitly incorporating information from the row space of $\mathbf{A}$, thus enhancing the approximation quality. Furthermore, we observe that only a limited amount of information from $\text{Range}(\mathbf{A}^T)$ is needed. Exploiting this, we show both theoretically and through numerical experiments that a variant of the algorithm equipped with a subsampling step can significantly improve efficiency while retaining competitive accuracy.

We expand our method to the high-dimensional context of tensor decompositions, building on these matrix-based randomized techniques. With applications ranging from deep learning to signal processing, the Tucker decomposition is a popular technique for compressing and analyzing multidimensional data while maintaining its intrinsic multiway structure. The deterministic SVD-based factorizations of tensor matricizations used in classical Tucker decomposition algorithms are frequently computationally costly. Motivated by our recent developments for the matrix case, we suggest a novel Tucker decomposition framework that combines randomized sketching and tensor fiber sampling techniques to overcome these drawbacks. This strategy avoids the costly formation of tensor matricizations and improves the approximation of their column spaces, reducing both computational time and memory requirements. Under a low-rank assumption, the approximation errors remain comparable to those of existing randomized Tucker methods.

References

  1. N. Halko, P. G. Martinsson, and J. A. Tropp. Finding Structure with Randomness: Probabilistic Algorithms for Constructing Approximate Matrix Decompositions. SIAM Review 53.2, 217–288 (2011). DOI: 10.1137/090771806

  2. Davide Palitta and Sascha Portaro. Row-aware Randomized SVD with applications. arXiv: 2408.04503 [math.NA]. URL: https://arxiv.org/abs/2408.04503. (2024).

Primary author

Sascha Portaro (University of Bologna)

Co-author

Davide Palitta (Alma Mater Studiorum, Università di Bologna)

Presentation materials

There are no materials yet.