20–21 Jan 2025
Aula Magna "Fratelli Pontecorvo", Building E, Polo Fibonacci. Pisa
Europe/Rome timezone

Provably convergent, extrapolated algorithms for Nonlinear Matrix Decomposition problems

21 Jan 2025, 14:20
20m
Building E (Aula Magna "Fratelli Pontecorvo", Building E, Polo Fibonacci. Pisa)

Building E

Aula Magna "Fratelli Pontecorvo", Building E, Polo Fibonacci. Pisa

Largo Bruno Pontecorvo 3, 56127 Pisa (Building E)

Speaker

Giovanni Seraghiti (Unifi/Umons)

Description

In this contribution I propose a new problem in low-rank matrix factorization, that is the Nonlinear Matrix Decomposition (NMD): given a sparse nonnegative matrix~$X$, find a low-rank matrix $\Theta$ such that $X \approx f(\Theta)$, where $f$ is element-wise nonlinear. I will focus on the so-called ReLU-NMD, where $f(\cdot) = \max(0, \cdot)$, the rectified unit (ReLU) non-linear activation.

At first, I will provide a brief overview of the motivations, possible interpretations of the model and its connection with neural networks, explaining the idea that stands behind ReLU-NMD and how nonlinearity can be exploited to get low-rank approximation of given data. Secondly, I will talk about standard approaches to model ReLU-NMD as an optimization problem.

Then, I will introduce the Block-Coordinate-Descent (BCD) method along with some convergence guarantees. Moreover, I will show how the BCD scheme applies to ReLU-NMD (BCD-NMD) and how it can be accelerated via extrapolation, maintaining its convergence properities also in the non-convex setting (eBCD-NMD).

Finally, I will illustrate the effectiveness of the proposed algorithms on synthetic and real-world data sets, providing some possible applications.

Primary authors

Giovanni Seraghiti (Unifi/Umons) Prof. Margherita Porcelli (Unifi) Prof. Nicolas Gillis (Umons)

Presentation materials

There are no materials yet.