Speaker
Description
In this contribution I propose a new problem in low-rank matrix factorization, that is the Nonlinear Matrix Decomposition (NMD): given a sparse nonnegative matrix~$X$, find a low-rank matrix $\Theta$ such that $X \approx f(\Theta)$, where $f$ is element-wise nonlinear. I will focus on the so-called ReLU-NMD, where $f(\cdot) = \max(0, \cdot)$, the rectified unit (ReLU) non-linear activation.
At first, I will provide a brief overview of the motivations, possible interpretations of the model and its connection with neural networks, explaining the idea that stands behind ReLU-NMD and how nonlinearity can be exploited to get low-rank approximation of given data. Secondly, I will talk about standard approaches to model ReLU-NMD as an optimization problem.
Then, I will introduce the Block-Coordinate-Descent (BCD) method along with some convergence guarantees. Moreover, I will show how the BCD scheme applies to ReLU-NMD (BCD-NMD) and how it can be accelerated via extrapolation, maintaining its convergence properities also in the non-convex setting (eBCD-NMD).
Finally, I will illustrate the effectiveness of the proposed algorithms on synthetic and real-world data sets, providing some possible applications.