Speaker
Description
The Hadamard factorization is a powerful technique for data analysis and matrix compression, which decomposes a given matrix $A$ into the element-wise product of two low-rank matrices $W$ and $H$ such that $A\approx W\circ H$. Unlike the well-known SVD, this decomposition allows to represent higher-rank matrices with the same amount of variables. We present some new theoretical results which show the features of this factorization and when it is possible. Based on these facts, we derive some new initialization guesses for the two factors of the problem. After that, we develop a new algorithm for computing $W$ and $H$, by taking into account a special manifold structure of these matrices. We implement a block gradient descent on the two manifolds where $W$ and $H$ belong to and we integrate the associated gradient system. We compare our results with an existing approach in the literature [Wertz et al. 2025] and we show that the new initializations can also improve this algorithm.