May 18 – 22, 2026
Virginia Tech
America/New_York timezone

Manifold-based Algorithms for the Hadamard Decomposition

May 18, 2026, 4:35 PM
25m
Torgersen Hall 1060

Torgersen Hall 1060

Minisymposium Talk Low-rank Matrix and Tensor Decompositions: Theory, Algorithms and Applications Low-rank Matrix and Tensor Decompositions: Theory, Algorithms and Applications

Speaker

Stefano Sicilia (University of Mons)

Description

Given a matrix $X$, and two ranks $r_1$ and $r_2$, the Hadamard decomposition (HD) consists in looking for two low-rank matrices, $X_1$ of rank $r_1$ and $X_2$ of rank $r_2$, both of the same size as $X$, such that $X\approx X_1\circ X_2$, where $\circ$ is the Hadamard (element-wise) product. HD is more expressive than standard low-rank approximations, such as the truncated singular value decomposition (TSVD), as it can represent higher-rank matrices with the same number of parameters; this is because the rank of $X_1 \circ X_2$ is generically equal to $r_1 r_2$. In this paper, we first present some new theoretical results for HD. These allow us to develop two new manifold-based gradient descent algorithms for computing HDs: the first one uses the representation $X\approx X_1\circ X_2$, while the second one uses $X\approx WH^\top$, where $W$ and $H$ have $r_1 r_2$ columns. We also propose new initializations that allow us to obtain better HDs. We compare our algorithms and initialization strategies with the TSVD and with the state of the art. Numerical results show that the new methods are efficient and competitive, both on synthetic and real data.

Author

Stefano Sicilia (University of Mons)

Co-authors

Arnaud Vandaele (University of Mons) Nicolas Gillis (Universite de Mons) Subhayan Saha (Universite de Mons)

Presentation materials

There are no materials yet.