Speaker
Description
In this talk I will present an extension of the Maximization-Minimization Generalized Krylov Subspace (MM-GKS) method for solving $\ell^p-\ell^q$ minimization problems, as proposed in [1], by introducing a right preconditioner aimed at accelerating convergence without compromising the quality of the computed solution. The original MM-GKS approach relies on iterative reweighting and projection onto subspaces of increasing dimensions, enabling efficient resolution of minimization problems. Our enhanced method leverages a carefully designed regularizing preconditioner, inspired by Iterated Tikhonov regularization, to address the inherent ill-conditioning of the problem. We demonstrate that our preconditioned MM-GKS method preserves the stability and accuracy of the original MM-GKS method, as validated by numerical results in image deblurring, showing significant reductions in CPU time.
This work is a collaboration with A. Buccini (University of Cagliari), M. Donatelli (University of Insubria), and L. Reichel (Kent State University).
Reference:
[1] A. Lanza, S. Morigi, L. Reichel, F. Sgallari, A generalized Krylov subspace method for $\ell_p-\ell_q$ minimization. SIAM Journal on Scientific Computing, (2015), 37(5), S30-S50.