Speaker
Description
In this talk, we present an extension of the Maximization-Minimization Generalized Krylov Subspace (MM-GKS) method for solving \ell_p-\ell_q minimization problems, as proposed in [1], by introducing a right preconditioner aimed at accelerating convergence without compromising the quality of the computed solution. The original MM-GKS approach relies on iterative reweighting and projection onto subspaces of increasing dimensions, enabling efficient resolution of minimization problems. Our enhanced method leverages a carefully designed regularizing preconditioner, inspired by Iterated Tikhonov regularization, to address the inherent ill-conditioning of the problem. We demonstrate that our preconditioned MM-GKS method preserves the stability and accuracy of the original MM-GKS method, as validated by numerical results in image deblurring, showing significant reductions in CPU time.
Reference:
[1] A. Lanza, S. Morigi, L. Reichel, F. Sgallari, A generalized Krylov subspace method for $\ell_p-\ell_q$ minimization. SIAM Journal on Scientific Computing, (2015), 37(5), S30-S50.