Description
Chair: Valeria Simoncini
In this work we present a low-memory method for the approximation of the action of a symmetric matrix function $f(A) \in \mathbb{R}^{n \times n}$ on a vector $\mathbf b \in \mathbb{R}^n$, where the matrix $A$ is large and sparse. A popular approach for approximating $f(A) \mathbf b$ is the Lanczos algorithm. Given an orthonormal basis $Q_M \in \mathbb{R}^{n \times M}$ of the Krylov subspace...
We propose and analyze an algorithm for identifying spectral gaps of a real symmetric matrix $A$ by simultaneously approximating the traces of spectral projectors associated with multiple different spectral slices. Our method utilizes Hutchinson's stochastic trace estimator together with the Lanczos algorithm to approximate quadratic forms involving spectral projectors. Instead of focusing on...
Multiple orthogonal polynomials (MOPs) arise in various applications, including approximation theory, random matrix theory, and numerical integration. To define MOPs, one needs multiple inner products. In this talk, we restrict our attention to the case of two inner products. These MOPs satisfy recurrence relations, and we focus specifically on the stepline recurrence relation.
We derive an...
Preconditioners are essential tools for efficiently solving linear systems arising from the discretization of PDEs. Traditional single-level approaches like Jacobi, Incomplete LU factorization (ILU), and Factorized Sparse Approximate Inverse (FSAI)$^{1}$ are effective in reducing high frequency error components but struggle with low-frequency components. The basic idea of multigrid and...
In this talk I will present an extension of the Maximization-Minimization Generalized Krylov Subspace (MM-GKS) method for solving $\ell^p-\ell^q$ minimization problems, as proposed in [1], by introducing a right preconditioner aimed at accelerating convergence without compromising the quality of the computed solution. The original MM-GKS approach relies on iterative reweighting and projection...