Speaker
Description
Preconditioners are essential tools for efficiently solving linear systems arising from the discretization of PDEs. Traditional single-level approaches like Jacobi, Incomplete LU factorization (ILU), and Factorized Sparse Approximate Inverse (FSAI)$^{1}$ are effective in reducing high frequency error components but struggle with low-frequency components. The basic idea of multigrid and two-level domain decomposition is to achieve efficiency by balancing coarse-grid correction with smoothing.
In this work, we propose a novel method for approximating the inverse of discrete operators by leveraging DeepONet, a supervised learning framework for nonlinear operators$^2$. This Deep Learning based method is well-suited to focus on low-frequency features$^3$. Therefore, instead of employing a conventional coarse grid in the multigrid sense, we construct an analogous structure through a DeepONet trained on vectors representing low frequencies.
Alternated to classical single-level preconditioners, the DeepONet approximation compensates the action on the low-frequency part of the eigenspectrum, thus accelerating the convergence. Preliminary test cases are investigated and presented in order to analyze the potential of the proposed approach, in view of its possible application in matrix-free multi-level methods.
- C. Janna, M. Ferronato, F. Sartoretto, G. Gambolati. FSAIPACK: A Software Package for High-Performance Factored Sparse Approximate Inverse Preconditioning. ACM Trans. Math. Softw. 41.2, 1-26 (2015).
- L. Lu, P. Jin, G. Pang, Z. Zhang, G. E. Karniadakis. Learning nonlinear operators via DeepONet based on the universal approximation theorem of operators. Nature machine intelligence 3.3, 218-229 (2021).
- E. Zhang, A. Kahana, A. Kopaničáková, E. Turkel et al. Blending neural operators and relaxation methods in PDE numerical solvers. Nature machine intelligence 6, 1303–1313 (2024).