Speaker
Description
Preconditioners are essential tools for efficiently solving linear systems arising from the discretization of PDEs. Traditional single-level approaches like Jacobi, Incomplete LU factorization (ILU), and Factorized Sparse Approximate Inverse (FSAI)
In this work, we propose a novel method for approximating the inverse of discrete operators by leveraging DeepONet, a supervised learning framework for nonlinear operators
Alternated to classical single-level preconditioners, the DeepONet approximation compensates the action on the low-frequency part of the eigenspectrum, thus accelerating the convergence. Preliminary test cases are investigated and presented in order to analyze the potential of the proposed approach, in view of its possible application in matrix-free multi-level methods.
- C. Janna, M. Ferronato, F. Sartoretto, G. Gambolati. FSAIPACK: A Software Package for High-Performance Factored Sparse Approximate Inverse Preconditioning. ACM Trans. Math. Softw. 41.2, 1-26 (2015).
- L. Lu, P. Jin, G. Pang, Z. Zhang, G. E. Karniadakis. Learning nonlinear operators via DeepONet based on the universal approximation theorem of operators. Nature machine intelligence 3.3, 218-229 (2021).
- E. Zhang, A. Kahana, A. Kopaničáková, E. Turkel et al. Blending neural operators and relaxation methods in PDE numerical solvers. Nature machine intelligence 6, 1303–1313 (2024).