The dominant costs of many lattice QFT calculations involve solving the lattice Dirac equations with iterative solvers. The speed of convergence of those solvers could be accelerated by the use of preconditioners to reduce the condition numbers. Designing good preconditioners, however, require domain-specific knowledge and fine-tuning of preconditioner parameters.
In this talk, I will demonstrate how to parameterize preconditioners with neural networks for solving lattice Wilson-Dirac equations in the lattice Schwinger model in two dimensions. In addition, I will show that the costs of training neural-network preconditioners could be significantly amortized by training only on coarse lattices. The trained networks can then be used to construct preconditioners for finer lattices without impacting their performance.