Ab-initio Potential Energy Surfaces with Graph Neural Networks

This page provides additional information our works:

Ab-Initio Potential Energy Surfaces by Pairing GNNs with Neural Wave Functions
by Nicholas Gao and Stephan Günnemann
Published at the Tenth International Conference on Learning Representations (ICLR), 2022 Spotlight

and

Sampling-free Inference of Ab-initio Potential Energy Surface Networks
by Nicholas Gao and Stephan Günnemann
Published at the Eleventh International Conference on Learning Representations (ICLR), 2023

Ab-Initio Potential Energy Surfaces by Pairing GNNs with Neural Wave Functions

by Nicholas Gao and Stephan Günnemann
Published at the Eleventh International Conference on Learning Representations (ICLR), 2022 Spotlight

Abstract

Solving the Schrödinger equation is key to many quantum mechanical properties. However, an analytical solution is only tractable for single-electron systems. Recently, neural networks succeeded at modeling wave functions of many-electron systems. Together with the variational Monte-Carlo (VMC) framework, this led to solutions on par with the best known classical methods. Still, these neural methods require tremendous amounts of computational resources as one has to train a separate model for each molecular geometry. In this work, we combine a Graph Neural Network (GNN) with a neural wave function to simultaneously solve the Schrödinger equation for multiple geometries via VMC. This enables us to model continuous subsets of the potential energy surface with a single training pass. Compared to existing state-of-the-art networks, our Potential Energy Surface Network (PESNet) speeds up training for multiple geometries by up to 40 times while matching or surpassing their accuracy. This may open the path to accurate and orders of magnitude cheaper quantum mechanical calculations.

Cite

Please cite our paper if you use the model, experimental results, or our code in your own work:

@inproceedings{gao_pesnet_2022,
    title = {Ab-Initio Potential Energy Surfaces by Pairing GNNs with Neural Wave Functions},
    author = {Gao, Nicholas and G{\"u}nnemann, Stephan},
    booktitle={International Conference on Learning Representations (ICLR)},
    year = {2022}
}

Links

[Paper  | Video | GitHub]

Sampling-free Inference of Ab-initio Potential Energy Surface Networks

by Nicholas Gao and Stephan Günnemann
Published at the Tenth International Conference on Learning Representations (ICLR), 2023

Abstract

Recently, it has been shown that neural networks not only approximate the ground-state wave functions of a single molecular system well but can also generalize to multiple geometries. While such generalization significantly speeds up training, each energy evaluation still requires Monte Carlo integration which limits the evaluation to a few geometries. In this work, we address the inference shortcomings by proposing the Potential learning from ab-initio Networks (PlaNet) framework, in which we simultaneously train a surrogate model in addition to the neural wave function. At inference time, the surrogate avoids expensive Monte-Carlo integration by directly estimating the energy, accelerating the process from hours to milliseconds. In this way, we can accurately model high-resolution multi-dimensional energy surfaces for larger systems that previously were unobtainable via neural wave functions. Finally, we explore an additional inductive bias by introducing physically-motivated restricted neural wave function models. We implement such a function with several additional improvements in the new PESNet++ model. In our experimental evaluation, PlaNet accelerates inference by 7 orders of magnitude for larger molecules like ethanol while preserving accuracy. Compared to previous energy surface networks, PESNet++ reduces energy errors by up to 74%.

Cite

Please cite our paper if you use the model, experimental results, or our code in your own work:

@inproceedings{gao_planet_2023,
    title = {Sampling-free Inference of Ab-initio Potential Energy Surface Networks},
    author = {Gao, Nicholas and G{\"u}nnemann, Stephan},
    booktitle={International Conference on Learning Representations (ICLR)},
    year = {2023}
}

Links

[Paper  | GitHub]