Abstract

Inductive biases play a critical role in enabling Graph Networks (GN) to learn particle and mesh-based physics simulations. In this paper, we propose two generalizable inductive biases that minimize rollout error and energy accumulation. GNs conditioned on the input states and relying on the Mean Squared Error (MSE) loss function implicitly assume Gaussian-distributed output errors. Consequently, GNs may either assign probability densities to infeasible regions in the state space of the deterministic physics problem or fail to assign densities to feasible regions. Instead, we advocate for maximizing the likelihood of the actual target distribution, challenging the underlying assumptions of MSE-based regression models using our proposed conditional normalizing flows (cNF) decoder. We discover that this inductive bias enables GNs to significantly improve their next state prediction accuracy. Existing sequential GNs encode temporal dependencies by autoregressively processing the latent representations of the input data. In our work, we find that inducing the Arrow-of-Time inductive bias through an auto-regressive encoding step before autoregressively processing the resulting latent vectors enables GNs to better minimize rollout error. We critically analyze the impact of existing inductive biases on rollout error and energy accumulation and discover that the choice of biases encoded in a GN, rather than the number of inductive biases, has a substantial impact on forward simulation prediction.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call