Abstract
Neural Operators are a recent class of data-driven models for learning solutions to Partial Differential Equations (PDEs). Traditionally, these models are trained in an autoregressive fashion using data collected at discrete time points in the evolution of the PDE. This setup gives rise to two problems: (i) poor temporal generalization due to error accumulation and (ii) poor zero-shot super-resolution capabilities. To address these issues, we propose Vectorized Conditional Neural Fields (VCNeF), a general framework that utilizes transformers and implicit neural representations to efficiently solve time-dependent PDEs of varying coefficients. A comprehensive evaluation of VCNeF on the challenging 1D and 2D PDEs from PDEBench demonstrates the superiority of our model over four state-of-the-art baselines. Furthermore, our proposed model achieves faster inference and generalizes better to unseen PDE parameters than the compared models.
Users
Please
log in to take part in the discussion (add own reviews or comments).