Mixture-of-Experts-Ensemble Meta-Learning for Physics-Informed Neural Networks

Date

September 2022

Authors

Rafael Bischof, Michael A Kraus

Conference / Journal

Proceedings of 33. Forum Bauinformatik

Partial Differential Equations (PDEs) arise in natural and engineering sciences to model reality and allow for numerical assessment of these phenomena. Classical numerical solutions for PDEs rely on discretisations such as eg the finite or spectral element method (FEM/SEM). Physics Informed Neural Networks (PINN) leverage physical laws by including PDE together with a respective set of boundary and initial conditions (BC/IC) as penalty terms into their loss function during training without the need for discretisation. However, as the computational domain or the nonlinearity of the PDE becomes more complex, finding PDE solutions via FEM/SEM as well as PINNs becomes hard. Mixture of Experts (MoE) is a probabilistic model, consisting of local experts weighted by a gating network. As in ensemble methods, MoE involves decomposing predictive modeling tasks into sub-tasks, ie training an expert model on partitions of the input space, and devising a gating model that learns to weight the experts’ predictions conditioned on the input. In this work, we combine the two methods to form Physics Informed Mixture of Neural Network Experts (PIMNNE)/Mixture of Experts Physics Informed Neural Networks (MoE-PINNs) and investigate the learning and predictive capabilities by the example of a 2-dimensional Poisson PDE over a L-shaped domain with homogeneous Dirichlet BC. Our simulation studies show, that all MoE-PINNs setups approximate the SEM reference solution to a satisfactory high degree. A hyperparameter study revealed that an increase in the number of PINN experts reduces the approximation error significantly even if regularised …