PDE constrained kernel regression methods

Iain Henderson (Université de Toulouse)

In this talk, I will describe the application of specific kernel methods, when seeking a numerical solution to PDE-driven forward and inverse problems. These kernel methods are general function approximation methods, which are used on a daily basis in machine learning and uncertainty quantification. Still, these methods have in fact already found their way in numerical methods for solving PDEs, with the RBF collocation method. Such kernel methods can be introduced as orthogonal projections in a suitable Hilbert space, but also as statistical (Bayesian) estimators stemming from an underlying Gaussian random field model. This Bayesian point of view is especially useful when dealing with ill-posed inverse problems, as well as for uncertainty quantification purposes. The use of kernel methods for solving PDEs is currently an active research topic, in the general context of ``physics-informed'' machine learning. Starting from the random field perspective, I will provide simple theoretical results describing how to choose the kernel in relation with a given (linear) PDE. I will then describe an application to the scalar three dimensional wave equation (in view of applications to photoacoustic tomography), as well as explicit links with well-known finite difference schemes for scalar hyperbolic PDEs.