CAM Seminar: VarMiON: A variationally mimetic operator network
Author: Lona
Author: Lona
Speaker: Deep Ray, University of Maryland, College Park
Abstract: Operator networks have emerged as promising deep learning tools for building fast surrogates of PDE solvers. These networks map input functions that describe material properties, forcing functions and boundary data to the solution of a PDE, i.e., they learn the solution operator of the PDE. In this talk, we consider a new type of operator network called VarMiON, that mimics the variational or weak formulation of PDEs. A precise error analysis of the VarMiON solution reveals that the approximation error contains contributions from the error in the training data, the training error, quadrature error in sampling input and output functions, and a “covering error” that measures how well the training dataset covers the space of input functions. Numerical experiments are presented for a canonical elliptic PDE to demonstrate the efficacy and robustness of the VarMiON as compared to a standard Deep Operator Network (DeepONet).
SHORT BIO: Deep Ray is an Assistant Professor of Mathematics at the University of Maryland, College Park. He obtained his PhD in Mathematics from the Tata Institute of Fundamental Research (Bangalore, India), followed by postdoctoral positions at EPFL (Switzerland), Rice University and University of Southern California. His research interests lie at the interface of conventional numerical analysis and machine learning. He has worked on the judicious integration of deep learning tools to overcome computational bottlenecks in areas such as fluid flow simulations, reduced order modeling, PDE constrained optimization, and Bayesian inference.