
Mark Ainsworth
Brown University
Abstract
Title: Galerkin Neural Network Approximation of Variational Problems with Error Control
Recent years have seen an unprecedented surge of interest in applying neural networks to a very wide range of areas including non-scientific applications and artificial intelligence. In principle, neural networks offer benefits for scientific applications including the approximate solution of differential equations. However, if such methods are to be accepted then it is important that they are set on a firm theoretical foundation and ideally include quantitative measures on the reliability of the results that are obtained.
We present an overview of some of our recent work in this direction on what we call Extended Galerkin Neural Networks (xGNN) where we aim to provide variational framework for approximating general boundary value problems (BVPs) with error control. The main contributions of this work are (1) a rigorous theory guiding the construction of weighted least squares variational formulations suitable for use in neural network approximation of general BVPs (2) an “extended’ feedforward network architecture which incorporates and is even capable of learning singular solution structures, thus offering the potential to greatly improve the efficiency for singular solutions.
This is joint work with Justin Dong, Lawrence Livermore National Laboratory, USA.
Bio
Professor Ainsworth obtained his PhD from the University of Durham, United Kingdom in 1989. He is the Francis Wayland Professor of Applied Mathematics at Brown University and the current Editor in Chief of SIAM Journal on Numerical Analysis.

Yingda Cheng
Virginia Tech
Abstract
Title: Low rank Anderson AccelerationIn this talk, we present the low rank Anderson
Acceleration (lrAA), a numerical method that directly computes low rank solutions to nonlinear equations. In many applications (e.g. nonlinear diffusion), the approximate solution, when represented as a matrix, is approximately low rank. It is challenging to design a numerical scheme for nonlinear equations that work directly with low rank. A principal challenge is that if nonlinearities are evaluated element-wise for all matrix elements then the computational savings, from quadratic to linear in the grid points per dimension, is lost.
We propose lrAA, which is based on Anderson Acceleration (AA), a well known technique for accelerating Picard iteration for fixed point problems. We couple AA with low rank truncation and cross approximations. We develop a new method for matrix cross approximation, Cross-DEIM, that uses the discrete empirical interpolation method (DEIM) based index selection with adaptive error control to achieve effective cross approximations throughout the iterations. We show that lrAA works well for benchmark problems, such as Bratu problem and Allen-Cahn equations. This is joint work with Daniel Appelo (VT).
Bio
I am interested in high order accurate and structure preserving numerical methods for differential equations, particularly the discontinuous Galerkin schemes. In recent years, my main research focus is in high dimensional scientific computing, where non-conventional numerical methods are developed to handle the curse of dimensionality, bridging data science and numerical analysis. Application domain of my work includes fusion energy, semiconductor device modeling, nonlinear optics and quantum computing.

Robert Gramacy
Virginia Tech
Abstract
Title: A surrogate modeling journey through Gaussian processes modeling for computer simulation experiments
This talk begins with an overview of Gaussian process (GP) surrogate modeling, and my favorite application: active learning for the (Bayesian) optimization of a blackbox function. I shall then survey some important, recent methodological developments targeting specific situations that increasingly arise in practice: large simulation campaigns, noisy observations/stochastic simulation, nonstationary modeling, and the calibration of computer models to field data. The presentation concludes with an in-depth description of a recent application: contour location for reliability in an airfoil simulation experiment using deep GPs. Throughout, there will be reproducible visuals and demos supported by code, both run live and embedded in the slides. These are biased toward my own work, in part because I understand that code best. But along the way I shall also endeavour to provide an otherwise balanced discussion of myriad alternatives that can be found elsewhere in this fast-moving literature.
Bio
I am a Professor of Statistics in the College of Science at Virginia Polytechnic and State University (Virginia Tech/VT) and affiliate faculty in VT’s Computational Modeling and Data Analytics program. Previously I was an Associate Professor of Econometrics and Statistics at the Booth School of Business, and a fellow of the Computation Institute at The University of Chicago. My research interests include Bayesian modeling methodology, statistical computing, Monte Carlo inference, nonparametric regression, sequential design, and optimization under uncertainty.

Christine Heitsch
Georgia Tech
Abstract
Title: How can discrete mathematics improve RNA folding predictions?
Understanding the folding of RNA sequences into three-dimensional structures — such as a viral genome inside its protein capsid — is a fundamental scientific challenge. Branching is a critical characteristic of RNA folding, yet too often poorly predicted under the standard thermodynamic objective function. By formulating this discrete optimization problem as a linear program, methods from geometric combinatorics (convex polytopes and their normal fans) can fully characterize the optimal branching configurations. This parametric analysis illuminates the optimization geometry. The insights gained significantly improve the prediction accuracy while also revealing why the general problem is so difficult.
Bio
Heitsch is Professor of Mathematics at Georgia Tech, with courtesy appointments in Biological Sciences and Computational Science & Engineering as well as an affiliation with the Petit Institute for Bioengineering & Bioscience.
She is also Director of the new Southeast Center for Mathematics and Biology (SCMB), an NSF-Simons MathBioSys Research Center, and finishing her tenure directing the GT Interdisciplinary Mathematics Preparation and Career Training (IMPACT) Postdoctoral Program.
Heitsch’s research interests lie at the interface between discrete mathematics and molecular biology, specifically combinatorial problems “as motivated by” and “with applications to” fundamental biomedical questions like RNA folding.

Guowei Wei
Michigan State
Abstract
Title: Topological deep learning on graphs, manifolds, and curves
In the past few years, topological deep learning (TDL), a term coined by us in 2017, has become an emerging paradigm in artificial intelligence (AI) and data science. TDL is built on persistent homology (PH), a vital tool in topological data analysis (TDA) that bridges the gap between complex geometry and abstract topology through multiscale analysis. While TDA has made huge strides in a wide variety of scientific and engineering disciplines, it has many limitations. I will discuss our recent effort in extending the scope of TDA from graphs to manifolds and curves, through new formulations from algebraic topology, geometric topology, and differential topology. I will also discuss how TDL achieved its victories in worldwide annual competitions in computer-aided drug design, discovered SARS-CoV-2 evolutionary mechanism, and accurately predicted emerging dominant viral variants.
Bio
Guowei Wei received his Ph.D. degree from the University of British Columbia and is currently an MSU Research Foundation Professor at Michigan State University. His research focuses on the mathematical foundation of biosciences and artificial intelligence (AI). Dr. Wei pioneered mathematical AI paradigms, such as topological deep learning (TDL), that integrate profound mathematical structures with AI to tackle biological challenges. His math AI has led to victories in D3R Grand Challenges, a worldwide annual competition series in computer-aided drug design. Using TDL, genotyping, and computational biophysics, the Wei team unveiled the mechanisms of SARS-CoV-2 evolution and successfully forecast emerging dominant SARS-CoV-2 variants.