Ameya Jagtap Enhancing Scientific Computing Through Physics informed Neural Networks
The Alan Turing Institute The Alan Turing Institute
47.7K subscribers
369 views
0

 Published On Feb 28, 2024

Traditional approaches to scientific computation have made significant strides in progress. However, they still face challenges as they operate within strict constraints, requiring precise knowledge of underlying physical laws, boundaries, and/or initial conditions. Additionally, these approaches often involve time-consuming workflows, including mesh generation and lengthy simulations. Addressing high-dimensional problems governed by parameterized partial differential equations (PDEs) remains a formidable task. Furthermore, efficiently incorporating noisy data remains a challenge when solving inverse problems. Physics-informed deep learning (PIDL) has emerged as a promising alternative to address these issues. In this presentation, we will delve into a specific type of PIDL method known as physics-informed neural networks (PINNs). We will provide an overview of the current capabilities and limitations of PINNs, highlighting diverse applications where PINNs have proven to be highly effective compared to traditional approaches. Additionally, we will explore extensions of the PINN method, such as conservative PINNs (cPINNs) and eXtended PINNs (XPINNs), specifically designed for handling big data and/or large models. Throughout the discussion, we will also explore various adaptive activation functions that can enhance the convergence of deep and physics-informed neural networks.

show more

Share/Embed