Loading…


Friday May 16, 2025 TBA
Design discovery often includes computer simulation (e.g. physics, etc.) for design evaluation. These simulations can take a long period of time to compute creating a limitation in the number of designs that can be evaluated within a given time frame. Surrogate machine learning models are being used as an alternate means to perform these simulations faster. However, the machine learning model input-output relationships are constrained within the training data distribution and often cannot extrapolate well hindering the ability to explore interesting designs. Additionally, the data required to train these models to achieve accurate results can be immense. This reduces the impact of the surrogate model because the training data is derived from the physics simulations. Operator neural network structures such as Deep Operator Networks (DeepONet) are being developed to improve extrapolation capability by modeling the solution of differential equations through an input function to output function mapping (i.e. operator).  Physics Informed Neural Networks (PINNs) impose additional constraints on the training loss function potentially reducing the amount of training data required. In this talk, the ideas of DeepONet and PINNs are introduced by applying them to demagnetization fields (Poisson Equation).
Speakers
NP

Nicholas Propes, PhD

Senior Staff Data Scientist, Seagate
Nicholas Propes is a Data Scientist at Seagate Technology currently researching solutions for design discovery using machine learning methods.  He received his Ph.D. in Electrical Engineering from Georgia Institute of Technology. 
Friday May 16, 2025 TBA

Log in to save this to your schedule, view media, leave feedback and see who's attending!

Share Modal

Share this link via

Or copy link