Optimization in the Presence of Noise with Applications in Machine Learning

Dr. Jorge Nocedal
Northwestern University

In many engineering applications, one wishes to optimize the performance of a system simulated by software whose intrinsics are not accessible to us. Objective function values are available -- and are typically noisy-- but derivatives are unknown. We discuss how to solve problems of this kind in practice. Examples of adversarial attacks of neural networks and engineering design illustrate the challenges to be overcome, particularly when the number of unknowns is large and the model includes constraints that must be respected.


SHORT BIO

Dr. Jorge Nocedal is a Professor at Northwestern University. He obtained his B.S. degree from UNAM, Mexico, and a PhD from Rice University. His research is in optimization, both deterministic and stochastic, with emphasis on large-scale problems arising in machine learning. He served as editor-in-chief of the SIAM Journal on Optimization, is a SIAM Fellow, was awarded the 2012 George B. Dantzig Prize as well as the 2017 Von Neumann Theory Prize for contributions to theory and algorithms of nonlinear optimization. He is a member of the US National Academy of Engineering.