Viola Ward Brinning and Elbert Calhoun Brinning Professor of Mathematics, on leave for the 2022-23 academic year

Professor Levy's research interests include optimization, variational analysis, and numerical methods. He enjoys teaching these subjects as well as multivariable calculus and differential equations.

"Optimization pulls in people who like math, but also people who like to solve real problems they can recognize from their daily lives. These can involve very complicated questions and being able to solve them is a magical thing."

Springer Briefs in Optimization. Springer Nature Switzerland AG, Cham, 2018

Numerical minimization of an objective function is analyzed in this book to understand solution algorithms for optimization problems. Multiset-mappings are introduced to engineer numerical minimization as a repeated application of an iteration mapping. Ideas from numerical variational analysis are extended to define and explore notions of continuity and differentiability of multiset-mappings, and prove a fixed-point theorem for iteration mappings. Concepts from dynamical systems are utilized to develop notions of basin size and basin entropy. Simulations to estimate basins of attraction, to measure and classify basin size, and to compute basin are included to shed new light on convergence behavior in numerical minimization. Graduate students, researchers, and practitioners in optimization and mathematics who work theoretically to develop solution algorithms will find this book a useful resource.

Springer Briefs in Optimization. Springer, New York, 2012

This monograph presents and analyzes a unifying framework for a wide variety of numerical minimization methods. Our “reduce-or-retreat” framework is a conceptual method outline that covers essentially any method whose iterations choose between the two options of reducing the objective in some way at a trial point, or (if reduction is not possible) retreating to a closer set of trial points. Included in this category are many derivative-based methods (which depend on the objective gradient to generate trial points), as well as many derivative-free and direct methods that generate trial points either from some model of the objective or from some designated pattern in the variable space.

Proto-derivatives and the geometry of solution mappings in nonlinear programming (with R. T. Rockafellar) in Nonlinear Optimization and Applications , pages 343-365. Plenum, 1996.

Sensitivity of solutions in nonlinear programming problems with nonunique multipliers (with R. T. Rockafellar), in D.-Z. Du, L. Qi, and R.S. Womersley, editors, Recent Advances in Nonsmooth Optimization , pages 215-223. World Scientific, 1995.

Partial extensions of Attouch's theorem with application to proto-differentiability of subgradient mappings (with R. Poliquin and L. Thibault) in Transactions of the American Mathematical Society ,347 (1995), 1269-1294.

Sensitivity analysis of solutions to generalized equations (with R. T. Rockafellar) in Transactions of the American Mathematical Society ,345 (1994), 661-671.

Society for Industrial and Applied Mathematics (2022)

Optimization is presented in most multivariable calculus courses as an application of the gradient, and while this treatment makes sense for a calculus course, there is much more to the theory of optimization. Optimization problems are generated constantly, and the theory of optimization has grown and developed in response to the challenges presented by these problems.

This textbook

aims to show readers how optimization is done in practice and help them to develop an appreciation for the richness of the theory behind the practice;

incorporates exercises, problems (including modeling and computational problems), and implementations throughout the text to help students learn by doing; and

inserts Python notes strategically to help readers complete computational problems and implementations.