The Class of ’27 Lecture Series is a special lecture held each year. It was established in 1960 to honor the first chair of the Math Sciences Department, Professor Edwin Allen. The three members of the class of 1927 who established this series are Issac Arnold, Alexander Hassan, and Isadore Fixman.
Class’27 Lecture II
Tamás Terlaky
from Lehigh University
Class’27 Lecture I
Tamás Terlaky
from Lehigh University
Class of '27 Lecture II-"Convergence Analysis of Stochastic Optimization Methods via Martingales"
Katya Scheinberg
from Lehigh University
Abstract: We will present a very general framework for unconstrained stochastic optimization which encompasses standard frameworks such as line search and trust region using random models. In particular this framework retains the desirable practical features such step acceptance criterion, trust region adjustment and ability to utilize of second order models. The framework is based on bounding the expected stopping time of a stochastic process, which satisfies certain assumptions...
Class of '27 Lecture I-"Gradient Decent Without Gradients"
Katya Scheinberg
from Lehigh University
Abstract: The core of continuous optimization lies in using information from first and second order derivatives to produce steps that improve objective function value. Classical methods such as gradient decent and Newton method rely on this information. The recently popular method in machine learning - Stochastic Gradient Decent - does not require the gradient itself, but still requires its unbiased estimate. However, in many applications either derivatives or their unbiased estimates are not available.
Class of '27 Lecture "Signal Fragmentation for Low Frequency Radio Transmission"
Russel Caflisch
from NYU Courant Institute
Class of '27 Lecture "From Differential Equations to Data Science and Back"
Russel Caflisch
from NYU Courant Institute
Class of ’27 Lecture : Coordinate Descent Methods
Stephen J. Wright
from University of Wisconsin
Coordinate descent is an approach for minimizing functions in which only a subset of the variables are allowed to change at each iteration, while the remaining variables are held fixed. This approach has been popular in applications since the earliest days of optimization, because it is intuitive and because the low-dimensional searches that take place at each iteration are inexpensive in many applications. In recent years, the popularity of coordinate descent methods has grown further because of their usefulness in data analysis.
Class of ’27 Lecture: Fundamental Optimization Methods in Data Analysis
Stephen Wright
from University of Wisconsin
Optimization formulations and algorithms are vital tools for solving problems in data analysis. There has been particular interest in some fundamental, elementary, optimization algorithms that were previously thought to have only niche appeal. Stochastic gradient, coordinate descent, and accelerated first-order methods are three examples. We outline applications in which these approaches are useful, discuss the basic properties of these methods, and survey some recent developments in the analysis of their convergence behavior.
Flows in complex networks: theory, algorithms, and application to Lennard-Jones cluster rearrangement
Eric Vanden-Eijnden
from Courant Institute of Mathematical Sciences
Modeling metastability in complex systems
Eric Vanden-Eijnden
from Courant Institute of Mathematical Sciences
Droplet Splashing
Michael P. Brenner
from Harvard University
Linear Algebra and the Shape of Bird Beaks
Michael P. Brenner
from Harvard University
Exact phase retrieval via convex optimization
Emmanuel Candes
from Stanford University
Robust principal component analysis? Some theory and some applications
Emmanuel Candes
from Stanford University
New Perspective of Wave Turbulence
David Cai
from New York Univesity
Mathematical Analysis of Neuronal Network Dynamics
David Cai
from New York University
Efficient Algorithms for the Analysis and Design Nucleic Acid Base-Pairing
Niles A. Pierce
from California Institute of Technology
Biomolecular Choreography
Niles A. Pierce
from California Institute of Technology
Evolutionary Dynamics in Set Structures Populations
Corina Tarnita
from Harvard University
Evolution of Cooperation
Martin Nowak
from Harvard University
Domain Decomposition Methods for Partial Differential Equations
David E. Keyes
from Columbia University
A Nonlinearly Implicit Manifesto
David E. Keyes
from Columbia University
The Nonuniform FFT and Magnetic Resonance Imaging
Leslie Greengard
from Courant Institute
Fast Multipole Methods and their Applications
Leslie Greengard
from Courant Institute
Multiscale Modeling of Complex Fluids
Weinan E
from Princeton University
What is Multiscale Modeling?
Weinan E
from Princeton University
The Level Set Method: What's in it for you?
Stanley Osher
from University of California, Los Angeles
Equilibrium Statistical Mechanics of Vortices in 2 and 3-D and Crude Closure for Geophysical Flows
Andrew J. Maida
from Courant Institute of Mathematical Sciences New York University
Clouds, Climate and Modern Applied Mathematics
Andrew J. Maida
from Courant Institute of Mathematical Sciences New York University
Hydrodynamic Limits
Marshall Slemrod
from University of Wisconsin
Coagulation-Fragmentation Equations
Marshall Slemrod
from University of Wisconsin