Numerous tasks in applied math and data science lead to minimizing a composition of a finite convex function with a smooth nonlinear map. I will discuss various aspects of this problem class, including efficiency of first-order methods, stochastic algorithms for large finite sums, fast local rates of convergence, and termination criteria. In the second part of the talk, I will specialize the aforementioned techniques to the phase retrieval problem. I will explain how the composite framework allows one to determine high probability regions devoid of stationary points; these are the regions where the landscape of the objective function is benign. Building on the recent work of Duchi and Ruan '17, I will then explain how one can harness the rapid convergence guarantees of proximal and subgradient-type methods to devise globally efficient algorithms for the phase retrieval problem.
Joint work with Damek Davis (Cornell), Alexander D. Ioffe (Technion), Adrian S. Lewis (Cornell), and Courtney Paquette (Lehigh).