Algorithms for Smooth Nonconvex Optimization with Worst-case Guarantees

Algorithms for Smooth Nonconvex Optimization with Worst-case Guarantees
Author :
Publisher :
Total Pages : 0
Release :
ISBN-10 : OCLC:1245496225
ISBN-13 :
Rating : 4/5 (25 Downloads)

Book Synopsis Algorithms for Smooth Nonconvex Optimization with Worst-case Guarantees by : Michael John O'Neill

Download or read book Algorithms for Smooth Nonconvex Optimization with Worst-case Guarantees written by Michael John O'Neill and published by . This book was released on 2020 with total page 0 pages. Available in PDF, EPUB and Kindle. Book excerpt: The nature of global convergence guarantees for nonconvex optimization algorithms has changed significantly in recent years. New results characterize the maximum computational cost required for algorithms to satisfy approximate optimality conditions, instead of focusing on the limiting behavior of the iterates. In many contexts, such as those arising from machine learning, convergence to approximate second order points is desired. Algorithms designed for these problems must avoid saddle points efficiently to achieve optimal worst-case guarantees. In this dissertation, we develop and analyze a number of nonconvex optimization algorithms. First, we focus on accelerated gradient algorithms and provide results related to the avoidance of "strict saddle points''. In addition, the rate of divergence these accelerated gradient algorithms exhibit when in a neighborhood of strict saddle points is proven. Subsequently, we propose three new algorithms for smooth, nonconvex optimization with worst-case complexity guarantees. The first algorithm is developed for unconstrained optimization and is based on the classical Newton Conjugate Gradient method. This approach is then extended to bound constrained optimization by modifying the primal-log barrier method. Finally, we present a method for a special class of ``strict saddle functions'' which does not require knowledge of the parameters defining the optimization landscape. These algorithms converge to approximate second-order points in the best known computational complexity for their respective problem classes.


Algorithms for Smooth Nonconvex Optimization with Worst-case Guarantees Related Books

Algorithms for Smooth Nonconvex Optimization with Worst-case Guarantees
Language: en
Pages: 0
Authors: Michael John O'Neill
Categories:
Type: BOOK - Published: 2020 - Publisher:

DOWNLOAD EBOOK

The nature of global convergence guarantees for nonconvex optimization algorithms has changed significantly in recent years. New results characterize the maximu
Structure and Complexity in Non-convex and Non-smooth Optimization
Language: en
Pages: 167
Authors: Courtney Paquette
Categories:
Type: BOOK - Published: 2017 - Publisher:

DOWNLOAD EBOOK

Complexity theory drives much of modern optimization, allowing a fair comparison between competing numerical methods. The subject broadly seeks to both develop
Evaluation Complexity of Algorithms for Nonconvex Optimization
Language: en
Pages: 549
Authors: Coralia Cartis
Categories: Mathematics
Type: BOOK - Published: 2022-07-06 - Publisher: SIAM

DOWNLOAD EBOOK

A popular way to assess the “effort” needed to solve a problem is to count how many evaluations of the problem functions (and their derivatives) are require
Tight Worst-case Guarantees and Approximation Algorithms for Several Classes of Geometric Optimization Problems
Language: en
Pages:
Authors: Phillip-Raphael Keldenich
Categories:
Type: BOOK - Published: 2020 - Publisher:

DOWNLOAD EBOOK

Trust Region Methods
Language: en
Pages: 960
Authors: A. R. Conn
Categories: Mathematics
Type: BOOK - Published: 2000-01-01 - Publisher: SIAM

DOWNLOAD EBOOK

Mathematics of Computing -- General.