Enhancing Local Derivative-Free Optimization with Curvature Information and Inspection Strategies
Skip to main content
eScholarship
Open Access Publications from the University of California

UCLA

UCLA Electronic Theses and Dissertations bannerUCLA

Enhancing Local Derivative-Free Optimization with Curvature Information and Inspection Strategies

Abstract

Derivative-Free Optimization (DFO) problems naturally arise in various domains, from engineering design to hyperparameter optimization and beyond.This thesis introduces two methodologies for improving local DFO methods. Curvature-Aware Random Search (CARS) uses the approximate second-order information of the objective function in computing the step size to enhance the speed of local convergence. We prove the linear convergence of CARS for strongly convex objective functions and propose two variants of CARS: CARS with Cubic Regularization (CARS-CR), which has an $\mathcal{O}(k^{-1})$ convergence rate without the assumption of strong convexity, and CARS with Numerical Quadrature (CARS-NQ), which offers robustness against oscillatory noise. We also introduce a novel randomized matrix inversion method to leverage more curvature information. The second method, Inspect as You Run (IR), can be integrated into any iterative DFO method. It aids in escaping spurious local optima while preserving local convergence properties. We also prove that IR assists in verifying $R$-local optimality, an intermediate between the local and global optimality. Extensive numerical results are provided, demonstrating the efficacy of these methods.

Main Content
For improved accessibility of PDF content, download the file to your device.
Current View