EDP Sciences, 2023. — 166 p. — ASIN B0CLL33PRZ.
With the fast development of big data and artificial intelligence, a natural question is how do we analyze data more efficiently ? One of the efficient ways is to use optimization. What is optimization ? Optimization exists everywhere. People optimize. As long as you have choices, you do optimization. Optimization is the key of operations research. This book introduces the basic definitions and theory about numerical optimization, including optimality conditions for unconstrained and constrained optimization, as well as algorithms for unconstrained and constrained problems. Moreover, it also includes the nonsmooth Newton’s method, which plays an important role in large-scale numerical optimization. Finally, based on the author’s research experiences, several latest applications about optimization are introduced, including optimization algorithms for hypergraph matching, support vector machine and bilevel optimization approach for hyperparameter selection in machine learning. With these optimization tools, one can deal with data more efficiently.
This book is based on the author ’ s lecture “Modern Optimization Methods” given to graduate students at the Beijing Institute of Technology since 2015. It aimed at presenting complete and systematic theories of numerical optimization and their latest applications in different areas, especially in machine learning, statistics, and computer science.
True PDF
Preface
Introduction
About Optimization
Classification of Optimization
Preliminaries in Convex Analysis
Exercises
Fundamentals of Optimization
Unconstrained Optimization Problem
What is a Solution?
Overview of Algorithms
Convergence
Scaling
Exercises
Line Search Methods
Step Length
Convergence of Line Search Methods
Rate of Convergence
Exercises
Trust Region Methods
Outline of the Trust Region Approach
Algorithms Based on the Cauchy Point
Global Convergence
Local Convergence
Other Enhancements
Exercises
Conjugate Gradient Methods
Linear Conjugate Gradient Method
Nonlinear Conjugate Gradient Methods
Exercises
Semismooth Newton’s Method
Semismoothness
Nonsmooth Version of Newton’s Method
Support Vector Machine
Semismooth Newton’s Method for SVM
Exercises
Theory of Constrained Optimization
Local and Global Solutions
Examples
Tangent Cone and Constraint Qualifications
First-Order Optimality Conditions
Second-Order Conditions
Duality
KKT Condition
Dual Problem
Exercises
Penalty and Augmented Lagrangian Methods
The Quadratic Penalty Method
Exact Penalty Method
Augmented Lagrangian Method
Quadratic Penalty Method for Hypergraph Matching
Augmented Lagrangian Method for SVM
Exercises
Bilevel Optimization and Its Applications
Introduction
Bilevel Model for a Case of Hyperparameter Selection in SVC
The Global Relaxation Method (GRM)
MPEC-MFCQ: A Hidden Property
Numerical Results
Bibliography