EDP Sciences EDP Sciences EDP Sciences EDP Sciences

Modern Optimization Methods

by Qingna LI (author)
november 2023
eBook [PDF]
157 pages Download after purchase
65,99 €
Référencer ce produit sur votre site

Presentation

With the fast development of big data and artificial intelligence, a natural question is how do we analyze data more efficiently? One of the efficient ways is to use optimization. What is optimization? Optimization exists everywhere. People optimize. As long as you have choices, you do optimization. Optimization is the key of operations research. This book introduces the basic definitions and theory about numerical optimization, including optimality conditions for unconstrained and constrained optimization, as well as algorithms for unconstrained and constrained problems. Moreover, it also includes the nonsmooth Newton’s method, which plays an important role in large-scale numerical optimization. Finally, based on the author’s research experiences, several latest applications about optimization are introduced, including optimization algorithms for hypergraph matching, support vector machine and bilevel optimization approach for hyperparameter selection in machine learning. With these optimization tools, one can deal with data more efficiently.

Resume

Preface..................................................... III

CHAPTER 1

Introduction................................................. 1

1.1 About Optimization ...................................... 1

1.2Classification of Optimization ............................... 4

1.3Preliminaries in Convex Analysis ............................ 10

1.4Exercises............................................... 15

CHAPTER 2

Fundamentals of Optimization ................................... 17

2.1Unconstrained Optimization Problem ......................... 17

2.2 What isa Solution? ...................................... 18

2.2.1Definitions of Different Solutions ....................... 18

2.2.2Recognizing a Local Minimum ......................... 20

2.2.3Non smooth Problems ................................ 23

2.3Overview of Algorithms ................................... 25

2.3.1 Line Search Strategy ................................ 26

2.3.2 Trust Region Strategy ............................... 30

2.4Convergence ............................................ 31

2.5 Scaling................................................ 32

2.6Exercises............................................... 33

CHAPTER 3

Line Search Methods .......................................... 35

3.1 StepLength ............................................ 35

3.1.1 The Wolfe Conditions ............................... 37

3.1.2 The Goldstein Conditions ............................ 40

3.1.3Sufficient Decrease and Backtracking .................... 41

3.2Convergence of Line Search Methods ......................... 42

3.3 Rate of Convergence ...................................... 44

3.3.1Steepest Descent Method ............................. 44

3.3.2Newton’s Method................................... 46

3.3.3Quasi-Newton Methods .............................. 48

3.4Exercises............................................... 50

CHAPTER 4

Trust Region Methods ......................................... 51

4.1 Outlineof the Trust Region Approach ........................ 52

4.2Algorithms Based on the Cauchy Point ........................ 54

4.2.1 The Cauchy Point .................................. 54

4.2.2 The Dogleg Method ................................. 56

4.2.3Two-Dimensional Subspace Minimization ................. 58

4.3 Global Convergence ...................................... 59

4.3.1Reduction Obtained by the Cauchy Point ................ 59

4.3.2Convergence to Stationary Points....................... 61

4.4 Local Convergence ....................................... 65

4.5 Other Enhancements......................................65

4.6 Exercises...............................................68

CHAPTER 5

Conjugate Gradient Methods.................................... 69

5.1 Linear Conjugate Gradient Method........................... 69

5.1.1Conjugate Direction Method .......................... 69

5.1.2Conjugate Gradient Method........................... 72

5.1.3 A Practical Form of the Conjugate Gradient Method ........ 75

5.1.4 Rate of Convergence ................................ 76

5.1.5Preconditioning .................................... 77

5.2Nonlinear Conjugate Gradient Methods ....................... 78

5.2.1 ThePolak-Ribiere Method and Variants.................. 80

5.2.2Global Convergence ................................. 81

5.3Exercises............................................... 83

CHAPTER 6

Semi smooth Newton’s Method ................................... 85

6.1Semi smoothness ......................................... 85

6.2Non smooth Version of Newton’s Method....................... 87

6.3 Support Vector Machine ................................... 89

6.4Semi smooth Newton’s Method for SVM ....................... 91

6.5Exercises............................................... 96

CHAPTER 7

Theory of Constrained Optimization ............................... 97

7.1 Local and Global Solutions ................................. 97

7.1.1Smoothness ....................................... 98

7.2Examples .............................................. 99

VI Contents

7.3 Tangent Cone and Constraint Qualifications .................... 103

7.4First-Order Optimality Conditions ........................... 105

7.5Second-Order Conditions .................................. 106

7.6 Duality................................................ 109

7.7 KKTCondition ......................................... 112

7.8 DualProblem ........................................... 114

7.9Exercises............................................... 118

CHAPTER 8

Penalty and Augmented Lagrangian Methods ........................ 119

8.1 The Quadratic Penalty Method.............................. 119

8.2 ExactPenalty Method .................................... 122

8.3Augmented Lagrangian Method ............................. 123

8.4Quadratic Penalty Method for Hypergraph Matching ............. 125

8.4.1Hypergraph Matching ............................... 126

8.4.2Mathematical Formulation ............................ 126

8.4.3Relaxation Problem ................................. 128

8.4.4Quadratic Penalty Method for (8.21) .................... 129

8.4.5Numerical Results .................................. 130

8.5Augmented Lagrangian Method for SVM ...................... 132

8.5.1Support Vector Machine ............................. 132

8.5.2Mathematical Formulation ............................ 133

8.5.3Augmented Lagrangian Method (ALM) .................. 133

8.5.4Semismooth Newton’s Method for the Subproblem.......... 136

8.5.5Reducing the Computational Cost ...................... 137

8.5.6Convergence Result of ALM ........................... 138

8.5.7Numerical Results on LIBLINEAR...................... 139

8.6Exercises............................................... 141

CHAPTER 9

Bilevel Optimization and Its Applications ........................... 143

9.1Introduction ............................................ 143

9.2 Bilevel Model for a Case of Hyperparameter Selection in SVC ....... 145

9.2.1 AnMPEC Formulation .............................. 147

9.3 The Global Relaxation Method (GRM)........................ 148

9.4MPEC-MFCQ: A Hidden Property ........................... 149

9.5 Numerical Results........................................ 150

Bibliography................................................. 153


Compléments

Characteristics

Language(s): English

Audience(s): Research, Students, Professionals

Publisher: EDP Sciences & Science Press

Collection: Current Natural Sciences

Published: 13 november 2023

EAN13 (hardcopy): 9782759831746

Reference eBook [PDF]: L31753

EAN13 eBook [PDF]: 9782759831753

Interior: Colour

Pages count eBook [PDF]: 157

Size: 6,3 Mo (PDF)

--:-- / --:--