S. J. Wright Numerical Optimization(second edition).pdf

所需积分/C币:49 2017-10-19 20:35:24 4.2MB PDF
收藏 收藏
举报

This is a book for people interested in solving optimization problems. Because of the wide (and growing) use of optimization in science, engineering, economics, and industry, it is essential for students and practitioners alike to develop an understanding of optimization algorithms. Knowledge of the
To Sue. isabel and martin and To Mum and dad Contents Preface Preface to the Second edition XXI 1 Introduction Mathematical Formulation Example: A Transportation Problem Continuous versus Discrete Optimization Constrained and Unconstrained optimization Global and Local optimization Stochastic and Deterministic Optimization Convexity 245667789 Optimization algorithms Notes and References 2 Fundamentals of Unconstrained optimization 2.1 What is a solution viII ContentS Recognizing a local minimum 14 Nonsmooth problems 17 2.2 Overview of algorithms 18 Two Strategies: Line Search and Trust Region 19 Search directions for line search methods Models for Trust-Region Methods 25 g 26 Exercises 27 3 Line search Methods 30 Step length 31 The Wolfe Conditions The Goldstein Conditions 36 Sufficient Decrease and Backtracking 3.2 Convergence of Line Search Methods 37 3.3 Rate of Convergence 41 Convergence Rate of Steepest Descent Newtons Method Quasi-Newton Methods 3. 4 Newton's Method with Hessian modification Eigenvalue Modification 49 Adding a Multiple of the Identity 51 Modified Cholesky factorization Modified Symmetric Indefinite Factorization 54 3.5 Step-Length Selection Algorithms Interpolatio Initial Step Length A Line Search Algorithm for the wolfe Conditions 60 Notes and References 62 Exercises 4 Trust-Region Method Outline of the trust-Region approach 4.1 Algorithms Based on the Cauchy point 71 The Cauchy Point Improving on the cauchy point The dogleg method Two-Dimensional Subspace minimization 4.2 Global Convergence Reduction Obtained by the Cauchy Point Convergence to Stationary Points 4.3 Iterative Solution of the Subproblem 83 CONTENTs ix The Hard Case 87 Proof of Theorem 4.1 89 Convergence of algorithms Based on Nearly Exact Solutions 4.4 Local Convergence of Trust-Region Newton Methods 4.5 Other Enhancements Scaling Trust Regions in Other norms Notes and References 55788 Exercises 5 Conjugate Gradient Methods 101 5.1 The Linear Conjugate gradient Method Conjugate Direction Methods 102 Basic Properties of the conjugate gradient method 107 A Practical Form of the Conjugate Gradient Method 111 Rate of convergence 2 Preconditioning Practical Preconditioners 120 5.2 Nonlinear Conjugate Gradient Methods 121 The fletcher-Reeves method 121 The polak-Ribiere Method and variants 122 Quadratic Termination and Restarts 124 Behavior of the fletcher-Reeves method 125 Global Convergence 127 Numerical performance 131 otes and references 132 Exercises 133 6 Quasi-Newton Methods 135 6.1 The bfgs method 136 Properties of the BFGS Method 141 Implementation 142 6.2 The Srl Method 144 Properties of SRl Updating 147 6.3 The broyden Cl 149 6.4 Convergence analysis 153 Global Convergence of the bFgs Method 153 Superlinear Convergence of the BFGS Method 156 Convergence analysis of the srl Method 160 Notes and References 161 Exercises .162 X CONTENTS 7 Large-Scale Unconstrained Optimization 164 7.1 Inexact Newton Methods 165 Local Convergence of Inexact Newton Methods 166 Line search Newton -CG Method 168 Trust-Region Newton-CG Method 170 Preconditioning the Trust-Region Newton-CG Method 174 Trust-Region Newton-Lanczos Method 175 7.2 Limited-Memory Quasi-Newton Methods 176 Limited-Memory BFgs 177 Relationship with Conjugate Gradient Methods 180 General Limited-Memory Updating 181 Compact Representation of BFGs updating 181 Unrolling the Update 184 7.3S parse Quasi-Newton Updates 185 7.4 Algorithms for Partially Separable Functions 186 7.5 Perspectives and Software 189 Notes and References 190 Exercises 191 8 Calculating Derivatives 193 8. 1 Finite-Difference Derivative approximations 194 pproximating the gradient 195 Approximating a Sparse Jacobian 197 pp ting the e Hessian 201 Approximating a Sparse Hessian 202 8.2 Automatic Differentiation 204 Ane 205 The Forward Mode 206 The Reverse Mode 207 Vector Functions and Partial Separability 210 Calculating jacobians of Vector Functions 212 Calculating Hessians: Forward Mode 213 Calculating Hessians: Reverse Mode 215 Current limitations 216 Notes and references 217 Exercises 217 9 Derivative-Free Optimization 220 9.1 Finite Differences and nois 221 9.2 Model-Based Methods 223 Interpolation and polynomial bases 226 Updating the Interpolation Set 227 CONTENTS xi A Method Based on Minimum-Change Updating 228 9.3 Coordinate and Pattern -Search Methods 229 Coordinate Search Method 230 Pattern-Search Methods 231 9. 4 A Conjugate-Direction Method 234 9.5 Nelder-Mead Method 238 9.6 Implicit Filtering 240 Notes and references 242 Exercises 242 10 Least-Squares Problems 245 10.1 Backs 247 10.2 Linear Least-Squares Problems 250 10.3 Algorithms for Nonlinear Least-Squares Problems 254 The gauss-Newton method 254 Convergence of the gauss-Newton Method 255 The Levenberg-Marquardt Method 258 Implementation of the Levenberg-Marquardt method 259 Convergence of the Levenberg-Marquardt Method 261 Methods for Large-Residual Problems 262 10.4 Orthogonal Distance Regression 265 Notes and References 267 Exercises 269 11 Nonlinear equations 270 11.1 Local algorithms 274 Newtons Method for Nonlinear Equations 274 Inexact Newton methods 277 Broyden's method 279 Tensor Methods 283 11.2 Practical methods 285 Merit functions 285 Line search methods 287 Trust-Region Methods 290 11.3 Continuation/Homotopy method 296 Motivation 296 Practical Continuation methods 297 Motes and reference 302 Exercises 302 12 Theory of Constrained Optimization 304 Local and global solutions 305 xii Contents Smoothness 306 12. 1 Examples 307 A Single equality Constraint 308 A Single inequality constraint 310 Two Inequality constraints 313 12.2 Tangent Cone and constraint qualifications 315 12.3 First-Order Optimality Conditions 320 12.4 First-Order optimality conditions: Proof 323 Relating the Tangent Cone and the First-Order Feasible Direction Set 323 A Fundamental Necessary Condition 325 Farkas lemma 326 Proof of theorem 12.1 329 12.5 Second-Order Conditions 330 Second-Order Conditions and Projected Hessians 337 12.6 Other Constraint Qualifications 338 12.7 A Geometric Viewpoint 340 12.8 Lagrange Multipliers and Sensitivity 341 12.9 Duality 343 Notes and References 349 Exercises 351 13 Linear Programming: The Simplex method 355 Linear Programming 356 13.1O1 ptimality and duality 358 Optimality conditions 358 The Dual Problem 359 3.2 Geometry of the Feasible Set 362 Bases and Basic Feasible points 362 Vertices of the Feasible Polytope 365 13. 3 The Simplex method 366 Outline 366 A Single Step of the Method 370 13.4 Linear Algebra in the simplex method 372 13.5 Other Important Details 375 Pricing and Selection of the Entering Index 375 Starting the Simplex method 378 Degenerate Steps and Cycling 381 13.6 The Dual Simplex Method 382 13.7 Presolving 385 13.8 Where Does the Simplex method Fit 388 Notes and References 389 Exercises 389 CONTENTs xiii 14 Linear Programming: Interior-Point Methods 392 14.1 Primal-Dual Methods 393 Outline 393 The Central path 397 Central Path Neighborhoods and Path-Following Methods 399 14.2 Practical Primal-Dual Algorithms 407 Corrector and Centering Steps 407 Step lengths 409 Starting pe 410 A Practical Algorithm 411 Solving the linear systems 411 14. 3 Other Primal-Dual Algorithms and Extensions 413 Other Path-Following methods 413 Potential-Reduction Methods 414 Extensions 415 14.4 Perspectives and Software 416 Notes and References 417 Exercises 418 15 Fundamentals of Algorithms for Nonlinear Constrained Optimization 421 15.1 Categorizing Optimization Algorithms 422 15.2 The Combinatorial Difficulty of Inequality-Constrained Problems 424 15.3 Elimination of variables 426 Simple elimination using Linear Constraints 428 General Reduction Strategies for Linear Constraints .......... 431 Effect of Inequality Constraints 434 15.4 Merit Functions and Filters 435 Merit functions 435 Filters 437 15.5 The Maratos Effect 440 15.6 Second-Order Correction and nonmonotone Techniques 443 nonmonotone(Watchdog) Strategy 444 Notes and references 446 Exercises 446 16 Quadratic Programming 448 16.1 Equality-Constrained Quadratic Programs 451 Properties of Equality-Constrained QPs 451 16.2 Direct Solution of the Kkt System 454 Factoring the Full KKT System 454 Schur-Complement Method 455 Null-Space Method 457

...展开详情
试读 127P S. J. Wright Numerical Optimization(second edition).pdf
立即下载 低至0.43元/次 身份认证VIP会员低至7折
抢沙发
一个资源只可评论一次,评论内容不能少于5个字
上传资源赚积分or赚钱
最新推荐
S. J. Wright Numerical Optimization(second edition).pdf 49积分/C币 立即下载
1/127
S. J. Wright Numerical Optimization(second edition).pdf第1页
S. J. Wright Numerical Optimization(second edition).pdf第2页
S. J. Wright Numerical Optimization(second edition).pdf第3页
S. J. Wright Numerical Optimization(second edition).pdf第4页
S. J. Wright Numerical Optimization(second edition).pdf第5页
S. J. Wright Numerical Optimization(second edition).pdf第6页
S. J. Wright Numerical Optimization(second edition).pdf第7页
S. J. Wright Numerical Optimization(second edition).pdf第8页
S. J. Wright Numerical Optimization(second edition).pdf第9页
S. J. Wright Numerical Optimization(second edition).pdf第10页
S. J. Wright Numerical Optimization(second edition).pdf第11页
S. J. Wright Numerical Optimization(second edition).pdf第12页
S. J. Wright Numerical Optimization(second edition).pdf第13页
S. J. Wright Numerical Optimization(second edition).pdf第14页
S. J. Wright Numerical Optimization(second edition).pdf第15页
S. J. Wright Numerical Optimization(second edition).pdf第16页
S. J. Wright Numerical Optimization(second edition).pdf第17页
S. J. Wright Numerical Optimization(second edition).pdf第18页
S. J. Wright Numerical Optimization(second edition).pdf第19页
S. J. Wright Numerical Optimization(second edition).pdf第20页

试读结束, 可继续阅读

49积分/C币 立即下载 >