Algorithms for Optimization

Algorithms for Optimization
分享
扫描下方二维码分享到微信
打开微信,点击右上角”+“,
使用”扫一扫“即可将网页分享到朋友圈。
出版社: The MIT Press
2019-03
ISBN: 9780262039420
装帧: 其他
页数: 520页
  • This book offers a comprehensive introduction to optimization with a focus on practical algorithms. The book approaches optimization from an engineering perspective, where the objective is to design a system that optimizes a set of metrics subject to constraints. Readers will learn about computational approaches for a range of challenges, including searching high-dimensional spaces, handling problems where there are multiple competing objectives, and accommodating uncertainty in the metrics. Figures, examples, and exercises convey the intuition behind the mathematical approaches. The text provides concrete implementations in the Julia programming language.
    Topics covered include derivatives and their generalization to multiple dimensions; local descent and first- and second-order methods that inform local descent; stochastic methods, which introduce randomness into the optimization process; linear constrained optimization, when both the objective function and the constraints are linear; surrogate models, probabilistic surrogate models, and using probabilistic surrogate models to guide optimization; optimization under uncertainty; uncertainty propagation; expression optimization; and multidisciplinary design optimization. Appendixes offer an introduction to the Julia language, test functions for evaluating algorithm performance, and mathematical concepts used in the derivation and analysis of the optimization methods discussed in the text. The book can be used by advanced undergraduates and graduate students in mathematics, statistics, computer science, any engineering field, (including electrical engineering and aerospace engineering), and operations research, and as a reference for professionals. Contents
    Preface
    Acknowledgments
    1 Introduction
    1.1 A History
    1.2 Optimization Process
    1.3 Basic Optimization Problem
    1.4 Constraints
    1.5 Critical Points
    1.6 Conditions for Local Minima
    1.7 Contour Plots
    1.8 Overview
    1.9 Summary
    1.10 Exercises
    2 Derivatives and Gradients
    2.1 Derivatives
    2.2 Derivatives in Multiple Dimensions
    2.3 Numerical Differentiation
    2.4 Automatic Differentiation
    2.5 Summary
    2.6 Exercises
    3 Bracketing
    3.1 Unimodality
    3.2 Finding an Initial Bracket
    3.3 Fibonacci Search
    3.4 Golden Section Search
    3.5 Quadratic Fit Search
    3.6 Shubert-Piyavskii Method
    3.7 Bisection Method
    3.8 Summary
    3.9 Exercises
    4 Local Descent
    4.1 Descent Direction Iteration
    4.2 Line Search
    4.3 Approximate Line Search
    4.4 Trust Region Methods
    4.5 Termination Conditions
    4.6 Summary
    4.7 Exercises
    5 First-Order Methods
    5.1 Gradient Descent
    5.2 Conjugate Gradient
    5.3 Momentum
    5.4 Nesterov Momentum
    5.5 Adagrad
    5.6 RMSProp
    5.7 Adadelta
    5.8 Adam
    5.9 Hypergradient Descent
    5.10 Summary
    5.11 Exercises
    6 Second-Order Methods
    6.1 Newton’s Method
    6.2 Secant Method
    6.3 Quasi-Newton Methods
    6.4 Summary
    6.5 Exercises
    7 Direct Methods
    7.1 Cyclic Coordinate Search
    7.2 Powell’s Method
    7.3 Hooke-Jeeves
    7.4 Generalized Pattern Search
    7.5 Nelder-Mead Simplex Method
    7.6 Divided Rectangles
    7.7 Summary
    7.8 Exercises
    8 Stochastic Methods
    8.1 Noisy Descent
    8.2 Mesh Adaptive Direct Search
    8.3 Simulated Annealing
    8.4 Cross-Entropy Method
    8.5 Natural Evolution Strategies
    8.6 Covariance Matrix Adaptation
    8.7 Summary
    8.8 Exercises
    9 Population Methods
    9.1 Initialization
    9.2 Genetic Algorithms
    9.3 Differential Evolution
    9.4 Particle Swarm Optimization
    9.5 Firefly Algorithm
    9.6 Cuckoo Search
    9.7 Hybrid Methods
    9.8 Summary
    9.9 Exercises
    10 Constraints
    10.1 Constrained Optimization
    10.2 Constraint Types
    10.3 Transformations to Remove Constraints
    10.4 Lagrange Multipliers
    10.5 Inequality Constraints
    10.6 Duality
    10.7 Penalty Methods
    10.8 Augmented Lagrange Method
    10.9 Interior Point Methods
    10.10 Summary
    10.11 Exercises
    11 Linear Constrained Optimization
    11.1 Problem Formulation
    11.2 Simplex Algorithm
    11.3 Dual Certificates
    11.4 Summary
    11.5 Exercises
    12 Multiobjective Optimization
    12.1 Pareto Optimality
    12.2 Constraint Methods
    12.3 Weight Methods
    12.4 Multiobjective Population Methods
    12.5 Preference Elicitation
    12.6 Summary
    12.7 Exercises
    13 Sampling Plans
    13.1 Full Factorial
    13.2 Random Sampling
    13.3 Uniform Projection Plans
    13.4 Stratified Sampling
    13.5 Space-Filling Metrics
    13.6 Space-Filling Subsets
    13.7 Quasi-Random Sequences
    13.8 Summary
    13.9 Exercises
    14 Surrogate Models
    14.1 Fitting Surrogate Models
    14.2 Linear Models
    14.3 Basis Functions
    14.4 Fitting Noisy Objective Functions
    14.5 Model Selection
    14.6 Summary
    14.7 Exercises
    15 Probabilistic Surrogate Models
    15.1 Gaussian Distribution
    15.2 Gaussian Processes
    15.3 Prediction
    15.4 Gradient Measurements
    15.5 Noisy Measurements
    15.6 Fitting Gaussian Processes
    15.7 Summary
    15.8 Exercises
    16 Surrogate Optimization
    16.1 Prediction-Based Exploration
    16.2 Error-Based Exploration
    16.3 Lower Confidence Bound Exploration
    16.4 Probability of Improvement Exploration
    16.5 Expected Improvement Exploration
    16.6 Safe Optimization
    16.7 Summary
    16.8 Exercises
    17 Optimization under Uncertainty
    17.1 Uncertainty
    17.2 Set-Based Uncertainty
    17.3 Probabilistic Uncertainty
    17.4 Summary
    17.5 Exercises
    18 Uncertainty Propagation
    18.1 Sampling Methods
    18.2 Taylor Approximation
    18.3 Polynomial Chaos
    18.4 Bayesian Monte Carlo
    18.5 Summary
    18.6 Exercises
    19 Discrete Optimization
    19.1 Integer Programs
    19.2 Rounding
    19.3 Cutting Planes
    19.4 Branch and Bound
    19.5 Dynamic Programming
    19.6 Ant Colony Optimization
    19.7 Summary
    19.8 Exercises
    20 Expression Optimization
    20.1 Grammars
    20.2 Genetic Programming
    20.3 Grammatical Evolution
    20.4 Probabilistic Grammars
    20.5 Probabilistic Prototype Trees
    20.6 Summary
    20.7 Exercises
    21 Multidisciplinary Optimization
    21.1 Disciplinary Analyses
    21.2 Interdisciplinary Compatibility
    21.3 Architectures
    21.4 Multidisciplinary Design Feasible
    21.5 Sequential Optimization
    21.6 Individual Discipline Feasible
    21.7 Collaborative Optimization
    21.8 Simultaneous Analysis and Design
    21.9 Summary
    21.10 Exercises
    A Julia
    A.1 Types
    A.2 Functions
    A.3 Control Flow
    A.4 Packages
    B Test Functions
    B.1 Ackley’s Function
    B.2 Booth’s Function
    B.3 Branin Function
    B.4 Flower Function
    B.5 Michalewicz Function
    B.6 Rosenbrock’s Banana Function
    B.7 Wheeler’s Ridge
    B.8 Circle Function
    C Mathematical Concepts
    C.1 Asymptotic Notation
    C.2 Taylor Expansion
    C.3 Convexity
    C.4 Norms
    C.5 Matrix Calculus
    C.6 Positive Definiteness
    C.7 Gaussian Distribution
    C.8 Gaussian Quadrature
    D Solutions
    Bibliography
    Index
  • 内容简介:
    This book offers a comprehensive introduction to optimization with a focus on practical algorithms. The book approaches optimization from an engineering perspective, where the objective is to design a system that optimizes a set of metrics subject to constraints. Readers will learn about computational approaches for a range of challenges, including searching high-dimensional spaces, handling problems where there are multiple competing objectives, and accommodating uncertainty in the metrics. Figures, examples, and exercises convey the intuition behind the mathematical approaches. The text provides concrete implementations in the Julia programming language.
    Topics covered include derivatives and their generalization to multiple dimensions; local descent and first- and second-order methods that inform local descent; stochastic methods, which introduce randomness into the optimization process; linear constrained optimization, when both the objective function and the constraints are linear; surrogate models, probabilistic surrogate models, and using probabilistic surrogate models to guide optimization; optimization under uncertainty; uncertainty propagation; expression optimization; and multidisciplinary design optimization. Appendixes offer an introduction to the Julia language, test functions for evaluating algorithm performance, and mathematical concepts used in the derivation and analysis of the optimization methods discussed in the text. The book can be used by advanced undergraduates and graduate students in mathematics, statistics, computer science, any engineering field, (including electrical engineering and aerospace engineering), and operations research, and as a reference for professionals.
  • 目录:
    Contents
    Preface
    Acknowledgments
    1 Introduction
    1.1 A History
    1.2 Optimization Process
    1.3 Basic Optimization Problem
    1.4 Constraints
    1.5 Critical Points
    1.6 Conditions for Local Minima
    1.7 Contour Plots
    1.8 Overview
    1.9 Summary
    1.10 Exercises
    2 Derivatives and Gradients
    2.1 Derivatives
    2.2 Derivatives in Multiple Dimensions
    2.3 Numerical Differentiation
    2.4 Automatic Differentiation
    2.5 Summary
    2.6 Exercises
    3 Bracketing
    3.1 Unimodality
    3.2 Finding an Initial Bracket
    3.3 Fibonacci Search
    3.4 Golden Section Search
    3.5 Quadratic Fit Search
    3.6 Shubert-Piyavskii Method
    3.7 Bisection Method
    3.8 Summary
    3.9 Exercises
    4 Local Descent
    4.1 Descent Direction Iteration
    4.2 Line Search
    4.3 Approximate Line Search
    4.4 Trust Region Methods
    4.5 Termination Conditions
    4.6 Summary
    4.7 Exercises
    5 First-Order Methods
    5.1 Gradient Descent
    5.2 Conjugate Gradient
    5.3 Momentum
    5.4 Nesterov Momentum
    5.5 Adagrad
    5.6 RMSProp
    5.7 Adadelta
    5.8 Adam
    5.9 Hypergradient Descent
    5.10 Summary
    5.11 Exercises
    6 Second-Order Methods
    6.1 Newton’s Method
    6.2 Secant Method
    6.3 Quasi-Newton Methods
    6.4 Summary
    6.5 Exercises
    7 Direct Methods
    7.1 Cyclic Coordinate Search
    7.2 Powell’s Method
    7.3 Hooke-Jeeves
    7.4 Generalized Pattern Search
    7.5 Nelder-Mead Simplex Method
    7.6 Divided Rectangles
    7.7 Summary
    7.8 Exercises
    8 Stochastic Methods
    8.1 Noisy Descent
    8.2 Mesh Adaptive Direct Search
    8.3 Simulated Annealing
    8.4 Cross-Entropy Method
    8.5 Natural Evolution Strategies
    8.6 Covariance Matrix Adaptation
    8.7 Summary
    8.8 Exercises
    9 Population Methods
    9.1 Initialization
    9.2 Genetic Algorithms
    9.3 Differential Evolution
    9.4 Particle Swarm Optimization
    9.5 Firefly Algorithm
    9.6 Cuckoo Search
    9.7 Hybrid Methods
    9.8 Summary
    9.9 Exercises
    10 Constraints
    10.1 Constrained Optimization
    10.2 Constraint Types
    10.3 Transformations to Remove Constraints
    10.4 Lagrange Multipliers
    10.5 Inequality Constraints
    10.6 Duality
    10.7 Penalty Methods
    10.8 Augmented Lagrange Method
    10.9 Interior Point Methods
    10.10 Summary
    10.11 Exercises
    11 Linear Constrained Optimization
    11.1 Problem Formulation
    11.2 Simplex Algorithm
    11.3 Dual Certificates
    11.4 Summary
    11.5 Exercises
    12 Multiobjective Optimization
    12.1 Pareto Optimality
    12.2 Constraint Methods
    12.3 Weight Methods
    12.4 Multiobjective Population Methods
    12.5 Preference Elicitation
    12.6 Summary
    12.7 Exercises
    13 Sampling Plans
    13.1 Full Factorial
    13.2 Random Sampling
    13.3 Uniform Projection Plans
    13.4 Stratified Sampling
    13.5 Space-Filling Metrics
    13.6 Space-Filling Subsets
    13.7 Quasi-Random Sequences
    13.8 Summary
    13.9 Exercises
    14 Surrogate Models
    14.1 Fitting Surrogate Models
    14.2 Linear Models
    14.3 Basis Functions
    14.4 Fitting Noisy Objective Functions
    14.5 Model Selection
    14.6 Summary
    14.7 Exercises
    15 Probabilistic Surrogate Models
    15.1 Gaussian Distribution
    15.2 Gaussian Processes
    15.3 Prediction
    15.4 Gradient Measurements
    15.5 Noisy Measurements
    15.6 Fitting Gaussian Processes
    15.7 Summary
    15.8 Exercises
    16 Surrogate Optimization
    16.1 Prediction-Based Exploration
    16.2 Error-Based Exploration
    16.3 Lower Confidence Bound Exploration
    16.4 Probability of Improvement Exploration
    16.5 Expected Improvement Exploration
    16.6 Safe Optimization
    16.7 Summary
    16.8 Exercises
    17 Optimization under Uncertainty
    17.1 Uncertainty
    17.2 Set-Based Uncertainty
    17.3 Probabilistic Uncertainty
    17.4 Summary
    17.5 Exercises
    18 Uncertainty Propagation
    18.1 Sampling Methods
    18.2 Taylor Approximation
    18.3 Polynomial Chaos
    18.4 Bayesian Monte Carlo
    18.5 Summary
    18.6 Exercises
    19 Discrete Optimization
    19.1 Integer Programs
    19.2 Rounding
    19.3 Cutting Planes
    19.4 Branch and Bound
    19.5 Dynamic Programming
    19.6 Ant Colony Optimization
    19.7 Summary
    19.8 Exercises
    20 Expression Optimization
    20.1 Grammars
    20.2 Genetic Programming
    20.3 Grammatical Evolution
    20.4 Probabilistic Grammars
    20.5 Probabilistic Prototype Trees
    20.6 Summary
    20.7 Exercises
    21 Multidisciplinary Optimization
    21.1 Disciplinary Analyses
    21.2 Interdisciplinary Compatibility
    21.3 Architectures
    21.4 Multidisciplinary Design Feasible
    21.5 Sequential Optimization
    21.6 Individual Discipline Feasible
    21.7 Collaborative Optimization
    21.8 Simultaneous Analysis and Design
    21.9 Summary
    21.10 Exercises
    A Julia
    A.1 Types
    A.2 Functions
    A.3 Control Flow
    A.4 Packages
    B Test Functions
    B.1 Ackley’s Function
    B.2 Booth’s Function
    B.3 Branin Function
    B.4 Flower Function
    B.5 Michalewicz Function
    B.6 Rosenbrock’s Banana Function
    B.7 Wheeler’s Ridge
    B.8 Circle Function
    C Mathematical Concepts
    C.1 Asymptotic Notation
    C.2 Taylor Expansion
    C.3 Convexity
    C.4 Norms
    C.5 Matrix Calculus
    C.6 Positive Definiteness
    C.7 Gaussian Distribution
    C.8 Gaussian Quadrature
    D Solutions
    Bibliography
    Index
查看详情
相关图书 / 更多
Algorithms for Optimization
Algorithms in C++, Parts 1-4:Fundamentals, Data Structure, Sorting, Searching (3rd Edition)
Robert Sedgewick
Algorithms for Optimization
Algorithms in C++ Part 5:Graph Algorithms
Robert Sedgewick
Algorithms for Optimization
Algorithms
Jeff Erickson
Algorithms for Optimization
Algorithms in C, Parts 1-4:Fundamentals, Data Structures, Sorting, Searching
Robert Sedgewick
Algorithms for Optimization
Algorithms
Robert Sedgewick;Kevin Wayne
Algorithms for Optimization
Algorithms for Minimization Without Derivatives
Richard P. Brent
Algorithms for Optimization
Algorithms and Data Structures:The Basic Toolbox
Kurt Mehlhorn;Peter Sanders
Algorithms for Optimization
Algorithms for Reinforcement Learning
Szepesvari;Csaba
Algorithms for Optimization
Algorithms and Programming:Problems and Solutions
Alexander Shen
Algorithms for Optimization
Algorithms and Parallel Computing
Fayez Gebali
Algorithms for Optimization
Algorithms Unlocked
Thomas H. Cormen
Algorithms for Optimization
Algorithms and Data Structures in Action
Marcello La Rocca
您可能感兴趣 / 更多
Algorithms for Optimization
New York City For Dummies, 6th Edition
Myka Carroll 著