วันจันทร์ที่ 12 สิงหาคม พ.ศ. 2567

Well-known optimization methods

 Exact Optimization Methods:

  1. Linear Programming (LP): Optimizes a linear objective function subject to linear equality and inequality constraints.
  2. Integer Programming (IP): A type of linear programming where some or all of the decision variables are required to be integers.
  3. Quadratic Programming (QP): Optimizes a quadratic objective function subject to linear constraints.
  4. Dynamic Programming (DP): Solves problems by breaking them down into simpler subproblems and solving each subproblem just once, storing the results.
  5. Branch and Bound: An algorithm for solving integer programming problems by systematically exploring and pruning the solution space.

Gradient-Based Methods:

  1. Gradient Descent: Iteratively moves towards the minimum of a function by following the negative gradient.
  2. Newton’s Method: Uses second-order derivative information (Hessian matrix) to find the roots of a function or the minima of a function more efficiently.
  3. Conjugate Gradient Method: An iterative method for solving large systems of linear equations and optimization problems, especially useful for quadratic functions.

Heuristic and Metaheuristic Methods:

  1. Genetic Algorithms (GA): Uses natural selection principles to evolve solutions over generations.
  2. Simulated Annealing (SA): Uses a probabilistic approach to avoid local optima by mimicking the annealing process in metallurgy.
  3. Particle Swarm Optimization (PSO): Models the social behavior of swarms to find optimal solutions.
  4. Ant Colony Optimization (ACO): Simulates the foraging behavior of ants to find optimal paths or solutions.
  5. Tabu Search: Uses memory structures to guide the search and avoid local optima.
  6. Differential Evolution (DE): Uses mutation and recombination to evolve solutions towards the optimal.

Approximation Algorithms:

  1. Greedy Algorithms: Make a series of locally optimal choices to find a solution that is hopefully globally optimal.
  2. Local Search: Iteratively explores the neighborhood of a solution to find better solutions, such as in the case of Hill Climbing.

Stochastic and Hybrid Methods:

  1. Bayesian Optimization: Uses probabilistic models to guide the search for optimal solutions, often used for expensive-to-evaluate functions.
  2. Memetic Algorithms: Combines genetic algorithms with local search methods to refine solutions.