Simulated Annealing

Simulated Annealing #

Name #

Simulated Annealing, SA

Taxonomy #

Simulated Annealing is a stochastic optimization algorithm inspired by the physical process of annealing in metallurgy. It is closely related to the Metropolis-Hastings algorithm and can be considered a metaheuristic or a probabilistic local search algorithm.

  • Optimization
    • Stochastic Optimization
      • Metaheuristics
        • Simulated Annealing

Strategy #

Simulated Annealing is modeled after the physical process of annealing, where a material is heated to a high temperature and then slowly cooled to remove defects and reach a low-energy crystalline state. In the context of optimization, the objective function to be minimized is analogous to the energy of the physical system, and the optimal solution corresponds to the lowest energy state.

The algorithm starts with an initial solution and iteratively explores the solution space by generating neighboring solutions. At each iteration, a new solution is generated by applying a small perturbation to the current solution. If the new solution improves the objective function, it is accepted as the current solution. However, even if the new solution is worse, it may still be accepted with a certain probability that depends on the current “temperature” and the magnitude of the objective function difference between the current and new solutions.

The acceptance probability is higher at the beginning of the process when the temperature is high, allowing the algorithm to escape local optima. As the temperature decreases according to a predefined cooling schedule, the acceptance probability for worse solutions decreases, and the algorithm gradually converges to a near-optimal solution.

Exploration and Exploitation #

Simulated Annealing balances exploration and exploitation through the temperature parameter. At high temperatures, the algorithm is more likely to accept worse solutions, enabling exploration of the solution space. As the temperature decreases, the algorithm becomes more selective, favoring better solutions and exploiting the promising regions of the solution space.

Convergence #

The convergence of Simulated Annealing is controlled by the cooling schedule, which determines how the temperature decreases over time. Common cooling schedules include exponential, logarithmic, and linear functions. The choice of the cooling schedule affects the trade-off between the quality of the final solution and the computational time required to reach it.

Procedure #

  1. Initialize:
    1. Define the objective function to be minimized
    2. Generate an initial solution
    3. Set the initial temperature and cooling schedule parameters
  2. While stopping criteria are not met:
    1. Generate a new solution by perturbing the current solution
    2. Calculate the objective function difference between the current and new solutions
    3. If the new solution is better (lower objective function value):
      1. Accept the new solution as the current solution
    4. Else:
      1. Calculate the acceptance probability based on the current temperature and objective function difference
      2. Generate a random number between 0 and 1
      3. If the random number is less than the acceptance probability:
        1. Accept the new solution as the current solution
    5. Update the temperature according to the cooling schedule
  3. Return the best solution found

Data Structures #

  • Current Solution: Represents the current state of the optimization problem
  • New Solution: Generated by perturbing the current solution
  • Best Solution: Stores the best solution found during the optimization process
  • Temperature: Controls the acceptance probability of worse solutions

Parameters #

  • Initial Temperature: The starting temperature of the annealing process
  • Cooling Schedule: Determines how the temperature decreases over time (e.g., exponential, logarithmic, linear)
  • Perturbation Function: Defines how new solutions are generated from the current solution
  • Stopping Criteria: Conditions for terminating the algorithm (e.g., maximum number of iterations, minimum temperature, no improvement for a certain number of iterations)

Considerations #

Advantages #

  • Ability to escape local optima and explore the solution space effectively
  • Applicable to a wide range of optimization problems, including combinatorial and continuous optimization
  • Relatively simple to implement and adapt to different problem domains
  • Can handle noisy or stochastic objective functions

Disadvantages #

  • Performance depends heavily on the choice of cooling schedule and perturbation function
  • May require careful tuning of parameters to achieve good results
  • Can be computationally expensive, especially for large-scale problems
  • No guaranteed optimality of the final solution

Heuristics #

Cooling Schedule #

  • Start with a high initial temperature to allow sufficient exploration of the solution space
  • Decrease the temperature gradually to balance exploration and exploitation
  • Exponential cooling schedules (e.g., T = T0 * α^k, where α is the cooling rate and k is the iteration number) are commonly used
  • Logarithmic cooling schedules (e.g., T = T0 / log(1 + k)) can provide theoretical guarantees of convergence but may be slower in practice
  • Linear cooling schedules (e.g., T = T0 - β * k, where β is the cooling step) are simple but may not explore the solution space effectively

Perturbation Function #

  • The perturbation function should be tailored to the specific problem domain
  • For continuous optimization problems, small random perturbations (e.g., adding Gaussian noise) can be effective
  • For combinatorial optimization problems, problem-specific operators (e.g., swap, insert, remove) can be used to generate neighboring solutions
  • The magnitude of the perturbation should be balanced to allow both local and global exploration

Stopping Criteria #

  • Set a maximum number of iterations or a computational time limit to prevent excessive runtime
  • Monitor the improvement in the best solution found and stop if no significant improvement is observed for a certain number of iterations
  • Define a minimum temperature at which the algorithm terminates, as the acceptance probability of worse solutions becomes negligible at low temperatures

Objective Function #

  • Ensure that the objective function accurately captures the goals of the optimization problem
  • Consider normalizing or scaling the objective function to avoid numerical issues and improve the performance of the algorithm
  • If the problem has multiple objectives, consider using a weighted sum or a Pareto-based approach to handle multi-objective optimization

Initial Solution #

  • The choice of the initial solution can impact the performance of Simulated Annealing
  • If problem-specific knowledge is available, use heuristics or greedy algorithms to generate a good initial solution
  • Otherwise, generate a random initial solution or use multiple random restarts to explore different regions of the solution space

Parameter Tuning #

  • Experiment with different values for the initial temperature, cooling schedule, and perturbation function to find the best configuration for the problem at hand
  • Use a subset of the problem instances or a smaller problem size for parameter tuning to reduce computational overhead
  • Consider using automated parameter tuning techniques, such as meta-optimization or adaptive cooling schedules, to improve the performance of the algorithm

Code #

Show

Python Simulated Annealing #

If you are stuck, below is a candidate implementation of the simulated annealing in pure Python.

# AlgorithmAfternoon.com
import math
import random

def objective_function(x):
    """Objective function f(x) = sin(x) + sin(10 * x / 3)."""
    return math.sin(x) + math.sin(10 * x / 3)

def acceptance_criteria(cost, new_cost, temperature):
    """Determine if the new solution should be accepted based on the current temperature."""
    if new_cost > cost:
        return True
    # Calculate acceptance probability (Boltzmann distribution)
    return math.exp((new_cost - cost) / temperature) > random.random()

def exponential_cooling(current_temperature, cooling_rate):
    """Cool down the temperature exponentially."""
    return current_temperature * cooling_rate

def uniform_neighbor_generation(current_x, neighborhood_size):
    """Generate a new solution within the neighborhood of the current solution."""
    return current_x + random.uniform(-neighborhood_size, neighborhood_size)

def simulated_annealing(objective_function, bounds, initial_temperature, final_temperature, cooling_rate, max_iterations, neighborhood_size):
    """Run the simulated annealing algorithm."""
    # Start from a random position within bounds
    current_x = random.uniform(bounds[0], bounds[1])
    current_cost = objective_function(current_x)
    best_x = current_x
    best_cost = current_cost

    temperature = initial_temperature

    iteration = 0
    while temperature > final_temperature and iteration < max_iterations:
        # Generate a neighbor solution
        candidate_x = uniform_neighbor_generation(current_x, neighborhood_size)
        # Ensure candidate is within bounds
        if candidate_x < bounds[0] or candidate_x > bounds[1]:
            continue
        candidate_cost = objective_function(candidate_x)

        # Acceptance check
        if acceptance_criteria(current_cost, candidate_cost, temperature):
            current_x, current_cost = candidate_x, candidate_cost
            if candidate_cost > best_cost:
                best_x, best_cost = candidate_x, candidate_cost

        # Cooling step
        temperature = exponential_cooling(temperature, cooling_rate)

        # Report progress
        print(f"Iteration {iteration+1}: Best Cost = {best_cost}")

        iteration += 1

    return best_x, best_cost

# Algorithm parameters
initial_temperature = 1000
final_temperature = 1e-3
cooling_rate = 0.97
max_iterations = 500
neighborhood_size = 1.0
bounds = (-5, 5)

# Run the simulated annealing algorithm
best_x, best_cost = simulated_annealing(objective_function, bounds, initial_temperature, final_temperature, cooling_rate, max_iterations, neighborhood_size)
print(f"Best solution: x = {best_x}, Cost = {best_cost}")

Mini-Book #

Update: If you need more help, you might be interested in the new Simulated Annealing Mini-Book