Harmony Search

Harmony Search #

Name #

Harmony Search (HS)

Taxonomy #

Harmony Search is a metaheuristic optimization algorithm inspired by the improvisation process of musicians. It belongs to the family of evolutionary algorithms and is closely related to other population-based metaheuristics such as Genetic Algorithms and Particle Swarm Optimization.

  • Computational Intelligence
    • Biologically Inspired Computation
      • Evolutionary Algorithms
        • Genetic Algorithms
        • Differential Evolution
        • Harmony Search
    • Swarm Intelligence
      • Particle Swarm Optimization
      • Ant Colony Optimization
  • Stochastic Optimization
    • Metaheuristics
      • Single-Solution Based Metaheuristics
      • Population-Based Metaheuristics
        • Evolutionary Algorithms
          • Harmony Search

Strategy #

Harmony Search mimics the improvisation process of musicians in a jazz band. Each musician corresponds to a decision variable, and their improvisation represents the search for better solutions in the problem space. The algorithm maintains a memory called the Harmony Memory (HM), which stores a set of good solutions. New solutions are generated by applying three rules: memory consideration, pitch adjustment, and random selection.

In the memory consideration step, the algorithm selects a value for each decision variable from the Harmony Memory with a probability called the Harmony Memory Considering Rate (HMCR). If a value is not selected from the memory, it is randomly chosen from the feasible range of the variable.

The pitch adjustment step is applied to the values selected from the memory. With a probability called the Pitch Adjusting Rate (PAR), the selected value is adjusted by a small amount, allowing the algorithm to fine-tune the solutions.

The random selection step introduces diversity by assigning a random value to a decision variable with a probability of (1 - HMCR).

The newly generated solution is evaluated using the objective function. If it is better than the worst solution in the Harmony Memory, it replaces the worst solution. This process is repeated until a termination criterion, such as a maximum number of iterations or a satisfactory solution, is met.

Procedure #

Data Structures #

  • Harmony Memory (HM): A matrix storing a set of good solutions, where each row represents a solution and each column corresponds to a decision variable.
  • Harmony Memory Considering Rate (HMCR): The probability of selecting a value from the Harmony Memory.
  • Pitch Adjusting Rate (PAR): The probability of adjusting the value of a decision variable selected from the Harmony Memory.
  • Bandwidth (bw): The maximum amount by which a decision variable can be adjusted during the pitch adjustment step.

Parameters #

  • Harmony Memory Size (HMS): The number of solutions stored in the Harmony Memory.
  • Maximum Iterations (MaxIter): The maximum number of iterations or improvisations.
  • Harmony Memory Considering Rate (HMCR)
  • Pitch Adjusting Rate (PAR)
  • Bandwidth (bw)

Steps #

  1. Initialize the Harmony Memory (HM) with random solutions.
  2. Evaluate the fitness of each solution in the HM using the objective function.
  3. Improvise a new solution:
    1. For each decision variable:
      1. With probability HMCR, select a value from the corresponding column of the HM.
      2. If a value is not selected from the HM, generate a random value within the feasible range.
      3. If a value is selected from the HM, with probability PAR, adjust the value by adding or subtracting a small random amount (bounded by bw).
    2. Replace the values of the decision variables with the newly generated or adjusted values.
  4. Evaluate the fitness of the new solution using the objective function.
  5. If the new solution is better than the worst solution in the HM, replace the worst solution with the new solution.
  6. Repeat steps 3-5 until the maximum number of iterations (MaxIter) is reached or a satisfactory solution is found.
  7. Return the best solution found in the Harmony Memory.

Considerations #

Advantages #

  • Simple and easy to implement
  • Requires fewer parameters compared to other evolutionary algorithms
  • Able to handle both discrete and continuous optimization problems
  • Provides a good balance between exploration and exploitation of the search space

Disadvantages #

  • May converge prematurely or get stuck in local optima if not properly tuned
  • Performance depends on the appropriate selection of parameters (HMCR, PAR, bw)
  • May require a large number of function evaluations to find high-quality solutions

Heuristics #

Parameter Selection #

  • Harmony Memory Considering Rate (HMCR):
    • Typically set between 0.7 and 0.95
    • Higher values prioritize exploitation of the Harmony Memory
    • Lower values encourage exploration of the search space
  • Pitch Adjusting Rate (PAR):
    • Usually set between 0.1 and 0.5
    • Higher values promote fine-tuning of solutions
    • Lower values maintain the diversity of solutions
  • Bandwidth (bw):
    • Depends on the problem and the range of decision variables
    • Smaller values are suitable for fine-tuning solutions
    • Larger values enable exploration of distant regions in the search space

Harmony Memory Size #

  • A larger Harmony Memory Size (HMS) can store more diverse solutions but may slow down convergence
  • A smaller HMS may lead to premature convergence but can be computationally efficient
  • Typical values range from 10 to 50, depending on the problem complexity

Termination Criteria #

  • Set a maximum number of iterations (MaxIter) based on the problem complexity and available computational resources
  • Monitor the improvement of the best solution found over a certain number of iterations and terminate if no significant improvement is observed
  • Use a predefined threshold for the objective function value as a termination criterion, if known

Hybridization #

  • Harmony Search can be hybridized with other optimization techniques, such as local search methods, to improve its performance
  • Incorporating problem-specific knowledge or heuristics can enhance the search process and solution quality