Skip to main content
Advertisement
Browse Subject Areas
?

Click through the PLOS taxonomy to find articles in your field.

For more information about PLOS Subject Areas, click here.

  • Loading metrics

An Improved Population Migration Algorithm Introducing the Local Search Mechanism of the Leap-Frog Algorithm and Crossover Operator

Abstract

The population migration algorithm (PMA) is a simulation of a population of the intelligent algorithm. Given the prematurity and low precision of PMA, this paper introduces a local search mechanism of the leap-frog algorithm and crossover operator to improve the PMA search speed and global convergence properties. The typical test function verifies the improved algorithm through its performance. Compared with the improved population migration and other intelligential algorithms, the result shows that the convergence rate of the improved PMA is very high and its convergence is proved.

Introduction

The population migration algorithm (PMA) was proposed by Zhou et al in 2003 [1][2]. The PMA is a simulated population migration theory global optimization algorithm. The PMA is also a simulated mechanism that involves population along with economic center transfer and population pressure diffusion in the field. In other words, people relocate to a preferential region that has a high level of economic development and employment opportunities. When the preferential region has relative overpopulation, the population pressure constantly increases. When population pressure exceeds a certain limit, people will move to a more suitable preferential region. Population pressure makes the PMA engage in a better regional search. To a certain extent, population proliferation can prevent the PMA from falling into a local optimal solution. The whole algorithm shows alternate features that localized search and scatter search have in the search process.

In 2004, Xu gave an improved PMA (IPMA), which can be summarized into three forms: population flow, population migration, and population proliferation [3]. Population flow states that the population residing in the region is spontaneous and the overall flow is uncertain. Population migration is based on the mechanism across a wide range of the selection movement, in which people flow to the rich place. When the population pressure is very high, population proliferation is a selective movement from a preferential area to a no preferential area. It also reflects the people’s pioneering spirit.

Recent, various kinds of intelligent algorithms have been improved by many researchers. Wang et al improved the PMA with the steepest descent operator and the Gauss mutation [4]. Ouyang et al introduced the simplex in the PMA to improve the search performance of the algorithm [5]. Guo et al increased performance of the population migration by improving the migrations’ efficiency [6]. Karaboga improved the bee swarm [7]. Li et al introduced extremal optimization to improve the performance of the shuffled frog leaping algorithm (SFLA) [8]. However, the results of all these studies show low accuracy or fall into the local optimal solution. To further improve the local search ability of the PMA and speed up convergence, we introduce a local search mechanism of frog leaping and crossover. This paper is improved and uses a number of classic functions for the simulation to show that the algorithm is effective and feasible.

Population Migration Algorithm

Consider the following unconstrained single-objective optimization problem:(1)

In which is a real valued mapping, is a search space, . We assume that problem (1) has a constant solution, namely, global optimal solution exists and the set H of the global optimal solution is not empty. Seeking the most value problem can be transformed into the form of problem (1). In the algorithm, is an individual and stands for the population and various kinds of information. is the ith individual in the solution space, ; is the ith individual of the jth weight; , is the jth weight of the ith region’s radius , in which, , , ; and N stands for population size. Target function f(x) corresponds to the population attraction. The optimal solution (a local optimal solution) of problem (1) corresponds to the most attractive region (the preferential). The ability of the PMA to escape from the local optimal solution corresponds to the ability. Due to population pressure increase, the population moves out of the preferential area. The rise of algorithm or mountain climbing made the populations move into the preferential area. Local random search of algorithm corresponds to the population flow. The method by which the algorithm selects the approximate solution corresponds to the population migration. Escaping from the local optimal strategies corresponds to the population proliferation.

Frog Leaping Algorithm of Local Search

SFLA is proposed, as proved by Eusuff in 2003 [9]. SFLA is based on the heuristic and collaborative search of species. The implementation of the algorithm simulates the element evolution behavior of nature.

The implementation process of SFLA simulates the feeding behavior of a group of frogs in a wetland [10]. In the feeding process, every frog can be seen as the carrier of ideas and information. Frogs can communicate through the exchange of information, which can improve the knowledge of others. Every frog represents a solution to the optimization problem (1). Wetland represents the solution space. In the algorithm implemental stage, there are F frogs divided into N groups. Each group has l frogs. Different groups have different ideas and information. In a group, frogs follow the element evolution strategy in the solution space for local depth search and internal communication.(2)(3)

In the above formulas, rand() is a random function that randomly generates the value in the range [0,1], S is a step of the frog leaping, Pwi stands for the worst frog of the ith group, Pbi stands for the best frog of the ith group, and P1wi stands for an update of the worst frog of the ith group. If the adaptive value of the updated worst frog is more than that of the original worst frog in the same group, the updated worst frog will replace the original worst frog. Otherwise, a frog will be randomly generated in the group instead of the original worst frog. The updated process is repeated until the predetermined number of local search LS. When all groups have completed the depth of local search, all the frogs of the groups are again mixed and divided into N groups. We continue the local search in the groups until the termination rule holds true, and then we stop.

Crossover Operators

By two leaner combinations, we can obtain two new individuals. Arithmetic crossover operation is often used to express individuals by floating individual coding. Hypothesis: The two parents are and .

Then, n random numbers are randomly generated, in which , . After hybrid implementation, two new progenies are generated:

Methods

In the PMA, the population flow includes extensive information exchange. In practice, population information exchange decides the direction of the migration under certain conditions, so we should give full attention to population flow. When under pressure, population migration is the random population flow in the range of the population pressure, which does not have a significant role in further improving the algorithm.

The local search mechanism of frog-leaping algorithm and the crossover operator are introduced into the PMA in the process of population flow, which can well improve the original search results. IMPA organically made the frog-leaping combine with the crossover operator in the PMA. The process of the algorithm is as follows:

Initialization

Input the population size N, the initial radius δ(0), the vigilance parameter of population pressure α(0), the number of the population flow l, the number of local search LS, and the maximum number of iterations T. We randomly generate N individuals in the search space S, X1, X2, …, XN. The ith regional center determines the upper and lower bounds of the ith region, in which , , .The extraction method makes equal, so we cancel the superscript of in the following steps. We obtain an initial search space , which stands for the initial residential area. The set is taken as the sphere, in which X is the center and r is the radium. The following variables appearing that way are the same. and . We set iteration counter t = 0.

Evolutionary

  1. Preparation:
  2. Population flow: In every , we randomly generate li individuals, so we can obtain the group in , which has Nl individuals.
  3. Population migration
    We next introduce the local search mechanism of frog-leaping algorithm and crossover operator. At this time, the role of population has been transformed to the frog. The search area is seen as the frog’s wetland. The formed N local areas are seen as the N groups. The individual of every local area is seen as the frog.
    1. (a) We choose N best frogs in Y(t), which form the intermediate frog group .
    2. (b) Forming N regions: .
    3. (c) In every group, we produce a frog body flow. In other words, li individuals are randomly generated in . li and the adaptive value of are proportional.
    4. (d) We set the group counter iN = 1, which is compared with the total of groups N; the local evolution counter im = 1, which is compared with the number of local search LS.
    5. (e) In the iNth group, we calculate the adaptive value of every frog in the group and arrange them in descending order. We can obtain the best solution Pbi and the worst solution Pwi.
    6. (f) Phwi = Pwi, according to the following four formulas, we can update and improve the position of the worst frog in the group.
      (4) (5) (6) (7)
    7. (g) If step (f) improved the position of the worst frog, the adaptive value of is more than the Pwi’s, the position of Pwi will be replaced by the position of .That is, . Otherwise, we randomly generate a position of a frog instead of the worst frog in the group.
    8. (h) If im<LS, then im = im+1and turn to step (f). Otherwise, we report the best frog and turn to step (i).
    9. (i) If iN<N, then iN = iN+1 and we turn to step (e). Otherwise, we turn to step (j).
    10. (j) Contraction of preferential region:.
    11. (k) If (not more than the population pressure alert), we continue the frog flow and turn to step (c) Otherwise, we turn to step (l).
    12. (l) We report N the best frogs:.
  4. Population proliferation:
    1. (a) We keep the best individual in the , which is credited as .
    2. (b) In the , we randomly sample N-1 new individuals instead of the N-1 individuals of .
    3. (c) We define a new generation of population

Termination of Inspection

If the new generation of population X(t+1) contains the eligible solution, we stop and output the best solution in X(t+1). Otherwise, we narrow and properly and set t:t+1. If the maximum numbers of iterations for the algorithm are not met, we turn to the second step.

Convergence Analysis

This paper uses the axiom model to analyze and prove the convergence of IMPA.

Definition 1 Optimization variables X have limited-length characters and have the form . n is called the code string length of X, A is called the code of X, and X is called the decode of the character A. ai is considered a genetic gene. All possible values of ai are called allelic gene. A is a chromosome made up of n genes [11].

Definition 2 (Individual Space) is an allelic gene. n is given the code string length. The set is called individual, in which is a natural number and is the order of population space [11].

Definition 3 (Satisfaction Set) If , then . B(X) is called the satisfaction set. The number of individuals in the set B(X) is used for . is any of the population and is the best individual in the set . The satisfaction set induced by is called satisfaction set of the population, which is marked . All of the satisfaction sets’ intersections are formed with the global optimal solution [11].

Definition 4 (Selection Operator) The selection operator is a random mapping, . It meets the following two conditions. (1) For , we can obtain , in which stands for the individual in the ; (2) Any population that meets has . The first condition states that selected individuals should be in the selected population. The second condition states that the selection operator may increase the number of optimal individuals in the population, in addition to variations of the individuals’ numbers between selected population and original population [11].

Definition 5 (Reproduction Operator) Reproduction operator is a random mapping.. If any population , which meets , the new population which becomes under the action of the operator, may contain more satisfied individuals than the original population [11].

Definition 6 (Mutation Operator) Mutation operator is a random mapping: .meets. Mutation operator is the reproduction operator that has a much better nature. If the population contains satisfied individuals, then after mutation, the population should contain satisfied individuals. If the population does not have satisfied individuals, then after mutation, the population may contain satisfied individuals [11].

In improved population migration, extended operator meets the following expressions:and

Therefore, we can use the operator to stand for the process of population flow. The selection operator S1 stands for the process of step (a) in population migration, where we chose N, the best individuals from population Y(t). The reproduction operator R1 stands for the progress from step (b) to the process of final output Zbest(t) in step (l). The reproduction operator R2stands for the process from population proliferation to finally generate X(t+1). Therefore, the IPMA can be expressed as:(8)(9)

After both sides have the same function of operator for (8), we can find that .The general simulated evolution algorithm can be composed of a series of selection operators and reproduction operators to be represented abstractly as follows: .

Lemma 1 In the IPMA,S is a selection operator and R is a reproduction operator. The proof can be obtained from reference [11].

Theorem 1 The IPMA is a simulated evolutionary algorithm.

Lemma 2 In the IPMA, the characteristic number of the preferred selection operator, the selection pressure of the operator S:PS = 1, the selection intensity αS = 1. The IPMA just takes the preferred selection operator. According to reference [3], the selection pressure and the selection intensity are defined to calculate the characteristic number of the operator S. The proof can be obtained from reference [11].

Lemma 3 In the IPMA, the characteristic number of the reproduction operator, the operator , then the absorption and scattering rate of the reproduction meet the following estimates: . Where, (is the radius of the search space) is the radius of the set of optimal solution and is a conversion coefficient. If the set includes a division region in the , then; otherwise, is the rate of volume of the intersection of the set and a division region. Thus, we can obtain . The proof can be obtained from reference [11].

Lemma 4 If the abstract simulated evolutionary algorithm (SEA) satisfies the following conditions: (1) the pressure of selection operator is uniformly bounded; (2) the absorption of reproduction operator: meets ; (3) the strength of selection , absorption , and scattering content are as follows: . Then, an abstract and a general SEA will probably weakly approach the optimal solution set. The proof can be obtained from reference [11].

Theorem 2 (Convergence of IPMA) If the IPMA adopts the selection operator family , which comprise all the preferred selection operators, reproduction operators family, in which . The following conditions are thus met: (1); (2) ().

Then, to prove that the IPMA will probably weakly approach the solution set, the following steps are performed.

Proving.

Known by lemma 2: ,; Known by lemma 3: ,; when is given, is also given. By condition (1), we obtain . By condition (2), we obtain , then . We have and lemma 3, then . We know the divergence of the harmonic series, so . According to lemma 3: . According to lemma 2: , so . The three conditions of lemma 3 are met. The IPMA probably weakly approaches the global optimal solution.

Results and Discussion

Simulation Examples and Experimental Parameter Setting

To verify the performance of the IPMA, this paper performs an experiment. The IPMA is compared with the basic population algorithm and other improved algorithms. In the simulation experiment, we choose 10 classic test functions to test the algorithm. According to the performance of the functions, they are divided into a single minimum (single) and a number of local minima (multimodal) into two categories. The following lists the 10 functions’ definition, variable range, and global optimal values of the theory and optimal solution of the theory. The function f1 is a unimodal function used to examine the accuracy of the algorithm. The function f2 which contains infinite local optimal solution and has infinite local maximum near suboptimal solution, is a strong shock multimodal function. General optimization algorithms easily fall into local optimal solution. Thus, the ability of the algorithm is used to avoid falling into a local optimization algorithm. The function f5 is a multimodal function. It has about 10*D (D is dimension) local optimal solution. The function f6 is a deep and local minimum multimodal function that has many local optimal solutions. The functions f5 and f7 are complex nonlinears used to test global search performance of algorithms. The function f8 is a complex and classical function used to test the optimization efficiency of algorithms. In a certain number of iterations, this paper estimates the performance of IMPM by the test function of the best value, the worst value, average value, variance, average running time, and convergence rate.

1. Sphere Model function: . The optimal solution is and the optimal minimum value is 0, when n = 30.

2. Schaffer function: , , i = 1,2. The optimal solution is (X1, X2) = (0, 0) and the optimal maximum value is 1.

3. , , . The function in the section (−5.12, −5) has a global minimum value −30.

4. , , . The optimal solution is and the optimal minimum value is −6.

5. Generalized Rastrigin function: . The function has many local minimum solutions, but it has only one global minimum solution (X1, X2) = (0, 0) and the global minimum value is 0, when n = 2, n = 5 and n = 10.

6. Quartic function: ,, . The optimal solution is and the optimal minimum value is 0, when n = 20.

7. Ackley’s function: , . The optimal solution is and the optimal minimum value is 0, when n = 20.

8. Generalized Rosen Brock’s function:

. The optimal solution is and the optimal minimum value is 0, when n = 3.

9. Griewank function: . The optimal minimum value is 0, when n = 30.

10. Schwefel function: . The optimal minimum value is −2094.9, when n = 5. The optimal minimum value is −20949, when n = 50.

To verify the performance of the IPMA, we choose a computer whose CPU is made by AMD and has an operating system of Windows 7. The mathematical software Matlab7.0 was used for the testing. Each function is applied 50 times. We present the parameters in the process of solving the functions for this paper in the table 1. To show the better performance of the IPMA, we compare it with other algorithms through the following indicators: best and worst solutions, average solution, variance, average running time, and average convergence rate in 50 cycles.

thumbnail
Table 1. Parameter setting of the functions for the IPMA.

https://doi.org/10.1371/journal.pone.0056652.t001

This table shows parameter setting of the ten functions. Population size N, number of population flow l, radius δ, coefficient of contraction Δ, pressure of population parameters α, number of local search LS, and largest number of iterations K. .

Experimental Result and Analysis

In Table 2, the data of function f1 show that the improved population algorithm is better than frog-leaping algorithm in terms of precision. Other algorithms find it very difficult to achieve the optimal solution of function f2, but the IPMA can easily do so, which indicates that the algorithm’s ability of escaping from local optimal solution is very ideal. Function f8 can test whether the algorithm’s ability of global optimization is good. It is used to measure the efficiency of the algorithm, so we know the efficiency of the IPMA is very high compared with other algorithms from the data of function f8.The data functions f3, f4, f5, f7, f9 and f10 reflect that the IPMA is a better algorithm from a certain angle. Through cooperation of the best value, the worst value, the average value, the variance, and the convergence rate, the IPMA is found to be better than the other algorithms in the calculation of stability and precision.

thumbnail
Table 2. Comparison performance of the IPMA in the ten functions.

https://doi.org/10.1371/journal.pone.0056652.t002

Conclusions

In this paper, based on the local search mechanism and crossover operator, we put forward an IPMA. The parameters of the IPMA are simple and easy to realize. By testing the 10 functions, we find that the IPMA is better than the basic PMA and is superior to many intelligent algorithms. The PMA is good at global searches, but it does not perform well in local search and its precision is low. We introduce local search in population and crossover operator to improve these deficits. Therefore, the IPMA is increased to avoid falling into local optimum capacity; its calculation speed and precision are also improved.

Author Contributions

Conceived and designed the experiments: XYL. Performed the experiments: YQZ. Analyzed the data: YQZ XYL. Contributed reagents/materials/analysis tools: YQZ. Wrote the paper: YQZ.

References

  1. 1. Zhou YH, Mao ZY (2003) A new search algorithm for global optimization: population migration algorithm (I). Journal of South China University of Technology (Nature Science) 31: 1–5 (in Chinese)..
  2. 2. Zhou YH, Mao ZY (2003) A new search algorithm for global optimization: population migration algorithm (II). Journal of South China University of Technology (Nature Science) 31: 41–43 (in Chinese)..
  3. 3. Xu ZB (2004) Computational intelligence (Volume 1) - Simulated evolutionary computer. Beijing: High Education Press. 152p. (in Chinese).
  4. 4. Wang XH, Liu XY, Bai XH (2009) Population migration algorithm with Gauss mutation and steepest descent operator. Computer Engineering and Application 45: 57–60 (in Chinese)..
  5. 5. Ouyang YJ, Zhang WW, Zhou YQ (2010) A based on the simplex and population migration hybrid global optimization algorithm. Computing Engineering and Application 46: 29–35 (in Chinese)..
  6. 6. Guo YN, Cheng J, Cao YY (2011) A novel multi-population cultural algorithm adopting knowledge migration. Soft Computing 15: 897–905.
  7. 7. Karaboga D (2005) An idea based on honey bee swarm for numerical optimization. Technical Report-TR06, Erciyes University, Engineering Faculty, Computer Engineering Department. 10p.
  8. 8. Li X, Luo JP, Chen MR, Wang N (2012) An improved shuffled frog-leaping algorithm with extremal optimization for continuous optimization. Information Science 192: 143–151.
  9. 9. Eusuff MM, Lansey KE (2003) Optimization of water distribution network design using the shuffled frog leaping algorithm. Journal of Water Resource Planning and Management 129: 210–225.
  10. 10. Han Y, Cai JH (2010) Advances in Shuffled Frog Leaping algorithm. Computer Science 37: 16–19 (in Chinese)..
  11. 11. Wu Y (2005) Convergence analysis on population migration algorithm. Xi’an: Xi’an University of science and Technology. 76p. (in Chinese).
  12. 12. He B, Che LX, Liu CS (2011) Novel hybrid shuffled frog leaping and differential evolution algorithm. Computer Engineering and Application 47: 4–8 (in Chinese)..