Skip to main content
Advertisement
Browse Subject Areas
?

Click through the PLOS taxonomy to find articles in your field.

For more information about PLOS Subject Areas, click here.

  • Loading metrics

An improved crayfish optimization algorithm for solving engineering optimization problems

Abstract

By emulating crayfish behaviors such as social foraging, rapid retreat from threats, and adaptive sensing, the Crayfish Optimization Algorithm (COA) achieves a dynamic balance between global search and local exploration, improving optimization efficiency. However, COA suffers from diversity degradation, insufficient exploration capability, low optimization finding accuracy, and easy fall to a local minimum. To solve these problems, an improved crayfish optimization algorithm (ICOA) is proposed. Firstly, the position of the population is achieved through the application of the Sobol sequence mapping in the initialization phrase, which enhances the diversity within the population. Secondly, a Lévy flight strategy is proposed in the foraging phase, which avoids algorithms fall into local optimization and enhances the individuals’ capacity for extensive exploration within the solution space. Subsequently, during the competition phase, using the Euclidean distance-fitness balanced competition strategy improves simultaneous development and exploration performance. To evaluate ICOA performance, the IEEE CEC2019 and CEC2020 benchmark functions and experiments were used in different dimensions for verification, followed by sensitivity analysis, quantitative analysis, and nonparametric statistical analysis. Furthermore, the effectiveness is validated in five engineering optimization problems, in which ICOA improved by 0.28%, 17.86%, 0.01%, 88.8% and 0.1%, respectively, compared to COA. ICOA exhibits enhanced optimization capabilities to tackle complex spatial and practical challenges. Incorporating multiple strategies markedly improve the efficacy of ICOA. This finding has significant implications in the field of engineering optimizations.

1 Introduction

With the increasing prominence of AI technology, many scholars show great interest in intelligent optimization [1,2], and the establishment of precise mathematical models to address complex optimization problems. Inspired by biological habits, physical phenomena, mathematical methods and other meta-inspired intelligent algorithms [3,4] are widely used in solving engineering optimization problems [35] such as hydrological Modeling and Prediction [6] and medical imaging area [7]. There are multiple design constraints in real-world applications, such as those being nonlinear, complex, and non-microscopic in nature, making them difficult to resolve. Facing these challenges, the prevailing solutions can be categorized into two types. The first category is deterministic algorithms. These algorithms leverage analytical characteristics and produce a deterministic sequence of points in globally optimal solution. These algorithms rely on precise mathematical formulations and follow a predictable trajectory toward the optimal solution. These deterministic methods include gradient descent, Newton’s method, and conjugate gradient, which are particularly effective for convex and differentiable functions. This type of algorithm is stable and predictable and often used in scientific computation, engineering design and other fields that require precise control and handling of deterministic problems. However, this type of algorithm can’t be applied to solve large-scale data and complex problems. The other category is metaheuristic algorithms [811], which are improvements on heuristics algorithms, combining the characteristics of local search algorithms and stochastic algorithms [12,13]. The standard meta-heuristic algorithms include Evolution-based, Physics-based, Human-based and Swarm-based algorithms. These algorithms are inspired by natural processes and are widely used for solving complex, non-linear and multi-modal optimization problems. An overview of various metaheuristic algorithms and their categories is presented in Table 1 and summarized as follows.

The crayfish optimization algorithm [34] simulates the behavior of the crayfish to forage, complete and avoid heat in the natural environment. It has a rapid convergence rate, which allows for a thorough search in the solution space and increases the possibility of finding the best global solution. However, it is more sensitive in parameter selection, and different problems may require different parameter settings, so its local search ability may be insufficient, which affects the search accuracy. In some complex optimization problems, the algorithm may experience premature convergence, resulting in the inability to find all optimal solutions. When simulating behaviors and position updates, it requires larger individuals and may require more computational resources to address large-scale issues.

The optimization research on crayfish at home and abroad focuses mainly on three aspects. The first is the improvement algorithm. Researchers are committed to improving the performance of crayfish optimization algorithms, including increasing convergence speed, improving overall search ability, avoiding premature convergence, and so on. Chao [55] proposed an advanced artificial intelligence-based method that uses a parallel Crayfish Optimization and Arithmetic Optimization Algorithm (PSCOAAOA) to optimize the parameters of a Support Vector Machine (SVM), allowing the prediction of the friction force for the Ti-6Al-4V alloy under different lubrication conditions. Wang [56] introduces strategies such as cave candidacy, food covariance learning, and non-monopoly perturbation to address the shortcomings of the original algorithm. Secondly, in theoretical research, some scholars focus on the theoretical foundations of the crayfish algorithm, including convergence analysis, complexity analysis .etc. Through theoretical research, a further understanding of the characteristics and adaptation range of the algorithm. Shikoun [57] enhances the original COA with refracted opposition-based learning and a crisscross strategy to improve search ability and convergence accuracy. Zhang [58] proposed an enhanced crayfish optimization algorithm to improve the exploration capability and avoid being caught in local optima. Third, COA can be applied to industrial production scheduling, image processing, path planning, medical, and other practical problems. Chaib [59] uses fractional-order chaos maps in COA, then adaptively adjusts COA settings based on energy requirements. Jia [60] combines a learning strategy with COA strengthening the algorithm’s ability to explore the solution space and escape local minima. Patel [61] proposes an intelligent kidney tumor segmentation and classification model for the early identification of benign and malignant tumors based on a deep learning model and MCOA. This proposed method significantly improves the accuracy and efficiency of tumor detection. Xiao [62] combines specular reflection learning, the expanded exploration strategy from the Aquila optimizer(AO), the local exploitation characteristics of Lévy flight, and a vertical crossover operator to enhance its optimization capability.

COA suffers from diversity degradation, insufficient exploration capability, low optimization finding accuracy, and easily falling into a local minimum. To optimize these limitations, this paper proposes an ICOA. First, use the Sobol sequence to intialize the population, which can effectively make its distribution average. Second, to balance global and local exploration in the competition phase, a Euclidean distance-fitness balanced competition strategy is proposed. Finally, the position of the crayfish is randomly updated using the Lévy flight strategy, with additional update conditions incorporated to accelerate the convergence rate. The optimization performance of ICOA is evaluated in CEC2019 and CEC2020. The main contributions of this study are summarized as follows.

  • Use the Sobol sequence to initialize the population, which can effectively make its distribution average in the initial stage.
  • To balance exploration and exploitation during the competition phase, a Euclidean distance-fitness balanced competition strategy is proposed in the competition phase.
  • To enhance global optimal search ability, the position of the crayfish was randomly updated using the long-short jump feature of theLévy’s flight.
  • ICOA is evaluated on 10 functions in CEC2019 and CEC2020, which utilize various dimensions.
  • ICOA is evaluated in five engineering applications.

The overall research framework of this paper is organized as follows:

The Sect 2 reviews the classical COA and discusses the motivations for improving its search mechanisms. The Sect 3 presents improvements from three perspectives: initialization improvements, exploration strategies, and hybrid COA variants. The Sect 4 conducts a performance analysis on the CEC2019 and CEC2020 benchmark functions in different dimensions, and evaluation metrics. Reports and analyzes the comparative results, followed by Sensitivity analysis, statistical tests and ablation studies. The Sect 5 validates the effectiveness of the proposed method through five engineering optimization problems. Finally, the Sect 6 concludes the paper by discussing the advantages, limitations, and future research directions of the proposed algorithm.

2 Crayfish optimization algorithm

COA simulates key behavior stages of crayfish, including escaping summer heat, engaging in competition, and searching for food.. The summer heat avoidance stage is exploration stage, and competition phase and foraging behaviour stage is its development stage. The whole algorithm has a more organic optimization effect due to the temperature balancing algorithm exploration and development capability, which can find the best fitness value more quickly.

2.1 Initialization stage

In COA, every individual crayfish is set as a matrix structure, in a set of variables , a set of potential solutions X will be created randomly. Additionally, each variable should remain within its respective boundary limits.

(1)(2)

where, N represents the total number of crayfish in the population, dim represents its dimension, X is the initial population, Xi,j represents the coordinate of the i-th crayfish along the j-th axis, Ubj means the upper boundary in the j-th dimensional, Lbj is lower boundary in the j-th dimension, means a random value in the range [0,1].

2.2 Summer heat avoidance stage (exploratory phase))

Temperature Changes will affect the behavior of the crayfish. When the temperature exceeds a certain value, the crayfish will seek cooler areas to escape the heat. The optimal feeding temperature of crayfish is in the range[20,30]. The temperature of crayfish is given by Eq 3:

(3)

In the summer heat avoidance stage (exploratory phase), if it exceeds a certain value, the crayfish will move into a cave to escape the heat. Subsequently, they will select a relatively cooler location to avoid exposure to high temperatures. The mathematical model for this heat avoidance behavior in crayfish is presented in Eq 4:

(4)

Where XG refers to the best position achieved during multiple iterations. and XL refers to the optimal position obtained after the update of the current iteration. The competition among crayfish for the cave occurs randomly. In the summer heat avoidance stage (exploratory phase), when rand < 0.5, if there are no competing crayfish, it enters the cave directly to escape the heat. The updated position is set in the following Eq 5:

(5)

Here, Xnew is the post-update position used for the subsequent generation, and C2 follows a decreasing trend from 2 to 0. C2 is calculated in the following Eq 6:

(6)

2.3 Competition stage (exploitation stage)

The crayfish begin to compete to the cave once the temperature >30°C and . In this situation, the crayfish Xi updates the position based on another randomly selected crayfish Xz. The updated position of the crayfish is calculated using Eq 7:

(7)

Here, Xz,j is the position of a random crayfish, and the formula for calculating the random crayfish is presented in Eq 8.

(8)

2.4 Foraging stage (exploitation stage)

The crayfish will come out of their cave to search for food once the temperature. In Addition, temperature plays a role in determining how much food the crayfish will eat. The crayfish have a strong foraging behavior when the temperature is between and . And when the temperature is equal to , crayfish will find the most food. The food intake of crayfish depends on the temperature and is normally distributed. Food intake is calculated in the following Eq 9:

(9)

The position of the food is calculated according to the Eq 10:

(10)

temp is the environmental temperature, μ is the best temperature for crayfish to eat, σ and C1 are factors influencing food intake adjustments based on temperature variations.

When eating, crayfish decide whether to split the food apart depending on their size. When the food is the right size, they will eat it immediately. Nevertheless, if the food is too large, they will split it into smaller pieces before eating it. The size of the food is evaluated using the following Eq 11.

(11)

Here, C3 is a food-related parameter associated with the maximum food size. represents the fitness of the location of the food, while is the fitness value of the i-th crayfish.

If Q>(C3  +  , the crayfish cannot directly eat the food due to its size. When the food is too large, the crayfish first breaks it into pieces with its claws, then uses its second and third pairs of legs in turn to feed. The formula for slicing the food is presented in Eq 12.

(12)

The new feeding position is computed using the formula shown in Eq 13:

(13)

If Q<(C3 + 1)/2, since the food is of a suitable size, the crayfish immediately goes to it and begins eating without the need for prior processing. The new feeding position is given in Eq 14:

(14)

2.5 Pseudo-code of the COA

The COA pseudo-code is detailed in Algorithm 1 and Fig 1.

Algorithm 1 Pseudo-code of the COA.

Initialization: Population Np, Iteration T, Dimension Dim; Initialize the population, evaluate the fitness to identify XGbest, XLbest.

while t<T do

   Calculate the temperature using Eq 3

   if temperature > 30 then

    Compute Xshade via Eq 4

    if rand < 0.5 then

     Adjust the location by using Eq 5

    else

     Adjust the location by using Eq 7

    end if

   else

    Evaluate the relationship between food intake proportion and food size using Eqs 9 and 11

    if Q > (C3 + 1)/2 then

     Adjust the location by using Eq 13

    else

     Adjust the location by using Eq 14

    end if

   end if

   for 1:N do

    Adjust fitness values, XGbest, XLbest

   end for

   t = t + 1

end while

3 Improved crayfish optimization algorithm

3.1 Initialization improvements

Crayfish are highly territorial. In the natural environment, crayfish are evenly distributed at the bottom of the pool and cannot tolerate the presence of crayfish of the same sex within their burrows. This territoriality is manifested not only in the fact that they cannot tolerate the same kind in their territories. In addition, the size and location of the territories will be adjusted with changes in time and the ecological environment. In meta-heuristic optimization, how initial solutions are positioned within the solution space has a significant impact on both search accuracy and convergence speed. Uniformly initializing the population improves the algorithm’s search efficiency. COA employs a random generation method for the initialization of solutions, which leads to low traversability, slow convergence, and uneven population distribution. This paper employs the Sobol sequence for population initialization, and uniform points are used to fill the multidimensional hypercube as much as possible, which results in higher computational efficiency and broader coverage of sampling points.

The sobol sequence is an efficient random-number generation method with low variance and produces random search directions. In each dimension of the population, there exists a Radical Inversion with a base number of 2, and each dimension of the Radical Inversion possesses its distinct matrix, thereby generating non-repetitive and uniform points. Fig 2 shows randomly generation and Sobol sequence generation scatterplot comparison graph, the random number distribution of a population of 100 sizes in the interval of [0,1], which is compared with the random number distribution in the 2D space. Compared to other methods, the Sobol sequence produces more uniform populations, and the coverage of the solution space is more complete.

thumbnail
Fig 2. Randomly generated and Sobol sequence generation scatterplot comparison graph.

https://doi.org/10.1371/journal.pone.0340464.g002

3.2 Exploration strategies

In the foraging phase, when , the crayfish experience slow convergence during direct food acquisition. COA may prematurely converge when search agents cluster too quickly, which motivates the incorporation of Lévy flights to allow long-distance jumps and escape from local optima. A modified conditional update optimization algorithm is introduced that incorporates the Lévy flight strategy. This algorithm utilizes the characteristics of the heavy-tailed distribution of Lévy flight to randomly adjust the position of the crayfish. Significantly enhances global search capabilities by combining frequent local steps with occasional long-distance jumps, allowing the algorithm to escape local optima and accelerate convergence. The expression of Lévy step size is defined in the following Eq 15:

(15)

Where:

  • : a standard normal distribution with zero mean and variance .
  • : a standard normal distribution with zero mean and variance .

The variance , is calculated using Eq 16:

(16)

In Eq 16, Γ presents the standard gamma function, β represents the value range [0,2]. Here, the value is taken as 0.8.

When  +  , crayfish immediately after eating food in the position update formula (Eq 14), thus allowing the algorithm to be more efficient in escaping from local optima. The new crayfish updating formula is presented in the following Eq 17:

(17)

where, δ is the step scaling factor, set to 1. denotes the operation of tensor product.

3.3 Hybrid COA variants

During the competition stage, the update strategy is designed primarily to promote a deeper exploration of the solution space. However, the initial formulation favors random exploitation and reduces the effectiveness of exploitation. And the original distance-update mechanism lacks adaptive sensitivity to the spatial relationships among individuals, motivating the use of an Euclidean-distance-based strategy to refine position update dynamics. This research presents a balanced competition strategy that incorporates both Euclidean distance and fitness metrics for agent selection. To ensure adequate exploitation performance throughout the competition phase while allowing agents to be ranked according to their fitness levels, we introduce a beta parameter that balances Euclidean distance with fitness. Subsequently, any agent from the top 50% is selected based on this balance. This approach effectively synchronizes exploration and exploitation performance more optimally. The following Eqs 18 to 24 show the modified competition phase formula:

(18)(19)(20)(21)(22)(23)(24)

Where distance measures how far the current individual is from the other individuals in Euclidean terms. fitness_score is a matrix, XEdf is the index of the selected individual.

3.4 Pseudo-code of the ICOA

The ICOA pseudo-code is detailed in Algorithm 2 and Fig 3:

Algorithm 2 Pseudo-code of ICOA.

Initialization: Population Np, Iteration T, Dimension Dim; Initialize the population, evaluate the fitness to identify XGbest, XLbest.

while t<T do

   Calculate the temperature using Eq 3

   if temperature > 30 then

    Compute Xshade via Eq 4

    if rand < 0.5 then

     Adjust the location by using Eq 5

    else

     Adjust the location by using Eq 24

    end if

   else

    Evaluate the relationship between food intake proportion and food size using Eqs 9 and 11

    if Q > (C3 + 1)/2 then

     Adjust the location by using Eq 13

    else

     Generate improved Lévy Flight random numbers via Eqs 15 and 16

     Adjust the location by using Eq 17

    end if

   end if

   for 1:N do

    Adjust fitness values, XGbest, XLbest

   end for

   t = t + 1

end while

3.5 Computational time complexity

To better evaluate the performance of each algorithm, this paper employs a method where each assessment of the fitness function counts as one iteration. This is because some algorithms perform numerous position evaluations during iteration; using this approach allows for a more accurate comparison of the algorithms’ performance differences.In the subsequent test set, the iteration count is indirectly determined by the problem’s dimensionality. Here, T denotes the result of multiplying the problem’s dimensionality by a predetermined multiplier. Different algorithms evaluate the fitness function a varying number of times during iteration, potentially precluding direct comparison. For COA, each iteration performs only one fitness analysis, yielding an overall time complexity of O(DxT). The improved strategy during iterations does not invoke the fitness function when computing Euclidean distances. Whilst this introduces additional runtime overhead, it does not impact the overall time complexity. Similarly, the Levy flight strategy imposes no time overhead related to D, N, or T, thus also preserving the overall time complexity.The CPU time statistics for multiple dimensions from CEC2020 reveal that as the number of dimensions increases, the impact of these additional overheads on algorithmic time diminishes progressively. This finding corroborates the preceding theoretical analysis.

In terms of overall computational time, comparative experiments were conducted on the CEC-2020 benchmark under three-dimensional settings (20D, 30D, and 50D). All algorithms are evaluated using identical parameter configurations and executed within the same computational environment to ensure a fair comparison. The results show that, as dimensionality increases, the runtime of ICOA remains within a reasonable range, demonstrating good scalability and computational stability. Furthermore, compared to other benchmark algorithms, ICOA displays a slower growth in computational cost at higher dimensions, indicating superior computational efficiency. The overall runtime comparison of ICOA across different dimensions of the CEC2020 benchmark functions is presented in the Table 1.

The comparison of time complexity between COA and ICOA is presented in Table 2.

thumbnail
Table 2. Comparison of computational complexity: COA and ICOA.

https://doi.org/10.1371/journal.pone.0340464.t002

4 Experimental results and discussion

This section presents simulation experiments to evaluate the optimization performance and effectiveness of the ICOA.

This study assesses the ICOA’s ability to optimize diverse objective functions using the CEC-2019 benchmark suite and CEC-2020 test set. The results are benchmarked against established algorithms, including COA, SCA, WOA, HHO, WDO, PSO, APO, DBO, and BKA, to thoroughly evaluate ICOA’s solution accuracy and optimization efficacy. Four key metrics—mean value (Mean), standard deviation (Std), best-found value(Best), and rank value(Rank)—are utilized for a comprehensive performance analysis.

4.1 Parameter setting and environment

The hardware and software configurations for all experiments are detailed in Table 3. The experimental parameters include the dimension of the population (D=20, 30, 50), while keeping the population size of N=100 and a fixed number of iterations (MaxFEs=10000). For each algorithm, 30 independent runs are performed, and statistical metrics (Mean, Std, Best, and Rank) are collected. The details of the parameters for each algorithm are presented in Table 4.

4.2 Ensitivity analysis

To evaluate the stability and robustness of the proposed Improved Crayfish Optimization Algorithm (ICOA) under varying parameter configurations, a systematic parameter sensitivity analysis is conducted in this section. Internal parameter settings significantly influence algorithmic performance; appropriate tuning not only enhances global search capability and convergence speed but also strengthens generalization across complex optimization problems. Given that ICOA incorporates stochastic factors and the Lévy flight mechanism in population updating and step-size control, two core parameter categories are selected for analysis: one is the random weight parameter rand, which modulates perturbation intensity in individual search; and the other is the Lévy flight distribution parameter , which governs the heavy-tailed step-length distribution and exploration range.

Experiments are performed on the CEC-2020 benchmark suite at 30 dimensions, with systematic combinations of and . Table 5 is conducted on the CEC-2020 benchmark suite at 30 dimensions, to ensure statistical reliability and reproducibility, a population size of 100 is adopted, and each parameter configuration is independently executed for 10000 runs. Performance is evaluated using mean fitness (Mean), standard deviation (Std), and best-found value (Best), providing a comprehensive assessment of convergence precision, stability, and global search ability. Results reveal moderate sensitivity to rand, in contrast to relative insensitivity to β. Notably, the configuration and achieves an optimal trade-off between convergence precision and stability. These findings provide a data-driven foundation for parameter selection and facilitate broader deployment of ICOA in real-world optimization scenarios.

thumbnail
Table 5. The influence of parameter rand and β on test results (CEC-2020).

https://doi.org/10.1371/journal.pone.0340464.t005

4.3 CEC2019 benchmark test function experiment

4.3.1 Numerical experiment and analysis.

This part aims to comparatively assess the performance of ICOA and other algorithms (including COA, SCA, WOA, HHO, WDO, PSO, APO, DBO and BKA), to evaluate their optimization capabilities.

In CEC2019, except for the function , the functions all have the same dimensions and the same search intervals. Here, ‘Mean’ denotes the average value of the objective function, reflecting the algorithm’s computational accuracy. ‘Std’ represents the standard deviation of the best objective function values, indicating the algorithm’s stability. ‘Best’ refers to the optimal value obtained, while ‘Rank’ shows the ranking based on the best results.

Table 6 shows that ICOA performs significantly better precision in obtaining optimal values for functions. Specifically, the ‘Best’ and the ‘Mean’ of ICOA is significantly better than that of other algorithms, ‘Std’ of ICOA is slightly less than the ‘Best’. This shows that ICOA has good optimization ability. In contrast to the initialization and updating methods used in the traditional COA, the ICOA promotes greater information exchange among individuals. This enhancement strengthens the search and optimization capabilities of the algorithm.

thumbnail
Table 6. Experimental outcomes for each algorithm IEEE2019 (optimal data are highlighted in bold).

https://doi.org/10.1371/journal.pone.0340464.t006

4.3.2 Iterative curve analysis.

To more effectively examine the convergence of the ICOA in CEC2019, the convergence curve is presented in IEEE CEC2019 in Fig 4. The x-axis shows how many iterations were performed, and the y-axis shows the average result over 30 independent runs.

thumbnail
Fig 4. Convergence curve of ICOA in IEEE CEC2019 (DIM=10).

https://doi.org/10.1371/journal.pone.0340464.g004

Firstly, during the early phase of iteration, the convergence curve of ICOA decreases rapidly. This suggests that initializing data with the Sobol sequence effectively accelerates convergence. Furthermore, ICOA shows strong global search ability and seldom falls into local optima throughout iterations. In , although early convergence is slightly slower, other algorithms generally converge more slowly as the iteration progresses. And the present algorithm can carry out the optimality-seeking process. Incorporating the Lévy flight mode improves its ability to identify potentially optimal solutions. From the vertical comparison, ICOA achieves more effective performance outcomes, particularly showing rapid convergence. This improvement in optimization accuracy is mainly due to the Euclidean distance-fitness balanced competition strategy, which takes into account the differences between individual positions and fitness values. Enhances the interaction of information between individuals within the population.

In Fig 5, detailed box-and-line plots present the performance, clearly indicating that ICOA exhibits superior performance. The solution distribution of ICOA is more concentrated compared to that of all other algorithms. These results showcase ICOA’s strong global search and local refinement abilities, confirming its reliability and precision.

4.4 CEC2020(DIM=20) benchmark test function experiment

Table 7 shows that the ICOA algorithm performs exceptionally well overall in terms of average fitness (Mean) on the CEC2020 test function set (dimension D=20). It can be observed that ICOA achieves lower mean fitness values in most test functions compared with COA, SCA, WOA, HHO, WDO, PSO, APO, DBO, and BKA, demonstrating its superior global search capability and convergence stability. Specifically, for unimodal functions (e.g., F1 and F2), ICOA achieves significantly lower mean fitness values, indicating faster and more accurate convergence. On multimodal and hybrid functions (F6–F9), ICOA still maintains competitive performance with lower mean values, suggesting it has a stronger ability to avoid local optima. On certain high-dimensional or discontinuous functions (e.g., F10), ICOA performs comparably to the best competitors and shows good robustness. The overall average ranking also indicates that ICOA ranks among the top algorithms across all functions, confirming its overall optimization capability. In summary, the ICOA algorithm achieves a good balance between global exploration and local exploitation, with fast convergence speed and high stability in fitness, demonstrating strong generalization ability across different types of target functions.

thumbnail
Table 7. Experimental outcomes for each algorithm in IEEE2020 D=20 (optimal data are highlighted in bold).

https://doi.org/10.1371/journal.pone.0340464.t007

As illustrated in Fig 6, ICOA exhibits superior convergence accuracy and rapid convergence across all functions except F2, suggesting its outperformance compared to comparison algorithms. In particular, ICOA demonstrates the highest convergence rate. Furthermore, compared with ten other similar algorithms, ICOA shows more pronounced improvements in terms of optimization capability and prevention of local optima, leading to superior practical applicability. As shown in Fig 7, the detailed box-and-line plots illustrate the performance characteristics, and these visualizations clearly demonstrate that ICOA achieves better performance.

thumbnail
Fig 6. Convergence curve of ICOA in IEEE CEC2020 (DIM=20).

https://doi.org/10.1371/journal.pone.0340464.g006

4.5 CEC2020(DIM=30) benchmark test function experiment

As shown in Table 8, the results presented in the table demonstrate that, on the CEC2020 benchmark suite (dimension D = 30), the ICOA algorithm achieved the lowest mean fitness values across the majority of test functions. This indicates that ICOA consistently obtains high-quality solutions over multiple independent runs, reflecting its superior global optimization capability and convergence efficiency. Specifically, for most unimodal functions (e.g., F1 and F2), ICOA significantly outperformed the compared algorithms in 30 dimensions, underscoring its clear advantages in overall performance and convergence stability. For complex multimodal and hybrid functions (e.g., F6–F9), ICOA maintained remarkably low mean fitness values even under high-dimensional conditions. Compared with classical algorithms such as PSO, WOA, and HHO, ICOA exhibited stronger global search ability and more effectively escaped local optima. On a few highly discontinuous functions (e.g., F10), although the performance gap narrowed, ICOA still delivered competitive results, demonstrating its robust adaptability.

thumbnail
Table 8. Experimental outcomes for each algorithm IEEE2020 D=30 (optimal data are highlighted in bold.

https://doi.org/10.1371/journal.pone.0340464.t008

Overall, the average ranking across all 30-dimensional tests places ICOA at the forefront among the evaluated algorithms, further confirming its superiority in high-dimensional global optimization problems. The comprehensive experimental results reveal that ICOA effectively balances global exploration and local exploitation in high-dimensional scenarios, achieves rapid convergence with stable outcomes, and exhibits strong adaptability across diverse objective function types.

ICOA exhibits excellent convergence precision and fast convergence in all functions in Fig 8. Furthermore, in the comparison of convergence between 20 and 30 dimensions, it is clear that ICOA consistently produces excellent optimization results. It can be inferred that the strategy developed in this work continues to perform efficiently with increasing dimensions.

thumbnail
Fig 8. Convergence curve of ICOA in IEEE CEC2020 (DIM=30).

https://doi.org/10.1371/journal.pone.0340464.g008

The box plots in Fig 9 illustrate that ICOA generates solutions with a more concentrated distribution and superior quality. This shows its strong performance in balancing global search and local search spaces, consistently yielding positive results.

4.6 CEC2020(DIM=50) benchmark test function experiment

As shown in Table 9, when the dimensionality increases from 20 to 50, the overall optimization difficulty rises significantly, with the mean fitness values (Mean) and best fitness values (Best) of all algorithms generally increasing. This indicates a substantial increase in both search complexity and the number of local optima in high-dimensional spaces. Under these conditions, the ICOA algorithm still demonstrates strong stability and global search capability. Compared to the 20-dimensional results, although ICOA experiences a slight increase in fitness values in the 50-dimensional tests, its performance degradation is smaller than that of most comparative algorithms. Particularly in complex multimodal functions, it maintains lower average errors and higher solution convergence precision, reflecting excellent high-dimensional robustness. From the perspective of the standard deviation (Std) metric, ICOA exhibits relatively stable fluctuation amplitudes in high-dimensional scenarios, indicating that the algorithm can still maintain consistent search results across multiple independent runs without significant instability. Overall, ICOA demonstrates good scalability in cross-dimensional tests from 20 to 50. When facing the increased complexity introduced by expanding the search space, it maintains a good balance between exploration and exploitation, exhibiting superior global optimization capability and convergence stability compared to other algorithms.

thumbnail
Table 9. Experimental outcomes for each algorithm in IEEE2020 D=50 (optimal data are highlighted in bold).

https://doi.org/10.1371/journal.pone.0340464.t009

In Figs 10 and 11, ICOA achieves faster convergence with higher accuracy in the remaining functions. This suggests that ICOA demonstrates faster convergence performance. Clearly, the solutions obtained by ICOA exhibit a more concentrated distribution with fewer poor values, highlighting the improved robustness of ICOA.

thumbnail
Fig 10. Convergence curve of ICOA in IEEE CEC2020 (DIM=50).

https://doi.org/10.1371/journal.pone.0340464.g010

thumbnail
Fig 11. Comparison of box diagrams of the 10 algorithms using CEC2020 (Dim=50).

https://doi.org/10.1371/journal.pone.0340464.g011

4.7 Nonparametric statistical analysis

4.7.1 Effectiveness analysis of the ICOA.

To verify the effectiveness of the core strategy in the improved crayfish optimization algorithm, ablation experiments were conducted on the CEC2020 test set with a 30-dimensional scenario. The experiment designed five comparative schemes: 1) the complete ICOA algorithm (integrating Sobol sequence initialization, Levy flight, and Euclidean distance-fitness balanced competition strategy); 2) the original COA algorithm; 3) ICOAX (removing Sobol sequence initialization and adopting random initialization); 4) ICOAY (removing Levy flight and retaining the basic search mechanism); 5) ICOAZ (removing the Euclidean distance-fitness balanced competition strategy and adopting traditional competition selection rul optimization accuracy (mean, standard deviation, and optimal value) of the five algorithms on the CEC2020 test set functions was used as the core evaluation metric. The experimental results showed that the complete ICOA algorithm exhibited comprehensive advantages across 10 test functions.

Table 11 presents the ablation study based on the CEC2020 test set. The parameter settings of all algorithms follow those of the COA algorithm, with a population size of 100 and a maximum of 10,000 iterations. As shown in Table 10, for the unimodal function F1, each strategy yields a noticeable improvement in optimization performance compared with the original COA algorithm, while the combined strategies achieve the best overall enhancement. For the basic functions (F2–F4), ICOA exhibits a substantial improvement on F2 and is able to reach the optimal value; a moderate improvement is observed on F3, and similar optimization results are obtained on F4. For the hybrid functions (F5–F7) and composition functions (F8–F10), all strategies contribute to better optimization performance and improved stability. Overall, the ablation study demonstrates that each strategy enhances the search capability of the original COA, and the integrated ICOA achieves superior performance, confirming the effectiveness and feasibility of the proposed improvements.

thumbnail
Table 10. The overall runtime comparison of ICOA across different dimensions of the CEC2020 benchmark functions.

https://doi.org/10.1371/journal.pone.0340464.t010

4.7.2 Friedman test analysis.

According to Table 12, the Friedman test results (, p < 0.001) indicate that there is a significant difference in performance among the 10 algorithms tested on the 10 benchmark functions. Among them, ICOA ranks first with an average rank of 2.28, significantly better than other algorithms, and is the optimal algorithm in this experiment, with significant statistical advantages.

5 Results and analysis of engineering optimization experiments

To verify the optimization performance of the improved algorithm, a variety of typical engineering design problems were selected for solution and analysis. To more comprehensively demonstrate the feasibility and practicality of the ICOA algorithm in engineering optimization, the experiments employed the latest evaluation indicators and statistical analysis methods. First, the constraint-violation value v 25 was introduced in each engineering case to determine whether the obtained solution satisfies the imposed constraints; only after feasibility is ensured is the fitness value used to assess the quality of the solution. The experimental results were statistically analyzed and compared across different algorithms.

(25)

Specifically, each algorithm was independently executed 30 times on every engineering problem, and the following performance indicators were recorded: the best fitness value (Best), the mean fitness value (Mean), the standard deviation of fitness values (Std), the mean constraint violation (MV), and the feasibility rate (FR). Among them, MV represents the average constraint-violation value over the 30 independent runs, while FR denotes the proportion of runs in which the algorithm successfully produced feasible solutions.

The performance of algorithms on engineering constrained optimization problems was evaluated mainly using two indicators: the feasibility rate (FR) and the mean constraint violation (MV). FR measures how often an algorithm produces feasible solutions over multiple independent runs, with higher values indicating more consistent satisfaction of constraints. MV represents the average degree to which the solutions violate constraints, with lower values indicating better handling of constraints and fewer infeasible solutions. When algorithms have the same FR and MV, their performance is further compared based on the mean fitness value (Mean), where smaller values indicate better solutions. Overall, algorithms with higher FR, lower MV, and smaller Mean are considered more effective for solving engineering constrained optimization problems.

5.1 Tension/compression spring design problem

Designing extension/compression springs involves engineering optimization focused on reducing the weight of the spring. There are four design variables to optimize and four constraints to take into account, as shown in Fig 12. This optimization aims to minimize the weight of the spring by adjusting the values of D (mean coil diameter), d (wire diameter), and N (effective coil count). Using ICOA, the compression spring reaches an optimal weight of 0.0099657. Table 13 shows that ICOA achieves optimal results, demonstrating the superior effectiveness of ICOA.

thumbnail
Fig 12. Schematic design of the tension/compression spring problem.

https://doi.org/10.1371/journal.pone.0340464.g012

thumbnail
Table 13. Experimental results for the tension/compression spring design problem.

https://doi.org/10.1371/journal.pone.0340464.t013

Consider:

(26)

Minimize:

(27)

Variable range:

(28)

Subject to:

As shown in Table 13, it can be inferred that ICOA exhibits and N = 8.7022. The result shows a minimum weight of 0.0099657—0.28% better than that produced by COA. Compared with the other algorithms, ICOA achieves the best performance by producing the lowest spring weight, indicating its strong optimization capability.

5.2 Pressure vessel design problem

This problem involves designing a cylindrical pressure vessel with hemispherical heads that can withstand a given internal pressure while minimizing the overall cost or weight. There are four key parameters to be optimized: shell thickness(Ts), head thickness(Th), inner radius (R) and cylindrical length(L), as shown in Fig 13.

Consider:

(29)

Minimize:

(30)

Variable range:

(31)

Subject to:

As shown in Table 14, it can be inferred that Ts = 0.8998, and L = 197.1797 for ICOA. The result shows a average weight of 7148.163262 —17.86% better than that produced by COA. Compared with other algorithms with average weight, ICOA is still the smallest. Consequently, it can be inferred that ICOA demonstrates an excellent optimization capability.

thumbnail
Table 14. Experimental results for the pressure vessel design problem.

https://doi.org/10.1371/journal.pone.0340464.t014

5.3 Three-bar truss design problem

This design is a fundamental structural configuration composed of three members arranged in a triangular form, which is widely used in bridges, roofs, and mechanical structures. As shown in Fig 14, the parameters include the dimensions, geometry and connection techniques of the rods to ensure that the structure meets the performance and cost goals within the given constraints.

thumbnail
Fig 14. Schematic design for the three-bar truss problem.

https://doi.org/10.1371/journal.pone.0340464.g014

Consider:

(32)

Minimize:

(33)

Variable range:

Parameter:

As shown in Table 15, it can be inferred that k1 = 0.7880 and k2 = 0.4101 of ICOA. The result shows a minimum weight of 263.8971695. Compared to other algorithms, ICOA achieves the lowest minimum weight, showing its strong optimization ability.

thumbnail
Table 15. Experimental results for the triple bar truss design problem.

https://doi.org/10.1371/journal.pone.0340464.t015

5.4 Gear train design problem

This problem involves selecting and arranging gears to achieve a desired speed ratio or torque transmission in mechanical systems. Fig 15 shows a gearing schematic with four gears: A, B, D, and F. The goal is to design a gear train that achieves a specified gear ratio while optimizing performance criteria such as size, weight, cost, and durability. The goal is smaller or lighter gear trains(Ta,Th,Td,andTf) to reduce material costs and improve the efficiency of the system.

Consider:

(34)

Minimize:

(35)

Variable range:

(36)

As shown in Table 16, it can be inferred that Ta = 49,Tb = 13,Td = 31. and Tf = 57 of ICOA. The minimum weight value is 9.93988E-11, which is 88.8% greater than the COA. Compared to the minimum weight values obtained by other algorithms, ICOA yields the smallest value. This shows that ICOA exhibits an effective optimization performance.

thumbnail
Table 16. Experimental results for the gear train design problem.

https://doi.org/10.1371/journal.pone.0340464.t016

5.5 Tubular column design

The design problem involves a uniform tubular column subjected to compressive loads, with the aim of reducing the cost. As shown in Fig 16, it involves two variables: the column’s average thickness and the tube’s wall thickness. The structural column consists of a material with known yield strength and elasticity parameters.

Consider:

(37)

Minimize:

(38)

Variable range:

(39)

Subject to:

As shown in Table 17, from which it can be concluded that d = 5.4519, t = 0.2922 of ICOA on the tubular column design problem. The result shows a minimum weight of 26.51741673—0.1% better than that produced by COA. Compared to the minimum weight values obtained by other algorithms, although the ICOA does not surpass DBO and WDO in overall performance, it still demonstrates a notable advantage in local search and specific subproblems. This indicates that ICOA demonstrates an effective optimization performance.

thumbnail
Table 17. Experimental results for the tubular column design problem.

https://doi.org/10.1371/journal.pone.0340464.t017

6 Discussion

The proposed Improved Crayfish Optimization Algorithm (ICOA) demonstrates significant advantages in solving engineering optimization problems. By incorporating a the initialization of the Sobol sequence, Lévy flight strategy, and a Euclidean distance–fitness balance competition mechanism, ICOA effectively addresses the imbalance between exploration and exploitation, thus enhancing global search capability and convergence efficiency. In the CEC2019 and CEC2020 benchmark function tests, ICOA consistently achieved faster convergence and higher solution accuracy across different dimensions, significantly outperforming COA and several other metaheuristic algorithms. Moreover, in multiple classical engineering optimization problems, ICOA also exhibited superior performance. For example, in the tension/compression spring design problem, pressure vessel design problem, three-bar truss design problem, gear train design problem, and tubular column design problems, ICOA outperformed COA with improvements up to 96.95, confirming its effectiveness and reliability in practical engineering applications.

Despite these promising results, the iterative update mechanism of ICOA still has certain limitations. First, while the combination of the Sobol sequence and Lévy flight enhances population diversity and the ability to escape local optima, it may also cause individuals to become overly dispersed in the search space, leading to instability in early-stage convergence. Second, although the Euclidean distance–fitness balance competition mechanism promotes the coordination between global and local search, its equilibrium is difficult to maintain in high-dimensional complex problems, which can result in either excessive exploration or insufficient exploitation. In addition, ICOA relies on interactive updates throughout the population, which substantially increases computational cost in large-scale problems and may slow down convergence in later iterations.

In future studies, efforts will focus on refining the ICOA update mechanism to improve its scalability and stability in large-scale problems, while also introducing adaptive parameter control to minimize dependence on manual tuning. Additionally, extending the algorithm to address multi-objective, dynamic, and uncertain optimization scenarios will further expand its applicability in complex engineering domains. Moreover, integration with deep learning and large models represents a promising direction, where data-driven learning of update rules and predictive guidance of search processes could significantly enhance the intelligence and cross-domain adaptability of ICOA.

7 Conclusion

To address the drawbacks of existing crayfish optimization algorithms, including slow convergence, susceptibility to local optima, and limited accuracy, an improved Crayfish Optimization Algorithm (ICOA) is proposed. Within ICOA lies a refined version of a crayfish-based optimization algorithm that leverages enhanced strategies for improvement. In the early stage, Sobol sequences are used to promote a more diverse population. In the foraging phase, the algorithm uses the Lévy flight strategy to avoid local optima and encourage a broader exploration of the solution space. In the competition phase, the foraging behavior and updating methods of crayfish individuals are sought by using the Euclidean distance-fitness balanced competition strategy to increase the information on location and fitness differences between individuals. Experimental results on CEC2019 and CEC2020 confirm the improved efficiency of the algorithm, demonstrating its performance across various test functions. Through rank test and statistical analyses, we demonstrate the substantial superiority of ICOA compared to its competitors in both CEC2019 and CEC2020 (with dimensions of 20, 30, and 50). Additionally, the practical significance of ICOA in engineering applications is further highlighted through the analysis of five engineering case studies. The results clearly confirm ICOA’s efficiency and strong performance on five optimization problems.

The proposed algorithm still has certain limitations. Although the Sobol sequence is adopted to replace random initialization for improving population uniformity and diversity, its advantage may be less significant in some discontinuous or highly multimodal optimization problems. Future research could consider combining the Sobol sequence with random perturbations or chaotic mappings to further enhance population diversity. In the Lévy flight strategy, parameter settings have a considerable impact on algorithm performance. Therefore, introducing an adaptive Lévy step-size mechanism that dynamically adjusts the step length according to fitness variation or iteration progress could improve stability and convergence efficiency. In addition, while the Euclidean Distance–Fitness Balanced Competition Strategy contributes to maintaining population diversity, it also increases computational complexity, since the distance matrix among all individuals must be computed in each iteration. Future work may explore cluster-based or density estimation-based competition mechanisms to reduce computational cost and improve overall algorithmic stability.

Future research will aim to extend the applications of the Enhanced Crayfish Optimization Algorithm (ECOA) to various fields, including function optimization, image processing, vehicle scheduling, and creating a multi-objective variant of ICOA, demonstrating its effectiveness in solving multi-objective tasks and providing diverse optimization results.

Supporting information

S1 Dataset. Minimum runnable.

https://doi.org/10.1371/journal.pone.0340464.s001

This folder contains the minimal runnable dataset required to reproduce the core analyses and results presented in this study. The data and scripts are designed to be executed in MATLAB.

S2 Dataset. Raw data.

https://doi.org/10.1371/journal.pone.0340464.s002

This folder contains the original, unprocessed data collected for the experiments. These raw files serve as the basis for all data preprocessing and analysis steps described in the manuscript.

Acknowledgments

The authors would like to thank the support of Fujian Province University Key Laboratory for the Analysis and Application of Industry Big Data, and also the anonymous reviewers and the editor for their careful reviews and constructive suggestions to help us improve the quality of this paper.

References

  1. 1. Abualigah L, Yousri D, Abd Elaziz M, Ewees AA, Al-qaness MAA, Gandomi AH. Aquila optimizer: A novel meta-heuristic optimization algorithm. Comput Ind Eng. 2021;157:107250.
  2. 2. Sasmal B, Hussien AG, Das A, Dhal KG. A comprehensive survey on aquila optimizer. Arch Comput Methods Eng. 2023;:1–28. pmid:37359742
  3. 3. Abualigah L, Elaziz MA, Sumari P, Geem ZW, Gandomi AH. Reptile search algorithm (RSA): A nature-inspired meta-heuristic optimizer. Expert Syst Applic. 2022;191:116158.
  4. 4. Abualigah L, Habash M, Hanandeh ES, Hussein AM, Shinwan MA, Zitar RA, et al. Improved reptile search algorithm by salp swarm algorithm for medical image segmentation. J Bionic Eng. 2023;:1–25. pmid:36777369
  5. 5. Almotairi KH, Abualigah L. Hybrid reptile search algorithm and remora optimization algorithm for optimization tasks and data clustering. Symmetry. 2022;14(3):458.
  6. 6. Sasmal B, Hussien AG, Das A, Dhal KG. A comprehensive survey on aquila optimizer. Arch Comput Methods Eng. 2023;:1–28. pmid:37359742
  7. 7. Zhong R, Wang Z, Zhang Y, Lian JJ, Yu J, Chen H. Integrating competitive framework into differential evolution: Comprehensive performance analysis and application in brain tumor detection. Appl Soft Comput. 2025;174:112995.
  8. 8. Jia H, Zhang W, Zheng R, Wang S, Leng X, Cao N. Ensemble mutation slime mould algorithm with restart mechanism for feature selection. Int J Intell Syst. 2021;37(3):2335–70.
  9. 9. Zheng R, Jia H, Abualigah L, Liu Q, Wang S. Deep ensemble of slime mold algorithm and arithmetic optimization algorithm for global optimization. Processes. 2021;9(10):1774.
  10. 10. Nouri-Moghaddam B, Ghazanfari M, Fathian M. A novel multi-objective forest optimization algorithm for wrapper feature selection. Expert Syst Applic. 2021;175:114737.
  11. 11. Hammami M, Bechikh S, Hung C-C, Ben Said L. A multi-objective hybrid filter-wrapper evolutionary approach for feature selection. Memetic Comp. 2018;11(2):193–208.
  12. 12. Lacomme P, Prins C, Sevaux M. A genetic algorithm for a bi-objective capacitated arc routing problem. Comput Oper Res. 2006;33(12):3473–93.
  13. 13. Keshavarz Ghorabaee M, Amiri M, Azimi P. Genetic algorithm for solving bi-objective redundancy allocation problem with k-out-of-n subsystems. Appl Math Modell. 2015;39(20):6396–409.
  14. 14. Abualigah L, Yousri D, Abd Elaziz M, Ewees AA, Al-qaness MAA, Gandomi AH. Aquila optimizer: A novel meta-heuristic optimization algorithm. Comput Ind Eng. 2021;157:107250.
  15. 15. Beyer H-G, Schwefel H-P. Evolution strategies – A comprehensive introduction. Nat Comput. 2002;1(1):3–52.
  16. 16. Fogel DB, Fogel LJ. An introduction to evolutionary programming. Lecture Notes in computer science. Springer Berlin Heidelberg; 1996. p. 21–33. https://doi.org/10.1007/3-540-61108-8_28
  17. 17. Fogel DB. Applying evolutionary programming to selected traveling salesman problems. Cybern Syst. 1993;24(1):27–36.
  18. 18. Mirjalili S. SCA: A Sine Cosine Algorithm for solving optimization problems. Knowl-Based Syst. 2016;96:120–33.
  19. 19. Mirjalili S, Mirjalili SM, Hatamlou A. Multi-verse optimizer: A nature-inspired algorithm for global optimization. Neural Comput Applic. 2015;27(2):495–513.
  20. 20. Yang XS. Nature-inspired metaheuristic algorithms. Luniver Press; 2010.
  21. 21. Rashedi E, Nezamabadi-pour H, Saryazdi S. GSA: A gravitational search algorithm. Inform Sci. 2009;179(13):2232–48.
  22. 22. Kirkpatrick S, Gelatt CD Jr, Vecchi MP. Optimization by simulated annealing. Science. 1983;220(4598):671–80. pmid:17813860
  23. 23. Deng L, Liu S. Snow ablation optimizer: A novel metaheuristic technique for numerical optimization and engineering design. Expert Syst Applic. 2023;225:120069.
  24. 24. Bayzidi H, Talatahari S, Saraee M, Lamarche C-P. Social network search for solving engineering optimization problems. Comput Intell Neurosci. 2021;2021:8548639. pmid:34630556
  25. 25. Ma B, Hu Y, Lu P, Liu Y. Running city game optimizer: A game-based metaheuristic optimization algorithm for global optimization. J Comput Design Eng. 2022;10(1):65–107.
  26. 26. Kumar M, Kulkarni AJ, Satapathy SC. Socio evolution & learning optimization algorithm: A socio-inspired optimization methodology. Future Generat Comput Syst. 2018;81:252–72.
  27. 27. Zhang Y, Jin Z. Group teaching optimization algorithm: A novel metaheuristic method for solving global optimization problems. Expert Syst Applic. 2020;148:113246.
  28. 28. Bayraktar Z, Komurcu M, Werner DH. Wind Driven Optimization (WDO): A novel nature-inspired optimization algorithm and its application to electromagnetics. In: 2010 IEEE antennas and propagation society international symposium; 2010. p. 1–4. https://doi.org/10.1109/aps.2010.5562213
  29. 29. Agushaka JO, Ezugwu AE, Abualigah L. Dwarf mongoose optimization algorithm. Comput Methods Appl Mech Eng. 2022;391:114570.
  30. 30. Mirjalili S, Lewis A. The whale optimization algorithm. Adv Eng Softw. 2016;95:51–67.
  31. 31. Heidari AA, Mirjalili S, Faris H, Aljarah I, Mafarja M, Chen H. Harris hawks optimization: Algorithm and applications. Future Generat Comput Syst. 2019;97:849–72.
  32. 32. Kennedy J, Eberhart R. Particle swarm optimization. In: Proceedings of ICNN’95 – International conference on neural networks; 1942–8. https://doi.org/10.1109/icnn.1995.488968
  33. 33. Seyyedabbasi A, Kiani F. Sand Cat swarm optimization: A nature-inspired algorithm to solve global optimization problems. Eng Comput. 2022;39(4):2627–51.
  34. 34. Jia H, Rao H, Wen C, Mirjalili S. Crayfish optimization algorithm. Artif Intell Rev. 2023;56(S2):1919–79.
  35. 35. Sasmal B, Hussien AG, Das A, Dhal KG, Saha R. Reptile search algorithm: Theory, variants, applications, and performance evaluation. Arch Computat Methods Eng. 2023;31(1):521–49.
  36. 36. Peraza-Vázquez H, Peña-Delgado A, Merino-Treviño M, Morales-Cepeda AB, Sinha N. A novel metaheuristic inspired by horned lizard defense tactics. Artif Intell Rev. 2024;57(3).
  37. 37. Xiao Y, Cui H, Khurma RA, Castillo PA. Artificial lemming algorithm: A novel bionic meta-heuristic technique for solving real-world engineering optimization problems. Artif Intell Rev. 2025;58(3).
  38. 38. Harifi S, Mohammadzadeh J, Khalilian M, Ebrahimnejad S. Giza pyramids construction: An ancient-inspired metaheuristic algorithm for optimization. Evol Intel. 2020;14(4):1743–61.
  39. 39. El-Kenawy ESM, Abdelhamid AA, Ibrahim A, Mirjalili S, Khodadad N, Alduailij MA. Al-Biruni earth radius (BER) metaheuristic search optimization algorithm. Comput Syst Sci Eng. 2023;45(2):1917–34.
  40. 40. Irizarry R. LARES: An artificial chemical process approach for optimization. Evol Comput. 2004;12(4):435–59. pmid:15768524
  41. 41. Talatahari S, Azizi M, Tolouei M, Talatahari B, Sareh P. Crystal structure algorithm (CryStAl): A metaheuristic optimization method. IEEE Access. 2021;9:71244–61.
  42. 42. Abdelhamid AA, Towfek SK, Khodadadi N, Alhussan AA, Khafaga DS, Eid MM, et al. Waterwheel plant algorithm: A novel metaheuristic optimization method. Processes. 2023;11(5):1502.
  43. 43. Pan J-S, Zhang S-Q, Chu S-C, Yang H-M, Yan B. Willow Catkin optimization algorithm applied in the TDOA-FDOA joint location problem. Entropy (Basel). 2023;25(1):171. pmid:36673312
  44. 44. Kim JH. Harmony search algorithm: A unique music-inspired algorithm. Proced Eng. 2016;154:1401–5.
  45. 45. Kaveh A, Talatahari S, Khodadadi N. Stochastic paint optimizer: Theory and application in civil engineering. Eng Comput. 2020;38(3):1921–52.
  46. 46. Ma B, Hu Y, Lu P, Liu Y. Running city game optimizer: A game-based metaheuristic optimization algorithm for global optimization. J Comput Design Eng. 2022;10(1):65–107.
  47. 47. Yuan Y, Ren J, Wang S, Wang Z, Mu X, Zhao W. Alpine skiing optimization: A new bio-inspired optimization algorithm. Adv Eng Softw. 2022;170:103158.
  48. 48. Salimi H. Stochastic fractal search: A powerful metaheuristic algorithm. Knowl-Based Syst. 2015;75:1–18.
  49. 49. Abualigah L, Diabat A, Mirjalili S, Abd Elaziz M, Gandomi AH. The arithmetic optimization algorithm. Comput Meth Appl Mech Eng. 2021;376:113609.
  50. 50. Yu C, Lahrichi N, Matta A. Optimal budget allocation policy for tabu search in stochastic simulation optimization. Comput Oper Res. 2023;150:106046.
  51. 51. Mladenović N, Hansen P. Variable neighborhood search. Comput Oper Res. 1997;24(11):1097–100.
  52. 52. Sedak M, Rosić B. Multi-objective optimization of planetary gearbox with adaptive hybrid particle swarm differential evolution algorithm. Appl Sci. 2021;11(3):1107.
  53. 53. Singhal S. A hybrid approach combining ant colony optimization and simulated annealing for cloud resource scheduling. J Comput Syst Appl. 2025;2(2):17–32.
  54. 54. Ouyang C, Liu X, Zhu D, Li Y, Mao J, Zhou C, et al. Hierarchical adaptive cuckoo search algorithm for global optimization. Cluster Comput. 2025;28(5).
  55. 55. Chauhan S, Vashishtha G, Gupta MK, Korkmaz ME, Demirsöz R, Noman K, et al. Parallel structure of crayfish optimization with arithmetic optimization for classifying the friction behaviour of Ti-6Al-4V alloy for complex machinery applications. Knowl-Based Syst. 2024;286:111389.
  56. 56. Wang R, Zhang S, Zou G. An improved multi-strategy crayfish optimization algorithm for solving numerical optimization problems. Biomimetics (Basel). 2024;9(6):361. pmid:38921241
  57. 57. Shikoun NH, Al-Eraqi AS, Fathi IS. BinCOA: An efficient binary crayfish optimization algorithm for feature selection. IEEE Access. 2024;12:28621–35.
  58. 58. Zhang Y, Liu P, Li Y. Implementation of an enhanced crayfish optimization algorithm. Biomimetics (Basel). 2024;9(6):341. pmid:38921221
  59. 59. Chaib L, Tadj M, Choucha A, Khemili FZ, EL-Fergany A. Improved crayfish optimization algorithm for parameters estimation of photovoltaic models. Energy Convers Manag. 2024;313:118627.
  60. 60. Jia H, Zhou X, Zhang J, Abualigah L, Yildiz AR, Hussien AG. Modified crayfish optimization algorithm for solving multiple engineering application problems. Artif Intell Rev. 2024;57(5).
  61. 61. Vasantbhai Patel V, Yadav AR, Jain P, Cenkeramaddi LR. A systematic kidney tumour segmentation and classification framework using adaptive and attentive-based deep learning networks with improved crayfish optimization algorithm. IEEE Access. 2024;12:85635–60.
  62. 62. Xiao Y, Cui H, Khurma RA, Hussien AG, Castillo PA. MCOA: A multistrategy collaborative enhanced crayfish optimization algorithm for engineering design and UAV Path planning. Int J Intell Syst. 2025;2025(1).
  63. 63. Wang W, Tian W, Xu D, Zang H. Arctic puffin optimization: A bio-inspired metaheuristic algorithm for solving engineering design optimization. Adv Eng Softw. 2024;195:103694.
  64. 64. Xue J, Shen B. Dung beetle optimizer: A new meta-heuristic algorithm for global optimization. J Supercomput. 2022;79(7):7305–36.
  65. 65. Wang J, Wang W, Hu X, Qiu L, Zang H. Black-winged kite algorithm: A nature-inspired meta-heuristic for solving benchmark functions and engineering problems. Artif Intell Rev. 2024;57(4).