Skip to main content
Advertisement
Browse Subject Areas
?

Click through the PLOS taxonomy to find articles in your field.

For more information about PLOS Subject Areas, click here.

  • Loading metrics

MOSRS: An engineering multi-objective optimization through Einsteinian concept

Abstract

Multi-objective optimization stands at the intersection of mathematics, engineering, and decision-making, and metaheuristics offer a promising avenue for tackling such challenges. The literature shows they are the best, and there is space for new algorithms to deliver Pareto Fronts (PFs) with more convergence and coverage at lower computational costs. This paper presents the Multi-objective Special Relativity Search (MOSRS) for the first time. It relies on principles inspired by the theory of special relativity physics, which iteratively refines solutions toward optimality and self-adapts its parameters using these laws. Unlike most algorithms in the literature today, the user sets only the number of iterations and particles (or population). To test the performance, MOSRS is applied to the most challenging test functions set (CEC 2009) and 21 real and constrained world problems, being compared with a total of eleven metaheuristics: NSGA-II, NSGA-III, MOEA/D, MOPSO, MOGWO, ARMOEA, TiGE2, CCMO, ToP, and AnD. Inverted Generational Distance, Spacing, Maximum Spread, and Hypervolume are used to identify the best algorithm. MOSRS was robust in finding the best PF in most studied problems. The source codes of the MOSRS algorithm are publicly available at https://nimakhodadadi.com/algorithms-%2B-codes.

1. Introduction

Structural optimization is crucial in various fields, such as engineering, architecture, and design, as it aims to find the best configuration of materials and shapes to achieve desired performance criteria while minimizing costs or other constraints. Metaheuristic algorithms play a significant role in this optimization process because they can efficiently explore large solution spaces and find near-optimal solutions in complex and often nonlinear problems. Particle Swarm Optimization (PSO) is one of the most popular algorithms developed to solve engineering problems, and it is inspired by the social behavior of bird flocking or fish schooling. It iteratively updates the position and velocity of particles in a multidimensional search space, guided by their own best-known position and the best-known position found by the entire swarm, aiming to efficiently explore and exploit the search space for optimal solutions [1].

Goodarzimehr et al. [2,3] presented a novel optimization technique inspired by the principles of Einstein’s theory of special relativity. The Special Relativity Search (SRS) algorithm leverages the concept of relativistic velocity addition to update particle positions and velocities in the search space. By incorporating relativistic effects, such as time dilation and length contraction, the SRS algorithm enhances the exploration-exploitation trade-off, enabling efficient and effective search for optimal solutions in complex optimization problems. Pereira et al. [4] introduced the Lichtenberg Algorithm (LA), a computational method inspired by the intricate branching patterns in electrical discharges. It simulates the growth of these patterns by iteratively applying probabilistic rules to determine the trajectory of discharges, yielding visually striking and complex fractal-like structures.

Scientists have developed various paradigms to address especially engineering problems by leveraging the rich advantages of metaheuristic methods, including artificial hummingbird algorithm [5], stochastic paint optimizer [6], starling murmuration optimizer [7], Advanced swarm algorithm [8], mountain gazelle optimizer [9], Improved chaos game optimization algorithm [10], Lichtenberg optimization algorithm [11], enhanced special relativity search algorithm [12], improved harmony search algorithm [13], modified adolescent identity search algorithm [14], greylag goose optimization [15], hybrid symmetry–PSO approach [16], electric eel foraging optimization [17], hippopotamus optimization algorithm [18], puma optimizer [19], among others. While all these methods are typically designed for optimizing single-objective problems, this study delves into investigating more challenging problems.

At its core, multi-objective optimization involves the exploration of a solution space to identify a set of solutions that are not dominated by any other solution, according to defined criteria. These solutions form what is known as the Pareto frontier or Pareto front, representing the trade-offs between conflicting objectives [20]. Each point on this frontier represents a feasible solution where improvement in one objective comes at the expense of another. The development of a metaheuristic algorithm for multi-objective optimization typically involves several key stages:

  • Problem Understanding: The first step is thoroughly understanding the multi-objective optimization problem, including its objectives, constraints, and specific characteristics that may influence the solution space.
  • Algorithm Design: Designing metaheuristic algorithms entails defining the search strategy, exploration, and exploitation mechanisms, as well as determining how solutions are represented and evaluated. This phase often draws inspiration from nature, such as evolutionary processes, swarm behavior, or physical phenomena.
  • Implementation: Once the algorithm design is in place, the next step is implementing it in code. This involves translating the conceptual framework into a functional program capable of iteratively generating and evaluating solutions.
  • Parameter Tuning: Metaheuristic algorithms typically have parameters influencing their behavior and performance. Fine-tuning these parameters through experimentation and optimization is crucial for achieving good results across different problem instances.
  • Validation and Testing: Once implemented, the algorithm must undergo rigorous testing and validation to ensure its effectiveness and robustness. This involves benchmarking against known problem instances, comparing against existing algorithms, and assessing performance on real-world applications.
  • Optimization and Enhancement: Continuous optimization and enhancement are essential for algorithm development. This may involve refining the search strategy, incorporating problem-specific knowledge, or adapting the algorithm to handle different problem types or constraints.

Building upon the abovementioned stages, scientists have developed various multi-objective methods to address engineering problems: Baril et al. [21] explored integrating Six Sigma methodologies with collaborative multi-objective optimization techniques to enhance product design processes. They highlight the synergies between these approaches and their potential to achieve simultaneous optimization of multiple conflicting objectives in complex engineering systems. Branke et al. [22] introduced a novel method that delves into strategies for providing direction and assistance to evolutionary algorithms when navigating complex multi-objective optimization problems. It explores various techniques to enhance convergence and diversity in the search process, ultimately aiding decision-makers in efficiently discussing and understanding the trade-offs among conflicting objectives. Coello et al. [23,24] developed different paradigms of multi-objective approaches for addressing engineering problems.

Cohon et al. [25] meticulously scrutinize diverse methodologies used in multi-objective programming, offering insights into their efficacy and practicality. This comprehensive analysis is a valuable resource for researchers and practitioners seeking to navigate the complexities of multi-objective optimization. Emmerich et al. [26] explore the foundational principles and evolutionary techniques employed in multi-objective optimization. This comprehensive guide equips readers with essential knowledge and practical insights for tackling complex optimization problems with conflicting objectives.

Fan et al. [27] investigate the optimal selection of heat pump technologies within micro-CHP systems, balancing multiple criteria for efficient energy utilization. This study offers valuable insights into enhancing the performance and sustainability of fuel cell-based micro-CHP systems through rigorous multi-objective optimization techniques. Fonseca et al. [28] comprehensively examine genetic algorithms tailored for multi-objective optimization problems. This paper delves into their formulation, discussion, and generalization, offering valuable insights into their applicability and efficiency across diverse optimization tasks. Gomes et al. have extensively investigated the application of metaheuristic algorithms for solving various complex multi-objective optimization problems [29,30].

Francisco et al. [31] explore the optimization of CRP Isogrid tubes through a multi-objective approach, leveraging the SunFlower optimization algorithm and metamodel-based techniques. This study offers valuable insights into enhancing composite materials’ design efficiency and performance in aerospace applications through multi-objective optimization methodologies. Jian et al. [32] introduce a novel evolutionary algorithm for optimizing many objectives, employing a reference-point-based nondominated sorting approach. This paper contributes to advancing multi-objective optimization techniques, offering a promising method for efficiently handling complex optimization problems with multiple conflicting objectives. Kumar et al. [33] present a comprehensive collection of real-world constrained multi-objective optimization problems and baseline performance results. This benchmark suite is a valuable resource for evaluating and comparing the efficacy of optimization algorithms in tackling complex real-world problems with multiple constraints and objectives.

Several other applications of multi-objective metaheuristics to handle structural problems can be found in literature, always showing that the most common and modern preference is to apply the metaheuristics to find all possible and non-dominated solutions first. This means the PF has more convergence and coverage at the same time, and then the decision maker analyses them and chooses the best one according to your preferences, but in general terms, all the solutions in the PF have initially equal importance [3443].

This study will use the SRS algorithm, which has demonstrated excellent performance in its single-objective version, to construct a new multi-objective metaheuristic. The principles of special relativity inspire it, and it employs relativistic velocity addition to update particle positions and velocities in the search space efficiently. It also self-adapts its parameters using these laws. The user sets only the number of iterations and particles (or population) as stopping criteria, making it one of the few free specific -hyperparameters in the literature. The first step of the Multi-objective Special Relativity Search (MOSRS) algorithm is to define the objectives and constraints of the multi-objective optimization problem. These objectives may represent different performance metrics or criteria to be optimized simultaneously, and at least one must be conflictive. Next, initialize a population of solutions, or particles, within the search space. Then, during each iteration of the MOSRS algorithm, the positions and velocities of particles are updated based on their own best-known positions and the best-known positions found by the entire cluster (or population). These best positions are represented in the objective space as all current non-dominated solutions.

The relativistic velocity addition mechanism allows particles to explore the search space dynamically, considering the relative importance of objectives and constraints. Additionally, the MOSRS algorithm offers a unique advantage in handling dynamic environments or changing objectives. By continuously adapting particle velocities based on relativistic principles, the algorithm can dynamically adjust to changes in the optimization landscape, ensuring robust performance.

To validate the performance of the new MOSRS algorithm, the CEC 2009 [44] and 2021 [33] will be used. The former is considered the most challenging and used test functions in the literature [34,45,46], and MOSRS will be compared with the most used algorithms: NSGA-II, MOEA/D, MOPSO, and MOGWO. Inverted Generational Distance (IGS), Maximum Spread (MS), and Spacing (SP) are used to assess its performance because the True Pareto fronts (TPF) are available. Furthermore, 21 complex structural optimization problems from CEC 2021 are investigated. The proposed method is compared against seven recent methods: ARMOEA [47], TiGE2 [48], NSGA-III [32], CCMO [49], ToP [50], MOSFO [40], and AnD [51] using the Hypervolume (HV) metric, due to lack of available TPFs.

1.1. Related works

In recent structural optimization research, various bio-inspired and nature-inspired multi-objective algorithms have been introduced to address the complexity of truss design problems. The thermal exchange optimization was employed for multi-objective optimization, demonstrating improved convergence and solution diversity in truss structures [52]. Tejani et al. [53] applied a 2-archive multi-objective cuckoo search algorithm, effectively maintaining Pareto diversity while optimizing structural performance metrics. Similarly, the MOBBO algorithm, modeled after the behavior of brown bears, provided competitive results in constrained environments, indicating its potential as a viable alternative to classical methods.

Further enhancements have been explored through hybrid and archive-based mechanisms. The two-archive boosted MOHO algorithm was successfully applied to truss optimization, offering improved search capability and robustness compared to baseline models [54]. In parallel, the application of MOHO from a multi-objective perspective has also shown efficiency in generating well-distributed solutions. Multi-objective truss optimization has not been studied as extensively as its single-objective counterpart. To address this gap, the novel Multi-objective Lichtenberg Algorithm with Two Archives (MOLA-2arc) was developed by Panagant et al.. Its performance has been benchmarked against eight other multi-objective optimization algorithms, demonstrating its potential in handling complex structural design problems. A comprehensive analysis of many-objective metaheuristic methods emphasized the effectiveness of decomposition-based strategies and diverse search operators in high-dimensional objective spaces. The improved heat transfer search and its decomposition-based variant have also shown promise in handling constrained structural problems with conflicting objectives [55]. Multi-objective Generalized Normal Distribution Optimization (MOGNDO) algorithm was introduced in [56], an extension of the GNDO originally designed for single-objective problems. MOGNDO enhances the original approach by incorporating an archive to store non-dominated Pareto-optimal solutions and a new leader selection mechanism to guide the search. These improvements enable the algorithm to effectively balance convergence and diversity in solving multi-objective optimization tasks. These studies [52]–[56] form a strong foundation for advancing intelligent optimization techniques in structural engineering applications.

Complementing the above developments, several recent studies have focused on advancing many-objective optimization techniques tailored for complex engineering problems. Kalita et al. introduced the Many-Objective Multi-Verse Optimizer (MaOMVO), which integrates gravitational memory mechanisms to effectively handle convergence and diversity trade-offs in high-dimensional objective spaces [57]. Further, the MORKO algorithm, inspired by the Runge–Kutta numerical method, was proposed to tackle multi-domain optimization tasks and demonstrated strong generalization across varied benchmark functions [58]. Additional efforts include the development of the MaOGOA and the Many-Objective Whale Optimization Algorithm, both of which enhance solution distribution and convergence speed in large-scale, many-objective design problems [59] and [60]. Collectively, these approaches underscore the growing emphasis on scalable, general-purpose optimization frameworks capable of addressing the rising complexity in engineering design applications.

In parallel with these algorithmic advancements, efforts have also been directed toward enhancing the formulation and refinement of multi-objective optimization frameworks for structural and civil engineering applications. Abdel-Basset et al. proposed a balanced multi-objective optimization strategy based on improvement-guided reference points, effectively addressing convergence-stagnation issues commonly observed in multi-criteria [61]. More recently, MORIME, a robust multi-objective RIME framework, has been introduced specifically for efficient truss structure optimization, showing promising results in balancing structural efficiency and computational cost [62]. In domain-specific implementations, Mashru et al. [63] have applied various metaheuristics—ranging from many-objective approaches on complex spatial truss domes to thermal exchange optimization in structural systems—demonstrating their adaptability and effectiveness). In [64], the Multi-objective Stochastic Paint Optimizer (MOSPO) extends the original single-objective SPO by incorporating key features such as a fixed-sized external archive and a leader selection mechanism for effective multi-objective optimization. The proposed MOSPO was evaluated on ten benchmark functions (CEC-09) and eight engineering design problems, demonstrating superior accuracy and solution uniformity compared to other established methods.

Furthermore, application-oriented innovations such as the improved multi-objective particle swarm optimization algorithm for geotechnical projects highlight the growing relevance of these techniques in practical civil infrastructure scenarios [65]. These contributions collectively reinforce the trend toward more versatile and context-aware optimization models in contemporary engineering practice.

The paper is organized as follows: Section 2 presents the MOSRS. Section 3 describes the studied complex multi-objective problems, and Section 4 discusses their results and discussions. Section 5 concludes with the main highlights.

2. Multi-objective Special Relativity Search

MOSRS explores the Search Space with the population being charged particles, and to understand better how it works, we need to dive into physics. When moving through a magnetic field, a charged particle, such as an electron or a proton, experiences a force perpendicular to its velocity vector and the magnetic field lines. The Lorentz force law describes this force. For a positively charged particle, the force is in the direction of the field lines if the motion is perpendicular to the field lines, or it’s opposite if the motion is in the opposite direction to the field lines. For negatively charged particles, the direction of the force is reversed [2].

The Lorentz force describes the force experienced by a charged particle moving through an electromagnetic field. The equation gives it:

(1)(2)

where F is the Lorentz force vector, Q is the charge of the particle, E is the electric field vector, v is the velocity vector of the charged particle, r is the separate distance between two particles, is the unit vector, μ is the constant of the magnetic field, and B is the magnetic field vector.

This equation breaks down into two components:

  1. a) The first component, QE, represents the force experienced by the charged particle due to the electric field. This force is directly proportional to the particle’s charge and the electric field’s strength.
  2. b) The second component, Q(v × B), represents the force experienced by the charged particle due to the magnetic field. It is proportional to both the charge of the particle and its velocity, as well as the strength and direction of the magnetic field.

In the magnetic field, the particles are moving at velocity v, but in the electric field, the particles don’t move, and their velocity is zero. Therefore, we neglect the electric term and rewrite Eq. (1) as follows:

(3)

The direction of the magnetic force can be determined using the right-hand rule (Fig 1). If you point your thumb in the direction of the velocity of the charged particle and your fingers in the direction of the magnetic field, then the direction in which your palm faces give the direction of the force.

As a result of this force, charged particles tend to move in circular or helical paths (spirals) if their initial motion has a component perpendicular to the magnetic field lines (Fig 2). The radius of this circular motion depends on the mass, charge, velocity, and strength of the magnetic field. The radius is defined using Eq. (4) and depicted in Fig 2.

(4)

The magnitude of the force experienced by the charged particle is proportional to its velocity and the strength of the magnetic field. Mathematically, this relationship is given by Eq. (3). This interaction between charged particles and magnetic fields is fundamental to various phenomena, such as the operation of electric motors, the behavior of particles in particle accelerators like cyclotrons, and the formation of auroras in Earth’s atmosphere.

The particles in a magnetic field have high speed and negligible mass. For this reason, Newtonian physics cannot mathematically represent the behavior of charged particles in a magnetic field. Special relativity physics is a good fit for simulating this kind of behavior. In relativity physics, the speed of particles is measured based on the speed of light, and by applying two significant phenomena, time dilation, and length contraction, the accuracy of equations is improved [66].

In this sense, in the 19th century, following the realization that the classical theory of electromagnetism was insufficient, there emerged a demand for a more comprehensive theory of relativity in physics. During this era, experimental findings regarding light propagation, particularly concerning the impact of an observer’s motion relative to the medium through which light traveled, challenged the prevailing beliefs of the time. To develop a description of light’s behavior consistent with these observations, the scientific community had to embrace the law proposed by Lorentz. This law established a connection between the coordinates of axes when in uniform relative motion, leading to what is now known as the Lorentz transformations. In essence, it can be stated that two fundamental principles govern the accurate transformation of coordinates in inertial frames according to relativity principles.

  1. In every inertial frame, light moves uniformly in all directions at a consistent speed, denoted as ‘c’.
  2. All reference frames, regardless of their initial state, are equally capable of accurately describing physical phenomena.

Applying the principles mentioned above allows for the initial derivation of Lorentz transformations. Thus, considering two orthogonal coordinate frames, E1 and E2, moving with a constant relative speed U along their X axis, denoted as (X), the transformation between the coordinates of an event in the first frame, represented as x1, y1, z1, t1, and the coordinates of the same event in the second frame, denoted as x2, y2, z2, and t2 can be expressed through Lorentz transformations. These transformations describe the relation between the coordinates of the event when transitioning from one frame to another and are outlined as follows.

(5)(6)(7)(8)

Eq. (7) represents a pivotal step in the MOSRS algorithm. It incorporates both the initial and destination positions, velocity, and direction. It’s evident that the equation takes on a distinct form, mainly due to coordinates being measured relative to the speed of light.

2.1. Mathematical simulation of the MOSRS algorithm

Expanding the single-objective framework of the Special Relativity Search (SRS) algorithm to tackle multiple conflicting objectives results in a multi-objective version. This is done by integrating ideas of multi-objective optimization like Pareto dominance and preservation of diversity. In multi-objective optimization, multiple objectives are optimized concurrently rather than using a single fitness function. The main ideas to present are:

  • Pareto dominance: A solution y1 dominates another solution y2 when y1 is equal to or better than y2 in all objectives and superior in at least one.
  • Pareto front: A set of non-dominated solutions that are considered optimal.
  • Diversity maintenance: Ensures that there are diverse solutions along the Pareto front.

Initially, the multi-objective process commences with a randomly created population of particles, each symbolizing a possible solution. The particles will be evaluated based on multiple objective functions. The particles are organized by Pareto dominance to detect fronts that are not dominated. Every particle is ranked according to its present location, and the vector of the best points (ideal point) and the worst (nadir point) is used to calculate the particle’s charge.

The velocity and position of particles need to be updated, considering time dilation and length contraction. The updates must be guided by both the current position and the non-dominated solutions. Particles are selected for the next generation based on rank and diversity measures. Selection and replacement are performed as follows:

  • Combine the current population X and the new population X′ (generated from the updated positions) into a combined population Xc = XX′.
  • Perform non-dominated sorting on Xc and select the top N particles based on Pareto fronts and diversity measures to form the next generation.

Repeat the evaluation, sorting, updating, and selection until a stopping criterion is met. The final set of non-dominated solutions forms the Pareto front, representing the optimal trade-offs among the objectives.

A magnetic field is conceptualized as the feasible search space, where all optimal solutions within this space are deemed acceptable. The boundaries of the search space are defined as side constraints. The particles exert forces on each other, moving towards optimal solutions. The optimization process begins with a random solution. Subsequently, the search process continues until the stop criterion is met. In the MOSRS algorithm, the stop criterion is determined by two different loops. In the first loop, the stop criterion is the population size, while in the second loop, it is the maximum iteration. The initial population is defined using Eq. (9). According to Eq. (10), the fitness of the optimal solutions is evaluated in each iteration.

(9)(10)

where xij is the vector of possible optimal solutions, LB and UB are the lower and upper bound, respectively. rand is a stochastic number between [0,1]. The distance and charge of particles are calculated using Eq. (11) and Eq. (12), respectively.

(11)(12)

where fit is the objective function value for the kth objective on the iteration ith and best and worst are the best and worst for these objectives according to the current Pareto front. That is, the nadir and ideal vector points. Eq. (13) is applied to determine the position of the particle. This equation depends on cyclotron frequency. The cyclotron frequency is calculated using Eq. (14).

(13)(14)

Then, the main equation of step length of the MOSRS algorithm is defined using the relativistic Eq. (15).

(15)

where β represents the relativistic parameter, indicating the ratio of the speed of particles to the speed of light (approximately 3 × 108 meters per second). In the standard SRS, β is modeled using a stochastic operator and assigned exact values here. Note that MOSRS has self-adaptative parameters according to the fact, and only several iterations and particles rule its routine.

Each particle in the search space generates a vector of solutions in the objective space. At each iteration, the new solutions are integrated with the current Pareto front and analyzed by the Pareto dominance relationship that excludes the dominated solutions. Our method keeps only Ns solutions in the Pareto front to improve computational cost. The criteria to keep or not them is how close one is to another using Euclidian distance. Also, our proposed method is equipped with the “death penalty” function, which penalizes solutions that violate equality or inequality constraints. The flowchart of the Multi-Objective Special Relativity Search (MOSRS) algorithm is presented in Fig 3 to clarify the main steps and decision mechanisms. Table 1 brings its pseudocode.

3. Test problems

MOSRS will be tested through two CEC test functions: CEC 2009 for complex and general multi-objective optimization problems and CEC 2021 for real-world design problems.

3.1. CEC 2009 test functions

This test suite was proposed by Zhang et al. [44] and is considered by many authors as the most complex test functions for multi-objective problems until today; those used here are presented in Table 2. MOSRS will be compared against the most used algorithms in literature today in these functions. They and their recommended hyperparameters by other studies are in Table 3. Note that MOSRS does not have specific hyperparameters.

Three metrics, inverted generational distance (IGD), maximum spread (MS), and spacing (SP), will be used to evaluate the performance of these algorithms.

Inverted Generational Distance (IGD): This method is a performance metric in multi-objective optimization algorithms used to evaluate the quality of approximate sets generated by these algorithms. This method calculates the distance between the points of the approximate set and the points of an ideal reference set. Specifically, IGD measures the average distance from each point in the reference set to the nearest point in the approximate set. A lower IGD value indicates that the approximate set is closer to the reference set, thereby better evaluating the performance of the optimization algorithm. Due to its high accuracy and reliability, IGD is very common in comparing and analyzing the performance of different multi-objective optimization algorithms. The IGD quantifies how much the metaheuristic approaches the Pareto front, measuring its ability to converge, and it is in Eq (16):

(16)

where nt is the number of true Pareto optimal solutions; d’i indicates the Euclidean distance between the i-th true Pareto optimal solution and the closest Pareto optimal solution obtained in the reference set.

Maximum Spread (MS): The MS examines and measures the extent of coverage and dispersion of points in the objective space. Specifically, Maximum Spread calculates the distance between the farthest points in the approximate set. A higher value of this criterion indicates that the approximate set has greater diversity and dispersion in the objective space. Using Maximum Spread is particularly useful when maintaining diversity in the population of optimal points, as this diversity can help find better solutions and avoid local optimal points. This method serves as a key tool in analyzing the performance of multi-objective optimization algorithms, assisting researchers to identify and develop algorithms capable of producing diverse and well-dispersed sets.

Spacing (SP): The SP checks the distances between the points in the approximate set to determine whether the solutions are uniformly distributed across the Pareto front. Specifically, SP calculates the standard deviation of the consecutive distances between points. A lower SP value indicates a more uniform and orderly distribution of points, which means better diversity and more complete coverage of the objective space. The SP metric is particularly important in multi-objective optimization problems because the uniform distribution of points can help identify and maintain diversity in the set of optimal solutions, preventing excessive clustering of solutions in specific areas.

The SP and the MS measure the coverage, and they are given by Eqs 17 and18, respectively:

(17)

where is the average of all di; n is the number of Pareto optimal solutions obtained; and

(18)

where d is a function to calculate the Euclidean distance; ai is the maximum value in the i-th objective; bi is the minimum in the i-th objective; and o is the number of objectives.

The lower values of IGD and SP represent better results, while for MS is the opposite. All the algorithms will be run 30 times to obtain the average and standard deviation (SD).

3.2. Real-world optimization problems

The proposed method will also be tested on 21 complex mechanically constrained multi-objective design problems. The problem’s name, the variables number (Nv), the objectives number (No), the inequality constraints number (Ni), and the equality constraints number (Ne) are in Table 4. More and more information can be found in the references after the name and in Kumar et al. [33]. We can see that the Speed Reducer project and the Car Inside impact are the problems with more design variables (7). The more dimensional search space is, the harder it is for metaheuristics.

thumbnail
Table 4. Real design multi-objective problems (Adapted from [33]).

https://doi.org/10.1371/journal.pone.0328005.t004

Otherwise, the most complex objective spaces in the Multi-product Batch Plant, Gear Box Design, Crash Energy for High-Speed Train, and Bulk Carrier designs represent a metaheuristics challenge. But of course, just analyzing the number of the dimensions. We will apply MOSRS to address these optimization problems and compare its performance with other recent algorithms, such as ARMOEA, TiGE 2, NSGA-III, CCMO, ToP, MOSFO, and AnD.

Our proposed method does not have specific parameters, only a number of particles and iterations. The other algorithm parameters are detailed in their respective publications; we will adopt the recommendations of Kamur et al. [33] and Pereira & Gomes [40]. For a fair comparison, we will use the same number of iterations and population for all the metaheuristics, following what was proposed by Kumar et al. [33]: for Nv values of 2, 3, 4, and 5, the population sizes will be set to 80, 105, 143, and 212, respectively. If Nv is less than or equal to 10, the number of iterations will be fixed at 2500; if Nv is greater than 10, the number of iterations will be set to 10000. As with other real problems, these multi-objective problems do not have a clearly defined and true Pareto front. Therefore, we will use the Hypervolume (HV) metric to compare the algorithms. It measures the volume of the objective space dominated by the Pareto front and the Nadir points. The same found by Kumar et al. [33] will be used and presented in their paper for a fair comparison. Maximizing the hypervolume helps maintain a diverse and extensive Pareto front.

This metric utilizes a reference vector, denoted as r, to calculate the space between the reference vector and the non-dominated solutions constituting the Pareto front. Typically, the reference vector is the nadir point, and the objective space is normalized using this vector. A higher HV indicates greater convergence and coverage of the Pareto front simultaneously. The reference vectors for CEC 2021 multi-objective problems have been outlined in Kamur et al. [33]. Also, Fig 4 represents some of the problems with Nv = 4.

4. Results and discussions

4.1. CEC 2009

Five multi-objective metaheuristics were compared on the CEC2009 functions: MOSRS, MOGWO, MOPSO, NSGA-II, and MOEA/D. The computer used was a DELL core i7 with 16GB RAM SSD and 1 TB HD, and all algorithms are implemented in MATLAB. Figs 57 show the Pareto fronts for some of the executions for MOSRS, MOGWO, and MOPSO. It shows the complexity of the testing functions and the difficulties of the algorithms. Tables 5–7 show the mean results after 30 independent runs for all algorithms for statistical analysis of the metrics used in this study: IGD, SP, and MS, respectively.

thumbnail
Table 6. SP results of algorithms in the CEC 2009 test functions.

https://doi.org/10.1371/journal.pone.0328005.t006

thumbnail
Table 7. MS results of algorithms in the CEC 2009 test functions.

https://doi.org/10.1371/journal.pone.0328005.t007

thumbnail
Fig 5. CEC 2009 rounds for some algorithms, highlighting MOSRS (UF1:UF4).

https://doi.org/10.1371/journal.pone.0328005.g005

thumbnail
Fig 6. CEC 2009 rounds for some algorithms, highlighting MOSRS (UF5:UF8).

https://doi.org/10.1371/journal.pone.0328005.g006

thumbnail
Fig 7. CEC 2009 rounds for some algorithms, highlighting MOSRS (UF9-UF10).

https://doi.org/10.1371/journal.pone.0328005.g007

Regarding the IGD (Table 5), MOSRS had the best convergence mean in 5 of the 10 used test functions (represented in bold in Table 5), being the algorithm that won the most times. But it lost to NSGA-II in UF2; MOGWO in UF3 and UF5; and MOPSO in UF8 and UF9. Also, it is important to note that compared to the other algorithms, it has a good result in the three-objective test functions (UF8, UF9, and UF10), but MOPSO was the best, and this can be further investigated.

Regarding the mean distance between the Pareto front solutions measured by the MS metric and presented at Table 6, MOSRS had the best mean value just in UF5, showing that it is able to find more spaced solutions and with less continuity. The best algorithm in this metric was the MOPSO winning 4 times. But, connected with this, came the ability to find a well spread solution into the objective space. Very close solutions may indicate exploratory weakness.

Regarding the Maximum Spread, a measure between the furthest solutions in the PF, MOSRS had the best results in 8 of the tested functions, losing to MOGWO in UF3 and UF6. This is a significant achievement for our algorithm because it may suggest that it can present more options to the decision-maker when he analyzes the PF to make decisions.

The radar plot [45] in Fig 8 is a convenient tool to summarize the findings. It standardizes the key objectives of this study: IGD, SP, and MS. The area under the radar curve for each MH is indicated in parentheses. A smaller area reflects a more favorable balance among all the metrics and considers all test functions simultaneously. MOSRS achieved the smallest area (0.0005 – see legend in Figure), owing to its lowest IGD and SP. It is followed by MOGWO, MOPSO, NSGA-II, and MOEA/D. These results highlight the influence of computational cost on FS performance when it is considered.

thumbnail
Fig 8. Summarizing the IGD, SP, and MS results with a radar plot.

https://doi.org/10.1371/journal.pone.0328005.g008

To conclude the analysis of the algorithms’ performances on the CEC 2009 functions, the Friedman test [46] with 95% confidence is applied to the algorithms’ results for all test functions. They are shown in Fig 9 for the three metrics used. This test is commonly used to compare several machine learning algorithms on multiple problems. Regarding IGD, the test placed MOSRS as the best algorithm, followed by MOGWO and NSGA-II. MOEA/D and MOPSO are the worst algorithms.

Regarding SP, MOSRS is the worst, while MOPSO is the best. This metric measures how equally spaced the solutions are in the Pareto front. As for MS, MOSRS is again the best, and MOEA/D is the last. MOSRS was the best in the IGD and MS metrics, which measure respectively the convergence capacity and the total coverage of the Pareto fronts, revealing that it is an algorithm suitable for both.

Also, Figs 1015 analyze the performance of MOSRS, MOGWO, MOPSO, NSGA-II, and MOEAD regarding IGD, SP, and MS metrics. These figures represent trends in the data series, with linear equations calculated to illustrate the relationships. The algorithms are ranked based on their best performance from R1 to R5, where R denotes Rank. This ranking helps to identify the most effective algorithms for the given problems.

thumbnail
Fig 10. Graphical assessment of competitive algorithms by IGD metric (UF1:UF6).

https://doi.org/10.1371/journal.pone.0328005.g010

thumbnail
Fig 11. Graphical assessment of competitive algorithms by IGD metric (UF7:UF10).

https://doi.org/10.1371/journal.pone.0328005.g011

thumbnail
Fig 12. Graphical assessment of competitive algorithms by SP metric (UF1:UF6).

https://doi.org/10.1371/journal.pone.0328005.g012

thumbnail
Fig 13. Graphical assessment of competitive algorithms by SP metric (UF7:UF10).

https://doi.org/10.1371/journal.pone.0328005.g013

thumbnail
Fig 14. Graphical assessment of competitive algorithms by MS metric (UF1:UF6).

https://doi.org/10.1371/journal.pone.0328005.g014

thumbnail
Fig 15. Graphical assessment of competitive algorithms by MS metric (UF7:UF10).

https://doi.org/10.1371/journal.pone.0328005.g015

These figures show that the proposed method demonstrates superior overall performance in solving these problems. The trend analysis reveals that specific algorithms consistently outperform others across different metrics, highlighting their robustness and efficiency. Additionally, the lowest standard deviation (SD) of MOSRS is another critical metric, underscoring the reliability and stability of its results. This low variability indicates that MOSRS produces consistent outcomes, crucial for applications requiring dependable performance. Furthermore, the detailed analysis of these metrics provides insights into the strengths and weaknesses of each algorithm, allowing for a more informed selection process for specific problem-solving scenarios. The comprehensive evaluation and comparison enable researchers and practitioners to understand better the trade-offs involved in choosing an optimization algorithm and to tailor their approach to the specific needs of their application.

4.2. Design problems

The mean results after 25 independent runs for all algorithms are presented in Table 8. The computer software used is the same. The MOSRS’s Pareto front of the bi-objective problems solved here are in Fig 16 for one of these runs. These problems do not have true Pareto Fronts; therefore, we will compare the algorithms using HV. The results are in Table 2, where it is possible to see that MOSRS showed results that were very competitive with the compared metaheuristics, having the best HV mean throughout all the problems with 0.3923 value. Following it comes AnD (0.3939) and NSGA-III (0.3897). The worst algorithm was TiGE2. Also, the metaheuristic that won more times was the ToP, having the best HV results on eight problems. Another interesting observation is about the AnD, which won just once but was consistent enough to have the second overall result.

thumbnail
Fig 16. All bi-objective mechanical design optimization problems studied.

https://doi.org/10.1371/journal.pone.0328005.g016

It is also possible to note that our method had the best results in one of the problems with more decision variables (RCM08), showing good abilities to handle high search space dimensional problems. However, when the number of objectives increases, such as in the RCM13 and RCM19, the performance of our proposed method decreases if compared with the other methods, which will be investigated more in the future. From Fig 16, we can see the MOSRS finding Pareto fronts with good coverage and convergence.

As before, the Friedman test (See Fig 17) was used to compare these eight algorithms. TiGE2 was ranked as the best algorithm, and MOSRS was second. Behind it are AnD, MOSFO, NSGA-III, CCMO, ARMOEA, and ToP.

5. Conclusion

This paper presents the creation and validation of the Multi-objective Special Relativity Search, a multi-objective metaheuristic inspired by special relativity physics. This is the first time this theory has been applied to multi-objective optimization. MOSRS was validated using ten test functions from CEC 2009 being compared to NSGA-II, MOPSO, MOEA/D, and MOGWO in them and 21 complex structural design problems and compared with ARMOEA, TiGE 2, NSGA-III, CCMO, ToP, MOSFO, and AnD using the Hypervolume metric.

The Einsteinian approach had the best Inverted Generational Distance mean and Maximum Spread in the CEC 2009 test functions. It showed itself a promisor algorithm to explore the search space better and present Pareto fronts with more convergence and coverage. When applied to 21 real-world design problems, it had the best HV mean throughout all issues with a 0.3923 value. Following it comes AnD (0.3939) and NSGA-III (0.3897). The worst algorithm was TiGE2. It was possible to note that our method had the best results in one of the problems with more decision variables (7 in the Car Side Impact design), showing good abilities to handle high search space dimensional problems. However, when the number of objectives increases, such as in the Gear Box Designs (7) and Multi-Product Batch Plant (10), the performance of our proposed method decreases if compared with the other techniques, which will be investigated in the future, this was also noticed in the CEC2009 test problems.

MOSRS obtained good coverage and convergence capabilities and proved very competitive with the compared metaheuristics, which were selected to be common, recent, and popular. The algorithm was shown to be very promising with higher decision variable dimensions and was specially tested for real-world structural design problems. While the proposed MOSRS algorithm has demonstrated strong performance across benchmark and real-world problems, future work will extend its capabilities to handle many-objective problems (more than three objectives) more effectively. Additionally, we aim to explore adaptive mechanisms for further enhancing convergence and diversity, as well as integrating hybrid approaches and applying MOSRS to emerging complex engineering applications such as smart structures and energy systems.

References

  1. 1. Kennedy J, Eberhart R. Particle swarm optimization. In: Proceedings of ICNN’95 - International Conference on Neural Networks, vol. 4. 1995. p. 1942–8.
  2. 2. Goodarzimehr V, Talatahari S, Shojaee S, Hamzehei-Javaran S. Special Relativity Search for applied mechanics and engineering. Comput Method Appl Mech Eng. 2023;403:115734.
  3. 3. Goodarzimehr V, Shojaee S, Hamzehei-Javaran S, Talatahari S. Special Relativity Search: A novel metaheuristic method based on special relativity physics. Knowledge Based Syst. 2022;257:109484.
  4. 4. Pereira JLJ, Francisco MB, Diniz CA, Antônio Oliver G, Cunha SS Jr, Gomes GF. Lichtenberg algorithm: A novel hybrid physics-based meta-heuristic for global optimization. Expert Syst Appl. 2021;170:114522.
  5. 5. Zhao W, Wang L, Mirjalili S. Artificial hummingbird algorithm: A new bio-inspired optimizer with its engineering applications. Comput Method Appl Mech Eng. 2022;388:114194.
  6. 6. Kaveh A, Talatahari S, Khodadadi N. Stochastic paint optimizer: theory and application in civil engineering. Eng Comput. 2022:1–32.revi
  7. 7. Zamani H, Nadimi-Shahraki MH, Gandomi AH. Starling murmuration optimizer: A novel bio-inspired algorithm for global and engineering optimization. Comput Method Appl Mech Eng. 2022;392:114616.
  8. 8. Goodarzimehr V, Talatahari S, Shojaee S, Gandomi AH. Computer-aided dynamic structural optimization using an advanced swarm algorithm. Eng Structures. 2024;300:117174.
  9. 9. Abdollahzadeh B, Gharehchopogh FS, Khodadadi N, Mirjalili S. Mountain Gazelle Optimizer: A new Nature-inspired Metaheuristic Algorithm for Global Optimization Problems. Adv Eng Softw. 2022;174:103282.
  10. 10. Goodarzimehr V, Topal U, Vo-Duy T, Shojaee S. Improved chaos game optimization algorithm for optimal frequency prediction of variable stiffness curvilinear composite plate. J Reinforced Plastics Compos. 2023;42(19–20):1054–66.
  11. 11. Pereira JLJ, Francisco MB, Cunha SS Jr, Gomes GF. A powerful Lichtenberg Optimization Algorithm: A damage identification case study. Eng Appl Artificial Intell. 2021;97:104055.
  12. 12. Goodarzimehr V, Salajegheh F. Optimal design of tall steel moment frames using special relativity search algorithm. Int J Optimization Civil Eng. 2024;14(1):61–81.
  13. 13. Degertekin SO. Improved harmony search algorithms for sizing optimization of truss structures. Comput Struct. 2012;92–93:229–41.
  14. 14. Dehghani AA, Hamzehei-Javaran S, Shojaee S, Goodarzimehr V. Optimal analysis and design of large-scale problems using a Modified Adolescent Identity Search Algorithm. Soft Comput. 2024;28(17–18):9405–32.
  15. 15. El-kenawy E-SM, Khodadadi N, Mirjalili S, Abdelhamid AA, Eid MM, Ibrahim A. Greylag Goose Optimization: Nature-inspired optimization algorithm. Expert Syst Appl. 2024;238:122147.
  16. 16. Chen Y, Yan J, Feng J, Sareh P. A hybrid symmetry–PSO approach to finding the self-equilibrium configurations of prestressable pin-jointed assemblies. Acta Mech. 2020;231(4):1485–501.
  17. 17. Zhao W, Wang L, Zhang Z, Fan H, Zhang J, Mirjalili S, et al. Electric eel foraging optimization: A new bio-inspired optimizer for engineering applications. Expert Syst Appl. 2024;238:122200.
  18. 18. Amiri MH, Mehrabi Hashjin N, Montazeri M, Mirjalili S, Khodadadi N. Hippopotamus optimization algorithm: a novel nature-inspired optimization algorithm. Sci Rep. 2024;14(1):5032. pmid:38424229
  19. 19. Abdollahzadeh B, Khodadadi N, Barshandeh S, Trojovský P, Gharehchopogh FS, El-kenawy E-SM, et al. Puma optimizer (PO): a novel metaheuristic optimization algorithm and its application in machine learning. Cluster Comput. 2024;27(4):5235–83.
  20. 20. Back T. Evolutionary algorithms in theory and practice: evolution strategies, evolutionary programming, genetic algorithms. Oxford: Oxford University Press; 1996.
  21. 21. Baril C, Yacout S, Clément B. Design for Six Sigma through collaborative multiobjective optimization. Comput Ind Eng. 2011;60(1):43–55.
  22. 22. Branke J, Kaußler T, Schmeck H. Guidance in evolutionary multi-objective optimization. Adv Eng Softw. 2001;32:499–507.
  23. 23. Coello C, Lamont B, Van Veldhuizen D. Evolutionary Algorithms for Solving Multi-Objective Problems. Springer US, 2007.
  24. 24. Coello CAC, Pulido GT, Lechuga MS. Handling multiple objectives with particle swarm optimization. IEEE Trans Evol Computat. 2004;8(3):256–79.
  25. 25. Cohon JL, Marks DH. A review and evaluation of multiobjective programing techniques. Water Res Res. 1975;11(2):208–20.
  26. 26. Emmerich MTM, Deutz AH. A tutorial on multiobjective optimization: fundamentals and evolutionary methods. Nat Comput. 2018;17(3):585–609. pmid:30174562
  27. 27. Fan X, Sun H, Yuan Z, Li Z, Shi R, Razmjooy N. Multi-objective optimization for the proper selection of the best heat pump technology in a fuel cell-heat pump micro-CHP system. Energy Reports. 2020;6:325–35.
  28. 28. Fonseca CM, Fleming PJ. Genetic algorithms for multi-objective optimization: formulation discussion and generalization. In: Proceedings of the International Conference on Genetic Algorithms, vol. 93. Citeseer; 1993. p. 416–23.
  29. 29. Gomes GF, da Cunha SS Jr, Ancelotti AC. A sunflower optimization (SFO) algorithm applied to damage identification on laminated composite plates. Eng Comput. 2018;35(2):619–26.
  30. 30. Gomes GF, de Almeida FA, da Silva Lopes Alexandrino P, da Cunha SS, de Sousa BS, Ancelotti AC. A multi-objective sensor placement optimization for SHM systems considering Fisher information matrix and mode shape interpolation. Eng Comput. 2018.
  31. 31. Francisco MB, Pereira JLJ, Oliver GA, Da Silva FHS, Cunha SS, Gomes GF. Multi-objective design optimization of CRP isogrid tubes using sunflower multi-objective optimization based on metamodel. Comput Struct. 2021.
  32. 32. Jain H, Deb K. An Evolutionary Many-Objective Optimization Algorithm Using Reference-Point Based Nondominated Sorting Approach, Part II: Handling Constraints and Extending to an Adaptive Approach. IEEE Trans Evol Computat. 2014;18(4):602–22.
  33. 33. Kumar A, Wu G, Ali MZ, Luo Q, Mallipeddi R, Suganthan PN, et al. A Benchmark-Suite of real-World constrained multi-objective optimization problems and some baseline results. Swarm Evolution Comput. 2021;67:100961.
  34. 34. Pereira JLJ, Oliver GA, Francisco MB, Cunha SS Jr, Gomes GF. A Review of Multi-objective Optimization: Methods and Algorithms in Mechanical Engineering Problems. Arch Computat Methods Eng. 2021;29(4):2285–308.
  35. 35. Luiz Junho Pereira J, Antônio Oliver G, Brendon Francisco M, Simões Cunha Jr S, Ferreira Gomes G. Multi-objective lichtenberg algorithm: A hybrid physics-based meta-heuristic for solving engineering problems. Expert Syst Appl. 2022;187:115939.
  36. 36. Pereira JLJ, Francisco MB, de Oliveira LA, Chaves JAS, Cunha SS Jr, Gomes GF. Multi-objective sensor placement optimization of helicopter rotor blade based on Feature Selection. Mech Syst Signal Process. 2022;180:109466.
  37. 37. Pereira JLJ, Francisco MB, Ribeiro RF, Cunha SS, Gomes GF. Deep multiobjective design optimization of CFRP isogrid tubes using lichtenberg algorithm. Soft Comput. 2022;26(15):7195–209.
  38. 38. de Souza TAZ, Pereira JLJ, Francisco MB, Sotomonte CAR, Jun Ma B, Gomes GF, et al. Multi-objective optimization for methane, glycerol, and ethanol steam reforming using lichtenberg algorithm. Int J Green Energy. 2022;20(4):390–407.
  39. 39. Francisco M, Pereira J, Oliveira L, Simões Cunha S Jr, Gomes GF. Multi-objective design optimization of reentrant auxetic model using Lichtenberg algorithm based on metamodel. EC. 2023;40(9/10):3009–35.
  40. 40. Pereira JLJ, Gomes GF. Multi‐objective sunflower optimization: A new hypercubic meta‐heuristic for constrained engineering problems. Expert Syst. 2023;40(8):e13331.
  41. 41. Pereira JLJ, Guedes FC, Francisco MB, Chiarello AG, Gomes GF. Multi-objective design optimization of a high performance disk brake using lichtenberg algorithm. Mech Based Design Struct Mach. 2023;52(6):3038–51.
  42. 42. Guedes HC, Pereira JLJ, Gomes GF. Multi-objective parametric optimization of a composite high-performance prostheses using metaheuristic algorithms. Struct Multidisc Optim. 2023;66(8):189.
  43. 43. Nogueira FS, Pereira JLJ, Simões Cunha S Jr. Multi-objective sensor placement optimization and damage identification for an aircraft wing using Lichtenberg algorithm. EC. 2024;41(2):438–67.
  44. 44. Zhang Q, Zhou A, Zhao S, Suganthan PN, Liu W, Tiwari S. Multiobjective optimization test instances for the CEC 2009 special session and competition. Special session on performance assessment of multi-objective optimization algorithms, technical report, 264. Colchester, UK: University of Essex and Singapore: Nanyang Technological University. 2008.
  45. 45. Saary MJ. Radar plots: a useful way for presenting multivariate health care data. J Clin Epidemiol. 2008;61(4):311–7. pmid:18313553
  46. 46. Demsar J. Statistical comparisons of classifiers over multiple datasets. J Machine Learning Res. 2006;7:1–30.
  47. 47. Tian Y, Cheng R, Zhang X, Cheng F, Jin Y. An Indicator-Based Multiobjective Evolutionary Algorithm With Reference Point Adaptation for Better Versatility. IEEE Trans Evol Computat. 2018;22(4):609–22.
  48. 48. Zhou Y, Zhu M, Wang J, Zhang Z, Xiang Y, Zhang J. Tri-goal evolution framework for constrained many-objective optimization. IEEE Transactions on Systems, Man, and Cybernetics: Systems. 2018.
  49. 49. Tian Y, Zhang T, Xiao J, Zhang X, Jin Y. A coevolutionary framework for constrained multi-objective optimization problems. IEEE Trans Evol Comput. 2020.
  50. 50. Liu Z-Z, Wang Y. Handling Constrained Multiobjective Optimization Problems With Constraints in Both the Decision and Objective Spaces. IEEE Trans Evol Computat. 2019;23(5):870–84.
  51. 51. Liu Z-Z, Wang Y, Huang P-Q. A many-objective evolutionary algorithm with angle-based selection and shift-based density estimation. Inf Sci (Ny). 2020;509:400–19.
  52. 52. Khodadadi N, Talatahari S, Eslamlou D. MOTE O: A novel multi-objective thermal exchange optimization algorithm for engineering problems. Soft Comput. 2022;26(14).
  53. 53. Tejani GG, Mashru N, Patel P, Sharma SK, Celik E. Application of the 2-archive multi-objective cuckoo search algorithm for structure optimization. Sci Rep. 2024;14(1):31553. pmid:39738304
  54. 54. Tejani GG, Sharma SK, Mashru N, Patel P, Jangir P. Optimization of truss structures with two archive-boosted MOHO algorithm. Alexandria Eng J. 2025;120:296–317.
  55. 55. Kumar S, Tejani GG, Pholdee N, Bureerat S. Multiobjecitve structural optimization using improved heat transfer search. Knowledge Based Syst. 2021;219:106811.
  56. 56. Khodadadi N, Khodadadi E, Abdollahzadeh B, EI-Kenawy ESM, Mardanpour P, Zhao W, et al. Multi-objective generalized normal distribution optimization: A novel algorithm for multi-objective problems. Cluster Comput. 2024;27(8):10589–631.
  57. 57. Kalita K, Jangir P, Pandya SB, Shanmugasundar G, Chohan JS, Abualigah L. Many-Objective Multi-Verse Optimizer (MaOMVO): A Novel Algorithm for Solving Complex Many-Objective Engineering Problems. J Inst Eng India Ser C. 2024;105(6):1467–502.
  58. 58. Kalita K, Jangir P, Pandya SB, Alzahrani AI, Alblehai F, Abualigah L, et al. MORKO: A Multi-objective Runge–Kutta Optimizer for Multi-domain Optimization Problems. Int J Comput Intell Syst. 2025;18(1).
  59. 59. Kalita K, Jangir P, Čep R, Pandya SB, Abualigah L. Many-Objective Grasshopper Optimization Algorithm (MaOGOA): A New Many-Objective Optimization Technique for Solving Engineering Design Problems. Int J Comput Intell Syst. 2024;17(1):214.
  60. 60. Kalita K, Ramesh JVN, Čep R, Jangir P, Pandya SB, Ghadai RK, et al. Many-Objective Whale Optimization Algorithm for Engineering Design and Large-Scale Many-Objective Optimization Problems. Int J Comput Intell Syst. 2024;17(1):171.
  61. 61. Abdel-Basset M, Mohamed R, Abouhawwash M. Balanced multi-objective optimization algorithm using improvement based reference points approach. Swarm Evolutionary Comput. 2021;60:100791.
  62. 62. Aljaidi M, Mashru N, Patel P, Adalja D, Jangir P, Arpita PSB, et al. MORIME: A multi-objective RIME optimization framework for efficient truss design. Results Eng. 2025;25:103933.
  63. 63. Mashru N, Tejani GG, Patel P. Many-Objective Optimization of a 120-Bar 3D Dome Truss Structure Using Three Metaheuristics. In: Venkata Rao R, Taler J, editors. Advanced Engineering Optimization Through Intelligent Techniques. AEOTIT 2023. Lecture Notes in Electrical Engineering, vol 1226. Singapore: Springer; 2024.
  64. 64. Khodadadi N, Abualigah L, Mirjalili S. Multi-objective Stochastic Paint Optimizer (MOSPO). Neural Comput & Applic. 2022;34(20):18035–58.
  65. 65. Shao J, Lu Y, Sun Y, Zhao L. An improved multi-objective particle swarm optimization algorithm for the design of foundation pit of rail transit upper cover project. Sci Rep. 2025;15(1):10403. pmid:40140400
  66. 66. Weidner RT, Sells RL. Elementary modern physics. Boston: Allyn and Bacon; 1973.
  67. 67. Kannan B, Kramer SN. An augmented lagrange multiplier based method for mixed integer discrete continuous optimization and its applications to mechanical design. 1994.
  68. 68. Narayanan S, Azarm S. On improving multiobjective genetic algorithms for design optimization. Struct Optimiz. 1999;18(2):146.
  69. 69. Chiandussi G, Codegone M, Ferrero S, Varesio FE. Comparison of multi-objective optimization methodologies for engineering applications. Comput Math Appl. 2012;63(5):912–42.
  70. 70. Deb K, et al. Evolutionary algorithms for multi-criterion optimization in engineering design. Evolution Algorithms Eng Comput Sci. 1999;2:135–61.
  71. 71. Osyczka A, Kundu S. A genetic algorithm-based multicriteria optimization method. In: Proc. 1st World Congr. Struct. Multidisc. Optim, 1995. p. 909–14.
  72. 72. Azarm S, Tits A, Fan M. Tradeoff-driven optimization-based design of mechanical systems. In: 4th Symposium on Multidisciplinary Analysis and Optimization, 1999. p. 4758.
  73. 73. Ray T, Liew K. A swarm metaphor for multiobjective design optimization. Eng Optim. 2002;34(2):141–53.
  74. 74. Cheng FY, Li XS. Generalized center method for multiobjective engineering optimization. Eng Optimization. 1999;31(5):641–61.
  75. 75. Huang HZ, Gu YK, Du X. An interactive fuzzy multi-objective optimization method for engineering design. Eng Appl Artif Intell. 2006;19(5):451–60.
  76. 76. Osyczka A. Evolutionary algorithms for single and multicriteria design optimization. 2002.
  77. 77. Kannan B, Kramer SN. An augmented Lagrange multiplier based method for mixed integer discrete continuous optimization and its applications to mechanical design. 1994.
  78. 78. Coello CAC, Lamont GB, Van Veldhuizen DA, et al. Evolutionary algorithms for solving multi-objective problems, volume 5. Springer; 2007.
  79. 79. Parsons MG, Scott RL. Formulation of Multicriterion Design Optimization Problems for Solution With Scalar Numerical Optimization Methods. J Ship Res. 2004;48(1):61–76.
  80. 80. Fan L, Yoshino T, Xu T, Lin Y, Liu H. A novel hybrid algorithm for solving multi-objective optimization problems with engineering applications. Math Probl Eng. 2018.
  81. 81. Dhiman G, Kumar V. Multi-objective spotted hyena optimizer: A Multi-objective optimization algorithm for engineering problems. Knowledge Based Syst. 2018;150:175–97.
  82. 82. Siddall JN. Optimal engineering design: Principles and applications. CRC Press; 1982.
  83. 83. Zhang H, Peng Y, Hou L, Tian G, Li Z. A hybrid multi-objective optimization approach for energy-absorbing structures in train collisions. Inf Sci (Ny). 2019;481:491–506.