Skip to main content
Advertisement
Browse Subject Areas
?

Click through the PLOS taxonomy to find articles in your field.

For more information about PLOS Subject Areas, click here.

  • Loading metrics

An enhanced decision-making framework for predicting future trends of sharing economy

Abstract

This work aims to provide a reliable and intelligent prediction model for future trends in sharing economy. Moreover, it presents valuable insights for decision-making and policy development by relevant governmental bodies. Furthermore, the study introduces a predictive system that incorporates an enhanced Harris Hawk Optimization (HHO) algorithm and a K-Nearest Neighbor (KNN) forecasting framework. The method utilizes an improved simulated annealing mechanism and a Gaussian bare bone structure to improve the original HHO, termed SGHHO. To achieve optimal prediction performance and identify essential features, a refined simulated annealing mechanism is employed to mitigate the susceptibility of the original HHO algorithm to local optima. The algorithm employs a mechanism that boosts its global search ability by generating fresh solution sets at a specific likelihood. This mechanism dynamically adjusts the equilibrium between the exploration and exploitation phases, incorporating the Gaussian bare bone strategy. The best classification model (SGHHO-KNN) is developed to mine the key features with the improvement of both strategies. To assess the exceptional efficacy of the SGHHO algorithm, this investigation conducted a series of comparative trials employing the function set of IEEE CEC 2014. The outcomes of these experiments unequivocally demonstrate that the SGHHO algorithm outperforms the original HHO algorithm on 96.7% of the functions, substantiating its remarkable superiority. The algorithm can achieve the optimal value of the function on 67% of the tested functions and significantly outperforms other competing algorithms. In addition, the key features selected by the SGHHO-KNN model in the prediction experiment, including " Form of sharing economy in your region " and " Attitudes to the sharing economy ", are important for predicting the future trends of the sharing economy in this study. The results of the prediction demonstrate that the proposed model achieves an accuracy rate of 99.70% and a specificity rate of 99.38%. Consequently, the SGHHO-KNN model holds great potential as a reliable tool for forecasting the forthcoming trajectory of the sharing economy.

1. Introduction

In the year 1978, Marcus Felson, an esteemed Sociology professor at Texas State University, and Joe L. Spaeth, a professor specializing in Sociology at the University of Illinois, introduced the notion of the sharing economy in their scholarly publication, marking its inaugural mention [1,2]. It is a market-based model in which the supplier of resources provides idle resources or skills to the demander through the technology platform and charges a certain fee [3]. The effective allocation of resources is achieved after integrating the technology platforms. Specifically, it can combine various business models such as sharing, renting and swapping together, which is different from traditional business models and signals the emergence of a new economic model [4]. To some extent, the sharing economy model enables the optimal allocation and use of relevant resources. Unlike other economic models, the sharing economy model is an inevitable product of the internet era, as consumers are disadvantaged in the defense of their rights by businesses, dissatisfied with the asymmetry of transaction information and lack of trust in social transaction mechanisms, combined with the full spread of internet technology and continued innovation, provide the prerequisites for the birth of the sharing economy model [5]. Although the concept of "sharing economy" has been around for a long time, the academic community has not yet fully agreed on a definition of the term. Academics agree that platforms are important foundations for the existence and development of sharing economy [6,7]. A quality platform enables effective exchange in the sharing economy, allowing consumers to consume higher-quality products at a lower cost. Meanwhile, the good reputation of the platform is transferred to the merchant, increasing consumer trust in the merchant and effectively facilitating trading activities. Overall, the sharing economy can bring unused items back into circulation without changing ownership, enabling efficient allocation of resources [8].

From a certain point of view, the application of the sharing economy can achieve the Pareto optimum, which can promote the creation of benefits for the sharing platform and the participating entities. Secondly, the elimination of information asymmetries can be used to achieve free market matching, so that idle resources can be utilized to the greatest extent while factor costs are effectively controlled [9]. Finally, the start-up costs of internet platforms can be effectively controlled under the influence of the sharing economy, and a more low-risk approach can be adopted to promote the stable operation of asset-light. Nowadays, sharing economy has been popular in many countries around the world. According to forecasts by relevant institutions, the global market of sharing economy for major industries will grow from $14 billion in 2014 to $335 billion by 2025. In recent years, advances in technology and the growing popularity of mobile communication devices and the internet have led to the widespread emergence of the sharing economy phenomenon, which has also brought a huge impact on traditional consumption models [10,11]. On the one hand: it is more adaptable to the consumption experience and consumption needs of the epidemic situation and networked development; that is, sharing economy model can replace the previous intermediary model with producer-consumer transactions, i.e. replacing the previous employment relationship with a contractual relationship between the consumer and the platform, giving consumers a better service experience. Besides, the price of products and services in the sharing economy is significantly lower than in the traditional economy, so total social demand can be significantly increased by sharing economy. However, on the other hand, due to its imperfect development, the implementation of sharing economy involves product food testing, third-party payment supervision, information security checks, logistics supervision and other means, and its regulatory bodies include industry and commerce, telecommunications, finance, quality inspection and public security [12,13]. And looking at the current development of the sharing economy model, its activity still lacks regulatory bodies. The regulatory bodies have not yet made a clear delineation of powers and responsibilities, making the regulation of the sharing economy a phenomenon of overlapping powers and responsibilities and regulatory gaps [14,15]. In addition to this, there are also many problems such as low-price dumping, monopoly agreements, unfair competition and big data "killing", which are still difficult to be accepted for some users. In short, as a new trend, the acceptance of the sharing economy among the population is still unknown. Therefore, it is necessary to analyze the data to understand in detail the level of awareness and acceptance of the sharing economy. This will not only help to anticipate the future development of sharing economy and adjustment of relevant policies and regulations but will also help to further guide the benign operation of the sharing economy [2,16].

Recently, artificial intelligence technology has developed rapidly and applied successfully in predicting the future development of sharing economy. Klos et al. [17] specifically address the issue of sharing economy. Pilot work was carried out on the phenomenon of collective consumption of Polish consumers by means of a questionnaire to analyze the influence of sharing economy on consumers. Perles et al. [18] aimed to investigate the extent to which sharing economy affects the tourism industry. Machine learning techniques were used to solve practical problems in the tourism industry to further analyze the potential impact of the sharing economy on the tourism industry. Chen et al. [19] used structural equation modeling on a questionnaire survey of 90 sharing economy companies to explore the sustainability of sharing economy. Ackermann et al. [20] examine consumer use of sharing economy platforms from a legitimacy perspective, exploring the impact of consumer attitudes and behavioral intentions towards the accommodation sector in the context of sharing economy. Cheng et al. [21] studied the carbon footprint of Airbnb hosts in Sydney using peer-to-peer sharing on the Airbnb platform and analyses the notion that sharing economy facilitates the utilization of underutilized resources. This study contributes to the sustainability of peer-to-peer accommodation and sharing economy more generally. Ferreri et al. [22] explored the interrelationship between sharing platform economy companies and governments under the background of sharing economy, analyzing the trade-offs between corporate and public interests. Zemla et al. [23] explored the role of sharing economy in urban economic planning and analyzed the planning options made for the development and implementation of sharing economy in specific cities.

Artificial intelligence technology has experienced significant advancements in the past few years, witnessing a remarkable pace of progress, and machine learning algorithms have been of great practical value in both academia and industry. With the exponential growth and intricacy of big data, the conventional machine learning algorithms designed for small-scale data have become inadequate for addressing the multitude of challenges presented by big data applications [24,25]. Therefore, the study of machine learning algorithms in the big data environment has become a common topic of interest for both academia and industry and has wide application to the problem of predicting the future development of the sharing economy. Kim et al. [26] used Random Forest, XGBoost and LightGBM models to predict the demand for shared bicycles and integrated the predictions into the company’s business operations to better serve the needs of customers. Wang et al. [27] constructed a multi-relational network to analyze the value of cooperation in the sharing economy by using different machine learning classification models and feature sets to predict consumers’ purchase behavior on the platform. Tornberg et al. [28] used a machine learning approach to classify picture profiles provided by landlords of shared rentals to assess the gap between race and gender income and to further explore the impact of sharing economy. Shokoohyar et al. [29] use multiple machine learning models (support vector machines, plain Bayes and neural networks) to predict the highest return rental strategy for a specific property in Philadelphia in the context of sharing economy based on data from 2163 properties. Nadeem et al. [30] cut through the theoretical issues of consumer participation in sharing economy platforms from a multi-dimensional perspective, using marketing and business theory literature to analyze consumers’ value co-creation orientation to predict future trends in sharing economy. Jiang et al. [31] conducted a study on bike-sharing. A deep learning model is used to predict the most likely destination for each user to further satisfy consumer demand to explore the influence of sharing economy on people.

Many swarm intelligence algorithms were proposed one after another in the early 1990s, such as hunger games search (HGS) [32], colony predation algorithm (CPA) [33], slime mould algorithm (SMA) [34,35], Harris hawks optimization (HHO) [36], Runge Kutta optimizer (RUN) [37], weighted mean of vectors (INFO) [38], and rime optimization algorithm (RIME) [39]. They have achieved exciting results on combinatorial optimization problems with NP-hard characteristics that are difficult to be handled by traditional optimization algorithms [40,41]. It has garnered significant attention from the academic community and has been rapidly applied in various practice environments, particularly achieving enormous success in engineering applications. For example, they have gained wide application and achieved good results in areas such as combinatorial optimization [42,43], data mining [44,45], energy scheduling [46,47], medicine [48,49] and image classification [50,51], bankruptcy prediction [52], economic emission dispatch [53], feature selection [5458], numerical optimization [5961], scheduling optimization [62,63], multi-objective optimization [64], large-scale complex optimization [65], global optimization [6670], feed-forward neural networks [71], and target tracking [72]. Therefore, to better explore the future development trend of sharing economy, this study proposes an effective prediction model for the future development trend of sharing economy. This model combines KNN with the improved Harris Hawk optimization algorithm, named SGHHO-KNN model. In the proposed algorithm, two novel mechanisms are introduced to the original HHO to improve its disadvantages. First, embedding an enhanced simulated annealing mechanism to address the tendency of the original HHO to fall into local optima. This mechanism generates new solution sets with a certain probability, enhancing the chance of the basic HHO escaping from the local optimum and thus strengthening the capability of the algorithm in global search. Secondly, by incorporating the principles of the Gaussian bare bone mechanism, the SGHHO algorithm effectively navigates the transition from exploration to exploitation phases through a refined adjustment process. This ensures both population diversity and enables the algorithm to achieve enhanced accuracy in problem-solving. Lastly, to further enhance classification performance, the SGHHO algorithm is synergistically integrated with a KNN classifier, which facilitates the evaluation of feature subsets and ultimately leads to improved classification outcomes. Simultaneously, a comprehensive set of optimization experiments was undertaken using the well-established IEEE CEC 2014 benchmark functions. The simulation outcomes unequivocally demonstrate the remarkable superiority of the proposed algorithm over the original HHO algorithm, exhibiting a significantly higher performance level on 96.7% of the functions, while also exhibiting improved stability. Moreover, to thoroughly examine the key factors influencing the future trajectory of the sharing economy, it is imperative to undertake comparative experiments involving SGHHO-KNN and other state-of-the-art algorithms. The empirical findings demonstrate that SGHHO-KNN outperforms conventional approaches across all four evaluated metrics, showcasing superior classification accuracy and enhanced stability. Notably, the prediction outcomes reveal exceptional performance, with the proposed model achieving an impressive accuracy rate of 99.70% and a remarkable specificity rate of 99.38%. Ultimately, the primary objective of this investigation is to conduct an in-depth analysis of the influence of the sharing economy on individuals, consequently providing valuable insights into the anticipated future trajectory of this economic phenomenon.

This work has made the following contributions:

  • An improved simulated annealing mechanism (ISA) and a Gaussian bare bone strategy (GB) are introduced and applied to the HHO to enhance its optimization performance.
  • The superior performance of SGHHO is effectively validated through comprehensive benchmark function experiments.
  • A SGHHO-KNN-based model is developed to predict the future development of sharing economy.
  • Effective forecasting of future trends in the sharing economy is realized and relevant key features are filtered out.

This is the organization of this study. Section 2 gives the basic principles of the HHO algorithm; Section 3 introduces SGHHO proposed in this paper, together with the improved simulated annealing mechanism and the Gaussian bare bone variation strategy introduced. Section 4 introduces the SGHHO-KNN model; Section 5 evaluates the optimization performance of SGHHO on several benchmark functions; Section 6 uses the SGHHO-KNN model to predict the future development of the sharing economy. Finally, summing up the research results of this work and giving the directions of future work.

2. Related works

2.1. Literature review

The research shows that the internal parameters in machine learning models and the feature space and sample space in big data have an important influence on the performance of classifiers. In the last decade or so, swarm intelligence algorithms have shown good benefits in solving such problems [7376]. Furthermore, swarm intelligence algorithms exhibit notable characteristics of randomness and efficiency, encompassing a range of deterministic and stochastic techniques that find wide application in diverse optimization problems and practical scenarios, effectively harnessing the power of collective intelligence. Examples include optimization problems [7780], traveler problems [81], complex network problems [82,83], path planning problems [84,85], and real-time detection problems [86]. Within the scope of this investigation, a refined variation of the Harris Hawks algorithm is introduced, leveraged to forecast forthcoming patterns in the advancement of the sharing economy. Originating from the pioneering work of AliAsghar Heidari and SeyedaliMirjalili in 2019, the Harris Hawks Optimization (HHO) algorithm represents a biomimetic intelligent optimization technique. The algorithm mimics the Harris Hawk predation characteristics and combines Levy Flights to achieve solutions to complex multidimensional problems. The algorithm is similar to common meta-heuristic algorithms, inspired by the habits of nature’s animals. It achieves global optimization algorithms by imitating the group hunting, raiding and siege strategy of the Harris Hawk. Since its introduction, the Harris Hawk algorithm has been used to solve optimization problems in a variety of fields and has achieved good results. Literature [87] introduced the HHO algorithm with chaotic search and opposition learning to perform parameter searches for PV cells and modules. The literature [88] used the HHO algorithm to analyze the stability of mass slope problems and to improve the accuracy of predictions. Literature [89] applied the HHO to in-vehicle location services for intelligent transportation systems to reduce the delay in the delivery of emergency data in in-vehicle networks to enhance the location rate. The literature [90] used the improved adaptive Harris Hawk algorithm for obtaining the optimal queue for the 2D grey scale gradient method and used this gradient method for image processing, possessing better results compared to other multi-stage min-valorization methods. The literature [91] used HHO for industrial safety. Using HHO for augmented artificial neural networks (ANN), a hybrid model ANN-HHO model was proposed for predicting the scour depth of spillways by dam outflow water and compared with two other hybrid models ANN-GA and ANN-PSO, showing the superiority and effectiveness of the hybrid model.

2.2. K-nearest neighbor classifier

The KNN algorithm is a simple and effective classification algorithm. It is based on the idea of a nearest neighbor, i.e. the class of a new sample is determined by the nearest samples of a known class. The fundamental concept underlying the K-nearest neighbors (KNN) algorithm involves the computation of the distance between the sample under classification and all the training samples, utilizing a distance metric. By identifying the k closest neighbors to the sample in question [92], a majority vote is subsequently employed to ascertain the category to which the sample belongs, based on the categories assigned to these nearest neighbors. Many research works have been done on this algorithm by scholars at home and abroad. For example, Wai Lam et al. [93] from the Chinese University of Hong Kong combined the KNN method with a linear classifier and achieved better classification results, with an accuracy of over 80% at a recall rate close to 90%. The literature [94] investigated the asymptotic nature of the K-nearest neighbor estimate of the regression function and obtained the asymptotic normality of the K-nearest neighbor estimate of the regression function and the coincidence of its Bootstrap statistic. The literature [95] establishes an efficient search tree for the nearest neighbor algorithm and improves the query rate. In the literature [96], an iterative nearest neighbor method is proposed to solve the problem of poor classification of KNN algorithms in the environment of small sample pools. In the case where insufficient classed samples are available, the local thematic features of the samples to be classified are amplified by retrieval to obtain similar samples with sufficiently fixed classes. The literature [97] gives parallel algorithms on reconfigurable mesh machines (RMESH) for K nearest neighbors in Euclidean space, among others. Li et al. proposed a two-layer hierarchical combination of text classification [98]. It was demonstrated that the combination of support vector machine and KNN methods could make SVM noise less sensitive and improve the efficiency of KNN methods, showing better classification performance. Shi et al. [99] propose a semi-supervised classification algorithm based on rough set error, using rough set error theory to extract negative case samples. A new classifier is constructed by combining SVM, Rocchio and Naive Bayes to improve the classification accuracy.

3. Structure of HHO

HHO is a heuristic algorithm put forward by Heidari et al that was enlighted by the predatory behaviors of Harris’s hawks [35]. In line with other heuristic algorithms, HHO contains exploration and exploitation phases. In the two different phases, Harris’s hawks produce different predatory behaviors.

3.1. Exploration phase

Harris Hawks in the exploration phase roam many locations waiting for prey to arrive. Harris’s Hawks usually consider the position of their companions and prey when selecting their perches. Two main strategies are followed during this phase, as shown in Eq (2.1).

Eq (2.1)Eq (2.2)

In Eq (2.1)., X(t+1) is the next updated position of the population iteration of Harris’s hawk. Xrand(t) refers to a random position in the current population. Xrabbit(t) refers to the prey’s location, i.e., the location of the current optimal solution. r1, r2, r3, r4 and q are random numbers between [0,1], respectively. UB and LB are used to control the upper boundary and lower bound, independently. In Eq (2.2), Xm(t) is the average position of individuals in the current Harris’s hawk population. where N is the population size and Xi(t) is the position of the i-th individual.

3.2. Transition factor

The escape energy E of the target is designed during the exploitation phase, which determines the exploitation behavior of the Harris Hawk, as shown in Eq (2.3). Eq (2.3) Eq (2.4) where E represents the escape energy of the target prey. E0 denotes the escape energy of target in its initial state. T represents the maximum number of iterations. t denotes the current number of iterations. Harris Hawk chooses different exploitation behaviors depending on the value of E. When E≥1, the Harris Hawk searches for the target in a region far from the current solution. When E<1, Harris Hawk searches in the neighborhood of the current solution.

3.3. Exploitation phase

In the process of the exploitation phase, Harris’s Hawks choose the most appropriate way to attack their prey, depending on each situation. Four strategies describe the hunting behaviors of Harris’s hawk. The probability of prey escape is set to r. r<0.5 represents the state in which the prey escapes capture. r≥0.5 represents the state in which the prey is captured. Regardless of the state, the Harris hawk always surrounds the prey based on escape energy. When E≥0.5, the Harris Hawk performs a soft besiege. when E<0.5, the Harris Hawk performs a hard besiege.

3.3.1 Soft besiege.

When E≥0.5 and r≥0.5, then Harris Hawk successfully captures the prey in a soft besiege state. In soft besiege, the Harris Hawk surrounds its prey and dissipates the prey’s energy. The process can be expressed as Eq (2.5). Eq (2.5) where ΔX(t) denotes the vector difference between the prey target and the current individual and can be represented as Eq (2.6). J refers to the prey escape force, as shown in Eq (2.7). Eq (2.6) Eq (2.7) where r5 represents a random number between [0,1].

3.3.2 Hard besiege.

When E<0.5 and r≥0.5, the energy of the prey target is at a low value. At this point, Harris’s hawk uses a surprise attack to approach the prey, a process that can be expressed by Eq (2.8).

Eq (2.8)

3.3.3 Soft besiege with progressive rapid dives.

When E≥0.5 and r<0.5, the energy of the prey target is at a high value. The probability of prey avoiding capture by Harris’s hawk was high at this time. Harris’s hawks performed soft besiege between prey captures. The subsequent random wandering of Levy-flight was used to simulate prey escape and Harris Hawk pursuit, a process that can be expressed by Eq (2.9). Eq (2.10) and Eq (2.11) determine the next updated position of the Harris Hawk. Eq (2.9) Eq (2.10) Eq (2.11) where u and v are both random numbers between [0,1], β takes 1.5. D is the dimension of the problem. S vector is randomly generated. Greedy selection is then used to determine which positional update is more favorable to the optimal solution, as shown in Eq (2.12). F() calculates the target fitness value.

Eq (2.12)

3.3.4 Hard besiege with progressive rapid dives.

When E<0.5 and r<0.5, Harris’s Hawk surrounds the low-energy prey target by forming an envelope. In the process, Harris’s hawk approaches the prey by narrowing the envelope. This process is described as Eq (2.13)–Eq (2.15).

Eq (2.13)Eq (2.14)Eq (2.15)

4. Proposed SGHHO

In the subsection, we systematically present the algorithmic implementation of SGHHO, including the two novel mechanisms introduced therein. Compared to the standard HHO, the proposed SGHHO has better performance.

4.1. Simulated annealing based HHO

Simulated annealing (SA) algorithm is a random search algorithm first proposed by N. Metropolis et al [100]. Metropolis perceived a strong similarity between the solid annealing processes in the physical world and the general engineering combinatorial optimization situation. SA is a generally probability-based optimization algorithm where the solid annealing process can be summarized in three parts: the heating stage, the isothermal procedure and the cooling process:

  1. Heating stage. This process is designed to increase the energy of the molecules of a solid so that the molecules within it have a more intense thermal movement, thus departing from the equilibrium state of the solid molecules. As the temperature reaches a sufficient value, the solid becomes a liquid, resulting in the disappearance of the non-uniform state in the solid system.
  2. Isothermal processes. The states in the system always proceed in the direction of decreasing energy, and as the energy decreases, the system eventually enters equilibrium.
  3. The cooling processes. The thermal movement of molecules is reduced, the energy of the system decreases, and a crystalline structure is formed.

The main idea of the simulated annealing algorithm in a combinatorial optimization problem is then shown as follows:

  • Initialize the temperature values and solve the optimization problem.
  • Execute the algorithm according to the Metropolis criterion.
  • Output optimal solution.

In the SA algorithm, the Metropolis criterion is the assumption that the position of the previous agent is X(n). After further iterative updates of the population, the agent position is updated to X(n+1). At the same time, the energy of the search agent changes from E(n) to E(n+1). The probability of receiving a search agent from X(n) to X(n+1) is p. Eq (3.1) where T denotes the relative temperature; when moving to the next iteration of the update, if the energy has become smaller, then this change is accepted. If the energy has increased, the search agent has moved further away from the global optimum. In this case, the algorithm does not immediately discard it but judges it by probability: a random number ε is generated on a fixed interval [0,1]. If ε<p, the transfer in this case will also be accepted, otherwise the transfer fails to proceed to the next step of the cooling process, and so on. The cooling equation for the above cooling process is: Eq (3.2) where q denotes the cooling factor, the value of which is usually set to 0.99 [28]; T denotes the current temperature. When the temperature drops to the termination temperature Tend, the optimal value is output.

In this study, chaos mapping is introduced for the simulated annealing algorithm to further enhance the performance of SA. The operation of chaos mapping is shown in detail as follows: Eq (3.3) where μ denotes the control parameter, whose value is usually set to 4 [101]; βi represents a random value ranging in [0,1]; and s stands for the number of the entire population. The chaotic strategy is strongly influenced by the initial conditions and can be understood as a search movement with a combination of overall ergodicity and randomness [102,103]. By effectively circumventing premature convergence of the population, this approach proficiently mitigates the occurrence thereof, expedites the convergence rate, and enhances the precision of solutions attained through the HHO algorithm.

4.2. Gaussian bare-bones

Gaussian Bare Bones is widely used as a variational strategy. Wang et al [104] introduced Gaussian skeleton variation into DE by intersecting the feasible solution obtained from Gaussian perturbation with the solution of the original DE. In this study, enhanced Gaussian Bare Bones were designed to be applied to HHO, and the new update procedure is shown in Eq (3.4). Eq (3.4) Eq (3.5) Eq (3.6) where N(μ, σ) is a Gaussian function with mean μ and variance σ. CR is the crossover factor, and its value is usually set to 0.3 [104106]. are three random individuals in dimension j, while ii1i2i3.

In the modified Gaussian Bare Bones, the new update position is generated by a Gaussian distribution. The method calculates the average of the intermediate position between the present capture and the optimal solution, utilizing the difference between the current solution and the optimal solution as a measure of variance. During the initial stages of evolution, when the disparity between the current solution and the global optimum is substantial, the population’s individuals encompass the entire search space. This stage primarily emphasizes global exploration. As the number of offspring increases, the deviation decreases, while the use of a linear decreasing factor b ensures that individuals in the later population will cover a smaller search area. At this point the algorithm can move more quickly into the behaviors of local exploitation, ensuring convergence at a later stage.

4.3. Framework of SGHHO

The entire framework of SGHHO combines both of these strategies and the flow graph of its overall framework is detailed in Fig 1. First, the population needs to be set up initially and the optimal position selected as the rabbit’s position in the algorithm. After the SGHHO algorithm is updated using the core formulation of HHO, a simulated annealing strategy is introduced. Guided by the SA strategy, the iterative process of SGHHO can accept points with larger energy values, i.e., poorer solutions. As a result, the strategy possesses a better vertical search capability with a larger search range, thus effectively improving the probability of HHO finding a global optimum. Next, Gaussian bare bones strategy is implemented, and the process of fine-tuning the Gaussian sampling allows SGHHO to ensure diversity exploration in the first stage while focusing on exploitation in the next phase. The switch in search behavior from exploration to exploitation allows SGHHO to converge to a higher level of accuracy.

Eventually, the optimal solution in the entire population is updated until it reaches the maximum allowed number of iterations. Additionally, Algorithm 1 showcases the pseudo-code for the SGHHO method employed in this research.

Algorithm1. The pseudocode of SGHHO.

Initialize the population size N and maximum number of evaluation MaxFES;

Randomly initialize the location information of population Xi(i = 1,2,…,N);

While (termination condition is not satisfied) do

  Calculate the fitness value for each individual;

  Set Xrabbit to the current optimal solution;

  for (Each individual) do

    Updating the initial escape energy E0 and escape force J;

    Update E by Eq (2.3);

    if (|E| ≥ 1) then

         if (q < 0.5) then

             Updating the position according to Eq (2.1);

         elseif (q ≥ 0.5) then

             Update the location according to the random selection in Eq (2.1);

         end

         if (|E| < 1) then

           if (r ≥ 0.5 and |E| ≥ 0.5) then

             Use Eq (2.5) to update the position;

           elseif (r ≥ 0.5 and |E| < 0.5) then

             Use Eq (2.8) to update the position;

           elseif (r < 0.5 and |E| ≥ 0.5) then

             Use Eq (2.12) to update the position;

           elseif (r < 0.5 and |E| < 0.5) then

             Use Eq (2.15) to update the position;

end if

    end for

    Calculate the fitness value for all individuals;

    Calculate the mutation position of the modified Gaussian bare bone strategy by Eq (2.15);

    Updating search agent locations by means of the modified simulated annealing strategy;

t = t + 1;

end while

return Xrabbit;

4.4. Computational complexity

The evaluation of algorithmic models involves considering the time complexity of an optimizer, which is an important metric. In this research, the time complexity of SGHHO consists of various steps, including population initialization, updating fitness values, modifying the positions of individuals in the population, executing the simulated annealing strategy, and implementing the Gaussian bare bone strategy. In the standard SGHHO algorithm, the time complexity of these first three components is calculated as O (population initialization) = O(N); O (fitness value update) = O(T*N) during T update iterations; and O (population individual position update) = O(T*N*D) during T iterations. where N denotes the assumed population size; T denotes the maximum number of iterative updates during the experiment; and D denotes the problem dimensions. The SA strategy and GB strategy are also included in the SGHHO algorithm to help SGHHO improve its performance. In particular, the time complexity of the SA strategy is O (SA strategy) = O(N); the time complexity of GB strategy is O (GB strategy) = O(N*D). So that the overall time complexity consumed by the SGHHO algorithm is O(SGHHO) = O(N*(T+TD+2+D)).

5. Proposed SGHHO-KNN

In this section, to enable more representative attributes to be found in the dataset and helps researchers to mine more effective information. In this paper, a simulated annealing mechanism and a Gaussian bare bone variation strategy for improving HHO is proposed for feature selection. Various evaluation criteria exist for different feature selection methods. In this research, a wrapped feature selection method is employed, which primarily relies on the learning algorithm. The classification performance of feature selection serves as the evaluation metric for this method. To enhance the evaluation process, this study incorporates a KNN classifier. KNN is a non-parametric classification technique known for its ability to achieve high accuracy when dealing with unknown and non-normally distributed data. Moreover, KNN offers several advantages, including conceptual clarity and ease of implementation. Furthermore, the KNN approach primarily depends on a small set of neighboring samples in its classification process, rather than relying on class domain discrimination methods. This characteristic makes the KNN method particularly well-suited for datasets with overlapping or intersecting class domains. It is worth noting that the classification error rate of the KNN classifier is closely tied to the concept of distance. In general, KNN algorithms will use distance metrics such as Euclidean and Manhattan for operations. In this study, Manhattan is used for the sample calculation.

In overview, the SGHHO-KNN model is developed in this paper. The model involves three main components: primarily, in this chapter, a refined version of the HHO algorithm is introduced, which incorporates a simulated annealing mechanism and a Gaussian skeleton variation strategy. Subsequently, the SGHHO algorithm is synergistically combined with the K-Nearest Neighbor (KNN) model, giving rise to a novel classification model referred to as SGHHO-KNN. This model utilizes a hybrid feature selection approach. Within this framework, the SGHHO algorithm is employed to conduct a thorough exploration of the feature space, enabling the identification and selection of the most optimal feature subset. The K-Nearest Neighbor classifier is mainly used for feature subset evaluation and for comparing the classification performance after feature selection in combination with the comparison algorithm. That is, with ten-fold cross-validation, the SGHHO algorithm finds the best feature subset on the training set by internal five-fold cross-validation, and the KNN model uses this best feature subset to perform the classification task on the test set. In addition, for the experiments in this study, each classification experiment was run 10 times independently to ensure that the algorithm was fair and unbiased. Therefore, the performance of the algorithm was evaluated according to the average classification results and the corresponding standard deviation of each of the 10 independent runs. The flow chart of the SGHHO-KNN hybrid model proposed in this study is shown in Fig 2.

6. Experimental design and analysis of results

To analyze the SGHHO’s performance in various aspects, comprehensive experiments are designed. In the optimization experiments, comprehensive experiments on the IEEE CEC 2014 function set had been conducted. The experimental design procedure of the work is shown as follows: firstly, experiments are designed to test the scalability of the SGHHO algorithm for dimensional changes; secondly, the effects of two mechanisms on the SGHHO algorithm are investigated; thirdly, comparison experiments are carried out between the SGHHO algorithm and 10 well-known optimization algorithms. In addition, for the metaheuristic algorithm setup described above, the population size is 30; the maximum number of evaluations is 300,000. Meanwhile, all competing methods were independently repeated 30 times during the testing process to avoid chance during experiments. Finally, all the above tests were conducted on Windows 10 host with 16GB RAM and 3.6GHz main frequency and coded using MATLAB R2016b.

6.1. CEC 2014 function validation

In the section, IEEE CEC 2014 function set was chosen to prove each algorithm’s performance in optimization precision and convergence speed. In particular, algorithms are divided into four main types: singlet functions (F1- F3); multimodal functions (F4—F8); hybrid modal functions (F9- F20) and composite modal functions (F21- F30). Table 1 shows the specific data for these 30 functions. In the table, the last two columns show the type of each algorithm and the corresponding global optimum value. Furthermore, to provide a more comprehensive assessment of each algorithm’s performance, two metrics, namely the average value (AVG) and the standard deviation value (STD), are utilized. The experimental tables highlight the optimal values for each problem in bold black font, ensuring their prominence. Additionally, non-parametric statistical tests such as the Wilcoxon signed-rank test [107] and the Friedman test [108] are employed to quantitatively evaluate the performance of SGHHO.

thumbnail
Table 1. Description of the 30 IEEE CEC 2014 benchmark functions.

https://doi.org/10.1371/journal.pone.0291626.t001

6.2. Scalability analysis of SGHHO

The problem’s dimensionality corresponds to the number of factors to be optimized within the given optimization problem. Consequently, optimizing high-dimensional data allows for a more comprehensive evaluation of the algorithm’s overall performance and stability. Thus, the performance of the SGHHO algorithm can be thoroughly assessed. We use the original HHO and Farmland Fertility Algorithm (FFA) [109], which have better performance in the variable dimensionality case, as the reference objects for the experiments in this subsection. Within the experimental framework, scalability experiments are conducted using a test function set consisting of 30 functions from IEEE CEC 2014. The dimensions are varied, specifically set to 100, 500, and 1000, respectively. All experimental parameters remain constant, with the exception of the dimension setting (D). The metrics utilized to analyze the experimental results include AVG (average) and STD (standard deviation). A comprehensive overview of the results can be found in Table 2.

thumbnail
Table 2. Experimental results of scalability tests on SGHHO.

https://doi.org/10.1371/journal.pone.0291626.t002

The table provides a clear visual representation, demonstrating that the SGHHO algorithm consistently outperforms the original HHO algorithm in unimodal functions (F1-F3), irrespective of the dimensional complexity. The AVG values achieved by the SGHHO algorithm exhibit a significant improvement compared to those of the original HHO algorithm. This superiority is evident in both low and high-dimensional problem settings. Also, SGHHO outperforms the FFA algorithm for F1 and F3 functions. And with the increasing complexity of the functions, the SGHHO algorithm has better optimization performance compared with the original HHO algorithm in multimodal functions (F4-F9). The implementation of the GB strategy successfully directs the HHO algorithm towards exploring regions that contain more optimal solutions, thereby enhancing the algorithm’s precision in generating solutions. The FFA, with its excessive local search capability, slightly outperforms the SGHHO on F4, F7 and F8. Further, in the mixed modal functions (F9-F20), the SGHHO algorithm still maintains its obvious superiority over FFA and HHO. Despite the variations in dimensional settings, the SGHHO algorithm demonstrates consistent performance in attaining optimal solutions. Among the composite functions (F21-F30), although there is a slight difference in the AVG value between the SGHHO algorithm and the original algorithm on the F29 function, it remains relatively small. Conversely, for all other functions, the SGHHO algorithm outperforms the HHO algorithm by a significant margin. The results of scalability experiments indicate that the SGHHO algorithm exhibits notable enhancements in optimization performance and stability across 93.3% of the functions in the IEEE CEC 2014 function set when considering variable dimensionality scenarios. This signifies the considerable improvements achieved by the SGHHO algorithm through the integration of SA and GB strategies.

6.3. The impact of two mechanisms

This subsection investigates the effect of the randomly introduced simulated annealing strategy and the Gaussian bare bone strategy on HHO. To enable effective verification of the role of the added mechanisms, this experiment performs all sequential sequencing of the SA and GB strategies, thus avoiding interactions between the mechanisms. The details are displayed in Table 3, where "S" and "G" denote "simulated annealing" and "Gaussian skeleton", respectively. 1 and 0 indicate that the strategy is selected and unselected, respectively. The four algorithms were compared on 30 test functions. Table 4 details the Avg and STD values obtained by the algorithms during the experiments. Similarly, the optimal values obtained under each function are marked in bold. Analysis of the table shows that SGHHO has the best solution accuracy out of the 30 functions and ranks first out of 19 functions. In addition, SGHHO is more stable than SHHO and GHHO in terms of STD values. This means that the SGHHO algorithm maximizes the performance benefits of both strategies.

thumbnail
Table 4. Optimization results of each HHO variants on IEEE CEC 2014 functions.

https://doi.org/10.1371/journal.pone.0291626.t004

The p-values for the Wilcoxon signed rank test at a 5% confidence level are presented in Table 5. The symbols "+/-/ = " are utilized to signify the outcomes of the comparison between SGHHO and the other methods. From the table, SGHHO outperforms HHO on 24 test functions; it outperforms the GHHO algorithm on 13 functions; and it outperforms the SHHO algorithm on 10 functions. The integration of the two strategies demonstrates a substantial enhancement in the performance of HHO, attaining its maximum potential in the SGHHO algorithm. Additionally, the Friedman test was employed to quantitatively assess the algorithm’s performance. According to the average ranking of the algorithms, the SGHHO algorithm secures the top position among all combinations, signifying a significant advancement in achieving optimal performance. Not only is there a significant improvement in global search, but there is also a significant breakthrough in local exploitation. Therefore, the experimental results in this section serve as evidence for the efficacy of the SA and GB strategies in enhancing the performance of the original HHO algorithm.

thumbnail
Table 5. The p-values of Wilcoxon test for HHO variants on the benchmark function.

https://doi.org/10.1371/journal.pone.0291626.t005

6.4. Comparison with other reported well-known optimizers

To verify the optimization performance of the SGHHO algorithm, 10 high-quality algorithms including BMWOA [110], CBA [111], EM [112], OBSCA [113], IGWO [114], MFO [115], ALPSO [116], CGPSO [117], SCADE [118] and HGWO [119] were selected for comparison. Furthermore, the IEEE CEC 2014 test set was utilized in the experiments to comprehensively evaluate the algorithms’ exploration and exploitation capabilities. To provide further insight into the performance comparisons, statistical tests such as the Wilcoxon rank sum test and the Friedman test were employed to assess the significance differences between SGHHO and the other algorithms. Furthermore, to allow experimental fairness, the whole algorithms were compared under the same settings, as well as each algorithm’s parameters were shown in Table 6.

Table 7 shows AVG and STD values gained for all competitors on the set of functions tested. Where the optimal values obtained on each set of functions are bolded in order to allow more visual analysis of the gaps between the algorithms. The results in Table 7 demonstrate that SGHHO can reach the optimal solution of the function on 67% of the tested functions. In particular, on the unimodal function, SGHHO is only not optimal on the F2 function, second only to the ALSPSO and CBA algorithms. Compared to algorithms such as BMWOA, OBSCA, CGPSO and SCADE, the SGHHO showed strong competitiveness on the single-peak function. In addition, although the AVG values of SGHHO on the multimodal functions are not fully optimal, there is no significant difference compared to the optimal solution for each function and the average AVG ranking is better. On the hybrid functions, SGHHO can search for optimal solutions for most functions compared to other advanced algorithms. A comparison to this can be clearly seen in the F10-F20 functions. Also, using the Friedman test, this study compares the combined strength of SGHHO and these competing algorithms.

thumbnail
Table 7. Comparative results for SGHHO and other reported methods on the benchmark function.

https://doi.org/10.1371/journal.pone.0291626.t007

As the data at the end of Table 8 indicates, SGHHO achieves the first overall average ranking in processing the CEC 2014 function set, followed by algorithms including ALCPSO, IGWO and CGPSO. Besides, according to the Wilcoxon test, the p-value of SGHHO is mostly less than 0.05 compared to other methods, which demonstrates the significant difference between the SGHHO algorithm compared to other competitors. It validates that the SGHHO algorithm, which incorporates the Gaussian Bare-Bones strategy and the simulated annealing strategy, is a more promising algorithm for different types of optimization problems, and that SGHHO significantly improves HHO’s optimization power.

thumbnail
Table 8. P -values for SGHHO and other reported algorithms by Wilcoxon test on the benchmark function.

https://doi.org/10.1371/journal.pone.0291626.t008

Fig 3 illustrates the convergence curves of SGHHO in comparison to other state-of-the-art algorithms on the CEC 2014 test suite. The convergence curves provide a more intuitive picture of the difference in convergence quality and optimization capability between SGHHO and the competitors. The figure clearly demonstrates that SGHHO exhibits faster convergence rates for the majority of functions. For example, F1, F10, F16, F29 and F30. This is because SGHHO utilizes the Gaussian Bare Bones strategy for individual variation and can expand the global search early in the evolutionary process. Furthermore, an observation can be made from the results of F3, F5, F18, and F21, which indicate that even though SGHHO may not exhibit the fastest early convergence, it manages to sustain its progress towards the optimal value after ALPSO, CBA, and CGPSO have reached convergence. This signifies the ability of SGHHO to effectively prevent being trapped in local optima. Also, SGHHO converges rapidly in the later evolutionary stage through SA mechanism, and eventually achieves higher precision and better solutions in the search process when compared with other comparative algorithms. This makes algorithms such as OBSCA, ALCPSO, BMWOA and SCADE fall into premature convergence, while SGHHO can maintain the convergence trend until the global optimum. SGHHO demonstrates a remarkable capability to effectively maintain a balance between exploration and exploitation throughout the search process, thereby rejuvenating the evolution of the population. This ability proves instrumental in mitigating the issue of premature convergence that is commonly encountered in HHO.

thumbnail
Fig 3. Convergence curves for SGHHO and advanced algorithms on 9 selected benchmark functions.

https://doi.org/10.1371/journal.pone.0291626.g003

A comparison of the mean, standard deviation values, Wilcoxon test and convergence figures leads to the conclusion that the proposed SGHHO significantly outperforms other competing algorithms. It can consistently maintain a leading position in function optimization problems, and the advantage of SGHHO becomes more obvious as the dimensionality of the problem increases.

Hence, in subsequent investigations, this approach can be extended to a broader range of scenarios, such as optimization of machine learning models [120], computer-aided medical diagnosis [121,122], pathology image segmentation [123125], image denoising [126,127], fine-grained alignment [128], cancer diagnosis [129131], medical signals [132,133], and structured sparsity optimization [134].

7. Predicting the future of the sharing economy

7.1. Methods & data collection

The sharing economy dataset is collected for this study and deals with data mainly from participants in the general public who have participated in the sharing economy. From these actual participants, 671 participants were selected for the study. The questionnaire was developed with reference to numerous scholars’ questionnaires on the sharing economy and predicted to have good reliability and validity. The questionnaire analyses 43 aspects of the respondents by examining their gender, education level, political status, age, profession, location, income level, consumption level, various forms of sharing economy and their perception of sharing economy (see Table 9). This study examines the basic situation of the public’s participation in sharing economy, explores the importance of these attributes and their inherent linkages, and builds a predictive model of future trends in the sharing economy on this basis.

Since the above-mentioned questionnaires did not involve ethical issues, the review committee/ethics committee of Wenzhou University granted an exemption from ethical review. All participants in the questionnaires signed a consent form.

7.2. Condition configuration

The experiments involved in this work were completed on a host computer with 16GB of RAM, 3.6GHz main frequency, Windows 10 and coded on MATLAB R2016b software. In addition, the experimental parameters are extremely important to the simulation and subtle changes in the parameters can affect the final experimental outcome. To ensure the fairness of the study, the proposed models were quantitatively evaluated using statistical techniques including the calculation of average values (AVG) and standard deviations (STD). The AVG represents the average predictive performance of each model, while the STD indicates the degree of variation across 10 independent runs. To further assess the effectiveness of the proposed models, four commonly used classification metrics were employed, namely Accuracy (ACC), Sensitivity, Specificity, and Matthew’s correlation coefficient (MCC).

7.3. Experimental results and analysis of SGHHO-KNN

To investigate the key factors influencing the future development of sharing economy, this experiment uses a variety of machine learning models to make predictions from a real dataset collected, including BSGHHO-KNN, BSHHO-KNN, BGHHO-KNN, BP, RF, ELM and other classification models. Meanwhile, the classification results of each model will be statistically analyzed using four indicators, namely ACC, Sensitivity, Specificity and MCC. Furthermore, the comprehensive experimental findings are disclosed in Table 10, providing detailed insights into the results obtained. And it can be intuitively found that the BSGHHO-KNN model obtained 99.70%—the highest value of accuracy in predicting the future development trend of sharing economy, while the other five models were 98.66%, 93.45%, 51.56%, 99.11% and 65.73% respectively. While the BSHHO-KNN and RF models demonstrated satisfactory accuracy, it is worth noting that the BSGHHO-KNN model exhibited superior performance in terms of the Specificity and MCC metrics, surpassing both of them and attaining the highest classification outcomes. Based on the above four evaluation metrics it can be fully demonstrated that the BSGHHO-KNN model is feasible to be applied to the problem of predicting future trends in the sharing economy.

thumbnail
Table 10. Average results of BSGHHO-KNN and other models on four indicators.

https://doi.org/10.1371/journal.pone.0291626.t010

Table 11 shows the standard deviation values derived from the BSGHHO-KNN model and the other five models after 10 independent experiments. The optimal values in the table are bolded for more visual analysis of the experimental data. After analyzing the results, it can be deduced that the BSGHHO-KNN model demonstrates exceptional stability in terms of ACC, sensitivity, and MCC, surpassing other models in these aspects. In terms of specificity indicators, RF model gains the best value.

thumbnail
Table 11. Standard deviation values for BSGHHO-KNN and the other five models on the four indicators.

https://doi.org/10.1371/journal.pone.0291626.t011

To better compare the performance gap between the models, Fig 4 presents the histograms of AVG and STD values for the five models mentioned above. The examination of Fig 4 reveals that the BSGHHO-KNN model exhibits superior performance across all four indicators, namely ACC, Sensitivity, Specificity, and MCC, resulting in the most effective classification outcome. Following closely behind are the RF model and the BSHHO-KNN model. The figure shows that BSHHO-KNN and BGHHO-KNN both have been effective compared to the other classifiers. The experiments show that the HHO algorithm has excellent adaptability in combination with the KNN classifier. In addition, the BP model performed the worst on the sharing economy dataset. The ACC value was only 51.56% and the Sensitivity was 16.29%. This suggests that the BP model needs to be tuned with appropriate parameters for the specific problem. In addition, the BSHHO-KNN model and the BGHHO-KNN model do not perform the best on the dataset, which means that the BSGHHO algorithm can enable the KNN classifier to maximize its classification performance and thus achieve better classification results. In summary, the BSGHHO-KNN model outperforms other similar methods and can be used to investigate the key factors affecting future trends in the sharing economy.

thumbnail
Fig 4. Comparison of SGHHO_KNN with well-known classifiers.

https://doi.org/10.1371/journal.pone.0291626.g004

In this experimental study, the proposed model successfully accomplishes the task of selecting the optimal subset of features during the entire process. Fig 5 counts the selected times of each feature in each experiment in the form of a line graph. From the figure: "form of sharing economy in the region" (F7), "attitude towards the sharing economy" (F30), "products in the sharing domain that have been used" (F22), "Factors of greatest concern when using shared product services" (F23), "Main aspects of distress caused by the sharing economy" (F28), "Negative effects of the sharing economy on society " (F39), "Main aspects of concerns about the sharing economy" (F35) and "Any suggestions for the development of the sharing economy" (F43) were selected most frequently. These six most common attributes appeared 10, 10, 5, 5, 5, 4, 4 and 4 times respectively. Therefore, this study concludes that these types of attributes may make a valuable contribution to predicting future trends in the sharing economy.

thumbnail
Fig 5. The times each feature was selected by SGHHO_KNN during the 10-fold CV process.

https://doi.org/10.1371/journal.pone.0291626.g005

7.4. Discussion

The emergence of the sharing economy as a novel consumption model has introduced certain challenges, yet the public’s expectations for its future development remain high. Through the analysis of the questionnaire experiment results, it becomes evident that numerous factors influence the future trajectory of the sharing economy. Among the 43 attributes considered, attributes F7, F22, F23, F28, F30, F35, F39, and F43 stand out as the most significant in shaping the sharing economy’s future development trend. As a model intended to cater to the entire population, it is crucial to select the appropriate type of sharing economy that effectively serves the public, thereby playing a pivotal role in its overall advancement.

Attributes F22 and F23 pertain to consumer feedback regarding their experience with the sharing economy post-usage. In this context, the success of a particular type of sharing economy hinges on its capacity to satisfy diverse consumer needs at a reduced cost, while also garnering positive experiential feedback. This attribute plays a pivotal role in driving the development of the sharing economy. When consumers have a greater choice of products and services, providers are unable to price them effectively and consumer surplus is significantly increased. On the other hand, when the price of products and services is lower than normal, market demand increases as the price decreases, so the application of sharing economy model will help to increase market demand and thus contribute to the creation of greater value for money.

The F28, F35 and F39 belong to the troubles caused by sharing economy. Under this new model of sharing economy, the existing regulations and systems, etc. do not fit in with its development, thus problems such as taxation, labor security and information security arise one after another. Meanwhile, some of the current regulations are not yet effective in regulating the implementation of sharing economy model, and there is even the phenomenon of sharing economy model being bound by the traditional model legal system, which affects the model’s value due to the lack of laws and regulations. Therefore, under the premise of clarifying the development mechanism, solving the problem of fitting the environment of sharing economy model is a key factor in developing the sharing economy.

As per the attributes F30 and F43, participants hold a crucial position in regulating the activities of the sharing economy through SGHHO. However, the current process of gathering participant feedback and opinions is inadequate, and there is a lack of focus on establishing targeted channels for collecting and incorporating their input. Add to this the fact that sharing platforms have yet to be effectively improved and perfected, leaving a lack of comprehensive, real-time regulation of participants. Therefore, encouraging public participation in the regulation of the sharing sector is a vital measure for promoting the development of sharing economy.

In summary, the sharing economy exhibits positive momentum, with a wider scope of development and an increasing variety of sharing economy products. The future growth of the sharing economy relies on the support of national policies and the continuous enhancement of the management system, which serves as a political guarantee for its stable progress. Additionally, the improvement of social infrastructure is a prerequisite for establishing and maintaining the sharing economy, while the enhancement of citizens’ quality and their active and respectful involvement in the sharing economy are essential for its sustainable long-term development. Despite current challenges in the sharing economy’s development, the outlook for the future remains promising.

8. Conclusions and future directions

In the work, we develop an effective SGHHO-KNN hybrid model to provide predictions for the future development of sharing economy. In this study, a novel variant of HHO called SGHHO is proposed. Different from other existing variants, the main contribution and innovation of the algorithm is the effective incorporation of an improved Gaussian bare bone strategy and simulated annealing mechanism. When the original HHO algorithm falls into a local optimum, the Gaussian bare-bones strategy can generate a variation factor that guides individuals to skip from the optimal solution with a higher chance of survival. The enhanced simulated annealing mechanism enhances the exploration capability of the algorithm, enabling it to conduct more comprehensive global search operations across the entire feature space. To assess the optimization prowess of SGHHO, the IEEE CEC 2014 test suite is employed for rigorous comparative tests. The results demonstrate a substantial superiority of the SGHHO algorithm over the original HHO algorithm in 96.7% of the functions. This compelling evidence showcases the significant enhancements achieved by the SGHHO algorithm in optimizing multivariate problems, thanks to the integration of SA and GB strategies. At the same time, SGHHO significantly outperforms similar algorithms and achieves optimal solutions for 67% of the functions tested when compared to other superior algorithms. This highlights the strong competitiveness of SGHHO. Furthermore, by combining the SGHHO algorithm with KNN, a better subset of features can be obtained than previous methods. When compared to conventional machine learning approaches, the SGHHO-KNN model achieved optimal results in the evaluation metrics of ACC, sensitivity, specificity, and MCC, attaining values of 99.70%, 100.00%, 99.38%, and 99.42% respectively. The feasibility of the BSGHHO-KNN model applied to the problem of predicting future trends in the sharing economy can be fully demonstrated based on the above four evaluation indicators. In conclusion, the SGHHO-KNN approach, as proposed in this study, demonstrates its effectiveness in selecting an optimal subset of features from the available data, enabling accurate predictions of the development trend of the sharing economy.

Although the SGHHO algorithm has shown excellent performance in the above applications, there are still shortcomings and aspects that deserve further research. For instance, SGHHO does not offer a universal solution to all intricate optimization problems, and its parameters require careful consideration and analysis in a problem-specific context to achieve optimal performance. Furthermore, the SGHHO acts as a stochastic optimizer. It is randomized by nature. This means that there is still the possibility that SGHHO can fall into a local optimum in other complex applications. Therefore, the SGHHO method can also be combined with the latest optimization algorithms in future research, such as the farmland fertility algorithm, hunger games search, etc. And furthermore, it is applied in areas such as financial risk prediction and medical data diagnosis [135]. Nowadays, along with the increasing size of data in various fields, large-scale datasets also generate a large amount of redundant, useless and noisy data. This data seriously affects the performance of learning algorithms for data analysis. Thus, feature selection has an important place in this process. It is possible to drastically reduce the size of the data while maintaining the expressiveness of the information in the original feature set and thereby avoiding the combinatorial explosion problem. This is particularly true for corporate bankruptcy prediction and intelligent medical diagnosis. Therefore, the SGHHO method can be subsequently applied to the field. By combining machine learning methods to assist asset owners and medical staff in achieving intelligent decisions.

References

  1. 1. Grybaite V., et al., COMPARISON OF THE ENVIRONMENT OF EU COUNTRIES FOR SHARING ECONOMY STATE BY MODERN MULTIPLE CRITERIA METHODS. AMFITEATRU ECONOMIC, 2022. 24(59): p. 194–213.
  2. 2. Laukkanen M. and Tura N., The potential of sharing economy business models for sustainable value creation. JOURNAL OF CLEANER PRODUCTION, 2020. 253.
  3. 3. Zhu X. and Liu K., A systematic review and future directions of the sharing economy: business models, operational insights and environment-based utilities. JOURNAL OF CLEANER PRODUCTION, 2021. 290.
  4. 4. Baumber A., Scerri M., and Schweinsberg S., A social licence for the sharing economy. TECHNOLOGICAL FORECASTING AND SOCIAL CHANGE, 2019. 146: p. 12–23.
  5. 5. Oberg C., Towards a typology of sharing economy business model transformation. TECHNOVATION, 2023. 123.
  6. 6. Cheng X., Mou J., and Yan X., Sharing economy enabled digital platforms for development. INFORMATION TECHNOLOGY FOR DEVELOPMENT, 2021. 27(4): p. 635–644.
  7. 7. Acquier A., Daudigeos T., and Pinkse J., Promises and paradoxes of the sharing economy: An organizing framework. TECHNOLOGICAL FORECASTING AND SOCIAL CHANGE, 2017. 125: p. 1–10.
  8. 8. Cui L., et al., Text mining to explore the influencing factors of sharing economy driven digital platforms to promote social and economic development. INFORMATION TECHNOLOGY FOR DEVELOPMENT, 2021. 27(4): p. 779–801.
  9. 9. Ritter M. and Schanz H., The sharing economy: A comprehensive business model framework. JOURNAL OF CLEANER PRODUCTION, 2019. 213: p. 320–331.
  10. 10. Curtis S.K. and Mont O., Sharing economy business models for sustainability. JOURNAL OF CLEANER PRODUCTION, 2020. 266. pmid:32884181
  11. 11. Rojanakit P., de Oliveira R.T., and Dulleck U., The sharing economy: A critical review and research agenda. JOURNAL OF BUSINESS RESEARCH, 2022. 139: p. 1317–1334.
  12. 12. Carrigan M., et al., Fostering sustainability through technology-mediated interactions Conviviality and reciprocity in the sharing economy. INFORMATION TECHNOLOGY & PEOPLE, 2020. 33(3): p. 919–943.
  13. 13. Ciulli F. and Kolk A., Incumbents and business model innovation for the sharing economy: Implications for sustainability. JOURNAL OF CLEANER PRODUCTION, 2019. 214: p. 995–1010.
  14. 14. Tham W.K., Lim W.M., and Vieceli J., Foundations of consumption and production in the sharing economy. ELECTRONIC COMMERCE RESEARCH.
  15. 15. Thebault-Spieker J., Terveen L., and Hecht B., Toward a Geographic Understanding of the Sharing Economy: Systemic Biases in UberX and TaskRabbit. ACM TRANSACTIONS ON COMPUTER-HUMAN INTERACTION, 2017. 24(3).
  16. 16. Veretennikova A. and Kozinskaya K., Assessment of the Sharing Economy in the Context of Smart Cities: Social Performance. SUSTAINABILITY, 2022. 14(19).
  17. 17. Klos L., THE SHARING ECONOMY IN THE OPINION OF POLISH CONSUMERS. EKONOMIA I SRODOWISKO-ECONOMICS AND ENVIRONMENT, 2021. 2(77): p. 112–125.
  18. 18. Francisco Perles-Ribes J., et al., Competitiveness and overtourism: a proposal for an early warning system in Spanish urban destinations. EUROPEAN JOURNAL OF TOURISM RESEARCH, 2021. 27.
  19. 19. Chen D., You N., and Lv F., Study on Sharing Characteristics and Sustainable Development Performance: Mediating Role of the Ecosystem Strategy. SUSTAINABILITY, 2019. 11(23).
  20. 20. Ackermann C.-L., Matson-Barkat S., and Truong Y., A legitimacy perspective on sharing economy consumption in the accommodation sector. CURRENT ISSUES IN TOURISM, 2022. 25(12): p. 1947–1967.
  21. 21. Cheng M., et al., The sharing economy and sustainability—assessing Airbnb’s direct, indirect and induced carbon footprint in Sydney. JOURNAL OF SUSTAINABLE TOURISM, 2020. 28(8): p. 1083–1099.
  22. 22. Ferreri M. and Sanyal R., Platform economies and urban planning: Airbnb and regulated deregulation in London. URBAN STUDIES, 2018. 55(15): p. 3353–3368.
  23. 23. Zemla M., Jaremen D.E., and Nawrocka E., Consequences of development of the sharing economy in tourism for cities—theory and examples. PRACE KOMISJI GEOGRAFII PRZEMYSLU POLSKIEGO TOWARZYSTWA GEOGRAFICZNEGO-STUDIES OF THE INDUSTRIAL GEOGRAPHY COMMISSION OF THE POLISH GEOGRAPHICAL SOCIETY, 2021. 35(1): p. 109–122.
  24. 24. Zhao S., et al., Multilevel threshold image segmentation with diffusion association slime mould algorithm and Renyi’s entropy for chronic obstructive pulmonary disease. Computers in Biology and Medicine, 2021. 134: p. 104427. pmid:34020128
  25. 25. Liu Y., et al., Boosting Slime Mould Algorithm for Parameter Identification of Photovoltaic Models. Energy, 2021: p. 121164.
  26. 26. Kim T.Y., et al., Prediction of Bike Share Demand by Machine Learning: Role of Vehicle Accident as the New Feature. INTERNATIONAL JOURNAL OF BUSINESS ANALYTICS, 2022. 9(1).
  27. 27. Wang X., et al., E-book adoption behaviors through an online sharing platform A multi-relational network perspective. INFORMATION TECHNOLOGY & PEOPLE, 2020. 33(3): p. 1011–1035.
  28. 28. Tornberg P., How sharing is the "sharing economy"? Evidence from 97 Airbnb markets. PLOS ONE, 2022. 17(4). pmid:35446887
  29. 29. Shokoohyar S., Sobhani A., and Sobhani A., Determinants of rental strategy: short-term vs long-term rental strategy. INTERNATIONAL JOURNAL OF CONTEMPORARY HOSPITALITY MANAGEMENT, 2020. 13(12): p. 3873–3894.
  30. 30. Nadeem W., et al., The Role of Ethical Perceptions in Consumers’ Participation and Value Co-creation on Sharing Economy Platforms. JOURNAL OF BUSINESS ETHICS, 2021. 169(3): p. 421–441.
  31. 31. Jiang J., et al., A Destination Prediction Network Based on Spatiotemporal Data for Bike-Sharing. COMPLEXITY, 2019.
  32. 32. Yang Y., et al., Hunger games search: Visions, conception, implementation, deep analysis, perspectives, and towards performance shifts. Expert Systems with Applications, 2021. 177: p. 114864.
  33. 33. Tu J., et al., The Colony Predation Algorithm. Journal of Bionic Engineering, 2021. 18(3): p. 674–710.
  34. 34. Chen H., et al., Slime mould algorithm: a comprehensive review of recent variants and applications. International Journal of Systems Science, 2022: p. 1–32.
  35. 35. Li S., et al., Slime mould algorithm: A new method for stochastic optimization. Future Generation Computer Systems, 2020. 111: p. 300–323.
  36. 36. Heidari A.A., et al., Harris hawks optimization: Algorithm and applications. Future Generation Computer Systems-the International Journal of Escience, 2019. 97: p. 849–872.
  37. 37. Ahmadianfar I., et al., RUN Beyond the Metaphor: An Efficient Optimization Algorithm Based on Runge Kutta Method. Expert Systems with Applications, 2021: p. 115079.
  38. 38. Ahmadianfar I., et al., INFO: An Efficient Optimization Algorithm based on Weighted Mean of Vectors. Expert Systems with Applications, 2022: p. 116516.
  39. 39. Su H., et al., RIME: A physics-based optimization. Neurocomputing, 2023.
  40. 40. Wang M., et al., Evaluation of constraint in photovoltaic models by exploiting an enhanced ant lion optimizer. Solar Energy, 2020. 211: p. 503–521.
  41. 41. Hu J., et al., Chaotic diffusion‐limited aggregation enhanced grey wolf optimizer: insights, analysis, binarization, and feature selection. International Journal of Intelligent Systems, 2022. 37(8): p. 4864–4927.
  42. 42. Dlaska C., et al., Quantum Optimization via Four-Body Rydberg Gates. PHYSICAL REVIEW LETTERS, 2022. 128(12). pmid:35394305
  43. 43. Kaya E., et al., A review on the studies employing artificial bee colony algorithm to solve combinatorial optimization problems. ENGINEERING APPLICATIONS OF ARTIFICIAL INTELLIGENCE, 2022. 115.
  44. 44. Hu J., et al., Dispersed foraging slime mould algorithm: Continuous and binary variants for global optimization and wrapper-based feature selection. Knowledge-Based Systems, 2022. 237: p. 107761.
  45. 45. Zhang X., et al., Gaussian mutational chaotic fruit fly-built optimization and feature selection. Expert Systems with Applications, 2020. 141: p. 112976.
  46. 46. AkbaiZadeh M., Niknam T., and Kavousi-Fard A., Adaptive robust optimization for the energy management of the grid-connected energy hubs based on hybrid meta-heuristic algorithm. ENERGY, 2021. 235.
  47. 47. Khan N.H., et al., A Novel Modified Lightning Attachment Procedure Optimization Technique for Optimal Allocation of the FACTS Devices in Power Systems. IEEE Access, 2021. 9: p. 47976–47997.
  48. 48. Hu J., et al., Detection of COVID-19 severity using blood gas analysis parameters and Harris hawks optimized extreme learning machine. Computers in Biology and Medicine, 2022. 142: p. 105166. pmid:35077935
  49. 49. Yang X., et al., Multi-level threshold segmentation framework for breast cancer images using enhanced differential evolution. Biomedical Signal Processing and Control, 2023. 80: p. 104373.
  50. 50. Zhao S., et al., Performance optimization of salp swarm algorithm for multi-threshold image segmentation: Comprehensive study of breast cancer microscopy. Computers in biology and medicine, 2021. 139: p. 105015. pmid:34800808
  51. 51. Hao S., et al., Performance optimization of water cycle algorithm for multilevel lupus nephritis image segmentation. Biomedical Signal Processing and Control, 2023. 80: p. 104139.
  52. 52. Zhang Y., et al., Towards augmented kernel extreme learning models for bankruptcy prediction: Algorithmic behavior and comprehensive analysis. Neurocomputing, 2021. 430: p. 185–212.
  53. 53. Dong R., et al., Boosted kernel search: Framework, analysis and case studies on the economic emission dispatch problem. Knowledge-Based Systems, 2021. 233: p. 107529.
  54. 54. Liu Y., et al., Simulated annealing-based dynamic step shuffled frog leaping algorithm: Optimal performance design and feature selection. Neurocomputing, 2022. 503: p. 325–362.
  55. 55. Xue Y., Xue B., and Zhang M., Self-adaptive particle swarm optimization for large-scale feature selection in classification. ACM Transactions on Knowledge Discovery from Data (TKDD), 2019. 13(5): p. 1–27.
  56. 56. Xue Y., Cai X., and Neri F., A multi-objective evolutionary algorithm with interval based initialization and self-adaptive crossover operator for large-scale feature selection in classification. Applied Soft Computing, 2022. 127: p. 109420.
  57. 57. Wang X., et al., Crisscross Harris Hawks Optimizer for Global Tasks and Feature Selection. Journal of Bionic Engineering, 2022: p. 1–22. pmid:36466727
  58. 58. Shan W., et al., Multi-strategies boosted mutative crow search algorithm for global tasks: Cases of continuous and discrete optimization. Journal of Bionic Engineering, 2022. 19(6): p. 1830–1849.
  59. 59. Sun G., Yang G., and Zhang G., Two-level parameter cooperation-based population regeneration framework for differential evolution. Swarm and Evolutionary Computation, 2022. 75: p. 101122.
  60. 60. Li C., et al., A population state evaluation-based improvement framework for differential evolution. Information Sciences, 2023. 629: p. 15–38.
  61. 61. Sun G., Li C., and Deng L., An adaptive regeneration framework based on search space adjustment for differential evolution. Neural Computing and Applications, 2021. 33: p. 9503–9519.
  62. 62. Wen X., et al., A two-stage solution method based on NSGA-II for Green Multi-Objective integrated process planning and scheduling in a battery packaging machinery workshop. Swarm and Evolutionary Computation, 2021. 61: p. 100820.
  63. 63. Wang G., et al., Research on Vessel Speed Heading and Collision Detection Method Based on AIS Data. Mobile Information Systems, 2022.
  64. 64. Zhao C., Zhou Y., and Lai X., An integrated framework with evolutionary algorithm for multi-scenario multi-objective optimization problems. Information Sciences, 2022. 600: p. 342–361.
  65. 65. Huang C., et al., Co-evolutionary competitive swarm optimizer with three-phase for large-scale complex optimization problem. Information Sciences, 2023. 619: p. 2–18.
  66. 66. Deng W., et al., An Enhanced MSIQDE Algorithm With Novel Multiple Strategies for Global Optimization Problems. IEEE Transactions on Systems, Man, and Cybernetics: Systems, 2022. 52(3): p. 1578–1587.
  67. 67. Sun G., et al., Hierarchical structure-based joint operations algorithm for global optimization. Swarm and Evolutionary Computation, 2023: p. 101311.
  68. 68. Hu H., et al., Dynamic Individual Selection and Crossover Boosted Forensic-based Investigation Algorithm for Global Optimization and Feature Selection. Journal of Bionic Engineering, 2023.
  69. 69. Sharma S., et al., mLBOA: A Modified Butterfly Optimization Algorithm with Lagrange Interpolation for Global Optimization. Journal of Bionic Engineering, 2022. 19(4): p. 1161–1176.
  70. 70. Sahoo S.K. and Saha A.K., A Hybrid Moth Flame Optimization Algorithm for Global Optimization. Journal of Bionic Engineering, 2022. 19(5): p. 1522–1543.
  71. 71. Xue Y., Tong Y., and Neri F., An ensemble of differential evolution and Adam for training feed-forward neural networks. Information Sciences, 2022. 608: p. 453–471.
  72. 72. Zhang X., et al., A Robust Tracking System for Low Frame Rate Video. International Journal of Computer Vision, 2015. 115(3): p. 279–304.
  73. 73. Zouache D. and Ben Abdelaziz F., A cooperative swarm intelligence algorithm based on quantum-inspired and rough sets for feature selection. COMPUTERS & INDUSTRIAL ENGINEERING, 2018. 115: p. 26–36.
  74. 74. Naseri T.S. and Gharehchopogh F.S., A Feature Selection Based on the Farmland Fertility Algorithm for Improved Intrusion Detection Systems. JOURNAL OF NETWORK AND SYSTEMS MANAGEMENT, 2022. 30(3).
  75. 75. Samadi Bonab M., et al., A wrapper-based feature selection for improving performance of intrusion detection systems. INTERNATIONAL JOURNAL OF COMMUNICATION SYSTEMS, 2020. 33(12).
  76. 76. Mohammadzadeh H. and Gharehchopogh F.S., A multi-agent system based for solving high-dimensional optimization problems: A case study on email spam detection. INTERNATIONAL JOURNAL OF COMMUNICATION SYSTEMS, 2021. 34(3).
  77. 77. Zaman H.R.R. and Gharehchopogh F.S., An improved particle swarm optimization with backtracking search optimization algorithm for solving continuous optimization problems. ENGINEERING WITH COMPUTERS, 2022. 38(SUPPL 4): p. 2797–2831.
  78. 78. Gharehchopogh F.S., et al., Advances in Sparrow Search Algorithm: A Comprehensive Survey. ARCHIVES OF COMPUTATIONAL METHODS IN ENGINEERING, 2023. 30(1): p. 427–455. pmid:36034191
  79. 79. Gharehchopogh F.S., Quantum-inspired metaheuristic algorithms: comprehensive survey and classification. ARTIFICIAL INTELLIGENCE REVIEW.
  80. 80. Gharehchopogh F.S., et al., Slime Mould Algorithm: A Comprehensive Survey of Its Variants and Applications. ARCHIVES OF COMPUTATIONAL METHODS IN ENGINEERING. pmid:36685136
  81. 81. Gharehchopogh F.S., Abdollahzadeh B., and Arasteh B., An Improved Farmland Fertility Algorithm with Hyper-Heuristic Approach for Solving Travelling Salesman Problem. CMES-COMPUTER MODELING IN ENGINEERING & SCIENCES, 2023. 135(3): p. 1981–2006.
  82. 82. Shishavan S.T. and Gharehchopogh F.S., An improved cuckoo search optimization algorithm with genetic algorithm for community detection in complex networks. MULTIMEDIA TOOLS AND APPLICATIONS, 2022. 81(18): p. 25205–25231.
  83. 83. Gharehchopogh F.S., An Improved Harris Hawks Optimization Algorithm with Multi-strategy for Community Detection in Social Network. JOURNAL OF BIONIC ENGINEERING, 2023. 20(3): p. 1175–1197.
  84. 84. Belge E., Altan A., and Hacioglu R., Metaheuristic Optimization-Based Path Planning and Tracking of Quadcopter for Payload Hold-Release Mission. ELECTRONICS, 2022. 11(8).
  85. 85. Altan, A. Performance of metaheuristic optimization algorithms based on swarm intelligence in attitude and altitude control of unmanned aerial vehicle for path following. in 2020 4th International Symposium on Multidisciplinary Studies and Innovative Technologies (ISMSIT). 2020. IEEE.
  86. 86. Yag I. and Altan A., Artificial Intelligence-Based Robust Hybrid Algorithm Design and Implementation for Real-Time Detection of Plant Diseases in Agricultural Environments. BIOLOGY-BASEL, 2022. 11(12). pmid:36552243
  87. 87. Chen H., et al., Parameters identification of photovoltaic cells and modules using diversification-enriched Harris hawks optimization with chaotic drifts. Journal of Cleaner Production, 2020. 244: p. 118778.
  88. 88. Moayedi H., et al., A novel Harris hawks’ optimization and k-fold cross-validation predicting slope stability. Engineering with Computers, 2021. 37: p. 369–379.
  89. 89. Malar C.J.A., Priya D.M., and Janakiraman S., Harris hawk optimization algorithm-based effective localization of non-line-of-sight nodes for reliable data dissemination in vehicular ad hoc networks. International Journal of Communication Systems, 2021. 34(1).
  90. 90. Wunnava A., et al., An adaptive Harris hawks optimization technique for two dimensional grey gradient based multilevel image thresholding. Applied Soft Computing, 2020. 95: p. 106526.
  91. 91. Sammen S.S., et al., Enhanced artificial neural network with Harris hawks optimization for predicting scour depth downstream of ski-jump spillway. Applied Sciences, 2020. 10(15): p. 5160.
  92. 92. Domeniconi C., Peng J., and Gunopulos D., Locally adaptive metric nearest-neighbor classification. IEEE transactions on pattern analysis and machine intelligence, 2002. 24(9): p. 1281–1285.
  93. 93. Lam W. and Ho C.Y. Using a generalized instance set for automatic text categorization. in Proceedings of the 21st annual international ACM SIGIR conference on Research and development in information retrieval. 1998.
  94. 94. Wang X., et al., Analyzing the Therapeutic Mechanism of Mongolian Medicine Zhonglun-5 in Rheumatoid Arthritis Using a Bagging Algorithm with Serum Metabonomics. EVIDENCE-BASED COMPLEMENTARY AND ALTERNATIVE MEDICINE, 2022. 2022. pmid:36532854
  95. 95. McNames J., A fast nearest-neighbor algorithm based on a principal axis search tree. IEEE Transactions on pattern analysis and machine intelligence, 2001. 23(9): p. 964–976.
  96. 96. Zhang H. and Sun G., Optimal reference subset selection for nearest neighbor classification by tabu search. Pattern Recognition, 2002. 35(7): p. 1481–1490.
  97. 97. Ghosh A.K., Chaudhuri P., and Murthy C., On visualization and aggregation of nearest neighbor classifiers. IEEE transactions on pattern analysis and machine intelligence, 2005. 27(10): p. 1592–1602. pmid:16237994
  98. 98. Li W., Miao D., and Wang W., Two-level hierarchical combination method for text classification. Expert Systems with Applications, 2011. 38(3): p. 2030–2039.
  99. 99. Shi L., et al., Rough set and ensemble learning based semi-supervised algorithm for text classification. Expert Systems with Applications, 2011. 38(5): p. 6300–6306.
  100. 100. Metropolis N., et al., Equation of state calculations by fast computing machines. The journal of chemical physics, 1953. 21(6): p. 1087–1092.
  101. 101. Chechkin A.V., et al., Introduction to the theory of Lévy flights. Anomalous transport: Foundations and applications, 2008: p. 129–162.
  102. 102. Zhao F., et al., A hybrid differential evolution and estimation of distribution algorithm based on neighbourhood search for job shop scheduling problems. INTERNATIONAL JOURNAL OF PRODUCTION RESEARCH, 2016. 54(4): p. 1039–1060.
  103. 103. Bharti K.K. and Singh P.K., Opposition chaotic fitness mutation based adaptive inertia weight BPSO for feature selection in text clustering. Applied Soft Computing, 2016. 43: p. 20–34.
  104. 104. Wang H., et al., Gaussian Bare-Bones Differential Evolution. Ieee Transactions on Cybernetics, 2013. 43(2): p. 634–647. pmid:23014758
  105. 105. Kennedy, J. Bare bones particle swarms. in Proceedings of the 2003 IEEE Swarm Intelligence Symposium. SIS’03 (Cat. No.03EX706). 2003.
  106. 106. Omran M.G.H., Engelbrecht A.P., and Salman A., Bare bones differential evolution. European Journal of Operational Research, 2009. 196(1): p. 128–139.
  107. 107. García S., et al., Advanced nonparametric tests for multiple comparisons in the design of experiments in computational intelligence and data mining: Experimental analysis of power. Information Sciences, 2010. 180(10): p. 2044–2064.
  108. 108. Derrac J., et al., A practical tutorial on the use of nonparametric statistical tests as a methodology for comparing evolutionary and swarm intelligence algorithms. Swarm and Evolutionary Computation, 2011. 1(1): p. 3–18.
  109. 109. Shayanfar H. and Gharehchopogh F.S., Farmland fertility: A new metaheuristic algorithm for solving continuous optimization problems. Applied Soft Computing, 2018. 71: p. 728–746.
  110. 110. Heidari A.A., et al., An enhanced associative learning-based exploratory whale optimizer for global optimization. Neural Computing and Applications, 2020. 32(9): p. 5185–5211.
  111. 111. Adarsh B.R., et al., Economic dispatch using chaotic bat algorithm. Energy, 2016. 96: p. 666–675.
  112. 112. Dempster A.P., Laird N.M., and Rubin D.B., Maximum likelihood from incomplete data via the EM algorithm. Journal of the Royal Statistical Society: Series B (Methodological), 1977. 39(1): p. 1–22.
  113. 113. Abd Elaziz M., Oliva D., and Xiong S., An improved Opposition-Based Sine Cosine Algorithm for global optimization. Expert Systems with Applications, 2017. 90: p. 484–500.
  114. 114. Cai Z., et al., Evolving an optimal kernel extreme learning machine by using an enhanced grey wolf optimization strategy. Expert Systems with Applications, 2019. 138: p. 112814.
  115. 115. Mirjalili S., Moth-flame optimization algorithm: A novel nature-inspired heuristic paradigm. Knowledge-Based Systems, 2015. 89: p. 228–249.
  116. 116. Chen W.-N., et al., Particle swarm optimization with an aging leader and challengers. IEEE transactions on evolutionary computation, 2012. 17(2): p. 241–258.
  117. 117. Sun T.Y., et al., Cluster Guide Particle Swarm Optimization (CGPSO) for Underdetermined Blind Source Separation With Advanced Conditions. IEEE Transactions on Evolutionary Computation, 2011. 15(6): p. 798–811.
  118. 118. Nenavath H. and Jatoth R.K., Hybridizing sine cosine algorithm with differential evolution for global optimization and object tracking. Applied Soft Computing, 2018. 62: p. 1019–1043.
  119. 119. Tawhid M.A. and Ali A.F., A Hybrid grey wolf optimizer and genetic algorithm for minimizing potential energy function. Memetic Computing, 2017. 9(4): p. 347–359.
  120. 120. Zhao C., et al., JAMSNet: A Remote Pulse Extraction Network Based on Joint Attention and Multi-Scale Fusion. IEEE Transactions on Circuits and Systems for Video Technology, 2022: p. 1–1.
  121. 121. Hržić F., et al., XAOM: A method for automatic alignment and orientation of radiographs for computer-aided medical diagnosis. Computers in Biology and Medicine, 2021. 132: p. 104300. pmid:33714842
  122. 122. Goel K., et al., The effect of machine learning explanations on user trust for automated diagnosis of COVID-19. Computers in Biology and Medicine, 2022. 146: p. 105587. pmid:35551007
  123. 123. Chen Y., et al., Accurate iris segmentation and recognition using an end-to-end unified framework based on MADNet and DSANet. Neurocomputing, 2023. 517: p. 264–278.
  124. 124. Li Y., et al., Dual Encoder-Based Dynamic-Channel Graph Convolutional Network With Edge Enhancement for Retinal Vessel Segmentation. IEEE Transactions on Medical Imaging, 2022. 41(8): p. 1975–1989. pmid:35167444
  125. 125. Chen J., et al., Renal Pathology Images Segmentation Based on Improved Cuckoo Search with Diffusion Mechanism and Adaptive Beta-Hill Climbing. Journal of Bionic Engineering, 2023. pmid:37361683
  126. 126. Zhang X., et al., Exemplar-Based Denoising: A Unified Low-Rank Recovery Framework. IEEE Transactions on Circuits and Systems for Video Technology, 2020. 30(8): p. 2538–2549.
  127. 127. Nabavi S., et al., Medical imaging and computational image analysis in COVID-19 diagnosis: A review. Computers in Biology and Medicine, 2021. 135: p. 104605. pmid:34175533
  128. 128. Wang S., et al., Class-aware sample reweighting optimal transport for multi-source domain adaptation. Neurocomputing, 2023. 523: p. 213–223.
  129. 129. Faruqui N., et al., LungNet: A hybrid deep-CNN model for lung cancer diagnosis using CT and wearable sensor-based medical IoT data. Computers in Biology and Medicine, 2021. 139: p. 104961. pmid:34741906
  130. 130. Kourou K., et al., A machine learning-based pipeline for modeling medical, socio-demographic, lifestyle and self-reported psychological traits as predictors of mental health outcomes after breast cancer diagnosis: An initial effort to define resilience effects. Computers in Biology and Medicine, 2021. 131: p. 104266. pmid:33607379
  131. 131. Painuli D., Bhardwaj S., and köse U., Recent advancement in cancer diagnosis using machine learning and deep learning techniques: A comprehensive review. Computers in Biology and Medicine, 2022. 146: p. 105580. pmid:35551012
  132. 132. Dai Y., et al., MSEva: A musculoskeletal rehabilitation evaluation system based on EMG signals. ACM Transactions on Sensor Networks, 2022. 19(1): p. 1–23.
  133. 133. Zhou J., Zhang X., and Jiang Z., Recognition of Imbalanced Epileptic EEG Signals by a Graph-Based Extreme Learning Machine. Wireless Communications and Mobile Computing, 2021. 2021: p. 5871684.
  134. 134. Zhang X., et al., Structured Sparsity Optimization With Non-Convex Surrogates of $\ell _{2,0}$ℓ2,0-Norm: A Unified Algorithmic Framework. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2023. 45(5): p. 6386–6402.
  135. 135. Hu J., et al., Identification of Pulmonary Hypertension Animal Models Using a New Evolutionary Machine Learning Framework Based on Blood Routine Indicators. Journal of Bionic Engineering, 2022. pmid:36466726