Skip to main content
Advertisement
Browse Subject Areas
?

Click through the PLOS taxonomy to find articles in your field.

For more information about PLOS Subject Areas, click here.

  • Loading metrics

The uncertainty related to the inexactitude of prioritization based on consistent pairwise comparisons

Abstract

When the in/consistency in Pairwise Comparisons (PCs) is taken into consideration as the subarea of the Multi Attribute Decision Making (MADM) scientific field, it has many repercussions in various types of research areas including different modelling scenarios e.g. reduction of inconsistency during PCs, deriving appropriate consistency thresholds for inconsistent Pairwise Comparison Matrices (PCMs), completing of incomplete PCMs, aggregating of individual PCMs in relation to Group Decision Making (GDM) aspects, and PCMs in/consistency relation to credibility of Priority Vectors (PV) derived from PCMs with the application of various Priorities Deriving Methods (PDMs). The examination objective in the latter area of research is the uncertainty related to the inexactitude of prioritization based on derived PVs. However, only few research studies examine this problem from the perspective of PCM applicability for credible designation of decision maker’s (DM) priorities in the way that leads to minimization of the prioritization uncertainty related to possible, and sometimes very probable, ranking fluctuations. This problem constitutes the primary area of interest for this research paper as no research study was thus far identified that examines this problem from the perspective of consistent PCMs. Hence, a research gap was identified. Thus, the objective of this research paper is to fill in this scientific gap. The research findings have serious repercussions in relation to prioritization quality with the application of PCs methodology, mostly in relation to the interpretation and reliability evaluation of prioritization results. Firstly, the research study outcome changes the perspective of the rank reversal phenomenon, which shed new light on many research studies that have been presented in the subject’s literature for many decades. Secondly, the research study results throw new light on the discussion concerning the fuzziness of AHP’s results. Last but not least, the effect of the research opens the unique opportunity to evaluate the prioritization outcome obtained within the process of consistent PCs from the well-known perspective of statistical hypothesis testing i.e. the probability designation of the chance that accepted ranking results which were considered as correct due to low probability of change may be incorrect, hence they should be rejected, and the probability designation of the chance that rejected ranking results which were considered as incorrect due to high probability of change may be correct and should be accepted. The paramount finding of the research is the fact that consistent PCMs provide PVs, which elements cannot be considered as established, but only approximated within certain confidence intervals estimated with a certain level of probability. As problems related to heuristics can be analyzed only via a computer simulation process, because they cannot be mathematically determined, the problem examined in this research paper is examined via Monte Carlo simulations, appropriately coded and executed with the application of Wolfram’s Mathematica Software. It is believed that this research findings should be very important and useful for all decision makers and researchers during their problems‘ examinations that relate to prioritization processes with the application of PCs methodology.

Introduction

Due to the fact acknowledged by many research studies, including Miller’s input, [1] concluding that humans are not capable of dealing accurately with more than about seven (±2) things at a time (the human brain is limited in its short term memory capacity, its discrimination ability and its range of perception), there are numerous techniques/methods which strive to make this process easier and sometimes even possible. Fundamentally, two schools of decision making methodology exist at the present moment: the Multiple Criteria Decision Making (MCDM) school, developed by Americans, and the Multiple Criteria Decision Aiding/Analysis (MCDA) school, developed by Europeans [24]. For most researchers, these terms have similar meaning, but for the record, the Americans’ school is applied in this research. Terms like Multi-Objective Decision Making (MODM), Multi-Attributes Decision Making (MADM), as well as Multi-Dimensions Decision-Making (MDDM) can also be found in literature. Generally, when optimization techniques are utilized (continuous problems are examined), one deals with MODM, and when alternative selection takes place (discrete problems are considered), one deals with MADM and/or MDDM. The facet of MCDM is created by these subdisciplines all together.

Pairwise-Comparisons-Based (PCB) prioritization is a method with a long history dating back to the Middle Ages. Presumably, the first elaboration on this subject was created by Ramon Lull [5], who in his work debated comparisons of alternatives in an election process. Over time, other studies on the PCs method appeared, such as the Condorcet and the Copeland methods, see e.g. [68]. Indisputably, due to Saaty’s manuscript [9] where he defined the Analytic Hierarchy Process (AHP), alternatives comparisons in pairs, contemporarily called pairwise comparisons, started to be considered an element of the MADM.

Undeniably, the AHP is a popular MADM method that proposes its own priorities deriving method (PDM) i.e. Principal Right Eigenvector (PREV) method, a related to PREV consistency index (CI) which is supposed to indicate quality of data which is processed and a hierarchical model which is supposed to enable easier structuring of multiple criteria problems [1013]. Over time, scientific evidence also appeared that indicated a few flaws of the AHP, see e.g. [1417]. Due to some criticism of the AHP methodology, many scientists worked and keep working on methods which can improve its drawbacks. Hence, many PDMs have been proposed, see e.g. [1850], which, for the article’s brevity, will not be discussed herein in details.

Taking into account the AHP drawbacks, many indicators of PCM consistency,–commonly known as consistency indices (CIs)–have been also proposed thus far, see e.g. [23,24,41,5168]. They also, due to brevity of this article, will not be scrutinized herein. However, the interested reader may choose to acquaint thoroughly with their ample diversity, see e.g. [6977].

When the in/consistency in Pairwise Comparisons (PCs) is taken into consideration as the subarea of the MADM scientific field, it presumably may be perceived as the most exploiting topic in this research area. A variety of models have been proposed to address inconsistency issues, see e.g. [7885]. Certainly, issues related to PCs in/consistency have many repercussions in various types of modelling scenarios i.e. the inconsistency reduction of reciprocal Pairwise Comparison Matrices (PCMs) with high levels of inconsistency, see e.g. [72,8691]; deriving appropriate consistency thresholds for non/reciprocal PCMs, see e.g. [10,74,9296]; completing of incomplete reciprocal PCMs, see e.g. [97102]; and aggregating of individual reciprocal PCMs in relation to Group Decision Making (GDM) aspects, see e.g. [75,103110]. The issue of in/consistency in PCs is also especially attractive when examined from the perspective of its relation to trustworthiness of Priority Ratios (PRs) derived from Pairwise Comparison Matrix (PCM) denoted as PCM(w) = [wij]n×n with elements wij = wi/wj, where wij>0, and i, j = 1,…, n, with the application of various Priorities Deriving Methods (PDMs). It is believed for example, although some evidence from a few research papers contradicts this belief, see e.g. [76,77,111,112], that inconsistent PCMs provide less credible PRs and inconsistency reduction is the process that leads to betterment of priorities estimation. It is also believed that both consistent and inconsistent PCMs provide Priority Vectors (PVs), denoted as w = [w1,…, wn]T, where wi>0, i = 1,…, n, whose elements are considered as established, and not approximated within a certain confidence intervals estimated with a certain level of probability. The PCMs in/consistency relation to credibility of Priority Vectors (PV) derived from PCMs with the application of various Priorities Deriving Methods (PDMs) constitutes the key issue in a few research studies e.g. [51,76,77,111117]. The examination objective in the latter area of research is the uncertainty related to the inexactitude of prioritization based on derived PVs. However, only few research studies examine this problem from the perspective of PCM applicability for credible designation of decision maker’s (DM) priorities in the way that leads to minimization of the prioritization uncertainty related to possible, and sometimes very probable, ranking fluctuations. This problem constitutes the primary area of interest for this research paper as no research study was thus far identified that examines this problem from the perspective of consistent PCMs. So far, this concept has been studied only from the perspective of inconsistent PCMs, see e.g. [51,76,77,111,112,114,117]. Hence, a research gap was identified. Thus, the objective of this research paper is to fill in this scientific gap. The research findings have serious repercussions in relation to prioritization quality with the application of PCs methodology, mostly in relation to the interpretation and reliability evaluation of prioritization results. Firstly, the research study outcome changes the perspective of the rank reversal phenomenon, which shed new light on many research studies that have been presented in the subject’s literature for many decades. Secondly, the research study results throw new light on the discussion concerning the fuzziness of AHP’s results. Last but not least, the effect of the research opens the unique opportunity to evaluate the prioritization outcome obtained within the process of consistent PCs from the well-known perspective of statistical hypothesis testing i.e. the probability designation of the chance that accepted ranking results which were considered as correct due to low probability of change may be incorrect, hence they should be rejected, and the probability designation of the chance that rejected ranking results which were considered as incorrect due to high probability of change may be correct and should be accepted. The paramount finding of the research is the fact that consistent PCMs provide PVs, which elements cannot be considered as established, but only approximated within certain confidence intervals estimated with a certain level of probability. As problems related to heuristics can be analyzed only via a computer simulation process, because they cannot be mathematically determined, the problem examined in this research paper is examined via Monte Carlo simulations, appropriately coded and executed with the application of Wolfram’s Mathematica Software. It is believed that this research findings should be very important and useful for all decision makers and researchers during their problems‘ examinations that relate to prioritization processes with the application of PCs methodology.

Due to its main area of interest, this research paper is structured as follows: firstly, in the Section Systematic review of literature, the MCDM techniques/methods are reviewed, and then some Preliminary remarks are presented; secondly, in the Section Research methodology, the research method of the problem is presented; this Section consists of two Subsections i.e. Outline of the examination concept, where the example case study is analyzed, and Description of simulation concept, where the Monte Carlo simulation algorithm applied for the research is presented; thirdly, the Section Results and discussion, which is also divided into two Subsections i.e. Analysis of results, where general simulation results are elaborated, and Discussion of results, where detailed simulation results are examined and discussed from the perspective of Rank Reversal Phenomenon; fourthly, final remarks and future research direction end the research paper with the Section Final remarks.

Systematic review of literature

There are many MCDM techniques/methods which are available in relevant literature. As they have their own unique characteristics, there are many ways to classify them e.g. according to the type of data they utilize, according to the number of Decision Makers (DM) involved in the decision process, or according to the type of information and pertinent features of given information.

Thus far, the following techniques/methods have been devised for MADM/MDDM problems: the Weighted Sum Model (WSM), see e.g. [118,119], useful for evaluating several alternatives in relation to various criteria expressed in the same unit; the Weighted Product Model (WPM), see e.g. [120], often called dimensionless analysis because its mathematical structure eliminates any units of measure (the WPM can be applied to both, single- and multi-dimensional MCDM problems); the Analytic Hierarchy Process (AHP), see e.g. [17,37,42,69,121129], which will be examined herein in more details from the perspective of this research paper’s objective; the Analytic Network Process (ANP), see e.g. [130135], which expands the AHP concept for situations with dependence and feedback among alternatives and criteria; the fuzzy AHP (F-AHP) which implements the concepts of Zadeh’s [136] fuzzy set theory, see e.g. [11,137142]; the Data Envelopment Analysis (DEA), see e.g. [143145], which is used to estimate an efficiency frontier by considering the best performance observations (extreme points) which ‘envelop’ the remaining observations; Goal Programming (GP), see e.g. [20,146150], which is used for solving multi-objective optimization problems that balance a trade-off in conflicting objectives; Grey Analysis (GA), see e.g. [119,151153] which applies a sophisticated mathematical analysis of the systems which are partly defined and partly unknown, thus recognized as ‘insufficient data’ and‘ weak knowledge’; ELECTRE (in French: ELimination Et Choix Traduisant la REalité), see e.g. [3,154156], implemented to select the best alternative with maximum advantage and least conflict in relation to various criteria; VIKOR (from the Serbian: Vise Kriterijumska Optimizacija I Kompromisno Resenje), see e.g. [157,158], applied for examination of alternative preferences in highly complex environments; Technique for Order Preference by Similarity to Ideal Solution (TOPSIS), see e.g. [158164], which finds the best solutions of MADM problems looking for the shortest Euclidean distance from the positive-ideal solution, and the longest Euclidean distance from the negative-ideal one; Decision Making Trial and Evaluation Laboratory (DEMATEL), see e.g. [165172], which deals with examination of interdependent relationships among analyzed factors and identification of the critical ones through a visual structural model; and Measuring Attractiveness by a Categorical Based Evaluation Technique (MACBETH), see e.g. [173175], which is the MADM method that evaluates options against multiple criteria.

When the AHP is examined as the MADM method, contradiction between its easy to use applicability and controversial methodology which it uses is clearly perceived for a long time. Perhaps, it is due to a few areas that are very intriguing when this methodology is considered i.e. the inconsistency associated with pairwise comparisons made by humans during its methodological process, see e.g. [7072,78,79,114,176179]; human judgments’ errors connected with human’s preferences expression, see e.g. [73,103105,179184]; limitations in relation to a selected preference scale which must be applied during the pairwise comparisons process, see e.g. [27,117,185189]; the reciprocity condition imposed for a Pairwise Comparison Matrix (PCM)–denoted as PCM(w) = [wij]n×n with elements wij = wi/wj, where wij>0, and i, j = 1,…, n, which constitutes the information source for a prioritization process of DM preferences, see e.g. [39,42,46,73,78,184,190]; and last but not least, the Condition of the order Preservation (COP), which constitutes a complementary approach to the examination of the in/consistency of PCM, see e.g. [15,32,191195].

When the in/consistency in Pairwise Comparisons (PCs) is taken into consideration as the subarea of the Multi Attribute Decision Making (MADM) scientific field, it has many repercussions in various types of research areas including different modelling scenarios e.g. reduction of inconsistency during PCs, deriving appropriate consistency thresholds for inconsistent Pairwise Comparison Matrices (PCMs), completing of incomplete PCMs, aggregating of individual PCMs in relation to Group Decision Making (GDM) aspects, and PCMs in/consistency relation to credibility of Priority Vectors (PV) derived from PCMs with the application of various Priorities Deriving Methods (PDMs). The examination objective in the latter area of research is the uncertainty related to the inexactitude of prioritization based on derived PVs.

It should be realized here that there are three significantly different notions:

  • the PCM consistency perceived from the perspective of its definition, see hereafter D[3], and expressed by the specific inconsistency index value;
  • the consistency of decision makers, i.e. their trustworthiness, reflected by the number and size of their judgments discrepancies, and;
  • the PCM applicability for estimation of decision makers’ priorities in the way that leads to minimization of their estimation errors.

As it seems the third issue is probably the most important problem in the contemporary arena of the MADM theory concerning AHP, and the only way to examine that phenomena is through computer simulations. It is the fact that Monte Carlo simulations are commonly recognized and applied as important and credible source of scientific information [196,197]. Their applications spread for examination purposes of various phenomena, e.g.: consequences of decisions made, or different processes subdued to random impact of the particular environment [32,49,76,113,117,198200].

Preliminary remarks

It is paramount to underline, that all of so far devised PDMs provide exactly the same results i.e. priorities ratios (PR) within priority vectors (PV) when consistent PCMs are processed as the source of data. In relation to the subject of this research paper, it is paramount to emphasize at this point, that for perfectly consistent PCMs, all those thus far devised and potentially yet not invented indicators of PCM consistency must be by definition equal to zero. Hence, for the problem examined in this paper, it does not matter which priorities deriving method (PDM), and which consistency index (CI), out of all available in literature, will be utilized hereafter.

Taking into account possible distortions of priorities ratios (PRs) derived from consistent PCMs, the three factors, out of the few listed earlier, must be taken into consideration during these distortion examinations i.e. human judgment errors connected with human preferences expression, rounding errors resulting from a selected preference scale which must be applied during the pairwise comparisons process, and the PCM reciprocity requirement commonly imposed on this source of computational data. All of them can be examined with the application of Monte Carlo simulations applied for the purpose of this research paper. However, before the introduction of particulars, a few notions must be first presented. Thus, the following definitions D[13] are introduced as follows:

D[1]: If the elements of a matrix W(w) satisfy the condition wij = 1/wji for all i, j = 1,…, n, then the matrix W(w) is called reciprocal.

D[2]: If the following conditions are true: (a) if for any i = 1,…, n, an element wij is not less than an element wik, then wijwik for i = 1,…, n, and (b) if for any i = 1,…, n, an element wji is not less than an element wki, then wjiwki for i = 1,…, n, then the matrix W(w) is called an ordinal transitive.

D[3]: If the elements of a matrix W(w) satisfy the condition wikwkj = wij for all i, j, k = 1,…, n, and the matrix W(w) is reciprocal, then it is called consistent or a cardinal transitive.

While the norm of the vector w can be written as: (1) where e = [1, 1,…, 1]T, the vector w can be normalized by dividing it by its norm. Thus, for uniqueness, w is referred hereafter in its normalized form.

For every given priority vector v = [v1, v2, v3,…,vn]T, the Matrix of Priority Ratios (MPR) can be denoted as: (2) for all i, j = 1,…, n.

Reconsidering rounding errors resulting from a given preference scale which can be selected for the pairwise comparisons process, the following integer based preference scales have been proposed thus far, whose numbers are combined with linguistic variables expressing the preference intensity from ‘indifferent’ to ‘extremely preferred’ (Tables 1 and 2).

thumbnail
Table 1. The principal preference scale proposed by Saaty [129,201].

https://doi.org/10.1371/journal.pone.0290751.t001

thumbnail
Table 2. Various preference scales devised for pairwise comparisons of alternatives.

https://doi.org/10.1371/journal.pone.0290751.t002

Last but not least, reconsidering human judgment errors connected with human preferences expression, the following relation has been proposed to make their simulation process possible (Formula 1): (3) where εij acts as a perturbation factor oscillating close to unity i.e. it is randomly drawn from an assigned interval e.g. εij∈[0.8, 1.2], and wij and vij are the elements of the matrices PCM(w) = [wij]n×n and MPR(v) = [vij]n×n, respectively. In the statistical approach, εij reflects the realization of a random variable which is applied during a simulation process with a certain probability distribution (PD) reflecting imperfect human judgments during pairwise comparisons. In literature, the following types of PDs are commonly taken for similar implementation purposes: truncated-normal, gamma, log-normal, and uniform [38,49]. However, in addition to the above listed most utilized types of PDs, one can also find applications of triangular, beta, Cauchy, and Laplace PDs [26], as well as Fisher–Snedecor PDs which have been recently applied for the first time by Kazibudzki [32]. The maximal possible spread for εij encountered during similar research studies has never been beyond the following interval εij∈[0.01, 1.99].

Research methodology

Outline of the examination concept

Let the exemplary PV be considered as v = [0.35, 0.19, 0.16, 0.30]T, which reflects some real, not derived, priorities ratios (PRs) towards some known four objects whose relative characteristics are known i.e. can be computed on the basis of their mass, volume, circumference, etc. Then, the MPR(v) = [vij]4×4 can be constructed as follows: (4)

Let the elements of the above MPR(v) be distorted via a perturbation factor of ε = 0.9. As this process leads to reflection of imperfect human judgments during pairwise comparisons, let the new matrix obtained from MPR(v) be denoted as PCM(w) = [wij]4×4, and be presented as follows: (5)

Let the above PCM(w) be made reciprocal now, denoted as PCMR(w), and presented as follows: (6)

Last but not least, let the matrix PCMR(w) be scaled in relation to the linear preference scale proposed by Saaty (Table 1), then denoted as PCMSR(w), and presented as follows: (7)

As can be noticed, the matrix PCMSR(w) is perfectly consistent. Hence, any PDM applied herein will derive exactly the same PV which can be denoted as w and presented as follows: w = [0.333(3), 0.166(6), 0.166(6), 0.333(3)]T.

In this presentation, it becomes possible to compare rankings provided by the initial real v, and w derived from PCMSR(w) which represents–in the provided example–the possible outcome of DM efforts during the process of MPR(v) approximation. It is also possible to assess the range of deviation between v, and w. For this purpose, proposed is the use the following two measures i.e. the commonly applied Absolute Average Error (AAE) between v, and w, as defined in Formula 6, and the new, proposed herein, Maximal Absolute Deviation (MAD) from AAE, as defined in Formula 7.

(8)(9)

It behooves mentioning that the sum of the both defined above measures i.e. AAE and MAD, provides information about the Maximal Possible Error (MPE) that can occur between the examined PRs of vectors v, and w. For the above example; MPE = 0.033(3), with AAE = 0.02, and MAD = 0.013(3).

Mentionable, in the presented example, the preferences order provided by the two analyzed PVs is different and can be defined respectively as v1v4v2v3 for v, and w1w4w2w3 for w. The latter result clearly indicates that the derived preferences are not as sharp as true ones, and they are definitely more ambiguous in their interpretation. Thus, there is always a margin of error which clearly should be taken into consideration, and thus far, as indicated by all available resources in this research subject, IT IS NOT.

Description of the simulation concept

The PCMs in/consistency relation to credibility of Priority Vectors (PV) derived from PCMs with the application of various Priorities Deriving Methods (PDMs) constitutes the key issue in a few research studies e.g. [51,76,77,111117]. The examination objective in this area of research is the uncertainty related to the inexactitude of prioritization based on derived PVs. However, only few research studies examine this problem from the perspective of PCM applicability for credible designation of decision maker’s (DM) priorities in the way that leads to minimization of the prioritization uncertainty related to possible, and sometimes very probable, ranking fluctuations. This problem constitutes the primary area of interest for this research paper as no research study was thus far identified that examines this problem via complex simulations from the perspective of consistent PCMs. So far, this concept has been studied only from the perspective of inconsistent PCMs, see e.g. [51,76,77,111,112,114,117]. As problems related to heuristics can be analyzed only via a computer simulation process, because they cannot be mathematically determined, the problem examined in this research paper is examined via Monte Carlo simulations, appropriately coded and executed with the application of Wolfram’s Mathematica Software.

It is the fact that Monte Carlo simulations are commonly recognized and applied as important and credible source of scientific information [196,197]. Their applications spread for examination purposes of various phenomena, e.g.: consequences of decisions made, or different processes subdued to random impact of the particular environment [32,49,76,113,117,198200].

Hence, for the analysis of the problem revealed in the former subsections, the following Monte Carlo simulation algorithm is proposed (Chart 1), and depicted in the form of the Flowchart (Fig 1). The primary version of the algorithm was devised and successfully applied for the first time by Grzybowski [77], and since then it was adapted and successfully implemented for many similar problems, see e.g. [51,76,111,112,114117,210]. This research study applies the adaptation of this algorithm for examination of consistent PCMs. Three elements in this algorithm are available that have to be defined i.e. the kind of PDM, CI, and the type of preference scale used during its execution as was already stated in earlier subsections of this paper, any kind of PDM and CI suits assumptions of this simulation algorithm. Thus, for simplicity of calculation, the Logarithmic Least Squares Method (LLSM), (appreciated also for its closed form (Formula 8), known as the Geometric Mean Method (GM)), was selected as the PDM [23,24]. It was also decided to use as the simulation algorithm, the modified version of a recently introduced PCM consistency measure [51] i.e. the Index of Absolute Logarithm Deviations (IALD) that is defined herein by Formula 9.

thumbnail
Chart 1. Simulation algorithm applied for the research.

Steps relate to descriptions in Chart 1.

https://doi.org/10.1371/journal.pone.0290751.g001

thumbnail
Fig 1. Flowchart of simulation algorithm applied for the research.

https://doi.org/10.1371/journal.pone.0290751.g002

(10)(11)

As some micro deviations from zero of the applied CI must have appeared during the simulation research program, it was decided that in order for the obtained results reliability confirmation to apply concurrently as a point of reference, the well-established Koczkodaj’s index of consistency applies i.e. K(A) defined herein by the Formula 10.

(12)

Last but not least, it was decided to also apply three of the available preference scales which were selected on the basis of their features related to pairwise comparison processes i.e. the frequently examined Geometric Scale (GS), in this examination as ; x = {1,2,…,9}–linear scale proposed by Saaty for the AHP, and the recently praised Inverse Linear Scale (ILS), see e.g. [185].

Results and discussion

Analysis of results

Pursuing the objective of this research, the following general simulation results can be presented. They encompass two sets of possible values for n i.e. n∈{3,4,5,6} and n∈{4,5…,9}, and concern two kinds of earlier described errors–AAE and MAD. The following two tables–Tables 3 & 4 –contain results for tens of thousands of asymptotically consistent PCMs obtained during millions of iterations of variously distorted PCMs as described in the simulation algorithm presented within Chart 1. In particular, results presented in Table 3 are obtained out of 3,600,000 iterations i.e. 500 various PVs of 4 sizes i.e. n∈{3,4,5,6}, perturbed 50 times each by a perturbation factor applied with 4 kinds of probability distributions with 3 sizes of potentially large error, and 3 intervals for small errors (500×4×50×4×3×3 = 3,600,000). The results presented in Table 4 were obtained from 2,700,000 iterations i.e. 250 various PVs of 6 sizes i.e. n∈{4,5…,9}, perturbed 50 times each by perturbation factor applied with 4 kinds of probability distributions with 3 sizes of a possible large error, and 3 intervals for small errors (250×6×50×4×3×3 = 2,700,000).

thumbnail
Table 3. Mean values and p–Quantiles of AAE for selected scales.

https://doi.org/10.1371/journal.pone.0290751.t003

thumbnail
Table 4. Mean values and p–Quantiles of MAD for selected scales.

https://doi.org/10.1371/journal.pone.0290751.t004

The results presented in Tables 3 & 4 provide very significant information about possible distortions of PRs obtained via any PDM. They can be concluded in the following way: AAE & MAD are different for various PCM sizes i.e. smaller for larger PCMs and vice versa, larger for smaller PCMs; for smaller PCMs, inverse linear scale gives smaller possible maximal errors in relation to other examined scales; for larger PCMs, Saaty’s linear scale outperform other examined scales, providing smaller possible maximal errors; due to possible PVs distortions. The presented errors should be taken into account during a standard prioritization procedure i.e. alternatives ranking as they can indicate a probability of possible ranking alteration. Considering the latter conclusion as crucial for effective MADM via PCs, it must be further developed in detail. Hence, the following examination results will be presented herein for the selected scales and designated sizes of PCMs together with the example development of their importance.

Discussion of results

A very long discussion concerning the Rank Reversal Phenomenon (RRP) during PCs particularly regarding applications of AHP, is reported in the subject’s literature, see e.g. [211,212]. It may be concluded that the most frequent reasons of RRP inclusion, yet are not limited to synthesis procedure, see e.g. [213215], normalization procedure, see e.g. [216218], criteria weights, see e.g. [219,220], methods misuse, see e.g. [202], decisional process uncertainty, see e.g. [221], and structural dependency among the criteria and alternatives, see e.g. [222,223]. However, it must be emphasized that from the best acquired knowledge from the subject’s literature analysis, none so far have published research considering this phenomenon from the perspective of the inexactitude of the prioritization process. Indeed, all choices made so far with the application of the AHP have been made on the basis of the AHP’s final ranking which has been taken as established, hence with disregard of possible estimation errors.

In order to grasp a new perspective in this issue, the known example of RRP is proposed to be reexamined [217,224]. In this example, presented for the first time by Belton & Gear [217], the hypothetic problem is analyzed via the AHP which is structured as the three criteria and three alternatives framework. It is assumed that all considered criteria are of equal importance. It is also assumed that DM’s judgments concerning the problem’s alternative solutions are perfectly consistent and based on the AHP’s standard Saaty’s linear preference scale. The tables comprising considered PCMs and their related PVs are presented below (Figs 25).

thumbnail
Fig 2. The PCM of criteria with regard to the goal, and its related PV.

https://doi.org/10.1371/journal.pone.0290751.g003

thumbnail
Fig 3. The PCM of alternatives with regard to the first criterion, and its related PV.

https://doi.org/10.1371/journal.pone.0290751.g004

thumbnail
Fig 4. The PCM of alternatives with regard to the second criterion, and its related PV.

https://doi.org/10.1371/journal.pone.0290751.g005

thumbnail
Fig 5. The PCM of alternatives with regard to the third criterion, and its related PV.

https://doi.org/10.1371/journal.pone.0290751.g006

After application of the standard AHP’s synthesis procedure, the following aggregated PV can be obtained: p = [0.4512, 0.4697, 0.0791]T, which provides the following DM’s preference order p2p1p3.

Next, it is assumed that a new alternative is added to the problem while all previous assumptions concerning the problem structure and DM’s judgments remain the same. The accordingly modified tables comprising of extended PCMs and their related PVs are presented below (Figs 68).

thumbnail
Fig 6. The extended PCM of alternatives with regard to the first criterion, and its related PV.

https://doi.org/10.1371/journal.pone.0290751.g007

thumbnail
Fig 7. The extended PCM of alternatives with regard to the second criterion, and its related PV.

https://doi.org/10.1371/journal.pone.0290751.g008

thumbnail
Fig 8. The extended PCM of alternatives with regard to the third criterion, and its related PV.

https://doi.org/10.1371/journal.pone.0290751.g009

This time, also after application of the standard AHP’s synthesis procedure, the new aggregated PV is obtained: d = [0.3654, 0.2889, 0.0568, 0.2889]T, which provides the following new DM’s preference order d1d2d4d3, which is different than the previous version i.e. p2p1p3. Despite there being no change in the relative preferences with regard to A1 versus A2, the final preference order between these two elements has been reversed. However, the latter conclusion is based entirely on the intrinsic and purely heuristic assumption about the established character of received results. In order to indicate its erroneous nature, the following table (Table 5) is presented with results of Monte Carlo simulations for particular number of alternatives and Saaty’s linear preference scale with the application of the simulation algorithm presented earlier in this research paper (Chart 1). Results presented in Tables 5 and 6 are based on over 10,000 asymptotically consistent PCMs which were obtained out of 900,000 iterations i.e. 500 various PVs, perturbed 50 times each by a perturbation factor applied with 4 kinds of probability distributions with 3 sizes of a possible big error, and 3 intervals for small errors (500×50×4×3×3 = 900,000).

thumbnail
Table 5. Mean values and p–Quantiles of AAE for Saaty’s linear scale.

https://doi.org/10.1371/journal.pone.0290751.t005

thumbnail
Table 6. Mean values and p–Quantiles of MAD for Saaty’s linear scale.

https://doi.org/10.1371/journal.pone.0290751.t006

Let the earlier presented PV i.e. p = [0.4512, 0.4697, 0.0791]T be now reconsidered. As it has been already indicated, assuming its established values as fixed, it provides the following DM’s preference order p2p1p3. However, taking into consideration the fact that the considered PV is only the possible estimate of some true PV which remains unknown, the ranking of its PRs should be preceded by an analysis of possible errors indicated in Tables 5 and 6, as its true PRs’ values may fluctuate around considered errors. Hence, for this particular PV, for example, the median of its MPEME = 0.01384+0.01087 = 0.02471, the mean of its MPEMN = 0.05376+0.02998 = 0.08374, and the 0.99 quantile of its . Thus, the true PRs’ values of this particular PV may fluctuate plus-minus 0.02471 (if the medians of possible errors are taken into consideration), or 0.08374 (if the means of possible errors are taken into consideration), or even 0.35001 (if the 0.99 quantiles of possible errors are taken into consideration), etc. However, the difference between p1 and p2 merely equals 0.4697−0.4512 = 0.0187 which is even smaller than a double 0.01 quantile of AAE = 0.01384×2 = 0.02768>0.0187 for this particular PV. It means that there is an average 99% probability that the true preference order which this PV provides may be equally well denoted by the sequence p1p2p3. Hence, in comparison with the preference order indicated in the example by the second PV i.e. d1d2d4d3, which by the way also may fluctuate, a rank reversal is not observed.

It is believed that every ranking designated by any PV, obtained or derived via the pairwise comparisons process, should be evaluated from the above presented perspective. It is why, Tables 710 present discussed errors also for other selected preference scales and various sizes of consistent PCMs.

When the in/consistency in Pairwise Comparisons (PCs) is taken into consideration as the subarea of the MADM scientific field, it presumably may be perceived as the most exploiting topic in this research area. A variety of models have been proposed to address inconsistency issues, see e.g. [7885]. Certainly, issues related to PCs in/consistency have many repercussions in various types of modelling scenarios i.e. the inconsistency reduction of reciprocal Pairwise Comparison Matrices (PCMs) with high levels of inconsistency, see e.g. [72,8691]; deriving appropriate consistency thresholds for non/reciprocal PCMs, see e.g. [10,74,9296]; completing of incomplete reciprocal PCMs, see e.g. [97102]; and aggregating of individual reciprocal PCMs in relation to Group Decision Making (GDM) aspects, see e.g. [75,103110], and PCMs in/consistency relation to credibility of Priority Vectors (PV) derived from PCMs with the application of various Priorities Deriving Methods (PDMs). The examination objective in the latter area of research is the uncertainty related to the inexactitude of prioritization based on derived PVs.

It should be realized here that there are three significantly different notions: the PCM consistency perceived from the perspective of its definition, see hereafter D[3], and expressed by the specific inconsistency index value; the consistency of decision makers, i.e. their trustworthiness, reflected by the number and size of their judgments discrepancies, and; the PCM applicability for estimation of decision makers’ priorities in the way that leads to minimization of their estimation errors.

As it seems the third issue is probably the most important problem in the contemporary arena of the MADM theory concerning AHP, and the only way to examine that phenomena is through computer simulations.

The PCMs in/consistency relation to credibility of Priority Vectors (PV) derived from PCMs with the application of various Priorities Deriving Methods (PDMs) constitutes the key issue in a few research studies e.g. [51,76,77,111117]. As it was previously stated the examination objective in this area of research is the uncertainty related to the inexactitude of prioritization based on derived PVs. However, only few research studies examine this problem from the perspective of PCM applicability for credible designation of decision maker’s (DM) priorities in the way that leads to minimization of the prioritization uncertainty related to possible, and sometimes very probable, ranking fluctuations. This problem constitutes the primary area of interest for this research paper as no other research study was thus far identified that examines this problem from the perspective of consistent PCMs.

So far, this concept has been studied only from the perspective of inconsistent PCMs, see e.g. [51,76,77,111,112,114,117]. Hence, a research gap was identified. Thus, the objective of this research paper was to fill in this scientific gap. The research findings have serious repercussions in relation to prioritization quality with the application of PCs methodology, mostly in relation to the interpretation and reliability evaluation of prioritization results. Firstly, the research study outcome changes the perspective of the rank reversal phenomenon, which shed new light on many research studies that have been presented in the subject’s literature for many decades. Secondly, the research study results throw new light on the discussion concerning the fuzziness of AHP’s results. Last but not least, the effect of the research opens the unique opportunity to evaluate the prioritization outcome obtained within the process of consistent PCs from the well-known perspective of statistical hypothesis testing i.e. the probability designation of the chance that accepted ranking results which were considered as correct due to low probability of change may be incorrect, hence they should be rejected, and the probability designation of the chance that rejected ranking results which were considered as incorrect due to high probability of change may be correct and should be accepted. The paramount finding of the research is the fact that consistent PCMs provide PVs, which elements cannot be considered as established, but only approximated within certain confidence intervals estimated with a certain level of probability. As problems related to heuristics can be analyzed only via a computer simulation process, because they cannot be mathematically determined, the problem examined in this research paper was examined via Monte Carlo simulations, appropriately coded and executed with the application of Wolfram’s Mathematica Software. It is believed that this research findings should be very important and useful for all decision makers and researchers during their problems‘ examinations that relate to prioritization processes with the application of PCs methodology.

Final remarks

The conducted research study is a mile stone on the way of interpretation and reliability evaluation of results of the prioritization process with the application of the pairwise comparisons technique. Firstly, the research study results change the perspective of the rank reversal phenomenon, which gives a new light on many research studies that have been presented in the subject’s literature for many decades, see e.g. [211216,218,220,222,223,225,226].

Secondly, the research study results throw a new light on the discussion concerning the fuzziness of the AHP’s results. As is known, the creator of the AHP was strictly against further fuzzifying of the AHP’s results, see e.g. [141,142]. Yet, there are many fuzzy AHP applications in the literature, and they are very popular, see e.g. [138,227229].

Last but not least, the effect of this research opens a unique opportunity to evaluate the prioritization outcome obtained within the process of consistent pairwise comparisons from the well-known perspective from the statistical hypothesis testing theory i.e. the probability of making the error of the first and the second kind i.e. accepting the ranking results which are not valid as they are prone to change, and rejecting the ranking results which may be valid. This has significant repercussions for many application-oriented research papers which apply the pairwise comparisons method, see e.g. [139,140,230233].

No research study is perfect when it comes to the reality of the physical world. Hence, this research study has also its limitations. They concern mostly the way of the human judgments imitation process i.e. simulation of human judgment’s errors. The nature of human judgments can only be represented as a realization of some random process in accordance with the assumed probability distribution of the perturbation factor e.g. uniform, gamma, truncated normal, log-normal, etc., see e.g. [115,234]. This is the main limitation of this research because this process is only a stochastic process generated by computer algorithms.

Future direction of the research may challenge these limitations as well include other preference scales which were not examined in this research paper.

Acknowledgments

The author would like to acknowledge the support of two independent Referees for their benevolence and a very smooth reviewing process, especially Dr. Bagh Ali who decided to reveal the identity, for very constructive comments which improved the first version of the manuscript. The author also appreciates valuable comments made by the Editor which improved both the second, and the third version of the manuscript.

References

  1. 1. Miller GA. The magical number seven plus or minus two: some limits on our capacity for processing information. Psychol Rev. 1956;63: 81–97. pmid:13310704
  2. 2. Zopounidis C, Pardalos PM, editors. Handbook of Multicriteria Analysis. Berlin, Heidelberg: Springer; 2010. https://doi.org/10.1007/978-3-540-92828-7
  3. 3. Decision-aid Roy B. and decision-making. Eur J Oper Res. 1990;45: 324–331.
  4. 4. Kashid US, Kashid DU, Mehta SN. A Review of Mathematical Multi-Criteria Decision Models With a Case Study. International Conference on Efficacy of Software Tools for Mathematical Modeling (ICESTMM’19). Rochester, NY: http://ijrar.com/uploads/conference/ijrar_47.pdf; 2019. Available: https://ssrn.com/abstract=3751947.
  5. 5. Colomer JM. Ramon Llull: from ‘Ars electionis’ to social choice theory. Soc Choice Welf. 2013;40: 317–328.
  6. 6. Young HP. Condorcet’s Theory of Voting. Am Polit Sci Rev. 1988;82: 1231–1244.
  7. 7. Condorcet J-A-N de C (1743–1794; marquis de) A du texte. Essai sur l’application de l’analyse à la probabilité des décisions rendues à la pluralité des voix ([Reprod.]) / par M. le marquis de Condorcet,… 1785. Available: https://gallica.bnf.fr/ark:/12148/bpt6k417181.
  8. 8. Saari DG, Merlin VR. The Copeland method. Econ Theory. 1996;8: 51–76.
  9. 9. Saaty TL. The Analytic Hierarchy Process: Decision Making in Complex Environments. In: Avenhaus R, Huber RK, editors. Quantitative Assessment in Arms Control: Mathematical Modeling and Simulation in the Analysis of Arms Control Problems. Boston, MA: Springer US; 1984. pp. 285–308. https://doi.org/10.1007/978-1-4613-2805-6_12
  10. 10. Saaty TL. Decision making with the analytic hierarchy process. Int J Serv Sci. 2008;1: 83.
  11. 11. Zhao H, Yao L, Mei G, Liu T, Ning Y. A Fuzzy Comprehensive Evaluation Method Based on AHP and Entropy for a Landslide Susceptibility Map. Entropy. 2017;19: 396.
  12. 12. Feng G, Lei S, Guo Y, Meng B, Jiang Q. Optimization and Evaluation of Ventilation Mode in Marine Data Center B--ased on AHP-Entropy Weight. Entropy. 2019;21: 796. pmid:33267509
  13. 13. Hodicky J, Özkan G, Özdemir H, Stodola P, Drozd J, Buck W. Analytic Hierarchy Process (AHP)-Based Aggregation Mechanism for Resilience Measurement: NATO Aggregated Resilience Decision Support Model. Entropy. 2020;22: 1037. pmid:33286806
  14. 14. Tomashevskii IL. Eigenvector ranking method as a measuring tool: Formulas for errors. Eur J Oper Res. 2015;240: 774–780.
  15. 15. Bana e Costa CA, Vansnick J-C. A critical analysis of the eigenvalue method used to derive priorities in AHP. Eur J Oper Res. 2008;187: 1422–1428.
  16. 16. Koczkodaj WW, Mikhailov L, Redlarski G, Soltys M, Szybowski J, Tamazian G, et al. Important Facts and Observations about Pairwise Comparisons (the special issue edition). Fundam Informaticae. 2016;144: 291–307.
  17. 17. Genest C, Rivest L-P. A Statistical Look at Saaty’s Method of Estimating Pairwise Preferences Expressed on a Ratio Scale. J Math Psychol. 1994;38: 477–496.
  18. 18. Basak I. Comparison of statistical procedures in analytic hierarchy process using a ranking test. Math Comput Model. 1998;28: 105–118.
  19. 19. Bozóki S, Dezső L, Poesz A, Temesi J. Analysis of pairwise comparison matrices: an empirical research. Ann Oper Res. 2013;211: 511–528.
  20. 20. Bryson N. A Goal Programming Method for Generating Priority Vectors. J Oper Res Soc. 1995;46: 641–648.
  21. 21. Choo EU, Wedley WC. A common framework for deriving preference values from pairwise comparison matrices. Comput Oper Res. 2004;31: 893–908.
  22. 22. Cook WD, Kress M. Deriving weights from pairwise comparison ratio matrices: An axiomatic approach. Eur J Oper Res. 1988;37: 355–362.
  23. 23. Crawford G, Williams C. The Analysis of Subjective Judgment Matrices: 1985 [cited 19 Feb 2020]. Available: https://www.rand.org/pubs/reports/R2572-1.html.
  24. 24. Crawford G, Williams C. A note on the analysis of subjective judgment matrices. J Math Psychol. 1985;29: 387–405.
  25. 25. Csató L. Ranking by pairwise comparisons for Swiss-system tournaments. Cent Eur J Oper Res. 2013;21: 783–803.
  26. 26. Dijkstra TK. On the extraction of weights from pairwise comparison matrices. Cent Eur J Oper Res. 2013;21: 103–123.
  27. 27. Dong Y, Xu Y, Li H, Dai M. A comparative study of the numerical scales and the prioritization methods in AHP. Eur J Oper Res. 2008;186: 229–242.
  28. 28. Farkas A, Rózsa P. A recursive least-squares algorithm for pairwise comparison matrices. Cent Eur J Oper Res. 2013;21: 817–843.
  29. 29. Hosseinian SS, Navidi H, Hajfathaliha A. A New Linear Programming Method for Weights Generation and Group Decision Making in the Analytic Hierarchy Process. Group Decis Negot. 2012;21: 233–254.
  30. 30. Hovanov NV, Kolari JW, Sokolov MV. Deriving weights from general pairwise comparison matrices. Math Soc Sci. 2008;55: 205–220. Available: https://ideas.repec.org/a/eee/matsoc/v55y2008i2p205-220.html.
  31. 31. Ishizaka A, Lusti M. How to derive priorities in AHP: a comparative study. Cent Eur J Oper Res. 2006;14: 387–400.
  32. 32. Kazibudzki PT. The Quality of Ranking during Simulated Pairwise Judgments for Examined Approximation Procedures. Model Simul Eng. 2019;2019: e1683143.
  33. 33. Kou G, Ergu D, Chen Y, Lin C. Pairwise comparison matrix in multiple criteria decision making. Technol Econ Dev Econ. 2016;22: 738–765.
  34. 34. Kou G, Lin C. A cosine maximization method for the priority vector derivation in AHP. Eur J Oper Res. 2014;235: 225–232.
  35. 35. Kułakowski K. A heuristic rating estimation algorithm for the pairwise comparisons method. Cent Eur J Oper Res. 2015;23: 187–203.
  36. 36. Kułakowski K, Mazurek J, Strada M. On the similarity between ranking vectors in the pairwise comparison method. J Oper Res Soc. 2021;0: 1–10.
  37. 37. Lin C, Kou G. A heuristic method to rank the alternatives in the AHP synthesis. Appl Soft Comput. 2020; 106916.
  38. 38. Lin C-C. A revised framework for deriving preference values from pairwise comparison matrices. Eur J Oper Res. 2007;176: 1145–1150.
  39. 39. Linares P, Lumbreras S, Santamaría A, Veiga A. How relevant is the lack of reciprocity in pairwise comparisons? An experiment with AHP. Ann Oper Res. 2016;245: 227–244.
  40. 40. Mardani A, Jusoh A, Nor KM, Khalifah Z, Zakwan N, Valipour A. Multiple criteria decision-making techniques and their applications–a review of the literature from 2000 to 2014. Econ Res-Ekon Istraživanja. 2015;28: 516–571.
  41. 41. Mizuno T. A Link Diagram for Pairwise Comparisons. In: Czarnowski I, Howlett RJ, Jain LC, Vlacic L, editors. Intelligent Decision Technologies 2018. Cham: Springer International Publishing; 2019. pp. 181–186. https://doi.org/10.1007/978-3-319-92028-3_19
  42. 42. Nishizawa K. Non-reciprocal Pairwise Comparisons and Solution Method in AHP. In: Czarnowski I, Howlett RJ, Jain LC, Vlacic L, editors. Intelligent Decision Technologies 2018. Cham: Springer International Publishing; 2019. pp. 158–165. https://doi.org/10.1007/978-3-319-92028-3_16
  43. 43. Orbán-Mihálykó É, Mihálykó C, Koltay L. A generalization of the Thurstone method for multiple choice and incomplete paired comparisons. Cent Eur J Oper Res. 2019;27: 133–159.
  44. 44. Saaty TL, Vargas LG. Comparison of eigenvalue, logarithmic least squares and least squares methods in estimating ratios. Math Model. 1984;5: 309–324.
  45. 45. Saaty TL, Vargas LG. The possibility of group choice: pairwise comparisons and merging functions. Soc Choice Welf. 2012;38: 481–496.
  46. 46. Shiraishi S, Obata T, Daigo M. Properties of a Positive Reciprocal Matrix and Their Application to Ahp. J Oper Res Soc Jpn. 1998;41: 404–414.
  47. 47. Temesi J. Pairwise comparison matrices and the error-free property of the decision maker. Cent Eur J Oper Res. 2011;19: 239–249.
  48. 48. Wang H, Peng Y, Kou G. A two-stage ranking method to minimize ordinal violation for pairwise comparisons. Appl Soft Comput. 2021; 107287.
  49. 49. Zahedi F. A simulation study of estimation methods in the analytic hierarchy process. Socioecon Plann Sci. 1986;20: 347–354.
  50. 50. Zhu B, Xu Z, Zhang R, Hong M. Hesitant analytic hierarchy process. Eur J Oper Res. 2016;250: 602–614.
  51. 51. Kazibudzki PT. On estimation of priority vectors derived from inconsistent pairwise comparison matrices. J Appl Math Comput Mech. 2022;21: 52–59.
  52. 52. Peláez JI, Martínez EA, Vargas LG. Consistency in Positive Reciprocal Matrices: An Improvement in Measurement Methods. IEEE Access. 2018;6: 25600–25609.
  53. 53. Peláez JI, Lamata MT. A new measure of consistency for positive reciprocal matrices. Comput Math Appl. 2003;46: 1839–1845.
  54. 54. Dixit PD. Entropy production rate as a criterion for inconsistency in decision theory. J Stat Mech Theory Exp. 2018;2018: 053408.
  55. 55. Fedrizzi M, Ferrari F. A chi-square-based inconsistency index for pairwise comparison matrices. J Oper Res Soc. 2018;69: 1125–1134.
  56. 56. Fedrizzi M. Distance–Based Characterization of Inconsistency in Pairwise Comparisons. In: Greco S, Bouchon-Meunier B, Coletti G, Fedrizzi M, Matarazzo B, Yager RR, editors. Advances in Computational Intelligence. Springer Berlin Heidelberg; 2012. pp. 30–36.
  57. 57. Cavallo B D’Apuzzo . Investigating Properties of the ⊙-Consistency Index. In: Greco S, Bouchon-Meunier B, Coletti G, Fedrizzi M, Matarazzo B, Yager RR, editors. Advances in Computational Intelligence. Springer Berlin Heidelberg; 2012. pp. 315–327.
  58. 58. Kułakowski K, Szybowski J. The New Triad based Inconsistency Indices for Pairwise Comparisons. Procedia Comput Sci. 2014;35: 1132–1137.
  59. 59. Szybowski J. The Cycle Inconsistency Index in Pairwise Comparisons Matrices. Procedia Comput Sci. 2016;96: 879–886.
  60. 60. Fedrizzi M, Civolani N, Critch A. Inconsistency evaluation in pairwise comparison using norm-based distances. Decis Econ Finance. 2020 [cited 5 Sep 2020].
  61. 61. Wan Z, Chen M, Zhang L. New Consistency Index for Comparison Matrices and Its Properties. Int J Appl Math Stat. 2013;42: 206–218. Available: http://www.ceser.in/ceserp/index.php/ijamas/article/view/897.
  62. 62. Cavallo B D’Apuzzo L, Squillante M. About a consistency index for pairwise comparison matrices over a divisible alo‐group. Int J Intell Syst. 2012;27: 153–175.
  63. 63. Cavallo B D’Apuzzo L, Marcarelli . Pairwise Comparison Matrices: Some Issue on Consistency and a New Consistency Index. In: Greco S, Marques Pereira RA, Squillante M, Yager RR, Kacprzyk J, editors. Preferences and Decisions: Models and Applications. Berlin, Heidelberg: Springer; 2010. pp. 111–122. https://doi.org/10.1007/978-3-642-15976-3_7
  64. 64. Stein WE, Mizzi PJ. The harmonic consistency index for the analytic hierarchy process. Eur J Oper Res. 2007;177: 488–497.
  65. 65. Salo AA, Hämäläinen RP. Preference programming through approximate ratio comparisons. Eur J Oper Res. 1995;82: 458–475.
  66. 66. Amenta P, Lucadamo A, Marcarelli G. Approximate thresholds for Salo-Hamalainen index. IFAC-Pap. 2018;51: 1655–1659.
  67. 67. Takeda E. A note on consistent adjustments of pairwise comparison judgments. Math Comput Model. 1993;17: 29–35.
  68. 68. Golden BL, Wang Q. An Alternate Measure of Consistency. In: Golden BL, Wasil EA, Harker PT, editors. The Analytic Hierarchy Process: Applications and Studies. Berlin, Heidelberg: Springer; 1989. pp. 68–81. https://doi.org/10.1007/978-3-642-50244-6_5
  69. 69. Lin C, Kou G, Ergu D. An improved statistical approach for consistency test in AHP. Ann Oper Res. 2013;211: 289–299.
  70. 70. Aguarón J, Escobar MT, Moreno-Jiménez JM, Turón A. The Triads Geometric Consistency Index in AHP-Pairwise Comparison Matrices. Mathematics. 2020;8: 926.
  71. 71. Barzilai J. Consistency measures for pairwise comparison matrices. J Multi-Criteria Decis Anal. 1998;7: 123–132.
  72. 72. Siraj S, Mikhailov L, Keane JA. Contribution of individual judgments toward inconsistency in pairwise comparisons. Eur J Oper Res. 2015;242: 557–567.
  73. 73. Grzybowski AZ. Note on a new optimization based approach for estimating priority weights and related consistency index. Expert Syst Appl. 2012;39: 11699–11708.
  74. 74. Koczkodaj WW. A new definition of consistency of pairwise comparisons. Math Comput Model. 1993;18: 79–84.
  75. 75. Wu Z, Xu J. A consistency and consensus based decision support model for group decision making with multiplicative preference relations. Decis Support Syst. 2012;52: 757–767.
  76. 76. Kazibudzki PT. An Examination of Ranking Quality for Simulated Pairwise Judgments in relation to Performance of the Selected Consistency Measure. Adv Oper Res. 2019;2019: e3574263.
  77. 77. Grzybowski AZ. New results on inconsistency indices and their relationship with the quality of priority vector estimation. Expert Syst Appl. 2016;43: 197–212.
  78. 78. Bortot S, Brunelli M, Fedrizzi M, Marques Pereira RA. A novel perspective on the inconsistency indices of reciprocal relations and pairwise comparison matrices. Fuzzy Sets Syst. 2022 [cited 10 May 2022].
  79. 79. Pant S, Kumar A, Ram M, Klochkov Y, Sharma HK. Consistency Indices in Analytic Hierarchy Process: A Review. Mathematics. 2022;10: 1206.
  80. 80. Brunelli M, Fedrizzi M. Axiomatic properties of inconsistency indices for pairwise comparisons. J Oper Res Soc. 2015;66: 1–15.
  81. 81. Brunelli M, Fedrizzi M. A general formulation for some inconsistency indices of pairwise comparisons. Ann Oper Res. 2019;274: 155–169.
  82. 82. Cavallo B. Computing random consistency indices and assessing priority vectors reliability. Inf Sci. 2017;420: 532–542.
  83. 83. Mazurek J, Ramík J. Some new properties of inconsistent pairwise comparisons matrices. Int J Approx Reason. 2019;113: 119–132.
  84. 84. Koczkodaj WW, Urban R. Axiomatization of inconsistency indicators for pairwise comparisons. Int J Approx Reason. 2018;94: 18–29.
  85. 85. Koczkodaj W, Szwarc R. On Axiomatization of Inconsistency Indicators for Pairwise Comparisons. Fundam Informaticae. 2014; 485–500.
  86. 86. Bozóki S, Fülöp J, Poesz A. On reducing inconsistency of pairwise comparison matrices below an acceptance threshold. Cent Eur J Oper Res. 2015;23: 849–866.
  87. 87. Aguarón J, Escobar MT, Moreno-Jiménez JM. Reducing inconsistency measured by the geometric consistency index in the analytic hierarchy process. Eur J Oper Res. 2021;288: 576–583.
  88. 88. Ergu D, Kou G, Peng Y, Shi Y. A simple method to improve the consistency ratio of the pair-wise comparison matrix in ANP. Eur J Oper Res. 2011;213: 246–259.
  89. 89. Khatwani G, Kar AK. Improving the Cosine Consistency Index for the analytic hierarchy process for solving multi-criteria decision making problems. Appl Comput Inform. 2017;13: 118–129.
  90. 90. Siraj S, Mikhailov L, Keane J. A heuristic method to rectify intransitive judgments in pairwise comparison matrices. Eur J Oper Res. 2012;216: 420–428.
  91. 91. Xu Y, Li M, Cabrerizo FJ, Chiclana F, Herrera-Viedma E. Algorithms to Detect and Rectify Multiplicative and Ordinal Inconsistencies of Fuzzy Preference Relations. IEEE Trans Syst Man Cybern Syst. 2021;51: 3498–3511.
  92. 92. Aguarón J, Moreno-Jiménez JM. The geometric consistency index: Approximated thresholds. Eur J Oper Res. 2003;147: 137–145.
  93. 93. Amenta P, Lucadamo A, Marcarelli G. On the transitivity and consistency approximated thresholds of some consistency indices for pairwise comparison matrices. Inf Sci. 2020;507: 274–287.
  94. 94. Bozóki S, Rapcsák T. On Saaty’s and Koczkodaj’s inconsistencies of pairwise comparison matrices. J Glob Optim. 2008;42: 157–175.
  95. 95. Duszak Z, Koczkodaj WW. Generalization of a new definition of consistency for pairwise comparisons. Inf Process Lett. 1994;52: 273–276.
  96. 96. Scala NM, Rajgopal J, Vargas LG, Needy KL. Group Decision Making with Dispersion in the Analytic Hierarchy Process. Group Decis Negot. 2016;25: 355–372.
  97. 97. Bozóki S, Fülöp J, Rónyai L. On optimal completion of incomplete pairwise comparison matrices. Math Comput Model. 2010;52: 318–333.
  98. 98. Bozóki S, Fülöp J, Koczkodaj WW. An LP-based inconsistency monitoring of pairwise comparison matrices. Math Comput Model. 2011;54: 789–793.
  99. 99. Fedrizzi M, Giove S. Incomplete pairwise comparison and consistency optimization. Eur J Oper Res. 2007;183: 303–313.
  100. 100. Kułakowski K. On the Geometric Mean Method for Incomplete Pairwise Comparisons. Mathematics. 2020;8: 1873.
  101. 101. Liu X, Pan Y, Xu Y, Yu S. Least square completion and inconsistency repair methods for additively consistent fuzzy preference relations. Fuzzy Sets Syst. 2012;198: 1–19.
  102. 102. Ureña R, Chiclana F, Morente-Molinera JA, Herrera-Viedma E. Managing incomplete preference relations in decision making: A review and future trends. Inf Sci. 2015;302: 14–32.
  103. 103. Aguarón J, Escobar MT, Moreno-Jiménez JM, Turón A. AHP-Group Decision Making Based on Consistency. Mathematics. 2019;7: 242.
  104. 104. Brunelli M, Fedrizzi M. Boundary properties of the inconsistency of pairwise comparisons in group decisions. Eur J Oper Res. 2015;240: 765–773.
  105. 105. Escobar MT, Aguarón J, Moreno-Jiménez JM. A note on AHP group consistency for the row geometric mean priorization procedure. Eur J Oper Res. 2004;153: 318–322.
  106. 106. Aguarón J, Escobar MT, Moreno-Jiménez JM. The precise consistency consensus matrix in a local AHP-group decision making context. Ann Oper Res. 2016;245: 245–259.
  107. 107. Csató L. Characterization of the Row Geometric Mean Ranking with a Group Consensus Axiom. Group Decis Negot. 2018;27: 1011–1027.
  108. 108. Escobar MT, Aguarón J, Moreno-Jiménez JM. Some extensions of the precise consistency consensus matrix. Decis Support Syst. 2015;74: 67–77.
  109. 109. Fedrizzi M, Fedrizzi M, Pereira RAM. On the Issue of Consistency in Dynamical Consensual Aggregation. In: Bouchon-Meunier B, Gutiérrez-Ríos J, Magdalena L, Yager RR, editors. Technologies for Constructing Intelligent Systems 1: Tasks. Heidelberg: Physica-Verlag HD; 2002. pp. 129–137. https://doi.org/10.1007/978-3-7908-1797-3_10
  110. 110. Xu Z. On consistency of the weighted geometric mean complex judgement matrix in AHP1Research supported by NSF of China.1. Eur J Oper Res. 2000;126: 683–687.
  111. 111. Kazibudzki PT. Redefinition of triad’s inconsistency and its impact on the consistency measurement of pairwise comparison matrix. J Appl Math Comput Mech. 2016;15: 71–78.
  112. 112. Grzybowski AZ, Starczewski T. New Look at the Inconsistency Analysis in the Pairwise-Comparisons-Based Prioritization Problems. Expert Syst Appl. 2020; 113549.
  113. 113. Kazibudzki PT. An examination of performance relations among selected consistency measures for simulated pairwise judgments. Ann Oper Res. 2016;244: 525–544.
  114. 114. Kazibudzki PT, Křupka J. Pairwise judgments consistency impact on quality of multi-criteria group decision-making with AHP. EM Ekon Manag. 2019;22: 195–212.
  115. 115. Kazibudzki PT. The AHP Phenomenon of Rank Reversal Demystified. 2022.
  116. 116. Grzybowski AZ, Starczewski T. Remarks about inconsistency analysis in the pairwise comparison technique. 2017 IEEE 14th International Scientific Conference on Informatics. 2017. pp. 227–231.
  117. 117. Grzybowski AZ, Starczewski T. Simulation Analysis of Prioritization Errors in the AHP and Their Relationship with an Adopted Judgement Scale. Proceedings. San Francisco, USA; 2018. p. 5. Available: http://www.iaeng.org/publication/WCECS2018/WCECS2018_pp547-551.pdf.
  118. 118. Hwang C-L, Yoon K. Multiple Attribute Decision Making. Berlin, Heidelberg: Springer; 1981. https://doi.org/10.1007/978-3-642-48318-9
  119. 119. Aruldoss M, Lakshmi TM, Venkatesan VP. A Survey on Multi Criteria Decision Making Methods and Its Applications. Am J Inf Syst. 2013;1: 31–43.
  120. 120. Taherdoost H, Madanchian M. Multi-Criteria Decision Making (MCDM) Methods and Concepts. Encyclopedia. 2023;3: 77–87.
  121. 121. Canco I, Kruja D, Iancu T. AHP, a Reliable Method for Quality Decision Making: A Case Study in Business. Sustainability. 2021;13: 13932.
  122. 122. Chen C-H. A Novel Multi-Criteria Decision-Making Model for Building Material Supplier Selection Based on Entropy-AHP Weighted TOPSIS. Entropy. 2020;22: 259. pmid:33286032
  123. 123. Emrouznejad A, Marra M. The state of the art development of AHP (1979–2017): a literature review with a social network analysis. Int J Prod Res. 2017;55: 6653–6675.
  124. 124. Garuti C. Reflections on Common Misunderstandings When Using AHP and a Response to Criticism of Saaty’s Consistency Index. Int J Anal Hierarchy Process. 2018;10.
  125. 125. Leal JE. AHP-express: A simplified version of the analytical hierarchy process method. MethodsX. 2020;7: 100748. pmid:32021813
  126. 126. Mitchell KH, Wasil EA. AHP in Practice: Applications and Observations from a Management Consulting Perspective. In: Golden BL, Wasil EA, Harker PT, editors. The Analytic Hierarchy Process: Applications and Studies. Berlin, Heidelberg: Springer Berlin Heidelberg; 1989. pp. 192–212. https://doi.org/10.1007/978-3-642-50244-6_13
  127. 127. Munier N, Hontoria E. Shortcomings of the AHP Method. In: Munier N, Hontoria E, editors. Uses and Limitations of the AHP Method: A Non-Mathematical and Rational Analysis. Cham: Springer International Publishing; 2021. pp. 41–90. https://doi.org/10.1007/978-3-030-60392-2_5
  128. 128. Mu E, Pereyra-Rojas M. Group Decision-Making in AHP. In: Mu E, Pereyra-Rojas M, editors. Practical Decision Making using Super Decisions v3: An Introduction to the Analytic Hierarchy Process. Cham: Springer International Publishing; 2018. pp. 81–89. https://doi.org/10.1007/978-3-319-68369-0_8
  129. 129. Saaty TL. A scaling method for priorities in hierarchical structures. J Math Psychol. 1977;15: 234–281.
  130. 130. Kheybari S, Rezaie FM, Farazmand H. Analytic network process: An overview of applications. Appl Math Comput. 2020;367: 124780.
  131. 131. Saaty TL. Decision making—the Analytic Hierarchy and Network Processes (AHP/ANP). J Syst Sci Syst Eng. 2004;13: 1–35.
  132. 132. Saaty TL. The Analytic Hierarchy and Analytic Network Processes for the Measurement of Intangible Criteria and for Decision-Making. In: Figueira J, Greco S, Ehrogott M, editors. Multiple Criteria Decision Analysis: State of the Art Surveys. New York, NY: Springer; 2005. pp. 345–405. https://doi.org/10.1007/0-387-23081-5_9
  133. 133. Saaty TL. The Analytic Network Process. In: Saaty TL, Vargas LG, editors. Decision Making with the Analytic Network Process: Economic, Political, Social and Technological Applications with Benefits, Opportunities, Costs and Risks. Boston, MA: Springer US; 2006. pp. 1–26. https://doi.org/10.1007/0-387-33987-6_1
  134. 134. Sipahi S, Timor M. The analytic hierarchy process and analytic network process: an overview of applications. Manag Decis. 2010;48: 775–808.
  135. 135. Whitaker R. Validation examples of the Analytic Hierarchy Process and Analytic Network Process. Math Comput Model. 2007;46: 840–859.
  136. 136. Zadeh LA. Fuzzy sets. Inf Control. 1965;8: 338–353.
  137. 137. Demirel T, Demirel NÇ, Kahraman C. Fuzzy Analytic Hierarchy Process and its Application. In: Kahraman C, editor. Fuzzy Multi-Criteria Decision Making: Theory and Applications with Recent Developments. Boston, MA: Springer US; 2008. pp. 53–83. https://doi.org/10.1007/978-0-387-76813-7_3
  138. 138. Mikhailov L. A fuzzy programming method for deriving priorities in the analytic hierarchy process. J Oper Res Soc. 2000;51: 341–349.
  139. 139. Nazam M, Ahmad J, Javed MK, Hashim M, Yao L. Risk-Oriented Assessment Model for Project Bidding Selection in Construction Industry of Pakistan Based on Fuzzy AHP and TOPSIS Methods. In: Xu J, Cruz-Machado VA, Lev B, Nickel S, editors. Proceedings of the Eighth International Conference on Management Science and Engineering Management. Berlin, Heidelberg: Springer; 2014. pp. 1165–1177. https://doi.org/10.1007/978-3-642-55122-2_101
  140. 140. Nazam M, Hashim M, Ahmad J, Ahmad W, Tahir M. Selection of Reverse Logistics Operating Channels Through Integration of Fuzzy AHP and Fuzzy TOPSIS: A Pakistani Case. In: Xu J, Hajiyev A, Nickel S, Gen M, editors. Proceedings of the Tenth International Conference on Management Science and Engineering Management. Singapore: Springer; 2017. pp. 1117–1134. https://doi.org/10.1007/978-981-10-1837-4_92
  141. 141. Saaty TL. There is no mathematical validity for using fuzzy number crunching in the analytic hierarchy process. J Syst Sci Syst Eng. 2006;15: 457–464.
  142. 142. Saaty TL, Tran LT. On the invalidity of fuzzifying numerical judgments in the Analytic Hierarchy Process. Math Comput Model. 2007;46: 962–975.
  143. 143. Charnes A, Cooper WW, Rhodes E. Measuring the efficiency of decision making units. Eur J Oper Res. 1978;2: 429–444.
  144. 144. Grošelj P, Pezdevšek Malovrh Š, Zadnik Stirn L. Methods based on data envelopment analysis for deriving group priorities in analytic hierarchy process. Cent Eur J Oper Res. 2011;19: 267–284.
  145. 145. Triantaphyllou E. Multi-Criteria Decision Making Methods. In: Triantaphyllou E, editor. Multi-criteria Decision Making Methods: A Comparative Study. Boston, MA: Springer US; 2000. pp. 5–21. https://doi.org/10.1007/978-1-4757-3157-6_2
  146. 146. Ignizio JP, Romero C. Goal Programming. In: Bidgoli H, editor. Encyclopedia of Information Systems. New York: Elsevier; 2003. pp. 489–500. https://doi.org/10.1016/B0-12-227240-4/00082-4
  147. 147. Lin C-C. An enhanced goal programming method for generating priority vectors. J Oper Res Soc. 2006;57: 1491–1496.
  148. 148. Liu F, Zhang W-G, Wang Z-X. A goal programming model for incomplete interval multiplicative preference relations and its application in group decision-making. Eur J Oper Res. 2012;218: 747–754.
  149. 149. Orumie UC, Ebong D. A Glorious Literature on Linear Goal Programming Algorithms. Am J Oper Res. 2014;2014.
  150. 150. Wang Z-J. A note on “A goal programming model for incomplete interval multiplicative preference relations and its application in group decision-making.” Eur J Oper Res. 2015;247: 867–871.
  151. 151. Gerus-Gościewska M, Gościewski D. Grey Relational Analysis (GRA) as an Effective Method of Research into Social Preferences in Urban Space Planning. Land. 2022;11: 102.
  152. 152. Kuo Y, Yang T, Huang G-W. The use of grey relational analysis in solving multiple attribute decision-making problems. Comput Ind Eng. 2008;55: 80–93.
  153. 153. Li L, Liu Z, Du X. Improvement of Analytic Hierarchy Process Based on Grey Correlation Model and Its Engineering Application. Asce-Asme J Risk Uncertain Eng Syst Part -Civ Eng. 2021;7: 04021007.
  154. 154. Govindan K, Jepsen MB. ELECTRE: A comprehensive literature review on methodologies and applications. Eur J Oper Res. 2016;250: 1–29.
  155. 155. Figueira J, Mousseau V, Roy B. Electre Methods. In: Figueira J, Greco S, Ehrogott M, editors. Multiple Criteria Decision Analysis: State of the Art Surveys. New York, NY: Springer; 2005. pp. 133–153. https://doi.org/10.1007/0-387-23081-5_4
  156. 156. David A, Damart S. Bernard Roy et l’aide multicritère à la décision. Rev Fr Gest. 2011;214: 15–28. Available: https://www.cairn.info/revue-francaise-de-gestion-2011-5-page-15.htm.
  157. 157. Mardani A, Zavadskas EK, Govindan K, Amat Senin A, Jusoh A. VIKOR Technique: A Systematic Review of the State of the Art Literature on Methodologies and Applications. Sustainability. 2016;8: 37.
  158. 158. Opricovic S, Tzeng G-H. Compromise solution by MCDM methods: A comparative analysis of VIKOR and TOPSIS. Eur J Oper Res. 2004;156: 445–455.
  159. 159. Behzadian M, Khanmohammadi Otaghsara S, Yazdani M, Ignatius J. A state-of the-art survey of TOPSIS applications. Expert Syst Appl. 2012;39: 13051–13069.
  160. 160. Chakraborty S. TOPSIS and Modified TOPSIS: A comparative analysis. Decis Anal J. 2022;2: 100021.
  161. 161. Karim R, Karmaker CL. Machine Selection by AHP and TOPSIS Methods. Am J Ind Eng. 2016;4: 7–13.
  162. 162. Rianto Budiyanto D, Setyohadi Suyoto. AHP-TOPSIS on selection of new university students and the prediction of future employment. 2017 1st International Conference on Informatics and Computational Sciences (ICICoS). 2017. pp. 125–130.
  163. 163. Zyoud SH, Fuchs-Hanusch D. A bibliometric-based survey on AHP and TOPSIS techniques. Expert Syst Appl. 2017;78: 158–181.
  164. 164. TOPSIS method for Multiple-Criteria Decision Making (MCDM). In: GeeksforGeeks [Internet]. 15 Sep 2021 [cited 16 Mar 2023]. Available: https://www.geeksforgeeks.org/topsis-method-for-multiple-criteria-decision-making-mcdm/.
  165. 165. Si S-L, You X-Y, Liu H-C, Zhang P. DEMATEL Technique: A Systematic Review of the State-of-the-Art Literature on Methodologies and Applications. Math Probl Eng. 2018;2018: e3696457.
  166. 166. Hsieh Y-F, Lee Y-C, Lin S-B. Rebuilding DEMATEL threshold value: an example of a food and beverage information system. SpringerPlus. 2016;5: 1–13. pmid:27610304
  167. 167. Gigović L, Pamučar D, Bajić Z, Milićević M. The Combination of Expert Judgment and GIS-MAIRCA Analysis for the Selection of Sites for Ammunition Depots. Sustainability. 2016;8: 372.
  168. 168. Adegoke AS, Oladokun TT, Ayodele TO, Agbato SE, Jinadu AA. DEMATEL method of analysing the factors influencing the decision to adopt virtual reality technology by real estate firms in Lagos property market. Smart Sustain Built Environ. 2021;11: 891–917.
  169. 169. Chen C-Y, Huang J-J. A Novel DEMATEL Approach by Considering Normalization and Invertibility. Symmetry. 2022;14: 1109.
  170. 170. Chung-Wei L, Gwo-Hshiung T. Identification of a Threshold Value for the DEMATEL Method: Using the Maximum Mean De-Entropy Algorithm. In: Shi Y, Wang S, Peng Y, Li J, Zeng Y, editors. Cutting-Edge Research Topics on Multiple Criteria Decision Making. Berlin, Heidelberg: Springer; 2009. pp. 789–796. https://doi.org/10.1007/978-3-642-02298-2_115
  171. 171. Kwartnik-Pruc A, Ginda G, Trembecka A. Using the DEMATEL Method to Identify Impediments to the Process of Determining Compensation for Expropriated Properties. Land. 2022;11: 693.
  172. 172. Tsai C-J, Shyr W-J. Using the DEMATEL Method to Explore Influencing Factors for Video Communication and Visual Perceptions in Social Media. Sustainability. 2022;14: 15164.
  173. 173. Bana E Costa CA, Vansnick J-C. The MACBETH Approach: Basic Ideas, Software, and an Application. In: Meskens N, Roubens M, editors. Advances in Decision Analysis. Dordrecht: Springer Netherlands; 1999. pp. 131–157. https://doi.org/10.1007/978-94-017-0647-6_9
  174. 174. Bana E Costa CA, De Corte J-M, Vansnick J-C. Macbeth. Int J Inf Technol Decis Mak. 2012;11: 359–387.
  175. 175. Bana e Costa CA, de Corte J-M, Vansnick J-C. MACBETH (Measuring Attractiveness by a Categorical Based Evaluation Technique). Wiley Encyclopedia of Operations Research and Management Science. Hoboken, NJ, USA: John Wiley & Sons, Inc.; 2011. p. eorms0970. https://doi.org/10.1002/9780470400531.eorms0970
  176. 176. Alonso JA, Lamata MT. Consistency in the analytic hierarchy process: a new approach. Int J Uncertain Fuzziness Knowl-Based Syst. 2006;14: 445–459.
  177. 177. Benítez J, Delgado-Galván X, Izquierdo J, Pérez-García R. Improving consistency in AHP decision-making processes. Appl Math Comput. 2012;219: 2432–2441.
  178. 178. Osei-Bryson K-M. An action learning approach for assessing the consistency of pairwise comparison data. Eur J Oper Res. 2006;174: 234–244.
  179. 179. Ishizaka A, Siraj S. Interactive consistency correction in the analytic hierarchy process to preserve ranks. Decis Econ Finance. 2020;43: 443–464.
  180. 180. Aczél J, Saaty TL. Procedures for synthesizing ratio judgements. J Math Psychol. 1983;27: 93–102.
  181. 181. Botelho M. Analyzing priority vectors: going beyond inconsistency indexes. Int J Anal Hierarchy Process. 2022;14.
  182. 182. Cavallo B, Ishizaka A, Olivieri MG, Squillante M. Comparing inconsistency of pairwise comparison matrices depending on entries. J Oper Res Soc. 2019;70: 842–850.
  183. 183. Floriano CM, Pereira V, Rodrigues B e S. 3MO-AHP: an inconsistency reduction approach through mono-, multi- or many-objective quality measures. Data Technol Appl. 2022;ahead-of-print.
  184. 184. Kazibudzki PT, Grzybowski AZ. On Some Advancements within Certain Multicriteria Decision Making Support Methodology. Am J Bus Manag. 2013;2: 143–154.
  185. 185. Cavallo B, Ishizaka A. Evaluating scales for pairwise comparisons. Ann Oper Res. 2022 [cited 12 Apr 2022].
  186. 186. Franek J, Kresta A. Judgment Scales and Consistency Measure in AHP. Procedia Econ Finance. 2014;12: 164–173.
  187. 187. Starczewski T. Remarks about geometric scale in the analytic hierarchy process. J Appl Math Comput Mech. 2018;Vol. 17.
  188. 188. Starczewski T. Remarks on the impact of the adopted scale on the priority estimation quality. J Appl Math Comput Mech. 2017;16: 105–116.
  189. 189. Zhang H, Chen X, Dong Y, Xu W, Wang S. Analyzing Saaty’s consistency test in pairwise comparison method: a perspective based on linguistic and numerical scale. Soft Comput. 2018;22: 1933–1943.
  190. 190. Bajwa G, Choo EU, Wedley WC. Effectiveness analysis of deriving priority vectors from reciprocal pairwise comparison matrices. Asia-Pac J Oper Res. 2008;25: 279–299.
  191. 191. Kułakowski K, Mazurek J, Ramík J, Soltys M. When is the condition of order preservation met? Eur J Oper Res. 2019;277: 248–254.
  192. 192. Mazurek J. New preference violation indices for the condition of order preservation. RAIRO—Oper Res. 2022;56: 367–380.
  193. 193. Garuti C. Measuring in Weighted Environments (Moving from Metric to Order Topology). 2014.
  194. 194. Garuti C. Measuring in Weighted Environments: Moving from Metric to Order Topology (Knowing When Close Really Means Close). Applications and Theory of Analytic Hierarchy Process—Decision Making for Strategic Decisions. IntechOpen; 2016.
  195. 195. Kazibudzki PT. On the Statistical Discrepancy and Affinity of Priority Vector Heuristics in Pairwise-Comparison-Based Methods. Entropy. 2021;23: 1150. pmid:34573775
  196. 196. Carmone FJ, Kara A, Zanakis SH. A Monte Carlo investigation of incomplete pairwise comparison matrices in AHP. Eur J Oper Res. 1997;102: 538–553.
  197. 197. Herman MW, Koczkodaj WW. A Monte Carlo study of pairwise comparison. Inf Process Lett. 57: 25–29. Available: https://www.academia.edu/11950856/A_Monte_Carlo_study_of_pairwise_comparison.
  198. 198. Wu H, Leung S-O. Can Likert Scales be Treated as Interval Scales?—A Simulation Study. J Soc Serv Res. 2017;43: 527–532.
  199. 199. Zanakis SH, Solomon A, Wishart N, Dublish S. Multi-attribute decision making: A simulation comparison of select methods. Eur J Oper Res. 1998;107: 507–529.
  200. 200. Winsberg E. Science in the Age of Computer Simulation. Chicago, IL: University of Chicago Press; 2010. Available: https://press.uchicago.edu/ucp/books/book/chicago/S/bo9003670.html.
  201. 201. Saaty TL. The Modern Science of Multicriteria Decision Making and Its Practical Applications: The AHP/ANP Approach. Oper Res. 2013;61: 1101–1118.
  202. 202. Harker PT, Vargas LG. The Theory of Ratio Scale Estimation: Saaty’s Analytic Hierarchy Process. Manag Sci. 1987;33: 1383–1403.
  203. 203. Lootsma FA. Scale sensitivity in the multiplicative AHP and SMART. J Multi-Criteria Decis Anal. 1993;2: 87–110.
  204. 204. Lootsma FA. A model for the relative importance of the criteria in the Multiplicative AHP and SMART. Eur J Oper Res. 1996;94: 467–476.
  205. 205. Ishizaka A, Balkenborg D, Kaplan T. Influence of aggregation and measurement scale on ranking a compromise alternative in AHP. J Oper Res Soc. 2011;62: 700–710.
  206. 206. Dodd FJ, Donegan HA. Comparison of Prioritization Techniques Using Interhierarchy Mappings. J Oper Res Soc. 1995;46: 492–498.
  207. 207. Salo AA, Hämäläinen RP. On the measurement of preferences in the analytic hierarchy process. J Multi-Criteria Decis Anal. 1997;6: 309–319.
  208. 208. Elliott MA. Selecting numerical scales for pairwise comparisons. Reliab Eng Syst Saf. 2010;95: 750–763.
  209. 209. Goepel KD. Comparison of Judgment Scales of the Analytical Hierarchy Process—A New Approach. Int J Inf Technol Decis Mak. 2019;18: 445–463.
  210. 210. Grzybowski AZ. On some recent advancements within the pairwise comparison methodology. 2017 IEEE 14th International Scientific Conference on Informatics. 2017. pp. 1–5.
  211. 211. Aires RF de F, Ferreira L. The rank reversal problem in multi-criteria decision making: a literature review. Pesqui Oper. 2018;38: 331–362.
  212. 212. Maleki H, Zahir S. A Comprehensive Literature Review of the Rank Reversal Phenomenon in the Analytic Hierarchy Process. J Multi-Criteria Decis Anal. 2013;20: 141–155.
  213. 213. Zahir S. Normalisation and rank reversals in the additive analytic hierarchy process: a new analysis. Int J Oper Res. 2009;4: 446–467.
  214. 214. Ramanathan U, Ramanathan R. An investigation into rank reversal properties of the multiplicative AHP. Int J Oper Res. 2011;11: 54–77.
  215. 215. Majumdar A, Tiwari MK, Agarwal A, Prajapat K. A new case of rank reversal in analytic hierarchy process due to aggregation of cost and benefit criteria. Oper Res Perspect. 2021;8: 100185.
  216. 216. Triantaphyllou E. Two new cases of rank reversals when the AHP and some of its additive variants are used that do not occur with the multiplicative AHP. J Multi-Criteria Decis Anal. 2001;10: 11–25.
  217. 217. Belton V, Gear T. On a short-coming of Saaty’s method of analytic hierarchies. Omega. 1983;11: 228–230. Available: https://ideas.repec.org/a/eee/jomega/v11y1983i3p228-230.html.
  218. 218. Belton V, Gear T. The legitimacy of rank reversal—A comment. Omega. 1985;13: 143–144. Available: https://ideas.repec.org/a/eee/jomega/v13y1985i3p143-144.html.
  219. 219. Liberatore MJ, Nydick RL. Wash criteria and the analytic hierarchy process. Comput Oper Res. 2004;31: 889–892.
  220. 220. Wijnmalen DJD, Wedley WC. Non-discriminating criteria in the AHP: removal and rank reversal. J Multi-Criteria Decis Anal. 2008;15: 143–149.
  221. 221. Van Den Honert RC. Stochastic pairwise comparative judgements and direct ratings of alternatives in the REMBRANDT system. J Multi-Criteria Decis Anal. 1998;7: 87–97.
  222. 222. Saaty TL. Rank Generation, Preservation, and Reversal in the Analytic Hierarchy Decision Process. Decis Sci. 1987;18: 157–177.
  223. 223. Saaty TL, Sagir M. An essay on rank preservation and reversal. Math Comput Model. 2009;49: 1230–1243.
  224. 224. Papathanasiou J, Ploskas N. AHP. In: Papathanasiou J, Ploskas N, editors. Multiple Criteria Decision Aid: Methods, Examples and Python Implementations. Cham: Springer International Publishing; 2018. pp. 109–129. https://doi.org/10.1007/978-3-319-91648-4_5
  225. 225. Barzilai J, Golany B. Ahp Rank Reversal, Normalization And Aggregation Rules. INFOR Inf Syst Oper Res. 1994;32: 57–64.
  226. 226. Saaty TL, Vargas LG. The legitimacy of rank reversal. Omega. 1984;12: 513–516.
  227. 227. Grošelj P, Stirn LZ. Evaluation of several approaches for deriving weights in fuzzy group analytic hierarchy process. J Decis Syst. 2018;27: 217–226.
  228. 228. Grošelj P, Zadnik Stirn L. Soft consensus model for the group fuzzy AHP decision making. Croat Oper Res Rev. 2017;8: 207–220.
  229. 229. Ramík J, Korviny P. Inconsistency of pair-wise comparison matrix with fuzzy elements based on geometric mean. Fuzzy Sets Syst. 2010;161: 1604–1613.
  230. 230. Hou F. Market Competitiveness Evaluation of Mechanical Equipment with a Pairwise Comparisons Hierarchical Model. PLOS ONE. 2016;11: e0146862. pmid:26783751
  231. 231. Madzík P, Falát L. State-of-the-art on analytic hierarchy process in the last 40 years: Literature review based on Latent Dirichlet Allocation topic modelling. PLOS ONE. 2022;17: e0268777. pmid:35622850
  232. 232. Shahabi S, Pardhan S, Teymourlouy AA, Skempes D, Shahali S, Mojgani P, et al. Prioritizing solutions to incorporate Prosthetics and Orthotics services into Iranian health benefits package: Using an analytic hierarchy process. PLOS ONE. 2021;16: e0253001. pmid:34101766
  233. 233. Hashim M, Nazam M, Abrar M, Hussain Z, Nazim M, Shabbir R. Unlocking the Sustainable Production Indicators: A Novel TESCO based Fuzzy AHP Approach. Tan AWK, editor. Cogent Bus Manag. 2021;8: 1870807.
  234. 234. Kazibudzki PT. On the Similarity Among Priority Deriving Methods for the AHP. 2020.