Browse Subject Areas

Click through the PLOS taxonomy to find articles in your field.

For more information about PLOS Subject Areas, click here.

  • Loading metrics

A new weighting factor in combining belief function

A new weighting factor in combining belief function

  • Deyun Zhou, 
  • Qian Pan, 
  • Gyan Chhipi-Shrestha, 
  • Xiaoyang Li, 
  • Kun Zhang, 
  • Kasun Hewage, 
  • Rehan Sadiq


Dempster-Shafer evidence theory has been widely used in various applications. However, to solve the problem of counter-intuitive outcomes by using classical Dempster-Shafer combination rule is still an open issue while fusing the conflicting evidences. Many approaches based on discounted evidence and weighted average evidence have been investigated and have made significant improvements. Nevertheless, all of these approaches have inherent flaws. In this paper, a new weighting factor is proposed to address this problem. First, a modified dissimilarity measurement is proposed which is characterized by both distance and conflict between evidences. Second, a measurement of information volume of each evidence based on Deng entropy is introduced. Then two kinds of weight derived from aforementioned measurement are combined to obtain a new weighting factor and a weighted average method based on the new weighting factor is proposed. Numerical examples are used to illustrate the validity and effectiveness of the proposed method. In the end, the new method is applied to a real-life application of river water quality monitoring, which effectively identify the major land use activities contributing to river pollution.


Dempster-Shafer (D-S) evidence theory provides a reasonable and efficient way to deal with the information which is uncertain and discordant. It has been extensively used in various applications related to decision-making such as information fusion [13], uncertain reasoning [4], fault diagnosis [5], risk analysis [69], cognitive map [10], target recognition and association [1113]. Unlike the probability theory and Bayesian theory, the D-S evidence theory requires few prior conditions and knowledge when information is processed. For example, the Evidential Reasoning (ER) algorithm is a generalized Bayesian inference process and the ER rule reveals that the combined degree of joint support for a proposition from two pieces of independent evidence constitutes two parts in general [1416]. When there is no priori information, the ER rule will reduce to the D-S combination rule. Moreover, the combination rule of the D-S evidence theory satisfies some of mathematical properties, such as commutativity and associativity. However, counter-intuitive results may occurred by the normalization step of the classical D-S combination rule when collected sources of evidence highly conflict with each other, as pointed out by Zadeh [17]. The effectiveness of the D-S evidence theory will be considerably reduced by this deficiency.

Evidently, it is crucial to handle the evidences with high conflict. In the last few years, many researchers have carried out comprehensive research and have applied a series of modifications to the conventional evidence combination rule [1824]. In general, the existing methods can be divided into two kinds of solutions: revisal of the combination rule and revisal of the evidences, for the problem of high level conflict evidence fusion. Proponents of the first argument present that illogical results are caused by inappropriate distribution of the conflict information. Therefore, the modified methods based on the revisal of the combination rule mainly focus on altering the assignment of conflict information [2, 18, 21, 2527]. Among them, solutions of the transferable belief model (TBM) and Dezert-Smarandache theory (DSmT) are more popular. The TBM develops a method to transfer the basic belief assignments (BBAs) to probabilities, but the method only can be used in closed world [25, 28]. DSmT extends the assignment universe of BBAs from a power set to a super power set which is more thorough and complete, and corresponding combination models and rules are developed as well [27]. Nonetheless, the modified combination rules have limitations in some situation. For example, most of the modified combination rules are not commutative and associative and are time consuming when dealing with a large amount of evidences.

The other modification, revisal of the evidences, preprocesses conflict evidences before combination process. The favorable mathematical properties of the D-S combination rule are reserved in the improvement as they do not change the D-S combination rule. Many related work have been proposed to support this modification method [2224, 2932]. Murphy [29] generates a new evidence by averaging N evidences with equal weights and then combine it with N-1 times. Based on this idea, Deng [22] proposes a weighted averaging method to obtain the new evidence. Besides the weighted averaging method, the discounting method also plays an important role in preprocessing conflict evidences. The weighting factor of both the weighted average method and discounting method can be identified by evidence distance, which is usually used to describe the conflict or dissimilarity [3335]. Liu [35] argues that the conflict coefficient k in the evidence theory is inadequate to reflect the degree of conflict and dissimilarity between evidences, and he utilizes a two-dimension cell <evidence conflict, evidence distance> to measure the dissimilarity between evidences. The cell is indeed more comprehensive and adequate than the single coefficient k when describing the dissimilarity, but it also has intrinsic shortcomings in practical situation. For instance, the conflict tolerance threshold ε is largely subjective and depends on the perception of a decision maker. In addition, the cell is not syncretized in the combination rule. A dissimilarity measure is proposed on the basis of Hamacher T-conorm fusion rules given by Liu, who considers not only the evidence conflict and distance but also combines it in the combination rule as a discount [36]. Nevertheless, there are twofold limitations associated with the mathematical modeling. First, since the conflict factors only use the maximal subjective probability of the BBAs, it cannot solve the situation related to the propositions with equal belief values, which are investigated in Section 3. Second, combining the evidence one by one has a low convergence rate.

Besides evidence conflict and distance, the evidence volume is another criterion to measure the importance of an evidence [37]. If an evidence has more information, it should have a greater impact on the final aggregated result. Deng entropy [38], as a generalization of Shannon entropy, can measure the evidence information volume under the framework of D-S evidence theory. In this paper, Deng entropy and modified dissimilarity measure are used to form a new weighting factor. Then the new combination rule of evidence is carried out based on the new weighting factor, which has improved the versatility and has a fast convergence rate.

This paper is organized as follows. Section 2 describes some basic concepts related to the D-S evidence theory and dissimilarity measure. Section 3 presents problems of existing conflict coefficients, especially the limitations of Liu’s method. Section 4 investigates the new weighting factor of modified dissimilarity and Deng entropy, and some examples and analysis are presented to show the superiority and effectiveness of proposed method. In Section 5, the proposed method is used in a real-life application of the identification of water pollution sources. Finally, conclusions are drawn in Section 6.


2.1 Basics of D-S evidence theory

Definition 1. Suppose Θ be a nonempty finite set of mutually exclusive alternatives and defined as frame of discernment. Set of all the possible subsets of Θ, denoted by 2Θ, is called power set. The mapping m: 2Θ → [0,1] is defined as the basic belief assignment (BBA) (also known as basic probability assignment, BPA) [39, 40]. The BBA satisfies (1) (2) where m(A) reflects the strength of each of evidence support for the proposition A in the frame of discernment, and ∅ denotes the empty set of Θ. A is called the focal element, if m(A) > 0.

Definition 2. The belief function Bel(A) and plausibility function Pl(A) from a BBA are defined as (3) (4) where Bel(A) represents the amount of belief that definitely support A, and the Pl(A) could be viewed as the amount of belief that potentially placed in A.

Definition 3. Let m1 and m2 be two BBAs defined on the same frame Θ. D-S evidence theory combination rule is expressed as (5) with (6) where k is named as conflict coefficient to measure the degree of conflict between two BBAs. The combination rule is out of work when k = 1.

Zadeh [17] presents a famous example that the D-S combination rule will produce an unexpected result. Suppose a frame is Θ = {A,B,C} and two BBAs are given as by the D-S combination rule, the aggregated result is k = 0.9999, m(A) = m(C) = 0 and m(B) = 1, which is obviously counter-intuitive and unreasonable.

2.2 Jousselme distance

Jousselme distance [32], considering both the mass and cardinality of focal elements of each BBA, is commonly used as the measure of dissimilarity.

Definition 4. Let m1 and m2 be two BBAs on the same frame Θ, containing N mutually exclusive and exhaustive propositions. The Jousselme distance between m1 and m2 are defined as (7) where ‖m12 = 〈m1,m1〉, ‖m22 = 〈m2,m2〉 and 〈m1,m2〉 is given by (8) with Ai and Bj are the elements of the power set 2Θ. |AiBj| and |AiBj| denote the cardinality intersection set and union set of Ai and Bj.

2.3 Probabilistic-based distance

Since the probalilistic transformation has an ability to convert a BBA from the focal elements into a probability measure of distinct atomic, it provides a probabilistic-based distance to measure the dissimilarity of two evidences [41].

Definition 5. Let m be a BBA on a frame Θ, and the probabilistic expression of a singleton element B in Θ could be obtained by pignistic probability function (9) where |A| is the cardinality of proposition A. If |A| = 1, then B = A and BetP(B) = BetP(A) = m(A).

Definition 6. Let m1 and m2 be two BBAs on the same frame Θ and let and be the results of pignistic probability transformation of m1 and m2, the probabilistic-based distance is defined as (10) and the Murkowski distance [36] proposed by Liu is defined as (11)

2.4 Combinatorial dissimilarity measure

Some compound dissimilarity measures are presented based on the conflict coefficient, evidence distance and probabilistic-based distance.

Definition 7. Let m1 and m2 be two BBAs on the same frame Θ, and a combinatorial dissimilarity measure based on the conflict coefficient and probabilistic-based distance is defined as (12) m1 and m2 are in conflict, iff both and . ε ∈ [0,1] denotes the threshold of conflict tolerance, and identified according to different applications [35].

Definition 8. Let m1 and m2 be two BBAs on the same frame Θ, and a combinatorial dissimilarity measure [42] based on the conflict coefficient and Jousselme distance is defined as (13)

Definition 9. Let m1 and m2 be two BBAs on the same frame Θ and let and be the results of pignistic probability transformation of m1 and m2. Then a combinatorial dissimilarity measure based on the Hamacher T-conorm fusion rules [43] is defined as (14) where denotes the conflict coefficient based on the pignistic probability, (15) where

2.5 Deng entropy

Deng entropy, as a generalization of Shannon entropy, provide a solution to measure the information volume of a BBA. It is observed that the Deng entropy and Shannon entropy correspond to an uncertain degree of measurement [38].

Definition 10. Let m be a BBA on the frame Θ and the Deng entropy of m is defined as (16) where Ai is a proposition in BBA m, and |Ai| is the cardinality of Ai. The Deng entropy will become identical to Shannon entropy if |Ai| = 1, that is (17)

Limitations of exiting dissimilarity measurements between BBAs

Example 1. Let m1, m2 and m3 be three BBAs on the same frame Θ with four propositions Θ = {A1,A2,A3,A4}. The three BBAs are given as we can get the conflict coefficients and by using Eq (6) between the BBAs. The result shows that the degree of conflict between m1 and m2 is bigger than the degree of conflict between m1 and m3, and they both are in relative high conflict. In fact, there is no conflict intuitively between m1 and m2 because they are the same. By using the Eq (10) and Eq (12), we can get the and cf(m1,m2) = 〈0.75,0〉, which illustrates that m1 and m2 are consistent and measurement of coefficient k cannot measure the degree of conflict between the evidences in this situation. Although the combined measurement implies that the D-S combination rule should be used, it cannot conclude that how much the error will be conduct by using the combination rule. Therefore, the combination rule has a limitation in terms of providing an explicit expression and cannot be used directly in the combination rule.

Example 2. Let m1 and m2 be two BBAs on the same frame Θ = {A1,A2,…,A2n}, such that It is obvious that the m1 and m2 are totally contrary to each other as they support the different propositions. The different dissimilarities between m1 and m2 are displayed in Fig 1.

From Fig 1, it is evident that the values of the dJ, kd, and difBetP are 1, when n = 1, which are intuitive. But when n > 1, the values of dJ and difBetP tend to 0, and kd tends to 0.6, indicating that m1 and m2 are getting closer and less conflict with the increase of n, which are counter-intuitive and abnormal. Only the DismP keeps 1 with the increase with n, meaning that the m1 and m2 are totally in disagreement with each other. Therefore, dJ, kd, and difBetP cannot be used as measurement of the dissimilarity between BBAs in this example.

Since DismP considers not only the distance but also the conflict between BBAs, the measurement based on the Hamacher T-conorm fusion rules provides a general method of the dissimilarity. However, it has dificiency as shown in Example 3.

Example 3. Let m1 and m2 be two BBAs on the same frame of discernment Θ = {A1,A2,…,A20}. For notation conciseness 1, 2, and so forth have been used to denote A1,A2, and so forth in the frame. The two pairs of BBAs are shown as

1st Pair:

2nd Pair: where the Δ is a subset of Θ. This example considers 20 cases of the subset Δ, which increases by adding a new element at each case from Δ = {1} to Δ = {1,2,…,20}. The comparison of the dissimilarity measurements between m1 and m2 of the two pairs are shown in Figs 2 and 3 respectively.

Fig 2. Comparison of dissimilarity measures of the 1st pair evidence.

Fig 3. Comparison of dissimilarity measures of the 2nd pair evidence.

From Figs 2 and 3, it can be seen that the distP and DismP are very close and follow the same trend, since the DismP are mainly decided by the distP and ConfP. As calculated in Eq (15), the maximal pignistic probabilities in both m1 and m2 always have intersection. So ConfP is small when the Δ increases from {1} to {1,2,…,20}. However, it is quite distinct situation of the two pairs of BBAs. In the 1st pair, both BBAs distribute their major belief to the same elements when the cases from 1 to 6, which cause that the ConfP keeps 0. This is reasonable as the m2 only has one focal element m2(1,2,3,4,5) = 1 which corresponds to classical conflict coefficient k. In the 2nd pair, m2 has two equal focal elements m2(1,2,3,4,5) = 0.5 and m2(6,7,8,9,10) = 0.5. As the case from 1 to 5, there should be a notable dissimilarity between m1 and m2, which is shown as k in Fig 3. Nevertheless, the pignistic probability transformation divides the belief equally to each single proposition as , indicating that the ConfP considers the dissimilarity as 0. Therefore, the dissimilarity measures of ConfP and DismP are illogical in this situation. Although the classical conflict coefficient k could depicts the dissimilarity from cases 1 to 5, it cannot reflect the variety of divergence degree as the case increases. Neither does the difBetP.

Combining belief function with a new weighting factor

4.1 A modified dissimilarity measure

In this section, a modified dissimilarity measure is proposed which is based on the Hamacher T-conorm fusion rules to describe the dissimilarity between BBAs. The dissimilarity measurement based on Hamacher T-conorm rules satisfy two important properties of commutativity and monotonicity. The commutativity could ensure that the dissimilarity matrix is symmetrical and no matter the fusion order of two evidences is, their dissimilarity is coincident. The monotonicity provides that dissimilarity measurement has single variation trend in a specific interval, which is easy to compare the dissimilarity between evidences.

Definition 11. Let m1 and m2 be two BBAs on the same frame Θ. The modified dissimilarity measure is defined as (18) where is the classical conflict coefficient.


The modified dissimilarity still satisfies the basic properties of commutativity and monotonicity:

(1) Commutativity: (20)

(2) Monotonicity: (21) where 0 ≤ xx′ ≤ 1 and 0 ≤ yy′ ≤ 1.

The modified dissimilarity measurement consists of a distance coefficient and a conflict coefficient between two evidences. As both the distance coefficient and the conflict coefficient lie in [0,1], the modified measurement is larger than its either components. The evidences have large distance and high conflict with the majority of other evidences would have a larger dissimilarity measurement and vice versa.

It is obvious that the modified dissimilarity measure replaces the conflict coefficient with . The implies that the main conflict results from discordant propositions which are strongly supported by two BBAs respectively. However, the cannot handle the situation that one BBA has several propositions with equal belief. As we see it, the conflict coefficient should involve all conflicts existed between BBAs no matter how small the extent of conflicts is. Furthermore, the MDismP not only maintains good features but also makes up for shortcomings of DismP.

Example 4. Considering two pairs of BBAs from Example 3 with the proposed dissimilarity measure of MDismP, the results are plotted in Figs 4 and 5.

Fig 4. Comparison of dissimilarity of the 1st pair evidence.

Fig 5. Comparison of dissimilarity of the 2nd pair evidence.

For the results of the 1st pair illustrated in Fig 4, the distP, DismP and MDismP are identical. The lines show a variation tendency from a high dissimilarity when Δ = {1} to the minimum dissimilarity when Δ = {1,2,3,4,5} and increase again as Δ includes more elements. This is because the m2 only has one proposition and when Δ = {1,2,3,4,5}, the propositions with the maximum belief of two BBAs are accordant. In Fig 5, when m2 has two propositions with equal belief value, the results are different. The MDismP has a bigger value than the distP and DismP when Δ from {1} to {1,2,3,4,5}. The dissimilarity value of MDismP is near 0.8 before cases 6, which means the two BBAs are incompatible with each other. But the dissimilarity values of distP and DismP are less than 0.6, which seems unreasonable. Based on the analysis of the above examples, a conclusion can be drawn that the modified dissimilarity measure MDismP can efficiently reflects the degree of dissimilarity between BBAs.

4.2 Weighting factors

In this section, we propose a novel method to determine the weighting factors among BBAs based on the modified dissimilarity measure and Deng entropy.

The weight determinations are based on the principle that if an evidence is supported by greater number of evidences, this piece of evidence should be more important and have large effect on the final combination results. Moreover, if an evidence has considerable information, it also should be weighted more [37].

Suppose N evidences {m1,m2,…,mN} are in the same frame of discernment Θ and the weight of each evidence is made up of the degree of similarity and information volume. The similarity degree Simi,j of mi and mj is defined as (22) the mutual similarity degree matrix SN×N is then defined as (23) The SN×N is symmetrical and Simi,j = Simj,i means that the similarity between two evidences fulfills the commutativity. The diagonal element is 1 means that an evidence is totally similar with itself. The similarity degree matrix helps give an insight into the agreement between evidences, and the weighting factor of dissimilarity could be obtained based on the similarity degree matrix.

The weighting vector Wdis of each evidence is associated with the eigen vector of the maximal positive eigen value λmax, that is λmaxWdis = SN×NWdis. The evidence with the largest weight is deemed to be the most important evidence and the weight of each evidence is revised as (24) the information volume of each evidence is measured by Deng entropy and can be calculated by Eq (16). After Deng entropy of each evidence is processed, the weights of information volume of evidence are obtained by (25) then the weight of each evidence based on the proposed method is defined as (26)

The final weighting factor is measured by the weighting factor of dissimilarity measurement and the weighting factor of evidence information. The two weighting factors describe the final weight from two aspects: the factor of dissimilarity measurement depicts the mutual degree of deviation of an evidence with other evidences; the factor of evidence information illustrates the relative amount of information of an evidence compared to others. The final weighting factor is more thorough than the other weighting (discounting) factors which just depend on the dissimilarity measurement. With the weighting factor of information volume, the new weighting factor still hold the capability to avoid fusion error caused by the single information source failure.

4.3 Combination of evidences

The weighted average [29] and discounted [36] methods are two kinds of approaches that have been proposed for combining the evidences. The discounted method distributes the remaining mass value of the discounted mass to the universal set Θ, but this would result in more uncertainty. As the discounted methods need to combine the evidences one by one, the calculation will be time consuming and have a low convergence when a large amount of evidences need to be combined. The weighted average method would reinforce each other if the evidences are concordant and would weaken each other if the evidences are in conflict. The belief of propositions after combination will remain distinct from each other. In addition, the weighted average method is easily computational and more reliable and rational. Hence, the weighted average method is applied in this article.

The evidence generated by weighted average method is (27)

For N original evidences, we should combine the new evidence EWA(m) for N − 1 times. In this section, an example of target recognition [36] is presented to show some behaviors of the existing method as well as the proposed method.

Example 5. Let m1, m2, m3, m4 and m5 be five BBAs on the same frame of discernment Θ = {A1,A2,A3} as shown in Table 1. The combination results obtained with the proposed methods are shown in Table 2. The convergence is shown in Table 3. means fusion of evidences m1,m2,…,mi, that is .

As can be seen from Table 2, the D-S combination rule (without the discounting or weighted average process) concludes that the proposition A2 is almost be regarded as the target. The result is unreasonable and counter-intuitive since majority of evidences distribute the major belief to proposition A1 and just one evidence m3 give its major belief to A2. Such unexpected behavior is solved by using the discounting and weighted average method to lessen the influence of the evidences which is dissimilar with the other evidences. The larger the dissimilarity of an evidence, the larger discount it will have. However, one also sees that the process of the proposed method as the dissimilarity measure to determine the weight of each evidence generates a more specific and reliable result than the process of discounting factors based on dJ and DismP. In addition, the is larger than the of m(A1) of the proposed method, which is completely opposite of dJ and DismP when the m3 assign its major belief to A2. What caused this is that the proposed new weighting factor weakens m3 twice by the weighting factor of dissimilarity and Deng entropy. For dJ and DismP, the discount factor weakens m3 once, which overcomes the defect raise by the classical combination rule. Besides, the evidence m3 has less information when compared to other evidences. So, weighting factor of Deng entropy further weaken the evidence m3 on the basis of the weighting factor of the dissimilarity factor. The larger of m(A1) of the proposed method reflects the proposed new weighting factor has a great capability of anti-interference. Furthermore, the proposed method has a better performance of convergence than the other methods. As can be seen from Table 3, the series results of proposed method achieve the belief level of other methods before the completion of the fusion process. For example, the three times fusion results of by the proposed method achieves 0.8 which achieves belief level to final result of DismP 0.8254, indicating that the proposed method provides faster convergence and is less computationally intensive. In the end, the great difference in m3, where almost all belief is given to A2, is caused by that evidence m3 may be especially sensitive to the typical characteristics of the kind of A2. The results in Table 2 show that the A2 with the second largest belief that is much larger than other propositions except A1. Therefore, A2 could be seen as a potential identification results as well.

This work provides a modified dissimilarity measure and evidence information volume measure to determine the weighting factors of evidences involved in the fusion process. The modified dissimilarity measure includes both the distances and all conflicts between evidences and the evidence information volume of each evidence is measured by Deng entropy. Compared with other methods, the proposed method gives a more specific and faster aggregated result. In addition, it is an efficient method in some decision-making applications.

Identification of major land use activities contributing to river pollution: An application

In this section, the proposed method is applied to a case study in identifying the major land use activities contributing to river pollution and the required data are collected by water quality monitors.

The Manahara River lies in the Kathmandu Valley, Nepal, with a watershed of 256 km2. The river originates from Manichud Lekh (ridge) at an elevation of 2352 m and has a length of 30 km. The river originates from a pristine forested region and flows through forest, rural, semi-urban, and urban areas. The water quality was monitored monthly in seven different sites (Sites 1–7) of the river by sensors. The data of Sites 1 to 7, respectively Salinadi, Sankhu, Brahmakhel, Bode, Sinamangal, Imadol, and Chyasal are obtained from [44]. The land use and anthropogenic activities in transition of different sites are given in Table 4.

Table 4. Land use and river activities in site transition.

The water quality of the river during the low flow months in 2006 is given in Table 5. Water qualities are good in upstream and bad in downstream regions and varied gradually from Site 1 to 7. Each monitor records the amount of each chemical element in river when the water stream flow past each site. The actual variation amount of chemical element in each site should only consider the difference with the previous site. The chemical elements criteria in Table 5 could be divided into benefit criteria and cost criteria.

Table 5. Water quality of Manahara river in different sites.

The change in water quality from one site to next site is normalized by Eqs (28) and (29). For a benefit criterion, (28) where and Xmax is the maximum value and Xmin is the minimum value of the criterion. For a cost criterion, (29) where and Xmax is the maximum value and Xmin is the minimum value of the criterion.

For example, the amount of DO (mg/L) in February is 9.2, 2.2 and 0 record in site 1, 6 and 7 respectively. DO (mg/L) is a benefit criterion and the normalization of site 6–7 is as Eq (30) shows. (30) The criteria, except the DO, are all cost criteria. The normalized chemical criteria of February are shown in Table 6.

Table 6. Normalized chemical criteria in different sites of February.

In Table 6, each chemical criterion is seen as an evidence. Therefore, there are ten evidences need to be fused to identify which site causes the major pollution. According to the proposed method, the results of the river water quality monitoring and identification of February are given in Table 7.

Table 7. Fusion results of February by using the proposed method.

From the Table 7, it shows that the A6 urban settlement (dense) is the major cause for the river pollution. All the fusion results are listed in Table 8.

Table 8. Results of the application of the proposed method.

Table 8 shows that the major source of water pollution in all months are related to A6, i.e. urban settlement (dense). This result is true and reasonable as the adjacent area of Site 6 and 7 is A6 that discharged untreated sewage directly to those sites [45, 46], that have the worst water quality with very high BOD, nutrients (NO3-N and PO4-P), NH3-N, EC, and TDS with very low dissolved oxygen. Untreated sewage has high BOD, nutrients (NO3-N and PO4-P), NH3-N, EC, and TDS with very low or almost zero dissolved oxygen [47]. However, the combination (A2, A3, & A6) in Site 4 to 5 and the combination (A2, A3, A5, & A6) in Site 5 to 6 also have urban settlement (A6), which contributed insignificantly because urban settlement covered only a smaller land cover, is far from the river, and sewage outfall is absent in Site 4 and lower in Site 5.

The combination (A2, A3, A5, & A6) also has an industry which contributed insignificantly because there is only one beverage industry and its wastewater is diluted by the river flow and/or the sampling time would have been different than the discharge of concentrated industrial wastewater. In addition, the contribution of organic load by agriculture is much lower than that of sewage outfall [48]. Due to this, agriculture (A2) contribution has resulted in an insignificant pollution. Furthermore, the organic load contribution by forest (A1) is lower [49, 50], due to which the combination (A2 & A3) has insignificant contribution too.

In the four months from February to May, the results are similar. This is because the rivers have low flow in these months with February being winter and March to May being the pre-monsoon season [47]. The low flow in a river results in a high concentration of pollutants due to the lack of dilution. Therefore, the results of the application of the proposed method in identifying water polluting land use activities match the actual situation.


Solving the problem of conflicting information fusion is still under continuous discussion. In this paper, a combination approach of evidences with a new weighting factor has been proposed based on a modified dissimilarity measure and Deng entropy between BBAs. The new weighting factor contains two part: weight of similarity and weight of entropy. The weight of similarity describes the degree of mutual support between BBAs. The larger the dissimilarity with others, the less the weight of similarity of a BBA. The weight of entropy describes the information volume of each BBA. If a BBA has a larger entropy value, meaning it contains more information, the weight of entropy of that BBA would be larger.

After analyzing the features and limitations of the existed dissimilarity measures, a modified dissimilarity measure based on the Hamacher T-conorm fusion rules mixing the probabilistic-based distances with the all conflict between BBAs has been developed. Also, the weight of similarity is determined by dissimilarity measures. The information volume of a BBA is obtained by Deng entropy and is used in the determination of the weight of entropy.

The new weighting factor based on the modified dissimilarity and Deng entropy is presented and applied in the weighted average combining belief function. Several numerical examples analysis show that the proposed method not only obtains a more reasonable and specific result but also has a faster convergence rate. A real application of determining the major land use activities contributing to river pollution is implemented by the proposed method. Result shows that the urban settlement (dense) is the major source of water pollution. This new approach can be of great interest for decision makers in devising strategies to control water pollution and environmental management.


The work is partially supported by the National Natural Science Foundation of China (Grant No. 61401363), the Natural Sciences and Engineering Research Council of Canada (NSERC) Collaborative Research and Development Grants (Grant No. CRDPJ 446638–12), the Science and Technology on Avionics Integration Laboratory and Aeronautical Science Foundation (Grant No. 20155153034), the Fundamental Research Funds for the Central Universities (Grant No. 3102016AXXX005, 3102015BJIIJGZ009).

Author Contributions

  1. Conceptualization: QP.
  2. Data curation: GC-S.
  3. Formal analysis: QP GC-S.
  4. Funding acquisition: KZ KH.
  5. Investigation: QP GC-S.
  6. Methodology: QP.
  7. Project administration: DZ RS.
  8. Resources: DZ RS.
  9. Software: QP XL.
  10. Supervision: RS.
  11. Validation: DZ RS.
  12. Visualization: QP GC-S.
  13. Writing – original draft: QP.
  14. Writing – review & editing: GC-S RS.


  1. 1. Jiang W, Yang Y, Luo Y, Qin X. Determining basic probability assignment based on the improved similarity measures of generalized fuzzy numbers. International Journal of Computers Communications & Control. 2015;10(3):333–47.
  2. 2. Smets P, Kennes R. The transferable belief model. Artificial intelligence. 1994;66(2):191–234.
  3. 3. Rao SS, Annamdas K. A comparative study of evidence theories in the modeling, analysis, and design of engineering systems. Journal of Mechanical Design. 2013;135(6):061006.
  4. 4. Fu C, Chin K-S. Robust evidential reasoning approach with unknown attribute weights. Knowledge-Based Systems. 2014;59:9–20.
  5. 5. Wang Y, Dai Y, Chen Y-w, Meng F. The evidential reasoning approach to medical diagnosis using intuitionistic fuzzy Dempster-Shafer theory. International Journal of Computational Intelligence Systems. 2015;8(1):75–94.
  6. 6. Deng Y, Mahadevan S, Zhou D. Vulnerability Assessment of Physical Protection Systems: A Bio-Inspired Approach. International Journal of Unconventional Computing. 2015;11.
  7. 7. Garcia ML. Vulnerability assessment of physical protection systems: Butterworth-Heinemann; 2005.
  8. 8. Liu H-C, Liu L, Lin Q-L. Fuzzy failure mode and effects analysis using fuzzy evidential reasoning and belief rule-based methodology. IEEE Transactions on Reliability. 2013;62(1):23–36.
  9. 9. Su X, Mahadevan S, Xu P, Deng Y. Dependence assessment in human reliability analysis using evidence theory and AHP. Risk Analysis. 2015;35(7):1296–316. pmid:25847228
  10. 10. Kang B, Deng Y, Sadiq R, Mahadevan S. Evidential cognitive maps. Knowledge-Based Systems. 2012;35:77–86.
  11. 11. Tianlu C, Peiwen Q. Target recognition based on modified combination rule. Journal of Systems Engineering and Electronics. 2006;17(2):279–83.
  12. 12. Deng Y, Su X, Wang D, Li Q. Target recognition based on fuzzy dempster data fusion method. Defence Science Journal. 2010;60(5).
  13. 13. McKleroy VS, Galbraith JS, Cummings B, Jones P. Adapting evidence-based behavioral interventions for new settings and target populations. AIDS Education and Prevention. 2006;18:59. pmid:16987089
  14. 14. Yang J-B, Xu D-L. On the evidential reasoning algorithm for multiple attribute decision analysis under uncertainty. IEEE Transactions on Systems, Man, and Cybernetics-Part A: Systems and Humans. 2002;32(3):289–304.
  15. 15. Yang J-B, Xu D-L. Evidential reasoning rule for evidence combination. Artificial Intelligence. 2013;205:1–29.
  16. 16. Zhang M-J, Wang Y-M, Li L-H, Chen S-Q. A general evidential reasoning algorithm for multi-attribute decision analysis under interval uncertainty. European Journal of Operational Research. 2017;257(3):1005–15.
  17. 17. Zadeh LA. A simple view of the Dempster-Shafer theory of evidence and its implication for the rule of combination. AI magazine. 1986;7(2):85.
  18. 18. Yager RR. On the Dempster-Shafer framework and new combination rules. Information sciences. 1987;41(2):93–137.
  19. 19. Smets P. Belief functions: the disjunctive rule of combination and the generalized bayesian theorem. Classic Works of the Dempster-Shafer Theory of Belief Functions: Springer; 2008. p. 633–64.
  20. 20. Smets P. The combination of evidence in the transferable belief model. IEEE Transactions on pattern analysis and machine intelligence. 1990;12(5):447–58.
  21. 21. Dubois D, Prade H. Representation and combination of uncertainty with belief functions and possibility measures. Computational Intelligence. 1988;4(3):244–64.
  22. 22. Yong D, WenKang S, ZhenFu Z, Qi L. Combining belief functions based on distance of evidence. Decision support systems. 2004;38(3):489–93.
  23. 23. Liu Z-g, Pan Q, Cheng Y-m, Dezert J. Sequential adaptive combination of unreliable sources of evidence. Advances and Applications of DSmT for Information Fusion, Vol IV: Collected Works. 2015:23.
  24. 24. Martin A, Jousselme A-L, Osswald C, editors. Conflict measure for the discounting operation on belief functions. Information Fusion, 2008 11th International Conference on; 2008: IEEE.
  25. 25. Deng Y. Generalized evidence theory. Applied Intelligence. 2015;43(3):530–43.
  26. 26. Smarandache F, Dezert J. Advances and Applications of DSmT for Information Fusion (Collected works), second volume: Collected Works: Infinite Study; 2006.
  27. 27. Dezert J, Smarandache F. DSmT: A new paradigm shift for information fusion. arXiv preprint cs/0610175. 2006.
  28. 28. Dempster AP, Chiu WF. Dempster–Shafer models for object recognition and classification. International Journal of Intelligent Systems. 2006;21(3):283–97.
  29. 29. Murphy CK. Combining belief functions when evidence conflicts. Decision support systems. 2000;29(1):1–9.
  30. 30. Han D, Deng Y, Han C, Yang Y. Some notes on betting commitment distance in evidence theory. Science China Information Sciences. 2012;55(3):558–65.
  31. 31. He Y, Hu L, Guan X, Deng Y, Han D. New method for measuring the degree of conflict among general basic probability assignments. Science China Information Sciences. 2012;55(2):312–21.
  32. 32. Jousselme A-L, Grenier D, Bossé É. A new distance between two bodies of evidence. Information fusion. 2001;2(2):91–101.
  33. 33. Deng X, Hu Y, Deng Y, Mahadevan S. Environmental impact assessment based on D numbers. Expert Systems with Applications. 2014;41(2):635–43.
  34. 34. Ristic B, Smets P. The TBM global distance measure for the association of uncertain combat ID declarations. Information fusion. 2006;7(3):276–84.
  35. 35. Liu W. Analyzing the degree of conflict among belief functions. Artificial Intelligence. 2006;170(11):909–24.
  36. 36. Liu Z-g, Dezert J, Pan Q, Mercier G. Combination of sources of evidence with different discounting factors based on a new dissimilarity measure. Decision Support Systems. 2011;52(1):133–41.
  37. 37. Jiang W, Wei B, Xie C, Zhou D. An evidential sensor fusion method in fault diagnosis. Advances in Mechanical Engineering. 2016;8(3):1687814016641820.
  38. 38. Deng Y. Deng entropy: a generalized Shannon entropy to measure uncertainty. 2015.
  39. 39. Shafer G. A mathematical theory of evidence: Princeton university press Princeton; 1976.
  40. 40. Dempster AP. Upper and lower probabilities induced by a multivalued mapping. The annals of mathematical statistics. 1967:325–39.
  41. 41. Smets P. Decision making in the TBM: the necessity of the pignistic transformation. International Journal of Approximate Reasoning. 2005;38(2):133–47.
  42. 42. Jiang W, Peng J-y, Deng Y. New representation method of evidential conflict. Systems Engineering and Electronics. 2010;32(3):562–5.
  43. 43. Hamacher H. Uber logische Aggregationen nicht-binar explizierter Entscheidungskriterien: Ein axiomatischer Beitrag zur normativen Entscheidungstheorie: Rita G Fischer Verlag; 1978.
  44. 44. Chhipi-Shrestha GK. River water quality assessment of the Manahara river by using macroinvertebrates as biological indicators, Kathmandu Valley. MSc thesis Central Department of Environmental Science, Tribhuvan University. 2007.
  45. 45. Chhipi-Shrestha GK. Ecological status of the Manahara river and community initiatives in wastewater management for preservation of the river. Proceeding of International symposium on Community-led Management of River Environment August. 2007:163–78.
  46. 46. Chhipi-Shrestha GKaP A.M.S. Water quality monitoring of Manahara river: An application of chemical water quality indices. Journal of SAN Stratigraphical Association of Nepal, Kathmandu. 2007;6:55–62.
  47. 47. Kumar AY, Reddy MV. Assessment of seasonal effects of municipal sewage pollution on the water quality of an urban canal—a case study of the Buckingham canal at Kalpakkam (India): NO3, PO4, SO4, BOD, COD and DO. Environmental monitoring and assessment. 2009;157(1–4):223–34. pmid:18949568
  48. 48. Zia H, Harris NR, Merrett GV, Rivers M, Coles N. The impact of agricultural activities on water quality: A case for collaborative catchment-scale management using integrated wireless sensor networks. Computers and electronics in agriculture. 2013;96:126–38.
  49. 49. Ernst C, Gullick R, Nixon K. Conserving forests to protect water. Am Water W Assoc. 2004;30:1–7.
  50. 50. Cunha DGF, Sabogal-Paz LP, Dodds WK. Land use influence on raw surface water quality and treatment costs for drinking supply in São Paulo State (Brazil). Ecological Engineering. 2016;94:516–24.