A modified belief entropy in Dempster-Shafer framework

How to quantify the uncertain information in the framework of Dempster-Shafer evidence theory is still an open issue. Quite a few uncertainty measures have been proposed in Dempster-Shafer framework, however, the existing studies mainly focus on the mass function itself, the available information represented by the scale of the frame of discernment (FOD) in the body of evidence is ignored. Without taking full advantage of the information in the body of evidence, the existing methods are somehow not that efficient. In this paper, a modified belief entropy is proposed by considering the scale of FOD and the relative scale of a focal element with respect to FOD. Inspired by Deng entropy, the new belief entropy is consistent with Shannon entropy in the sense of probability consistency. What’s more, with less information loss, the new measure can overcome the shortage of some other uncertainty measures. A few numerical examples and a case study are presented to show the efficiency and superiority of the proposed method.

Uncertainty often comes from several types of uncertain and incomplete information, including ignorance, vagueness and so on [39]. Uncertainty and ignorance are difficult categories to deal with [40]. Although ignorance increases the uncertain degree of uncertain a1111111111 a1111111111 a1111111111 a1111111111 a1111111111 A mass function m is defined as a mapping from the power set 2 O to the interval [0, 1], which satisfies the following conditions [1,2]: If m(A) > 0, then A is called a focal element, the mass function m(A) represents how strongly the evidence supports the proposition A.
A body of evidence (BOE), also known as a basic probability assignment (BPA) or basic belief assignment (BBA), is represented by the focal sets and their associated mass value: where < is a subset of the power set 2 O , each A 2 < has an associated nonzero mass value m (A).
A BPA m can also be represented by its associate belief function Bel and plausibility function Pl respectively, defined as follows: In Dempster-Shafer evidence theory, two independent mass functions, denoted as m 1 and m 2 , can be combined with Dempster's rule of combination defined as [1,2]: where k is a normalization constant representing the degree of conflict between m 1 and m 2 , k is defined as [1,2]:

Shannon entropy
As an uncertainty measure of information volume in a system or process, Shannon entropy plays a central role in information theory. Shannon entropy indicates that the information volume of each piece of information is directly connected to its uncertain degree. Shannon entropy, as the information entropy, is defined as follows [45]: where N is the number of basic states, p i is the probability of state i, p i satisfies P N i¼1 p i ¼ 1. If the unit of information is bit, then b = 2.

Deng entropy
Deng entropy is a generalization of Shannon entropy in Dempster-Shafer framework [54]. If the information is modelled in the framework of a probability theory, Deng entropy can be degenerated to Shannon entropy. Deng entropy, denoted as E d , is defined as follows [54]: where |A| denotes the cardinality of the proposition A, X is the FOD. If and only if the mass value is assigned to single elements, Deng entropy can be degenerated to Shannon entropy, in this case, the form of Deng entropy is as follows: For more details about Deng entropy, please refer to [54].

Uncertainty measures in Dempster-Shafer framework
Assume that X is the FOD, A and B are focal elements of the mass function, and |A| denotes the cardinality of A. Then, the definitions of some typical uncertainty measures in Dempster-Shafer framework are briefly introduced as follows.

Hohle's confusion measure.
Hohle's confusion measure, denoted as C H , is defined as follows [49]: Recall the Eq (12) of Dubois & Prade's weighted Hartley entropy, the uncertainty measure of m 1 and m 2 are shown as follows: The results calculated by Deng entropy and the weighted Hartley entropy are counterintuitive. The two BOEs have the same mass value assignment, but the FOD of the first BOE consists of four targets denoted as a, b, c and d, while the second BOE has only three possible targets denoted as a, b and c. Intuitively, it is expected that the second BOE has less uncertainty than the first one. In other words, the uncertain degree of m 1 should be bigger than that of m 2 . Both Deng entropy and weighted Hartley entropy fail to quantify the difference of uncertain degree among these two BOEs. To address this issue, a modified belief entropy based on Deng entropy is proposed.

The new belief entropy
In the framework of Dempster-Shafer evidence theory, the new belief entropy based on Deng entropy is shown as follows: where |A| denotes the cardinality of the focal element A, |X| denotes the cardinality of X which represents the number of element in FOD. Compared with Deng entropy, the new belief entropy addresses more information in BOE, including the scale of FOD, denoted as |X|, and the relative scale of a focal element with respect to FOD, denoted as ((|A| − 1)/|X|).
The exponential factor e jAjÀ 1 jXj in the new belief entropy represents the uncertain information in a BOE that has been ignored by Deng entropy and some other uncertainty measures such as the confusion measure, the dissonance measure, the weighted Hartley entropy, the discord measure and the strife measure. More importantly, by involving the scale of FOD in the proposed belief entropy, the new uncertainty measure now can effectively quantify the difference among different BOEs even if the same mass value is assigned on different FODs. In addition, the new information exponential factor doesn't affect the merit of Deng entropy, which will be discussed in detail in the ensuing part of this paper.
With the new belief entropy, recall Example 3.1, the new belief entropy for these two BOEs is calculated as follows: The comparison results of different uncertainty measures for Example 3.1 are shown in Table 1. It can be concluded that both Dubois & Prade's weighted Hartley entropy and Deng entropy can't measure the difference of uncertain degree between these two BOEs, while the new belief entropy can effectively measure the different uncertain degree by taking into consideration of more available information in the BOE. In addition, according to Table 1, the first BOE m 1 has a higher uncertain degree with the new belief entropy, this is reasonable because the FOD of m 1 consists of four candidate targets which means a larger information volume than the second BOE m 2 . The efficiency of the new belief entropy is not available in the weighted Hartley entropy and Deng entropy.

Property of the new belief entropy
Some properties of the new belief entropy are presented in this section, including the range of the new measure and its compatibility with Shannon entropy. Property 1. Mathematically, the value range of the new belief entropy is (0, +1).
Proof. According to Dempster-Shafer evidence theory, a focal element A consists at least one element and the superior limit of its element number is the scale of FOD, while a FOD O (the X in Eq (20)) consists at least one element and there is no superior limit, thus the range of |A| and |X| are the same, denoted as [1, +1). The range of a mass function m(A) is (0, 1]. Recall Eq (20) Eq (23) is in consistent with Eqs (9) and (7) when the mass function is Bayesian, because a mass function m(A) can degenerate to a Bayesian probability p i in the sense of the probability consistency.

Numerical example and discussion
In

Compatibility with Shannon entropy
It is obvious that the uncertain degree for a certain event is zero. So the values of Shannon entropy, Deng entropy and the new belief entropy are all zero.

Superiority of the new belief entropy
In this section, the numerical examples are no longer appropriate for Shannon entropy, so the comparison is between the proposed belief entropy and some other uncertainty measures in Dempster-Shafer framework. Deng entropy E d and the new belief entropy E Md are calculated as follows: E Md ðmÞ ¼ À 1 Â log 2 1 2 5 À 1 e 5 À 1 5 ¼ 3:8000: The result shows that both Deng entropy and the new belief entropy of this vacuous mass function are bigger than that in Example 4.2. This is because the vacuous mass function in Example 4.3 means the information is totally unknown for the system, but the Bayesian mass function in Example 4.2 shows that the probability is equally distributed in the system. More information is available in Example 4.2 than the vacuous mass function in Example 4.3, so the uncertain degree in Example 4.2 should be smaller than that of the vacuous mass function. In addition, in Example 4.3, the uncertain degree indicated by the new belief entropy is smaller than that of Deng entropy, this is achieved by taking into consideration of the scale of the FOD in the BOE. That is to say, by taking into consideration of more available information, the uncertain degree measured by the new belief entropy is significantly decreased in comparison with Deng entropy. It's also safe to say that the new belief entropy can be more accurate for uncertainty measure in this case.
In order to test the capacity and superiority of the new belief entropy, recall the example in [54] as the following example.  Table 2.
Deng entropy E d and the modified belief entropy E Md are calculated with a changed proposition T, the results are shown in Table 2 Table 2 and Fig 1 show that the modified belief entropy is smaller than Deng entropy. This is reasonable, because more information in the BOE is taken into consideration within the modified belief entropy. The proposed method has a less information loss than Deng entropy. Fig 2 shows the results of comparison between the modified belief entropy and some other typical uncertainty measures in Dempster-Shafer framework.
In Fig 2, the uncertain degree measured by Hohle's confusion measure never changes with the variation of the element number in proposition T, thus it cannot measure the variance of uncertain degree in this case. Similar to Hohle's confusion measure, Yager's dissonance measure has a limited capacity of uncertainty measure in this case, both of these two methods can't measure the change in proposition T. The uncertain degree measured by Klir & Ramer's discord measure, Klir & Parviz's strife measure and George & Pal's conflict measure is decreasing with the increasing of the element number in the proposition T. Thus, the confusion measure, the dissonance measure, the discord measure, the strife measure and the conflict measure can't effectively measure the rising of the uncertain degree along with the increasing of the element number in the proposition T.

Modified belief entropy
It seems that the uncertain degree measured by Dubois & Prade's weighted Hartley entropy, Deng entropy and the modified belief entropy is rising significantly along with the increasing of the element number in proposition T. However, the weighted Hartley entropy and Deng entropy can't distinguish the different uncertain degree among BOEs with similar BPAs on different FODs, as is shown in Example 3.1. More importantly, by taking into consideration of the scale of the FOD and the cardinality of each proposition simultaneously, the uncertain degree measured by the modified belief entropy is significantly decreased in comparison with Deng entropy. The proposed modified belief entropy takes advantage of more valuable information in BOE, which ensures it to be more reasonable and effective for uncertainty measure in Dempster-Shafer framework.

A case study
In order to show the effectiveness and the application prospect of the modified belief entropy, the case study in [69] and the fault diagnosis method in [10] are recalled in this section. While performing the fault diagnosis method in [10], this paper changes Deng entropy into the new belief entropy.
Recall the example in [69]. Three fault types are denoted as F 1 , F 2 and F 3 , the fault hypothesis set is Θ = {F 1 , F 2 , F 3 }, three sensors report the diagnosis results independently, the diagnosis results are modelled as BOEs, denoted as E 1 , E 2 and E 3 , the BPAs of the diagnosis results are shown in Table 3.
Based on the sensor reports in Table 3, which one is the fault that happens now, F 1 , F 2 or F 3 ? With Dempster's rule of combination in Eq (5), the combination results of sensor reports are shown in Table 4. It's hard to judge which fault has been occurred, because the combination results obtained by the conventional Dempster's rule of combination are very close.
The case study demonstrates the effectiveness of the new belief entropy. In addition, the case study shows a promising application prospect of the new uncertainty measure.

Conclusions
In information processing, each tiny piece of information is valuable. The uncertain information should be addressed cautiously, especially when there is limited available information. In this paper, a new belief entropy based on Deng entropy is proposed. The proposed method takes full advantage of uncertain information in BOE, including the mass function, the cardinality of the proposition and the scale of FOD. By addressing more available information of Modified belief entropy BOEs, the difference of uncertain degree that can't be addressed by some other uncertainty measures now can be distinguished successfully. Numerical examples show that the new belief entropy can quantify the uncertain degree of BOE more accurately. The case study demonstrates the effectiveness and the application prospect of the new measure. Further study of this work will be focused on the application of the proposed measure. The new belief entropy provides a promising way to measure the uncertain degree in decision making, fault diagnosis, pattern recognition, risk analysis and so on.