In this article, we discuss the problem of establishing relations between information measures for network structures. Two types of entropy based measures namely, the Shannon entropy and its generalization, the Rényi entropy have been considered for this study. Our main results involve establishing formal relationships, by means of inequalities, between these two kinds of measures. Further, we also state and prove inequalities connecting the classical partition-based graph entropies and partition-independent entropy measures. In addition, several explicit inequalities are derived for special classes of graphs.
Citation: Sivakumar L, Dehmer M (2012) Towards Information Inequalities for Generalized Graph Entropies. PLoS ONE 7(6): e38159. https://doi.org/10.1371/journal.pone.0038159
Editor: Stefano Boccaletti, Technical University of Madrid, Italy
Received: April 10, 2012; Accepted: May 3, 2012; Published: June 8, 2012
Copyright: © 2012 Sivakumar, Dehmer. This is an open-access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.
Funding: Matthias Dehmer and Lavanya Sivakumar thank the FWF (Austrian Sciences Fund) (Project No. PN22029-N13) for supporting this work. The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.
Competing interests: The authors have declared that no competing interests exist.
Complexity of a system, in general, deals with the intricate design and complex interrelations among the components of the system. Complexity analysis can be categorized in three types based on functional behavior, topological properties, and/or at the compositional level of a system . Over the years, all these categories have been implemented and contemplated concurrently in several branches of science and social science. In this paper, we study the complexity of graphs with respect to its underlying structure. It is often referred to as topological complexity , as the measures are used to associate high complexity with low symmetry and larger diversity of the system’s components, while low complexity is related to high symmetry, uniformity and lack of diversity. The quantitative estimation (using measures/indices) of topological complexity has been proven useful when characterizing the networks and has widely spread into all branches of natural sciences, mathematics, statistics, economics and sociology; for e.g., see –.
In the study of complexity, information theory has been playing a predominant role. That is, the measures based on Shannon entropy have been very powerful and useful in determining the structural complexity of networks; see , , , . Apart from Shannon entropy, its generalizations such as Rényi entropy , Daròczy entropy  have also been identified as useful measures for characterizing network-based systems; see .
In this paper, we deal with a novel aspect when analyzing the complexity of network-based systems. Namely, we establish relations between information-theoretic complexity measures , . Investigating relations (in the form of inequalities) among measures is useful when studying large scale networks where evaluating the exact value of a measure might be computationally challenging. In addition, they also serve as a tool for solving problems: In the field of communication theory, the study of inequalities has led to the development of so-called algebra of information where several rules have been established between the mutual information among events  and their respective entropy measures. For example, Young’s inequality, Brunn-Minkowski inequality, Fisher’s information inequalities to name a few in this context –.
Inequalities involving information measures for graphs are also referred to as information inequalities . They can be classified in two types, namely implicit information inequalities and explicit information inequalities. In particular, when information measures are present on either side of the inequality, we call it an implicit information inequality , while in the latter, the information measure is bounded by a function of parameters (or constants) involved. For some of the recent contributions in this direction, we refer to , –.
Recently, we have established relations  involving only Shannon entropy measures, under certain assumptions. In this article we extend the study to analyze the relation between entropy measures belonging to different concepts. In particular, the main contribution of this paper, is to establish implicit information inequalities involving Shannon entropy and Rényi entropy measures when being applied to networks. Further, we present implicit inequalities between Rényi entropy measures having two different types of probability distributions with additional assumptions. To achieve this, we analyze and establish relations between classical partition-based graph entropies , ,  and non-partition-based (or the functional) based entropies . Finally, we apply the obtained inequalities to specific graph classes and derive simple explicit bounds for the Rényi entropy.
In this section, we state some of the definitions of information-theoretic complexity measures , –. These measures are based on two major classifications, namely partition-based and partition-independent measures. Some basic results on inequalities on real numbers ,  are also presented at the end of the section.
Let be a graph on N vertices where and . Throughout this article, G denotes a simple undirected graph. Let X be a collection of subsets of G representing a graph object. Let be an equivalence relation that partitions X into k subsets , with cardinality , for . Let denote the probability distribution on X w.r.t , such that , is the value of probability on each of the partition.
Note that while the above definition is based on partitioning a graph object, another class of Shannon entropy has been defined in  where the probability distribution is independent of partitions. That is, probabilities were defined for every vertex of the graph using the concept of information functionals.
Suppose is an arbitrary information functional  that maps a set of vertices to the non-negative real numbers and let.(2) is the probability value of .
As a follow-up to Shannon’s seminal work , many generalizations of the entropy measure were proposed in the literature , , . These generalized entropies were recently , extended to study graphs. In the following, we present one such generalization from , namely the Rényi entropy for graphs.
Definition 3 The Rényi entropy , for and , of a graph G  is given by,(4)Here, is the equivalence relation on a graph object and denotes the probabilities defined on the partition induced by .
It has been proved that Rényi entropy is a generalization of Shannon entropy and in the limiting case when , the Rényi entropy equals the Shannon entropy .
Similar to expression (3), the Rényi entropy can be immediately extended  to partition-independent probability distributions defined on G.
Definition 4 Let , for and , denote the Rényi entropy  defined using an information functional f. Then.(5)
Next we state some interesting inequalities from the literature that are crucial to prove our main results. One of the well-known result for real numbers is stated as follows .
Lemma 1  Let and be real numbers. Then.(6)(7)
A simplified form of Minkowski’s inequality has been expressed in .
Lemma 2  If , then.(8)where , if and , if .
As an extension of discrete Jensen’s inequality, the following inequality has been derived in .
Lemma 3  Let , for , and such that . Then.(9)
Results and Discussion
In this section, we present our main results on implicit information inequalities. To begin with, we establish the bounds for Rényi entropy in terms of Shannon entropy.
Theorem 4 Let be the probability values on the vertices of a graph G. Then the Rényi entropy can be bounded by the Shannon entropy as follows:
Proof: It is well known  that the Rényi entropy satisfies the following relation with the Shannon entropy.(12)and
Now we prove the theorem by considering intervals for .
Case 1: When .
Case 2: When .
Note that when , by applying inequality (6), as before, to the term with and by simplifying we get the same expression as above. When , by direct simplification we get a similar expression. Hence we conclude that the expression (25) holds, in general for .
Remark 6 Observe that Theorem 4, in general, holds for any arbitrary probability distribution with non-zero probability values. The following theorem illustrates this fact with the help of a probability distribution obtained by partitioning a graph object.
Proof: By proceeding similarly to Theorem 4, we get the desired result.
In the next theorem, we establish bounds between like-entropy measures, by considering the two different probability distributions.
Now we distinguish two cases, depending on as follows:
Expressions (40) and (41) are the desired inequalities.
Remark 9 A similar relation by considering and has been derived in .
We focus our attention to the Rényi entropy measure defined using information functionals (given by equation (5)) and present various bounds when two different functionals and their probability distributions satisfy certain initial conditions. A similar study has been performed in ,  by using Shannon’s entropy only.
Let and be two information functionals defined on . Let and . Let and denote the probabilities of and , respectively, on a vertex . Let and denote the Rényi entropy based on the functionals and respectively.
Case 1: When . Dividing either side of the equation by yields the desired expression (42).
Case 2: When . In this case, dividing either side of the equation by yields the expression (43) as desired.
Proof: By assumption, we have . Therefore, the corollary follows by letting in the above theorem.
The next theorem can be used to study how a minor perturbation in the probability distribution of the system can affect the corresponding value of Rényi entropy measure. The amount of deviation can then be estimated as follows.
Case 1: When .
Dividing by , yields the desired expression (51).
Case 2: When .
Dividing by , yields the desired expression (52).
Here, and .
Case 1: .
Dividing by , yields the desired expression (65).
Case 2: .
Dividing by , yields the desired expression (66).
Finally by proceeding as before and by simplifying each of the terms in the above equation, we get the desired expression (83).
Upon simplification of the above equation, we get the desired expression (84).
Applications to chemical graphs
In this section, we consider various classes of chemical graphs and illustrate the results from the previous section. To this purpose, we consider a specific example of the equivalence relation on G and an information functional . In order to define concrete graph entropies, we need to specify graph invariants and information functionals to determine a probability distribution.
For the graph invariant we use the automorphism group of a graph. We use this invariant due to their extensive investigations available in the literature; for example see . Note that there are various other invariants such as distance, degrees and paths that could be used. Observe that each graph belongs to an automorphism group, where an automorphism is a permutation of the vertices such that the adjacency relation of the graph is preserved. An automorphism group divides the vertex set into orbits where a vertex orbit is a collection of topologically equivalent vertices .
Definition 5 Let be an automorphism (equivalence relation) that partitions the vertex set V of G into vertex orbits. Let be the k orbits of V such that .
Let be a simple, undirected graph on n vertices and let denote the distance between two vertices u and v, and let . Let denote the j-sphere of a vertex u defined as .
Remark 15 The setting is trivial as . But anyway, for all combinations of the resulting measures are well defined.
Note that the constants in the above expressions contribute to the weight of the j-spheres, see . If , see Remark 15. When the are all distinct, the vertices belonging to different j-spheres are weighted differently while the vertices belonging to the same j-sphere are considered to have same weight. Interestingly, the choice of constants has been proven useful for solving problems in chemical graph analysis . By doing so, the emphasis of a particular vertex is mainly given by its nearest neighbors and that the contribution of vertices at farthest distance is low. For more examples, we refer to , .
For the rest of the article, we consider two graph classes namely the stars and the path graphs to show the application of results from previous section. In addition, we also present the behavior of certain information functionals for any general connected graphs. A similar analysis on the relation between Shannon entropy measure (only) has been performed in , .
A Star is a tree on n vertices where there is exactly one vertex of degree and vertices of degree 1, see . The unique vertex of degree , denoted by u, is also referred to as central vertex. Star graphs have been of considerable interest, since they represent trees with smallest possible diameter among all trees on n vertices. Let be an automorphism defined on such that partitions into two orbits, and , where and .
Theorem 16 If is the automorphism, as defined above, on . Then.
Hence, the theorem follows.
Proof: Follows by using equation (96) in Theorem 8.
When , the conditions of the theorem are satisfied. That is, and , for some . Note we obtain a family of functionals (depending on and ) satisfying the conditions of the theorem. Also, we have . By substituting the value of S in expressions (101) and (102), we get the bounds for .
Remark 19 Another interesting graph class possessing the same automorphism group as the stars is the class of wheel graphs. A wheel is a graph obtained by joining a new vertex v to every vertex of an -cycle . That is, . While studying the inequalities for this class of graph, we derived similar expressions as of theorems 16 and 17. Hence, we conclude that the theorems 16 and 17 also holds for the wheel .
A path graph, denoted by , are the only trees with maximum diameter among all the trees on n vertices. This class of graph has received considerable attention in chemistry when studying the hydrogen-depleted hydrocarbon molecules. Let be an automorphism defined on , where partitions the vertices of into orbits () of size 2, when n is even, and orbits of size 2 and one orbit of size 1, when n is odd.
In the following theorem, we consider , when n is even.
Next, by using this value of and in Theorem 8, we get the desired expression for .
When we consider , n being odd, evaluating is not immediate. Hence we invoke Theorem 7 and obtain the following result.
Proof: Since n is odd, partitions into orbits of size 2 and one orbit of size 1. That is, , and for , . Therefore, , and for , .
By using this value of along with and in Theorem 7, we get the desired bounds for .
Next we evaluate the bounds for .
Upon simplification of the above expression, we get the desired bound (114).
In the case of , by proceeding similarly using expression (32) from Theorem 8, we yield the other bound (115).
Remark 22 Observe that, the computation of the Rényi entropy even with the classical partition-based distributions is not immediate for odd paths when compared to even paths. Hence, getting a closed form expression for general connected graphs is equally difficult.
In this section, we consider any general connected graph G on n vertices and the functionals and given by equations (91) and (92) respectively. In the next two theorems, we present the explicit bounds for the Rényi entropy , when we choose the two information functionals in particular.
Dividing the expression (130) by , and simplifying we get the desired expressions given by (122) and (123) depending on the value of .
Let us illustrate the above theorem by the following example. Let be the graph on 12 vertices as shown in Figure 1. The corresponding value of the information functional is also depicted in Figure 1. Here, . Also,
It is known that (see Remark 15) if . Equivalently, by using Theorem 23, we arrive at the same value, since and that . Observe that, the upper and lower bounds of coincides with this choice of constants.
Theorem 24 Let given by equation (92). Let and where is as defined in . Then the value of can be bounded as follows.
Dividing the expression (143) by , and simplifying we get the expressions (135) and (136) as desired.
Conclusion and Summary
In this article, we have studied the problem of establishing relations between graph entropy measures. Among various entropy measures, we have considered the classical Shannon entropy and the Rényi entropy. In fact, there is only very little work when applying Rényi’s entropy to graphs , . While this research is an extension of our earlier work , the results obtained here are complementing the earlier ones and of competing interest. In particular, the main contribution of this paper was to establish implicit information inequalities involving the Shannon entropy and the Rényi entropy measures when applied to networks. Also, we have presented implicit inequalities between Rényi entropy measures having two different types of probability distributions with additional assumptions. Further we have shown the application of the derived results by using various graph classes.
As mentioned earlier, investigating relations (by means of inequalities) is crucial as relations of the values of the measures have not yet been investigated extensively. To demonstrate the importance of such inequalities exemplarily, suppose and are graph entropy measures and it holds, for some graph . If has non-polynomial complexity and is computable in polynomial time, then is an upper bound that may be feasible in a general sense. In terms of measures such as Körner’s entropy, relations between graph entropies could be crucial. But note that in view of the vast amount of existing measures, this is a daunting problem. Also, the meaning of the Rényi graph entropy is not yet fully understood. Hence, we believe that such relations can be useful when designing and understanding complex graph-based systems. This might be especially applicable when applying the information-theoretic network measures such as Shannon’s and Rényi’s entropy to large complex networks.
- 1. Bonchev D, Rouvray D (2003) Complexity in chemistry: Introduction and Fundamentals. Mathematical and Computational Chemistry 7. New York: CRC Press.
- 2. Bonchev D (2009) Information theoretic measures of complexity. In: Meyers R, editor, Encyclopedia of Complexity and System Science, Springer, volume 5: 4820–4838.
- 3. Anand K, Bianconi G (2009) Entropy measures for networks: Toward an information theory of complex topologies. Physics Review E 80: 045102.
- 4. Costa LdF, Rodrigues FA, Travieso G, Boas PRV (2007) Characterization of complex networks: A survey of measurements. Advances in Physics 56: 167–242.
- 5. Kim J, Wilhelm T (2008) What is a complex graph? Physica A: Statistical Mechanics and its Applications 387: 2637–2652.
- 6. Balaban A, Balaban T (1991) New vertex invariants and topological indices of chemical graphs based on information on distances. Journal of Mathematical Chemistry 8: 383–397.
- 7. Bertz SH (1983) A mathematical model of complexity. In: King R, editor, Chemical applications of topology and graph theory, Elsevier, Amsterdam. pp. 206–221.
- 8. Basak SC, Magnuson VR, Niemi GJ, Regal RR (1988) Determining structural similarity of chemicals using graph-theoretic indices. Discrete Applied Mathematics 19: 17–44.
- 9. Bonchev D, Rouvray D (2005) Complexity in chemistry, biology, and ecology. Mathematical and Computational Chemistry. New York: Springer, xx+344 pp. doi:10.1007/b136300. URL. http://dx.doi.org/10.1007/b136300.
- 10. Claussen JC (2007) Offdiagonal complexity: A computationally quick complexity measure for graphs and networks. Physica A: Statistical Mechanics and its Applications 375: 365–373.
- 11. Körner J (1973) Coding of an information source having ambiguous alphabet and the entropy of graphs. Trans 6th Prague Conference on Information Theory. pp. 411–425.
- 12. Butts C (2001) The complexity of social networks: Theoretical and empirical findings. Social Networks 23: 31–71.
- 13. Bonchev D (1983) Information Theoretic Indices for Characterization of Chemical Structures. Research Studies Press, Chichester.
- 14. Rényi P (1961) On measures of information and entropy. In: Proceedings of the 4th Berkeley Symposium on Mathematics, Statistics and Probability. Berkeley, CA: University of California Press, volume 1, 547–561.
- 15. Daròczy Z, Jarai A (1979) On the measurable solutions of functional equation arising in information theory. Acta Math Acad Sci Hungar 34: 105–116.
- 16. Dehmer M, Mowshowitz A (2011) Generalized graph entropies. Complexity 17: 45–50.
- 17. Dehmer M, Sivakumar L (2012) Recent Developments in Quantitative Graph Theory: Information Inequalities for Networks. PLoS ONE 7: e31395.
- 18. Dehmer M, Mowshowitz A (2011) A history of graph entropy measures. Information Sciences 181: 57–78.
- 19. Cover TM, Thomas JA (2006) Elements of Information Theory.Wiley Series in Telecommunications and Signal Processing. Wiley & Sons.
- 20. Dembo A, Cover T, Thomas J (1991) Information theoretic inequalities. IEEE Tranactions on Information Theory 37: 1501–1518.
- 21. Yeung RW (1997) A framework for linear information inequalities. IEEE Transactions on Information Theory 43: 1924–1934.
- 22. Dragomir SS, Goh CJ (1997) Some bounds on entropy measures in information theory. Applied Mathematics Letters 10: 23–28.
- 23. Dehmer M, Mowshowitz A (2010) Inequalities for entropy-based measures of network information content. Applied Mathematics and Computation 215: 4263–4271.
- 24. Bonchev D, Trinajstić N (1977) Information theory, distance matrix, and molecular branching. The Journal of Chemical Physics 67: 4517–4533.
- 25. Dehmer M, Mowshowitz A, Emmert-Streib F (2011) Connections between classical and parametric network entropies. PLoS ONE 6: e15733.
- 26. Dehmer M, Borgert S, Emmert-Streib F (2008) Entropy bounds for hierarchicalmolecular networks. PLoS ONE 3: e3079.
- 27. Mowshowitz A (1968) Entropy and the complexity of graphs: I. an index of the relative complexity of a graph. Bulletin of Mathematical Biophysics 30: 175–204.
- 28. Dehmer M (2008) Information-theoretic concepts for the analysis of complex networks. Applied Artificial Intelligence: An International Journal 22: 684–706.
- 29. Dehmer M (2008) Information processing in complex networks: Graph entropy and information functionals. Applied Mathematics and Computation 201: 82–94.
- 30. Skorobogatov VA, Dobrynin AA (1988) Metrical analysis of graphs. MATCH Commun Math Comp Chem 23: 105–155.
- 31. Shannon CE (1948) A mathematical theory of communication. Bell System Technical Journal 27: 379–423 and 623–656.
- 32. Hardy GH, Littlewood JE, Pólya G (1988) Inequalities. Cambridge Mathematical Library. Cambridge University Press; 2 edition.
- 33. Shannon C, Weaver W (1997) The Mathematical Theory of Communication. University of Illinois Press, Urbana, IL, USA.
- 34. Rashevsky N (1955) Life, information theory and topology. Bulletin of Mathematical Biophysics 17: 229–235.
- 35. Arndt C (2004) Information Measures: Information and its Description in Science and Engineering (Signals and Communication Technology). Springer.
- 36. Dehmer M, Emmert-Streib F (2008) Structural information content of networks: Graph entropy based on local vertex functionals. Computational Biology and Chemistry 32: 131–138.
- 37. Dehmer M, Barbarini N, Varmuza K, Graber A (2009) A large scale analysis of informationtheoretic network complexity measures using chemical structures. PLoS ONE 4: 1–13.
- 38. Harary F (1969) Graph Theory. Addison Wesley Publishing Company. Reading, MA, USA.
- 39. Delgado-Soler L, Toral R, Tomás MS, Rubio-Martinez J (2009) Red: A set of molecular descriptors based on ŕenyi entropy. Journal of Chemical Modeling and Information 49: 2457–2468.