Skip to main content
Advertisement
Browse Subject Areas
?

Click through the PLOS taxonomy to find articles in your field.

For more information about PLOS Subject Areas, click here.

  • Loading metrics

Not so optimal: The evolution of mutual information in potassium voltage-gated channels

  • Alejandra Duran-Urriago,

    Roles Conceptualization, Data curation, Formal analysis, Investigation, Methodology, Software, Validation, Visualization, Writing – original draft, Writing – review & editing

    Affiliation Pioneer Academics, Philadelphia, PA, United States of America

  • Sarah Marzen

    Roles Conceptualization, Project administration, Resources, Supervision, Writing – review & editing

    smarzen@kecksci.claremont.edu

    Affiliation W. M. Keck Science Department, Pitzer, Scripps, and Claremont McKenna Colleges, Claremont, CA, United States of America

Abstract

Potassium voltage-gated (Kv) channels need to detect and respond to rapidly changing ionic concentrations in their environment. With an essential role in regulating electric signaling, they would be expected to be optimal sensors that evolved to predict the ionic concentrations. To explore these assumptions, we use statistical mechanics in conjunction with information theory to model how animal Kv channels respond to changes in potassium concentrations in their environment. By measuring mutual information in representative Kv channel types across a variety of environments, we find two things. First, under weak conditions, there is a gating charge that maximizes mutual information with the environment. Second, as Kv channels evolved, they have moved towards decreasing mutual information with the environment. This either suggests that Kv channels do not need to act as sensors of their environment or that Kv channels have other functionalities that interfere with their role as sensors of their environment.

Introduction

Accurately detecting electrical stimuli in the environment is crucial for all living organisms. It is remarkably important for communication in the nervous system, which relies on efficiently detecting and responding to electrical signals produced as neuronal ion channels open and close [1]. Such signaling is specifically dependent on the actions of voltage-gated ion channels. These are transmembrane proteins that open depending on the voltage changes across the membrane to allow an ionic current to flow [2]. Of the different types of voltage-gated channels that exist (e.g. sodium, calcium), this work focuses on potassium (Kv) voltage-gated channels, specifically those present in Metazoans (animals).

Upon voltage activation, Kv channels undergo a conformational change that allows only potassium ions to flow through [3]. They exist in all domains of life and the biological tasks they carry out are very diverse [46], but a roughly unified function and structure is found in Metazoan Kv channels [7]. This is due to the presence of a nervous system that sends “messages” in the form of repetitive current spikes called action potentials [8], which requires Kv channels to be excellent sensors. Imagine a neuron as an electrical signal passes through it: each of the voltage-gated channels needs to make the best prediction possible about the charges it “feels”—that ultimately determine the voltage changes—in no more than two milliseconds [9]. In a nervous system that does not allow a wide range of ion concentrations, the slightest variation in the currents conveys important information about the signal, and it seems that Kv channels should accurately detect it.

Researchers have so far developed Kv channel models to assess the probability of being in an open conformation according to changes in membrane potential (voltage). Using statistical mechanics and thermodynamics is a common approach [4, 10, 11], where a variety of parameters in partition functions are introduced to generate models coherent with experimental results. The principal models of this type have been proposed by Sigworth [12], and Sigg and Bezanilla [13].

These models have been mostly used in conjunction with mutagenesis studies to determine the structure of the Kv voltage sensor domain (VSD) [14, 15] or the physiological effects of common mutations [16, 17]. However, the theoretical application of these models to learn how good Kv channels are at making predictions about their environment has not been explored yet. In an environment that demands accurate and incredibly fast sensing, this approach is ideal to assess how informative Kv channels need to be about their environment.

In this work, we manipulate Sigworth’s model to create one that predicts how likely the channel is to be open depending ultimately on K+ concentration. Using information theory, we quantify how well Kv channels send messages about their environment. We then explore if their evolution could reflect a tendency to maximize mutual information. The following section provides a theoretical background on the biophysical model implemented, information theory, and Kv evolutionary history. In the Methods, we derive the model. In Results, we show how the gating charge is a critical factor for sensing across different environments and explore the relationship between mutual information and Kv channel evolution. Finally, in the Discussion, we explore new evolutionary perspectives suggested by our results.

Theoretical background

Modeling a two-state voltage-gated ion channel

Several biological sensors can be considered as allosterically regulated molecules, where an indirect regulator induces a conformational change [18]. Examples range from ligand-gated ion channels, to hemoglobin changing conformation upon oxygen binding. Considering these sensors as allosteric molecules allows to formulate statistical mechanical models that link changes in conformation to their external regulators, assigning statistical weights to different conformational states [19].

As for voltage-gated ion channels, a simple statistical mechanical model that describes the influence of the membrane potential (voltage difference) on the conformation of the channel was proposed by Sigworth in 1994 [12]. Assuming an allosteric-like model where the channel can be either open or closed, the following relation is found: (1) Where Po is the probability to be in the open conformation and Pc in the closed one, has its usual meaning where kB is the Boltzmann constant and T the temperature in Kelvin, and (2) Where ΔG° is the free energy change between the open and closed states at zero membrane potential, q is the gating charge (in terms of elemental charges eo) and E is the membrane potential in mV. ΔG° and q are unique to each type of channel. ΔG has been redefined by several authors [20], but the original definition is also valid. The gating current is generated when specific charges in the voltage sensor domains of the voltage-gated channel move from a lower gate to an upper gate, which is necessary for a conformational change to occur [8]. See Fig 1. This q value is unique for each type of channel.

thumbnail
Fig 1. Two-state toy model of a Kv channel.

The upwards movement of gating charges (in black) generates a gating current that induces the conformational change from a closed to an open state (voltage sensor domains in orange).

https://doi.org/10.1371/journal.pone.0264424.g001

Knowing that PO + PC = 1, the model is obtained from Eq 1 as: (3)

The probability of being open is affected by the gating charge and the standard free energy as constants and is dependent on the changes in membrane potential. It can also be thought of and written as a conditional probability distribution: P(open|E). This notation will be used below.

Note from equation Eq 3 that at very positive (depolarized) membrane potentials, ΔG becomes more negative, and the channel is virtually always open. In turn, the value of q determines the magnitude by which changes in E impact ΔG. Generally, q is regarded as an indicator of the voltage sensitivity of the channel.

The model was first used by Sigworth to study the shifts in Po induced by a mutation in Kv Shaker channels and has continued to be used for similar purposes [2022]. While its simplicity makes it unsuitable for perfect fits to experimental data, it is preferred for this work. For the same reasons, a two-state model was chosen instead of a multi-state scheme.

Mutual information

The model above presents a scenario where the open conformation of a voltage-gated channel is dependent on the membrane potential in the environment. The open conformation immediately allows an ionic current to flow and is consequently equivalent to firing an electrical signal. In a very general sense, this can be considered as an output signal (Y) depending on certain input (X). Biology has greatly focused on how a cell or an organism gets from receiving X to producing Y, studying molecular mechanisms. However, looking at how well they process and communicate these signals in terms of the information they carry is also possible. In fact, the question about how much information from a given variable X can be reliably tracked to an output Y (or vice versa, how much can Y reliably tell about X) is answered by information theory.

Both X and Y represent random variables, meaning that their state is unknown and random until we do a particular experiment and obtain a “realization” of each. Their realizations are denoted x and y, respectively. Over the course of N ≫ 1 experiments, we see X take a value x and Y take a value y with frequency ≈ Np(x, y). The quantity p(x, y) is known as the joint probability distribution, and it describes (equivalently, in a Bayesian sense) our belief that X will take value x and Y will take value y in any given experiment. Of particular interest are: the marginal distributions p(x) = ∑y p(x, y) and p(y) = ∑x p(x, y), which represent the probability of seeing a particular value x or a particular value y; and the conditional probability distributions p(x|y) and p(y|x), which represent the probability of seeing a particular value x conditioned on the fact that we have seen a particular value y, or vice versa. It is of calculational importance that there is a simple relationship between joint, marginal, and conditional probability distributions: p(x, y) = p(x)p(y|x) = p(y)p(x|y). Though we have implicitly focused on the case that x and y can only be in a finite set, one can straightforwardly extend mutual information (described below) to the case when either x or y or both are real numbers.

Proposed by Claude Shannon in 1948 [23], information theory is built upon the idea of entropy in a system. This is not to be confused with the Boltzmann-Gibbs entropy used in thermodynamics. Instead, Shannon’s entropy measures the uncertainty of the state of a variable in a system (X or Y as outlined above). It reflects the average number of “yes/no” questions needed to correctly guess the state of a random variable [24]. It is mathematically expressed as: (4)

In this expression, N is the number of possible states of X (any given variable), p(x) is the probability distribution of the states of X, and the logarithm with base 2 allows us to get an entropy value in binary units—bits. The probability distribution can be obtained using theoretical predictive models, using statistical weights for example (as done in Sigworth’s model). As well, it can be inferred from experimental measurements [24]. For convenience, values of X will represent “inputs” and Y “outputs” throughout the rest of the paper.

To describe the amount of shared information between two systems (or two variables X and Y), Shannon introduced mutual information. It quantifies the reduction in uncertainty about X obtained from knowing Y, or vice versa. Different relationships between X and Y result in different expressions for mutual information, which have been reviewed in other works [2426]. Here, the scenario of interest is where the state of Y is determined by X: it is conditional. The corresponding expression for its mutual information is: (5)

In this expression, H(Y|X) is a conditional entropy, which expresses the uncertainty of a certain value of Y occurring given X. It is directly related to the conditional probability distribution p(y|x) via H(Y|X) = −∑y p(y|x)log2 p(y|x), which in this case, is exactly what we get using Sigworth’s model. Note the correspondence with Eq 3. The specifications to get I(X;Y) are discussed in the Methods. Mutual information will quantify, in bits, how strong the (nonlinear) correlation is between X and Y. It is always non-negative, with values close to zero meaning the correlation is weak, and 0 meaning X and Y are completely independent from each other [23].

In a biological context, the value of mutual information indicates the number of possible environmental conditions (X) that a biological readout (Y) allows to distinguish, or a reduction in uncertainty about the state of X. For example, if the mutual information value is 1 bit, then the state uncertainty in X is reduced by half from knowing Y. With 2 bits, 22 states of X are possible; 3 bits represent 23 possible states, and so on [27]. In this sense, the higher the value is for mutual information, the more states we can distinguish of X from knowledge of Y, and hence we can be less uncertain about it compared to when we did not know Y.

In general, information theory can be applied in two ways to biological contexts: with a source coding or a channel coding approach. The first approach assumes we can control the channel through which the information gets from X to Y, and focuses on how to compress the X signal into a Y response losing as less information as possible. The second approach assumes we can control the environment, studying how much data we can send through a noisy channel. A third approach treats the mutual information as a useful quantification of nonlinear correlations [28]. We mostly use the third approach in this work, though later address the second approach.

Mutual information has been repeatedly used to approach questions throughout a great variety of areas [26, 29, 30]. The copious number of studies using information-theoretic tools have led some authors to judge the efforts as too “optimistic” about the power of information theory [31]. However, in biological contexts, it can be a good indicator of how well a biological sensor tracks its input to its output. As well, careful uses of the theory have resulted in valuable new insights about a possible evolutionary principle to optimize mutual information given the energetic constraints that living organisms have to face [32].

The evolution of Kv channels

Kv channels are present across all domains of life [7], and their functions in organisms other than eukaryotes are just starting to be understood. Archaean and prokaryotic Kv channels have been mainly used for structural modeling, and there are initial studies suggesting that they play a role in electrical signaling (only for bacteria) [33]. In eukaryotes, specifically Metazoans, three major families have been identified with further categories. Table 1 summarizes the classifications.

As the one shown below, phylogenetic trees showing the evolution of these Kv channels have been obtained using genomic data.

The tree shows the complete evolution for all ion channels that are selective for K+, and only the ones with the “Kv” label are voltage-gated. We can see that most of the diversification including the emergence of all Kv families occurred in the basal metazoans. Surprisingly, structures are highly conserved between the time they diversified and the latest organisms in evolutionary history (for instance, humans). Although gene sequences have changed, the changes in the final expression of the protein are silent or insignificant, and major structural properties remain unchanged [36].

Such numerous diversifications in the basal metazoans yet highly conserved later lead to several questions. What could have driven such a sudden differentiation of Kv channels? What selective pressure was present during the basal metazoan era but not the following ones that slowed down Kv channel evolution? Further, looking at the deeper principles in the evolution of biological sensors using concepts from information theory, and reiterating the question we explore, did Kv channels in Metazoans evolve to maximize mutual information?

Methods

To obtain a biophysical model for Kv channels we first identify its environment (X) as the extracellular potassium concentration [K+]o, and its biological readout (Y) as either being in an open or closed conformation. Although one could initially think that the environment should be the membrane potential, ultimately differences in ion concentrations generate such potentials and Kv channels must select only for potassium ions. Conversely, the opening and closing of the channels only generates changes in potassium concentrations. Hence, we can make an educated guess that these channels want to sense K+ concentrations in their outside (changing) environment, which in turn affect the voltage sensed too.

Precisely, the Goldman-Hodgkin-Katz voltage equation (or Goldman equation) relates the membrane potential (E) to the main ion concentrations in and out of the cell [37, 38]: (6) where P is the membrane permeability value for each ion, R is the ideal gas constant, F is Faraday’s constant, T is the temperature, and the subscripts indicate if the concentration is inside (i) or outside (o) of the cell. We have implicitly assumed that the equilibrium membrane potential given the ion concentration gradients is quickly achieved, as we wish to sense pre-action potential.

To keep [K+]o as the only variable term, we assume all other terms as constant depending on the respective average concentrations for each cell type. In this work, we use the concentrations for a typical neuron at rest. Then, the obtained expression for E can be replaced in Sigworth’s model 3 to get: (7) where C1, C2 are constants related to the aforementioned Goldman equation and ΔG° has different possible expressions. One showing high fidelity to experimental results has been proposed by Chowdhury and Chanda as the “limiting slope method” [20]. ΔG° values reported using this method were directly used. However, in most cases, experimental values only allowed us to make another estimation for ΔG° indicated by the same authors as: (8) where q is the gating charge, F is Faraday’s constant, and V1/2 is the voltage at which half of the channels are open (half-maximal activation voltage), commonly reported in the literature.

Once we get the equation Eq 7 relating the readout (Y) to the identified environment (X), we proceed to find the expression for the mutual information. There are different ways to do so, but here we use the general expression indicated in Ref. [39]: (9)

This expression is especially useful since it depends only on p(x) and p(y|x), which represent the known distributions of the environment and the conditional probability distribution found in Eq 7. Recall that the marginal probability distribution p(y) is simplified because it can be expressed as: (10)

Replacing Eq 7 in Eq 9, we get the final expression for mutual information. It will quantify how good sensors Kv channels are in the context of the two-state model used.

Although it is mathematically correct for p(x) to have its domain over all real numbers, this is a condition that can never be met in real life. In this case, potassium concentrations fall within a physiological range, and exceeding it simply causes cell death. This limit varies according to cell types and tissue location. Additionally, we discretize our environment by assuming that the ion concentration can only take on a finite number of values within that physiological interval.

For the channel capacity calculation we use the Blahut-Arimoto algorithm [40]. Its implementation in Python along with that of all equations above is available at the GitHub repository referenced in the S1 Appendix.

Results

The influence of gating charge

Recalling the model obtained from Eq 7, the parameters that are expected to differentiate the responses among distinct types of Kv channels to their environment are T, ΔG° and q. Exploring these parameters is of interest since, should they play a significant role in the statistical mechanical model, they should also be significant for mutual information. As shown in Fig 2, changes in the gating charge and ΔG° each impact in different ways the curves obtained. Surprisingly, changes in temperature (T) did not produce any noticeable changes, regardless of how greatly the value was varied, within the range 298—400K.

thumbnail
Fig 2. Changes in P(open) curves depend on standard free energy ΔG° with values of −10, −100, −1000, and −10000 kcal/mol (a), and increasing gating charge q with values of 3, 6, 12, 24, and 48 e (b).

The curves in (b) intersect at coordinate (181, 0.505). The [K+] domain [0 500] mM used matches the dynamic range of the channel so as to show the full range of behavior of the channel.

https://doi.org/10.1371/journal.pone.0264424.g002

Shifts in the shape of the curve are produced by only doubling the gating charge values, while it is necessary to increase ΔG° by orders of magnitude. The shifts corresponding to changing the free energy are expected: the more negative ΔG° is, the more energetically favorable it is for the channel to be in the open conformation. However, in a biological context free energy values are very rarely below −100 kcal/mol. For comparison, the combustion of pure carbon dioxide, one of the most exergonic reactions in nature, has a free energy of −94 kcal/mol. The range of possible shapes that the P(open) curve could take are thus between the blue and orange curves in Fig 2, which show no outstanding variation.

However, realistic increases in the gating charge cause the curve to be increasingly sigmoidal. With high gating currents, the curve starts resembling the common current-voltage (QV) curves obtained in patch-clamp experiments. Interestingly, there is a point where all curves have the same P(open), regardless of the gating current. This happens precisely when half of the channels are open, matching the condition that defines the median voltage of activation V1/2. We therefore highlight that although V1/2 is used to differentiate the QV curves of voltage-gated channels [8], it may not be informative about differences in gating currents, at least for Kv channels.

Mutual information in different environments

The impact of q values on the p(y|x) functions suggest that mutual information values may also have a strong dependence on the gating charge. We thus explore this dependence, but as Eq 10 shows, we must now consider the distribution of values in the environment p(x), where each x represents a value of [K+]o in the model used. We consider four different possible distributions for it: uniform, normal, exponential, and bimodal. The goal here is not to precisely portray real-life environments, but rather to evaluate how significantly the model behavior changes just by switching to a very different distribution. Recall that under an information-theoretic source coding approach, we would expect an organism to modify itself (in this case the Kv channel) to compress data from changing environments into a signal. Then, perhaps one strategy these channels could use to do so is modifying or optimizing their gating currents, as shown in Fig 3. The optimal mutual information values found are however significantly lower than channel capacity—the upper bound of information transmission. If mutual information was instead near channel capacity, then the channel’s sensitivity would be theoretically near-optimal, but this is not the case here (Fig 3b).

thumbnail
Fig 3. There is a clear optimal gating charge that generates a maximum mutual information for four distinct environment distributions (a), but they are far from channel capacity(b).

There is a certain gating charge that maximizes channel capacity as well. The x (environment) ranges from 0 to 20 mM, intervals are taken each 0.5 mM, mean = 10, second mean (only for bimodal) = 3, std. deviation (for normal and bimodal) = 5. Max. mutual information values are 0.0052 (uniform), 0.0048 (exponential), 0.0037 (bimodal), and 0.0029 (normal) bits. Maximum mutual information (channel capacity) is 0.138.

https://doi.org/10.1371/journal.pone.0264424.g003

In general, mutual information benefits from low noise, a large dynamic range, and (when the input distribution is reasonably smooth) a linear input-output function. For a binary output, the noise is lowest at the extremes (p ∼ 0, 1) and highest in the middle (p ∼ 1/2). Therefore, from Fig 2b, it seems that over the domain [0, 20] mM, at low sensitivity the model suffers from high noise, low range, and high non-linearity; at high sensitivity the model has low noise but low range and high non-linearity; while at intermediate sensitivity the model benefits from low noise, high range, and high linearity. This likely accounts for the optima in Fig 3.

The maximum mutual information values (channel capacities) for these optima is, however, still very low. With a uniform environment, a value of 0.005 bits tells us that if we see a Kv channel that is open, the doubts we initially have about the environment are only reduced by , or 0.34%. It seems not to be a very suitable sensor for the task it should do, but we will return to the discussion of these values later.

It is commonly thought that the greater the gating charge is, the more sensitivity the Kv channel has [41]. One could easily assert that higher sensitivity makes a better sensor. However, these results suggest that there is one value of gating charge beyond which Kv channels do not improve their performance as sensors. Also, while the curves seen in Fig 3 are apparently highly conserved, no Kv channel has the gating charge that would supposedly always maximize mutual information.

We also highlight that the channel capacity is remarkably low compared to those obtained in other biological systems [24, 42], and the sample distributions yield an even lower mutual information value. Indeed, the optimal input distribution would be markedly different from the sample ones (Fig 4). It appears to be almost the opposite to the normal distribution, which is the one that most accurately models potassium ionic environments. Hence, it is unlikely that such an optimal distribution is possible in a real environment.

thumbnail
Fig 4. Optimal p(x) input distribution.

The optimal input would favor the K+ extracellular concentrations at the extremes of the concentration domain. This is expected, as these concentrations favor high overall entropy of output but low conditional entropy of output given input.

https://doi.org/10.1371/journal.pone.0264424.g004

The evolution of mutual information

We now evaluate the mutual information values of Kv channels from an evolutionary approach. To have a well-rounded perspective about the different types of Kv channels that have evolved, we have chosen representative and well-studied types from the Shaker and KCNQ families (refer to Fig 5). We do not consider the Eag family because it has significant allosteric regulators other than gating currents [43], and hence does not meet the sole dependence on [K+] that the model used has. The considered Kv channel types with their corresponding parameters and their evolutionary order are shown in Table 2.

thumbnail
Fig 5. Evolution of potassium-specific ion channels during the time of appearance of 4 groups of organisms.

Data taken from Li et al. [35]. Kv channel families are indicated. Unicellular eukaryotes refer specifically to the choanoflagellates. Basal metazoans include Ctenophora (comb jellies) and Cnidaria (jellyfish, sea anemones, and corals). Branch length is not proportional to time.

https://doi.org/10.1371/journal.pone.0264424.g005

thumbnail
Table 2. Representative Kv channels across different families.

https://doi.org/10.1371/journal.pone.0264424.t002

We identified two environments representative of specific evolutionary eras (a primitive and a recent one), to compare how mutual information evolves in different environment distributions defined also with different parameters. Although the distributions were very different, the Kv channels end up showing common behaviors.

We identified these environments considering the evolutionary relationships between Kv channels, which had, at most, two evolutionary hotspot moments. The first corresponds to the narrow period between the emergence of Ctenophora (comb jellies) and Cnidaria (jellyfish) during the evolutionary time of the basal metazoans Fig 5. It is known that most of the diversity of Kv channels appeared during that time, including all diversions of the baseline voltage-gated family members [35]. Although these primitive marine animals had internal nervous systems, they were very thin and did not have permeable or isolating layers. Ions were then easily diffused, making the ionic environment in their nervous system essentially identical to that of the open ocean [48]. Hence, these marine [K+] distributions define the first environment.

The second hotspot occurred in very late bilaterian metazoans, specifically mammals and insects. The KCNQ family greatly diverged during this time period. The environment in which they did so corresponds to the developed bilaterian nervous system with highly regulated potassium concentrations. We took the cerebrospinal fluid (CSF) as the reference to define the second environment.

Defining p(x) with these parameters allows us to use mutual information evolution to evaluate possible evolutionary scenarios. First and foremost, if Kv channels had evolved to maximize mutual information, we should expect to always see increasing curves in Fig 6. If mutual information between ion concentration and channel state had been selected for positively, the graph would show steeper slopes between the most evolved Kv channels in the CSF environment than in the marine one– mutual information would be even more maximized in the evolved environments where most recent Kv channels have been selected for. These expected trends are evidently not true. However, we highlight that a change in the steepness of the curve occurs in the normal and bimodal environments, where the most basal channel types are most greatly favored in their corresponding basal environments, although differences in mutual information are not significant in the evolved environment.

thumbnail
Fig 6. Mutual information in the evolutionary history of Kv channels.

Mutual information in the evolutionary history of Kv channels always decreased in all environment distributions. The environment intervals are taken each 0.1mM. a) real-life conditions of a primitive environment during the first evolutionary hotspot. b) real-life conditions of an evolved environment during second hotspot. Note the color correspondence with Fig 5. Note that the random, exponential, and uniform distributions are not affected in shape or steepness when changing the parameters. Kv2.1(H) is of mammalian type (Human) and Kv2.1(D) is invertebrate (Drosophila melanogaster).

https://doi.org/10.1371/journal.pone.0264424.g006

Despite having different parameters and distributions, the mutual information curves between both environments have similar changes in steepness). However, it is hard to believe that the predictive ability changes by the same degree when the Kv channels only perceive 3 mM of concentration just as good as when the range goes up to 20 mM, considering that an increase of only 2.5 mM in the human (nervous) environment already causes serious alterations to synaptic transmission and the frequency of electric discharges [47].

The decreasing trend, considering the accuracy of normally distributed concentrations [47, 49, 50], strongly suggests that maximizing mutual information was not the principle behind the evolution of Kv channels. As much as their biological task in nervous communication suggests that they should be optimal sensors tracking their environment very well in their signals, it appears not to be the case here.

Recalling Eq 5, two possible scenarios may lead to decreasing mutual information values: either H(Y) has decreased or H(Y|X) has increased during evolutionary history. It is possible that both values changed simultaneously, and their effect decreasing mutual information shows to be correlated with the lower gating currents that emerged in Kv channels (Fig 7). The correlation is reasonable considering that gating currents are known to determine channel sensitivity and hence their ability to convey information about their environment. However, why evolution would have selected for smaller gating currents is not immediately clear. We face here a scenario where evolution appears to not care about making more efficient signals, at least under what we have considered to represent an optimal response.

thumbnail
Fig 7. Decrease in mutual information is closely related to decrease in gating charge in Kv channel evolution.

The mutual information in a normally distributed environment (blue) is used as an example, but the correlation with gating charge is similar in all other environments.

https://doi.org/10.1371/journal.pone.0264424.g007

Even if maximizing mutual information is not relevant, there is a change in steepness when changing the parameters. Looking specifically at the realistic normal distribution, mutual information values do not greatly change when evolving in the CSF conditions (range from 0 to 0.00005), relative to their range in the marine environment (up to 0.005). Mutual information changes, despite being minimal, are maximized in marine conditions and favor, as expected, the most basal channel types. However, there is no significant advantage in CSF conditions for the most evolved channel types.

Looking further at the normal environment, going from marine to CSF conditions also significantly decreases the maximum mutual information (maxMI) by two orders of magnitude. Although the marine maxMI is still low, it is 100 times more informative than its CSF counterpart. Evidently, for two Kv channels with the same precision (e.g. they detect each 0.1mM of ion concentrations), the signal that responds to detecting such units will be a more specific, or less uncertain, “message” about the actual environment when it goes up to 20 mM, than another with a maximum of 4 mM.

In the 20 mM maximum case, detecting any concentration value with a 0.1 mM precision means that (ideally and in the simplest case) one state is differentiated from the other 199. In the 4 mM maximum case, one state is differentiated from only 39 other possible ones. In general, H(X) increases when the [K+] limit is higher, explaining why the marine maxMI is higher than that of the CSF environment. Although this supports that Kv channel performance (and arguably that of other biological sensors) is affected by their precision relative to the biophysical limit of their environment, it does not show that their evolution tends to maximize the information it can get given these constraints.

Applicability of the GHK equation

In our analysis, we evaluate membrane potential based on the variation of one ion concentration. Although ignoring Na+ concentrations may seem imprecise at first, the GHK equation has been repeatedly used in this manner [5159]. Further, here we demonstrate that, given the differences in membrane permeabilities for Na+ and K+, our approach leads to qualitative conclusions representative of the reality.

K+ leak channels are the predominant type in neuronal cells [60] and they make the greatest contribution to resting membrane potential in comparison to Cl− and Na+ leak channels. Neuronal membranes have a permeability for K+ significantly greater than for Cl− and Na+, with a ratio of 60: 13: 1 respectively [61]. If the variations in chloride and sodium concentrations were considered in our calculations, it would be reasonable to expect minor deviations from the membrane potential calculated with GHK considering [K+]. This, in turn, would generate small variations in the mutual information, as shown in (Fig 8). The overall variation in mutual information is not extraordinary. Given that Na+ extracellular concentrations range from 130mM-170mM across animals in all evolutionary stages [62], the maximum variation in mutual information is <14%.

thumbnail
Fig 8. Variations in the mutual information value considering changes in sodium and chloride concentrations within their biophysical range.

The baseline K+ concentration was considered for a normally-distributed environment, since it is the most realistic distribution. Changes in MI are calculated in both (a) primitive conditions and (b) CSF conditions, as indicated in Table 3.

https://doi.org/10.1371/journal.pone.0264424.g008

thumbnail
Table 3. Parameters of marine and CSF environment to be used in the distribution categories.

https://doi.org/10.1371/journal.pone.0264424.t003

Therefore, the small effects of ion concentrations distinct from K+ on mutual information demonstrate that not considering sodium leads to acceptable and representative estimates. Although they will include an error margin, it seems to us from this analysis and from the review of the literature that the qualitative conclusions are likely to be accurate.

Discussion

Evaluating how good Kv channels are at accurately making predictions about their environment, using an information-theoretic approach, suggests that they did not evolve to maximize mutual information. Instead, the evolutionary history shows a steady decrease in mutual information that is strongly correlated to similarly decreasing gating currents. Regardless, showing what did not lead to their peculiar evolution still allows us to identify certain crucial parameters for their response, and provides insight into one of those cases where biological sensors do not need to be optimal.

It is worth highlighting that an evolutionary trend that does not seem to improve the performance of Kv channels is coherent with the high conservation seen between evolved and primitive Kv channel types. It is reasonable to suggest that during the massive diversification these channels had when basal metazoans emerged, the channels developed a sufficient amount of sensitivity and ability to predict their environment, which is surprisingly far below channel capacity. Nevertheless, there would have been no further need to maximize the acquired sensitivity, and the critical structural features were highly conserved over time.

If mutual information changes reflect random evolution, then the gating charge—which we show to be highly correlated with mutual information—should also have certain randomness associated with it. However, even minimum changes in gating currents are the reason behind several nervous signaling dysfunctions [41, 43, 63]. Likewise, there is a clear trend across environments that show an optimal gating charge for maximum mutual information.

Considering the ordered character of this last fact, we suggest a possible evolutionary scenario: random diversions in protein domains (and hence gating currents) first generated functional Kv channel types, which then did not need to evolve significantly, but ended up incorporated billions of years later into such a regulated microscopic nervous “system” that random alterations to Kv channel features now have a negative impact. By this “system”, we refer to the neuron membrane complex of protein transporters and pumps actively interacting with ions, regulators, and other cells nearby.

This could not only explain why random variations are so impactful now but not when Kv channels emerged, but also suggest the possibility of any of the components of the system preventing the gating charge to increase up to its optimal value. Possibly in a trade-off relationship, getting to the optimal gating charge destabilizes a feedback mechanism or any other transduction pathway not clear yet. One likely scenario is that the gating charge is restricted or regulated by other external factors, which is precisely the role that voltage-sensing phosphatases have [64], e.g. the latest gating models that consider the interactions between gating charges and the surrounding lipids (tilt-shift model [65], tethered hydrophobic cation sensor [66, 67]).

As a whole, we have to acknowledge that with the model we have used, just as in any other case that models a biological situation approaching it from information theory, we can never be sure that we have correctly identified what the organism (or in this case the Kv channel) wants to sense. Perhaps Kv channels are just not well thought-of as sensors. There may be further dependencies for potassium concentrations, and even significant feedback loops or cooperativeness in the Kv channel subunits. Another possible approach is considering what parts of the channel are evolutionarily conserved or changed between channels, and how these evolutionary signals might inform such an analysis. As well, the two-state model itself may behave differently than a multistate one. These are left as potential considerations for continuation of this work.

Conclusion

Biological sensors do not always need to maximize mutual information, and we have shown that so is the case for potassium voltage-gated channels. Their evolution seems to have been driven by numerous random diversifications when basal animals appeared, which left no need to further optimize their performance as sensors. Still, the gating charge is most likely a determinant feature for how well Kv channels sense. We find conserved tendencies for a possibly optimal gating charge which still no Kv channel has, leading to possible evolutionary scenarios that may have caused this.

These tendencies are even kept when the environment distribution changes, suggesting that Kv channels could know how to keep the level of efficacy as sensors that is just good enough for them. While they cannot control their environment, they respond accordingly to how much surprise they can find in it, and opt for a same range of parameters to perform well given these constraints.

References

  1. 1. Armstrong CM, Hille B. Voltage-Gated Ion Channels and Electrical Excitability. Neuron. 1998;20(3):371–380. pmid:9539115
  2. 2. Hille B. Ionic Channels of Excitable Membranes. Oxford University Press, Incorporated; 1992. Available from: https://books.google.com.co/books?id=-ZbuAAAAMAAJ.
  3. 3. Choe S, Kreusch A, Pfaffinger PJ. Towards the three-dimensional structure of voltage-gated potassium channels. Trends in Biochemical Sciences. 1999;24(9):345–349. pmid:10470033
  4. 4. Schmidt D, Cross SR, MacKinnon R. A Gating Model for the Archeal Voltage-Dependent K+ Channel KvAP in DPhPC and POPE:POPG decane lipid bilayers. Journal of molecular biology. 2009;390(5):902–912. pmid:19481093
  5. 5. Milkman R. An Escherichia coli homologue of eukaryotic potassium channel proteins. Proceedings of the National Academy of Sciences of the United States of America. 1994;91(9):3510–3514. pmid:8170937
  6. 6. Sharma T, Dreyer I, Riedelsberger J. The role of K+ channels in uptake and redistribution of potassium in the model plant Arabidopsis thaliana. Frontiers in Plant Science. 2013;4. pmid:23818893
  7. 7. Moran Y, Barzilai MG, Liebeskind BJ, Zakon HH. Evolution of voltage-gated ion channels at the emergence of Metazoa. Journal of Experimental Biology. 2015;218(4):515–525. pmid:25696815
  8. 8. Hodgkin AL, Huxley AF. A quantitative description of membrane current and its application to conduction and excitation in nerve. The Journal of Physiology. 1952;117(4):500–544. pmid:12991237
  9. 9. Lodish H, Berk A, Zipursky SL, Matsudaira P, Baltimore D, Darnell J. The Action Potential and Conduction of Electric Impulses. Molecular Cell Biology 4th edition. 2000;.
  10. 10. Zhang XC, Yang H, Liu Z, Sun F. Thermodynamics of voltage-gated ion channels. Biophysics Reports. 2018;4(6):300–319. pmid:30596139
  11. 11. Catacuzzeno L, Franciolini F. Simulation of Gating Currents of the Shaker K Channel Using a Brownian Model of the Voltage Sensor. Biophysical Journal. 2019;117(10):2005–2019. pmid:31653450
  12. 12. Sigworth FJ. Voltage gating of ion channels. Quarterly Reviews of Biophysics. 1994;27(1):1–40. pmid:7520590
  13. 13. Sigg D, Bezanilla F. Total Charge Movement per Channel. The Journal of General Physiology. 1997;109(1):27–39.
  14. 14. Seoh SA, Sigg D, Papazian DM, Bezanilla F. Voltage-Sensing Residues in the S2 and S4 Segments of the Shaker K+ Channel. Neuron. 1996;16(6):1159–1167. pmid:8663992
  15. 15. Kalstrup T, Blunck R. S4–S5 linker movement during activation and inactivation in voltage-gated K+ channels. Proceedings of the National Academy of Sciences. 2018;115(29):E6751–E6759. pmid:29959207
  16. 16. Yang H, Gao Z, Li P, Yu K, Yu Y, Xu TL, et al. A Theoretical Model for Calculating Voltage Sensitivity of Ion Channels and the Application on Kv1.2 Potassium Channel. Biophysical Journal. 2012;102(8):1815–1825. pmid:22768937
  17. 17. Kubisch C, Schroeder BC, Friedrich T, Lütjohann B, El-Amraoui A, Marlin S, et al. KCNQ4, a Novel Potassium Channel Expressed in Sensory Outer Hair Cells, Is Mutated in Dominant Deafness. Cell. 1999;96(3):437–446. pmid:10025409
  18. 18. Monod J, Wyman J, Changeux JP. On the Nature of Allosteric Transitions: A Plausible Model. Journal of Molecular Biology. 1965;12:88–118. pmid:14343300
  19. 19. Marzen S, Garcia HG, Phillips R. Statistical Mechanics of Monod–Wyman–Changeux (MWC) Models. Journal of Molecular Biology. 2013;425(9):1433–1460. pmid:23499654
  20. 20. Chowdhury S, Chanda B. Estimating the voltage-dependent free energy change of ion channels using the median voltage for activation. The Journal of General Physiology. 2012;139(1):3–17. pmid:22155736
  21. 21. Pedersen TH, Riisager A, de Paoli FV, Chen TY, Nielsen OB. Role of physiological ClC-1 Cl- ion channel regulation for the excitability and function of working skeletal muscle. The Journal of General Physiology. 2016;147(4):291–308. pmid:27022190
  22. 22. Hou P, Shi J, White KM, Gao Y, Cui J. ML277 specifically enhances the fully activated open state of KCNQ1 by modulating VSD-pore coupling. eLife. 2019;8. pmid:31329101
  23. 23. Shannon CE. A Mathematical Theory of Communication. Bell System Technical Journal. 1948;27(3):379–423.
  24. 24. Adami C. Information theory in molecular biology. Physics of Life Reviews. 2004;1(1):3–22.
  25. 25. Paninski L. Estimation of entropy and mutual information. Neural Computation. 2003;15(6):1191–1253.
  26. 26. Cheong R, Rhee A, Wang CJ, Nemenman I, Levchenko A. Information transduction capacity of noisy biochemical signaling networks. Science (New York, NY). 2011;334(6054):354–358. pmid:21921160
  27. 27. Levchenko A, Nemenman I. Cellular noise and information transmission. Current Opinion in Biotechnology. 2014;28:156–164.
  28. 28. Kinney JB, Atwal GS. Equitability, mutual information, and the maximal information coefficient. Proceedings of the National Academy of Sciences. 2014;111(9):3354–3359. pmid:24550517
  29. 29. Tkacik G, Callan CG, Bialek W. Information flow and optimization in transcriptional regulation. Proceedings of the National Academy of Sciences. 2008;105(34):12265–12270. pmid:18719112
  30. 30. MacKay DM, McCulloch WS. The limiting information capacity of a neuronal link. The bulletin of mathematical biophysics. 1952;14(2):127–135.
  31. 31. Shannon CE. The bandwagon. IRE Transactions on Information Theory. 1956;2(1):3.
  32. 32. Palmer SE, Marre O, Berry MJ, Bialek W. Predictive information in a sensory population. Proceedings of the National Academy of Sciences. 2015;112(22):6908–6913.
  33. 33. Prindle A, Liu J, Asally M, Ly S, Garcia-Ojalvo J, Süel GM. Ion channels enable electrical communication within bacterial communities. Nature. 2015;527(7576):59–63. pmid:26503040
  34. 34. González C, Baez-Nieto D, Valencia I, Oyarzún I, Rojas P, Naranjo D, et al. K(+) channels: function-structural overview. Comprehensive Physiology. 2012;2(3):2087–2149. pmid:23723034
  35. 35. Li X, Liu H, Luo JC, Rhodes SA, Trigg LM, v Rossum DB, et al. Major diversification of voltage-gated K+ channels occurred in ancestral parahoxozoans. Proceedings of the National Academy of Sciences. 2015;112(9):E1010–E1019. pmid:25691740
  36. 36. Jegla TJ, Zmasek CM, Batalov S, Nayak SK. Evolution of the human ion channel set. Combinatorial Chemistry High Throughput Screening. 2009;12(1):2–23. pmid:19149488
  37. 37. Miller C. An overview of the potassium channel family. Genome Biology. 2000;1(4):reviews0004.1–reviews0004.5. pmid:11178249
  38. 38. Hodgkin AL, Katz B. The effect of sodium ions on the electrical activity of the giant axon of the squid. The Journal of Physiology. 1949;108(1):37–77. pmid:18128147
  39. 39. Cover TM. Elements of information theory. John Wiley & Sons; 1999.
  40. 40. Arimoto S. An algorithm for computing the capacity of arbitrary discrete memoryless channels. IEEE Transactions on Information Theory. 1972;18(1):14–20.
  41. 41. Perozo E, Papazian DM, Stefani E, Bezanilla F. Gating currents in Shaker K+ channels. Implications for activation and inactivation models. Biophysical Journal. 1992;62(1):160–171. pmid:1600094
  42. 42. Dubuis JO, Tkacik G, Wieschaus EF, Gregor T, Bialek W. Positional information, in bits. Proceedings of the National Academy of Sciences of the United States of America. 2013;110(41):16301–16308. pmid:24089448
  43. 43. Barros F, de la Peña P, Domínguez P, Sierra LM, Pardo LA. The EAG Voltage-Dependent K+ Channel Subfamily: Similarities and Differences in Structural Organization and Gating. Frontiers in Pharmacology. 2020;11. pmid:32351384
  44. 44. Islas LD, Sigworth FJ. Voltage Sensitivity and Gating Charge in Shaker and Shab Family Potassium Channels. The Journal of General Physiology. 1999;114(5):723–742. pmid:10539976
  45. 45. Schoppa NE, McCormack K, Tanouye MA, Sigworth FJ. The size of gating charge in wild-type and mutant Shaker potassium channels. Science (New York, NY). 1992;255(5052):1712–1715. pmid:1553560
  46. 46. Dougherty K, Covarrubias M. A Dipeptidyl Aminopeptidase–like Protein Remodels Gating Charge Dynamics in Kv4.2 Channels. The Journal of General Physiology. 2006;128(6):745–753. pmid:17130523
  47. 47. Somjen GG. Ion Regulation in the Brain: Implications for Pathophysiology. The Neuroscientist. 2002;8(3):254–267. pmid:12061505
  48. 48. Bayrami A, Abtahi B, Talebi E, Rahim Pouran S. A Perusal on Ion Regulation and Osmotic Pressure in the Mnemiopsis Leidyi Existing in Caspian Sea. Australian Journal of Basic and Applied Sciences. 2010;4:4448–4452.
  49. 49. Worster MG, Wettlaufer JS. Natural Convection, Solute Trapping, and Channel Formation during Solidification of Saltwater. The Journal of Physical Chemistry B. 1997;101(32):6132–6136.
  50. 50. Armstrong GAB, Rodríguez EC, Meldrum Robertson R. Cold hardening modulates K+ homeostasis in the brain of Drosophila melanogaster during chill coma. Journal of Insect Physiology. 2012;58(11):1511–1516. pmid:23017334
  51. 51. Schewe M, Nematian-Ardestani E, Sun H, Musinszki M, Cordeiro S, Bucci G, et al. A Non-canonical Voltage-Sensing Mechanism Controls Gating in K2P K+ Channels. Cell. 2016;164(5):937–949. pmid:26919430
  52. 52. Larsen AP, Steffensen AB, Grunnet M, Olesen SP. Extracellular Potassium Inhibits Kv7.1 Potassium Channels by Stabilizing an Inactivated State. Biophysical Journal. 2011;101(4):818–827. pmid:21843472
  53. 53. Vandendriessche T, Abdel-Mottaleb Y, Maertens C, Cuypers E, Sudau A, Nubbemeyer U, et al. Modulation of voltage-gated Na+ and K+ channels by pumiliotoxin 251D: A “joint venture” alkaloid from arthropods and amphibians. Toxicon. 2008;51(3):334–344. pmid:18061227
  54. 54. Manville RW, Neverisky DL, Abbott GW. SMIT1 Modifies KCNQ Channel Function and Pharmacology by Physical Interaction with the Pore. Biophysical Journal. 2017;113(3):613–626. pmid:28793216
  55. 55. Granitzer M, Nagel W, Crabbé J. Voltage dependent membrane conductances in cultured renal distal cells. Biochimica et Biophysica Acta (BBA)—Biomembranes. 1991;1069(1):87–93. pmid:1657165
  56. 56. Chang DC. Is the K permeability of the resting membrane controlled by the excitable K channel? Biophysical Journal. 1986;50(6):1095–1100. pmid:2432949
  57. 57. Greeff NG, Kühn FJP. Variable Ratio of Permeability to Gating Charge of rBIIA Sodium Channels and Sodium Influx in Xenopus Oocytes. Biophysical Journal. 2000;79(5):2434–2453. pmid:11053121
  58. 58. Sands SB, Costa AC, Patrick JW. Barium permeability of neuronal nicotinic receptor alpha 7 expressed in Xenopus oocytes. Biophysical Journal. 1993;65(6):2614–2621. pmid:8312496
  59. 59. Wang CT, Zhang HG, Rocheleau TA, ffrench Constant RH, Jackson MB. Cation Permeability and Cation-Anion Interactions in a Mutant GABA-Gated Chloride Channel from Drosophila. Biophysical Journal. 1999;77(2):691–700. pmid:10423418
  60. 60. Alberts B, Johnson A, Lewis J, Raff M, Roberts K, Walter P. Ion Channels and the Electrical Properties of Membranes. Molecular Biology of the Cell 4th edition. 2002;.
  61. 61. Michaelis L. Contribution to the theory of permeability of membranes for electrolytes. The Journal of General Physiology. 1925;8(2):33–59. pmid:19872189
  62. 62. Lenter DK. Blood—Inorganic substances. Scientific Tables. 1970;Seventh Edition:561–568.
  63. 63. Miceli F, Vargas E, Bezanilla F, Taglialatela M. Gating Currents from Kv7 Channels Carrying Neuronal Hyperexcitability Mutations in the Voltage-Sensing Domain. Biophysical Journal. 2012;102(6):1372–1382. pmid:22455920
  64. 64. Li Q, Wanderling S, Sompornpisut P, Perozo E. Structural basis of lipid-driven conformational transitions in the KvAP voltage-sensing domain. Nature Structural & Molecular Biology. 2014;21(2):160–166. pmid:24413055
  65. 65. Kim DM, Nimigean CM. Voltage-Gated Potassium Channels: A Structural Examination of Selectivity and Gating. Cold Spring Harbor Perspectives in Biology. 2016;8(5):a029231. pmid:27141052
  66. 66. Jiang Y, Lee A, Chen J, Ruta V, Cadene M, Chait BT, et al. X-ray structure of a voltage-dependent K+ channel. Nature. 2003;423(6935):33–41. pmid:12721618
  67. 67. Ruta V, Chen J, MacKinnon R. Calibrated measurement of gating-charge arginine displacement in the KvAP voltage-dependent K+ channel. Cell. 2005;123(3):463–475. pmid:16269337