Browse Subject Areas

Click through the PLOS taxonomy to find articles in your field.

For more information about PLOS Subject Areas, click here.

  • Loading metrics

A simple parametric representation of the Hodgkin-Huxley model

  • Alejandro Rodríguez-Collado ,

    Contributed equally to this work with: Alejandro Rodríguez-Collado, Cristina Rueda

    Roles Data curation, Methodology, Writing – original draft, Writing – review & editing

    Affiliation Department of Statistics and Operations Research, Universidad de Valladolid, Valladolid, Spain

  • Cristina Rueda

    Contributed equally to this work with: Alejandro Rodríguez-Collado, Cristina Rueda

    Roles Methodology, Writing – original draft, Writing – review & editing

    Affiliation Department of Statistics and Operations Research, Universidad de Valladolid, Valladolid, Spain

A simple parametric representation of the Hodgkin-Huxley model

  • Alejandro Rodríguez-Collado, 
  • Cristina Rueda


The Hodgkin-Huxley model, decades after its first presentation, is still a reference model in neuroscience as it has successfully reproduced the electrophysiological activity of many organisms. The primary signal in the model represents the membrane potential of a neuron. A simple representation of this signal is presented in this paper. The new proposal is an adapted Frequency Modulated Möbius multicomponent model defined as a signal plus error model in which the signal is decomposed as a sum of waves. The main strengths of the method are the simple parametric formulation, the interpretability and flexibility of the parameters that describe and discriminate the waveforms, the estimators’ identifiability and accuracy, and the robustness against noise. The approach is validated with a broad simulation experiment of Hodgkin-Huxley signals and real data from squid giant axons. Interesting differences between simulated and real data emerge from the comparison of the parameter configurations. Furthermore, the potential of the FMM parameters to predict Hodgkin-Huxley model parameters is shown using different Machine Learning methods. Finally, promising contributions of the approach in Spike Sorting and cell-type classification are detailed.


Neuroscience is an interdisciplinary science that studies the cellular, functional, behavioral, evolutionary, computational, molecular, and medical aspects of the nervous system. Many specialists from different areas of knowledge, such as physicists, chemists, mathematicians, computer engineers, and psychologists, have contributed to the field. The mathematical approach is one of the most preferred ones, particularly in studying the electrophysiological activity between neurons. The signal that has received most of the attention is the neuron membrane potential, which is the difference in electric potential between the cell’s interior and exterior. This signal is composed of various Action Potential curves (APs). A single AP lasts a few milliseconds and consists of 3 stages: depolarization, repolarization, and hyperpolarization. For researchers, APs are of special importance: they are the informational unit between neurons, and their number and shape determine the morphological, functional, and genetic profile of the cell. For more detail see, [15].

According to their dynamic behavior, neurons can be classified as excitable (generate an individual AP) or oscillatory (generate repetitive APs). Neurons that do not generate APs are called non-excitable. Neuronal cells of a similar type habitually exhibit similar behaviors. For instance, cardiac myocytes are usually oscillatory, while cortical neurons are mostly excitable. Furthermore, oscillatory neurons have been sub-classified in this work by the observed number of APs. A signal where several APs are observed is known as a Spike Train.

The most broadly considered mathematical model for describing AP dynamics is the Hodgkin-Huxley (HH) model, presented in [6]. More than a half-century later, this model remains key in neuroscience due to its innovative concept of modeling neuronal dynamics as a system of Ordinary Differential Equations (ODE) and its accurate representation of the electrophysiological neuronal activity. Specifically, the neuron’s membrane potential is stated to behave like an electrical circuit with various currents associated with three types of ions: sodium (Na+), potassium (K+) and another which is a non-specific leak current, mainly due to the influx of chlorine (Cl). However, the HH model lacks identifiability as various parameter configurations can lead to the same observed signal. In addition, the model is not robust as minor manipulations of the values of the parameters can change its output completely ([7, 8]).

Many models that have been developed afterward are either simplifications or extensions of the HH model. Some emulate the HH model as biophysically realistic, whereas others seek more simple models. Among the first group are the Hopfield model and the Van der Pol oscillator’s extensions, such as the FitzHugh-Nagumo model. In the second group, some popular choices are the family of leaky integrate-and-fire models and the Izhikevich model. Some basic references about these models are [913]. All the above are mechanistic models. The counterpart of the mechanistic models is the data-driven approach. Models based in data science, statistics and Machine Learning are in ever-rising popularity due to the increase in data availability and quality as noted in [14]. Our proposal can be framed within this class of models.

The Frequency Modulated Möbius (FMM) approach and others, such as the Fourier method, are encompassed in the amplitude modulated-frequency modulated decompositions. A general overview on these decompositions and time-frequency signal analysis can be found in [15, 16]. In particular, the FMM decomposition assumes a constant amplitude and a frequency that is modeled as a Möbius transformation. The monocomponent FMM model is presented in [17]. It shows how it accurately fits a wide variety of oscillatory patterns. The multicomponent FMM model is introduced in [18] and, in this, its potential in neuroscience is concisely demonstrated. Moreover, an exciting application for the automatic analysis of electrocardiograms is presented in [19].

This paper’s main goal is to show that the FMM model faithfully represents the AP signals waveforms derived from an HH model. To that end, a new FMM model is presented, denoted as FMMST, where ST stands for Spike Train. It is an FMM multicomponent model with restrictions on the parameters. Expressly, the model assumes that the Spike Train is the concatenation of a fixed number of successive spikes with the same shape, each one described with a bi-component FMM.

The approach is validated with a broad simulation experiment and real data. In the first case, a total of 5000 HH signals, corresponding to a wide variety of parameter configurations, has been generated according to a factorial design for the most relevant HH parameters. It is shown how the FMMST model accurately predicts the simulated signals across all the parameter configurations. In the second case, signals from the Squid Giant Axon Membrane Potential (SGAMP) database, originally from [20], have been analyzed. These signals have been chosen because they inspired originally the HH model definition [6]. Interesting differences between the simulated and real data emerge from the comparison of the parameter configurations.

Furthermore, the simulated data is used to illustrate the potential of the FMM parameters to predict HH model parameters using different Machine Learning methods. It is shown how specific HH parameters, with relevant physiological interpretation, can be accurately predicted from the FMM parameters, which is important in real scenarios where the underlying HH model is unknown.


HH model

The presentation in this section follows those in [21, 22]. The notation for HH variables and parameters is slightly changed from the one used in these papers to avoid confusion with other terms introduced later in the paper.

The HH model is defined, see Definition 1 below, by a nonlinear ODE system for four state variables: the membrane potential (X) and the three opening probabilities of the ion gates (m, n, h). Furthermore, X depends on the input stimulus I(t) generated by other neurons’ post-synaptic currents. On their behalf, the variables m, n and h are referred to as voltage-gated channels as they depend on the membrane potential through the six ion gate transition functions denoted by aj and bj, j ∈ {m, n, h}, as explained in [12].

Definition 1. Hodgkin-Huxley (HH) model.

The first equation in Definition 1 depends on the parameters of the cell capacitance (C), the maximum conductances (gK, gNa, gL) and the equilibrium potentials (VK, VNa, VL) of the ionic currents. The six ion gate transition functions increase the number of parameters to more than twenty parameters. Hence, the parametric space of the model can be simplified by replacing the ion gate transition functions am(X), an(X), ah(X), bm(X), bn(X), and bh(X) with the constants and , as done in [7, 5].

An interesting simplification of the HH parametric space is the pair that has been recently considered in [7] and is defined as follows: (1)

The aforementioned paper explains the properties of the pair. In particular, they claim that the neuron’s excitability phenomenon is essentially bidimensional, being determined by the structure of the neuron as a cell and its ionic current kinetics. While captures the neuron’s structural information, represents the kinetics of the ionic gates. Moreover, are less sensitive to slight changes in the signal than the primary HH parameters. However, such a drastic reduction in dimensions does not give a complete representation of the model.

FMM approach

Let ti, i = 1, …, n denote the vector of observed time points and X(ti) the observed data, which in this paper is the potential difference across the neuron’s membrane. It is assumed that the time points are in [0, 2π). Otherwise, consider t′ ∈ [t0, T + t0] with t0 as the initial time value and T as the period. Transform the time points by .

Let, υ = (A, α, β, ω)′ be the four-dimensional parameters describing a monocomponent FMM signal, defined as the following wave: W(t, υ) = Acos(ϕ(t, α, β, ω)), where A is the amplitude and, (2) is the phase. Particularly, the phase is defined in terms of α, β and ω, parameters that determine the location, skewness and kurtosis respectively. More details about the parameters can be found in [17].

The FMM approach relies upon a signal plus error model, as follows: (3) where the error term is assumed to be (e(t1), …, e(tn))′ ∼ Nn(0, σ2 I) and M is an intercept parameter. Note that this parameter does not represent the signal’s baseline level but its level at t0 minus the sum of waves values at t0.

The papers [19, 18] consider particular FMM models, show the broad type of signals that the model represents, provide properties, and interpret the parameters as well as detail the algorithm used to fit the models. In particular, in the second paper, its potential in neuroscience is concisely shown.

Depending on the application, the waves of the model represent different physiological processes. For instance, in the ECG case, the waves of the FMM model represent the five fundamental ECG upward and downward deflections, which are universally named P, QRS complex (a wave complex), and T.

The AP signals are modeled using two (or three in some cases) waves, the first being much more relevant than the rest. This wave, denominated dominant wave, identifies when the neuron spikes, allows the AP’s approximate reconstruction in the presence of noise, and the detection and identification of overlapping spikes. Moreover, theoretical properties are derived for the dominant wave. All of these is shown in [18].

Fig 1(A) shows FMM wave patterns by plotting W(t, υ) against t for different parameter configurations υ. In Fig 1(B), (left) an AP signal from an excitable neuron (dots in grey) is represented along with the corresponding fitted FMM model (in red) whereas in Fig 1(B), (right), W(t, υJ), J = 1, 2, is plotted against t.

Fig 1.

(A) FMM wave patterns for different parameter configurations. (B) Typical AP from an excitable neuron and corresponding fitted FMM model (left). Waves composing the FMM model (right).

The FMMST model is a particular multicomponent FMM model, where restrictions on the parameters are imposed. The model is defined as follows:

Definition 2. FMMST Model where,

  • M ∈ ℜ; ; S = 1, …, s,
  • ; S = 1, …, s
  • and ; S = 2, …, s.
  • and ; S = 2, …, s.
  • (e(t1), …, e(tn))′ ∼ Nn(0, σ2 I),

where s is the number of spikes that is assumed to be known and can be easily estimated with a naive method as is detailed in the estimation algorithm section.

The model also assumes that each spike is modelled with two waves with parameters υA and υB, respectively. Other parameters of interest in practice are the distance between the A and B waves for a given AP, defined as , and the distance between consecutive APs, . The inclusion of restrictions between the parameters is a specific feature of the FMM models, and their role is twofold. On the one hand, parameter identifiability is attained including the restriction A > 0 in the monocomponent case and, the restrictions between the αs and As in the multicomponent case, in addition. According to that, the number of free parameters of FMMST model is 1 + 4s + 4. On the other hand, restrictions on the ωs and βs, which represent the equal spike shapes, provide physiologically interpretable solutions.

Furthermore, depending on the application at hand, additional restrictions may be imposed in order to use a simpler model. In particular, the following restrictions on the A parameters, (4) are suitable in signals without APs generated at the beginning and/or end of the stimulus application, as these may exhibit different amplitudes than the rest of the APs. These restrictions have been considered in the SGAMP data analysis. In this case, the number of free parameters is reduced to 1 + 2s + 6.

Finally, in cases in which the distances between consecutive spikes are assumed to be constant, for instance in controlled experiments without stimulus changes, it would be fair to assume . Moreover, if the time lapse between the AP’s spike and the end of the hyperpolarization is constant, an additional set of restrictions may be imposed: . The number of free parameters in the first case is 1 + s + 8 whereas, in the second is 1 + 1 + 8.

Estimation algorithm.

The implementation of FMM models in the programming language R, including applying the defined restrictions, is openly available in package FMM of the programming language R, first presented in [23]. It is assumed that the segments to be analyzed represent complete spikes, in particular, X(t1) ≃ X(tn). s is easily determined by a naive method based on a threshold as proposed [24]. This threshold is k = 2.5σX, σX being the sample standard deviation of the observed data.

Occasionally, two different parameter configurations represent a given signal equally well. However, one is physiologically more plausible. In that case, additional restrictions on the parameters are needed to guarantee that the solution is the one expected. In the case of SGAMP signals, it is assumed that a spike has a prominent dominant wave which means that AAAB > C, for a given threshold, C.

Validation measure.

The goodness of fit of the model is measured with an R2 statistic, which is the proportion of the variance explained by a model out of the total variance, as follows: (5) where represents the fitted value at ti, i = 1, …, n. In this paper, , , refer to the R2 value for the FMMST model, the FMMST model with restrictions on the As, the FMMs model with restrictions on βs and ωs, and the FMMs* model with restrictions on the As, respectively.

Machine Learning Supervised methods.

Several Machine Learning Supervised methods have been considered in the paper. At one end, the simple Linear Regression (LR) that serves as a benchmark approach. At the other extreme, the complex and “black box” Support Vector Machines of RBF Kernel (SVM) approach that has been proved to achieve excellent results in neuronal dynamics, as seen in [7, 13], among others. Random Forest (RF) and Gradient Boosting Machines (GBM) are complex methods that provide interpretable results between the underfitting LR and the overfitting SVM, which have also been considered. [25, 26] are essential references to learn about the procedures. The R packages [2730], and the auxiliary package for learning procedures caret [31] have been used to implement the procedures.

Programming languages.

The simulation experiment has been developed combining the programming languages Python and R, which are probably the most used programming languages in data science. Python is used for data acquisition and transformation, while R fits the FMM models. Several solutions have been studied for the coupling between them that could, at the same time, be computationally effective, robust, and simple. A basic outline on the matter is presented in [32]. While certain libraries provide tools for the coupling of the two languages, such as [3335], these solutions are not sufficiently refined, and bash scripting was finally used.


Simulation experiment design

In the first stage, Python simulates AP signals using a modified HH model implementation based on the one available in the Neurodynex package [21]. The original implementation offers several features, such as a detailed evolution of the voltage-gated variables m, n, h, or the application of input stimuli with different amplitude and shape. A modification was implemented to facilitate changes in the model parameters. The analyzed signal spans 60 ms. A short square input stimulus I, of just 1 ms, has been applied in the tenth ms of the simulation. See [36] to learn about the most used stimulus types. Five values for the strength of the stimulus have been selected, I = {0, 4.5, 7, 9.5, 12} μA, and 1000 experiments were conducted for each value of I, leading to a total of 5000 experiments. Different configurations of the most influential parameters have been considered according to a factorial experiment design. In every experiment, each of these parameters takes a random value from a set of preselected values. These values have been experimentally observed in nature, as described in [7] and detailed in Table 1.

Table 1. Parameters varied in the Hodgkin-Huxley simulations according to the designed factorial experiment design.

For the values in Table 1, have been calculated with Eq (1) to be represented in Fig 2 across stimulus intensities by neuron type. While Fig 2(C) reproduces the one in [7], the rest of the planes generated by other stimulus intensity strengths are a novelty of this work, showing how the application of a stronger stimulus generates more excitable and oscillatory experiments.

In a second stage, using R, the signals are preprocessed to assure that complete APs are analyzed. We proceed as follows: the first AP is discarded if it occurs before the input stimulus has been applied; resting potential values are considered instead of the discarded data to get segments of equal length for all the trials. An analogous procedure is applied if the last AP is observed in the last 6 ms of the experiment. In the exceptional case, where only a pre-stimulus AP is observed, the original signal is analyzed. It is relevant to note that discarding the first and/or last AP in the experiments is not a limitation of the approach, as we are assuming equally-shaped APs.

In a third step, an FMMST model is fitted to the data using the FMM package.

Simulation experiment results: FMM results

From the total 5000 simulated HH experiments, the 3613 with at least one AP have been analyzed using the FMM approach. In Fig 3, representative signals with one to four spikes are plotted along with the FMMST model predictions. There is a relevant dominant wave that represents the depolarization and repolarization; and a second wave that accounts for the hyperpolarization. A summary of the main statistics and parameter estimates are given in Table 2. The values in the table show the high prediction accuracy. In particular, the global mean (standard deviation) is equal to 0.8527 (0.0579), and the is equal to 0.9877 (0.0053). These values quantify what Fig 3 shows. Table 2 also shows interesting differences in parameter configurations between signals with a different number of APs.

Table 2. Means and standard deviations for R2 values and parameter estimators.

Fig 3.

Neuronal signals simulated with the HH model and the estimated FMMST signals in red (left). Waves of the fitted FMMST model (right), which have been enumerated according to relevance and order.

In order to illustrate the parameter configurations differences between excitable and oscillatory neurons for HH and FMM models, two radar-plots have been represented in Fig 4. In the case of the HH parameters, which represent biochemical properties, oscillatory experiments have primarily higher values in gNa and smaller values in and . In terms of FMM parameters, which represent AP waveform features, oscillatory APs have nearer waves () with smaller kurtosis and skewness (ωA, ωB, βA, βB) and a B wave with a higher amplitude ().

Fig 4. Median values of the HH parameters and FMMST parameters (defined in Table 3) by neuron type.

The represented interval for each parameter is between the 10% and 90% percentiles.

Simulation experiment results: Main HH parameters prediction

In this section, the potential of the FMM parameters to predict relevant parameters of the HH model is shown. Two different sets of predictors: τA and τA + τB, defined in Table 3 and different Machine Learning Supervised procedures have been considered. Note that the LR and SVM approaches assume that the predictors are euclidean, but β is a circular parameter. Then, cos(βB) and sin(βB) are considered instead of βB. Furthermore, βA is considered as euclidean as it takes values concentrated in a small arc. Other predictors, derived from those in τA + τB, have been considered in the preliminary analysis but were eventually discarded using the principle of parsimony as only the results of LR improved lightly.

Table 3. Feature sets used in the prediction of the main HH parameters.

Variable selection methods have been used: a stepwise Akaike information criterion for LR, which selects the regression with the best trade-off between simplicity and goodness of fit, and caret’s recursive feature elimination [31] for the rest, which searched the minimal subset of variables that decreased at most 1.5% the goodness of fit. Only a single LR model reduces the used set of variables by discarding cos(βB).

Furthermore, ten-fold cross-validation is considered for comparative and validation purposes. The dataset is divided into ten equally sized splits. In ten iterations, nine of the subsets are used to train the procedure, while the tenth serves as a test as in [25, 37]. In addition, the Generalized Degrees of Freedom (GDFs) of the procedures have been calculated, as proposed in [38], to measure the underlying complexity.

Firstly, the prediction of the pair is presented. Table 4 provides a summary of the results of the prediction procedures of and . It can be seen that the parameters associated with the second wave, τB, significantly increase the prediction accuracy compared with that obtained using only parameters associated to wave A. Regarding the different procedures, at one end, the relatively bad results for LR provides evidence that the relation between the predictors and () is not linear. At the other end, SVM gives the most accurate prediction, with more than 95% and 94% of explained variance for and respectively. The RF and GBM results are between those for LR and SVM; while RF is comparable to GBM in terms of interpretability and complexity, the attained accuracy is less. However, although GBM procedures remain more complex and slightly less accurate than SVM, the predictions are interpretable.

Table 4. R2 and GDFs for the LR, RF, SVM and GBM procedures to predict and .

On the one hand, the most relevant predictors to explain are βA and ωA, which implies that is related to the AP shape. In particular, AP signals from experiments with low have more symmetrical A waves (βA close to π). On the other hand, the most relevant predictors to explain are , M and s, which recalls the strong association between and the numbers of APs observed in Fig 2. Note that s is third in importance due to its discrete nature.

Predictive analysis for other HH parameters has also been performed. The results for sodium parameters are included in Table 5. Sodium parameters have been selected as they are more accurately predicted and are the neuron excitability engine, as authors such as [8, 39] claim. It is interesting to note that the and are directly related to the first and second FMM waves, respectively, while gNa is related to both. The attained accuracy is not as high as for (), which are more stable parameters.

Table 5. R2 values for LR, RF, SVM and GBM procedures to predict the sodium HH parameters.

Real data analysis

The SGAMP database, firstly used in [20] and publicly accessible at [40], contains single-unit neuronal recordings of North Atlantic squid (Loligo pealei) giant axons in response to stimulus currents. The database has been extensively used in works. Among the recent ones are [41, 42]. The experiments where the applied stimulus is a short square stimulus, the same as in the HH experimentation, have been selected for the analysis. Five signals have been extracted from 4 trials of 3 different axons, being the stimulus amplitude equal to I = 5 μA in the five cases. The length of the analyzed segment is also equal to that of the simulated HH signals, 60 ms, to facilitate comparisons.

Four different FMM models have been fitted to the signals: an FMMs**, FMMs*, FMMST and an FMMST*. Moreover, it is assumed that AAAB < C, where C is 0.70 times the maximum difference obtained in previous iterations of the algorithm. For comparative proposes, Fourier models with the same number of free parameters as the FMM models, denoted as FDa where a is the number of harmonics, have also been fitted.

Fig 5 shows the FMMST* and the corresponding Fourier predictions for a representative signal. The R2 values and the number of free parameters are given in Table 6, for the five signals and eight models. The figures in the table show that the dominant wave explains much of the variability here, much more than it does in the HH experimentation. Furthermore, the models with restrictions in the A parameters are as accurate as the others without them. As such, the former models, being the simpler ones, are preferred due to the parsimony principle. Besides, the FD models make much less accurate predictions. Moreover, Table 7 gives the parameter values for the FMMST* model of the five signals. Compared to the values in Table 2, in particular to those corresponding to signals with 4 APs, some interesting differences can be highlighted. SGAMP APs have a more symmetrical shape (βA values nearer to π) and a spike with a lower maximum voltage (smaller values of AA). Furthermore, shape differences in the repolarization and hyperpolarization are evidenced by AB and βB.

Fig 5. Neuronal APs from the SGAMP database (axon 1, trial 18, second short square stimulus) along with the fitted signals using an FMMST* signal (red) and an FD7 (blue).

The waves of the fitted FMMST* model are illustrated at the bottom.

Table 6. R2 values of the different FMM and FD models for the signals extracted from the SGAMP database.

Table 7. Parameter estimators of the FMMST* models for the SGAMP signals.

Parameter configurations comparison between simulated and real data

In order to facilitate the parameter configurations comparison between real data and HH simulations, a principal component analysis has been performed with the simulated data. Furthermore, the corresponding projections of the SGAMP parameter configurations into this plane have been obtained. The first two principal components are plotted in Fig 6, where circular points represent the simulated signals and the rhombus points represent the real data projections. The Figure clearly illustrates the differences between the parameter configurations of the simulated and real data, as rhombus points are far from the main cloud of circular points.

Fig 6. Plot of the first two principal components of τA + τB of the HH experiments (circular points), with experiments with 4 APs highlighted, and projections of the SGAMP experiments (rhombus points).

At first glance, these differences could be attributed to the simplifications of the HH experimentation done. However, in our opinion, this is not a plausible explanation. Specifically, the intensity and shape of the stimulus being fixed does not affect the AP shape modelled by the FMM parameters, as authors like [4, 43] state. The other simplifications in the experiment design are minor. Furthermore, a simpler FMM model is adequate to analyze the real signals accurately as Table 6 shows. All of these comments evidence that the model underlying SGAMP signals is not an HH but a simpler one, such as FitzHugh-Nagumo (see [9]).


In this paper, the FMMST model has been presented, and its potential to describe the waveforms of simulated and real AP signals has been proved. It is shown that the squid giant axon signals exhibit simpler waveforms and are faithfully described with a simpler model than the simulated HH signals. Moreover, the excellent behavior of the FMM model to predict HH simulated and real signals provides pieces of evidence that other neuronal dynamics models could also be represented using the new approach, in which case, the differences between models would be articulated through the differences in the parameter configurations.

In addition to many interesting theoretical properties, the FMM approach is a flexible methodology from an applied point of view. When a single AP is analyzed, the parameters are able to describe a wide variety of spikes morphologies and, in particular, to discriminate neuron types. When multiple spikes are analyzed, restrictions between the parameters can be included to provide physiologically interpretable solutions and reduce the model’s complexity.

Many open problems in electrophysiological neuroscience can benefit from the FMM methodology presented in this paper. In particular, it can significantly contribute to solving problems where the AP characterization is needed, such as Spike Sorting and cell-type classification. Spike Sorting implies grouping spikes into clusters corresponding to different neurons based on the similarity of their shapes. Cell-type classification deals with the definition of hierarchical taxonomies of cells based on different sets of morphological, genetic and/or electrophysiological features.

Spike Sorting and cell-type classification are two of the most critical data analysis problems in neuroscience and have received a lot of attention in the literature. Some interesting references about Spike Sorting are [24, 44, 45], whereas [2, 13, 46] address cell-type classification. Besides, it is worth mentioning other more complex related problems such as multi-channel Spike Sorting and the function identification of neurons in brain circuits. These questions that are tackled in [4749].

Furthermore, the FMM approach, based on a simple parametric model, is opposite of ‘black box’ methods, which are becoming popular in neuroscience and other areas, and provides simple solutions to different questions. A particular case is the problem of denoising neuronal signals, which is essential when recordings are made in vivo because often low temporal resolution and noisy data are recorded in this scenario. Neuronal networks are the predominant methodology to solve this task (see [50, 51]); however, the FMM model is a signal plus error and the denoising is included in the estimation step.

Two different lines of work could be defined for the future. On the one hand, from a theoretical perspective, a first question to solve is the implementation of restrictions with the form ; S = 2, …, s. They are of interest to analyze signals with equal distances between waves in the APs, because reducing the number of parameters to be estimated is important when large or noisy Spike Trains are analyzed. On the other hand, from an applied perspective, many other AP real signals must be analyzed, and the questions of spike and cell classification and clustering may be addressed. The approach’s potential is difficult to calibrate as many aspects remain to be researched and exploited.

Finally, a limitation of our study is that the input stimulus’s timing and shape have been fixed. The influence of the stimulus type could be analyzed using the FMM approach. The α parameters are related to the firing times, and the shape parameters could be useful to detect circumstances where the shape of the APs is independent of the stimulus, as [4] suggests, and circumstances where changes happen, such as the observation of incomplete spikes. However, the question is tricky and deserves further research.


  1. 1. Mensi S, Naud R, Pozzorini C, Avermann M, Petersen CCH, Gerstner W. Parameter extraction and classification of three cortical neuron types reveals two distinct adaptation mechanisms. Journal of Neurophysiology. 2012;107(6):1756–1775. pmid:22157113
  2. 2. Zeng H, Sanes J. Neuronal cell-type classification: Challenges, opportunities and the path forward. Nature Reviews Neuroscience. 2017;18. pmid:28775344
  3. 3. Trainito C, von Nicolai C, Miller EK, Siegel M. Extracellular spike waveform dissociates four functionally distinct cell classes in primate cortex. Current Biology. 2019;29(18):2973–2982. pmid:31447374
  4. 4. Raghavan M, Fee D, Barkhaus PE. Generation and propagation of the action potential. In: Handbook of clinical neurology. vol. 160. Elsevier; 2019. p. 3–22.
  5. 5. Ori H, Hazan H, Marder E, Marom S. Dynamic clamp constructed phase diagram for the Hodgkin and Huxley model of excitability. Proceedings of the National Academy of Sciences. 2020;117(7):3575–3582. pmid:32024761
  6. 6. Hodgkin AL, Huxley AF. A quantitative description of membrane current and its application to conduction and excitation in nerve. The Journal of Physiology. 1952;117(4):500–544. pmid:12991237
  7. 7. Ori H, Marder E, Marom S. Cellular function given parametric variation in the Hodgkin and Huxley model of excitability. Proceedings of the National Academy of Sciences. 2018;115(35):E8211–E8218. pmid:30111538
  8. 8. Marom S. Emergence and maintenance of excitability: kinetics over structure. Current Opinion in Neurobiology. 2016;40:66—71. pmid:27400289
  9. 9. Fitzhugh R. Impulses and Physiological States in Theoretical Models of Nerve Membrane. Biophysical Journal. 1961;1(6):445–466. pmid:19431309
  10. 10. Abbott LF, Kepler TB. Model neurons: From Hodgkin-Huxley to Hopfield. In: Statistical Mechanics of Neural Networks. Berlin, Heidelberg: Springer Berlin Heidelberg; 1990. p. 5–18.
  11. 11. Izhikevich EM. Simple model of spiking neurons. IEEE Transactions on Neural Networks. 2003;14(6):1569–1572. pmid:18244602
  12. 12. Ermentrout B, Terman D. The Mathematical Foundations of Neuroscience. 1st ed. Springer; 2010.
  13. 13. Teeter C, Iyer R, Menon V, Gouwens N, Feng D, Berg J, et al. Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications. 2018;9(1):1–15. pmid:29459723
  14. 14. Brunton BW, Beyeler M. Data-driven models in human neuroscience and neuroengineering. Current Opinion in Neurobiology. 2019;58:21—29. pmid:31325670
  15. 15. Picinbono B. On instantaneous amplitude and phase of signals. IEEE Transactions on Signal Processing. 1997;45(3):552–560.
  16. 16. Boashash B. Time-Frequency Signal Analysis and Processing: A Comprehensive Reference. 1st ed. Academic Press; 2003.
  17. 17. Rueda C, Larriba Y, Peddada SD. Frequency Modulated Möbius Model Accurately Predicts Rhythmic Signals in Biological and Physical Sciences. Scientific Reports. 2019;9(1):1–10. pmid:31822685
  18. 18. Rueda C, Rodríguez-Collado A, Larriba Y. A novel wave decomposition for oscillatory signals. IEEE Transactions on Signal Processing. 2021;69:960–972.
  19. 19. Rueda C, Larriba Y, Lamela A. The hidden waves in the ECG uncovered revealing a sound automated interpretation method. Scientific Reports. 2021;11:3724. pmid:33580164
  20. 20. Paydarfar D, Forger DB, Clay JR. Noisy Inputs and the Induction of On–Off Switching Behavior in a Neuronal Pacemaker. Journal of Neurophysiology. 2006;96(6):3338–3348. pmid:16956993
  21. 21. Gerstner W, Kistler WM, Naud R, Paninski L. Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. 1st ed. New York, NY, USA: Cambridge University Press; 2014.
  22. 22. Estumano D, Orlande HR, Colaço M, Ritto T, Diaz J, Dulikravich G. Bayesian Estimation of Parameters in Hodgkin-Huxley’s Model of Biomedical Electric Signals. In: 2nd International Symposium on Uncertainty Quantification and Stochastic Modeling; 2014. p. 1–11.
  23. 23. Fernández I, Rodríguez-Collado A, Larriba Y, Lamela A, Canedo C, Rueda C. FMM: An R package for modeling rhythmic patterns in oscillatory systems. arXiv:2105.10168 [Preprint]; 2021. Available from:
  24. 24. Rey HG, Pedreira C, Quiroga R. Past, present and future of spike sorting techniques. Brain Research Bulletin. 2015;119:106–117. pmid:25931392
  25. 25. Hastie T, Tibshirani R, Jerome F. The Elements of Statistical Learning: Data Mining, Inference, and Prediction. 2nd ed. Springer; 2009.
  26. 26. Izenman AJ. Modern Multivariate Statistical Techniques: Regression, Classification, and Manifold Learning. 1st ed. Springer Publishing Company, Incorporated; 2008.
  27. 27. R Core Team. Package ‘stats’ in R: A Language and Environment for Statistical Computing; 2013. Available from:
  28. 28. Breiman L, Cutler A, Liaw A, Wiener M. Package ‘randomForest’ Manual. CRAN; 2018.
  29. 29. Karatzoglou A, Smola A, Hornik K, Zeileis A. kernlab—An S4 Package for Kernel Methods in R. Journal of Statistical Software. 2004;11(9):1–20.
  30. 30. Ridgeway G. Generalized Boosted Models: A Guide to the GBM Package. Compute. 2005;1:1–12.
  31. 31. Kuhn M. Package ‘caret’ Manual. New London: CRAN; 2018.
  32. 32. Pandey P. From ‘R vs Python’ to ‘R and Python’. Towards Data Science; 2019. Available from:
  33. 33. Gautier L. rpy2—R in Python; 2019. Available from:
  34. 34. Curtis V. RWinOut; 2017. Available from:
  35. 35. Xia XQ, McClelland M, Wang Y. PypeR, A Python Package for Using R in Python. Journal of Statistical Software, Code Snippets. 2010;35(2):1–8.
  36. 36. Allen Brain Institute. Allen Cell Types Database—Electrophysiology. Technical report; 2015. Available from:
  37. 37. Kubat M. An Introduction to Machine Learning. 1st ed. Springer Publishing Company, Incorporated; 2015.
  38. 38. Ye J. On Measuring and Correcting the Effects of Data Mining and Model Selection. Journal of The American Statistical Association. 1998;93:120–131.
  39. 39. Silva J. Slow Inactivation of Na+ Channels. In: Voltage Gated Sodium Channels; 2014. p. 33–49.
  40. 40. Goldberger AL, Amaral LAN, Glass L, Hausdorff JM, Ivanov PC, Mark RG, et al. PhysioBank, PhysioToolkit, and PhysioNet: Components of a New Research Resource for Complex Physiologic Signals. Circulation. 2000;101(23):e215–e220. pmid:10851218
  41. 41. Forger D, Paydarfar D, Clay J. Optimal Stimulus Shapes for Neuronal Excitation. PLoS Computational Biology. 2011;7:e1002089. pmid:21760759
  42. 42. Clay J, Forger D, Paydarfar D. Ionic Mechanism Underlying Optimal Stimuli for Neuronal Excitation: Role of Na+ Channel Inactivation. PloS One. 2012;7:e45983. pmid:23049913
  43. 43. Kass RE, Amari SI, Arai K, Brown EN, Diekman CO, Diesmann M, et al. Computational Neuroscience: Mathematical and Statistical Perspectives. Annual Review of Statistics and Its Application. 2018;5(1):183–214. pmid:30976604
  44. 44. Caro-Martín CR, Delgado-García JM, Gruart A, Sánchez-Campusano R. Spike sorting based on shape, phase, and distribution features, and K-TOPS clustering with validity and error indices. Scientific Reports. 2018;8(1):1–28. pmid:30542106
  45. 45. Souza BC, Lopes-dos Santos V, Bacelo J, Tort AB. Spike sorting with Gaussian mixture models. Scientific Reports. 2019;9(1):1–14. pmid:30842459
  46. 46. Ghaderi P, Marateb H, Safari MS. Electrophysiological Profiling of Neocortical Neural Subtypes: A Semi-Supervised Method Applied to in vivo Whole-Cell Patch-Clamp Data. Frontiers in Neuroscience. 2018;12. pmid:30542256
  47. 47. Mannal N, Kleiner K, Fauler M, Dougalis A, Poetschke C, Liss B. Multi-Electrode Array Analysis Identifies Complex Dopamine Responses and Glucose Sensing Properties of Substantia Nigra Neurons in Mouse Brain Slices. Frontiers in Synaptic Neuroscience. 2021;13:1. pmid:33716704
  48. 48. Bashford J, Mills K, Shaw C. The evolving role of surface electromyography in amyotrophic lateral sclerosis: A systematic review. Clinical Neurophysiology. 2020;131(4):942–950. pmid:32044239
  49. 49. Chen Y, Rommelfanger NJ, Mahdi AI, Wu X, Keene ST, Obaid A, et al. How is flexible electronics advancing neuroscience research? Biomaterials. 2021;268:120559. pmid:33310538
  50. 50. Lecoq J, Oliver M, Siegle JH, Orlova N, Koch C. Removing independent noise in systems neuroscience data using DeepInterpolation. BioRxiv 2020.10.15.341602 [Preprint]; 2020. Available from:
  51. 51. Sebastian J, Sur M, Murthy HA, Magimai-Doss M. Signal-to-signal neural networks for improved spike estimation from calcium imaging data. PLOS Computational Biology. 2021;17(3):1–19. pmid:33647015