Abstract
Examining individual components of cellular systems has been successful in uncovering molecular reactions and interactions. However, the challenge lies in integrating these components into a comprehensive system-scale map. This difficulty arises due to factors such as missing links (unknown variables), overlooked nonlinearities in high-dimensional parameter space, downplayed natural noisiness and stochasticity, and a lack of focus on causal influence and temporal dynamics. Composite static and phenomenological descriptions, while appearing complicated, lack the essence of what makes the biological systems truly “complex.” The formalization of system-level problems is therefore important in constructing a meta-theory of biology. Addressing fundamental aspects of cellular regulation, adaptability, and noise management is vital for understanding the robustness and functionality of biological systems. These aspects encapsulate the challenges that cells face in maintaining stability, responding to environmental changes, and harnessing noise for functionality. This work examines these key problems that cells must solve, serving as a template for such formalization and as a step towards the axiomatization of biological investigations. Through a detailed exploration of cellular mechanisms, particularly homeostatic configuration, ion channels and harnessing noise, this paper aims to illustrate complex concepts and theories in a tangible context, providing a bridge between abstract theoretical frameworks and concrete biological phenomena.
Author summary
This paper presents an approach to understanding the complex workings of cells, looking beyond single molecules and focusing on how various parts of a cell interact. We discuss the challenges in mapping out cellular functions, including gaps in knowledge, complex intracellular interactions, and the inherent noisiness of biological processes. By using a structured framework, inspired by Hilbert’s method of identifying key problems to guide scientific inquiry, we can better understand these challenges. We focus on key cellular issues like control mechanisms and noise management, providing examples that can be tested with real data.
Citation: Dehghani N (2024) Systematizing cellular complexity: A Hilbertian approach to biological problems. PLOS Complex Syst 1(3): e0000013. https://doi.org/10.1371/journal.pcsy.0000013
Editor: Vera Pancaldi, INSERM U1037, FRANCE
Published: November 5, 2024
Copyright: © 2024 Nima Dehghani. This is an open access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.
Funding: The author received no specific funding for this work.
Competing interests: The author has declared that no competing interests exist.
Introduction
The complexity of cellular systems
Unraveling the intricate web of processes in cellular systems presents a formidable frontier in contemporary biological research. Many significant breakthroughs in modern biology have come from studies at a molecular level, from genes to proteins. As our measurement techniques are providing us data at different scales with unprecedented resolution and throughput, a more detailed and nuanced picture is necessary for further advances, especially in areas where simplified models fall short of capturing the emergent properties and interactions within biological systems. Scientific journals and textbooks often feature graphical representations of cellular mechanisms, ranging from 3D depictions of subcellular structures to diagrams of signaling pathways. Advances in imaging and high-throughput gene expression methods, coupled with a deeper understanding of biochemical pathways, have resulted in an abundance of intricate yet simplified representations of intracellular structures and signaling. However, it’s critical to recognize that these images are abstractions, simplifying complex cellular systems.
Capturing the complexity of cellular interactions with static descriptions is challenging. Given the complexity and abundant stochasticity in intracellular events, the ability of cells and simple organisms to achieve reliability and robustness is remarkable. In biological systems, many genes are responsible for coding sensors, actuators, and the intricate regulatory networks that manage them, providing robustness to variations rather than just the basic functionality required for survival in ideal circumstances [1]. These robustness mechanisms involve sophisticated regulatory feedback and dynamic processes that prevent cascading failures, creating systems so robust they appear simple, reliable, and consistently stable, seemingly unaffected by the environment [1]. Evolution favors the development of modular architectures with complex hierarchies of protocols and layers of feedback regulation, driven by the need for robustness in uncertain environments. These characteristics stem from the intricate relationship between complexity, robustness, modularity, and feedback [2].
The robustness of biological systems, which allows them to maintain functionality in the face of internal and external perturbations, has been extensively studied and is a fundamental characteristic of living organisms [3]. Robustness in biochemical networks, such as those governing gene expression, ensures that cells can function reliably despite the inherent noise and variability in these processes [4]. This robustness is often a result of complex regulatory networks that provide multiple layers of control and feedback mechanisms, allowing organisms to adapt to changing environments and maintain homeostasis [5,6].
Understanding how cells control their inner dynamics and process environmental information, and how they respond to challenging conditions, may require more than reductionist or ateleological approaches. Referring to Lazebnik’s essay “can a biologist fix a radio?” [7], we question whether we can truly understand the inner workings of a cell through conventional methods.
Towards an axiomatic approach in biology
Many significant breakthroughs in understanding diseases at a molecular level have come from integrating insights across various scales, from genes to proteins to cellular networks. Thus, a more detailed and nuanced picture is necessary for further advances, especially in areas where simplified models fall short of capturing the emergent properties and interactions within biological systems.
In a quantitative approach to understanding the inner workings of cells, adopting methods from quantitative fields like physics or computer science can indeed be beneficial. However, we propose that a structured axiomatic framework can further enhance these efforts by providing a systematic way to address complex biological questions. This involves correctly framing questions and developing new methods to understand “evolved” biological systems. Integrating approaches from dynamical systems theory, computer science, statistical inference, and machine learning within this axiomatic framework can provide a more comprehensive understanding [8]. Instead of concentrating on isolated issues or being confined by our expertise in certain methods, it is important to identify a set of problems that a cell must address.
To address these challenges, this work proposes adopting a “Hilbertian approach,” where key problems are systematically framed to guide biological research and theoretical development (please see Box 1 for related definitions and nomenclature). This involves identifying a spectrum of problems that a cell must address and developing new methods to understand these evolved biological systems. Developing a systematic theoretical approach not only is more productive but also shapes our thinking within a well-defined paradigm. This approach mirrors the influence of Hilbert’s list of unsolved problems in mathematics on the field, guiding our thought process in a similar manner [9].
Box 1: Glossary and Definitions
- Hilbertian Approach: A method inspired by David Hilbert’s list of 23 unsolved problems in mathematics presented at the Paris conference in 1900 [9]. Hilbert’s list was not merely a set of mathematical puzzles but a strategic roadmap to guide future research. This approach involves systematically framing key problems to direct theoretical development and research efforts. It emphasizes identifying fundamental questions and establishing a structured framework for scientific inquiry. In biology, as we propose in this work, this method can help in tackling the complexity and variability inherent in living systems by focusing on key unresolved questions that drive the field forward.
- Axiomatizing Biology: Axiomatization is the process of defining a set of basic principles or axioms from which other truths can be logically derived. In biology, this means establishing foundational principles that can explain various biological phenomena. Joseph Woodger and Mario Bunge made notable attempts to axiomatize biological sciences, though they faced challenges due to the complexity and variability of biological systems [10,11].
- Pearl’s Framework: A theoretical framework developed by Judea Pearl for causal inference, based on structural models and counterfactual reasoning. This framework helps in understanding causality in complex systems, providing tools to model and analyze causal relationships in biological networks [12,13].
- Microscale, Macroscale, and Causal Levels: Micro/Macro-scale terms refer to phenomena or structures at different scales of observation. Macroscale pertains to large-scale structures/phenomena, whereas microscale involves smaller, often microscopic structures/phenomena. To understand systems with dynamics across multiple scales, we turn to multiscale modeling. This approach combines models operating at different resolutions. Macroscale models provide a broader view, while microscale models offer intricate details. The goal is to strike a balance between accuracy and efficiency (macroscale models may not be accurate enough, while the microscale models may be inefficient) [14]. Due to layered interaction across scales in biological systems [15,16], successful multiscale modeling hinges on causality. We must grasp how changes at one level influence behavior at another and develop quantitative models that can capture such “causal” interaction [15–19]. By connecting micro and macro scales causally, we can create meaningful representations of complex biological dynamics.
- Quantization Level: In non-biological contexts, quantization involves representing analog signals as finite-resolution digital signals by slicing their amplitude into discrete levels. The reason for it is that the uniform quantization may not be optimal, as it rounds each sample value to the nearest value from a finite set of possible quantization levels [20,21]. Biological systems, much like electronic circuits, encounter the need to represent continuous analog signals using discrete levels. These analog signals undergo quantization to facilitate processing, transmission, and analysis. Quantization involves dividing the amplitude range of an analog signal into distinct levels. The number of quantization levels determines the precision of representation. This can help achieving an optimal balance between precision and resource efficiency remains a challenge in biological contexts.
- Teleology: The study of purpose or design in natural phenomena. In biology, it refers to the explanation of biological processes in terms of their goals or functions rather than solely by their mechanistic causes. This concept, rooted in Aristotle’s philosophy, has been critical for understanding evolutionary adaptations and the functional organization of biological systems [22,23]. During the Scientific Revolution, figures like Francis Bacon and René Descartes emphasized efficient causation over final causation, leading to a decline in teleological explanations [24]. However, William Harvey and Robert Boyle argued that teleological reasoning could complement mechanistic explanations, particularly in understanding biological functions. In contemporary biology, the debate continues, with some scholars framing properties of biological systems as “functions” shaped by evolutionary pressures rather than purposeful design [25,26]. Recent critique of selected effects theories argues that biological functions should be seen as causes rather than effects, emphasizing the causal roles that functions play in biological systems [27]. It was the Cybernetics movement that reintroduced teleological ideas, emphasizing feedback and control systems to explain goal-directed behaviors in biological organisms [22,28].
Previous attempts at axiomatizing biology, such as those by Joseph Woodger [10] and Mahner and Bunge’s metaphysically schemed approach to biology [11], are prime examples of failed attempts due to improper formalization of key problems. To avoid repeating these mistakes, we would need to properly formalize these key issues before attempting to develop an axiomatic approach. By working within an scalable paradigm and adapting it as we progress (both experimentally and theoretically), we can bridge biological intuition with mathematical development, furthering our understanding of living systems. This approach can be very useful for avoiding the pitfalls of previous attempts and to ensure the successful axiomatization of biology.
In light of these perspectives, we use cellular adaptability, ion channels, noise, and stochasticity as prototypical examples to illustrate the Hilbertian approach. This targeted exploration serves as a tangible context for applying and testing theoretical frameworks, bridging abstract concepts with concrete biological phenomena. By concentrating on these elements, we aim to construct a framework that can decipher the cellular machinery and predicts its behavior under diverse conditions. Through this approach, the integration of multidisciplinary methods can overcome traditional barriers and elucidate the complex nature of cellular systems. Ultimately, we envision that attempts such as this endeavor and adopting a similar Hilbertian framework may pave the way for revolutionary advances in biology, offering insights that could transform our understanding of life at the molecular level through an axiomatic lens.
This paper aims to formalize key challenges in cellular biology: finding the right control switches, addressing the need for reconfiguration, and managing internal noise. By focusing on ion channels and their regulation, as well as exploring stochastic resonance, we provide an in-depth case study. Additionally, we outline other significant problems that could be part of a Hilbert-like set of questions in biology, advocating for a systematic and comprehensive approach to understand complex biological systems.
Problem 1: Where are the control switches?
Challenges of environmental adaptation
Single-cell organisms must constantly adapt to changing environments, such as fluctuations in temperature, pH, and the concentration of necessary substrates or harmful chemicals. Multicellular systems have more internal stability due to sophisticated regulatory systems, but their individual cells must also overcome fluctuations in their constrained environments [29,30]. A key question arises: How do cells, whether unicellular or multicellular, maintain their internal stability amidst these changes?
Over time, in the context of evolutionary innovation, cells have developed intricate mechanisms that allow them to efficiently manage internal and external changes. These mechanisms enable cells to identify and manipulate relevant control mechanisms—or switches—to adapt to environmental changes and preserve internal stability [29]. A key aspect of these mechanisms is the presence of network motifs, which are recurring, significant patterns of interactions among genes and their regulatory elements. Examining these motifs provides valuable insights into the regulatory structures and their roles in cellular functions [31–33].
Effective regulation in cells involves navigating 2 critical constraints: firstly, the identification of these relevant switches without causing irreversible harm or death to the cell; secondly, the necessity of prompt action due to limited resources and time. Consequently, adaptive cellular regulation represents a time-constrained optimization challenge, centered around the evolutionary discovery and manipulation of the necessary “switches” for appropriate responses to environmental stimuli.
The complexity of cellular switches
Identifying the right switch at the right time poses a complex problem due to the vast array of potential switches in a cell. These switches can range from single molecular factors initiating signaling cascades to complex regulatory motifs and physical structures. A possible approach to this challenge is the concept of trial and error, as posited by Ashby [34,35]. Karl Popper highlighted the role of “trial and error” as a fundamental process in knowledge acquisition within evolutionary epistemology [36]. Ashby illustrated the limits of trial and error with a hypothetical scenario in which an individual must find the correct configuration among 1,000 switches to illuminate a light. The time required to find the right configuration varies significantly across different approaches. For instance, a thorough sequential search would take an impractical 21000 seconds (assuming 2 seconds for each switch’s on and off positions). In contrast, a serial test with some knowledge of partial correctness might average around 500 seconds, while testing all switches individually but in parallel could take merely 1 second. However, while this concept might be applicable to high-dimensional problems like adaptive cellular regulation, it comes with significant risks: incorrect switch activation, deactivation of essential switches, or time constraints might lead to detrimental outcomes for the cell.
In the context of cellular populations, one might speculate that some cells could successfully adapt through trial and error. However, the structure and hierarchy of cellular switches are far from random [37,38]. They exhibit a layered organization, with some operating at lower scales and others providing higher-level modulation. This hierarchical architecture implies a causal modulatory tuning of more efficient, evolutionarily refined systems for achieving the right configuration, rather than relying on mere chance. This implies an evolutionary context where regulatory systems have been refined to ensure survival and adaptability.
Emergent organization in cellular regulation
In the context of cellular regulation, network motifs and complex switches play a pivotal role. Network motifs and complex switches provide functional building blocks within complex biological systems, occur at higher frequencies than in randomized networks, and are ubiquitous across various biological networks, including metabolic networks, gene regulatory networks, and in organisms ranging from plants to worms [39–41]. For example, motifs in gene regulatory networks of worms and plants have been found to function in a highly integrated manner, showing conservation and complexity across species [39]. Similarly, the evolution of modularity in bacterial metabolic networks demonstrates how modular structures can emerge to optimize metabolic efficiency and adaptability [40]. These network motifs are not unique to cellular networks but also appear in ecological networks, indicating their fundamental role in maintaining stability and functionality in biological systems [42]. Evolutionary processes have shaped these modular and motif-based structures to enhance the robustness and adaptability of biological systems [43]. Network motifs can emerge from interconnections that favor stability, providing an evolutionary advantage in fluctuating environments [44]. Understanding these evolutionary processes that lead to the establishment of regulatory motifs can provide deeper insights into cellular regulation. Over time, cells have developed regulatory mechanisms that allow them to efficiently manage internal and external changes. Additionally, the identification of switches must be contextualized within the network of interactions in which they operate, as emphasized by the extensive work on network motifs [31–33].
Control mechanisms in biological systems often operate in a heterarchical rather than strictly hierarchical manner [45]. This means that control is distributed across multiple levels and components, often with feedback loops and multiple controllers interacting without a single top-down directive. Simon’s concept of nearly decomposable systems, where components at various levels interact in complex ways, also supports this view [46]. This organized heterarchy highlights the flexible and adaptive nature of biological control mechanisms, which can operate independently yet coherently within the larger system [47]. These complex control layers, combined with the dynamic and noisy nature of cellular environments, underscore that a simple trial and error approach is not efficient. Instead, the evolutionary refinement of modular and motif-based regulatory structures is crucial. This refinement ensures robust and adaptable responses to environmental challenges, allowing cells to efficiently manage internal and external changes. Given the inherent complexity of switch configurations, compounded by the often limited and noisy input information, makes a pure trial and error approach both computationally prohibitive and pragmatically risky [48].
Causal inference in cellular regulation
To reach a desirable problem-specific continuously evolving solution that requires little to no knowledge of the sources of external changes, cell would needs to be able to manage control levels at both macro and micro scales. This implies some form of internal tuning within the cell. Phenomenological models can describe the system at the macro scale, but they do not provide insights into the precise molecular interactions occurring at the micro level. Therefore, to understand the right switches at the right time is activated in a cell’s exposure to environmental changes, it is necessary for us to conduct simultaneous measurements at various scales. These scales should not be examined separately and then our static models of these scales be stitched together in a fixed diagram. Instead, overlapping complementary measurements at multiple scales can help construct a map of the causal structure of switches and reveal how the cell exerts internal control. This approach is more effective than creating an ever-growing chart of imaginary blueprints. A dichotomoy of view is shaping in the field; one in which, all these forms of internal control mechanisms are purely of evolutionary nature, and an opposing emergent view that adds a layer of learning and agential control to the single cell level control and decision-making [49]. Interested readers may refer to a two-part special issue (in Philosophical Transaction B of Royal Society, 15 March 2021, Volume 376, Issue 1820) on this topic, entitled “Basal cognition: conceptual tools and the view from the single cell.”
A methodological approach for uncovering the “causal” structure should consider the search for such motifs. Moreover, the reliance of regulatory information transfer on weak links, as opposed to the dominance of strong links in energy transfer and metabolic pathways, may have played a critical role in the evolution of new activating or inhibiting signals [50]. The topological characteristics of these regulatory modules indicate varying connectivity structures, which are influenced by the degree of exposure to environmental variability and noise. Modules closely tied to environmental fluctuations tend to exhibit lower connectivity to enhance network robustness, whereas modules shielded from external noise sources lack a strong evolutionary impetus for sparse connections [51].
The complexity inherent in the multiscale nature of protein and genetic switches presents a formidable challenge in deciphering the causal mechanisms underpinning cellular control systems. In the realm of gene regulatory networks, this challenge is exacerbated by the limitations of current data types necessary for unraveling causal relationships [52]. The ongoing evolution of methodological approaches, particularly in handling sparse and noisy gene expression time series [53, 54], is crucial. However, the current state of data often leads to the development of methods that are inefficient or unreliable [52]. As richer, higher-quality gene expression data becomes available, it becomes more important to focus on developing causal inference methods that are compatible with the complex, multiscale nature of cellular switch networks. For instance, in transcriptional regulatory systems, genes with low demand are predominantly regulated by repressors, whereas genes with high demand are controlled by activators [55]. This mechanism, aimed at minimizing error loads, might have been an evolutionary driver for the emergence of various regulatory motifs [32,55].
Considering this intricacy of interaction, search for causal links can be very insightful. Causal discovery methods have been applied to gene expression data to detect the presence and absence of causal relationships, even with very few samples [56]. Similarly, methods developed for causal inference from gene perturbation experiments have validated the feasibility of discerning causal structures within gene regulatory networks [57]. Furthermore, single-cell analyses using tools demonstrate how causal discovery can elucidate gene regulation mechanisms at an unprecedented resolution [58]. By prioritizing causal discovery within gene regulatory networks, we can highlight the underlying regulatory motifs that orchestrate cellular responses.
An exciting frontier in this field is the exploration of higher-order interactions and motifs. Higher-order interaction analyses reveal complex connectivity patterns that go beyond simple pairwise interactions [59]. These higher-order motifs can offer deeper insights into the modular organization of biological networks [60]. Data-driven approaches have also shown promise in identifying these intricate interactions within high-dimensional complex systems [61]. Combining these advanced techniques with the Pearl causal framework could push the boundaries of current biological research, uncovering novel regulatory mechanisms and evolutionary strategies.
Biological systems have evolved to minimize errors and maintain robustness amidst environmental and signaling noise [6,29]. This evolutionary resilience necessitates methods that can effectively capture causality from a functional hierarchical standpoint. A mere knowledge of connections between components is insufficient; instead, we require a form of causal probing akin to in-circuit testing that can discern interactions at varying scales. This approach is similar to how in-circuit testing for printed circuit boards works, where the testing is not just about understanding the connections, but also about probing the functionality of the individual components and the system as a whole. This allows for a more comprehensive understanding of the system, whether it’s a circuit board or a biological cell. Theories like Pearl’s framework of causal and counterfactual inference based on structural models [12,13], alongside its recent extensions incorporating information theory [17,18], can provide valuable tools for quantitatively analyzing causal structures. Recent advances have shown how macro-scales can emerge in biological systems and affect micro-scales, offering new avenues for prediction and control [18]. These tools can illuminate macro-scale models of network interactions.
However, the inherent time-delays and the complex ordering of processes resulting from genetic switch configurations pose significant challenges in unraveling the causal framework of intracellular switches. Presently, the mixed formalisms of information theory and causal modeling are not fully equipped to manage the daunting complexity of time-delays within extensive networks of genes and proteins. Concepts from concurrency theory, originally developed to address the problem of partially ordered components in distributed systems synchronization [62], offer intriguing possibilities. Some concurrency-related methodologies, such as Π-Calculus and Petri nets, have been adapted to model cellular signaling pathways [63–67]. Integrating these approaches with causal modeling and information theory could provide a more accurate representation of the nature of biological switches and their complex, intertwined multiscale dynamics. The primary focus should not be on what experts in a particular method can contribute to biology, but rather on first comprehensively understanding the biological problem and the cellular/subcellular constraints. Once these constraints are well understood, we can then proceed to construct relevant and robust methods that are tailored to address these specific challenges. This approach ensures that the methods developed are not only scientifically sound but also practically applicable in the biological context.
Problem 2: How to manage the need to reconfigure?
Homeostatic response and feedback control
In Problem 1, we explored how cells maintain internal stability through gene/protein switch configurations. Here, we examine the challenges cells face when managing significant extracellular environmental changes, necessitating not just maintenance but active reconfiguration of internal states. This problem focuses specifically on how cells reconfigure their regulatory mechanisms to maintain homeostasis under significant environmental changes, extending beyond the initial identification and manipulation of control switches discussed previously.
With minimal changes in the extracellular environment, a homeostatic response can handle internal regulation through a few adjustments. The concept of homeostasis, crucial for understanding this regulatory balance, was initially conceptualized by Claude Bernard and later formally defined by Walter Cannon [68,69]. It deals with maintaining a steady-state goal as a core mechanism in living systems [70]. This mechanism often relies on negative feedback [70,71]. The significance of negative feedback in achieving a dynamic equilibrium has been a pivotal aspect of cybernetics, highlighting how living systems autonomously counteract external changes [28]. In fact, cybernetics stretched this notion to define purposive behavior, giving the feedback regulatory mechanism a “teleologic” flavor [22].
However, challenges such as lag time in response and dampening oscillatory offshoots prompted the development of other control mechanisms, notably integrative error minimization through negative feedback [72]. Although mathematically elegant, the biological realization of integral feedback control was initially unclear. This gap has been recently addressed with the proposal of a controller topology enabling robust adaptation in noisy intracellular networks [73,74].
Beyond homeostasis: Adaptive reconfiguration in cells
Homeostatic regulation, whether through negative feedback or proportional–integral–derivative (PID) control, is key in maintaining internal regulation in response to extracellular changes. Negative feedback helps achieve dynamic equilibrium by counteracting deviations from a set point [22,28]. PID control extends this by adjusting responses based on current state, history of past states, and rate of change, thus offering a robust mechanism for maintaining stability [75]. Recent advancements have introduced a variation of this type of control, antithetic integral feedback (AIF), which ensures robust perfect adaptation in noisy biomolecular networks by using 2 regulatory species that annihilate each other [73,74]. This approach has been demonstrated to achieve robust adaptation and has potential applications in maintaining homeostasis in complex intracellular environments. AIF controllers, in particular, may provide a universal solution for robust perfect adaptation, essential for coping with intrinsic and extrinsic noise in biological systems [73,74,76].
However, the need for homeostatic regulation extends beyond handling moderate extracellular changes. In cases of more severe environmental shifts, such as a sudden but long-term change in extracellular Na+, cells employ additional regulatory approaches that lead to transient expansion or contraction of the homeostatic response bounds (may be referred to as Allostasis [77,78], Heterostasis [79], or Adaptive Homeostasis [80].). These adaptive responses, akin to integral control methods where the accumulation of error over time drives corrective measures [1,72], are critical for cells facing acute or sustained environmental stresses, necessitating a reconfiguration of gene and protein pathways.
This not only pushes the cell to find the right switch configuration to maintain internal stability, but it also mandates the need for long-term or lasting reconfiguration. This leads us to conceptualize the cell’s response as a multidimensional landscape with regulatory stable states representing basins of attraction, akin to Waddington’s landscape model [81]. The Waddington landscape model offers a multidimensional perspective on cellular behavior, illustrating various potential states of cell development as attractor basins within a complex topography [81]. This model highlights the role of noise in shaping the epigenetic landscape, as fluctuations can distort the pathways that cells follow, leading to diverse cell-fate decisions [82]. Noise, therefore, is not merely a disruptive force but a crucial element that can drive variability and adaptability in biological systems. Modern interpretations of this model show how bifurcations and bistability are necessary for understanding the robustness and flexibility of cellular decisions and developmental pathways [83]. The intrinsic noisiness of gene networks is another aspect where noise plays a pivotal role. The fidelity of molecular signaling is influenced by this inherent noise, impacting the information capacity and reliability of these regulatory systems [84,85]. These perspectives collectively emphasize the importance of noise in both genetic regulation and epigenetic landscapes, illustrating how stochasticity can be harnessed for functional adaptability and complexity in cellular systems. Exposure to extreme events can push the cell to search for a new optimum basin in a new functional space with altered regulatory bounds, signifying a fundamental shift in the cell’s operational parameters.
Upon reconfiguration, a cell transitions into a new functional space, distinct from its original state. This new space presents the cell with unique and modified regulatory bounds, altering both the response time and nature of the intracellular response. The duration and severity of external changes dictate whether this reconfiguration is short term or long term. More complex cells may possess a broader functional repertoire, enabling them to navigate more basins in the functional space. This enhanced functional repertoire is underpinned by the biological robustness seen in higher organisms [29,30], which have evolved sophisticated regulatory networks and control mechanisms to maintain homeostasis and adapt to environmental changes [3,4,86]. For example, eukaryotic cells exhibit emergent macroscales in their protein interactomes, which are associated with lower noise and greater resilience compared to prokaryotic cells [4,6]. These higher-order structures allow for more effective information transmission and robust adaptability, indicating a significant evolutionary advantage in managing complex cellular processes [87,88]. Therefore, the more advanced cells may have the capacity to quickly revisit an optimal basin in their functional repertoire upon re-exposure to a similar extreme event.
A key feature of adaptive homeostatic regulation is its predictive nature, which aids in reducing the magnitude of response error and the need for compensation [89]. In combination, homeostasis and adaptive homeostasis (allostasis) shape a functional space where phenotypic stability and plasticity define the dynamics and trajectory of a cell’s configuration in response to varying degrees of extracellular changes.
Adaptive plasticity in complex cellular systems
In more complex cells and multicellular organisms, this dynamic interplay between phenotypic stability and plasticity is further complicated by additional layers of regulatory control. These mechanisms extend beyond simple homeostatic responses, encompassing a broader range of adaptive processes that are essential for maintaining function under fluctuating conditions. For instance, the steady-state distribution of ion channels is governed by a correlated homeostatic regulation [90]. Furthermore, the macroscopic electrogenic function can be maintained despite variation in ion channel types [91]. The co-regulation of ion channels through homeostatic mechanisms illustrates how different channels can compensate for each other to maintain stability. This co-regulation is essential for managing the complex interplay between multiple properties and ensuring robust macroscopic cellular function. The emergence of correlations in ion channel expression levels can be explained by homeostatic control mechanisms that couple expression rates to cellular activity [90,92].
On the other hand, other complicated parameters can also set the functional landscape of cellular functioning. For example, in neurons, there can be differential expression of Na and K channels between soma and dendrites [93]. This suggests that the functional space and its minima are uniquely defined for soma versus dendrites. As a result, the functional space in dendrites or soma could be maintained within a pre-set range while it might be altered for the other parts of the neuron. This differential plasticity allows the neuron, as a unit of computation, to have varying overall function due to a polarized spatial balance between phenotypic stability and plasticity.
Such differential regulation underscores the dynamic nature of cellular responses and the importance of understanding these responses in the context of the entire organism. Additionally, intrinsic membrane excitability can be altered through dynamic epigenetic modification of DNA (through methylation) that controls the expression of ion channels and synaptic response [94,95].
A similar situation is observed in cardiac tissue where the balance between phenotypic stability and plasticity is controlled by a variety of mechanisms at different levels [96]. For example, repetitive pacing transforms the electrophysiological attributes of cardiomyocytes to a new semi-stable state [94,95]. This shift in the functional space occurs through a reduction in the mRNA concentration of INa and ICa channels, effectively altering their expression in cardiomyocytes [97]. The complex molecular mechanism for electrical remodeling of the cardiac cells creates a major challenge for understanding how fine-tuning ion channel expression controls cardiac excitability [98].
The phenomenon of altering electrophysiological properties in excitable membrane (of neurons or cardiomyocytes) can be studied through mixed models combining recursion and dynamics. These models provide a deeper understanding of the intricate causal chains affecting a cell’s functional space post reconfiguration. This approach underscores the need for robust, interdisciplinary methods to unravel the complexities of cellular adaptation and reconfiguration in response to environmental changes.
Molecular mechanics: The role of ion channels in cellular adaptation
Understanding how these adaptive processes operate on a molecular level, particularly through the role of ion channels, provides insight into the fundamental mechanisms of cellular adaptation. Ion channels serve as a prime example of the adaptive mechanism in cells. Essential for ion transport across cell membranes, these channels exhibit both phenotypic stability and plasticity, which are crucial for cellular adaptation. They control the membrane potential and intracellular ion concentration, both of which are dependent on the number and type of ion channels presently expressed in a cell. These channels are ubiquitous across unicellular and multicellular organisms, demonstrating the universal importance of this mechanism.
The diversity and complexity of ion channels reflect a higher functional repertoire in more complex cells, contributing to their ability to revisit optimal functional states quickly. This view can be scrutinized through an evolutionary lens. The evolution of ion channels highlights their importance in cellular responses to environmental changes. Early ion channels, such as calcium channels, evolved to include potassium and sodium ions and later fully evolved to dedicated potassium and sodium channels, which improved the precision and speed of signaling. This evolution was crucial for the development of efficient signaling mechanisms in multicellular organisms [99]. Voltage-gated ion channels play a critical role in managing ion flow across cell membranes, which is essential for various cellular processes. These channels have diversified significantly, especially in more complex organisms, enhancing their ability to process and transmit information rapidly [100]. Eukaryotes, in particular, have developed a vast array of ion channels that facilitate complex cellular behaviors. This expanded repertoire allows eukaryotic cells to manage a greater variety of functions with enhanced efficiency and adaptability, enabling them to respond more effectively to environmental changes [101].
In specialized excitable cells, such as neurons, the homeostatic regulation of ion channels is critical for maintaining excitability and compensating for perturbations such as channel deletions, mutations, or environmental disturbances [102,103]. These channels enable neurons to adjust their intrinsic properties through activity-dependent mechanisms that sense and respond to changes in physiological activity. They key point is that ion channels act as critical mediators between extracellular events and intracellular processes, coupling signals from the environment to cytoplasmic biochemical pathways. This capability underscores the advanced functional repertoire of more complex eukaryotic cells, allowing for refined and rapid responses to external stimuli [104,105].
To further illustrate these principles, let us consider the case of KscA, a well-known bacterial potassium channel, which is a tetramer with 4-fold symmetry [106]. It operates as a pH-gated, potassium-specific channel, favoring a closed state at neutral pH. The channel exhibits voltage-gated inactivation, and the energy landscape of gating undergoes a conformational change, reaching its minimum when the channel is closed [107]. When the channel is exposed to an acidic environment, electrostatic changes in the protein’s cytosolic domain cause it to open [108,109]. The duration of the channel’s inactivity influences its reactivation, thereby modulating its response to pH changes [110].
Intracellular pH, regulated through changes in membrane permeability to ions like K+, Na+, and H+ [111], can affect many cellular mechanisms, including cell metabolism, cell growth, Ca2+ homeostasis, proliferation, and gene expression [112,113]. On the other hand, extracellular acidity impacts the cytosolic pH, altering the opening probability and activation of potassium channels such as KcsA. This leads to a departure from the energetically favored closed state, placing an energy demand on the cell. The speed at which such a change of functional space occurs is controlled by the duration of prior inactivation.
In parallel to the response of voltage-gated K channels, other K transport systems play a crucial role in regulating the cytosolic pH, and thus its effect on intracellular K+ concentration. For instance, if a cell has Kdp, a K transport system sensitive to both extracellular K+ concentration and pH, its growth can be inhibited due to a joint reduction of Kdp activity and repression of Kdp gene expression caused by moderate external K+. Additionally, at lower pH, the cell’s capacity to regulate its cytosolic pH through K transport is undermined.
This complex regulation involves intricate patterns of nested, hierarchical, and recursive control, where higher-level regulatory mechanisms modulate lower-level processes to achieve overall stability and adaptation [87]. Nested and hierarchical structures provide a layered approach to control, while recursive functions enable dynamic adjustments based on feedback loops, enhancing the robustness and flexibility of cellular responses. The combination of these regulatory mechanisms renders the multiscale interactions inherent in cellular processes a complex entity for modeling and analysis.
The extensive literature bridging control theory and biophysics on ion channel dynamics, adaptation, and responses, provides valuable insights into the molecular constructs of these mechanisms. The control theoretic approach to modeling ion channel dynamics offers valuable insights into their function (for interested readers, detailed reviews of mathematical models and simulations of ion channels can be found in [114–118]. These works explore various approaches, including traditional methods like Hodgkin–Huxley and Markov models, as well as modern techniques such as artificial neural networks (ANNs) and neural ordinary differential equations (ODEs) for modeling ion channel conductance and kinetics [119,120]. These advanced models offer improved accuracy and computational performance, providing valuable tools for studying the electrophysiological mechanisms of ion channels across different biological systems.).
System theory-based models, for instance, provide a novel method for characterizing ion channel kinetics, allowing for exceptional accuracy and computational efficiency compared to traditional methods [121]. The application of control theory concepts to neuronal homeostasis has clarified many underlying theoretical aspects and has suggested new experimental and computational approaches to better understand these processes [122]. While control theory and physics can guide us in untangling the dynamics at given scales, we also need formal systems that can aid in linking the models and can untangle the complex web of interactions in these forms of nested, hierarchical, and recursive structures. As such, a recent work on category theory offers a formal mathematical framework to understand the compositional structures, providing tools for the detailed analysis of nested and hierarchical systems in cellular processes [123]. Similarly, formal treatment of recursion in partially ordered components in distributed systems [62] can provide the needed complementary tool for deciphering complex regulations that was discussed above. By adapting and fusing these approaches, we may better understand the need for cellular reconfiguration in response to environmental changes.
Problem 3: How to harness noise rather than succumb to it?
The dual nature of cellular noise
Cells face the challenge of noise at almost every scale of operation. Noise exists at the boundary with the external world as well as internally across molecular interactions and physical interfaces. Despite the abundance of noise, cells manage to function, grow, and evolve. The viewpoint that noise is pure nuisance has been gradually shifting to an understanding that noise is not necessarily detrimental to biological systems [124]. In this context, we will explore various aspects of noise, how cells handle such challenges, and what modeling frameworks can elucidate this complex relationship. While it may seem narrow to focus on specific aspects, this approach allows us to examine critical examples that illustrate broader principles. By examining specific instances of noise handling, we can uncover underlying mechanisms and strategies that are broadly applicable across different biological systems, thereby maintaining coherence with the holistic perspective advocated throughout the article.
The cell membrane acts as the boundary between the internal and external worlds of the cell. Through molecular sensing and transport, cells receive information about the external world and harvest necessary chemicals. Ligand–receptors and ion channels are key molecular systems through which a cell communicates and samples the external world. Aside from cases where a ligand irreversibly binds to its receptor, the ligand–receptor interaction is inherently stochastic and noisy [125]. This inherent noise, contrary to being a mere byproduct, plays a crucial role in the cell’s information processing capabilities. The irreversible binding may be desirable for drug testing purposes [126], but in principle, is the opposite of what a continuous information acquisition system would need. Considering the ligand–receptor binding as the means for probing the extracellular space, one can think of the “source (of the chemical), particles, and receptor” as an equivalent of “transmitter, channel, receiver.” An information-theoretic treatment can guide in separating different noise sources in a mixture model of kinetic and stochastic reactions [127].
In the context of negative feedback and genetic networks, a framework combining information theory and control theory provided insights on the fundamental limits on the intracellular noise reduction [128]. By applying similar principles, we can better understand how cells manage noise at the various steps of signal transduction. However, at each level, if not accounting for the subtleties of biological systems, a blind adaptation of such methodology might fail to capture many nuances. The speed of ligand–receptor binding defines the nature of the “noise” from a specific vantage point. Slow binding and unbinding can cause the response curve to gradually reach a saturation point, making it challenging for the cell to differentiate between high ligand concentrations [129,130]. Receptor noise studies have demonstrated that slow binding leads to reduced chemotactic efficiency, establishing bounds on the fidelity of such signaling processes [131,132].
Furthermore, from the perspective of the cell, measuring high concentrations of ligand introduces a different kind of noise management compared to low concentrations. At high concentrations, receptors can become saturated, making it difficult to distinguish between slightly different high ligand levels. This saturation arises not from low molecular counts but from limitations in receptor availability and binding dynamics. Cells can utilize pre-equilibrium sensing and signaling, where information from early, transient binding events is used before the system reaches equilibrium [133]. This allows cells to bypass the saturation limit, enhancing discrimination between high ligand concentrations and improving signal fidelity. Conversely, when the number of receptors in a cell is low, another form of noise is introduced due to the low signal-to-noise ratio caused by the small number of ligand molecules. In such cases, the shift of response tuning to downstream machinery can prevent the amplification of receptor-level noise, maintaining accurate signaling [133].
Boundary noise and cellular communication
Another special treatment of noise occurs at the boundary, where interference noise due to the presence of structurally similar non-target molecules adds to the inherent stochasticity of a small number of target molecules. In such situations, ligand-induced stochastic cluster formation of receptors takes place, enhancing the receptor’s sensitivity and specificity [134].
This “noise-cancelling” receptor clustering and ensued allosteric interactions have evolved to enhance the signal-to-noise ratio (SNR) in cellular communication. Clustering of receptors increases sensitivity by reducing the detrimental effects of intrinsic and extrinsic noise. For example, receptor clustering has been shown to increase signal detection reliability and transduction efficiency, thereby optimizing the cellular response to environmental cues [135,136]. Additionally, ligand-induced stochastic cluster formation of receptors enhances both sensitivity and specificity through noise-induced symmetry breaking, compensating for the presence of non-target ligands in the environment [134]. In certain cases, this can be understood through the concept of stochastic resonance (Stochastic resonance, where noise actually enhances the detection of weak signals, is further elaborated in the following sections.). Particularly in clusters of major histocompatibility molecules and T-cell receptors, a stochastic quantizing encoding of transmembrane signaling improves information transfer in cellular processes [137].
A solution for managing noisy ligand–receptor interactions is to synchronize the excitable receptors during signal transduction [138]. These noise-induced phenomena, whether they result in spatial symmetry breaking or temporal symmetry breaking, might have separate evolutionary roots. However, they hint at a key point: cells are not merely passive receivers faltering in the face of abundant noise at the boundary. An important lesson is to recognize that the nature of the noise itself can guide us in modeling and understanding it effectively. As Monod elegantly put it, biological systems are shaped by deterministic laws and randomness [139]. While reductionist approaches provide detailed insights into the molecular aspects of noise in biological systems, they often fall short in capturing the emergent properties of complex systems. In biological contexts, the separation of scales is not straightforward. Higher-level properties and processes are not merely derivable from lower-level data and mechanisms but also act as causes of lower-level behavior, involving interactions in both directions [140]. Understanding how cells sample information from their environment and manage their responses to noise requires considering multiple levels of organization and these causally linked scales. This integrated perspective helps to elucidate how cells harness noise and adapt to changing conditions, thereby maintaining functionality and adaptability in changing environments.
However, the way that living systems harness noise, as exemplified above, shows us that a purely reductionist approach to the biophysical characteristics of ligand/receptor/membrane will leave us ill-equipped in understanding how cells manage challenging environments [140]. While reductionist approaches provide detailed insights into molecular interactions, they often miss the emergent properties and cross-scale dynamics that are crucial for understanding noise management. By integrating both reductionist details and perspectives on interscale interactions, we gain a more comprehensive understanding of cellular resilience and adaptability.
A deeper understanding of boundary noise and its management is necessary for understanding cellular efficiency in noisy environments. Developments such as patch clamp technology have revealed the conductance fluctuations of individual channels as they oscillate between open and closed states, influenced by thermal noise [141,142]. The probability of voltage-gated channels opening and closing is dependent on the membrane potential, which in turn affects channel noise. These fluctuations impact the timing and generation of action potentials, as well as subsequent changes in membrane potential. Channel noise can cause significant variations in spiking propagation [143], and it imposes a lower limit on the miniaturization of cell signaling [144,145]. This is because the noise generated by ion channels can shift the functional state of the cell, leading to spontaneous action potential generation [146]. Understanding these dynamics not only elucidates how cells maintain functionality under noisy conditions but also sets the stage for exploring how noise can be harnessed to enhance cellular processes, as discussed in the following section on stochastic resonance.
Harnessing noise: Stochastic resonance in cellular function
While some aspects of noise can be limiting or detrimental, noise itself can enhance the detection of weak signals in certain nonlinear systems, such as electronic circuits and biological sensory systems [147]. A phenomenon known as stochastic resonance (SR) has emerged as a key player in biological systems dealing with noise. SR, which enables the amplification of weak signals in nonlinear systems, illustrates that noise, when managed appropriately, can enhance cellular functionality. Initially introduced to explain the periodic occurrences of the Earth’s ice ages [148], SR involves adding noise to a nonlinear dynamical system to bring a weak signal above the threshold, enabling the system to detect sub-threshold signals [149,150]. In this case, the system that is subject to periodic forcing shows a resonance in the spectrum that is absent in the forcing and the perturbation [148].
SR has been shown to contribute to signal detection in various sensory systems, including acoustic/electric stimulation of human hearing [151], vision [152], and noise-enhanced tactile sensation and balance control [153,154]. Large nonlinear networks are more sensitive to weak inputs when a fixed level of noise is added. In such cases, irrespective of the nature of the input signal, it can be pushed to cross the threshold [155].
However, SR is not limited to large networks of excitable units. SR has also been observed at the cellular level in crayfish mechanoreceptor cells [156] and in bistable single neuron models that show a correlated switch between states when they are driven by noise in the presence of periodic external modulation [157]. For SR to be effective at this scale, the noise must be optimized and adjusted to the nature of the signal [158].
SR also plays a significant role in ion channel signal transduction at the subcellular level. This was first demonstrated in a large parallel ensemble of an artificial ion channel, specifically polypeptide alamethicin incorporated into planar lipid bilayers. In this case, the signal transduction induced by noise increased over a hundred-fold [159]. Ion channel conformational changes are considered as thermally activated Poisson events. Ion channels open and close stochastically, and they do not have a built-in “absolute” threshold. Ion channels open and close stochastically, and they do not have a built-in “absolute” threshold. However, additive noise linearly changes their activation barriers with applied transmembrane voltage, thereby increasing the instantaneous rate of the thermally activated reaction and exhibiting SR [160]. This phenomenon allows for the detection of small amplitude input signals in thermally driven physico-chemical systems, where reaction rates are controlled by activation barriers. This is applicable in systems such as semiconductor p–n junctions and voltage-dependent ion channels [161].
Although the activation barriers in ion channels change linearly with additive noise, the downstream effects in these driven nonlinear systems demonstrate complex behavior. The presence of noise can induce resonant amplification of weak signals, a hallmark of SR, which significantly enhances signal detection and transmission. This is because SR exploits the interplay between noise and the nonlinear dynamics of the system, leading to emergent behaviors that are not predictable from the individual components alone [162]. This interplay often reveals new dynamic states and responses [163,164], highlighting the importance of considering higher-order interactions and emergent properties to fully understand noise management in biological systems.
Although, it’s worth noting that the artificial channels used in these studies are less sensitive to temperature variations than natural channels [165,166]. The higher temperature sensitivity of natural channels near physiological temperature, and the induced conformational transitions, effectively change the energy landscape and transition rate between closed and open states at the level of single ion channels [166]. As a result, at the scale of a single ion channel, enhanced thermal-noise-induced SR affects the rate of information gain more significantly in biological ion channels than in artificial channels [166,167].
Periodic, aperiodic, and nonstationary stochastic resonance occurring at physiological temperature allows cells to harness noise to enhance signal transmission and information gain with fewer channels [167,168]. This strategy is energy-efficient, as increasing the density of ion channels to combat noise is energy-consuming.
The collective property of ion channels and its effect on SR also influences the response to noise. The number of ion channels determines the amplitude of the membrane potential fluctuation [169]. In small clusters of ion channels, SR and threshold-crossing are primarily defined by individual channel kinetics and thermal-noise-induced SR [165].
As the size of the membrane patch increases (for example, in soma versus dendrites) and/or the density of ion channels increases (through modulation of gene expression or their polarization, as in neurons), a system-size resonance occurs [170]. This emergent collective property of globally coupled ion channel assemblies exhibits a resonant-like temporal coherence [165]. This collective property leads to a type of SR that occurs only in large clusters of ion channels, even in the presence of the suboptimal intrinsic noise of single channels [165,171].
Quantization and noise management in cellular systems
Unlike engineered systems, where efforts are focused on eliminating noise, cells harness boundary noise at the subcellular scale. SR plays a key role in dealing with boundary noise at both the individual ion channel and collective levels.
By examining the challenges that cells must overcome, we can gain a deeper understanding of the computational primitives that operate through cellular biophysical interfaces and biochemical pathways. The stochastic nature of ion channels and the presence of stochastic resonance can guide us to probe information gating through the lens of quantization error.
Ligand receptors and ion channels serve as quantizing sampling devices. Their stochastic kinetics and density define the collective sampling rate. We have discussed how SR can enhance the sensitivity of ion channels to subthreshold inputs. Equally important, thresholded systems exhibiting SR are unique forms of dithered quantizers [172]. Dithering, a technique commonly used in digital signal processing domains such as audio and picture coding, involves adding random noise (or dither) to data before storage or transmission. During World War II, engineers discovered that mechanical computers used for radars and trajectory calculations performed better onboard airplanes than on the ground. They found that vibration-induced noise randomized the quantization error, thereby increasing the accuracy of the calculations performed by these mechanical computers [173]. Since then, it has become standard practice to add a small amount of noise to a signal either before quantization (during analog-to-digital conversion) or when reducing the bit-depth of a digital signal (such as during a 256-bit to a 16-bit image reduction). Adding this noise randomizes the quantization error when a reduction in precision is necessary [174]. The dithering characteristic of SR allows cells to modulate the sampling precision and information transfer of ion channels through a quantization operation.
Thresholded dynamical systems, such as ion channels and excitable membranes, can be interpreted as multistable systems. Signal transduction through these channels has a finite range, and the sampling process is discrete and stochastic. This leads to inevitable distortion and loss of signal details, which can manifest as spurious frequencies or amplitude reduction due to time discretization and amplitude quantization.
In systems with a nonlinear input–output relationship, the quantization error needs to be independent of the input signal to minimize distortion [175]. In thresholded nonlinear dynamical systems, like cells with excitable membranes, SR is more akin to a special case of a dithering effect rather than a resonant phenomenon [172,175].
Just as image dithering maintains the color-depth gradient during bit reduction (e.g., when converting grayscale to black-and-white), SR in multi-thresholded dynamical systems can increase the number of quantization levels [176]. SR affects the responsiveness of individual channels, causing them to open stochastically at different rates. Collectively, these opening dynamics create a multi-threshold system.
Instead of setting a fixed threshold for all channels to switch to the open state, the presence of noise induces concentrated transitions to the open state in clusters of channels. The small intervals created around multiple thresholds, determined by the different sizes of these clusters, increase the number of quantization levels. This increased quantization level allows the cell to transduce microscopic fluctuations in the external environment in a graded, discretized manner.
As a result, by harnessing noise, cells can effectively capture a macroscopic view of their environment at the boundary. We can infer that SR enables cells to be sensitive to microscopic fluctuations while maintaining a macroscopic understanding of their environment.
Modeling and multilevel implications of cellular noise
Cells’ ability to operate efficiently amidst noisy dynamics underscores the functional role of noise in biological systems [124,125]. The fact that noise at various levels can exhibit non-rigid functional characteristics is significant. This includes boundary stochastic noise in a single channel, clusters of channels in a membrane patch, collective properties of polarized distribution of channels in a single cell, and networks of cells. This highlights the importance of multilevel (multiscale) modeling.
Multiscale modeling and simulation provide powerful tools to capture the complexity of biological systems [14]. Developing a common framework for multiscale modeling, which includes defining clear scales and ensuring proper scale bridging, helps in understanding how microscopic interactions aggregate to influence macroscopic phenomena and vice versa [177]. A balanced approach that integrates reductionist and systemic perspectives is necessary for understanding complex systems [178]. While reductionism provides detailed insights into individual components, emergent properties and interactions across scales will require a different approach to understand collective behaviors that are not evident from the properties of individual components. This integrated approach ensures that modeling techniques are tailored to specific contexts, recognizing the variability in scale interactions across different biological systems [179]. Given the spatial variability and temporal dynamics of the involved components, we need to resort to the heterogeneous multiscale method in order to effectively couple macroscopic and microscopic models [180]. In cellular systems, this method can help capture how molecular-level interactions impact macroscopic behavior. For instance, deciphering the dynamics of ion channels and receptors at the microscopic level can significantly clarify our understanding of complex macroscopic phenomena like cell signaling, membrane excitability, and cell–cell communication in the presence of noise.
The multiscale effects of stochastic resonance align well with the concept that there is no privileged level of causality in biological systems [181]. This suggests that all levels of biological organization, from molecular to cellular to systemic, play a crucial role in the overall function and behavior of the system. This perspective encourages a comprehensive approach to studying biological systems, taking into account the interplay between different levels and the role of noise in these interactions. By integrating insights from molecular dynamics, cellular interactions, and system-level behavior, we can better understand the emergent properties and complex interactions and how cells harness and manage noise to maintain functionality and adaptability.
Concluding remarks
The challenge of biological modeling
Hilbert’s list of problems was not just a set of unsolved mathematical problems, but also a framework for advancing the discipline. While some of these problems remain unsolved, the concept of a framework has endured. Unlike mathematics, which is grounded in logic and proofs, biology has largely been an ever-expanding collection of detailed findings with limited scope. Efforts at biological modeling and theorizing have frequently struggled to synthesize these findings into unified theories. This raises the question: What is missing in our approach? Is biology fundamentally different from quantitative fields like physics? Can biology not be tamed with mathematical models? Or is it the case that we yet do not have the appropriate mathematical tools to properly model biological systems?
In a thought-provoking essay [182], Gunawardena posits that our models, at least at the molecular scale, are mostly phenomenological and amount to little more than educated guesswork. He points out that unlike physics, biological models are not objective. Therefore, our best strategy is to begin with a sound set of assumptions that can guide us in selecting the most suitable model for the problem at hand [182]. The essay paraphrases James Black, a Nobel laureate pharmacologist, reminding us that “our models are accurate descriptions of our pathetic thinking.” This depiction of biological models highlights key issues. Biological modeling faces additional challenges due to the complexity and variability inherent in living systems. The dynamic interactions, nonlinearity, and stochasticity at multiple scales make it difficult to create comprehensive models that are both predictive and explanatory.
Embracing a problem-oriented approach in biology
One of the challenging issues with current biological models is their focus on isolated components rather than integrated systems. This approach leads to disjointed models that hinder our ability to formulate fundamental theories. While biological sciences have seen exponential growth, the progress in finding principles has not paralleled that fast-paced progress. Some biological textbooks even include words like “fundamentals” or “principles” in their titles, yet they primarily expand with more detailed examples and new findings without consolidating overarching principles (as an example, the 1st edition (1981) of “Principles of Neural Science” was published in 468 pages, and its 6th edition (2021) has now grown to 1,646 pages). This massive expansion in volume, while reflecting incredible progress in terms of our knowledge about some details of the system, is antithetical to extracting fundamental principles in theory-laden science. This phenomenon underscores a significant challenge: the need for models that bridge different scales and components, and the need for a framework that would facilitate integration across such models. In this context, we should highlight the importance of not only understanding system structures but also system dynamics, control methods, and design methods as critical properties for a comprehensive model in biology [183]. Incorporating these perspectives can guide the development of integrated models that address the complexity of biological systems.
Here, we seek to address this challenge by proposing a program inspired by Hilbert’s approach. By grounding our understanding in fundamental principles and axioms, we can systematically build multiscale models that capture causal links across different scales. This problem-oriented approach keeps us aligned with empirical data and ensures the development of falsifiable models, fostering a cohesive framework for integrating diverse biological models into core principles. Adhering to constraints and details rooted in the physical nature of biological components is crucial. This adherence will give us a chance to formulate models that are true to the underlying biophysics with predictions that can be experimentally falsified [184].
Necessarily, we need to avoid being trapped by disjointed descriptions of biological systems. A computational lens can guide us to focus on systemic functions rather than isolated components. There are convergent themes between biological and computing systems that fall under the lens of “computational thinking,” such as distributed and coordinated processes, robustness to failure/attacks, and modularity [185].
The concept of separation of scales, where slow processes are treated as static constraints, dynamic processes are governed by mechanical laws, and fast processes are averaged over, is crucial in modeling many physical systems. However, in biological systems, scales are often coupled and causally linked, making the separation less straightforward. Complex systems in biology require an integrated approach that considers the interactions across scales. This integrated approach is helpful for capturing the true complexity of biological systems. By emphasizing problems over methods to guide proper heterogeneous multiscale models, we can identify the constraints inherent in these problems and develop integrated models that reflect the true complexity of biological systems. This approach aligns with the idea that biological theory can benefit from combining parsimonious reasoning with rule-based explanations [186]. By incorporating these principles, we can better understand how to build effective models that bridge different scales and components, providing a more comprehensive understanding of biological systems.
Integrating biophysics, computational thinking, and cybernetic principles
The proposed framework, synergizing computational thinking with biophysical modeling, aims to foster the design of experiments towards uncovering fundamental biological principles. This approach also addresses the limitations of cybernetics by integrating insights on feedback and control within a broader reductionist perspective of molecular biology. While cybernetics correctly recognized the crucial importance of feedback and control in biological systems [28], its overemphasis on teleology [22] juxtaposed it with the explosive success of reductionist molecular biology. What makes biology truly unique is its concern with purpose [187], but perhaps searching for behavioral purposeful descriptions untied from the underlying biophysical processes was a key behind cybernetics’ demise. We can bring back the insightful aspects of cybernetics to the center stage and avoid its pitfalls.
It is important to acknowledge that models solely aiming to outperform their competition in data fitting are non-falsifiable [182]. The increasing usage of ANNs in biology is noteworthy. While ANNs have shown success in modeling complex communication and control pathways, their success hinges on a high-parameter nonlinear fitting process. This often leads to the replacement of one complex nonlinear system with another black box that remains largely misunderstood. However, the potential of ANNs can be harnessed more effectively within the framework of mathematical models, where they can serve as nonlinear optimizers. This approach enables the fusion of physical laws with data-driven methods, yielding a more comprehensive description and prediction of system behavior [188,189]. Importantly, it also ensures the falsifiability of our theories, a cornerstone of scientific modeling.
Advancements in machine learning present opportunities to integrate data-driven methods with physical principles, facilitating the creation of models capable of deciphering complex dynamical systems from noisy and incomplete data [190,191]. Combining machine learning with multiscale modeling can enhance our understanding of complex biological systems by leveraging the strengths of both approaches. Multiscale modeling provides a structured framework to address different temporal and spatial scales, effectively capturing the interactions between them [192]. For instance, this approach involves identifying relevant scales and developing models at each scale, which are then combined to form a comprehensive multiscale model. This method allows us to understand how microscopic interactions influence macroscopic phenomena and vice versa [180]. Integrating machine learning with multiscale modeling can address the limitations of each approach individually. Machine learning excels in handling large data sets and uncovering correlations, while multiscale modeling is adept at probing causality and mechanisms [193]. By combining these methods, we can create robust predictive models that integrate underlying physics to manage ill-posed problems and explore massive design spaces. This integration allows for a more comprehensive understanding of biological systems, capturing emergent properties and complex interactions across scales [194].
This integrated framework signifies a substantial opportunity for advancement in biology. Given the current technological landscape, encompassing precision devices, high-throughput techniques, and advancements in computing and AI, we are well positioned for significant progress. Theoretical frameworks for cellular computation and cell–cell communication could lay the groundwork for biology to develop its first principled quantitative theories, mirroring the remarkable achievements of physics in the early 20th century. This prospect could be within our reach. What can propel us is the application of a systematic Hilbertian approach for the formalization of system-level problems in biology.
Acknowledgments
The author would like to express gratitude to the organizers and attendees of the “What is biological computation?” workshop at Santa Fe Institute. Some of the ideas presented in this work were inspired by discussions that took place during the workshop. The author also wishes to thank both reviewers (the anonymous reviewer and Iain Johnston) for their constructive comments and thoughtful suggestions.
References
- 1. Carlson JM, Doyle J. Complexity and robustness. Proc Natl Acad Sci U S A. 2002;99(suppl1):2538–2545. pmid:11875207
- 2. Csete ME, Doyle JC. Reverse Engineering of Biological Complexity. Science. 2002;295(5560):1664–1669. pmid:11872830
- 3. Kitano H. Biological robustness. Nat Rev Genet. 2004;5(11):826–837. pmid:15520792
- 4. Barkai N, Leibler S. Robustness in simple biochemical networks. Nature. 1997;387(6636):913–917. pmid:9202124
- 5. Balázsi G, van Oudenaarden A, Collins JJ. Cellular Decision Making and Biological Noise: From Microbes to Mammals. Cell. 2011;144(6):910–925. pmid:21414483
- 6. Kitano H. Towards a theory of biological robustness. Mol Syst Biol. 2007;3(1):137. pmid:17882156
- 7. Lazebnik Y. Can a biologist fix a radio?—Or, what I learned while studying apoptosis. Cancer Cell. 2002;2(3):179–182. pmid:12242150
- 8. Vittadello ST, Stumpf MPH. Open problems in mathematical biology. Math Biosci. 2022;354:108926. pmid:36377100
- 9. Hilbert D. Mathematical problems. Bull Amer Math Soc. 1902;8(10):437–479.
- 10. Woodger JH. Biology and the Axiomatic Method. Ann N Y Acad Sci. 1962;96(4):1093–1116. pmid:14008214
- 11.
Mahner M, Bunge M. Foundations of biophilosophy. 1st ed. Springer Berlin; 1997.
- 12.
Pearl J. Causality. 2nd ed. Cambridge: Cambridge University Press; 2009. Available from: https://doi.org/10.1017/CBO9780511803161
- 13. Pearl J. Causal inference in statistics: An overview. Stat Surv. 2009;3(none):96–146.
- 14. Weinan E, Jianfeng L. Multiscale modeling. Scholarpedia. 2011;6(10):11527.
- 15. Noble D. Genes and causation. Philos Trans R Soc A. 1878;2008(366):3001–3015. pmid:18559318
- 16. Noble D. A theory of biological relativity: no privileged level of causation. Interface Focus. 2012;2(1):55–64. pmid:23386960
- 17. Hoel EP, Albantakis L, Tononi G. Quantifying causal emergence shows that macro can beat micro. Proc Natl Acad Sci U S A. 2013;110(49):19790–19795. pmid:24248356
- 18. Hoel E, Levin M. Emergence of informative higher scales in biological systems: a computational toolkit for optimal prediction and control. Commun Integr Biol. 2020;13(1):108–118. pmid:33014263
- 19. Hoel EP. When the Map Is Better Than the Territory. Entropy. 2017;19(5).
- 20.
Crecraft D, Gergely S. Analog Electronics: circuits, systems and signal processing. Elsevier; 2002.
- 21.
Morris AS, Langari R. Measurement and instrumentation: theory and application. Academic Press; 2011.
- 22. Rosenblueth A, Wiener N, Bigelow J. Behavior, Purpose and Teleology. Philos Sci. 1943;10(1):18–24.
- 23. Ayala FJ. Teleological Explanations in Evolutionary Biology. Philos Sci. 1970;37(1):1–15.
- 24.
McDonough JK. Not Dead Yet: Teleology and the “Scientific Revolution. In: Teleology: A History. Oxford University Press; 2020. p. 150–179. Available from: https://doi.org/10.1093/oso/9780190845711.003.0009.
- 25. Wouters AG. Four notions of biological function. Stud Hist Philos Biol Biomed Sci. 2003;34(4):633–668.
- 26. Wimsatt WC. Teleology and the logical structure of function statements. Stud Hist Philos Sci A. 1972;3(1):1–80.
- 27. García-Valdecasas M, Deacon TW. Biological functions are causes, not effects: A critique of selected effects theories. Stud Hist Philos Sci. 2024;103:20–28. pmid:37984082
- 28.
Wiener N. Cybernetics or Control and Communication in the Animal and the Machine. The MIT Press; 1965. Available from: https://doi.org/10.7551/mitpress/11810.001.0001.
- 29. Stelling J, Sauer U, Szallasi Z, Doyle FJ, Doyle J. Robustness of Cellular Functions. Cell. 2004;118(6):675–685. pmid:15369668
- 30.
The Evolution of Genetic Robustness for Cellular Cooperation in Early Multicellular Organisms. vol. ALIFE 2022: The 2022 Conference on Artificial Life of Artificial Life Conference Proceedings; 2022. Available from: https://doi.org/10.1162/isal_a_00536.
- 31. Milo R, Shen-Orr S, Itzkovitz S, Kashtan N, Chklovskii D, Alon U. Network Motifs: Simple Building Blocks of Complex Networks. Science. 2002;298(5594):824–827. pmid:12399590
- 32. Alon U. Network motifs: theory and experimental approaches. Nat Rev Genet. 2007;8(6):450–461. pmid:17510665
- 33.
Alon U. An Introduction to Systems Biology: Design Principles of Biological Circuits. Chapman and Hall/CRC; 2019.
- 34. Ashby WR. The Physical Origin of Adaptation by Trial and Error. J Gen Psychol. 1945;32(1):13–25.
- 35.
Ross Ashby W. Design for a brain: The origin of adaptive behaviour. Springer Science & Business Media; 2013.
- 36.
Popper KR. The logic of scientific discovery. London, England: Routledge; 2005.
- 37. Yu H, Gerstein M. Genomic analysis of the hierarchical structure of regulatory networks. Proc Natl Acad Sci U S A. 2006;103(40):14724–14731. pmid:17003135
- 38. Tonner PD, Pittman AMC, Gulli JG, Sharma K, Schmid AK. A Regulatory Hierarchy Controls the Dynamic Transcriptional Response to Extreme Oxidative Stress in Archaea. PLoS Genet. 2015;11(1):1–13. pmid:25569531
- 39. Defoort J, Van de Peer Y, Vermeirssen V. Function, dynamics and evolution of network motif modules in integrated gene regulatory networks of worm and plant. Nucleic Acids Res. 2018;46(13):6480–6503. pmid:29873777
- 40. Kreimer A, Borenstein E, Gophna U, Ruppin E. The evolution of modularity in bacterial metabolic networks. Proc Natl Acad Sci U S A. 2008;105(19):6976–6981. pmid:18460604
- 41. Kashtan N, Alon U. Spontaneous evolution of modularity and network motifs. Proc Natl Acad Sci U S A. 2005;102(39):13773–13778. pmid:16174729
- 42. Stone L, Simberloff D, Artzy-Randrup Y. Network motifs and their origins. PLoS Comput Biol. 2019;15(4):1–7. pmid:30973867
- 43. Clune J, Mouret JB, Lipson H. The evolutionary origins of modularity. Proc R Soc Lond B Biol Sci. 2013;280(1755):20122863. pmid:23363632
- 44. Angulo MT, Liu YY, Slotine JJ. Network motifs emerge from interconnections that favour stability. Nat Phys. 2015;11(10):848–852.
- 45. Bechtel W, Bich L. Grounding cognition: heterarchical control mechanisms in biology. Philos Trans R Soc Lond B Biol Sci. 1820;2021(376):20190751. pmid:33487110
- 46. Simon HA. The Architecture of Complexity. Proc Am Philos Soc. 1962;106(6):467–482.
- 47. McCulloch WS. A heterarchy of values determined by the topology of nervous nets. Bull Math Biophys. 1945;7(2):89–93.
- 48.
Bei X, Chen N, Zhang S. On the Complexity of Trial and Error. In: Proceedings of the Forty-Fifth Annual ACM Symposium on Theory of Computing. STOC ‘13. New York, NY, USA: Association for Computing Machinery; 2013. p. 31–40.
- 49. Lyon P, Keijzer F, Arendt D, Levin M. Reframing cognition: getting down to biological basics. Philos Trans R Soc Lond B Biol Sci. 1820;2021(376):20190750. pmid:33487107
- 50. Kirschner M, Gerhart J. Evolvability. Proc Natl Acad Sci U S A. 1998;95(15):8420–8427. pmid:9671692
- 51. Navlakha S, He X, Faloutsos C, Bar-Joseph Z. Topological properties of robust biological and computational networks. J R Soc Interface. 2014;11(96):20140283. pmid:24789562
- 52. Ahmed SS, Roy S, Kalita J. Assessing the Effectiveness of Causality Inference Methods for Gene Regulatory Networks. IEEE/ACM Trans Comput Biol Bioinform. 2020;17(1):56–70. pmid:29994618
- 53. Aalto A, Viitasaari L, Ilmonen P, Mombaerts L, Gonçalves J. Gene regulatory network inference from sparsely sampled noisy data. Nat Commun. 2020;11(1):3493. pmid:32661225
- 54. Lu J, Dumitrascu B, McDowell IC, Jo B, Barrera A, Hong LK, et al. Causal network inference from gene transcriptional time-series response to glucocorticoids. PLoS Comput Biol. 2021;17(1):1–29. pmid:33513136
- 55. Shinar G, Dekel E, Tlusty T, Alon U. Rules for biological regulation based on error minimization. Proc Natl Acad Sci U S A. 2006;103(11):3999–4004. pmid:16537475
- 56. Kang EY, Ye C, Shpitser I, Eskin E. Detecting the Presence and Absence of Causal Relationships between Expression of Yeast Genes with Very Few Samples. J Comput Biol. 2010;17(3):533–546. pmid:20377462
- 57. Meinshausen N, Hauser A, Mooij JM, Peters J, Versteeg P. Bühlmann P. Methods for causal inference from gene perturbation experiments and validation. Proc Natl Acad Sci U S A. 2016;113(27):7361–7368. pmid:27382150
- 58. Wen Y, Huang J, Guo S, Elyahu Y, Monsonego A, Zhang H, et al. Applying causal discovery to single-cell analyses using CausalCell. Elife. 2023;12:e81464. pmid:37129360
- 59. Lotito QF, Musciotto F, Montresor A, Battiston F. Higher-order motif analysis in hypergraphs. Commun Phys. 2022;5(1):79.
- 60. Battiston F, Amico E, Barrat A, Bianconi G, Ferraz de Arruda G, Franceschiello B, et al. The physics of higher-order interactions in complex systems. Nat Phys. 2021;17(10):1093–1098.
- 61. Tabar MRR, Nikakhtar F, Parkavousi L, Akhshi A, Feudel U, Lehnertz K. Revealing Higher-Order Interactions in High-Dimensional Complex Systems: A Data-Driven Approach. Phys Rev X. 2024;14:011050.
- 62. Time Lamport L., clocks, and the ordering of events in a distributed system. Commun ACM. 1978;21(7):558–565.
- 63. Regev A, Silverman W, Shapiro E. In: Representation and simulation of biochemical processes using the π-calculus process algebra. Biocomputing 2001. WORLD SCIENTIFIC; 2000. p. 459–470. Available from: https://doi.org/10.1142/9789814447362-0045.
- 64. Priami C, Regev A, Shapiro E, Silverman W. Application of a stochastic name-passing calculus to representation and simulation of molecular processes. Inf Process Lett. 2001;80(1):25–31.
- 65. Pinney JW, Westhead DR, McConkey GA. Petri Net representations in systems biology. Biochem Soc Trans. 2003;31(6):1513–1515. pmid:14641101
- 66.
Phillips A, Cardelli L. Efficient, Correct Simulation of Biological Processes in the Stochastic Pi-calculus. In: Calder M, Gilmore S, editors. Computational Methods in Systems Biology. Berlin, Heidelberg: Springer Berlin Heidelberg; 2007. p. 184–199.
- 67. Priami. Algorithmic Systems Biology. Commun ACM. 2009;52(5):80–88.
- 68. Cooper SJ. From Claude Bernard to Walter Cannon. Emergence of the concept of homeostasis. Appetite. 2008;51(3):419–427. pmid:18634840
- 69. Cannon WB. Organization for physiological homeostasis. Physiol Rev. 1929;9(3):399–431.
- 70. Modell H, Cliff W, Michael J, McFarland J, Wenderoth MP, Wright A. A physiologist’s view of homeostasis. Adv Physiol Educ. 2015;39(4):259–266. pmid:26628646
- 71. Schneck DJ. Feedback control and the concept of homeostasis. Mathematical Modelling. 1987;9(12):889–900.
- 72. Yi TM, Huang Y, Simon MI, Doyle J. Robust perfect adaptation in bacterial chemotaxis through integral feedback control. Proc Natl Acad Sci U S A. 2000;97(9):4649–4653. pmid:10781070
- 73. Briat C, Gupta A, Khammash M. Antithetic Integral Feedback Ensures Robust Perfect Adaptation in Noisy Biomolecular Networks. Cell Syst. 2016;2(1):15–26. pmid:27136686
- 74. Aoki SK, Lillacci G, Gupta A, Baumschlager A, Schweingruber D, Khammash M. A universal biomolecular integral feedback controller for robust perfect adaptation. Nature. 2019;570(7762):533–537. pmid:31217585
- 75. Chevalier M, Gómez-Schiavon M, Ng AH, El-Samad H. Design and Analysis of a Proportional-Integral-Derivative Controller with Biological Molecules. Cell Syst. 2019;9(4):338–353.e10. pmid:31563473
- 76. Filo M, Kumar S, Khammash M. A hierarchy of biomolecular proportional-integral-derivative feedback controllers for robust perfect adaptation and dynamic performance. Nat Commun. 2022;13(1):2119. pmid:35440114
- 77. McEwen BS. Stress, Adaptation, and Disease: Allostasis and Allostatic Load. Ann N Y Acad Sci. 1998;840(1):33–44. pmid:9629234
- 78. McEwen BS. Protective and Damaging Effects of Stress Mediators. N Engl J Med. 1998;338(3):171–179. pmid:9428819
- 79. Selye H. Homeostasis and Heterostasis. Perspect Biol Med. 1973;16(3):441–445. pmid:4705073
- 80. Davies KJA. Adaptive homeostasis. Mol Aspects Med. 2016;49:1–7. pmid:27112802
- 81.
Waddington CH. The Strategy of the Genes. 1st ed. Routledge; 1957. Available from: https://doi.org/10.4324/9781315765471.
- 82. Coomer MA, Ham L, Stumpf MPH. Noise distorts the epigenetic landscape and shapes cell-fate decisions. Cell Syst. 2022;13(1):83–102.e6. pmid:34626539
- 83. Ferrell JE Jr. Bistability, Bifurcations, and Waddington’s Epigenetic Landscape. Curr Biol. 2012;22(11):R458–R466. pmid:22677291
- 84. Paulsson J. Summing up the noise in gene networks. Nature. 2004;427(6973):415–418. pmid:14749823
- 85. Paulsson J. Models of stochastic gene expression. Phys Life Rev. 2005;2(2):157–175.
- 86. Aderem A. Systems Biology: Its Practice and Challenges. Cell. 2005;121(4):511–513. pmid:15907465
- 87. Klein B, Hoel E. The Emergence of Informative Higher Scales in Complex Networks. Complexity. 2020;2020(1):8932526.
- 88. Klein B, Hoel E, Swain A, Griebenow R, Levin M. Evolution and emergence: higher order information structure in protein interactomes across the tree of life. Integr Biol. 2021;13(12):283–294. pmid:34933345
- 89. Sterling P. Allostasis: A model of predictive regulation. Physiol Behav. 2012;106(1):5–15. pmid:21684297
- 90. O’Leary T, Williams AH, Caplan JS, Marder E. Correlations in ion channel expression emerge from homeostatic tuning rules. Proc Natl Acad Sci U S A. 2013;110(28):E2645–E2654. pmid:23798391
- 91. Ori H, Marder E, Marom S. Cellular function given parametric variation in the Hodgkin and Huxley model of excitability. Proc Natl Acad Sci U S A. 2018;115(35):E8211–E8218. pmid:30111538
- 92. Yang J, Shakil H, Ratté S, Prescott SA. Minimal requirements for a neuron to coregulate many properties and the implications for ion channel correlations and robustness. Elife. 2022;11:e72875. pmid:35293858
- 93. Duménieu M, Oulé M, Kreutz MR, Lopez-Rojas J. The Segregated Expression of Voltage-Gated Potassium and Sodium Channels in Neuronal Membranes: Functional Implications and Regulatory Mechanisms. Front Cell Neurosci. 2017;11:115. pmid:28484374
- 94. Meadows JP, Guzman-Karlsson MC, Phillips S, Holleman C, Posey JL, Day JJ, et al. DNA methylation regulates neuronal glutamatergic synaptic scaling. Sci Signal. 2015;8(382):ra61–ra61. pmid:26106219
- 95. Meadows JP, Guzman-Karlsson MC, Phillips S, Brown JA, Strange SK, Sweatt JD, et al. Dynamic DNA methylation regulates neuronal intrinsic membrane excitability. Sci Signal. 2016;9(442):ra83–ra83. pmid:27555660
- 96. Rosati B, McKinnon D. Regulation of Ion Channel Expression. Circ Res. 2004;94(7):874–883. pmid:15087427
- 97. Yue L, Melnyk P, Gaspo R, Wang Z, Nattel S. Molecular Mechanisms Underlying Ionic Remodeling in a Dog Model of Atrial Fibrillation. Circ Res. 1999;84(7):776–784. pmid:10205145
- 98. Balse E, Steele DF, Abriel H, Coulombe A, Fedida D, Hatem SN. Dynamic of Ion Channel Expression at the Plasma Membrane of Cardiomyocytes. Physiol Rev. 2012;92(3):1317–1358. pmid:22811429
- 99. Franciolini F, Petris A. Evolution of ionic channels of biological membranes. Mol Biol Evol. 1989;6(5):503–513. pmid:2477661
- 100. Moran Y, Barzilai MG, Liebeskind BJ, Zakon HH, Anderson PAV. Evolution of voltage-gated ion channels at the emergence of Metazoa. J Exp Biol. 2015;218(4):515–525. pmid:25696815
- 101. Wan KY, Jékely G. Origins of eukaryotic excitability. Philos Trans R Soc Lond B Biol Sci. 1820;2021(376):20190758. pmid:33487111
- 102. O’Leary T. Homeostasis, failure of homeostasis and degenerate ion channel regulation. Curr Opin Physio. 2018;2:129–138.
- 103. Williams AH, O’Leary T, Marder E. Homeostatic Regulation of Neuronal Excitability. Scholarpedia. 2013;8(1):1656.
- 104. Schauf CL. Ion channel diversity: a revolution in biology? Science Progress (1933-). 1987;71(4(284)):459–478. pmid:2449729
- 105. Anderson PAV, Greenberg RM. Phylogeny of ion channels: clues to structure and function. Comp Biochem Physiol B Biochem Mol Biol. 2001;129(1):17–28. pmid:11337248
- 106. Doyle DA, Cabral JM, Pfuetzner RA, Kuo A, Gulbis JM, Cohen SL, et al. The Structure of the Potassium Channel: Molecular Basis of K+ Conduction and Selectivity. Science. 1998;280(5360):69–77. pmid:9525859
- 107. Linder T, de Groot BL, Stary-Weinzinger A. Probing the Energy Landscape of Activation Gating of the Bacterial Potassium Channel KcsA. PLoS Comput Biol. 2013;9(5):1–9. pmid:23658510
- 108. Hirano M, Onishi Y, Yanagida T, Ide T. Role of the KcsA Channel Cytoplasmic Domain in pH-Dependent Gating. Biophys J. 2011;101(9):2157–2162. pmid:22067153
- 109. Baker KA, Tzitzilonis C, Kwiatkowski W, Choe S, Riek R. Conformational dynamics of the KcsA potassium channel governs gating properties. Nat Struct Mol Biol. 2007;14(11):1089–1095. pmid:17922011
- 110. Gao L, Mi X, Paajanen V, Wang K, Fan Z. Activation-coupled inactivation in the bacterial potassium channel KcsA. Proc Natl Acad Sci U S A. 2005;102(49):17630–17635. pmid:16301524
- 111.
Booth IR. 3. In: The Regulation of Intracellular pH in Bacteria. John Wiley & Sons, Ltd; 2007. p. 19–37. Available from: https://onlinelibrary.wiley.com/doi/abs/10.1002/9780470515631.ch3.
- 112.
Putnam RW. Chapter 17—Intracellular pH Regulation. In: Sperelakis N, editor. Cell Physiology Source Book (Fourth Edition). 4th ed. San Diego: Academic Press; 2012. p. 303–321.
- 113. Isfort RJ, Cody DB, Asquith TN, Ridder GM, Stuard SB, Leboeuf RA. Induction of protein phosphorylation, protein synthesis, immediate-early-gene expression and cellular proliferation by intracellular pH modulation. Eur J Biochem. 1993;213(1):349–357. pmid:8477706
- 114. Roux B, Allen T, Bernèche S, Im W. Theoretical and computational models of biological ion channels. Q Rev Biophys. 2004;37(1):15–103. pmid:17390604
- 115. Southern J, Pitt-Francis J, Whiteley J, Stokeley D, Kobashi H, Nobes R, et al. Multi-scale computational modelling in biology and physiology. Prog Biophys Mol Biol. 2008;96(1):60–89. pmid:17888502
- 116. Maffeo C, Bhattacharya S, Yoo J, Wells D, Aksimentiev A. Modeling and Simulation of Ion Channels. Chem Rev. 2012;112(12):6250–6284. pmid:23035940
- 117. Guardiani C, Cecconi F, Chiodo L, Cottone G, Malgaretti P, Maragliano L, et al. Computational methods and theory for ion channel research. Adv Phys. 2022;7(1):2080587. pmid:35874965
- 118. Clerx M, Beattie KA, Gavaghan DJ, Mirams GR. Four Ways to Fit an Ion Channel Model. Biophys J. 2019;117(12):2420–2437. pmid:31493859
- 119. Jeong DU, Lim KM. Artificial neural network model for predicting changes in ion channel conductance based on cardiac action potential shapes generated via simulation. Sci Rep. 2021;11(1):7831. pmid:33837240
- 120. Lei CL, Mirams GR. Neural network differential equations for ion channel modelling. Front Physiol. 2021;12:708944. pmid:34421652
- 121. Langthaler S, Lozanović Šajić J, Rienmüller T, Weinberg SH, Baumgartner C. Ion Channel Modeling beyond State of the Art: A Comparison with a System Theory-Based Model of the Shaker-Related Voltage-Gated Potassium Channel Kv1.1. Cells. 2022;11(2). pmid:35053355
- 122. O’Leary T, Wyllie DJA. Neuronal homeostasis: time for a change? J Physiol. 2011;589(20):4811–4826. pmid:21825033
- 123. Dehghani N, Caterina G. Physical computing: a category theoretic perspective on physical computation and system compositionality. J Phys Complex. 2024;5(3):035005.
- 124. Tsimring LS. Noise in biology. Rep Prog Phys. 2014;77(2):026601. pmid:24444693
- 125. Azpeitia E, Balanzario EP, Wagner A. Signaling pathways have an inherent need for noise to acquire information. BMC Bioinformatics. 2020;21(1):462. pmid:33066727
- 126.
Newman AH. Chapter 29. Irreversible Ligands for Drug Receptor Characterization. In: Bristol JA, editor. Annual Reports in Medicinal Chemistry. vol. 25. Academic Press; 1990. p. 271–280.
- 127. Pierobon M, Akyildiz IF. Noise Analysis in Ligand-Binding Reception for Molecular Communication in Nanonetworks. IEEE Trans Signal Process. 2011;59(9):4168–4182.
- 128. Lestas I, Vinnicombe G, Paulsson J. Fundamental limits on the suppression of molecular fluctuations. Nature. 2010;467(7312):174–178. pmid:20829788
- 129. Teimouri H, Kolomeisky AB. Relaxation Times of Ligand-Receptor Complex Formation Control T Cell Activation. Biophys J. 2020;119(1):182–189. pmid:32562619
- 130. Duke TAJ, Bray D. Heightened sensitivity of a lattice of membrane receptors. Proc Natl Acad Sci U S A. 1999;96(18):10104–10108. pmid:10468569
- 131. Rappel WJ, Levine H. Receptor Noise and Directional Sensing in Eukaryotic Chemotaxis. Phys Rev Lett. 2008;100:228101. pmid:18643461
- 132. Rappel WJ, Levine H. Receptor noise limitations on chemotactic sensing. Proc Natl Acad Sci U S A. 2008;105(49):19270–19275. pmid:19064934
- 133. Ventura AC, Bush A, Vasen G, Goldín MA, Burkinshaw B, Bhattacharjee N, et al. Utilization of extracellular information before ligand-receptor binding reaches equilibrium expands and shifts the input dynamic range. Proc Natl Acad Sci U S A. 2014;111(37):E3860–E3869. pmid:25172920
- 134. Kajita MK, Aihara K, Kobayashi TJ. Reliable target ligand detection by noise-induced receptor cluster formation. Chaos: An Interdisciplinary. J Nonlinear Sci. 2020;30(1):011104. pmid:32013460
- 135. Aquino G, Clausznitzer D, Tollis S, Endres RG. Optimal receptor-cluster size determined by intrinsic and extrinsic noise. Phys Rev E. 2011;83:021914. pmid:21405870
- 136. Caré BR, Soula HA. Receptor clustering affects signal transduction at the membrane level in the reaction-limited regime. Phys Rev E. 2013;87:012720. pmid:23410372
- 137. Bene L, Bagdány M, Damjanovich L. T-cell Receptor Is a Threshold Detector: Sub- and Supra-Threshold Stochastic Resonance in TCR-MHC Clusters on the Cell Surface. Entropy. 2022;24(3). pmid:35327900
- 138. Nagano S. Noise reduction and signal enhancement by receptor synchronization. Nonlinear Theory and Its Applications, IEICE. 2020;11(4):601–609.
- 139.
Monod J. On Chance and Necessity. In: Ayala FJ, Dobzhansky T, editors. Studies in the Philosophy of Biology: Reduction and Related Problems. London: Macmillan Education UK; 1974. p. 357–375.
- 140. Noble D. The role of stochasticity in biological communication processes. Prog Biophys Mol Biol. 2021;162:122–128. pmid:33010285
- 141.
Hille B. Ion channels of excitable membranes. 3rd ed. Sunderland, Mass: Sinauer; 2001.
- 142. Sakmann B, Neher E. Single-channel recording. Boston, MA: Springer US; 1995.
- 143. Horikawa Y. Noise effects on spike propagation in the stochastic Hodgkin-Huxley models. Biol Cybern. 1991;66(1):19–25. pmid:1768709
- 144. White JA, Rubinstein JT, Kay AR. Channel noise in neurons. Trends Neurosci. 2000;23(3):131–137. pmid:10675918
- 145. Schneidman E, Freedman B, Segev I. Ion Channel Stochasticity May Be Critical in Determining the Reliability and Precision of Spike Timing. Neural Comput. 1998;10(7):1679–1703. pmid:9744892
- 146. Faisal AA, White JA, Laughlin SB. Ion-Channel Noise Places Limits on the Miniaturization of the Brainś Wiring. Curr Biol. 2005;15(12):1143–1149. pmid:15964281
- 147. Wiesenfeld K, Moss F. Stochastic resonance and the benefits of noise: from ice ages to crayfish and SQUIDs. Nature. 1995;373(6509):33–36. pmid:7800036
- 148. Benzi R, Sutera A, Vulpiani A. The mechanism of stochastic resonance. J Phys A Math Gen. 1981;14(11):L453–L457.
- 149. Rouvas-Nicolis C, Nicolis G. Stochastic resonance. Scholarpedia. 2007;2(11):1474.
- 150. Gammaitoni L, Hänggi P, Jung P, Marchesoni F. Stochastic resonance. Rev Mod Phys. 1998;70:223–287.
- 151. Zeng FG, Fu QJ, Morse R. Human hearing enhanced by noise. Brain Res. 2000;869(1):251–255. pmid:10865084
- 152. Simonotto E, Riani M, Seife C, Roberts M, Twitty J, Moss F. Visual Perception of Stochastic Resonance. Phys Rev Lett. 1997;78:1186–1189.
- 153. Priplata A, Niemi J, Salen M, Harry J, Lipsitz LA, Collins JJ. Noise-Enhanced Human Balance Control. Phys Rev Lett. 2002;89:238101. pmid:12485044
- 154. Collins JJ, Imhoff TT, Grigg P. Noise-enhanced tactile sensation. Nature. 1996;383(6603):770–770. pmid:8893000
- 155. Collins JJ, Chow CC, Imhoff TT. Stochastic resonance without tuning. Nature. 1995;376(6537):236–238. pmid:7617033
- 156. Douglass JK, Wilkens L, Pantazelou E, Moss F. Noise enhancement of information transfer in crayfish mechanoreceptors by stochastic resonance. Nature. 1993;365(6444):337–340. pmid:8377824
- 157. Bulsara A, Jacobs EW, Zhou T, Moss F, Kiss L. Stochastic resonance in a single neuron model: Theory and analog simulation. J Theor Biol. 1991;152(4):531–555. pmid:1758197
- 158. Wiesenfeld K, Pierson D, Pantazelou E, Dames C, Moss F. Stochastic resonance on a circle. Phys Rev Lett. 1994;72:2125–2129. pmid:10055796
- 159. Bezrukov SM, Vodyanoy I. Noise-induced enhancement of signal transduction across voltage-dependent ion channels. Nature. 1995;378(6555):362–364. pmid:7477370
- 160. Bezrukov SM, Vodyanoy I. Stochastic resonance in thermally activated reactions: Application to biological ion channels. Chaos: An Interdisciplinary. J Nonlinear Sci. 1998;8(3):557–566. pmid:12779759
- 161. Bezrukov SM, Vodyanoy I. Stochastic resonance in non-dynamical systems without response thresholds. Nature. 1997;385(6614):319–321. pmid:9002515
- 162. Vincent UE, McClintock PVE, Khovanov IA, Rajasekar S. Vibrational and stochastic resonances in driven nonlinear systems. Philos Trans A Math Phys Eng Sci. 2021;379(2192):20200226. pmid:33455554
- 163. Sorokin V, Demidov I. On representing noise by deterministic excitations for interpreting the stochastic resonance phenomenon. Philos Trans A Math Phys Eng Sci. 2021;379(2192):20200229. pmid:33455556
- 164. Lucarini V. Stochastic resonance for nonequilibrium systems. Phys Rev E. 2019;100:062124. pmid:31962491
- 165. Schmid G, Goychuk I, Hänggi P. Stochastic resonance as a collective property of ion channel assemblies. Europhysics Letters (EPL). 2001;56(1):22–28.
- 166. Parc YW, Koh DS, Sung W. Stochastic resonance in an ion channel following the non-Arrhenius gating rate. Eur Phys J B. 2009;69(1):127–131.
- 167. Goychuk I, Hänggi P. Stochastic resonance in ion channels characterized by information theory. Phys Rev E. 2000;61:4272–4280. pmid:11088223
- 168. Adair RK. Noise and stochastic resonance in voltage-gated ion channels. Proc Natl Acad Sci U S A. 2003;100(21):12099–12104. pmid:14506291
- 169. Toral R, Mirasso CR, Gunton JD. System size coherence resonance in coupled FitzHugh-Nagumo models. Europhys Lett (EPL). 2003;61(2):162–167.
- 170. Pikovsky A, Zaikin A, de la Casa MA. System Size Resonance in Coupled Noisy Systems and in the Ising Model. Phys Rev Lett. 2002;88:050601. pmid:11863709
- 171. Schmid G, Goychuk I, Hanggi P, Zeng S, Jung P. Stochastic resonance and optimal clustering for assemblies of ion channels. Fluct Noise Lett. 2004;04(01):L33–L42.
- 172. Wannamaker RA, Lipshitz SP, Vanderkooy J. Stochastic resonance as dithering. Phys Rev E. 2000;61:233–236. pmid:11046260
- 173.
Pohlmann KC. Principles of digital audio. Audio library. H.W. Sams; 1989. Available from: https://books.google.com/books?id=SRRMAAAAYAAJ.
- 174. Roberts L. Picture coding using pseudo-random noise. IEEE Trans Inform Theory. 1962;8(2):145–154.
- 175. Gammaitoni L. Stochastic resonance and the dithering effect in threshold physical systems. Phys Rev E. 1995;52:4691–4698. pmid:9963964
- 176. Gammaitoni L. Stochastic resonance in multi-threshold systems. Phys Lett A. 1995;208(4):315–322.
- 177. Hoekstra A, Chopard B, Coveney P. Multiscale modelling and simulation: a position paper. Philos Transactions A Math Phys Eng Sci. 2021;2014(372):20130377. pmid:24982256
- 178. Laughlin RB, Pines D, Schmalian J. Stojković BP, Wolynes P. The middle way. Proc Natl Acad Sci U S A. 2000;97(1):32–37. pmid:10618366
- 179. Rice C. Beyond reduction and emergence: a framework for tailoring multiscale modeling techniques to specific contexts. Biol Philos. 2024;39(4):12.
- 180. Weinan E, Engquist B, Huang Z. Heterogeneous multiscale method: A general methodology for multiscale modeling. Phys Rev B. 2003;67:092101.
- 181. Noble R, Noble D. Harnessing stochasticity: How do organisms make choices? Chaos: An Interdisciplinary. J Nonlinear Sci. 2018;28(10):106309. pmid:30384641
- 182. Gunawardena J. Models in biology: ‘accurate descriptions of our pathetic thinking’. BMC Biol. 2014;12(1):29. pmid:24886484
- 183. Kitano H. Systems Biology: A Brief Overview. Science. 2002;295(5560):1662–1664. pmid:11872829
- 184.
Bialek WS. Biophysics: Searching for Principles. Princeton, NJ: Princeton University Press; 2012. Available from: https://books.google.com/books?id=5In-FKA2rmUC.
- 185. Navlakha S, Bar-Joseph Z. Algorithms in nature: the convergence of systems biology and computational thinking. Mol Syst Biol. 2011;7(1):546. pmid:22068329
- 186. Krakauer DC, Collins JP, Erwin D, Flack JC, Fontana W, Laubichler MD, et al. The challenges and scope of theoretical biology. J Theor Biol. 2011;276(1):269–276. pmid:21315730
- 187.
Mayr E. What Makes Biology Unique? Considerations on the Autonomy of a Scientific Discipline. Cambridge: Cambridge University Press; 2004. Available from: https://www.cambridge.org/core/product/5B4A83616082B6A6AB7DC13C33750AA1.
- 188. Rackauckas C, Ma Y, Martensen J, Warner C, Zubov K, Supekar R, et al. Universal Differential Equations for Scientific Machine Learning. arXiv. 2020:1.
- 189. Karniadakis GE, Kevrekidis IG, Lu L, Perdikaris P, Wang S, Yang L. Physics-informed machine learning. Nat Rev Phys. 2021;3(6):422–440.
- 190. Lu PY, Ariño Bernad J, Soljačić M. Discovering sparse interpretable dynamics from partial observations. Commun Phys. 2022;5(1):206.
- 191. Reinbold PAK, Kageorge LM, Schatz MF, Grigoriev RO. Robust learning from noisy, incomplete, high-dimensional experimental data via physically constrained symbolic regression. Nat Commun. 2021;12(1):3219. pmid:34050155
- 192. Groen D, Knap J, Neumann P, Suleimenova D, Veen L, Leiter K. Mastering the scales: a survey on the benefits of multiscale computing software. Philos Trans A Math Phys Eng Sci. 2019;377(2142):20180147. pmid:30967042
- 193. Alber M, Buganza Tepole A, Cannon WR, De S, Dura-Bernal S, Garikipati K, et al. Integrating machine learning and multiscale modeling—perspectives, challenges, and opportunities in the biological, biomedical, and behavioral sciences. NPJ Digit Med. 2019;2(1):115. pmid:31799423
- 194. Vlachas PR, Arampatzis G, Uhler C, Koumoutsakos P. Multiscale simulations of complex systems by learning their effective dynamics. Nat Mach Intell. 2022;4(4):359–366. pmid:34890204