^{1}

^{2}

^{*}

^{1}

^{1}

Conceived and designed the experiments: MZ SB. Performed the experiments: MZ. Analyzed the data: MZ SB. Contributed reagents/materials/analysis tools: FDP. Wrote the paper: MZ SB FDP.

The authors have declared that no competing interests exist.

The activity of networking neurons is largely characterized by the alternation of synchronous and asynchronous spiking sequences. One of the most relevant challenges that scientists are facing today is, then, relating that evidence with the fundamental mechanisms through which the brain computes and processes information, as well as with the arousal (or progress) of a number of neurological illnesses. In other words, the problem is how to associate an organized dynamics of interacting neural assemblies to a computational task. Here we show that computation can be seen as a feature emerging from the collective dynamics of an ensemble of networking neurons, which interact by means of adaptive dynamical connections. Namely, by associating logical states to synchronous neuron's dynamics, we show how the usual Boolean logics can be fully recovered, and a universal Turing machine can be constructed. Furthermore, we show that, besides the static binary gates, a wider class of logical operations can be efficiently constructed as the fundamental computational elements interact within an adaptive network, each operation being represented by a specific motif. Our approach qualitatively differs from the past attempts to encode information and compute with complex systems, where computation was instead the consequence of the application of control loops enforcing a desired state into the specific system's dynamics. Being the result of an emergent process, the computation mechanism here described is not limited to a binary Boolean logic, but it can involve a much larger number of states. As such, our results can enlighten new concepts for the understanding of the real computing processes taking place in the brain.

Synchronization is one of the most important features observed in neural systems

Here we show that computation can emerge from the synchronization of groups of adaptively coupled neurons. Such collective dynamics can encode information within different synchronization states, and efficiently perform any Boolean operations, thus being able to construct a universal Turing machine

The basic computational unit (see

A) Schematic representation of the basic computation unit. A neuron, subject to a common Gaussian noise, receives two control signals from ports

The K channel is modeled by the following set of equations:

Constants are common for all neurons, and their values are

We further assume that at least one reference spike pattern exists (henceforth called

In this equation,

Using the previously described computational unit, the simplest Boolean operation that can be constructed is the unary NOT gate: a logical gate which returns zero when the input is one, and one otherwise. Such a Boolean gate corresponds to a configuration where the input signal to be processed is fed inside port

A) The NOT gate is constructed by feeding a reference signal

By further embedding the basic computational unit inside a network

A) Circuit configurations corresponding to the Boolean gates NAND, XOR, and a reference translator (

It is important to remark that the NAND gate is known to be a universal Boolean gate

This feature, that is, the possibility of improving the efficiency of the computation by encoding information with different reference signals, is used in

A) Network motif corresponding to a full Boolean adder. Same stipulations as for

Beside the standard, static binary gates, a wider class of dynamical logical operations can also be efficiently constructed, that are of biological relevance. In

A) Network motif corresponding to a Set Reset Flip Flop. Same stipulations as for

Finally,

In conclusions, we have introduced a basic computation unit, made of a neuron that is interacting with others neurons, following the topology of a dynamical weighted network. Some neurons are forced to synchronize with a reference signal, with a coupling strength that depends on a simple adaptive mechanism, while others are let free to follow their internal dynamics, guided by a common external Gaussian noise. The synchronization and desynchronization of neurons with the reference signal is then used to codify binary information, and we show how the topology of the network is essential for the execution of specific Boolean operations, by means of which it is possible to construct a universal Turing machine.

This approach is of interest for several reasons. First of all, there is a growing quantity of experimental evidences supporting the hypothesis that synchronization is the form used by the brain to represent and compute information

The authors acknowledge I. Sendiña Nadal for many discussions on the model equations and parameters used in this manuscript.