Advertisement

< Back to Article

Integrated Information Increases with Fitness in the Evolution of Animats

Figure 3

Hidden Markov Gate representation.

A: An HMG with three binary input and two output Markov variables, where one of the outputs is fed back into the HMG (a hidden variable). The state transition table has entries that are determined by genetic evolution (see Methods and Text S2). In the gate shown, bit three is a hidden state and can be used to implement a one-bit memory. In principle, the probabilities in the HMG transition table can also be tuned via reinforcement learning using a signal from the environment (“World Feedback”). However, this capacity is not utilized in the present work. B: The “dual” representation of this gate, where the Markov variables are nodes, and the gate connects these via edges. This network is obtained by drawing a directed edge between bits that affect each other causally via the logic gate. Because bit 3 feeds back to itself, for example, it is given the same identifier and there is a directed arrow from bit 3 to itself as well as bit 3 to bit 4. See Text S2 and Fig. S1 for details on the genetic encoding and network visualization of HMGs.

Figure 3

doi: https://doi.org/10.1371/journal.pcbi.1002236.g003