Skip to main content
Advertisement

< Back to Article

Fig 1.

Using database management systems as an analogy, the brain’s memory system lacks such a clear and complete chain of signal processing.

More »

Fig 1 Expand

Fig 2.

A sample can be stored as a subgraph in a directed graph.

(a) Random activation of 30 nodes in the network. (b) Activated nodes propagate stimulus in the network to form a subgraph.

More »

Fig 2 Expand

Fig 3.

How to record activation traces within a single node.

(a) After storing a sample, node vb’s index table adds an activation trace. (b) When node vb receives the same or similar input again, it reuses the previously recorded activation traces.

More »

Fig 3 Expand

Fig 4.

Node resource-grabbing rules.

(a) Node va propagates stimulus to node vb. (b) After node vb is activated, it finds that other nodes have occupied node vc. (c) Node vb becomes a resting state as it cannot continue to propagate stimulus. (d) Node va becomes a resting state because it has not activated any downstream nodes.

More »

Fig 4 Expand

Fig 5.

Node state transition graph.

More »

Fig 5 Expand

Fig 6.

Example of a dynamic process for directed graph stimulus propagation.

(a) t0: Initial nodes {va,vb,vc,vd} are activated, which performs stimulus propagation according to a weighted random selection algorithm. (b) t1: {ve,vf,vg,vi,vj} becomes active after receiving stimulus from the initial nodes. (c) t2: {vh,vk} becomes active after receiving stimulus from active nodes. (d) t3: The downstream nodes {vb,vc} of vh are all occupied, so stimulus cannot be propagated. vh becomes resting again. (e) t4: After vh becomes resting state, according to the avalanche effect, vj also becomes resting state. (f) t5: The subgraph iterates to a steady state. Start to release resources, and node va releases the occupancy of ve. (g) t6: After the resources are released, the dormant node vd restarts pathfinding and successfully activates vj. (h) t7: Node vh is successfully activated after receiving the stimulus from vj and passing the stimulus to vb. The subgraph is iterated to a stable state.

More »

Fig 6 Expand

Table 1.

Different-scale network retrieval performance after storing 1000 samples.

More »

Table 1 Expand

Fig 7.

The relationship between the number of edges and network capacity and subgraph structure.

More »

Fig 7 Expand

Fig 8.

Storage examples under different network connectivity.

(a) The sample-generated subgraph in an ER random graph with p = 0.07. (b) The sample-generated subgraph in an ER random graph with p = 0.04.

More »

Fig 8 Expand

Fig 9.

Comparison of capacity between sparse and dense graphs.

(a) Results in a sparse graph. (b) Results in a dense graph.

More »

Fig 9 Expand

Fig 10.

The impact of incomplete sample input on sample retrieval.

(a) Test results in a sparse graph. (b) Test results in a dense graph.

More »

Fig 10 Expand

Fig 11.

The impact of noisy sample input on sample retrieval.

(a) Test results in a sparse graph. (b) Test results in a dense graph.

More »

Fig 11 Expand

Fig 12.

The impact of both incomplete and contained noise nodes in the sample input on sample retrieval.

(a) Test results in a sparse graph. (b) Test results in a dense graph.

More »

Fig 12 Expand

Table 2.

Restoration schemes for the index table after damage.

More »

Table 2 Expand

Fig 13.

The impact of partial node damage on sample retrieval.

(a) The change in accuracy in a sparse graph. (b) The change in completeness in a sparse graph. (c) The change in accuracy in a dense graph. (d) The change in completeness in a dense graph.

More »

Fig 13 Expand

Fig 14.

The impact of partial directed edge damage on sample retrieval.

(a) The change in accuracy in a sparse graph. (b) The change in completeness in a sparse graph. (c) The change in accuracy in a dense graph. (d) The change in completeness in a dense graph.

More »

Fig 14 Expand

Fig 15.

Six classic network structures.

(a) ER random graph with p = 0.1. (b) Globally coupled network. (c) Nearest-neighbor coupled network with L = 1. (d) Star coupled network. (e) Kleinberg network. (f) Price network.

More »

Fig 15 Expand

Table 3.

Comparison of classic network models.

More »

Table 3 Expand

Fig 16.

Network storage example (a) Kleinberg network storage example. (b) Nearest-neighbor coupled network storage example.

More »

Fig 16 Expand

Fig 17.

Comparison of the capacities of three models (sample node scale of 50).

More »

Fig 17 Expand

Fig 18.

Comparison of the capacities of three models (sample node scale of 200).

More »

Fig 18 Expand