^{1}

^{1}

^{2}

^{1}

^{*}

The authors have declared that no competing interests exist.

Conceived and designed the experiments: DJK. Analyzed the data: DJK TMR DTY AB. Wrote the paper: DJK TMR DTY AB. Wrote programs for analysis: TMR DTY AB.

The brain is one of the most studied and highly complex systems in the biological world. While much research has concentrated on studying the brain directly, our focus is the structure of the brain itself: at its core an interconnected network of nodes (neurons). A better understanding of the structural connectivity of the brain should elucidate some of its functional properties. In this paper we analyze the connectome of the nematode

Fractal theory has become an increasingly prevalent topic of both debate and research in recent years. Beginning with Mandelbrot's discussion of Britain's immeasurable coastline

More recently, fractal theory has found applications in the biological realm. Kinetics of ion channels have been modeled with fractal structures

In this paper we use a graph-theoretical approach to probe the structure of the

The

In order to index each of these connections we use the graph Laplacian matrix,

For the

The eigenvalue counting function is a cumulative distribution function on the spectrum of a matrix, in this case the Laplacian (see

(a)

There is a clear presence of step-like portions of those graphs corresponding to known fractals. These sections of slope-zero correspond to spectral gaps, consistent with expected results. The eigenvalue counting function plot of the

The Weyl ratio of a graph is defined as

As expected, the Weyl ratios of known self-similar fractals show a high degree of organization. That of the Sierpinski gasket in particular (

(a)

While several cases of slight periodicity could be argued for, this evidence is not definitive enough to indicate self-similarity in the

We replicated the network visualization performed in

After embedding each vertex in either 2- or 3-dimensional space, neuronal or network connections were represented with line segments between the appropriate points. In the case of the

The eigen-projection visualizations (

(a)

The eigen-projections display some of the functional organization of the

We consider two functions defined on graphs: average clustering coefficient and average path length. The clustering coefficient of a vertex

Small-world networks arise quite often in the natural sciences, as they allow for the efficient transfer of information while maintaining a certain level of complexity. There is a great deal of research which suggests that neural networks possess small-world properties

Graph | Clustering Coefficient | Average Path Length |

Sierpinski Gasket, Level 5 | 0.4495 | 17.3721 |

Random(Sierpinski Gasket) | 0.0104 | 5.748 |

Sierpinski Gasket Rewire |
0.2843 | 7.3833 |

Random(SG Rewire) | 0.0104 | 5.748 |

0.3371 | 2.5377 | |

Random( |
0.0581 | 2.3458 |

This motivated our work with network-rewiring, related to that done by Watts and Strogatz. In

Using an eigenfunction of a graph's Laplacian,

(a)

Of the four graphs considered here, the eigenfunctions of the random network possess the largest spacial variances. As this system was designed to lack general organization, non-localized eigenfunctions were both expected and observed. The eigenfunctions of the Sierpinski Gasket possess both highly concentrated and low-valued spacial variances. Such trends correspond to a high degree of localization, as anticipated in approximations of fractal geometries. Eigenfunctions of the rewired Sierpinski Gasket demonstrate a similar concentration pattern with slightly higher spacial variance, indicating a slightly lower degree of eigenfunction localization.

Of particular interest are the spacial variances of eigenfunctions of the

In

In this paper we used a variety of graph theoretic and mathematical techniques to probe the structural framework of the

In order to analyze only the framework of the

In order to generate a Laplacian matrix representation of the random graphs, we used the following algorithm:

First, fix the number of vertices,

For each

To produce an adjacency matrix of this graph,

The algorithm used for producing the Laplacian matrix of a random-branching tree is more involved. Again, fix the number of vertices,

Begin by generating a random integer

Next move to all subsequent vertices. Because no “looping” exists in the structure of the tree, each node can only be connected to its parent vertex and its “children” vertices. We take

Now, as above, for each remaining vertex

For a given graph Laplacian matrix,

We used two different forms of the graph-Laplacian matrix: the standard Laplacian and the degree-normalized Laplacian. In the case of eigen-projections, we utilize the degree-normalized matrix. We define the degree matrix,

We found all eigenvalues,

The clustering coefficient measures the probability that two neighbors of a given vertex are also connected to one another. For a graph

For a graph

In order to analyze our networks for small-world properties, it was useful to compare these graphs to those of similar networks with randomly assigned edges. Small-world networks are nearly as well-connected as random graphs, but possess a well-localized structure. We developed the following algorithm for this process:

For a graph

First number each vertex in

For a graph

In order to discuss spacial variance, we must first define the resistance between two vertices on a graph. Let

Let

1.

2.

Finding the harmonic function is equivalent to finding a vector

Using these we can now define the spacial variance of a graph. Again, let

We thank Dr. Alexander Teplyaev, Department of Mathematics, University of Connecticut, for his guidance in organizing and overseeing our project. We thank Dr. Dmitri Chklovskii, Howard Hughes Medical Institute, Janiela Farm Research Campus, for allowing us to extend his work and use his adjacency matrices. We thank Matthew Begué, Department of Mathematics, University of Maryland, for his help with our MATLAB code.