Abstract
Building upon Li & Britz (Li NZ, Britz T. 2024 On the scale-freeness of random colored substitution networks. Proc. Amer. Math. Soc. 152, 1377–1389 (doi: 10.1090/proc/16604)), this study aims to introduce fractal geometry into graph theory and to establish a potential theoretical foundation for complex networks. Specifically, we employ the method of substitution to create and explore fractal-like graphs, termed iterated graph systems (IGS) for the first time. While the concept of substitution is commonplace in fractal geometry and dynamical systems, its analysis in the context of graph theory remains a nascent field. By delving into the properties of these systems, including distance and diameter, we derive two primary outcomes. Firstly, within the deterministic IGS, we establish that the Minkowski dimension and Hausdorff dimension align through explicit formulae. Secondly, in the case of random IGS, we demonstrate that almost every Gromov–Hausdorff scaling limit exhibits identical Minkowski and Hausdorff dimensions analytically by their Lyapunov exponents. The exploration of IGS holds the potential to unveil novel directions. These findings not only, mathematically, contribute to our understanding of the interplay between fractals and graphs, but also, physically, suggest promising avenues for applications for complex networks.
1. Introduction and main results
(a) Introduction
Graph theory originated in the eighteenth century and saw rapid development throughout the twentieth century, while fractal geometry was proposed and explored during the twentieth century as well. Both fields have undergone rigorous mathematical development to this day. The integration of these two fields is generally associated with the study of complex networks [1]: this is primarily due to the work of Watts on small-world networks [2], Barabási’s discovery on scale-free networks [3] and Song’s exploration of fractal networks [4]. These remarkable works demonstrate certain distinctive patterns of real-world complex networks.
However, the underlying mathematical principles of complex networks still remain unclear. Few attempts have been made to develop the theory: most relevant studies are based on real-world data and finite networks, rather than mathematical objects. Such a gap is to be filled, and it is of great interest to explore the behaviour of infinite graph fractals.
To establish the fractals for graphs, we utilize the idea of substitution and iterated function systems (IFS), where these two concepts are more than common in dynamical systems and fractal geometry. It is also always natural to relate substitutions to dynamics because a natural source of substitutions is tiling and substitutive dynamical systems [5–8]. Since Hutchinson [9] set up the IFS strictly in a general framework, large numbers of notable works contribute to the studies of self-similar sets, for instance graph-directed systems [10,11], IFS with overlaps [12–14] and random IFS [15,16]; see also [17–21].
Building on these foundations, we rigorously define the iterated graph system (IGS) by iteratively substituting the (coloured) arcs in a graph (randomly) with certain fixed graphs. However, the concept of the IGS is not entirely new; it evolves from the predecessor known as substitution networks, which were first introduced by Xi et al. [22], where fractality and scale-freeness are obtained. Li et al. [23,24] brought coloured graphs into this model and the average distance was studied by Ye et al. [25,26] in which the existence of average distance is exhibited under some circumstances. IGS can also be viewed both as the graph version of graph-directed fractals [10,27] and also as a generalization of them because the substitution process inherent to IGS includes a graph-directed methodology. We will discuss this issue in another work.
Therefore, from a mathematical perspective, studying IGS is at least intriguing: we anticipate that fractals on graphs will exhibit unique behaviours not seen in Euclidean spaces. This is indeed the case, as evidenced by the specially defined degree dimension [1] for graph fractals. Physically, IGS provide a solid mathematical foundation for the study of complex networks, especially those with scale-freeness and fractality. The potential applications include considerations of fractal dimensions, degree distributions, average distance, random walks, diffusion process and percolation, among others. See a recent interesting work [28] by Eriksson-Bique et al. For instance, a quick corollary in this paper is that for any network characterized by a specific fractal dimension, it is possible to correspondingly design an iterated graph system that mirrors this dimension.
(b) Main results
The deterministic (or random) IGS roughly refers to the system generating infinite graphs (randomly) iteratively by independent coloured arc substitutions. Find figure 1 as an example. Rigorous definitions will be given in §2b and §4a.
Figure 1. An example of a deterministic iterated graph system.
This paper mainly contains two conclusions, the first of which solves the problem left in Li et al. [24]:
Theorem 1.1. Given any irreducible deterministic IGS and its Gromov–Hausdorff scaling limit
Here,
Another result is to construct random IGS and prove that the associated graph limits
Theorem 1.2. Given any irreducible random IGS, for
Here,
The concept of IGS is straight and simple. However, the difficulty and novelty of this paper lie in the solution of a combinatorial matrix problem (§3a), estimation of diameter and distal (§3b) and analysis of random substitution (§4).
2. Deterministic iterated graph systems
(a) Minkowski dimension for graphs
In this subsection, we will define the limit of a graph sequence and the Minkowski dimension, also known as the box-counting dimension, for graphs to facilitate the study of fractal-like graphs. With these foundational concepts established, we can proceed to analyse IGS.
Definition 2.1. Let
In this paper, the distance between vertices
where
Denote a simple path, omitting the direction of arcs, from vertex
Define
Let the diameter of
Definition 2.2. We now define the scaling graph
Define the metric of
Definition 2.3. Let
be the Gromov–Hausdorff distance in Gromov [30].
Let
It is evident that the Gromov–Hausdorff scaling limit
Example 2.1. An example of scaling graphs is given in figure 2, which shows the first four scaling graphs
Figure 2. An example of scaling graphs: Xi fractal in Xi et al. [22].
Also refer to figure 3 for an illustration of
Figure 3. Xi fractal as a complex network.
Definition 2.4. Let
Define the smallest number of
In this paper, we stipulate that
Definition 2.5. Let
if such a limit exists. We say that a graph
(b) Deterministic iterated graph systems
Definition 2.6. See figure 4 for example.
Figure 4. An example of deterministic IGS.
Let the number of colours be
— |
| ||||
— |
|
The colour function is described by
— |
|
That is, for an arc linking from vertex
We now define a discrete time
satisfying
-
for any .
We call such
Intuitively, to construct a sequence of growing graphs:
-
start from the initial graph
, which is usually just a single arc (for instance a blue arc in figure 4); -
we construct
by replacing all -coloured arc in by . To ensure the substitution is unique, we fix of at the starting vertex of the arc and fix of at the ending vertex of the arc; -
we then iteratively replace all
-coloured arcs for all in to obtain . In this way, denote the graph after iterations by ; accordingly, we have a sequence of graphs called substitution networks. We will prove in §3d that for any iterated graph system , the Gromov–Hausdorff scaling limit always exists.
(c) Relevant definitions
Before strictly presenting the main results, several definitions are to be introduced in advance.
Notation 2.1. For any
Definition 2.7. Define the
Definition 2.8. Recall that a path in a graph is simple if it has no repeating vertices. In this paper, we assume that all paths are simple.
Denote a path between vertices
Consider the Cartesian product:
Each element (or say choice)
Define, for each
Let
where
Also, define:
If
Theorem 2.1. (main theorem). Given any deterministic irreducible IGS
Remark 2.1. For any network with a given algebraic number fractal dimension, we can always directly construct a deterministic iterated graph system that possesses the same box dimension.
In the future, we may define an approximate dimension for finite graphs to address the convergence rate of dimensions.
Example 2.2. Recall the iterated graph system of figure 4. For definition 2.7, we have
with
For definition 2.8, there are
As a result,
The smallest spectral radius of these matrices is
As a conclusion, by theorem 2.1, we obtain that for the demonstrated example
Modelling the network, we obtain the simulated values given in figure 6. It shows a simulation of the Minkowski dimension of
See
figure 5 for a fractal and scale-free complex network. It illustrates the model from
figure 4
in the form of a complex network, created using Python’s
NetworkX
. According to previous conclusions, this complex network has an approximate Minkowski dimension of
Figure 5. Iterated graph system in figure 4 as a complex network.
Figure 6.
Test of fractality when
We will later prove theorem 2.1 through the following two theorems.
Theorem 2.2. We will prove in §3c that:
Theorem 2.3. We will prove in §3e that:
(d) Growth rates of arcs and vertices
This section introduces several necessary tools and properties for the proof of fractality in §3. We will omit the proofs of such results that may be found in [1,24].
Notation 2.2.
Lemma 2.1. Let
where
Lemma 2.2. Let
Definition 2.9. Let
where
This means
Lemma 2.3. For each
Proof. By induction.∎
Lemma 2.4.
Proof. By
Lemma 2.5.
Proof. Define:
Then,
Given the initial graph
Since the row vector only contains constants,
∎
Definition 2.10. The Gromov–Hausdorff scaling limit
Theorem 2.4.
Proof.
∎
3. Proof of theorem 2.1
This section is devoted to proving theorem 2.1. However, before we are to do so, a few short but technical results are required, especially theorems 3.2 and 3.3.
(a) A combinatorial matrix problem
Lemma 3.1. Suppose that, for any non-negative matrix
Proof. See lemma 8.1 in Li & Britz [1].∎
Recall the defined notation from definition 2.8.
Theorem 3.1. Suppose that all matrices in
Proof. Assume that
whereas
By lemma 3.1,
contradicting the definition of
Next is our second auxiliary result.
Theorem 3.2. For any number of matrices
Proof. For each
Then,
so
By Collatz–Wielandt formula,
∎
Remark 3.1. If
with
However,
is the family satisfying theorem 3.2.
The inherent property of this matrix family fundamentally ensures the existence of the Gromov–Hausdorff scaling limit
(b) Distance growth between vertices
In Li et al. [24], the authors provided an analysis of the distances between vertices in deterministic IGS, and it was proved that these distances grew with bounded growth rates. However, a feasible formula was not given to calculate these rates; they are indeed difficult to determine.
To illustrate why these distances are difficult to estimate, consider figure 7 in which a deterministic iterated graph system is shown, with just two substitution rules: one for blue arcs and one for red arcs. In the shown graph sequence
Figure 7. An example of deterministic IGS.
In general, the shortest paths between vertices
Definition 3.1. In this paper, we without loss of generality let
Lemma 3.2. For each
Proof. For
Now, we focus on calculating the lengths of uniform paths recursively. Recall from definition 2.8 that
Note that if
define
Note that
Lemma 3.3.
Proof. Each given tuple
If we fix
As at least one shortest path
∎
Theorem 3.3.
Proof. By lemma 3.3, lemma 2.1 and theorem 3.2,
∎
(c) Proof of theorem 2.2
Definition 3.2. Let
Lemma 3.4.
Proof. Choose
Notation 3.1. Let
Theorem 3.4.
Proof. Let
Note that if
Observe that,
Let
Note that here,
By lemma 2.1,
∎
We now prove theorem 2.2.
Proof. By lemma 3.4,
Set
Recall theorem 2.4, providing that for
Then,
∎
(d) Existence of the Gromov–Hausdorff scaling limit for iterated graph systems
This section aims to prove the existence of the Gromov–Hausdorff scaling limit for IGS.
Theorem 3.5. There exists
Proof. Let
This is because
where
We now prove that
Therefore, we trickily consider the metric space
Given that
Finally,
Writing
A dynamical system on metric space
Corollary 3.1.
A topological space
Corollary 3.2.
(e) Proof of theorem 2.3
This subsection aims to demonstrate a result for the Hausdorff dimension of
There is a substantial amount of literature on the equivalence of box dimension and Hausdorff dimension, particularly in the context of invariant measures in various iterated systems. For instance, see [9,18–20,28].
Federer and Rogers [32,33] have sufficiently investigated Hausdorff measure and dimension on metric spaces. We now introduce a definition of Hausdorff dimension for graphs.
Definition 3.3. Define Vertex Hausdorff measure for graph
Then, set
Define Edge Hausdorff measure for graphs that,
Then, set
Let the Hausdorff dimension for graphs be:
Remark 3.2. There is no inequality between
An example is exactly IGS. Choosing any
Theorem 3.6.
Proof. By theorem 2.2,
By remark 3.2, we have
It is trivial to check that
Hence, we only need to calculate the lower bound of the Hausdorff dimension.
Fix
Accordingly, as theorem 3.3 points out the growth of
Picking any node
Then, it is clear that
As a result,
and hence
by lemma 3.3 and theorem 3.4.
Finally, because
Hence,
which completes the proof of theorem 2.1.∎
Definition 3.4. A metric space
Theorem 3.7.
Proof. Recall
Let
By theorem 3.4, we have:
Based on
because these two graphs are exactly the same for each
Also by theorem 3.3,
In light of equations (3.1)–(3.3), we obtain:
∎
Corollary 3.3. By [
31
], we have
4. Random iterated graph systems and proof of theorem 4.1
(a) Random iterated graph systems
Definition 4.1. See figure 8 for example.
Figure 8. An example of random IGS.
Let
— |
| ||||
— |
| ||||
— |
|
Let
— |
|
where
We now define a discrete time stochastic process of graphs
with probability
Let
We call the stochastic process of graphs
By Kolmogorov extension theorem, there exists a unique probability measure
See figure 8 for example. For each
Intuitively, to construct a sequence of growing graphs,
— | start from the initial graph | ||||
— | we construct | ||||
— | we then iteratively replace all |
(b) Relevant definitions
Definition 4.2. A finite set containing
Definition 4.3. Let
be with
Definition 4.4. Let
By the definition,
In this way,
Let
In other words,
If all matrices in
Notation 4.1. Let
Theorem 4.1. (main theorem). Given any random irreducible iterated graph system, for
Remark 4.1. For any network with a given algebraic number fractal dimension, we can always directly construct a random iterated graph system that possesses the same dimension.
Example 4.1.
respectively, with the probability vector
Hence, in this example,
all with the probability vector
By theorem 4.1, we have the theoretical dimension:
Similarly modelling the limit graph, we obtain the simulated values given in
figure 9
. It shows 10 times of simulating Minkowski dimensions of
Figure 9.
Ten simulations of fractality when
(c) Lyapunov exponents and stochastic substitution systems
This subsection is devoted to studying the stochastic behaviours of random IGS. We summarize the process of random substitution on graph fractals to so-called stochastic substitution systems, and we prove some properties of the associated Lypunov exponents. Therefore, we shall start by introducing Lyapunov exponents.
Lyapunov exponents regarding products of random matrices were first studied by Bellman [35]. Furstenberg and Kesten [36,37] then developed a couple of famous results of the asymptotic behaviours for products of random matrices. Recently, Pollicott [38,39] presented several impressive works on effective numerical estimates of Lyapunov exponents.
We will continue to use the notations in Pollicott [38] for Lyapunov exponents. Let
Furstenberg & Kesten [36] develop that if
for any
In this paper, we will discuss a type of slightly different matrix product, but it will be proved the Lyapunov exponent of such a system coincides with the associated products of random matrices.
Definition 4.5. Let
Define a random operator
Here, the random operator
Let
We call such the triple
Definition 4.6. Let
where
Theorem 4.2. Let
Proof. First,
Because
Finally, by the above,
∎
Theorem 4.3. For
Proof. By formula (4.2) and theorem 4.2.∎
(d) Growths of arcs and vertices
Lemma 4.1.
Proof.
Lemma 4.2.
Proof. This proof is of the form of lemma 2.5, by lemma 4.1 providing that almost surely
Theorem 4.4.
Proof. For all possible
∎
Remark 4.2. Theorem 4.4 holds even if all graphs in
(e) Proof of theorem 4.1
For notational convenience, write
Lemma 4.3.
Proof. For a certain
By lemma 3.2, there exists one
Hence, the paths between
∎
Lemma 4.4.
Proof. The analogous proof is established like theorem 3.4, by lemma 4.3.∎
We are now able to prove theorem 4.1.
Proof. We start by verifying the Minkowski dimension. By lemma 4.4, the growth of diameter of
In the sense of the Hausdorff dimension, we also adopt the definition in §3e. With the same techniques to estimate the mass distribution, the result boils down to,
Consequently, for
which completes the final proof.∎
Data accessibility
This article has no additional data.
Declaration of AI use
I have not used AI-assisted technologies in creating this article.
Authors’ contributions
Z.N.: conceptualization, data curation, formal analysis, funding acquisition, investigation, methodology, project administration, resources, software, validation, visualization, writing—original draft, writing—review and editing.
Conflict of interest declaration
I declare I have no competing interests.
Funding
This work was supported by the Additional Funding Programme for Mathematical Sciences, delivered by EPSRC (EP/V521917/1) and the Heilbronn Institute for Mathematical Research. This work was also supported by the EPSRC Centre for Doctoral Training in Mathematics of Random Systems: Analysis, Modelling and Simulation (EP/S023925/1).
Acknowledgements
The author would like to thank his mom Birong Lin for her persistent support. The author would like to thank Thomas Britz and the anonymous reviewers for their precious comments. The author would like to thank Mark Pollicott for his profound suggestions.
- Reactive learning strategies for iterated gamesAlex McAvoy, Proceedings A, 2019
- Fractal measures of spatial pattern as a heuristic for return rate in vegetative systemsM. A. Irvine, Royal Society Open Science, 2016
- Graph hierarchies for phylogeographyGabriela B. Cybis, Philosophical Transactions B, 2013
- Exploring the dimensions of responsible research systems and cultures: a scoping reviewSarahanne M. Field, Royal Society Open Science, 2024
- Modified Projective Synchronization of Fractional-order Chaotic Systems with Different Dimensions
Acta Mathematicae Applicatae Sinica (English Series)-1, 2011
- Modified Projective Synchronization of Fractional-order Chaotic Systems with Different Dimensions
Acta Mathematicae Applicatae Sinica (English Series), 2011
- Maximal Inequalities for Iterated Integrals under large G-expectation for Recurrent Event Data
Acta Mathematicae Applicatae Sinica-English Series, 2012
- Chung-type law of the iterated logarithm for continuous time random walk
Wen-sheng Wang, Acta Mathematicae Applicatae Sinica (English Series), 2017