arXiv:2105.05968v1 [cs.FL] 12 May 2021
-
Upload
khangminh22 -
Category
Documents
-
view
1 -
download
0
Transcript of arXiv:2105.05968v1 [cs.FL] 12 May 2021
A new version of Toomβs proof
Peter GΓ‘csβBoston University
Abstract
There are several proofs now for the stability of Toomβs example ofa two-dimensional stable cellular automaton and its application to fault-tolerant computation. Simon and Berman simplified and strengthenedToomβs original proof: the present report is a simplified exposition of theirproof.
1 Introduction
Let us define cellular automata.
Definition 1.1 For a finiteπ, letZπ be the set of integers moduloπ; we will alsowrite Zβ = Z for the set of integers. A set C will be called a one-dimensionalset of sites, or cells, if it has the form C = Zπ for a finite or infinite π. Forfinite π, and π₯ β C, the values π₯ + 1 π₯ β 1 are always understood modulo π.Similarly, it will be called a two- or three-dimensional set of sites if it has theform C = Zπ1 Γ Zπ2 or C = Zπ1 Γ Zπ2 Γ Zπ3 for finite or infinite ππ. One- andthree-dimensional sets of sites are defined similarly.
For a given set C of sites and a finite set S of states, we call every functionb : C β S a configuration. Configuration b assigns state b(π₯) to site π₯. Forsome interval πΌ β (0,β], a function [ : C Γ πΌ β S will be called a space-timeconfiguration. It assigns value [(π₯, π‘) to cell π₯ at time π‘.
In a space-time vector (π₯, π‘), we will always write the space coordinate first.y
βPartially supported by NSF grant CCR-9204284
arX
iv:2
105.
0596
8v1
[cs
.FL
] 1
2 M
ay 2
021
Definition 1.2 Let us be given a function function Trans : S3 β S and a one-dimensional set of sites C. We say that a space-time configuration [ in one di-mension is a trajectory of the one-dimensional (deterministic) cellular automatonCA(Trans)
[(π₯, π‘) = Trans([(π₯ β π΅, π‘ β π), [(π₯, π‘ β π), [(π₯ + π΅, π‘ β π))
holds for all π₯, π‘. Deterministic cellular automata in several dimensions are de-fined similarly. y
Since we want to analyze the effect of noise, we will be interested in randomspace-time configurations.
Definition 1.3 For a given set C of sites and time interval πΌ, consider a prob-ability distribution P over all space-time configurations [ : C Γ πΌ β S. Oncesuch a distribution is given, we will talk about a random space-time configuration(having this distribution). We will say that the distribution P defines a trajectoryof the Y-perturbation
CAY(Trans)if the following holds. For all π₯ β C, π‘ β πΌ, πβ1, π0, π1 β S, let πΈ0 be an eventthat [(π₯ + π, π‘ β 1) = π π ( π = β1, 0, 1) and [(π₯ β², π‘β²) is otherwise fixed in somearbitrary way for all π‘β² < π‘ and for all π₯ β² β π₯, π‘β² = π‘. Then we have
P{[(π₯, π‘) = Trans(πβ1, π0, π1) | πΈ0} β€ Y.
y
A simple stable two-dimensional deterministic cellular automaton given byToom in [3] can be defined as follows.
Definition 1.4 (Toom rule) First we define the neighborhood
π» = {(0, 0), (0, 1), (1, 0)}.
The transition function is, for each cell π₯, a majority vote over the three valuesπ₯ + ππ where ππ β π». y
As in [2], let us be given an arbitrary one-dimensional transition functionTrans and the integers π, π .
Definition 1.5 We define the three-dimensional transition function Transβ² asfollows. The interaction neighborhood is π» Γ {β1, 0, 1} with the neighborhoodπ» defined above. The rule Transβ² says: in order to obtain your state at time π‘+1,
2
first apply majority voting among self and the northern and eastern neighborsin each plane defined by fixing the third coordinate. Then, apply rule Trans oneach line obtained by fixing the first and second coordinates.
For a finite or infinite π, let C be our 3-dimensional space that is the prod-uct of Z2
π and a 1-dimensional (finite or infinite) space A with π = |A|. Fora trajectory Z of Trans on A, we define the trajectory Zβ² of Transβ² on C byZβ²(π, π, π, π‘) = Z(π, π‘). y
Let Zβ² be a trajectory of Transβ² and [ a trajectory of CAY(Transβ²) such that[(π€, 0) = Zβ²(π€, 0).Theorem 1 Let π = 24, and suppose Y < 1
32Β·π8 . If π = β then we have
P{[(π€, π‘) β Zβ²(π€, π‘) } β€ 4πY.
If π is finite then we have
P{[(π€, π‘) β Zβ²(π€, π‘) } β€ 4πY + (π‘π) Β· 2ππ2(2π2Y1/12)π.
The proof we give here is a further simplification of the simplified proof of[1].
Definition 1.6 Let Noise be the set of space-time points π£where [ does not obeythe transition rule Transβ². Let us define a new process b such that b(π€, π‘) = 0 if[(π€, π‘) = Zβ²(π€, π‘), and 1 otherwise. Let
Corr(π, π, π’, π‘) = Maj(b(π, π, π’, π‘), b(π + 1, π, π’, π‘), b(π, π + 1, π’, π‘)).
y
For all points (π, π, π’, π‘ + 1) β Noise([), we have
b(π, π, π’, π‘ + 1) β€ max(Corr(π, π, π’ β 1, π‘), Corr(π, π, π’, π‘), Corr(π, π, π’ + 1, π‘)).
Now, Theorem 1 can be restated as follows:Suppose Y < 1
32Β·π8 . If π = β then
P{b(π€, π‘) = 1} β€ 4πY.
If π is finite then
P{b(π€, π‘) = 1} β€ 4πY + (π‘π) Β· 2ππ2(2π2Y1/12)π.
3
2 Proof using small explanation trees
Definition 2.1 (Covering process) If π < β let Cβ² = Z3 be our covering space,and Vβ² = Cβ² Γ Z our covering space-time. There is a projection proj(π’) from Cβ²
to C defined byproj(π’)π = π’π mod π (π = 1, 2).
This rule can be extended to Cβ² identically. We define a random process bβ² overCβ² by
bβ²(π€, π‘) = b(proj(π€), π‘).The set Noise is extended similarly to Noiseβ². Now, if proj(π€1) = proj(π€2) thenbβ²(π€1, π‘) = bβ²(π€2, π‘) and therefore the failures at time π‘ in π€1 and π€2 are notindependent. y
Definition 2.2 (Arrows, forks) In figures, we generally draw space-time withthe time direction going down. Therefore, for two neighbor points π’, π’β² of thespace Z (where π’ is considered a neighbor for itself as well) and integers π, π, π‘,we will call arrows, or vertical edges the following kinds of (undirected) edges:
{(π, π, π’, π‘), (π, π, π’β², π‘ β 1)}, {(π, π, π’, π‘), (π + 1, π, π’β², π‘ β 1)},{(π, π, π’, π‘), (π, π + 1, π’β², π‘ β 1)}.
We will call forks, or horizontal edges the following kinds of edges:
{(π, π, π’, π‘), (π + 1, π, π’, π‘)}, {(π, π, π’, π‘), (π, π + 1, π’, π‘)},{(π + 1, π, π’, π‘), (π, π + 1, π’, π‘)}.
We define the graphG by introducing all possible arrows and forks. Thus, a pointis adjacent to 6 possible forks and 18 possible arrows: the degree of G is at most
π = 24.
(If the space is π + 2-dimensional, then π = 12(π + 1).) We use the notationTime((π€, π‘)) = π‘. y
The following lemma is key to the proof, since it will allow us to estimatethe probability of each deviation from the correct space-time configuration. Itassigns to each deviation a certain tree called its βexplanationβ. Larger expla-nations contain more noise and have a correspondingly smaller probability. Forsome constants π1, π2, there will be β€ 2π1πΏ explanations of size πΏ and each suchexplanation will have probability upper bound Yπ2πΏ.
4
Lemma 2.3 (Explanation Tree) Let π’ be a point outside the set ππππ πβ² withbβ²(π’) = 1. Then there is a tree Expl(π’, bβ²) consisting of π’ and points π£ of Gwith Time(π£) < Time(π’) and connected with arrows and forks called an expla-nation of π’. It has the property that if π nodes of Expl belong to Noiseβ² then thenumber of edges of Expl is at most 4(π β 1).
This lemma will be proved in the next section. To use it in the proof of themain theorem, we need some easy lemmas.
Definition 2.4 A weighted tree is a tree whose nodes have weights 0 or 1, withthe root having weight 0. The redundancy of such a tree is the ratio of its numberof edges to its weight. The set of nodes of weight 1 of a tree π will be denotedby πΉ(π).
A subtree of a tree is a subgraph that is a tree. y
Lemma 2.5 Let π be a weighted tree of total weight π€ > 3 and redundancy _. Ithas a subtree of total weight π€1 with π€/3 < π€1 β€ 2π€/3, and redundancy β€ _.
Proof. Let us order π from the root π down. Let π1 be a minimal subtree below π
with weight > π€/3. Then the subtrees immediately below π1 all weigh β€ π€/3.Let us delete as many of these as possible while keeping π1 weigh > π€/3. Atthis point, the weight π€1 of π1 is > π€/3 but β€ 2π€/3 since we could subtract anumber β€ π€/3 from it so that π€1 would become β€ π€/3 (note that since π€ > 3)the tree π1 is not a single node.
Now π has been separated by a node into π1 and π2, with weights π€1, π€2 >
π€/3. Since the root of a tree has weight 0, by definition the possible weight ofthe root of π1 stays in π2 and we have π€1 +π€2 = π€. The redundancy of π is thena weighted average of the redundancies of π1 and π2, and we can choose the oneof the two with the smaller redundancy: its redundancy is smaller than that ofπ . οΏ½
Theorem 2 (Tree Separator) Let π be a weighted tree with weight π€ and re-dundancy _, and let π < π€. Then π has a subtree with weight π€β² such thatπ/3 < π€β² β€ π and redundancy β€ _.
Proof. Let us perform the operation of Lemma 2.5 repeatedly, until we get weightβ€ π. Then the weight π€β² of the resulting tree is > π/3. οΏ½
Lemma 2.6 (Tree Counting) In a graph of maximum node degree π the number ofweighted subtrees rooted at a given node and having π edges is at most 2π Β· (2π2)π.
5
Proof. Let us number the nodes of the graph arbitrarily. Each tree of π edges cannow be traversed in a breadth-first manner. At each non-root node of the treeof degree π from which we continue, we make a choice out of π for π and then achoice out of π β 1 for each of the π β 1 outgoing edges. This is ππ possibilities atmost. At the root, the number of outgoing edges is equal to π, so this is ππ+1. Thetotal number of possibilities is then at most π2π+1 since the sum of the degrees is2π. Each point of the tree can have weight 0 or 1, whichmultiplies the expressionby 2π+1. οΏ½
Proof of Theorem 1. Let us consider each explanation tree a weighted tree inwhich the weight is 1 in a node exactly if the node is in Noiseβ². For each π, letEπ be the set of possible explanation trees Expl for π’ with weight |πΉ(Expl) | = π.First we prove the theorem for π = β, that is Noiseβ² = Noise. If we fix anexplanation tree Expl then all the events π€ β Noiseβ² for all π€ β πΉ = πΉ(Expl)are independent from each other. It follows that the probability of the eventπΉ β Noiseβ² is at most Yπ. Therefore we have
P{b(π’) = 1} β€ββοΈπ=1
|Eπ |Yπ.
By the Explanation Tree Lemma, each tree in Eπ has at most π = 4(πβ1) edges.By the Tree Counting Lemma, we have
|Eπ | β€ 2π Β· (2π2)4(πβ1) ,
Hence
P{b(π’) = 1} β€ 2πYββοΈπ=0
(16π8Y)π = 2πY(1 β 16π8Y)β1.
If Y is small enough to make 16π8Y < 1/2 then this is < 4πY.In the case C β Cβ² this estimate bounds only the probability of bβ²(π’) =
1, |Expl(π’, bβ²) | β€ π, since otherwise the events π€ β Noiseβ² are not necessarilyindependent for π€ β πΉ. Let us estimate the probability that an explanationExpl(π’, bβ²) has π or more nodes. It follows from the Tree Separator Theoremthat Expl has a subtree π with weight πβ² where π/12 β€ πβ² β€ π/4, and atmost π nodes. Since π is connected, no two of its nodes can have the sameprojection. Therefore for a fixed tree of this kind, for each node of weight 1the events that they belong to Noiseβ² are independent. Hence for each tree πof these sizes, the probability that π is such a subtree of Expl is at most Yπ/12.To get the probability that there is such a subtree we multiply by the number of
6
such subtrees. An upper bound on the number of places for the root is π‘π2π.An upper bound on the number of trees from a given root is obtained from theTree Counting Lemma. Hence
P{ |Expl(π’, bβ²) | > π} β€ 2ππ‘π2π Β· (2π2Y1/12)π.
οΏ½
3 The existence of small explanation trees
3.1 Some geometrical facts
Let us introduce some geometrical concepts.
Definition 3.1 Three linear functionals are defined as follows for π£ = (π₯, π¦, π§, π‘).
πΏ1(π£) = βπ₯ β π‘/3, πΏ2(π£) = βπ¦ β π‘/3, πΏ3(π£) = π₯ + π¦ + 2π‘/3.
y
Notice πΏ1(π£) + πΏ2(π£) + πΏ3(π£) = 0.
Definition 3.2 For a set π, we write
Size(π) =3βοΈπ=1
maxπ£βπ
πΏπ(π£).
y
Notice that for a point π£ we have Size({π£}) = 0.
Definition 3.3 A set S = {π1, . . . , ππ} of sets is connected by intersection if thegraph πΊ(S) is connected which we obtain by introducing an edge between ππand π π whenever ππ β© π π β β . y
Definition 3.4 A spanned set is an object P = (π, π£1, π£2, π£3) where π is a space-time set and π£π β π. The points π£π are the poles of P, and π is its base set. Wedefine Span(P) as β3
π=1 πΏπ(π£π). y
Remark 3.5 As said in the introduction, this paper is an exposition of Toomβsmore general proof in [3], specialized to the case of the construction in [2].Some of the terminology is taken from [1], and differs from the one in [3].What is called a spanned set here is called a βpolarβ in [3], and its span is calledits βextentβ there. In our definition of πΏπ the terms depending on π‘ play no role,they just make the exposition compatible with [3]. y
7
Lemma 3.6 (Spanned Set Creation) If π is a set then there is a spanned set(π, π£1, π£2, π£3) on π with Span(P) = Size(π).
Proof. Assign π£π to a point of the set π in which πΏπ is maximal. οΏ½
The following lemma is our main tool.
Lemma 3.7 (Spanning) Let L = (πΏ, π’1, π’2, π’3) be a spanned set and M be a setof subsets of πΏ connected by intersection, whose union covers the poles of L. Thenthere is a set {M1, . . . ,Mπ} of spanned sets whose base sets ππ are elements of M,such that the following holds. Let π β²
πbe the set of poles of Mπ.
a) Span(L) = βπ Span(Mπ).
b) The union of the sets π β²πcovers the set of poles of L.
c) The system {π β²1, . . . , π
β²π} is a minimal system connected by intersection (that
is none of them can be deleted) that connects the poles of L.
Proof. Let ππ π β M be a set containing the point π’ π. Let us choose π’ π as the π-thpole of ππ π . Now leave only those sets ofM that are needed for a minimal tree Tof the graph πΊ(M) connecting ππ1 , ππ2 , ππ3 . Keep deleting points from each set(except π’ π from ππ π) until every remaining point is necessary for a connectionamong π’ π. There will only be two- and three-element sets, and any two of themintersect in at most one element. Let us draw an edge between each pair ofpoints if they belong to a common set π β²
π. This turns the union
π =βπ
π β²π
into a graph. (Actually, this graph can have only two simple forms: a pointconnected via disjoint paths to the poles π’π or a triangle connected via disjointpaths to these poles.) For each π and π, there is a shortest path between π β²
πand
π’ π. The point of π β²πwhere this path leaves π β²
πwill be made the π-th pole π’π π of
ππ. For π β {1, 2, 3} we have π’π π π = π’ π by definition. This rule creates three polesin each ππ and each point of π β²
πis a pole.
Let us showβπ Span(Mπ) = Span(L). We can writeβοΈ
π
Span(Mπ) =βοΈπ£βπ
βοΈπ, π:π£=π’π π
πΏ π(π£). (1)
For a point π£ β π, let
πΌ(π£) = { π : π£ β π β²π }.
8
For π β πΌ(π£) let πΈπ(π£) be the set of those π β {1, 2, 3} for which either π = π π orπ£ β π’π π. Because graph T is a tree, for each fixed π£ the sets πΈπ(π£) are disjoint.Because of connectedness, they form a partition of the set {1, 2, 3}. Let ππ( π, π£) =1 if π β πΈπ(π£) and 0 otherwise, then we have
βπ ππ( π, π£) = 1 for each π.
We can now rewrite the sum (1) as
3βοΈπ=1
βοΈπ£βπ
πΏ π(π£) (ππ π ( π, π£) +βοΈπ£βπ
βοΈπβπΌ (π£)\{π π }
(1 β ππ( π, π£))).
If π = π π β πΌ(π£) then by definition we have 1 β ππ( π, π£) = 0, therefore we cansimplify the sum as
3βοΈπ=1
βοΈπ£βπ
πΏ π(π£)ππ π ( π, π£) +βοΈπβπΌ (π£)
3βοΈπ=1
πΏ π(π£) (1 β ππ( π, π£)).
The first term is equal to Span(L); we show that the last term is 0. Moreover, weshow 0 =
β3π=1 πΏ π(π£)
βπβπΌ (π£) (1βππ( π, π£)) for each π£. Indeed,
βπβπΌ (π£) (1βππ( π, π£))
is independent of π since it is |πΌ(π£) | ββπ ππ( π, π£) = |πΌ(π£) | β1. On the other hand,β3
π πΏ π(π£) = 0 as always. οΏ½
3.2 Building an explanation tree
Let us define the excuse of a space-time point.
Definition 3.8 (Excuse) Let π£ = (π, π, π’, π‘ + 1) with bβ²(π£) = 1. If π£ β Noiseβ²
then there is a π’β² such that bβ²(π€) = 1 for at least two members π€ of the set{(π, π, π’β², π‘), (π + 1, π, π’β², π‘), (π, π + 1, π’β², π‘)
}.
We define the set Excuse(π£) as such a pair of elements π€, and as the empty setin all other cases. By Lemma 3.6, we can turn Excuse(π£) into a spanned set,(Excuse(π£), π€1, π€2, π€3) with span 1. Denote
Excuseπ(π£) = π€π.
Since no excuse is built from a node in Noiseβ², let us delete all arrows leadingdown from nodes in Noiseβ²: the new graph is denoted by Gβ². y
The following lemma utilizes the fact that Toomβs rule βmakes triangles shrinkβ.
9
Lemma 3.9 (Excuse size) If V = (π, π£1, π£2, π£3) is a spanned set and π£π are notin Noiseβ² then we have
3βοΈπ=1
πΏ π(Excuse π(π£ π)) = Span(V) + 1.
Proof. Let π be the triangular prism
(0, 0, 0,β1) + {π’ : πΏ1(π’) β€ 0, πΏ2(π’) β€ 0, πΏ3(π’) β€ 1}.
We have Size(π) = 1, and Excuse(π£) β π£ + π . Since the chosen poles turnExcuse(π£) into a spanned set of size 1, the function πΏ π achieves its maximum inπ£ + π on Excuse π(π£). We have
πΏ π(Excuse π(π£)) = maxπ’βπ£+π
πΏ π(π’) = πΏ π(π£) +maxπ’βπ
πΏ π(π’).
Hence we haveβοΈπ
πΏ π(Excuse π(π£ π)) =βοΈπ
maxπ’βπ
πΏ π(π’) +βοΈπ
πΏ π(π£ π)
= Size(π) + Span(V) = 1 + Span(V).
οΏ½
Definition 3.10 (Clusters) Let us call two nodes π’, π£ of the above graph withTime(π’) = Time(π£) = π‘ equivalent if there is a path between them in Gβ² madeof arrows, using only points π₯ with Time(π₯) β€ π‘. An equivalence class will becalled a cluster. For a cluster πΎ we will denote by Time(πΎ) the time of its points.We will say that a fork or arrow connects two clusters if it connects some of theirnodes. y
By our definition of Gβ², if a cluster contains a point in Noiseβ² then it containsno other points.
Definition 3.11 (Cause graph) Within a subgraphGβ² that is some excuse graph,for a cluster πΎ, we define the cause graphπΊπΎ = (ππΎ , πΈπΎ) as follows. The elementsof πΊπΎ are those clusters π with Time(π ) = Time(πΎ) β 1 which are reachable byan arrow from πΎ. For π , π β ππΎ we have {π , π} β πΈπΎ iff for some π£ β π andπ€ β π we have Time(π£) = Time(π€) = Time(πΎ) β 1 and {π£, π€} β Forks. y
Lemma 3.12 The cause graph πΊπΎ is connected.
10
Proof. The points of πΎ are connected via arrows using points π₯ with Time(π₯) β€Time(πΎ). The clusters in πΊπΎ are therefore connected with each other onlythrough pairs of arrows going trough πΎ. The tails of each such pair of arrows inTime(πΎ) β 1 are connected by a fork. οΏ½
Definition 3.13 A spanned cluster is a spanned set that is a cluster. y
The explanation tree will be built from an intermediate object defined below.Let us fix a point π’0: from now on we will work in the subgraph of the graph Gβ²
reachable from π’0 by arrows pointing backward in time. Clusters are defined inthis graph.
Figure 1: An explanation tree. The black points are noise. The squares are otherpoints of the explanation tree. Thin lines are arrows not in the explanation tree.Framed sets are clusters to which the refinement operation was applied. Thicksolid lines are arrows, thick broken lines are forks of the explanation tree.
Definition 3.14 A partial explanation tree is an object of the form (πΆ0, πΆ1, πΈ).Elements of πΆ0 are spanned clusters called unprocessed nodes, elements of πΆ1are processed nodes, these are nodes of G. The set πΈ is a set of arrows or forks
11
between processed nodes, between poles of the spanned clusters, and betweenprocessed nodes and poles of the spanned clusters. From this structure a graphis formed if we identify each pole of a spanned cluster with the cluster itself.This graph is required to be a tree.
The span of such a tree will be the sum Span(π) of the spans of its unpro-cessed clusters and the number of its forks. y
The explanation tree will be built by applying repeatedly a βrefinementβ op-eration to partial explanation trees.
Definition 3.15 (Refinement) Let π be a partial explanation tree, and let thespanned cluster K = (πΎ, π£1, π£2, π£3) be one of its unprocessed nodes, with π£π notin Noiseβ². We apply an operation whose result will be a new tree π β².
Consider the cause graph πΊπΎ = (ππΎ , πΈπΎ) defined above. Let M = ππΎ βͺπΈπΎ , that is the family of all clusters in ππΎ (sets of points) and all edges in πΊπΎconnecting them, (two-element sets). Let πΏ be the union of these sets, and L =
(πΏ, π’1, π’2, π’3) a spanned set where π’π = Excuseπ(π£π). Lemma 3.12 implies thatthe set M is connected by intersection. Applying the Spanning Lemma 3.7 to Land M, we find a familyM1, . . . ,Mπ of spanned sets withβοΈ
π
Span(Mπ) = Span(L) =βοΈπ
πΏπ(π’π).
It follows from Lemma 3.9 that the latter sum is Span(K) + 1, and that π’π areamong the poles of these sets. Some of these sets are spanned clusters, oth-ers are forks connecting them, adjacent to their poles. Consider these forksagain as edges and the spanned clusters as nodes. By the minimality propertyof Lemma 3.7, they form a tree π (K) that connect the three poles of L.
The refinement operation takes an unprocessed node K = (πΎ, π£1, π£2, π£3) inthe tree π . This node is connected to other parts of the tree by some of its polesπ£ π.
The operation deletes cluster πΎ, and keeps those poles π£ π that were needed tokeep connected K to other clusters and nodes in π . It turns these into processednodes, and adds the tree π (K) just built, declaring each of its spanned clustersunprocessed nodes. Then it adds the arrow from these π£ π to Excuse π(π£ π). Even ifnone of these nodes were needed for connection, it keeps π£1 and adds the arrowfrom π£1 to Excuse1(π£1). y
The refinement operation increases both the span and the number of arrowsby 1.
12
Let us build now the explanation tree. We start with a node π’0 β Noiseβ² withbβ²(π’0) = 1 and from now on work in the subgraph of the graphG of points reach-able from π’0 by arrows backward in time. Then ({π’0}, π’0, π’0, π’0) is a spannedcluster, forming a one-node partial explanation tree if we declare it an unpro-cessed node. We apply the refinement operation to this partial explanation tree,as long as we can. When it cannot be applied any longer then all nodes are eitherprocessed or one-point spanned clusters belonging to Noiseβ². See an example inFigure 1.
Proof of Lemma 2.3. What is left to prove is the estimate on the number of edgesof our explanation tree π . Note the following:β’ The span of π is the number of its forks.β’ Each point at some time π‘ that is not in Noiseβ² is incident to some arrows going
to time π‘ β 1.Let us contract each arrow (π’, π£) of π one-by-one into its bottom point π£. Theedges of the resulting tree π β² are the forks. All the processed nodes will becontracted into the remaining one-node clusters that are elements of Noiseβ². Ifπ is the number of these nodes then there are π β 1 = Span(π) forks in π β².
The number of arrows in π is at most 3(π β 1). Indeed, each introduction ofat most 3 arrows by the refinement operation was accompanied by an increaseof the span by 1. The total number of edges of π is thus at most 4(π β 1). οΏ½
References
[1] Piotr Berman and Janos Simon, Investigations of fault-tolerant networks ofcomputers, Proc. of the 20-th Annual ACM Symp. on the Theory of Comput-ing, 1988, pp. 66β77.
[2] Peter GΓ‘cs and John Reif, A simple three-dimensional real-time reliable cel-lular array, Journal of Computer and System Sciences 36 (1988), no. 2,125β147, Short version in STOC β85.
[3] Andrei L. Toom, Stable and attractive trajectories in multicomponent systems,Multicomponent Systems (R. L. Dobrushin, ed.), Advances in Probability,vol. 6, Dekker, New York, 1980, Translation from Russian, pp. 549β575.
13