• Tidak ada hasil yang ditemukan

be necessary in order to achieve average consensus. It is conceivable that a simulation algorithm that allows mistakes will achieve average consensus faster than a universal algorithm such as Algorithm 5. This is not a focus of the current Chapter.

time and space, the convergence rate of (7.9), µsc which we define as

µsc = sup

xo6=x01 k→∞lim

Ekxk−x01k2 kxo−x01k2

2k1

(7.10) is given by

µsc =p

λ2s) (7.11)

where Γs = E(I −L0)⊗(I −L0) is a deterministic matrix that is a function of , p,L and can be computed explicitly in closed form. The subscript c indicates that there is no coding and the subscript s in Γs is because the erasures are symmetric Proof. See Appendix 7.4.1.

In this case, note that even without coding, the nodes achieve average consensus albeit at a slower rate depending on the erasure probability p.

Now consider using repetition coding. To understand the rationale behind even considering repetition coding, recall that the recursion (7.9) can be written asxk+1 = (I −Lk)xk. Take expectation on both sides to get xk+1 = (I −L)xk where the bar indicates that they are expected values1. Since the link erasure probability is p, L = I −(1−p)L. Suppose = as in (7.8). Then using Lemma 7.1, the rate of convergence of xk tox01 can be calculated as

µ= max

1−λN−1(L), λ1(L)−1 (7.12)

= max{1−(1−p)λN−1(L), (1−p)λ1(L)−1} (7.13)

= λ1(L)−λN−1(L)

λ1(L) +λN−1(L) + 2pλN−1

λ1(L) +λN−1(L) (7.14)

=µ+ 2pλN−1

λ1(L) +λN−1(L) (7.15)

where µ is the rate of convergence of the consensus protocol on the unerased graph.

Clearly µ > µ. Moreover the rate of convergence of xk to x01 is even slower (as

1this is inconsistent with the notation x0 which is a deterministic scalar but should not cause any confusion

compared to xk converging to x01). Since the repetition code simulates consensus over the unerased graph whose convergence rate is µ, it can potentially result in faster convergence if the overhead due to repetition is not too high.

Using Theorems 6.1 and 6.2, we can determine the convergence rate, µsc, of con- sensus using the repetition code (i.e., Algorithm 4) and it is given by

µsc ≤min{µR(p), µ(1−p)|E|} (7.16) µ is defined in (7.7). The superscript and subscript in µsc denote that it is the con- vergence rate with coding under symmetric erasures. So, whenever µsc < µsc, coding offers an advantage. In practice though, the computational overhead of doing repeti- tion coding would probably far outweigh any benefits of being able to reach consensus faster. The more interesting and relevant case is when erasures are asymmetric in which there is no recourse coding.

7.3.2 Asymmetric Erasures

SinceXkij and Xkji are independent, they are not equal in general. Note thatLk1=1 but 1TLk6=1T in general which violates (7.5). Furthermore, the associated graph is not balanced2 either, i.e., P

jaijXkij 6=P

iajiXkji, in general. In this case, the nodes will not achieveaverage consensus. But under very mild conditions, it is well known that the nodes achieve an agreement, i.e., xk → Y1 where Y is a random variable that does not necessarily concentrate around the initial average r. Nevertheless the nodes reach agreement and we will characterize the rate of convergence below. But tree codes allow us to simulate the original recursions, i.e., (7.1), and hence guarantee asymptotic average consensus. Here, we characterize the mean-squared error of the state from average consensus when no error correction is used.

Lemma 7.3 (Asymmetric Erasures). When the erasures are asymmetric and i.i.d

2A graph is said to be balanced if for every node in-degree is equal to out-degree.

over time and space, we have

Ekxk−x01k2 = (xo−x01)T ⊗(xo−x01)TΓkavec(I) (7.17) Here I is an N ×N identity matrix and

Γa=E(I−LT0)⊗(I−LT0) (7.18) where Γa is a deterministic matrix that is a function of , p,L and can be computed explicitly in closed form. Furthermore ρ(Γa) = 1.

Proof. See Appendix 7.4.2.

Note that 1TΓa = 1T but Γa1 6= 1. Let c, kck = 1 be the right eigen vector of Γa corresponding to eigen value 1, i.e., Γac = c. Then, it is easy to see that limk→∞Γka = N1c1T. Using this in (7.17), we get

k→∞lim Ekxk−x01k2 = (xo−x01)T ⊗(xo−x01)Tc (7.19) This proves that one cannot achieve average consensus without coding when link failures are asymmetric. So, a major benefit of using tree codes in such cases is to guarantee average consensus. Note that we ignore quantization effects which is justified by the packet sizes used in practice. We can easily compute the rate of convergence of the consensus protocol when tree codes are used. Recall that the overall rate of the simulation protocol (i.e., Algorithm 5) is at least rρ(r, p) where r is the rate of the tree code,pis the probability of packet erasure andρ(r, p) is defined in (6.13) as

ρ(r, p), sup

R0≥0

{R0 |(1−R00/2> H(R0) + log(∆ + 1)}

The effective rate of convergence to average consensus achieved by using tree codes is no worse thanµrρ(r,p).

7.3.3 A Simulation

We will perform a simple simulation to demonstrate the effectiveness of tree codes in achieving average consensus and achieving it quickly. We will use a graph of 20 nodes connected in a straight line as depicted in Fig 7.3.3. The packet length is 16. When there is no coding, nodes exchange one packet each in every communication round.

For coding, we generate a random code from the Toeplitz ensemble (e.g., Chapter 4) with rate 1/5, i.e., every packet is mapped to 5 packets and a communication round now consists of exchanging these 5 packets between every pair of rounds. Each node is initialized randomly with 0 or 1. Sample trajectories of the values at every node in the graph are plotted in Figure 7.3.3. The plot clearly illustrates the fact that nodes do not achieve average consensus without coding while they do with tree codes.

1 2 . . . 19 20

Friday, April 27, 2012

(a) A line graph with 20 nodes

0 50 100 150 200 250

0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1

Network of 20 nodes connected in a line topology, 30% erasures

number of communication rounds

nodevalues

Average of initial Values Without using coding

Agreement reached but not to the initial average

Using Tree codes with rate 0.2

(b) Tree codes achieve average consensus. The slow down due to coding is visible in the plot.

Figure 7.1: One needs coding to achieve average consensus when packet erasures are asymmetric