• Tidak ada hasil yang ditemukan

Monotonicity in Cascading Failures

Chapter V: Monotonicity in Cascading Failures and Tree-partitions

5.2 Monotonicity in Cascading Failures

In this section, we present our results for monotonicity in cascading failure pro- cesses. Our characterization is related to known monotonicity results and suggests a systematic way to define monotonic topological metrics over a failure event.

Our approach focuses on the Laplacian spectrum of the system. In contrast to the lack of monotonicity in the physical system, when we look at the process from this spectral perspective, there is a rich set of monotonicity one can explore. They are built upon the following fundamental monotonicity result:

Theorem 5.2.1. Let λ1(n) ≤ λ2(n) ≤ · · · ≤ λn(n) be the eigenvalues of L(n).

Thenλi(n)is a decreasing function inn for eachi. Moreover, for each stagen, as long as new lines are tripped atn, there existsisuch that the decrease is strict:

λi(n+1) < λi(n).

The Laplacian eigenvalues encode information on how well the graph is connected and how fast information can propagate in the network (see [24], for example).

Therefore, this result shows that as the cascading failure process unfolds there is a decreasing level on the network connectivity and its “mixing ability”. Although Theorem 5.2.1 is only related to the network topology evolution, we demonstrate in Corollary 5.2.6 that by applying such monotonicity properly it is possible to devise monotonic properties that are directly related to the power flow dynamics.

To prove Theorem 5.2.1, we first derive an eigenvalue interlacing result for generic weighted Laplacian matrices. Its special case, in which the graph is unweighted and only a single line is removed, is known in the literature [32].

Proposition 5.2.2. LetGbe a weighted graph with positive line weights{we}, and letH be a subgraph of G obtained by removing exactly s edges from G. Denote λ1 ≤ λ2 ≤ . . . ≤ λnand µ1 ≤ µ2 ≤ . . . ≤ µnto be the eigenvalues of LG and LH, respectively. Then for anyk =1,2, . . . ,n, we have

µk ≤ λk (5.1)

and fork = s+1,s+2, . . . ,n, we have

λk−s ≤ µk. (5.2)

The proof of this proposition is presented in Section 5.7. As an immediate corollary, we can deduce the following well-known result fors =1:

Corollary 5.2.3. With previous notations, whenH is obtained by removing a single edge fromG, we have

µ1≤ λ1≤ µ2 ≤ · · · ≤ λn−1≤ µn ≤ λn.

We now apply Proposition 5.2.2 to the transmission network Laplacian matrices G(n). Note that in a cascading process described by the graph sequence{G(n)}n∈N, G(n+1)is obtained fromG(n)by removing the tripped linesF(n)incurred during stagen; therefore, we know that the functionsλi(n)as defined in Theorem 5.2.1 are monotonically decreasing. To finish the proof of Theorem 5.2.1, it thus suffices to show that for each stagen we can always find anisuch that the decrease is strict.

Proof of Theorem 5.2.1. This is immediate after noting X

i

λi(n+1) = tr(L(n+1)) = X

e∈E(n+1)

Be

< X

e∈E(n)

Be =tr(L(n))

= X

i

λi(n),

where the inequality is strict because there are lines tripped at timen.

Such monotonicity of Laplacian eigenvalues suggests that all metrics measuring the system from its spectrum should be monotonic as well. The most general result we can conclude along this line is the following:

Corollary 5.2.4. Let|||·|||be a unitarily-invariant norm on the set ofn×nmatrices.

Then|||L(n)|||is a decreasing function ofn.

Proof. This is an immediate result from the bijective correspondence between uni- tarily invariant norms onn×nmatrices and symmetric gauge functions applied to the matrix singular values [10], because symmetric gauge functions are monotone in the vector components.

Examples of unitarily-invariant norms include the spectral norm, nuclear norm, Frobenious norm, Schatten p-norms, and Ky-Fan k-norms, etc., each of which

suggests a different way to measure the system monotonicity. For example, the monotonicity in nuclear norm recovers the fact that the sum of all link reactances decreases in a cascading failure process.

It is well-known from singular value decomposition that the nonzero eigenvalues of L(n) are given as 1/λi(n), with the same corresponding eigenvectors as L(n).

Therefore, Theorem 5.2.1 implies that the nonzero eigenvalues ofL(n)are mono- tonically increasing. It is tempting to conclude from this fact that vTL(n)v is monotonically increasing for a fixedv ∈ Rn, but the situation becomes tricky after we notice that the eigenvectors ofL(n)also evolve withn. Fortunately, we can still prove such monotonicity with careful algebra.

Proposition 5.2.5. For anyv ∈Rn, the functionV(n) := vTL(n)vis increasing in the stage indexn.

Proof. Without loss of generality, let us assume there is only a single edgee= (i, j) tripped at stagen. The general case follows by tripping the lines one by one.

Under this assumption, by direct computation we have L(n+1) = L(n)−BeCeCeT,

whereCe is the column of C corresponding toe. It is shown in [6] that this rank one perturbation translates in its Moore-Penrose pseudoinverse to the equation

L(n+1) = L(n)+ 1 Xi j −Ri j

L(n)CeCeTL(n), (5.3) where Ri j is the effective reactance between busi and j defined in Section 2.4.4.

Recall by Corollary 2.4.6 we always haveXi j −Ri j > 0 for directly connectediand j(as long as after removingethe network is still connected), we thus see the second term in (5.3) is positive semidefinite. The monotonicity ofV(n) then follows.

The network tension [41] at stagenis defined to beH(n) = f(n)TX(n)f(n), which measures the aggregate load of the network and is shown to be an increasing function ofn in [41]. We now show this is a special case of our result.

Corollary 5.2.6. H(n)is an increasing function inn.

Proof. We can calculate that (for notation simplicity, we drop the stage indexn) fTX f = pTLC BX BCTLp

= pTLLLp

= pTLp.

By Proposition 5.2.5 we then know H(n)is monotonically increasing.

The equation (5.3) not only shows the monotonicity of H(n), but also implies that the increment ofH(n)at eachnis inversely proportional to the amount of reactance reduction of(i, j)from the network at timen.