4.5 Structured pseudospectra of structured matrix poly-
Proof: For the spectral norm, by Theorem 4.3.1, we haveηS(λ, x,P) =η(λ, x,P) for all x.
Consequently, we haveηS(λ,P) =η(λ,P).Hence the result follows.
For the Frobenius norm, the result follows from Theorem 4.3.4 when P isT-skew-symmetric.
So, suppose that P is T-symmetric. Then P(λ)∈Cn×n is symmetric. Consider the Takagi factorization P(λ) =UΣUT, where U is unitary and Σ is a diagonal matrix containing sin- gular values of P(λ) (appear in descending order). Setσ:= Σ(n, n) and u:=U(:, n). Then we have P(λ)u=σu.Now define
4Aj:=−λjσ uuT kΛmk22 , and consider the polynomial 4P(z) =Pm
j=0zj4Aj.Then 4P isT-symmetric and P(λ)u+ 4P(λ)u= 0.Notice that, for the|||·||| ≡ |||·|||2 and|||·||| ≡ |||·|||F onCn×n,we have
ηS(λ,P)≤ |||4P|||= σ
kΛmk2 =η(λ,P) and henceσ²(P) =σ²S(P).
This completes the proof. ¥
When P is T-symmetric, the above proof shows how to construct a T-symmetric pencil 4P such thatλ∈Λm(P +4P) and|||4P|||=ηS(λ,P). When P isT-skew-symmetric, using Takagi factorization of the complex skew-symmetric matrix P(λ),one can construct aT-skew- symmetric pencil4P such thatλ∈Λm(P +4P) and|||4P|||=ηS(λ,P).Indeed, consider the Takagi factorization
P(λ) =Udiag(d1,· · ·, dm)UT, where U is unitary,dj :=
"
0 sj
−sj 0
#
, sj ∈Cis nonzero and|sj|are singular values of P(λ).
Here the blocks dj appear in descending order of magnitude of |sj|. Note that P(λ)U = Udiag(d1,· · ·, dm).Letu:=U(:, n−1 :n).Then P(λ)u=udm=udmuTu.Now define
4Aj:=−λjudmuT kvλk22
and consider the pencil 4P(z) =Pm
j=0zj4Aj.Then 4P isT-skew-symmetric and P(λ)u+ 4P(λ)u= 0.For the spectral norm onCn×n, we have
ηS(λ,P) =|||4P|||= σmin(P(λ))
kΛmk2 =η(λ,P) and for the Frobenius norm on Cn×n, we have
ηS(λ,P) =|||4P|||=√
2σmin(P(λ)) kΛmk2 =√
2η(λ,P).
We denote the unit circle inCbyT, thatT:={z∈C:|z|= 1}. Then for theT-even or T-odd polynomials we have the following result.
Theorem 4.5.2. Let S∈ {T-even, T-odd} andP∈Sandmis odd. Letλ∈T. Then for the
Frobenius norm on Cn×n,we have ηS(λ,P) =√
2η(λ,P)andσS²(P)∩T=σ²/√2(P)∩T.
Proof: Letλ∈T.Then by Theorem 4.3.6 and Theorem 4.3.8, we have ηS(λ, x,P) =
√2kP(λ)xk2
kΛmk2
for all xsuch that kxk2 = 1. Hence taking minimum over kxk2 = 1, we obtain the desired results. ¥
Theorem 4.5.3. Let S ∈ {H-Hermitian,H-skew-Hermitian} and P ∈S. Let λ∈ R. Then for the spectral and the Frobenius norms on Cn×n,we have ηS(λ,P) =η(λ,P) and hence
σ²S(P)∩R=σ²(P)∩R.
Proof: Note that P(λ) is either Hermitian or skew-Hermitian. Let (µ, u) be an eigenpair of the matrix P(λ) such that|µ|=σmin(P(λ)) anduHu= 1.Then P(λ)u=µu.Define
4Aj:=−λjµ uuH kΛmk22
and consider the pencil4P(z) =Pm
j=0zj4Aj.Then4P∈Sandλ∈Λm(P +4P).Further, for the spectral and the Frobenius norms, we have |||4P|||= σmin(P(λ))
kΛmk2
. Hence the result follows. ¥
Theorem 4.5.4. LetS∈ {H-even,H-odd} andP∈S.Letλ∈iR.Then for the spectral and the Frobenius norms on Cn×n,we have ηS(λ,P) =η(λ,P) and hence
σS²(P)∩iR=σ²(P)∩iR.
Proof: Note forλ∈iR,then the matrix P(λ) is again either is Hermitian or skew-Hermitian.
Hence the result follows from the proof of Theorem 4.5.3. ¥
We mention that the above results can be easily extended to the case of general structured polynomials where the coefficients matrices are elements of Jordan and/or Lie algebras.
Chapter 5
Backward errors and
linearizations for palindromic matrix polynomials
This chapter is devoted to the backward perturbation analysis of palindromic and anti- palindromic polynomials. We derive structured backward error of approximate eigenpair of these polynomials and characterize the minimal structured perturbations that achieve it. Following similar approach employed for structured polynomials in chapter 4, we show that there always exists “good” palindromic /anti-palindromic linearization for a palindromic /anti-palindromic polynomial.
5.1 Introduction
In this chapter we undertake a detailed backward perturbation analysis of palindromic and anti-palindromic polynomials. These polynomials arise mainly in the study of rail traffic noise caused by high speed trains, see [44, 64, 68, 82]. A matrix polynomial P∈Pm(Cn×n) is called a palindromic matrix polynomial if P∗(z) =zmP(1/z) for allz∈C\ {0} and an anti- palindromic polynomial if P∗(z) =zmP(−1/z) for allz∈ C\ {0},where ∗ is the transpose or conjugate transpose of a matrix and
P∗(z) = Xm
j=0
zjA∗j whenever P(z) = Xm
j=0
zjAj.
In this chapter we consider only the regular matrix polynomials. We denote the space of regular palindromic or anti-palindromic polynomials by S. It is shown in [64, 68] that eigenvalues of palindromic polynomials possess certain spectral symmetry.
It is well known that linearization is used to solve a given polynomial eigenvalue problem.
As discussed in Chapter 4, an arbitrary linearization destroys the symmetry in the computed eigenvalues. Therefore, the first step towards solving a palindromic eigenvalue problem is to convert the given polynomial P∈Sinto a linear polynomial L,call it a structured linearization which has the same eigensymmetry as that of P.It is shown in [68] that a palindromic or anti- palindromic polynomial has infinitely many palindromic/ and anti-palindromic linearizations
which preserve the eigen-symmetry.
This gives rise to a problem of choosing a “good” linearization among them. Aiming to find a “good” structured linearization we follow similar procedure as discussed in Chapter 4.
Indeed, given an approximate eigen-pair, we derive an explicit expression of the structured backward error under structured perturbation of P∈S.Note that, structured backward error is always greater than or equal to the unstructured backward error. We show that structured backward error is bounded above by a scalar multiple of the unstructured backward error for few approximate eigen-pairs. Besides, for each structure, there are certain approximate eigen-pairs for which they are equal.
Further using the expression of structured backward error of approximate eigen-pair of palindromic polynomial P, we obtain structured backward error of the corresponding ap- proximate eigen-pair of its structured linearizations. Then we identify “good” structured linearization L of P that minimizes ηS(λ,Λm−1⊗x,L)/η(λ, x,P).
Its evident that structure preserving algorithms are required to produce symmetry in the computed eigenvalues. Structured preserving algorithms have been proposed in the literature to compute the eigenpair of a structured matrix polynomials, see [22, 44, 66]. We mention that the computable expressions of structured backward error obtained in this chapter has an important role to play in analyzing the backward stability of structured preserving algorithms.
Finally, we define structured backward error of an approximate eigenvalue of a struc- tured matrix polynomial and apply it to establish a partial equality between structured and unstructured pseudospectra ofH-palindromic /H-anti-palindromic polynomials.
The chapter is organized as follows. In section 5.2, we first discuss the eigensymmetry and backward error of an approximate eigenpair of palindromic polynomials. In section 5.3, we obtain expressions for structured backward error of approximate eigen-pair of palindromic polynomials. In section 5.4 we explain the palindromic linearizations of structure polynomials and determine “good” structured linearization.