We describe the recovery of eigenvectors, minimal bases, and minimal indices of rational matrices from those of GFPRs and show that the recovery is operation-free. We also describe the recovery of eigenvectors, minimal bases, and minimal indices of rational matrices from those of strong Rosenbrock linearizations belonging to affine spaces.
Preliminaries
We further describe the recovery of eigenvectors, minimal bases, and minimal indices of rational matrices from FP, GFP, and FPR, and show that the recovery is operation-free. It also describes the recovery of eigenvectors, minimal bases, and minimal indices of a rational matrix from Rosenbrock's strong linearizations.
Polynomial and rational matrices
System matrix associated with a rational matrix
A realization of G(λ) is said to be minimal realization if the size of the pencil λE−A is the smallest of all the realizations of G(λ), see [40]. Unless otherwise stated throughout the thesis, we consider only minimal realization (1.2) of the rational matrix G(λ).
Linearizations of matrix polynomials
Two sub-permutations σ and τ of N are said to be disjoint if σ ∩τ = ∅, where the intersection is defined as the intersection of the underlying subsets of the sub-permutations. If ω := (σ, τ) is a permutation of N, then the sub-permutations σ and τ are said to be a partition of ω.
Recovery of eigenvectors and minimal bases from FPs and GFPs . 11
Then β is said to be in column standard form if β+h is in column standard form. Then β is said to be in series standard form if β+h is in series standard form.
Recovery of eigenvectors and minimal bases
Let cL:=c(α) and iL :=i(α) be the total number of sequences and inversions of the permutation α of {0 :m−1}. It turns out that Theorem 2.1.18 is not valid for GFPR, which means that it cannot guarantee the recovery of eigenvectors and minimal bases of P(λ) from those of type 1 GFPR.
Eigenvalue at infinity and recovery of eigenvectors
We need the following result to derive the recovery of eigenvectors of P(λ) corresponding to an eigenvalue at ∞ from those of the GFPRs of P(λ). We are now ready to describe the recovery of eigenvectors of P(λ) corresponding to an eigenvalue at ∞ of that of the GFPRs ofP(λ).
Recovery of eigenvectors and minimal bases from structured linearizations 37
Skew-symmetric linearizations
McPwMtPwMcPvMtPv. Then, up to multiplication by −1, there exists a unique quasi-identity matrix S such that SL(λ) is skew symmetric. We now describe the recovery of minimal bases P(λ) from bases of skew-symmetric linearizations P(λ).
Even and odd linearizations
Palindromic linearizations
Since S and R are non-singular, the minimum left (respectively, right) indices of SRL (λ) and L (λ) are the same. Therefore, the recovery formulas for eigenvectors, minimum bases and minimum indices of aT-anti-palindromicP(λ) from those of T-anti-palindromic linearization obtained from FPRs are the same as those of T-palindromic P(λ) and T-palindromic linearizations obtained from FPRs.
Block structure of eigenvectors of GFPRs
As the example above illustrates, determining the block structure of eigenvectors of GFPRs is a difficult problem. Even the block structure of eigenvectors of type-1 FPRs obtained in [14] does not hold for type-1 GFPRs.
Algorithms for FPRs and GFPRs
Determining the block structure of GFPR eigenvectors under appropriate assumptions and choosing optimal (in terms of conditioning and feedback errors) GFPRs are interesting open problems. EGFPRs share the same characteristic properties as Fiedler pencils, namely, the construction of EGFPR is operation-free, and the recovery of eigenvectors, minimal bases, and minimal indices P(λ) from those of EGFPR is also operation-free. Note that for an index tuple α and a matrix assignment X of α, Mα(X) is operation-free if and only if MαP is operation-free, see Lemma 2.1.7 and Remark 2.1.8.
If all matrix assignments
Recovery of minimal bases and minimal indices
The following result shows that, under some general nonsingularity conditions, EGFPRs are strong linearizations of P(λ). We now describe the recovery of minimal bases and minimal indices of a singular P(λ) from those of EGFPRs of P(λ). Since Xj and Yj, j = 1,2, are nonsingular matrix assignments, we have A and B are nonsingular constant matrices.
If L(λ) is equal to T(λ), then the lever (this oz.) minimizes the index L(λ) in T(λ).
Recovery of eigenvectors
Now we are ready to describe the recovery of eigenvectors of a regular P(λ) from those of the EGFPRs of P(λ). Now follows the desired result for the recovery of right eigenvectors from Theorem 1.2.26 and Lemmas 3.3.4 and 3.3.5. Now follows the desired result for the recovery of the left eigenvectors from Theorem 1.2.26 and Lemmas 3.3.4 and 3.3.6.
Next, we consider an EGFPR that is not operation-free (since −1 and −0 simultaneously belong to τ), but the recovery of the eigenvector is operation-free.
Eigenvalue at infinity and recovery of eigenvectors
We now describe the recovery of eigenvectors of P(λ) corresponding to the eigenvalue at∞ from those of the EGFPRs of P(λ). The proof is similar for part (b). c) Next we prove the results for the left eigenspace of P(λ) at∞. We now illustrate our recovery rule for eigenvectors of P(λ) corresponding to an eigenvalue at ∞ of that of the EGFPRs of P(λ) by considering an example.
Low bandwidth banded EGFPRs
Note that L(λ) is a block penta-diagonal pencil if and only if both L0 and L1 are block penta-diagonal matrices. Note that to prove that L0 is a block penta-diagonal matrix, it is enough to show that. We show that the family of EGFPRs does not have such a shortcoming and allows us to construct symmetric block tridiagonal and symmetric block penta-diagonal EGFPRs when P(λ) is symmetric.
We mention that no block penta-diagonal symmetric GFPR exists for symmetric P(λ) with degree m ≥ 8.
Hermitian EGFPRs preserving the sign characteristic
Recall that there does not exist any penta-diagonal Hermitian GFPR of P(λ) when P(λ) has degree m ≥ 8. This implies that there is no penta-diagonal Hermitian GFPR that preserves the sign characteristic of P. (λ) when m ≥ 8. Consequently, there exists no block of penta-diagonal Hermitian GFPR that preserves the sign characteristic of P(λ) when m ≥ 7.
Recall that there does not exist a block tridiagonal Hermitian GFPR of P(λ) if P(λ) has degree m ≥ 4.
Palindromic linearizations
Then SRT(λ) is a T-palindromic strong linearization of P(λ), where S is the quasi-identity matrix given by. Then SRL(λ) is a T-palindromic strong linearization of P(λ), where S is the quasi-identity matrix given by. Then SRL(λ) is a T-palindromic strong linearization of P(λ), where S is the quasi-identity matrix given by.
Then L(λ)RS is a T-palindrome strong linearization of P(λ), where S is the quasi-identity matrix given by.
Rosenbrock strong linearization
We also mention that by [4, Theorem 3.5] Rosenbrock's strong linearization corresponds to the strong linearization of rational matrices introduced in [5]. The finite and infinite pole-zero structure of L(λ) can be calculated from the Kronecker canonical form of L(λ), see [58, 57]. The pole-zero structure at infinity of a rational matrix is given by the Smith-McMillan form at infinity.
This shows that we can derive the pole-zero structure of L(λ) from G(λ) and vice versa.
Affine spaces of strong linearizations for rational matrices
Consequently, given a realization of G(λ), the pencils in L1(G)∪L2(G) are easily constructible from the data inG(λ). Furthermore, the affine maps A1 and A2 in Theorem 5.3.2 show that we can construct linearizations of G(λ) directly from the linearizations of P(λ). We now show that almost all pencils in L1(G) and L2(G) are Rosenbrock strong linearizations of S(λ) and G(λ). For any pencil L(λ)∈ L1(P) with right ansatz vector e1 and with full Z-rank there exists a unimodular matrix polynomial U(λ) such that.
Finally, we conclude by (5.12) and Theorem 5.3.5 that T(λ) is a Rosenbrock strong linearization of G(λ). This completes the proof.
Symmetric linearization
It was shown in [25, Theorem 6.1] that none of the pencils in DL(P) is a linearization of P(λ) when P(λ) is singular and m≥2. Therefore, for a double pencil ansatz in DL(G ) to be a strong Rosenbrock linearization of G(λ), the polynomial P(λ) must necessarily be regular. In contrast, Fiedler and generalized Fiedler pencils of G(λ) (regular or singular) constructed in [2, 4, 8] are shown to be Rosenbrock linearizations of S(λ). However, Fiedler and generalized Fiedler pencils cannot guarantee a structure-preserving linearization of G(λ) when G(λ) is structured.
Indeed, unlike in the case of a symmetric matrix polynomial [6, 15], a symmetric G(λ) does not have a symmetric generalized Fiedler pencil when deg(P) is even, see [4].
Recovery of minimal bases and minimal indices
The recovery of right minimal bases and right minimal indices of S(λ) from those of its linearizations iL1(G) is given in the following result. We have described the recovery of eigenvectors, minimal bases and minimal indices of S(λ) from those of Rosenbrock's strong linearizations from L1(G)∪L2(G). We now describe the recovery of eigenvectors, minimal bases, and minimal indices of G(λ) from those of S(λ).
The following result describes the recovery of minimal bases and minimal indices of G(λ) from those of S(λ).
Linearizations via non-monomial polynomial bases
Therefore, minimal bases and minimal indices of S(λ) and G(λ) can be recovered from pencils in Le1(G)∪Le2(G) as follows. We now describe the recovery of minimal bases and minimal indices of G(λ) from the Fiedler pencils of G(λ). In view of (6.3), there are only two different types of blocks in Eσ(P). b) For simplicity, we write Lσ(S) and Lrev(σ)(ST) to denote the Fiedler pencil S(λ) and ST(λ) associated with the permutationσandrev(σ), respectively.
We are now ready to describe the recovery of minimal bases and minimal indices of S(λ) from the Fiedler pencils of S(λ).
GFPs of rational matrices
The results for left minimal bases and left minimal indices, respectively, follow from right minimal bases and right minimal indices in light of the following facts. Now we describe the recovery of minimal bases and minimal indices of S(λ) from those of the PGF pencils of S(λ). I) Real minimal bases and real minimal indices. Since iT = i(σ), the total number of inversions of σ, the desired results for right minimal bases and right minimal indices follow from Theorem 6.1.13.
Again, since iT = m −1 = i(σ), the total number of inversions of σ, the desired results for minimal right bases and minimal right indices follow from Theorem 6.1.13.
FPRs of rational matrices
Structured rational matrices such as symmetric, skew-symmetric, Hamiltonian, skew-Hamiltonian, Hermitian, skew-Hermitian, para-Hermitian, and para-skew-Hermitian rational matrices appear in many applications. For structured rational matrices, it is desirable to construct structure-preserving linearizations to preserve symmetry in the eigenvalues and poles of the rational matrices. The primary goal of this chapter is to construct strong Rosenbrock linearizations of structure-preserving (symmetric, skew-symmetric, Hamiltonian, skew-Hamiltonian, Hermitian, skew-Hermitian, para-Hermitian, and para-skew-Hermitian) rational matrices. .
For structured rational matrices, it is desirable to construct structure-preserving linearizations to preserve symmetry in the eigenvalues and poles of the rational matrices.
GFPRs of rational matrices
Fiedler-like pencils are Rosenbrock strong linearizations
First, we show that the FPs of G(λ) introduced in [2] (see Chapter 6) are strong Rosenbrock linearizations of G(λ). Recall that Lσ(λ) :=λMS−m−MSσ is called a Fiedler pencil (FP) ofG(λ), whereσ is a permutation of{0 :m−1}. We now define the inverse sequence-inversion structure order of a permutation that we need to prove that a Fielder pencil is a Rosenbrock strong linearization of G(λ). Recall that the block transpose of His is the block`×kmatrixHB, given by (HB)ij =Hji, see [26].
The following results will be useful to prove that the FPs of G(λ) are strong Rosenbrock linearizations of G(λ).
Structure-preserving strong linearizations
Symmetric GFPRs
Hamiltonian linearizations
Skew-Hamiltonian linearizations
Skew-symmetric linearizations