• Tidak ada hasil yang ditemukan

Linear Algebra and Random Processes - CSE-IITM

N/A
N/A
Protected

Academic year: 2024

Membagikan "Linear Algebra and Random Processes - CSE-IITM"

Copied!
2
0
0

Teks penuh

(1)

CS6015: Linear Algebra and Random Processes Course Instructor: Prashanth L.A.

Quiz - 3: Solutions 1. True or False?

(a) T(u) =v for some (fixed)v6= 0 is a linear transformation.

Solution: False, sinceT(0)6= 0.

(b) SupposeP1andP2 are projection matrices. Then

(P1−P2)2+ (I−P1−P2)2=I.

Solution: True. UsingP12=P1 andP22=P2, we get (P1−P2)2+ (I−P1−P2)2

= (P12+P22−P1P2−P2P1) + (I+P12+P22−2P1−2P2+P1P2+P2P1)

= (P1+P2) + (I+P1+P2−2P1−2P2) =I.

(c) Let{q1, . . . , qn}be an orthonormal subset of vectors in

R

nandT be a linear transformation that satisfies

T(qi)TT(qi) =qiTqi, i= 1, . . . , n.

Then, the set{T(q1), . . . , T(qn)} is orthonormal.

Solution: False. In

R

2, notice that the standard basis {e1, e2} is orthonormal as well. Define a linear transformation T as follows: T(e1) = e2 andT(e2) =e2. Then, T(e1)TT(e2) =eT2e2= 1 and hence, the set{T(e1), T(e2)}is not orthonormal.

(d) If a linear transformationT :

R

R

satisfiesT(4) = 24, then T(x) = 6x, for allx∈

R

.

Solution: True. T(4) = 4T(1) implies T(1) = 6. For anyx∈

R

, T(x) =T(x·1) =

xT(1) = 6x.

(e) Ifv1, . . . , vnare linearly independent vectors inV andT is a linear transformation fromV toV, thenT(v1), . . . , T(vn) are linearly independent.

Solution: False. ConsiderT(vi) =vi, fori= 1, . . . , n−1 andT(vn) =v1+. . .+vn−1. As another (trivial) example, the transformation T(vi) = 0 for i= 1, . . . , n works as well.

(f) IfT(v1), . . . , T(vn) are linearly independent vectors inV, whereT is a linear transformation fromV toV, thenv1, . . . , vn are linearly independent.

Solution: True. Ifv1, . . . , vnare linearly dependent, thenc1v1+. . .+cnvn = 0 for some scalarsci, not all of them zero. This implies,c1T v1+. . .+cnT vn= 0, a contradiction.

(g) Every orthonormal set of vectors in

R

4 must be a basis for

R

4.

(2)

Solution: False. Consider the setS=





 0 1 0 0

 ,

 1 0 0 0





 .

2. Consider a subspaceS of

R

4spanned by the following vectors:

u1=

 1 0 1 0

 , u2=

 1 1 1 0

andu3=

 1 0 1 1

 .

Using the usual dot product on

R

4, do the following:

(a) Convert{u1, u2, u3}to an orthonormal basis for S.

Solution: Applying Gram Schmidt algorithm, one gets the following orthonormal basis:

q1=

1 2

0

1 2

0

 , q2=

 0 1 0 0

andq3=

 0 0 0 1

 .

(b) Forb=

 1 2 3 4

, find the “least-squares approximation ofb” inS.

Solution:

 2 2 2 4

(c) Explain why Gram Schmidt algorithm fails when the input set of vectors is linearly de- pendent.

Solution: Let the input set be {v1, . . . , vn}. Suppose that (w.l.o.g.) vk is a linear combination of v1, . . . , vk−1, with the latter set linearly independent. After turning v1, . . . , vk−1into an orthonormal set of vectors, when Gram Schmidt algorithm acts on vk, it would result in a zero vector (why?) and normalizing this vector would lead to the algorithm’s failure!

Referensi

Dokumen terkait