• Tidak ada hasil yang ditemukan

Directory UMM :Networking Manual:computer_network_books:

N/A
N/A
Protected

Academic year: 2017

Membagikan "Directory UMM :Networking Manual:computer_network_books:"

Copied!
35
0
0

Teks penuh

(1)

3D Geometry for

Computer Graphics

(2)

The plan today

Singular Value Decomposition

 Basic intuition

 Formal definition

(3)

Geometric analysis of linear transformations

 We want to know what a linear transformation A does  Need some simple and “comprehendible” representation

of the matrix of A.

 Let’s look what A does to some vectors

 Since A(v) = A(v), it’s enough to look at vectors v of unit length

(4)

The geometry of linear transformations

A linear (non-singular) transform

A

always takes

hyper-spheres to hyper-ellipses.

A

(5)

The geometry of linear transformations

 Thus, one good way to understand what A does is to find

which vectors are mapped to the “main axes” of the ellipsoid.

A

(6)

Geometric analysis of linear transformations

If we are lucky:

A = V

V

T

,

V

orthogonal (true

if

A

is symmetric)

The eigenvectors of

A

are the axes of the ellipse

(7)

Symmetric matrix: eigen decomposition

 In this case A is just a scaling matrix. The eigen

decomposition of A tells us which orthogonal axes it scales, and by how much:

1

2

1 2 1 2

T n n n A    � � � � � �  � � � � � � � �

v v K v v v K v

O

A

1

1

(8)

General linear transformations: SVD

 In general A will also contain rotations, not just scales:

A

1

2

1 2 1 2

T n n n A    � � � � � �  � � � � � � � �

u u K u v v K v

O

1 1

2

1

T

(9)

General linear transformations: SVD

A

   

1

2 1 2 n 1 2 n

n A    � � � � � �  � � � � � � � �

v v K v u u K u

O

1 1

2

1

AV

U

,

0

i i

A

v

i

u

i

(10)

SVD more formally

 SVD exists for any matrix  Formal definition:

 For square matrices A Rnn, there exist orthogonal matrices U, VRnn and a diagonal matrix , such that all the diagonal

values i of  are non-negative and

T

A U V

 

=

(11)

SVD more formally

 The diagonal values of (1, …, n) are called the singular values. It

is accustomed to sort them: 1  2 …  n

 The columns of U (u1, …, un) are called the left singular vectors.

They are the axes of the ellipsoid.

 The columns of V (v1, …, vn) are called the right singular vectors.

They are the preimages of the axes of the ellipsoid.

T

A U V

 

=

(12)

Reduced SVD

For rectangular matrices, we have two forms of

SVD. The reduced SVD looks like this:

 The columns of U are orthonormal

 Cheaper form for computation and storage

A

U

V

T
(13)

Full SVD

We can complete

U

to a full orthogonal matrix

and pad

by zeros accordingly

A

U

V

T
(14)

Some history

SVD was discovered by the following people:

E. Beltrami (1835  1900)

M. Jordan (1838  1922)

J. Sylvester (1814  1897)

H. Weyl (1885-1955) E. Schmidt

(15)

SVD is the “working horse” of linear algebra

There are numerical algorithms to compute

SVD. Once you have it, you have many things:

 Matrix inverse can solve square linear systems  Numerical rank of a matrix

 Can solve least-squares systems

 PCA

(16)

Matrix inverse and solving linear systems

Matrix inverse:

So, to solve

  

1

1 1

1 1 1

1 1 n T T T T

A U V

A U V V U

V U         �  �  �  � � � �  � � � � O 1 T

A

V

U

x b

(17)

Matrix rank

The rank of

A

is the number of non-zero singular

values

A

U

V

T

= m

n

12

(18)

Numerical rank

If there are very small singular values, then

A

is

close to being singular. We can set a threshold

t

,

so that

numeric_rank(

A

) = #{

i

|

i

> t}

If

rank(

A

) < n

then

A

is singular. It maps the entire

space

R

n

onto some subspace, like a plane (so

A

(19)

PCA

 We wanted to find principal components

x y

(20)

PCA

 Move the center of mass to the origin

pi’ = pi m

x y

(21)

PCA

 Constructed the matrix X of the data points.

 The principal axes are eigenvectors of S = XXT

1 2 | | | | | | n X � � �� � �  � � � �

p p L p

1

T T

d

XX S U U

(22)

PCA

 We can compute the principal components by SVD of X:

 Thus, the left singular vectors of X are the principal

components! We sort them by the size of the singular values of X.

2

( )

T

T

T T T T

T T

T

X U V

XX U V U V

V

U V U U U

 

   

(23)

Shape matching

We have two objects in correspondence

Want to find the rigid transformation that aligns

(24)

Shape matching

When the objects are aligned, the lengths of the

(25)

Shape matching – formalization

Align two point sets

Find a translation vector

t

and rotation matrix

R

so that:

1

,

,

n

and

1

,

,

n

.

P

p

K

p

Q

q

K

q

2

1

(

) is minimized

n

i i i

R

(26)

Shape matching – solution

 Turns out we can solve the translation and rotation

separately.

 Theorem: if (R, t) is the optimal transformation, then the

points {pi} and {Rqi + t} have the same centers of mass.

1 1 n i i n

p p 1 1 n i i n

q q

1 1 1 1 n i i n i i R R n R R n     � � �     � �  

p q

p q q t

t p

t

t

(27)

Finding the rotation

R

 To find the optimal R, we bring the centroids of both point

sets to the origin:

 We want to find R that minimizes

i i i i

p p p q q q

2

1

n

i i i

R

(28)

Summary of rigid alignment:

 Translate the input points to the centroids:

 Compute the “covariance matrix”

 Compute the SVD of H:

 The optimal rotation is

 The translation vector is

i i

ii

p p q q

p q 1 n T i i i

H

��

q p

T

H U V

 

T

R

VU

R

 

(29)

Complexity

Numerical SVD is an expensive operation

We always need to pay attention to the

(30)
(31)

Finding the rotation

R

 

2 1 1 1 n n T

i i i i i i

i i

n

T T T T T T

i i i i i i i i i

R

R

R

R

R

R R

  

� � � � �

� �

p

q

p

q

p

q

p p

p

q

q

p

q

q

I

These terms do not depend on R,

(32)

Finding the rotation

R

1 1 1 1

min max .

max 2 max

n n

T T T T

i i i ii i i i ii

i i

T

T T T

i ii i i i i

n n

T

T T

T T

T

i i i i

i i

R R R R

R R R

R R     � � � � � � � �     � � � � � � � � � � �

p q q p p q q p

q p q p p q

p q p q

(33)

Finding the rotation

R

1 1 1

1

n n n

T T T

i i i i i i

i i i

n T i i i Trace Trace H

R R R

    � � � � � � �� �� � � � � �� 

p q q p q p

q p

3 1 1 3 = 3 3

=

1

( ) n ii

i

Trace A A

(34)

 So, we want to find R that maximizes

 Theorem: if M is symmetric positive definite (all

eigenvalues of M are positive) and B is any orthogonal matrix then

 So, let’s find R so that RH is symmetric positive definite.

Then we know for sure that Trace(RH) is maximal.

Finding the rotation

R

Trace RH

 

(

)

(35)

 This is easy! Compute SVD of H:

 Define R:  Check RH:

Finding the rotation

R

T

H U V

 

T

R

VU

VU

T

 

U

T

V

T

R

H

V

 

V

This is a symmetric matrix, Its eigenvalues are i  0

Referensi

Dokumen terkait

Disajikan ilustrasi perilaku sifat terpuji dalam kehidupan sehari-hari , peserta didik dapat meneladani sikap yang terkandung dari ilustrasi tersebut (qanaah dan

Figure 5 presents a preliminary result for our new approach for 3D surface reconstruction based on a combination of aerial and terrestrial images.. The set of 205 images contains

[r]

mudah hancur jika berbenturan 1 kali dalam keadaan mobil itu dalam kecepatan tinggi bumper berbahan fiber dapat di katakan adalah bahan yang murah dan dapat

Gambaran lebih lengkap mengenai jumlah rumah tangga, penduduk, luas dan kepadatan penduduk di Kabupaten Lamandau pada Tahun 2015 adalah sebagaimana pada Tabel berikut:.

Jika pada tanggal / waktu tersebut diatas Saudara atau yang mewakili Saudara (yang ditunjukkan dengan Surat Kuasa dari Perusahaan ) tidak dapat hadir dan/atau

Kegiatan pendahuluan antara lain: (a) menyiapkan peserta didik secara psikis dan fisik untuk mengikuti proses pembelajaran, (b) memberi motivasi belajar peserta

Kajian Efektifitas Anggaran Kementerian/Lembaga dalam Pembangunan Infrastruktur Ekonomi dan Kesra di Kawasan