Sal Husain

46:50

AAT

James Shi

47:02

was it A^T A

Alberto Checcone

47:03

S

Kaichen Liu

47:37

don't both forms work?

Jake Whinnery

48:14

Yea but the u vectors are eigenvectors for a different matrix than the v vectors

James Shi

49:54

mxm

Terrance Li

49:58

^

Gilbert

55:02

an m x n matrix, with the first rxr as S, and 0's everywhere else

Sal Husain

55:03

S but extended to the right

Nisha Prabhakar

55:07

S in the left top

Sal Husain

55:07

bottom right?

Nisha Prabhakar

55:10

zero everywhere else

Vainavi Viswanath

55:16

Sigma

Vainavi Viswanath

55:26

with zero filling in

Nisha Prabhakar

56:52

R x r

Francisco G

56:52

rxr

Jiachen Yuan

56:54

m*r

James Shi

58:39

should say rx(n-r) in the upper right

Ashwin Rammohan

58:44

yeah

Aryaman Jhunjhunwala

58:53

yeah isn't the top right r(n-r)

Ashwin Rammohan

01:00:13

we haven't proven that U^TU and V^TV equal I right?

Nisha Prabhakar

01:00:25

i think he sort of proved it above

Francisco G

01:00:34

i thought UTU was 0?

Alexander Dong

01:00:53

He gave the basic idea of it as just doing inner products

Jake Whinnery

01:00:54

We did earlier since orthonormal so just dot of unit vectors has magnitude 1

Jennifer Zhou

01:01:00

Its cuz vectors in U and V are orthonormal

Ashwin Rammohan

01:01:29

but we haven't proven that yet right

Ashwin Rammohan

01:01:36

we're just accepting it as true for now?

Aryaman Jhunjhunwala

01:01:43

yeah have we proven why all the evectors are orthonormal?

Atharv Vanarase

01:01:44

We proved it already

Atharv Vanarase

01:01:58

Not super formal but he showed us it

Jake Whinnery

01:02:07

Proven that they’re orthonormal or that if they’re orthonormal then VTV is I

Vainavi Viswanath

01:02:13

So would the remaining vectors in U1 or ViT be the null space of AAT or ATA?

Aryaman Jhunjhunwala

01:02:25

why are they all orthonormal tho

Jiachen Yuan

01:02:29

Does the order of evectors in U2 or V2 matters?

Alberto Checcone

01:02:49

they're eigenvectors (so they are orthogonal to each other) and we have created them sth they are orthonormal

Ashwin Rammohan

01:03:02

eigenvectors aren't orthogonal by definition

Ashwin Rammohan

01:03:08

they are just linearly independent

Jake Whinnery

01:03:21

You right

Alberto Checcone

01:03:41

oh yeah huh

Sal Husain

01:03:42

yeah counter example [1,2] and [2, 1] are e.vectors but not orth

James Shi

01:03:49

I think that’s something we proved from construction last lecture

Jake Whinnery

01:03:57

I think it’s bc ATA and AAT are symmetric matrices actually

Jake Whinnery

01:04:26

But no I don’t think that’s been proven yet in lecture

Nisha Prabhakar

01:04:48

can we ask if the top right corner should be r x (n-r)

Nisha Prabhakar

01:04:53

Of signma

Terrance Li

01:07:06

how can a wide matrix have linearly independent columns?

Regina Wang

01:07:08

how can all the columns be linearly independent?

Chirag

01:07:21

r of them

Atharv Vanarase

01:07:24

As many as possible. So all the rows are

Atharv Vanarase

01:07:45

For the number of rows *

Regina Wang

01:07:54

oh ok!

Vanshaj Singhania

01:10:01

svd is a religion

Alberto Checcone

01:10:46

cult**

Alexander Dong

01:11:12

definitely not a cult**

Vainavi Viswanath

01:11:55

why is this preferred over the other method

Vainavi Viswanath

01:12:00

Where we us S instead of sigma

Gilbert

01:12:08

cuz everything is square

Jake Whinnery

01:12:10

Square bois

Gilbert

01:12:20

(except for sigma itself lol)

Ashwin Rammohan

01:12:27

so in which case is S the upper left quadrant

Ashwin Rammohan

01:12:34

of sigma

Gilbert

01:12:37

the general case

Gilbert

01:12:59

if m = r or n = r, then m-r = 0 or n-r = 0 respectively, giving us our two special cases he just drew out

Alexander Dong

01:13:00

I think when it does not have independent columns or rows

Ashwin Rammohan

01:13:00

but if we already looked at m < n and m > n, then is the general case where m = n?

Ashwin Rammohan

01:13:04

oh ok

James Shi

01:14:04

and square root?

Regina Wang

01:14:50

^

Vanshaj Singhania

01:15:14

technically yeah, but that doesn't help us in this case

Ashwin Rammohan

01:15:36

eigenvalue of 1

Vanshaj Singhania

01:15:37

plus that only works for vectors I think, not matrix-vector products?

vincentwaits

01:15:38

A rotation matrix

ryanm

01:15:48

Sin cos

Francisco G

01:17:44

lol

Alberto Checcone

01:19:46

scales right?

Alexander Dong

01:19:59

scales and changes the size of the vector?

fy

01:21:40

sigma matrix would change the length of VTx right?

Edward Im

01:21:52

yea

Alberto Checcone

01:22:09

the rotation matrices don't (u, v) but sigma does

fy

01:22:15

thank you

Steven Chen

01:23:17

Why doesn’t V transpose change the length?

Jake Whinnery

01:23:26

Does sigma change length and direction tho

Jake Whinnery

01:23:31

Like basically acting as a mapping

Edward Im

01:23:33

Cause its orthonormal @steven

vincentwaits

01:23:35

Because its an orthonormal matrix

Steven Chen

01:23:48

thanks

Regina Wang

01:24:43

[0 1]

Stephen Wang

01:24:43

1,0

Stephen Wang

01:26:04

length becomes corresponding sigma

Francisco G

01:26:14

scaled by rho?

Francisco G

01:26:20

or sigma, rather*

Alexander Dong

01:26:26

So is VT and U just change of bases?

Francisco G

01:26:34

by sigma_2

dylanbrater

01:26:54

Sorry I missed it what are the v1 and v2 vectors

Ashwin Rammohan

01:27:07

eigenvectors of A^TA

Regina Wang

01:27:11

the vectors in V

dylanbrater

01:27:15

Ah thanks

Alberto Checcone

01:27:40

all matrices are change of basis technically

Sakshi Satpathy

01:27:42

What is x here?

Alberto Checcone

01:27:50

some arbitrary input

Sakshi Satpathy

01:27:56

Ok thanks

Ryan Zhao

01:28:14

will the angle stay the same?

Ryan Zhao

01:28:32

for VT*x -> sigma VT*x

Alberto Checcone

01:28:34

Vt rotates the vector space so that the columns of v are on the axes

Alberto Checcone

01:28:37

sigma scales

Alberto Checcone

01:28:39

and U scales back

Vainavi Viswanath

01:28:39

where would Vtx land

Alberto Checcone

01:28:42

rotates back**

Vainavi Viswanath

01:28:44

When multiplied by sigm

Regina Wang

01:28:46

@ryan don’t think so

Ashwin Rammohan

01:29:41

how does he know that those two are AV1 and AV2?

fy

01:30:13

sigma 1

Terrance Li

01:30:14

sigma1?

Gilbert

01:30:14

its between sigma1 and sigma2

Sakshi Satpathy

01:30:22

how do we know all the vectors are length 1 initially? Do we always set it so?

Kunaal Sundara

01:30:51

because they are orthonormal

vincentwaits

01:31:12

Isn’t sigma1 just stretching one component? How does this bound apply to the entire vector?

Sakshi Satpathy

01:31:12

Is x also orthonormal to v1 and v2?

Edward Im

01:31:19

no

Edward Im

01:31:26

x is an arbitrary length 1 vector

Francisco G

01:31:31

b/c of the way we ordered the sigmas, Vincent

Gilbert

01:31:34

x is a linear combination of v1, v2, ...

Francisco G

01:31:49

sigma 1 was the greatest value

Sakshi Satpathy

01:31:50

Oh ok thanks

vincentwaits

01:32:26

Ty

vincentwaits

01:32:41

Fran

Francisco G

01:32:44

<3

Ryan Zhao

01:34:17

are relative angles between vectors preserved when multiplying by orthonormal vectors like U and V^T?

Ryan Zhao

01:34:50

matricies*

Sakshi Satpathy

01:36:36

(1) so to clarify, the goal of this section is to get from x to Ax using SVD? (2) how do we know the length of Ax? And how do we know it is less than sigma1 * the length of x? (3) you showed a rotation matrix earlier. Is this the usual behavior when SVD is applied, or is this just a particular example? Thanks

James Shi

01:37:20

This just provides a geometric interpretation of what Ax does

Francisco G

01:37:55

^for us Visual-types

Sakshi Satpathy

01:38:01

Oh ok—but what is Ax?

James Shi

01:38:06

I don’t think we know the length of Ax, but we can find an upper bound given ||x|| = 1

James Shi

01:38:25

||V^T x|| = 1

Sakshi Satpathy

01:38:53

oh ok thanks

James Shi

01:38:53

and then we know that sigma scales each component of V^T x by the small sigma

Ashwin Rammohan

01:38:54

I don't think we preserve angle because angle b/w x and V1 changes after multiplying by Vt

Kunaal Sundara

01:39:53

so could A be any linear transformation and then U, sigma, and VT perform that same linear transformation through rotation/reflection and scaling?

Gilbert

01:39:55

@Sakshi (1) more or less, yes; (2) we know the length of A*x = U*Sigma*VT*x by analyzing each individual multiplication (note VT*anyvector and U*anyvector preserves magnitude since VT and U are orthonormal, so the only place where the length is scaled is in the multiplication by sigma) Since sigma1 is the largest Singular value, the largest u can be scaled by is if your entire length is along that axis and scaled by sigma1 (3) we observed that the orthonormal matrices didnt change magnitude, and we know that rotational matrices have this property so we basically are saying that orthonormal matrices rotate vectors

Aryaman Jhunjhunwala

01:40:38

why is knowing the upper bound so useful

Sakshi Satpathy

01:40:39

Oh ok that makes sense, thanks so much

Aryaman Jhunjhunwala

01:40:44

what is this used for?

ryanm

01:41:08

Sad?

ryanm

01:41:13

svd*

James Shi

01:42:22

square

Alberto Checcone

01:42:24

square?

James Shi

01:46:38

a-jb

Jennifer Zhou

01:48:04

why do we not need to take conjugate of Q again? Is it by constuction?

Joseph Fedota

01:48:19

Q is all real

Connie Shi

01:48:24

it's real so the conjugate is Q

Jennifer Zhou

01:49:00

Sorry when/why did we say Q is real? Is that in the definition of symmetric matrix?

Connie Shi

01:49:36

the claim is for symmetric real matrices

Jennifer Zhou

01:49:56

Oh wait yeah I see that highlighted now thanks!!

mjayasur

01:51:45

Wait how are we getting this step where we go from lambda conjugate * xTx means that lambda conjugate is lambda

Stephen Wang

01:52:30

so this is to show that ATA and AAT have real eigenvectors and eigenvalues so we can do SVD, right?

Bijan Fard

01:52:45

Yep

James Shi

01:52:56

technically this should’ve been proven first

Bryan Ngo

01:54:42

why the specification of "can be chosen to be orthonormal" rather than "are orthonormal"

Calvin Yan

01:56:00

Well because whether or not vectors are orthonormal depends on magnitude

Calvin Yan

01:56:16

And evector magnitude is arbitrary

Kaichen Liu

01:56:43

ye by linearity the scalar multiplication of an eigenvector by any scalar maintains that it's still an eigevector

Jake Whinnery

01:57:17

Sorry I missed it, why is X2TQX1 a scalar

Edward Im

01:57:37

can look at the dimensions

Bijan Fard

01:57:40

Also, if there are repeated eigenvalues, then you might have to choose the direction too. For example, with the identity matrix, every nonzero vector is an eigenvector, but you can still choose orthogonal ones.

Stephen Wang

01:57:45

wait, why can you do that again?

Bijan Fard

01:57:46

Regarding the question above

Stephen Wang

01:58:17

x2Tx1 = 0?

fy

01:58:25

eigenvalues are 1 or 0

Alexander Yang

01:58:42

lambda1 = lambda 2

Kaichen Liu

01:58:56

wait @Bijan with the identity matrix you still always have orthogonality with the eigebasis

Kaichen Liu

01:59:14

no matter what magnitudes you choose? or am i missing something

Aryaman Jhunjhunwala

01:59:26

why is x2Tx1 = x1Tx2

Ashwin Rammohan

01:59:32

inner product

Ashwin Rammohan

01:59:37

is same regardless of order

Bijan Fard

01:59:38

Well, you could choose an eigenbasis where they aren't orthogonal as well.

Aryaman Jhunjhunwala

01:59:41

oh right yeah

Alberto Checcone

02:00:50

1,-1

Ryan Zhao

02:03:26

thank you!

Ashwin Rammohan

02:03:30

Thanks!

Kunaal Sundara

02:03:31

thanks!

Calvin Yan

02:03:31

Thanks!

umakarki20

02:03:32

Thank you professors

Bijan Fard

02:03:32

Thanks!

Jake Whinnery

02:03:33

THanks!

Shayan Islam

02:03:34

^

David Yi

02:03:35

thank you!

Jamie

02:03:36

thanks!

Jasmine Bae

02:03:36

Thank you

Vainavi Viswanath

02:03:37

thank you

Sakshi Satpathy

02:03:38

thanks

Aryaman Jhunjhunwala

02:03:38

thank you

Hannah Huang

02:03:40

Thank you!

Grace Chen

02:03:40

Thank you!

Will Panitch

02:03:42

Thanks!

Minglai Yu

02:03:43

Thanks

Ayush Sharma

02:03:44

Thank you! :)