14 - (Lecture 5, Part 3) Singular Value Decomposition [ID:31680]
50 von 182 angezeigt

Hi guys, welcome back to computer vision lecture series.

This is lecture five and part three.

Out of a lot of demand, we are going to talk about singular value decomposition in this

lecture.

I'm going to give you a brief overview of the theoretical background, the core concepts

and a few applications and you will see the power of computing singular value decompositions.

So let's go ahead.

Singular value decomposition, I will take it up in terms of projection vectors.

So I will start with simple basic projections.

However, if you have a background of matrix algebra and vector calculus, it would be easy

to follow.

I have tried to make it as simple as possible and I majorly followed these two blog posts

at the bottom of the screen here.

And if you want to have a deeper look into how the workings of SVD, more geometrical

interpretation, maybe more analytical interpretation and their applications, I would recommend

you to look at the wiki page of SVD after watching this video and are reading those

blog posts.

So what is an SVD?

Any real matrix A can be decomposed into matrix three matrices U, Sigma and V. This is a V

transpose.

What is an A?

A is an m cross n matrix.

It's a real matrix.

U is an m cross n orthogonal matrix.

Sigma is an n cross n diagonal matrix and V is an n cross n orthogonal matrix.

We will look into the details of what these are, but this is the basic definition of singular

value decomposition.

Before I go ahead, if someone doesn't know what's an orthogonal matrix, it's basically

the transpose of any matrix is an inverse.

So is it's inverse.

So basically this equation holds.

So any matrix who has its inverse as its transpose, that is an orthogonal matrix.

Basically the rows and columns of such matrices when multiplied will result into an identity

matrix.

So that's the basic idea here.

Orthogonal matrices are special case of unitary matrices.

Unitary matrices are dealt in complex domain where complex conjugates, instead of having

a transpose, there is a complex conjugate and when you multiply that complex conjugate

of that complex matrix, it results into an identity matrix.

This is just some background details for someone who wants to know a bit more about orthogonal

matrices.

So let's jump into the basics of it.

So first thing that we want to look here is vector decomposition.

So let's say you have a point A in a vector space and in any vector space, here we are

considering these axes as the x and y axis, if you must, and any point A can be represented

by its x and y component in this vector form.

Let's say that we have V1 and V2 as two different directions, these are orthogonal directions

and we want to decompose this vector A into these directions and when we project this

point along these directions, we get the component that is the component of A which is in the

direction of V1 is SA1 and that's the length or the strength of vector A along direction

Teil einer Videoserie :

Presenters

Zugänglich über

Offener Zugang

Dauer

00:17:29 Min

Aufnahmedatum

2021-04-26

Hochgeladen am

2021-04-26 11:46:21

Sprache

en-US

Tags

Computer Vision
Einbetten
Wordpress FAU Plugin
iFrame
Teilen