16 - Pattern Recognition [PR] - PR 13 [ID:22850]
50 von 159 angezeigt

Welcome everybody to Pattern Recognition.

So today we want to look into the actual linear discriminant analysis and how to compute it

in a rank reduced form.

So today we really go into discriminant analysis and what we talk about today is the so called

rank reduced linear discriminant analysis.

So we've seen already that our problem was how to choose an L dimensional subspace with

L equals to k minus one where k is the number of classes and this is supposed to be a subspace

that is good for this linear discriminant analysis.

So now the idea that we want to follow is to maximize the spread of the L dimensional

projection of the centroids and we already know a method that can do that and this is

the so called principal component analysis.

So we calculate the principal components of the covariance of the mean vectors.

So you remember we can transform of course using phi our mean vectors and then we can

do that for all of our classes.

So the principal component analysis as you may know from introduction to pattern recognition

is a mapping that computes a linear transform phi that results in the highest spread of

the projected features.

So this has an objective function and here we are looking for the transform phi star

that is the maximum over this term.

Now this term may look a little complex but you can see here this is already the version

for our classes so we take the means of our classes y and then we subtract the distance

to the other means and we essentially take the two norm of these transformed matrices.

Now here we type out the two norm as this inner product of the two vectors and then

of course we compute the mean of our classes k and at the same time we apply a regularization.

So you see here on the right hand side we have some additional regularization and this

is essentially a sum over the different regularization terms and here you see that this is

lambda i times and then the two norm of phi i where phi i is essentially the i-th row vector of

phi.

So this essentially means that we are looking for a matrix where the individual row vectors

have a norm of one.

And the method that we are using to bring in these constraints that the row vectors

have a length of one is given by the so called Lagrange multipliers.

So note that we have to introduce this kind of regularization, these constraints, because

we are doing a maximization over the space of transforms and they could very easily maximize

the left hand side term of our maximization problem simply by maximizing all of the entries

of phi.

So if I let them go towards infinity then we will also get a maximization of the entire

term.

However, this is not what we are looking for then we need to introduce these constraints.

So in case you forgot about Lagrangian multipliers let's have a short refresher.

Generally this is a method to include constraints into the optimization and we will use this

technique quite frequently for the rest of this class.

So if you have trouble with this you may want to consult some math textbook in order to

get acquainted with the concept.

So let's look into some simple example.

Here we have simply a function x plus y and you already see again as in the previous example

this is not a bounded function.

So if we want to maximize x plus y we would simply go to x or y towards infinity and that

would essentially give us the maximum of this function.

So generally this is not so nice in terms of it's actually quite easy to optimize because

Teil einer Videoserie :

Zugänglich über

Offener Zugang

Dauer

00:17:08 Min

Aufnahmedatum

2020-11-04

Hochgeladen am

2020-11-04 20:57:27

Sprache

en-US

In this video, we introduce a rank-reduced version of the LDA.

This video is released under CC BY 4.0. Please feel free to share and reuse.

For reminders to watch the new video follow on Twitter or LinkedIn. Also, join our network for information about talks, videos, and job offers in our Facebook and LinkedIn Groups.

Music Reference: Damiano Baldoni - Thinking of You

Einbetten
Wordpress FAU Plugin
iFrame
Teilen