11 - Pattern Recognition (PR) [ID:2516]
50 von 468 angezeigt

[MUSIK]

sum over all the feature vectors XI

times X I transposed if you think in terms of column vectors and

what is the eigenvectors that corresponds to the largest eigenvalue that's

the principal axis that's the direction in which the

one dimensional projections on this straight line have maximum

variance so the interval that is covered by the

1-D projections is maximum the second eigenvector that belongs to

the second largest eigenvalue is the second principle axis or we

have the second highest variation and so on so what you can do

is you can basically do a so called spectral decomposition we

have written it here a spectral decomposition a spectral decomposition of

the covariance matrix that is you can rewrite

the whole covariance matrix in terms of a

linear combination of rank one matrices these are

the projection matrices that do projection on the

on the eigenvalue vectors and these one rank

1 matrices are weighted by their eigenvalue this decomposition

is true for the positive positive semi

definite covariance matrices and now you can think of the following you have

this covariance matrix and you can characterize the covariance matrix by the

E I's and these matrices and the lambda I's and now you

can change the lambda I's a little bit for instance so you

have a larger scaling or a smaller scaling in a certain principal

direction and that's how you can build different kidney models out of

this type of decomposition and that's actually what we did we have used

here the mean vector if it's zero it's gone and then

we say our vectors X are basically characterized by

linear combinations of our eigenvectors and these eigenvectors are

differently weighted by AI and by manipulating these AI's

which have been in previous in previous examples the

lambda I's I can built different shapes and if

just change A1 the highest eigenvalue if I modify that

and if if I plot the point vector or the

high dimensional feature vector and the associate volume I see

basically within the kidney what is the deformation with the

highest variation that's very interesting we have had for instance one project

with the Adidas company and what we tried to do is we

tried to measure the three dimensional surface of the human feet so

we measured the human foot and then we

have computed this PCA decomposition of the covariance matrix

and we get the eigenvector that belongs to the largest eigenvalue and then

we have rewritten this point set using this type of representation and

then we looked at variations of the largest eigenvalue and

what do you think what is the change of a

human foot in terms of its variability over humans the

length right that's the length and that's expressed

here by the largest eigenvalue that's very interesting

without using any domain knowledge we have captured

5000 feet of people we did the PCA

Teil einer Videoserie :

Zugänglich über

Offener Zugang

Dauer

00:40:13 Min

Aufnahmedatum

2012-11-19

Hochgeladen am

2012-11-20 17:16:50

Sprache

en-US

Einbetten
Wordpress FAU Plugin
iFrame
Teilen