[MUSIK]
so good morning everybody we will continue today to look at different norms
and different optimization problems for pattern for solving pattern recognition related problems
but before we do so let let me briefly summarize the big picture the big picture so
in winter semester we talk about pattern recognition and we
have seen at the beginning of this lecture that we
are basically studying one mapping that maps feature vectors to
class numbers so we want to have a decision rule that takes a feature
vector of fixed dimension D and maps it to a certain class and
there are different ways to set up classifiers we have seen already different approaches and
we have learned about the so called optimal classifier with respect to
the 01 cost function and that's the Bayes classifier the Bayesian
classifier that basically decides for the
class with the maximum posterior probability P
of Y given X and in this context we
have seen that there are different ways to define this posterior
probability so posterior PDF modeling was
one topic and the classifier before
before we talk about the posteriors one comment on
the Bayesian classifier is optimal with respect to what
with respect to the 0 1 loss function right
the 0 1 loss function the 0 1
loss function so 0 1 loss what does it mean it means we
do the right decision it's for free and if we do the wrong
decision we have to pay 1 euro 1 dollar 1 I don't know
what what is you currency rupee so doesn't
matter it's currency independent so posterior PDF
modeling we have seen the generic and
the generic model and the discriminative model and
sorry about my poor hand writing you can live
with it the generic model and the discriminative model
if I talk about a Gaussian classifier does it
fall into the class of discriminative or genetic models
generic models what is Gaussian what is in a
Gaussian classifier the class conditional PDF
and the discriminative modeling approach we have seen which one
was that Gaussian the discriminative model we have
seen Florian you remember that no that was the
generic the discriminative was the direct modeling of
the posterior and we have seen one way to do so the sigmoid function
logistic function right logistic regression logistic function
and we also have seen that we
can associate with an arbitrary decision boundary F
of X is zero in the two dimensional case for instance
we can write down right away the posterior probability by
using this F of X which is basically 1 over
1 plus Y times plus F of X yes sir
ah sorry generative generic generative we
have called it I am so confusing
I'm sorry generative generic I did not write
Presenters
Zugänglich über
Offener Zugang
Dauer
01:27:41 Min
Aufnahmedatum
2012-11-20
Hochgeladen am
2012-12-04 09:10:49
Sprache
en-US