14 - Pattern Recognition (PR) [ID:2543]
50 von 970 angezeigt

[MUSIK]

so welcome back everybody to pattern recognition in case you haven't been

here yesterday my name is Andreas Maier and I am substituting professor

Hornegger today he's on a business trip so he can't do the lecture today we are talking

about pattern recognition here so now you guys have to help me because I

need a refresher in pattern recognition what is this about you should have seen something

similar to this and please excuse my drawing skills here

on the tablet I'm still learning I'm still improving

okay so what is pattern recognition what's the fundamental problem that

you want to do in pattern recognition sounds like

difficult one yes so we have a feature

vector X and we want to assign it

a specific class Y yes that's it sounds easy

enough okay so this classification right so we want to

assign a class label Y to a feature vector X and

in the beginning you looked at how to look

at how to do that what's the first thing

you looked at what so the Bayes classifier okay and what's

special about the Bayes classifier yes and what kind of classification does

it yield so okay yes you are talking about this here so you

maximize the posterior probability P of Y given X

over Y okay anything

more about the Bayes classifier did you learn about it it's

optimal and when is it optimal for the 0 1

loss function exactly what does 0 1 loss function mean 0 1 loss

function okay can you still

read this I'm saying this

stuff so imagine these are a Chinese character and I am saying what

they mean okay you just have to memorize okay so

I hope you can can deal with that I try to do my best okay so Bayes classifier

is optimal for the 01 loss function what's the unit one

is it euros it can be anything exactly very well and what

else did you look at what else should I draw next what

did you look at now we have this probability distribution how how does it look

like Gaussian always no so what what can we do

about this so here as you can so you can there is different ways of modeling this yes so the

posterior PDF modeling

and there's different ways to model it actually you got to know

two different ways of modeling it generative not generic

generative why is this called generative

and what what did you model here can you

speak up okay so generative and

you were referring here to the

Gaussian right why is the Gaussian

generative what did you model with your

Gaussian so you were using you were

using a trick with the Gaussian modeling right

so with your Gaussian you usually model how you can generate

your feature vectors X given a class Y so and you

can if you know this distribution this distribution describes you how

Teil einer Videoserie :

Zugänglich über

Offener Zugang

Dauer

01:26:30 Min

Aufnahmedatum

2012-11-27

Hochgeladen am

2012-12-04 09:11:27

Sprache

en-US

Einbetten
Wordpress FAU Plugin
iFrame
Teilen