The following content has been provided by the University of Erlangen-Nürnberg.
Good morning everybody, welcome to pattern recognition.
I'd like to introduce Dominik today.
He would like to make some promotion for a master thesis at our lab.
He's from the digital sports group and he will explain you what you will have to do in the master thesis.
Thank you Stefan. First I want to introduce what the digital sports group deals with.
So we place some sensors on the body here, can be motion sensors or biosignal sensors.
We transmit data through some devices like a tablet or a smartphone and then the user wants to have some feedback.
So for example we place some sensors on the shoe and the user wants to know how much steps the user did in for example two hours.
And we develop applications in sports and in medicine and we run through the pattern recognition pipelines.
So I think you know that. So we acquire data, we do some pre-processing like filtering or segmentations.
We extract features and then we do some classification experiments.
And the thesis, we concentrate or focus lives on the lower extremities.
So we place some sensors here for example on the shoe and we acquire some data.
And the idea is now to extract events from such a signal so we can see the red circles.
You have to detect these events and then you want to classify these events.
So you want to know which movement corresponds to that for example maximum.
And the focus lies on embedded implementation and this sensor system should be integrated in sportswear.
For example in shoes or in trousers, something like that.
And we want to apply that in team sports.
And we work together with Adidas so you can get deep into this company and know the right persons after the master thesis.
And yes, if you're interested in that thesis you can contact me by email or phone.
That was the introduction. If you're interested.
Okay. Thank you.
Okay. Where did we stop yesterday?
I introduced the AdaBoost algorithm, one of the most famous boosting algorithms.
And we had a look at the technical background.
We saw that it is equivalent to an additive model with a forward stage-wise modeling.
And it applies as a loss function, the exponential loss function.
And then you see that the additive model with this exponential loss function is identical to the AdaBoost algorithm.
We also had a look at different loss functions.
We had this function here which tells us which samples or which areas are correctly classified.
Here we have the margin.
This value is called margin.
It's the outcome of our classifier, of our additive model times the class label.
And if we have a positive value, that is the output of the model and the class label have the same sign,
so it can be both positive or both negative, then we have a correct classification and zero loss.
And if the sign is different, then it's a misclassification and then we say the loss is one.
So what do we do if our model outputs a real value?
We compute the sign of F.
We compare it with the class label and then we compute the indicator function.
If this argument is true, then the output is one.
So if we have this mismatch, in the other case, it's zero.
One loss function that's widely used is the squared error.
So you compare the output with the value it should have.
And in our case, the value should be one plus one for a correct classification.
And in all other cases, we compute the distance from one, so one minus y times F and the square of it.
It's actually one minus y of F squared.
And you see here, that's the optimal value that we want to reach, zero loss.
And then here, that's the area where it's correctly classified.
Presenters
Zugänglich über
Offener Zugang
Dauer
01:25:05 Min
Aufnahmedatum
2013-02-05
Hochgeladen am
2013-02-07 12:10:42
Sprache
en-US