[MUSIK]
good afternoon everybody tomorrow I will give again the big picture Monday usually
no big picture just the continuation of the topics we have considered last week
Andreas Maier was my substitute last week and he finished the
topic perceptron we talked about the perceptron the convergence the idea or
the observation that the convergence speed is basically not dependent or
independent of the dimensionality of the feature vectors D then we
talked a little bit about neural networks something that I do not like
so much so the excitement factor is rather rather small and you can
be rather sure that I don't ask usually any questions regarding
neural networks and also the biological motivation as Andreas told you
hopefully last week is not part part of the topic's of
the examinations we usually talk about things that we understand and that
we know and not about biology and all these things
where I only have a vague vague type of understanding
today we have one chapter which might be boring for
you if you are studying mathematics engineering mathematics technical mathematics
because it will be on optimization and we were thinking
a lot about having some kind of refresher course in
optimization or whether we should skip that and assume that
you have all the experience regarding optimization but our personal observation
is that students usually haven't seen so much about optimization in
their career at our university so if it's redundant for you just enjoy
it and hopefully you'll find some weak parts which I do not explain so well so you can
support me and if you see the first time try to catch the basic ideas
all these methods are usually implemented in MatLab and many optimization
libraries so you are not required to implement these methods on
your own but it's important once you want to use it
to have a good understanding what is actually going on in
this or that optimization approach so in general optimization is crucial
for many solutions in pattern recognition image processing usually you can
say what is pattern recognition pattern recognition basically is writing down
an objective function optimize the objective function and then you are
done and this is true for computing the parameters of an
regressor or doing classification we have to optimize with respect to
a decision rule so many of the problems we are considering
are basically reduced to reduced to optimization problems and if you
study optimization in the web in the literature you will see that optimization has
many many different faces you can find whole books whole lecture series
on optimizing and for us as pattern recognition researchers it's very
important also to follow up with current research in optimizations and we
have to say things have changed a lot over the past
10 years over the past 10 years a lot of novel
results with respect to optimization methods were generated and today we
can solve optimization problems where nobody was thinking off let's say 20
years 25 years ago when I was a student so optimization is important very important
if you have a chance to attend lecture on optimization at our university
you should take the chance in particular if you want to do research
with us later on it's always good to have a certain expertise in
this field and it's always good if you have a better expertise in
Presenters
Zugänglich über
Offener Zugang
Dauer
00:45:11 Min
Aufnahmedatum
2012-12-03
Hochgeladen am
2012-12-06 14:25:14
Sprache
en-US