Thank you Daniel.
Okay welcome everybody.
So this talk is an introduction to deep learning so it's really covering mainly the basics.
So three people have been mainly working on these slides, Katharina Breininger, Tobias
Wurfel and myself.
So we did the slides actually for the deep learning course, so the complete lecture and
now I put out, grabbed some things and tried to squeeze them in 90 minutes.
Let's see about that.
So I really removed a lot.
So if you want more information I have put all the references in here but you can also
see the deep learning lecture, the complete lecture, so 12 lectures.
It's stood on, presented then by mainly Professor Meier and some lectures are also by me and
Katharina.
Is it on actually?
Is it okay for you?
I can make it louder.
No?
Okay better not.
Good.
Okay so yeah what is deep learning?
Deep learning can be seen as a part of machine learning and now I have the pleasure here
to speak as a computer scientist and yeah what the hell is it?
We will try to now cover all the basics so it's not, I mean there are so many subfields
which I cannot cover here in 90 minutes, it's just impossible.
So the outline is we first start with the neural network's basics then we will talk
a little bit about CNNs, so convolutional neural networks and then regularization techniques.
And so neural network's basics, we start with a small motivation, the perceptron and then
going over to the learning rule and optimization process.
So for motivation maybe a nice motivation or I always hoped I have bought, would have
bought some stocks from Nvidia because that quite rise and that was made, one part of
it is due to deep learning.
Maybe someone guesses why we have this drop here?
Yeah?
I think it's a new architecture from AMD.
No no why did it break down here so heavily?
That was Bitcoin.
So we had the Bitcoin drop so mining Bitcoins was not worth anymore so then we had this
huge drop here.
So it's not all deep learning but a huge part of Nvidia selling these graphic cards is of
course here represented.
So when does it all happen?
So the big bang of deep learning at least in the image community was when we had the
big breakthrough of Alex Krzyzewski in 2014 and 2012 sorry and they basically dropped
the error rate quite a lot when they trained a classifier so a deep neural network on the
ImageNet data set.
The ImageNet data set covers around 14 million images but the challenges, so the challenges
are have less categories so that the typical ImageNet challenge has 1000 classes and 1000
images per class.
So and yeah you can basically download them.
They have one single image, one single label and so of course they are sometimes a little
Presenters
Zugänglich über
Offener Zugang
Dauer
01:22:04 Min
Aufnahmedatum
2019-11-14
Hochgeladen am
2019-11-19 15:29:02
Sprache
de-DE