Welcome to our deep learning lecture. We are now in part four of the introduction.
Now in this fourth part we want to talk about machine learning and pattern recognition.
And first of all we have to introduce a bit of terminology and notation.
We are standing on the shoulders of the giants who in the past simplified the problem of problem solving so much
that now we have a chance to do the final step.
So throughout this entire lecture series we will use the following notation.
So matrices are bold and uppercase, so examples here are M and A.
And vectors are bold and lowercase, examples V, X. Scalars are italic and lowercase, Y, W, alpha.
And for the gradient of a function we will use the gradient symbol.
For partial derivatives we will use the partial notation.
Furthermore we have some specifics about deep learning.
So the trainable weights will generally be called W.
Features, inputs are X, they're typically vectors.
Then we have the ground truth label which is Y.
We have some estimated output that is Y hat.
And if we have some iterations going on we typically do that in a superscript and put it into brackets.
This is an iteration index here, iteration i for the variable X.
Of course this is a very coarse notation and we will further refine it throughout the lecture.
The stuff that works best is really simple.
If you have attended previous lectures of our group then you should know the classical image processing pipeline, pattern recognition pipeline.
We do the recording with the sampling and the analog to digital conversion takes place.
Then you have the pre-processing, feature extraction followed by classification.
And of course in the classification step you have to do the training.
The first part of the pattern recognition pipeline is covered in our lecture introduction pattern recognition.
The main part of classification is covered in pattern recognition.
So that's stuff that's really stood the test of time.
Now what you see in this image is a classical image recognition problem.
Let's say you want to differentiate apples from pears.
Now one idea that you could do is you could draw a circle around them and then measure the length of the major axis.
So you will recognize that apples are round and pears are longer.
So their ellipses have a difference in the major and minor axis.
Now you could take those two numbers and represent them as a vector value.
Then you enter a two-dimensional space which is basically vector space representation summing up the input from all sensors.
That does not show any pictures in which you will find that all of the apples are located on the diagonal through the x-axis.
Because if their diameter in one direction increases also the diameter in the other direction increases.
While your pears are off this straight line because they have a difference in their two minor and major axis.
Now you can find a line that separates those two and there you have your first classification system.
With say image recognition or something like that?
Now what many people think how the big data processing works is shown in this small figure.
So is this your machine learning system?
Yep.
Pour the data into this big pile of linear algebra then collect the answers on the other side.
And what if the answers are wrong?
Just stir the pile until they start looking right.
If you're a development engineer or if you have the development build like I do then you can see all the debug information.
But those would just be total gibberish to most people.
So what you can see in this picture is that of course this is how many people think that they approach deep learning.
You just pour the data in and in the end you just stir a bit and then you get the right results.
But that's not actually how it works.
Reminder what you want to do is you want to build a system that don't have a classification.
Presenters
Zugänglich über
Offener Zugang
Dauer
00:14:48 Min
Aufnahmedatum
2020-04-14
Hochgeladen am
2020-04-14 11:46:03
Sprache
en-US
Deep Learning - Introduction Part 4 This video introduces the topic of Deep Learning by defining the course's notation, introducing fundamental principles of Pattern Recognition, and a short review of the perceptron. Video References: Lex Fridman's Channel Further Reading A gentle Introduction to Deep Learning