7 - Pattern Recognition [PR] - Q&A - Classification vs. Regression [ID:27100]
50 von 80 angezeigt

Welcome back to Pattern Recognition Q and A and today I again received a couple of mails and

I want to reply to some of your questions. So today's topic will be classification and

regression and we will see where the differences between the two are and we will look into

a small regression problem and how to solve it with linear algebra.

So you sent quite a few mails and you sent many many questions and I had to select only

a few of them and I also present some of the questions that I don't want to actually answer

in detail here but those questions actually show up. One particular one is hey aren't

you a data scientist and machine learning expert? I have this great idea about predicting

stock price. Well no I'm not going to reply to such emails and I might even block you

if you send stuff like this. So I won't do any stock market price prediction stuff like

that so you can save that email. I won't answer that one. Also questions like this one pop

up. So can you compute P of X given D and the answer is yes I can compute it and then

you ask something like can you send me the answer and the answer is no I won't do your

homework and your home exercises. This is not what these videos are intended for in

particular if they're not even related to this class. I will answer of course questions

that are related to the stuff that I talk about in these videos. I have now the class

pattern recognition. I also have the class deep learning and I will answer about the

stuff that we discuss in those videos. That's perfectly fine but I won't solve your exercise

problems. Yeah you can save those mails. I won't reply to this. Sorry. Now let's go

into something where we really had a couple of questions with and this actually concerns

our class. And one question that came up actually in a couple of occasions was what's the difference

actually between classification and regression? Now in classification as we talked about in

the first couple of videos we want to find the decision boundary and we actually assumed

that the problem of regression is more or less known to you. So the problem of regression

is actually trying to fit a certain mathematical construct. So let's hear this is a kind of

curve, maybe a polynomial, maybe it is also a spline or something like this and we try

to fit it to the data such that it is able to explain the data and follow it. So this

is a regression problem. What you encounter very frequently is linear regression. So in

linear regression we have a set of points and we want to fit a line through a set of

points. Now you already see that I have a equation here so this is the kind of model

that we try to fit and we have the observed data x and y and we want to fit the unknown

parameters m and t. Of course to do that I have to tell you what variable is what and

this is of course here explained with the axis. So we have the y axis and the x axis

and now you can see that with every point I essentially have one observation of a pair

of x and y value. Now what we try to find is the m and the t that matches this observation

best. You see that we have many of those observations so for every point in the diagram I actually

get one equation so I can rearrange this then we end up with a system of equations. You

see I have the y1 that is then associated to the x1 and I have the yn where n is the

number of points and this is associated to xn. So we have equations and now that we have

many of those equations we can reformat this and this thing is a system of linear equations.

You see here that the y's can be written into a vector of y's then we can separate the product

and the sum on the right hand side and we can express it with means of matrix multiplication.

So here if you take the first line and you do the inner product with the vector mt and

you assume in the current line so let's start with the first line so this is y1 and x1 and

now I expand this by essentially a column of ones in this matrix and you see if I now

compute the first line of this inner product I exactly end up with the first equation and

if I do that for all of the other lines in the system of equations I exactly get above

equations that we already observed from the point. So we can rewrite this a little and

you see that this can be expressed completely using matrix calculus so now we have a vector

y a matrix x and some vector beta and what you see here is that we can observe all the

Teil einer Videoserie :

Zugänglich über

Offener Zugang

Dauer

00:09:26 Min

Aufnahmedatum

2020-12-22

Hochgeladen am

2020-12-22 12:09:54

Sprache

en-US

In this video, we look into the difference between classification and regression and show a simple example for linear regression.

This video is released under CC BY 4.0. Please feel free to share and reuse.

For reminders to watch the new video follow on Twitter or LinkedIn. Also, join our network for information about talks, videos, and job offers in our Facebook and LinkedIn Groups.

Music Reference: Damiano Baldoni - Thinking of You

Einbetten
Wordpress FAU Plugin
iFrame
Teilen