6 - Graph-Based Semi-Supervised Learning [ID:33912]
50 von 580 angezeigt

Welcome to this lecture on graph-based semi-supervised learning.

In the last lecture, we have introduced graphs and you have learned how to apply them for

modeling different tasks in mathematical imaging, for instance, and also machine learning and

data science in general.

So today we will speak about a more specific application of graphs, namely semi-supervised

learning.

In the beginning of the machine learning block, you've already learned about different paradigms

in learning.

So you might recall that there is unsupervised learning, fully supervised learning, semi-supervised

learning, self-supervised learning, all different kinds of supervised learning tasks.

And to wrap this up a little bit, I have prepared this slide here.

So in unsupervised learning, you're just given a data set and you don't have any information

about the data set itself and you somehow want to structure it.

And this is basically what we started this lecture series with.

This is a clustering task.

You would like to somehow subdivide your data set automatically into, let's say, two different

clusters.

And the techniques to do that are, for instance, k-means clustering or expectation maximization

clustering and also a concept which is called spectral clustering and which, in fact, is

also related to graphs.

But I will not speak about this today.

Okay, so unsupervised learning means you have some data set, you don't have any information

and you want to cluster it.

Okay, so then the contrary is supervised learning or fully supervised learning.

And there you are given a data set, but in addition, you have labels for the data set.

For every single data point, you know precisely what it should be.

So in this example, you precisely know whether it should belong to the blue class or to the

red class.

You have all these labels here.

And what you would like to do in supervised learning is you want to basically compute

a function which extends these labels to the whole space.

So you're not just interested in the labels of the data points because you already have

them.

No, instead, you're interested in the labels for the whole space.

So you would somehow like to subdivide your space, in this case, into two different regions,

one region corresponding to the blue class and the other one corresponding to the red

cluster.

And in the last lectures, we've spent quite some time with explaining you a concept on

how to solve supervised learning tasks, namely with neural networks, because neural networks

are just a parameterized function on the input space, which and you can feed in any input

that you want to and you would like to get the corresponding classification, let's say,

with a neural network.

And a bit more specifically, you could also solve this with a linear regression or nonlinear

regression.

OK, so this is supervised learning where you would like to transfer the information that

you have on a given data set onto the whole space.

Hence, if you get new data points which are not in your data set, you can just evaluate

the parameterized function that you computed here, for instance, the neural network, and

you will get a classification result for new data points.

Good.

Teil einer Videoserie :

Presenters

Leon Bungert Leon Bungert

Zugänglich über

Offener Zugang

Dauer

00:54:54 Min

Aufnahmedatum

2021-06-07

Hochgeladen am

2021-06-07 21:58:18

Sprache

en-US

Einbetten
Wordpress FAU Plugin
iFrame
Teilen