4 - Exercise 2 Preview [ID:58103]
50 von 850 angezeigt

Alright, welcome to Introduction to Machine Learning to Exercise 4.

And today I will primarily cover the contents of Exercise 2, so the stuff that you should

know about Exercise 2.

Vincent will also cover everything in the lecture, but he does it with lots of mathematical

rigor with some derivations and so on.

So he gives you a high level view and a low level view and I will primarily try to give

you a high level view at everything.

Some intuition maybe too, how I understood it, what helped me to learn it and so on.

Alright, we have lots to cover, so let's start directly.

Let's start with convolutions.

What is a convolution?

Well a convolution, probably most of you have heard it somewhere, learned about it somewhere,

but here's the definition.

To convolve a kernel with an input signal, flip the signal, move to the desired time

and accumulate every interaction with the kernel.

Okay, what does that mean?

Well it sounds a bit complicated, but if you look at this, it also looks very complicated,

but it's actually very easy.

This means, okay, we integrate everything over the entire time, sounds good, and then

we have two functions.

This F and this G function.

Okay, and what do we do?

To convolve everything, convolution is given, we write down the convolution with this star

shape sign or snowflake, then we have a kernel or F for example and then another signal,

so two signals in theory, or signal F or signal G, and then we flip our signal or our kernel,

both is fine.

So we flip one of them, that's what we're doing here, so we just turn our signal around,

mirror it at the y-axis, then we more or less just move our input signal, so our green signal

G, over to the left, onto the right a few times, and we always more or less compute

the overlap between our two signals, between our kernel and our signal.

And the overlap is the convolved signal at each time point.

Okay, that sounds a bit complicated, it's hard to describe it based on a formula and

some text, so let's have a look.

This is a discrete example, and what we see is that we have our signal or our kernel and

our signal in red, and we move the kernel across the signal.

The first thing that you will see is that our kernel that moves, so that moves and the

convolution row is flipped compared to our original kernel, or it's mirrored.

And then we simply move it across our image, and you can see we compute, I'm not sure if

I can stop that, I can't, we always just compute the overlap, okay, doesn't work, always just

compute the overlap between our three points and the three points from the top.

We see if the three points at the top are zero, then yeah, the overlap is zero, because

we just multiply each entry at the top with each entry at the bottom, and if that results

in zero, then our entire signal is zero, and else if we can see an overlap, or if it's

not zero, then we will see an overlap that is non-zero, and that is also what we can

see in our result here at the bottom.

I tried to look for a GIF for discrete convolutions, because it's usually easier to understand.

This one is quite fast, sadly, but I hope that you can still see the most important

things.

We flipped our kernel, as you can see here, we move it across, and then we compute the

overlap with our signal, where the overlap is non-zero, then our resulting convolution,

Zugänglich über

Offener Zugang

Dauer

00:54:15 Min

Aufnahmedatum

2025-06-12

Hochgeladen am

2025-06-12 16:46:03

Sprache

en-US

We explain the core concepts of exercise 2

Tags

Introduction Python Computer Vision Machine Learning
Einbetten
Wordpress FAU Plugin
iFrame
Teilen