13 - Deep Learning - Activations, Convolutions, and Pooling Part 1 [ID:16881]
50 von 99 angezeigt

Welcome back to deep learning. So today's lecture we want to talk about activations

and convolutional neural networks. We will split this up into several videos

and the first one will be about activation functions. Later we will talk

about convolutional neural networks, convolution layers, pooling and so on. So

let's start with activation functions and you can see that the activation

functions go back to the biological motivation and we remember that

everything we've been doing so far we somehow also motivated with the

biological configuration where we see that these neurons are being connected

with synapses to other neurons and this way they can actually communicate with

each other. The synapses have this myelin sheath and with this they can

actually electrically be isolated and this way they are able to communicate to

other cells. Now when they are communicating they are not just sending

information everything that they get in but they have a selective mechanism so

if you have some stimuli it actually does not suffice to have some signal but

the total signal must be above some threshold and what will then happen is

that an action potential is triggered it repolarizes and then returns to the

resting state. Interestingly it doesn't matter how strongly the cell is

activated it is always returning the same action potential and then it

returns to its resting state. The actual biological activation is even more

complicated so you have the different accents and they are connected to the

synapses in other neurons and on the path they are covered within Schwann cells

that then can deliver this action potential towards the next synapse. There

are ion channels and that are actually used to stabilize the entire electrical

process and bring this whole thing again into equilibrium after the activation

pulse. So what we can see is the knowledge essentially lies in the

connections between the neurons we have both inhibitory and excitatory

connections the synapses anatomically and force feed forward processing so it's

very similar to what we've seen so far however those connections can be in any

direction so they can also be cycles and you have entire networks of neurons

that are connected with different accents in order to form different

cognitive functions. Crucial is the sum of activations only if the sum of

activations is above the threshold then you will actually end up with an

activation and these activations are electric spikes with a specified

intensity and to be honest the whole system is also time dependent and they

also encode the entire information over time so it's not just that we have a

single event that passes through but the whole process runs at a certain

frequency and this enables the entire processing over time. Now activations in

artificial neural networks so far they were nonlinear activation functions and

mainly motivated by the universal function approximation so if we don't

have the nonlinearities we can't get a powerful network without the

nonlinearities we would just enable matrix multiplication after matrix

multiplication. So compared to biology we have some some sine function that can

model all the naffig response but generally our activations have no time

component and maybe this could be modeled by activation strength. The sine

function of course is mathematically unreciable because the derivative of

the sine function is zero everywhere except at zero where we have infinity so

this is absolutely not suited for back propagation so far we've been using the

sigmoid function because we can compute an analytic derivative. Now the question

is can we do better? So let's look at some activation functions and the most

simple one that we can think of is a linear activation where we just take the

Teil einer Videoserie :

Zugänglich über

Offener Zugang

Dauer

00:09:30 Min

Aufnahmedatum

2020-05-30

Hochgeladen am

2020-05-30 18:06:34

Sprache

en-US

Deep Learning - Activations, Convolutions, and Pooling Part 1

This video presents the biological background of activation functions and the classical choices that were used for neural networks.

Further Reading:
A gentle Introduction to Deep Learning

Tags

Perceptron Introduction artificial intelligence deep learning machine learning pattern recognition
Einbetten
Wordpress FAU Plugin
iFrame
Teilen