This audio is presented by the University of Erlangen Nürnberg.
So let's consider a matrix A.
And in the general form, our matrix will be element of Rn times m.
And this will mean that we have m lines, sorry, it will be m times n.
So this means that we have m lines and n columns.
Okay, so our matrix we can write as a set of scalar values and some indexing.
So what we do is we index the first element here, this is a scalar value and has the index 1,1.
And then we can increase the second index 1,2 and so on until we end at the value a1n.
And then in the second row we can do the same.
So we can increase the other index a21 and increase here and we will get am1.
And of course you can do the same around this diagonal.
So this will be 2,2 and here you will get index amn.
So this is simply a matrix and we can actually rewrite this entire matrix into a set of column vectors.
So we could also write this matrix a bit shorter but we write it as a set of vectors.
So let's say this is a vector a1 and then we have a vector a2.
And we have a lot of more vectors until we end up at the vector an.
So we could also write this matrix like this.
So this is fairly easy, right? Everybody should know that.
Now we can start thinking about a matrix multiplication.
So let's say you have a matrix and you multiply it with a vector x.
What will happen? Well, you can do the math and do the element-wise computation
or you can actually look into the matrix here.
And what will actually happen is that you get a mixing of the two vectors.
So what will happen is you will get this vector times the vector x.
Sorry.
Ah, yes.
So what we actually want to do is we want to write them.
Okay, we can write them as column vectors.
But what we actually want to do is we want to rewrite them as row vectors.
We want to write them as a set of row vectors a1 and then we actually put the transpose here.
And then we do a2 and do a transpose and a3 and so on until we end up with am.
And then you have your entire vectors essentially as row vectors.
So you also need transposes here.
Okay, so if we do that, you can actually multiply this guy with a vector x.
And what you will see now is that if you multiply it with the vector x, you get...
Our vector is this way.
Maybe because we get the weighted sum with x here, right?
And what we want to get is essentially a mixture of the...
Let me put it this way.
So what I want to show you is this guy here.
And no indexes here.
Now let's think.
Then we need the transposes here.
And then we need to sum up.
So if you multiply with the vector x, you should be getting this vector here and this vector here.
Yeah, yeah, yeah, this is going to be a vector again.
Okay, forget about this.
So what I actually want to do is I want to write this up as a vector x1.
Yeah, okay.
So what I want to show you is that if we take the other form, then what you get is...
Presenters
Zugänglich über
Offener Zugang
Dauer
01:05:05 Min
Aufnahmedatum
2016-04-19
Hochgeladen am
2016-04-19 22:55:08
Sprache
en-US
This lecture focuses on recent developments in image processing driven by medical applications. All algorithms are motivated by practical problems. The mathematical tools required to solve the considered image processing tasks will be introduced.