2 - Diagnostic Medical Image Processing (DMIP) 2010/11 [ID:1054]
50 von 649 angezeigt

Okay, so welcome to the Monday session.

I appreciate a lot that so many of you showed up again and hopefully not all of you are

still shopping around, but you already decided to go ahead with this lecture.

Today we still have a very, I would say, mathematical, theoretical topic because this is the basis

for many, many methods that we are going to discuss in terms or in the context of applications later on.

So do not worry about this.

This is something you have to know.

We browse through it.

We talk about these things.

And from tomorrow on, we are just applying it and using it.

And if you don't understand anything, you should at least keep in mind the set of slides

about the properties of the SVD, and then you should be able to start MATLAB and type

in SVD of a matrix.

That's basically all you need to know for our lecture here.

We will not explain to you how this can be computed in terms of numerical methods.

We assume that we have a library that does all this for us.

We just know how to deal with the SVD decomposition of matrices for solving particular problems

that are related to image processing problems.

Here is the definition basically of the singular value decomposition.

It's wrapped up in a theorem because we can decompose any matrix into this so-called normal

form SVD form.

We can do this with any matrix.

Even for a one by one matrix we can do that.

We can do it for a one by one matrix.

We can do it for a rectangular matrix.

We can do it for a singular matrix.

We can do it for a regular matrix.

What is the difference between a singular and a regular matrix?

A singular cannot be inverted.

Regular matrices can be inverted and that means what is the rank for a regular matrix?

For regular, it's 10 times 10.

For regular matrices the rank is the number of column vectors and they are all mutually

independent or they are linearly independent from each other.

How does this decomposition look like?

We can take a matrix.

It's an m by n matrix.

So we have m lines and m columns and this matrix can be decomposed as follows.

We can find an m by m matrix U that is orthogonal.

What does it mean?

What's an orthogonal matrix?

The column vectors are orthogonal to each other.

So the inner product of column vectors is zero.

They are orthogonal to each other.

And the other fact that we associate with orthogonal matrices is that the column vectors

are unit vectors.

And if we again come back to the geometric interpretation we have discussed last week,

the unit circle is mapped to a unit circle by using an orthogonal matrix.

And we also have an orthogonal matrix m by n.

It's called V. And then we have a diagonal matrix where only the diagonal elements are

non-zero and all the other elements are zero.

Zugänglich über

Offener Zugang

Dauer

00:44:20 Min

Aufnahmedatum

2010-10-25

Hochgeladen am

2011-04-11 13:53:28

Sprache

de-DE

Einbetten
Wordpress FAU Plugin
iFrame
Teilen