Back to the serious lecture and the serious life we are talking about
diagnostic medical image processing this semester and we know mostly
everything about different modalities and the physics of different modalities
and currently we are in a chapter which I intend to finalize as soon as possible
because we don't want to talk the whole lecture about image pre-processing. We
are currently in the chapter on acquisition specific pre-processing and
we talked about x-ray, we talked about different detector technologies used in
x-rays, x-ray imaging and we know the artifacts that are implied by this
technology and currently we are talking about MR imaging and how to eliminate
artifacts in MR images that is due to inhomogeneities. In terms of algorithms,
methods and principles we have learned a lot. Yeah, motivated by these very
specific problems in medical imaging we were considering important concepts
algorithmic concepts, computer science concepts that we can use for tons of
applications and problem sets that show up in image processing in general. We
learned about the basic tool for doing linear algebra if we are doing it on a
Matlab level if we just want to find out some properties of matrices and things
like that we use the singular value decomposition that does a lot. For us we
have learned about these square estimators, we have learned about
parameterization, fair parameterization, singularities in parameterizations, we
learned about bootstrapping, we learned about the Fourier transform, the convolution
theorem, we learned about what else did we learn, deconvolution methods to do
interpolation, we learned about bilinear interpolation. So we did
quite a lot and currently we are discussing how to eliminate low
frequency corruption, low frequency no, how do you say, corruptions of the signal?
No, low frequency, huh? Inhomogeneities, distortions, intensity distortions in MR
images and we have discussed a bunch of methods that are useful to do that. For
instance we have talked about low pass filtering with a huge kernel and using
the difference of the low pass filtered image with the original one. We talked
about homomorphic unsharp masking, we talked about what else did we say? What
algorithm? We fit it in parametric surfaces, we all know how to do that, set
up a least square objective function, square estimate and do standard
linear solution or standard solution of a linear system of equations. And
yesterday we discussed a statistical method that is related to the KL
divergence and the KL divergence is like the sum of square differences that
compares some numbers and the similarity of numbers with the KL divergence we
can compare PDFs. Now we can compute the similarity of PDFs. So if we have two PDFs,
PDF one and two, or two PDFs, we have two PDFs, let's say P of X and Q of X
we can compute the similarity by doing this here.
Computing the KL divergence that is a similarity measure for these two
density functions. And there are a few problems. Yesterday I just told you how
fascinating this is and how wonderful this works and I told you how to compute
histograms but in practice this is not that easy to use. For instance if you
have histograms where you have here zero entries, I mean this has to be handled
somehow. So we will talk right after Christmas about estimating histograms
that are suitable for evaluating KL divergence. It's not that
straightforward as it appeared yesterday when I did my magic here and I just said
use it it's wonderful. If you use it you will see right away that it's not as
wonderful as I have described it in the lecture. Good. A problem that struggles me
currently a lot is, I mean what is the advantage of using that? And this is not
well understood in the literature and maybe some of you come up with a good
Presenters
Zugänglich über
Offener Zugang
Dauer
00:56:17 Min
Aufnahmedatum
2010-11-30
Hochgeladen am
2011-04-11 13:53:29
Sprache
de-DE