7 - Lecture 7: Parallel Imaging III: Non-Cartesian Imaging and Iterative Reconstruction [ID:46137]
50 von 881 angezeigt

All right, awesome.

We made it.

This is always the hardest part of this lecture is getting this Zoom slide sharing things

going.

Okay, so welcome back everybody to the second last lecture before the winter break.

We're going to continue with the third part of parallel imaging, but today is one kind

of important aspect because we are going to talk about gradient based iterative optimization

methods for the first time.

And this is a very important topic because pretty much everything we do after this will

be or can be seen as a variant.

We just make it more complex.

We're going to introduce nonlinearities, we're going to introduce machine learning methods,

but it's all going to be based on this same core.

And from the point of view of parallel imaging, what we do today, so this is kind of the application

that motivates this, we're going to consolidate what we did in the last couple of lectures.

Remember, we talked about how hard it is to do parallel imaging with non-cortesian trajectories,

with Kappa, it doesn't really work at all, with sense, it didn't really work with the

framework that we were discussing because we were really looping through the pixels

in our matrix and decoupling this unfolding procedure.

Today, we're going to learn about a general way of treating this problem where we can

solve for arbitrary sampling patterns, and that's going to be an iterative method.

Yeah, so this is our overview for today.

First, going to start with a little bit of a recap of parallel imaging and non-cortesian

parallel imaging in particular just to motivate the problem, and then we are going to talk

about two iterative algorithms, the gradient descent algorithm and the method of conic

at gradients, and those are going to be the two that you will implement on Thursday and

then run some numerical experiments.

We will see how this convergence behavior of these two, and so this is going to be a

fun exercise on Thursday.

And then in particular, I'm going to show you a little bit about this method that is

generally called CG sense, conic at gradient sense.

This is what I would say was the first modern iterative optimization reconstruction in MR

that kind of paved the way for everything that came after compressed sensing nonlinear

method.

Again, from Klaas Buschmann from ETH in Zurich, the same author of the sense paper.

So he started with the simplistic Cartesian sense, then found out where the limits were,

and two years later he came up with this publication.

And I think this is a pretty influential one.

And then in the very end, I have a little bit of a more modern research follow-up that

talks about reproducible science because it's connected this paper.

So this is a little bit beyond core lecture stuff, but hopefully you'll find it interesting.

Okay, so let's get started with the recap of parallel imaging.

And if you have any questions here about any parallel imaging related things that we have

been discussing, please just let me know immediately.

What we have been doing so far is, you heard it twice with the sense and with Kappa, this

idea that we sub-sample, meaning we increase the distance between our k-space lines.

That means we can measure less lines while still maintaining the same maximum point in

k-space that defines our resolution.

And because we are now violating Nyquist, or we increase the distance of these points

in the Fourier domain, this paths off in the view, so we get these back-folding artifacts

Zugänglich über

Offener Zugang

Dauer

01:35:58 Min

Aufnahmedatum

2022-12-13

Hochgeladen am

2022-12-13 20:06:04

Sprache

en-US

Tags

Medical Imaging, MRI, Inverse Problems, Numerical Optimization, Machine Learning, Deep Learning
Einbetten
Wordpress FAU Plugin
iFrame
Teilen