Alright.
Thanks for the introduction.
My key, the content I was planning to talk about are these differentiable simulations.
In the meantime, since I agreed to do the talk, I slightly at least shifted the goal
and I want to also put quite some emphasis, probably it's always half half, I think, but
not on the fusion modeling.
So I think especially the combinations, actually a very interesting topic that keeps my group
quite busy here at the moment.
And I think it's an exciting direction within this larger deep learning field.
Hence, this coupling between the two.
Alright, let me start very broadly.
So I used to start these talks by trying to convince people that deep learning is worth
looking at at all.
I'd be happy to go into more detail here, but by now I'm seeing across many fields that
there are some consensus that it's a tool that's worth looking at, that's not completely
off the hooks.
And in a way, right, it's just another tool in a toolbox and visually you can't imagine
it.
It's a shiny new hammer that's more or less pretty quickly dropped out of nowhere seemingly
and offers a lot of questions and interesting research directions figuring out how to
exact the employer and where it pays off most.
But at least I think by now across many people we have a consensus that it's at least
worth looking at as one alternative of numerical methods.
And at the same time, working with simulations, I'll be focusing on fluids a bit as already
visible here on the right.
Many of the physical systems we study actually have uncertainties.
And here's this fluid smoke example in motion closely related, different interfaces and typically
model as two fluids are liquids.
Classical applications could be airfoil, flows, lift and drag calculations.
Many settings here are fairly well understood, but have uncertainties, be it in terms of
the inherent randomness of the process or the unpotentially incomplete or ambiguous measurement
that we have in certain states, typically dubbed as as aliatory uncertainty.
And this goes in contrast to the uncertainties and errors that we have in our models and
representations that we form in the computer.
That's typically the epistemic uncertainty.
And in the following, I actually want to focus on the first kind on these aliatory uncertainties.
So the ambiguous states and the uncertainties that we have in the physical descriptions and
not so much on the second kind.
And that's in a way a bit closer to the actual learning with more data and refinements
you can reduce the uncertainties there, but I'll be assuming that's basically, for example,
we have an incomplete observation of a flow and then we are not sure how exactly this measurement
will end up in the future a few seconds down the road or how exactly this on average
would lead to drag that we experience in an airfoil, so cases like this.
All right, in this context of uncertainties and randomness, I directly want to introduce
the basics of these diffusion models.
So let me start with the basic equations here.
We have some quantity of interest, typically Y and function f of x that that constitutes
these.
We typically assume in the following we approximate this with a neural network with some
way it's data and we have some approximation error.
Presenters
Zugänglich über
Offener Zugang
Dauer
00:42:54 Min
Aufnahmedatum
2023-06-21
Hochgeladen am
2023-06-29 19:56:04
Sprache
en-US
Date: Wed. June 21, 2023:
Event: FAU DCN-AvH Seminar
Organized by: FAU DCN-AvH, Chair for Dynamics, Control, Machine Learning and Numerics – Alexander von Humboldt Professorship at FAU, Friedrich-Alexander-Universität Erlangen-Nürnberg (Germany)
Title: Differentiable Physics Simulations for Deep Learning
Speaker: Prof. Dr. Nils Thürey
Affiliation: TUM, Technical University of Munich (Germany)
SEE MORE:
https://dcn.nat.fau.eu/differentiable-physics-simulations-for-deep-learning/