2 - FAU MoD Mini-workshop: AI in Mechanics [ID:58169]
50 von 406 angezeigt

We can pass to the second speaker who is Hagen Holthausen who is a postdoc at FIU from the Institute of Applied Mechanics.

Thank you.

Okay, thank you for this warm introduction. My name is Hagen Holthausen and I just want to give you a talk about AI in mechanics.

It's not such an in-depth talk, it's more like an overview of what is going on in the mechanics community and I should say in the continuum mechanics community.

So not fluids, it's more on solids and of course this is joint work together with some of my colleagues from Aachen, from Stanford and also from Hamburg.

So this talk is structured as follows. First we start by AI and material modeling and I guess most of you are mathematicians.

So material modeling means you have a material and it's subjected to some kind of load and it exhibits some stress behavior.

And this could be different if you think about rubber in contrast to steel. So this is the very first part of the talk.

The second is about AI and model order reduction where we look at attire systems and we want to speed up the computation time.

And lastly it's more like an outlook. So what do we think where I will go in mechanics and we call this AI and mechanical thinking.

So let's start with the very first thing. As engineers we always start with three equations, the so-called equilibrium.

If you think about forces, so Newton mechanics, then forces should be in equilibrium.

Second we have the kinematics which relates our solution variable, in our case it's a displacement, with some gradient of the solution and lastly the material law.

And this is more or less heuristic. We know some mathematical properties it should have, but it's not that strict like the first equation.

So the equilibrium and the kinematics they are really proven that they exist. The material law you must find somehow.

And one of the famous approaches in AI is so-called physics-informed neural networks.

So you have a normal neural network where you have a loss, which is for example the mean squared error between your prediction and the thing you measured.

So for example the solution field, the displacement. And what they do in physics-informed neural networks is they add a physical loss in addition and this is nothing but the PDE.

So they include the PDE and so the network is informed with the underlying physics.

The problem is a little bit, okay, you must know the material law in that case.

So to somehow connect the PDE or to put this PDE into the loss, you have to know the physical law or the material law and this is not known a priori.

So what we are focusing more on is, okay, how can we find this material law?

And what we do is we look at the second law of thermodynamics, which states nothing but, okay, if you put ice into water, then the ice will melt and not the entire water will get ice.

And if you multiply everything out, you get a Helmholtz free energy with these properties.

So it must be always greater to zero. It must be zero for no deformation and this derivative must also be zero.

And then you can connect the stress to this Helmholtz free energy.

And the scalar function is much more easy to handle, at least for us, than a stress tensor.

And what we then do is you can call this constitutive artificial neural networks, physics-augmented neural networks, or thermodynamics artificial neural networks.

There are a lot of names in the literature, but they all do the very same.

They plug in the deformation gradient.

So this comes from kinematics, where we have the gradient of our solution field, pass this through some network, and we manipulate the architecture.

And we manipulate it in such a way that it fulfills certain requirements.

So these first three end due to objectivity.

So, for example, you and I must measure the very same energy, regardless of where we're standing or how do we rotate.

And it must be so-called polyconvex.

It must be convex with respect to all the invariants of the deformation.

And if you do so, then you guarantee at least that you are priori satisfy physics.

And this is what we do next.

So this is a work from Kevin Linker from 2021, where they investigated rubber materials and they constructed such a neural network, made it arbitrarily large.

And we had some data of the rubber, so we either stretched the rubber or we sheared it.

And we plugged this in and trained the neural network on this.

In the first case here, it's a single load training.

This means, okay, the neural network sees either the stretching, the shearing, or some biaxial tension.

So we put or you stretch the material in both directions.

What you see is if you feed the neural network with more and more data.

So you use something we call the multi-load training.

Then you get a somehow unique solution.

So less terms in the neural network are active.

And this is what we consider to be a unique solution, at least from an engineering point of view.

And you could now ask, okay, it seems to work.

Presenters

Dr.-Ing. Hagen Holthusen Dr.-Ing. Hagen Holthusen

Zugänglich über

Offener Zugang

Dauer

00:37:45 Min

Aufnahmedatum

2025-06-24

Hochgeladen am

2025-06-24 17:51:47

Sprache

en-US

Date: Mon.-Tue. June 23 - 24, 2025
Event: FAU MoD Lecture & Workshop
Organized by: FAU MoD, the Research Center for Mathematics of Data at Friedrich-Alexander-Universität Erlangen-Nürnberg (Germany)
 
FAU MoD Lecture: Mon. June 23, 2025 at 16:00H
AI for maths and maths for AI
Speaker: Dr. François Charton, Meta | FAIR | École Nationale des Ponts et Chaussées
 
Mini-workshop: Tue. June 24, 2025 (AM/PM sessions)
FAU room: H11
 
AM session (09:45H to 11:30H)
• 10:00H The Turnpike Phenomenon for Optimal Control Problems under Uncertainty. Dr. Michael Schuster, FAU DCN-AvH Chair for Dynamics, Control, Machine Learning and Numerics – Alexander von Humboldt Professorship
• 10:30H AI in Mechanics Dr.-Ing. Hagen Holthusen, FAU MoD, Research Center for Mathematics of Data | Institute of Applied Mechanics
• 11:00H Contribution evaluation in Federated Learning Daniel Kuznetsov, Visiting Student at FAU DCN-AvH from ENS Paris-Saclay
 
PM session (14:15H to 16:00H)
• 14:15H AI for maths and maths for AI Dr.-Ing. François Charton, Meta | FAIR | ENPC
• 14:30H Exact sequence prediction with transformers Giovanni Fantuzzi, FAU MoD, Research Center for Mathematics of Data | FAU DCN-AvH at Friedrich-Alexander-Universität Erlangen-Nürnberg
• 15:00H Discovering the most suitable material model for cardiac tissue with constitutive neural networks Dr. Denisa Martonová, FAU MoD, Research Center for Mathematics of Data | Institute of Applied Mechanics
• 15:30H Stability of Hyperbolic Systems with Non-Symmetric Relaxation Dr. Lorenzo Liverani, FAU MoD, Research Center for Mathematics of Data | FAU DCN-AvH at Friedrich-Alexander-Universität Erlangen-Nürnberg  
 
AUDIENCE. This is a hybrid event (On-site/online) open to: Public, Students, Postdocs, Professors, Faculty, Alumni and the scientific community all around the world.
 
WHEN
• Lecture: Mon. June 23, 2025 at 16:00H (Berlin time)
• Workshop: Tue. June 24, 2025 (AM/PM sessions) at 09:45H and 14:15H (Berlin time)
 
WHERE. On-site / Online

Tags

Mechanics AI Applied Mathematics FAU MoD FAU MoD Lecture Series FAU MoD workshop Maths FAU
Einbetten
Wordpress FAU Plugin
iFrame
Teilen