3 - FAU MoD Mini-workshop: Contribution evaluation in Federated Learning [ID:58170]
50 von 260 angezeigt

We pass to the third speaker, who is Daniel Kutsensof.

Kutsensof, yeah.

Okay.

Who is a visiting student from NS Paris-Aucly.

The message is yours.

Hello.

The objective of my talk is to present a little bit this topic of federated learning and some

works which are done here and we share on this topic.

I will speak a little bit more about what I'm doing personally, but also mention some

other works done by Ziki and Professor Zvazov.

So to understand what federated learning is, you have to start by understanding the difference

between federated learning and decentralized learning.

So I will start by talking about centralized learning, then I will talk about federated

learning.

After this, we'll talk a bit about Shapely value and how I use this notion to evaluate

the contribution of each client to this general process of federated learning.

And I will conclude.

So what is centralized learning?

You all know a bit about this, I imagine, because we came here.

So you have a model, you have a big data set, and this model takes inputs from this data

set and gives an output in the label space which corresponds to the most probable answer

to this input.

And you can apply it in lots of different situations.

We had some inspiring talks today about this and yesterday, so nothing new to you.

You can just choose one you prefer.

Research recognition, XOR prediction, Venter brings modeling, lots of different stuff.

Mathematically, what you're doing is that you're just finding a certain point in a big

vector space which minimizes your loss function.

To do this, you do just this gradient descent or in practice, stochastic gradient descent.

But what if your data set is not one data set but several data sets which do not all

have the same data?

And these constraints are actually very realistic.

It happens in lots of different situations.

For example, with some smartphones training because every client has his own private data

which he doesn't want to share with this huge corporation which trains their keyboards

inside of their smartphones.

Or with some medical information from hospitals who get their patients' information but can't

share it with our hospitals because of privacy reasons, once again.

But you want to train a big model which will use all this information to solve some practical

problems you want to find an answer.

And federated learning is all about this.

So federated learning, let me read this definition, is a machine learning setting where multiple

clients collaborate in solving a machine learning problem under the condition of a central server

which would, without sharing their private data.

So a lot of challenges arise in this topic.

The most important ones are robustness.

It is about not getting swayed away by some noise in your data and some attacks on your

training process.

The incentive, you want to be able to train your model in the most efficient way, the

most quickly, the most cheap way, with the least communication possible.

Presenters

Daniel Kuznetsov Daniel Kuznetsov

Zugänglich über

Offener Zugang

Dauer

00:25:17 Min

Aufnahmedatum

2025-06-24

Hochgeladen am

2025-06-24 17:52:36

Sprache

en-US

Date: Mon.-Tue. June 23 - 24, 2025
Event: FAU MoD Lecture & Workshop
Organized by: FAU MoD, the Research Center for Mathematics of Data at Friedrich-Alexander-Universität Erlangen-Nürnberg (Germany)
 
FAU MoD Lecture: Mon. June 23, 2025 at 16:00H
AI for maths and maths for AI
Speaker: Dr. François Charton, Meta | FAIR | École Nationale des Ponts et Chaussées
 
Mini-workshop: Tue. June 24, 2025 (AM/PM sessions)
FAU room: H11
 
AM session (09:45H to 11:30H)
• 10:00H The Turnpike Phenomenon for Optimal Control Problems under Uncertainty. Dr. Michael Schuster, FAU DCN-AvH Chair for Dynamics, Control, Machine Learning and Numerics – Alexander von Humboldt Professorship
• 10:30H AI in Mechanics Dr.-Ing. Hagen Holthusen, FAU MoD, Research Center for Mathematics of Data | Institute of Applied Mechanics
• 11:00H Contribution evaluation in Federated Learning Daniel Kuznetsov, Visiting Student at FAU DCN-AvH from ENS Paris-Saclay
 
PM session (14:15H to 16:00H)
• 14:15H AI for maths and maths for AI Dr.-Ing. François Charton, Meta | FAIR | ENPC
• 14:30H Exact sequence prediction with transformers Giovanni Fantuzzi, FAU MoD, Research Center for Mathematics of Data | FAU DCN-AvH at Friedrich-Alexander-Universität Erlangen-Nürnberg
• 15:00H Discovering the most suitable material model for cardiac tissue with constitutive neural networks Dr. Denisa Martonová, FAU MoD, Research Center for Mathematics of Data | Institute of Applied Mechanics
• 15:30H Stability of Hyperbolic Systems with Non-Symmetric Relaxation Dr. Lorenzo Liverani, FAU MoD, Research Center for Mathematics of Data | FAU DCN-AvH at Friedrich-Alexander-Universität Erlangen-Nürnberg  
 
AUDIENCE. This is a hybrid event (On-site/online) open to: Public, Students, Postdocs, Professors, Faculty, Alumni and the scientific community all around the world.
 
WHEN
• Lecture: Mon. June 23, 2025 at 16:00H (Berlin time)
• Workshop: Tue. June 24, 2025 (AM/PM sessions) at 09:45H and 14:15H (Berlin time)
 
WHERE. On-site / Online

Tags

AI Applied Mathematics federated learning FAU MoD FAU MoD Lecture Series FAU MoD workshop Maths FAU
Einbetten
Wordpress FAU Plugin
iFrame
Teilen