We pass to the third speaker, who is Daniel Kutsensof.
Kutsensof, yeah.
Okay.
Who is a visiting student from NS Paris-Aucly.
The message is yours.
Hello.
The objective of my talk is to present a little bit this topic of federated learning and some
works which are done here and we share on this topic.
I will speak a little bit more about what I'm doing personally, but also mention some
other works done by Ziki and Professor Zvazov.
So to understand what federated learning is, you have to start by understanding the difference
between federated learning and decentralized learning.
So I will start by talking about centralized learning, then I will talk about federated
learning.
After this, we'll talk a bit about Shapely value and how I use this notion to evaluate
the contribution of each client to this general process of federated learning.
And I will conclude.
So what is centralized learning?
You all know a bit about this, I imagine, because we came here.
So you have a model, you have a big data set, and this model takes inputs from this data
set and gives an output in the label space which corresponds to the most probable answer
to this input.
And you can apply it in lots of different situations.
We had some inspiring talks today about this and yesterday, so nothing new to you.
You can just choose one you prefer.
Research recognition, XOR prediction, Venter brings modeling, lots of different stuff.
Mathematically, what you're doing is that you're just finding a certain point in a big
vector space which minimizes your loss function.
To do this, you do just this gradient descent or in practice, stochastic gradient descent.
But what if your data set is not one data set but several data sets which do not all
have the same data?
And these constraints are actually very realistic.
It happens in lots of different situations.
For example, with some smartphones training because every client has his own private data
which he doesn't want to share with this huge corporation which trains their keyboards
inside of their smartphones.
Or with some medical information from hospitals who get their patients' information but can't
share it with our hospitals because of privacy reasons, once again.
But you want to train a big model which will use all this information to solve some practical
problems you want to find an answer.
And federated learning is all about this.
So federated learning, let me read this definition, is a machine learning setting where multiple
clients collaborate in solving a machine learning problem under the condition of a central server
which would, without sharing their private data.
So a lot of challenges arise in this topic.
The most important ones are robustness.
It is about not getting swayed away by some noise in your data and some attacks on your
training process.
The incentive, you want to be able to train your model in the most efficient way, the
most quickly, the most cheap way, with the least communication possible.
Presenters
Daniel Kuznetsov
Zugänglich über
Offener Zugang
Dauer
00:25:17 Min
Aufnahmedatum
2025-06-24
Hochgeladen am
2025-06-24 17:52:36
Sprache
en-US