1 - FAU MoD Course (1/5): Mysteries of Deep Learning [ID:57586]
50 von 1597 angezeigt

Hello, good afternoon to all of you.

It's my great pleasure to introduce the third speaker of today, again from the Institute

of Natural Sciences in Shanghai Yaotong University.

I was lucky enough last September, I think, we were running this workshop, F.A.U.

in Fudan in Shanghai, and then I had the opportunity to visit for a couple of days

Shanghai Yaotong, which is not that far from Fudan in Shanghai,

and where I know Shijin, Professor Shijin for many years now.

And then I was introduced to Professor Nana Liu.

She delivered a colloquium this morning, and also Professor Yao Yu Zhang.

Yao Yu got his master and PhD in Shanghai Yaotong in 2016.

Then he, as a postdoc, he was spending time in New York University Abu Dhabi,

the Quran Institute in Manhattan, New York University, and also Princeton.

He's an expert in computational neural sciences and also on the theoretical,

mathematical theory of neural networks and learning and machine learning.

And, well, talking with him, I had the feeling that he knows many,

many things that would be very, very useful for us in order to get deeper

into the foundations of the mathematical theory of machine learning

and also the neuroscience inspiration.

And he was kind enough to be ready to spend here almost two weeks,

and in particular deliver this series of five lectures, right, starting today.

So we'll have two hours per day starting today.

Well, not tomorrow and Sunday, okay?

So you can still come if you want, but there will not be lectures,

but we will continue next Monday.

You have to check every day on the web where, in which room we are lecturing.

I'm not sure it's going to be every day here.

But in principle, these series of lectures will be recorded, no?

Are they being streamlined?

Yeah, okay.

So one can follow them by streaming, and many colleagues are doing that,

but also they will be recorded and hopefully will be a material of use

also for future researchers willing to enter into this topic.

So, well, he will now present.

Yeah, so thanks a lot for the invitation, Eric.

And it's my great pleasure to come here.

And particularly, this is also the first time just to deliver this 10-hour kind of lecture series,

particularly focusing on the progress I and my collaborators have made

over the past actually seven years, focusing on the kind of foundation of deep learning.

And the goal is to give you a feeling that the deep neural network is no longer a black box.

So I think probably because all of you do like math, and we are used to the case where

whatever the problem we're looking to, we have some kind of framework to understand the problem.

And then we never say, okay, anything is a black box.

Like PDE can be very, very difficult.

However, we don't say PDE is a black box or any difficult PDEs are like that kind of black box.

However, for the deep learning, it's very different.

And so we often say that it's a black box despite the fact that it's mathematical formulation.

It's very, very clear.

It's unlike our brain where if I ask you to write down a model that tells you what happens in the brain

or mimics our brain, that's extremely difficult.

For example, maybe some of you know the Blue Brain Project at EU,

Zugänglich über

Offener Zugang

Dauer

01:55:24 Min

Aufnahmedatum

2025-05-02

Hochgeladen am

2025-05-06 07:35:44

Sprache

en-US

Date: Fri. – Thu. May 2 – 8, 2025
FAU MoD Course: Towards a Mathematical Foundation of Deep Learning: From Phenomena to Theory
Session 1: Mysteries of Deep Learning
Speaker: Prof. Dr. Yaoyu Zhang
Affiliation: Institute of Natural Sciences & School of Mathematical Sciences, Shanghai Jiao Tong University
Organizer: FAU MoD, Research Center for Mathematics of Data at FAU, Friedrich-Alexander-Universität Erlangen-Nürnberg
Overall, this course serves as a gateway to the vibrant field of deep learning theory, inspiring participants to contribute fresh perspectives to its advancement and application.
Session Titles:
1. Mysteries of Deep Learning
2. Frequency Principle/Spectral Bias
3. Condensation Phenomenon
4. From Condensation to Loss Landscape Analysis
5. From Condensation to Generalization Theory
 

Tags

FAU FAU MoD FAU MoD Course deep learning artificial intelligence
Einbetten
Wordpress FAU Plugin
iFrame
Teilen