Deep learning for time series. Attention-based models. 13.3 - The Transformer architecture/ClipID:40428 previous clip next clip

Recording date 2022-01-25

Lecturer

Dario Zanca

Via

IdM-login / Studon

Language

English

Organisational Unit

Friedrich-Alexander-Universität Erlangen-Nürnberg

Producer

Friedrich-Alexander-Universität Erlangen-Nürnberg

Discuss the fundamental components of the Transformer architecture, presented in the paper "Attention is all you need" (2017).

Up next in Chapter

Schloss1
Dr. Luis Ignacio Lopera Gonzalez
2022-02-01
IdM-login / Studon
Schloss1
Dr. Luis Ignacio Lopera Gonzalez
2022-02-01
IdM-login / Studon
Schloss1
Dr. Luis Ignacio Lopera Gonzalez
2022-02-01
IdM-login / Studon
Schloss1
Dr. Luis Ignacio Lopera Gonzalez
2022-02-01
IdM-login / Studon

More clips in this category "Friedrich-Alexander-Universität Erlangen-Nürnberg"

2022-05-17
IdM-login
protected  
2022-05-17
IdM-login
protected  
2022-05-17
IdM-login
protected  
2022-05-17
IdM-login
protected