Deep learning for time series. Attention-based models. 13.2 - Attention models/ClipID:40427 previous clip next clip

Recording date 2022-01-25

Lecturer

Dario Zanca

Via

IdM-login / Studon

Language

English

Organisational Unit

Friedrich-Alexander-Universität Erlangen-Nürnberg

Producer

Friedrich-Alexander-Universität Erlangen-Nürnberg

Attention models as described in their first implementation from the paper "Neural machine translation by jointly learning to align and translate".

Up next in Chapter

Schloss1
Dr. Luis Ignacio Lopera Gonzalez
2022-02-01
IdM-login / Studon
Schloss1
Dr. Luis Ignacio Lopera Gonzalez
2022-02-01
IdM-login / Studon
Schloss1
Dr. Luis Ignacio Lopera Gonzalez
2022-02-01
IdM-login / Studon

More clips in this category "Friedrich-Alexander-Universität Erlangen-Nürnberg"

2022-05-17
IdM-login
protected  
2022-05-17
IdM-login
protected  
2022-05-17
IdM-login
protected  
2022-05-17
IdM-login
protected