Skip to main content

Attention Mechanisms History

3 selectedDifficulty 2-43 unseenView topic
FoundationNew
0 answered
2 foundation1 intermediateAdapts to your performance
Question 1 of 3
120sfoundation (2/10)state theorem
Attention was introduced in neural machine translation by Bahdanau et al. (2014). What problem did it solve in the sequence-to-sequence architecture?