Unlock: Efficient Transformers Survey
Sub-quadratic attention variants (linear attention, Linformer, Performer, Longformer, BigBird) and why FlashAttention, a hardware-aware exact method, made most of them unnecessary in practice.
142 Prerequisites0 Mastered0 Working123 Gaps
Prerequisite mastery13%
Recommended probe
Chernoff Bounds is your weakest prerequisite with available questions. You haven't been assessed on this topic yet.
Not assessed3 questions
Sign in to track your mastery and see personalized gap analysis.