Skip to main content
← Choose a different target

Unlock: Induction Heads

Induction heads are attention head circuits that implement a specific kind of pattern completion: given a sequence like [A][B]...[A], they predict [B]. Olsson et al. (2022) give strong causal (ablation) evidence that these heads exist in small attention-only models, and correlational co-occurrence evidence linking their formation to a sudden jump in in-context-learning ability in larger transformers. They are one mechanism among several that have been proposed for in-context learning, not the whole story.

273 Prerequisites0 Mastered0 Working208 Gaps
Prerequisite mastery24%
Recommended probe

Floating-Point Arithmetic is your weakest prerequisite with available questions. You haven't been assessed on this topic yet.

Not assessed3 questions
Not assessed4 questions
Not assessed1 question
Not assessed11 questions
Not assessed1 question
Not assessed11 questions

Sign in to track your mastery and see personalized gap analysis.