Unlock: Mirror Descent and Frank-Wolfe
Mirror descent generalizes gradient descent via Bregman divergences, recovering multiplicative weights and exponentiated gradient as special cases. Frank-Wolfe replaces projections with linear minimization, making it projection-free.
50 Prerequisites0 Mastered0 Working48 Gaps
Prerequisite mastery4%
Recommended probe
Borel-Cantelli Lemmas is your weakest prerequisite with available questions. You haven't been assessed on this topic yet.
Not assessed6 questions
Characteristic FunctionsFoundations
Not assessed5 questions
Not assessed30 questions
Modes of Convergence of Random VariablesInfrastructure
Not assessed13 questions
Not assessed16 questions
Triangular DistributionAxioms
Not assessed4 questions
Not assessed5 questions
Convex DualityCore
Not assessed10 questions
Convex Optimization BasicsFoundations
Not assessed32 questions
Online Convex OptimizationAdvanced
Not assessed2 questions
Not assessed6 questions
Sign in to track your mastery and see personalized gap analysis.