Unlock: Gradient Boosting
Gradient boosting as functional gradient descent: fit weak learners to pseudo-residuals sequentially, reducing bias at each round. Covers AdaBoost, shrinkage, XGBoost second-order methods, and LightGBM leaf-wise growth.
118 Prerequisites0 Mastered0 Working104 Gaps
Prerequisite mastery12%
Recommended probe
McDiarmid's Inequality is your weakest prerequisite with available questions. You haven't been assessed on this topic yet.
Gradient BoostingTARGET
Not assessed13 questions
Not assessed2 questions
Not assessed15 questions
Symmetrization InequalityAdvanced
Not assessed3 questions
VC DimensionCore
Not assessed58 questions
Contraction InequalityAdvanced
Not assessed1 question
Gradient Descent VariantsFoundations
Not assessed16 questions
No quiz
AdaBoostCore
Not assessed3 questions
Not assessed3 questions
No quiz
Sign in to track your mastery and see personalized gap analysis.