Unlock: Feedforward Networks and Backpropagation
Feedforward neural networks as compositions of affine transforms and nonlinearities, the universal approximation theorem, and backpropagation as reverse-mode automatic differentiation on the computational graph.
126 Prerequisites0 Mastered0 Working110 Gaps
Prerequisite mastery13%
Recommended probe
McDiarmid's Inequality is your weakest prerequisite with available questions. You haven't been assessed on this topic yet.
Not assessed13 questions
Not assessed2 questions
Not assessed15 questions
Symmetrization InequalityAdvanced
Not assessed3 questions
VC DimensionCore
Not assessed58 questions
Contraction InequalityAdvanced
Not assessed1 question
Activation FunctionsFoundations
Not assessed36 questions
Automatic DifferentiationFoundations
Not assessed4 questions
Deep Learning (Goodfellow, Bengio, Courville)Infrastructure
No quiz
Differentiation in RⁿAxioms
Not assessed21 questions
Not assessed1 question
Matrix CalculusFoundations
Not assessed9 questions
Not assessed4 questions
Not assessed1 question
Not assessed3 questions
PerceptronFoundations
Not assessed5 questions
Sign in to track your mastery and see personalized gap analysis.