Unlock: Activation Functions
Nonlinear activation functions in neural networks: sigmoid, tanh, ReLU, Leaky ReLU, GELU, and SiLU. Their gradients, saturation behavior, and impact on trainability.
32 Prerequisites0 Mastered0 Working30 Gaps
Prerequisite mastery6%
Recommended probe
Cardinality and Countability is your weakest prerequisite with available questions. You haven't been assessed on this topic yet.
Activation FunctionsTARGET
Not assessed16 questions
Not assessed5 questions
Convex Optimization BasicsFoundations
Not assessed32 questions
Differentiation in RⁿAxioms
Not assessed21 questions
Sign in to track your mastery and see personalized gap analysis.