Skip to main content
← Choose a different target

Unlock: Activation Functions

Nonlinear activation functions in neural networks: sigmoid, tanh, ReLU, Leaky ReLU, GELU, and SiLU. Their gradients, saturation behavior, and impact on trainability.

32 Prerequisites0 Mastered0 Working30 Gaps
Prerequisite mastery6%
Recommended probe

Cardinality and Countability is your weakest prerequisite with available questions. You haven't been assessed on this topic yet.

Not assessed16 questions
Not assessed5 questions
Not assessed32 questions
Not assessed21 questions

Sign in to track your mastery and see personalized gap analysis.