Where this topic leads
Topics that build on Cross-Entropy Loss: MLE, KL Divergence, and Classification
Once you have Cross-Entropy Loss: MLE, KL Divergence, and Classification, these are the topics that cite it as a prerequisite. Pick by tier and the area you want to push into next.
Editor's suggested next (2)
Core flagship topics (1)
- Regularization in Practicelayer 2 · training-techniques