Cross-entropy is a amount that’s usually utilized in Machine Studying (ML), and by extension to Deep Studying (DL) as a loss perform. Nonetheless, this loss measure is commonly launched as part of the equipment of ML; use binary cross-entropy for binary classification and categorical cross-entropy for multi-class classification with none background of why or the way it applies.
Take a look at the total article on Notion the place:
- I’ll clarify the origins of cross-entropy as a loss measure.
- The way it pertains to essential ideas in info concept reminiscent of Entropy and KL (Kullback-Leibler) Divergence.
- And its’ connection to Most Probability Estimation (MLE) and why it seems as a loss perform there.