What is Cross-Entropy?

Cross-entropy is a loss function that measures the difference between two probability distributions. In machine learning, it quantifies how well predicted probabilities match actual class labels, making it the standard loss for classification tasks.

Ready to master Cross-Entropy?

Interactive lesson with visualizations and practice problems

Start Learning

Part of the Logistic Regression lesson in Mathematics Foundations

Under Development
What is Cross-Entropy? - Definition & Examples | Quantato