Cross-entropy is a loss function that measures the difference between two probability distributions. In machine learning, it quantifies how well predicted probabilities match actual class labels, making it the standard loss for classification tasks.
Interactive lesson with visualizations and practice problems
Part of the Logistic Regression lesson in Mathematics Foundations