ReLU is an activation function that outputs the input if positive, and zero otherwise: f(x) = max(0, x). It is the most popular activation function in deep learning due to its simplicity and effectiveness at preventing vanishing gradients.
Interactive lesson with visualizations and practice problems
Part of the Neural Networks lesson in Mathematics Foundations