What is ReLU (Rectified Linear Unit)?

ReLU is an activation function that outputs the input if positive, and zero otherwise: f(x) = max(0, x). It is the most popular activation function in deep learning due to its simplicity and effectiveness at preventing vanishing gradients.

Ready to master ReLU (Rectified Linear Unit)?

Interactive lesson with visualizations and practice problems

Start Learning

Part of the Neural Networks lesson in Mathematics Foundations

Under Development
What is ReLU (Rectified Linear Unit)? - Definition & Examples | Quantato