Rectified Linear Unit

Why Trust Techopedia

What Does Rectified Linear Unit Mean?

The rectified linear unit (ReLU) is one of the most common activation functions in machine learning models. As a component of an artificial neuron in artificial neural networks (ANN), the activation function is responsible for processing weighted inputs and helping to deliver an output.

Advertisements

Techopedia Explains Rectified Linear Unit

With the ReLU as the activation function, the function returns positive values, but does not return negative values, returning zero if negative input applies.

Experts explain how ReLU works on interaction effects with input and accounts for non-linear results.

In general, ReLU works with principles like gradient descent to supply a model for a working activation function in a neural network. In addition, engineers refine the algorithmic work of machine learning programs and develop layers of neurons in ANNs to help to converge or resolve specific problems tackled by the technologies.

Alternatives to ReLU include sigmoid and tanh functions.

Advertisements

Related Terms

Margaret Rouse
Technology Expert
Margaret Rouse
Technology Expert

Margaret is an award-winning technical writer and teacher known for her ability to explain complex technical subjects to a non-technical business audience. Over the past twenty years, her IT definitions have been published by Que in an encyclopedia of technology terms and cited in articles by the New York Times, Time Magazine, USA Today, ZDNet, PC Magazine, and Discovery Magazine. She joined Techopedia in 2011. Margaret's idea of a fun day is helping IT and business professionals learn to speak each other’s highly specialized languages.