Rectified Linear Unit

What Does Rectified Linear Unit Mean?

The rectified linear unit (ReLU) is one of the most common activation functions in machine learning models. As a component of an artificial neuron in artificial neural networks (ANN), the activation function is responsible for processing weighted inputs and helping to deliver an output.

Advertisements

Techopedia Explains Rectified Linear Unit

With the ReLU as the activation function, the function returns positive values, but does not return negative values, returning zero if negative input applies.

Experts explain how ReLU works on interaction effects with input and accounts for non-linear results.

In general, ReLU works with principles like gradient descent to supply a model for a working activation function in a neural network. In addition, engineers refine the algorithmic work of machine learning programs and develop layers of neurons in ANNs to help to converge or resolve specific problems tackled by the technologies.

Alternatives to ReLU include sigmoid and tanh functions.

Advertisements

Related Terms

Margaret Rouse

Margaret Rouse is an award-winning technical writer and teacher known for her ability to explain complex technical subjects to a non-technical, business audience. Over the past twenty years her explanations have appeared on TechTarget websites and she's been cited as an authority in articles by the New York Times, Time Magazine, USA Today, ZDNet, PC Magazine and Discovery Magazine.Margaret's idea of a fun day is helping IT and business professionals learn to speak each other’s highly specialized languages. If you have a suggestion for a new definition or how to improve a technical explanation, please email Margaret or contact her…