Layer-Wise Relevance Propagation

What Does Layer-Wise Relevance Propagation Mean?

Layer-wise relevance propagation is a method for understanding deep neural networks that uses a particular design path to observe how the individual layers of the program work.

Advertisements

These types of techniques help engineers to learn more about how neural networks do what they do, and they are crucial in combating the problem of "black box operation" in artificial intelligence, where technologies become so powerful and complex that it's hard for humans to understand how they produce results.

Techopedia Explains Layer-Wise Relevance Propagation

Specifically, experts contrast layer-wise relevance propagation with a deepLIFT model which uses backpropagation to examine activation differences between artificial neurons in various layers of the deep network. Some describe layer-wise relevance propagation as a deepLIFT method that sets all reference activations of artificial neurons to the same baseline for analysis.

Techniques like layer-wise relevance propagation, deepLIFT and LIME can be attached to Shapley regression and sampling techniques and other processes that work to provide additional insight into machine learning algorithms.

Advertisements

Related Terms

Latest Computer Science Terms

Related Reading

Margaret Rouse

Margaret Rouse is an award-winning technical writer and teacher known for her ability to explain complex technical subjects to a non-technical, business audience. Over the past twenty years her explanations have appeared on TechTarget websites and she's been cited as an authority in articles by the New York Times, Time Magazine, USA Today, ZDNet, PC Magazine and Discovery Magazine.Margaret's idea of a fun day is helping IT and business professionals learn to speak each other’s highly specialized languages. If you have a suggestion for a new definition or how to improve a technical explanation, please email Margaret or contact her…