Layer-Wise Relevance Propagation

Why Trust Techopedia

What Does Layer-Wise Relevance Propagation Mean?

Layer-wise relevance propagation is a method for understanding deep neural networks that uses a particular design path to observe how the individual layers of the program work.

Advertisements

These types of techniques help engineers to learn more about how neural networks do what they do, and they are crucial in combating the problem of "black box operation" in artificial intelligence, where technologies become so powerful and complex that it's hard for humans to understand how they produce results.

Techopedia Explains Layer-Wise Relevance Propagation

Specifically, experts contrast layer-wise relevance propagation with a deepLIFT model which uses backpropagation to examine activation differences between artificial neurons in various layers of the deep network. Some describe layer-wise relevance propagation as a deepLIFT method that sets all reference activations of artificial neurons to the same baseline for analysis.

Techniques like layer-wise relevance propagation, deepLIFT and LIME can be attached to Shapley regression and sampling techniques and other processes that work to provide additional insight into machine learning algorithms.

Advertisements

Related Terms

Margaret Rouse
Technology expert
Margaret Rouse
Technology expert

Margaret is an award-winning writer and educator known for her ability to explain complex technical topics to a non-technical business audience. Over the past twenty years, her IT definitions have been published by Que in an encyclopedia of technology terms and cited in articles in the New York Times, Time Magazine, USA Today, ZDNet, PC Magazine, and Discovery Magazine. She joined Techopedia in 2011. Margaret’s idea of ​​a fun day is to help IT and business professionals to learn to speak each other’s highly specialized languages.