Advertisement

Layer-Wise Relevance Propagation

What does Layer-Wise Relevance Propagation mean?

Layer-wise relevance propagation is a method for understanding deep neural networks that uses a particular design path to observe how the individual layers of the program work.

These types of techniques help engineers to learn more about how neural networks do what they do, and they are crucial in combating the problem of "black box operation" in artificial intelligence, where technologies become so powerful and complex that it's hard for humans to understand how they produce results.

Advertisement

Techopedia explains Layer-Wise Relevance Propagation

Specifically, experts contrast layer-wise relevance propagation with a deepLIFT model which uses backpropagation to examine activation differences between artificial neurons in various layers of the deep network. Some describe layer-wise relevance propagation as a deepLIFT method that sets all reference activations of artificial neurons to the same baseline for analysis.

Techniques like layer-wise relevance propagation, deepLIFT and LIME can be attached to Shapley regression and sampling techniques and other processes that work to provide additional insight into machine learning algorithms.

Advertisement

Share this Term

  • Facebook
  • LinkedIn
  • Twitter

Survey: Why Is There Still a Gender Gap in Tech?

Do you work in the tech industry? Help us learn more about why the gender gap still exists in tech by taking this quick survey! Survey respondents will also be entered to win a $100 Amazon Gift Card!

Related Reading

Tags

Computer ScienceTechnology TrendsMachine LearningData Science

Trending Articles

Newest Articles

Go back to top