Xavier Initialization

What Does Xavier Initialization Mean?

Xavier initialization is an attempt to improve the initialization of neural network weighted inputs, in order to avoid some traditional problems in machine learning. Here, the weights of the network are selected for certain intermediate values that have a benefit in machine learning application.


Techopedia Explains Xavier Initialization

Some experts explain that Xavier initialization helps machine learning technologies to converge, because the neuron activation functions are in a decent range — in the words of some data scientists, not in "saturated" or "dead" regions: balanced in weighting in a way that facilitates better results.

Weighted inputs lead to the transfer function, which leads to the activation function and the eventual result. In Xavier initialization, there's the philosophy that the variance of the outputs of a network layer should be equal to the variance of the inputs, which again leads to a kind of stability and stasis in machine learning procedures.


Related Terms

Latest Artificial Intelligence Terms

Related Reading

Margaret Rouse

Margaret Rouse is an award-winning technical writer and teacher known for her ability to explain complex technical subjects to a non-technical, business audience. Over the past twenty years her explanations have appeared on TechTarget websites and she's been cited as an authority in articles by the New York Times, Time Magazine, USA Today, ZDNet, PC Magazine and Discovery Magazine.Margaret's idea of a fun day is helping IT and business professionals learn to speak each other’s highly specialized languages. If you have a suggestion for a new definition or how to improve a technical explanation, please email Margaret or contact her…