Definition - What does Weight mean?

The idea of weight is a foundational concept in artificial neural networks. A set of weighted inputs allows each artificial neuron or node in the system to produce related outputs. Professionals dealing with machine learning and artificial intelligence projects where artificial neural networks for similar systems are used often talk about weight as a function of both biological and technological systems.

Weight is also known as synaptic weight.

Techopedia explains Weight

In an artificial neuron, a collection of weighted inputs is the vehicle through which the neuron engages in an activation function and produces a decision (either firing or not firing). Typical artificial neural networks have various layers including an input layer, hidden layers and an output layer. At each layer, the individual neuron is taking in these inputs and weighting them accordingly. This simulates the biological activity of individual neurons, sending signals with a given synaptic weight from the axon of a neuron to the dendrites of another neuron.

IT pros can utilize specific mathematical equations and visual modeling functions to show how synaptic weights are used in an artificial neural network. In a system called backpropagation, input weights can be altered according to the output functions as the system learns how to correctly apply them. All of this is foundational to how neural networks function in sophisticated machine learning projects.

This definition was written in the context of Neural Networks
Share this: