Tech moves fast! Stay ahead of the curve with Techopedia!
Join nearly 200,000 subscribers who receive actionable tech insights from Techopedia.
Hebbian theory is a theoretical type of cell activation model in artificial neural networks that assesses the concept of “synaptic plasticity” or dynamic strengthening or weakening of synapses over time according to input factors.
Hebbian theory is also known as Hebbian learning, Hebb's rule or Hebb's postulate.
Hebbian theory is named after Donald Hebb, a neuroscientist from Nova Scotia who wrote “The Organization of Behavior” in 1949, which has been part of the basis for the development of artificial neural networks.
In modern artificial neural networks, algorithms can update weights of neural connections. Professionals sometimes talk about “Hebb’s rule” that describes how these connections work and how they change. Part of the appeal of Hebbian theory is the idea that by changing neural weights and associations, engineers can get different results out of sophisticated artificial neural networks.