Tech moves fast! Stay ahead of the curve with Techopedia!
Join nearly 200,000 subscribers who receive actionable tech insights from Techopedia.
Learning vector quantization (LVQ) is an algorithm that is a type of artificial neural networks and uses neural computation. More broadly, it can be said to be a type of computational intelligence. This algorithm takes a competitive, winner-takes-all approach to learning and is also related to other neural network algorithms like Perceptron and back-propagation. The LVQ algorithm allows one to choose the number of training instances to undergo and then learns about what those instances look like. LVQ was invented by Teuvo Kohonen and is related to the k-nearest neighbor algorithm.
The basic objective of learning vector quantization in terms of information processing is to prepare, in the domain of observed data samples, a set of codebook vectors. Further, these vectors are then used for the classification of vectors that are unseen. Initially, a random pool of vectors is composed and they are then exposed to training samples. On the employment of a winner-takes-all strategy, either one or those that are most similar vectors to the given input pattern are chosen. These are then adjusted in such a way as to be closer to the input vector, or sometimes, further away from the runner-up. On repetition of this process, it results in a distribution of codebook vectors in the input space that can approximate the distribution of samples underlying the test data set. This algorithm is used for predictive modeling.