Tech moves fast! Stay ahead of the curve with Techopedia!
Join nearly 200,000 subscribers who receive actionable tech insights from Techopedia.
Adaptive resonance theory (ART) is a particular philosophy driving unsupervised artificial neural network models. It uses a specific architecture, often useful in some types of neural networks, to try to build the capacity for new learning while keeping in place fundamental existing models.
Many attribute much of the design of ART networks to Stephen Grossberg and Gail Carpenter and their work in the 1980s. Another influence is Kohonen's self-organizing networks.
Experts describe adaptive resonance theory as partially an effort to remain open to new learning without sacrificing knowledge of existing patterns – hence the words “adaptation” and “resonance.” A key part of the ART network is a classifier that compares inputs to stored patterns.
One way to look at ART is that it tries to solve the stability-plasticity dilemma or SPD. In a nutshell, this dilemma is based on the system's ability to remain static through irrelevant events, while adapting to relevant and significant events that require plasticity. Understanding the stability-plasticity dilemma is a key way to move toward an understanding of ART and how to use it in neural network designs.
Experts talk about evaluating “expectations” and utilizing vector matching systems that start to accomplish some of this high-level cognitive work through unsupervised neural network architecture.