Citation: IEEE TRANSACTIONS ON NEURAL NETWORKS Volume: 10 Issue: 4 Pages: 757-767
Abstract: The internal competition between categories in the adaptive resonance theory (ART) neural model can be biased by replacing the original choice function by one that contains an attentional tuning parameter under external control. For the same input but different values of the attentional tuning parameter, the network can learn and recall different categories with different degrees of generality, thus permitting the coexistence of both general and specific categorizations of the same set of data. Any number of these categorizations can be learned within one and the same network by virtue of generalization and discrimination properties. A simple model in which the attentional tuning parameter and the vigilance parameter of ART are linked together is described. The self-stabilization property is shown to be preserved for an arbitrary sequence of analog inputs, and for arbitrary orderings of arbitrarily chosen vigilance levels.