Invariant pattern recognition and recall by an attentive ART architecture in a nonstationary world

Author(s): Carpenter, G.A. | Grossberg, S. |

Year: 1987

Citation: Proceedings of the IEEE First International Conference on Neural Networks, II, 737-745.

Abstract: A neural network is described which can stably self-organize an invariant pattern recognition code in response to a sequence of analog or digital input patterns; be attentionally p[rimed to ignore all but a designated category of input patterns; automatically shift its prime as it satisfies internal criteria in response to the occurrence of a previously primed category of input patterns; and learn to generate an arbitrary spatiotemporal output pattern in response to any input pattern exemplar of an activated recognition category. This architecture exploits properties of that ART 1 and ART 2 adaptive resonance theory architectures which have been developed in Carpenter and Grossberg; the Boundary Contour System for boundary segmentation and the Feature Contour System for figural filling-in which have been developed in Cohen and Grossberg, Grossberg, Grossberg and Mingolla, and Grossberg and Todorovic; theorems on associative pattern learning and associative map learning; and circuit designs to focus attention on desired goal objects by using learned feedback interactions between external sensory events and internal homeostatic events. The overall circuit design embodies, in a primitive way, an intentional learning machine in which distinct cognitive, homeostatic, and motor representations are self-organized in a coordinated fashion.

Topics: Image Analysis, Machine Learning, Models: ART 1, ART 2 / Fuzzy ART, Boundary Contour System,

PDF download

Cross References