ART: Self-organizing neural networks for learning and memory of cognitive recognition codes

Author(s): Carpenter, G.A. | Grossberg, S. |

Year: 1990

Citation: Proceedings of the 12th Annual Conference of the Cognitive Science Society, Hillsdale, NJ: Erlbaum Associates, 1032-1034.

Abstract: Adaptive resonance (ART) architectures are neural networks that self-organize stable pattern recognition codes in real-time in response to arbitrary sequences of analog or binary input patterns. In ART architectures, top-down learned expectation and matching mechanisms are critical in self-stabilizing the code learning process. A parallel search scheme updates itself adaptively as the learning process unfolds, and realizes a form of real-time hypothesis discovery, testing, learning, and recognition. A parameter called the attentional vigilance parameter determines how fine the categories will be. If vigilance increases (decreases) due to environmental feedback, then the system automatically searches for and learns finer (coarser) recognition categories. Learned representations are encoded in bottom-up and top-down adaptive filters whose long-term memory (LTM) traces vary slowly compared to the rapid short-term memory (STM) information processing.

Topics: Machine Learning, Models: ART 1,

PDF download




Cross References