Browse Bar: Browse by Author | Browse by Category | Browse by Citation | Advanced Search
Author(s): Carpenter, G.A. | Milenova, B. |
Year: 1999
Citation: Proceedings of the International Joint Conference on Neural Networks (IJCNN 99), CD-ROM (IEEE Catalog Number: 99CH36339C): #3022. Session 5.13.
Abstract: Distributed coding at the hidden layer of a multi?layer perceptron (MLP) endows the network with memory compression and noise tolerance capabilities. However, an MLP typically requires slow off?line learning to avoid catastrophic forgetting in an open input environment. An adaptive resonance theory (ART) model is designed to guarantee stable memories even with fast on?line learning. However, ART stability typically requires winner?take?all coding, which may cause category proliferation in a noisy input environment. Distributed ARTMAP (dARTMAP) seeks to combine the computational advantages of MLP and ART systems in a real? time neural network for supervised learning. This system incorporates elements of the unsupervised dART model as well as new features, including a content? addressable memory (CAM) rule. Simulations show that dARTMAP retains fuzzy ARTMAP accuracy while significantly improving memory compression. The model?s computational learning rules correspond to paradoxical cortical data.
Topics:
Machine Learning,
Models:
ARTMAP,
Distributed ART,