Browse Bar: Browse by Author | Browse by Category | Browse by Citation | Advanced Search
Author(s):
Year:
Citation:
Abstract:
Topics:
Machine Learning,
Applications:
Remote Sensing,
Models:
ARTMAP,
Fuzzy ARTMAP,
Author(s): Amis, G.P. | Carpenter, G.A. |
Year: 2009
Citation: Techical Report CAS/CNS TR-2009-006
Abstract: Computational models of learning typically train on labeled input patterns (supervised learning), unlabeled input patterns (unsupervised learning), or a combination of the two (semi-supervised learning). In each case input patterns have a fixed number of features throughout training and testing. Human and machine learning contexts present additional opportunities for expanding incomplete knowledge from formal training, via self-directed learning that incorporates features not previously experienced. This article defines a new self-supervised learning paradigm to address these richer learning contexts, introducing a new neural network called self-supervised ARTMAP. Self-supervised learning integrates knowledge from a teacher (labeled patterns with some features), knowledge from the environment (unlabeled patterns with more features), and knowledge from internal model activation (self-labeled patterns). Self-supervised ARTMAP learns about novel features from unlabeled patterns without destroying partial knowledge previously acquired from labeled patterns. A category selection function bases system predictions on known features, and distributed network activation scales unlabeled learning to prediction confidence. Slow distributed learning on unlabeled patterns focuses on novel features and confident predictions, defining classification boundaries that were ambiguous in the labeled patterns. Self-supervised ARTMAP improves test accuracy on illustrative low-dimensional problems and on high-dimensional benchmarks. Model code and benchmark data are available from: http://cns.bu.edu/techlab/SSART/.
Topics:
Machine Learning,
Models:
ARTMAP,