Neural expectation: Cerebellar and retinal analogs of cells fired by learnable or unlearned pattern classes

Author(s): Grossberg, S. |

Year: 1972

Citation: Kybernetik, 10, 49-57

Abstract: Neural networks are introduced which can be taught by classical or instrumental conditioning to fire in response to arbitrary learned classes of patterns. The filters of output cells are biased by presetting cells whose activation prepares the output cell to ""expect"" prescribed patterns. For example, an animal that learns to expect food in response to a lever press becomes frustrated if food does not follow the lever press. It s expectations are thereby modified, since frustration is negatively reinforcing. A neural analog with aspects of cerebellar circuitry is noted, including diffuse mossy fiber inputs feeding parallel fibers that end in Purkinjecell dendrites, climbing fiber inputs ending in Purkinje cell dendrites and giving off collaterals to nuclear cells, and inhibitory Purkinje cell outputs to nuclear cells. The networks are motivated by studying mechanisms of pattern discrimination that require no learning. The latter often use two successive layers of inhibition, analogous to horizontal and amacrine cell layers in vertebrate retinas.. Cells exhibiting hue (in)constancy, brightness (in)constancy, or movement detection properties are included. These results are relevant to Land s retinex theory and to the existence of opponent- and nonopponent-type cell responses in retinal cells. Some adaptation mechanisms, and arousal mechanisms for (~rispening the pattern weights that can fire a given cell, are noted.

Topics: Biological Learning, Mathematical Foundations of Neural Networks, Applications: Other, Models: Other,

PDF download




Cross References


  1. Embedding fields: A theory of learning with physiological implications
    A learning theory in continuous time is derived herein from simple psychologicalpostulates. The theory has an anatomical and neurophysiological interpretation interms of nerve cell bodies, axons, synaptic knobs, membrane ... Article Details

  2. Pavlovian pattern learning by nonlinear neural networks
    This note describes laws for the anatomy, potentials, spiking rules, and transmitters of some networks of formal neurons that enable them to learn spatial patterns by Pavlovian conditioning. Applications to spacetime pattern ... Article Details