Mathematical Foundations of Neural Networks

Mathematical methods such as differential equations are fundamental to our understanding of Neural Networks. Papers here are devoted to direct applications of these mathematical concepts.


Articles & Tech Transfers


A dynamical adaptive resonance architecture
Abstract A set of nonlinear differential equations that describe the dynamics of the ART1 model are presented, along with the motivation for their use. These equations are extensions of those developed by Carpenter and Grossberg ...

Some nonlinear networks capable of learning a spatial pattern of arbitrary complexity
Abstract Introduction: This note describes some nonlinear networks which caD learn a spatial pattern, in "black and white," of arbitrary size and complexity. These networks are a special case of a collection of learning machines ~ ...

Some physiological and biochemical consequences of psychological postulates
Abstract This note lists some psychological, physiological, and biochemical predictions that have been derived from simple psychological postu]ates. These psychological postulates have been used to derive a nev learning theory, 1-3 ...

Global ratio limit theorems for some nonlinear functional differential equations, I
Abstract 1. Introduction. We study some systems of nonlinear functional-differential equations of the form(1)X(t)= A X(1) + B(Xi)X(t - r) + CO), t' 0,where X=(xi,, x„) is nonnegative, B(Xj) =jjB;j(t)jj is a matrixof nonlinear ...

On learning and energy-entropy dependence in recurrent and nonrecurrent signed networks
Abstract Learning of patterns by neural networks obeying general rules of sensory transduction and of converting membrane potentials to spiking frequencies is considered. Any finite number of cellsA can sample a pattern playing on ...

On the variational systems of some nonlinear difference-differential equations
Abstract This paper studies the variational systems of two closely related systemsof nonlinear difference-differential equations which arise in prediction- andlearning-theoretical applications ([1], [2], [31). The first system is ...

A prediction theory for some nonlinear functional-differential equations, II: Learning of patterns
Abstract This paper studies the following system of nonlinear difference-differentialequations: [...](3)where i, j, k = 1, 2,..., n, and /3 0. We will establish global limit andoscillation theorems for the nonnegative solutions of ...

A prediction theory for some nonlinear functional-differential equations, I: Learning of lists
Abstract In this paper, we study some systems of nonlinear functional-differentialequations of the form
X(t) = AX(t) + B(X,) X(t - r) + C(t), t 0, (1)which ...

Some networks that can learn, remember, and reproduce any number of complicated space-time patterns, I.
Abstract 1. Introduction. This paper describes some networks 9R that can learn,simultaneously remember, and individually reproduce on demand any numberof spatiotemporal patterns (e.g., "motor sequences") of essentially arbitrary ...

Learning and energy-entropy dependence in some nonlinear functional-differential systems
Abstract 1. Introduction. This note describes limiting and oscillatory fea-tures of some nonlinear functional-differential systems having appli-cations in learning and nonstationary prediction theory. The mainresults discuss systems ...

On learning, information, lateral inhibition, and transmitters
Abstract A mathematical model with both a psychological and neurophysiological interpretation is introduced to qualitatively explain data about serial learning of lists. Phenomenasuch as bowing, anchoring, chunking, backward ...

On the production and release of chemical transmitters and related topics in cellular control
Abstract This paper makes some neurophysiological and biochemical predictionsconcerning transmitter production and release which are suggested bypsychological postulates. A main theme is the joint comrol of presynapticexcitatory ...

On learning of spatiotemporal patterns by networks with ordered ordered sensory and motor components, I: Excitatory components of the cerebellum
Abstract Many of our sensory and motor organs have linearly ordered components, for example the fingers on a hand, the tonotopic organization of the auditory system, the successivjeo ints on arms and legs,t he spine,e tc. This paper ...

Global ratio limit theorems for some nonlinear functional differential equations, II
Abstract Introduction: A previous note [l] introduced some systems of nonlinear functional-differential equations of the form X(t) = AX(t) + B(Xt)X(t - r) + C(t) i £ 0, where X~(xi, • - * , xn) is nonnegative, B(Xt) is a ...

On the global limits and oscillations of a system of nonlinear differential equations describing a flow of a probabilistic network
Abstract 1. INTRODUCTION: This paper considers various aspects of the global limiting and oscillatorybehavior of the following system of nonlinear differential equations.sx;(t) ...

Embedding fields: A theory of learning with physiological implications
Abstract A learning theory in continuous time is derived herein from simple psychologicalpostulates. The theory has an anatomical and neurophysiological interpretation interms of nerve cell bodies, axons, synaptic knobs, membrane ...

Some networks that can learn, remember, and reproduce any number of complicated space-time patterns, II.
Abstract 1. Introduction - This paper describes some networks ..lf that can learn, simultaneously remember,and perform individually upon demand any number of spatiotemporal patterns(e.g., "motor sequences" and "internal perceptual ...

Neural expectation: Cerebellar and retinal analogs of cells fired by learnable or unlearned pattern classes
Abstract Neural networks are introduced which can be taught by classical or instrumental conditioning to fire in response to arbitrary learned classes of patterns. The filters of output cells are biased by presetting cells whose ...

Neural dynamics of decision making under risk: Affective balance and cognitive-emotional interactions
Abstract A real-time neural network model, called affective balance theory, is developed to explain many properties of decision making under risk that heretofore have been analyzed using formal algebraic models, notably prospect ...

Spiking threshold and overarousal effects in serial learning
Abstract Possible dependencies of serial learning data on physiological parameters such as spiking thresholds, arousal level, and decay rate of potentials are considered in a rigorous learning model. Influence of these parameters on ...

Pattern formation, contrast control, and oscillations in the short-term memory of shunting on-center off-surround networks
Abstract The transformation of spatial patterns and their storage in short term memory by shunting neural networks are studied herein. Various mechanisms are described for real-time regulation of the amount of contrast with which a ...

On the dynamics of operant conditioning
Abstract Simple psychological postulates are presented which are used to derivepossible anatomical and physiological substrates of operant conditioning.These substrates are compatible with much psychological data aboutoperants. A ...

Pavlovian pattern learning by nonlinear neural networks
Abstract This note describes laws for the anatomy, potentials, spiking rules, and transmitters of some networks of formal neurons that enable them to learn spatial patterns by Pavlovian conditioning. Applications to spacetime pattern ...

Some developmental and attentional biases in the contrast enhancement and short-term memory of recurrent neural networks
Abstract This paper studies the global dynamics of neurons, or neuron populations,in a recurrent on-center off-surround anatomy undergoing nonlinearshunting interactions. In such an anatomy, a given population excitesitself and ...

Neural pattern discrimination
Abstract Some possible neural mechanisms of pattern discrimination are discussed, leading to neural networks which can discriminate any number of essentially arbitrarily complicated space-time patterns and activate cells which can ...

Decisions, patterns, and oscillations in nonlinear competitive systems with applications to Volterra-Lotka systems
Abstract This paper describes new properties of competitive systems which arise in population biology, ecology, psychophysiology, and developmental biology. These properties yield a global method for analyzing the geometric design ...

Do all neural models really look alike?
Abstract Several of the formal approaches that are used to explain psychophysiologicalphenomena lead to different properties and principles of organization. Theseapproaches include computer, linear, and nonlinear models. The present ...

Embedding fields: Underlying philosophy, mathematics, and applications to psychology, physiology, and anatomy
Abstract This article reviews results on a learning theory that can be derived from simple psychological postulates and given a suggestive neurophysiological, anatomical, and biochemical interpretation. The neural networks described ...

Contour enhancement, short-term memory, and constancies in reverberating neural networks
Abstract A model of the nonlinear dynamics of reverberating on-center off-surround networks of nerve cells, or of cell populations, is analysed. The on-center off-surround anatomy allows patterns to be processed across populations ...

Absolute stability of global pattern formation and parallel memory storage by competitive neural networks
Abstract The process whereby input patterns are transformed and stored by competitive cellular networks is considered. This process arises in such diverse subjects as the short-tenD storage of visual or language patterns by neural ...

Schizophrenia: Possible dependence of associational span, bowing, and primacy vs. recency on spiking threshold
Abstract The hypothesis has been advanced thatcertain schizophrenic patients are in acontinual state of overarousal, leading topoor attention, and perhaps to schizophrenicpunning (Kornetsky and Eliasson, 1969;Maher, 1968). ...

A neural theory of punishment and avoidance, I: Qualitative theory
Abstract Neural networks are derived from psychological postulates about punishment and avoidance. The classical notion that drive reduction is reinforcing is replaced by a precise physiological alternative akin to Miller s ""Go"" ...

A neural theory of punishment and avoidance, II: Quantitative theory
Abstract Quantitative neural networks are derived from psychological postulates about punishment and avoidance. The classical notion that drive reduction is reinforcing is replaced by a precise physiological altetAative akin to ...

Cortical synchronization and perceptual framing
Abstract How does the brain group together different parts of an object into a coherent visual object representation? Different parts of an object may be processed by the brain at different rates and may thus become desynchronized. ...

Nonlinear difference-differential equations in prediction and learning theory
Abstract Introduction.-This note introduces some nonlinear difference-differential equations which can be interpreted as a learning theory or, alternatively, as a prediction theory whose goal is to discuss the prediction of ...

Adaptive pattern classification and universal recoding: I Parallel development and coding of neural feature detectors
Abstract This paper analyses a model for the parallel development and adult coding of neural feature detectors. The model was introduced in Grossberg (1976). We show how experience can retune feature detectors to respond to a ...

Fuzzy ART choice functions
Abstract Adaptive Resonance Theory (ART) models are real-time neural networks for category learning, pattern recognition, and prediction. Unsupervised fuzzy ART and supervised fuzzy ARTMAP networks synthesize fuzzy logic and ART by ...

The ART of adaptive pattern recognition by a self organizing neural network
Abstract The adaptive resonance theory (ART) suggests a solution to the stability-plasticity dilemma facing designers of learning systems, namely how to design a learning system that will remain plastic, or adaptive, in response to ...

A massively parallel architecture for a self organizing neural pattern recognition machine
Abstract A neural network architecture for the learning of recognition categories is derived. Real-time network dynamics are completely characterized through mathematical analysis and computer simulations. The architecture ...

Computing with neural networks: The role of symmetry
Abstract Hopfield and Tank refer to A new concept for understanding the dyanmic of neural circuitry using the equation (in a slightly different ...

Absolutely stable learning of recognition codes by a self-organizing neural network
Abstract A neural network which self-organizes and self-stabilizes its recognition codes in response to arbitrarily orderings of arbitrarily man y and arbitrarily complex binary input patterns is here outline. Top-down attentional ...

Classical and instrumental learning by neural networks
Abstract This article reviews results chosen from the theory of embedding fields. Embedding field theory discusses mechanisms of pattern discrimination and learning in a psychophysiological setting. It is derived from psychological ...

Birth of a learning law
Abstract While most forest maps identify only the dominant vegetation class in delineated stands, individual stands are often better characterized by a mix of vegetation types. Many land management applications, including wildlife ...

Linking mind to brain: The mathematics of biological intelligence
Abstract How our brains give rise to our minds is one of the most intriguing questions in all of science. We are now living in a particularly interesting time to consider this question. This is true because, during the last decade, ...


Software


SMART network
Description This entry contains the software, implemented in the KDE Integrated NeuroSimulation Software (KInNeSS ) that simulates the Synchronous Matching Adaptive Resonance Theory. SMART was first described in Grossberg and Versace ...