On learning and energy-entropy dependence in recurrent and nonrecurrent signed networks

Author(s): Grossberg, S. |

Year: 1969

Citation: Journal of Statistical Physics, 1, 319-350

Abstract: Learning of patterns by neural networks obeying general rules of sensory transduction and of converting membrane potentials to spiking frequencies is considered. Any finite number of cellsA can sample a pattern playing on any finite number of cells nabla without causing irrevocable sampling bias ifA = bernou orA cap bernou =. Total energy transfer from inputs ofA to outputs of bernou depends on the entropy of the input distribution. Pattern completion on recall trials can occur without destroying perfect memory even ifA = bernou by choosing the signal thresholds sufficiently large. The mathematical results are global limit and oscillation theorems for a class of nonlinear functional-differential systems.

Topics: Mathematical Foundations of Neural Networks, Models: Other,

PDF download




Cross References