Modeling brain function : the world of attractor neural networks /

Saved in:
Bibliographic Details
Author / Creator:Amit, D. J., 1938-2007
Imprint:Cambridge [England] ; New York : Cambridge University Press, 1989.
Description:xvii, 504 p. : ill. ; 24 cm.
Language:English
Subject:
Format: Print Book
URL for this record:http://pi.lib.uchicago.edu/1001/cat/bib/1056039
Hidden Bibliographic Details
ISBN:0521361001
Notes:Includes bibliographies and index.
Table of Contents:
  • Preface
  • 1. Introduction
  • 1.1. Philosophy and Methodology
  • 1.1.1. Reduction to physics and physics modeling analogues
  • 1.1.2. Methods for mind and matter
  • 1.1.3. Some methodological questions
  • 1.2. Neurophysiological Background
  • 1.2.1. Building blocks for neural networks
  • 1.2.2. Dynamics of neurons and synapses
  • 1.2.3. More complicated building blocks
  • 1.2.4. From biology to information processing
  • 1.3. Modeling Simplified Neurophysiological Information
  • 1.3.1. Neuron as perceptron and formal neuron
  • 1.3.2. Digression on formal neurons and perceptrons
  • 1.3.3. Beyond the basic perceptron
  • 1.3.4. Building blocks for attractor neural networks (ANN)
  • 1.4. The Network and the World
  • 1.4.1. Neural states, network states and state space
  • 1.4.2. Digression on the relation between measures
  • 1.4.3. Representations on network states
  • 1.4.4. Thinking about output mechanism
  • 1.5. Spontaneous Computation vs. Cognitive Processing
  • 1.5.1. Input systems, transducers, transformers
  • 1.5.2. ANN's as computing elements -- a position
  • 1.5.3. ANN's and computation of mental representations
  • Bibliography
  • 2. The Basic Attractor Neural Network
  • 2.1. Networks of Analog, Discrete, Noisy Neurons
  • 2.1.1. Analog neurons, spike rates, two-state neural models
  • 2.1.2. Binary representation of single neuron activity
  • 2.1.3. Noisy dynamics of discrete two-state neurons
  • 2.2. Dynamical Evolution of Network States
  • 2.2.1. Network dynamics of discrete-neurons
  • 2.2.2. Synchronous dynamics
  • 2.2.3. Asynchronous dynamics
  • 2.2.4. Sample trajectories and lessons about dynamics
  • 2.2.5. Types of trajectories and possible interpretation - a summary
  • 2.3. On Attractors
  • 2.3.1. The landscape metaphor
  • 2.3.2. Perception, recognition and recall
  • 2.3.3. Perception errors due to spurious states - possible role of noise
  • 2.3.4. Psychiatric speculations and images
  • 2.3.5. The role of noise and simulated annealing
  • 2.3.6. Frustration and diversity of attractors
  • Bibliography
  • 3. General Ideas Concerning Dynamics
  • 3.1. The Stochastic Process, Ergodicity and Beyond
  • 3.1.1. Stochastic equation and apparent ergodicity
  • 3.1.2. Two ways of evading ergodicity
  • 3.2. Cooperativity as an Emergent Property in Magnetic Analog
  • 3.2.1. Ising model for a magnet - spin, field and interaction
  • 3.2.2. Dynamics and equilibrium properties
  • 3.2.3. Noiseless, short range ferromagnet
  • 3.2.4. Fully connected Ising model: real non-ergodicity
  • 3.3. From Dynamics to Landscapes - The Free Energy
  • 3.3.1. Energy as Lyapunov function for noiseless dynamics
  • 3.3.2. Parametrized attractor distributions with noise
  • 3.3.3. Free-energy landscapes - a noisy Lyapunov function
  • 3.3.4. Free-energy minima, non-ergodicity, order-parameters
  • 3.4. Free-Energy of Fully Connected Ising Model
  • 3.4.1. From minimization equation to the free-energy
  • 3.4.2. The analytic way to the free-energy
  • 3.4.3. Attractors at metastable states
  • 3.5. Synaptic Symmetry and Landscapes
  • 3.5.1. Noiseless asynchronous dynamics - energy
  • 3.5.2. Detailed balance for noisy asynchronous dynamics
  • 3.5.3. Noiseless synchronous dynamics - Lyapunov function
  • 3.5.4. Detailed balance for noisy synchronous dynamics
  • 3.6. Appendix: Technical Details for Stochastic Equations
  • 3.6.1. The maximal eigen-value and the associated vector
  • 3.6.2. Differential equation for mean magnetization
  • 3.6.3. The minimization of the dynamical free-energy
  • 3.6.4. Legendre transform for the free-energy
  • Bibliography
  • 4. Symmetric Neural Networks at Low Memory Loading
  • 4.1. Motivations and List of Results
  • 4.1.1. Simplifying assumptions and specific questions
  • 4.1.2. Specific answers for low loading of random memories
  • 4.1.3. Properties of the noiseless network
  • 4.1.4. Properties of the network in the presence of fast noise
  • 4.2. Explicit Construction of Synaptic Efficacies
  • 4.2.1. Choice of memorized patterns
  • 4.2.2. Storage prescription - "Hebb's rule"
  • 4.2.3. A decorrelating (but nonlocal) storage prescription
  • 4.3. Stability Considerations at Low Storage
  • 4.3.1. Signal to noise analysis - memories, spurious states
  • 4.3.2. Basins of attraction and retrieval times
  • 4.3.3. Neurophysiological interpretation
  • 4.4. Mean Field Approach to Attractors
  • 4.4.1. Self-consistency and equations for attractors
  • 4.4.2. Self-averaging and the final equations
  • 4.4.3. Free-energy, extrema, stability
  • 4.4.4. Mean-field and free-energy - synchronous dynamics
  • 4.5. Retrieval States, Spurious States - Noiseless
  • 4.5.1. Perfect retrieval of memorized patterns
  • 4.5.2. Noiseless, symmetric spurious memories
  • 4.5.3. Non-symmetric spurious states
  • 4.5.4. Are spurious states a free lunch?
  • 4.6. Role of Noise at Low Loading
  • 4.6.1. Ergodicity at high noise levels - asynchronous
  • 4.6.2. Just below the critical noise level
  • 4.6.3. Positive role of noise and retrieval with no fixed points
  • 4.7. Appendix: Technical Details for Low Storage
  • 4.7.1. Free-energy at finite p - asynchronous
  • 4.7.2. Free-energy and solutions - synchronous dynamics
  • 4.7.3. Bound on magnitude of overlaps
  • 4.7.4. Asymmetric spurious solution
  • Bibliography
  • 5. Storage and Retrieval of Temporal Sequences
  • 5.1. Motivations: Introspective, Biological, Philosophical
  • 5.1.1. The introspective motivation
  • 5.1.2. The biological motivation
  • 5.1.3. Philosophical motivations
  • 5.2. Storing and Retrieving Temporal Sequences
  • 5.2.1. Functional asymmetry
  • 5.2.2. Early ideas for instant temporal sequences
  • 5.3. Temporal Sequences by Delayed Synapses
  • 5.3.1. A simple generalization and its motivation
  • 5.3.2. Dynamics with fast and slow synapses
  • 5.3.3. Simulation examples of sequence recall
  • 5.3.4. Adiabatically varying energy landscapes
  • 5.3.5. Bi-phasic oscillations and CPG's
  • 5.4. Tentative Steps into Abstract Computation
  • 5.4.1. The attempt to reintroduce structured operations
  • 5.4.2. Ann counting chimes
  • 5.4.3. Counting network - an exercise in connectionist programming
  • 5.4.4. The network
  • 5.4.5. Its dynamics
  • 5.4.6. Simulations
  • 5.4.7. Reflections on associated cognitive psychology
  • 5.5. Sequences Without Synaptic Delays
  • 5.5.1. Basic oscillator - origin of cognitive time scale
  • 5.5.2. Behavior in the absence of noise
  • 5.5.3. The role of noise
  • 5.5.4. Synaptic structure and underlying dynamics
  • 5.5.5. Network storing sequence with several patterns
  • 5.6. Appendix: Elaborate Temporal Sequences
  • 5.6.1. Temporal sequences by time averaged synaptic inputs
  • 5.6.2. Temporal sequences without errors
  • Bibliography
  • 6. Storage Capacity of ANN's
  • 6.1. Motivation and general considerations
  • 6.1.1. Different measures of storage capacity
  • 6.1.2. Storage capacity of human brains
  • 6.1.3. Intrinsic interest in high storage
  • 6.1.4. List of results
  • 6.2. Statistical Estimates of Storage
  • 6.2.1. Statistical signal to noise analysis
  • 6.2.2. Absolute informational bounds on storage capacity
  • 6.2.3. Coupling (synaptic efficacies) for optimal storage
  • 6.3. Theory Near Memory Saturation
  • 6.3.1. Mean-field equations with replica symmetry
  • 6.3.2. Retrieval in the absence of fast noise
  • 6.3.3. Analysis of the T = 0 equations
  • 6.4. Memory Saturation with Noise and Fields
  • 6.4.1. A tour in the T-[alpha] phase diagram
  • 6.4.2. Effect of external fields - thresholds and PSP's
  • 6.4.3. Fields coupled to several patterns
  • 6.4.4. Some technical details related to phase diagrams
  • 6.5. Balance Sheet for Standard ANN
  • 6.5.1. Limiting framework and analytic consequences
  • 6.5.2. Finite-size effects and basins of attraction: simulations
  • 6.6. Beyond the Memory Blackout Catastrophe
  • 6.6.1. Bounded synapses and palimpsest memory
  • 6.6.2. The 7 [plus or minus] 2 rule and palimpsest memories
  • 6.7. Appendix: Replica Symmetric Theory
  • 6.7.1. The replica method
  • 6.7.2. The free-energy and the mean-field equations
  • 6.7.3. Marginal storage and palimpsests
  • Bibliography
  • 7. Robustness - Getting Closer to Biology
  • 7.1. Synaptic Noise and Synaptic Dilution
  • 7.1.1. Two meanings of robustness
  • 7.1.2. Noise in synaptic efficacies
  • 7.1.3. Random symmetric dilution of synapses
  • 7.2. Non-Linear Synapses and Limited Analog Depth
  • 7.2.1. Place and role of non-linear synapses
  • 7.2.2. Properties of networks with clipped synapses
  • 7.2.3. Non-linear storage and the noisy equivalent
  • 7.2.4. Clipping at low storage level
  • 7.3. Random vs. Functional Synaptic Asymmetry
  • 7.3.1. Random asymmetry and performance quality
  • 7.3.2. Asymmetry, noise and spin-glass suppression
  • 7.3.3. Neuronal specificity of synapses - Dale's law
  • 7.3.4. Extreme asymmetric dilution
  • 7.3.5. Functional asymmetry
  • 7.4. Effective Cortical Cycle Times
  • 7.4.1. Slow bursts and relative refractory period
  • 7.4.2. Neuronal memory and expanded scenario
  • 7.4.3. Simplified scenario for relative refractory period
  • 7.5. Appendix: Technical Details
  • 7.5.1. Digression - the mean-field equations
  • 7.5.2. Dilution requirement
  • Bibliography
  • 8. Memory Data Structures
  • 8.1. Biological and Computational Motivation
  • 8.1.1. Low mean activity level and background-foreground asymmetry
  • 8.1.2. Hierarchies for biology and for computation
  • 8.2. Local Treatment of Low Activity Patterns
  • 8.2.1. Demise of naive standard model
  • 8.2.2. Modified ANN and a plague of spurious states
  • 8.2.3. Constrained dynamics - monitoring thresholds
  • 8.2.4. Properties of the constrained biased network
  • 8.2.5. Quantity of information in an ANN with low activity
  • 8.2.6. More effective storage of low activity (sparse) patterns
  • 8.3. Hierarchical Data Structures in a Single Network
  • 8.3.1. Early proposals
  • 8.3.2. Explicit construction of hierarchy in a single ANN
  • 8.3.3. Properties of hierarchy in a single network
  • 8.3.4. Prosopagnosia and learning class properties
  • 8.3.5. Multy-ancestry with many generations
  • 8.4. Hierarchies in Multi-ANN: Generalization First
  • 8.4.1. Organization of the data and the networks
  • 8.4.2. Hierarchical dynamics
  • 8.4.3. Hierarchy for image vector quantization
  • 8.5. Appendix: Technical Details for Biased Patterns
  • 8.5.1. Noise estimates for biased patterns
  • 8.5.2. Mean-field equations in noiseless biased network
  • 8.5.3. Retrieval entropy in biased network
  • 8.5.4. Mean-square noise in low activity network
  • Bibliography
  • 9. Learning
  • 9.1. The Context of Learning
  • 9.1.1. General Comments and a limited scope
  • 9.1.2. Modes, time scales and other constraints
  • 9.1.3. The need for learning modes
  • 9.1.4. Results for learning in learning modes
  • 9.2. Learning in Modes
  • 9.2.1. Perceptron learning
  • 9.2.2. ANN learning by perceptron algorithm
  • 9.2.3. Local learning of the Kohonen synaptic matrix
  • 9.3. Natural Learning - Double Dynamics
  • 9.3.1. General features
  • 9.3.2. Learning in a network of physiological neurons
  • 9.3.3. Learning to form associations
  • 9.3.4. Memory generation and maintenance
  • 9.4. Technical Details in Learning Models
  • 9.4.1. Local Iterative Construction of Projector Matrix
  • 9.4.2. The free energy and the correlation function
  • Bibliography
  • 10. Hardware Implementations of Neural Networks
  • 10.1. Situating Artificial Neural Networks
  • 10.1.1. The role of hardware implementations
  • 10.1.2. Motivations for different designs
  • 10.2. The VLSI Neural Network
  • 10.2.1. High density high speed integrated chip
  • 10.2.2. Smaller, more flexible electronic ANN's
  • 10.3. The Electro-Optical ANN
  • 10.4. Shift Register (CCD) Implementation
  • Bibliography
  • Glossary
  • Index