Neural codes and distributed representations : foundations of neural computation /
Saved in:
Imprint: | Cambridge, Mass. : MIT Press, c1999. |
---|---|
Description: | xxiii, 345 p. : ill. ; 23 cm. |
Language: | English |
Series: | Computational neuroscience |
Subject: | |
Format: | Print Book |
URL for this record: | http://pi.lib.uchicago.edu/1001/cat/bib/3966358 |
Table of Contents:
- Introduction
- Neural Coding
- Neuronal Response Variability
- The Nature of the Neural Code
- Population Coding
- Temporal Sequences
- Sources
- References
- Deciphering the Brain's Codes
- 1. Introduction
- 2. Behavioral Analysis
- 3. Successive Stages of Signal Processing
- 3.1. The Top-Down Approach in the Owl.
- 3.2. The Bottom-Up Approach in the Electric Fish.
- 4. The Output Neurons
- 5. Stimulus Selectivities and Neural Codes
- 6. Similarities in Algorithms
- 7. Concluding Remarks
- Acknowledgments
- References
- A Neural Network for Coding of Trajectories by Time Series of Neuronal Population Vectors
- 1. Introduction
- 2. Model and Learning Procedure
- 3. Results of Simulations
- 4. Discussion
- Acknowledgments
- References
- Self-Organization of Firing Activities in Monkey's Motor Cortex: Trajectory Computation from Spike...
- 1. Introduction
- 2. Spike Signal and Feature Extraction
- 3. The Computation Model
- 4. Trajectory Computation from Motor Cortical Discharge Rates
- 4.1. Using Data from Spiral Tasks to Train the SOFM
- 4.2. Using Data from Spiral and Center Out Tasks to Train the SOFM
- 4.3. Average Testing Result Using the Leave-k-Out Method
- 4.4. Trajectory Computation by the Population Vector Algorithm
- 5. Discussion
- Acknowledgments
- References
- Theoretical Considerations for the Analysis of Population Coding in Motor Cortex
- 1. Introduction
- 2. Single Unit Tuning Curves
- 3. Population Vectors
- 4. Coordinate-Free Representations
- 5. Conclusion
- Acknowledgments
- References
- Statistically Efficient Estimation Using Population Coding
- 1. Introduction
- 2. Model of Neuronal Responses
- 3. Classical Decoding Methods
- 3.1. Maximum Likelihood (ML)
- 3.2. Optimum Linear Estimator (OLE)
- 3.3. Center of Mass (COM)
- 3.4. Complex Estimator (COMP)
- 4. Recurrent Networks
- 4.1. Linear Network
- 4.2. Nonlinear Network
- 5. Simulation Results
- 6. Analysis
- 6.1. Notation
- 6.2. Linearization
- 6.3. Characterizing the Transformation
- 6.4. Properties of the Network Estimate
- 6.5. Nonoptimal Cases
- 6.5.1. Nonequal Variance
- 6.5.2. Correlations
- 6.5.3. Large Noise
- 6.5.4. Nongaussian Distributions
- 6.5.5. Different Input and Output Functions
- 6.6. Relation to Linear ML Estimator
- 7. Discussion
- Acknowledgments
- References
- Parameter Extraction from Population Codes: A Critical Assessment
- 1. Introduction
- 2. Efficiency of CG Estimation Is Low for Sharply Tuned Sensors Perturbed by Background Noise
- 3. Efficiency of CG Estimation Is High for Poisson Noise or Broadly Tuned Sensors
- 3.1. Poisson Noise
- 3.2. Broadly Tuned Sensors
- 4. Sensor Position Irregularities: Another Noise Source for Center-of-Gravity Estimation
- 5. System Nonlinearities: Consequences for the CG Estimate
- 6. Conclusions
- Appendix: Proof of Equation 4.5
- Acknowledgments
- References
- Energy Efficient Neural Codes
- 1. Introduction
- 2. Case 1: Binary Neurons
- 2.1. Representational Capacity.
- 2.2. Energy Expenditure.
- 2.3. Maximizing
- 3. Case 2: Analog Neurons
- 3.1. Representational Capacity.
- 3.2. Energy Expenditure.
- 3.3. Maximizing
- Summary
- Appendix
- Acknowledgments
- References
- Seeing Beyond the Nyquist Limit
- 1. Introduction
- 2. The Receptor Array
- 3. Stimulus Reconstruction
- 4. Examples
- 5. What's going on?
- 6. Is Phase Preserved in Super-Nyquist Frequencies?
- 7. Conclusions
- Acknowledgments
- References
- A Model of Spatial Map Formation in the Hippocampus of the Rat
- 1. Mathematical Results
- Acknowledgments
- References
- Probabilistic Interpretation of Population Codes
- 1. Introduction
- 2. Population Code Interpretations
- 2.1. The Encoding-Decoding Framework.
- 2.2. The Poisson Model.
- 2.3. The KDE Model.
- 3. The Extended Poisson Model
- 4. Comparing the Models
- 4.1. Uncertainty in Target Location.
- 4.2. Multiple Locations.
- 4.3. Uncertainty in Object Presence.
- 4.4. Noise Robustness.
- 5. Discussion
- Acknowledgments
- References
- Cortical Cells Should Fire Regularly, But Do Not
- Acknowledgments
- References
- Role of Temporal Integration and Fluctuation Detection in the Highly Irregular Firing of a Leaky ...
- 1. Introduction
- 2. Partial Reset and the Control of the Firing Irregularity
- 3. Equivalence Between Partial Reset and Time-Varying Threshold
- 4. Determinants of the Firing Time
- 5. What Do Reverse Correlation Graphs Tell Us?
- 6. Proving Coincidence Detection
- 7. Temporally Clustered Firing and Neuronal Gain
- 8. Summary
- Appendix A. Equivalence Between a Model with Partial Reset and a Model with Time-Dependent Threshold...
- Appendix B. Decay Time Constant for Fluctuations
- Acknowledgments
- References
- Physiological Gain Leads to High ISI Variability in a Simple Model of a Cortical Regular Spiking ...
- 1. Introduction
- 2. The High-Gain Model
- 3. Simulation Results
- 4. An Intuitive Picture
- 5. Discussion
- Acknowledgments
- References
- Coding of Time-Varying Signals in Spike Trains of Integrate-and-Fire Neurons with Random Thresho...
- 1. Introduction
- 2. Linear Estimation of Time-Varying Signals from Neuronal Spike Trains
- 3. A Simplified Model of Motion Encoding in H1 Neurons
- 4. Results
- 5. Discussion
- Acknowledgments
- References
- Temporal Precision of Spike Trains in Extrastriate Cortex of the Behaving Macaque Monkey
- 1. Introduction
- 2. Methods
- 2.1. Experimental Procedures.
- 2.2. Data Analysis.
- 3. Results
- 3.1. Precision and Reliability.
- 3.2. Frequency Profile.
- 3.3. Response to Coherent Motion.
- 4. Discussion
- Acknowledgments
- References
- Conversion of Temporal Correlations Between Stimuli to Spatial Correlations Between Attractors
- 1. Introduction
- 1.1. Temporal to Spatial Correlations in Monkey Cortex.
- 1.2. Modeling Correlation Conversion.
- 2. The Model with ±1 Neurons
- 3. ANN with Discrete 01 Neurons
- 4. Learning
- 5. Experimental Predictions and Some Speculations
- Acknowledgments
- References
- Neural Network Model of the Cerebellum: Temporal Discrimination and the Timing of Motor Responses...
- 1. Introduction
- 2. Structure of the Model
- 2.1. Cerebellar Circuitry.
- 2.2. Classic Cerebellar Theories.
- 2.3. Hypothesis.
- 2.4. Neural Network.
- 3. Simulations
- 3.1. Timing.
- 3.2. Ability to Store Multiple Intervals.
- 3.3. Sensitivity to Noise.
- 3.4. Effects of the MF Go Connection on Timing and Sensitivity to Noise.
- 4. Discussion
- Acknowledgments
- References
- Gamma Oscillation Model Predicts Intensity Coding by Phase Rather than Frequency
- 1. Introduction
- 2. Methods
- 3. Results
- 4. Discussion
- Acknowledgments
- References
- Effects of Input Synchrony on the Firing Rate of a Three-Conductance Cortical Neuron Model
- 1. Introduction
- 2. Methods
- 3. Results
- 3.1. Steady-State Activity.
- 3.2. Time-Varying Inputs.
- 3.3. Cross-Correlations.
- 4. Discussion
- Acknowledgments
- References
- NMDA-Based Pattern Discrimination in a Modeled Cortical Neuron
- 1. Introduction
- 2. The Biophysical Model
- 3. A Basis for Nonlinear Pattern Discrimination
- 4. Conclusions
- Acknowledgments
- References
- The Impact of Parallel Fiber Background Activity on the Cable Properties of Cerebellar Purkinje C...
- 1. Introduction
- 2. Model
- 3. Results
- 4. Discussion
- Acknowledgments
- References
- Index