Lavoisier S.A.S.
14 rue de Provigny
94236 Cachan cedex
FRANCE

Heures d'ouverture 08h30-12h30/13h30-17h30
Tél.: +33 (0)1 47 40 67 00
Fax: +33 (0)1 47 40 67 02


Url canonique : www.lavoisier.fr/livre/autre/models-of-neural-networks-i-2nd-updated-ed-1995/domany/descriptif_2309002
Url courte ou permalien : www.lavoisier.fr/livre/notice.asp?ouvrage=2309002

Models of Neural Networks I (2° Éd., 2nd ed. 1995. Softcover reprint of the original 2nd ed. 1995) Coll. Physics of Neural Networks

Langue : Français

Coordonnateurs : Domany Eytan, Hemmen J.Leo van, Schulten Klaus

Couverture de l’ouvrage Models of Neural Networks I
One of the great intellectual challenges for the next few decades is the question of brain organization. What is the basic mechanism for storage of memory? What are the processes that serve as the interphase between the basically chemical processes of the body and the very specific and nonstatistical operations in the brain? Above all, how is concept formation achieved in the human brain? I wonder whether the spirit of the physics that will be involved in these studies will not be akin to that which moved the founders of the "rational foundation of thermodynamics". C. N. Yang! 10 The human brain is said to have roughly 10 neurons connected through about 14 10 synapses. Each neuron is itself a complex device which compares and integrates incoming electrical signals and relays a nonlinear response to other neurons. The brain certainly exceeds in complexity any system which physicists have studied in the past. Nevertheless, there do exist many analogies of the brain to simpler physical systems. We have witnessed during the last decade some surprising contributions of physics to the study of the brain. The most significant parallel between biological brains and many physical systems is that both are made of many tightly interacting components.
1. Collective Phenomena in Neural Networks.- 1.1 Introduction and Overview.- 1.1.1 Collective Phenomena in a Historical Perspective.- 1.1.2 The Role of Dynamics.- 1.1.3 Universality, Locality, and Learning.- 1.1.4 Outline of this Paper.- 1.2 Prerequisites.- 1.2.1 Large Deviations: A Case Study.- 1.2.2 Large Deviations: General Principles.- 1.2.3 A Mathematical Detour.- 1.2.4 Sublattice Magnetizations.- 1.2.5 The Replica Method.- 1.3 The Hopfield Model.- 1.3.1 The Hopfield Model with Finitely Many Patterns.- 1.3.2 Stability.- 1.3.3 The Hopfield Model with Extensively Many (Weighted) Patterns.- 1.3.4 The Phase Diagram of the Hopfield Model.- 1.3.5 Discussion.- 1.3.6 Parallel Dynamics (the Little Model).- 1.3.7 Continuous-Time Dynamics and Graded-Response Neurons.- 1.4 Nonlinear Neural Networks.- 1.4.1 Arbitrary Synaptic Kernel and Finitely Many Patterns.- 1.4.2 Spectral Theory.- 1.4.3 Extensively Many Patterns.- 1.5 Learning, Unlearning, and Forgetting.- 1.5.1 Introduction.- 1.5.2 The Pseudoinverse Learning Rule.- 1.5.3 The Perceptron Convergence Theorem.- 1.5.4 Hebbian Learning.- 1.5.5 Intermezzo.- 1.5.6 Hebbian Unlearning.- 1.5.7 Forgetting.- 1.6 Hierarchically Structured Information.- 1.6.1 Structured Information, Markov Chains, and Martingales.- 1.6.2 Signal-to-Noise-Ratio Analysis.- 1.6.3 Equivalence to the Hopfield Model.- 1.6.4 Weighted Hierarchies.- 1.6.5 Low-Activity Patterns.- 1.6.6 Discussion.- 1.7 Outlook.- References.- 2. Information from Structure: A Sketch of Neuroanatomy.- 2.1 Development of the Brain.- 2.2 Neuroanatomy Related to Information Handling in the Brain.- 2.3 The Idea of Electronic Circuitry.- 2.4 The Projection from the Compound Eye onto the First Ganglion (Lamina) of the Fly.- 2.5 Statistical Wiring.- 2.6 Symmetry of Neural Nets.- 2.7 The Cerebellum.- 2.8 Variations in Size of the Elements.- 2.9 The Cerebral Cortex.- 2.10 Inborn Knowledge.- References.- 3. Storage Capacity and Learning in Ising-Spin Neural Networks.- 3.1 Introduction.- 3.1.1 The Model.- 3.1.2 Content-addressable Memory.- 3.1.3 The Hopfield Model.- 3.1.4 The Spin-glass Analogy.- 3.1.5 Finite Temperature.- 3.2 Content-addressability: A Dynamics Problem.- 3.2.1 Numerical Tests.- 3.3 Learning.- 3.3.1 Learning Perfect Storage.- 3.3.2 Enforcing Content-addressability.- 3.3.3 Optimal Learning.- 3.3.4 Training with Noise.- 3.3.5 Storing Correlated Patterns.- 3.4 Discussion.- References.- 4. Dynamics of Learning.- 4.1 Introduction.- 4.2 Definition of Supervised Learning.- 4.3 Adaline Learning.- 4.4 Perceptron Learning.- 4.5 Binary Synapses.- 4.6 Basins of Attraction.- 4.7 Forgetting.- 4.8 Outlook.- References.- 5. Hierarchical Organization of Memory.- 5.1 Introduction.- 5.2 Models: The Problem.- 5.3 A Toy Problem: Patterns with Low Activity.- 5.4 Models with Hierarchically Structured Information.- 5.5 Extensions.- 5.6 The Enhancement of Storage Capacity: Multineuron Interactions.- 5.7 Conclusion.- References.- 6. Asymmetrically Diluted Neural Networks.- 6.1 Introduction.- 6.2 Solvability and Retrieval Properties.- 6.3 Exact Solution with Dynamic Functionals.- 6.4 Extensions and Related Work.- Appendix A.- Appendix B.- Appendix C.- References.- 7. Temporal Association.- 7.1 Introduction.- 7.2 Fast Synaptic Plasticity.- 7.2.1 Synaptic Plasticity in Hopfield-Type Networks.- 7.2.2 Sequence Generation by Selection.- 7.3 Noise-Driven Sequences of Biased Patterns.- 7.4 Stabilizing Sequences by Delays.- 7.4.1 Transition Mechanism and Persistence Times.- 7.4.2 Analytic Description of the Dynamics.- 7.4.3 Extreme Dilution of Synapses.- 7.5 Applications: Sequence Recognition, Counting, and the Generation of Complex Sequences.- 7.5.1 Sequence Recognition and Counting.- 7.5.2 Complex Sequences.- 7.6 Hebbian Learning with Delays.- 7.7 Epilogue.- References.- 8. Self-organizing Maps and Adaptive Filters.- 8.1 Introduction.- 8.2 Self-organizing Maps and Optimal Representation of Data.- 8.3 Learning Dynamics in the Vicinity of a Stationary State.- 8.4 Relation to Brain Modeling.- 8.5 Formation of a “Somatotopic Map”.- 8.6 Adaptive Orientation and Spatial Frequency Filters.- 8.7 Conclusion.- References.- 9. Layered Neural Networks.- 9.1 Introduction.- 9.2 Dynamics of Feed-Forward Networks.- 9.2.1 General Overview.- 9.2.2 The Model: Its Definition and Solution by Gaussian Transforms.- 9.2.3 Generalization for Other Couplings.- 9.3 Unsupervised Learning in Layered Networks.- 9.3.1 Hebbian Learning and the Development of Orientation-Sensitive Cells and Columns.- 9.3.2 Information-Theoretic Principles Guiding the Development of the Perceptual System.- 9.3.3 Iterated Hebbian Learning in a Layered Network.- 9.4 Supervised Learning in Layered Networks.- 9.5 Summary and Discussion.- References.- Elizabeth Gardner-An Appreciation.
This collection of articles responds to the urgent need for timely and comprehensive reviews in this field. The book starts out with an extensive introduction to the ideas used in the subsequent chapters, which are all centered around the theme of collective phenomena in neural networks : dynamics and storage capacity of networks of formal neurons with symmetric or asymmetric couplings, learning algorithms, temporal association, structured data (software), and structured nets (hardware). The style and level of this book make it particularly useful for advanced students and researchers looking for an accessible survey of today's theory of neural networks.

Date de parution :

Ouvrage de 355 p.

15.5x23.5 cm

Disponible chez l'éditeur (délai d'approvisionnement : 15 jours).

Prix indicatif 52,74 €

Ajouter au panier