Lavoisier S.A.S.
14 rue de Provigny
94236 Cachan cedex
FRANCE

Heures d'ouverture 08h30-12h30/13h30-17h30
Tél.: +33 (0)1 47 40 67 00
Fax: +33 (0)1 47 40 67 02


Url canonique : www.lavoisier.fr/livre/autre/models-of-neural-networks-iii/domany/descriptif_2934796
Url courte ou permalien : www.lavoisier.fr/livre/notice.asp?ouvrage=2934796

Models of Neural Networks III, Softcover reprint of the original 1st ed. 1996 Association, Generalization, and Representation Physics of Neural Networks Series

Langue : Anglais

Coordonnateurs : Domany Eytan, Hemmen J. Leo van, Schulten Klaus

Couverture de l’ouvrage Models of Neural Networks III
One of the most challenging and fascinating problems of the theory of neural nets is that of asymptotic behavior, of how a system behaves as time proceeds. This is of particular relevance to many practical applications. Here we focus on association, generalization, and representation. We turn to the last topic first. The introductory chapter, "Global Analysis of Recurrent Neural Net­ works," by Andreas Herz presents an in-depth analysis of how to construct a Lyapunov function for various types of dynamics and neural coding. It includes a review of the recent work with John Hopfield on integrate-and­ fire neurons with local interactions. The chapter, "Receptive Fields and Maps in the Visual Cortex: Models of Ocular Dominance and Orientation Columns" by Ken Miller, explains how the primary visual cortex may asymptotically gain its specific structure through a self-organization process based on Hebbian learning. His argu­ ment since has been shown to be rather susceptible to generalization.
1. Global Analysis of Recurrent Neural Networks.- 1.1 Global Analysis-Why?.- 1.2 A Framework for Neural Dynamics.- 1.2.1 Description of Single Neurons.- 1.2.2 Discrete-Time Dynamics.- 1.2.3 Continuous-Time Dynamics.- 1.2.4 Hebbian Learning.- 1.3 Fixed Points.- 1.3.1 Sequential Dynamics: Hopfield Model.- 1.3.2 Parallel Dynamics: Little Model.- 1.3.3 Continuous Time: Graded-Response Neurons.- 1.3.4 Iterated-Map Networks.- 1.3.5 Distributed Dynamics.- 1.3.6 Network Performance.- 1.3.7 Intermezzo: Delayed Graded-Response Neurons.- 1.4 Periodic Limit Cycles and Beyond.- 1.4.1 Discrete-Time Dynamics.- 1.4.2 Continuous-Time Dynamics.- 1.5 Synchronization of Action Potentials.- 1.5.1 Phase Locking.- 1.5.2 Rapid Convergence.- 1.6 Conclusions.- References.- 2. Receptive Fields and Maps in the Visual Cortex: Models of Ocular Dominance and Orientation Columns.- 2.1 Introduction.- 2.2 Correlation-Based Models.- 2.2.1 The Von der Malsburg Model of V1 Development.- 2.2.2 Mathematical Formulation.- 2.2.3 Semilinear Models.- 2.2.4 How Semilinear Models Behave.- 2.2.5 Understanding Ocular Dominance and Orientation Columns with Semilinear Models.- 2.2.6 Related Semilinear Models.- 2.3 The Problem of Map Structure.- 2.4 The Computational Significance of Correlatin-Based Rules.- 2.4.1 Efficient Representation of Information.- 2.4.2 Self-Organizing Maps and Associative Memories.- 2.5 Open Questions.- References.- 3. Associative Data Storage and Retrieval in Neural Networks.- 3.1 Introduction and Overview.- 3.1.1 Memory and Representation.- 3.1.2 Retrieval from the Memory.- 3.1.3 Fault Tolerance in Addressing.- 3.1.4 Various Memory Tasks.- 3.1.5 Retrieval Errors.- 3.2 Neural Associatve Memory Models.- 3.2.1 Retrieval Process.- 3.2.2 Storage Process.- 3.2.3 Distributed Storage.- 3.3 Analysis of the Retrieval Process.- 3.3.1 Random Pattern Generation.- 3.3.2 Site Averaging and Threshold Setting.- 3.3.3 Binary Storage Procedure.- 3.3.4 Incremental Storage Procedure.- 3.4 Information Theory of the Memory Process.- 3.4.1 Mean Information Content of Data.- 3.4.2 Association Capacity.- 3.4.3 Including the Addressing Process.- 3.4.4 Asymptotic Memory Capacities.- 3.5 Model Performance.- 3.5.1 Binary Storage.- 3.5.2 Incremental Storage.- 3.6 Discussion.- 3.6.1 Heteroassociation.- 3.6.2 Autoassociation.- 3.6.3 Relations to Other Approaches.- 3.6.4 Summary.- Appendix 3.1.- Appendix 3.2.- References.- 4. Inferences Modeled with Neural Networks.- 4.1 Introduction.- 4.1.1 Useful Definitions.- 4.1.2 Proposed Framework.- 4.1.3 How Far Can We Go with the Formal-Logic Approach?.- 4.2 Model for Cognitive Systems and for Experiences.- 4.2.1 Cognitive Systems.- 4.2.2 Experience.- 4.2.3 From the Hebb Rule to the Postulate?.- 4.3 Inductive Inference.- 4.3.1 Optimal Inductive Inference.- 4.3.2 Unique Inductive Inference.- 4.3.3 Practicability of the Postulate.- 4.3.4 Biological Example.- 4.3.5 Limitation of Inductive Inference in Terms of Complexity.- 4.3.6 Summary for Inductive Inference.- 4.4 External Memory.- 4.4.1 Counting.- 4.5 Limited Use of External Memory.- 4.5.1 Counting.- 4.5.2 On Wittgenstein’s Paradox.- 4.6 Deductive Inference.- 4.6.1 Biological Example.- 4.6.2 Mathematical Examples.- 4.6.3 Relevant Signal Flow.- 4.6.4 Mathematical Examples Revisited.- 4.6.5 Further Ansatz.- 4.6.6 Proofs by Complete Induction.- 4.6.7 On Sieves.- 4.7 Conclusion.- References.- 5. Statistical Mechanics of Generalization.- 5.1 Introduction.- 5.2 General Results.- 5.2.1 Phase Space of Neural Networks.- 5.2.2 VC Dimension and Worst-Case Results.- 5.2.3 Bayesian Approach and Statistical Mechanics.- 5.2.4 Information-Theoretic Results.- 5.2.5 Smooth Networks.- 5.3 The Perceptron.- 5.3.1 Some General Properties.- 5.3.2 Replica Theory.- 5.3.3 Results for Bayes and Gibbs Algorithms.- 5.4 Geometry in Phase Space and Asymptotic Scaling.- 5.5 Applications to Perceptrons.- 5.5.1 Simple Learning: Hebb Rule.- 5.5.2 Overfitting.- 5.5.3 Maximal Stability.- 5.5.4 Queries.- 5.5.5 Discontinuous Learning.- 5.5.6 Learning Drifting Concepts.- 5.5.7 Diluted Networks.- 5.5.8 Continuous Neurons.- 5.5.9 Unsupervised Learning.- 5.6 Summary and Outlook.- Appendix 5.1: Proof of Sauer’s Lemma.- Appendix 5.2: Order Parameters for ADALINE.- References.- 6. Bayesian Methods for Backpropagation Networks.- 6.1 Probability Theory and Occam’s Razor.- 6.1.1 Occam’s Razor.- 6.1.2 Bayesian Methods and Data Analysis.- 6.1.3 The Mechanism of the Bayesian Occam’s Razor: The Evidence and the Occam Factor.- 6.2 Neural Networks as Probabilistic Models.- 6.2.1 Regression Networks.- 6.2.2 Neural Network Learning as Inference.- 6.2.3 Binary Classification Networks.- 6.2.4 Multiclass Classification Networks.- 6.2.5 Implementation.- 6.3 Setting Regularization Constants ? and ?.- 6.3.1 Relationship to Ideal Hierarchical Bayesian Modeling.- 6.3.2 Multiple Regularization Constants.- 6.4 Model Comparison.- 6.4.1 Multimodal Distributions.- 6.5 Error Bars and Predictions.- 6.5.1 Implementation.- 6.5.2 Error Bars in Regression.- 6.5.3 Integrating Over Models: Committees.- 6.5.4 Error Bars in Classification.- 6.6 Pruning.- 6.7 Automatic Relevance Determination.- 6.8 Implicit Priors.- 6.9 Cheap and Cheerful Implementations.- 6.9.1 Cheap Approximations for Optimization of a and ?.- 6.9.2 Cheap Generation of Predictive Distributions.- 6.10 Discussion.- 6.10.1 Applications.- 6.10.2 Modeling Insights.- 6.10.3 Relationship to Theories of Generalization.- 6.10.4 Contrasts with Conventional Dogma in Learning Theory and Statistics.- 6.10.5 Minimum Description Length (MDL).- 6.10.6 Ensemble Learning.- References.- 7. Penacée: A Neural Net System for Recognizing On-Line Handwriting.- 7.1 Introduction.- 7.2 Description of the Building Blocks.- 7.2.1 Recognition Preprocessor.- 7.2.2 Neural Feature Extractor.- 7.2.3 Classifier.- 7.2.4 Segmentation Preprocessor and Postprocessor.- 7.2.5 Similarity Measurer.- 7.2.6 Loss Calculator.- 7.2.7 Global Optimization Techniques.- 7.3 Applications.- 7.3.1 Isolated Character Recognition.- 7.3.2 Hand-Printed Word Recognition.- 7.3.3 Signature Verification.- 7.4 Conclusion.- References.- 8. Topology Representing Network in Robotics.- 8.1 Introduction.- 8.2 Problem Description.- 8.3 Topology Representing Network Algorithm.- 8.3.1 Training of First-Stage Motion.- 8.3.2 Training of Final Grasping of the Cylinder — Second Stage of Movement.- 8.4 Experimental Results and Discussion.- 8.4.1 Robot Performance.- 8.4.2 Combination of Two Networks for Grasping.- 8.4.3 Discussion.- References.

Date de parution :

Ouvrage de 311 p.

15.5x23.5 cm

Disponible chez l'éditeur (délai d'approvisionnement : 15 jours).

Prix indicatif 105,49 €

Ajouter au panier

Ces ouvrages sont susceptibles de vous intéresser