Cover image for Principles Of Artificial Neural Networks.
Principles Of Artificial Neural Networks.
Title:
Principles Of Artificial Neural Networks.
Author:
Graupe, Daniel.
ISBN:
9789812770578
Personal Author:
Edition:
2nd ed.
Physical Description:
1 online resource (320 pages)
Series:
Advanced Series in Circuits & Systems, v. 6
Contents:
Contents -- Acknowledgments -- Preface to the First Edition -- Preface to the Second Edition -- Chapter 1. Introduction and Role of Arti cial Neural Networks -- Chapter 2. Fundamentals of Biological Neural Networks -- Chapter 3. Basic Principles of ANNs and Their Early Structures -- 3.1. Basic Principles of ANN Design -- 3.2. Basic Network Structures -- 3.3. The Perceptron's Input-Output Principles -- 3.4. The Adaline (ALC) -- 3.4.1. LMS training of ALC -- 3.4.2. Steepest descent training of ALC -- Chapter 4. The Perceptron -- 4.1. The Basic Structure -- 4.1.1. Perceptron's activation functions -- 4.2. The Single-Layer Representation Problem -- 4.3. The Limitations of the Single-Layer Perceptron -- 4.4. Many-Layer Perceptrons -- 4.A. Perceptron Case Study: Identifying Autoregressive Parameters of a Signal (AR Time Series Identi cation) -- Chapter 5. The Madaline -- 5.1. Madaline Training -- 5.A. Madaline Case Study: Character Recognition -- 5.A.1. Problem statement -- 5.A.2. Design of network -- 5.A.3. Training of the network -- 5.A.4. Results -- 5.A.5. Conclusions and observations -- 5.A.6. MATLAB code for implementing MADALINE network: -- Chapter 6. Back Propagation -- 6.1. The Back Propagation Learning Procedure -- 6.2. Derivation of the BP Algorithm -- 6.3. Modified BP Algorithms -- 6.3.1. Introduction of bias into NN -- 6.3.2. Incorporating momentum or smoothing to weight adjustment -- 6.3.3. Other modi cation concerning convergence -- 6.A. Back Propagation Case Study: Character Recognition -- 6.A.1. Introduction -- 6.A.2. Network design -- 6.A.3. Results -- 6.A.4. Discussion and conclusions -- 6.A.5. Program Code (C++) -- 6.B. Back Propagation Case Study: The Exclusive-OR (XOR) Problem (2-Layer BP) -- 6.C. Back Propagation Case Study: The XOR Problem

7.2. Binary Hopfield Networks -- 7.3. Setting of Weights in Hop eld Nets - Bidirectional Associative Memory (BAM) Principle -- 7.4. Walsh Functions -- 7.5. Network Stability -- 7.6. Summary of the Procedure for Implementing the Hopfield Network -- 7.7. Continuous Hopfield Models -- 7.8. The Continuous Energy (Lyapunov) Function -- 7.A. Hopfield Network Case Study: Character Recognition -- 7.A.1. Introduction -- 7.A.2. Network design -- 7.A.3. Setting of weights -- 7.A.4. Testing -- 7.A.5. Results and conclusions -- 7.A.6. MATALAB codes -- 7.B. Hopfield Network Case Study: Traveling Salesman Problem -- 7.B.1. Introduction -- 7.B.2. Hopfield neural network design -- 7.B.3. Input selection -- 7.B.4. Implementation details -- 7.B.5. Output results -- 7.B.6. Concluding discussion -- Chapter 8. Counter Propagation -- 8.1. Introduction -- 8.2. Kohonen Self-Organizing Map (SOM) Layer -- 8.3. Grossberg Layer -- 8.4. Training of the Kohonen Layer -- 8.4.1. Preprocessing of Kohonen layer's inputs -- 8.4.2. Initializing the weights of the Kohonen layer -- 8.4.3. Interpolative mode layer -- 8.5. Training of Grossberg Layers -- 8.6. The Combined Counter Propagation Network -- 8.A. Counter Propagation Network Case Study: Character Recognition -- 8.A.1. Introduction -- 8.A.2. Network structure -- 8.A.3. Network training -- 8.A.4. Test mode -- 8.A.5. Results and conclusions -- 8.A.6. Source codes (MATLAB) -- Chapter 9. Adaptive Resonance Theory -- 9.1. Motivation -- 9.2. The ART Network Structure -- 9.3. Setting-Up of the ART Network -- 9.4. Network Operation -- 9.5. Properties of ART -- 9.6. Discussion and General Comments on ART-I and ART-II -- 9.A. ART-I Network Case Study: Character Recognition -- 9.A.1. Introduction -- 9.A.2. The data set -- 9.A.3. Network design -- 9.A.4. Performance results and conclusions -- 9.A.5. Code for ART neural network (Java).

9.B. ART-I Case Study: Speech Recognition -- 9.B.1. Input matrix set-up for spoken words -- 9.B.2. Simulation programs Set-Up -- 9.B.3. Computer simulation of ART program (C-language) -- 9.B.4. Simulation results -- Chapter 10. The Cognitron and the Neocognitron -- 10.1. Background of the Cognitron -- 10.2. The Basic Principles of the Cognitron -- 10.3. Network Operation -- 10.4. Cognitron's Network Training -- 10.5. The Neocognitron -- Chapter 11. Statistical Training -- 11.1. Fundamental Philosophy -- 11.2. Annealing Methods -- 11.3. Simulated Annealing by Boltzman Training of Weights -- 11.4. Stochastic Determination of Magnitude of Weight Change -- 11.5. Temperature-Equivalent Setting -- 11.6. Cauchy Training of Neural Network -- 11.A. Statistical Training Case Study

13.2.1. Basic structural elements -- 13.2.2. Setting of storage weights and determination of winning neurons -- 13.2.3. Adjustment of resolution in SOM modules -- 13.2.4. Links between SOM modules and from SOM modules to output modules -- 13.2.5. Determination of winning decision via link weights -- 13.2.6. Nj weights (not implemented in most applications) -- 13.2.7. Initialization and local minima -- 13.3. Forgetting Feature -- 13.4. Training vs. Operational Runs -- 13.4.1. INPUT WORD for training and for information retrieval -- 13.5. Advanced Data Analysis Capabilities -- 13.5.1. Feature extraction and reduction in the LAMSTAR NN -- 13.6. Correlation, Interpolation, Extrapolation and Innovation-Detection -- 13.6.1. Correlation feature -- 13.6.2. Innovation detection in the LAMSTAR NN -- 13.7. Concluding Comments and Discussion of Applicability -- 13.A. LAMSTAR Network Case Study : Character Recognition -- 13.A.1. Introduction -- 13.A.2. Design of the network -- 13.A.3. Fundamental principles -- 13.A.4. Training algorithm -- 13.A.5. Testing procedure -- 13.A.6. Results and their analysis -- 13.A.7. Summary and concluding observations -- 13.A.8. LAMSTAR CODE (MATLAB) -- 13.B. Application to Medical Diagnosis Problems -- Problems -- References -- Author Index -- Subject Index.
Abstract:
The book should serve as a text for a university graduate course or for an advanced undergraduate course on neural networks in engineering and computer science departments. It should also serve as a self-study course for engineers and computer scientists in the industry. Covering major neural network approaches and architectures with the theories, this text presents detailed case studies for each of the approaches, accompanied with complete computer codes and the corresponding computed results. The case studies are designed to allow easy comparison of network performance to illustrate strengths and weaknesses of the different networks.
Local Note:
Electronic reproduction. Ann Arbor, Michigan : ProQuest Ebook Central, 2017. Available via World Wide Web. Access may be limited to ProQuest Ebook Central affiliated libraries.
Electronic Access:
Click to View
Holds: Copies: