Cover image for Neural Networks for Intelligent Signal Processing.
Neural Networks for Intelligent Signal Processing.
Title:
Neural Networks for Intelligent Signal Processing.
Author:
Zaknich, Anthony.
ISBN:
9789812796851
Personal Author:
Physical Description:
1 online resource (510 pages)
Series:
Series on Innovative Intelligence ; v.4

Series on Innovative Intelligence
Contents:
Contents -- Acknowledgments -- Foreword -- Preface -- 1. Introduction -- 1.1 Motivation for ANNs -- 1.2 ANN Definitions and Main Types -- 1.3 Specific ANN Models -- 1.4 ANN Black Box Model -- 1.5 ANN Implementation -- 1.6 When To Use an ANN -- 1.7 How To Use an ANN -- 1.8 General Applications -- 1.9 Pattern Recognition Examples -- 1.9.1 Sheep Eating Phase Identification from Jaw Sounds -- 1.9.2 Particle Isolation in SEM Images -- 1.9.3 Oxalate Needle Detection in Microscope Images -- 1.10 Function Mapping and Filtering Examples -- 1.10.1 Water Level from Resonant Sound Analysis -- 1.10.2 Nonlinear Signal Filtering -- 1.11 Motor Control Example -- 1.12 ANN Summary -- References -- 2. A Brief Historical Overview -- 2.1 ANN History to 1970 -- 2.1.1 Key Events prior to 1970 -- 2.2 ANN History after 1970 -- 2.2.1 Key Events after 1970 to the Mid 1980s -- 2.2.2 Developments after the Mid 1980s -- 2.2.3 Nonparametric Learning From Finite Data -- 2.3 Reasons for the Resurgence of Interest in ANNs -- 2.4 Historical Summary -- References -- 3. Basic Concepts -- 3.1 The Basic Model of the Neuron -- 3.2 Activation Functions -- 3.3 Topologies -- 3.4 Learning -- 3.4.1 A Basic Supervised Learning Algorithm -- 3.4.2 A Basic Unsupervised Learning Algorithm -- 3.5 The Basic McCulloch Pitts and Perceptron Models -- 3.6 Vectors Spaces and Matrix Models -- 3.6.1 ANN Classifiers -- 3.6.2 Vectors and Feature Spaces -- 3.6.3 Representation of Multivariate Data -- 3.7 Basic Structure of a Neural Network -- 3.8 Basic ANN Operations in terms of Matrices -- 3.9 Why Use Matrices in ANNs? -- 3.9.1 Subspace -- 3.9.2 Multiplication of Matrices and Vectors -- 3.9.3 Line Subspace Example -- 3.9.4 The XOR Problem -- References -- 4. ANN Performance Evaluation -- 4.1 Confusion Matrix.

4.2 Error Measures including the Square Error -- 4.2.1 The Significance of Second Order Statistics -- 4.3 Receiver-Operating-Characteristic (ROC) -- 4.4 Chi-Squared Goodness of Fit -- References -- 5. Basic Pattern Recognition Principles -- 5.1 Data Pre-processing -- 5.1.1 Input Scaling and Normalisation -- 5.1.2 Feature Extraction -- 5.2 Feature Measurement Types -- 5.2.1 Nominal Feature Variables -- 5.2.2 Ordinal Feature Variables -- 5.2.3 Interval Feature Variables -- 5.2.4 Ratio Feature Variables -- 5.2.5 2-D Shape Feature Example -- 5.3 Classification -- 5.3.1 K-Nearest Neighbour Classifier -- 5.3.2 Distance Measures in a Feature Space -- 5.3.3 Statistical Classifiers -- 5.3.4 ANN Classifiers -- 5.4 Decision Criteria and Output Thresholds -- 5.5 Design Procedure for an ANN Classifier -- 5.5.1 A Simple Classification Example -- 5.6 Principle Component Analysis (PCA) -- References -- 6. ADALINES Adaptive Filters and Multi-Layer Perceptrons -- 6.1 Adaptive Linear Combiner and AD ALINE -- 6.1.1 Derivation of the LMS Algorithm -- 6.1.2 The ADALINE and MADALINE -- 6.1.3 Channel Equalisation Example -- 6.1.4 ADALINE Summary -- 6.2 General MLP Networks -- 6.2.1 A Detailed Three-Layer MLP Model -- 6.2.2 Derivation of Backpropagation-of-error Learning -- 6.2.3 An Illustrative Worked Example -- 6.2.4 MLP Application Notes -- 6.2.5 Multi-Layer Perceptron Construction -- 6.2.6 MLP Summary -- References -- 7. Probabilistic Neural Network Classifier -- 7.1 PNN Theory -- 7.2 Bayes' Decision Strategy -- 7.3 PDF Estimators and Radial Basis Functions -- 7.4 PNN Architecture -- 7.4.1 Relation of the Dot Product to Radial Distance -- 7.5 Features and Application Issues -- 7.6 Applications of the PNN -- 7.7 Gong Classification Application Example -- 7.7.1 Gong Description -- 7.7.2 Data Sampling.

7.7.3 Primary Analysis and Feature Selection -- 7.7.4 PNN Training and Classification Performance -- 7.8 Particle Isolation Application Example -- 7.9 FPGA PNN Design -- 7.9.1 Proposed Design -- 7.9.2 Test Results -- 7.9.3 Test Conclusion -- References -- 8. General Regression Neural Network -- 8.1 The Bayes Theorem and Regression Theory -- 8.2 The General Regression Neural Network -- 8.3 Short Wave Signal Filtering Application Example -- References -- 9. The Modified Probabilistic Neural Network -- 9.1 MPNN Theory -- 9.1.1 MPNN Method A -- 9.1.2 MPNN Method B -- 9.1.3 Other MPNN Network Construction Methods -- 9.1.4 Automated Sample Size Reduction -- 9.2 Other MPNN Characteristics -- 9.2.1 Noise Variance in Training and Network Size -- 9.3 MPNN and GRNN Adaptation and Learning Scheme -- 9.4 Signal Processing Application Examples -- 9.5 MPNN Hardware Implementation Schemes -- 9.5.1 Optoelectronic Implementation -- 9.5.2 VLSI Implementation -- 9.5.3 A Virtual Digital VLSI Hardware Design -- 9.5.4 A Parallel VLSI Hardware Design -- 9.6 MPNN Summary -- 9.6.1 Relationship to GRNN -- References -- 10. Advanced MPNN Developments -- 10.1 A Tuneable Approximate Piecewise Linear Model -- 10.1.1 The New Model -- 10.1.2 Example Results -- 10.1.3 Discussion -- 10.2 Integrated Sensory Intelligent System (ISIS) -- 10.2.1 Integrated Sensory Intelligent System Model -- 10.2.2 Doppler Shifted Chirp Detection -- 10.2.3 Test Results and Analysis of Initial Methods -- 10.2.4 Test Results and Analysis for ISIS Method -- 10.3 Future Directions for the MPNN -- 10.3.1 Hyperspace Signal Processing -- 10.3.2 Improvements to the MPNN -- 10.3.3 Other Engineering Applications for the MPNN -- 10.3.4 The MPNN as a Building Block to a General Parallel Computer -- References.

11. Neural Networks Similar to the Common Bandwidth Spherical Basis Function Regression ANNs -- 11.1 Radial Basis Function Neural Network -- 11.1.1 Vector Quantisation and K-means Clustering -- 11.1.2 Least Squares Estimation -- 11.2. Cerebellar Model Articulation Controller -- 11.2.1 Theory of the CMAC Network -- 11.2.2 CMAC Network Architecture -- 11.2.3 CMAC Network Operation -- 11.2.4 CMAC Applications -- References -- 12. Unsupervised Learning Neural Networks -- 12.1 Kohonen's Self-Organising Map -- 12.1.1 Kohonen's Training Algorithm -- 12.1.2 Kohonen's Recognition Algorithm -- 12.1.3 A Character Clustering Example -- 12.1.4 Another Simple Example -- 12.2 Adaptive Resonance Theory -- 12.2.1 ART1 and ART2 Network Characteristics -- 12.2.2 ART1 and ART2 Network Operation -- 12.2.3 The ART Algorithm -- 12.2.4 Theory of the ART1 and ART2 Networks -- 12.2.5 ART Applications -- References -- 13. Other Neural Network Models -- 13.1 Hopfield Neural Network -- 13.1.1 Hopfield Network Characteristics -- 13.1.2 Hopfield Network Operation -- 13.1.3 Hopfield Network Equations -- 13.1.4 Theory of the Hopfield Network -- 13.1.5 Hopfield Network Applications -- 13.2 Boltzmann Machine -- 13.2.1 Simulated Annealing -- 13.2.2 Boltzmann Network Characteristics -- 13.2.3 Boltzmann Network Operation -- 13.2.4 Theory of the Boltzmann Network -- 13.2.5 Boltzmann Machine Applications -- 13.3 Bidirectional Associative Memory -- 13.3.1 BAM Network Characteristics -- 13.3.2 BAM Network Operation -- 13.3.3 BAM Network Equations -- 13.3.4 Theory of the BAM Network -- 13.3.5 BAM Network Applications -- 13.4 Neocognitron -- 13.4.1 Neocognitron Network Structure -- 13.4.2 Neocognitron Network Equations -- 13.4.3 Neocognitron Network Training -- 13.4.4 Neocognitron Applications -- References -- 14. Statistical Learning Theory.

14.1 Learning and Regularisation -- 14.1.1 Dimensionality Problems in Learning -- 14.1.2 Learning Functions and Complexity -- 14.1.3 Approaches to Complexity Control -- 14.2 Vapnik's Statistical Learning Theory -- 14.2.1 Consistency and Convergence of ERM -- 14.2.2 VC-Dimension -- 14.2.3 Structural Risk Minimisation -- 14.3 Support Vector Learning Machines -- 14.3.1 Fundamental Ideas -- 14.3.2 Hyperplane for Optimal Linear Separability -- 14.3.3 Optimal Hyperplane for Nonseparable Data -- 14.3.4 Pattern Recognition SVM Design -- 14.3.5 Summary of SVM Learning Method -- References -- 15. Application to Intelligent Signal Processing -- 15.1 Estimation or Approximation Theory -- 15.2 General Signal Processing Model -- 15.3 Static ANN Models -- 15.4 Dynamic ANN Models -- 15.5 Application and Signal Pre-processing Issues -- 15.6 Signal Processing Examples -- 15.7 MPNN Comparisons with Other Important ANNs -- References -- 16. Application to Intelligent Control -- 16.1 Utility of ANNs for Control -- 16.2 Generic Approaches for Controller Design -- 16.2.1 Hybrid Controllers -- 16.2.2 ANN Controllers -- 16.3 Neural Control Principles -- 16.4 A Fast Adaptive Neural Network System -- 16.4.1 An Illustrative Example -- 16.4.2 Discussion -- References -- 17. Discussion -- 17.1 ANNs for Intelligent Engineering Systems -- 17.2 Signal Processing -- 17.3 Possible Generic Approaches -- References -- Appendix -- The "A" data format -- GRNN MATLAB Program -- Subject Index.
Abstract:
This book provides a thorough theoretical and practical introduction to the application of neural networks to pattern recognition and intelligent signal processing. It has been tested on students, unfamiliar with neural networks, who were able to pick up enough details to successfully complete their masters or final year undergraduate projects. The text also presents a comprehensive treatment of a class of neural networks called common bandwidth spherical basis function NNs, including the probabilistic NN, the modified probabilistic NN and the general regression NN. Contents: A Brief Historical Overview; Basic Concepts; ANN Performance Evaluation; Basic Pattern Recognition Principles; ADALINES, Adaptive Filters, and Multi-Layer Perceptrons; Probabilistic Neural Network Classifier; General Regression Neural Network; The Modified Probabilistic Neural Network; Advanced MPNN Developments; Neural Networks Similar to the Common Bandwidth Spherical Basis Function Regression ANNs; Unsupervised Learning Neural Networks; Other Neural Network Models; Statistical Learning Theory; Application to Intelligent Signal Processing; Application to Intelligent Control. Readership: Students and professionals in computer science and engineering.
Local Note:
Electronic reproduction. Ann Arbor, Michigan : ProQuest Ebook Central, 2017. Available via World Wide Web. Access may be limited to ProQuest Ebook Central affiliated libraries.
Electronic Access:
Click to View
Holds: Copies: