Cover image for Pattern Recognition.
Pattern Recognition.
Title:
Pattern Recognition.
Author:
Theodoridis, Sergios.
ISBN:
9780080513614
Personal Author:
Edition:
3rd ed.
Physical Description:
1 online resource (854 pages)
Contents:
Front cover -- Title page -- Copyright page -- Table of contents -- PREFACE -- 1 INTRODUCTION -- 1.1 IS PATTERN RECOGNITION IMPORTANT? -- 1.2 FEATURES, FEATURE VECTORS, AND CLASSIFIERS -- 1.3 SUPERVISED VERSUS UNSUPERVISED PATTERN RECOGNITION -- 1.4 OUTLINE OF THE BOOK -- 2 CLASSIFIERS BASED ON BAYES DECISION THEORY -- 2.1 INTRODUCTION -- 2.2 BAYES DECISION THEORY -- 2.3 DISCRIMINANT FUNCTIONS AND DECISION SURFACES -- 2.4 BAYESIAN CLASSIFICATION FOR NORMAL DISTRIBUTIONS -- 2.5 ESTIMATION OF UNKNOWN PROBABILITY DENSITY FUNCTIONS -- 2.6 THE NEAREST NEIGHBOR RULE -- 2.7 BAYESIAN NETWORKS -- 3 LINEAR CLASSIFIERS -- 3.1 INTRODUCTION -- 3.2 LINEAR DISCRIMINANT FUNCTIONS AND DECISION HYPERPLANES -- 3.3 THE PERCEPTRON ALGORITHM -- 3.4 LEAST SQUARES METHODS -- 3.5 MEAN SQUARE ESTIMATION REVISITED -- 3.6 LOGISTIC DISCRIMINATION -- 3.7 SUPPORT VECTOR MACHINES -- 4 NONLINEAR CLASSIFIERS -- 4.1 INTRODUCTION -- 4.2 THE XOR PROBLEM -- 4.3 THE TWO-LAYER PERCEPTRON -- 4.4 THREE-LAYER PERCEPTRONS -- 4.5 ALGORITHMS BASED ON EXACT CLASSIFICATION OF THE TRAINING SET -- 4.6 THE BACKPROPAGATION ALGORITHM -- 4.7 VARIATIONS ON THE BACKPROPAGATION THEME -- 4.8 THE COST FUNCTION CHOICE -- 4.9 CHOICE OF THE NETWORK SIZE -- 4.10 A SIMULATION EXAMPLE -- 4.11 NETWORKS WITH WEIGHT SHARING -- 4.12 GENERALIZED LINEAR CLASSIFIERS -- 4.13 CAPACITY OF THE l-DIMENSIONAL SPACE IN LINEAR DICHOTOMIES -- 4.14 POLYNOMIAL CLASSIFIERS -- 4.15 RADIAL BASIS FUNCTION NETWORKS -- 4.16 UNIVERSAL APPROXIMATORS -- 4.17 SUPPORT VECTOR MACHINES: THE NONLINEAR CASE -- 4.18 DECISION TREES -- 4.19 COMBINING CLASSIFIERS -- 4.20 THE BOOSTING APPROACH TO COMBINE CLASSIFIERS -- 4.21 DISCUSSION -- 5 FEATURE SELECTION -- 5.1 INTRODUCTION -- 5.2 PREPROCESSING -- 5.3 FEATURE SELECTION BASED ON STATISTICAL HYPOTHESIS TESTING -- 5.4 THE RECEIVER OPERATING CHARACTERISTICS (ROC) CURVE.

5.5 CLASS SEPARABILITY MEASURES -- 5.6 FEATURE SUBSET SELECTION -- 5.7 OPTIMAL FEATURE GENERATION -- 5.8 NEURAL NETWORKS AND FEATURE GENERATION/ SELECTION -- 5.9 A HINT ON GENERALIZATION THEORY -- 5.10 THE BAYESIAN INFORMATION CRITERION -- 6 FEATURE GENERATION I: LINEAR TRANSFORMS -- 6.1 INTRODUCTION -- 6.2 BASIS VECTORS AND IMAGES -- 6.3 THE KARHUNEN-LOÈVE TRANSFORM -- 6.4 THE SINGULAR VALUE DECOMPOSITION -- 6.5 INDEPENDENT COMPONENT ANALYSIS -- 6.6 THE DISCRETE FOURIER TRANSFORM (DFT) -- 6.7 THE DISCRETE COSINE AND SINE TRANSFORMS -- 6.8 THE HADAMARD TRANSFORM -- 6.9 THE HAAR TRANSFORM -- 6.10 THE HAAR EXPANSION REVISITED -- 6.11 DISCRETE TIMEWAVELET TRANSFORM (DTWT) -- 6.12 THE MULTIRESOLUTION INTERPRETATION -- 6.13 WAVELET PACKETS -- 6.14 A LOOK AT TWO-DIMENSIONAL GENERALIZATIONS -- 6.15 APPLICATIONS -- 7 FEATURE GENERATION II -- 7.1 INTRODUCTION -- 7.2 REGIONAL FEATURES -- 7.3 FEATURES FOR SHAPE AND SIZE CHARACTERIZATION -- 7.4 A GLIMPSE AT FRACTALS -- 7.5 TYPICAL FEATURES FOR SPEECH AND AUDIO CLASSIFICATION -- 8 TEMPLATE MATCHING -- 8.1 INTRODUCTION -- 8.2 MEASURES BASED ON OPTIMAL PATH SEARCHING TECHNIQUES -- 8.3 MEASURES BASED ON CORRELATIONS -- 8.4 DEFORMABLE TEMPLATE MODELS -- 9 CONTEXT-DEPENDENT CLASSIFICATION -- 9.1 INTRODUCTION -- 9.2 THE BAYES CLASSIFIER -- 9.3 MARKOV CHAIN MODELS -- 9.4 THE VITERBI ALGORITHM -- 9.5 CHANNEL EQUALIZATION -- 9.6 HIDDEN MARKOV MODELS -- 9.7 HMM WITH STATE DURATION MODELING -- 9.8 TRAINING MARKOV MODELS VIA NEURAL NETWORKS -- 9.9 A DISCUSSION OF MARKOV RANDOM FIELDS -- 10 SYSTEM EVALUATION -- 10.1 INTRODUCTION -- 10.2 ERROR COUNTING APPROACH -- 10.3 EXPLOITING THE FINITE SIZE OF THE DATA SET -- 10.4 A CASE STUDY FROM MEDICAL IMAGING -- 11 CLUSTERING: BASIC CONCEPTS -- 11.1 INTRODUCTION -- 11.2 PROXIMITY MEASURES -- 12 CLUSTERING ALGORITHMS I: SEQUENTIAL ALGORITHMS -- 12.1 INTRODUCTION.

12.2 CATEGORIES OF CLUSTERING ALGORITHMS -- 12.3 SEQUENTIAL CLUSTERING ALGORITHMS -- 12.4 A MODIFICATION OF BSAS -- 12.5 A TWO-THRESHOLD SEQUENTIAL SCHEME -- 12.6 REFINEMENT STAGES -- 12.7 NEURAL NETWORK IMPLEMENTATION -- 13 CLUSTERING ALGORITHMS II: HIERARCHICAL ALGORITHMS -- 13.1 INTRODUCTION -- 13.2 AGGLOMERATIVE ALGORITHMS -- 13.3 THE COPHENETIC MATRIX -- 13.4 DIVISIVE ALGORITHMS -- 13.5 HIERARCHICAL ALGORITHMS FOR LARGE DATA SETS -- 13.6 CHOICE OF THE BEST NUMBER OF CLUSTERS -- 14 CLUSTERING ALGORITHMS III: SCHEMES BASED ON FUNCTION OPTIMIZATION -- 14.1 INTRODUCTION -- 14.2 MIXTURE DECOMPOSITION SCHEMES -- 14.3 FUZZY CLUSTERING ALGORITHMS -- 14.4 POSSIBILISTIC CLUSTERING -- 14.5 HARD CLUSTERING ALGORITHMS -- 14.6 VECTOR QUANTIZATION -- APPENDIX -- 15 CLUSTERING ALGORITHMS IV -- 15.1 INTRODUCTION -- 15.2 CLUSTERING ALGORITHMS BASED ON GRAPH THEORY -- 15.3 COMPETITIVE LEARNING ALGORITHMS -- 15.4 BINARY MORPHOLOGY CLUSTERING ALGORITHMS (BMCAs) -- 15.5 BOUNDARY DETECTION ALGORITHMS -- 15.6 VALLEY-SEEKING CLUSTERING ALGORITHMS -- 15.7 CLUSTERING VIA COST OPTIMIZATION (REVISITED) -- 15.8 KERNEL CLUSTERING METHODS -- 15.9 DENSITY-BASED ALGORITHMS FOR LARGE DATA SETS -- 15.10 CLUSTERING ALGORITHMS FOR HIGH-DIMENSIONAL DATA SETS -- 15.11 OTHER CLUSTERING ALGORITHMS -- 16 CLUSTER VALIDITY -- 16.1 INTRODUCTION -- 16.2 HYPOTHESIS TESTING REVISITED -- 16.3 HYPOTHESIS TESTING IN CLUSTER VALIDITY -- 16.4 RELATIVE CRITERIA -- 16.5 VALIDITY OF INDIVIDUAL CLUSTERS -- 16.6 CLUSTERING TENDENCY -- Appendix A HINTS FROM PROBABILITY AND STATISTICS -- A.1 TOTAL PROBABILITY AND THE BAYES RULE -- A.2 MEAN AND VARIANCE -- A.3 STATISTICAL INDEPENDENCE -- A.4 MARGINALIZATION -- A.5 CHARACTERISTIC FUNCTIONS -- A.6 MOMENTS AND CUMULANTS -- A.7 EDGEWORTH EXPANSION OF A PDF -- A.8 KULLBACK-LEIBLER DISTANCE -- A.9 MULTIVARIATE GAUSSIAN OR NORMAL PROBABILITY DENSITY FUNCTION.

A.10 THE CRAMER-RAO LOWER BOUND -- A.11 CENTRAL LIMIT THEOREM -- A.12 CHI-SQUARE DISTRIBUTION -- A.13 t-DISTRIBUTION -- A.14 BETA DISTRIBUTION -- A.15 POISSON DISTRIBUTION -- Appendix B LINEAR ALGEBRA BASICS -- B.1 POSITIVE DEFINITE AND SYMMETRIC MATRICES -- B.2 CORRELATION MATRIX DIAGONALIZATION -- Appendix C COST FUNCTION OPTIMIZATION -- C.1 GRADIENT DESCENT ALGORITHM -- C.2 NEWTON'S ALGORITHM -- C.3 CONJUGATE-GRADIENT METHOD -- C.4 OPTIMIZATION FOR CONSTRAINED PROBLEMS -- Appendix D BASIC DEFINITIONS FROM LINEAR SYSTEMS THEORY -- D.1 LINEAR TIME INVARIANT (LTI) SYSTEMS -- D.2 TRANSFER FUNCTION -- D.3 SERIAL AND PARALLEL CONNECTION -- D.4 TWO-DIMENSIONAL GENERALIZATIONS -- INDEX.
Abstract:
Pattern recognition is a fast growing area with applications in a widely diverse number of fields such as communications engineering, bioinformatics, data mining, content-based database retrieval, to name but a few. This new edition addresses and keeps pace with the most recent advancements in these and related areas. This new edition: a) covers Data Mining, which was not treated in the previous edition, and is integrated with existing material in the book, b) includes new results on Learning Theory and Support Vector Machines, that are at the forefront of today's research, with a lot of interest both in academia and in applications-oriented communities, c) for the first time treats audio along with image applications since in today's world the most advanced applications are treated in a unified way and d) the subject of classifier combinations is treated, since this is a hot topic currently of interest in the pattern recognition community. * The latest results on support vector machines including v-SVM's and their geometric interpretation * Classifier combinations including the Boosting approach * State-of-the-art material for clustering algorithms tailored for large data sets and/or high dimensional data, as required by applications such as web-mining and bioinformatics * Coverage of diverse applications such as image analysis, optical character recognition, channel equalization, speech recognition and audio classification.
Local Note:
Electronic reproduction. Ann Arbor, Michigan : ProQuest Ebook Central, 2017. Available via World Wide Web. Access may be limited to ProQuest Ebook Central affiliated libraries.
Electronic Access:
Click to View
Holds: Copies: