
Dissimilarity Representation For Pattern Recognition : Foundations And Applications.
Title:
Dissimilarity Representation For Pattern Recognition : Foundations And Applications.
Author:
Pekalska , Elzbieta.
ISBN:
9789812703170
Personal Author:
Physical Description:
1 online resource (634 pages)
Contents:
Contents -- Preface -- Notation and basic terminology -- Abbreviations -- 1. Introduction -- 1.1 Recognizing the pattern -- 1.2 Dissimilarities for representation -- 1.3 Learning from examples -- 1.4 Motivation of the use of dissimilarity representations -- 1.5 Relation to kernels -- 1.6 Outline of the book -- 1.7 In summary -- PART 1 Concepts and theory -- 2. Spaces -- 2.1 Preliminaries -- 2.2 A brief look at spaces -- 2.3 Generalized topological spaces -- 2.4 Generalized metric spaces -- 2.5 Vector spaces -- 2.6 Normed and inner product spaces -- 2.6.1 Reproducing kernel Halbert spaces -- 2.7 Indefinite inner product spaces -- 2.7.1 Reproducing kernel Krez'n spaces -- 2.8 Discussion -- 3. Characterization of dissimilarities -- 3.1 Embeddings, tree models and transformations -- 3.1.1 Embeddings -- 3.1.2 Distorted metric embeddings -- 3.2 Tree models for dissimilarities -- 3.3 Useful transformations -- 3.3.1 Transformations in semimetric spaces -- 3.3.2 Direct product spaces -- 3.3.3 Invariance and robustness -- 3.4 Properties of dissimilarity matrices -- 3.4.1 Dissimilarity matrices -- 3.4.2 Square distances and inner products -- 3.5 Linear embeddings of dissimilarities -- 3.5.1 Euclidean embedding -- 3.5.2 Correction of non-Euclidean dissimilarities -- 3.5.3 Pseudo-Euclidean embedding -- 3.5.4 Generalized average variance -- 3.5.5 Projecting new vectors t o an embedded space -- 3.5.6 Reduction of dimension -- 3.5.7 Reduction of complexity -- 3.5.8 A general embedding -- 3.5.9 Spherical embeddings -- 3.6 Spatial representation of dissimilarities -- 3.6.1 FastMap -- 3.6.2 Multidimensional scaling -- 3.6.3 Reduction of complexity -- 3.7 Summary -- 4. Learning approaches -- 4.1 Traditional learning -- 4.1.1 Data bias and model bias -- 4.1.2 Statistical learning -- 4.1.3 Inductive principles -- 4.1.3.1 Empirical risk minimization (ERM).
4.1.3.2 Principles based on Occam's razor -- Structural Risk Minimization (SRM). -- Regularization principle. -- Bayesian inference. -- 4.1.4 Why is the statistical approach not good enough f o r learning from objects? -- 4.2 The role of dissimilarity representations -- 4.2.1 Learned proximity representations -- 4.2.2 Dissimilarity representations: learning -- 4.3 Classification in generalized topological spaces -- 4.4 Classification in dissimilarity spaces -- 4.4.1 Characterization of dissimilarity spaces -- 4.4.2 Classifiers -- 4.5 Classification in pseudo-Euclidean spaces -- 4.6 On generalized kernels and dissimilarity spaces -- 4.6.1 Connection between dissimilarity spaces and pseudo- Euclidean spaces -- 4.7 Discussion -- 5. Dissimilarity measures -- 5.1 Measures depending on feature types -- 5.2 Measures between populations -- 5.2.1 Normal distributions -- 5.2.2 Divergence measures -- 5.2.3 Discrete probability distributions -- 5.3 Dissimilarity measures between sequences -- 5.4 Information-theoretic measures -- 5.5 Dissimilarity measures between sets -- 5.6 Dissimilarity measures in applications -- 5.6.1 Invariance and robustness -- 5.6.2 Example measures -- 5.7 Discussion and conclusions -- PART 2 Practice -- 6. Visualization -- 6.1 Multidimensional scaling -- 6.1.1 First examples -- 6.1.2 Linear and nonlinear methods: ezarnples -- 6.1.3 Implementation -- 6.2 Other mappings -- 6.3 Examples: getting insight into the data -- 6.4 Tree models -- 6.5 Summary -- 7. Further data exploration -- 7.1 Clustering -- 7.1.1 Standard approaches -- 7.1.2 Clustering o n dissimilarity representations -- 7.1.3 Clustering examples for dissimilarity representations -- 7.2 Intrinsic dimension -- 7.3 Sampling density -- 7.3.1 Proposed criteria -- 7.3.2 Experiments with the NIST digits -- 7.4 Summary -- 8. One-class classifiers -- 8.1 General issues.
8.1.1 Construction of one-class classifiers -- 8.1.2 One-class classijiers in feature spaces -- 8.2 Domain descriptors for dissimilarity representations -- 8.2.1 Neighborhood-based OCCs -- 8.2.2 Generalized mean class descriptor -- 8.2.3 Linear programming dissimilarit3 data description -- 8.2.4 More issues on class descriptors -- 8.3 Experiments -- 8.3.1 Experiment I: Condition monitoring -- 8.3.2 Experiment 11: Diseased mucosa in the oral cavity -- 8.3.3 Experiment 111: Heart disease data -- 8.4 Conclusions -- 9. Classification -- 9.1 Proof of principle -- 9.1.1 NN rule us alternative dissimilarity-based classifiers -- 9.1.2 Experiment I: square dissimilarity representations -- 9.1.3 Experiment 11: the dissimilarity space approach -- 9.1.4 Discussion -- 9.2 Selection of the representation set: the dissimilarity space approach -- 9.2.1 Prototype selection methods -- 9.2.2 Experimental setup -- 9.2.3 Results and discussion -- 9.2.4 Conclusions -- 9.3 Selection of the representation set: the embedding ap- proach -- 9.3.1 Prototype selection methods -- 9.3.2 Experiments and results -- 9.3.3 Conclusions -- 9.4 On corrections of dissimilarity measures -- 9.4.1 Going more Euclidean -- 9.4.2 Experimental setup -- 9.4.3 Results and conclusions -- 9.5 A few remarks on a simulated missing value problem -- 9.6 Existence of zero-error dissimilarity-based classifiers -- 9.6.1 Asymptotic separability of classes -- 9.7 Final discussion -- 10. Combining -- 10.1 Combining for one-class classification -- 10.1.1 Combining strategies -- 10.1.2 Data and experimental setup -- 10.1.3 Results and discussion -- 10.1.4 Summary and conclusions -- 10.2 Combining for standard two-class classification -- 10.2.1 Combining strategies -- 10.2.2 Experiments on the handwritten digit set -- 10.2.3 Results -- 10.2.4 Conclusions -- 10.3 Classifier projection space.
10.3.1 Construction and the use of CPS -- 10.4 Summary -- 11. Representation review and recommendat ions -- 11.1 Representation review -- 11.1.1 Three generalization ways -- 11.I .2 Representation formation -- 11.1.3 Generalization capabilities -- 11.2 Practical considerations -- 11.2.1 Clustering -- 11.2.2 One-class classification -- 11.2.3 Classification -- 12. Conclusions and open problems -- 12.1 Summary and contributions -- 12.2 Extensions of dissimilarity representations -- 12.3 Open questions -- Appendix A On convex and concave functions -- Appendix B Linear algebra in vector spaces -- B.l Some facts on matrices in a Euclidean space -- B.2 Some facts on matrices in a pseudo-Euclidean space -- Appendix C Measure and probability -- Appendix D Statistical sidelines -- D.l Likelihood and parameter estimation -- D.2 Expectation-maximization (EM) algorithm -- D.3 Model selection -- D.4 PCA and probabilistic models -- D.4.1 Gaussian model -- D.4.2 A Gaussian mixture model -- D.4.3 PCA -- D.4.4 Probabilistic PCA -- D.4.5 A mixture of probabilistic PCA -- Appendix E Data sets -- E.l Artificial data sets -- E.2 Real-world data sets -- Bibliography -- Index.
Abstract:
This book provides a fundamentally new approach to pattern recognition in which objects are characterized by relations to other objects instead of by using features or models. This 'dissimilarity representation' bridges the gap between the traditionally opposing approaches of statistical and structural pattern recognition. Physical phenomena, objects and events in the world are related in various and often complex ways. Such relations are usually modeled in the form of graphs or diagrams. While this is useful for communication between experts, such representation is difficult to combine and integrate by machine learning procedures. However, if the relations are captured by sets of dissimilarities, general data analysis procedures may be applied for analysis. With their detailed description of an unprecedented approach absent from traditional textbooks, the authors have crafted an essential book for every researcher and systems designer studying or developing pattern recognition systems.
Local Note:
Electronic reproduction. Ann Arbor, Michigan : ProQuest Ebook Central, 2017. Available via World Wide Web. Access may be limited to ProQuest Ebook Central affiliated libraries.
Genre:
Added Author:
Electronic Access:
Click to View