Cover image for Hyperspectral Data Processing : Algorithm Design and Analysis.
Hyperspectral Data Processing : Algorithm Design and Analysis.
Title:
Hyperspectral Data Processing : Algorithm Design and Analysis.
Author:
Chang, Chein-I.
ISBN:
9781118269756
Personal Author:
Edition:
1st ed.
Physical Description:
1 online resource (1165 pages)
Contents:
HYPERSPECTRAL DATA PROCESSING: Algorithm Design and Analysis -- CONTENTS -- PREFACE -- 1 OVERVIEWAND INTRODUCTION -- 1.1 Overview -- 1.2 Issues of Multispectral and Hyperspectral Imageries -- 1.3 Divergence of Hyperspectral Imagery from Multispectral Imagery -- 1.3.1 Misconception: Hyperspectral Imaging is a Natural Extension of Multispectral Imaging -- 1.3.2 Pigeon-Hole Principle: Natural Interpretation of Hyperspectral Imaging -- 1.4 Scope of This Book -- 1.5 Book's Organization -- 1.5.1 Part I: Preliminaries -- 1.5.2 Part II: Endmember Extraction -- 1.5.3 Part III: Supervised Linear Hyperspectral Mixture Analysis -- 1.5.4 Part IV: Unsupervised Hyperspectral Analysis -- 1.5.5 Part V: Hyperspectral Information Compression -- 1.5.6 Part VI: Hyperspectral Signal Coding -- 1.5.7 Part VII: Hyperspectral Signal Feature Characterization -- 1.5.8 Applications -- 1.5.8.1 Chapter 30: Applications of Target Detection -- 1.5.8.2 Chapter 31: Nonlinear Dimensionality Expansion to Multispectral Imagery -- 1.5.8.3 Chapter 32: Multispectral Magnetic Resonance Imaging -- 1.6 Laboratory Data to be Used in This Book -- 1.6.1 Laboratory Data -- 1.6.2 Cuprite Data -- 1.6.3 NIST/EPA Gas-Phase Infrared Database -- 1.7 Real Hyperspectral Images to be Used in this Book -- 1.7.1 AVIRIS Data -- 1.7.1.1 Cuprite Data -- 1.7.1.2 Purdue's Indiana Indian Pine Test Site -- 1.7.2 HYDICE Data -- 1.8 Notations and Terminologies to be Used in this Book -- I: PRELIMINARIES -- 2 FUNDAMENTALS OF SUBSAMPLE AND MIXED SAMPLE ANALYSES -- 2.1 Introduction -- 2.2 Subsample Analysis -- 2.2.1 Pure-Sample Target Detection -- 2.2.2 Subsample Target Detection -- 2.2.2.1 Adaptive Matched Detector (AMD) -- 2.2.2.2 Adaptive Subspace Detector (ASD) -- 2.2.3 Subsample Target Detection: Constrained Energy Minimization (CEM) -- 2.3 Mixed Sample Analysis -- 2.3.1 Classification with Hard Decisions.

2.3.1.1 Fisher's Linear Discriminant Analysis (FLDA) -- 2.3.1.2 Support Vector Machines (SVM) -- 2.3.2 Classification with Soft Decisions -- 2.3.2.1 Orthogonal Subspace Projection (OSP) -- 2.3.2.2 Target-Constrained Interference-Minimized Filter (TCIMF) -- 2.4 Kernel-Based Classification -- 2.4.1 Kernel Trick Used in Kernel-Based Methods -- 2.4.2 Kernel-Based Fisher's Linear Discriminant Analysis (KFLDA) -- 2.4.3 Kernel Support Vector Machine (K-SVM) -- 2.5 Conclusions -- 3 THREE-DIMENSIONAL RECEIVER OPERATING CHARACTERISTICS (3D ROC) ANALYSIS -- 3.1 Introduction -- 3.2 Neyman-Pearson Detection Problem Formulation -- 3.3 ROC Analysis -- 3.4 3D ROC Analysis -- 3.5 Real Data-Based ROC Analysis -- 3.5.1 How to Generate ROC Curves from Real Data -- 3.5.2 How to Generate Gaussian-Fitted ROC Curves -- 3.5.3 How to Generate 3D ROC Curves -- 3.5.4 How to Generate 3D ROC Curves for Multiple Signal Detection and Classification -- 3.6 Examples -- 3.6.1 Hyperspectral Imaging -- 3.6.1.1 Hyperspectral Target Detection -- 3.6.1.2 Linear Hyperspectral Mixture Analysis -- 3.6.2 Magnetic Resonance (MR) Breast Imaging -- 3.6.2.1 Breast Tumor Detection -- 3.6.2.2 Brain Tissue Classification -- 3.6.3 Chemical/Biological Agent Detection -- 3.6.4 Biometric Recognition -- 3.7 Conclusions -- 4 DESIGN OF SYNTHETIC IMAGE EXPERIMENTS -- 4.1 Introduction -- 4.2 Simulation of Targets of Interest -- 4.2.1 Simulation of Synthetic Subsample Targets -- 4.2.2 Simulation of Synthetic Mixed-Sample Targets -- 4.3 Six Scenarios of Synthetic Images -- 4.3.1 Panel Simulations -- 4.3.2 Three Scenarios for Target Implantation (TI) -- 4.3.2.1 Scenario TI1 (Clean Panels Implanted into Clean Background) -- 4.3.2.2 Scenario TI2 (Clean Panels Implanted into Noisy Background) -- 4.3.2.3 Scenario TI3 (Gaussian Noise Added to Clean Panels Implanted into Clean Background).

4.3.3 Three Scenarios for Target Embeddedness (TE) -- 4.3.3.1 Scenario TE1 (Clean Panels Embedded in Clean Background) -- 4.3.3.2 Scenario TE2 (Clean Panels Embedded in Noisy Background) -- 4.3.3.3 Scenario TE3 (Gaussian Noise Added to Clean Panels Embedded in Background) -- 4.4 Applications -- 4.4.1 Endmember Extraction -- 4.4.2 Linear Spectral Mixture Analysis (LSMA) -- 4.4.2.1 Mixed Pixel Classification -- 4.4.2.2 Mixed Pixel Quantification -- 4.4.3 Target Detection -- 4.4.3.1 Subpixel Target Detection -- 4.4.3.2 Anomaly Detection -- 4.5 Conclusions -- 5 VIRTUAL DIMENSIONALITY OF HYPERSPECTRAL DATA -- 5.1 Introduction -- 5.2 Reinterpretation of VD -- 5.3 VD Determined by Data Characterization-Driven Criteria -- 5.3.1 Eigenvalue Distribution-Based Criteria -- 5.3.1.1 Thresholding Energy Percentage -- 5.3.1.2 Thresholding Difference between Normalized Correlation Eigenvalues and Normalized Covariance Eigenvalues -- 5.3.1.3 Finding First Sudden Drop in the Normalized Eigenvalue Distribution -- 5.3.2 Eigen-Based Component Analysis Criteria -- 5.3.2.1 Singular Value Decomposition (SVD) -- 5.3.2.2 Principal Components Analysis (PCA) -- 5.3.3 Factor Analysis: Malinowski's Error Theory -- 5.3.4 Information Theoretic Criteria (ITC) -- 5.3.4.1 AIC -- 5.3.4.2 MDL -- 5.3.5 Gershgorin Radius-Based Methods -- 5.3.5.1 Thresholding Gershgorin Radii -- 5.3.5.2 Thresholding Difference Gershgorin Radii between RLxL and KLxL -- 5.3.6 HFC Method -- 5.3.7 Discussions on Data Characterization-Driven Criteria -- 5.4 VD Determined by Data Representation-Driven Criteria -- 5.4.1 Orthogonal Subspace Projection (OSP) -- 5.4.2 Signal Subspace Estimation (SSE) -- 5.4.3 Discussions on OSP and SSE/HySime -- 5.5 Synthetic Image Experiments -- 5.5.1 Data Characterization-Driven Criteria -- 5.5.1.1 Target Implantation (TI) Scenarios -- 5.5.1.2 Target Embeddedness (TE) Scenarios.

5.5.2 Data Representation-Driven Criteria -- 5.6 VD Estimated for Real Hyperspectral Images -- 5.7 Conclusions -- 6 DATA DIMENSIONALITY REDUCTION -- 6.1 Introduction -- 6.2 Dimensionality Reduction by Second-Order Statistics-Based Component Analysis Transforms -- 6.2.1 Eigen Component Analysis Transforms -- 6.2.1.1 Principal Components Analysis -- 6.2.1.2 Standardized Principal Components Analysis -- 6.2.1.3 Singular Value Decomposition -- 6.2.2 Signal-to-Noise Ratio-Based Components Analysis Transforms -- 6.2.2.1 Maximum Noise Fraction Transform -- 6.2.2.2 Noise-Adjusted Principal Component Transform -- 6.3 Dimensionality Reduction by High-Order Statistics-Based Components Analysis Transforms -- 6.3.1 Sphering -- 6.3.2 Third-Order Statistics-Based Skewness -- 6.3.3 Fourth-Order Statistics-Based Kurtosis -- 6.3.4 High-Order Statistics -- 6.3.5 Algorithm for Finding Projection Vectors -- 6.4 Dimensionality Reduction by Infinite-Order Statistics-Based Components Analysis Transforms -- 6.4.1 Statistics-Prioritized ICA-DR (SPICA-DR) -- 6.4.2 Random ICA-DR -- 6.4.3 Initialization Driven ICA-DR -- 6.5 Dimensionality Reduction by Projection Pursuit-Based Components Analysis Transforms -- 6.5.1 Projection Index-Based Projection Pursuit -- 6.5.2 Random Projection Index-Based Projection Pursuit -- 6.5.3 Projection Index-Based Prioritized Projection Pursuit -- 6.5.4 Initialization Driven Projection Pursuit -- 6.6 Dimensionality Reduction by Feature Extraction-Based Transforms -- 6.6.1 Fisher's Linear Discriminant Analysis -- 6.6.2 Orthogonal Subspace Projection -- 6.7 Dimensionality Reduction by Band Selection -- 6.8 Constrained Band Selection -- 6.9 Conclusions -- II: ENDMEMBER EXTRACTION -- 7 SIMULTANEOUS ENDMEMBER EXTRACTION ALGORITHMS (SM-EEAs) -- 7.1 Introduction -- 7.2 Convex Geometry-Based Endmember Extraction.

7.2.1 Convex Geometry-Based Criterion: Orthogonal Projection -- 7.2.2 Convex Geometry-Based Criterion: Minimal Simplex Volume -- 7.2.2.1 Minimal-Volume Transform (MVT) -- 7.2.2.2 Convex Cone Analysis (CCA) -- 7.2.3 Convex Geometry-Based Criterion: Maximal Simplex Volume -- 7.2.3.1 Simultaneous N-FINDR (SM N-FINDR) -- 7.2.3.2 Iterative N-FINDR (IN-FINDR) -- 7.2.3.3 Various Versions of Implementing IN-FINDR -- 7.2.3.4 Discussions on Various Implementation Versions of IN-FINDR -- 7.2.3.5 Comparative Study Among Various Versions of IN-FINDR -- 7.2.3.6 Alternative SM N-FINDR -- 7.2.4 Convex Geometry-Based Criterion: Linear Spectral Mixture Analysis -- 7.3 Second-Order Statistics-Based Endmember Extraction -- 7.4 Automated Morphological Endmember Extraction (AMEE) -- 7.5 Experiments -- 7.5.1 Synthetic Image Experiments -- 7.5.1.1 Scenario TI1 (Endmembers Implanted in a Clean Background) -- 7.5.1.2 Scenario TI2 (Endmembers Implanted in a Noisy Background) -- 7.5.1.3 Scenario TI3 (Noisy Endmembers Implanted in a Noisy Background) -- 7.5.1.4 Scenario TE1 (Endmembers Embedded into a Clean Background) -- 7.5.1.5 Scenario TE2 (Endmembers Embedded into a Noisy Background) -- 7.5.1.6 Scenario TE3 (Noisy Endmembers Embedded into a Noisy Background) -- 7.5.2 Cuprite Data -- 7.5.3 HYDICE Data -- 7.6 Conclusions -- 8 SEQUENTIAL ENDMEMBER EXTRACTION ALGORITHMS (SQ-EEAs) -- 8.1 Introduction -- 8.2 Successive N-FINDR (SC N-FINDR) -- 8.3 Simplex Growing Algorithm (SGA) -- 8.4 Vertex Component Analysis (VCA) -- 8.5 Linear Spectral Mixture Analysis-Based SQ-EEAs -- 8.5.1 Automatic Target Generation Process-EEA (ATGP-EEA) -- 8.5.2 Unsupervised Nonnegativity Constrained Least-Squares-EEA (UNCLS-EEA) -- 8.5.3 Unsupervised Fully Constrained Least-Squares-EEA (UFCLS-EEA) -- 8.5.4 Iterative Error Analysis-EEA (IEA-EEA) -- 8.6 High-Order Statistics-Based SQ-EEAS.

8.6.1 Third-Order Statistics-Based SQ-EEA.
Abstract:
Hyperspectral Data Processing: Algorithm Design and Analysis is a culmination of the research conducted in the Remote Sensing Signal and Image Processing Laboratory (RSSIPL) at the University of Maryland, Baltimore County. Specifically, it treats hyperspectral image processing and hyperspectral signal processing as separate subjects in two different categories. Most materials covered in this book can be used in conjunction with the author's first book, Hyperspectral Imaging: Techniques for Spectral Detection and Classification, without much overlap. Many results in this book are either new or have not been explored, presented, or published in the public domain. These include various aspects of endmember extraction, unsupervised linear spectral mixture analysis, hyperspectral information compression, hyperspectral signal coding and characterization, as well as applications to conceal target detection, multispectral imaging, and magnetic resonance imaging. Hyperspectral Data Processing contains eight major sections: Part I: provides fundamentals of hyperspectral data processing Part II: offers various algorithm designs for endmember extraction Part III: derives theory for supervised linear spectral mixture analysis Part IV: designs unsupervised methods for hyperspectral image analysis Part V: explores new concepts on hyperspectral information compression Parts VI & VII: develops techniques for hyperspectral signal coding and characterization Part VIII: presents applications in multispectral imaging and magnetic resonance imaging Hyperspectral Data Processing compiles an algorithm compendium with MATLAB codes in an appendix to help readers implement many important algorithms developed in this book and write their own program codes without relying on software packages. Hyperspectral Data Processing is a valuable reference for those who have

been involved with hyperspectral imaging and its techniques, as well those who are new to the subject.
Local Note:
Electronic reproduction. Ann Arbor, Michigan : ProQuest Ebook Central, 2017. Available via World Wide Web. Access may be limited to ProQuest Ebook Central affiliated libraries.
Electronic Access:
Click to View
Holds: Copies: