Cover image for High-Level Data Fusion.
High-Level Data Fusion.
Title:
High-Level Data Fusion.
Author:
Das, Subrata.
ISBN:
9781596932821
Personal Author:
Physical Description:
1 online resource (393 pages)
Contents:
High-Level Data Fusion -- Table of Contents -- Preface -- Chapter 1: Models, Architectures, and Data -- 1.1 WHAT IS HIGH-LEVEL FUSION? -- 1.2 FUSION MODELS -- 1.2.1 JDL Model -- 1.2.2 DIKW Hierarchy and Abstraction of Knowledge -- 1.2.3 Assessment Versus Awareness -- 1.2.4 OODA Loop -- 1.2.5 Rasmussen Information Processing Hierarchy -- 1.2.6 Correspondence among Models -- 1.3 SENSORS AND INTELLIGENCE -- 1.3.1 Signals Intelligence (SIGINT) -- 1.3.2 Imagery Intelligence (IMINT) -- 1.3.3 Measurement and Signature Intelligence (MASINT) -- 1.3.4 Human Intelligence (HUMINT) -- 1.3.5 Open Source Intelligence (OSINT) -- 1.3.6 Geospatial Intelligence (GEOINT) -- 1.3.7 Intelligent Data Format -- 1.4 GENERIC FUSION ARCHITECTURE AND BOOK SCOPE -- 1.5 FURTHER READING -- Chapter 2: Mathematical Preliminaries -- 2.1 USAGE OF SYMBOLS -- 2.2 GRAPHS AND TREES -- 2.3 PROBABILITY AND STATISTICS -- 2.3.1 Probability Distributions -- 2.4 MATHEMATICAL LOGIC -- 2.5 ALGORITHMIC COMPLEXITY -- 2.6 FURTHER READING -- Chapter 3: Approaches to Handling Uncertainty -- 3.1 IGNORANCE TO UNCERTAINTIES -- 3.2 APPROACHES TO HANDLING UNCERTAINTIES -- 3.3 NEO-PROBABILIST APPROACH -- 3.3.1 Bayesian Belief Networks (BNs) -- 3.4 NEO-CALCULIST APPROACH -- 3.4.1 Theory of Belief Functions -- 3.4.2 Certainty Factors -- 3.5 NEO-LOGICIST APPROACH -- 3.5.1 Default Logic -- 3.5.2 Program Completion -- 3.6 NEO-POSSIBILIST APPROACHES -- 3.6.1 Fuzzy Sets -- 3.6.2 Fuzzy Logic -- 3.6.3 Possibility Theory -- 3.6.4 Possibilistic Logic -- 3.7 TRANSFORMATION BETWEEN FORMALISMS -- 3.7.1 Transferable Belief Model -- 3.7.2 Relating Probability and Possibility -- 3.8 FURTHER READING -- Chapter 4: Introduction to Target Tracking -- 4.1 TARGET TRACKING CONCEPT AND ARCHITECTURE -- 4.2 TARGET TRACKING PROBLEM MODELING -- 4.2.1 State Transition and Observation Models -- 4.2.2 Estimation Problem.

4.3 SINGLE SENSOR SINGLE TARGET TRACKING -- 4.3.1 Alpha-Beta Filter -- 4.3.2 Kalman Filter (KF) -- 4.4 GATING AND DATA ASSOCIATION -- 4.5 MULTISENSOR SINGLE TARGET TRACKING (IN CLUTTER) -- 4.5.1 Probabilistic Data Association Filter (PDAF) -- 4.6 MULTISENSOR MULTITARGET TRACKING (IN CLUTTER) -- 4.6.1 Joint Probabilistic Data Association (JPDA) -- 4.6.2 Multiple-Hypothesis Tracking (MHT) -- 4.7 INTERACTING MULTIPLE MODEL (IMM) -- 4.8 CRAMER-RAO LOWER BOUND (CRLB) -- 4.9 FURTHER READING -- Chapter 5: Target Classification and Aggregation -- 5.1 TARGET CLASSIFICATION -- 5.1.1 Example Surveillance Scenario -- 5.1.2 Naïve Bayesian Classifier (NBC) for Target Classification -- 5.1.3 Rule-Based Expert Systems for Target Classification -- 5.1.4 Dempster-Shafer Theory for Target Classification -- 5.1.5 Fuzzy Logic for Target Classification -- 5.2 TARGETS AGGREGATION -- 5.2.1 Spatiotemporal Clustering (STC) Concept -- 5.2.2 Manhattan Distance-Based Grid-Constrained Clustering -- 5.2.3 Directivity- and Displacement-Based Unconstrained Clustering -- 5.2.4 Orthogonality-Based Clustering -- 5.2.5 Singular Value Decomposition-Based Clustering -- 5.2.6 Preprocessing through Entropy Measure -- 5.3 FURTHER READING -- Chapter 6: Model-Based Situation Assessment -- 6.1 BAYESIAN BELIEF NETWORKS -- 6.2 CONDITIONAL INDEPENDENCE IN BELIEF NETWORKS -- 6.3 EVIDENCE, BELIEF, AND LIKELIHOOD -- 6.4 PRIOR PROBABILITIES IN NETWORKS WITHOUT EVIDENCE -- 6.5 BELIEF REVISION -- 6.6 EVIDENCE PROPAGATION IN POLYTREES -- 6.6.1 Upward Propagation in a Linear Fragment -- 6.6.2 Downward Propagation in a Linear Fragment -- 6.6.3 Upward Propagation in a Tree Fragment -- 6.6.4 Downward Propagation in a Tree Fragment -- 6.6.5 Upward Propagation in a Polytree Fragment -- 6.6.6 Downward Propagation in a Polytree Fragment -- 6.6.7 Propagation Algorithm.

6.7 EVIDENCE PROPAGATION IN DIRECTED ACYCLIC GRAPHS -- 6.7.1 Graphical Transformation -- 6.7.2 Join Tree Initialization -- 6.7.3 Propagation in Join Tree and Marginalization -- 6.7.4 Handling Evidence -- 6.8 COMPLEXITY OF INFERENCE ALGORITHMS -- 6.9 ACQUISITION OF PROBABILITIES -- 6.10 ADVANTAGES AND DISADVANTAGES OF BELIEF NETWORKS -- 6.11 THEATER MISSILE DEFENSE APPLICATION -- 6.12 BELIEF NETWORK TOOLS -- 6.13 FURTHER READING -- Chapter 7: Modeling Time for Situation Assessment -- 7.1 MARKOV MODELS -- 7.2 HIDDEN MARKOV MODELS (HMM) -- 7.2.1 The Forward Algorithm -- 7.2.2 The Viterbi Algorithm -- 7.3 HIERARCHICAL HIDDEN MARKOV MODELS (HHMM) -- 7.3.1 The Forward Algorithm for HHMM -- 7.3.2 The Viterbi Algorithm for HHMM -- 7.4 MARKOV MODELS FOR TEXT ANALYSES -- 7.5 HMM WITH EXPLICIT STATE DURATION -- 7.6 DYNAMIC BAYESIAN NETWORKS (DBNs) -- 7.6.1 Inference Algorithms for DBNs -- 7.7 DBN APPLICATION FOR LIFE STATUS ESTIMATION -- 7.8 FURTHER READING -- Chapter 8: Handling Nonlinear and Hybrid Models -- 8.1 EXTENDED KALMAN FILTER (EKF) -- 8.2 UNSCENTED KALMAN FILTER (UKF) -- 8.3 PARTICLE FILTER (PF) -- 8.3.1 Basic Particle Filter -- 8.3.2 Particle Filter Algorithms -- 8.3.3 Rao-Blackwellised Particle Filter (RBPF) -- 8.3.4 Multitarget Tracking and Particle Filters -- 8.3.5 Tracking a Variable Number of Targets via DBNs -- 8.3.6 Particle Filter for DBN -- 8.3.7 Example DBN Inferencing by Particle Filtering -- 8.3.8 Particle Filter Issues -- 8.4 FURTHER READING -- Chapter 9: Decision Support -- 9.1 EXPECTED UTILITY THEORY AND DECISION TREES -- 9.2 INFLUENCE DIAGRAMS FOR DECISION SUPPORT -- 9.2.1 Inferencing in Influence Diagrams -- 9.2.2 Compilation of Influence Diagrams -- 9.2.3 Inferencing in Strong Junction Trees -- 9.2.4 An Example Influence Diagram for Theater Missile Defense -- 9.3 SYMBOLIC ARGUMENTATION FOR DECISION SUPPORT.

9.3.1 Measuring Consensus -- 9.3.2 Combining Sources of Varying Confidence -- 9.4 FURTHER READING -- Chapter 10: Learning of Fusion Models -- 10.1 LEARNING NAÏVE BAYESIAN CLASSIFIERS -- 10.2 RULE LEARNING FROM DECISION TREE ALGORITHMS -- 10.2.1 Algorithms for Constructing Decision Trees -- 10.2.2 Overfitting in Decision Trees -- 10.2.3 Handling Continuous Attributes -- 10.3 BAYESIAN BELIEF NETWORK LEARNING -- 10.3.1 Learning Probabilities: Brief Survey -- 10.3.2 Learning Probabilities from Fully Observable Variables -- 10.3.3 Learning Probabilities from Partially Observable Variables -- 10.3.4 Online Adjustment of Parameters -- 10.3.5 Brief Survey of Structure Learning -- 10.3.6 Learning Structure from Fully Observable Variables -- 10.3.7 Learning Structure from Partially Observable Variables -- 10.3.8 Use of Prior Knowledge from Experts -- 10.4 BAUM-WELCH ALGORITHM FOR LEARNING HMM -- 10.4.1 Generalized Baum-Welch Algorithm for HHMM -- 10.5 FURTHER READING -- Chapter 11: Towards Cognitive Agents for Data Fusion -- 11.1 MOTIVATION AND SCOPE -- 11.2 ENVELOPE MODEL OF HUMAN COGNITION -- 11.3 COMPARATIVE STUDY -- 11.3.1 Classical Cognitive Architectures and Envelope -- 11.3.2 Agent Architectures and Envelope -- 11.3.3 C4I Architectures and Envelope -- 11.4 LEARNING, SYSTEMATICITY, AND LOGICAL OMNISCIENCE -- 11.5 COMPUTATIONAL REALIZATION -- 11.6 SOME DISCUSSION -- 11.7 FURTHER READING -- Chapter 12: Distributed Fusion -- 12.1 CONCEPT AND APPROACH -- 12.2 DISTRIBUTED FUSION ENVIRONMENTS -- 12.3 ALGORITHM FOR DISTRIBUTED SITUATION ASSESSMENT -- 12.4 DISTRIBUTED KALMAN FILTER -- 12.5 RELEVANCE TO NETWORK CENTRIC WARFARE -- 12.6 FURTHER READING -- References -- About the Author -- Index.
Abstract:
The book explores object and situation fusion processes with an appropriate handling of uncertainties, and applies cutting-edge artificial intelligence and emerging technologies like particle filtering, spatiotemporal clustering, net-centricity, agent formalism, and distributed fusion together with essential Level 1 techniques and Level 1/2 interactions.
Local Note:
Electronic reproduction. Ann Arbor, Michigan : ProQuest Ebook Central, 2017. Available via World Wide Web. Access may be limited to ProQuest Ebook Central affiliated libraries.
Electronic Access:
Click to View
Holds: Copies: