Cover image for Identification of Physical Systems : Applications to Condition Monitoring, Fault Diagnosis, Soft Sensor and Controller Design.
Identification of Physical Systems : Applications to Condition Monitoring, Fault Diagnosis, Soft Sensor and Controller Design.
Title:
Identification of Physical Systems : Applications to Condition Monitoring, Fault Diagnosis, Soft Sensor and Controller Design.
Author:
Doraiswami, Rajamani.
ISBN:
9781118536506
Personal Author:
Edition:
1st ed.
Physical Description:
1 online resource (538 pages)
Contents:
IDENTIFICATION OF PHYSICAL SYSTEMS -- Contents -- Preface -- Nomenclature -- 1 Modeling of Signals and Systems -- 1.1 Introduction -- 1.2 Classification of Signals -- 1.2.1 Deterministic and Random Signals -- 1.2.2 Bounded and Unbounded Signal -- 1.2.3 Energy and Power Signals -- 1.2.4 Causal, Non-causal, and Anti-causal Signals -- 1.2.5 Causal, Non-causal, and Anti-causal Systems -- 1.3 Model of Systems and Signals -- 1.3.1 Time-Domain Model -- 1.3.2 Frequency-Domain Model -- 1.4 Equivalence of Input-Output and State-Space Models -- 1.4.1 State-Space and Transfer Function Model -- 1.4.2 Time-Domain Expression for the Output Response -- 1.4.3 State-Space and the Difference Equation Model -- 1.4.4 Observer Canonical Form -- 1.4.5 Characterization of the Model -- 1.4.6 Stability of (Discrete-Time) Systems -- 1.4.7 Minimum Phase System -- 1.4.8 Pole-Zero Locations and the Output Response -- 1.5 Deterministic Signals -- 1.5.1 Transfer Function Model -- 1.5.2 Difference Equation Model -- 1.5.3 State-Space Model -- 1.5.4 Expression for an Impulse Response -- 1.5.5 Periodic Signal -- 1.5.6 Periodic Impulse Train -- 1.5.7 A Finite Duration Signal -- 1.5.8 Model of a Class of All Signals -- 1.5.9 Examples of Deterministic Signals -- 1.6 Introduction to Random Signals -- 1.6.1 Stationary Random Signal -- 1.6.2 Joint PDF and Statistics of Random Signals -- 1.6.3 Ergodic Process -- 1.7 Model of Random Signals -- 1.7.1 White Noise Process -- 1.7.2 Colored Noise -- 1.7.3 Model of a Random Waveform -- 1.7.4 Classification of the Random Waveform -- 1.7.5 Frequency Response and Pole-Zero Locations -- 1.7.6 Illustrative Examples of Filters -- 1.7.7 Illustrative Examples of Random Signals -- 1.7.8 Pseudo Random Binary Sequence (PRBS) -- 1.8 Model of a System with Disturbance and Measurement Noise -- 1.8.1 Input-Output Model of the System.

1.8.2 State-Space Model of the System -- 1.8.3 Illustrative Examples in Integrated System Model -- 1.9 Summary -- References -- Further Readings -- 2 Characterization of Signals: Correlation and Spectral Density -- 2.1 Introduction -- 2.2 Definitions of Auto- and Cross-Correlation (and Covariance) -- 2.2.1 Properties of Correlation -- 2.2.2 Normalized Correlation and Correlation Coefficient -- 2.3 Spectral Density: Correlation in the Frequency Domain -- 2.3.1 Z-transform of the Correlation Function -- 2.3.2 Expressions for Energy and Power Spectral Densities -- 2.4 Coherence Spectrum -- 2.5 Illustrative Examples in Correlation and Spectral Density -- 2.5.1 Deterministic Signals: Correlation and Spectral Density -- 2.5.2 Random Signals: Correlation and Spectral Density -- 2.6 Input-Output Correlation and Spectral Density -- 2.6.1 Generation of Random Signal from White Noise -- 2.6.2 Identification of Non-Parametric Model of a System -- 2.6.3 Identification of a Parametric Model of a Random Signal -- 2.7 Illustrative Examples: Modeling and Identification -- 2.8 Summary -- 2.9 Appendix -- References -- 3 Estimation Theory -- 3.1 Overview -- 3.2 Map Relating Measurement and the Parameter -- 3.2.1 Mathematical Model -- 3.2.2 Probabilistic Model -- 3.2.3 Likelihood Function -- 3.3 Properties of Estimators -- 3.3.1 Indirect Approach to Estimation -- 3.3.2 Unbiasedness of the Estimator -- 3.3.3 Variance of the Estimator: Scalar Case -- 3.3.4 Median of the Data Samples -- 3.3.5 Small and Large Sample Properties -- 3.3.6 Large Sample Properties -- 3.4 Cramér-Rao Inequality -- 3.4.1 Scalar Case: and Scalars while y is a Nx1 Vector -- 3.4.2 Vector Case: is a Mx1 Vector -- 3.4.3 Illustrative Examples: Cramér-Rao Inequality -- 3.4.4 Fisher Information -- 3.5 Maximum Likelihood Estimation -- 3.5.1 Formulation of Maximum Likelihood Estimation.

3.5.2 Illustrative Examples: Maximum Likelihood Estimation of Mean or Median -- 3.5.3 Illustrative Examples: Maximum Likelihood Estimation of Mean and Variance -- 3.5.4 Properties of Maximum Likelihood Estimator -- 3.6 Summary -- 3.7 Appendix: Cauchy-Schwarz Inequality -- 3.8 Appendix: Cramér-Rao Lower Bound -- 3.8.1 Scalar Case -- 3.8.2 Vector Case -- 3.9 Appendix: Fisher Information: Cauchy PDF -- 3.10 Appendix: Fisher Information for i.i.d. PDF -- 3.11 Appendix: Projection Operator -- 3.12 Appendix: Fisher Information: Part Gauss-Part Laplace -- Problem -- References -- Further Readings -- 4 Estimation of Random Parameter -- 4.1 Overview -- 4.2 Minimum Mean-Squares Estimator (MMSE): Scalar Case -- 4.2.1 Conditional Mean: Optimal Estimator -- 4.3 MMSE Estimator: Vector Case -- 4.3.1 Covariance of the Estimation Error -- 4.3.2 Conditional Expectation and Its Properties -- 4.4 Expression for Conditional Mean -- 4.4.1 MMSE Estimator: Gaussian Random Variables -- 4.4.2 MMSE Estimator: Unknown is Gaussian and Measurement Non-Gaussian -- 4.4.3 The MMSE Estimator for Gaussian PDF -- 4.4.4 Illustrative Examples -- 4.5 Summary -- 4.6 Appendix: Non-Gaussian Measurement PDF -- 4.6.1 Expression for Conditional Expectation -- 4.6.2 Conditional Expectation for Gaussian x and Non-Gaussian y -- References -- Further Readings -- 5 Linear Least-Squares Estimation -- 5.1 Overview -- 5.2 Linear Least-Squares Approach -- 5.2.1 Linear Algebraic Model -- 5.2.2 Least-Squares Method -- 5.2.3 Objective Function -- 5.2.4 Optimal Least-Squares Estimate: Normal Equation -- 5.2.5 Geometric Interpretation of Least-Squares Estimate: Orthogonality Principle -- 5.3 Performance of the Least-Squares Estimator -- 5.3.1 Unbiasedness of the Least-Squares Estimate -- 5.3.2 Covariance of the Estimation Error -- 5.3.3 Properties of the Residual.

5.3.4 Model and Systemic Errors: Bias and the Variance Errors -- 5.4 Illustrative Examples -- 5.4.1 Non-Zero-Mean Measurement Noise -- 5.5 Cramér-Rao Lower Bound -- 5.6 Maximum Likelihood Estimation -- 5.6.1 Illustrative Examples -- 5.7 Least-Squares Solution of Under-Determined System -- 5.8 Singular Value Decomposition -- 5.8.1 Illustrative Example: Singular and Eigenvalues of Square Matrices -- 5.8.2 Computation of Least-Squares Estimate Using the SVD -- 5.9 Summary -- 5.10 Appendix: Properties of the Pseudo-Inverse and the Projection Operator -- 5.10.1 Over-Determined System -- 5.10.2 Under-Determined System -- 5.11 Appendix: Positive Definite Matrices -- 5.12 Appendix: Singular Value Decomposition of a Matrix -- 5.12.1 SVD and Eigendecompositions -- 5.12.2 Matrix Norms -- 5.12.3 Least Squares Estimate for Any Arbitrary Data Matrix H -- 5.12.4 Pseudo-Inverse of Any Arbitrary Matrix -- 5.12.5 Bounds on the Residual and the Covariance of the Estimation Error -- 5.13 Appendix: Least-Squares Solution for Under-Determined System -- 5.14 Appendix: Computation of Least-Squares Estimate Using the SVD 229 References -- Further Readings -- 6 Kalman Filter -- 6.1 Overview -- 6.2 Mathematical Model of the System -- 6.2.1 Model of the Plant -- 6.2.2 Model of the Disturbance and Measurement Noise -- 6.2.3 Integrated Model of the System -- 6.2.4 Expression for the Output of the Integrated System -- 6.2.5 Linear Regression Model -- 6.2.6 Observability -- 6.3 Internal Model Principle -- 6.3.1 Controller Design Using the Internal Model Principle -- 6.3.2 Internal Model (IM) of a Signal -- 6.3.3 Controller Design -- 6.3.4 Illustrative Example: Controller Design -- 6.4 Duality Between Controller and an Estimator Design -- 6.4.1 Estimation Problem -- 6.4.2 Estimator Design -- 6.5.1 Problem Formulation -- 6.5.2 The Internal Model of the Output.

6.5.3 Illustrative Example: Observer with Internal Model Structure -- 6.5 Observer: Estimator for the States of a System -- 6.6 Kalman Filter: Estimator of the States of a Stochastic System -- 6.6.1 Objectives of the Kalman Filter -- 6.6.2 Necessary Structure of the Kalman Filter -- 6.6.3 Internal Model of a Random Process -- 6.6.4 Illustrative Example: Role of an Internal Model -- 6.6.5 Model of the Kalman Filter -- 6.6.6 Optimal Kalman Filter -- 6.6.7 Optimal Scalar Kalman Filter -- 6.6.8 Optimal Kalman Gain -- 6.6.9 Comparison of the Kalman Filters: Integrated and Plant Models -- 6.6.10 Steady-State Kalman Filter -- 6.6.11 Internal Model and Statistical Approaches -- 6.6.12 Optimal Information Fusion -- 6.6.13 Role of the Ratio of Variances -- 6.6.14 Fusion of Information from the Model and the Measurement -- 6.6.15 Illustrative Example: Fusion of Information -- 6.6.16 Orthogonal Properties of the Kalman Filter -- 6.6.17 Ensemble and Time Averages -- 6.6.18 Illustrative Example: Orthogonality Properties of the Kalman Filter -- 6.7 The Residual of the Kalman Filter with Model Mismatch and Non-Optimal Gain -- 6.7.1 State Estimation Error with Model Mismatch -- 6.7.2 Illustrative Example: Residual with Model Mismatch and Non-Optimal Gain -- 6.8 Summary -- 6.9 Appendix: Estimation Error Covariance and the Kalman Gain -- 6.10 Appendix: The Role of the Ratio of Plant and the Measurement Noise Variances -- 6.11 Appendix: Orthogonal Properties of the Kalman Filter -- 6.11.1 Span of a Matrix -- 6.11.2 Transfer Function Formulae -- 6.12 Appendix: Kalman Filter Residual with Model Mismatch -- References -- 7 System Identification -- 7.1 Overview -- 7.2 System Model -- 7.2.1 State-Space Model -- 7.2.2 Assumptions -- 7.2.3 Frequency-Domain Model -- 7.2.4 Input Signal for System Identification -- 7.3 Kalman Filter-Based Identification Model Structure.

7.3.1 Expression for the Kalman Filter Residual.
Abstract:
Identification of a physical system deals with the problem of identifying its mathematical model using the measured input and output data. As the physical system is generally complex, nonlinear, and its input-output data is corrupted noise, there are fundamental theoretical and practical issues that need to be considered.  Identification of Physical Systems addresses this need, presenting a systematic, unified approach to the problem of physical system identification and its practical applications.  Starting with a least-squares method, the authors develop various schemes to address the issues of accuracy, variation in the operating regimes, closed loop, and interconnected subsystems. Also presented is a non-parametric signal or data-based scheme to identify a means to provide a quick macroscopic picture of the system to complement the precise microscopic picture given by the parametric model-based scheme.  Finally, a sequential integration of totally different schemes, such as non-parametric, Kalman filter, and parametric model, is developed to meet the speed and accuracy requirement of mission-critical systems. Key features: Provides a clear understanding of theoretical and practical issues in identification and its applications, enabling the reader to grasp a clear understanding of the theory and apply it to practical problems Offers  a self-contained guide by including the background necessary to understand this interdisciplinary subject Includes case studies for the application of identification on physical laboratory scale systems, as well as number of illustrative examples throughout the book Identification of Physical Systems is a comprehensive reference for researchers and practitioners working in this field and is also a useful source of information for graduate students in electrical, computer, biomedical, chemical, and mechanical

engineering.
Local Note:
Electronic reproduction. Ann Arbor, Michigan : ProQuest Ebook Central, 2017. Available via World Wide Web. Access may be limited to ProQuest Ebook Central affiliated libraries.
Electronic Access:
Click to View
Holds: Copies: