Cover image for Entropy Theory and its Application in Environmental and Water Engineering.
Entropy Theory and its Application in Environmental and Water Engineering.
Title:
Entropy Theory and its Application in Environmental and Water Engineering.
Author:
Singh, Vijay P.
ISBN:
9781118428603
Personal Author:
Edition:
1st ed.
Physical Description:
1 online resource (747 pages)
Contents:
Cover -- Title Page -- Copright -- Contents -- Preface -- Acknowledgments -- Chapter 1 Introduction -- 1.1 Systems and their characteristics -- 1.1.1 Classes of systems -- 1.1.2 System states -- 1.1.3 Change of state -- 1.1.4 Thermodynamic entropy -- 1.1.5 Evolutive connotation of entropy -- 1.1.6 Statistical mechanical entropy -- 1.2 Informational entropies -- 1.2.1 Types of entropies -- 1.2.2 Shannon entropy -- 1.2.3 Information gain function -- 1.2.4 Boltzmann, Gibbs and Shannon entropies -- 1.2.5 Negentropy -- 1.2.6 Exponential entropy -- 1.2.7 Tsallis entropy -- 1.2.8 Renyi entropy -- 1.3 Entropy, information, and uncertainty -- 1.3.1 Information -- 1.3.2 Uncertainty and surprise -- 1.4 Types of uncertainty -- 1.5 Entropy and related concepts -- 1.5.1 Information content of data -- 1.5.2 Criteria for model selection -- 1.5.3 Hypothesis testing -- 1.5.4 Risk assessment -- Questions -- References -- Additional References -- Chapter 2 Entropy Theory -- 2.1 Formulation of entropy -- 2.2 Shannon entropy -- 2.3 Connotations of information and entropy -- 2.3.1 Amount of information -- 2.3.2 Measure of information -- 2.3.3 Source of information -- 2.3.4 Removal of uncertainty -- 2.3.5 Equivocation -- 2.3.6 Average amount of information -- 2.3.7 Measurement system -- 2.3.8 Information and organization -- 2.4 Discrete entropy: univariate case and marginal entropy -- 2.5 Discrete entropy: bivariate case -- 2.5.1 Joint entropy -- 2.5.2 Conditional entropy -- 2.5.3 Transinformation -- 2.6 Dimensionless entropies -- 2.7 Bayes theorem -- 2.8 Informational correlation coefficient -- 2.9 Coefficient of nontransferred information -- 2.10 Discrete entropy: multidimensional case -- 2.11 Continuous entropy -- 2.11.1 Univariate case -- 2.11.2 Differential entropy of continuous variables.

2.11.3 Variable transformation and entropy -- 2.11.4 Bivariate case -- 2.11.5 Multivariate case -- 2.12 Stochastic processes and entropy -- 2.13 Effect of proportional class interval -- 2.14 Effect of the form of probability distribution -- 2.15 Data with zero values -- 2.16 Effect of measurement units -- 2.17 Effect of averaging data -- 2.18 Effect of measurement error -- 2.19 Entropy in frequency domain -- 2.20 Principle of maximum entropy -- 2.21 Concentration theorem -- 2.22 Principle of minimum cross entropy -- 2.23 Relation between entropy and error probability -- 2.24 Various interpretations of entropy -- 2.24.1 Measure of randomness or disorder -- 2.24.2 Measure of unbiasedness or objectivity -- 2.24.3 Measure of equality -- 2.24.4 Measure of diversity -- 2.24.5 Measure of lack of concentration -- 2.24.6 Measure of flexibility -- 2.24.7 Measure of complexity -- 2.24.8 Measure of departure from uniform distribution -- 2.24.9 Measure of interdependence -- 2.24.10 Measure of dependence -- 2.24.11 Measure of interactivity -- 2.24.12 Measure of similarity -- 2.24.13 Measure of redundancy -- 2.24.14 Measure of organization -- 2.25 Relation between entropy and variance -- 2.26 Entropy power -- 2.27 Relative frequency -- 2.28 Application of entropy theory -- Questions -- References -- Additional Reading -- Chapter 3 Principle of Maximum Entropy -- 3.1 Formulation -- 3.2 POME formalism for discrete variables -- 3.3 POME formalism for continuous variables -- 3.3.1 Entropy maximization using the method of Lagrange multipliers -- 3.3.2 Direct method for entropy maximization -- 3.4 POME formalism for two variables -- 3.5 Effect of constraints on entropy -- 3.6 Invariance of total entropy -- Questions -- References -- Additional Reading -- Chapter 4 Derivation of Pome-Based Distributions.

4.1 Discrete variable and discrete distributions -- 4.1.1 Constraint E[x] and the Maxwell-Boltzmann distribution -- 4.1.2 Two constraints and Bose-Einstein distribution -- 4.1.3 Two constraints and Fermi-Dirac distribution -- 4.1.4 Intermediate statistics distribution -- 4.1.5 Constraint: E[N]: Bernoulli distribution for a single trial -- 4.1.6 Binomial distribution for repeated trials -- 4.1.7 Geometric distribution: repeated trials -- 4.1.8 Negative binomial distribution: repeated trials -- 4.1.9 Constraint: E[N]=n: Poisson distribution -- 4.2 Continuous variable and continuous distributions -- 4.2.1 Finite interval [a, b], no constraint, and rectangular distribution -- 4.2.2 Finite interval [a, b], one constraint and truncated exponential distribution -- 4.2.3 Finite interval [0, 1], two constraints E[lnx] and E[ln(1-x)] and beta distribution of first kind -- 4.2.4 Semi-infinite interval (0, ‡), one constraint E[x] and exponential distribution -- 4.2.5 Semi-infinite interval, two constraints E[x] and E[lnx] and gamma distribution -- 4.2.6 Semi-infinite interval, two constraints E[lnx] and E[ln(1+x)] and beta distribution of second kind -- 4.2.7 Infinite interval, two constraints E[x] and E[x^2] and normal distribution -- 4.2.8 Semi-infinite interval, log-transformation Y = lnX, two constraints E[y] and E[y^2] and log-normal distribution -- 4.2.9 Infinite and semi-infinite intervals: constraints and distributions -- Questions -- References -- Additional Reading -- Chapter 5 Multivariate Probability Distributions -- 5.1 Multivariate normal distributions -- 5.1.1 One time lag serial dependence -- 5.1.2 Two-lag serial dependence -- 5.1.3 Multi-lag serial dependence -- 5.1.4 No serial dependence: bivariate case -- 5.1.5 Cross-correlation and serial dependence: bivariate case.

5.1.6 Multivariate case: no serial dependence -- 5.1.7 Multi-lag serial dependence -- 5.2 Multivariate exponential distributions -- 5.2.1 Bivariate exponential distribution -- 5.2.2 Trivariate exponential distribution -- 5.2.3 Extension to Weibull distribution -- 5.3 Multivariate distributions using the entropy-copula method -- 5.3.1 Families of copula -- 5.3.2 Application -- 5.4 Copula entropy -- Questions -- References -- Additional Reading -- Chapter 6 Principle of Minimum Cross-Entropy -- 6.1 Concept and formulation of POMCE -- 6.2 Properties of POMCE -- 6.3 POMCE formalism for discrete variables -- 6.4 POMCE formulation for continuous variables -- 6.5 Relation to POME -- 6.6 Relation to mutual information -- 6.7 Relation to variational distance -- 6.8 Lin's directed divergence measure -- 6.9 Upper bounds for cross-entropy -- Questions -- References -- Additional Reading -- Chapter 7 Derivation of POME-Based Distributions -- 7.1 Discrete variable and mean E[x] as a constraint -- 7.1.1 Uniform prior distribution -- 7.1.2 Arithmetic prior distribution -- 7.1.3 Geometric prior distribution -- 7.1.4 Binomial prior distribution -- 7.1.5 General prior distribution -- 7.2 Discrete variable taking on an infinite set of values -- 7.2.1 Improper prior probability distribution -- 7.2.2 A priori Poisson probability distribution -- 7.2.3 A priori negative binomial distribution -- 7.3 Continuous variable: general formulation -- 7.3.1 Uniform prior and mean constraint -- 7.3.2 Exponential prior and mean and mean log constraints -- Questions -- References -- Chapter 8 Parameter Estimation -- 8.1 Ordinary entropy-based parameter estimation method -- 8.1.1 Specification of constraints -- 8.1.2 Derivation of entropy-based distribution -- 8.1.3 Construction of zeroth Lagrange multiplier.

8.1.4 Determination of Lagrange multipliers -- 8.1.5 Determination of distribution parameters -- 8.2 Parameter-space expansion method -- 8.3 Contrast with method of maximum likelihood estimation (MLE) -- 8.4 Parameter estimation by numerical methods -- Questions -- References -- Additional Reading -- Chapter 9 Spatial Entropy -- 9.1 Organization of spatial data -- 9.1.1 Distribution, density, and aggregation -- 9.2 Spatial entropy statistics -- 9.2.1 Redundancy -- 9.2.2 Information gain -- 9.2.3 Disutility entropy -- 9.3 One dimensional aggregation -- 9.4 Another approach to spatial representation -- 9.5 Two-dimensional aggregation -- 9.5.1 Probability density function and its resolution -- 9.5.2 Relation between spatial entropy and spatial disutility -- 9.6 Entropy maximization for modeling spatial phenomena -- 9.7 Cluster analysis by entropy maximization -- 9.8 Spatial visualization and mapping -- 9.9 Scale and entropy -- 9.10 Spatial probability distributions -- 9.11 Scaling: rank size rule and Zipf's law -- 9.11.1 Exponential law -- 9.11.2 Log-normal law -- 9.11.3 Power law -- 9.11.4 Law of proportionate effect -- Questions -- References -- Further Reading -- Chapter 10 Inverse Spatial Entropy -- 10.1 Definition -- 10.2 Principle of entropy decomposition -- 10.3 Measures of information gain -- 10.3.1 Bivariate measures -- 10.3.2 Map representation -- 10.3.3 Construction of spatial measures -- 10.4 Aggregation properties -- 10.5 Spatial interpretations -- 10.6 Hierarchical decomposition -- 10.7 Comparative measures of spatial decomposition -- Questions -- References -- Chapter 11 Entropy Spectral Analyses -- 11.1 Characteristics of time series -- 11.1.1 Mean -- 11.1.2 Variance -- 11.1.3 Covariance -- 11.1.4 Correlation -- 11.1.5 Stationarity -- 11.2 Spectral analysis -- 11.2.1 Fourier representation.

11.2.2 Fourier transform.
Abstract:
Entropy Theory and its Application in Environmental and Water Engineering responds to the need for a book that deals with basic concepts of entropy theory from a hydrologic and water engineering perspective and then for a book that deals with applications of these concepts to a range of water engineering problems. The range of applications of entropy is constantly expanding and new areas finding a use for the theory are continually emerging. The applications of concepts and techniques vary across different subject areas and this book aims to relate them directly to practical problems of environmental and water engineering. The book presents and explains the Principle of Maximum Entropy (POME) and the Principle of Minimum Cross Entropy (POMCE) and their applications to different types of probability distributions. Spatial and inverse spatial entropy are important for urban planning and are presented with clarity. Maximum entropy spectral analysis and minimum cross entropy spectral analysis are powerful techniques for addressing a variety of problems faced by environmental and water scientists and engineers and are described here with illustrative examples. Giving a thorough introduction to the use of entropy to measure the unpredictability in environmental and water systems this book will add an essential statistical method to the toolkit of postgraduates, researchers and academic hydrologists, water resource managers, environmental scientists and engineers.  It will also offer a valuable resource for professionals in the same areas, governmental organizations, private companies as well as students in earth sciences, civil and agricultural engineering, and agricultural and rangeland sciences. This book: Provides a thorough introduction to entropy for beginners and more experienced users Uses numerous examples to illustrate the applications of the

theoretical principles Allows the reader to apply entropy theory to the solution of practical problems Assumes minimal existing mathematical knowledge Discusses the theory and its various aspects in both univariate and bivariate cases Covers newly expanding areas including neural networks from an entropy perspective and future developments.
Local Note:
Electronic reproduction. Ann Arbor, Michigan : ProQuest Ebook Central, 2017. Available via World Wide Web. Access may be limited to ProQuest Ebook Central affiliated libraries.
Electronic Access:
Click to View
Holds: Copies: