Cover image for Farewell to Entropy : Statistical Thermodynamics Based on Information.
Farewell to Entropy : Statistical Thermodynamics Based on Information.
Title:
Farewell to Entropy : Statistical Thermodynamics Based on Information.
Author:
Ben-Naim, Arieh.
ISBN:
9789812790736
Personal Author:
Physical Description:
1 online resource (412 pages)
Contents:
Contents -- List of Abbreviations -- Preface -- 1 Introduction -- 1.1 A Brief History of Temperature and Entropy -- 1.2 The Association of Entropy with Disorder -- 1.3 The Association of Entropy with Missing Information -- 2 Elements of Probability Theory -- 2.1 Introduction -- 2.2 The Axiomatic Approach -- 2.2.1 The sample space, denoted -- 2.2.2 The field of events, denoted F -- 2.2.3 The probability function, denoted P -- 2.3 The Classical Definition -- 2.4 The Relative Frequency Definition -- 2.5 Independent Events and Conditional Probability -- 2.5.1 Conditional probability and subjective probability -- 2.5.2 Conditional probability and cause and effect -- 2.5.3 Conditional probability and probability of joint events -- 2.6 Bayes' Theorem -- 2.6.1 A challenging problem -- 2.6.2 A more challenging problem: The three prisoners' problem -- 2.7 Random Variables, Average, Variance and Correlation -- 2.8 Some Specific Distributions -- 2.8.1 The binomial distribution -- 2.8.2 The normal distribution -- 2.8.3 The Poisson distribution -- 2.9 Generating Functions -- 2.10 The Law of Large Numbers -- 3 Elements of Information Theory -- 3.1 A Qualitative Introduction to Information Theory -- 3.2 Definition of Shannon's Information and Its Properties -- 3.2.1 Properties of the function H for the simplest case of two outcomes -- 3.2.2 Properties of H for the general case of n outcomes -- 3.2.3 The consistency property of the missing information (MI) -- 3.2.4 The case of an infinite number of outcomes -- 3.2.4.1 The uniform distribution of locations -- 3.2.4.2 The normal distribution of velocities or momenta -- 3.2.4.3 The Boltzmann distribution -- 3.3 The Various Interpretations of the Quantity H -- 3.4 The Assignment of Probabilities by the Maximum Uncertainty Principle.

3.5 The Missing Information and the Average Number of Binary Questions Needed to Acquire It -- 3.6 The False Positive Problem, Revisited -- 3.7 The Urn Problem, Revisited -- 4 Transition from the General MI to the Thermodynamic MI -- 4.1 MI in Binding Systems: One Kind of Information -- 4.1.1 One ligand on M sites -- 4.1.2 Two different ligands on M sites -- 4.1.3 Two identical ligands on M sites -- 4.1.4 Generalization to N ligands on M sites -- 4.2 Some Simple Processes in Binding Systems -- 4.2.1 The analog of the expansion process -- 4.2.2 A pure deassimilation process -- 4.2.3 Mixing process in a binding system -- 4.2.4 The dependence of MI on the characterization of the system -- 4.3 MI in an Ideal Gas System: Two Kinds of Information. The Sackur-Tetrode Equation -- 4.3.1 The locational MI -- 4.3.2 The momentum MI -- 4.3.3 Combining the locational and the momentum MI -- 4.4 Comments -- 5 The Structure of the Foundations of Statistical Thermodynamics -- 5.1 The Isolated System -- The Micro-Canonical Ensemble -- 5.2 System in a Constant Temperature -- The Canonical Ensemble -- 5.3 The Classical Analog of the Canonical Partition Function -- 5.4 The Re-interpretation of the Sackur-Tetrode Expression from Informational Considerations -- 5.5 Identifying the Parameter β for an Ideal Gas -- 5.6 Systems at Constant Temperature and Chemical Potential -- The Grand Canonical Ensemble -- 5.7 Systems at Constant Temperature and Pressure -- The Isothermal Isobaric Ensemble -- 5.8 The Mutual Information due to Intermolecular Interactions -- 6 Some Simple Applications -- 6.1 Expansion of an Ideal Gas -- 6.2 Pure, Reversible Mixing -- The First Illusion -- 6.3 Pure Assimilation Process -- The Second Illusion -- 6.3.1 Fermi-Dirac (FD) statistics -- Fermions -- 6.3.2 Bose-Einstein (BE) statistics -- Bosons -- 6.3.3 Maxwell-Boltzmann (MB) statistics.

6.4 Irreversible Process of Mixing Coupled with Expansion -- 6.5 Irreversible Process of Demixing Coupled with Expansion -- 6.6 Reversible Assimilation Coupled with Expansion -- 6.7 Reflections on the Processes of Mixing and Assimilation -- 6.8 A Pure Spontaneous Deassimilation Process -- 6.9 A Process Involving only Change in the Momentum Distribution -- 6.10 A Process Involving Change in the Intermolecular Interaction Energy -- 6.11 Some Baffing Experiments -- 6.12 The Second Law of Thermodynamics -- Appendices -- A Newton's binomial theorem and some useful identities involving binomial coefficients -- B The total number of states in the Fermi-Dirac and the Bose-Einstein statistics -- C Pair and triplet independence between events -- D Proof of the inequality
Abstract:
The principal message of this book is that thermodynamics and statistical mechanics will benefit from replacing the unfortunate, misleading and mysterious term "entropy" with a more familiar, meaningful and appropriate term such as information, missing information or uncertainty. This replacement would facilitate the interpretation of the "driving force" of many processes in terms of informational changes and dispel the mystery that has always enshrouded entropy.It has been 140 years since Clausius coined the term "entropy"; almost 50 years since Shannon developed the mathematical theory of "information" - subsequently renamed "entropy." In this book, the author advocates replacing "entropy" by "information," a term that has become widely used in many branches of science.The author also takes a new and bold approach to thermodynamics and statistical mechanics. Information is used not only as a tool for predicting distributions but as the fundamental cornerstone concept of thermodynamics, held until now by the term "entropy."The topics covered include the fundamentals of probability and information theory; the general concept of information as well as the particular concept of information as applied in thermodynamics; the re-derivation of the Sackur-Tetrode equation for the entropy of an ideal gas from purely informational arguments; the fundamental formalism of statistical mechanics; and many examples of simple processes the "driving force" for which is analyzed in terms of information.
Local Note:
Electronic reproduction. Ann Arbor, Michigan : ProQuest Ebook Central, 2017. Available via World Wide Web. Access may be limited to ProQuest Ebook Central affiliated libraries.
Electronic Access:
Click to View
Holds: Copies: