Information Theory Meets Power Laws : Stochastic Processes and Language Models.
Material type:
- text
- computer
- online resource
- 9781119625360
- 410.15195
- P98 .D436 2021
Cover -- Title Page -- Copyright -- Contents -- Preface -- Acknowledgments -- Basic Notations -- Chapter 1 Guiding Ideas -- 1.1 The Motivating Question -- 1.2 Further Questions About Texts -- 1.3 Zipf's and Herdan's Laws -- 1.4 Markov and Finite‐State Processes -- 1.5 More General Stochastic Processes -- 1.6 Two Interpretations of Probability -- 1.7 Insights from Information Theory -- 1.8 Estimation of Entropy Rate -- 1.9 Entropy of Natural Language -- 1.10 Algorithmic Information Theory -- 1.11 Descriptions of a Random World -- 1.12 Facts and Words Related -- 1.13 Repetitions and Entropies -- 1.14 Decay of Correlations -- 1.15 Recapitulation -- Chapter 2 Probabilistic Preliminaries -- 2.1 Probability Measures -- 2.2 Product Measurable Spaces -- 2.3 Discrete Random Variables -- 2.4 From IID to Finite‐State Processes -- Problems -- Chapter 3 Probabilistic Toolbox -- 3.1 Borel σ‐Fields and a Fair Coin -- 3.2 Integral and Expectation -- 3.3 Inequalities and Corollaries -- 3.4 Semidistributions -- 3.5 Conditional Probability -- 3.6 Modes of Convergence -- 3.7 Complete Spaces -- Problems -- Chapter 4 Ergodic Properties -- 4.1 Plain Relative Frequency -- 4.2 Birkhoff Ergodic Theorem -- 4.3 Ergodic and Mixing Criteria -- 4.4 Ergodic Decomposition -- Problems -- Chapter 5 Entropy and Information -- 5.1 Shannon Measures for Partitions -- 5.2 Block Entropy and Its Limits -- 5.3 Shannon Measures for Fields -- 5.4 Block Entropy Limits Revisited -- 5.5 Convergence of Entropy -- 5.6 Entropy as Self‐Information -- Problems -- Chapter 6 Equipartition and Universality -- 6.1 SMB Theorem -- 6.2 Universal Semidistributions -- 6.3 PPM Probability -- 6.4 SMB Theorem Revisited -- 6.5 PPM‐based Statistics -- Problems -- Chapter 7 Coding and Computation -- 7.1 Elements of Coding -- 7.2 Kolmogorov Complexity -- 7.3 Algorithmic Coding Theorems -- 7.4 Limits of Mathematics.
7.5 Algorithmic Randomness -- Problems -- Chapter 8 Power Laws for Information -- 8.1 Hilberg Exponents -- 8.2 Second Order SMB Theorem -- 8.3 Probabilistic and Algorithmic Facts -- 8.4 Theorems About Facts and Words -- Problems -- Chapter 9 Power Laws for Repetitions -- 9.1 Rényi-Arimoto Entropies -- 9.2 Generalized Entropy Rates -- 9.3 Recurrence Times -- 9.4 Subword Complexity -- 9.5 Two Maximal Lengths -- 9.6 Logarithmic Power Laws -- Problems -- Chapter 10 AMS Processes -- 10.1 AMS and Pseudo AMS Measures -- 10.2 Quasiperiodic Coding -- 10.3 Synchronizable Coding -- 10.4 Entropy Rate in the AMS Case -- Problems -- Chapter 11 Toy Examples -- 11.1 Finite and Ultrafinite Energy -- 11.2 Santa Fe Processes and Alike -- 11.3 Encoding into a Finite Alphabet -- 11.4 Random Hierarchical Association -- 11.5 Toward Better Models -- Problems -- Future Research -- Bibliography -- Index -- EULA.
Description based on publisher supplied metadata and other sources.
Electronic reproduction. Ann Arbor, Michigan : ProQuest Ebook Central, 2024. Available via World Wide Web. Access may be limited to ProQuest Ebook Central affiliated libraries.
There are no comments on this title.