Book contents
- Frontmatter
- Contents
- Preface
- Donors
- Bayesian Methods: General Background
- Monkeys, Kangaroos, and N
- The Theory and Practice of the Maximum Entropy Formalism
- Bayesian Non-Parametric Statistics
- Generalized Entropies and the Maximum Entropy Principle
- The Probability of a Probability
- Prior Probabilities Revisited
- Band Extensions, Maximum Entropy and the Permanence Principle
- Theory of Maximum Entropy Image Reconstruction
- The Cambridge Maximum Entropy Algorithm
- Maximum Entropy and the Moments Problem: Spectroscopic Applications
- Maximum-Entropy Spectrum from a Non-Extendable Autocorrelation Function
- Multichannel Maximum Entropy Spectral Analysis Using Least Squares Modelling
- Multichannel Relative-Entropy Spectrum Analysis
- Maximum Entropy and the Earth's Density
- Entropy and Some Inverse Problems in Exploration Seismology
- Principle of Maximum Entropy and Inverse Scattering Problems
- Index
The Theory and Practice of the Maximum Entropy Formalism
Published online by Cambridge University Press: 04 May 2010
- Frontmatter
- Contents
- Preface
- Donors
- Bayesian Methods: General Background
- Monkeys, Kangaroos, and N
- The Theory and Practice of the Maximum Entropy Formalism
- Bayesian Non-Parametric Statistics
- Generalized Entropies and the Maximum Entropy Principle
- The Probability of a Probability
- Prior Probabilities Revisited
- Band Extensions, Maximum Entropy and the Permanence Principle
- Theory of Maximum Entropy Image Reconstruction
- The Cambridge Maximum Entropy Algorithm
- Maximum Entropy and the Moments Problem: Spectroscopic Applications
- Maximum-Entropy Spectrum from a Non-Extendable Autocorrelation Function
- Multichannel Maximum Entropy Spectral Analysis Using Least Squares Modelling
- Multichannel Relative-Entropy Spectrum Analysis
- Maximum Entropy and the Earth's Density
- Entropy and Some Inverse Problems in Exploration Seismology
- Principle of Maximum Entropy and Inverse Scattering Problems
- Index
Summary
INTRODUCTION AND OVERVIEW
We consider three aspects of the maximum entropy formalism [1–3]. Our purpose is to dispel the three more common objections raised against the rationale and results of the approach. To do so we restrict the scope of the formalism: We consider only such experiments that can be repeated N, (N not necessarily large), times.
(a) Consistent inference: The probabilities determined using the maximum entropy formalism are shown to have the interpretation of the mean frequency. Their value is independent of the number, N, of repetitions of the experiment. What very much does depend on N is the variance of the frequency. The larger N is, the smaller is the variance and the less likely are the actual, observed, frequencies to deviate from the mean. Here (following [4]), we shall show that the maximum entropy formalism does have the stated consistency property. Elsewhere [5, 6] we have shown that it is the only algorithm with that property. In Ref. 6 there are additional arguments which are also based on the need for consistency of predictions in reproducible experiments.
The maximum entropy approach dates at least as far back as Boltzmann [7]. He showed that, in the N → ∞ limit, the maximum entropy formalism determines the most probable frequencies. Ever since, the approach has been plagued by the criticism that it is only valid in the N → ∞ limit. The present [4–6] results should put an end to such arguments.
- Type
- Chapter
- Information
- Maximum Entropy and Bayesian Methods in Applied StatisticsProceedings of the Fourth Maximum Entropy Workshop University of Calgary, 1984, pp. 59 - 84Publisher: Cambridge University PressPrint publication year: 1986
- 3
- Cited by