Book contents
- Frontmatter
- Contents
- Acknowledgments
- Prologue
- Part I Pattern recognition
- Part II Pattern generation: a key to the puzzles
- Introduction
- 6 An interpretative model
- 7 Testing the interpretative model
- 8 The interpretative model and whorled patterns
- 9 Convergences among models
- Epilogue
- Part III Origins of phyllotactic patterns
- Part IV Complements
- Appendixes
- Bibliography
- Author index
- Subject index
6 - An interpretative model
Published online by Cambridge University Press: 27 April 2010
- Frontmatter
- Contents
- Acknowledgments
- Prologue
- Part I Pattern recognition
- Part II Pattern generation: a key to the puzzles
- Introduction
- 6 An interpretative model
- 7 Testing the interpretative model
- 8 The interpretative model and whorled patterns
- 9 Convergences among models
- Epilogue
- Part III Origins of phyllotactic patterns
- Part IV Complements
- Appendixes
- Bibliography
- Author index
- Subject index
Summary
The necessity of defining entropy measures
An a–disciplinary concept
The word entropy comes from a Greek word meaning evolution. The physical meaning of the concept of entropy is much disputed; it is still considered to be not very clear. According to Poincaré, it is a “prodigiously abstract concept.” There does not exist a completely rigorous mathematical formulation of thermodynamics. Wiener (1948) indicated the need to extend the notion of physical entropy when he stated that information is negative entropy. For Brillouin (1959) information and physical entropy are of the same nature, the increase in entropy corresponding to a loss of information.
The literature – in physics; in statistical theory of communications and information theory; in social sciences and the life sciences; in probability theory, graph theory, and Lebesgue's integration theory – contains many concepts and formulas for entropy and for quantity of information. Among them we find the topological and structural information content of Rashevsky (1955) and Trucco (1956a,b) defined on graphs as a measure of their complexity; the more well–known Shannon–Weaver entropy of a set of probabilities given by I = - Σ pi log pi,; the Hartley – Nyquist formula I = – log p where p is the probability of drawing an n letters message in an urn containing all messages; the chromatic information content of Mowshowitz (1968); the entropy of measurable functions and the epsilon–entropy; and the absolute S-entropy and the weighted entropy.
- Type
- Chapter
- Information
- PhyllotaxisA Systemic Study in Plant Morphogenesis, pp. 127 - 144Publisher: Cambridge University PressPrint publication year: 1994