Information and its Role in Nature. J. G. Roederer. (2005, Springer.) Hardcover, $59.95, 235 pages. ISBN 3-540-23075-0. With the growth of the Internet, the concomitant movement of the market towards server-based vendors and efficient search engines, and the advent of high-throughput scientific methods promoting easy access to data, the storage, acquisition, and analysis of information have become some of the dominant concerns of our time. Whereas previous centuries witnessed the ascendancy of machines and built upon them mechanical theories of the universe and society, our own century has looked towards information as a fundamental metaphor for organizing science and civilization. It remains surprising therefore that no generally accepted definition of information exists, beyond that provided by Claude Shannon in 1949, whose original domain of application was somewhat limited to communication problems in engineering. Over the last twenty years several important articles and books have been published that seek to move information beyond a mere statistical measure of communication or uncertainty assaying underlying physical properties, to a position where information is treated as a part of more general complexity measures related to computational work. Physical mechanisms are considered in tandem with information, providing essential substrates and constraints on information such that only by placing matter, energy, and information on an equal footing can we hope to understand the true nature of adaptive decision making mechanisms. In addition to promoting informational sciences, this has also proven to be a program in interdisciplinary thinking, as new bridges have needed to be built between the biological, physical, and mathematical disciplines in order to fully understand their correspondences. It is in this spirit of information as a fundamental organizing principle in physical nature, and in particular adaptive nature, that Juan G. Roederer has written his book: Information and its role in nature. This is in many ways a curious book, and it is not precisely clear who the target audience is. The book is organized into six chapters (and 235 pages) on classical information theory, quantum information theory, physical interactions, the role of information in biology, the role of information in physics (described as passive), and information and the brain. The titles of the chapters provide only a clue to their contents which span the origin of the universe and the emergence of consciousness. Information in the technical sense plays only a small role in any of the chapters, and greater emphasis is placed on physical or adaptive correlations among components of a system, which we would more typically associate with complexity measures. The chapters with a physics emphasis tend to be stronger than those in biology. As I have suggested previously, the book does not provide enough detail on information theory to be useful as an introduction to the field or as a reference, and it omits altogether discussion of relative entropy, mutual information, channel coding, and network information, concepts that would have served the author’s interest in causality. The book spends a fair amount of time discussing quantum mechanics and touches on quantum information theory, but not broadly enough to provide significant utility for students of the field. The book also chooses not to discuss the physics of computation, as researched by Landauer, Bennet, and their colleagues. The book feels more like a personal odyssey through a variety of areas of great interest, loosely connected by qualitative information concepts, and is perhaps best suited to those who have already navigated through similar territories.
Read full abstract