

Buy Information, Entropy, Life And The Universe: What We Know And What We Do Not Know Reprint by Bennaim, Arieh (ISBN: 9789814651677) from desertcart's Book Store. Everyday low prices and free delivery on eligible orders. Review: Does the Second Law Apply to Life and the Universe? - This is indeed a welcome and long needed addition to the literature dealing with the connection between entropy and information theory. If nothing else, Ben-Naim's book serves as a cautionary statement on a bottle of medicine warning the avid reader not to swallow all that is fed him in the pseudo-scientific popular literature that has grown up around the words entropy and information. First coined by Rudolf Clausius almost two centuries ago, Ben-Naim, with surgical precision, separates Clausius, the great physicist, whose formulations of the first and second laws still stand today, from Clausius, the dramatizer, who enshrouded his laws in the clichés that the energy of the universe is constant whiles its entropy `strives' to a maximum. Ben-Naim concludes that it is meaningless to talk about and energy and entropy of a universe that is not defined thermodynamically. Tell me its volume, number of particles, temperature, and pressure, and I will tell you its energy and entropy. Otherwise, the problem is ill-posed. Moreover, if such a universe is isolated, as we believe it to be, why should entropy show a tendency to increase? To the list of showmen, Ben-Naim adds: -Peter Atkins', in his "Four Laws that Drive the Universe," exaggerates when he says that the second law accounts for the emergence of the intricately ordered forms of life. It certainly does not. In another unfulfilled promise, Atkins promises to show how chaos can run against Nature. Oddly enough, the journal bearing the same name judged Atkin's book as "going along way to ease the confusion about the subject." -Sean Carroll's asserts in his "From Eternity to Here" states that, according to classical relativity, there is no way to reconstruct information. In matter of fact, general relativity does not touch on information at all, and he is confusing what has become known incorrectly as black hole thermodynamics dealing with the hated singularities that Einstein at all costs tried to avoid by building bridges over them. -Jacob Bekenstein's constructs the second law of black holes on the pillars of Stephen Hawking's area theorem, which likens the statement that when two black holes collide the area of their event horizons is greater than the sum of their original areas to the increase in entropy when two bodies, of the same kind, but at different temperatures, come into contact. Sadly, the particular case where the entropy is extensive occurs when the temperatures are the same is not covered by Hawking's theorem, and neither is it by Bekenstein's formulation of the second law. In fact, equilibrium between two black holes can never be achieved, so what type of second law are we talking about? And if that is not enough, Ben-Naim queries why bother calculating the entropy of a black hole to begin with when so little is known about it. -Jacques Monod, who epitomizes that adage that an expert should remain in his field of expertise, writes in his "Chance and Necessity" that "[i]ndeed, it is legitimate to view the irreversibility of evolution as an expression of the Second Law in the biosphere." Ben-Naim faults the Nobel Laureate saying that the statement is not only false but it moreover "deepens the `mystery' and `incomprehensibility' associated with entropy and the Second Law." I totally agree. Finally, although one is intrigued by Aaron Katchalsky's statement that "[l]ife is a constant struggle against the tendency to produce entropy by irreversible processes", I totally agree with Ben-Naim that the entropy of a living system per se is undefinable, and even if it were, no one would be able to quantify how much entropy it produces. In short, Ben-Naim's message is to keep your sites down and seek answers to well-posed problems, while being true that entropy has a privileged role in having one foot in the macroscopic world, and the other foot in the microscopic one. As such, it is not only macroscopically measurable but it can be calculated microscopically from the permutation of balls into urns, as Ludwig Boltzmann so well appreciated. Ben-Naim would settle for an intermediary role where entropy determines the probability distribution governing everything from games of chance to macroscopically disordered systems; in short, any system containing a large number of identical, and random distributed elements where the outcome of an experiment is less than certain. Review: A good dose of reality for the educated layman. - At last we have an honest popular science book filled with facts, with solid concrete ideas not wild speculation or the usual tomes filled with statements which over emphasise what science can do. Too many times have I seen what passes for popular science, sometimes written by respected scientists, whereby they attempt to glorify the achievements of science with highly dubious statements. It is this kind of hyperbole which has for ever turned me off just about every popular science books. This book by Ben-Naim is divided into 4 chapters. The first introduces the idea of information finally leading to Shannon's measure of information (SMI). This is followed by plenty of examples both from simple ideas of probability and 20 question games to various mixing processes. The second chapter then concentrates on thermodynamic entropy for various processes in isolated systems as well as discussions of the arrow of time in the second law and the common interpretation of entropy as disorder. These two chapters are then used to study how both information and entropy are related to the processes of life. This includes a study of the molecular structure of DNA, information storage in the brain as well as some discussion about so-called neg-entropy and feeding on information. The last chapter concentrates instead on the universe and studies how many scientists have speculated on the entropy of the universe as well as its SMI. These two things are very different from each other and it is clear that even great scientists mistake the two or equate them. This book is an attempt to clarify these issues as they remain an ambiguous and confused mess in the popular science literature. The conclusions are that thermodynamic entropy is only ever defined for an isolated system at equilibrium which clearly cannot be properly defined for either living beings or the universe. In addition, information, as it is studied using Shannon's information theory, overcomes some of these weaknesses and can still be defined for a system that is not in equilibrium nor isolated provided that a probability distribution is defined for it. These clarifications completely demystify the speculative statements made in other popular science literature. A good dose of reality for the educated layman.
| Best Sellers Rank | 1,259,467 in Books ( See Top 100 in Books ) 1,409 in Computer Information Systems 50,027 in Biological Sciences (Books) 108,968 in Reference (Books) |
| Customer reviews | 4.1 4.1 out of 5 stars (14) |
| Dimensions | 15.24 x 2.95 x 22.86 cm |
| Edition | Reprint |
| ISBN-10 | 9814651672 |
| ISBN-13 | 978-9814651677 |
| Item weight | 794 g |
| Language | English |
| Print length | 492 pages |
| Publication date | 26 Mar. 2015 |
| Publisher | Wspc |
L**D
Does the Second Law Apply to Life and the Universe?
This is indeed a welcome and long needed addition to the literature dealing with the connection between entropy and information theory. If nothing else, Ben-Naim's book serves as a cautionary statement on a bottle of medicine warning the avid reader not to swallow all that is fed him in the pseudo-scientific popular literature that has grown up around the words entropy and information. First coined by Rudolf Clausius almost two centuries ago, Ben-Naim, with surgical precision, separates Clausius, the great physicist, whose formulations of the first and second laws still stand today, from Clausius, the dramatizer, who enshrouded his laws in the clichés that the energy of the universe is constant whiles its entropy `strives' to a maximum. Ben-Naim concludes that it is meaningless to talk about and energy and entropy of a universe that is not defined thermodynamically. Tell me its volume, number of particles, temperature, and pressure, and I will tell you its energy and entropy. Otherwise, the problem is ill-posed. Moreover, if such a universe is isolated, as we believe it to be, why should entropy show a tendency to increase? To the list of showmen, Ben-Naim adds: -Peter Atkins', in his "Four Laws that Drive the Universe," exaggerates when he says that the second law accounts for the emergence of the intricately ordered forms of life. It certainly does not. In another unfulfilled promise, Atkins promises to show how chaos can run against Nature. Oddly enough, the journal bearing the same name judged Atkin's book as "going along way to ease the confusion about the subject." -Sean Carroll's asserts in his "From Eternity to Here" states that, according to classical relativity, there is no way to reconstruct information. In matter of fact, general relativity does not touch on information at all, and he is confusing what has become known incorrectly as black hole thermodynamics dealing with the hated singularities that Einstein at all costs tried to avoid by building bridges over them. -Jacob Bekenstein's constructs the second law of black holes on the pillars of Stephen Hawking's area theorem, which likens the statement that when two black holes collide the area of their event horizons is greater than the sum of their original areas to the increase in entropy when two bodies, of the same kind, but at different temperatures, come into contact. Sadly, the particular case where the entropy is extensive occurs when the temperatures are the same is not covered by Hawking's theorem, and neither is it by Bekenstein's formulation of the second law. In fact, equilibrium between two black holes can never be achieved, so what type of second law are we talking about? And if that is not enough, Ben-Naim queries why bother calculating the entropy of a black hole to begin with when so little is known about it. -Jacques Monod, who epitomizes that adage that an expert should remain in his field of expertise, writes in his "Chance and Necessity" that "[i]ndeed, it is legitimate to view the irreversibility of evolution as an expression of the Second Law in the biosphere." Ben-Naim faults the Nobel Laureate saying that the statement is not only false but it moreover "deepens the `mystery' and `incomprehensibility' associated with entropy and the Second Law." I totally agree. Finally, although one is intrigued by Aaron Katchalsky's statement that "[l]ife is a constant struggle against the tendency to produce entropy by irreversible processes", I totally agree with Ben-Naim that the entropy of a living system per se is undefinable, and even if it were, no one would be able to quantify how much entropy it produces. In short, Ben-Naim's message is to keep your sites down and seek answers to well-posed problems, while being true that entropy has a privileged role in having one foot in the macroscopic world, and the other foot in the microscopic one. As such, it is not only macroscopically measurable but it can be calculated microscopically from the permutation of balls into urns, as Ludwig Boltzmann so well appreciated. Ben-Naim would settle for an intermediary role where entropy determines the probability distribution governing everything from games of chance to macroscopically disordered systems; in short, any system containing a large number of identical, and random distributed elements where the outcome of an experiment is less than certain.
F**R
A good dose of reality for the educated layman.
At last we have an honest popular science book filled with facts, with solid concrete ideas not wild speculation or the usual tomes filled with statements which over emphasise what science can do. Too many times have I seen what passes for popular science, sometimes written by respected scientists, whereby they attempt to glorify the achievements of science with highly dubious statements. It is this kind of hyperbole which has for ever turned me off just about every popular science books. This book by Ben-Naim is divided into 4 chapters. The first introduces the idea of information finally leading to Shannon's measure of information (SMI). This is followed by plenty of examples both from simple ideas of probability and 20 question games to various mixing processes. The second chapter then concentrates on thermodynamic entropy for various processes in isolated systems as well as discussions of the arrow of time in the second law and the common interpretation of entropy as disorder. These two chapters are then used to study how both information and entropy are related to the processes of life. This includes a study of the molecular structure of DNA, information storage in the brain as well as some discussion about so-called neg-entropy and feeding on information. The last chapter concentrates instead on the universe and studies how many scientists have speculated on the entropy of the universe as well as its SMI. These two things are very different from each other and it is clear that even great scientists mistake the two or equate them. This book is an attempt to clarify these issues as they remain an ambiguous and confused mess in the popular science literature. The conclusions are that thermodynamic entropy is only ever defined for an isolated system at equilibrium which clearly cannot be properly defined for either living beings or the universe. In addition, information, as it is studied using Shannon's information theory, overcomes some of these weaknesses and can still be defined for a system that is not in equilibrium nor isolated provided that a probability distribution is defined for it. These clarifications completely demystify the speculative statements made in other popular science literature. A good dose of reality for the educated layman.
O**2
I was tired of all those grandiose-type statements about entropy and the "everything is information" statements that became a common place. Ben-Naim has create a no non-sense book that goes over entropy and Shannon's measure of information that is clear and, as much as possible, very simple. It is a must read work!
J**A
Very satisfied. Thank you!
J**N
Arieh Ben-Naim’s latest book in a series of 5 books about the perplexing topic of entropy and information is a brave attempt to bring precision and rigor to a topic that has befuddled many readers of popular science books. In this volume he directly challenges several current popular authors who apply the concepts of entropy and information to topics like life, evolution, and the universe in ways that are confusing and, to use one of Ben-Naim’s favorite adjectives, meaningless. One hopes that some of these scientists and science writers will respond and clarify or defend their writing. This would benefit the curious public greatly. Full disclosure: I was a proof reader of an early draft of the book because I was hoping the book would stimulate debate on several topics that I love to study. The first two chapters are a thorough and lengthy tutorial on information theory as developed by Claude Shannon of Bell Labs and entropy, that elusive concept developed by Clausius, Thomson, Boltzmann, and Gibbs. There is a less rigorous mathematical focus in this book compared to his earlier volumes, but there is still plenty to digest especially if you go to the end notes every time there is a reference. Setting the stage requires 275 out of the 406 pages. (I’m not sure that all of the math is error free.) Why spend so much time on basics? To address a root of the problem. Many statements by well known authors “are the result of confusing information with SMI (Shannon’s Measure of Information), then confusing SMI with entropy, and finally applying entropy to predict the fate of life and of the universe.” To bring his points home Ben-Naim badgers the reader by repetition and alternative explanations. The terminology brings much baggage that must be stripped away. For example, Shannon developed what is called information theory but it is really communication theory. His work deals with the transmission of error free data over communication links. It does not address the content of the message, i.e. it doesn’t concern what we colloquially call information. Shannon didn’t care if the message was written by Shakespeare or Daffy Duck. The term “bit” is a unit of information on the same level as centimeter is a unit of length. Some people claim the universe is made of bits. “But the bit is a unit of information, not information, and certainly not the smallest possible chunk of information.” The last two chapters apply the concepts to living systems and the universe. I am particularly interested in the concepts of the entropy of the universe and black hole entropy, particularly Sean Carroll’s Past Hypothesis which Carroll asserts explains the arrow of time. Ben-Naim criticizes the popular writing on these topics for lacking a justification for using the term “entropy” in these cases. The “entropy of the universe” may be totally meaningless since the universe is not a well defined thermodynamic system at equilibrium and we have no way to measure it. Similarly a black hole “is far from being well-defined and well-characterized object. Trying to estimate its entropy based on its mass, energy or the area of its horizon is, at best, making an estimate of the entropy of the BH if that would be definable. It does not provide any additional information beyond that.”
R**E
This book is 90% information theory textbook and 10% discussion of entropy and its relation to life. What’s worse is the author then goes about slamming the ideas of Erwin Schrodinger, which is I’m sure the reason a substantial amount of potential readers by the book, in the least elequent, least substatiated fashion I can imagine from someone well versed in this area. Then, rather than following up with his own theories around entropy and life he states that entropy likely doesn’t have any intrinsic involvement in life or that if it does we can’t currently discern how. This book serves as a readers digest on information theory at best and a click-bait titled rant at worst.
M**T
In January of this year (2015) I began corresponding with Professor Ben-Naim. Shortly thereafter, I was invited to review his manuscript for this book. It was thrilling to see a book at this stage of its development and I read it several times. I was amazed by the Professor’s world-class knowledge and expertise in each topic and I came to realize and appreciate that these books are hard work. This book should be and deserves to be a best seller. The book is loaded with information theory, Second Law principles and concepts and the application of probability to these subjects. The book is well organized and beautifully written. In Chapter Two, the professor thoroughly covers Maxwell’s Demon and is undoubtedly one the best analysis and commentary ever written on this paradox. In Chapter Three, the professor explains Schrödinger’s Cat in a most interesting way. I was very impressed and admired the author for identifying, criticizing and correcting unsupportable statements and conclusions made by some of the well-known, established scientists involved in these subjects. Ben-Naim had the fortitude, intellect and the courage to rightfully point out these errors and exposes them. In regards to evolution, Ben-Naim treats the subject with respect for all. Everyone will enjoy his vision of the future of the Universe and the future of the human race. Lastly, I would like to thank Professor Ben-Naim for the honor and privilege of being one of the reviewers and most of all for his friendship.
Trustpilot
1 week ago
2 weeks ago