Happy Atheist Forum

General => Science => Topic started by: ablprop on October 21, 2010, 01:48:42 AM

Title: entropy and inflation
Post by: ablprop on October 21, 2010, 01:48:42 AM
Hi,

I'd like to know more about the low entropy present in the first moments of the universe, and the way inflation affected (created?) that low entropy.

My starting point for this idea is Brian Greene's "Fabric of the Cosmos," and I know that Hackenslash has at least twice now mentioned Greene's incomplete treatment of entropy. I was unsatisfied, as well, because Greene didn't answer my questions:

1) Did a low-entropy situation, by chance, arise, then expand via inflation? and did the low-entropy situation also trigger inflation, were low-entropy and inflation-causing conditions just coincidental? or

2) Did inflation itself somehow create the low entropy situation by spreading out an ordinary high-entropy lump? And again, just what were the conditions that caused inflation in this scenario?
Title: Re: entropy and inflation
Post by: hackenslash on October 21, 2010, 08:02:33 AM
The second is the closest to the picture. I don't have time to treat this properly now, but I'll come back and give it a full treatment later.
Title: Re: entropy and inflation
Post by: Tank on October 21, 2010, 09:32:24 AM
:pop:
Title: Re: entropy and inflation
Post by: DropLogic on October 22, 2010, 11:56:55 PM
My kitchen is a great example of entropy.  I will post pictures with a labeled diagram later tonight.
Title: Re: entropy and inflation
Post by: hackenslash on October 23, 2010, 11:37:34 AM
The first thing to note here is just what entropy is. The problem being, of course, that there are several definitions, all of which are related, but have slightly different expressions. I'll copy a couple of my posts from other places to deal with that, not least because I have been meaning to do a thorough treatment of entropy for a while, in order to shoot down the various canards erected about entropy by cretinists.

Quote from: "url=http://www.rationalskepticism.org/general-science/alda-greene-why-communicating-science-matters-t14198.html#p523918]hackenslash[/url]"]That's one way of looking at it. The problem is that there are different definitions of entropy from different branches of science and, while they are all related, they're not actually equivalent. In thermodynamics, entropy is a measure of how much energy in a system is unavailable to do work. In statistical mechanics, it's a measure of uncertainty, specifically the uncertainty of a sytem being in a particular configuration. This can loosely be described as the number of different ways a system could be reconfigured without changing its appearance. The classic analogy employed here is the desktop strewn with bits of paper. You can move one piece of paper without appreciably altering the appearance of the desktop. To actually tidy the desktop requires the input of energy, effectively rendering energy unavailable to do work. This is, of course, the relationship between the two treatments.

Incidentally, the use of entropy in information theory is related to the second definition, because an increase in information entropy is precisely equal to an increase in uncertainty with regard to the content of a message (Shannon information). Indeed, information is defined as 'reduction in uncertainty', so it can be readily seen how these concepts all relate to each other, and thus why the term entropy actually applies to all of them, albeit in slightly different contexts.

Edited to add:

Going back to the black hole, you can see that all of these definitions apply. The number of ways you can reconfigure the system without affecting its outward appearance is high (probably higher than any other thermodynamic system), and pretty much all of the energy in the system is unavailable for performing work, so its entropy is high. On the other hand, if all the mass of the system is at the singularity, then it is also highly ordered, which demonstrates why equating entropy with disorder is misleading.

Moving on, Daniel F Styer Evolution and Entropy: American Journal of Physics 76 (11) November 2008 tells us:

Quote from: "Styer 2008"Disorder is a metaphor for entropy, not a defi nition for entropy.

Metaphors are valuable only when they are not identical in all respects to their targets. (For example, a map of Caracas is a metaphor for the surface of the Earth at Caracas, in that the map has a similar arrangement but a dissimilar scale. If the map had the same arrangement and scale as Caracas, it would be no easier to navigate using the map than it would be to navigate by going directly to Caracas and wandering the streets.) The metaphor of disorder for entropy is valuable and thus imperfect. For example, take some ice cubes out of your freezer, smash them, toss the shards into a bowl, and then allow the ice to melt. The jumble of ice shards certainly seems more disorderly than the bowl of smooth liquid water, yet the liquid water has the greater entropy

Finally, a little on entropy in information theory:

Quote from: "url=http://www.rationalskepticism.org/creationism/energy-god-t9141-240.html#p332791]hackenslash[/url]"]
Quote from: "Psalm23"I'm not lecturing you on entropy, please calm down and put your knife away.  :what:

PubMed: http://www.ncbi.nlm.nih.gov/pubmed/8196422 (http://www.ncbi.nlm.nih.gov/pubmed/8196422)

That article should speak for itself. I'm willing to bet there are others like it.

Oops! You appear to have missed the operative words in that abstract. Perhaps it would be informative if the key words were presented here. I will bold them, just so you don't miss them.

QuoteAging, from the perspective of the second law of thermodynamics, can be viewed as associated with the inevitable and natural increase in informational entropy of the genome.

So, what this actually amounts to is a quote mine. This is not talking about thermodynamic entropy, but information entropy. The two concepts are related, but not equivalent. Allow me to explain, for your edification and the reduction of your abject ignorance in this regard:

In information theory, there is a parameter known as Shannon entropy, which is defined as the degree of uncertainty associated with a random variable. What this means, in real terms, is that the detail of a message can only be quantitatively ascertained when entropy is low. In other words, the entropy of a message is highest when the highest number of random variables are inherent in the transmission or reception of a message.

In thermodynamics, entropy is defined as the amount of energy in a system that is not available to perform work.

The association between the two is very simple, because it amounts to an equation. Shannon actually chose the word 'entropy' because his formula for determining the degree of uncertainty in an informational system is very similar to the Gibbs equation, formulated by Gibbs and Bolztmann in the late 19th century. The equations are as follows:

Shannon:
(https://www.happyatheistforum.com/forum/proxy.php?request=http%3A%2F%2Fupload.wikimedia.org%2Fmath%2Ff%2F0%2F3%2Ff03cef1034faedb50f7a08351399093f.png&hash=37b03dcad89420e59210b43752dd30bb88ca85db)

Gibbs:
(https://www.happyatheistforum.com/forum/proxy.php?request=http%3A%2F%2Fupload.wikimedia.org%2Fmath%2F0%2F2%2F6%2F026ecd86dda6e59b725030b4796a9846.png&hash=f0d75a3d42e0504c4e3b2eb3e67c869e0c1d96fa)

It is only the similarity between to two equations that led Shannon to choose the term entropy, and if he were a member of this forum, he would fuck your argument up the arse with a cheese-covered stick, possibly adorned with several pinecones, just for your cretinous misrepresentation of what he was actually talking about.*

It should be noted that, among all that up there, you may be able to see something quite dichotomous about entropy in the pre-Planck cosmos, specifically in the bit about black holes (always assuming that the BB arose form a singularity, which is not clear at this time). We have a situation in which entropy can be described as both high and low at the same time. Indeed, Roger Penrose (I'll see if I can find the lecture, but it's escaping me at the moment) showed that even given a single treatment of entropy, the one from physics as cited above, it can be said that the universe was in a state of low and high entropy at the same time, not least because, mathematically, infinity is indistinguishable from zero**.

I prefer to think of it as high entropy, though. In reality, the early universe before expansion was at maximum entropy, especially in the presence of a singularity. All the energy was tied up in the singularity, which gives us one definition. The system could have had its internal state changed in any number of ways without affecting its outward appearance, giving another definition. What is important in the context of this thread is what inflation does to the picture, because it's not very intuitive but, with a little thought, it's quite simple to visualise. What inflation does is to change the game. In other words, while entropy was maximal before inflation, the maximum attainable value for entropy increased as inflation continued, thus allowing entropy to increase. Indeed, the maximum attainable value continues to increase even now as the universe expands. This will not continue for ever, of course, because there will come a point at which energy cannot travel from one place to another, because the temperature of the universe will be the same all over, known as thermal equilibrium. At this point, for entropy to increase would require the input of more energy. This, of course, is prohobited by the first law of thermodynamics.

I hope that clears some of it up, and I will revisit this at soome point to properly formulate a more comprehensive treatment of entropy that deals more specifically with some of the erroneous claims erected about it.


* Excuse the tone, this is from another forum in which bad ideas are treated with contempt, and intellectual dishonesty such as that displayed by the poster I was responding to even more so.

** As an audio professional, this is pretty simple for me to grasp, because 'infinity' equals 'zero level' (related to impedance, where the signal is infinitely attenuated, giving silence).
Title: Re: entropy and inflation
Post by: ablprop on October 23, 2010, 03:27:22 PM
This is excellent. Thank you. I have questions:

1) So the highly ordered state in the pre-inflationary universe is not the result of an unlikely initial state. Instead, it is the result merely of the fact that the universe was very small, meaning there just weren't that many states to occupy. Correct?

2) Inflation took this compact state (something like the compact state of a black hole, but perhaps not a singularity) and spread it out. But in spreading, inflation caused the universe to have hugely more potential entropy than it had possessed in its compact state. In the ball on a hill analogy, rather than raising the ball, inflation simply made the hill much, much larger, while keeping the ball relatively in the same position. Am I still on track?

3) So today, as hydrogen fuses into helium, plants absorb the resultant photons, we eat plants, and worms eat us (hopefully after some fun Saturday nights), we and the stars and the plants and the worms are sort of grabbing onto the energy liberated by this rolling ball as it heads back down toward maximum entropy. OK so far?

4) So now comes the big question: what caused this high-entropy, compact, black hole-like pre-inflationary universe to suddenly balloon up, and in so doing give us the potential for hydrogen, helium, plants, worms, and Saturday nights?

I look forward to responses.
Title: Re: entropy and inflation
Post by: hackenslash on October 23, 2010, 03:44:42 PM
Quote from: "ablprop"This is excellent. Thank you. I have questions:

1) So the highly ordered state in the pre-inflationary universe is not the result of an unlikely initial state. Instead, it is the result merely of the fact that the universe was very small, meaning there just weren't that many states to occupy. Correct?

That's pretty close. It should be noted that the singularity has not yet been established, and at least one of the proposed models for cosmic instantiation has no singularity. If that model is correct, then discussions about the entropic state of the pre-Planck cosmos are rendered somewhat moot, if not entirely obsolete.

Quote2) Inflation took this compact state (something like the compact state of a black hole, but perhaps not a singularity) and spread it out. But in spreading, inflation caused the universe to have hugely more potential entropy than it had possessed in its compact state. In the ball on a hill analogy, rather than raising the ball, inflation simply made the hill much, much larger, while keeping the ball relatively in the same position. Am I still on track?

Well, not quite the same relative position, because there is still an average progression toward greater entropy, but essentially yes.

Quote3) So today, as hydrogen fuses into helium, plants absorb the resultant photons, we eat plants, and worms eat us (hopefully after some fun Saturday nights), we and the stars and the plants and the worms are sort of grabbing onto the energy liberated by this rolling ball as it heads back down toward maximum entropy. OK so far?

Pretty good.

Quote4) So now comes the big question: what caused this high-entropy, compact, black hole-like pre-inflationary universe to suddenly balloon up, and in so doing give us the potential for hydrogen, helium, plants, worms, and Saturday nights?

That's the $64,000 question, and what the various models for cosmic instantiation are attempting to elucidate. Again, Brian Greene deals with one of these hypotheses quite beautifully in Fabric of the Cosmos, in his treatment of the 'inflaton field', along with a discussion of the Higgs field, which you can find from about p.281 (paperback edition 2005)
Title: Re: entropy and inflation
Post by: ablprop on October 23, 2010, 03:55:34 PM
So I guess now we wait for the LHC to find or not find whatever it is that it will find or not find.

It's a great time to be alive. Thanks again.
Title: Re: entropy and inflation
Post by: Inevitable Droid on November 02, 2010, 09:32:42 AM
Hackenslash, I'm having trouble equating the thermodynamic, statistical, and informational dimensions of entropy.  Please help.  I comprehend the thermodynamic - the fact that greater entropy means less energy is available to do work.  But I would have thought that less energy available to do work would mean less work could be done, and thus there would be less uncertainty as to possible configurations of the system.  Picture a bunch of automobiles with empty gas tanks parked in a parking lot.  If nobody puts gas in their tanks and nobody tows them away, those automobiles tomorrow will still be in the same places they are today.  Uncertainty is low, if nobody puts gas in their tanks or tows them away.  In my head, greater entropy means less uncertainty.  Where am I going wrong?  I have no doubt I went wrong somewhere.  I just can't pinpoint where.  Thanks for helping me (and maybe others) understand!

My current level of understanding, incidentally, includes reasonable clarity as to why entropy doesn't disprove envolution, as creationists used to argue and newbie creationists still do, until they realize this argument has been refuted so utterly that long-time creationist debaters no longer even bother to bring it up.  If the Earth were a closed system, entropy might preclude an ever-escalating diversity in the kinds of work that might occur, but the Earth is an open system, with energy continually pouring in from the sun.  Energy pouring in means more energy available to do work, hence less entropy.  The only reason the universe as a whole is imagined to be progressing into steadily greater entropy over all is because the universe as a whole is imagined to be a closed system.  If the universe as a whole were somehow discovered to be an open system, implying apparently that there are other universes (or at least one other) pouring energy into this one, then presumably science would postulate at least the possibility that entropy won't progressively increase over all in our universe as a whole.  If any of that is erroneous, your correction would be appreciated.
Title: Re: entropy and inflation
Post by: DropLogic on November 02, 2010, 04:03:51 PM
Quote from: "Inevitable Droid"Hackenslash, I'm having trouble equating the thermodynamic, statistical, and informational dimensions of entropy.  Please help.  I comprehend the thermodynamic - the fact that greater entropy means less energy is available to do work.  But I would have thought that less energy available to do work would mean less work could be done, and thus there would be less uncertainty as to possible configurations of the system.  Picture a bunch of automobiles with empty gas tanks parked in a parking lot.  If nobody puts gas in their tanks and nobody tows them away, those automobiles tomorrow will still be in the same places they are today.  Uncertainty is low, if nobody puts gas in their tanks or tows them away.  In my head, greater entropy means less uncertainty.  Where am I going wrong?  I have no doubt I went wrong somewhere.  I just can't pinpoint where.  Thanks for helping me (and maybe others) understand!
Left alone long enough those vehicles will most certainly evaporate completely.
Title: Re: entropy and inflation
Post by: Inevitable Droid on November 02, 2010, 10:39:11 PM
Quote from: "DropLogic"Left alone long enough those vehicles will most certainly evaporate completely.

If so, then that's another way of saying that entropy and greater certainty go together.  But apparently it's incorrect to say that.  What I don't understand is why.  Hopefully Hackenslash or someone else of higher calibre than myself will lead me (and perhaps others) out of the wilderness of ignorance.
Title: Re: entropy and inflation
Post by: hackenslash on November 03, 2010, 10:18:48 AM
Quote from: "Inevitable Droid"Hackenslash, I'm having trouble equating the thermodynamic, statistical, and informational dimensions of entropy.  Please help.  I comprehend the thermodynamic - the fact that greater entropy means less energy is available to do work.  But I would have thought that less energy available to do work would mean less work could be done, and thus there would be less uncertainty as to possible configurations of the system.  Picture a bunch of automobiles with empty gas tanks parked in a parking lot.  If nobody puts gas in their tanks and nobody tows them away, those automobiles tomorrow will still be in the same places they are today.  Uncertainty is low, if nobody puts gas in their tanks or tows them away.  In my head, greater entropy means less uncertainty.  Where am I going wrong?  I have no doubt I went wrong somewhere.  I just can't pinpoint where.  Thanks for helping me (and maybe others) understand!

Where you're going wrong is to treat the usages as equal, when they are not. They are related but not the same. That's pretty much the point of the above, because it shows how usage is different between various branches of science. I'll break it down with a list:

1. Physics = Energy that is not available.
2. Statistical mechanics = disorder.
3. Information theory = uncertainty.

It's a mistake to treat the three concepts as the same thing, it's just useful to note the relationship between them. In the case you cite above of the automobiles, without the input of energy they would remain in the same place. When employing entropy in the physical definition, this is a state of high entropy. In the case of statistical mechanics, this is a state of low entropy, and in the case of information theory, it's also low entropy.

This is the entire root of the problem, and the reason that even scientists have trouble grasping entropy, and falsely conflate entropy in energetic processes with disorder.

QuoteMy current level of understanding, incidentally, includes reasonable clarity as to why entropy doesn't disprove envolution, as creationists used to argue and newbie creationists still do, until they realize this argument has been refuted so utterly that long-time creationist debaters no longer even bother to bring it up.  If the Earth were a closed system, entropy might preclude an ever-escalating diversity in the kinds of work that might occur, but the Earth is an open system, with energy continually pouring in from the sun.  Energy pouring in means more energy available to do work, hence less entropy.  The only reason the universe as a whole is imagined to be progressing into steadily greater entropy over all is because the universe as a whole is imagined to be a closed system.  If the universe as a whole were somehow discovered to be an open system, implying apparently that there are other universes (or at least one other) pouring energy into this one, then presumably science would postulate at least the possibility that entropy won't progressively increase over all in our universe as a whole.  If any of that is erroneous, your correction would be appreciated.

Well, there is still a bit of misunderstanding here. Entropy will still increase overall on average*, although it may decrease statistically at a local level. There is a real problem that you've highlighted with the use of the word 'universe', which is one of my pet peeves, and a misunderstanding that I really wish physicists and cosmologists would attempt to address, because it causes a great deal of confusion among lay people.

There is only one universe, and there can only be one universe, because it's what the word means. That's not to say that nothing lies outside our cosmic expansion, only that our cosmic expansion is only part of the universe. The universe is literally 'that which is', and includes anything that lies outside our expansion, if anything. The history of this usage is pretty clear, but it has become muddied in the last 70 years or so due to an expansion in our understanding. It might be useful to deal with the history of the word to clarify:

The word universe has always meant 'everything that exists'. When the word was first applied, it was thought that what we could see was all there was. Hubble, of course, changed all that. He discovered that what were thought of as clouds of gas, or 'nebulae', inside our galaxy (a word which had no real meaning before HUbble), were actually clusters of stars outside our own galaxy, and were galaxies in their own right. This expanded what we thought of as the universe, but the word still meant 'that which exists'. When later physical theories were put forward to deal with how our cosmic expansion began, cosmologists came up with different terms, because they thought that people had an idea what was meant by 'universe' and expanding what was thought of as the universe would only be confusing. In reality, it was not keeping to the original definition of 'universe' that caused all the confusion. Worse, it actually allowed equivocation in terms of precisely what cosmologists were talking about. One of the earliest definitions given, and you can still find this definition in dictionaries, was, 'the entirety of time and space and all matter and energy contained therein'.

Now, many people think that time began at the big bang, but this is far from haviing been established. in reality, while time and space are facets of a single entity, spacetime, and since Hawking and Penrose showed that Relativity dictated the beginning of tme at the big bang, both have since withdrawn that claim, because they realised that Einstein's work didn't take quantum effects into account, and that the mathematics couldn't deal with the cosmos when it got to the Planck epoch, when it was necessary to take quantum effects into account due to the scale of things. This is the reason that the unification of QM and GR is described as the biggest problem in physics. In short, we cannot assert that time began at the big bang. In that light, and since we have models on the table that have time existing before the big bang, we have an extension on what constitutes the universe. Indeed, since some of those models constitute a means of inputting energy into our cosmic expansion, it is even unclear just what kind of thermodynamic system our cosmos actually is!

Coming back to your question, it may be worth expanding a little on thermodynamic systems, as this should clear up the misunderstanding.

There are several types of thermodynamic system, broadly categorised into three groups:

1. Open.
2. Closed.
3. Isolated.

An open system is a system that allows transfer of energy/matter (herinafter called 'stuff', since they are the same thing in different states, as per E=mc^2) in both directions across its boundary. The Earth is an open system, because it allows the transfer of stuff across its boundary. It takes in energy in the form of radiation from the sun, as well as matter in the form of space-borne debris. It also evaporates energy across the boundary in the other direction, in the form of heat dissipation into space.

A closed system is a system that allows the transfer of energy in both directions across its boundary, but not matter. An example of this would be a greenhouse.

An isolated system is one that is completely isolated from its environment, and allows no stuff to cross its boundary.  And example of this would be something like a gas cylinder that is completely insulated.

In practice, isolated systems are almost certainly impossible, because they are still penetrated by fields, such a gravity, which constitute the ability to do work, hence energy**.

In each of the above systems, there are subsystems that define just what kind of transfer is allowed and in which direction across the boundary, but that's enough info to deal with the topic under discussion.

Our local cosmic expansion is almost certainly a closed system, as opposed to an isolated system, but not necessarily. Certainly, the input of energy at the beginning would define it as such. Whether or not transfer of energy is still allowed by the boundaries (even assuming the cosmos has boundaries. See Hawking-Hartle 'no boundary' proposal for more on this) is again not clear at this point.


And once again, what was supposed to be a brief post dealing with minor points turns into War and Peace. :lol:

*Strictly, it doesn't necessarily increase, it only necessarily doesn't decrease overall.
** One of the biggest unsolved problems in physics is to measure the mass of gravity. If you want a futile research project that might pay huge dividends, this is a good contender.
Title: Re: entropy and inflation
Post by: Inevitable Droid on November 03, 2010, 12:45:01 PM
Quote from: "hackenslash"Where you're going wrong is to treat the usages as equal, when they are not. They are related but not the same. That's pretty much the point of the above, because it shows how usage is different between various branches of science.

First, hackenslash, I want to thank you for your comprehensive reply.  Posts like yours don't get written in a matter of mere moments.  I appreciate your investment of time and energy.

Back on topic, I see that I was experiencing semantic dissonance if I may coin the term. :)

I could go with cosmos for everything contained in our local space/time bubble and universe for the sum of all space/time bubbles.  Of course that implies the absolute universality of space/time, perhaps a false implication.

QuoteThere are several types of thermodynamic system, broadly categorised into three groups:

1. Open.
2. Closed.
3. Isolated.

Interesting.  Thanks for the clarification.  So closed only means closed to matter, whereas isolated means closed even to energy.

QuoteIn practice, isolated systems are almost certainly impossible, because they are still penetrated by fields, such a gravity, which constitute the ability to do work, hence energy**.

In that case, then, stasis won't be absolute anywhere.

QuoteOur local cosmic expansion is almost certainly a closed system, as opposed to an isolated system, but not necessarily. Certainly, the input of energy at the beginning would define it as such. Whether or not transfer of energy is still allowed by the boundaries (even assuming the cosmos has boundaries. See Hawking-Hartle 'no boundary' proposal for more on this) is again not clear at this point.

If merely closed, not isolated, then stasis will never take dominion of our space/time bubble.  Good news for our descendants a trillion years from now. :)
Title: Re: entropy and inflation
Post by: hackenslash on November 03, 2010, 01:22:28 PM
Quote from: "Inevitable Droid"First, hackenslash, I want to thank you for your comprehensive reply.  Posts like yours don't get written in a matter of mere moments.  I appreciate your investment of time and energy.

My pleasure. In reality, this is my raison d'etre in many ways. Moreover, I am in the middle of writing a book, and the topic of entropy is going to form an entire (probably quite long) chapter, so discussions like this are extremely helpful in streamlining my thinking with respect to the topic, as well as raising interesting questions that I may not have considered for inclusion.

QuoteBack on topic, I see that I was experiencing semantic dissonance if I may coin the term. :)

You'll never know just how many arguments I've had about that, including with physicists.

QuoteI could go with cosmos for everything contained in our local space/time bubble and universe for the sum of all space/time bubbles.  

I concur, although it should be noted that both words actually mean the very same thing, one from Latin and the other from Greek. It does make sense to use them thus, though, and I generally do in shorthand, but discussions like this one in some contexts can get quite stuck up on semantics, which is why I employ the longhand 'our local cosmic expansion'. I have had many, many discussion of this stuff, and I arrived at this usage through many misunderstandings, sometimes bordering on the psychotic, as Tank will no doubt attest.

QuoteOf course that implies the absolute universality of space/time, perhaps a false implication.

And now you have begun to touch upon another interesting topic, namely the absoluteness of space and time, which they are not. What is absolute, though, is spacetime, the single unification of space and time. This is precisely what Einstein showed us. It has also led to the misunderstanding that there is something special about light, which is a major red herring, and another topic I intend to address in the book. I won't deal with it here, though. Suffice it to say that it is not light that is special, it is the speed at which it travels through space, and what it is, precisely, that this is telling us about the universe. I may begin another thread on this at some point in the near future, because it's another of those things that I need to knock around a bit to make sure I have made it all understandable. It's actually very simple stuff to get your head around, it's just a bit counter-intuitive. If you can understand Pythagoras' Theorem, you can understand relativity and, by corrollary, why light travels at the speed it does and the implications of that.

QuoteIn that case, then, stasis won't be absolute anywhere.

That's difficult to know, in reality. It is possible that we will know in the relatively near future whether or not gravitational influences (and other fields) extend beyond the cosmos (now used here advisedly) or whether they're completely contained. Until we know more about the boundaries (with due reference to my caveat earlier) and indeed the dimensional manifold than we do now, the jury is out.

QuoteIf merely closed, not isolated, then stasis will never take dominion of our space/time bubble.  Good news for our descendants a trillion years from now. :)

That depends. We can't know yet if there can be further input after the initial input at the Big Bang. It may be that the boundaries of our extended dimensions constitute a system that, although once closed (allowing the input of energy), is now isolated (allowing no input). It would still be properly described as closed, rather than an isolated system, simply because there has been input. Certainly, though, the current consensus is that no further input is likely, and that the fate of the universe is a state of thermal equilibrium, or maximum entropy. Of course, this doesn't take quantum effects into account, so the picture is far from complete.
Title: Re: entropy and inflation
Post by: Inevitable Droid on November 03, 2010, 10:48:38 PM
Quote from: "hackenslash"
QuoteBack on topic, I see that I was experiencing semantic dissonance if I may coin the term. :crazy:

An excellent term. Mind if I steal it?

Not at all.  Please do!

QuoteIt's worth noting that information theory has its own problems of this nature, not least because there are two very distinct formulations of information theory, one being the aforementioned Shannon information, which deals with signal integrity, and Kolmogorov information, which deals with information storage.

Going purely on what I've gleaned from your post, I think a distinction between information and data might be helpful.  I don't think there's any utility in calling something information unless and until an observer perceives and interprets it.  In a universe with no observers there would be no information, but there could still be data, if machines or other artifacts (like books, for instance) are storing it.  Kolmogorov, then, would be theorizing about data, rather than information.  Noise is the opposite of information but might not be the opposite of data.  Shannon would be theorizing about information.

QuoteWell, again, the terms are related, but this is precisely why even physicists have problems with it, and this has been a major niggle of mine at popular science writers for a very long time. Even Sagan, bless him, described entropy as disorder, from the perspective of a physicist.

I would suggest that order and information are head and tail of the same coin.  What exists subjectively as information, exists objectively as order.  It would make perfect sense to me if an information theorist like Shannon were to study order and define entropy in terms of order.  It doesn't make sense to me for physicists to do that.  Physicists should be treating everything as data, or better yet, as force.  In a context where information is irrelevant, order likewise is irrelevant.  As far as I know, physicists don't study information, and if that's the case, then neither do they study order, for the two are head and tail of the same coin.  That being said, physicists might in fact want to collaborate with information theorists, since, as it seems to me, information comes into being when an observer perceives and interprets it, and the phenomenon of observation has become a topic of interest for quantum physicists.
Title: Re: entropy and inflation
Post by: hackenslash on November 03, 2010, 11:22:57 PM
Quote from: "Inevitable Droid"Going purely on what I've gleaned from your post, I think a distinction between information and data might be helpful.  I don't think there's any utility in calling something information unless and until an observer perceives and interprets it.  In a universe with no observers there would be no information, but there could still be data, if machines or other artifacts (like books, for instance) are storing it.  Kolmogorov, then, would be theorizing about data, rather than information.  Noise is the opposite of information but might not be the opposite of data.  Shannon would be theorizing about information.

The distinction is already drawn. Data and information are actually interchangeable terms under pretty much any definition you can find. Indeed, most definitions of one will cite the other in order to avoid circularity. The real distinction between the two formulations of information theory is message, which is what Shannon deals with. In reality, though, the simple distinction between transmission and storage suffices for this purpose.

You are correct, though, in that information is the observational data in a system, although that information is still there without a mind to perceive it. It isn't very useful, because it requires a mind to be informed, but the same holds true for data.

QuoteI would suggest that order and information are head and tail of the same coin.  What exists subjectively as information, exists objectively as order.

Hmmm, I think you may be overthinking it. It happens because, as I say, entropy is ill-understood, even by scientists. Information is as well. One of the fora I spend a lot of time on is populated by tenured professional scientists (I'm not one, by the way, just a jobbing music producer) many of whom have struggled to grasp these concepts.

QuoteIt would make perfect sense to me if an information theorist like Shannon were to study order and define entropy in terms of order.  It doesn't make sense to me for physicists to do that.  Physicists should be treating everything as data, or better yet, as force.  In a context where information is irrelevant, order likewise is irrelevant.

Shannon does describe it in that manner, because a reduction in order is an increase in uncertainty, which is how Shannon defines entropy. As for physicists, there are good reasons to employ disorder as an analogy for entropy, because it actually holds water in some cases, and statistical mechanics is very important in physics (as is information theory), which is one of the reasons that confusion arises .

QuoteAs far as I know, physicists don't study information, and if that's the case, then neither do they study order, for the two are head and tail of the same coin.

Quite the contrary. Information is exactly what physicists study. Information, remember, is simply the observational data available with respect to a system, and systems are precisely what physicists study. Moreover, while stating that they study order would be misleading, it certainly forms a very major part of what they do study, because order comes about through the interaction of forces, and this is precisely what physicists do study.

QuoteThat being said, physicists might in fact want to collaborate with information theorists, since, as it seems to me, information comes into being when an observer perceives and interprets it, and the phenomenon of observation has become a topic of interest for quantum physicists.

They certainly do collaborate, although not necessarily explicitly. The only real scientific input I deal with in my day-to-day work is Shannon-Nyquist theorem, but it has to be borne in mind that much of what Shannon and Nyquist dealt with has massive implications in such areas as wave mechanics, a branch of physics, and in quantum mechanics, which deals massively in the implications of wave mechanics at the quantum level, since not only are the descriptions of all quantum events erected in terms of wavefunction, but it also hinges massively on probability distribution, which also employs wave mechanics for description.

All things are connected, and this is where the confusion arises, and why physicists make these errors, especially when presenting their work for a lay audience, who don't grasp the context in which they're being erected. When talking to other physicists, they generally know what is meant and the context is taken for granted.
Title: Re: entropy and inflation
Post by: Inevitable Droid on November 04, 2010, 10:07:36 AM
Quote from: "hackenslash"
Quote from: "Inevitable Droid"Going purely on what I've gleaned from your post, I think a distinction between information and data might be helpful.

The distinction is already drawn. Data and information are actually interchangeable terms under pretty much any definition you can find.

First, I want to thank you again for your time and effort.  I also want to say that at this point I'm stepping outside science into philosophy.  If science is your only interest as far as this thread is concerned, then I will very much understand and respect your decision to let our conversation gracefully end, if so you decide.

Philosophically, then, I would have to say that if the terms data and information are defined in such a way as to be interchangeable, then in fact they aren't distinct.  Quite the opposite of any distinction having been drawn, it has instead been erased.  What I propose is to redefine the words in such a way that we in fact draw a sharp distinction.  I want it to suddenly be false that these terms are interchangeable.  

I suggest we define data as, "(1) order transposed onto a medium and stored there in such a way that it can be retrieved later in some fashion; (2) order communicated by one subject to another via the spoken word; (3) order encountered in the environment by a subject who is predisposed to perceive and interpret that order."  I suggest we define information as, "the perception and interpretation given to data by a subject."  I suggest we define order as, "(1) an attribute of reality which, if presented to a subject, would be data to be perceived and interpreted; (2) that which information informs us of."

By the above redefinitions I have made the terms data, order, and information interdependent but not interchangeable.  I think this is a worthwhile thing to do because there really are three distinct things here and they really are interdependent, and both the distinction and the interdependence are noteworthy, because noting them has mental utility for us.

I'll stop here and see if the above is of interest to you or to anyone else.
Title: Re: entropy and inflation
Post by: hackenslash on November 04, 2010, 06:32:28 PM
Quote from: "Inevitable Droid"First, I want to thank you again for your time and effort.  I also want to say that at this point I'm stepping outside science into philosophy.  If science is your only interest as far as this thread is concerned, then I will very much understand and respect your decision to let our conversation gracefully end, if so you decide.

Well, I have little time for navel-gazing. Philosophy is a useful tool for teaching one how to think, but beyond that it has no utility whatsoever.

QuotePhilosophically, then, I would have to say that if the terms data and information are defined in such a way as to be interchangeable, then in fact they aren't distinct.  Quite the opposite of any distinction having been drawn, it has instead been erased.  What I propose is to redefine the words in such a way that we in fact draw a sharp distinction.  I want it to suddenly be false that these terms are interchangeable.  

A little misunderstanding here. I wasn't talking about a distinction between data and information, but between the two treatments. Perhaps I could have been clearer there. There is no distinction between the data and information, they are essentially the same thing. A datum is a single piece of information, while data are multiple pieces of information.

QuoteI suggest we define data as, "(1) order transposed onto a medium and stored there in such a way that it can be retrieved later in some fashion; (2) order communicated by one subject to another via the spoken word; (3) order encountered in the environment by a subject who is predisposed to perceive and interpret that order."  I suggest we define information as, "the perception and interpretation given to data by a subject."  I suggest we define order as, "(1) an attribute of reality which, if presented to a subject, would be data to be perceived and interpreted; (2) that which information informs us of."

Too much work for too little gain. I don't see enough distinction there to make it worthwhile, not least because it isn't just me you would have to convince. I suggest that this would get very confusing to very many people very quickly. I also don't really see the necessity for drawing any distinction, to be honest. As for order, it has no place in this discussion, because it isn't connected to data/information, but describes something else entirely namely a qualitative property, rather than definitional.

QuoteBy the above redefinitions I have made the terms data, order, and information interdependent but not interchangeable.  I think this is a worthwhile thing to do because there really are three distinct things here and they really are interdependent, and both the distinction and the interdependence are noteworthy, because noting them has mental utility for us.

No, I think that order is unconnected, while the other distinction is redundant and unnecessarily confusing. Indeed, this is the major reason I loathe most branches of philosophy. While setting out to clarify language, they actually have the opposite effect, and mangle it beyond all recognition, which I take as a personal affront, because language is something that is very important to me. Rare is the philosopher who can actually use language in a clear fashion or actually understands the true value of a semantic discussion.

I will resist this with every fibre of my being.

QuoteI'll stop here and see if the above is of interest to you or to anyone else.

I'll see what others may have to say, but I think I've made my position clear.
Title: Re: entropy and inflation
Post by: hackenslash on December 20, 2010, 10:33:41 PM
Just popped in to deliver this, which is an essay on entropy written for a kind of competition in another forum, and which is going to form the bones of the chapter on entropy for the book.

I hope you find it somewhat illuminating.

Quote from: "hackenslash"Order, Order!

Entropy in Thermodynamics, Statistical Mechanics, Evolution and Information Theory.

One of the most common misconceptions, and possibly the one most difficult to properly treat, is that evolution is a violation of the law of entropy. In this essay, I want to provide a comprehensive treatment of entropy. This will not be easy in the word limit (nor in the time I have to write this), not least because it is a concept horribly misunderstood, often because of the way it's treated in popular science books by people who really should know better. By the time I am finished, one of two things will have happened. Either I will have simplified entropy to the state where it can be easily understood, or I will have demonstrated that I don't actually understand it myself.

So what is entropy? Let's begin with what it is not, and a quote from a paper specifically dealing with entropy in evolution:

Quote from: "Styer 2008"Disorder is a metaphor for entropy, not a definition for entropy.

Metaphors are valuable only when they are not identical in all respects to their targets. (For example, a map of Caracas is a metaphor for the surface of the Earth at Caracas, in that the map has a similar arrangement but a dissimilar scale. If the map had the same arrangement and scale as Caracas, it would be no easier to navigate using the map than it would be to navigate by going directly to Caracas and wandering the streets.) The metaphor of disorder for entropy is valuable and thus imperfect. For example, take some ice cubes out of your freezer, smash them, toss the shards into a bowl, and then allow the ice to melt. The jumble of ice shards certainly seems more disorderly than the bowl of smooth liquid water, yet the liquid water has the greater entropy. [1]

The problem is that there are different definitions of entropy from different branches of science and, while they are all related, they're not actually equivalent. In thermodynamics, entropy is a measure of how much energy in a system is unavailable to do work. In statistical mechanics, it's a measure of uncertainty, specifically the uncertainty of a system being in a particular configuration. This can loosely be described as the number of different ways a system could be reconfigured without changing its appearance. This latter is also related to information entropy or Shannon entropy.

So, beginning with statistical mechanics: In statistical mechanics, entropy is a measure of uncertainty or probability. If a system is in an improbable configuration, it is said to be in a state of low entropy.

The classic analogy employed here is the desktop strewn with bits of paper. You can move one piece of paper without appreciably altering the appearance of the desktop. Statistically speaking, this configuration, or one of the many configurations it could have while still remaining untidy, is more probable than one in which the desktop is tidy. Thus, it is in a state of high entropy.

This, of course, is where the idea of entropy as disorder actually comes from, and it does not reflect the definition of entropy employed in thermodynamics, although there is a relationship, as I shall show.

Moving on, let's deal with the definition of entropy in thermodynamics. This will require the laying of a little groundwork:

Firstly, let's look at what the Law of Entropy actually states, as even this is a source of confusion.

The Second Law of Thermodynamics states that, in general,  the entropy of a system will not decrease, except by increasing the entropy of another system. [2]

This is an important point. It does not state that entropy will always increase, as many suggest, only that it will not, in general, decrease. There are no physical principles that prohibit the persistence of a thermodynamic state (except with due reference to the Uncertainty Principle, of course). Indeed, entropy in this instance can be defined as a tendency towards equilibrium, which is just such a state!  Measurement of this tendency can be quite simply stated as the amount of energy in a system that is unavailable to perform work.

It is probably apposite at this point to deal with the three main classes of thermodynamic system. [3]

1. Open system: An open thermodynamic system is the easiest to understand. It is simply a system in which both energy and matter can be exchanged in both directions across the boundary of the system. Indeed, it is not stretching the point too much to state that an open system is one with no boundary. We define a boundary only because that defines the limits of the system, but from a thermodynamic perspective, the boundary is only a convenience of definition. This is distinct from the two other main classes of thermodynamic system, in which the boundary actually plays an important role in the operation of the system.

2. Closed system: A closed thermodynamic system is one in which heat and work may be cross the boundary, but matter may not. This type of system is further divided based on the properties of the boundary. A closed system with an adiabatic boundary allows exchange of heat, but not work, while a rigid boundary allows no heat exchange, but does allow work.

3. Isolated system: An isolated system is a theoretical construct that, apart from the universe itself, probably does not exist in reality. It is a system in which no heat, work or matter can be exchanged across the boundary in either direction. There are two important things to note from my statement of this. The first is that my usage of the word 'universe' is in line with my standard usage, and does not describe 'that which arose from the big bang', but 'that which is'. The second is that we know of no system from which gravity, for example, can be excluded, and since gravity can apparently cross all boundaries, there can be no such thing as an isolated system within our universe unless a barrier to gravity can be found, hence the statement that there is probably no such thing as an isolated system except the universe itself.

Now that that's out of the way, let's attempt a rigorous definition of entropy in a thermodynamic sense.

Entropy in a thermodynamic system can be defined a number of ways, all of which are basically just implications of a single definition. Rigorously defined, it is simply a tendency toward equilibrium. This can be interpreted in a number of ways:

1. The number of configurations of a system that are equivalent.
2. A measure of the amount of energy in a system that is unavailable for performing work.
3. The tendency of all objects with a temperature above absolute zero to radiate energy.

Now, going back to our analogy of the untidy desktop, this can now be described as an open system, because heat, matter and work can be exchanged in both directions across its boundary. As stated before, this is a system in which statistical entropy is high, due to the high probability of its configuration when measured against other possible configurations (there are more configurations of the system that are untidy than there are configurations that are tidy). In  other words, and in compliance with the first of our interpretations above, it is a system which has a high number of equivalent configurations, since there are many 'untidy' configurations of the system, but only a few 'tidy' configurations. To bring the desktop to a state of lower entropy, i.e. a tidy desktop, requires the input of work. This work will increase the entropy of another system (your maid, or whoever does the tidying in your office), giving an increase in entropy overall. This, of course, ties the two definitions from different areas of science together, showing the relationship between them. They are not, of course, the same definition, but they are related. It is also the source of the idea of entropy as disorder.

In evolution, the role of the maid is played by that big shiny yellow thing in the sky. The Earth is an open system, which means that heat, work and matter can be exchanged in both directions across the boundary. The input of energy allowing a local decrease in entropy is provided in two forms, but mainly by the input of high-energy photons from the Sun. This allows photosynthesising organisms to extract carbon from the atmosphere in the form of CO2 and convert it into sugars. This is done at the expanse of an increase in entropy on the sun. Indeed, if Earth and Sun are thought of as a single system, then a local decrease in entropy via work input by one part of the system increases the entropy of the system overall.

Now, a little word on information entropy:

In information theory, there is a parameter known as Shannon entropy, which is defined as the degree of uncertainty associated with a random variable. What this means, in real terms, is that the detail of a message can only be quantitatively ascertained when entropy is low. In other words, the entropy of a message is highest when the highest number of random variables are inherent in the transmission or reception of a message. This shows a clear relationship between Shannon entropy and the definition of entropy from statistical mechanics, where we again have the definition of entropy as uncertainty, as defined by Gibbs. A further relationship is shown when we look at the equations for entropy from statistical thermodynamics, formulated by Boltzmann and Gibbs in the 19th Century, and Shannon's treatment of entropy in information. Indeed, it was actually the similarity of the equations that led Claude Shannon to call his 'reduction in uncertainty' entropy in the first place!

Shannon:
(https://www.happyatheistforum.com/forum/proxy.php?request=http%3A%2F%2Fi139.photobucket.com%2Falbums%2Fq301%2Fhackenslash_album%2FShannon.jpg&hash=3bfa8fd70f7248934d4c8a295f5be38d4bbc5a04)

Gibbs:
(https://www.happyatheistforum.com/forum/proxy.php?request=http%3A%2F%2Fupload.wikimedia.org%2Fmath%2F1%2F1%2F1%2F111eca7ae0de35c9e12b57f0b3822031.png&hash=c85193b01d79791ec9f76822dbf69faa8472ca8c)

The relationship here is clear, and the motivation for Shannon's labelling this parameter 'Entropy'.

Now, a bit on what is the most highly entropic entity that is currently known in the cosmos, namely the black hole. In what sense is it highly entropic? Let's look at those definitions again:

1. The number of configurations of a system that are equivalent: Check. This can be restated as the number of internal states a system can possess without affecting its outward appearance.
2. A measure of the amount of energy in a system that is unavailable for doing work: Check. All the mass/energy is at the singularity, rendering it unavailable.
3. The tendency of all objects with a temperature above absolute zero to radiate energy: Check. The black hole does this by a very specialised mechanism, of course. Energy cannot literally escape across the boundary, because to do so would require that it travelled at greater than the escape velocity for the black hole which is, as we all know, in excess of c. The mechanism by which it radiates is through virtual particle pair production. Where an electron/positron pair are produced, via the uncertainty principle, at the horizon of the black hole, one of several things can occur. Firstly, as elucidated by Dirac [4] the electron can have both positive charge and negative energy as solutions. Thus, a positron falling across the boundary imparts negative energy on the black hole, reducing the mass of the singularity via E=mc2, while the electron escapes, in a phenomenon known as Hawking radiation, thus causing the black hole to eventually evaporate. This is Hawking's solution to the 'information paradox', but that's a topic for another time. Secondly, as described by Feynman [5], we can have a situation in which the electron crosses the boundary backwards in time,  scatters, and then escapes and radiates away forwards in time. Indeed, it could be argued that Feynman, through this mechanism, predicted Hawking radiation before Hawking did!

Now, the classic example of a system is a collection of gas molecules collected in the corner of a box representing a highly ordered, low entropy state. As the gas molecules disperse, entropy increases. But wait! As we have just seen, the black hole has all its mass/energy at the singularity, which means that the most highly entropic system in the cosmos is also one of the most highly ordered! How can this be? Simple: Entropy is not disorder.

Finally, just a few more words from the paper by Styer:

Quote from: "Daniel F Styer"This creationist argument also rests upon the misconception that evolution acts always to produce more complex organisms. In fact evolution acts to produce more highly adapted organisms, which might or might not be more complex than their ancestors, depending upon their environment. For example, most cave organisms and parasites are qualitatively simpler
than their ancestors.

So, when you come across the canard that evolution violates the law of entropy, your first response should be to ask your opponent for a definition of entropy, as it is almost certain that a) he is using the wrong definition and/or b) that he has no understanding of what entropy actually is.



References:
[1] Daniel F Styer Evolution and Entropy: American Journal of Physics 76 (11) November 2008  
[2] http://www.upscale.utoronto.ca/PVB/Harr ... hermo.html (http://www.upscale.utoronto.ca/PVB/Harrison/BlackHoleThermo/BlackHoleThermo.html)
[3] http://en.wikipedia.org/wiki/Thermodynamic_system (http://en.wikipedia.org/wiki/Thermodynamic_system)
[4] P.M Dirac The Quantum Theory of the Electron 1928
[5] http://www.upscale.utoronto.ca/GeneralI ... atter.html (http://www.upscale.utoronto.ca/GeneralInterest/Harrison/AntiMatter/AntiMatter.html)

Further reading:
Simple Nature: Benjamin Crowell (http://www.lightandmatter.com/html_books/0sn/ch05/ch05.html)
http://www.upscale.utoronto.ca/GeneralI ... tropy.html (http://www.upscale.utoronto.ca/GeneralInterest/Harrison/Entropy/Entropy.html)
Title: Re: entropy and inflation
Post by: ablprop on December 21, 2010, 02:22:33 AM
As usual, excellent stuff. I particularly enjoyed this bit:

Quote from: "hackenslash"In evolution, the role of the maid is played by that big shiny yellow thing in the sky. The Earth is an open system, which means that heat, work and matter can be exchanged in both directions across the boundary. The input of energy allowing a local decrease in entropy is provided in two forms, but mainly by the input of high-energy photons from the Sun. This allows photosynthesising organisms to extract carbon from the atmosphere in the form of CO2 and convert it into sugars. This is done at the expanse of an increase in entropy on the sun. Indeed, if Earth and Sun are thought of as a single system, then a local decrease in entropy via work input by one part of the system increases the entropy of the system overall.

An interesting question is this: what if that tomato plant in my yard hadn't intercepted that particular photon? What if, instead, that photon had missed the Earth entirely and kept traveling uninterrupted through space? Does entropy go up by less? If so, then is life on our planet, in its own little miniscule way, speeding the heat death of the universe?

Of course, whether there was life on our planet or not, sunlight would still hit it, and our planet would still then radiate away photons of longer wavelength and lower energy in response, so I suppose that any planet around a star speeds along the heat death of the universe (by a tiny amount).

Or has the entropy increase already occurred by the time the photon flies away from the star? To some extent it must have, because certainly the Sun doesn't care or know that the Earth, plants or no plants, has intercepted its photons. So the act of radiating itself must cause an increase in entropy of the Sun. But I'm not sure I see what the mechanism is here. It is merely that by firing a photon away the total mass of the Sun is spread thinner through the universe?

I found your description of how inflation changed the game extremely useful. I see entropy now as much more a process than a quantity. The universe, pushed by inflation into a state where entropy is much lower than it might have been, is in the process of moving toward that final high entropy state. Along the way, we living things can grab a little of that low entropy, transform it into high entropy, but in the process learn and love and enjoy being alive. Not a particularly cheery thought, maybe, but the mere fact that we could figure it out is more important than the final answer, anyway.