News:

Nitpicky? Hell yes.

Main Menu

entropy and inflation

Started by ablprop, October 21, 2010, 01:48:42 AM

Previous topic - Next topic

ablprop

Hi,

I'd like to know more about the low entropy present in the first moments of the universe, and the way inflation affected (created?) that low entropy.

My starting point for this idea is Brian Greene's "Fabric of the Cosmos," and I know that Hackenslash has at least twice now mentioned Greene's incomplete treatment of entropy. I was unsatisfied, as well, because Greene didn't answer my questions:

1) Did a low-entropy situation, by chance, arise, then expand via inflation? and did the low-entropy situation also trigger inflation, were low-entropy and inflation-causing conditions just coincidental? or

2) Did inflation itself somehow create the low entropy situation by spreading out an ordinary high-entropy lump? And again, just what were the conditions that caused inflation in this scenario?

hackenslash

The second is the closest to the picture. I don't have time to treat this properly now, but I'll come back and give it a full treatment later.
There is no more formidable or insuperable barrier to knowledge than the certainty you already possess it.

Tank

If religions were TV channels atheism is turning the TV off.
"Religion is a culture of faith; science is a culture of doubt." ― Richard P. Feynman
'It is said that your life flashes before your eyes just before you die. That is true, it's called Life.' - Terry Pratchett
Remember, your inability to grasp science is not a valid argument against it.

DropLogic

My kitchen is a great example of entropy.  I will post pictures with a labeled diagram later tonight.

hackenslash

#4
The first thing to note here is just what entropy is. The problem being, of course, that there are several definitions, all of which are related, but have slightly different expressions. I'll copy a couple of my posts from other places to deal with that, not least because I have been meaning to do a thorough treatment of entropy for a while, in order to shoot down the various canards erected about entropy by cretinists.

Quote from: "url=http://www.rationalskepticism.org/general-science/alda-greene-why-communicating-science-matters-t14198.html#p523918]hackenslash[/url]"]That's one way of looking at it. The problem is that there are different definitions of entropy from different branches of science and, while they are all related, they're not actually equivalent. In thermodynamics, entropy is a measure of how much energy in a system is unavailable to do work. In statistical mechanics, it's a measure of uncertainty, specifically the uncertainty of a sytem being in a particular configuration. This can loosely be described as the number of different ways a system could be reconfigured without changing its appearance. The classic analogy employed here is the desktop strewn with bits of paper. You can move one piece of paper without appreciably altering the appearance of the desktop. To actually tidy the desktop requires the input of energy, effectively rendering energy unavailable to do work. This is, of course, the relationship between the two treatments.

Incidentally, the use of entropy in information theory is related to the second definition, because an increase in information entropy is precisely equal to an increase in uncertainty with regard to the content of a message (Shannon information). Indeed, information is defined as 'reduction in uncertainty', so it can be readily seen how these concepts all relate to each other, and thus why the term entropy actually applies to all of them, albeit in slightly different contexts.

Edited to add:

Going back to the black hole, you can see that all of these definitions apply. The number of ways you can reconfigure the system without affecting its outward appearance is high (probably higher than any other thermodynamic system), and pretty much all of the energy in the system is unavailable for performing work, so its entropy is high. On the other hand, if all the mass of the system is at the singularity, then it is also highly ordered, which demonstrates why equating entropy with disorder is misleading.

Moving on, Daniel F Styer Evolution and Entropy: American Journal of Physics 76 (11) November 2008 tells us:

Quote from: "Styer 2008"Disorder is a metaphor for entropy, not a defi nition for entropy.

Metaphors are valuable only when they are not identical in all respects to their targets. (For example, a map of Caracas is a metaphor for the surface of the Earth at Caracas, in that the map has a similar arrangement but a dissimilar scale. If the map had the same arrangement and scale as Caracas, it would be no easier to navigate using the map than it would be to navigate by going directly to Caracas and wandering the streets.) The metaphor of disorder for entropy is valuable and thus imperfect. For example, take some ice cubes out of your freezer, smash them, toss the shards into a bowl, and then allow the ice to melt. The jumble of ice shards certainly seems more disorderly than the bowl of smooth liquid water, yet the liquid water has the greater entropy

Finally, a little on entropy in information theory:

Quote from: "url=http://www.rationalskepticism.org/creationism/energy-god-t9141-240.html#p332791]hackenslash[/url]"]
Quote from: "Psalm23"I'm not lecturing you on entropy, please calm down and put your knife away.  :what:

PubMed: http://www.ncbi.nlm.nih.gov/pubmed/8196422

That article should speak for itself. I'm willing to bet there are others like it.

Oops! You appear to have missed the operative words in that abstract. Perhaps it would be informative if the key words were presented here. I will bold them, just so you don't miss them.

QuoteAging, from the perspective of the second law of thermodynamics, can be viewed as associated with the inevitable and natural increase in informational entropy of the genome.

So, what this actually amounts to is a quote mine. This is not talking about thermodynamic entropy, but information entropy. The two concepts are related, but not equivalent. Allow me to explain, for your edification and the reduction of your abject ignorance in this regard:

In information theory, there is a parameter known as Shannon entropy, which is defined as the degree of uncertainty associated with a random variable. What this means, in real terms, is that the detail of a message can only be quantitatively ascertained when entropy is low. In other words, the entropy of a message is highest when the highest number of random variables are inherent in the transmission or reception of a message.

In thermodynamics, entropy is defined as the amount of energy in a system that is not available to perform work.

The association between the two is very simple, because it amounts to an equation. Shannon actually chose the word 'entropy' because his formula for determining the degree of uncertainty in an informational system is very similar to the Gibbs equation, formulated by Gibbs and Bolztmann in the late 19th century. The equations are as follows:

Shannon:


Gibbs:


It is only the similarity between to two equations that led Shannon to choose the term entropy, and if he were a member of this forum, he would fuck your argument up the arse with a cheese-covered stick, possibly adorned with several pinecones, just for your cretinous misrepresentation of what he was actually talking about.*

It should be noted that, among all that up there, you may be able to see something quite dichotomous about entropy in the pre-Planck cosmos, specifically in the bit about black holes (always assuming that the BB arose form a singularity, which is not clear at this time). We have a situation in which entropy can be described as both high and low at the same time. Indeed, Roger Penrose (I'll see if I can find the lecture, but it's escaping me at the moment) showed that even given a single treatment of entropy, the one from physics as cited above, it can be said that the universe was in a state of low and high entropy at the same time, not least because, mathematically, infinity is indistinguishable from zero**.

I prefer to think of it as high entropy, though. In reality, the early universe before expansion was at maximum entropy, especially in the presence of a singularity. All the energy was tied up in the singularity, which gives us one definition. The system could have had its internal state changed in any number of ways without affecting its outward appearance, giving another definition. What is important in the context of this thread is what inflation does to the picture, because it's not very intuitive but, with a little thought, it's quite simple to visualise. What inflation does is to change the game. In other words, while entropy was maximal before inflation, the maximum attainable value for entropy increased as inflation continued, thus allowing entropy to increase. Indeed, the maximum attainable value continues to increase even now as the universe expands. This will not continue for ever, of course, because there will come a point at which energy cannot travel from one place to another, because the temperature of the universe will be the same all over, known as thermal equilibrium. At this point, for entropy to increase would require the input of more energy. This, of course, is prohobited by the first law of thermodynamics.

I hope that clears some of it up, and I will revisit this at soome point to properly formulate a more comprehensive treatment of entropy that deals more specifically with some of the erroneous claims erected about it.


* Excuse the tone, this is from another forum in which bad ideas are treated with contempt, and intellectual dishonesty such as that displayed by the poster I was responding to even more so.

** As an audio professional, this is pretty simple for me to grasp, because 'infinity' equals 'zero level' (related to impedance, where the signal is infinitely attenuated, giving silence).
There is no more formidable or insuperable barrier to knowledge than the certainty you already possess it.

ablprop

This is excellent. Thank you. I have questions:

1) So the highly ordered state in the pre-inflationary universe is not the result of an unlikely initial state. Instead, it is the result merely of the fact that the universe was very small, meaning there just weren't that many states to occupy. Correct?

2) Inflation took this compact state (something like the compact state of a black hole, but perhaps not a singularity) and spread it out. But in spreading, inflation caused the universe to have hugely more potential entropy than it had possessed in its compact state. In the ball on a hill analogy, rather than raising the ball, inflation simply made the hill much, much larger, while keeping the ball relatively in the same position. Am I still on track?

3) So today, as hydrogen fuses into helium, plants absorb the resultant photons, we eat plants, and worms eat us (hopefully after some fun Saturday nights), we and the stars and the plants and the worms are sort of grabbing onto the energy liberated by this rolling ball as it heads back down toward maximum entropy. OK so far?

4) So now comes the big question: what caused this high-entropy, compact, black hole-like pre-inflationary universe to suddenly balloon up, and in so doing give us the potential for hydrogen, helium, plants, worms, and Saturday nights?

I look forward to responses.

hackenslash

Quote from: "ablprop"This is excellent. Thank you. I have questions:

1) So the highly ordered state in the pre-inflationary universe is not the result of an unlikely initial state. Instead, it is the result merely of the fact that the universe was very small, meaning there just weren't that many states to occupy. Correct?

That's pretty close. It should be noted that the singularity has not yet been established, and at least one of the proposed models for cosmic instantiation has no singularity. If that model is correct, then discussions about the entropic state of the pre-Planck cosmos are rendered somewhat moot, if not entirely obsolete.

Quote2) Inflation took this compact state (something like the compact state of a black hole, but perhaps not a singularity) and spread it out. But in spreading, inflation caused the universe to have hugely more potential entropy than it had possessed in its compact state. In the ball on a hill analogy, rather than raising the ball, inflation simply made the hill much, much larger, while keeping the ball relatively in the same position. Am I still on track?

Well, not quite the same relative position, because there is still an average progression toward greater entropy, but essentially yes.

Quote3) So today, as hydrogen fuses into helium, plants absorb the resultant photons, we eat plants, and worms eat us (hopefully after some fun Saturday nights), we and the stars and the plants and the worms are sort of grabbing onto the energy liberated by this rolling ball as it heads back down toward maximum entropy. OK so far?

Pretty good.

Quote4) So now comes the big question: what caused this high-entropy, compact, black hole-like pre-inflationary universe to suddenly balloon up, and in so doing give us the potential for hydrogen, helium, plants, worms, and Saturday nights?

That's the $64,000 question, and what the various models for cosmic instantiation are attempting to elucidate. Again, Brian Greene deals with one of these hypotheses quite beautifully in Fabric of the Cosmos, in his treatment of the 'inflaton field', along with a discussion of the Higgs field, which you can find from about p.281 (paperback edition 2005)
There is no more formidable or insuperable barrier to knowledge than the certainty you already possess it.

ablprop

So I guess now we wait for the LHC to find or not find whatever it is that it will find or not find.

It's a great time to be alive. Thanks again.

Inevitable Droid

Hackenslash, I'm having trouble equating the thermodynamic, statistical, and informational dimensions of entropy.  Please help.  I comprehend the thermodynamic - the fact that greater entropy means less energy is available to do work.  But I would have thought that less energy available to do work would mean less work could be done, and thus there would be less uncertainty as to possible configurations of the system.  Picture a bunch of automobiles with empty gas tanks parked in a parking lot.  If nobody puts gas in their tanks and nobody tows them away, those automobiles tomorrow will still be in the same places they are today.  Uncertainty is low, if nobody puts gas in their tanks or tows them away.  In my head, greater entropy means less uncertainty.  Where am I going wrong?  I have no doubt I went wrong somewhere.  I just can't pinpoint where.  Thanks for helping me (and maybe others) understand!

My current level of understanding, incidentally, includes reasonable clarity as to why entropy doesn't disprove envolution, as creationists used to argue and newbie creationists still do, until they realize this argument has been refuted so utterly that long-time creationist debaters no longer even bother to bring it up.  If the Earth were a closed system, entropy might preclude an ever-escalating diversity in the kinds of work that might occur, but the Earth is an open system, with energy continually pouring in from the sun.  Energy pouring in means more energy available to do work, hence less entropy.  The only reason the universe as a whole is imagined to be progressing into steadily greater entropy over all is because the universe as a whole is imagined to be a closed system.  If the universe as a whole were somehow discovered to be an open system, implying apparently that there are other universes (or at least one other) pouring energy into this one, then presumably science would postulate at least the possibility that entropy won't progressively increase over all in our universe as a whole.  If any of that is erroneous, your correction would be appreciated.
Oppose Abraham.

[Missing image]

In the face of mystery, do science, not theology.

DropLogic

Quote from: "Inevitable Droid"Hackenslash, I'm having trouble equating the thermodynamic, statistical, and informational dimensions of entropy.  Please help.  I comprehend the thermodynamic - the fact that greater entropy means less energy is available to do work.  But I would have thought that less energy available to do work would mean less work could be done, and thus there would be less uncertainty as to possible configurations of the system.  Picture a bunch of automobiles with empty gas tanks parked in a parking lot.  If nobody puts gas in their tanks and nobody tows them away, those automobiles tomorrow will still be in the same places they are today.  Uncertainty is low, if nobody puts gas in their tanks or tows them away.  In my head, greater entropy means less uncertainty.  Where am I going wrong?  I have no doubt I went wrong somewhere.  I just can't pinpoint where.  Thanks for helping me (and maybe others) understand!
Left alone long enough those vehicles will most certainly evaporate completely.

Inevitable Droid

Quote from: "DropLogic"Left alone long enough those vehicles will most certainly evaporate completely.

If so, then that's another way of saying that entropy and greater certainty go together.  But apparently it's incorrect to say that.  What I don't understand is why.  Hopefully Hackenslash or someone else of higher calibre than myself will lead me (and perhaps others) out of the wilderness of ignorance.
Oppose Abraham.

[Missing image]

In the face of mystery, do science, not theology.

hackenslash

Quote from: "Inevitable Droid"Hackenslash, I'm having trouble equating the thermodynamic, statistical, and informational dimensions of entropy.  Please help.  I comprehend the thermodynamic - the fact that greater entropy means less energy is available to do work.  But I would have thought that less energy available to do work would mean less work could be done, and thus there would be less uncertainty as to possible configurations of the system.  Picture a bunch of automobiles with empty gas tanks parked in a parking lot.  If nobody puts gas in their tanks and nobody tows them away, those automobiles tomorrow will still be in the same places they are today.  Uncertainty is low, if nobody puts gas in their tanks or tows them away.  In my head, greater entropy means less uncertainty.  Where am I going wrong?  I have no doubt I went wrong somewhere.  I just can't pinpoint where.  Thanks for helping me (and maybe others) understand!

Where you're going wrong is to treat the usages as equal, when they are not. They are related but not the same. That's pretty much the point of the above, because it shows how usage is different between various branches of science. I'll break it down with a list:

1. Physics = Energy that is not available.
2. Statistical mechanics = disorder.
3. Information theory = uncertainty.

It's a mistake to treat the three concepts as the same thing, it's just useful to note the relationship between them. In the case you cite above of the automobiles, without the input of energy they would remain in the same place. When employing entropy in the physical definition, this is a state of high entropy. In the case of statistical mechanics, this is a state of low entropy, and in the case of information theory, it's also low entropy.

This is the entire root of the problem, and the reason that even scientists have trouble grasping entropy, and falsely conflate entropy in energetic processes with disorder.

QuoteMy current level of understanding, incidentally, includes reasonable clarity as to why entropy doesn't disprove envolution, as creationists used to argue and newbie creationists still do, until they realize this argument has been refuted so utterly that long-time creationist debaters no longer even bother to bring it up.  If the Earth were a closed system, entropy might preclude an ever-escalating diversity in the kinds of work that might occur, but the Earth is an open system, with energy continually pouring in from the sun.  Energy pouring in means more energy available to do work, hence less entropy.  The only reason the universe as a whole is imagined to be progressing into steadily greater entropy over all is because the universe as a whole is imagined to be a closed system.  If the universe as a whole were somehow discovered to be an open system, implying apparently that there are other universes (or at least one other) pouring energy into this one, then presumably science would postulate at least the possibility that entropy won't progressively increase over all in our universe as a whole.  If any of that is erroneous, your correction would be appreciated.

Well, there is still a bit of misunderstanding here. Entropy will still increase overall on average*, although it may decrease statistically at a local level. There is a real problem that you've highlighted with the use of the word 'universe', which is one of my pet peeves, and a misunderstanding that I really wish physicists and cosmologists would attempt to address, because it causes a great deal of confusion among lay people.

There is only one universe, and there can only be one universe, because it's what the word means. That's not to say that nothing lies outside our cosmic expansion, only that our cosmic expansion is only part of the universe. The universe is literally 'that which is', and includes anything that lies outside our expansion, if anything. The history of this usage is pretty clear, but it has become muddied in the last 70 years or so due to an expansion in our understanding. It might be useful to deal with the history of the word to clarify:

The word universe has always meant 'everything that exists'. When the word was first applied, it was thought that what we could see was all there was. Hubble, of course, changed all that. He discovered that what were thought of as clouds of gas, or 'nebulae', inside our galaxy (a word which had no real meaning before HUbble), were actually clusters of stars outside our own galaxy, and were galaxies in their own right. This expanded what we thought of as the universe, but the word still meant 'that which exists'. When later physical theories were put forward to deal with how our cosmic expansion began, cosmologists came up with different terms, because they thought that people had an idea what was meant by 'universe' and expanding what was thought of as the universe would only be confusing. In reality, it was not keeping to the original definition of 'universe' that caused all the confusion. Worse, it actually allowed equivocation in terms of precisely what cosmologists were talking about. One of the earliest definitions given, and you can still find this definition in dictionaries, was, 'the entirety of time and space and all matter and energy contained therein'.

Now, many people think that time began at the big bang, but this is far from haviing been established. in reality, while time and space are facets of a single entity, spacetime, and since Hawking and Penrose showed that Relativity dictated the beginning of tme at the big bang, both have since withdrawn that claim, because they realised that Einstein's work didn't take quantum effects into account, and that the mathematics couldn't deal with the cosmos when it got to the Planck epoch, when it was necessary to take quantum effects into account due to the scale of things. This is the reason that the unification of QM and GR is described as the biggest problem in physics. In short, we cannot assert that time began at the big bang. In that light, and since we have models on the table that have time existing before the big bang, we have an extension on what constitutes the universe. Indeed, since some of those models constitute a means of inputting energy into our cosmic expansion, it is even unclear just what kind of thermodynamic system our cosmos actually is!

Coming back to your question, it may be worth expanding a little on thermodynamic systems, as this should clear up the misunderstanding.

There are several types of thermodynamic system, broadly categorised into three groups:

1. Open.
2. Closed.
3. Isolated.

An open system is a system that allows transfer of energy/matter (herinafter called 'stuff', since they are the same thing in different states, as per E=mc^2) in both directions across its boundary. The Earth is an open system, because it allows the transfer of stuff across its boundary. It takes in energy in the form of radiation from the sun, as well as matter in the form of space-borne debris. It also evaporates energy across the boundary in the other direction, in the form of heat dissipation into space.

A closed system is a system that allows the transfer of energy in both directions across its boundary, but not matter. An example of this would be a greenhouse.

An isolated system is one that is completely isolated from its environment, and allows no stuff to cross its boundary.  And example of this would be something like a gas cylinder that is completely insulated.

In practice, isolated systems are almost certainly impossible, because they are still penetrated by fields, such a gravity, which constitute the ability to do work, hence energy**.

In each of the above systems, there are subsystems that define just what kind of transfer is allowed and in which direction across the boundary, but that's enough info to deal with the topic under discussion.

Our local cosmic expansion is almost certainly a closed system, as opposed to an isolated system, but not necessarily. Certainly, the input of energy at the beginning would define it as such. Whether or not transfer of energy is still allowed by the boundaries (even assuming the cosmos has boundaries. See Hawking-Hartle 'no boundary' proposal for more on this) is again not clear at this point.


And once again, what was supposed to be a brief post dealing with minor points turns into War and Peace. :lol:

*Strictly, it doesn't necessarily increase, it only necessarily doesn't decrease overall.
** One of the biggest unsolved problems in physics is to measure the mass of gravity. If you want a futile research project that might pay huge dividends, this is a good contender.
There is no more formidable or insuperable barrier to knowledge than the certainty you already possess it.

Inevitable Droid

Quote from: "hackenslash"Where you're going wrong is to treat the usages as equal, when they are not. They are related but not the same. That's pretty much the point of the above, because it shows how usage is different between various branches of science.

First, hackenslash, I want to thank you for your comprehensive reply.  Posts like yours don't get written in a matter of mere moments.  I appreciate your investment of time and energy.

Back on topic, I see that I was experiencing semantic dissonance if I may coin the term. :)

I could go with cosmos for everything contained in our local space/time bubble and universe for the sum of all space/time bubbles.  Of course that implies the absolute universality of space/time, perhaps a false implication.

QuoteThere are several types of thermodynamic system, broadly categorised into three groups:

1. Open.
2. Closed.
3. Isolated.

Interesting.  Thanks for the clarification.  So closed only means closed to matter, whereas isolated means closed even to energy.

QuoteIn practice, isolated systems are almost certainly impossible, because they are still penetrated by fields, such a gravity, which constitute the ability to do work, hence energy**.

In that case, then, stasis won't be absolute anywhere.

QuoteOur local cosmic expansion is almost certainly a closed system, as opposed to an isolated system, but not necessarily. Certainly, the input of energy at the beginning would define it as such. Whether or not transfer of energy is still allowed by the boundaries (even assuming the cosmos has boundaries. See Hawking-Hartle 'no boundary' proposal for more on this) is again not clear at this point.

If merely closed, not isolated, then stasis will never take dominion of our space/time bubble.  Good news for our descendants a trillion years from now. :)
Oppose Abraham.

[Missing image]

In the face of mystery, do science, not theology.

hackenslash

Quote from: "Inevitable Droid"First, hackenslash, I want to thank you for your comprehensive reply.  Posts like yours don't get written in a matter of mere moments.  I appreciate your investment of time and energy.

My pleasure. In reality, this is my raison d'etre in many ways. Moreover, I am in the middle of writing a book, and the topic of entropy is going to form an entire (probably quite long) chapter, so discussions like this are extremely helpful in streamlining my thinking with respect to the topic, as well as raising interesting questions that I may not have considered for inclusion.

QuoteBack on topic, I see that I was experiencing semantic dissonance if I may coin the term. :)

You'll never know just how many arguments I've had about that, including with physicists.

QuoteI could go with cosmos for everything contained in our local space/time bubble and universe for the sum of all space/time bubbles.  

I concur, although it should be noted that both words actually mean the very same thing, one from Latin and the other from Greek. It does make sense to use them thus, though, and I generally do in shorthand, but discussions like this one in some contexts can get quite stuck up on semantics, which is why I employ the longhand 'our local cosmic expansion'. I have had many, many discussion of this stuff, and I arrived at this usage through many misunderstandings, sometimes bordering on the psychotic, as Tank will no doubt attest.

QuoteOf course that implies the absolute universality of space/time, perhaps a false implication.

And now you have begun to touch upon another interesting topic, namely the absoluteness of space and time, which they are not. What is absolute, though, is spacetime, the single unification of space and time. This is precisely what Einstein showed us. It has also led to the misunderstanding that there is something special about light, which is a major red herring, and another topic I intend to address in the book. I won't deal with it here, though. Suffice it to say that it is not light that is special, it is the speed at which it travels through space, and what it is, precisely, that this is telling us about the universe. I may begin another thread on this at some point in the near future, because it's another of those things that I need to knock around a bit to make sure I have made it all understandable. It's actually very simple stuff to get your head around, it's just a bit counter-intuitive. If you can understand Pythagoras' Theorem, you can understand relativity and, by corrollary, why light travels at the speed it does and the implications of that.

QuoteIn that case, then, stasis won't be absolute anywhere.

That's difficult to know, in reality. It is possible that we will know in the relatively near future whether or not gravitational influences (and other fields) extend beyond the cosmos (now used here advisedly) or whether they're completely contained. Until we know more about the boundaries (with due reference to my caveat earlier) and indeed the dimensional manifold than we do now, the jury is out.

QuoteIf merely closed, not isolated, then stasis will never take dominion of our space/time bubble.  Good news for our descendants a trillion years from now. :)

That depends. We can't know yet if there can be further input after the initial input at the Big Bang. It may be that the boundaries of our extended dimensions constitute a system that, although once closed (allowing the input of energy), is now isolated (allowing no input). It would still be properly described as closed, rather than an isolated system, simply because there has been input. Certainly, though, the current consensus is that no further input is likely, and that the fate of the universe is a state of thermal equilibrium, or maximum entropy. Of course, this doesn't take quantum effects into account, so the picture is far from complete.
There is no more formidable or insuperable barrier to knowledge than the certainty you already possess it.

Inevitable Droid

Quote from: "hackenslash"
QuoteBack on topic, I see that I was experiencing semantic dissonance if I may coin the term. :crazy:

An excellent term. Mind if I steal it?

Not at all.  Please do!

QuoteIt's worth noting that information theory has its own problems of this nature, not least because there are two very distinct formulations of information theory, one being the aforementioned Shannon information, which deals with signal integrity, and Kolmogorov information, which deals with information storage.

Going purely on what I've gleaned from your post, I think a distinction between information and data might be helpful.  I don't think there's any utility in calling something information unless and until an observer perceives and interprets it.  In a universe with no observers there would be no information, but there could still be data, if machines or other artifacts (like books, for instance) are storing it.  Kolmogorov, then, would be theorizing about data, rather than information.  Noise is the opposite of information but might not be the opposite of data.  Shannon would be theorizing about information.

QuoteWell, again, the terms are related, but this is precisely why even physicists have problems with it, and this has been a major niggle of mine at popular science writers for a very long time. Even Sagan, bless him, described entropy as disorder, from the perspective of a physicist.

I would suggest that order and information are head and tail of the same coin.  What exists subjectively as information, exists objectively as order.  It would make perfect sense to me if an information theorist like Shannon were to study order and define entropy in terms of order.  It doesn't make sense to me for physicists to do that.  Physicists should be treating everything as data, or better yet, as force.  In a context where information is irrelevant, order likewise is irrelevant.  As far as I know, physicists don't study information, and if that's the case, then neither do they study order, for the two are head and tail of the same coin.  That being said, physicists might in fact want to collaborate with information theorists, since, as it seems to me, information comes into being when an observer perceives and interprets it, and the phenomenon of observation has become a topic of interest for quantum physicists.
Oppose Abraham.

[Missing image]

In the face of mystery, do science, not theology.