News:

When one conveys certain things, particularly of such gravity, should one not then appropriately cite sources, authorities...

Main Menu

entropy and inflation

Started by ablprop, October 21, 2010, 01:48:42 AM

Previous topic - Next topic

hackenslash

Quote from: "Inevitable Droid"Going purely on what I've gleaned from your post, I think a distinction between information and data might be helpful.  I don't think there's any utility in calling something information unless and until an observer perceives and interprets it.  In a universe with no observers there would be no information, but there could still be data, if machines or other artifacts (like books, for instance) are storing it.  Kolmogorov, then, would be theorizing about data, rather than information.  Noise is the opposite of information but might not be the opposite of data.  Shannon would be theorizing about information.

The distinction is already drawn. Data and information are actually interchangeable terms under pretty much any definition you can find. Indeed, most definitions of one will cite the other in order to avoid circularity. The real distinction between the two formulations of information theory is message, which is what Shannon deals with. In reality, though, the simple distinction between transmission and storage suffices for this purpose.

You are correct, though, in that information is the observational data in a system, although that information is still there without a mind to perceive it. It isn't very useful, because it requires a mind to be informed, but the same holds true for data.

QuoteI would suggest that order and information are head and tail of the same coin.  What exists subjectively as information, exists objectively as order.

Hmmm, I think you may be overthinking it. It happens because, as I say, entropy is ill-understood, even by scientists. Information is as well. One of the fora I spend a lot of time on is populated by tenured professional scientists (I'm not one, by the way, just a jobbing music producer) many of whom have struggled to grasp these concepts.

QuoteIt would make perfect sense to me if an information theorist like Shannon were to study order and define entropy in terms of order.  It doesn't make sense to me for physicists to do that.  Physicists should be treating everything as data, or better yet, as force.  In a context where information is irrelevant, order likewise is irrelevant.

Shannon does describe it in that manner, because a reduction in order is an increase in uncertainty, which is how Shannon defines entropy. As for physicists, there are good reasons to employ disorder as an analogy for entropy, because it actually holds water in some cases, and statistical mechanics is very important in physics (as is information theory), which is one of the reasons that confusion arises .

QuoteAs far as I know, physicists don't study information, and if that's the case, then neither do they study order, for the two are head and tail of the same coin.

Quite the contrary. Information is exactly what physicists study. Information, remember, is simply the observational data available with respect to a system, and systems are precisely what physicists study. Moreover, while stating that they study order would be misleading, it certainly forms a very major part of what they do study, because order comes about through the interaction of forces, and this is precisely what physicists do study.

QuoteThat being said, physicists might in fact want to collaborate with information theorists, since, as it seems to me, information comes into being when an observer perceives and interprets it, and the phenomenon of observation has become a topic of interest for quantum physicists.

They certainly do collaborate, although not necessarily explicitly. The only real scientific input I deal with in my day-to-day work is Shannon-Nyquist theorem, but it has to be borne in mind that much of what Shannon and Nyquist dealt with has massive implications in such areas as wave mechanics, a branch of physics, and in quantum mechanics, which deals massively in the implications of wave mechanics at the quantum level, since not only are the descriptions of all quantum events erected in terms of wavefunction, but it also hinges massively on probability distribution, which also employs wave mechanics for description.

All things are connected, and this is where the confusion arises, and why physicists make these errors, especially when presenting their work for a lay audience, who don't grasp the context in which they're being erected. When talking to other physicists, they generally know what is meant and the context is taken for granted.
There is no more formidable or insuperable barrier to knowledge than the certainty you already possess it.

Inevitable Droid

Quote from: "hackenslash"
Quote from: "Inevitable Droid"Going purely on what I've gleaned from your post, I think a distinction between information and data might be helpful.

The distinction is already drawn. Data and information are actually interchangeable terms under pretty much any definition you can find.

First, I want to thank you again for your time and effort.  I also want to say that at this point I'm stepping outside science into philosophy.  If science is your only interest as far as this thread is concerned, then I will very much understand and respect your decision to let our conversation gracefully end, if so you decide.

Philosophically, then, I would have to say that if the terms data and information are defined in such a way as to be interchangeable, then in fact they aren't distinct.  Quite the opposite of any distinction having been drawn, it has instead been erased.  What I propose is to redefine the words in such a way that we in fact draw a sharp distinction.  I want it to suddenly be false that these terms are interchangeable.  

I suggest we define data as, "(1) order transposed onto a medium and stored there in such a way that it can be retrieved later in some fashion; (2) order communicated by one subject to another via the spoken word; (3) order encountered in the environment by a subject who is predisposed to perceive and interpret that order."  I suggest we define information as, "the perception and interpretation given to data by a subject."  I suggest we define order as, "(1) an attribute of reality which, if presented to a subject, would be data to be perceived and interpreted; (2) that which information informs us of."

By the above redefinitions I have made the terms data, order, and information interdependent but not interchangeable.  I think this is a worthwhile thing to do because there really are three distinct things here and they really are interdependent, and both the distinction and the interdependence are noteworthy, because noting them has mental utility for us.

I'll stop here and see if the above is of interest to you or to anyone else.
Oppose Abraham.

[Missing image]

In the face of mystery, do science, not theology.

hackenslash

Quote from: "Inevitable Droid"First, I want to thank you again for your time and effort.  I also want to say that at this point I'm stepping outside science into philosophy.  If science is your only interest as far as this thread is concerned, then I will very much understand and respect your decision to let our conversation gracefully end, if so you decide.

Well, I have little time for navel-gazing. Philosophy is a useful tool for teaching one how to think, but beyond that it has no utility whatsoever.

QuotePhilosophically, then, I would have to say that if the terms data and information are defined in such a way as to be interchangeable, then in fact they aren't distinct.  Quite the opposite of any distinction having been drawn, it has instead been erased.  What I propose is to redefine the words in such a way that we in fact draw a sharp distinction.  I want it to suddenly be false that these terms are interchangeable.  

A little misunderstanding here. I wasn't talking about a distinction between data and information, but between the two treatments. Perhaps I could have been clearer there. There is no distinction between the data and information, they are essentially the same thing. A datum is a single piece of information, while data are multiple pieces of information.

QuoteI suggest we define data as, "(1) order transposed onto a medium and stored there in such a way that it can be retrieved later in some fashion; (2) order communicated by one subject to another via the spoken word; (3) order encountered in the environment by a subject who is predisposed to perceive and interpret that order."  I suggest we define information as, "the perception and interpretation given to data by a subject."  I suggest we define order as, "(1) an attribute of reality which, if presented to a subject, would be data to be perceived and interpreted; (2) that which information informs us of."

Too much work for too little gain. I don't see enough distinction there to make it worthwhile, not least because it isn't just me you would have to convince. I suggest that this would get very confusing to very many people very quickly. I also don't really see the necessity for drawing any distinction, to be honest. As for order, it has no place in this discussion, because it isn't connected to data/information, but describes something else entirely namely a qualitative property, rather than definitional.

QuoteBy the above redefinitions I have made the terms data, order, and information interdependent but not interchangeable.  I think this is a worthwhile thing to do because there really are three distinct things here and they really are interdependent, and both the distinction and the interdependence are noteworthy, because noting them has mental utility for us.

No, I think that order is unconnected, while the other distinction is redundant and unnecessarily confusing. Indeed, this is the major reason I loathe most branches of philosophy. While setting out to clarify language, they actually have the opposite effect, and mangle it beyond all recognition, which I take as a personal affront, because language is something that is very important to me. Rare is the philosopher who can actually use language in a clear fashion or actually understands the true value of a semantic discussion.

I will resist this with every fibre of my being.

QuoteI'll stop here and see if the above is of interest to you or to anyone else.

I'll see what others may have to say, but I think I've made my position clear.
There is no more formidable or insuperable barrier to knowledge than the certainty you already possess it.

hackenslash

Just popped in to deliver this, which is an essay on entropy written for a kind of competition in another forum, and which is going to form the bones of the chapter on entropy for the book.

I hope you find it somewhat illuminating.

Quote from: "hackenslash"Order, Order!

Entropy in Thermodynamics, Statistical Mechanics, Evolution and Information Theory.

One of the most common misconceptions, and possibly the one most difficult to properly treat, is that evolution is a violation of the law of entropy. In this essay, I want to provide a comprehensive treatment of entropy. This will not be easy in the word limit (nor in the time I have to write this), not least because it is a concept horribly misunderstood, often because of the way it's treated in popular science books by people who really should know better. By the time I am finished, one of two things will have happened. Either I will have simplified entropy to the state where it can be easily understood, or I will have demonstrated that I don't actually understand it myself.

So what is entropy? Let's begin with what it is not, and a quote from a paper specifically dealing with entropy in evolution:

Quote from: "Styer 2008"Disorder is a metaphor for entropy, not a definition for entropy.

Metaphors are valuable only when they are not identical in all respects to their targets. (For example, a map of Caracas is a metaphor for the surface of the Earth at Caracas, in that the map has a similar arrangement but a dissimilar scale. If the map had the same arrangement and scale as Caracas, it would be no easier to navigate using the map than it would be to navigate by going directly to Caracas and wandering the streets.) The metaphor of disorder for entropy is valuable and thus imperfect. For example, take some ice cubes out of your freezer, smash them, toss the shards into a bowl, and then allow the ice to melt. The jumble of ice shards certainly seems more disorderly than the bowl of smooth liquid water, yet the liquid water has the greater entropy. [1]

The problem is that there are different definitions of entropy from different branches of science and, while they are all related, they're not actually equivalent. In thermodynamics, entropy is a measure of how much energy in a system is unavailable to do work. In statistical mechanics, it's a measure of uncertainty, specifically the uncertainty of a system being in a particular configuration. This can loosely be described as the number of different ways a system could be reconfigured without changing its appearance. This latter is also related to information entropy or Shannon entropy.

So, beginning with statistical mechanics: In statistical mechanics, entropy is a measure of uncertainty or probability. If a system is in an improbable configuration, it is said to be in a state of low entropy.

The classic analogy employed here is the desktop strewn with bits of paper. You can move one piece of paper without appreciably altering the appearance of the desktop. Statistically speaking, this configuration, or one of the many configurations it could have while still remaining untidy, is more probable than one in which the desktop is tidy. Thus, it is in a state of high entropy.

This, of course, is where the idea of entropy as disorder actually comes from, and it does not reflect the definition of entropy employed in thermodynamics, although there is a relationship, as I shall show.

Moving on, let's deal with the definition of entropy in thermodynamics. This will require the laying of a little groundwork:

Firstly, let's look at what the Law of Entropy actually states, as even this is a source of confusion.

The Second Law of Thermodynamics states that, in general,  the entropy of a system will not decrease, except by increasing the entropy of another system. [2]

This is an important point. It does not state that entropy will always increase, as many suggest, only that it will not, in general, decrease. There are no physical principles that prohibit the persistence of a thermodynamic state (except with due reference to the Uncertainty Principle, of course). Indeed, entropy in this instance can be defined as a tendency towards equilibrium, which is just such a state!  Measurement of this tendency can be quite simply stated as the amount of energy in a system that is unavailable to perform work.

It is probably apposite at this point to deal with the three main classes of thermodynamic system. [3]

1. Open system: An open thermodynamic system is the easiest to understand. It is simply a system in which both energy and matter can be exchanged in both directions across the boundary of the system. Indeed, it is not stretching the point too much to state that an open system is one with no boundary. We define a boundary only because that defines the limits of the system, but from a thermodynamic perspective, the boundary is only a convenience of definition. This is distinct from the two other main classes of thermodynamic system, in which the boundary actually plays an important role in the operation of the system.

2. Closed system: A closed thermodynamic system is one in which heat and work may be cross the boundary, but matter may not. This type of system is further divided based on the properties of the boundary. A closed system with an adiabatic boundary allows exchange of heat, but not work, while a rigid boundary allows no heat exchange, but does allow work.

3. Isolated system: An isolated system is a theoretical construct that, apart from the universe itself, probably does not exist in reality. It is a system in which no heat, work or matter can be exchanged across the boundary in either direction. There are two important things to note from my statement of this. The first is that my usage of the word 'universe' is in line with my standard usage, and does not describe 'that which arose from the big bang', but 'that which is'. The second is that we know of no system from which gravity, for example, can be excluded, and since gravity can apparently cross all boundaries, there can be no such thing as an isolated system within our universe unless a barrier to gravity can be found, hence the statement that there is probably no such thing as an isolated system except the universe itself.

Now that that's out of the way, let's attempt a rigorous definition of entropy in a thermodynamic sense.

Entropy in a thermodynamic system can be defined a number of ways, all of which are basically just implications of a single definition. Rigorously defined, it is simply a tendency toward equilibrium. This can be interpreted in a number of ways:

1. The number of configurations of a system that are equivalent.
2. A measure of the amount of energy in a system that is unavailable for performing work.
3. The tendency of all objects with a temperature above absolute zero to radiate energy.

Now, going back to our analogy of the untidy desktop, this can now be described as an open system, because heat, matter and work can be exchanged in both directions across its boundary. As stated before, this is a system in which statistical entropy is high, due to the high probability of its configuration when measured against other possible configurations (there are more configurations of the system that are untidy than there are configurations that are tidy). In  other words, and in compliance with the first of our interpretations above, it is a system which has a high number of equivalent configurations, since there are many 'untidy' configurations of the system, but only a few 'tidy' configurations. To bring the desktop to a state of lower entropy, i.e. a tidy desktop, requires the input of work. This work will increase the entropy of another system (your maid, or whoever does the tidying in your office), giving an increase in entropy overall. This, of course, ties the two definitions from different areas of science together, showing the relationship between them. They are not, of course, the same definition, but they are related. It is also the source of the idea of entropy as disorder.

In evolution, the role of the maid is played by that big shiny yellow thing in the sky. The Earth is an open system, which means that heat, work and matter can be exchanged in both directions across the boundary. The input of energy allowing a local decrease in entropy is provided in two forms, but mainly by the input of high-energy photons from the Sun. This allows photosynthesising organisms to extract carbon from the atmosphere in the form of CO2 and convert it into sugars. This is done at the expanse of an increase in entropy on the sun. Indeed, if Earth and Sun are thought of as a single system, then a local decrease in entropy via work input by one part of the system increases the entropy of the system overall.

Now, a little word on information entropy:

In information theory, there is a parameter known as Shannon entropy, which is defined as the degree of uncertainty associated with a random variable. What this means, in real terms, is that the detail of a message can only be quantitatively ascertained when entropy is low. In other words, the entropy of a message is highest when the highest number of random variables are inherent in the transmission or reception of a message. This shows a clear relationship between Shannon entropy and the definition of entropy from statistical mechanics, where we again have the definition of entropy as uncertainty, as defined by Gibbs. A further relationship is shown when we look at the equations for entropy from statistical thermodynamics, formulated by Boltzmann and Gibbs in the 19th Century, and Shannon's treatment of entropy in information. Indeed, it was actually the similarity of the equations that led Claude Shannon to call his 'reduction in uncertainty' entropy in the first place!

Shannon:


Gibbs:


The relationship here is clear, and the motivation for Shannon's labelling this parameter 'Entropy'.

Now, a bit on what is the most highly entropic entity that is currently known in the cosmos, namely the black hole. In what sense is it highly entropic? Let's look at those definitions again:

1. The number of configurations of a system that are equivalent: Check. This can be restated as the number of internal states a system can possess without affecting its outward appearance.
2. A measure of the amount of energy in a system that is unavailable for doing work: Check. All the mass/energy is at the singularity, rendering it unavailable.
3. The tendency of all objects with a temperature above absolute zero to radiate energy: Check. The black hole does this by a very specialised mechanism, of course. Energy cannot literally escape across the boundary, because to do so would require that it travelled at greater than the escape velocity for the black hole which is, as we all know, in excess of c. The mechanism by which it radiates is through virtual particle pair production. Where an electron/positron pair are produced, via the uncertainty principle, at the horizon of the black hole, one of several things can occur. Firstly, as elucidated by Dirac [4] the electron can have both positive charge and negative energy as solutions. Thus, a positron falling across the boundary imparts negative energy on the black hole, reducing the mass of the singularity via E=mc2, while the electron escapes, in a phenomenon known as Hawking radiation, thus causing the black hole to eventually evaporate. This is Hawking's solution to the 'information paradox', but that's a topic for another time. Secondly, as described by Feynman [5], we can have a situation in which the electron crosses the boundary backwards in time,  scatters, and then escapes and radiates away forwards in time. Indeed, it could be argued that Feynman, through this mechanism, predicted Hawking radiation before Hawking did!

Now, the classic example of a system is a collection of gas molecules collected in the corner of a box representing a highly ordered, low entropy state. As the gas molecules disperse, entropy increases. But wait! As we have just seen, the black hole has all its mass/energy at the singularity, which means that the most highly entropic system in the cosmos is also one of the most highly ordered! How can this be? Simple: Entropy is not disorder.

Finally, just a few more words from the paper by Styer:

Quote from: "Daniel F Styer"This creationist argument also rests upon the misconception that evolution acts always to produce more complex organisms. In fact evolution acts to produce more highly adapted organisms, which might or might not be more complex than their ancestors, depending upon their environment. For example, most cave organisms and parasites are qualitatively simpler
than their ancestors.

So, when you come across the canard that evolution violates the law of entropy, your first response should be to ask your opponent for a definition of entropy, as it is almost certain that a) he is using the wrong definition and/or b) that he has no understanding of what entropy actually is.



References:
[1] Daniel F Styer Evolution and Entropy: American Journal of Physics 76 (11) November 2008  
[2] http://www.upscale.utoronto.ca/PVB/Harr ... hermo.html
[3] http://en.wikipedia.org/wiki/Thermodynamic_system
[4] P.M Dirac The Quantum Theory of the Electron 1928
[5] http://www.upscale.utoronto.ca/GeneralI ... atter.html

Further reading:
Simple Nature: Benjamin Crowell
http://www.upscale.utoronto.ca/GeneralI ... tropy.html
There is no more formidable or insuperable barrier to knowledge than the certainty you already possess it.

ablprop

As usual, excellent stuff. I particularly enjoyed this bit:

Quote from: "hackenslash"In evolution, the role of the maid is played by that big shiny yellow thing in the sky. The Earth is an open system, which means that heat, work and matter can be exchanged in both directions across the boundary. The input of energy allowing a local decrease in entropy is provided in two forms, but mainly by the input of high-energy photons from the Sun. This allows photosynthesising organisms to extract carbon from the atmosphere in the form of CO2 and convert it into sugars. This is done at the expanse of an increase in entropy on the sun. Indeed, if Earth and Sun are thought of as a single system, then a local decrease in entropy via work input by one part of the system increases the entropy of the system overall.

An interesting question is this: what if that tomato plant in my yard hadn't intercepted that particular photon? What if, instead, that photon had missed the Earth entirely and kept traveling uninterrupted through space? Does entropy go up by less? If so, then is life on our planet, in its own little miniscule way, speeding the heat death of the universe?

Of course, whether there was life on our planet or not, sunlight would still hit it, and our planet would still then radiate away photons of longer wavelength and lower energy in response, so I suppose that any planet around a star speeds along the heat death of the universe (by a tiny amount).

Or has the entropy increase already occurred by the time the photon flies away from the star? To some extent it must have, because certainly the Sun doesn't care or know that the Earth, plants or no plants, has intercepted its photons. So the act of radiating itself must cause an increase in entropy of the Sun. But I'm not sure I see what the mechanism is here. It is merely that by firing a photon away the total mass of the Sun is spread thinner through the universe?

I found your description of how inflation changed the game extremely useful. I see entropy now as much more a process than a quantity. The universe, pushed by inflation into a state where entropy is much lower than it might have been, is in the process of moving toward that final high entropy state. Along the way, we living things can grab a little of that low entropy, transform it into high entropy, but in the process learn and love and enjoy being alive. Not a particularly cheery thought, maybe, but the mere fact that we could figure it out is more important than the final answer, anyway.