Entropy is a favourite subject for pseudo science, and also for analogic reasoning in other fields (see e.g. EntropyReduction), and being statistical in nature is very difficult to think about correctly.
Common analogies cast entropy in the role of disorder, or chaos, or decay, or sometimes complexity and confusion.
There are several technical definitions (which agree in some cases and disagree in others) of entropy that are used in probability, InformationTheory, physics, and computer science. ChrisHillman? authored a decent guide to many of these definitions which is now maintained by RolandGunesch? (see http://www.math.psu.edu/gunesch/entropy.html).

In StatisticalPhysics? entropy is one of the ensemble averages used to describe the state of a thermodynamic system, and perhaps the most important one. It has dimensions of [Energy]/[Temperature]. In thermodynamics it is generally only meaningful to speak of the entropy*change* that takes place during a process.
Left to their own devices thermodynamic ensembles tend to spontaneously maximise their entropy. This is cognate with their becoming more "disordered". It can be argued that EntropyIsComplexity?, since a highly entropic system requires more information to capture its state than does a less entropic one.
*I thought that entropy is the inverse of information. Typically, a system has only one high entropy state, but many many low entropy states. A low entropy state contains therefore more information than a high one.*
In a high-entropy state less information is required to capture the *global* state (since the system will be in one of many possible states all of which are degenerate with respect to entropy, so you don't care which), but more information would be needed to capture the *local* state, since the constituent parts of the system will be more active.
A maximally entropic signal, for instance, would be like white noise. A minimally entropic signal would be, say a repeating stream of some constant. On the one hand, the pure noise signal can't be compressed, since it has no structure; if it can't be compressed then it has, in some sense, the maximum information content possible for its bandwidth. On the other hand, the pure noise signal has no information content, because it's noise.
*In lay terms, this is almost like saying that when pure information content becomes too high (for us), we tune out and perceive it generally as noise. Or, to summarize, noise is just an interpretation of signal overload. --WaldenMathews*

An interesting way to compare two states is to compare the total number of bits required to encode them (algorithm included). E.g. you have two text files of the same length. Using the same algorithm, one compresses to 50k and the other to 40k. The first (original, I suppose) has more entropy than the second. (This is off the top of my head, don't kill me if it's not perfectly correct.) -- RobHarwood See KolmogorovComplexity.

Living systems are principally concerned with opposing this spontaneous increase in entropy. Attempts to reduce entropy locally always result in increases of entropy globally, ultimately through the phenomenon of*waste heat*.
An actual conversation--
[scene: a run-down student flat in Edinburgh]
Mark: this kitchen is a disgusting mess, clean it up you slob!
Keith: hey, it's entropy man, the more you fight against it the worse it gets.

The most evident effect of entropy is: "You have to fight hard for order, but disorder comes automatically."

cf. http://www.secondlaw.com and http://www.2ndlaw.com Also see SecondLawOfThermodynamics.

I've kind of thought of entropy as the tendency for a system to fill the available probability-space of all flexible parameters. In other words, if it is possible for a system to spread out in a certain direction (eg temperature distribution, spatial distribution) entropy is the tendency for it to spontaneously do so. My real-world example is: "The mess tends to expand to fill the available space". Another good one: "Once you take worms out of a can, the only way to get them back in is to use a bigger can". -- AndyPierce*How did the worm-cannery get the worms into the can you've just taken them out of?*
They don't put worms in the can. They put worm-eggs in the can and wait for
them to hatch.
*I've had a redworm farm in my basement for three plus years, but I never heard of putting the little buggers in cans. Don't think it'd be good fer 'em, either. And, when one of mine gets loose, I pluck him/her up with my fingers, as common sense would suggest. No need to get a big can. What is all this? --WaldenMathews*
Ah, so *you're* the one increasing the global entropy! Next time they tell me to clean up my office, I'll just say, "It won't do any good. That worm farm is using up all my order."

Information-theoretic entropy and thermodynamic entropy are equivalent! See HenryBaker's explanation of why a GarbageCollector is a refrigerator: http://www.pipeline.com/~hbaker1/ThermoGC.html and how to build ReversibleLogic, information-conserving computers: http://www.pipeline.com/~hbaker1/ReverseGC.html

"Entropy, An International and Interdisciplinary Journal of Entropy and Information Studies" is an OpenAccess OnlineJournal? (see http://www.mdpi.org/entropy/).

In StatisticalPhysics? entropy is one of the ensemble averages used to describe the state of a thermodynamic system, and perhaps the most important one. It has dimensions of [Energy]/[Temperature]. In thermodynamics it is generally only meaningful to speak of the entropy

An interesting way to compare two states is to compare the total number of bits required to encode them (algorithm included). E.g. you have two text files of the same length. Using the same algorithm, one compresses to 50k and the other to 40k. The first (original, I suppose) has more entropy than the second. (This is off the top of my head, don't kill me if it's not perfectly correct.) -- RobHarwood See KolmogorovComplexity.

Living systems are principally concerned with opposing this spontaneous increase in entropy. Attempts to reduce entropy locally always result in increases of entropy globally, ultimately through the phenomenon of

The most evident effect of entropy is: "You have to fight hard for order, but disorder comes automatically."

cf. http://www.secondlaw.com and http://www.2ndlaw.com Also see SecondLawOfThermodynamics.

I've kind of thought of entropy as the tendency for a system to fill the available probability-space of all flexible parameters. In other words, if it is possible for a system to spread out in a certain direction (eg temperature distribution, spatial distribution) entropy is the tendency for it to spontaneously do so. My real-world example is: "The mess tends to expand to fill the available space". Another good one: "Once you take worms out of a can, the only way to get them back in is to use a bigger can". -- AndyPierce

Information-theoretic entropy and thermodynamic entropy are equivalent! See HenryBaker's explanation of why a GarbageCollector is a refrigerator: http://www.pipeline.com/~hbaker1/ThermoGC.html and how to build ReversibleLogic, information-conserving computers: http://www.pipeline.com/~hbaker1/ReverseGC.html

"Entropy, An International and Interdisciplinary Journal of Entropy and Information Studies" is an OpenAccess OnlineJournal? (see http://www.mdpi.org/entropy/).

EditText of this page (last edited August 22, 2010) or FindPage with title or text search