全 23 件のコメント

[–]Corpuscle 66ポイント67ポイント  (9子コメント)

ごめんなさい。これは既にアーカイブしてあり、もう投票はできません。

Here we have a cardboard box, about the size of a shoebox, filled with coins. Not a ton of them, just enough so that the all lay flat on the bottom of the box.

There are three possible states for that box to be in: all coins heads-side-up, all coins heads-side-down, and the other thing where some of the coins are heads-up and some are heads-down.

Imagine the box starts out in the all-coins-up state. We put the lid on and shake it. Without opening the lid, what state do you expect the box to be in?

The answer is obvious: Some of the coins will be heads-up and some will be heads-down. Why? Because the all-heads-up and all-heads-down states correspond to exactly one arrangement of coins each, while there are many arrangements of coins that correspond to the some-up-some-down state.

The "all-up, all-down, some-of-both" states are what we call macrostates. They're the states we care about, the ones we can easily observe. The individual position and heads-up-or-down-ness of all the coins comprises what we call a microstate. It's a state that is normally invisible to us, hidden from view, either because we just don't care about that much detail, or because that much detail is practically impossible for us to measure.

Entropy is, in a sense, how many microstates correspond to a particular macrostate. In this example, the all-heads-up and all-heads-down macrostates each correspond to just a single microstate; that's a very low-entropy condition. But the some-of-each macrostate corresponds to many microstates, making that a high-entropy condition.

When we started out, all the coins were heads-side-up, but when we put the lid on the box and shook it, the system moved from a low-entropy state to a high-entropy state.

In nature, systems always tend to move from low-entropy to high-entropy states. In the most abstract sense, this is just because of pure dumb luck: There are more combinations of coins that add up to "some of each" than either "all up" or "all down," so pure random chance dictates that we're far more likely to go from the all-up state to the some-of-each state than the other way around … and furthermore, that as we continue to shake the box, we're far more likely to stay in the some-of-each state, because the odds against getting all the coins to land heads-side-up are enormous.

In reality, this use of pure-dumb-luck-based statistics to describe complex systems is a mathematical approximation. After all, things like the motions of molecules in a bathtub of water aren't really random. They're actually the product of a huge number of very simple interactions … but that's the thing. When you take something that's fundamentally simple but that becomes vastly complex because of sheer scale, that thing tends to behave very much like a purely random system governed by dumb luck. So it turns out those dumb-luck-based statistical approximations are actually incredibly useful and predictive.

So basically, entropy can be thought of as a way of quantifying just how likely or unlikely it is that a complex system will evolve in a particular way. If the evolution you're imagining is from a low-entropy state to a high-entropy state, in general that's pretty likely. If it's the other way around, from a high-entropy state to a low-entropy state, then in general that's probably not going to happen. The more complex the system you're thinking about, the better statistical methods tend to be for predicting the evolution of that system over time.

[–][deleted] 11ポイント12ポイント  (4子コメント)

ごめんなさい。これは既にアーカイブしてあり、もう投票はできません。

Words cannot express my gratitude for refraining from using the hackneyed and misleading "order/disorder" explanation.

[–]Corpuscle 5ポイント6ポイント  (3子コメント)

ごめんなさい。これは既にアーカイブしてあり、もう投票はできません。

You're most welcome.

For such a conceptually simple thing, entropy sure can be made to sound confusing.

[–]dittendatt 2ポイント3ポイント  (0子コメント)

ごめんなさい。これは既にアーカイブしてあり、もう投票はできません。

Good explanation of thermodynamic entropy!

[–]Mx-yz-pt-lk 0ポイント1ポイント  (1子コメント)

ごめんなさい。これは既にアーカイブしてあり、もう投票はできません。

Wow. I always thought entropy was "heat death" I'm not sure where I read that though.

[–]Corpuscle 1ポイント2ポイント  (0子コメント)

ごめんなさい。これは既にアーカイブしてあり、もう投票はできません。

"Interesting things" can only happen because there are gradients of heat energy in the universe. Heat always flows from areas of high concentration to areas a low concentration, and as it does, interesting things happen. Things that are interesting to us, I'm talking about here, things like steam engines and digesting food and us having this conversation.

If the universe ever got into a state where heat were uniformly homogenized, spread out so there are no areas of higher or lower concentration, there would be no more interesting things happening. Ever, because nothing could reestablish any heat gradients because there'd be no heat gradient to do the work.

This is sometimes called the "heat death of the universe," that irreversible state in which the entire universe is in thermodynamic equilibrium and nothing interesting can ever happen again.

Entropy and heat are both very closely related to temperature, is why entropy and heat death are often found in close proximity.

[–]FriedGold9k 0ポイント1ポイント  (0子コメント)

ごめんなさい。これは既にアーカイブしてあり、もう投票はできません。

i thought entropy had something to do with the depletion of energy?

[–]MurfDurfWurf 10ポイント11ポイント  (0子コメント)

ごめんなさい。これは既にアーカイブしてあり、もう投票はできません。

Entropy is the disorder of a system. The universe is moving toward more disorder, or entropy.

What does "disorder" mean? Well, chemical reactions that take place without any energy create more entropy, or chaos. For example, when you mix baking soda and vinegar, you don't have to stir the mixture, or heat it, or even shake it, it just starts fizzing up and carries out the reaction.

This reaction increases entropy. What used to be two orderly substances, the solid baking soda and the liquid vinegar, is now all over the place, broken into multiple parts of gas and liquid, and generally more chaotic. You start with two substances and end up with three. One of which is a gas, and gasses are chaotic because they go everywhere, unlike liquids and solids whose molecules are somewhat bonded to each other.

If a reaction does not increase entropy, it requires energy. So if I wanted to make a cake, I can't just throw all the ingredients in a bowl and produce a cake. I need to heat it, or apply energy, to cause the reaction. This is because this reaction decreases entropy by taking five or six separate ingredients and turning them into just one finished product. Going from six separate parts to one decreases the chaos of a system. If I threw all the ingredients into the air it would make a really bad mess and create disorder. Cake thrown in the air would be much easier to clean.

Entropy is just the disorder of a system.

[–]Hindu_Wardrobe 3ポイント4ポイント  (0子コメント)

ごめんなさい。これは既にアーカイブしてあり、もう投票はできません。

Disorder is easy. Order, on the other hand, requires work.

Entropy is the overall natural tendency towards chaos, towards disorder.

Say you spend about 1 hour cleaning your room. You have put work into your room, to make it orderly.

Let's wait a week.

Lo and behold, your room is even messier than before! Why? Because disorder is easier to maintain than order. Disorder does not need to be maintained. Your room will get messy on its own. Your room will not get clean on its own.

Order requires work. Disorder simply happens without intervention.

Now apply this to the universe. The universe is very stubborn, and doesn't like cleaning its room. In fact it never does. It's got a lot of space, so why should it? It's just gonna keep getting messier and messier, for an amount of time our monkey brains cannot comprehend.

/how a professor explained it to me, I thought it was a cute analogy

[–]waremi 3ポイント4ポイント  (0子コメント)

ごめんなさい。これは既にアーカイブしてあり、もう投票はできません。

Entropy is one of those simple, fundamental aspects of reality that you can read a dozen definitions of and still not fully comprehend. The best "explanation" I've ever read was "The Last Question" By Isaac Asimov: http://bachiller.sabuco.com/ingles/eloy/1bach/last_question.pdf

[–]rastawarfare 0ポイント1ポイント  (0子コメント)

ごめんなさい。これは既にアーカイブしてあり、もう投票はできません。

Professor Brian Cox explains that quite well in this video, if you have 5.33 minutes to spare: http://www.youtube.com/watch?v=uQSoaiubuA0. It's literally ELI5.

[–]JTSnidely -1ポイント0ポイント  (0子コメント)

ごめんなさい。これは既にアーカイブしてあり、もう投票はできません。

It just isn't what it used to be.

[–]Natanael_L -1ポイント0ポイント  (0子コメント)

ごめんなさい。これは既にアーカイブしてあり、もう投票はできません。

In computer science: The randomness of something. 8 bits of entropy can have 28 = 256 different possible values, each with roughly the same probability.

In HVAC and science that deals with heat: the level of disorder or unavailable energy.

I think there are more definitions too.

[–]rrohbeck -1ポイント0ポイント  (0子コメント)

ごめんなさい。これは既にアーカイブしてあり、もう投票はできません。

At the microscopic level, all interactions between atoms and molecules are random due to quantum mechanics.

If you have a number of interactions, statistically the more "random" the outcome, the more probable it is. Take an empty box (vacuum) and put 10 molecules right in the center and let them bounce around for a while. The chance of all of them congregating in the center again is minuscule compared them being spread out in various ways. Take a large number of molecules (remember NA=6*1023 ) and the chance of them congregating in the center is way beyond astronomical.

It turns out that putting them all in the center takes energy (putting them in a small volume means high pressure) and when they can relax energy gets soaked up again (expanding gases cool.) They don't like being close together, but it has little to do with electrostatic repulsion (except right when they're bumping into each other) but all with statistical mechanics.

So: Highly ordered state=high energy, unordered state=low energy. The amount of disorder is called entropy, a measure of the possible number of states (placing the molecules in the example) that "look the same."

We gain energy from entropy differences. The energy for all of life comes from the gradient between low entropy sunlight and high entropy night sky. Same thing between low entropy high temperature/pressure steam and high entropy low temperature/pressure "sink", i.e. the environment, in a power plant.

[–]Lost_in_here -1ポイント0ポイント  (0子コメント)

ごめんなさい。これは既にアーカイブしてあり、もう投票はできません。

This should help. It's how I learned. http://www.youtube.com/watch?v=squ3RfbkgWI

[–]dittendatt -1ポイント0ポイント  (0子コメント)

ごめんなさい。これは既にアーカイブしてあり、もう投票はできません。

Since there is already a good explanation of thermodynamic entropy by Corpuscle, I will try to explain information theoretic entropy.

Information theoretic entropy measures the uncertainty of some unknown thing. For simplicity the entropy of a coin flip is defined as one. The entropy of independent unknowns add, so for example the entropy if you flip three different coins (head head tail is not the same as tail head head) is three. (From this one can come up with a formula: Entropy = -p1log2 (p1)-p2log2(p2)-p3*log2(p3)..... )

Entropy is useful in information theory because it can be used to predict how much bandwidth or harddrive space is need for sending/storing the outcome of a certain event on average. It turns out that the entropy is the number of bits required, so for instance storing the result of a coin flip takes one bit.

[–]Mortarius -1ポイント0ポイント  (2子コメント)

ごめんなさい。これは既にアーカイブしてあり、もう投票はできません。

Stuff eventually evens out, unless you meddle with it. Temperature and pressure equalizes.

Low entropy means order, means that temperature and pressure in various places is different. High entropy means chaos, stuff eventually mixes up, runs out of power to maintain low temperature of your fridge or high temperature of your fireplace.

Earth is not a closed system, so our atmosphere has places of high pressure and low pressure, we have tides from the tidal forces of the Moon and heat from the nuclear reactions of the Sun. However, Sun is constantly burning fuel and increases the entropy, and the Moon constantly slows down rotation of the Earth in order to make tides. Entropy rises.

Assuming that the Universe is a closed system, it means that eventually every star will run out of fuel to burn and in the end, everything will have temperature of the surrounding vacuum.

[–]Saraphite[S] -1ポイント0ポイント  (1子コメント)

ごめんなさい。これは既にアーカイブしてあり、もう投票はできません。

Well that's depressing.

[–]Mortarius -1ポイント0ポイント  (0子コメント)

ごめんなさい。これは既にアーカイブしてあり、もう投票はできません。

Here's a short story about reversing entropy to cheer you up :)

Also, if you prefer listening, here's an audio version: click.

[–]dasuberchin -1ポイント0ポイント  (0子コメント)

ごめんなさい。これは既にアーカイブしてあり、もう投票はできません。

Entropy is energy balancing out. For example, take a glass of ice water and leave it in your living room. After a while, the glass gets warmer and the ice begins to melt. That is because the higher concentration of energy in the room is going into the lower concentration of energy in the glass. By leaving the glass out, the 'entropy' of the glass and room balance out. Entropy applies to any 'system', be it warming your house in winter, your car overheating, to the universe. Any concentration (or lack of) energy will balance with it's surroundings. This will ultimately lead to the 'heat death' of the universe. As all of the stars run out of fuel and die out, all of the energy they released will spread throughout the universe. Anything the stars warmed up (like Earth) will start balancing their high concentration of energy with the low concentration in space. Once everything is balanced out, I think the universe's ambient temperature (the "ultimate room temperature") will be only 2 or 3 degrees above Absolute Zero. If universal expansion continues, then this will go down more.

[–]rTrees_Ambassador -3ポイント-2ポイント  (0子コメント)

ごめんなさい。これは既にアーカイブしてあり、もう投票はできません。

I think you meant to say: "ELI5: [ENT]ropy

Your welcome.