Jump to content
Science Forums

Def. of Entropy


pgrmdave

Recommended Posts

How is entropy measured? If we know that systems tend to move towards entropy, then it must be a measurable, and objective idea, right? It just seems to me that a lot of the examples given to show either entropy or order are somewhat subjective - crystals seem ordered, but it is just because we see a pattern, while weather, which seems disordered, we simply can't see the pattern, even though it is there. So, it cannot be based on human observers being able to see a pattern. If it cannot be based on human perception of pattern, then how is it measured?

Link to comment
Share on other sites

Wiki has a good definition and explanation:

 

http://en.wikipedia.org/wiki/Entropy

 

The thermodynamic entropy S, often simply called the entropy in the context of thermodynamics, is a measure of the amount of energy in a physical system that cannot be used to do work. It is also a measure of the disorder present in a system. The SI unit of entropy is J·K-1 (joule per kelvin), which is the same unit as heat capacity.

 

Entropy as a measure of disorder

 

We can view Ω as a measure of the disorder in a system. This is reasonable because what we think of as "ordered" systems tend to have very few configurational possibilities, and "disordered" systems have very many. Consider, for example, a set of 10 coins, each of which is either heads up or tails up. The most "ordered" macroscopic states are 10 heads or 10 tails; in either case, there is exactly one configuration that can produce the result. In contrast, the most "disordered" state consists of 5 heads and 5 tails, and there are 10C5 = 252 ways to produce this result (see combinatorics.)

 

Under the statistical definition of entropy, the second law of thermodynamics states that the disorder in an isolated system tends to increase. This can be understood using our coin example. Suppose that we start off with 10 heads, and re-flip one coin at random every minute. If we examine the system after a long time has passed, it is possible that we will still see 10 heads, or even 10 tails, but that is not very likely; it is far more probable that we will see approximately as many heads as tails.

 

Since its discovery, the idea that disorder tends to increase has been the focus of a great deal of thought, some of it confused. A chief point of confusion is the fact that the result ΔS ≥ 0 applies only to isolated systems; notably, the Earth is not an isolated system because it is constantly receiving energy in the form of sunlight. Nevertheless, it has been pointed out that the universe may be considered an isolated system, so that its total disorder should be constantly increasing. It has been speculated that the universe is fated to a heat death in which all the energy ends up as a homogeneous distribution of thermal energy, so that no more work can be extracted from any source. Recent work, however, has cast extensive doubt on the heat death hypothesis and the applicability of any simple thermodynamical model to the universe in general. Although entropy does increase in an expanding universe, the maximum possible entropy rises much more rapidly and leads to an "entropy gap," thus pushing the system further away from equilibrium with each time increment. Furthermore, complicating factors such as the impact of gravity, energy density of the vacuum (and thus a hypothesized "antigravity"), and macroscopic quantum effects under unusual conditions cannot be reconciled with current thermodynamical models, making any predictions of large-scale thermodynamics extremely difficult.

Link to comment
Share on other sites

How is entropy measured?
Entropy is dfined statistically and it is relevant in thermodynamics because of the underlying statistical basis.

 

Solely to the purpouses of thermodynamics, one can define the variation of entropy of a body by the heat absorbed or ceded, divided by the absolute temperature at which this occurs. If heat is not exchanged along an isotherm, the definition is clearly an integral. According to this definition, the variation and only the variation is measurable, in an analogous fashion to potential energy.

 

Statistics however defines the notion, not in terms of variation, but this isn't so practical to measure for 10 to the some 25 or 30 molecules or more!

Link to comment
Share on other sites

Of course, the statistical definition is the "real" one and is also useful, for example, in cryptography along with transinformation. However, the question by pgrmdave seemed to be how to measure it.

 

BTW, when I posted that, I didn't think to say that Q/T makes sense only when the temperature is well defined which isn't always the case but, as it's a function of state, the variation doesn't need to be actually measured as the process occurs. The difference between initial and final value can be computed along any good route between the same two well defined states.

Link to comment
Share on other sites

Of course, the statistical definition is the "real" one and is also useful, for example, in cryptography along with transinformation. However, the question by pgrmdave seemed to be how to measure it.

 

Which is S = Kb ln W in the statistical mechanics interpretation of entropy.

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
×
×
  • Create New...