Jump to content
Science Forums

Thermodynamic Entropy is what physical occurance?


Recommended Posts

Under the ideal conditions of the carnot engine, there is no net change in entropy. Under the conditions of a similar operating engine that passes some heat through without converting it to work, there is a net loss of entropy from the high heat source to the low heat source. The equation defining this loss of entropy is describing what specific physical occurrence? This question pertains to thermodynamic entropy. I researched previous threads about entropy; however, I am not asking about informational entropy. I am asking about the physical basis for existence of thermodynamic entropy? This is a physics question.

 

James

Link to comment
Share on other sites

Its really all down to man named Rudolph Claussius

 

The law of entropy, or the Second Law of Thermodynamics, is one of the bedrocks on which modern theoretical physics is based. The Second Law states that the entropy-or disorder-of a closed system always increases. Put simply, it says that things fall apart, disorder overcomes everything -eventually.

 

Peace out

:)

 

Entropy is the normal state of conciousness - a condition that is neither useful nor enjoyable

Link to comment
Share on other sites

Its really all down to man named Rudolph Claussius

 

The law of entropy, or the Second Law of Thermodynamics, is one of the bedrocks on which modern theoretical physics is based. The Second Law states that the entropy-or disorder-of a closed system always increases. Put simply, it says that things fall apart, disorder overcomes everything -eventually.

 

Peace out

:)

 

Entropy is the normal state of conciousness - a condition that is neither useful nor enjoyable

 

Hi Snoopy, I know your answer is offerred to be genuinely helpful. Thank you for taking the time and effort. However, Clausius' statement was: "It is impossible to construct a device that, operating in a cycle, will produce no effect other than the transference of heat from a cooler body to a hotter body." Disorder is not mentioned. Disorder is a common after effect of the operation of an irreversible engine cycle. The amount of heat that escapes could also be stated as a measure of disorder. But, heat and entropy are not the same thing.

 

The problem is that an increase in entropy usually results in an increase in disorder but not always. [From What is Entropy? "Lets dispense with at lease one popular myth: "Entropy is disorder" is a common enough assertion, but commonality does not make it right. Entropy is not "disorder", although the two can be related to one another. For a good lesson on the traps and pitfalls of trying to assert what entropy is, see Insight into entropy by Daniel F. Styer, American Journal of Physics 68(12): 1090-1096 (December 2000). Styer uses liquid crystals to illustrate examples of increased entropy accompanying increase 'order', quite impossible in the entropy is disorder worldview.

 

In my opinion, no one knows what is entropy. Answers such as: "It is a measure of disorder"; or: "It is a measure of energy no longer available for work by the engine that lost it." are efforts to skirt around the issue, by pointing to some resultant effects. In the Carnot engine, the high heat source loses entropy, but that drop in entropy is accompanied by a loss of heat that is completely converted into work.

 

Entropy is equal to energy divided by temperature. If the energy is lost heat, then it will almost always result in increasing disorder. However, the amount of lost heat can remain the same while the temperature can very. The calculation of entropy would also vary, even though the amount of lost heat and the resulting amount of increase in disorder do not change.

 

The most straightforward answer is that entropy is what the equation says it is. This is an admission that we have the mathematical means to calculate entropy even though we do not know what it is. We do know some common effects resulting from changes in entropy and, as is usually the case, this makes the limited knowledge we have useful.

 

James

Link to comment
Share on other sites

James,

 

You state that you are not interested in information entropy yet you are unsatisfied with the macroscopic nature of thermodynamic "heat" entropy as a fundamental property.

 

The problem seems clear. You are asking questions that directly involve statistical thermodynamics and you shouldn't hesitate to look there for the answer.

 

I found these two threads most illuminating:

 

http://hypography.com/forums/physics-mathematics/13716-information-entropy-roulette-wheel-etc.html

http://hypography.com/forums/physics-mathematics/13734-thermodynamic-information-entropy.html

 

They were an offshoot of this thread:

http://hypography.com/forums/physics-mathematics/13678-communication-entropy.html

 

I honestly learned a lot from reading these threads which seem directly related to what you are asking.

 

-modest

Link to comment
Share on other sites

James,

 

You state that you are not interested in information entropy yet you are unsatisfied with the macroscopic nature of thermodynamic "heat" entropy as a fundamental property.

 

The problem seems clear. You are asking questions that directly involve statistical thermodynamics and you shouldn't hesitate to look there for the answer.

 

I found these two threads most illuminating:

 

http://hypography.com/forums/physics-mathematics/13716-information-entropy-roulette-wheel-etc.html

http://hypography.com/forums/physics-mathematics/13734-thermodynamic-information-entropy.html

 

They were an offshoot of this thread:

http://hypography.com/forums/physics-mathematics/13678-communication-entropy.html

 

I honestly learned a lot from reading these threads which seem directly related to what you are asking.

 

-modest

 

I don't know for sure that I saw everything that has been posted, but I did quite a bit of searching and reading to see if I could pick up on a previous thread. From what I saw there seemed to be satisfaction with 'disorder' and 'unusable heat'. The threads seemed to be quickly directed toward information entropy as if thermodynamic entropy was a subset of informational entropy. I think that Clausius did discover a true physical activity that was occuring. I think that it is important to learn what that activity is. I personally would not leave the subject of thermodynamic entropy until I felt I understood it.

 

Every subject I raise here is one which I think I now understand. However, if I said my own conclusions straightforwardly, they would, understandably, be quickly dismissed. I will not be bringing them into the forum discussion because there is no way I could justify them in a forum setting. Instead my intent is to stress what is not known and to emphasize the importance of making it known. I think it has been harmful to theoretical physics that parts or aspects of the fundamentals have been skipped over.

 

James

Link to comment
Share on other sites

The threads seemed to be quickly directed toward information entropy as if thermodynamic entropy was a subset of informational entropy.

 

I think Erasmus makes a very good case for that in the thread linked above. However, "information entropy" is not synonymous with "statistical thermodynamic entropy".

 

I think the statistical mechanics definition is what you're looking for. It is more fundamental than the wider macroscopic thermodynamics application. And yes, the one can be derived from the other.

 

-modest

Link to comment
Share on other sites

 

Every subject I raise here is one which I think I now understand. However, if I said my own conclusions straightforwardly, they would, understandably, be quickly dismissed. I will not be bringing them into the forum discussion because there is no way I could justify them in a forum setting. Instead my intent is to stress what is not known and to emphasize the importance of making it known. I think it has been harmful to theoretical physics that parts or aspects of the fundamentals have been skipped over.

 

James

 

Hmm yes james,

 

I thought as I was typing my answer away to you that it would not be satisfying enough....

But I agree entropy is shadowy....

However I do think you should share your ideas who cares if some people laugh or poke you in the eye over it....

if the ideas are creative enough I wouldnt worry and it may get some thinking.. I would like to think that in Hypography such ideas would be allowed to be aired and I am pretty sure they would be...

 

Peace

:naughty:

 

Nothing is more conducive to peace of mind than not having any opinions at all

Georg Christophe Lichtenberg

Link to comment
Share on other sites

I think Erasmus makes a very good case for that in the thread linked above.

 

I probably read what you are referring to, but I am not sure where specifically you are pointing me. Is it to the Leo Szilard example? Or, is it the probability analysis? I'll go look back. I may be remembering incorrectly.

 

However, "information entropy" is not synonymous with "statistical thermodynamic entropy".

 

Yes. But I thought I saw a strong preference for information entropy as the best definition.

 

I think the statistical mechanics definition is what you're looking for. It is more fundamental than the wider macroscopic thermodynamics application. And yes, the one can be derived from the other.

 

-modest

 

Statistical yes, but, actually what I am looking for is an interpretation of entropy that follows a clear physical definition of what is temperature. Temperature is the indefinable property in the defining equation of thermodynamic entropy. I see it as the hurdle that must be overcome before explaining thermodynamic entropy.

 

James

Link to comment
Share on other sites

[snoopy;224507]Hmm yes james,

 

I thought as I was typing my answer away to you that it would not be satisfying enough....

But I agree entropy is shadowy....

However I do think you should share your ideas who cares if some people laugh or poke you in the eye over it....

if the ideas are creative enough I wouldnt worry and it may get some thinking.. I would like to think that in Hypography such ideas would be allowed to be aired and I am pretty sure they would be...

 

Peace

:shade:

 

 

Hi Snoopy,

 

My work has been in the public arena of the Internet for years. I have no problem with presenting it to others. However, the forum setting just doesn't work well for contesting physics theory. Hypography is the only forum I actively particpate in. I am not new to it. I think it works better than the others. There are some participants that I choose to put on ignore (there is also one moderator, but I can't put him on ignore). But, overall, there is a special quality about most of the moderators, editors, and participants in general that makes the interchange of ideas seem more personal in the positive sense. Its as if we are acquaintences that have already met. I think most people here really care about being a part of the best. Even here though, I think it is necessary to limit what you hope to accomplish if the ideas are new and different. Besides, they may be wrong. No reason to get in too deep until you see if you are making progress. Anyway, this seems like a good time to compliment Tormod and give recognition to the staff and members of hypography.

 

James

Link to comment
Share on other sites

Oh well,

 

It would have been nice to read your ideas...

 

Have you read Roger Penrose and his take on Entropy ?

 

I would recommend doing that if you havent...

 

all for me to say to you is the best of luck.....

 

I have a thing for FTL communication and time travel

and I strongly believe entropy weaves itself through those things

so I was genuinely interested in what you were about to say in a not saying way...

 

sorry I watch way too much tigger and pooh these days...

 

:shade:

 

Peace again

:hihi:

Link to comment
Share on other sites

James, as Modest said plenty of times I suggest you get a book on statistical physics and study it. I never liked thermodynamics, because the definition of Entropy there was never clear to me. I the statistical physics course I even eventually understood why some say it is a measure of disorder.

If I remember right Entropy is linked to the logarythm of the number of states a system can have. For example a system of 2 electrons would have an entropy linked to the ln(2) because there are two possible states spin up-down and spin down-up...

Link to comment
Share on other sites

James, as Modest said plenty of times I suggest you get a book on statistical physics and study it. I never liked thermodynamics, because the definition of Entropy there was never clear to me. I the statistical physics course I even eventually understood why some say it is a measure of disorder.

If I remember right Entropy is linked to the logarythm of the number of states a system can have. For example a system of 2 electrons would have an entropy linked to the ln(2) because there are two possible states spin up-down and spin down-up...

 

I am not trying to take anything away from statistical analysis of possible states. What I am avoiding doing is skipping over what it was that was discovered in the original derivation of entropy. It is an unknown and it should not be. I think it is very important to get thermodynamic entropy correct. It is evidence of something that is physically important for us to know. I think there is a possibility that earlier misteps in the development of physics theory are what is making this problem difficult to solve.

 

Well, anyway, that is what I think and proceed on. Thank you for your advice.

 

James

Link to comment
Share on other sites

I am not trying to take anything away from statistical analysis of possible states. What I am avoiding doing is skipping over what it was that was discovered in the original derivation of entropy. It is an unknown and it should not be.

 

It WAS an unknown- the new understanding demonstrates that what it really is is a measure of uncertainty/information about a system. Thermodynamics allows you to derive a great deal of things about entropy (a mathematical form up to constants, the fact that its an additive quantity, etc). HOWEVER, thermodynamics is incapable of giving a "physical meaning" to entropy- only by revisiting the subject with statistics in hand do we arrive at a physical definition. This is the beauty of statistical mechanics- it gives us a physical underpinning for parts of thermodynamics.

-Will

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
×
×
  • Create New...