A New Thermodynamics

www.newthermodynamics.com

By Kent W. Mayhew

Entropy : The parameter without meaning

 

   Entropy is the traditional thermal parameter, which when multiplied by temperature gives energy. Clausius’s mid 19th century logic sounds so simple even today. Yet of all the thermodynamic parameters, entropy remains the one that lacks any real clarity.

 

By the middle of the 20th century, entropy was accepted as being the “randomness of matter in incessant motion”(1).  This assertion is based upon the statistical thermodynamics understanding or if you prefer Boltzmann's entropy (see eqn 11 below). It sounds so simple; upon my hearing this in high school science, I just simply accepted it without pondering what relevance does randomness have to do with energy anyhow.  It was a blind acceptance resulting in a strong false foundation, which prepared me for my university indoctrination.

 

  Consider a barrel of hot water. Certainly we can envision that this barrel required, and then stored, energy as it was heated. But the heating of the water resulted in no real observable randomness changes at a macroscopic level. Even at a microscopic level, although the water molecules are vibrating more vigorously than they were before being heated, one might ask: “are the molecules really any more random?” We can certainly understand the argument that the molecular vibrational energies have increased, but to then claim that equates to randomness is subjective at best. 

 

Interestingly, Arieh Ben-Naim (2)  points out that the interpretation of randomness of gaseous molecules really lay in the eye of the beholder. Arieh gives several examples and he is right in saying that the term “randomness” lacks scientific foundation. 

 

Consider Clausius’s 19th century choice of the term “Entropy” (2): “I prefer going to the ancient languages for the names of important scientific quantities, so that they mean the same in all living tongues. I propose, accordingly, to call S the entropy of a body, after the Greek word “transformation”.  I have designedly coined the word entropy to be similar to energy, for these two quantities are so analogous in their physical significance, that an analogy of denominations seems to be helpful” (2).

  

In the same book’s preface Arieh Ben–Naim (2) then goes on to write: “Right after quoting Clausius’ explanation on his reasons for the choice of the word “Entropy”,” Leon Cooper (1968) “commented: “By doing this, rather than extracting a name from the body of current language (say: lost heat), he succeeded in coining a word that meant the same thing to everybody: nothing” 

   

Ben-Naim and I both agree with Cooper’s assessment of entropy. Ben-Naim further rightfully states that energy and entropy are not analogous and that “lost work” might not be the best choice either.  In context of my book, one can plainly see that a reason for entropy’s initial conceptualization was to explain inefficiencies of the steam engine and/or why the Carnot cycle/engine was idealistic. Of course my simple explanation of lost work based upon the displacement of Earth’s atmosphere was not realized by anyone before including Ben-Naim. Rather Ben-Naim choice was to tweak the sciences in a bizarre yet interesting marriage of Shannon’s information with traditional statistical considerations.   

  

A more recent 21st century definition is that entropy is “the dispersal of a system’s molecular energy” (3). This was Frank Lambert’s beloved statement. It can certainly be argued that Frank’s interpretation is an improvement; even so, I am sorry to say that Frank’s definition for entropy also lacks real clarity.

    

Certainly, I do prefer the term dispersal over randomness because it invokes the visualization that molecules and/or energy will naturally tend to disperse. Their dispersion is only limited by the constraints imposed upon a system so long as the system is given sufficient time to attain equilibrium.

  

Constraints can take many forms. Walls first and foremost tend to contain gaseous molecules preventing them from simply spreading outwards until their next collisions with another gaseous molecule. Gravity being the mother of all constraints, as it holds galaxies and all their various components together. It is certainly what keeps Earth’s atmosphere locked onto Earth. And of course pressure (gravity induced and/or otherwise) can be considered as a force constraint upon a system.

 

   Note: I apologize the use of symbols below for parameters. In the book I use the word math editor, but here in html well it is do the best I know how. Anyhow the importance is the logical points that are made rather than the mathematical symbols.

   

 We can ask: Mathematically what is Entropy?

   

  As previously stated entropy lacks clarity. This is not due to the efforts of Frank Lambert, and countless others, to give it real linguistic meaning. The reason entropy lacks clarity is in part due to numerous applications in which it is blindly alleged to apply. Consider the enthalpy (H) relation:

 

                 H=E+PV                                                     1a)

 

Which could also be found written as:

   

                TS=H=E+PV                                       1b)

  

Based upon 1b) entropy (S) seems to be something that when multiplied by temperature (T), gives something whose units are energy (SI units: Joules). In English, TS equates to the internal energy (E) plus pressure (P) multiplied by volume (V). Sounds simple but as written is somewhat problematic see differential shuffle.

 

 Note: Enthalpy is most often written as H=E+PV, thus eliminating TS, which allows for one to readily forget its association with entropy. However the association with entropy is beheld in the thermodynamic relation for isothermal entropy change, which is often written:

 

 TdS=dE+PdV                                               2)

 

 More limitations of eqn  2) are given click here

 

The integration of both sides of eqn 2) will lend itself to: TS=E+PV. Hence allow us to write 1b). This sort of constructive logic does make one ponder if Enthalpy was so designed to hide the direct association with entropy.

   

  For an N molecule ideal gas we can write the ideal gas law:

 

                  PV=NkT                                                     3)

   

 Again it all seems so simple; so why does entropy lack clarity? Well ask what does 1) represent? Be it right or wrong, in high school I envisioned 1) as equating energy and happily went to bed at night, pondering about my teenager life. And then in University I learned kinetic theory, and that the total energy (Et) of a monatomic ideal gas was:     

   

           Et= 3NkT/2                                                  4)

 

I was also taught that the heat capacity (Cy) times temperature change (dT) gave change to a systems energy change (dQ):

 

                  CydT =  dQ                                                 5)

 

And finally that 5) can also be written in terms of entropy (S):

   

               SdT=dQ                                                      6)

 

Comparing 6) to 5), logic seemingly dictates:

   

                 S=Cy                                                          7) 

 

7) implies that entropy is nothing more than a form of heat capacity for any system. Heaven forbid such simple constructive logic be utilized. If we embrace 7) then does this mean in going from absolute zero to the current temperature of a gas, that: dT=T-0=T.  And that we might then boldly write for that ideal gas:

   

                  ST=CyT = 3NkT/2                                         8)

  

Although we cannot fully embrace 7), hence 8), now compared them to 3).  Similar constructive logic might give cause for someone to write:

   

                     ST=3PV/2                                                  9)

   

 Surely 7),8) and 9) all embrace constructive logic but if they were applied to thermodynamics then we would end up with the following conundrum. Why does 9) not equate to 3) for an ideal gas? Moreover constructive logic might lend itself to writing for the internal energy (E):

   

                     E = PV/2                                                      10)

   

 Certainly not! 10) is NOT acceptable at any level.

 

Now ask what is entropy (S)? Well based on 7) it is a form of heat capacity but it also is somehow related to enthalpy, which does not define the true energy of a system of monatomic gas. It lends itself to asking is entropy: 

 

1) Something that when multiplied by temperature gives a system’s enthalpy

 and/or

2) Something that when multiplied by temperature gives a systems energy.

 

BUT a system’s enthalpy is NOT necessarily equal to a system’s energy. Consider an ideal gas , its enthalpy is: NkT but its energy is: 3NkT/2.  So entropy can be one or the other but it cannot remain as both!   Does this not imply that from a simple mathematical perspective entropy lacks clarity? Certainly it does! And in part this explains why entropy has no clear concise literal definition.

  

Okay we can avoid the use of the above constructive logic by limiting entropy to being something that when multiplied by temperature changes gives a system’s energy change thus ignoring entropy’s relation to enthalpy. But doing so means that we are just fooling ourselves via blinding our thoughts in terms of dH rather than d(TS). And it also means that we must reconsider what the enthalpy relation really is.

Even so we have just scrapped the surface of why entropy is so confusing as a thermodynamic parameter. Let us now investigate it’s somewhat more complex mathematical identity.

 

We can bury entropy further by saying that entropy is defined in terms of the number of microstates (@). I.e. we now have a third (or is it a fourth) mathematical definition for entropy, that being Boltzmann’s entropy:

   

                   S=k[In@]                                                     11)

 

Eqn 11) is fundamental to statistical thermodynamics. If energy is added to a system then the number of accessible energy states can increase. This embraces constructive logic.

 

Ideal Monatomic Gas: Isobaric isothermal expansion

 

Consider that an ideal monatomic gaseous  system experiences an input of thermal energy and this results in the isothermal isobaric expansion of the system.  Accepting that such an expanding must perform work onto the surrounding atmosphere and that such work is lost energy by the expanding system. i.e. lost work  In a nutshell: Basically the atmosphere has mass and an expanding system must upwardly displace this mass and the work lost by the expanding system equals the potential energy increase of the surrounding atmosphere that being: W=PdV.

 

 Herein the system’s temperature does not increase because the energy input into the expanding system’s equals the work required to upwardly displace the atmosphere’s mass:

   

        (Energy input) = Wout = PdV            12)

 

In such an idealistic isothermal case since the temperature within the system remains constant then the system’s energy remains constant (this is ignoring any additional thermal radiation residing in freespace between the gaseous molecules).

 

The above seems simple enough but now ask what happens to eqn 11)?  If there is no energy change within the system then does the number of accessible energy states increase? If you accept the traditional assertion that the number of accessible energy states is a function of volume then you would wrongly answer yes.

However logic dictates that if the total energy of a system remains constant then the total number of accessible energy states should remain constant. And this is the more correct answer, at least if you accept this author’s new perspective. And this disavows any correlation between volume and the number of accessible energy states, which is to say that writing entropy change in terms of volume change is simply a mistake.

 

For example chemists and others often write isothermal entropy change of an ideal gas in terms of  volume change, i.e. let “i” and “f” signify initial and final states then:

   

                         dS=kIn(Vf/Vi)                                13)

 

Part of the reason eqn 13) is wrongly embraced is that scientist realized that there was an energy lost by expanding systems and then they wrongly decided that there is a correlation between a system’s volume and its energy.  This is misguided! Specifically, in not realizing that the reason that energy was lost is that the expanding system must do work onto the surrounding atmosphere (PdV). In other words eqn 13) wrongly considers that the work done in terms of the system’s internal volume change rather than the external displacement of the surrounding atmosphere.

 

The above misconception is reinforced by numerous mathematical analysis. Specifically eqn 13) is based upon the Sackur-Tetrode equation combined with the premise that the number of particles and internal energy of an expanding ideal gas remain constant i.e. (Q=W=PatmdV). Note the Sackur-Tetrode equation is traditionally accepted for defining the entropy of an ideal gas, whatever entropy means.

 

You arrive at the same empirical answer thinking in terms of the expanding system’s volume increase i.e. expanding system’s randomness, as you will in thinking in terms of the lost work into the surrounding atmosphere. But your science will eventually become a complication of the simple if you believe in terms of an isolated system experiencing an increase in randomness! The logical answer is that a non-isolated expanding system displaces its surrounding atmosphere i.e. does work onto the atmosphere.

 

Hopefully you the reader are now starting to understand the fundamental differences.

 

So what we can now say, if entropy is defined in terms of the number of accessible states i.e. by eqn 11). We can say: The number of accessible states should only be a function of a system’s energy and not its volume. Hence although eqn 13) is often empirically correct, it is based upon blundering logic. And this as much as anything demonstrates why thermodynamics needs to be rewritten.

 

One may now ask then why does eqn 11) seemingly explain so much. It is a case of the powers statistical thermodynamics but not the logic that is bestowed onto it. Specifically Boltzmann equated his mathematical theory so that his constant (k) allows his formulation to explain lost work that being: PdV, here on Earth. This was circular logic!

 

Furthermore when developing equations like the Sackur-Tetrode we employ traditional concepts in statistical thermodynamics and also set the number of accessible states as a function of a prescribed volume (one that obeys Heisenberg’s uncertainty), hence for the isothermal expansion 13) applies, thus reinforcing our traditional belief’s in entropy. All of this is based upon circular logic reinforcing circular logic!    

 

So what about entropy itself? We can conclude that the science of thermodynamics requires a rethink. And this will require either bestowing entropy with clarity or expunging it from the science! So how entropy is redefined may alter its application but ultimately the science will simplify or perhaps entropy will either follow Phlogiston. No matter what Entropy cannot remain a parameter with clarity.

   

            Another Example

 

It must be emphasized that there is absolutely nothing wrong with counting the number of accessible energy states (@) as was done by Ludwig Boltzmann i.e. eqn 11).

 

       S=k[In(@)]                     11)

 

 Next consider that entropy change is also given by:

 

            dS=dE/T                      14)

 

Combining 11) and 14) leads to

 

         d{k[In(@)]}=dE/T         15)

 

  We previously discussed that for the isobaric isothermal expansion of an ideal system that the number of accessible states does not necessarily increase. Next consider that we simply add thermal energy into a system, i.e. add a piece of hot iron into a cup of water. If the piece of iron is large enough and hot enough, then there will be a measurable temperature increase within that cup. Since there is a temperature increase, then this process cannot be isothermal. Hence you CANNOT write the isothermal relation eqn 14) i.e. dS=dE/T, wherein entropy is nothing more than an illogical isothermal heat capacity.

 

Now ask: Does increasing the system’s thermal energy, increase the number of possible energy states? If the number of accessible energy states is strictly a function of volume, then the traditional answer may wrongly be “no”.  Conversely, if the number of accessible energy states is strictly a function of the system’s total energy, then the answer becomes “yes”.

 

 In this author’s way of thinking the number of accessible energy states should not be a function of both volume and energy, as traditional thermodynamics seemingly does when it applies eqn 11), 14), 15)  to all types of thermal energy (heat) transfer.  And these are fundamental failures in traditional thermodynamics.

 

 Now there are those who might argue that eqn 14) is applicable if the amount of thermal energy supplied to the cup of water was infinitesimally small. The real truth is that such an argument is bogus e.g. based upon scale. If the heat added is so small that your thermometer cannot read the temperature increase then it is simply a case of your thermometer not being accurate enough! Our reality is that the temperature went up even though the increase was immeasurably small, i.e. eqn 14) still does not apply. It is a case of; either find a more accurate thermometer, or a smaller cup of water.

 

   This all does not diminish Boltzmann’s great mathematical skills rather it does adds context to statistical thermodynamics. Please if this is your field of expertise, take out the indignity associated your humanity and view this as an opportunity to rewrite the science based upon statistical arguments.

 

  Note. I am of the opinion that Boltzmann’s constant (k) makes S=kIn(@) equate to empirical findings here on Earth. In other words k may be a function of Earth’s gravitational field rather than a universal constant

 

Closing remarks

  

No wonder entropy lacks clarity when we try to express it in a language of mere words like English. Our reality is; until we decide upon what entropy is mathematically, there can be no hope. Entropy can no longer remain an explanation for everything because in so doing it becomes nothing! As a mathematical contrivance it remains poorly understood/used at a fundamental level!

 

Yet there will always be those who claim that entropy can only be understood based upon upper level math. Perhaps. However it makes me wonder; what do they really know?  Human Indignity

 

 

 

  My paper on Entropy:                       My paper on Second law:

 

 

 Copyright

Kent W. Mayhew 

Blog:Entropy: A Parameter Lacking Clarity
thermowebsite2032017.jpg thermowebsite2032014.jpg thermowebsite2032012.jpg thermowebsite2032009.jpg thermowebsite2032008.jpg thermowebsite2032007.jpg thermowebsite2032005.jpg
Help support this site
Help support this site
Sommerfield quote:"Thermodynamics is a funny subject. The first time you go through it, you don't understand it at all. The second time you go through it, you think you understand it, except for one or two small points. The third time you go through it, you know you don't understand it, but by that time you are so used to it, so it doesn't bother you any more."
 The following quote by Authur Eddington demonstrates the purity of human arrogance that can lend itself to the complete indoctrination of a poorly conceived science

“The law that entropy always increases, holds, I think, the supreme position among the laws of Nature. If someone points out to you that your pet theory of the universe is in disagreement with Maxwell's equations — then so much the worse for Maxwell's equations. If it is found to be contradicted by observation — well, these experimentalists do bungle things sometimes. But if your theory is found to be against the second law of thermodynamics I can give you no hope; there is nothing for it but to collapse in deepest humiliation.”
Very scientific: Von Neumann to Shannon
"You should call it entropy for two reasons, in the first place your uncertainty function has been used in statistical mechanics under that name so it already has a name. In the second place but more important, nobody knows what entropy is, so in a debate you will always have the advantage"