Notice: entropysite.com is now http://entropysite.oxy.edu. Please update your links and bookmarks.


Teaching Entropy Is Simple — If You Discard "Disorder"

Foreword

          The following is an expanded version of a talk written for AP teachers and equally useful for those teaching first-year general chemistry, of course.  My format in a lecture ms. is to underline unduly and to type words in italics or boldface that I want to emphasize especially.  These are bad form in print, but they highlight important points in this introduction and so I have left a number of them.

Introduction

          I admit that the title may be unsettling.  Entropy isn't at all simple in advanced work.  Even quantitative problems in a beginning course can be difficult for students as well as for us to guide students through them.  However, I think the basic qualitative ideas of the nature of energy and entropy are surprisingly simple — and sharing with students what we'll talk about here will actually answer the old question, "What is entropy, really?"  That can change their whole attitude toward class work in the oft-dreaded topic of thermodynamics.

          Discarding the archaic idea of "disorder" in regard to entropy is essential. It just doesn't make scientific sense in the 21st century and its apparent convenience often is flat-out misleading. As of November 2005, fifteen first-year college texts have deleted “entropy is disorder” although a few still retain references to energy “becoming disorderly”. (This latter description is meaningless, as I shall mention here, and discuss in detail in "Disorder — A Cracked Crutch For Supporting Entropy".) Most high school texts are written by unknown people working for publishers rather than by the individuals listed on their covers. Thus, they are slow to include changes in scientific concepts and your HS text may still contain the obsolete “entropy is disorder”.

Steps to a simple fundamental view of entropy

           First , we'll see how all spontaneous events from physical processes like dropping a ball to chemical reactions like the explosion of hydrogen in oxygen  –  all… all … are basically due to energy….spreading….out…. in the processes.     Dispersing.      Superficially, spreading out in some cases because the system increases in volume, or in other cases because the system is heated.,  Fundamentally in all cases, dispersing energy in the sense that the molecules’ energies are in more arrangements ore different arrangements on  a multitude of quantized energy levels after a process or reaction that before.

Energy of all types spontaneously disperses if it is not hindered from doing so.

          Second , entropy is the measure of energy dispersal, as a function of temperature.  In chemistry, the kind of energy that entropy measures is motional energy — of molecules that are translating (moving and colliding), rotating, and vibrating (atoms in a molecule moving as though the bonds were springs) — and phase change energy (enthalpy of fusion or vaporization).  When students realize that entropy is just a measuring device, a yardstick in a sense, then understanding it is no longer a vague 'big deal' — although it is still an extremely important deal !

What entropy measures is how much energy is spread out in a process/T OR how spread out the initial energy of a system becomes in that system (at constant temperature).  Exactly how entropy measures “how much ” energy is dispersed in phase change is mathematically simple ( Hfusion or vaporization), in standard state entropy (S0), in temperature change, (Cp integraldT/T) and in many other applications, even though it is not awlays necessary to show your students the details of calculation.  Whenever entropy is encountered, energy becoming dispersed (involving T) is the function on which we should focus.  However, that viewpoint was silently buried in the simple equations that we were taught — and to which in the past we have introduced our students.  If we " follow the energy flow " then we and they can readily understand why and how entropy changes in all elementary thermodynamics. 

          In advanced work, in the many differential equations involving dS, the relation of energy dispersal to entropy change can be so complex as to be totally obscured.  But not in first-year thermodynamics!  

          Third, we'll look at examples of what "entropy" means in

  1. “how much energy is dispersed” cases:
    1. the standard state "entropy" of any substance at 298 K from Tables (a very approximate index guide to the amount of energy that has been spread out in a substance to heat it from 0 K to 298 K so that it can exist energetically at 298 K.

      (By putting entropy in quotes, I mean to imply that most often "entropy" actually involves an "entropy change", the measurement of a change in the amount of energy dispersal in a system (or surroundings) before a process and after a process.  Only for perfect crystals at absolute zero, does entropy mean a single measurement, i.e. 0 joules/K!)

    2. the entropy change in a solid as it melts to a liquid or a liquid boils or the converse, i.e. phase change;
    3. the entropy change in any system even as simple as the iron metal in frying pan as it is heated (and as it cools); and
  2. “how spread out in a system energy becomes” cases:
    1. the entropy change of a gas expanding into a vacuum; and
    2. the entropy change when ideal gases or ideal liquids mix, and a perfect solute (no enthalpy of dissolving) dissolves in a solvent. Both (4) and (5) involve no change in the amount of total motional energy (plus phase change energy) before the process and after, but their initial energy spontaneously becomes more spread out in the larger (or somewhat different) volume in a process that makes  it  available.

          Fourth, after talking about examples of entropy change in terms of macro thermodynamics, i.e., qrev/T, we'll also look at what energy "spreading out" or dispersing means in terms of molecular behavior, how the Boltzmann entropy equation quantitatively links energy dispersal to the number of microstates in a system.

          Finally, we'll see that Gibbs' ΔG is more closely related to entropy than it is to energy. If ΔG = ΔH - TΔS is divided by T, the result is ΔG/T, an entropy function.  Thus, a better description than "free energy" for Gibbs' ΔG is "the total dispersible energy in the universe due to a chemical reaction" in parallel to realizing that the dqrev in dqrev/T that is "energy that has been or could be dispersed" in a process.

Energy spontaneously disperses if it is not hindered from doing so

          A ball lifted up above the floor has been given potential energy (PE), ordinarily by human action.  When not hindered by one's grip, the ball will fall; its potential energy changes to kinetic energy that spreads out to pushing aside the air, to a bit of sound, to a small amount of heat in the ball and the floor as it bounces and comes to rest.  The ball's original localized potential energy has become widely dispersed in different varieties of molecular motional energy.

          A hot metal pan spontaneously disperses its energy to the air in a cooler room.  (This seemingly trivial example we shall see as a prototype of what happens in any such spontaneous energy transfer from hotter to cooler —  a net increase in the entropy of the combination of system and surroundings.)

          If we dropped a bottle of nitroglycerin on a concrete floor, the mechanical shock could be greater than the activation energy that otherwise hinders the spontaneous decomposition of nitroglycerin.  The "nitro" would then change into other substances explosively, because some of its bond energy/enthalpy — that quantity not transferred to bonds in its reaction products — is spread out extremely rapidly and thus enormously increases the vigorous motions of the gaseous product molecules.  In summary: some of the potential energy (bond enthalpy) that was in the nitroglycerin becomes widely dispersed in the molecular movement of the products. 

          An ideal gas will spontaneously flow into an evacuated chamber and spread its original energy over the larger final volume;  the speed of the molecules is unchanged but their energy becomes dispersed more widely – throughout a larger domain. 

          Hydrogen and oxygen in a closed chamber will remain unchanged for years and probably for millennia.  Despite their larger combined bond enthalpies (higher potential energy) compared to water's (i.e., the energetic or thermodynamic reason why they should react to yield water), their spontaneous reaction to form water is hindered by an activation energy.  However, if a spark is introduced, they will react explosively to dissipate some of their combined bond energy in causing the products of the reaction — water molecules — to move extremely rapidly as motional energy that we sense as high-temperature steam.  Thereby energy is further spread out to the surroundings.

          Those illustrations lead to a profound generalization: In all everyday or exotic spontaneous physical or chemical happenings, some type of energy flows from being localized or concentrated to becoming more spread out or dispersed.  Generally, the "some type" is kinetic energy.  In chemistry "motional energy" of molecules is preferable to the phrase kinetic energy because at any instant when molecules are ceaselessly colliding, many may be motionless for an instant due to head-on collision of equally energetic molecules.  Motional energy better circumscribes the entire process of energetic movement.  (Energy that is supplied to a substance in phase change becomes potential energy that is part of the total energy of a system, unaltered by volume change or by temperature change, except at phase change temperatures.)

          Potential energy in macro objects (like a rock held up in the air, or water behind a dam) is always hindered, i.e., kept from dispersing, until it is changed to kinetic energy. The potential energy in chemistry that we have already talked about is that involved in phase change; coming from or going to the surroundings, it causes breakage or formation of intermolecular bonds and thus can free or restrict molecular motion.. Another major type of potential energy is the energy in chemical bonds that holds molecules together. When chemical reaction occurs to form products whose bonds are stronger than the reactants (as in the case of hydrogen and oxygen reacting to form water), some of the potential energy that was in the hydrogen + oxygen can spread out (in the form of molecular motional energy).

          To most beginning chemistry students, this generality of energy spreading out in chemical reactions or from hot or high-pressure systems becomes obvious — once examples have organized the concept for them..  Most have seen high pressure air whoosh from a punctured tire.  All have seen that occur from a balloon. All know that hot things cool down and that hydrogen and oxygen react violently.  Thus, it is not a great step to a formal description of those phenomena as "the dispersal of energy to a larger three-dimensional space" and extension to energy spreading out more readily among more particles rather than fewer.  Then later, to an extent and depth that you choose, you can lead them to see how, fundamentally, this spatial dispersion of energy always is due to energy being dispersed because, at any instant, it can be in one of a much larger number of microstates (the total molecular motional energy of a system in quantized states) than the energy was before the process or reaction occured. 

           In exothermic reactions some of the bond energy that is in the reactants is transferred to products that have lesser bond enthalpy, and the remainder is detected as thermal energy, i.e., greater molecular motion in the molecules of the product.  This motional energy ("heat") can then transfer energy from the system to the surroundings.  Conversely, endothermic reactions are caused by spontaneous spreading out of some energy from the more concentrated-energy surroundings (hotter) to the lesser concentrated-energy (cooler) substances in the system.

But what does all that "energy spreading out" have to do with entropy?

Entropy change is a measure of the molecular motional energy (plus any phase change energy) that has been dispersed in a system at a specific temperature.

          Always, motional energy flows in the direction of hotter to cooler because that direction of dispersal results in a greater amount of spreading out of energy than the reverse.  Entropy change, q(rev)/T, quantitatively measures energy dispersing/spreading out. Its profound importance is that it always increases in a "hotter to cooler" process so long as you consider the ‘universe’ of  objects, systems, and surroundings.

          Exactly how entropy measures energy dispersal is mathematically simple in phase change. (It's just the ΔH of the process/T.) Further, from Tables of standard state entropy, the So values for substances at 298 K, we can get a general idea of how the amount of energy that has been dispersed in substances differs in types of elements and compounds. Finally, we can measure exactly how much entropy increases when substances are heated, i.e., when energy is dispersed from the surroundings to them. All other areas treated in general chem courses are easily describable to beginning students — and I will do so — but the details of their calculations can be left to your text.

Entropy (change) shown in standard state tables

          A standard state entropy of S0 for an element or a compound is the actual change in entropy when a substance has been heated from 0 K to 298 K. [However, determining S0 at low temperatures is not a simple calculation or experiment. It is found either from the sum of actual measurements of q(rev)/T at many increments of temperature, or from calculations based on spectroscopy.] The final value of S0 in joules/K is related to (but not an exact figure for) how much energy has been dispersed to the material by heating it from absolute zero where perfect crystals have an entropy of 0 to 298 K. 

          Thus, in our view of entropy as a measure of the amount of energy dispersed/T, S0 is a useful rough relative number or index to compare substances in terms of the amount of energy that has been dispersed in them from 0 K. (For ice and liquid water, the S0 at both 273 K and 298 K is listed in Tables.)  Let's consider ice at its S0273 of 41 J/K. Remember, that 41 joules is only an indicator or index of the total thermal energy that was dispersed in the mole of ice as it was warmed from 0 K. (Actually, several thousand joules of energy were added as q(rev) in many small reversible steps.)

          Is that complicated?  Abstract and hard to understand? An entropy value of a substance is very approximately related to how much energy that had to  be dispersed in it so that it can exist and be stable at a given temperature!

         From standard state tables we can see that  liquids need more energy than solids of the same substance at the same temperature.   (Of course!  Liquids at 298 K have required additional enthalpy of fusion, i.e., phase change energy, to break intermolecular attractions or bonds present in solids so that their molecules could more freely move in the liquid phase. And substances that are gases at 298 K similarly had to have the enthalpy of vaporization supplied to them at some temperature between 0 and 298 K so their intermolecular attractions in the liquid could be broken to allow their molecules to move and rotate as freely as they do in gases.)  Heavy elements that are solids at 298 K (and in the same column of the periodic table as lighter elements) need more energy to vibrate rapidly back and forth in one place in their solid state than lighter elements. More complex molecules need more energy for their more complicated motions than do simpler molecules, as do similar ionic solids: those that are doubly charged need more energy than do singly charged to exist at 298 K.

          The causes of all entropy relationships are not obvious from looking at tables of standard entropy values, but many make a lot more sense on the basis of S0 being related to the motional energy dispersed in them (plus any phase change energy) that is necessary for a substance's existence at T.  (Organic molecules are especially good examples, but are beyond what I have time to talk about here. See "Disorder in Rubber Banks?")

The entropy change of a substance in a phase change

          We all know the Clausius definition of entropy change as dS ≥ dqrev /T and ΔS = q/T in a reversible process.  Let's examine such a process as exemplified by the fusion of ice to form liquid water at 273 K.  (Vaporization is parallel, of course.)  A very large quantity of energy from warmer surroundings, the ΔH of fusion (6 kJ), must be dispersed within the cooler solid, but the solid's temperature is unchanged until the last crystal of ice melts. Doesn’t something seem wrong here?  All that  heat input and no  increase in temperature?.  And the reverse behavior of liquid water might  seem odd to a young student: When water is placed in cooler surroundings than 273 K, the same large amount of 6 kJ of energy is transferred to the surroundings before the water all becomes ice.

          This can be rationalized from a strictly macro viewpoint by seeing the process of fusion as a change of motional kinetic energy in the surroundings to potential energy in the water that has nothing to do with temperature change in water. Then, the reverse that occurs when liquid water is placed in surroundings that are 272.9 K , can be understood as merely changing the potential energy of the water system to kinetic energy in the surroundings.  (A weak analogy would be the kinetic energy of a pendulum swinging up toward changing totally into potential energy at the end of its arc, and then that potential energy changing back to kinetic energy as the pendulum  swings to its low point.) 

           Of course, the description of the process in molecular thermodynamics is more detailed and far more enlightening , but our goal here is primarily to see the macro view of thermodynamic changes.     

          As we said at the start, because fusion is an equilibrium process and therefore reversible, ΔSice ->water = qrev/T.  That qrev is simply the 6kJ of the enthalpy of fusion of ice and thus, ΔSice ->water = 6000 J/273 K, or 22 J/K.  The entropy change for ice to become water, results from the amount of energy, qrev  that has been spread out in the ice so that it could change it to water — divided by T.  Is that mysterious?  Hard to comprehend? 

          Admittedly, what gives entropy its great power of predicting the direction of energy flow — why spontaneous energy dispersal always occurs only from a hotter to a cooler system —  is hidden in the apparently simple process of dividing by T.  That tremendously important and relatively invisible predictive property can easily be proved to students as I'll show in a minute.

The entropy change of a substance when it is heated

[The standard procedure for determining the entropy change when a substance is heated involves calculus that may be beyond the background of many AP students. Qualitatively, of course, the process could be described as measuring the amount of energy dispersed from the hot surroundings to the cooler substance, divided by the temperature, at a very large number of small temperature intervals from T1 to T2 and adding all of those entropy results.]

           Qualitatively, from a macro viewpoint, it is obvious that entropy must increase in any system that is heated because entropy measures the increase (or decrease) of energy that is dispersed to a system!

          Quantitatively, determining the entropy change of a substance (such as the iron in a frying pan) as it is heated isn't as easy as in phase change. To keep the process of heating at least theoretically reversible, the substance should  be heated in many small increments of energy (dqrev). That way the temperature remains ‘approximately  unchanged’ in each increment and the process almost reversible. This is achieved in calculus.  Using the usual symbols, and integrated over the temperature range:  ΔST1-> T2 = integral dqrev/T. (1)  But, how do we find out the value of dqrev

          Fortunately, the heat capacity of a substance, Cp, is really an "entropy per degree" because it is the energy that must be dispersed in the substance per one degree Kelvin, i.e., qrev/"1 K"!  Therefore, if we just multiply Cp by "the number of degrees" (in more sophisticated terms: the temperature increment, dT), that will give us dqrev  — i.e., dqrev = Cp integral dT. 

          Substituting this result in (1) above gives ΔST1-> T2 = Cp integral dT/T = Cp ln T2/T1..

          Then, Cp ΔT is the energy dispersed within a system when it is heated from T1 to T2.

The decrease in entropy when a system is cooled
Energy always flows from hotter to cooler
In any spontaneous process, entropy increases

           If anything cools, it clearly has dispersed some of its energy to its cooler surroundings. Lesser energy in it means that its entropy decreases.  The cooling of a frying pan is an example that will most easily demonstrate to students how important is entropy.  With merely that (overused!) example of a hot iron frying pan we can show them why "heat cannot spontaneously pass from a colder to a warmer body" Clausius' original statement for one version of the second law of thermodynamics.  That leads directly to why the universe is always increasing in entropy, another version of the second law.

           Recapping our basic understanding of energy and entropy:  Energy spontaneously disperses from being localized to becoming spread out, if it is not hindered from doing so.  Entropy change measures that process — how widely spread out energy becomes in a system or in the surroundings — by the relationship, q(rev)/T.  (Let's symbolize a high temperature by a bold T, and a lower temperature by an ordinary T.)

          If the pan (system) is hotter than the cool room (surroundings), and q (an amount of motional energy, "heat") might flow from the pan to the room or vice versa, the entropy changes would be: pan, q/T sys and q/T surroundings . Then, when q is divided by a large number, i.e., by the bold T of the pan, the result is a smaller entropy change than if q is divided by a small number, i.e., the ordinary T of the surroundings, q/T, so there is a larger entropy change in the surroundings.)  (For simplicity in quickly first bringing this conceptual  point to a class, perhaps avoid numbers. If you feel that numbers are better, at least avoid dimensions by identification of q = 1, T = 100, and T = 1 so that 1/100 in the hot pan is obvious smaller than 1/1 in the cool surroundings!)

           Now, a larger entropy change — wherever it occus — means that energy would be more widely spread out there. Thus, because energy spontaneously becomes more spread out, if it is not hindered, the q will move from the hotter to the cooler (from a smaller entropy state  to a larger entropy state , from  our hot pan to our cool room.)  This is universally true, as stated by Clausius, as is our common human experience, and as quantified by the relative entropy changes  in hotter and cooler parts of this "frying pan - cool room universe".  (It should be emphasized that it is true even under conditions in which the process of transferring energy is essentially reversible, i.e., when the difference between q/T sys and q/T surroundings  is very small.)

          Finally, the spontaneous increase in entropy in the ‘cooler room’ part of this  universe is greater than the entropy decrease in the ‘hot pan’ part of this ‘room-pan’ universe.   The net result is an increase in entropy in the whole universe, the predicted result for any spontaneous process.

Entropy as "unavailable energy"

          This is a note to clarify an often quoted but confusing sentence about entropy, "Entropy is unavailable energy".  The sentence is ambiguous, either untrue or true depending on exactly what is meant by the words.  As any of us would predict, the energy q within even a faintly warm iron pan at 298 K, measured by its entropy per mole So, will spontaneously cause a 273 K ice cube placed in it to begin to melt.  In this sense the pan's entropy represents instantly available energy and the sentence appears untrue. 

          However, if any amount of energy is transferred from the 298 K pan, the pan no longer has enough energy for its q/T value to equal the entropy needed for that amount of iron to exist at 298 K.  Thus, from this viewpoint, the sentence is true, but tricky:  We can easily transfer energy from the pan.  It's not "unavailable" at all — except that when we actually transfer  the slightest amount of energy, the pan no longer is in its original energy and entropy states!  For the pan to remain in its original state, the energy is unavailable..… 

          "Entropy is unavailable energy" or "waste heat" is also ambiguous in regard to motional energy that is transferred to the surroundings as a result of a chemical reaction.  That energy/T is considered an entropy increase in the surroundings (because it is energy that is spread out in the surroundings and no longer available in the system).  However, it is completely available for work in the surroundings or transfer to anything there at a lower temperature;  it just is no longer available for the process that occurred in the system at the original temperature.

The change in entropy when a gas expands

[In this section concerning gas expansion, as well as the next that includes fluids mixing and a solute dissolving, I confess that I get ahead of my plan a bit.  My intent was to restrict this first presentation of entropy change as involving energy dispersal to a macro view. However, gas expansion so cries out for the obvious statement about molecules (that Clausius could not make in his time — prior to the knowledge that molecules really existed): “Here the energetic fast moving molecules have more space in which to bounce around and spread out their energy rather than keeping it localized in just one flask!”  Certainly, any student who remembers the kinetic molecular theory of 5-10 chapters back would quickly agree to that rationalization for the expansion to a larger volume.

This is a valid start for a molecular interpretation of entropy change, and perhaps all that need be told to most beginning students. However, it is really only half the of the cause of entropy increase in any process.  (This idea of two factors in any entropy change is developed for you, but in far too much detail for AP students here

Briefly, for your information at this point, entropy change in any process is due to two factors: first, molecular motional energy described by the kinetic molecular theory is enabling.  However, for that energy to result in entropy increase by becoming dispersed/spread out, it must be actualized by some process that makes accessible additional microstates (vide infra or here) Those processes that we are now going to consider include gas expansion and thus, obvious volume change. Then, we will look at mixing with other molecules (that amounts to separation of like molecules from each other and in that way a “volume” change for them). This second factor is measured by probability, and from the way it is counted in statistical mechanics is often associated with “positional” or “configurational” entropy.  It is an unfortunate name because students get the idea that there are two kinds of entropy change when, fundamentally, there is only one — a change in the number of microstates.)

          When an ideal gas expands from a glass bulb through a connecting stopcock into an evacuated bulb, there is no temperature change.  The initial motional energy of the molecules does not change  BUT that motional energy is now more widely spread over a larger volume than it was originally.  According to our basic concept of the spontaneity of energy dispersing as widely as it can, if it is not hindered, no further explanation seems to be needed.  The stopcock hindered the spontaneous expansion of the gas but when it was opened, the gas expanded into the evacuated bulb — that's just one more example of energy dispersing.  Of course, we can also see that the entropy of the gas must have increased because of our corollary concept, namely, when the motional energy of a substance becomes more dispersed, its entropy is greater.  From a macro viewpoint, nothing could be simpler than deducing that entropy increases when an ideal gas expands.

          However, quantitatively, the process of determining exactly how much the entropy has increased seems to run into a brick wall:  the energy that we normally associate with motional energy has not changed in the expansion, i.e.,  q has to be 0. So how can we measure entropy by q(rev)/T.  The problem is that the sudden expansion of the gas was irreversible, not reversible at each moment (as was phase change, and heating or cooling a substance).  If we want to measure entropy change by using ΔS = q(rev)/T, we must find some reversible process involving dqrev

          The solution to that problem is to reversibly compress the gas back to its original volume.  The energy required to do this will be equivalent in magnitude, just opposite in sign, to the energy dispersed in the spontaneous expansion into the vacuum.  As shown in many texts, the work, w, done to compress a gas reversibly is - w = n RT ln V2 /V1  and since q = - w,  ΔS = q/T = nR ln V2 /V1 .

The change in entropy when gases mix and liquids mix

          When a bulb containing one gas is connected to one with a different gas and the stopcock between them is opened, the two will begin to mix slowly and will continue until each gas has thoroughly mixed with the other.  Each gas has increased its volume with no change in temperature (if they are ideal gases that do not interact) and no change in the initial motional energy (or phase change energy) of each.  The situation is exactly like a gas expanding into a vacuum. The volume of each has increased.  The initial motional energy of each gas has become more spread out in that larger volume and so each has increased in entropy.

          The mixing of two ideal liquids can be viewed similarly but it is quantitatively different because the volume resulting from two liquids mixing is usually not exactly the sum of the two initial volumes.  Nevertheless, the concept of spontaneous mixing because of energy spreading out still applies. The initial motional energy of each liquid has been dispersed in the final larger combined volume and so the entropy of each has increased. ( An old idea that there is a special "entropy of mixing" is an error.)

Molecular thermodynamics. The Boltzmann entropy equation.

          The molecular dispersal of energy, molecular thermodynamics, is quantitatively treated by Boltzmann's relation of entropy to microstates and by the quantization of molecular energy in quantum mechanics.  But we are talking here today about the simple….direct…. presentation of entropy in an AP or first-year college/university class.  It must be the instructor's choice for his or her own class of beginners as to how far to go into many details of Boltzmann's development of entropy, ΔS = kB ln W Final/W Initial. (It will be intensively presented in the physical chemistry course.) I will describe a few specifics in this section, primarily as background information for instructors. Some generalities about a simple view of microstates are in the "sample lecture to all chemistry classes" in the next section. These can perhaps be used in most AP or college chemistry class without exceeding the students' abilities.

A sidenote:  One current general chemistry text takes about 4 pages to introduce entropy via Boltzmann.  That is absurd — just one more unnecessary burden on an already overloaded beginner.  The entire approach to entropy up to this present point plus some simple generalities about molecular thermodynamics could be presented far more understandably in 4 text pages.

          Saying that (and underscoring that most of the introduction to entropy prior to this section on molecular thermodynamics is usable in any class in which entropy has been mentioned as "disorder" in the past), I think that you should be aware of some conclusions from quantum mechanics related to entropy and the significance of the Boltzmann equation.  This is the content of the following indented paragraphs.  (Additional backgound is now at http://entropysite.oxy.edu/microstate/index.htmlYou can best judge whether you should share them as "enrichment" with all of your students, with only a select few, or with none.  Molecular thermodynamics is essential in a modern description of entropy.  It is the fundamental basis for my describing spontaneous energy dispersal as the key to understanding entropy. (But that word “dispersal” has a more precise meaning in molecular thermodynamics than just “spreading all over three-dimensional space” as we have been using it in macro thermodynamics.)

          Electromagnetic energy is quantized — in the same sense that you taught your students about photons being the quantized units of light energy.  Similarly, all types of energy are quantized, including the energy associated with the various modes of molecules' motions. In a section to follow that discusses the expansion of a gas into a vacuum and fluids mixing from a molecular and quantum mechanical viewpoint, the explanation is properly focused only on the energy levels accessible to the energies of the molecules. This is adequate and correct.  The rationalization I suggested for medium or lower level students, namely that vigorously moving energetic molecules would be expected to move into a larger volume of three-dimensional space if they were not hindred from doing so is a useful picture, but it is just a start. It is superficial because a molecular thermodynamic view of entropy change shows that entropy change consists of two essential factors: motional molecular energy as enabling and any process that changes the number of microstates as actualizing. (If that process leads to an increased number of microstates, it is spontaneous because it results in increased entropy.)

          The motional energy with which we are concerned in discussing entropy consists of the combined energy of translation, rotation, and vibration. (“Phase change energy” is motional energy of the surroundings that has been supplied to or released by the system due to its potential energy of intermolecular bonding. It does not change in any of the entropy-involving processes we discuss other than phase change itself.) Motional energy is present in  a substance because of the transfer of heat from the surroundings as the substance is warmed above 0 K.) The difference between rotational and between vibrational energy levels of molecules can be seen in the molecular spectra of rotation and vibration.  That between translational levels is too small to detect ordinarily so it is often considered to be continuous, the equivalent of  more than septillions of individual levels.  Therefore, because three independent energetic variables are involved, each molecule of any substance at any temperature above 0 K can be in one of an extremely large number of different energy levels. (Of course, input of energy that results in a higher temperature — greater molecular motion — makes additional energy levels become accessible for a molecule's energy. Similarly, in phase changes such as melting and vaporization where molecules can have increased opportunities for motion or when a solute is dissolved in a solvent, large numbers of additional energy levels become accessible.)

          A microstate can be defined as one arrangement  of all the energies of all the particles (each on  one of their many possible energy levels at one instant) that together have the total motional energy plus the phase change energy of a whole system.  Then considering how many molecules there are in a mole, we can sense that there are a truly unimaginable (though numerically expressible) number of microstates in the usual chemical system at ordinary temperatures. (The quantity is of the order of 101,000,000,000,000,000,000,000,000. To give you a sense of its magnitude: There are probably less than 10100 atoms in the entire universe.) Here again is a link to extended descriptions of microstates.

          A system has its total energy — the energy of each of its many molecules — arranged in some particular distribution on a gigantic number of energy levels at one instant. This would be one microstate. In the next instant the system is in a different microstate. This is because even a single collision of two molecules usually changes their energies, and therefore this one change makes the total arrangement of all the molecular energies different than it was an instant before — a different microstate. (Considering the number of molecules in a mole, you can have a slight appreciation of how many different microstates might be possible without any change in the total energy of a system of a mole of molecules!)  An increase in the number of accessible microstates  for a system of molecules results in an entropy increase because then a system's energy can be more dispersed or spread out in this  very precise sense: if there are more accessible microstates for a system,  there are more choices of different microstates in which the system might be at the next instant. That is energy dispersal, a greater number of possibilities of arrangements of the energy of the system.  That is, of course, the opposite of energy localization (of having fewer and fewer choices of arrangements of the energies in the next instant — with the ultimate being only one arrangement, the situation at absolute zero.) Energy dispersal in terms of microstates does NOT mean that the energy in any way is' smeared' or spread over many microstates. That is impossible, of course, because the total energy of a system is always present each instant in a single arrangement,  a single microstate.)

          Illustrations in textbooks that purport to show microstates as marbles in various boxes, or exemplify them as molecules in various locations in space are specious and misleading in that microstates are shown as devoid of energy. Microstates consist of the energies of all the molecules in a system considered to be on quantized energy levels. Even though in statistical mechanics, in order to count the numbers of molecules’ different arrangements, the molecules are placed in ‘cells’ and considered as locations in 3-D space, it is the different arrangements of their energies (the numbers of microstates) that are actually being counted combinatorially. More on this here.

(For a detailed development of the nature of microstates, see "What is a microstate?".)

          In the equation attributed to Boltzmann, ΔS = kB ln WFinal /WInitial , the W stands for the number of microstates in a system. Using heating of a substance as just one example of the equation's pertinence, when a substance is warmed, the increased energy dispersed in it allows each molecule to move over a greater range of speeds in its many collisions. Its occasionally greater energy can access  higher energy levels.  Thereby, the number of accessible  energy levels on which from one to a large number of the molecules’ energies may be at one moment one moment increases enormously.  (Of course, we can think of or draw energy levels on the board for students with dots for molecules themselves on these lines.) In turn this results in an enormously greater increase in the number of microstates because each microstate is but one arrangement of all the molecular energies whose total energy is  that of the system.  All the energy now in the warmer substance has the potential of being in any one of many more microstates than it had been in.  Energy dispersion, in the sense of there being more choices for the system's energy to be arranged  if there is an increase in the number of microstates, is the fundamental reason for an increase in entropy in any change in a system, not only from warming, but from phase change, from volume increase, from forming a solution.  The greater the number of microstates for a system after some process, the more its entropy has increased. This is why Boltzmann's equation is essential in molecular thermodynamics.

A possible approach to microstates for all chemistry classes

          Now, returning from our survey of molecular thermodynamics with the powerful support of the conclusion in boldface type above, we can describe it simply to students in the examples I have already discussed.  In the paragraphs below, I'll try to imagine my teaching a chemistry class that could be AP or below AP level to whom entropy in macro thermodynamics has already been presented somewhat as on earlier pages.  (Although the following is informal, it needs student interruptions to come alive and be most useful to students.  You can do much better, I'm sure!  I will insert a "disclaimer to students" in double brackets that you may disdain as well as discard.)

          "The energy q that we've been talking about when we learned that entropy change was q(rev)/T is the energy of moving atoms or molecules.  What if you heat an iron pan?  That means you put the “q” of fast moving atoms or molecules from a flame into making the iron atoms in the pan speed up their fast-jittering, vibrating, moving almost in the same place. Gases like oxygen and nitrogen in the air in our room are moving at an average of about a thousand miles an hour. In a cold ice cube, the water molecules can’t move much more than iron atoms in a cold pan, but yet they are rapidly vibrating. But fast or slow, hot or cold, the energy q of those moving molecules in any substance is quantized — that is, it isn't like a continuous flow of water from a faucet.  Molecules that are moving are energetic and that energy is in bunches or units or packets.  Remember when we learned that Einstein proposed that light is quantized?  He found that light could be considered to be in packets and those were named "photons". Then also, do you remember how we saw that the energy of electrons in the hydrogen atom was quantized — only on specific energy levels? The energy of moving molecules — their “translational” energy (and other kinds of movement) — is like that, on specific energy levels.

          [[Now relax — you don't have to remember any details about what I'll be saying for the next couple of minutes, but I hope you'll just a general feel for what "microstates" of molecules are and why how many of them are important.  At least, getting an idea about what microstates are is essential, because from now on I'll be using that word "microstates" in explaining why entropy increases or doesn't from the viewpoint of molecules' energy. ]]

          Here's an impossible "thought experiment" but try it anyway!  Close your eyes and pretend you can see all the energies of the different molecules in a drop of water, moving every which way, at any speed from 0 to 2000 miles an hour, and you can see all those energies arranged on a ladder-like gazillion different energy levels! Now, quick, freeze that frame of the ultra-fast movie. That's a microstate of the water molecules at the temperature of our room — all their motions stopped with their individual energies on a literally incredible number of levels. (The total energy of all those molecules' energies on all those levels is the energy q in that entropy equation of q(rev)/T.)  A microstate is an exact arrangement of all the energies of the molecules of a system at one instant.

           Then let the molecules move for just an instant. Freeze everything again.  That would be another microstate — slightly different in energy of a couple of molecules — so it would be a slightly different arrangement with the same total energy.  Don't keep going…I don't want you to get tired because even if you got all the people of the world thinking 'freeze-frames' like that every millionth of a billionth of a second for trillions times trillions of years, they wouldn't have visualized a 'zillionth' of the number of microstates in any substance at room temperature!

          A zillionth?  In the smallest drop of water you can see, there are at least 1010,000,000,000,000,000,000 of microstates. (There are probably less than 10 100 atoms in the entire universe!)  Heat that tiny drop of water up just a little and that makes even more — many more — microstates (that many more arrangements of molecules’ energies in any one of which at one instant the total energy of the tiny drop would  be).

          Now, here’s the payoff:  The arrangement of the motional energies of all the molecules in any chemical (a ‘system’) at one instant is a microstate. So, if there are more microstates (additional “accessible” arrangements) made available to the molecules, there are more choices for the system to be in any one of them at any instant.

          Here's a practical illustration: Even though we would be talking pretty good science if we were in a tire shop and said, when a high-pressure tire blew out, "Hear that air energy spreading out all over", that's not the full story.  What's really happening down at the molecular level is that the motional energy of the high-pressure air is spreading out from the tire BECAUSE it then has the chance of being in any one of a whole lot more microstates in the air of the shop — and THAT”S because there are more microstates whenever a substance is given more 3D space.

          Entropy increases whenever more energy is spread out in a substance. For example, when some substance is heated, the energy has many more microstates in any one of which it might be at an instant -- THAT'S fundamentally what entropy increase means.

          When that hot iron pan cools down, then there aren't as many microstates in one of which its atoms can be at one moment so we say that the energy can't spread out as much.   Therefore, the pan's entropy decreases.  But the cooling-down happened because slower air molecules hitting the hot pan were made to move a little faster.  The extra energy in those faster air molecules now has MORE microstates in any one of which the energy of the air can be at an instant , so the total energy of the air can be called more spread out..  And what does more "spread-out energy" in the air mean?  An increase in the air's entropy, of course.

          Any time entropy DECREASES in the universe — like the hot pan cooling down — the energy from that part of the universe spreads out  and increases the entropy of its nearby surroundings (and that includes any cooler thing or air near it).  Always the increase in entropy is greater than the decrease elsewhere.  You could guess that would be true because we said when we first started to talk about thermodynamics that "energy spontaneously spreads out, if it isn't hindered" and then we said that "entropy increases when energy becomes more dispersed or spread out".  Put the two together and you'll predict that entropy is always increasing …..

          Now, I leave the students in your more capable hands, and return to a discussion of entropy, but from the viewpoint of molecular thermodynamics

The expansion of an ideal gas into a vacuum

(See the objections to “positional” or “configurational” entropy in the box at the end of the next section, “The spontaneous mixing of fluids”)

          Students who know that entropy measures how spread out is the energy in a system readily accept the qualitative molecular  conclusion about entropy increase when a gas expands into an evacuated bulb.  Energetic moving molecules?  Allowed to go into a larger volume?  What else — a “spreading out of energy deal”: entropy increases because the molecules’ motional energy (with the system’s phase change energy) becomes more spread out/dispersed in the larger volume. (Of course, that should be followed up by developing why reversible restoration of the system is an essential quantitative corroboration of the entropy increase, the q rev /T for the irreversible change .)

          The better qualitative answer to this gas expansion question comes from a more detailed analysis via molecular thermodynamics, especially if one fact from quantum mechanics is added: The energy levels of a particle in a box become closer together, more dense, the larger is the box.  Therefore, we can conclude that the number of accessible energy levels for molecular energies within any small energy span increase when the volume of a system increases.  So with many more energy levels for molecules' energies, there must be a greatly increased number of newly-accessible arrangements of those energies — i.e., many, many more microstates for the system in the new larger volume than in the original space.  Thus, with spontaneous gas expansion, the entropy increases because the original energy now has many more microstates in any one of which it might be at any one instant — i.e., the system's energy is dispersed, in terms of microstates.  

The spontaneous mixing of fluids

          Two different ideal gases placed in two connected bulbs will mix spontaneously when the stopcock between the two bulbs is opened.  The initial pressure and temperature will be unchanged.  Thus, such a spontaneous isobaric isothermal process must be due to an entropy increase. The reason is not that there is an "entropy of mixing" for ideal gases.  Rather, it is simply an entropy increase due to greater dispersion of the molecular energy of each component throughout the new larger volume of the two bulbs — completely parallel to the free expansion of a gas into a vacuum. 

(The examples are far more interesting to a class than gas mixtures, I believe, except perhaps for perfume across a room! This is where a food dye in water can be demonstrated, and where we have an explanation for cream mixing in coffee, especially in ideal experiments where there we can say there is no fluid movement or convection.)

In both gas and liquid mixing, the motional energy of each component of the mixture has greater volume in which their energies can be dispersed more widely. Therefore, the entropy of each increases. However, the new volume of the mixture of liquids is not simply the total of the components’ volumes as is the case for gases (initially at equal pressures) Instead, the entropy change is calculated on the relative number of moles of each type of liquid in the mixture. Even though the calculation is more complex, the fact is simple and clear — any spontaneous mixing allows the molecules of each component in the mixture to spread out its energy more widely and a component’s entropy increases.

Calculations of entropy change in liquid mixtures are based on statistical mechanics wherein a model of the relative quantities of components in the mixture is constructed by placing those quantities of molecules (representing real and energetic molecules!) in ‘cells’ in three-dimensional space. Because a cell is considered located at a position or configuration in space, any calculation of the number of those cells — compared to the one configuration of the unmixed component — is often called “positional” or “configurational” entropy from Boltzmann’s ΔS = kB ln Ways/1. This is unfortunate in texts that so identify the results because each cell in a given position or having a given configuration is actually a Way, a microstate, an arrangement of the total molecular energies for the mixture! But if the word “positional” is used, entropy change of mixing is seemingly completely different from ”thermal” entropy change — even though both are measured by change in the number of microstates. See http://entropysite.oxy.edu/#calpoly for a clarification of this error.)

The increase in entropy when a solution is formed from a solute and a pure solvent
Osmosis and other colligative effects

          All colligative effects are due to the increased entropy of the solvent in a solution as compared to the pure solvent alone.   Probably,  to most classes, this should be presented simply as a finding or a fact.  To classes with superior students, the preceding detailed analysis of the cell model in statistical mechanics could be shared or summarized.  An entropy increase occurs in a solvent even if only a small amount of ideal solid solute (no heat effects) were added to form a solution.  This is because those solute molecules are throughout the solution, affecting the nature of the interaction of solvent molecules with one with one another. No longer does every solvent molecule have only other solvent molecules around it. Some therefore are as separated from each other as though they were in a larger volume — and that means that these energetic molecules have their energy more dispersed than in pure solvent. Their entropy has increased. Thus, the entropy of a solution is increased to an extent that is dependent on the number of moles of solute that have been added. Because the solvent molecules in a solution have a larger entropy then when in pure solvent, they less tend to "escape" from their greater entropy state in the solution to a vapor phase or to a solid phase than from the pure solvent.

          Then colligative effects such as osmosis are easily explained. They all involve an increase in entropy in the solvent molecules if they are in a solution. . If a membrane permeable only to solvent molecules is placed between some solvent and a solution of it, the solvent will spontaneously move through the membrane to the solution side.  Why?  Because if its molecules  go into that solution, the entropy of the solvent molecules will increase; their energy becomes more widely dispersed in the solution than it was in the pure solvent. Change will take place  if an entropy increase can occur due to that change.

           The elevation of the boiling point of a solution that contains a non-volatile solute is also caused by an entropic effect.  Because the solvent has a higher entropy in the solution than the pure solvent at its ordinary boiling point, there are not enough solvent molecules moving from the liquid solution to the vapor at that usual boiling point temperature to equal the atmospheric pressure of 760 mm.  (In some texts it is said merely that the solvent's "escaping tendency" is lowered in a solution with little or no explanation.  However, this omits the basic cause of the phenomenon, the greater entropy of the solvent when it is in a solution.)  This lower vapor pressure (fewer molecules escaping from the solution) at the normal boiling point can only be overcome by increasing the temperature of the solution and thereby the average energy of the molecules in it, including the solvent molecules.  Then, at some temperature above the usual boiling point, enough solvent molecules will be leaving the solution so that an equilibrium at 760 mm will be established between the solution and the solvent vapor.

          The depression of the freezing point of a solution containing a solute that is insoluble in the solid phase of the solvent is similarly caused by an entropic effect.  The solvent in the solution has a larger entropy value than the pure solvent.  Therefore, unlike the pure solvent and solid being at an equilibrium for crystallization to occur at the solvent's normal freezing point, the solvent in the solution has too much entropy — too little "escaping tendency" — to leave the solution and form the intermolecular bonds of crystals at that usual freezing temperature.  Accordingly, the solution must be cooled so that there is less energy dispersed within it, fewer accessible microstates in it and its entropy thereby decreased.  As the temperature of the surroundings is lowered and the entropy of the atmosphere and the solid ice decrease more slowly than does the solution, equilibrium between the liquid and solid phases is established for crystallization at some point below the normal freezing point of the pure solvent. Then, freezing can occur as energy continues to be dispersed from the solution to the cooler surroundings.

Chemical Reactions — “how much” and "how widely” energy is dispersed in the Gibbs free energy equation

ΔG is "free energy"?  What does that mean?

          "Free energy", represented by ΔG in the Gibbs equation, ΔG = ΔH - TΔS, is said to be the maximum non-expansion work that can be obtained from a process (in a system at constant temperature).  However, calling it an "energy" has been vigorously criticized because all types of energy are conserved according to the First Law of Thermodynamics whereas ΔG is not conserved.  What does that mean?

          From what we have been considering here, it is easy for us to see why ΔG is not a "usual" kind of energy that is described by the First Law, just as the qrev in the qrev /T of entropy isn't a "usual" energy.  (In q rev /T, qrev is energy defined by its being involved in a specific kind of action:  It is energy that has been, or could be, dispersed in an equilibrium situation and can be directly related to energy dispersal in non-equilbrium processes.) 

          The nature of ΔG can be shown if we divide the Gibbs equation by -T, then it becomes

          -ΔG/T = -ΔH/T + ΔS.  Look at that:  Every term in the equation is now an entropy expression!  Starting from the right of the equation, ΔS is the entropy change of the reaction in the system due to S products - S reactants .  The -ΔH/T is the entropy change of the surroundings due to energy dispersed from the reaction to the surroundings.  And, finally, the -ΔG/T is the entropy change of the universe (surroundings plus system) but note!  ΔG (or, showing that it comes from the system, -ΔG) therefore is the net dispersed or dispersible energy due to the reaction occurring inside the system. 

          That's why ΔG isn't conserved and isn't a "usual" energy.  ΔG is energy that is being measured by how much it spreads out/T or by how spread out/T it becomes.  Unlike energy in general, but like all energy whose entropy effects we have talked about, we now can see it as "entropy energy" — energy that is being or can be dispersed at a specific of temperature, T, as a result of a chemical reaction.

          Just changing terminology, to "dispersible energy" rather than "free energy", is not the point.   The important idea is that if we focus on energy flow, we can have and can help our students to have a better sense of what entropy means in this fundamental equation.  Entropy isn't a complex word or not understandable, but rather a universal working tool.

 

entropy.lambert@gmail.com
Last revised and updated: November 2005

back to entropysite