The content of this Web site has been selected for instructors in general and physical chemistry by Dr. Frank L. Lambert, Professor Emeritus (Chemistry) of Occidental College, Los Angeles (Wikipedia, pro biography; personal biography). It consists of copyrighted articles from the Journal of Chemical Education and the Chemical Educator that deal with a modern view of entropy change: a measure of how widely the energy in a process is dispersed or spread out in space or phase space (at T). Considerable non-published supplementary material concerning entropy and teaching it to beginners is also included.
Concisely, the second law is “Energy of all types changes from being localized to becoming more spread out, dispersed in space** if that energy is not constrained from doing so”.
In chemistry, the motional (kinetic) energy of molecules is most frequently involved (but in liquids that also includes their potential energy due to the phase change of fusion, and in gases their additional vaporization energy). Thus, molecules are ‘energy carriers’ above 0 K. (Both direct heat exchange upon contact and radiation can be the ways to disperse energy in a chemical process.)
The simplest example is the stereotypical textbook illustration of the spontaneous expansion of an ideal gas from one bulb to occupy both that bulb and an attached evacuated bulb: The initial motional/kinetic energy (and potential energy) of the molecules in the first bulb is unchanged in such an isothermal expansion process, but it becomes more widely distributed in the final larger volume. Further, this concept of energy dispersal equally applies to heating a system: a spreading of molecular energy from the volume of greater-motional energy (“warmer”) molecules in the surroundings to include the additional volume of a system that initially had “cooler” molecules. It is not obvious, but true, that this distribution of energy in greater space is implicit in the Gibbs free energy equation and thus in chemical reactions. [See pp. 12-14 in entropysite.oxy.edu/entropy_isnot_disorder.html.]
“Entropy change is the measure of how more widely a specific quantity of molecular energy is dispersed in a process, whether isothermal gas expansion, gas or liquid mixing, reversible heating and phase change or chemical reactions, as shown by the Gibbs free energy equation/T .”
**This literal greater spreading of molecular energy in 3-D space in an isothermal process is accompanied by occupancy of more quantum states (“closer energy levels”) within each microstate and thus more microstates for the final macrostate (i.e., W2/W1 in ΔS = kB ln W2/W1). Similarly, in any thermal process higher energy quantum states (“higher energy levels”) can be significantly occupied – thereby increasing the number of microstates in the product macrostate as measured by the Boltzmann kB lnW. In this case, if the volume is constrained, instead of distributing wider, the energy is spread broader, i.e., among a larger number of options within the system.
(See I. N. Levine, Physical Chemistry, 6th ed. 2009, p. 101, and ref. to entropysite.oxy.edu.)
There are two requisites for entropy change: (entropysite.oxy.edu/ConFigEntPublicat.pdf) First, it is enabled by the above-described increased distribution of molecular energy. Second, it is actualized (realized) if the process makes available a larger number of arrangements for the system’s energy, i.e., a final state that involves the most probable distribution of that energy under the new constraints. Temperature (when it is low) often creates such constraints.
Calculations that involve only probability can correctly predict a final state, but they are like the results from a hand calculator working only with pure numbers -- disconnected from anything to which the numbers apply. Thermodynamic entropy increase in chemistry must explicitly be seen and stated as involving energetic mobile molecules spreading out in space and/or in an increased number of accessible microstates.
dqrev/T = ΔS = kB ln W2/W1
Dr. John M. Garland, M.D. and Ph. D. from Manchester University (U.K.), has informed me that his remarkable article -- an analysis of cancer that involves consideration of energy dispersal (entropy increase) and the spread of cancer to distant sites (in fractal terms) -- has been accepted by the Elsevier journal, "Critical Reviews in Oncology and Hematology" for publication in a couple of months. We have communicated frequently over the past two years, but of course the content and its magnitude (51 pages of ms. with 155 references) are totally his accomplishment. (The abstract is at http://dx.doi.org/10.1016/j.critrevonc.2013.04.001 )
He summarized it for me thus:
This article arose from considering that the variety of different entities thought to induce cancer is huge, but that all cancer cells energetically behave identically; they become highly motile, proliferate seemingly without control, and their internal organisation becomes increasingly degenerate. So, if this energetic behavior is universal, why is there not a universal mechanism?
Until now, cancer induction is viewed/represented as from a deranged biochemical pathway, usually by mutation, whose components are arranged linearly and end in cancer-prone targets, for example DNA replication control or cytoskeletal activity. There are problems with this, for example: on the molecular scale, how different pathways intersect; how many are needed for a given effect; how all the myriad of biochemical pathways normally “fit together”; and progression and development of the universal phenotype. The first clue to cancer universality lies in the apparent switch from regulated structure and behavior to generalized increased dynamic activities at structural expense; as structure degenerates, dynamic activities such as motility and migration, replication and indeed clinical poor outcomes, increase.
Fundamental to all biochemical pathways is the constant flow of energy in chemical reactions. The interaction of that flow with the environment in which they take place is crucial and inter-dependent. Thus, building cell structure or undertaking dynamic activity are always dependent on energy flow, the greater the energy available the more likely it is that a reaction will take place. However, in the process of structure-building the overall energy available for dispersal elsewhere becomes “locked in” while in dynamic activity it is constantly being dispersed or spread out. Cells normally co-ordinate these options through numerous checks and balances – always with the possibility of random distribution of energy in or out of the cell. Thus, an unintended/unusual shift in cellular conditions could potentially favor dynamic activity that does not favor structure-building but rather random energy dispersal -- already suggesting a universal clue to cancer.
The interior of a cell is astronomically variable and for a cell to function the environments managing all biochemical pathways must somehow be coherently organized. Nature widely uses a very simple tool, fractal geometry, to generate re-iterated “levels” of organization and which can deal with huge spatial variation. Clouds and snowflakes are examples. All different, but overall the same shape regardless of size. If we now apply fractal geometry to the distribution of energy flow in cells, they will self-assemble into fractal “corridors” where the components/reactions are decided by the amount of energy they dissipate or minimal energy states. This fractal “map” of energy dispersal is continuously changing according to energy demand, as in melting/freezing snowflakes, formation of clouds, generation of river patterns. The significance of this organization is that all parts are dynamically seamless; new areas can be added at different levels, and there are an infinite number of components that can fit into the same “space” (in the model termed Domains which at any level are absorbed into the next tier). The video files in the on-line article illustrate the creation of fractals which can be very easily conceived as energy flows arising from domains. However, and crucially for this model, every biochemical reaction is connected, in some way by its pattern of energy flow to every other. Further, pathways need not be linearly-ordered nor consecutive and components self-assemble according to ‘normal’ energetic criteria (see reference 155, Jun & Wright, Nat. Rev. Microbiol.8, 600, 2011). This provides the key to a universal framework both for normal cell operation and also for cancer; in cancer, the fractal network of energy dispersal is simply skewed towards maximizing that energy flow randomly in dissipation rather than involving orderly structure and function.
The final element concerns the enormous unpredictable variability within cells; there are so many things going on at once. Chaos theory deals mathematically with unpredictability, and chaos can be linked to fractal geometry through descriptions of unpredictable behaviors which influence fractal formation. If this linkage is applied to fractal entropy, cell organisation is inherently stable overall because these variables effectively cancel each other. For cancer, however, the situation is different. Cancer-inducing agents all share the property of being permanently active (i.e. they uniformly favor the random dispersal of energy rather than its use in creating structure) whether in a suppressive or activation mode. The effect of this is to replace variability with a constant dissipative component and permanent re-alignment of energy flow toward random dispersal. Over time the network becomes increasingly focused on dissipation and activities with the highest abilities, i.e. dynamic activity, lead to the self-generating progressive disturbances recognized in cancer.
Many aspects of the model are amenable to testing, for example using mathematical modelling or exploring how enzyme systems energetically self-assemble in complex mixes.
[In the Acknowledgements section of Dr. Garland's article, he refers to "Professor F. Lambert (Occidental College, USA)" and then concludes with "This paper is dedicated to Professor Lambert for his work on the fundamental applications of entropy."]
(All but 1 relate entropy to the spreading or dispersal of energy in a process.)
General chemistry texts for majors
1. Bell, J. et al. Chemistry, 1st ed., W. H. Freeman, New York, NY. 2005. ISBN 780716731269. (Omits “disorder” -- but emphasizes “positional entropy”. See http://entropysite.oxy.edu/ ConFigEntPublicat.pdf for the reason that “positional/configurational entropy” is unwise for beginners.)
2. Burdge, J. Chemistry, 2nd ed. , McGraw-Hill, Hightstown, NJ. 2011. ISBN 9780077354763.
3. Burdge, J.; Overby, J. Chemistry: Atoms First,1st Ed., McGraw-Hill, Hightstown, NJ 2012 9780073511160.
4. Chang, R. Chemistry, 10th ed., McGraw-Hill, Hightstown, NJ. 2010. ISBN 9780077274313.
5. Chang, R.; Goldsby, K. Chemistry, 11th ed., McGraw-Hill, Hightstown, NJ. 2010. ISBN 9780073402680
6. Chang, R.; Overby, J. General Chemistry: The Essential Concepts, 6th ed., McGraw-Hill, Highstown, NJ. 2011 ISBN 9780077354718.
7. Ebbing, D.; Gammon, S. D. General Chemistry, 9th ed., Brooks/Cole - Cengage, Belmont, CA. 2011. ISBN 9780538697527.
8. Ebbing, D.; Gammon S. D.; Ragsdale, R. O. Essentials of General Chemistry, 2nd ed., Brooks/Cole - Cengage, Belmont, CA. 2006. ISBN 9780618491759.
9. Gilbert, T. R.; Kirss, R. V.;Foster, N.; and Davies, G. Chemistry: The Science in Context, 3rd ed., W. W. Norton. New York, NY. 2010. ISBN 9780393934311.
10. Hill, J. W.; Petrucci, R. H.; McCreary, T. W.; Perry, S. W. General Chemistry, 4th ed., Pearson/Prentice Hall, Lebanon, IN. 2005. ISBN 9780131402836.
11. Jesperson, N. D., Brady, J. E., Hyslop, A. Chemistry: Matter and Its Changes, 6th ed., John Wiley, Indianapolis, IN. 2012 ISBN 9780470577714.
12. Kotz, J. C.; Treichel, P. M.; Townsend, J.; Weaver, G. Chemistry and Chemical Reactivity, 8th ed., Brooks/Cole/Cengage, Belmont, CA. 2012. ISBN 9780840048288.
13. Moore, J. W.; Stanitski, C. L.; Jurs, P. J. Chemistry: The Molecular Science, 4th ed., Brooks Cole/Cengage, Belmont, CA. 2011. ISBN 9780495390794.
14. Moore, J. W.; Stanitski, C. L.; Jurs, P. J. Principles of Chemistry: The Molecular Science, 1st ed., 2010. ISBN 9780495390794.
15. Olmsted, J. A.; Williams, G. M. Chemistry, 4th ed., John Wiley, Indianapolis, IN. 2006. IBSN 9780471478119.
16. Olmsted, J. A.; Williams. G. M.; Burk, R. C. Chemistry, John Wiley, Toronto, Ontario M9B 6H8, Canada. 2010. ISBN 9780470155790.
17. Oxtoby, D. W.; Gillis, H. P.; Campion P. Principles of Modern Chemistry, 7th ed., Brooks Cole/Cengage, Belmont, CA. 2012. ISBN 9780534493660.
18. Petrucci, R. H.; Harwood, W. S., Herring, G. General Chemistry: Principles and Modern Applications, 10th ed., Pearson/Prentice Hall, Lebanon, IN. 2011. ISBN 9780136121497.
19. Silberberg, M. Chemistry: The Molecular Nature of Matter and Change, 6th ed., McGraw-Hill, Hightstown, NJ. 2012. ISBN 9780077216504.
20. Silberberg, M. Principles of General Chemistry, 2nd ed., McGraw-Hill, Hightstown, NJ. 2010. ISBN 9780077274320.
21. Tro, N. J. Chemistry: A Molecular Approach, 2nd ed., Pearson/Prentice Hall, Lebanon, IN. 2011. ISBN 9780321706157.
22, Tro, N. J. Principles of Chemistry, Pearson/Prentice Hall, Lebanon, IN. 2010. ISBN 9780321560049.
General chemistry texts for non-majors
23. Hill, J. W., Kolb, D. K.; McCreary, T. W.; Chemistry for Changing Times, 13th ed., Pearson, Upper Saddle River, NJ. 2013. ISBN 9708321750877. (The modern treatment of entropy is also present in the 11th and 12th editions.)
24. Suchocki, J. Conceptual Chemistry: Understanding Our World of Atoms and Molecules, 4th ed., Pearson/Benjamin Cummings, San Francisco, CA. 2011. ISBN 9780136054535.
25. Tro, N., J. Chemistry in Focus: A Molecular View of Our World, 5th ed, Cengage, 2013. ISBN 9781111989064
26. Tro, Nivaldo Introductory Chemistry 4th ed., Pearson/Prentice Hal, Lebanon, IN. 2012. ISBN 9780321741028
Physical chemistry texts
27. Atkins, P.; de Paula, J. Physical Chemistry 9th ed., W. H. Freeman, New York, NY. 2010. ISBN 9781429218122.
28. Atkins, P.; de Paula, J. Physical Chemistry for the Life Sciences, 1st ed., W. H. Freeman, New York, NY. 2005. ISBN 9780716782681.
29. Levine, I. N. Physical Chemistry, 6th ed., McGraw-Hill, Hightstown, NJ. 2009. ISBN 9780072538625. (Note p. 101, with ref. to entropysite.oxy.edu.)
Organic chemistry texts
30. Vollhardt, P. C. ; Schore, N. E. Organic Chemistry, 6th ed., W.H. Freeman, New York, NY. ISBN 9781429204941
31. Thermodynamics and Chemistry, 2012, a 535 page high-level text online in PDF format at http://www2.chem.umd.edu/thermobook/ by Professor Emeritus Howard Devoe, of the University of Maryland. (p. 131: "This description of entropy as "disorder" is highly misleading..." p. 132: "This [correct] interpretation of entropy increase has been described ...as the dispersal of energy [ref. to http://entropysite.oxy.edu/entropy_is_simple/index.html]
32. Thermodynamic Optimization of Power Plants, Yousef Haseli,
Eindhoven University Press, Eindhoven, Netherlands. 2011. ISBN 9789038625225.
33. Olsen, Bruce D., Understanding Biology through Evolution , 4th ed., Lulu Press, Inc., Raleigh, N. C. 2009 ISBN 9780557095391.
(Unless otherwise stated, all articles are copyright © by the Division of Chemical Education, Inc., of the American Chemical Society and reprinted by permission of the copyright owner. To access an article, click on its title.)
"Shuffled Cards, Messy
Desks, and Disorderly Dorm Rooms — Examples of Entropy Increase?
from the Journal of Chemical Education, Vol. 76, pp. 1385-1387, October 1999.
Changes in the arrangement of ordinary objects do not change their entropy. Entropy depends on the dispersal of energy at a specific temperature, not on a pattern. (Information "entropy" with no inherent or integral energy factor therefore is only related in form, and not in function, to thermodynamic entropy that must have an enabling factor of energy.
"Disorder — A Cracked Crutch for Supporting Entropy Discussions" from the Journal of Chemical Education, Vol. 79, pp. 187-192, February 2002.
"Entropy is disorder" is an archaic, misleading definition of entropy dating from the late 19th century before knowledge of molecular behavior, of quantum mechanics and molecular energy levels, or of the Third Law of thermodynamics. It seriously misleads beginning students, partly because "disorder" is a common word, partly because it has no scientific meaning in terms of energy or energy dispersal. Ten examples conclusively demonstrate the inadequacy of "disorder" in general chemistry.
"Entropy Is Simple, Qualitatively" originally published in the Journal of Chemical Education, Vol. 79, pp. 1241-1246, October 2002.
Note: the article as presented here has been extensively revised and expanded, most recently in August 2005.
Energy disperses from being localized to becoming spread out if it is not hindered. This is the enabling factor for all spontaneous physical and chemical events. Entropy change measures the dispersal of energy in a process: how much is spread out or how widely spread out that energy becomes. This is discussed in terms of macro thermodynamics, q(rev)/T, and molecular thermodynamics, kB ln [microstatesfinal / microstatesinitial ].
" "Disorder" in Unstretched Rubber Bands?" from the Journal of Chemical Education, Vol. 80, p. 145, February 2003.
The well known experiment of stretching a rubber band has often been used as an example of entropy increase toward greater "disorder" in the unstretched band as a cause for a stretched band to contract. Instead, from a scientific point of view, the unstretched rubber has greater entropy than the stretched form because of the increased possibilities for energy dispersal among the more freely-moving portions of rubber molecules in unstretched rubber compared to an extended rubber band. Thus, spontaneously a stretched band will change to unstretched.
"Entropy and Constraint of Motion" from the Journal of Chemical Education, Vol. 81, pp. 639-640, May 2004.
Professor William B. Jensen, a chemistry professor and historian at the University of Cincinnati, has independently developed an approach to teaching entropy that involves interpreting entropy change as a change in the dispersion of energy. His additional contributions are that only kinetic energy can become dispersed and that examination of the constraints to dispersion clarify how/what mode energy dispersion takes. In my response here, I call attention to the dispersal of kinetic energy to potential energy (due to bond breaking) at fusion and vaporization temperatures.
"Teaching Entropy Analysis in the First-Year High School Course and Beyond", Thomas H. Bindel, from the Journal of Chemical Education, Vol. 81, pp. 1585-1594, November 2004.
A novel and creative 16-day teaching unit is presented that develops chemical thermodynamics at the introductory high school level and beyond — exclusively from an entropy viewpoint referred to as entropy analysis. Many concepts are presented, such as: entropy, spontaneity, the second law of thermodynamics, qualitative and quantitative entropy analysis, extent of reaction, thermodynamic equilibrium, coupled equilibria, and Gibbs free energy. Entropy is presented in a nontraditional way, using energy dispersal.
"Introduction of Entropy via the Boltzmann Distribution in Undergraduate Physical Chemistry: A Molecular Approach", Evguenii I. Kozliak, from the Journal of Chemical Education, Vol. 81, pp. 1595-1598, November 2004.
Several problems that hinder optimal communication with students in the conventional introduction to thermodynamics are identified. Even though students from their first course focus on chemistry as a molecular science, most texts in physical chemistry begin with the phenomenological Clausius formulation, thereby emphasizing its macroscopic aspect; the others concentrate on so-called "positional" entropy thus decoupling it from the entropy of heat exchange. The suggested approach uses simple examples based on the Boltzmann distribution to introduce the concept of entropy consistently on a molecular basis by emphasizing energy distribution due to the number of accessible microstates but bypassing the complexities of statistics. Thereby, a connection between the increase of entropy on expansion as well as on heating can be shown. A clear illustration is provided for the basic tenet of the second law, the spontaneous transfer of thermal energy from hot to cold bodies.
"The Concentration Dependence of the ΔS Term in the Gibbs Free Energy Function: Application to Reversible Reactions in Biochemistry", Ronald K. Gary, from the Journal of Chemical Education, Vol. 81, pp. 1599-1604, November 2004.
Biochemistry students must use the concept of free energy change to understand reaction reversibility and the energetics of metabolism. The theory is founded on the Gibbs free energy function: ΔG = ΔH - TΔS.
Reactant and product concentrations affect the ΔS term and therefore determine whether ΔG is positive or negative at a standard temperature. However, most biochemistry texts do little to connect the sign of ΔG in this function to the concentration variables that determine it, and instead rely exclusively on the equation to relate these parameters. This can have the undesirable effect of rendering the Gibbs equation irrelevant for these students. For the biochemistry instructor, the challenge is to clarify the role of entropy in determining reaction directionality without digressing into aspects of thermodynamic theory that would be more appropriately covered in other courses. A model to explain the concentration dependence of the ΔS term is presented in a format that is appropriate for an audience of biochemistry students, and the concepts are illustrated using an aqueous phase reaction, the anomeric conversion of glucose.
"Playing-Card Equilibrium", Frank L. Lambert, from the Journal of Chemical Education, Vol. 81, p. 1569, November 2004. Complete letter to the Editor reproduced below, with permission from the JCE.
From experience, I am hypersensitive to the misconceptions of students and instructors that can be caused when playing cards are used in teaching chemistry (1). The root of such errors lies in overlooking the non-mobile, non-energetically-interacting nature of pieces of cardboard. Only while they are being shuffled can cards serve as some sort of analogy to molecular behavior in chemistry.
Thus, I found Hanson's "Playing-Card Equilibrium" of special interest (2). To me, his otherwise excellent treatment of probability in relation to chemical equilibrium lacked emphasis on shuffling as a vital element in the analogy. However, in a personal email, Professor Hanson said that his experience with teaching teachers did not show that they overlooked the importance of constant shuffling to simulate the interacting state of molecular movement. His summary is my view also: "The shuffling illustrates the equilibration, and counting the probabilities from the card arrangements at any moment is like taking snapshots of that dynamic process."
1. Lambert, F. L. J. Chem. Educ. 1999, 76, 1385-1387.
2. Hanson, R. M. J. Chem. Educ. 2003, 80, 1271-1274.
"``Order-to-Disorder'' for Entropy Change? Consider the Numbers!", Evguenii I. Kozliak and Frank L. Lambert, from The Chemical Educator (an Online Journal) 10 (2005) 1, pp. 24-25 © The Chemical Educator 2005.
Click title above to download the article in Acrobat (pdf) format. Abstract is below. The text in brackets is an addendum to the original abstract.
Defining entropy increase as a change from order to disorder is misleading
at best and incorrect at worst. Although Boltzmann described it this
way in 1898, he did so innocently in the sense that he had never calculated
the numerical values of W using ΔS = kB ln (W/W0) (because
this equation was not stated, kB was not known, and W0 was indeterminable
before 1900–1912). Prior
publications have demonstrated that the word “disorder” is misleading
in describing entropy change. In this paper, convincing evidence is provided
that no starting system above ca. 1 K can be said to be orderly so far
as the distribution of its energy (the fundamental determinant of entropy)
is concerned. This is supported by a simple calculation showing that any system
practical state of zero entropy” has an incomprehensibly large number
[The calculation is from K. L. Pitzer “Thermodynamics” (3rd ed.; McGraw-Hill, 1995), p.67, (5-3) and shows that any molar system even at temperatures as cold as 1 K has about
"Chemical Kinetics: As Important As The Second Law Of Thermodynamics?" Frank L. Lambert, from the Chemical Educator (an Online Journal) 3 (1998) 2, 6 pages © The Chemical Educator, 1998.
Note: In the article above, the intended marginal summary on the first page was "Chemical kinetics firmly restrains "time's arrow" in the taut bow of thermodynamics for milliseconds or for millennia."
The second law may be “time’s arrow” but activation energies (chemical kinetics) prevent second law predictions from occurring for femtoseconds to eons. This is humanly important: Activation energies not only protect all the organic chemicals in our bodies and our oxidizable possessions from instant combustion in air, but also our breakable skis and surfboards (and legs) from disastrous fracture. Murphy’s Law is often applied to chemical and physical mishaps — things going wrong. But things do not always follow the second law and burst into flame or break! Chemical kinetics is the reason Murphy’s Law usually fails.
"Entropy and the Shelf Model: A Quantum Physical Approach to a Physical Property", Arnd H. Jungermann, from the Journal of Chemical Education, Vol. 83, pp. 1686-1694. November 2006
For a number of years Jungermann has presented standard molar entropy to his students as energy that is stored in substances — using shelves as energy levels and S/kB = ln W as an introduction to the number of particles and their 'energy' distributions on various levels. With S0/R as a dimensionless but mass and attractive-force related property, Jungermann shows how these 'atomic entropy' values are related to trends in elements and compounds in the periodic table. His procedures and concepts well fit our "The standard molar entropy of a substance at temperature T is a measure of the quantity of energy that must be dispersed in that substance for it to exist at T, that is, it is ΔS from 0 K to T."
"Consistent Application of the Boltzmann Distribution to Residual Entropy in Crystals", Evguenii I. Kozliak, from the Journal of Chemical Education, Vol. 84, pp. 493-498, March 2007.
Resolution of the old problem of understanding "residual entropy" , the entropy remaining in crystals of compounds such as CO, N2O, FClO3 and H2O even as they approach absolute zero. The entropy present in two or more arrangements of molecules in such crystals had only been considered in terms of "configurational" or "positional" entropy. Kozliak shows that the counting procedures in these entropy calculations are identical to what would result from considering the different forms on different energy levels — a considerably more fundamental focus on entropy values as related to energy distributions.
“A Study of Turkish Chemistry Undergraduates' Understanding of Entropy”, Mustafa Sözbilir and Judith M. Bennett, from the Journal of Chemical Education, Vol. 84, pp. 1204-1208, July 2007.
“This study explores Turkish chemistry undergraduates' understanding of entropy and identifies and classifies their misunderstandings. For this purpose, a diagnostic questionnaire and semi-structured interviews were used—before and after teaching [about entropy in the physical chemistry course – to students who had also been taught entropy in their first-year course]….[Students were] from two different chemistry education departments in two different universities in Turkey…The misunderstandings identified were categorized into these five broad headings: (i) Defining entropy as "disorder" and considering visual disorder and entropy as synonymous; (ii) Inaccurate connection of entropy to the number of inter-molecular interactions; (iii) Inaccurate connection of entropy of a system and the accompanying entropy changes in its surroundings; (iv) Entropy of the whole system decreases or does not change when a spontaneous change occurs in an isolated system; and (v) Entropy of carbon dioxide is bigger than that of propane or the same at the same temperature. The findings have implications for tertiary-level teaching, suggesting that a substantial review of teaching strategies is needed.”
Dr. Sozbilir has told me that in his future writing about entropy he is adopting our approach to entropy and eliminating all reference to macro or molecular "disorder". (If you do not know how the idea of "disorder" came to be associated with entropy, see the link to Boltzmann's first erroneous deduction about "order" in nature here.)
"Configurational Entropy Revisited", Frank L. Lambert, from the Journal of Chemical Education, Vol. 84, pp. 1548-1550, September 2007
Entropy change is categorized in some prominent general chemistry textbooks as being either positional (configurational) or thermal. In those texts, the accompanying emphasis on the dispersal of matter — independent of energy considerations and thus in discord with kinetic molecular theory — is most troubling. This article shows that the variants of entropy can be treated from a unified viewpoint and argues that to decrease students' confusion about the nature of entropy change these variants of entropy should be merged. Molecular energy dispersal in space is implicit but unfortunately tacit iin the cell models of statistical mechanics that develop the concept of configurational entropy change. Two factors are necessary for entropy change in chemistry. An increase in thermodynamic entropy is enabled in a process by the motional energy of molecules (that, in chemical reactions, can arise from the energy released from a bond energy change). However, entropy increase is only actualized if the process results in a larger number of arrangements for the system's energy, that is, a final state that involves the most probable distribution for that energy under the new constraints. Positional entropy should be eliminated from general chemistry instruction and, especially benefiting "concrete minded" students, it should be replaced by emphasis on the motional energy of molecules as enabling entropy change.
“Residual Entropy, the Third Law and Latent Heat”, Evguenii I. Kozliak and Frank L. Lambert, from Entropy (an Online open access Journal), Vol. 10 (3), pp. 274-284, 2008
A thermodynamic treatment of residual entropy in crystals, involving the configurational partition function, is suggested, which is consistent with both classical and statistical thermodynamics. It relates residual entropy to the inherent latent heat which would be released upon cooling if the reversible path were available. The nature of this heat is that if the crystal possessing residual entropy freezes above its Boltzmann’s characteristic temperature of molecular alignment, the difference in energy between different molecular arrangements is overcome by the kT heat bath to form a nearly-ideal solution. However, upon cooling below this characteristic temperature, they would separate with a concomitant release of the corresponding energy, provided the reversible path were available.
"The Correlation of Standard Entropy with Enthalpy Supplied from 0 to 298.15 K”, Frank L. Lambert and Harvey S. Leff, from the Journal of Chemical Education, Vol. 86, pp. 94- 98, January 2009.)
As a substance is heated at constant pressure from near 0 K to 298 K, each incremental enthalpy increase, ΔH, alters entropy by ΔH/T, bringing it from approximately zero to its standard molar entropy, So. Using heat capacity data for 32 solids and CODATA results for another 45, we found a roughly linear relationship between So and ΔHo. The plot showing the relationship So ≈ (constant) ΔHo, with constant = 0.0066 K–1, for 77 solids can serve as an enlightening visualization of this relationship for students in general chemistry. The near-linearity can be understood qualitatively in terms of lattice vibrations and internal vibrations within polyatomic units, which are reflected by molar heat capacities and Debye temperatures. This study supports the thesis that thermodynamic entropy and stored internal energy in a solid are intimately related and that entropy can be usefully interpreted as a spreading function, as described in the text.
“Overcoming Misconceptions about Configurational Entropy in Condensed Phases”, Evgenuii I. Kozliak, from the Journal of Chemical Education, Vol. 86, pp. 1063-1068, September 2009.)
Configurational and thermal entropy yield identical numerical values for ΔS only when the system's "dimensionless" energy gaps (Δε /kT ) between the accessible quantized energy levels are minimized by temperature to nearly infinitesimal values so that the spreading of energy among the system's microstates becomes effectively classical and equiprobable. The molecular partition function provides the numerical value for the effective number of both accessible states and spatial configurations per molecule, for which this condition is valid at a given temperature. Considering the phenomenon of mixing and standard molar entropy values leads to the conclusion that configurational entropy calculations are significant and thermodynamically valid because of their fundamental connection to the process of random energy re-distributions in a system, via the available modes of molecular motion.
. . . . . . . . .
Kozliak’s thorough connection of classical and quantum mechanics is not easy reading but his conclusions about the profound differences between the “entropy” of playing cards and thermodynamic entropy, as well as other variants of over-hyped “probability”, are more explicitly and strongly supported than any previously in the literature. FLL
"Entropy Is Not "Disorder"; It Is a Measure of the Dispersal of Energy"
A complete summary of the concept and its application to macro and molecular thermodynamics for chemistry instructors.
"What is a microstate?"
Two approaches to understanding a microstate, a description of one arrangement of a system’s energy.
"A Student's Approach to the Second Law and Entropy"
A short introduction to the second law and entropy for students. Written with the hurried student in mind.
"Teaching Entropy Is Simple — If You Discard "Disorder" "
An introduction for AP teachers to the concept of entropy as measuring the dispersal of energy at a specific temperature.
" 'Configurational' Entropy: A Measure of Energy Dispersal in Statistical Mechanical Calculations"
A subtopic in the chemistry seminar at California State Polytechnic University, Pomona on November 8, 2005.
This initial presentation was developed further to become Article 15 in the September ?2007 Journal of Chemical Education, described above on page 11.
'Positional' or 'configurational' entropy change in general chemistry texts misplaces the nature of the change due to a probable increase in molecular positions. Actually, those positions represent the increased numbers of microstates, the spreading out of the initially more localized energy of the components
"The Second Law of Thermodynamics" and "Entropy in General Chemistry"
Written by Dr. Lambert for Wikibooks, these two articles contain material that is scattered on this site but is presented in somewhat different format, designed to be quite readily readable by beginners in chemistry and accessible to students not majoring in science.
"'Disorder' in Thermodynamic Entropy"
The historical origin of the introduction of 'disorder' by Boltzsmann, reproduced in response to many questioners. The brief article also is an introduction for instructors who are not familiar with my approach to understanding entropy change. It closes with a description of the clear distinction between thermodynamic entropy and Shannon information "entropy".
I thank the Journal of Chemical Education for permission to reprint these articles on this Web site. The Journal serves chemistry instructors worldwide and across the span of education from K-12 teachers to professors in graduate school. Permission by the JCE to display the logo below does not constitute any sort of endorsement of this site by the Journal or the American Chemical Society. The logo link is reproduced here only to aid the reader in learning more about the JCE and its remarkable print and software contributions to chemical education.
My thanks go to the Information Technology Services department of Occidental College for updating this site, and especially to Luu Tran for setting up the site and bringing my manuscripts to the web for the past decade.
Frank L. Lambert, Professor Emeritus (Chemistry)
Occidental College, Los Angeles CA 90041