The content of this Web site has been selected for instructors in general and physical chemistry by Dr. Frank L. Lambert, Professor Emeritus (Chemistry) of Occidental College, Los Angeles (Wikipedia, pro biography; personal biography). It consists of copyrighted articles from the Journal of Chemical Education and the Chemical Educator that deal with a modern view of entropy change: a measure of how widely the energy in a process is dispersed or spread out in space or phase space (at T). Considerable non-published supplementary material concerning entropy and teaching it to beginners is also included.
Concisely, the second law is “Energy of all types changes from being localized to becoming more spread out, dispersed in space** if that energy is not constrained from doing so”.
In chemistry, the motional (kinetic) energy of molecules is most frequently involved (but in liquids that also includes their potential energy due to the phase change of fusion, and in gases their additional vaporization energy). Thus, molecules are ‘energy carriers’ above 0 K. (Both direct heat exchange upon contact and radiation can be the ways to disperse energy in a chemical process.)
The simplest example is the stereotypical textbook illustration of the spontaneous expansion of an ideal gas from one bulb to occupy both that bulb and an attached evacuated bulb: The initial motional/kinetic energy (and potential energy) of the molecules in the first bulb is unchanged in such an isothermal expansion process, but it becomes more widely distributed in the final larger volume. Further, this concept of energy dispersal equally applies to heating a system: a spreading of molecular energy from the volume of greater-motional energy (“warmer”) molecules in the surroundings to include the additional volume of a system that initially had “cooler” molecules. It is not obvious, but true, that this distribution of energy in greater space is implicit in the Gibbs free energy equation and thus in chemical reactions. [See pp. 12-14 in entropysite.oxy.edu/entropy_isnot_disorder.html.]
“Entropy change is the measure of how more widely a specific quantity of molecular energy is dispersed in a process, whether isothermal gas expansion, gas or liquid mixing, reversible heating and phase change or chemical reactions, as shown by the Gibbs free energy equation/T .”
**This literal greater spreading of molecular energy in 3-D space in an isothermal process is accompanied by occupancy of more quantum states (“closer energy levels”) within each microstate and thus more microstates for the final macrostate (i.e., W2/W1 in ΔS = kB ln W2/W1). Similarly, in any thermal process higher energy quantum states (“higher energy levels”) can be significantly occupied – thereby increasing the number of microstates in the product macrostate as measured by the Boltzmann kB lnW. In this case, if the volume is constrained, instead of distributing wider, the energy is spread broader, i.e., among a larger number of options within the system.
(See I. N. Levine, Physical Chemistry, 6th ed. 2009, p. 101, and ref. to entropysite.oxy.edu.)
There are two requisites for entropy change: (entropysite.oxy.edu/ConFigEntPublicat.pdf) First, it is enabled by the above-described increased distribution of molecular energy. Second, it is actualized (realized) if the process makes available a larger number of arrangements for the system’s energy, i.e., a final state that involves the most probable distribution of that energy under the new constraints. Temperature (when it is low) often creates such constraints.
Calculations that involve only probability can correctly predict a final state, but they are like the results from a hand calculator working only with pure numbers -- disconnected from anything to which the numbers apply. Thermodynamic entropy increase in chemistry must explicitly be seen and stated as involving energetic mobile molecules spreading out in space and/or in an increased number of accessible microstates.
dqrev/T = ΔS = kB ln W2/W1
The 35 Science Textbooks That Have Deleted "disorder" From Their Description of the Nature of Entropy
(As advocated in the publications of Dr. Frank L. Lambert, Professor Emeritus, Chemistry, Occidental College.)
All relate entropy, S, to the spreading or dispersal of energy in a process – often as simply the motional energy of atoms or molecules in a greater space, always as related to the original energy of such particles becoming more dispersed in a greater number of micro states - S = k ln W.)
ISBN data have been omitted because of the large number of different formats in texts in recent years. (All are available from Amazon.com.)
To print these pages of 'what's new', please click here to view their printable format: http://entropysite.oxy.edu/texts.html
General chemistry texts for majors
1. (The following text is especially important re the treatment of entropy because of its ACS source.) The American Chemical Society Project by seven authors, Chemistry in Context, 7th ed., McGraw-Hill, New York, N.Y. 10020
2. Brady, J. E, Senese, F., .Chemistry: Matter and Its Changes, 5th ed., John Wiley, Indianapolis, IN. 2009.
3. Burdge, J. Chemistry, 2nd ed. , McGraw-Hill, Hightstown, NJ. 2011.
4. Burdge, J.; Overby, J. Chemistry: Atoms First,1st Ed., McGraw-Hill, Hightstown, NJ 2012.
(Both Burdge texts have unique two-page illustrations of the increased number of micro states due to a change in volume, temperature, molecular complexity, molar mass, phase, and chemical reaction.)
5. Chang, R. Chemistry, 10th ed., McGraw-Hill, Hightstown, NJ. 2010.
6. Chang, R.; Goldsby, K. Chemistry, 11th ed., McGraw-Hill, Hightstown, NJ. 2010.
7. Chang, R.; Overby, J. General Chemistry: The Essential Concepts, 6th ed., McGraw-Hill, Highstown, NJ. 2011.
8. Ebbing, D.; Gammon, S. D. General Chemistry, 9th ed., Brooks/Cole - Cengage, Belmont, CA. 2011.
9. Ebbing, D.; Gammon S. D.; Ragsdale, R. O. Essentials of General Chemistry, 2nd ed., Brooks/Cole - Cengage, Belmont, CA. 2006.
10. Gilbert, T. R.; Kirss, R. V.;Foster, N.; and Davies, G. Chemistry: The Science in Context, 3rd ed., W. W. Norton. New York, NY. 2010.
11. Hill, J. W.; Petrucci, R. H.; McCreary, T. W.; Perry, S. W. General Chemistry, 4th ed., Pearson/Prentice Hall, Lebanon, IN. 2005.
12. Jesperson, N. D., Brady, J. E., Hyslop, A. Chemistry: Matter and Its Changes, 6th ed., John Wiley, Indianapolis, IN. 2012.
13. Kotz, J. C.; Treichel, P. M.; Townsend, J.; Weaver, G. Chemistry and Chemical Reactivity, 8th ed., Brooks/Cole/Cengage, Belmont, CA. 2012.
14. Moore, J. W.; Stanitski, C. L.; Jurs, P. J. Chemistry: The Molecular Science, 4th ed., Brooks Cole/Cengage, Belmont, CA. 2011.
15. Moore, J. W.; Stanitski, C. L.; Jurs, P. J. Principles of Chemistry: The Molecular Science, 1st ed. John Wiley, Indianapolis, IN. 2010.
16. Olmsted, J. A.; Williams, G. M. Chemistry, 4th ed., John Wiley, Indianapolis, IN. 2006.
17. Olmsted, J. A.; Williams. G. M.; Burk, R. C. Chemistry, John Wiley, Toronto, Ontario M9B 6H8, Canada. 2010.
18. Oxtoby, D. W.; Gillis, H. P.; Campion P. Principles of Modern Chemistry, 7th ed., Brooks Cole/Cengage, Belmont, CA. 2012.
19. Petrucci, R. H.; Harwood, W. S., Herring, G. General Chemistry: Principles and Modern Applications, 10th ed., Pearson/Prentice Hall, Lebanon, IN. 2011.
20. Silberberg, M. Chemistry: The Molecular Nature of Matter and Change, 6th ed., McGraw-Hill, Hightstown, NJ. 2012.
21. Silberberg, M. Principles of General Chemistry, 2nd ed., McGraw-Hill, Hightstown, NJ. 2010.
22. Tro, N. J. Chemistry: A Molecular Approach, 2nd ed., Pearson/Prentice Hall, Lebanon, IN. 2011.
23. Tro, N. J. Principles of Chemistry, Pearson/Prentice Hall, Lebanon, IN. 2010.
General chemistry texts for non-majors
24. Hill, J. W., Kolb, D. K.; McCreary, T. W.; Chemistry for Changing Times, 12th ed., Pearson/Prentice Hall, Lebanon, IN. 2010.
25. Suchocki, J. Conceptual Chemistry: Understanding Our World of Atoms and Molecules, 4th ed., Pearson/Benjamin Cummings, San Francisco, CA. 2011.
26. Tro, N., J. Chemistry in Focus: A Molecular View of Our World, 5th ed, Cengage, 2013.
27. Tro, Nivaldo Introductory Chemistry 4th ed., Pearson/Prentice Hal, Lebanon, IN. 2012.
Physical chemistry texts
28. Atkins, P.; de Paula, J. Physical Chemistry 9th ed., W. H. Freeman, New York, NY. 2010.
29. Atkins, P.; de Paula, J. Physical Chemistry for the Life Sciences, 1st ed., W. H. Freeman, New York, NY. 2005.
30. Levine, I. N., Physical Chemistry, 6th ed., McGraw-Hill, New York, N.Y. 10020 . 2009. (Note p. 101, with ref. to entropysite.oxy.edu.)
Organic chemistry texts
31. Vollhardt, P. C. ; Schore, N. E., Organic Chemistry, 6th ed., W.H. Freeman, New York, NY. 10010. 2011.
32. Thermodynamics and Chemistry, 2012, a 535 page high-level text online in PDF format at http://www2.chem.umd.edu/thermobook/ by Professor Emeritus Howard Devoe, of the University of Maryland. (p. 131: "This description of entropy as "disorder" is highly misleading..." p. 132: "This [correct] interpretation of entropy increase has been described ...as the dispersal of energy [ref. to http://entropysite.oxy.edu/entropy_is_simple/index.html]
33. Yousef Haseli, Thermodynamic Optimization of Power Plants, Eindhoven University Press, Eindhoven, Netherlands. 2011.
34. Starr. C.,Taggart, T., Evers, C; Starr, L., Biology, The Unity & Diversity of Life, Brooks Cole/Cengage, Belmont, CA. 94002. 2013.
35. Olsen, Bruce D., Understanding Biology through Evolution , 4th ed., Lulu Press, Inc., Raleigh, N. C. 2009.
Although hardly to be classified as textbooks for collegiate courses, there is a series of 300 little books that are 5 x 8 inches in size with only 101 pages, selling a total of a million copies a year, under the general title of "Teach Yourself", "101 Key Ideas". That with the title of "Chemistry", written by Andrew Scott, has a simple–and yet correct - one page presentation of entropy!
(Unless otherwise stated, all articles are copyright © by the Division of Chemical Education, Inc., of the American Chemical Society and reprinted by permission of the copyright owner. To access an article, click on its title.)
"Shuffled Cards, Messy
Desks, and Disorderly Dorm Rooms — Examples of Entropy Increase?
from the Journal of Chemical Education, Vol. 76, pp. 1385-1387, October 1999.
Changes in the arrangement of ordinary objects do not change their entropy. Entropy depends on the dispersal of energy at a specific temperature, not on a pattern. (Information "entropy" with no inherent or integral energy factor therefore is only related in form, and not in function, to thermodynamic entropy that must have an enabling factor of energy.
"Disorder — A Cracked Crutch for Supporting Entropy Discussions" from the Journal of Chemical Education, Vol. 79, pp. 187-192, February 2002.
"Entropy is disorder" is an archaic, misleading definition of entropy dating from the late 19th century before knowledge of molecular behavior, of quantum mechanics and molecular energy levels, or of the Third Law of thermodynamics. It seriously misleads beginning students, partly because "disorder" is a common word, partly because it has no scientific meaning in terms of energy or energy dispersal. Ten examples conclusively demonstrate the inadequacy of "disorder" in general chemistry.
"Entropy Is Simple, Qualitatively" originally published in the Journal of Chemical Education, Vol. 79, pp. 1241-1246, October 2002.
Note: the article as presented here has been extensively revised and expanded, most recently in August 2005.
Energy disperses from being localized to becoming spread out if it is not hindered. This is the enabling factor for all spontaneous physical and chemical events. Entropy change measures the dispersal of energy in a process: how much is spread out or how widely spread out that energy becomes. This is discussed in terms of macro thermodynamics, q(rev)/T, and molecular thermodynamics, kB ln [microstatesfinal / microstatesinitial ].
" "Disorder" in Unstretched Rubber Bands?" from the Journal of Chemical Education, Vol. 80, p. 145, February 2003.
The well known experiment of stretching a rubber band has often been used as an example of entropy increase toward greater "disorder" in the unstretched band as a cause for a stretched band to contract. Instead, from a scientific point of view, the unstretched rubber has greater entropy than the stretched form because of the increased possibilities for energy dispersal among the more freely-moving portions of rubber molecules in unstretched rubber compared to an extended rubber band. Thus, spontaneously a stretched band will change to unstretched.
"Entropy and Constraint of Motion" from the Journal of Chemical Education, Vol. 81, pp. 639-640, May 2004.
Professor William B. Jensen, a chemistry professor and historian at the University of Cincinnati, has independently developed an approach to teaching entropy that involves interpreting entropy change as a change in the dispersion of energy. His additional contributions are that only kinetic energy can become dispersed and that examination of the constraints to dispersion clarify how/what mode energy dispersion takes. In my response here, I call attention to the dispersal of kinetic energy to potential energy (due to bond breaking) at fusion and vaporization temperatures.
"Teaching Entropy Analysis in the First-Year High School Course and Beyond", Thomas H. Bindel, from the Journal of Chemical Education, Vol. 81, pp. 1585-1594, November 2004.
A novel and creative 16-day teaching unit is presented that develops chemical thermodynamics at the introductory high school level and beyond — exclusively from an entropy viewpoint referred to as entropy analysis. Many concepts are presented, such as: entropy, spontaneity, the second law of thermodynamics, qualitative and quantitative entropy analysis, extent of reaction, thermodynamic equilibrium, coupled equilibria, and Gibbs free energy. Entropy is presented in a nontraditional way, using energy dispersal.
"Introduction of Entropy via the Boltzmann Distribution in Undergraduate Physical Chemistry: A Molecular Approach", Evguenii I. Kozliak, from the Journal of Chemical Education, Vol. 81, pp. 1595-1598, November 2004.
Several problems that hinder optimal communication with students in the conventional introduction to thermodynamics are identified. Even though students from their first course focus on chemistry as a molecular science, most texts in physical chemistry begin with the phenomenological Clausius formulation, thereby emphasizing its macroscopic aspect; the others concentrate on so-called "positional" entropy thus decoupling it from the entropy of heat exchange. The suggested approach uses simple examples based on the Boltzmann distribution to introduce the concept of entropy consistently on a molecular basis by emphasizing energy distribution due to the number of accessible microstates but bypassing the complexities of statistics. Thereby, a connection between the increase of entropy on expansion as well as on heating can be shown. A clear illustration is provided for the basic tenet of the second law, the spontaneous transfer of thermal energy from hot to cold bodies.
"The Concentration Dependence of the ΔS Term in the Gibbs Free Energy Function: Application to Reversible Reactions in Biochemistry", Ronald K. Gary, from the Journal of Chemical Education, Vol. 81, pp. 1599-1604, November 2004.
Biochemistry students must use the concept of free energy change to understand reaction reversibility and the energetics of metabolism. The theory is founded on the Gibbs free energy function: ΔG = ΔH - TΔS.
Reactant and product concentrations affect the ΔS term and therefore determine whether ΔG is positive or negative at a standard temperature. However, most biochemistry texts do little to connect the sign of ΔG in this function to the concentration variables that determine it, and instead rely exclusively on the equation to relate these parameters. This can have the undesirable effect of rendering the Gibbs equation irrelevant for these students. For the biochemistry instructor, the challenge is to clarify the role of entropy in determining reaction directionality without digressing into aspects of thermodynamic theory that would be more appropriately covered in other courses. A model to explain the concentration dependence of the ΔS term is presented in a format that is appropriate for an audience of biochemistry students, and the concepts are illustrated using an aqueous phase reaction, the anomeric conversion of glucose.
"Playing-Card Equilibrium", Frank L. Lambert, from the Journal of Chemical Education, Vol. 81, p. 1569, November 2004. Complete letter to the Editor reproduced below, with permission from the JCE.
From experience, I am hypersensitive to the misconceptions of students and instructors that can be caused when playing cards are used in teaching chemistry (1). The root of such errors lies in overlooking the non-mobile, non-energetically-interacting nature of pieces of cardboard. Only while they are being shuffled can cards serve as some sort of analogy to molecular behavior in chemistry.
Thus, I found Hanson's "Playing-Card Equilibrium" of special interest (2). To me, his otherwise excellent treatment of probability in relation to chemical equilibrium lacked emphasis on shuffling as a vital element in the analogy. However, in a personal email, Professor Hanson said that his experience with teaching teachers did not show that they overlooked the importance of constant shuffling to simulate the interacting state of molecular movement. His summary is my view also: "The shuffling illustrates the equilibration, and counting the probabilities from the card arrangements at any moment is like taking snapshots of that dynamic process."
1. Lambert, F. L. J. Chem. Educ. 1999, 76, 1385-1387.
2. Hanson, R. M. J. Chem. Educ. 2003, 80, 1271-1274.
"``Order-to-Disorder'' for Entropy Change? Consider the Numbers!", Evguenii I. Kozliak and Frank L. Lambert, from The Chemical Educator (an Online Journal) 10 (2005) 1, pp. 24-25 © The Chemical Educator 2005.
Click title above to download the article in Acrobat (pdf) format. Abstract is below. The text in brackets is an addendum to the original abstract.
Defining entropy increase as a change from order to disorder is misleading
at best and incorrect at worst. Although Boltzmann described it this
way in 1898, he did so innocently in the sense that he had never calculated
the numerical values of W using ΔS = kB ln (W/W0) (because
this equation was not stated, kB was not known, and W0 was indeterminable
before 1900–1912). Prior
publications have demonstrated that the word “disorder” is misleading
in describing entropy change. In this paper, convincing evidence is provided
that no starting system above ca. 1 K can be said to be orderly so far
as the distribution of its energy (the fundamental determinant of entropy)
is concerned. This is supported by a simple calculation showing that any system
practical state of zero entropy” has an incomprehensibly large number
[The calculation is from K. L. Pitzer “Thermodynamics” (3rd ed.; McGraw-Hill, 1995), p.67, (5-3) and shows that any molar system even at temperatures as cold as 1 K has about
"Chemical Kinetics: As Important As The Second Law Of Thermodynamics?" Frank L. Lambert, from the Chemical Educator (an Online Journal) 3 (1998) 2, 6 pages © The Chemical Educator, 1998.
Note: In the article above, the intended marginal summary on the first page was "Chemical kinetics firmly restrains "time's arrow" in the taut bow of thermodynamics for milliseconds or for millennia."
The second law may be “time’s arrow” but activation energies (chemical kinetics) prevent second law predictions from occurring for femtoseconds to eons. This is humanly important: Activation energies not only protect all the organic chemicals in our bodies and our oxidizable possessions from instant combustion in air, but also our breakable skis and surfboards (and legs) from disastrous fracture. Murphy’s Law is often applied to chemical and physical mishaps — things going wrong. But things do not always follow the second law and burst into flame or break! Chemical kinetics is the reason Murphy’s Law usually fails.
"Entropy and the Shelf Model: A Quantum Physical Approach to a Physical Property", Arnd H. Jungermann, from the Journal of Chemical Education, Vol. 83, pp. 1686-1694. November 2006
For a number of years Jungermann has presented standard molar entropy to his students as energy that is stored in substances — using shelves as energy levels and S/kB = ln W as an introduction to the number of particles and their 'energy' distributions on various levels. With S0/R as a dimensionless but mass and attractive-force related property, Jungermann shows how these 'atomic entropy' values are related to trends in elements and compounds in the periodic table. His procedures and concepts well fit our "The standard molar entropy of a substance at temperature T is a measure of the quantity of energy that must be dispersed in that substance for it to exist at T, that is, it is ΔS from 0 K to T."
"Consistent Application of the Boltzmann Distribution to Residual Entropy in Crystals", Evguenii I. Kozliak, from the Journal of Chemical Education, Vol. 84, pp. 493-498, March 2007.
Resolution of the old problem of understanding "residual entropy" , the entropy remaining in crystals of compounds such as CO, N2O, FClO3 and H2O even as they approach absolute zero. The entropy present in two or more arrangements of molecules in such crystals had only been considered in terms of "configurational" or "positional" entropy. Kozliak shows that the counting procedures in these entropy calculations are identical to what would result from considering the different forms on different energy levels — a considerably more fundamental focus on entropy values as related to energy distributions.
“A Study of Turkish Chemistry Undergraduates' Understanding of Entropy”, Mustafa Sözbilir and Judith M. Bennett, from the Journal of Chemical Education, Vol. 84, pp. 1204-1208, July 2007.
“This study explores Turkish chemistry undergraduates' understanding of entropy and identifies and classifies their misunderstandings. For this purpose, a diagnostic questionnaire and semi-structured interviews were used—before and after teaching [about entropy in the physical chemistry course – to students who had also been taught entropy in their first-year course]….[Students were] from two different chemistry education departments in two different universities in Turkey…The misunderstandings identified were categorized into these five broad headings: (i) Defining entropy as "disorder" and considering visual disorder and entropy as synonymous; (ii) Inaccurate connection of entropy to the number of inter-molecular interactions; (iii) Inaccurate connection of entropy of a system and the accompanying entropy changes in its surroundings; (iv) Entropy of the whole system decreases or does not change when a spontaneous change occurs in an isolated system; and (v) Entropy of carbon dioxide is bigger than that of propane or the same at the same temperature. The findings have implications for tertiary-level teaching, suggesting that a substantial review of teaching strategies is needed.”
Dr. Sozbilir has told me that in his future writing about entropy he is adopting our approach to entropy and eliminating all reference to macro or molecular "disorder". (If you do not know how the idea of "disorder" came to be associated with entropy, see the link to Boltzmann's first erroneous deduction about "order" in nature here.)
"Configurational Entropy Revisited", Frank L. Lambert, from the Journal of Chemical Education, Vol. 84, pp. 1548-1550, September 2007
Entropy change is categorized in some prominent general chemistry textbooks as being either positional (configurational) or thermal. In those texts, the accompanying emphasis on the dispersal of matter — independent of energy considerations and thus in discord with kinetic molecular theory — is most troubling. This article shows that the variants of entropy can be treated from a unified viewpoint and argues that to decrease students' confusion about the nature of entropy change these variants of entropy should be merged. Molecular energy dispersal in space is implicit but unfortunately tacit iin the cell models of statistical mechanics that develop the concept of configurational entropy change. Two factors are necessary for entropy change in chemistry. An increase in thermodynamic entropy is enabled in a process by the motional energy of molecules (that, in chemical reactions, can arise from the energy released from a bond energy change). However, entropy increase is only actualized if the process results in a larger number of arrangements for the system's energy, that is, a final state that involves the most probable distribution for that energy under the new constraints. Positional entropy should be eliminated from general chemistry instruction and, especially benefiting "concrete minded" students, it should be replaced by emphasis on the motional energy of molecules as enabling entropy change.
“Residual Entropy, the Third Law and Latent Heat”, Evguenii I. Kozliak and Frank L. Lambert, from Entropy (an Online open access Journal), Vol. 10 (3), pp. 274-284, 2008
A thermodynamic treatment of residual entropy in crystals, involving the configurational partition function, is suggested, which is consistent with both classical and statistical thermodynamics. It relates residual entropy to the inherent latent heat which would be released upon cooling if the reversible path were available. The nature of this heat is that if the crystal possessing residual entropy freezes above its Boltzmann’s characteristic temperature of molecular alignment, the difference in energy between different molecular arrangements is overcome by the kT heat bath to form a nearly-ideal solution. However, upon cooling below this characteristic temperature, they would separate with a concomitant release of the corresponding energy, provided the reversible path were available.
"The Correlation of Standard Entropy with Enthalpy Supplied from 0 to 298.15 K”, Frank L. Lambert and Harvey S. Leff, from the Journal of Chemical Education, Vol. 86, pp. 94- 98, January 2009.)
As a substance is heated at constant pressure from near 0 K to 298 K, each incremental enthalpy increase, ΔH, alters entropy by ΔH/T, bringing it from approximately zero to its standard molar entropy, So. Using heat capacity data for 32 solids and CODATA results for another 45, we found a roughly linear relationship between So and ΔHo. The plot showing the relationship So ≈ (constant) ΔHo, with constant = 0.0066 K–1, for 77 solids can serve as an enlightening visualization of this relationship for students in general chemistry. The near-linearity can be understood qualitatively in terms of lattice vibrations and internal vibrations within polyatomic units, which are reflected by molar heat capacities and Debye temperatures. This study supports the thesis that thermodynamic entropy and stored internal energy in a solid are intimately related and that entropy can be usefully interpreted as a spreading function, as described in the text.
“Overcoming Misconceptions about Configurational Entropy in Condensed Phases”, Evgenuii I. Kozliak, from the Journal of Chemical Education, Vol. 86, pp. 1063-1068, September 2009.)
Configurational and thermal entropy yield identical numerical values for ΔS only when the system's "dimensionless" energy gaps (Δε /kT ) between the accessible quantized energy levels are minimized by temperature to nearly infinitesimal values so that the spreading of energy among the system's microstates becomes effectively classical and equiprobable. The molecular partition function provides the numerical value for the effective number of both accessible states and spatial configurations per molecule, for which this condition is valid at a given temperature. Considering the phenomenon of mixing and standard molar entropy values leads to the conclusion that configurational entropy calculations are significant and thermodynamically valid because of their fundamental connection to the process of random energy re-distributions in a system, via the available modes of molecular motion.
. . . . . . . . .
Kozliak’s thorough connection of classical and quantum mechanics is not easy reading but his conclusions about the profound differences between the “entropy” of playing cards and thermodynamic entropy, as well as other variants of over-hyped “probability”, are more explicitly and strongly supported than any previously in the literature. FLL
"Entropy Is Not "Disorder"; It Is a Measure of the Dispersal of Energy"
A complete summary of the concept and its application to macro and molecular thermodynamics for chemistry instructors.
"What is a microstate?"
Two approaches to understanding a microstate, a description of one arrangement of a system’s energy.
"A Student's Approach to the Second Law and Entropy"
A short introduction to the second law and entropy for students. Written with the hurried student in mind.
"Teaching Entropy Is Simple — If You Discard "Disorder" "
An introduction for AP teachers to the concept of entropy as measuring the dispersal of energy at a specific temperature.
" 'Configurational' Entropy: A Measure of Energy Dispersal in Statistical Mechanical Calculations"
A subtopic in the chemistry seminar at California State Polytechnic University, Pomona on November 8, 2005.
This initial presentation was developed further to become Article 15 in the September ?2007 Journal of Chemical Education, described above on page 11.
'Positional' or 'configurational' entropy change in general chemistry texts misplaces the nature of the change due to a probable increase in molecular positions. Actually, those positions represent the increased numbers of microstates, the spreading out of the initially more localized energy of the components
"The Second Law of Thermodynamics" and "Entropy in General Chemistry"
Written by Dr. Lambert for Wikibooks, these two articles contain material that is scattered on this site but is presented in somewhat different format, designed to be quite readily readable by beginners in chemistry and accessible to students not majoring in science.
"'Disorder' in Thermodynamic Entropy"
The historical origin of the introduction of 'disorder' by Boltzsmann, reproduced in response to many questioners. The brief article also is an introduction for instructors who are not familiar with my approach to understanding entropy change. It closes with a description of the clear distinction between thermodynamic entropy and Shannon information "entropy".
I thank the Journal of Chemical Education for permission to reprint these articles on this Web site. The Journal serves chemistry instructors worldwide and across the span of education from K-12 teachers to professors in graduate school. Permission by the JCE to display the logo below does not constitute any sort of endorsement of this site by the Journal or the American Chemical Society. The logo link is reproduced here only to aid the reader in learning more about the JCE and its remarkable print and software contributions to chemical education.
My thanks go to the Information Technology Services department of Occidental College for updating this site, and especially to Luu Tran for setting up the site and bringing my manuscripts to the web for the past decade.
Frank L. Lambert, Professor Emeritus (Chemistry)
Occidental College, Los Angeles CA 90041