<< Chapter < Page Chapter >> Page >
By the end of this section, you will be able to:
  • Define entropy
  • Explain the relationship between entropy and the number of microstates
  • Predict the sign of the entropy change for chemical and physical processes

In 1824, at the age of 28, Nicolas Léonard Sadi Carnot ( [link] ) published the results of an extensive study regarding the efficiency of steam heat engines. In a later review of Carnot’s findings, Rudolf Clausius introduced a new thermodynamic property that relates the spontaneous heat flow accompanying a process to the temperature at which the process takes place. This new property was expressed as the ratio of the reversible heat ( q rev ) and the kelvin temperature ( T ). The term reversible process    refers to a process that takes place at such a slow rate that it is always at equilibrium and its direction can be changed (it can be “reversed”) by an infinitesimally small change is some condition. Note that the idea of a reversible process is a formalism required to support the development of various thermodynamic concepts; no real processes are truly reversible, rather they are classified as irreversible .

 A portrait of Rudolf Clasius is shown.
(a) Nicholas Léonard Sadi Carnot’s research into steam-powered machinery and (b) Rudolf Clausius’s later study of those findings led to groundbreaking discoveries about spontaneous heat flow processes.

Similar to other thermodynamic properties, this new quantity is a state function, and so its change depends only upon the initial and final states of a system. In 1865, Clausius named this property entropy ( S )    and defined its change for any process as the following:

Δ S = q rev T

The entropy change for a real, irreversible process is then equal to that for the theoretical reversible process that involves the same initial and final states.

Entropy and microstates

Following the work of Carnot and Clausius, Ludwig Boltzmann developed a molecular-scale statistical model that related the entropy of a system to the number of microstates possible for the system. A microstate ( W )    is a specific configuration of the locations and energies of the atoms or molecules that comprise a system like the following:

S = k ln W

Here k is the Boltzmann constant and has a value of 1.38 × 10 −23 J/K.

As for other state functions, the change in entropy for a process is the difference between its final ( S f ) and initial ( S i ) values:

Δ S = S f S i = k ln W f k ln W i = k ln W f W i

For processes involving an increase in the number of microstates, W f > W i , the entropy of the system increases, Δ S >0. Conversely, processes that reduce the number of microstates, W f < W i , yield a decrease in system entropy, Δ S <0. This molecular-scale interpretation of entropy provides a link to the probability that a process will occur as illustrated in the next paragraphs.

Consider the general case of a system comprised of N particles distributed among n boxes. The number of microstates possible for such a system is n N . For example, distributing four particles among two boxes will result in 2 4 = 16 different microstates as illustrated in [link] . Microstates with equivalent particle arrangements (not considering individual particle identities) are grouped together and are called distributions . The probability that a system will exist with its components in a given distribution is proportional to the number of microstates within the distribution. Since entropy increases logarithmically with the number of microstates, the most probable distribution is therefore the one of greatest entropy .

Practice Key Terms 3

Get Jobilize Job Search Mobile App in your pocket Now!

Get it on Google Play Download on the App Store Now




Source:  OpenStax, Chemistry. OpenStax CNX. May 20, 2015 Download for free at http://legacy.cnx.org/content/col11760/1.9
Google Play and the Google Play logo are trademarks of Google Inc.

Notification Switch

Would you like to follow the 'Chemistry' conversation and receive update notifications?

Ask