You must log in to edit PetroWiki. Help with editing

Content of PetroWiki is intended for personal use only and to supplement, not replace, engineering judgment. SPE disclaims any and all liability for your use of such content. More information


Second law of thermodynamics

PetroWiki
Jump to navigation Jump to search

The second fundamental idea in thermodynamics is the total entropy balance or the "second law" of thermodynamics. Entropy is a thermodynamic property that expresses the unidirectional nature of a process and, in some sense, is "nature’s clock." For example, a cup of hot coffee at room temperature cools down instead of heating up.

Unidirectional nature of processes

Conservation of total mass and energy are insufficient to solve many phase-equilibrium problems. Processes that satisfy these conservation equations may not be physically possible; that is, the process of a cold cup of coffee spontaneously heating up on your dinner table would satisfy the first law of thermodynamics but has a near zero probability to occur. Processes have a natural direction to them in that spontaneous processes tend to dissipate gradients in the system until equilibrium is reached, e.g.:

  • Darcy’s law for pressure gradients
  • Fick’s law for concentration gradients

A system that is not subject to forced flows of mass or energy from its surroundings will evolve to a time-invariant state that is uniform or composed of uniform subsystems—the equilibrium state. The second law of thermodynamics introduces a new thermodynamic property, entropy, and provides a mathematical statement that describes this unidirectional nature of processes.

The second law also has implications for the efficiency of processes. Heat and work are not of the same quality in that work can be efficiently converted to thermal energy (e.g., frictional heat losses), but thermal energy can be only partially converted into mechanical energy (e.g., steam power plants). Thus, work is a more valuable form of energy than heat—work has a high quality. Furthermore, energy at higher temperatures is more useful than energy at lower temperatures. For example, the ocean contains an immense amount of energy, but it is not very useful because of its low temperature. Energy is degraded when heat transfers from one system to another of lower temperature. Entropy is a measure of the energy degradation or disorder of the system.

Entropy

Entropy is a thermodynamic property just like temperature and pressure. Entropy is a state function, in which changes during a reversible process in a closed system are given by the ratio Q/T. Entropy increases as T decreases or Q increases.

Entropy is related to the likelihood that equilibrium will be reached. Entropy is best understood by examining a very simple example at the microscopic scale. Fig. 1 shows the initial state of a hypothetical closed system that contains four molecules. The system is initially partitioned into two halves, such that the molecules from one half cannot move into the other half. One molecule is in the left subsystem and three are in another, thus, the pressures are not initially the same. Each subsystem has only one possible configuration—the initial state. Thus, the subsystems are well ordered, and entropy is initially small.

When the partition is removed, however, the molecules from each subsystem are free to move into the other half of the system, and a total of 16 different configurations are possible, as shown in Fig. 1. Because each of these configurations is equally likely, the probability that the system will be found in its original configuration is only 1/16 (i.e., 2–4). The system is more likely to contain two molecules in each subsystem (probability of 6/16 or 3/8), which is the equilibrium state—the most disordered state. Entropy is related to the maximum number of possible configurations of the system, and thus, the entropy after the partition is removed has increased. The configurations could also be arrangements of energy quanta, instead of molecular arrangements, as in this example.

Although the original configuration for four molecules is not improbable, real systems contain many more molecules. For example, if one mole of a gas were present in the system (6.0 × 1023 molecules), the likelihood that the system would be found in its initial state would be very unlikely (2−6 × 1023). However, the probability that the system would contain a similar number of molecules on each side would be near 1.0 (i.e., the pressure would be equal throughout the closed system).

Open system

The steps to write the entropy balance for an open system are similar to those for the first law of thermodynamics. We allow for the following:

  1. Mass flows into or out of the system at only one location on the boundary of the system. The mass flow rate into the system is positive, whereas flow rates out of the system to the surroundings are negative.
  2. Mass can carry entropy into or out of the system. The rate of entropy transfer into the system by mass flow is given by Sṅ.
  3. Energy in the form of heat may enter or leave the system across the system boundaries at a specified exterior temperature, T. Heat transfer is positive when heat is exchanged from the surroundings to the system. Entropy, Q/T, is transferred from the surroundings to the system during heat transfer. The rate of entropy transfer into the system by heat transfer is therefore Vol1 page 0343 inline 001.png.
  4. The temperature at the boundary or exterior of the system is equal to the temperature within the system (i.e., the heat transfer process is reversible).
  5. Unlike total energy or mass, entropy is generated within a system that is not in equilibrium. Entropy generation is related to irreversibilities in the system such as temperature gradients, pressure gradients, or concentration gradients. The second law of thermodynamics states that entropy generation is always positive.

With these assumptions and definitions, the entropy balance is

Vol1 page 0343 eq 001.png....................(1)

where the term on the left is the change in total entropy within the system (S is entropy/mole), and the first two terms on the right are the net rate of total entropy transported into or out of the system by mass and heat transfer. The last term is the internal generation rate of entropy within the system. In thermodynamic shorthand, the entropy balance can be written as

Vol1 page 0343 eq 002.png....................(2)

Closed and isolated systems

For closed systems (dn = 0), the entropy balance becomes Vol1 page 0343 inline 002.png. For isolated systems (dn = 0 and dQ= 0), the entropy balance is

Vol1 page 0343 eq 003.png....................(3)

In an isolated system at equilibrium, the total entropy cannot change with time. Thus, the generation of entropy must be zero at equilibrium (dSG = 0), which we stated previously. Away from equilibrium, entropy generation is positive (dSG > 0), which implies that entropy, in an isolated system, increases with time and reaches a maximum at equilibrium.

Why is entropy generation positive away from equilibrium? Consider an isolated system that is composed of two open subsystems A and B.[2] Heat is exchanged only from the high temperature subsystem A to the low temperature subsystem B. Thus, the rate of heat transfer is Vol1 page 0343 inline 003.png, where h is a constant heat-transfer coefficient. Each subsystem is well mixed so that T is always uniform [i.e., no internal gradients exist, and subsystems A and B must pass through a succession of equilibrium states (i.e., the process is reversible and GA = GB = 0)]. From Eq. 1 and the expressions for the rate of heat transfer, the entropy balance for subsystem A is Vol1 page 0343 inline 005.png, and for subsystem B, Vol1 page 0343 inline 006.png.

For the isolated system not in equilibrium, the entropy balance is Vol1 page 0343 inline 007.png. Furthermore, the total entropy, nS, is an extensive property so that nS = nASA + nBSB. The entropy generation term of the combined system is therefore given by Vol1 page 0343 inline 008.png. Because absolute temperatures are positive and the heat transfer coefficient is positive, this result demonstrates that entropy generation is always positive away from equilibrium. At equilibrium TA = TB, and the entropy generation term is zero, as stated before. Furthermore, the rate of entropy generation is proportional to the square of the gradients (temperature difference in this case). If the temperature gradients are kept small with time (infinitesimal equilibrium steps), entropy generation remains near zero.

A process is reversible if it occurs with small gradients (i.e., consists of a succession of infinitesimal equilibrium steps). Thus, dSG = 0 for a reversible process, and from Eq. 2,

Vol1 page 0344 eq 001.png....................(4)

For a closed system that undergoes a reversible process, Eq. 4 reduces to dQrev = Td(nS).

Summary

The second law of thermodynamics states that the rate of entropy generation within a system must be greater than or equal to zero. A process for which the rate of generation of entropy is always zero is a reversible process. A large rate of entropy generation corresponds to greater process irreversibilities.

Nomenclature

G = molar Gibbs free energy, energy/mole, J/mole
n = total moles of all components, moles
Q = net heat transferred, energy, J
S = molar entropy of fluid, entropy/mole, J/(Kelvin-mole)
T = temperature, Kelvin

References

  1. Smith, J.M., Van Ness, H.C., and Abbott, M.M. 2001. Chemical Engineering Thermodynamics, sixth edition. 787. New York City: McGraw-Hill Book Co. Inc.
  2. Sandler, S.I. 2000. Chemical and Engineering Thermodynamics, third edition. New York City: John Wiley & Sons.

Noteworthy papers in OnePetro

Use this section to list papers in OnePetro that a reader who wants to learn more should definitely read

External links

Use this section to provide links to relevant material on websites other than PetroWiki and OnePetro

See also

Thermodynamics and phase behavior

First law of thermodynamics

Equations of state

PEH:Thermodynamics_and_Phase_Behavior

Category