Friday, February 10, 2006

entropy - definition

Definition: In thermal processes, a quantity which measure the extent to which energy of a system is available for conversion to work. If a system undergoing an infinitesimal reversible change takes in a quantity of heat dQ at absolute temperature T, its entropy is increased by dS = dQ/T. The area under the absolute temperature-entropy graph for a reversible process represents the heat transferred in the process. For an adiabatic process, there is no heat transfer and the temperature-entropy graph is a straight line, the entropy remaining constant through the process.

When a thermodynamic system is considered on the microscopic scale, equilibrium is associated with the distribution of molecules that has the greatest probability of occurring, i.e. the state with the greatest degree of disorder. Statistical mechanics interprets the increase in entropy in a closed system to a maximum at equilibrium as the consequence of the trend from a less probable to a more probable state. Any process in which there is no change in entropy occurs is said to be isentropic.

Entropy also appears in statistical mechanics as a function of the number of possible microscopic states (W) that a system of atoms or molecules could occupy, consistent with its macroscopic state (temperature, presure etc).

    S=kbLn(W)

Entropy is also used in a very similar way in "information theory" as a measure of the information (or lack there of) contained in data.

Also Known As: S, disorder

No comments: