Sleeve bearings are plain bearings which have very few moving parts in their construction. This article explains their applications and comparison with ball bearings.
The Difference Between Entropy and Enthalpy in Thermodynamics
Entropy and enthalpy are two important properties of a thermodynamic system. Though they are different from one another, they are related. This post provides a comparison between the two and also tells you the relationship between them, with the help of examples.
Relationship between Enthalpy and Entropy of a Closed System
T. ∆S = ∆H
Here, T is the absolute temperature, ∆H is the change in enthalpy, and ∆S is the change in entropy. According to this equation, an increase in the enthalpy of a system causes an increase in its entropy.
In chemistry, thermodynamics refers to the field that deals with heat and energy of a system and the study of energy change of a system. Enthalpy and entropy are thermodynamic properties.
Enthalpy Vs. Entropy
Enthalpy, denoted by the symbol ‘H’, refers to the measure of total heat content in a thermodynamic system under constant pressure. Enthalpy is calculated in terms of change, i.e., ∆H = ∆E + P∆V(where E is the internal energy). The SI unit of enthalpy is joules (J).
Entropy, denoted by the symbol ‘S’, refers to the measure of the level of disorder in a thermodynamic system. It is measured as joules per kelvin (J/K). Entropy is calculated in terms of change, i.e., ∆S = ∆Q/T (where Q is the heat content and T is the temperature).
Let us look into these two thermodynamic properties in greater detail.
What is Enthalpy?
It can be defined as the total energy of a thermodynamic system that includes the internal energy. Furthermore, for a homogeneous system, it is the sum of internal energy E of a system and the product of the pressure (P) and volume (V) of the system.
H = E + PV, where PV refers to the mechanical work done on or by the system.
Enthalpy cannot be measured directly. Thus, a change in enthalpy that can be measured is considered. It is given by,
∆H = ∆E + P∆V
Thus, the change in enthalpy is the sum of the change in internal energy and the work done.
Enthalpy is a state function and it is dependent on the changes between the initial and the final state i.e. reactants and products in case of a chemical reaction. Thus, the enthalpy change is important.
There are two types of chemical reactions; namely, exothermic and endothermic.
Exothermic reactions are those in which there is a release of heat. In this case, energy is given out to the surroundings. The energy required for the reaction to occur is less than the total energy released. Furthermore, the enthalpy of the products is lower than the enthalpy of the reactants. Thus, the enthalpy change or ∆H is negative or has a negative value.
Endothermic reactions are those in which there is an absorption of heat. In this case, energy is absorbed from its surroundings in the form of heat. Here, the enthalpy of the products is higher than the enthalpy of the reactants. Thus, the enthalpy change or ‘∆H’ is positive or has a positive value.
Thus, the enthalpy of a reaction can be calculated as follows:
∆H = ∑ nHproducts -∑ mHreactants, where n and m are the coefficients of the products and reactants.
That is, according to the aforementioned equation, enthalpy of a reaction is the sum of the enthalpies of the products subtracted from the sum of the enthalpies of the reactants.
What is Entropy?
Invented by Rudolf Clausius, it is a thermodynamic property and can be defined as a measure of the number of specific ways in which a thermodynamic system can be arranged. It can be referred to as a measure of chaos or disorder in a closed system. It is said to be the heat or thermal energy that is no longer available to do work by the system, thus, characteristic of the randomness of particles.
According to the second law of thermodynamics, there is always an increase in the entropy of an isolated system.
‘∆S’ or the change in entropy was originally represented by,
∆S = ∫ dQrev/T, where T is absolute temperature and dQ is the heat transfer into the system.
This equation is for a thermodynamically reversible process. Furthermore, it can also be called the macroscopic definition of entropy.
Later, entropy was described by Ludwig Boltzmann based on the statistical behavior of the microscopic components of the system. According to this, entropy is a measure of the number of possible microscopic configurations of the atoms and molecules (individually) in accordance with the macroscopic state of the system.
S = KB ln W where,
S is the entropy of an ideal gas, KB is the Boltzmann’s constant, and W is the number of microstates corresponding to a given macrostate.
Solids have low entropy due to their more regular structure as compared to liquids. Liquids have an intermediate entropy as they are more ordered than gas but less ordered than solids. Gases are known to have the highest entropy as they have the most disorder.
Both enthalpy and entropy can be explained with an example such as melting of ice. This phase change process can be given as follows:
H2O(s) ——> H2O(l)
In this thermodynamic system, heat is absorbed by ice, thus making ∆H positive. Now, due to the phase change that is involved i. e. solid turning into liquid, the level of disorder in the system increases, thus making ∆S positive.
Considering the aforementioned relationship equation again, it underlines the fact that the two thermodynamic properties are directly proportional to each other. However, it should be noted that the entropy change of a closed system can never be negative.