The concept of entropy in thermodynamics is a rather important and at the same time difficult question, since there are different points of view for its interpretation. We describe the entropy in detail in the article, and also give examples of processes where it plays a key role.
The meaning of the concept
The conversation about entropy is most logical to start with its definition. So, entropy in thermodynamics is an extensive physical quantity that reflects the number of possible microstates of the described macrosystem. In other words, entropy reflects the level of organization: the more heterogeneous the system, the less its entropy.
It is important to understand the two main properties of entropy:
- Extensiveness. That is, this value depends on the size of the system and the mass of the substance that is present in it. For example, if we calculate the amount of entropy for each of the two vessels with hydrogen with volumes V 1 and V 2 , in which the gas is at a certain pressure P and has a temperature T, then as a result of connecting these vessels together, the total entropy will be equal to their sum.
- Entropy in thermodynamics is a function of the state of the system. This means that talking about this quantity is possible only when the system is in thermodynamic equilibrium. In this case, entropy does not depend on the history of the evolution of this system, that is, it does not matter at all how it came to this thermodynamic state.
What determines entropy?
In other words, why was it introduced into physics? To answer this question, it is enough to consider a simple experiment: everyone knows that if you take a cold rod of metal and bring it into contact with the same rod, but heated to a certain temperature, then over time the hot body will cool and the cold body will heat up. No one has ever observed the reverse process. The direction of the considered phenomenon is described using the concept of "entropy".
Any isolated system left to itself always strives to occupy the most probable state. This state is characterized by a chaotic and uniform distribution of the elements that make up the system, and is understood as a state with a maximum value of entropy.
Statistical interpretation
In the late XIX - early XX centuries, the Austrian physicist Ludwig Boltzmann developed a new direction in physics, called statistical mechanics. In this new science, he introduced the concept of absolute entropy, which the scientist presented in the form: S = k * ln (Ω), where k is the Boltzmann constant, Ω is the number of possible states in the system.
Note that in physics, the absolute value of entropy is of little interest to all, since all mathematical formulas that take into account the physical quantity in question include precisely its change.
Reversible processes in thermodynamics
Another definition of entropy in thermodynamics is the relationship with energy, which in no way can be translated into useful work, for example, into mechanical work. It is known that this energy exists in the form of heat in this system, but for practical use it is unsuitable.
For example, an internal combustion engine works with some efficiency (many people have probably never thought about this, but the efficiency of the engines that are inside their cars is only about 20-25%), which will never be 100%, no matter how advanced the technology. This is because, as a result of the thermodynamic process of fuel combustion, part of the energy (most in the case under consideration) is lost on heating parts and friction. The process of fuel combustion is a prime example of increasing entropy.
In the middle of the 19th century, the German scientist Rudolf Clausius, analyzing various thermodynamic processes, introduced the concept of "system entropy" and was able to mathematically express it in the form of the following expression: dS = δQ / T, here dS is the change in the entropy of the system, δQ is the change in energy that occurred as a result of this process, T is the absolute temperature. It follows that the unit of measurement for dS in SI is J / K.
The given formula of entropy in thermodynamics is valid only for reversible processes, i.e., such transitions that can occur both in the forward and reverse directions if the external conditions are changed. For example, if you compress a gas in an airtight cylinder using an external force, and then terminate this force, the gas will restore its original volume (state).
Thus, according to the Clausius equation, the change in entropy during a reversible process is equal to the ratio of the change in energy between the initial and final states to the absolute temperature.
Isothermal and adiabatic reversible processes
The isothermal process is a special case, which assumes that the final and initial states of the system have the same temperature. According to the Clausius formula, as a result of a reversible isothermal process, a change in the enthalpy of the system will be exactly equal to the amount of heat that it exchanged with the environment divided by the temperature.
An example of such a process is the expansion of an ideal gas due to the supply of heat to it from the outside. All supplied energy in this case is spent on mechanical work (expansion), while the gas temperature remains constant.
Considering the concept of entropy, it is also interesting to recall the adiabatic process, which is understood as any transition in an isolated system, that is, internal energy is stored in it. If this process is reversible, then according to the formula dS = δQ / T = 0, since δQ = 0 (there is no heat exchange with the environment).
Irreversible processes
The considered examples of different processes can only be considered, in a rough approximation, reversible, since various thermal losses always exist in them. In our universe, almost all processes are irreversible. For them, the 2nd law of thermodynamics is formulated , and entropy plays an important role. We give the formula: dS≥δQ / T. What this expression says: according to the second law of thermodynamics, entropy as a result of absolutely any irreversible process always increases (see the ">" sign in the expression).
Thus, similar to the fact that energy cannot be created from nothing and cannot disappear without a trace, the second law of thermodynamics indicates that entropy can be created, but cannot be destroyed (constantly increasing).
History reference
As mentioned above, physicists began to think about the entropy only in the middle of the 19th century. The reason for this was the fact that the first steam engines had an extremely low efficiency (at the beginning of the XVIII century, the typical value of the efficiency for these machines was 2%). That is, entropy was originally understood as the “dispersion” of thermal energy during the thermodynamic process.
The word "entropy", introduced by Clausius, from the ancient Greek language means "evolution, transformation", thereby emphasizing its importance for describing ongoing processes.
Entropy and Thermal Death of the Universe
According to the 2nd law of thermodynamics, the entropy in our universe is constantly increasing. This means that in the end it will reach its maximum value when the substance is uniformly distributed and the temperature levels out throughout the space. Such a hypothesis was put forward by the same Clausius and was called the thermal death of the Universe.
Whether it is actually realized depends on the field of applicability of thermodynamics. The fact is that at the micro level, when individual molecules and atoms are considered, entropy in thermodynamics is a meaningless quantity, since the laws of this branch of physics themselves cease to work. It is assumed that similar limitations of their applicability exist when the scales of the system reach infinite values, that is, the dimensions of the Universe.