Entropy is ... The concept of entropy. Standard entropy

Entropy is a word that many have heard, but few understand. And it is worth recognizing that it is really difficult to fully realize the essence of this phenomenon. However, this should not scare us. Very much of what surrounds us, we, in fact, can only explain superficially. And we are not talking about the perception or knowledge of any particular individual. No. We are talking about the totality of scientific knowledge that mankind has.

Serious gaps exist not only in knowledge of the galactic dimensions, for example, in questions about black holes and wormholes, but also in what constantly surrounds us. For example, there is still debate about the physical nature of light. And who can put the concept of time on the shelves? There are a great many such questions. But this article will focus on entropy. For many years, scientists have been struggling with the concept of "entropy." Chemistry and physics go hand in hand in the study of this mysterious phenomenon. We will try to find out what became known to our time.

entropy of value

Introduction of the concept in the scientific community

For the first time, the concept of entropy was introduced among specialists by the outstanding German mathematician Rudolph Julius Emmanuel Clausius. In simple terms, the scientist decided to find out where the energy goes. In what sense? For illustration, we will not turn to the numerous experiments and complex conclusions of a mathematician, but take an example that is more familiar to us from everyday life.

You should be well aware that when you charge, say, the battery of a mobile phone, the amount of energy that is accumulated in the batteries will be less than actually received from the network. Certain losses occur. And in everyday life we ​​are used to it. But the fact is that similar losses occur in other closed systems. But for physicists and mathematicians this is already a serious problem. Rudolf Clausius was engaged in the study of this question.

As a result, he deduced a curious fact. If, again, we remove the complicated terminology, it will be reduced to the fact that entropy is the difference between an ideal and a real process.

Imagine you own a store. And you got 100 kilograms of grapefruit for sale at a price of 10 tugriks per kilogram. Putting a margin of 2 tugriks per kilo, as a result of the sale, you will receive 1,200 tugriks, give the required amount to the supplier and leave yourself a profit of two hundred tugriks.

So, this was a description of the ideal process. And any merchant knows that by the time all the grapefruits are sold, they will have managed to dry by 15 percent. And 20 percent will rot at all, and they’ll just have to be written off. But this is a real process.

So, the concept of entropy, which Rudolf Clausius introduced into the mathematical environment, is defined as the relationship of the system in which the increase in entropy depends on the ratio of the temperature of the system to the value of absolute zero. In fact, it shows the value of the spent (lost) energy.

Chaos Measure

It can still be said with some degree of conviction that entropy is a measure of chaos. That is, if we take the room of an ordinary student as a model of a closed system, then the school uniform not put in place will already characterize some entropy. But its significance in this situation will be small. But if, in addition to this, scatter toys, bring popcorn from the kitchen (of course, dropping a little) and leave all the textbooks in a mess on the table, then the entropy of the system (and in this particular case, this room) will increase sharply.

system entropy

Complex Matter

Entropy of a substance is a very difficult process to describe. Many scientists over the past century have contributed to the study of the mechanism of its work. Moreover, the concept of entropy is used not only by mathematicians and physicists. She also occupies a well-deserved place in chemistry. And some craftsmen with its help even explain the psychological processes in the relations between people. Let us trace the difference in the formulations of three physicists. Each of them reveals entropy on the other hand, and their combination will help us paint a more holistic picture for ourselves.

Clausius statement

The process of heat transfer from a body with a lower temperature to a body with a higher temperature is impossible.

To make sure of this postulate is easy. You will never be able to warm, say, a frozen little puppy with cold hands, no matter how you like to help him. Therefore, you will have to shove it in the bosom, where the temperature is higher than he has at the moment.

Thomson's statement

A process is impossible, the result of which would be the completion of work due to the heat taken from one body.

And if very simple, it means that it is physically impossible to construct a perpetual motion machine. Entropy of a closed system will not allow.

Boltzmann statement

Entropy cannot decrease in closed systems, that is, in those that do not receive external energy supply.

This formulation shook the faith of many adherents of the theory of evolution and made them seriously think about the existence of a reasonable Creator in the Universe. Why?

Because by default, in a closed system, entropy always increases. So, chaos is aggravated. It can be reduced only thanks to external energy recharge. And we observe this law every day. If you do not take care of the garden, home, car, etc., then they simply will become worthless.

entropy is

At the mega-scale, our Universe is also a closed system. And scientists came to the conclusion that our very existence should indicate that somewhere this external energization is taking place. Therefore, today it is not surprising that astrophysicists believe in God.

Arrow of time

Another very witty illustration of entropy can be represented as an arrow of time. That is, entropy shows in which direction the process will move in physical terms.

Indeed, it is unlikely that upon learning of the dismissal of the gardener, you will expect that the territory for which he was responsible will become more neat and tidy. Quite the contrary - if you do not hire another worker, after some time even the most beautiful garden will become desolate.

Entropy in chemistry

entropy chemistry

In the discipline "Chemistry" entropy is an important indicator. In some cases, its value affects the course of chemical reactions.

Who has not seen frames from feature films in which the heroes very carefully transferred containers with nitroglycerin, fearing to provoke an explosion with an inadvertent sharp movement? This was a visual aid to the principle of action of entropy in a chemical substance. If her indicator reached a critical level, then a reaction would begin, as a result of which an explosion occurs.

Mess order

Most often they say that entropy is a desire for chaos. In general, the word "entropy" means a transformation or a turn. We have already said that it characterizes action. The entropy of gas is very interesting in this context. Let's try to imagine how it happens.

We take a closed system consisting of two connected containers, in each of which there is gas. The pressure in the tanks, until they were hermetically connected to each other, was different. Imagine what happened at the molecular level when they were connected.

gas entropy

A crowd of molecules, under more intense pressure, immediately rushed to their brothers, who had previously lived quite freely. Thus, they increased the pressure there. This can be compared to how the water splashes in the bathroom. Having run to one side, she immediately rushes to the other. So are our molecules. And in our system, which is ideally isolated from external influences, they will be pushed until a perfect balance is established in the entire volume. And so, when around each molecule there will be exactly the same amount of space as the neighboring one, everything will calm down. And this will be the highest entropy in chemistry. Turns and transformations will stop.

Standard entropy

Scientists do not abandon attempts to streamline and classify even the disorder. Since the value of entropy depends on many related conditions, the concept of “standard entropy” was introduced. The values ​​of these standards are summarized in special tables so that it is easy to carry out calculations and solve a variety of applied problems.

By default, the values ​​of standard entropy are considered under conditions of pressure in one atmosphere and temperatures of 25 degrees Celsius. With increasing temperature, this indicator also increases.

entropy of matter

Codes and Ciphers

There is also informational entropy. It is designed to help encrypt encoded messages. For information, entropy is the value of the predictability of information. And if in a very simple language, then this is how easy it will be to crack the intercepted cipher.

How it works? At first glance, it seems that without at least some initial data it is impossible to understand the encoded message. But it is not so. This is where probability comes into play.

Imagine a page with an encrypted message. You know that the Russian language was used, but the characters are completely unfamiliar. Where to begin? Think: what is the likelihood that the letter "b" will appear on this page? And the opportunity to stumble upon the letter "o"? You understand the system. The characters that are found most often (and least often - this is also an important indicator) are calculated, and compared with the features of the language in which the message was compiled.

In addition, there are frequent, and in some languages ​​constant letter combinations. This knowledge is also used for decryption. By the way, this is the method used by the famous Sherlock Holmes in the story "Dancing Men". In the same way, codes were hacked on the eve of World War II.

And informational entropy is designed to increase the reliability of coding. Thanks to the derived formulas, mathematicians can analyze and improve the options offered by cryptographers.

The connection with dark matter

concept of entropy

There are a great many theories that are just waiting to be confirmed. One of them connects the phenomenon of entropy with the relatively recently discovered dark matter. It says that the lost energy is simply converted into dark. Astronomers admit that in our Universe only 4 percent falls on the matter we know. And the remaining 96 percent are occupied by the unexplored at the moment - dark.

It got this name because it does not interact with electromagnetic radiation and does not emit it (like all objects known in the Universe before this time). Therefore, at this stage in the development of science, the study of dark matter and its properties is not possible.


All Articles