Questo articolo è disponibile anche in: Italiano
What do these words have in common?
Let me take you through this little journey.
What is entropy?
Entropy is the measurement of chaos in a system (e.g. the universe). When a system goes from a state of order to a state of chaos we say that its entropy has increased. When we transform energy from a state to another, a portion of this energy is dispelled in form of heat and can’t be used anymore to empower further transformations. When this energy is dispelled the entropy of the system increases. The moment when all energy has been used and dispelled is described as a state of total balance in the system. Taking our universe as a model of an isolated system, this moment has been called the “heat death of the universe”.
One example that describes entropy is that of a drop of ink that falls into a glass of water. Naturally the ink will spread in the water. To recollect the ink from the water we need to apply some energy and force the natural course of events. Before the ink drops into the glass, we have a low level of entropy, or in other words, we have a high level of distinction and individuality among the elements of the system (water and ink). After the ink has dropped and has spread in the water, the entropy is at its highest level: we have chaos, less distinct separation and less recognisability between the ink and water elements.
One thing that we can learn from this concept is that in the universe there is a natural tendency towards a state of balance between all its parts. Any tension is destined to be dispelled. In this natural tendency towards an equally-arranged chaos, the occurrence of a natural organisation of energy/matter is highly improbable, unless we resort to use energy that comes from outside the system.
The funny thing is that this state of total equilibrium in which all the elements within the system are equally blent, is considered the highest state of chaos. But if you think about a state where everything is equal, that sounds very much like total order as well. Now let’s extend this concept to other areas… such as information, the way we understand things and the way we perceive reality.
Let’s start with an example: imagine a system made of a high number of particles in motion, where each particle’s movement influences all the other particles’ behaviour. In this example, if we don’t have a clue about what is going to happen, this sounds very much like randomness. Randomness and chaos are both faces of entropy, while predictability and order are features we find in a system with low entropy.
In this sense we can say that the level of entropy of the information we have about this system is very high. If we want to calculate the next position of a particle, we should differentiate each particle into certain parameters of our choice (e.g. direction, speed …) which means we have to submit the system to a code. By doing so we reduce the level of entropy of the system (its informational charge, or in other words its potentiality), by making order among the elements.
From this example we deduce how entropy is proportional to the charge of potentiality that the elements in the system have. If we want to be informed about the real state of the system, prior to the subscription to a code of interpretation, we get the real picture but we cannot understand: it is unusable information. The totality of information with its huge amount of data gives us the complete picture of the system with all its possible variations, but at the same time it doesn’t give us any clear definition about how things are going. Therefore the absolute information, the main picture, the transparent and total truth, doesn’t communicate us anything until we place the scenario within a pattern that makes order among its elements, according to whatever priorities we choose!
So it is possible now to define:
Information = entropy = potentiality
The totality of information is being aware of everything at once, of all possibilities and all the details of the whole system. This description of total information to me feels like enlightenment. Thanks to the narrowness of his/her point of view, a human being is able to interpret the infinite potentiality of the universe, being able to read this potentiality through his own code. The total information is enormous and dazzling, and to grab it in a single glimpse would mean to be aware of any thing in the universe, both in its potentiality and in its actuality.
Every realisation of history already exists amongst the tangles of freedom.