# Entropy of simple thermodynamical system seen as temporal automata

Here, I am describing a simple system consisting of many atomic particles which move on their own in some space. It is an abstract system that can be nicely used for thinking about pV = nRT formula — ideal gas formula. The larger the pressure and the larger the volume corresponds to more particles in the system and their bigger speed (bigger speed corresponds to the higher temperature).

Now, I want to look at this system as a temporal automata. A temporal automata is a system that goes from state a to state b after a time t. So, it gets to state a, then some time t passes, and then it turns to state b. It goes from state to state on its own, without someone outside the system nudging it (like it is the case with the usual automata which potentially changes state after the input), it changes states.

In this many particle system certain state would correspond to the positions of all particles. If we move one particle a bit, it is already a different state. Particles move on their own, so this system changes state on its own, unconnected to the anything outside of it. There could be a different definition of state of this system. For example, speeds of particles could be included in state definition, but it complicates things. Maybe some other time.

So, entropy is a measure of messiness of system. I am here trying to make exact definition of entropy for the temporal automata case based on intuition and correspondence with how entropy is defined when you look at the thermodynamical many-particle system the usual way (the Maxwell-Boltzman way).

The definition of entropy that I came up with is the number of different new states in an unit of time. Why different? Because a system which returns to the same states in the certain unit of time would have smaller entropy, it would be less random. It is somehow intuitive. The speed of changing states corresponds to bigger mess. The larger the speed of particles, the larger the temperature, the larger the number of new states in the certain unit of time and therefore the larger the entropy is. The bigger the space where particles can be, the bigger the volume is (in 3 dimensional case), the bigger the entropy is, of course having all other things remain the same (mainly temperature and number of particles), because there are more new states (particle positions) where system can go. However, I think space is not that big of a factor because when there are not many particles system probably won’t return to the same state. Imagine large space and few particles in it, they will most probably never return to the same state even if you shrink a space a little bit, just because there are so many places they can “explore”. And of course, the number of particles. With every new particle you increase the state space. However, if particles don’t move (at minimum temperature) entropy would be zero, because system would always be in the same state.

Particles are seen as atomic balls, unbreakable, colliding or not interacting (that would impact the definite formula, but not this very abstract theory very much).

However, I think that it is not actually that simple as I stated at the beginning, because I think that speed of changing states itself corresponds to bigger mess, than the number of new states. So, repeating states might not be that much of a big of a deal. So, it is not that straightforward as I stated at the beginning. It is really a matter of definition, but this is close. Can we really equivocally define entropy (so the mess, the randomness) when we observe this temporal automata?

So, entropy is the internal property of the system, not with connection with any outside factors.

Of course, if we view movement of particles as continuous, number of states would be infinite even on the very small difference between the two positions for only one particle. Therefore, discretization should be employed. To view for example only change of 1mm (actually should be less, much less) to be the change of state and less consider to be in the same state.

I think that the Boltzman definition of entropy by counting the possible microstates (so states in this case) is also good in this case because if you presume that system is changing in the likes of uniform distribution from one microstate to another the measure becomes analogous and correlated very much. Because the more the states the bigger mess it is if it starts to change from one state to another in an uniformly distributed way. Then, this definition of counting the different new states in one time unit would be an extension of Boltzman’s. Of course, the presumption is that the system never stands still in the same state, it is always it this time t where it “charges” its transition to the next state.

(from December 7, 2021)

--

--