Theory of Everything.se
Intro page 2 – Time, entropy & energy

Intro page 2 – Time, entropy & energy

to page 1

Time, entropy and energy in physics

Time is manifested in Universe when matter is set for acceleration, velocity, temperature, rotation, gravitation or any other duration- altering caused by the exposure of energy. Less energy causes time to slow down. -And rising energy is speeding time at matter. This indicates that time is “achieved” at matter when energy is applied or changes. Time itself isn’t any entity that has any impact on any matter. -When energy has an absolute altering effect on matter. Universe does not notice time, perhaps only energy. The speed of light in vacuum, c, is an energy limit. -Not a time related such. So, the speed of light may primarily be dependent on energy. And time, as said, is not a Universal affecting entity. Revealed being an imaginary physical reference unit.

Entropy is a mathematical expression for order in any chosen system. Commonly entropy is explained with that low entropy is at hand when, for example, a room has all its items in its standard places. And when any item is moved to any other place the entropy rises. -More items out of its standard place the higher the entropy becomes. Mathematically entropy has a value between 0 and 1. Where 0 is the lowest possible entropy. I.e. all at its ordinary positions. Universe is commonly said to begin with low entropy and evolve to a higher value. This is just a mathematical expression, an amount, on the expected development of Universe. -When it’s said to evolve to higher temperature and finally meet the heat death.

In QM, quantum mechanics, entropy is used or equaled with the information any quantum mechanics system holds. The entropy or information a quantum mechanics system has is dependent on such entities as position, momentum (mass and speed), spin or energy. It is also declared in quantum mechanics that when these entities are measured by any probe or instrument the state of the specific QM system is collapsing. -The collapse of the local wave function. I.e. it’s forced to this altered state when the former “unknown” entity is measured and thereafter being defined. One peculiar mathematical definition is that the entropy or information is not changed, thus remains the same, before as after the collapse. This is known as the observer effect of a quantum mechanics system and was stated, among other quantum mechanics definitions, in the so-called Copenhagen interpretation. The explanation for this is when a quantum mechanics system is measured and its unknown entity is defined; this lowered entropy simultaneously meets the observer environment with a higher entropy, with more disorder. -And the entropy or information of the QM system therefore stays the same before, during the collapse, till becoming defined.


Wave function collapse

So, when entropy explicitly being a mathematical entity or unit, it does not affect Universe in any realistic way. Maybe only energy including matter in different forms do. -Though in quantum- computing, cryptography or information theory it has a major position for these areas of science. Even when a QM system has entropy, holds information; i.e. has position, momentum, energy and so forth; a quantum mechanics system can only truly affect any other (QM) system through the transfer of energy. Where time merely being an imaginary physical entity, and entropy only is a mathematical value describing the amount of order in any system. Some proof about time being imaginary is that there exists negative time, -t, in physics. And a proof of that entropy is pure mathematics is that its values alter only all between 0 and 1. Describing order this way. Does Universe care if it’s in order or in chaos? Of course, not. -One can also schematically state that present Universe has the entropy of 0.1 and meets the heat death at 0.5. -It’s just figures Universe ignores. -Though Universe cannot ignore the presence of energy that evokes imaginary time values and shows mathematical amounts of entropy.

One further example is a beam of white light, holding all colors possible versus a laser beam with a fixed green light. If one further compares the green laser with a blue, and a red laser beam. -One can notice that the green laser has less energy compared with the blue beam. -And more energy versus the red light. Although the entropy of these three lasers, the internal order of these lights, are the same, very near to 0 entropy. All three hold full order when having fixed colors. -Everything is in its place. The white light beam must therefore have an entropy value near 1, when holding all three colors mentioned, inclusive of all other colors possible, in a scrambled manner. -Observe now that the three laser beams mentioned all hold different energy, though all three have the same amount of entropy. This example shows that energy and entropy are not compatible units and that entropy is a pure estimated or calculated value of order at systems. Although the mixing of fixed- light colors (photons) can be useful in machines that handle mathematics and other information manipulations. -This subsection is by the reasoning here a brief view of the difference between time, entropy and energy in QM environments.

Although there are many different manifestations of entropy which makes this physical/ mathematical expression a bit hard to grasp. -This when time is time and energy is energy. Entropy can describe how temperature, how the atoms’ and molecules’ kinetic energy, is transferred to its surroundings. -The direction and the proportions. This physical behavior relates to that energy has a tendency to spread and become more evenly distributed. -Energy can never be lost, it can only be spread. Generally, any isolated system can only increase or remain its entropy over time. Entropy can further describe, all between 0-1, how well any physical system is ordered (how much chaos it holds) concerning its internal structure and also forecast of the probability for its entropy to rise or stay intact. And there are more ways entropy can describe physical systems and informational concepts of randomness and its possible changes. Simply spoken entropy is a way to quantify the amount of chaos or randomness in a physical system. Observe that the total entropy in a system and its surroundings can never decrease.

Regards
/admin

to page 3