Posts Tagged ‘Entropy’

Don’t Look Up

February 17, 2022

The title obviously is ripped off from the 2021 Oscar nominated movie, but I thought it fit. Here in the Earth’s gravity well we are exposed to everything “out there”, even our own stuff which we have put out there, like satellites and space stations. 

Orbiting objects, especially the international space station, in orbiting the earth slowly accumulates an energy debt. This debt is the result of it’s borrowing energy from its surroundings to prevent it from falling out of the sky. Sooner or later this debt gets too high for the object to carry and it balances the books by falling down the gravity well back to earth. The fireball produced by that plunge is the removal of that debt.

This is kind of like the biologic death that happens to all of us. Sooner or later, local entropy, as it increases, wittles away at the bonds that hold together the matter of which we are made.  We breakup into smaller pieces of matter until the critical mass required to stay alive is passed and we “die”.

Here’s an interesting read on the topic.

https://theconversation.com/the-international-space-station-is-set-to-come-home-in-a-fiery-blaze-and-australia-will-likely-have-a-front-row-seat-176690?utm_medium=email&utm_campaign=Latest%20from%20The%20Conversation%20for%20February%2016%202022%20-%202202821848&utm_content=Latest%20from%20The%20Conversation%20for%20February%2016%202022%20-%202202821848+CID_8e10a0dcb5ad55d2fcc546e7c481066e&utm_source=campaign_monitor_uk&utm_term=The%20International%20Space%20Station%20is%20set%20to%20come%20home%20in%20a%20fiery%20blaze

Because we are usually much closer to the bottom of earth’s gravity well, we break apart much more slowly than the international space station will. Our “death” is usually far less firey and observed and recorded by far fewer witnesses and input sensors.

Entropy and Consciousness, Part 2

September 25, 2020

This is a follow up to a previous article on consciousness and entropy:  https://birkdalecomputing.com/2020/05/03/entropy-and-consciousness/

We have entered into the age of uber-prediction.  No I don’t mean guessing when your hired ride will arrive, but an age when nearly everything is predicted.  Humans have, of course, always been predicting the outcomes of activities and events.  Predicting the future has been called an emergent behavior of intelligence. Our ancestors needed it to tell them the most likely places that the alpha predator might be hiding, as well as telling them which route would most likely be taken by the prey they are trying to catch.

There is a natural feedback loop between predicted outcomes and expected outcomes. If a predicted outcome is perceived to be within a certain margin of error of an expected outcome the prediction is said to have “worked” and this positive assessment (i.e. that the prediction worked) when it occurs tends to reinforce the use of predictive behavior in the future.  In other words it increases the occurrence of predictions and simultaneously increases the amount of data that can be used for future predictions.

In the past we did not have as much data or as much computing power to process the data as we have today. This had always acted as a constraint on, not only, the aspects of life that could be predicted (i.e. not enough data), but also on how quickly prediction worked in respect to the aspect of life being predicted (i.e. not enough processing power). Predictability now tends to “work” better than it ever did before because there is more data to use and faster ways to use it.  The success of prediction creates a virtuous cycle that reinforces the desire for more prediction.

The state of the world around us seems to be increasing in its predictability.  This leads me to believe that we must be pushing back more and more against entropy, which is defined by Claude Shannon in Information Theory as a state of zero predictability where all outcomes are equally likely. This means you need less new information to predict an outcome because the amount of ambient information is constantly increasing.  To make information work for you, often requires discovery of predictions made by others. Consequently you need to ask less questions to obtain a workable prediction. The less entropic a system, the less new information it contains.

Information is measured in the bit, or single unit of surprise. The more bits a system has the more possible surprises it can have and the more entropic it is. So it follows that the more information there is in a system, the more units of surprise it potentially has and the less likely it is to work as a predicted.

Entropy and Consciousness

May 3, 2020

https://futurism.com/new-study-links-human-consciousness-law-governs-universe

A curious finding according to the study referred to in the article above is that the human brain displayed higher entropy when fully conscious than when not. “Normal wakeful states are characterized by the greatest number of possible configurations of interactions between brain networks, representing highest entropy values,” the team wrote in the study.  This means that, at least to me, that the higher number of connections between brain cells at any point in time, the higher the level of entropy at that point.  One could extrapolate that the more wakeful one is the more entropy there is in the brain and by inference, as one goes to sleep the less entropic the brain becomes.

This seems to go against the idea that learning, reasoning and awareness about the world is a type of “push back” against entropy.  Instances of life are generally seen as organized, metabolizing and replicating pieces of matter that eventually are overcome by entropy and die.  Almost like little islands of anti-entropy in a sea of entropic chaos.  How life can maintain this uphill struggle has always been a fascinating subject of study for biological scientists. Individual instances cannot keep up the struggle forever.  We eventually fall below a minimal level of energy production and consumption and die. This is probably the motivator for the evolution of replication and reproduction.[i]

We think of consciousness as a characteristic of life and thus of order, but this study seems to say the opposite.  Consciousness is a characteristic of chaos and disorder, and that pieces of matter at various locations and periods of time, when they display local order, tend to have less entropy and less consciousness.  This seems to me to infer a type of “cosmic consciousness” associated with entropy.  A concept that, at least from my experience dates back to, at least the hippie era of the 1960’s and 70’s, when expressions such as this were quite often externally stimulated.

Can life’s tendency to continue to live, that is to be less entropic, be a natural reaction against the cosmic consciousness which tends to disorganize our local order?  Can states like sleep, for example, be temporary reversals in the flow toward entropy while consciousness pushes forward our flow toward it?  Is it possible that cosmic consciousness is just the sum total of all local consciousnesses, and after one dies consciousness in the form of entropy, in a sense, lives on?

Another article along the same vein is this:  https://futurism.com/the-byte/mathematicians-think-universe-conscious

An earlier post by me about entropy and information loss can be found at:  https://birkdalecomputing.com/2019/02/08/information-entropy/

[i] A subject for another day.

Information Entropy

February 9, 2019

Is there a maximum amount of data about any given subject above which the incremental value of any additional data begins to approach zero? Even more starkly is there a point where more data about a given subject may actually begin to have a negative effect, in that it actually decreases the amount of information about the subject?

I don’t mean that newer data may prove that old data about the subject is no longer accurate and therefore render the old data out of date. In which case the old data would still exist but no longer be relevant. I’m talking about a situation where the informational content, i.e. the payload of the data, will actually decrease. This would be a situation when we know less about a subject at some point in the future than we knew in the past, based on all the data there is on the subject. I think it would have to have something to do with context, with the passage of time, and the associations between units of data about the subject and data about other subjects. The more connections, the more information a body of data has.

This is a tricky speculation. I mean, is it possible to actually know less at some point than was previously known? Not for just a single person, as sometimes happens as we age and simply forget the information we previously could recall about a subject. I am talking about the accumulated knowledge about a subject. It is kind of like a subject becoming simpler over time rather than more complex. Which is pretty much the opposite of observed reality.

In fact, this just may be what happens as we approach absolute and universal entropy. I am neither a physicist nor information scientist, but why would this not be the case? There would be two aspects to this, not only would universal entropy eliminate any differences between things, but the differences between parties, places and time would cease to exist as well. It’s not that there would be less information on all subjects, but there would be less and less subjects to have information about. Nor would there be anyone to have and be responsible for information, even if it did still exit.