Posts Tagged ‘Consciousness’

Entropy and Consciousness, Part 2

September 25, 2020

This is a follow up to a previous article on consciousness and entropy:  https://birkdalecomputing.com/2020/05/03/entropy-and-consciousness/

We have entered into the age of uber-prediction.  No I don’t mean guessing when your hired ride will arrive, but an age when nearly everything is predicted.  Humans have, of course, always been predicting the outcomes of activities and events.  Predicting the future has been called an emergent behavior of intelligence. Our ancestors needed it to tell them the most likely places that the alpha predator might be hiding, as well as telling them which route would most likely be taken by the prey they are trying to catch.

There is a natural feedback loop between predicted outcomes and expected outcomes. If a predicted outcome is perceived to be within a certain margin of error of an expected outcome the prediction is said to have “worked” and this positive assessment (i.e. that the prediction worked) when it occurs tends to reinforce the use of predictive behavior in the future.  In other words it increases the occurrence of predictions and simultaneously increases the the amount of data that can be used for future predictions.

In the past we did not have as much data or as much computing power to process the data as we have today. This had always acted as a constraint on, not only, the aspects of life that could be predicted (i.e. not enough data), but also on how quickly prediction worked in respect to the aspect of life being predicted (i.e. not enough processing power). Predictability now tends to “work” better than it ever did before because there is more data to use and faster ways to use it.  The success of prediction creates a virtuous cycle that reinforces the desire for more prediction.

The state of the world around us seems to be increasing in its predictability.  This leads me to believe that we must be pushing back more and more against entropy, which is defined by Claude Shannon in Information Theory as a state of zero predictability where all outcomes are equally likely. This means you need less new information to predict an outcome because the amount of ambient information is constantly increasing.  To make information work for you, often requires discovery of predictions made by others. Consequently you need to ask less questions to obtain a workable prediction. The less entropic a system, the more information it contains.

Information is measured in the bit, or single unit of surprise. The more bits a system has the more possible surprises it can have and the less entropic it is. So it follows that the more information there is in a system, the more units of surprise it potentially has. 

Entropy and Consciousness

May 3, 2020

https://futurism.com/new-study-links-human-consciousness-law-governs-universe

A curious finding according to the study referred to in the article above is that the human brain displayed higher entropy when fully conscious than when not. “Normal wakeful states are characterized by the greatest number of possible configurations of interactions between brain networks, representing highest entropy values,” the team wrote in the study.  This means that, at least to me, that the higher number of connections between brain cells at any point in time, the higher the level of entropy at that point.  One could extrapolate that the more wakeful one is the more entropy there is in the brain and by inference, as one goes to sleep the less entropic the brain becomes.

This seems to go against the idea that learning, reasoning and awareness about the world is a type of “push back” against entropy.  Instances of life are generally seen as organized, metabolizing and replicating pieces of matter that eventually are overcome by entropy and die.  Almost like little islands of anti-entropy in a sea of entropic chaos.  How life can maintain this uphill struggle has always been a fascinating subject of study for biological scientists. Individual instances cannot keep up the struggle forever.  We eventually fall below a minimal level of energy production and consumption and die. This is probably the motivator for the evolution of replication and reproduction.[i]

We think of consciousness as a characteristic of life and thus of order, but this study seems to say the opposite.  Consciousness is a characteristic of chaos and disorder, and that pieces of matter at various locations and periods of time, when they display local order, tend to have less entropy and less consciousness.  This seems to me to infer a type of “cosmic consciousness” associated with entropy.  A concept that, at least from my experience dates back to, at least the hippie era of the 1960’s and 70’s, when expressions such as this were quite often externally stimulated.

Can life’s tendency to continue to live, that is to be less entropic, be a natural reaction against the cosmic consciousness which tends to disorganize our local order?  Can states like sleep, for example, be temporary reversals in the flow toward entropy while consciousness pushes forward our flow toward it?  Is it possible that cosmic consciousness is just the sum total of all local consciousnesses, and after one dies consciousness in the form of entropy, in a sense, lives on?

Another article along the same vein is this:  https://futurism.com/the-byte/mathematicians-think-universe-conscious

An earlier post by me about entropy and information loss can be found at:  https://birkdalecomputing.com/2019/02/08/information-entropy/

[i] A subject for another day.