Posts Tagged ‘Entropy’

Entropy and Consciousness

May 3, 2020

https://futurism.com/new-study-links-human-consciousness-law-governs-universe

A curious finding according to the study referred to in the article above is that the human brain displayed higher entropy when fully conscious than when not. “Normal wakeful states are characterized by the greatest number of possible configurations of interactions between brain networks, representing highest entropy values,” the team wrote in the study.  This means that, at least to me, that the higher number of connections between brain cells at any point in time, the higher the level of entropy at that point.  One could extrapolate that the more wakeful one is the more entropy there is in the brain and by inference, as one goes to sleep the less entropic the brain becomes.

This seems to go against the idea that learning, reasoning and awareness about the world is a type of “push back” against entropy.  Instances of life are generally seen as organized, metabolizing and replicating pieces of matter that eventually are overcome by entropy and die.  Almost like little islands of anti-entropy in a sea of entropic chaos.  How life can maintain this uphill struggle has always been a fascinating subject of study for biological scientists. Individual instances cannot keep up the struggle forever.  We eventually fall below a minimal level of energy production and consumption and die. This is probably the motivator for the evolution of replication and reproduction.[i]

We think of consciousness as a characteristic of life and thus of order, but this study seems to say the opposite.  Consciousness is a characteristic of chaos and disorder, and that pieces of matter at various locations and periods of time, when they display local order, tend to have less entropy and less consciousness.  This seems to me to infer a type of “cosmic consciousness” associated with entropy.  A concept that, at least from my experience dates back to, at least the hippie era of the 1960’s and 70’s, when expressions such as this were quite often externally stimulated.

Can life’s tendency to continue to live, that is to be less entropic, be a natural reaction against the cosmic consciousness which tends to disorganize our local order?  Can states like sleep, for example, be temporary reversals in the flow toward entropy while consciousness pushes forward our flow toward it?  Is it possible that cosmic consciousness is just the sum total of all local consciousnesses, and after one dies consciousness in the form of entropy, in a sense, lives on?

Another article along the same vein is this:  https://futurism.com/the-byte/mathematicians-think-universe-conscious

An earlier post by me about entropy and information loss can be found at:  https://birkdalecomputing.com/2019/02/08/information-entropy/

[i] A subject for another day.

Information Entropy

February 8, 2019

Is there a maximum amount of data about any given subject above which the incremental value of any additional data begins to approach zero?  Even more starkly is there a point where more data about a given subject may actually begin to have a negative effect, in that it actually decreases the amount of information about the subject?

I don’t mean that newer data may prove that old data about the subject is no longer accurate and therefore render the old data out of date. In which case the old data would still exist but no longer be relevant.  I’m talking about a situation where the informational content, i.e. the payload of the data, will actually decrease.  This would be a situation when we know less about a subject at some point in the future than we knew in the past, based on all the data there is on the subject.  I think it would have to have something to do with context, with the passage of time, and the associations between units of data about the subject and data about other subjects.  The more connections, the more information a body of data has.

This is a tricky speculation.  I mean, is it possible to actually know less at some point than was previously known?  Not for just a single person, as sometimes happens as we age and simply forget the information we previously could recall about a subject.  I am talking about the accumulated knowledge about a subject.  It is kind of like a subject becoming simpler over time rather than more complex.  Which is pretty much the opposite of observed reality.

In fact, this just may be what happens as we approach absolute and universal entropy.  I am not a physicist but why would this not be the case?  There would be two aspects to this, not only would universal entropy eliminate any differences between things, but the differences between parties, places and time would cease to exist as well.  Not only would there be less information on all subjects, but there would be less and less subjects to have information about.  Nor would there be anyone to have and be responsible for information, even if it did still exit.