Posts Tagged ‘Consciousness’

Knowns and Unknowns

November 7, 2020

Are we really better off now than we used to be? In the old days we knew what we knew, and we knew there were things we didn’t know.  Today with the growth of data, machine learning and artificial intelligence, there are many more things we know and even more that we know we don’t know.  There are even things we could know but don’t bother to know, mainly because we don’t need to know them.  Someone else, or increasingly something else knows them for us, thus saving us the bother of knowing them.

We even discover from time to time that there are things we know that we didn’t know that we knew.  We are beginning to suspect that, more than all the things we know and know we don’t know, there are even more things we don’t know that we don’t know.  But we don’t know for sure that we don’t know them. If we knew this for sure (if we knew them) then it would just add to the things that we know that we don’t know and would no longer be unknown unknowns.[i]

It is kind of like the names of things.  Sometimes there are multiple names given to the same thing, and sometimes multiple things have the same name.  Of the two suboptimal situations multiple things with the same name is always the more vexing.  This is usually an intra-language problem and not inter-language problem, which makes it even more troubling.  You would expect a person speaking a different language would have a different name for something, but you might expect (wrongly it seems) that people speaking the same language would have the same name or names for the same thing.  Worse, people of the same language often have different polymorphic descriptors referring to the same object. 

What is even worse, a monomorphic descriptor can refer to a set of objects that can either overlap (like a Venn diagram[ii]) or be totally discontinuous, often without people even knowing that they don’t know.


[i] Conceptualized by Donald Rumsfeld February 2002.  Things we are neither aware of nor understand.

[ii] Conceived around 1880 by John Venn, according to Wikipedia.

Entropy and Consciousness, Part 2

September 25, 2020

This is a follow up to a previous article on consciousness and entropy:  https://birkdalecomputing.com/2020/05/03/entropy-and-consciousness/

We have entered into the age of uber-prediction.  No I don’t mean guessing when your hired ride will arrive, but an age when nearly everything is predicted.  Humans have, of course, always been predicting the outcomes of activities and events.  Predicting the future has been called an emergent behavior of intelligence. Our ancestors needed it to tell them the most likely places that the alpha predator might be hiding, as well as telling them which route would most likely be taken by the prey they are trying to catch.

There is a natural feedback loop between predicted outcomes and expected outcomes. If a predicted outcome is perceived to be within a certain margin of error of an expected outcome the prediction is said to have “worked” and this positive assessment (i.e. that the prediction worked) when it occurs tends to reinforce the use of predictive behavior in the future.  In other words it increases the occurrence of predictions and simultaneously increases the amount of data that can be used for future predictions.

In the past we did not have as much data or as much computing power to process the data as we have today. This had always acted as a constraint on, not only, the aspects of life that could be predicted (i.e. not enough data), but also on how quickly prediction worked in respect to the aspect of life being predicted (i.e. not enough processing power). Predictability now tends to “work” better than it ever did before because there is more data to use and faster ways to use it.  The success of prediction creates a virtuous cycle that reinforces the desire for more prediction.

The state of the world around us seems to be increasing in its predictability.  This leads me to believe that we must be pushing back more and more against entropy, which is defined by Claude Shannon in Information Theory as a state of zero predictability where all outcomes are equally likely. This means you need less new information to predict an outcome because the amount of ambient information is constantly increasing.  To make information work for you, often requires discovery of predictions made by others. Consequently you need to ask less questions to obtain a workable prediction. The less entropic a system, the more information it contains.

Information is measured in the bit, or single unit of surprise. The more bits a system has the more possible surprises it can have and the less entropic it is. So it follows that the more information there is in a system, the more units of surprise it potentially has. 

Entropy and Consciousness

May 3, 2020

https://futurism.com/new-study-links-human-consciousness-law-governs-universe

A curious finding according to the study referred to in the article above is that the human brain displayed higher entropy when fully conscious than when not. “Normal wakeful states are characterized by the greatest number of possible configurations of interactions between brain networks, representing highest entropy values,” the team wrote in the study.  This means that, at least to me, that the higher number of connections between brain cells at any point in time, the higher the level of entropy at that point.  One could extrapolate that the more wakeful one is the more entropy there is in the brain and by inference, as one goes to sleep the less entropic the brain becomes.

This seems to go against the idea that learning, reasoning and awareness about the world is a type of “push back” against entropy.  Instances of life are generally seen as organized, metabolizing and replicating pieces of matter that eventually are overcome by entropy and die.  Almost like little islands of anti-entropy in a sea of entropic chaos.  How life can maintain this uphill struggle has always been a fascinating subject of study for biological scientists. Individual instances cannot keep up the struggle forever.  We eventually fall below a minimal level of energy production and consumption and die. This is probably the motivator for the evolution of replication and reproduction.[i]

We think of consciousness as a characteristic of life and thus of order, but this study seems to say the opposite.  Consciousness is a characteristic of chaos and disorder, and that pieces of matter at various locations and periods of time, when they display local order, tend to have less entropy and less consciousness.  This seems to me to infer a type of “cosmic consciousness” associated with entropy.  A concept that, at least from my experience dates back to, at least the hippie era of the 1960’s and 70’s, when expressions such as this were quite often externally stimulated.

Can life’s tendency to continue to live, that is to be less entropic, be a natural reaction against the cosmic consciousness which tends to disorganize our local order?  Can states like sleep, for example, be temporary reversals in the flow toward entropy while consciousness pushes forward our flow toward it?  Is it possible that cosmic consciousness is just the sum total of all local consciousnesses, and after one dies consciousness in the form of entropy, in a sense, lives on?

Another article along the same vein is this:  https://futurism.com/the-byte/mathematicians-think-universe-conscious

An earlier post by me about entropy and information loss can be found at:  https://birkdalecomputing.com/2019/02/08/information-entropy/

[i] A subject for another day.