Archive for the ‘Other’ Category

Starbucks Fantasy

July 15, 2022

She was a pretty girl.  Sweet actually.   A smile, bright eyes and a cheery acknowledgement of my presence across from her.  We made short term small talk and I asked her her name.  She told me and then, I could almost not believe it, she asked me mine.  My heart skipped a beat… she couldn’t really care what my name was, could she?

I told her… and she wrote it on the cup.

Two Interesting Interrelated Topics

March 27, 2022

I have been hearing more and more about two interesting interrelated topics recently.

First is an article from my NPR news feed from a short while back, about governments introducing virtualized versions of their “centralized” currencies. As you might imagine China seems to be way out in front on this. It may be virtual and it may be crypto but it is still centralized. That means it is soverign currency and a government controls this “e-currency”.

https://www.npr.org/2022/02/06/1072406109/digital-dollar-federal-reserve-apple-pay-venmo-cbdc?sc=18&f=1001

Second and what might be even more interesting however is the rise in “decentralized” finance, or DeFi. Here, at least at first, a government does not necessarily control the currency. Here is a reference to a Coursera course produced by Duke University on this topic. This might be where the real action is going to be going forward.

https://www.coursera.org/learn/decentralized-finance-infrastructure-duke?specialization=decentralized-finance-duke

Don’t kill me and I won’t kill you

February 20, 2022

This is one of the best articles on COVID/Omicron I’ve read.  It makes lots of sense.  In order for the virus to infect more and more people, and thus increase its probability of surviving in a world full of enemies (eg. humans) it has to evolve to the point where it is not so severe and people just live with it.  Kind of like the flu.  People tolerate it and the virus continues to  propagate.  Our probiotic stomach bacteria learned this lesson millions of years ago.  Don’t kill them and they won’t kill you!.

https://www.nature.com/articles/d41586-022-00210-7?utm_source=Nature+Briefing&utm_campaign=f41aeff64d-briefing-dy-20220202&utm_medium=email&utm_term=0_c9dfd39373-f41aeff64d-42706715

Don’t Look Up

February 17, 2022

The title obviously is ripped off from the 2021 Oscar nominated movie, but I thought it fit. Here in the Earth’s gravity pool we are exposed to everything “out there”, even our own stuff which we have put out there, like satellites and space stations. 

Orbiting objects, especially the international space station, in orbiting the earth, slowly accumulate an energy debt. This is a result of it’s borrowing energy from its surroundings to prevent it from falling out of the sky. Sooner or later this debt gets too high for the object to carry and it balances the books by falling down the gravity well back to earth. The fireball produced by that plunge is the removal of that debt.

This is kind of like the biologic death that happens to all of us. Sooner or later, local entropy, as it increases, wittles away at the bonds that hold the matter of which we are made.  We breakup into smaller pieces of matter until the critical mass required to stay alive is passed and we “die”.

Here’s an interesting read.

https://theconversation.com/the-international-space-station-is-set-to-come-home-in-a-fiery-blaze-and-australia-will-likely-have-a-front-row-seat-176690?utm_medium=email&utm_campaign=Latest%20from%20The%20Conversation%20for%20February%2016%202022%20-%202202821848&utm_content=Latest%20from%20The%20Conversation%20for%20February%2016%202022%20-%202202821848+CID_8e10a0dcb5ad55d2fcc546e7c481066e&utm_source=campaign_monitor_uk&utm_term=The%20International%20Space%20Station%20is%20set%20to%20come%20home%20in%20a%20fiery%20blaze

Because we are usually much closer to the bottom of the gravity well, we break apart much more slowly than the international space station will. Our “death* is usually far less firey and observed and recorded by far fewer input sensors.

Wisdomtimes – 6 Interrogatives – The Mystery of Five W’s and One H

February 14, 2022

The following URL is a link to an article with a different and amusing take on The 6 Interrogatives:

Who, What, Where, When, Why and How. Hope you enjoy it.

http://www.wisdomtimes.com/blog/6-interrogatives-the-mystery-of-five-ws-and-one-h/

Regression to the Mean

May 20, 2021

Jack Bogle’s ghost warns about 401(k)s https://a.msn.com/r/2/BB1gUrmr?m=en-us&a=0  

As the article says “It’s worth taking a moment to reflect on just how good things have been for investors for a decade.”  But, “Enjoy making 12% a year, but don’t get used to it.”  It’s not going to last forever.  

Knowns and Unknowns

November 7, 2020

Are we really better off now than we used to be? In the old days we knew what we knew, and we knew there were things we didn’t know.  Today with the growth of data, machine learning and artificial intelligence, there are many more things we know and even more that we know we don’t know.  There are even things we could know but don’t bother to know, mainly because we don’t need to know them.  Someone else, or increasingly something else knows them for us, thus saving us the bother of knowing them.

We even discover from time to time that there are things we know that we didn’t know that we knew.  We are beginning to suspect that, more than all the things we know and know we don’t know, there are even more things we don’t know that we don’t know.  But we don’t know for sure that we don’t know them. If we knew this for sure (if we knew them) then it would just add to the things that we know that we don’t know and would no longer be unknown unknowns.[i]

It is kind of like the names of things.  Sometimes there are multiple names given to the same thing, and sometimes multiple things have the same name.  Of the two suboptimal situations multiple things with the same name is always the more vexing.  This is usually an intra-language problem and not inter-language problem, which makes it even more troubling.  You would expect a person speaking a different language would have a different name for something, but you might expect (wrongly it seems) that people speaking the same language would have the same name or names for the same thing.  Worse, people of the same language often have different polymorphic descriptors referring to the same object. 

What is even worse, a monomorphic descriptor can refer to a set of objects that can either overlap (like a Venn diagram[ii]) or be totally discontinuous, often without people even knowing that they don’t know.


[i] Conceptualized by Donald Rumsfeld February 2002.  Things we are neither aware of nor understand.

[ii] Conceived around 1880 by John Venn, according to Wikipedia.

Entropy and Consciousness, Part 2

September 25, 2020

This is a follow up to a previous article on consciousness and entropy:  https://birkdalecomputing.com/2020/05/03/entropy-and-consciousness/

We have entered into the age of uber-prediction.  No I don’t mean guessing when your hired ride will arrive, but an age when nearly everything is predicted.  Humans have, of course, always been predicting the outcomes of activities and events.  Predicting the future has been called an emergent behavior of intelligence. Our ancestors needed it to tell them the most likely places that the alpha predator might be hiding, as well as telling them which route would most likely be taken by the prey they are trying to catch.

There is a natural feedback loop between predicted outcomes and expected outcomes. If a predicted outcome is perceived to be within a certain margin of error of an expected outcome the prediction is said to have “worked” and this positive assessment (i.e. that the prediction worked) when it occurs tends to reinforce the use of predictive behavior in the future.  In other words it increases the occurrence of predictions and simultaneously increases the amount of data that can be used for future predictions.

In the past we did not have as much data or as much computing power to process the data as we have today. This had always acted as a constraint on, not only, the aspects of life that could be predicted (i.e. not enough data), but also on how quickly prediction worked in respect to the aspect of life being predicted (i.e. not enough processing power). Predictability now tends to “work” better than it ever did before because there is more data to use and faster ways to use it.  The success of prediction creates a virtuous cycle that reinforces the desire for more prediction.

The state of the world around us seems to be increasing in its predictability.  This leads me to believe that we must be pushing back more and more against entropy, which is defined by Claude Shannon in Information Theory as a state of zero predictability where all outcomes are equally likely. This means you need less new information to predict an outcome because the amount of ambient information is constantly increasing.  To make information work for you, often requires discovery of predictions made by others. Consequently you need to ask less questions to obtain a workable prediction. The less entropic a system, the less new information it contains.

Information is measured in the bit, or single unit of surprise. The more bits a system has the more possible surprises it can have and the more entropic it is. So it follows that the more information there is in a system, the more units of surprise it potentially has and the less likely it is to work as a predicted.

In My Humble Opinion

June 14, 2020

I was just thinking about how so much of the present day protests remind me of the 1960’s, in spirit at least, if not strictly on issues.  Protests of the 1980’s and 2000’s era seemed, to me at least, to get watered down too easily with every well intended “people oriented” cause of the day.  IMHO, this is probably why, for the most part, they don’t appear to have been very successful (though they did keep the ball rolling, so to speak).  If the current BLM protesters can stay focused and disciplined they do have a decent chance of actually influencing the evolution of social change as much or more than their predecessors.  But they need to stay focused on a few issues and their leaders need to have a concrete, but flexible program for specific steps that can be taken to correct the social and economic inequity that has now surfaced for all to see.

Sooner or later in this process people who are not themselves protestors will turn to the protestors, and the leaders who support them and say “What changes do you want to see enacted?”  Not what meta-issues do you feel (or know) are unjust and need to be corrected, but what enforceable steps do you think need to, and can be, taken?  I hope there is a short list of answers to this question.  Because if there is not, then the initiative, and its spirit, will quickly dissipate and the motivation to take corrective action will simply compete with the other issues of the day.

It is also important to understand that whatever solutions percolate out of the protests they have to produce two results, both measurable in a defined time frame.  Number one is a reduction in the occurrence of undesirable events, for example, unjustified murders by those who are expected to maintain order.  And second, a skewing, or slight flattening, of the wealth distribution curve.  I say “slight” because I would hate to see a disincentive to wealth creation in general.

I wish the peaceful protesters well.

Entropy and Consciousness

May 3, 2020

https://futurism.com/new-study-links-human-consciousness-law-governs-universe

A curious finding according to the study referred to in the article above is that the human brain displayed higher entropy when fully conscious than when not. “Normal wakeful states are characterized by the greatest number of possible configurations of interactions between brain networks, representing highest entropy values,” the team wrote in the study.  This means that, at least to me, that the higher number of connections between brain cells at any point in time, the higher the level of entropy at that point.  One could extrapolate that the more wakeful one is the more entropy there is in the brain and by inference, as one goes to sleep the less entropic the brain becomes.

This seems to go against the idea that learning, reasoning and awareness about the world is a type of “push back” against entropy.  Instances of life are generally seen as organized, metabolizing and replicating pieces of matter that eventually are overcome by entropy and die.  Almost like little islands of anti-entropy in a sea of entropic chaos.  How life can maintain this uphill struggle has always been a fascinating subject of study for biological scientists. Individual instances cannot keep up the struggle forever.  We eventually fall below a minimal level of energy production and consumption and die. This is probably the motivator for the evolution of replication and reproduction.[i]

We think of consciousness as a characteristic of life and thus of order, but this study seems to say the opposite.  Consciousness is a characteristic of chaos and disorder, and that pieces of matter at various locations and periods of time, when they display local order, tend to have less entropy and less consciousness.  This seems to me to infer a type of “cosmic consciousness” associated with entropy.  A concept that, at least from my experience dates back to, at least the hippie era of the 1960’s and 70’s, when expressions such as this were quite often externally stimulated.

Can life’s tendency to continue to live, that is to be less entropic, be a natural reaction against the cosmic consciousness which tends to disorganize our local order?  Can states like sleep, for example, be temporary reversals in the flow toward entropy while consciousness pushes forward our flow toward it?  Is it possible that cosmic consciousness is just the sum total of all local consciousnesses, and after one dies consciousness in the form of entropy, in a sense, lives on?

Another article along the same vein is this:  https://futurism.com/the-byte/mathematicians-think-universe-conscious

An earlier post by me about entropy and information loss can be found at:  https://birkdalecomputing.com/2019/02/08/information-entropy/

[i] A subject for another day.