Friday, September 06, 2024

The never-ending questions

The running theme in this blog, perhaps in my entire life, has been on reality, or what we think may be reality. What is reality?  Is that a valid question?

As author, literary agent, and 'thinker' John Brockman once proclaimed, he has had a lifelong obsession with asking questions, especially "the" question: "What is the last question?". 

Brockman's first book on pondering questions was By The Late John Brockman, published in 1969. In his own words, it was "about the idea of interrogating reality". Most would respond by declaring this is useless philosophy, thereby concluding it all meaningless. Yet each and every one of us plays apart in it, and without it. (As suggested in the movie, "The Matrix") Most of us just aren't aware of it. 

"Reality as a whole is unmeasurable except through effect. The unity is in the methodology, in the writing, reading, in the navigation. This system cannot provide us with ultimate answers, nor does it present the ultimate questions. There are none."*

This also precipitated his vision of the 'Third culture' consisting of "those scientists and other thinkers in the empirical world who, through their work and expository writing, are taking the place of the traditional intellectual in rendering visible the deeper meanings of our lives, redefining who and what we are." 

In 1988 Brockman and others created The Edge Foundation, an association of science and technology intellectuals. It was also an outgrowth of The Reality Club, a gathering (historically called a 'salon') of intellectuals based in NYC and held by a host. From 1981 through 1996 many well known scientists, authors, artists, technologists, and entrepreneurs met for presentations, 'round-table' discussions, seminars, etc. It retired to a virtual  presence on The Edge website in 1996. (A link on the sidebar has been posted when I created this blog in 2005.)

Despite frequently following the website after I first discovered The Edge (~1998), life became complicated and I lost track of it. An academic career sometimes consume the only existence in one's life, especially in LAC (Life After Children). I was busy "interrogating" small subsets of reality. 

It was with sadness (on my part) that Brockman ran out of questions and announced in 2018 the finale to The Edge project with the question, "WHAT IS THE LAST QUESTION?". 

"Ask 'The Last Question,' your last question, the question for which you will be remembered."

Possibly the best tribute to The Edge was in a ''moratorium'' by lecturer and writer Kenan Malik in The Guardian. The excerpt below gave me some private satisfaction in that I have always relayed similarly to students, children, colleagues, lab members, friends, relatives, and, many times, here on this blog: ask questions. (despite the frustrations of many).

"Asking questions is relatively easy. Asking good questions is surprisingly difficult. A bad question searches for an answer that confirms what we already know. A good question helps to reset our intellectual horizons. It has an answer that we can reach, yet unsettles what we already know."

 One response on the webpage, which may not really qualify as an answer for some, is one of my favorites. As we often say in the scientific fields, answers to a question may only be more questions.

"The final elegance: assuming, asking the question. No answers. No explanations. Why do you demand explanations? If they are given, you will once more be facing a terminus. They cannot get you any further than you are at present. The solution: not an explanation: a description and knowing how to consider it."**
Many people are unsettled by that, but it is the reality. Perhaps it is a very simple single question with no explanation. Just like reality.

________________________________________________________

 * John Brockman, By The Late John Brockman, 1969, Macmillon. The Kindle format can be accessed on Amazon.com. Hard copies are >$70.

** Ludwig Wittgenstein, Zettel, eds. G. E. M. Anscombe and G. H. von Wright, trans. G. E.

Wednesday, September 04, 2024

Genetics is like a musical score

Beethoven

Amadeus Mozart and Ludwig van Beethoven may be the most genetically studied musicians of all time. Why is that? Because their music is famous and loved? Because their personal history is full of celebrity and drama? Or are they just favorites of geneticists?

The narrative in most of the published studies is to understand the interaction between musical sounds and humans. I'm still not sure if this can be completely solved because music is more than just sound. It is a musical 'language', with or without words, that interacts with with the human mind and body. However, because we are a curious species, we seek to learn about those connections. And, as I suspect, the two famous composers have had a growing foundation of research on which to build upon. 

A recent study* explores if genetic factors can determine extraordinary musical achievements. If that is so, then how do genes contribute or determine a person's musicality? This isn't a new query; geneticists have examined similar questions by studying the two famous composers for decades. However, recent advancements in molecular genetics allow scientists to probe deeper into human DNA, sometimes restudying old questions or asking new ones, especially of long-dead people.

Then again, when studying humans, sometimes these newer studies only confirm older results. 

“An analysis of the famous composer's genetic make-up has revealed that DNA data has so far been too imprecise in capturing a person's abilities.” 

In this recent study, an international team of researchers analyzed Beethoven’s DNA to investigate if and how any differences in his genes may account for his celebrated musical exceptionalism. 

The deeper question is, how much can genes impact human traits, especially behavior? When considering a bird or a lizard, probably quite a bit. But humans are “messy.” There is no single quantitative or qualitative line that divides genetically determined and learned human behavior. This is the age-old “nature versus nurture” dilemma. The lines are fuzzy.

Ludwig van Beethoven was born in Bonn (a major city in Germany), which was at that time the capital of the Electorate of Cologne and partly dominated by Roman archbishops. He moved to Vienna (1792), Austria, to flee a dysfunctional family and meet other musicians. 

During this time in history Napoleon restructured France (1789) and regions north including Bonn (1794) and Vienna (1805) after the famous French revolution. Beethoven supported Napoleon's reformations and composed his famous third symphony naming it “Napoleon”.  After Napoleon proclaimed himself Emperor (1804) Beethoven rescinded the Napoleon dedication and renamed it “Eroica”. He even refused to play this symphony in front of French soldiers.

Beethoven lived during a tumultuous era of wars and conflicts with rulers. It was the rise of the German Enlightenment period, transition from the Classical to Romantic era in art and music, almost constant family turmoil, and loss of hearing. He was a man full of emotion, conviction, and righteousness. As his music conveys, he was a man with passion. Are there genes for that?

The researchers analyzed DNA sequences available from an earlier study (2023) in which the composer’s DNA was extracted from strands of Beethoven's hair. The authors then developed a ‘polygenic score’, a number that summarizes the estimated effect of many genetic variants on an individual's trait or behavior. 

"Our aim was to use this polygenic score as an example of the challenges of making genetic predictions for an individual that lived over 200 years ago.”

They chose a specific component of music that had a score for “beat synchronization ability”, which is closely related to musicality. Beat perception and synchronization in humans is the degree to which an individual can synchronize their movements in time with a musical beat. In humans, it is commonly within 120 to 140 beats/minute and is frequently used in music composition. Ironically, beat synchronization was thought to be uncommon in non-human species and the mechanism determining the optimal tempo are unclear.

Although this was thought to be a human rhythm trait, a study in rats (2022) revealed that rats also showed head movements and neural recordings within the same range as humans. This suggests that "the optimal tempo for beat synchronization is determined by the time constant of neural dynamics conserved across species".

And Beethoven?

"The study found that Beethoven had an unremarkable polygenic score for general musicality compared to population samples from the Karolinska Institute, Sweden, and Vanderbilt University, USA. However, considering the limitations of the current polygenic scores and the fact that a genetic indicator for ‘beat synchronization ability’ may not directly tap into Beethoven’s composer skills (musical creativity), this finding is not unexpected.”

The genetic architecture of this trait is highly polygenic, meaning that it is influenced by many genes in the human genome. Authors identified 69 separate locations on the genome in which different genetic alleles (every person has two copies of a gene; they are called 'alleles') in the population account for some of the variability in how accurately people synchronize to a musical beat. 

Genes associated with beat synchronization are more likely to be genes involved in central nervous system function, including genes expressed in brain tissue and genes involved in early brain development. Recent studies also found that beat synchronization shares some of its genetic architecture with other traits, including several that are involved in biological rhythms (walking, breathing, and circadian rhythm). 

The polygenic score computes the sum of genetic effects associated with beat synchronization in each individual, but they are only a rough guess. It can tell us only what an individual’s likelihood of specific levels of beat synchronization would be in relation to the population-based model, but they do not correspond directly to an exact match with the person’s beat synchronization accuracy. Thus a person's beat synchronization may be a point amongst many in a wide area under the curve. And Beethoven's score may be lower than expected, but did that negatively impact his compositions?

“Although Beethoven had a rather low genetic predisposition for beat synchronization highlights the limitations of polygenic score predictions at the individual level. While polygenic score prediction is expected to get more accurate in the future, it is important to remember that complex human traits, including musical skills, are not determined solely by genes or the environment but rather shaped by their complex interplay.”

In conclusion the authors stated that the current study "only shows that we’ve been able to use genetics to explain a portion of the variability in beat synchronization skills (again, at the level of pooled data in a large study sample)."

When scientists talk about “heritability” they are referring to the amount of phenotypic variance explained by genetic variation. This does not mean that rhythm is only “genetic” versus only “environmental,” or that rhythm is genetic in certain people but not others.

"Scientifically we really can’t say for sure how and why an individual reaches (or does not reach) a certain level of musicality. So it’s not “either-or” but “both-and” genes and environment, and the incredibly complex biological interrelationships that occur during human development of musicality will take many, many more years of work to unravel!"

Studies of beat synchronization in humans and other species, such as in rats, found interesting genetic correlations between beat synchronization and a cluster of interrelated traits: walking pace, musculoskeletal strength, breathing function, and cognitive processing speed. Possibly even cadence in language! Additionally, the shared genetic architecture has implications for physical and cognitive function in neurodiverse people and during aging.

* "Was Beethoven unmusical?", Max-Planck-Gesellschaft Research News, published on website April 10, 2024 and accessed 20/08/2024.

Sunday, September 01, 2024

When 'thing' was not a thing

In our daily conversation we use many words that we don’t give a millisecond of thought to where they arose from. And, sometimes, what they really mean. One of those words is ‘thing’. We may utter it dozens of times in a day and it may have as many dozens of meanings. That word, regardless of intention, may be the most ubiquitous and flexible in our language. It even fills holes in a conversation when the speaker can’t think of the correct word to use. 

But what does ‘thing’ really mean? Where and when did it originate? We will hitch a ride on the ‘thing’ through history and learn answers.

Proto-Indo-European Languages (late Stone Age to early Bronze Age)

The origins of the word ‘thing’ are rooted in the Indo-European family of languages. Most modern languages on the European, North American and northern India continent evolved from branches of these ancient languages. Some of these survive today: English, Germanic, French, Celtic, Albanian, Portuguese, Russian, Dutch, Slavic, Italic, and Spanish. Just like living organisms, languages evolve, too. Likewise, they follow people as they migrate, colonialize and trade. Eventually, some may share meanings and even sounds. As we will shortly discover. 

All the languages mentioned above descended from a single prehistoric language: proto-Indo-European. It was spoken sometime during the late Stone Age and early Bronze Age (~3300-1200 BC). Because this was before written history, the geographical area of origin is controversial. 

The favored hypothesis is the region of the Pontic-Caspian steppes (modern Ukraine and southern Russia). It is also associated with the famous Yamnaya peoples, nomads who are considered having developed wheeled carts and horse domestication. Bands of Yamnaya spread south, east and west, conquering and assimilating cultures as they went. Most Europeans and of European descent carry a percentage of Yamnaya genetic haplotypes in their own genes. Of course, the Yamnaya took their language with them.

Indo-European Languages (Mid-Bronze Age into the Middle Ages)

By the time written history appeared throughout Europe, many Indo-European languages had evolved from its spoken historical prototype. This was also a period (4000-1000 BC) of great migrations throughout Eurasia and the Atlantic Europe. Hittite (of Anatolia/Turkey) is considered the earliest (4000-3000 BC) spoken Indo-European language, but is now extinct. Spoken Indo-European languages mixed with speakers of other languages during these migrations; they mixed, overlapped and retained some similarities throughout. 

As pastoral cultures (farming) evolved and more people stayed in one location, especially during the Classical (800 BC-500 AD) and Middle Ages (500 AD-1500), branches of the Indo-European language became more standardized within those cultures. They also had more than one dialect. Remember that in those times, only the elite could read and write; most of the population was illiterate and information was shared orally. Consequently, each branch of the Indo-European languages had, and still have, varieties (dialects) of speech. 

Each of these languages had a development stage, similar to proto-Indo-European. For example, proto-Norse was spoken in Scandinavia and is thought to have evolved from a northern dialect of Proto-Germanic during the first century BC and 100 AD. It evolved into Old Norse and its several dialects at the beginning of the Viking age (~400 BC). These later separated and became the modern North Germanic languages.  Old Norse is also believed to be the first North Germanic language spoken. Keep this in mind as we progress through time. 

This thing and that thing

Before we dive into the word ‘thing’, it is important to follow the evolution and separation of the North Germanic languages. As mentioned, Old Norse was spoken in what we know now as Scandinavia.  At the beginning of the Viking Age the dialects of Old Norse diverged into the modern North Germanic languages, also known as the Nordic languages, or Faroese, Icelandic, Swedish, Norwegian, and Dutch. And their dialects. 

In early Germanic societies a ‘thing’ was a governing assembly made up of free people of the community and presided over by a ‘lawspeaker’. They provided legislative functions, as well as being social events and opportunities for trade.

The earliest trace of the word ‘thing’ is purported to be used by the Germanic
peoples of northwest Europe and Scandinavia during ancient history and into the early Middle Ages (~476-900 AD). The first detailed description (98 AD) of a ‘thing’ was made by Roman historian and politician Tacitus. He suggested that ‘things’ were annual delegate-based meetings for some early Germanic tribes and served legal and military functions. ‘Thing’ was used in Norway before the country’s first Viking king, Harald Finehair who ruled 872-930 AD. It is here where the word and it’s original meaning became famous.

In the Viking Age, ‘things’ were public assemblies of the free men of a place.* They functioned as both parliaments and courts at different levels of society: local, regional, and transregional. Their purpose was to solve disputes, establish laws and make political decisions. Earlier local ‘altings’ were a common assembly where all free farmers had the right to participate.  

An ancient site of a 'thing' in Sweden

‘Thing’ sites were also often the place for public religious rites. Norway’s ‘things’ provided the institutional and legal framework for subsequent legislative and judicial bodies, even in the modern Western world, and remain today as superior regional courts.

‘Things’ took place at regular intervals, usually at prominent places that were accessible by travel. The place where a 'thing' was held was called a "thingstead" (Old English þingstede, Old Norse þingstaðr) or "thingstow" (Old English þingstōw). The Vikings and early Norse settlers often brought their culture with them to new locations abroad, including their legal systems and establishment of ‘things’. The collaborative Thing Project has discovered and documented historical locations of ‘things’ in Norway, Iceland, Sweden, the Faroe Islands, Scotland, and Isle of Man. 

Was ‘thing’ always a thing?

Well, no and yes. Depending on where and when, the original word evolved over time from one language to another, from one definition to another.

In Old Norse, Old English, and modern Icelandic ‘thing’ was pronounced ‘þing’, where ‘þ’ is pronounced as unvoiced "-th". In Middle English, Old Saxon, Old Dutch, and Old Frisian, it evolved to ‘thing’. The difference between ‘þing’ and 'thing' is mostly spelling. ‘Thing’ is pronounced as ‘ding’ in German and Dutch; ‘ting’ in modern Norwegian, Danish and Swedish.

In Middle English, the word ‘thing’ shifted to mean an entity or matter (sometime before 899 AD), and then also an act, deed, or event (after 1000 AD). The original definition of ‘meeting’ or ‘assembly’ did not survive the shift to Middle English. In modern usage, the ancient meaning of ‘thing’ in English and other languages has been displaced to mean not just an assemblage of some sort but simply an object of any sort.

 The meaning of personal possessions, commonly in the plural, first appears in Middle English around 1300, and eventually led to the modern sense of "object". This semantic development from "assembly" to "object" is mirrored in the evolution of many words throughout history. Yet, the most ubiquitous word, at least in English, is ‘thing’. 

And it can mean……. any thing.


 * According to Norway's Law of the Gulathing (the historic ‘thing’ at Gula), only free men of full age could participate in the assembly. In written sources, women were present at some ‘things’ despite being left out of the decision making bodies, such as the Icelandic Althing. Women elders, however, were often consulted on in decision making. 


Evolution of things

While reading an abstract on a paper* introducing a theory on the complexity of evolutionary search effectiveness of natural selection as a learning process, I grinned at the courage of the authors to use this terminology.

"A variational formulation of natural selection is introduced, paying special attention to the nature of ‘things’ and the way that different ‘kinds’ of ’ things’ are individuated from—and influence—each other."


Context here: things = individual parts and the assembly of these parts
___________________

This may be the introduction to my recent blog post on 'thing'. What's a 'thing'?

Monday, July 01, 2024

Borders and grids are Us

 With a long interest in how we as humans perceive our environment, I found this quote to be quite adequate:

"....the imposition of the gridded survey defines the lands as a uniform and monotonous mathematical space turns the lands into a trackless expanse viewed from above."

In many ways we see land as two-dimensional and existing with geometrical shapes. Look at any map, look down from a plane, and you will see this. In a recent issue of Nature NAS scientist John Holmes reviews a new book, "Liberty's Grid", by historian of science and mathematics Amir Alexander.

Alexander relates how the "rectilinear grid was imposed and how it has fed into the US consciousness."
"In Alexander’s telling, the grid is grand, ambitious and uniquely American. It is not only a blank, boundless canvas, but also a causal factor of different aspects of the United States’s trajectory and character."
A interesting book, but as Holmes rightly criticizes, Alexander often confuses cause and effect. Alexander tends to blame many events in our American history on the gridded map and ignore other elements at play in our American culture.
"Fly over the United States or walk its city streets and you can’t help but notice the country’s seemingly endless patchwork of rectangular blocks of land. The origins of this ‘grid’ lie in the eighteenth and nineteenth centuries, when the early US early leaders sent surveyors out to carve up vast tracts of land acquired under treaties and to expand settlement westwards to the Pacific coast."
In other words, dividing up this country was a rushed job without real thought to the landforms. Not that they mattered anyway, except as resources to exploit. Ask the Native Americans about that.

Good points here:

"“Written not on parchment but into the mountains, valleys, and plains of North America, the Great American Grid embodies an ideal of America as a land of unconstrained freedom and infinite opportunity,” Alexander writes. His central thesis is that, although dividing up much of the United States into a geometric grid might seem like a convenient solution to a difficult problem, when viewed from a historian’s perspective it becomes an expression of American exceptionalism and a means to fulfil the idea of the country as an “empire of liberty”."
It is in the last explanation of criticism by Holmes that I agree. Simply because Alexander either ignores or circumvents the psychology and sociology of creating borders, visible and invisible. In basic philosophical perspective, it's creating and outlining territories, just as wild cats and canids do by urinating on the edges of their territories. And those borders are dynamic, not static. This has been a main foundation of human war since civilization. Except for traditionally nomadic peoples, where land borders are blurred or don't exist.

We could hypothesize that borders are a human construct, but other species prove otherwise. The real mystery is how we add layers of human culture and society on and in those human-drawn lines. Or perhaps that is our default subconscious way of looking at reality. What is the global traditional shape of most structures? Walls with straight lines and 45 degree corners.



Monday, June 24, 2024

Our Demons

Everyone has demons. The worst demon of all is denying acknowledgment of your demons. Conquering and ridding your demons is not the answer; they never go away. They are part of you no matter how much you try to deny them. 

But we also have an angel, too. The best angel is knowing how to live and control the demons. Consider the famous quote by Sun Tzu, Chinese warrior, strategist, philosopher, and teacher, from his book "The Art of War":

“If you know the enemy and know yourself, you need not fear the result of a hundred battles. If you know yourself but not the enemy, for every victory gained you will also suffer a defeat. If you know neither the enemy nor yourself, you will succumb in every battle.”

 Or, as Irish singer/songwriter Hozier eloquently sings:

"All you have is your 'fire'
And the place you need to reach
Don't you ever tame your demons
But always keep them on a leash…"

 


Friday, March 15, 2024

AI R Us

This should not be surprising.* AI models are a mirror to human behavior. So shit that goes in results in exporting shit. 

AI models are trained to learn, consolidate, collate, and regurgitate. We don't like what we get, so we change it afterwards. But that doesn't change the mirror effect. It's a Band-Aid approach: Cover the abscess so we don't have to see it. Or, as Nikhil Garg, a computer scientist, eloquently puts it, “simply paper over the rot”.

This brings to mind a sci-fi series I'm watching on TV, "Beacon 23". AI, a common component of most sci-fi, is represented as two important components in the series, and are almost complete opposites: "Bart" and "Harmony". The latter is a logical Spock attached to an individual human and created by a major company that controls almost everything.

Bart is an AI developed hundreds of years prior to the primary time event of the series and serves the structure, Beacon 23, in space (it's a 'lighthouse' stationed near a flux of dark matter.  The Beacons guide space craft away from detected nearby dark matter 'clusters', like a lighthouse.) 

Bart learns and adopts human nature/behavior from all the beacon "keepers" and those who visit. He also controls all the communication and mechanics of the Beacon station. As such, Bart is just like a human, with all the messiness, assumptions, errors, etc: he lies, whines, plots, complains, quotes Shakespeare, and often acts like a child.  But Bart also significantly sets the course for what occurs on the Beacon. Whereas Harmony is logical and attuned only to her individual human, but also capable of Beacon control, including Bart (she scolds Bard many times).

Humans created AI, and in its current state in our world is like a young Bart. That we can't see that is human blindness. AI is not, and won't be our savior. We can't even save ourselves. 

"Even though human feedback seems to be able to effectively steer the model away from overt stereotypes, the fact that the base model was trained on Internet data that includes highly racist text means that models will continue to exhibit such patterns."

* "Chatbot AI makes racist judgements on the basis of dialect," Elizabeth Gibney. Nature, March 13, 2024.

Monday, March 11, 2024

Which Reality is Real?

What is 'reality'? Ask 10 people that question and you'll likely hear 10 different answers, including "I don't know." The question may be as old as human consciousness. However, it may not be a realized 'thing' [1] that all humans ponder. 

A quick summary in the online encyclopedia Wikipedia present reality as..

"Reality is the sum or aggregate of all that is real or existent within the universe, as opposed to that which is only imaginary, nonexistent or nonactual. The term is also used to refer to the ontological status of things, indicating their existence. In physical terms, reality is the totality of a system, known and unknown."

Is reality a thing? Or is a way of looking at 'things'? 

Reality may be both. No one may have understood this more than physicist Erwin Schrödinger, famously known for his thought experiment (1935) in quantum mechanics, Schrödinger's cat, . It was as an argumentum ad absurdum (reductive argument to absurdity) intended for questioning the then proposed behavior of atoms and larger manifestations as being one or the other, as in "dead or alive", and which depends on the observer. Or, simply put, it suggests that reality is relative to the organism that observes or experiences it in one way or another.

Recalling an old platitude: If a tree falls in the forest and no one is looking or hearing it, did it really fall?

 Yet, Schrödinger's paradox legitimately questioned, 

"When does a quantum system stop existing as a superposition of states and become one or the other?" (More technically, when does the actual quantum state stop being a non-trivial linear combination of states, each of which resembles different classical states, and instead begin to have a unique classical description?"[2]

More simply put, 

"Our intuition says that no observer can be in more than one state simultaneously—yet the cat, it seems from the thought experiment, can be in such a condition. Is the cat required to be an observer, or does its existence in a single well-defined classical state require another external observer?" [2]

This may bring to mind Einstein's "Theory of Relativity," which is associated with quantum mechanics. Indeed, Einstein considered each alternative as absurd. He wrote to Schrödinger,

"You are the only contemporary physicist, besides Laue, who sees that one cannot get around the assumption of reality, if only one is honest. Most of them simply do not see what sort of risky game they are playing with reality—reality as something independent of what is experimentally established. Their interpretation is, however, refuted most elegantly by your system of radioactive atom + amplifier + charge of gun powder + cat in a box, in which the psi-function of the system contains both the cat alive and blown to bits. Nobody really doubts that the presence or absence of the cat is something independent of the act of observation."[3]

Many interpretations, both technical and popularized,  provide explanations and answers to Schrödinger's paradox. However, the quantum world is full of counterintuitive ideas, which was strongly implied in Schrödinger's thought experiment. Several physicists contemporary with Schrödinger and after his passing proposed their own perspectives. 

Of note, American physicist, Hugh Everett who  proposed (in his 1957 PhD thesis) what is now known as the many-worlds interpretation of quantum mechanics. I'm sure most readers here are familiar with the popular "multi-verse' of science fiction genre and even modern physics. Everett's idea of quantum mechanics does not single out observation as a special process. His many-worlds interpretation of Schrödinger's paradox explains that, 

"...both alive and dead states of the cat persist after the box is opened, but are decoherent[4] from each other. In other words, when the box is opened, the observer and the possibly-dead cat split into an observer looking at a box with a dead cat and an observer looking at a box with a live cat. But since the dead and alive states are decoherent, there is no effective communication or interaction between them. 
When opening the box, the observer becomes entangled with the cat, so "observer states" corresponding to the cat's being alive and dead are formed; each observer state is entangled, or linked, with the cat so that the observation of the cat's state and the cat's state correspond with each other. Quantum decoherence ensures that the different outcomes have no interaction with each other. The same mechanism of quantum decoherence is also important for the interpretation in terms of consistent histories. Only the "dead cat" or the "live cat" can be a part of a consistent history in this interpretation. Decoherence is generally considered to prevent simultaneous observation of multiple states."[5]

Quantum mechanics is often used in contextual explanations of reality. One could possibly, and loosely, refer to Everett's hypothesis as 'alternate' realities. Is this a real 'thing'? Or just another thought experiment or interpretation of reality?

This all begs the question, is there just one reality? If so, then what is it? Afterall, a 'real' reality could exist for non-living things (we know they exist, but the non-living have no consciousness), and another for living things (because we have consciousness and are aware). Which suggests that an infinite number of personal realities may exist. A shared reality may be then be overlapping personal realities like many flexing Venn 4-dimensional boxes that overlap, constantly shifting, temporally and spatially. 

As mentioned earlier, reality is the the totality of a system with known and unknown existences.

Or perhaps reality is like the smile of the Cheshire cat: it remains even when the cat becomes invisible.
_______________________________________________

[1] I'm compelled to explain my use of 'thing' as in it's most recent etymological context: used colloquially since 1600 AD, as a word to substitute for what a person cannot think of it's meaningful name. However, this ubiquitous word 'thing' has an interesting history back to the Vikings. 
[2] "Schrödinger's cat," Wikipedia
[3] Letter to Schrödinger in 1950.
[4] In quantum physics, decoherence is the process in which a system's behavior changes from that which can be explained by quantum mechanics to that which can be explained by classical mechanics.
[5] Hugh Everette III, Wikipedia. As an aside, both Hugh and his son, Mark Oliver Everett, are thought by many to be Asperger's (on the autism spectrum).