Sunday, October 20, 2024

Religion in horror and America

Came upon this essay accidentally this morning. The irony is that this (growing up in religious schools) was a topic in three conversations last week* (different people who lived it; one called it the "school of hypocrisy"). While I can't really contribute to the conversation because I didn't have that experience, I've known many who did live it. Including my father.

As author C. J. Leede comments, the roots run deep, perhaps a thousand years ago. Although, my limited experience in Europe suggests that perhaps it's influence may no longer be as strong in many places there. A French colleague once mentioned that many hundreds of years of religious wars there makes people weary and wary of religious repression and control (have heard similarly of Asia). Obviously, some countries are still willing to kill and maim in the name of their gods, such as many places in the Middle East. And treat women like property and dogs. I find this tantamount to extreme hypocrisy and excuse for patriarchal control. 

Regardless, the bad vapors of religion manifest in various ways. Many times permeating and fusing with capitalism, and, as the US is a perfect example of current times, politics. Some mistake religious piety for greed. As I mentioned in our conversation on Friday, the demon in the box pulling the strings is power. Greed and power seem to be twins, so can't delineate where one begins and the other ends. They are joined at the hips. Or, perhaps, the brain. 

"When I ask myself why I wrote 'American Rapture,' why I read religious horror or watch it on the screen, when I really stop and think, what do I wish more than anything I had known when I was a young girl trying to step into herself?

It’s this: Repression—religious or otherwise—is the horror. Ignorance is what we should fear. What makes us all ill-equipped for moving through life as humans in natural human bodies.

And maybe little girls aren’t born into more sin than little boys. Maybe sin is just an idea we’ve created for control. A powerful little beast that preys on all of us every day, in and outside of fiction. And maybe it’s one we just don’t need to feed anymore. - C. J. Leede"  

* These people whom I had these conversations with were unconnected, nor did I initiate the aforementioned topic. This suggests people may actually be thinking about such subjects. I interpret that as a good sign, aka maybe hope?


Friday, September 06, 2024

The never-ending questions

The running theme in this blog, perhaps in my entire life, has been on reality, or what we think may be reality. What is reality?  Is that a valid question?

As author, literary agent, and 'thinker' John Brockman once proclaimed, he has had a lifelong obsession with asking questions, especially "the" question: "What is the last question?". 

Brockman's first book on pondering questions was By The Late John Brockman, published in 1969. In his own words, it was "about the idea of interrogating reality". Most would respond by declaring this is useless philosophy, thereby concluding it all meaningless. Yet each and every one of us plays apart in it, and without it. (As suggested in the movie, "The Matrix") Most of us just aren't aware of it. 

"Reality as a whole is unmeasurable except through effect. The unity is in the methodology, in the writing, reading, in the navigation. This system cannot provide us with ultimate answers, nor does it present the ultimate questions. There are none."*

This also precipitated his vision of the 'Third culture' consisting of "those scientists and other thinkers in the empirical world who, through their work and expository writing, are taking the place of the traditional intellectual in rendering visible the deeper meanings of our lives, redefining who and what we are." 

In 1988 Brockman and others created The Edge Foundation, an association of science and technology intellectuals. It was also an outgrowth of The Reality Club, a gathering (historically called a 'salon') of intellectuals based in NYC and held by a host. From 1981 through 1996 many well known scientists, authors, artists, technologists, and entrepreneurs met for presentations, 'round-table' discussions, seminars, etc. It retired to a virtual  presence on The Edge website in 1996. (A link on the sidebar has been posted when I created this blog in 2005.)

Despite frequently following the website after I first discovered The Edge (~1998), life became complicated and I lost track of it. An academic career sometimes consume the only existence in one's life, especially in LAC (Life After Children). I was busy "interrogating" small subsets of reality. 

It was with sadness (on my part) that Brockman ran out of questions and announced in 2018 the finale to The Edge project with the question, "WHAT IS THE LAST QUESTION?". 

"Ask 'The Last Question,' your last question, the question for which you will be remembered."

Possibly the best tribute to The Edge was in a ''moratorium'' by lecturer and writer Kenan Malik in The Guardian. The excerpt below gave me some private satisfaction in that I have always relayed similarly to students, children, colleagues, lab members, friends, relatives, and, many times, here on this blog: ask questions. (despite the frustrations of many).

"Asking questions is relatively easy. Asking good questions is surprisingly difficult. A bad question searches for an answer that confirms what we already know. A good question helps to reset our intellectual horizons. It has an answer that we can reach, yet unsettles what we already know."

 One response on the webpage, which may not really qualify as an answer for some, is one of my favorites. As we often say in the scientific fields, answers to a question may only be more questions.

"The final elegance: assuming, asking the question. No answers. No explanations. Why do you demand explanations? If they are given, you will once more be facing a terminus. They cannot get you any further than you are at present. The solution: not an explanation: a description and knowing how to consider it."**
Many people are unsettled by that, but it is the reality. Perhaps it is a very simple single question with no explanation. Just like reality.

________________________________________________________

 * John Brockman, By The Late John Brockman, 1969, Macmillon. The Kindle format can be accessed on Amazon.com. Hard copies are >$70.

** Ludwig Wittgenstein, Zettel, eds. G. E. M. Anscombe and G. H. von Wright, trans. G. E.

Wednesday, September 04, 2024

Genetics is like a musical score

Beethoven

Amadeus Mozart and Ludwig van Beethoven may be the most genetically studied musicians of all time. Why is that? Because their music is famous and loved? Because their personal history is full of celebrity and drama? Or are they just favorites of geneticists?

The narrative in most of the published studies is to understand the interaction between musical sounds and humans. I'm still not sure if this can be completely solved because music is more than just sound. It is a musical 'language', with or without words, that interacts with with the human mind and body. However, because we are a curious species, we seek to learn about those connections. And, as I suspect, the two famous composers have had a growing foundation of research on which to build upon. 

A recent study* explores if genetic factors can determine extraordinary musical achievements. If that is so, then how do genes contribute or determine a person's musicality? This isn't a new query; geneticists have examined similar questions by studying the two famous composers for decades. However, recent advancements in molecular genetics allow scientists to probe deeper into human DNA, sometimes restudying old questions or asking new ones, especially of long-dead people.

Then again, when studying humans, sometimes these newer studies only confirm older results. 

“An analysis of the famous composer's genetic make-up has revealed that DNA data has so far been too imprecise in capturing a person's abilities.” 

In this recent study, an international team of researchers analyzed Beethoven’s DNA to investigate if and how any differences in his genes may account for his celebrated musical exceptionalism. 

The deeper question is, how much can genes impact human traits, especially behavior? When considering a bird or a lizard, probably quite a bit. But humans are “messy.” There is no single quantitative or qualitative line that divides genetically determined and learned human behavior. This is the age-old “nature versus nurture” dilemma. The lines are fuzzy.

Ludwig van Beethoven was born in Bonn (a major city in Germany), which was at that time the capital of the Electorate of Cologne and partly dominated by Roman archbishops. He moved to Vienna (1792), Austria, to flee a dysfunctional family and meet other musicians. 

During this time in history Napoleon restructured France (1789) and regions north including Bonn (1794) and Vienna (1805) after the famous French revolution. Beethoven supported Napoleon's reformations and composed his famous third symphony naming it “Napoleon”.  After Napoleon proclaimed himself Emperor (1804) Beethoven rescinded the Napoleon dedication and renamed it “Eroica”. He even refused to play this symphony in front of French soldiers.

Beethoven lived during a tumultuous era of wars and conflicts with rulers. It was the rise of the German Enlightenment period, transition from the Classical to Romantic era in art and music, almost constant family turmoil, and loss of hearing. He was a man full of emotion, conviction, and righteousness. As his music conveys, he was a man with passion. Are there genes for that?

The researchers analyzed DNA sequences available from an earlier study (2023) in which the composer’s DNA was extracted from strands of Beethoven's hair. The authors then developed a ‘polygenic score’, a number that summarizes the estimated effect of many genetic variants on an individual's trait or behavior. 

"Our aim was to use this polygenic score as an example of the challenges of making genetic predictions for an individual that lived over 200 years ago.”

They chose a specific component of music that had a score for “beat synchronization ability”, which is closely related to musicality. Beat perception and synchronization in humans is the degree to which an individual can synchronize their movements in time with a musical beat. In humans, it is commonly within 120 to 140 beats/minute and is frequently used in music composition. Ironically, beat synchronization was thought to be uncommon in non-human species and the mechanism determining the optimal tempo are unclear.

Although this was thought to be a human rhythm trait, a study in rats (2022) revealed that rats also showed head movements and neural recordings within the same range as humans. This suggests that "the optimal tempo for beat synchronization is determined by the time constant of neural dynamics conserved across species".

And Beethoven?

"The study found that Beethoven had an unremarkable polygenic score for general musicality compared to population samples from the Karolinska Institute, Sweden, and Vanderbilt University, USA. However, considering the limitations of the current polygenic scores and the fact that a genetic indicator for ‘beat synchronization ability’ may not directly tap into Beethoven’s composer skills (musical creativity), this finding is not unexpected.”

The genetic architecture of this trait is highly polygenic, meaning that it is influenced by many genes in the human genome. Authors identified 69 separate locations on the genome in which different genetic alleles (every person has two copies of a gene; they are called 'alleles') in the population account for some of the variability in how accurately people synchronize to a musical beat. 

Genes associated with beat synchronization are more likely to be genes involved in central nervous system function, including genes expressed in brain tissue and genes involved in early brain development. Recent studies also found that beat synchronization shares some of its genetic architecture with other traits, including several that are involved in biological rhythms (walking, breathing, and circadian rhythm). 

The polygenic score computes the sum of genetic effects associated with beat synchronization in each individual, but they are only a rough guess. It can tell us only what an individual’s likelihood of specific levels of beat synchronization would be in relation to the population-based model, but they do not correspond directly to an exact match with the person’s beat synchronization accuracy. Thus a person's beat synchronization may be a point amongst many in a wide area under the curve. And Beethoven's score may be lower than expected, but did that negatively impact his compositions?

“Although Beethoven had a rather low genetic predisposition for beat synchronization highlights the limitations of polygenic score predictions at the individual level. While polygenic score prediction is expected to get more accurate in the future, it is important to remember that complex human traits, including musical skills, are not determined solely by genes or the environment but rather shaped by their complex interplay.”

In conclusion the authors stated that the current study "only shows that we’ve been able to use genetics to explain a portion of the variability in beat synchronization skills (again, at the level of pooled data in a large study sample)."

When scientists talk about “heritability” they are referring to the amount of phenotypic variance explained by genetic variation. This does not mean that rhythm is only “genetic” versus only “environmental,” or that rhythm is genetic in certain people but not others.

"Scientifically we really can’t say for sure how and why an individual reaches (or does not reach) a certain level of musicality. So it’s not “either-or” but “both-and” genes and environment, and the incredibly complex biological interrelationships that occur during human development of musicality will take many, many more years of work to unravel!"

Studies of beat synchronization in humans and other species, such as in rats, found interesting genetic correlations between beat synchronization and a cluster of interrelated traits: walking pace, musculoskeletal strength, breathing function, and cognitive processing speed. Possibly even cadence in language! Additionally, the shared genetic architecture has implications for physical and cognitive function in neurodiverse people and during aging.

* "Was Beethoven unmusical?", Max-Planck-Gesellschaft Research News, published on website April 10, 2024 and accessed 20/08/2024.

Sunday, September 01, 2024

When 'thing' was not a thing

In our daily conversation we use many words that we don’t give a millisecond of thought to where they arose from. And, sometimes, what they really mean. One of those words is ‘thing’. We may utter it dozens of times in a day and it may have as many dozens of meanings. That word, regardless of intention, may be the most ubiquitous and flexible in our language. It even fills holes in a conversation when the speaker can’t think of the correct word to use. 

But what does ‘thing’ really mean? Where and when did it originate? We will hitch a ride on the ‘thing’ through history and learn answers.

Proto-Indo-European Languages (late Stone Age to early Bronze Age)

The origins of the word ‘thing’ are rooted in the Indo-European family of languages. Most modern languages on the European, North American and northern India continent evolved from branches of these ancient languages. Some of these survive today: English, Germanic, French, Celtic, Albanian, Portuguese, Russian, Dutch, Slavic, Italic, and Spanish. Just like living organisms, languages evolve, too. Likewise, they follow people as they migrate, colonialize and trade. Eventually, some may share meanings and even sounds. As we will shortly discover. 

All the languages mentioned above descended from a single prehistoric language: proto-Indo-European. It was spoken sometime during the late Stone Age and early Bronze Age (~3300-1200 BC). Because this was before written history, the geographical area of origin is controversial. 

The favored hypothesis is the region of the Pontic-Caspian steppes (modern Ukraine and southern Russia). It is also associated with the famous Yamnaya peoples, nomads who are considered having developed wheeled carts and horse domestication. Bands of Yamnaya spread south, east and west, conquering and assimilating cultures as they went. Most Europeans and of European descent carry a percentage of Yamnaya genetic haplotypes in their own genes. Of course, the Yamnaya took their language with them.

Indo-European Languages (Mid-Bronze Age into the Middle Ages)

By the time written history appeared throughout Europe, many Indo-European languages had evolved from its spoken historical prototype. This was also a period (4000-1000 BC) of great migrations throughout Eurasia and the Atlantic Europe. Hittite (of Anatolia/Turkey) is considered the earliest (4000-3000 BC) spoken Indo-European language, but is now extinct. Spoken Indo-European languages mixed with speakers of other languages during these migrations; they mixed, overlapped and retained some similarities throughout. 

As pastoral cultures (farming) evolved and more people stayed in one location, especially during the Classical (800 BC-500 AD) and Middle Ages (500 AD-1500), branches of the Indo-European language became more standardized within those cultures. They also had more than one dialect. Remember that in those times, only the elite could read and write; most of the population was illiterate and information was shared orally. Consequently, each branch of the Indo-European languages had, and still have, varieties (dialects) of speech. 

Each of these languages had a development stage, similar to proto-Indo-European. For example, proto-Norse was spoken in Scandinavia and is thought to have evolved from a northern dialect of Proto-Germanic during the first century BC and 100 AD. It evolved into Old Norse and its several dialects at the beginning of the Viking age (~400 BC). These later separated and became the modern North Germanic languages.  Old Norse is also believed to be the first North Germanic language spoken. Keep this in mind as we progress through time. 

This thing and that thing

Before we dive into the word ‘thing’, it is important to follow the evolution and separation of the North Germanic languages. As mentioned, Old Norse was spoken in what we know now as Scandinavia.  At the beginning of the Viking Age the dialects of Old Norse diverged into the modern North Germanic languages, also known as the Nordic languages, or Faroese, Icelandic, Swedish, Norwegian, and Dutch. And their dialects. 

In early Germanic societies a ‘thing’ was a governing assembly made up of free people of the community and presided over by a ‘lawspeaker’. They provided legislative functions, as well as being social events and opportunities for trade.

The earliest trace of the word ‘thing’ is purported to be used by the Germanic
peoples of northwest Europe and Scandinavia during ancient history and into the early Middle Ages (~476-900 AD). The first detailed description (98 AD) of a ‘thing’ was made by Roman historian and politician Tacitus. He suggested that ‘things’ were annual delegate-based meetings for some early Germanic tribes and served legal and military functions. ‘Thing’ was used in Norway before the country’s first Viking king, Harald Finehair who ruled 872-930 AD. It is here where the word and it’s original meaning became famous.

In the Viking Age, ‘things’ were public assemblies of the free men of a place.* They functioned as both parliaments and courts at different levels of society: local, regional, and transregional. Their purpose was to solve disputes, establish laws and make political decisions. Earlier local ‘altings’ were a common assembly where all free farmers had the right to participate.  

An ancient site of a 'thing' in Sweden

‘Thing’ sites were also often the place for public religious rites. Norway’s ‘things’ provided the institutional and legal framework for subsequent legislative and judicial bodies, even in the modern Western world, and remain today as superior regional courts.

‘Things’ took place at regular intervals, usually at prominent places that were accessible by travel. The place where a 'thing' was held was called a "thingstead" (Old English þingstede, Old Norse þingstaðr) or "thingstow" (Old English þingstōw). The Vikings and early Norse settlers often brought their culture with them to new locations abroad, including their legal systems and establishment of ‘things’. The collaborative Thing Project has discovered and documented historical locations of ‘things’ in Norway, Iceland, Sweden, the Faroe Islands, Scotland, and Isle of Man. 

Was ‘thing’ always a thing?

Well, no and yes. Depending on where and when, the original word evolved over time from one language to another, from one definition to another.

In Old Norse, Old English, and modern Icelandic ‘thing’ was pronounced ‘þing’, where ‘þ’ is pronounced as unvoiced "-th". In Middle English, Old Saxon, Old Dutch, and Old Frisian, it evolved to ‘thing’. The difference between ‘þing’ and 'thing' is mostly spelling. ‘Thing’ is pronounced as ‘ding’ in German and Dutch; ‘ting’ in modern Norwegian, Danish and Swedish.

In Middle English, the word ‘thing’ shifted to mean an entity or matter (sometime before 899 AD), and then also an act, deed, or event (after 1000 AD). The original definition of ‘meeting’ or ‘assembly’ did not survive the shift to Middle English. In modern usage, the ancient meaning of ‘thing’ in English and other languages has been displaced to mean not just an assemblage of some sort but simply an object of any sort.

 The meaning of personal possessions, commonly in the plural, first appears in Middle English around 1300, and eventually led to the modern sense of "object". This semantic development from "assembly" to "object" is mirrored in the evolution of many words throughout history. Yet, the most ubiquitous word, at least in English, is ‘thing’. 

And it can mean……. any thing.


 * According to Norway's Law of the Gulathing (the historic ‘thing’ at Gula), only free men of full age could participate in the assembly. In written sources, women were present at some ‘things’ despite being left out of the decision making bodies, such as the Icelandic Althing. Women elders, however, were often consulted on in decision making. 


Evolution of things

While reading an abstract on a paper* introducing a theory on the complexity of evolutionary search effectiveness of natural selection as a learning process, I grinned at the courage of the authors to use this terminology.

"A variational formulation of natural selection is introduced, paying special attention to the nature of ‘things’ and the way that different ‘kinds’ of ’ things’ are individuated from—and influence—each other."


Context here: things = individual parts and the assembly of these parts
___________________

This may be the introduction to my recent blog post on 'thing'. What's a 'thing'?

Monday, July 01, 2024

Borders and grids are Us

 With a long interest in how we as humans perceive our environment, I found this quote to be quite adequate:

"....the imposition of the gridded survey defines the lands as a uniform and monotonous mathematical space turns the lands into a trackless expanse viewed from above."

In many ways we see land as two-dimensional and existing with geometrical shapes. Look at any map, look down from a plane, and you will see this. In a recent issue of Nature NAS scientist John Holmes reviews a new book, "Liberty's Grid", by historian of science and mathematics Amir Alexander.

Alexander relates how the "rectilinear grid was imposed and how it has fed into the US consciousness."
"In Alexander’s telling, the grid is grand, ambitious and uniquely American. It is not only a blank, boundless canvas, but also a causal factor of different aspects of the United States’s trajectory and character."
A interesting book, but as Holmes rightly criticizes, Alexander often confuses cause and effect. Alexander tends to blame many events in our American history on the gridded map and ignore other elements at play in our American culture.
"Fly over the United States or walk its city streets and you can’t help but notice the country’s seemingly endless patchwork of rectangular blocks of land. The origins of this ‘grid’ lie in the eighteenth and nineteenth centuries, when the early US early leaders sent surveyors out to carve up vast tracts of land acquired under treaties and to expand settlement westwards to the Pacific coast."
In other words, dividing up this country was a rushed job without real thought to the landforms. Not that they mattered anyway, except as resources to exploit. Ask the Native Americans about that.

Good points here:

"“Written not on parchment but into the mountains, valleys, and plains of North America, the Great American Grid embodies an ideal of America as a land of unconstrained freedom and infinite opportunity,” Alexander writes. His central thesis is that, although dividing up much of the United States into a geometric grid might seem like a convenient solution to a difficult problem, when viewed from a historian’s perspective it becomes an expression of American exceptionalism and a means to fulfil the idea of the country as an “empire of liberty”."
It is in the last explanation of criticism by Holmes that I agree. Simply because Alexander either ignores or circumvents the psychology and sociology of creating borders, visible and invisible. In basic philosophical perspective, it's creating and outlining territories, just as wild cats and canids do by urinating on the edges of their territories. And those borders are dynamic, not static. This has been a main foundation of human war since civilization. Except for traditionally nomadic peoples, where land borders are blurred or don't exist.

We could hypothesize that borders are a human construct, but other species prove otherwise. The real mystery is how we add layers of human culture and society on and in those human-drawn lines. Or perhaps that is our default subconscious way of looking at reality. What is the global traditional shape of most structures? Walls with straight lines and 45 degree corners.



Monday, June 24, 2024

Our Demons

Everyone has demons. The worst demon of all is denying acknowledgment of your demons. Conquering and ridding your demons is not the answer; they never go away. They are part of you no matter how much you try to deny them. 

But we also have an angel, too. The best angel is knowing how to live and control the demons. Consider the famous quote by Sun Tzu, Chinese warrior, strategist, philosopher, and teacher, from his book "The Art of War":

“If you know the enemy and know yourself, you need not fear the result of a hundred battles. If you know yourself but not the enemy, for every victory gained you will also suffer a defeat. If you know neither the enemy nor yourself, you will succumb in every battle.”

 Or, as Irish singer/songwriter Hozier eloquently sings:

"All you have is your 'fire'
And the place you need to reach
Don't you ever tame your demons
But always keep them on a leash…"

 


Friday, March 15, 2024

AI R Us

This should not be surprising.* AI models are a mirror to human behavior. So shit that goes in results in exporting shit. 

AI models are trained to learn, consolidate, collate, and regurgitate. We don't like what we get, so we change it afterwards. But that doesn't change the mirror effect. It's a Band-Aid approach: Cover the abscess so we don't have to see it. Or, as Nikhil Garg, a computer scientist, eloquently puts it, “simply paper over the rot”.

This brings to mind a sci-fi series I'm watching on TV, "Beacon 23". AI, a common component of most sci-fi, is represented as two important components in the series, and are almost complete opposites: "Bart" and "Harmony". The latter is a logical Spock attached to an individual human and created by a major company that controls almost everything.

Bart is an AI developed hundreds of years prior to the primary time event of the series and serves the structure, Beacon 23, in space (it's a 'lighthouse' stationed near a flux of dark matter.  The Beacons guide space craft away from detected nearby dark matter 'clusters', like a lighthouse.) 

Bart learns and adopts human nature/behavior from all the beacon "keepers" and those who visit. He also controls all the communication and mechanics of the Beacon station. As such, Bart is just like a human, with all the messiness, assumptions, errors, etc: he lies, whines, plots, complains, quotes Shakespeare, and often acts like a child.  But Bart also significantly sets the course for what occurs on the Beacon. Whereas Harmony is logical and attuned only to her individual human, but also capable of Beacon control, including Bart (she scolds Bard many times).

Humans created AI, and in its current state in our world is like a young Bart. That we can't see that is human blindness. AI is not, and won't be our savior. We can't even save ourselves. 

"Even though human feedback seems to be able to effectively steer the model away from overt stereotypes, the fact that the base model was trained on Internet data that includes highly racist text means that models will continue to exhibit such patterns."

* "Chatbot AI makes racist judgements on the basis of dialect," Elizabeth Gibney. Nature, March 13, 2024.

Monday, March 11, 2024

Which Reality is Real?

What is 'reality'? Ask 10 people that question and you'll likely hear 10 different answers, including "I don't know." The question may be as old as human consciousness. However, it may not be a realized 'thing' [1] that all humans ponder. 

A quick summary in the online encyclopedia Wikipedia present reality as..

"Reality is the sum or aggregate of all that is real or existent within the universe, as opposed to that which is only imaginary, nonexistent or nonactual. The term is also used to refer to the ontological status of things, indicating their existence. In physical terms, reality is the totality of a system, known and unknown."

Is reality a thing? Or is a way of looking at 'things'? 

Reality may be both. No one may have understood this more than physicist Erwin Schrödinger, famously known for his thought experiment (1935) in quantum mechanics, Schrödinger's cat, . It was as an argumentum ad absurdum (reductive argument to absurdity) intended for questioning the then proposed behavior of atoms and larger manifestations as being one or the other, as in "dead or alive", and which depends on the observer. Or, simply put, it suggests that reality is relative to the organism that observes or experiences it in one way or another.

Recalling an old platitude: If a tree falls in the forest and no one is looking or hearing it, did it really fall?

 Yet, Schrödinger's paradox legitimately questioned, 

"When does a quantum system stop existing as a superposition of states and become one or the other?" (More technically, when does the actual quantum state stop being a non-trivial linear combination of states, each of which resembles different classical states, and instead begin to have a unique classical description?"[2]

More simply put, 

"Our intuition says that no observer can be in more than one state simultaneously—yet the cat, it seems from the thought experiment, can be in such a condition. Is the cat required to be an observer, or does its existence in a single well-defined classical state require another external observer?" [2]

This may bring to mind Einstein's "Theory of Relativity," which is associated with quantum mechanics. Indeed, Einstein considered each alternative as absurd. He wrote to Schrödinger,

"You are the only contemporary physicist, besides Laue, who sees that one cannot get around the assumption of reality, if only one is honest. Most of them simply do not see what sort of risky game they are playing with reality—reality as something independent of what is experimentally established. Their interpretation is, however, refuted most elegantly by your system of radioactive atom + amplifier + charge of gun powder + cat in a box, in which the psi-function of the system contains both the cat alive and blown to bits. Nobody really doubts that the presence or absence of the cat is something independent of the act of observation."[3]

Many interpretations, both technical and popularized,  provide explanations and answers to Schrödinger's paradox. However, the quantum world is full of counterintuitive ideas, which was strongly implied in Schrödinger's thought experiment. Several physicists contemporary with Schrödinger and after his passing proposed their own perspectives. 

Of note, American physicist, Hugh Everett who  proposed (in his 1957 PhD thesis) what is now known as the many-worlds interpretation of quantum mechanics. I'm sure most readers here are familiar with the popular "multi-verse' of science fiction genre and even modern physics. Everett's idea of quantum mechanics does not single out observation as a special process. His many-worlds interpretation of Schrödinger's paradox explains that, 

"...both alive and dead states of the cat persist after the box is opened, but are decoherent[4] from each other. In other words, when the box is opened, the observer and the possibly-dead cat split into an observer looking at a box with a dead cat and an observer looking at a box with a live cat. But since the dead and alive states are decoherent, there is no effective communication or interaction between them. 
When opening the box, the observer becomes entangled with the cat, so "observer states" corresponding to the cat's being alive and dead are formed; each observer state is entangled, or linked, with the cat so that the observation of the cat's state and the cat's state correspond with each other. Quantum decoherence ensures that the different outcomes have no interaction with each other. The same mechanism of quantum decoherence is also important for the interpretation in terms of consistent histories. Only the "dead cat" or the "live cat" can be a part of a consistent history in this interpretation. Decoherence is generally considered to prevent simultaneous observation of multiple states."[5]

Quantum mechanics is often used in contextual explanations of reality. One could possibly, and loosely, refer to Everett's hypothesis as 'alternate' realities. Is this a real 'thing'? Or just another thought experiment or interpretation of reality?

This all begs the question, is there just one reality? If so, then what is it? Afterall, a 'real' reality could exist for non-living things (we know they exist, but the non-living have no consciousness), and another for living things (because we have consciousness and are aware). Which suggests that an infinite number of personal realities may exist. A shared reality may be then be overlapping personal realities like many flexing Venn 4-dimensional boxes that overlap, constantly shifting, temporally and spatially. 

As mentioned earlier, reality is the the totality of a system with known and unknown existences.

Or perhaps reality is like the smile of the Cheshire cat: it remains even when the cat becomes invisible.
_______________________________________________

[1] I'm compelled to explain my use of 'thing' as in it's most recent etymological context: used colloquially since 1600 AD, as a word to substitute for what a person cannot think of it's meaningful name. However, this ubiquitous word 'thing' has an interesting history back to the Vikings. 
[2] "Schrödinger's cat," Wikipedia
[3] Letter to Schrödinger in 1950.
[4] In quantum physics, decoherence is the process in which a system's behavior changes from that which can be explained by quantum mechanics to that which can be explained by classical mechanics.
[5] Hugh Everette III, Wikipedia. As an aside, both Hugh and his son, Mark Oliver Everett, are thought by many to be Asperger's (on the autism spectrum).

Saturday, February 24, 2024

Life as a human chameleon

Two common words used amongst autistics (and in the autism literature) are 'masking' and 'camouflaging'.  As a late-diagnosed adult Asperger's (on the autism spectrum), I had no idea what they meant except for their literal meaning. At the time of my (unofficially official) diagnosis in 2007 it was referred to as 'coping'. That superficially described it and I left it at that, continuing on in my own uninformed default way: coping. 

In my post of 'coming out' as an older woman on the autism spectrum (Asperger's), I explained why I hid my diagnosis for 16 years. I spent five decades coping as a stranger in a strange land, constantly asking myself "Why am I so different?", and feeling very alone in my version of reality. But I had learned to cope for the most part. As I mentioned elsewhere, I felt fine most of the time. But don't know if that's because I've dealt with it, or if I've buried it. Now I know it was both.

I interpret masking as intentionally 'wearing a mask' to hide my weirdness (which I heard frequently). Camouflage to me was interpreted as subconsciously chosen or learned behavior. As a teen and early adult I used to call the former 'games' or 'playing games'. I learned the rules (expectations) and would intentionally pretend, or not, to follow the rules. Only in retrospect did I realize that in the past I was subconsciously learning and developing strategies to interact within the neurotypical world. It was somewhat Pavlovian.

The latter was pointed out to me during my diagnosis inquiry after I mentioned two key personal accounts: my father was also Asperger's (undiagnosed; there wasn't such a thing back then), as was my closest (ever) adult friend (also Asperger's). 

I realized consciously by observation that my father was very different from other male adults/fathers: he had no social skills, no common sense, didn't like close contact, didn't talk much, had exceptional memory, and was a polymath. And I was aware of the consequences of his weirdness, such as coping with alcohol abuse and being ostracized. It was suggested that I learned some coping skills from that awareness without consciously understanding. My mother, on the other hand, picked up on it, confirmed by her frequent exasperations of "You're just like your father!!"

Decades later, a very close adult friend, 15 years younger than I, was Asperger's. We shared many personal 'secrets' about ourselves, during which I realized he was similar to my father's behavior. I recommended that he learn the same coping skills I had learned: observe and mimic neurotypical people. Unaware that I myself had done the same.

It was pointed out to me that I was sharing the same 'coping' mechanisms (masking and camouflaging) that I intentionally and subconsciously used for all those years. Because I was also Asperger's, which is highly inheritable. 

Meeting a few other Autistics late in life has finally made me feel comfortable in my own skin. I've also been researching the biology, genetics, psychology, and sociology of being Asperger's.[1] One common trait in most (if not all) neurodiverse people is to  'mask' and 'camouflage.' They are common coping mechanisms to navigate in the dominant reality of neurotypical people. Or, the "process[es] of changing or concealing one’s natural personality in order to 'fit in', or perhaps more specifically in order to be perceived as neurotypical".[2] 

But the terms still confused me. 

Masking and Camouflaging

As a biologist, I understand camouflage in the context of other animals, such as the chameleon, cuttlefish, and many butterfly species. Not so much Homo sapeins, which is primarily social camouflage. Masking and camouflaging are used interchangeably in popular books and even some of the research on autism. I wasn't happy with the ambiguous usage so developed definitions as mentioned above based on my own experiences. Until I found a webpage discussion regarding more precise and technical definitions. 

The CAT-Q 

Recent research on autism and other neurodivergent behavior has developed distinct subsets of camouflage, in which masking is one. Eva Silvertant's webpage  explains with detail and comments the subcategories -compensation, masking and assimilation- of camouflaging by neurodiverse people. 

Based on conversations with family members and fellow autistics, the how and why of camouflaging is different based on individual perception. Several have pointed out that nearly everyone does it in some degree and fashion. Yes, that is true. But the degree and consequences are different for neurodiverse people. (I often use the colloquial "conforming to the norm." (But, you may ask, 'what is normal?', which is a long topic of itself.)

As Silvertant explains, 

"Some extent of camouflaging is probably inherent to human interaction; we learn social skills to improve social interaction, we adhere to social conventions that may not make complete sense to us, and we adjust our behavior as the situation demands. For example, we tend to behave differently at work than at home. But for some people, the need to camouflage is a lot less superficial. Of course, the need to camouflage is proportional to how strange your behavior is perceived to be by your surroundings. And since autism is generally not very well understood by non-professionals—and even many professionals, honestly—it is us autistic people who find we often have a greater need to camouflage.

For other people camouflaging might mean acting, talking, and/or dressing a certain way in order to fit in with a social group of their preference. And while this probably pertains to autistic people as well, our need to camouflage tends to go deeper; because autistic people often have to camouflage their autistic behaviors, so as to minimize the visibility of one’s autism in social situations."

Silverton then discusses the how and why, and some of the consequences of camouflage for autistics. For example, when trying to explain to my family members why I can not handle hugs from people (other than my family) without suffering anxiety and emotional and physical recoiling, I realized they could not understand. What they did finally understand was why I apologetically make excuses to not attend big extended family gatherings and linger on the edges in social groups to avoid hugs. I now have decided to stop 'playing the game' and just politely tell people "I don't do hugs".

Researcher Laura Hall, et. al.[3] designed a questionnaire, the "Camouflaging Autistic Traits Questionnaire (CAT-Q)," that helped them develop a model of camouflaging with three categories:

  1. Compensation — Strategies used to actively compensate for difficulties in social situations.
  2. Masking — Strategies used to hide autistic characteristics or portray a non-autistic persona.
  3. Assimilation — Strategies used to try to fit in with others in social situations.
The 25 items of the social camouflaging model are diagrammatically explained on Silverton's webpage and in the original published paper.[3] 

The CAT-Q questionnaire was validated  by others and it is purported by the researchers to be a "reliable self-report measure of adults’ social camouflaging behaviors, suitable for use in autistic and non-autistic male and female populations. It can be used in research settings to quantify camouflaging behaviors and compare between groups; in clinical settings as a potential screening tool for individuals who may be missed under current autism diagnostic criteria because they camouflage; and by autistic and non-autistic people to aid identification of beneficial or harmful behaviors they use in social situations." [3]

However important technical terms are in science communication, they can be confusing to the lay public. I can understand why using the two terms masking and camouflaging, rather than the three categories from the CAT-Q would be most efficient.  Regardless, the CAT-Q terminology should be a standard in autism research and professional publications.

In common use, it may be helpful to define masking and camouflaging more narrowly than to be used interchangeably. Thus far, the only book I have read that distinguishes between the two is Unmasking Autism: The Power of Embracing Our Hidden Neurodiversity, by Devon Price, PhD, autistic and a social psychologist.

Learning that many of my inherent differences and quirks in social communications and interactions are common amongst other neurodiverse people has more than explained my responses and behavior in the past. I regret my unwillingness to learn this earlier when I was in academia and where I experienced the most stigma about my neurodiversity. Now that I am retired and meeting more neurodiverse people on the autism spectrum and others, like ADHD, I feel more free to be who and what I really am; a lot less masking and camouflaging, but also more relaxed when 'light masking'.  
_______________________________________________________

1.  That's part of my Asperger's: extremely focused on specific topics, mostly the living sciences. Which is why I became a scientist.

2. Silvertant, Eva, 2020/2023. Autism & camouflaging (pulled from the Internet 02/24/2024)

3. Hull, L, et al., 2018. Development and Validation of the Camouflaging Autistic Traits Questionnaire (CAT-Q),  Journal of Autism and Developmental Disorders.

Tuesday, February 13, 2024

Air Pollution is Us

The dangers of air pollution seem obvious and of concern to some, but apparently only a minority. How much of the apathy is attributed to "I don't care" or "I don't believe it", or both, is unknown. Yet, I see it on a daily basis. 

The most striking is the number of people that leave their gas-powered vehicles running while at stores or other business. On my walk to and from the gym, I pass cars parked on the sides of the street and in the post-office parking lot. It's more common than not to notice that most vehicle engines are on and running with no occupants in them. 

Out of curiosity, I timed 4 empty vehicles independently with engines running (1 at post office, 2 at a barber shop, 1 store parking lot). Vacancy time was 12-32 minutes. The other day 5 running vehicles parked at the post-office were backed into the lot with rear ends butting the sidewalk. Another time I walked passed an engaged car with a passenger in a store parking lot. ~Twenty-five minutes later when I exited the store, the car was still running with it's passenger. I see this year-round, not just seasonal.

I see this ALL the time! Imagine how much emissions are exhausted into the air. I notice during my walks on cold and gray days that the exhaust stays closer to the ground, not dissipating into the air above, and worse on windless days. What that means is that I'm breathing it the entire time I walk along the streets. I can smell and taste it. 

All this in a small rural town. Now, consider the expansion of this in cities and along major highways. This is why I don't like living in urban areas (as well as my hypersensitivity to noise). And it affects more than just us humans; it impacts all life forms. 

Here is an important factoid: a scientist discovered by stringent experiments that lead in the environment was almost nil until the beginning of the Industrial Revolution and increased every year. By sampling the permafrost in Greenland he found that the levels significantly increased in the 1950's, coinciding when lead was added to gasoline. Lead in gas was (is) volatilized in engines and emitted from tailpipes. Therefore, lead was deposited along streets and highways for decades. 

The dangers of lead were known decades before unleaded gas was introduced for on-road vehicles in the 1950's. Yet it took another 20 years for unleaded gas to be phased out completely. Lead use in gasoline was prohibited in the US in 1986.

The problems are: 

1. Lead is a heavy metal and stays in living tissues for some time, depending on how much and long an organism was exposed. High exposure (time and amount) can lead to the metal deposited and stored  in bones and teeth (the two common sources for sampling of lead contamination).

2. Bioaccumulation: lead from air pollution, etc can accumulate in soil and plants, and the organisms that use plants for food source. And lead can persist for hundreds of years.

"Lead shares about 10% of total pollution produced by heavy metals. The uptake of lead by the primary producers (plants) is found to affect their metabolic functions, growth, and photosynthetic activity. The accumulation of lead in excess can cause up to a 42% reduction in the growth of the roots."

Also, lead arsenate and other lead compounds were used as pesticides for food crops,  such as fruit orchards and on other crops, until the 1950s. But that lead in the soil is still there and being absorbed by plant roots. Sampling of plants along highways and streets demonstrate that plants absorb, and some sequester, lead from the air. (Avoid planting edible plants next to roads and highways.)

Pollution particles are not just lead, as this recent science article points out. Yet no one seems to care, considering the amount of pollution generated, knowingly and unknowingly, on a daily basis. Right under our noses.

"The damage that air pollution can do is wide-ranging and well-known: The chemicals produced by human activities can trap heat in the atmosphere, change the chemistry of the oceans and harm human health in myriad ways.

Now, a new study suggests that air pollution might also make flowers less attractive to pollinating insects. Compounds called nitrate radicals, which can be abundant in nighttime urban air, severely degrade the scent emitted by the pale evening primrose, reducing visits from pollinating hawk moths." ("Polluted Flowers Smell Less Sweet to Pollinators, Study Finds," Emily Anthes, New York Times, 2/08/2024)


 

Thursday, February 01, 2024

Speaking Out

What convinces me that our species is 'doomed' (I wish I could avoid using that term, but can no longer deny the applicability) is the global pathological denial of the train wreck. Additionally, human civilization has already dragged down other species, and continues to do so at alarming rates.

As a scientist, I intellectually understand that denial is a defense mechanism to help cope with anxiety (coupled with explaining away problems and blaming others). Denial enables ignoring or refusing to believe an unpleasant reality, protecting psychological well-being in any situation that produces anxiety or conflict including challenges to one's standard of living or power status quo. But when one is looking into the jaws of a lion ready to bite off your head, or staring at the giant wall looming before the speeding train, denial no longer serves as a defense mechanism. Here I lose understanding, and I blame my science colleagues as much as the politicians and financiers that perpetuate the fuel of denial.

Most of the scientific community, including medical, is either in denial, or running to look for technological solutions rather than evaluate the root origins of our problems and address 'how we got from there to here.' Our politics, economics, policy, and, many times science are based on 'bandaid' cures. Put a bandaid on it, cover it up, and it will go away. Ignore the festering wound underneath; its origins, the process and interrelated changes of its development. Ignore the peripheral interacting relationships and far-reaching impacts. "Let's amp up production; don't pay any attention to consumption." "Give them a pill; who cares about prevention?" "Collect the species DNA and we can ignore life extinction."

At this rate, our civilization will collapse. As history teaches us, it will also arise again like the Phoenix. But perhaps a Phoenix with a limp. My sympathies and grief are for the non-human species on this planet. Their collapse and extinction is finite. And they have no concept of denial. The ubiquitous law of supply and demand is a part of population dynamics. The changing supply of resources -water, food, shelter- will trigger changes in population. I suspect that without modern industrial sustainment of large scale food production, water collection and long-distance distribution, human population everywhere will decline, possibly quickly collapse. Earlier civilizations, even other species, have experienced these cycles; we are not immune.

Sometimes I have to look beyond the doom and gloom and find specks of hope that we can change this trajectory. But my brain isn't convinced.

Monday, January 29, 2024

The Mental Health Crisis

I read an announcement this morning and associated media release about a NYS senator supporting introduced legislation to help recruit more mental health professionals to state counties that lack them. Which is most of NY state and nearly all rural counties.

One of her comments reflects public attitudes and policy regarding mental heath in this country (let alone the state). And it "stoked my fire" enough to type and send her an email (see below). 

Writing the email was easy. What's less easy is posting it here with my identification added to it. It's like stepping out on a public street naked with a sign, "See us, help us," where the typical reaction is that most other people will turn away or pretend they didn't see. That is the stigma all with mental health issues carry. 

Mental health issues don't mean we are damaged or broken. For some, we are just different; some of us have managed to cope. For others, they live every day in a nightmare. Most of us also remain hidden and invisible; by our own choice, because we are ignored, or we can't get help. It's not just an individual's problem, or New York's problem, it is the entire country. 

We really are more alike than we are different. But the stigma pushes us away, and, in some cases, kills us. This country's perceptions and public policy has, in a minute quality and quantity, changed for the better. But not enough. And the population of the troubled have grown. Just as centuries ago, only the privileged have ensured access to help and care.

This needs to change. It is starts with each and every person.
_____________________________________________________________

Contents of my email:

"I read the announcement and media release about the effort to increase mental health professionals in New York state counties that need it. Which, judging from the map in the media, is 73% of all NY state counties. Most of the latter are rural.

Offering student loan forgiveness for MH “professionals” is a pittance to the epidemic of mental health issues in this state (and country). The offer appeals more to recent graduates than the experienced exerts and professionals that are sorely needed.

The comment in the media piece demonstrates the attitude of people in our society that others with mental health issues are only “an enormous burden on our society and economy as a whole, imposing millions of dollars in direct and indirect costs." Very little consideration of the personal pain of the afflicted, their families, and their loved ones. We are whitewashed, as we have been for centuries.

Increasing professionals will not lessen or solve the MH epidemic. Most of the afflicted cannot afford professional help, even with most insurances. And many do not have any insurance. This is why a high number of people with MH issues die (overdose or suicide) or end up incarcerated. This results from lack of support and help. Isn’t it ironic that the only time the public notices is when they are in jail or in obituaries?

A large % of people and families live paycheck to paycheck. They can’t afford professional help because they struggle to feed themselves, pay their rents, and make it to their jobs. They also fear the “system”, scared that their children will be taken from them, that they will lose input or control over their own lives, and, most of all , they fear the stigma. Which, as your comment demonstrates and perpetuates, is very real and alive.

Mental health acknowledges no social and economic boundaries. Society sets the boundaries. And the privileged can afford professional help and services. Especially in the cities, as the map demonstrates. Rural people are left to flounder through the nightmares in which they live.

Do better. Help bring the mental health crisis to the forefront of the ongoing overall health crisis. Help them by reducing the stigmas and fear. Help them by making MH care more accessible and affordable. Help the public understand that 1 in 5 people in this country experience MH issues. And that number may actually be higher because many people hide it or are undiagnosed. They are the invisible people that silently cry out for help.

I’m on the autistic spectrum, diagnosed late in life. Most of my life has been living in a “different world” that no one else knew or saw. I know others in worse situations that live moment to moment wondering how they will cope and make it to the next day. I see undiagnosed children with less awareness of their problems, and, most of all, their futures. Some may not have a future.

We are the 'I have no mouth and I must scream.' "