Year End Thoughts for 2025 — The Big Picture Stuff

 [Before we begin, a word about hyperlinks in this and any Instapunk post. They’re there to help you, not create a series of distracting digressions. Good rule of thumb: note that the link is there, take it if you can’t resist, but try to finish reading the post and then go back to any hyperlinks that still intrigue you. Videos are reader’s now/later choice every time Absolute linearity is the obsession of the obsolete typewriter crowd.]


Göbekli Tepe. 12,500 years old.

This is very long. I had to write down what I was thinking in some detail. I’m glad I did, but you don’t have to read it at one sitting. If you like, you can skip all the way down to the Section titled “The Secular Dead End” and get the tone and gist of my perspective, leaving the substance till later or never. Understood?

Let’s get down to it.

What’s the Big Thing that matters most, more than anything? Answer? The story we tell ourselves as human beings about where we come from, where we are going, and what it all means. Everything written down or otherwise produced by philosophers, theologians, historians, scientists, political movements, and creative writers is about the Big Thing, with varying degrees of detail and conceptual coherence. The cultural assumptions embodied by each generation of Mankind are the rough human consensus of what the overall narrative means and how it should be pushed toward the next chapter. The consensus is always filled with contradictions, conflicting hypotheses, and pragmatic responses by various populations to what they perceive as reality.

The Big Thing is too big to hold in mind all at once, which is why every discipline and individual voice simplifies the complexities it encounters by means of metaphors — the “this complicated thing is like that simpler thing,” enough at least for purposes of discussion. Why in my Preface post I reproduced this page from my Vennich Manuscript project:

One page metaphor for all written human documents of any kind.

Oversimplification? No. Literal truth, or the closest we can get to the inherently ineffable thing we call truth. Even our very deepest and most passionately believed notions of truth are working assumptions with varying degrees of legitimacy. We have one tool to use in assessing the value of working assumptions. A tool with more metaphors describing it than a hen lays eggs… the smell test, the bird’s eye view, the bottom line, or most starkly, common sense. Does it work? Meaning has it worked, is it working now, will it work as a guiding assumption in the future??? 

Also in my preface I postulated that one of my key working assumptions is that everything is in everything. That is, no matter how finely we try to separate a thing from the whole to examine it more clearly in isolation, the whole is always there and can be inferred from that littlest thing if we look for it. Science has made its own real world metaphor for this phenomenon. It’s called the hologram, a graphic device whose physical property is that any fragment of a holographic image contains the whole image. Not explainable but observably verifiable. (For an exploratory post already on the record, see Poetry Across Scales, 8/23/25.) Here’s a snapshot of a post I’m still working on:

One corner of the room I’m sitting in. Basis for a post-in-progress longer than I have yet 
completed, showing that all the known universe can be inferred from what’s here in the pic.

The photograph above is a work of fiction. It’s a made thing. It is not the thing itself, but a shorthand version of it and it contains or implies so many working assumptions of the consensus narrative that its status as fiction is deducible from the simple fact that it is not the whole thing.

I’m not making a narrow theoretical point here. It’s a literal factual point. Take any piece of writing, including even legal briefs and movies, and test it for yourself. Everything contained therein is an individual version of its creator’s idea of the truth or a deliberate untruth he is seeking to make resemble truth. By ‘all writing’ I mean all writing. Setting aside post-it notes on the door of the fridge, which can also be parsed into unproven assumptions, even the most hot-headed and cold-blooded attempts at persuading a reader of anything and everything are not strictly true, no matter how sincerely intended. They are all edits of reality, compiled from selections of the available evidence, meant to achieve some purpose, employing logic, emotion (or feigned lack of same), and data (calculated in tables or made up on the fly), and intended as information or entertainment or wisdom. What is left out is usually as important as what is included, and usually discernible in outline at least despite the omissions.

Why these, my End of Year Thoughts, are also fiction. In my preface I said my agenda this year would be “Atheism, Sex, the Erosion of Consciousness, the Tyranny of Science, and the Judas Goat of Artificial Intelligence.” I’ve written about all these topics before, so why bother? Because this time they need to be looked at specifically for their relations to and among and against one another. They are all integral parts of the current narrative of who we are and where we’re going. The summary assessment here will also be, as I have defined it above, a work of fiction to be subjected to your own smell tests and common sense. I know that won’t be many of you. My purpose in writing is not conquest. It’s more simple than that. I want more people thinking about these things as interrelated phenomena. What impact that will have in your views and lives is entirely up to you. I’m content with that.

Atheism has been a consensus working assumption of the elite intellectual caste in charge of education and the remnants of what was once philosophy from the middle of the 19th century on through the 20th and 21st centuries to date. It is the tap root from which the other listed categories of crisis have grown, given that growing can also include withering as well as blooming. Almost exactly four years ago, I wrote a post about the British and Oxbridge contribution to the elite intellectual take on the Big Question. A key quote:

“…Monty Python was only a sideshow. Count back the years to the British Invasion. John Lennon didn’t go to Oxford. But he grew up in the Brit caste system, and he was pissed. At authority, at religion, at his home country, and he came HERE to express what he couldn’t get away with saying at home. What did he believe in? I can’t believe so many people still love this suicidal, nihilistic song


But here’s what he was really saying: Working Class Hero. It didn’t sell quite as well, but it was his truth.


Now go a long, long while back. Whence it finally seeped down to Lennon, who finally said of his music band, “We’re more important than Jesus Christ.” As it turns out, he was probably right.

This seminal work was published in 1811, 153 years before the Beatles captured America.

More Oxford. Harvard’s had a lot of poets, Yale a few. Oxbridge has more. It took the French to kill architecture (Le Corbusier), but it took the English — and Oxford — to kill God and Western Civilization and print literature.


The Brits have been crushed by their elites. We are staring down the barrel of the same fate.” 

Where were we? The history of atheism as a dominant working assumption among the upper crust of academic and scientific elites. A history that will begin to show the tight relationships between the topic headers above. When Shelley introduced atheism into the philosophical discussion, he was reviled as a heretic. But a succession of real world events changed the landscape very dramatically. The initial repudiation by the intellectual elite demonstrated that atheism was actually a notional novelty, an intellectual invention unique in the human narrative for thousands of years of recorded writings and devotional monuments. God(s) were there at the birth of civilization itself. Their existence was not doubted, just their names and legitimacy to worshippers and followers. Wars were fought to determine which gods were the right ones to honor and placate.

Science starts the ball rolling before Shelley. Newton formalizes the scientific method to systematize inquiries into attempts at understanding the extent of (just one) God’s creation. The prime directive of the method was to count and measure what could be counted or measured and to use only those data to test and prove theories about the ways of nature. This created a conflict between science and religion that simmered quietly in philosophical texts until 1849, when Darwin’s Origin of Species started an intellectual war that lasted for 75 years and ended with a one-sided armistice in 1918. It would take another 50 years for scientific and cultural atheism to complete the revolution against faith in God and rewrite the popular narrative of the Big Thing as well. Here’s are some key milestones of the Timeline:

1687 — Newton’s Principia, formalizing the Scientific Method.

1811 — Shelley’s “Necessity of Atheism,” planting the seed at Oxford.

1849 — Darwin’s Origin of Species, replacing the Bible’s creation story.

1882 — Nietzsche’s The Gay Science, declaring that “God is dead.”

1918 — The End of World War I, immolating religious faith among the intelligentsia.

1923 — Freud’s The Ego and the Id, formalizing the soft science of psychology.

1943 — Sartre’s Being and Nothingness, giving birth to secular existentialism.

1945 — The End of World War 2, launching the nuclear age and a 70-year Cold War.

1963 — The assassination of JFK and the escalation of the Vietnam War.

1964 — The British Invasion led by the Beatles and the rise of sex, drugs, and rock’n’roll.

1968 — The birth of the New Left in a year of blood-soaked politics.

1989 — The fall of the Berlin Wall and the End of the Cold War.

1990-Present — Anomie, culminating in the COVID panic and the replacement of scientific worship with the emergent suicide cult of Artificial Intelligence.

Most cultural critics would tend to propose very different looking timelines of key turning points. That’s because the sprawl of technology-driven mass media has resulted in a shallower, more sensational narrative that is communicated in terms of politicians and demographically oriented causes like civil rights, feminism, little-big wars, and economic hot buttons. My 1963 is different from the feminists’ 1963, which would claim the world-changing milestone of Betty Friedan’s The Feminine Mystique, still the bestselling book in the history of that cause. 

But the mass media timeline would be wrong in its emphases, because the overwhelming unifying feature of mine is the long journey of western civilization into de facto atheism as the ruling philosophy of the 21st Century. 

The Cultural Function of Religion

My timeline is the one that enables us to dig into the subtle but transformational ways that Christianity has reached its current status as a vilified and persecuted but nominal majority of populations the world over. Which is a supreme irony and therefore one that is vital to understand if one is to understand anything about the precipice on which we now stand. Most of what we tell ourselves are defining moral issues of our lives are mere distractions from the hidden big picture narrative.

For example, our public debate is presently consumed by the supposed conflicts between Marxist egalitarianism and Nazi/fascist nationalism. (Christianity gets dragged into the anti nationalist polemics as a kind of stinking corpse smell to steep the modern Nazis in.) Yet the real relation between Marxism and Nazism is they are both explicitly atheistic in practice and have murdered untold millions of the people they claimed to be liberating. By the same token, Islam has more in common with both Nazism and Marxism than it does with any of the Judeo-Christian Biblical religions, of which it is a cynical parody. 

What is the political value of atheism? It authorizes men to play God. Morality is not invested in Commandments but the dictates of the man in charge. (Atheist ideologies always end with one man in charge.) How German National Socialism turned into the power-mad totalitarianism of Hitler. How the Communist Russian Revolution against the autocratic czars turned into the power mad totalitarianism of Lenin and Stalin. How the Communist agrarian revolution in China turned into the power-mad totalitarianism of Mao. How the ‘religion’ which subtitles itself Submission transformed into the power-mad totalitarianism of the Ottoman emperors, the Ayatollahs Khomeini and Khamaini, and Osama bin Laden. The secret no one will admit about Islam is that its first act was to replace the Hebrew God Yahweh with a warlord pedophile ‘prophet’ named Muhammed who used plagiarized scripture to fight an eternal war for world conquest. It was never a religion. It was a battle plan hidden inside a political version of heavenly aspiration. Slaughter the infidel and screw all the virgins you want in the afterlife.

The people who write the contemporary atheist scripture in the west love to equate all violence committed by religious people as proof that if there’s any truth to the concept of Original Sin, religion is itself that primal sin. The original sin is human nature, which can never be made pure but can be moderated and inspired by a beneficent God.

It took a long time from its origins as a philosophy for atheism to acquire political power over large populations. We’ve had about 100 years of their performance as leaders of men and women. Their record is abysmal, abominable. The death tolls achieved against their own civilian populations by Hitler, Mussolini, Franco, Stalin, Mao, Castro, and Pol Pot beggar anything wrought purposely against feudal serfs by Europe’s divine right of Christian kings. Not all religions are created equal, it turns out. The superiority of the Judeo-Christian model in terms of their evolution toward personal and economic Liberty has been unique in human history. Plenty of bloodshed along the way? Yes. But the faith which underpinned their experiments with government created the Enlightenment, a sea change in human consciousness which recognized human nature as a given that could be educated to reduce believers’ violence against one another and their opportunities to provide better futures for their children. What actually guided the most prodigious breakthroughs in science from Galileo and Leonardo to Einstein, Pasteur, Fulton, and Henry Ford?Judeo-Christian altruism.

The Establishment of Establishment Science

Science did not set out to be a God killer. It became one by replacing its original mission with a set of operating rules that bureaucratized its processes for inquiry, credentialing, and proof. Because they could not measure a metaphor with calipers, they forgot how to detect disturbing patterns in their orthodoxies, which became increasingly promiscuous in the applying and re-applying, and ultimately the abusing of metaphors they acquired from Darwinian evolutionary theory. Randomness. Entropy and its by-product of mutations. Layers in which incredibly long ages are recorded. Toothbrushes and dental picks as the preemption of creative generalizations and hypotheses. Infinite parsings in place of common sense.

What did they miss? A lot. You could say, without overstating it, that they missed the whole blooming and continually renewing forest for a laboratory case filled with fossilized tree rings. They didn’t learn from new metaphors that were available from other branches of knowledge. They got lazy and sloppy, forgetting even their own rules at critical moments in their research.

They forgot Newton’s dictum that for every action, there is an equal and opposite reaction. They became myopically fixated on their narrative of natural history as a continuous falling apart of states of relative equilibrium into extinction events and barren epochs of time so uninteresting that it crushed any suspicion of order, meaning or direction in the process over the long term. They saw the inherent state of the Big Thing as chaos punctuated by entropic catastrophes precipitating rapid adaptations and “survival of the fittest” conflicts between mutated gene sets. 

They pushed their own heads so far up their ass that the most important scientific theory of all time wound up nakedly exposed as a farce by what is presently regarded as the Last Word on their field of study.

Look at it. No design. Accompanied by a perfect symbol of design. Self rebuttal in one pic.

We’ve got a Terminator’s Arm Paradox (TAP) here. Do you remember the problem?

Movies that matter are all still from the 20th Century. Terminator surfaced science’s ‘magic’ trick.

No, I’m not afraid to reference popular movies in discussing the Big Thing. Science occupies the power positions in culture now, because they have succeeded in building a wall around their unscientific resistance to interrogation by commoners. Their behavior as a pillar of establishment thinking is more like that of clerics in a religious orthodoxy than anyone seems to recognize. What is described as consensus science is riddled with inconsistencies and impossibilities which are defended from interrogation by the most suspect of all responses, arrogance, denial, and ridicule. Typically, any challenge to the orthodoxy is immediately dismissed as illegitimate/ignorant and then attacked by dragging the argument, lawyer-like, into the weeds of peer-reviewed pronouncements where laymen cannot hope to follow in the only approved syntax. But there’s no need to fight on the terrain they choose, and I won’t do it. Where movies become intriguingly relevant. Questions that can’t be asked of physics and biology professors can be posed in mass entertainments. All the critical questions are fair game on the silver screen. The contradictions embedded in the fatal logic of scientific tap-dancing about the nature of reality are brought to the fore in movies about robots, psychic phenomena, and time travel. Where the Terminator’s Arm Paradox is significant. 

In Terminator 2, the killer cyborg played by the Arnold returns from the Future to save the man whose birth he was originally sent to prevent. The new villain is an even more powerful Terminator than he is, and Arnold dies in the process of destroying him. His final act is to destroy his own body in order to forestall any problems that might be created by his anomalous technology. Except he fails in this act of human (ahem) sacrifice. One of his future-tech arms remains behind after his self-immolation. The arm is preserved, studied, de-engineered, and becomes the basis of the doom-pregnant technology of SkyNet, the AI computer system that will seek to replace Mankind with machines.

And there it is, in the case pictured above, the perfect symbol of the magic trick performed by Richard Dawkins, the reigning Grand Inquisitor of the Neo-Darwinian Evolutionary Fiction (excuse me… Theory [trans. Fact]). 

Do you see it yet? There is no conceivable source of the technology represented by the Terminator arm. It just appears out of nowhere, a plot device to push the narrative along. In the movies the means of concealing the paradox is the audience’s willing suspension of disbelief and the pace of cleverly edited action movies. Nothing to see here.. Move along… How does every evolutionist from Darwin to Dawkins conceal the illusion they are depending on? By the opposite of movie pacing, the immense scale of elapsed time over which the miracles of change from amoebas to high-tech man have occurred. A variety of Terminator’s Arm miracles are hidden in the vastness of the time terrain.


Magical Thinking

Look again at the title and cover of Dawkins’s The Blind Watchmaker. The watch is exactly like the Terminator’s Arm. Where did his watch enter the continuum of a process driven by random change in a reality ruled by the ineluctable force of entropy? Entropy — lest we forget — is the eternal “falling apart” of everything from mountains to eco-systems to food chains of species to civilizations to, yes, gods and religions, leaving only the unassailable tower of pristine, anti-magical Science behind. 


but where does the effing watch come from? This is not a trivial point. It is the make or break point of the whole secular assumption set that has enabled mankind (specifically, the smart ones) to take the place of gods and reshape the entire reality in which 6 to 8 billion people are expected to live and function as productive cogs in a rational, man-made system.


There are many questionable aspects of the Evolution story. Parts of the story are right, of course. The population of individual species of plants and animals does change. Species go extinct. New ones emerge. That’s called macro-evolution. It has some big problems associated with it but the observable artifacts of macro-evolution are undeniable. The changes within species that might lead to distinctly new species are called micro-evolution, and these are a trickier question. We can force micro-evolution through selective breeding, as we have done with dogs, for example, but as varied as domestic dog breeds are, they still belong to the same species; that is, able to breed with one another, size differences aside. When breeds are allowed to reproduce on their own through time, they quickly revert to a generic ‘dog’ we would all recognize from the strays we’ve seen on back streets. The irony of using animal husbandry to verify the concept of micro-evolution is, however, yet another example of TAP. Creating specialized dog breeds is a function of conscious design, which is the one thing that has to be completely excluded from the Darwinian Big Picture.


Design. It keeps sticking its cold set nose into the discussion. 


Individual esthetic perfections created by endless epochs of random genetic mutation?

What the orthodox scientists insist on is that there is no design. There is no design. There is no design. Yet when they are explaining the specific survival attributes of individual species, they describe them to lay audiences in terms of design. They pile up nods of assent for the value of forward-facing eyes for mammalian predators, the purpose of wildly divergent fang configurations for snakes, and the reproductive advantage of beautiful color plumage in male birds vs drab coloration in females. Then they conclude such logical exercises in persuasion by removing purpose and design in one sweeping sentence. “There is no design.” The exact same way Dawkins concludes his crowning title metaphor by removing the watchmaker from his description of the development of successful species.

Things they will not deal with: the illogic in the cosmological physics orthodoxy of time being a one-dimensional vector rather than a multiplication by infinity of the three-dimensional world it rests on top of in the dimensional hierarchy. The jumps from one dimension to two and from two to three multiply the universe by infinity. But the forth dimension reverts to the lowly line. Even though physicists have leapfrogged that particular obstacle by promoting string theory in which up to four or five more dimensions above and beyond time are necessary to explain certain phenomena observable in real world experiments. They want us to believe that black holes, dark matter, and dark energy exist even though no one has ever seen any of them up close. At the same time they rule out the existence of ghosts, Bigfoot, and Extraterrestrial abductions even though there is more preponderance of evidence for them (massive anecdotal reportage consistent across a witness population widely separated by geography, ethnicity, and timeframes) than for the invisible phenomena they use to fill holes in their calculations. In other words, they are kind of making shit up, pretending they know more about a bunch of stuff than they really do, while insisting that anything they can’t strap down on a gurney in a laboratory here on earth does not exist.

DNA is another hot button question mark scientists don’t talk about except in the most guarded ways. They get scornful if anyone mentions that one of the discoverers of DNA (Frances Crick) insisted that the simplest creatures on earth have more sophisticated DNA than they could possibly need, which argues that this essential ingredient of all plant and animal life is an import from outside earth, probably the output of a seeding endeavor by a highly intelligent entity or entities unknown. Design rearing its head again. Interestingly, up to 98 percent of the human genome consists of what’s been dismissed as “junk DNA” with no identifiable (i.e., yet identified) purpose. An excerpt from articles at this link:

Blow it up if you want to start a long frustrating trip into the weeds…

I haven’t even gotten into the quantum theory mess here. I’ll try to leave it alone for now.

So what does all this scientific dissimulation and disinformation have to do with the Big Thing and the Big Picture that’s warming up a gigantic confrontation for us between prevailing world views and the underlying reality. What does all this have to do with consciousness, sex, and Artificial Intelligence?

The Rise and Fall of Dominant Metaphors

The ultimate irony is that the mandated scientific view of the Big Question is based on obsolete metaphors that distort and omit key elements of the real Big Picture the consensus narrative is leading us into.

The refusal to identify the biggest hole in the scientific narrative is a gross act of negligence by the geniuses in charge. The subconscious motive for the refusal is the narcissism of scientists who want to replace god(s) in the human equation with themselves. The reason they cannot allow design into the equation is that design affirms a highly superior intelligence at work in the real narrative. A pre-existing super-intelligence capable of design, baked into the cake. Which would put them back in the ordinary human canoe without a paddle. Their own insistence on entropy as the prime mover in the universe is an absurdity. If entropy is that omnipresent, then the fact that creative events occur at all absolutely requires an equally powerful and pervasive opposing force we’ll call Syntropy. It puts things together while Entropy is taking them apart. If the Big Bang was the original Entropic flying apart of all the stuff in the universe, Syntropy would have been needed to put together the stars, planets, moons, asteroids, and comets, not to mention coalescing the dark matter and dark energy into physical forces affecting the mathematics we still can’t decide(?) to be a product of human or universal intelligence, either of which preempts the design-free narrative their own pretensions to divinity depend on.

Without Syntropy, there is no design. There is also, vitally, no explanation for the miraculous intricacy of the human brain and the interactions between that brain and the body it uses to perceive, describe, and interact with the physical world. 

Without Syntropy, there is no creativity. Just, you know, the falling apart model. Which is, to be direct, a very powerful metaphor of its own. But molecules of flesh do not fall apart one tiny mutation at a time for thousands of years to turn accidentally into an eye that can see. Not just one times but at least five separate and independent times and ways in the history of zoology. As if the concept of seeing is baked into the cake and matter must be molded to provide it.  Something identifies the objective of seeing and designs the means to make it. 

Without Syntropy, there is no consciousness, which is the supreme defining attribute of the human mind; that is, the coordinated interaction of brain and body to perceive, physically experience, and plan/respond to the consequences of that by making decisions before, during, and after physical and emotional events. 

And without consciousness there may be simple reactions to experience, but there will be no abstract thoughts or gradations of emotional responses to events and their consequences. There will be no calculus of ordering thought as a preventive against undesirable consequences.

Conscious  human mental life is the most complex phenomenon yet discovered in the universe, and science with its outdated metaphor base can’t even agree on what consciousness is. Why we are very (extremely) far from being ready to sign over large chunks of human operating systems to fancy machines called computers and their incompetent models of human brain function.

About 50 years ago, a brilliant Princeton psychologist named Julian Jaynes identified another Terminator’s Arm in the history of human development. His book The Origin of Consciousness argues that consciousness is not hard-wired into the human brain but the output of a cultural process spanning generations that moves from the simple act of naming things to the development of objective descriptive words that gradually acquire resonant connotations and jump-start what we now call self awareness and conscious thought. Jayne believed that it’s possible to observe the pre-conscious mind in ancient writings, notably in “The Iliad” and in the Bible’s Book of Amos. The specifics are fascinating, but what matters here is that consciousness is, in Jaynes’s model, a variable not a constant. It can be created and expanded, and it can also therefore be lost. It is therefore another suggestion that the machine model is obsolete. 

The brain is not just a finite organic machine that goes along pretty well until it starts falling apart with disease or old age. For most animals the brain coordinates the senses and body functions without creative thinking. Jaynes stresses that human beings don’t need consciousness for most of daily life. We don’t need it for memory, or task attentiveness, reactive physical exertions, reproduction, or repetitive work routines. We need it for decision making and imaginative responses to unexpected events. In the pre-conscious era, decision-making was effected by aural hallucination of ancestors and gods from a brain location called Broca’s Area. Then, about 1,500 BC, a series of catastrophes shattered cultural stability all over the known world, the voices of the gods went away, and man developed a thinking self to adapt and survive. Broca’s Area went silent, though if electrically stimulated it will still generate indecipherable voices talking in your head. 

Reading the Jaynes book is a massive, revelatory thought starter; (I referenced it as an ‘unacceptable viewpoint’ in my graphic work ST99.) It was unacceptable enough to the Darwinian scientists that Jaynes died before receiving the recognition he still deserves. In his day, the behavioral psychologists had performed the old magic trick of setting consciousness aside because it could not be counted or measured, and they studied human decision-making in terms of Skinner boxes and the like. The Machine model was thus applied to conceal the unsolvable mystery of where reflective consciousness entered the picture, along with human creativity. The most important takeaway from Jaynes here is that by finding a significant TA he has shown us a critical hole in the human narrative. His criteria demonstrate that before certain milestones of culture were achieved, what we take for granted as enlightened consciousness could not have existed at all. Evolutionary machine magic has hidden this from us.

Machine. There’s a word. And it’s especially fascinating that it has a role to play in the Darwinian description of how Evolution works. The way to see it is as a long, slow (very) process of random genetic mutations that gradually change species for the better or worse. Periodically, nature takes a hand with violent events that eradicate some species and (somehow) spin off others into new species. The process proceeds — love that phrase — through geologic and climatic ages epochs and ages that we can track in the fossil record through layers of earth and rock sediment. Throughout, the evolutionary Time Machine (i.e., Dawkins’s ‘Watch’) keeps ticking out tiny random mutations which are retained or not, leading us up to the present day. Implausible metaphor? Not if you look at the history of the Industrial Revolution.

The pocket watch becomes a gentleman’s personal appliance around the turn of the 19th Century, and immensely popular among vest wearers by the middle of that century, when Origin of Species was researched and published.

Professors wore them to be on time for class. They were a coming of age 
symbol. I inherited two gold ones, one from each of my grandfathers.

The mechanical symbolism of the pocketwatch is invaluable for its association with time itself and for the incremental, one increment at a time production of units, The model of machine functioning, which gave us Henry Ford’s assembly line, rests on the linear postulate that one unit’s worth of input equals one unit’s worth of output at the other end, contributing to a sum of however much finished goods inventory you want based on the number of input units. Even the lowly watch spring had its moment to shine in Evolution’s own evolution. It stored the energy needed for some finite number of ticks. Where’s the brain here? In the little winding stem up top (omitted in The Dawkins graphic) that decides when to inject new energy into the process.

When the fossil record grew large enough, paleontologists discovered that there was a period they called the Cambrian Explosion, in which a great many new species appeared almost all at once in the grand scheme of time. A colossal meteor hit ended the Age of Dinosaurs and, Bingo, there were all kinds of new rivals to replace them up and down the food chain. The ruling Pope of Evolution before Dawkins set about finding an explanation for the sudden nonlinear behavior of the species-creating system, which was that catastrophe, an external infusion of energy into the process, had quick-started a new burst of mutational activity. 
Without too much effort we might view this phenomenon as the rewinding of a watch that had been shocked into losing its stored energy. (My metaphoric interpretation of course, but see if it doesn’t appeal in terms of logic.) Perhaps the environmental increase in ambient energy associated with climatic and geologic shocks has been absorbed into the mutation-making machine called Evolution. Thus was Neo-Darwinian Theory born, An ironclad rule of Evolutionist physics: the need for a new explanation that enables students and grant seekers to keep clinging to a clunky metaphor will always result in an explanation that will be seen as scientifically acceptable.

Sorry for the level of detail. But this linear machine model worked well enough in the 19th century and the first half of the 20th. Henry Ford’s assembly line turned Detroit into the manufacturing capital of the world. Then it nearly ran Ford, General Motors, and Chrysler out of business. An emerging science of “nonlinear dynamics,” born out of the ascendant metaphor of the computer as a model of brain function, speedily made the machine metaphor obsolete for most applications. 

I chose the word ‘speedily’ because the chief contribution of computers to the narrative of how things work was their exponentially greater capacity for speed of operation. As they were implemented in post-WW2 industry, they could perform mathematical computations and transactions much much faster than mechanical adding machines. They generated output so quickly and in such quantity that it became possible to detect patterns in output never noticed before. It became obvious that what machine metaphors had described as a unified single process was in fact more complicated than that. They were collections of multiple processes that generated output at different rates and had to be stored in buffers until the main process was ready to use them. There were corresponding increases in error rates (e.g., bad input data). Not that big deal in the computer programming world, but a critical one in the manufacturing world. 

Assembly lines now consisted of many more steps, many more much faster machines, also generating errors at an increasing clip. The biggest error, the fatal one, is that all the much more complicated machines along the assembly line were still measuring their own efficiency in terms of speed (e.g., parts per minute, etc), just as the entire assembly line was (e.g., completed units of product per hour, etc) and given a rating of cost effectiveness thereby in accordance with obsolete accounting measures. In this fashion they failed altogether to see that the only valid measure of efficiency was $-value of the total resources incurred in the entire assembly process, including the floor space, materials, and employee time wasted by building huge volumes of excess work-in-process inventory created by running machines at speeds faster than the line needed them. Excess cost associated with wrong measurement targets was also increasing error rates for the line as a whole, more defective units of finished product caused by undetected defects in the work-in process placed in W-I-P hours, days, or even months ago.

In other words, the assembly line was a linear process only if you insisted in regarding it as one. In fact, during the transition from 1919 Model T’s to 1976 Cadillacs it had become a nonlinear set of sequenced interdependent processes working at different speeds with wildly varying rates of productivity. The solution, discovered by Japanese auto manufacturers, was that the ideal speed of an assembly line should be equal to the speed of the slowest machine in the line. The critical performance factors in this new model of efficiency were governed by a concept called “sensitive dependence on initial conditions.” Small changes in the operating parameters at the start of the process could and did result in massive changes in overall system performance. A new discipline called just-in-time manufacturing represented a junking of the machine metaphor as it had been defined and used uncritically since the beginning of the Industrial Revolution early in the19th century.

Slowing the assembly line to its slowest step was only the beginning, of course. The real measure of system efficiency in the JIT world was lead time and resources expended in the output of the whole assembly line. As with species development, manufacturing lead time was merely a calculations, not a focus of study. A number of new system disciplines were devised to make the process itself the target of continuous improvement. (I’ve taught these in person to executives and factory floor employees; they are simple, they work, and they improve both quality and cost effectiveness.) 

Successes like the one in manufacturing require constant vigilance and self-correcting disciplines throughout. You learn to measure different things and change your targets accordingly. 

What is striking at this moment in time is that the computer metaphor has replaced the machine in many human analytical activities and measures, but not in the worlds of science, academe, or government. Worse, the computer metaphor itself is already obsolete when it is applied to modeling the processes of the human brain/body connection known correctly as the Mind.

I have written previously about the failures of the computer metaphor in terms of the new Shiny Thing called ‘Artificial Intelligence.’ (A couple of those posts are linked below at the end of the post for your perusal.) 

But we are not yet done with the legacy of the machine metaphor in science as a whole, closely held and guarded academic discipline.

The Prevalence of Machine Thinking in Science and its Costs

An irony to bear in mind: Engineering, in terms of computer hardware manufacturing has largely abandoned the machine metaphor, while the computer science of software design and implementation has, despite failed false starts in the past, maintained and extended the machine metaphor to its own detriment and ours. This is why the computer pressed into service as a brain metaphor is inherently defective. 

[I know that those of you who are still with me are tapping your feet and waiting impatiently for the Sex part. It’s coming, it’ll be great but I’m saving it for last. Are you linear or nonlinear  in your approach to life? I’m doing the linear thing, one word after another to give you a path through a complicated set of related material. I have to write this in a certain order; I never said you have to consume it in that order. You’re you. Do what you like…]

What’s defective? There’s a circularity of design problem. The history of computer technology is be no means free of the constraints and errors contained in Evolutionary Theory. Computer software is still overwhelmingly machine-like, despite the fancy new oxymoron called Artificial Intelligence. As a brain metaphor, computers are superior to the Evolutionary machine metaphor because their applications to real world transactions do recognize the ways that information can be manipulated, intentionally altered by design, and used to create new perspectives and better decisions. But the brain they’re modeling computer designs on is still based on the machine metaphor inherited from Evolutionary theory. 

Evolutionary history is a series of fossil layers separated by time and catastrophe, each layer subject to specialized study in isolation and connected to one another principally by observing the overall patterns made by the analyst who is organizing, interpreting, and collating them. The tyranny of dental picks and brushes remains, the macro subjugated by the macro view in terms of research approaches. (Still measuring the wrong things like the parts-per-hour performance of individual assembly line machines). The only patterns they recognize are the ones that reinforce the unitary model of the assembly line. (In this case, oddly anthropomorphic for academics who keep speaking disparagingly of flaws in the human animal.) The Big Picture for Darwin, Gould, and Dawkins is to provide the narrative of how we got from the amoeba to Leonardo Da Vinci. Which is, of course, an implied purpose they would deny having, even if it leads them to illogical inferences. 

For example a battle has long been raging in Evolutionary science about whether birds were an independent emergence or an outgrowth of dinosaurs. It appears the dinosaur advocates have won the argument, at least for the moment. They’ve the dental pick discovery of archaeopteryx, which is dinosaur-looking in skeletal ways but has feathers. 

All we have of the first bird.

Archaeopteryx was so exciting that it led to fanciful new images of big dinosaurs with feathers and colorful avian plumage. This from Evolution autocrats who will icily inform you that chaos theory is irrelevant as an argument against antique Evolutionary assumptions, although it is chaos theory which identifies feathers as an embedded template of deep order, like leaf shapes and seashell configurations, which recur repeatedly in our un-designed flora and fauna. Where did those templates come from? More TAs.

No. Dinosaurs developed feathers via retained random mutations, and little dinos with feathers achieved flight by tiny random mutations that gave them much larger brains, much much lighter bones, and the highly specialized beaks and wings they would need to survive as birds in another million years. The fossils of true dinosaurs presumed capable of flight were laid aside in this intramural skirmish, principally because pterodactyls and pterosaurs are a mystery unto themselves, the problem of how they actually managed flight with heavy dinosaur bones still unsolved because less dangerous than the hint that flying might be like seeing, a functional property that pre-existed the mutations and drove them to organic fruition via multiple independent lines of descent in different species. 

What carried the day, finally, was the narrative — amoeba to Da Vinci — that made assembly line sense. Reptiles, then birds, then mammals, then men… And great for sales! Birds are living dinosaurs, more attractive than crocodiles and cockroaches as survivors of the fittest sweepstakes. Never mind how Pterodactyls flew. Never mind how bird bones got so amazingly lighter in weight, given that animals with decreasing bone density without yet being capable of flight on half-developed wings might be at a significant survival disadvantage for several hundred thousand years. Time heals/conceals all anomalies.

Evolution has even bigger problems with microbiology and cell evolution, never satisfactorily addressed in the pocketwatch world of Dawkins, but these are definitely in the weeds. What matters is the assembly line mentality that describes the human brain as a reptilian layer, a bird layer (in females… ), a mammalian layer, and a human thinking layer sitting on top of the pile.  It is postulated that the brain metaphorically operates — that is, “drives” — the body from its crows nest perch on top of the skeleton. And the body is always described as a machine, a natural cyborg consisting of a bony structure containing food handling organs, muscle mass for movements, and sensory collection and processing appliances. Layers again. Exterior layer of skin collecting input from the atmosphere and protecting the inner layers of nerve cells, muscles, bone, and specialized devices within, which enable breathing, digestion, excretion, and reproduction. Kind of like a more elaborate watch with an outer case, a face, inner cogs and gears, and a bigger winding stem on top to keep the whole thing running. 

How modern medicine has gotten itself into so much trouble. They have their own dental pick problem of focusing on research targets in isolation and over-generalizing about holistic patterns they’re not looking for if they involve intelligence below the cranial area. Result? Pharmacology that treats symptoms in isolation with little regard for broader interactions between mind and body. Surgery that is still skillful high-tech butchery, the surgeon a mechanic (okay, a jeweler) called in to fix the broken watch whose design he understands chiefly in terms of replacement parts catalogues and tiny hammers that make most problems look like tiny nails. Not his fault. Simply that like the scientists in the fields of Evolution, cosmology, psychology, and chemistry, the initial condition we are all sensitively dependent on is that their ‘expertise’ presumes to know a great deal more about the variables in the real world than it does in fact. When experts claim the mantle of godhood in terms of their relations with people they claim to be serving, they make huge mistakes and cover their own insecurities with dictatorial behavior and bureaucratic walls against criticism.

Why the brain as computer metaphor is so dangerously flawed. It shares science’s universal secular bias against reality as a product of design. They believe reality is flawed and they are the fixers. This leads them into catastrophic instances of illogic and absurd practice. And we’re not supposed to raise our hands and invoke the smell test.

The biggest inherent advantage of the brain as computer metaphor is that it automatically includes the design factor. Everything in computer software is the product of code written by a programmer. That’s why it can use an incredibly primitive alphabet of two characters, (bit/no bit) to perform large complicated processing tasks, store and format incredibly large volumes of information, and transmit that information to worldwide networks of compatible machines. It can do all of this at, literally, the speed of light. Beats the pocketwatch and the Model-T  assembly line all to hell as a way of understanding how the human brain accomplishes its own computational and information storage tasks. 

It’s just that its other weaknesses are so profound that the computer metaphor is actually existentially dangerous if we don’t understand the incorrectability of its flaws. 

Go up and take a look at the top graphic in this post. The hunter-gatherers of Göbekli Tepe who started building their temple 14,000+ years ago knew something all the PhDs in science, mathematics, medicine, and computer science have forgotten. There is a supernatural power — that is, a power above nature — that designed and created nature, set its laws in perpetuity, and also created every living being on earth and will still own its creation long after all of us, our children, and our grandchildren are dead. That power does intervene on occasion and deliver both blessings and punishments to the lives of the beings on earth. Every part of the creation is tied to every other and part of the same whole. No man or woman or product of same can ever be a god. That will never change.

Not long ago, in revisiting the dean of statistical process control, the legendary Dr. Deming, I realized that one of his prime assertions was in fact a modern-day restatement of God’s first words in Genesis:



I flashed on God commanding, “Let there be variation.” Indistinguishable from the Biblical ‘Let there be Light,’ the simple edict dismissed by secularists as a poetic metaphor for what can be viewed as an obvious first step of creating a world from scratch. If the nothing that obtained before the command was darkness in any sense, then “Let there be variation” would have introduced light as a variation from darkness without replacing it. Indeed, everything we do know of follows from the fact that variation is perhaps the prime directive of physics. No two objects, however defined at any scale, are identical. That’s the inborn creativity of even an infant universe. This thing may be like that thing, but it is not that thing. Newton’s mission restated: “Let us explore and expand the potentials of that difference in the infinity of time(s) to come.”

The Scientific Method built the box it would ultimately imprison itself in. For understandable and pragmatic reasons. The error was that examining things in isolation that could be counted or measured led to the inference that what could not be counted or measured in isolation did not exist in any relevant way and must be ignored in the laboratory. So-called soft sciences came along later to address the glaring holes in this inference, but they came late enough that they fell into the grip of Evolution’s machine metaphor.

Psychologists described the human mind as having discrete but connected parts suspiciously akin to physiologist’s description of the reptilian, mammalian, and human layers. Thus, Freud described the Ego, Superego, and Id. Others described the processes of the brain in terms of drives: survival, sex, the pleasure drives of the senses. Electronic technologists described a hierarchy of wave categories in brain activity: Alpha, Beta, Delta, Gamma, Theta. Carl Jung, the only one to break out of the isolationist box, described the Ego (consciousnesses), the Personal (Subconscious), and the Collective Unconscious (accessible species-level memory). None offered any particular grammar for describing the interactions of their mind parts on any continuous basis. The assumption seemed to be that parts were operating independently for the most part and collided occasionally, resulting in illness, meaning neurosis or psychosis. 

Like physicians, psychologists were focused on instances where the brain was not working properly and needed intervention. Thus, the question of how healthy brains go about their business has been neglected except by motivational speakers and writers of books about male-female relationships. (The sex is coming.,. I promise).

The general neglect of brain function has led to incompetent intelligence testing and outrageous assertions about things that cannot be counted or measured except clumsily at a distance. Poorly designed and politically compromised IQ testing has concealed a prolonged and rapidly accelerating decline in human intelligence, including levels of consciousness, The assertion that human beings only use 10 percent of their brain on average is absurd, dead wrong, and an incitement to computer programmers who have their own convenient delusions about capacity measurement.

The long and the short of it is that people who should know better are seriously underestimating the realizable power of the human brain and what it is doing in even the least  (seemingly) accomplished among us. And computer designers have completely failed to understand that their attempts to imitate brain function are using less than 1 percent of the relevant data to build their doomed models of human mentation. Digital symbols of reality do not account for variation and because of the bit/no bit character set regard all data in terms of units, not individual ‘datums’. Why, for example they cannot experience but only name sensory and emotional data. Possibly the most absurd delusion in computerland is that human programmers are able to build a human-like electronic brain that can function as well as or better than a human brain after they have switched off their keyboards and gone home to watch the performance. Thus, the product of deliberate uncoordinated design efforts can be passed off as an proto-conscious decision-making system as if it were the natural by-product of technological evolution and therefore smarter than its makers.

Oh. One more fatal flaw in computer brain simulations. Programmers are human beings. They cannot observe and understand brain function in isolation because a thing cannot study itself and see all of it at once. That’s a built-in disqualifying bias. They also have not learned from Evolutionist mistakes. Tiny genetic mutations do not accidentally turn a dinosaur into a bird, Software “patches” are the mutations of the digital world. They almost always fail, sometimes in catastrophic ways. (Ask the honchos who are still trying to fix the deadly problems of the Boeing 737-400.) Variables that are not anticipated by programmers cannot be dealt with successfully 100 percent of the time. Variations are infinite, but computers deal successfully only in finite data sets. In planes, we will always need human pilots to carry human passengers. Why computers as presently conceived can never run the world. The best computer on the design boards is to a competent human being what Robby the Robot is to Leonardo Da Vinci (or Michael Jordan) in their peak years.

Implementation of Artificial Intelligence on a massive scale will lead to a civilization-threatening catastrophe. Even the modest implementations in place are causing the antique layers of computer infrastructure dating all the way back to the 1960s to creak, slow down, and time out with increasing frequency. Even the early implementations are power hogs that are weighing on the old infrastructure solar panels installed on a roof that wasn’t designed to bear such massive weight Ambitious patches have already made user interfaces less friendly and frequently obstructive. It has already taken me longer to write this than it would have six months ago because AutoCorrect’s new AI ‘enhancements’ interpret a two-letter typo at the beginning of a word as an invitation to insert whole replacement words it has identified as personal to me. These replacements are always wrong, sometimes hard to detect during brisk phases of writing, and time consuming to hunt down after the fact and fix. This kind of unforced computer error is replicating itself everywhere, and it can only get worse as Tek-Lords feed appetites that are bigger than their stomachs.

And a lot of us don’t want to see the perils lurking before us. We are obsessed with other things.

The Secular Dead End

This is the prospect we are facing now. We have just passed the hundredth year anniversary of a civilizational crisis that has served to damage individual and therefore societal consciousness in ways that have been leading us back to 1500 BC.  That’s the date, in the Jaynesian hypothesis when the proto-conscious mind brain organization he called the bicameral mind ceased to work and had to be replaced by a new, more future-oriented kind of self awareness. The day to day, reactive form of living was unable to deal creatively with rapid change and combined natural/economic convulsions. More powerful individuals were necessary, aware of why they must carry on and develop whatever new ideas could improve chances of survival for themselves, their families, and their descendants. 

Why, in his definition, Jaynes argued that the real purpose of what we call consciousness is the ability to make personal decisions based on the foreseeing the consequences of our behaviors. This required a mind space that could look ahead to hypothetical futures and imagine them vividly enough to devise preventive steps and persevere in taking those steps even when they prove inconvenient or onerous. In other words, the sea change in mental life that occurred just that recently was the creation in each person of a clear-eyed observer of what the drive-dominated physical person was doing and judge it in big picture terms.

In 1976, Jaynes was careful to avoid the obvious religious implications and historical connections of the 1500 BC turning point he had identified. The date correspondence he spent the most time in was the estimated archaeological speculation about the timing of the Trojan War, which probably occurred about a thousand years before Homer committed the oral legends about that conflict to the written word in ~600 BC. His discovery, instantly true when he draws our attention to it, is that the actors in Homer’s “Iliad” are way (way) different from the characters we read about in the literature we have inherited in the millennia since. 

There is no real sign of interior life in the characters of the Iliad. They are usually described in terms of appearance, particularly their physical accessories — helmets, weapons, horses, chariots — and their actions. Emotional descriptions are extreme and mostly primitive: “He was wroth…” There is no dialogue to speak of, mostly declarations and proclamations, as if recited ready-made. When a decision of some kind is required, it is made by interaction with some God, not by sitting down on a rock and working it out in conversation with peers, subordinates, or spouses. Even the meter of the Iliad is a reflection of rhythms from nature. The phenomenon of “speaking in tongues,” official name glossolalia, has actually been found to coincide in most recorded instances with Iliadic dactylic hexameter. (In school, I had to learn how to read the Roman imitation of the Iliad, the Aeneid, in the correct dactylic hexameter. As you read out it loud, the meter gets easier and you can your own voice reading it automatically. Lovely but kind of eerie.)

Jaynes does reference the Bible’s Book of Amos as being a close cousin of the Iliad. The oldest book of the prophets, Amos is written as if it is received wisdom be spoken by the page to the reader. In other words, Amos is a step in the progression toward consciousness but not quite there. Interesting in that a common accepted date for the writing down of the Old Testament is 1500 BC. And the historical record verifies that the moral component of religion as a prime purpose added to the ancient regs requirements fealty, ear, obedience, and worship also dates to this epoch in recorded history/mythology. Why the Ten Commandments are still the basis of humane societies, however its ideas are represented in other scriptures.

With many ups and downs, the moral guidance provided by the major religions required individuals to evaluate their own behavior with an eye to the potential costs of moral disobedience in an eternal afterlife. Pretty strong incentive to be good. Then the world turned, as it does, in its grand cycles, and that explains the relevance of the timeline above.

We have covered the subversive role of science, with its emphasis on random, essentially meaningless change effected by entropic chemical processes over a period of terrestrial time too long and intimidating to comprehend. Then came the shock that changed thousands of years worth of human development. World War I.

Call it the Coming of Age of secular atheism. In that terrible conflict, whose phosgene gas atrocities presaged the atomic devastation of Hiroshima, the ivory towers born of the Christian era’s Enlightenment, realized that Mankind was truly mortal. We didn’t need God to punish us with a Great Flood. We could and probably would punish ourselves to death simply by being who we’s always been under the surface. There was no divine justice in a war that killed 22 million people in response to the assassination of a minor aristocrat in 1914. Reliance on religion was a delusion, and the doom of humankind was in the cards. Time to rework all our philosophies, arts, and politics. Time to grow up and accept the lonely despair which is the only rational state of an intelligent mind.

Revolutions duly occurred in all these realms. Religions did not go away, but they “evolved” and the faithful of every denomination began a philosophical balancing act that could not be maintained for multiple generations in perpetuity. The early 20th Century version of the Internet, a storehouse of human learning called the Encyclopedia Britannica, began to concede that it was impossible to prove that Jesus Christ had ever existed, let alone preached, died on a cross, and been resurrected to save all Mankind. 

There were skirmishes. The religious belatedly resisted the scientific imposition of Evokution as a replacement for Genesis. A play called “Inherit the Wind” put Clarence Darrow on Darwin’s side in the courtroom and he made God the defendant look foolish under cross-examination. 

The balancing act. Accepting the secular science in real world material terms while continuing to abide by religious precepts in family and personal life. Through time, it was the religious component of this divided world view that became closeted, half-apologized for when it came up in conversations with “intellectuals,” who increasingly included clerics educated at seminary to disbelieve the historicity of the Bible and use its quaint metaphors in service to humanitarian community and political causes.

Surveys kept reporting that Americans still believed in God. But did they? There’s a whole genre of movies in which devout Christians who lose a child to violence or ravaging disease and respond by blaming and denouncing God. The ‘happy ending’ is usually the striking of a separate peace, a resigned acceptance we can’t know the purposes of God if he exists, and we probably should respond by starting a new philanthropic organization of some kind.

The bottom line? Fewer and fewer people really believe in an afterlife or the eternal life of the human soul. It survives as a yearning, perhaps, but not a pillar of our identities.

More bottom line. The power of responsibility to the future as a fundamental driver of our personal decisions is ebbing away. Which means that the most obvious proof of impaired consciousness is the shortening of the timeframes individuals use as a basis for assessing the present and planning for the future. 

Compare. Elected President in 1932, FDR promised to fix everything and end the Great Depression. By 1936, the economy was in approximately the same shape it was back in ‘32. FDR was re-elected because people thought he was trying and were willing to trust him one more time. In 1940, FDR got re-elected again because the world was in a mess (everybody could see that, and dance with the guy that brought you…), and in 1944 re-elected on his deathbed because we were at war and…? 

As I said, compare. Having survived the colossal error of inaugurating a President on his deathbed in 2020, the nation re-elected a man who had been a better President by far than the dead one, because he promised to close the border, deport 20 million illegal aliens, end runaway inflation, make the tax reductions if his first term permanent, reform the corrupt practices of an overweight government that existed solely for its own benefit, and pursue a foreign policy that put America citizens first, ahead of the freeloaders in Europe and Asia. Less than a year later, he has delivered substantially on all those promises. Result? The number one topic in the public debate is the probability that the President’s party will lose its majorities in the Senate and House next year because the people who voted for him aren’t happy about the lack of progress so far.

The record of the opposition party is similar in its shortsidedness. Since its platform consists of automatically opposing everything done or proposed by the President, this means they have been forced (i.e., forced themselves) to side with illegals against citizens, violent criminals against law enforcement employees, abortion advocates and practitioners against babies, genocidal Islamic radicals against Jews, blatantly corrupt state and city governments against taxpayers, and totalitarian globalists against the sovereign independence of the United States government, as stipulated in its 237 year old Constitution. How can this be? They can no longer envision consequences beyond a few months or so. Why the Trump Curse takes so many of them out. Not his personal revenge, but their own crippled imaginations.

The people who still remember the selves who voted for President just a year ago are being squeezed from the top and bottom by consciousness-impaired secularists of slightly different stripes. There is a submerged 10 to 30 percent minority of citizens who are demographically very similar to the MAGA supporters they hate so fervently. The are ill-informed or uninformed, and all they “know” for sure is that the President is evil, senile, perverted, and wrong about absolutely everything. They want him dead. I’ve visited their home base in recent months.


They’re squeezing from the bottom, toward riots and impeachments and assassinations. The squeezers from the top are not just Democrats but a constituency consisting of Dems, RINOs, MINOs (MAGA in name only), and a huge mass of people who can only be described as The Distracted. Who are also consciousness-impaired. Starting with the Lower Slobbovians, I have termed this population the LSDRM+++ bloc.

The Distracted

Ah yes. The headline here is Sex. Other subheads will be mentioned in passing, the most important of which is Race, but Sex takes precedence for reasons I will explain as we go.

The Shiny Thing is the most useful kind of tool to attract attention away from the Big Thing and hypnotize the consciousness-impaired. We have several different Shiny Things working wonders in the Sex Department.

One of them is the clickbait leader. That would be the Epstein Files. The other two are more dangerous, simmering diversions from common sense decision making. These are the Abortion Thing and the Gender Thing.

Epstein. 


The blue dress is the ultimate Clinton symbol. First with Monica Lewinsky and her stained frock, then with Bill reclining as a showpiece of the Epstein Island harem. The flap about releasing the files to the public has never been about justice. It’s about voyeurism. The practical impact of the document tranches will be more negative than positive if legal accountability is your motive for demanding to see them. Epstein is dead. His Island Inferno is shut down. His chief accomplice and consort is in prison. The victims whose cases did not result in prosecutions will not be obtaining redress in courtrooms. The perpetrators who have not been charged will not be. Some of the guilty will be humiliated by association, appearing in photos where they shouldn’t have been present but can’t be proven to have broken any laws,  Some of the innocent will also be humiliated, appearing in photos with Epstein or on his plane (or what might be his his plane because there are females present). Still others will be dragged in by crude and/or sophisticated Photoshop creations. It won’t matter which, because in the tabloid world, the appearance of guilt is conviction in public. The only avenue of escape is to be implicated by an AI fake, because all the ballyhoo about AI sophistication is overstated. It cannot create artificial reality convincingly (and everyone knows it).

The only reality that matters here is that everyone who seeks out Epstein photographs is a Peeping Tom. Pick any photo you like, fake or not. Nothing good will come of your having seen it. The ones who also seek the documentation looking for hints of wrongdoing, hearsay accusations, and circumstantial slanders are not only Peeping Toms but hypocrites. I can’t believe the standard spiels of the Lower Slobbovians who are deeply emotional in their self-righteous damnations of people they’ve never met based on no evidence whatever. That’s consciousness impairment at a psychotic level.

Why am I so offended by showy outrage about the Epstein Files? Because there are a few hundred thousand victims of the abduction and sexual trafficking that went on for all four years of the abiden administration with nary a squeak of protest from the Peeping Toms. Worse in my mind are the people who declare themselves MAGA dropouts and advocates of Bondi impeachment over the Epstein Files, when they have no excuse for not knowing the following:


The Abortion Thing. Still having its subterranean impact on the voting choices of single/? females who think their anatomy entitles them to an automatic license to kill. They actually need to keep the issue alive as a national despite the fact that the Supreme Court has made it a local issue, which give every citizen more control of the governing laws in their communities than Roe did. The inability to imagine the 30 million missing human beings with unloved lives is prima facie proof of impaired human consciousness, All I’ll add on this subject is two graphics:

The shirt is one of many. So is the fetus from my ST99 project.

The Gender Thing. No, it’s a Sex Thing, pure and simple. Akin to the Abortion Thing in that it refuses to take into account the collateral damage to real people caused by its absurd attention-getting gambits. The narrative has it that it’s not strictly another feminist power grab because it also claims to be about advancing the rights of homosexuals, bisexuals, and transgenders of both sexes, not to mention the racial and ethnic minorities who are also not members of the white male patriarchy, which must be done in at all costs. Oops. Right. It is another feminist power grab who are using all the other flagged initials to isolate and sideline white males who are sexually interested in attractive women. It’s everybody against the white guys, including even furries and woke pedophiles who were never invited to Epstein Island. 

Collateral casualties? Trad women (ugh) who don’t like transitional TGs in their bathrooms and locker rooms. Female jocks (grow up, girl!) who object to getting seriously injured by delusional men with silicon breasts competing against them in female sports. Minor children being lobbied by feminist school boards to learn about sexual fetishes in their elementary school libraries and even take one for the team by submitting to surgical castration and mutilation before they’re even old enough to have sex. Various children and adults targeted for mass shootings or assassination by Transgenders who still haven’t attained the Utopian ideal of thinking about nothing at all but how others regard you.

Lies, Damned Lies, and Statistics. This all three. The Official Narrative.


The four shooters figure is a blatant falsehood, enabled by mass media who routinely conceal the TG identities, especially in school shootings. Interestingly, if you perform the calculation needed to determine the percent-incidence of shooters among TGs vs “Cisnormals” (eew!), here’s the number of shootings there would be if the population disparities are adjusted for.


As the number of such shootings increase, as it would in any honest reporting, that number would increase dramatically. 

As a final point on the Gender Thing, women are still complaining mightily about the patriarchy, but here’s where the smell test comes in. Have you noticed how many women are visibly in charge of so many government/academic/elite institutions compared to even a decade ago? Prosecutors and Judges (including SCOTUS), mass media executives and on-camera talent, presidents of prestigious universities, blue state governors, big city mayors,  chiefs of police and fire departments in major cities, cabinet secretaries (Trump admin included), and rising political media magnets in both parties. Feels like the patriarchy is already on the run, except for the handful of most hated men in the country, whom more women than men want pushing up daisies for the most fantastic imaginary crimes against, uh, women. 

These are the distractions that occupy more time and attention in the public discourse than anything else I’ve mentioned in this essay. Think about that. Nothing much you can do about any  these attention headline grabbers, is there?

Other lesser distractions?

Europe is gone. Muslim terror states all.

Ukraine is gone. A Third World wasteland whether they continue their war or not.

Russia is what it’s been since the Soviet Union fell. A poor country with a lot of nuclear weapons. 

China will keep playing its long game. The same one they’ve been playing since the Emperor Ch’in took over a couple hundred years before Christ. A giant panda bewildering the world but mostly staying at home, munching on bamboo and still producing remarkably few individuals.

Israel will survive. Islam will stagger in till their oil runs out than melt back into the sands with millions of casualties as their sole legacy.

All that’s left is the important stuff. The soul of Mankind. And the Big Question.

Summary

So where are we? More precisely where am I in my appraisal of our prospects in 2026 and beyond?

I think we’re headed for a reckoning of our own counterpart of 1500 BC. The catalyst, barring unpredictable natural and enemy-created disasters, will be the misguided premature implementation of what the hoaxsters call Artificial Intelligence. (See the two posts linked below for focused assessment of this technology.) There is likely to be an early crisis with an AI stock market bubble exacerbated by serious breakdowns in our computer infrastructure. That would actually be good news if it slows down or shuts down the fsr more ambition investments in the offing. 

Very large scale implementation of Artificial Intelligence will collapse the global economy and create a Depression so profound it will be called a Dark Age. No one is anticipating this because we have lost our individual and collective ability to foresee consequences beyond even a few months in the future. When only a shrinking minority are making their decisions based on a sense of responsibility to an eternal moral imperative, the tipping points of chaos will arrive with great suddenness and wreak great ruin.

I have been concerned about the dangers of declining consciousness since 1976, when I read Jaynes and added 2 + 2 beyond his own conclusions, I had occasion to speak with him on the phone shortly before he died. I ran my principal tenet past him. If consciousness is a variable, one variation in organizations of mind possible in the human brain, then it is also possible that the kind of modern consciousness you describe can be lost. Am I wrong about that? “No,” he said. “You’re right.”

Whee we are, in my opinion. Moving at breakneck speed toward a tipping point engineered by scientists who are making the same kind of mistake the Democrat Party is making right now. They are so determined to cement their position as technological replacements for a God they see as outmoded and dumber than they are, that they fail to see they are victims of a delusion that will destroy them and the rest of us too.

As the man said, “You can’t fix stupid.”

As to matters that sit in the far future (as our myopic, self-obsessed souls view the world), the outcome of the 2026 midterms cannot be predicted at all. How many rounds of polls will be administered and capture the “this moment, right now” feelings of voters before November of the new year?  Neo-proto-conscious LSDRM+++’ers who could be fat and sassy come Election Day or so pissed off by the last electric bill that they want their own congressional version of that Mamdani joker. You pick it. I won’t even go on record about Alabama vs. Indiana on New Years Day.

From the Matrix. I have my own coincidental connection to the movie. Here.


Happy New Year to you too.


HELPFUL LINKS:





Comments

Readers also liked…

The Coming AI-mageddon

My World and Welcome to It

The Cryptkeeper

Epstein Files Release Delayed Again, EXCEPT…

The Roots of TDS in the Dem Rank and File

We can learn from useful archetypes of the liberal elites

The “W” File from Moon Books

The Friday Follies — One-Way Tickets to Galoopaville

An Open Letter to Michael Smith