The mechanical symbolism of the pocketwatch is invaluable for its association with time itself and for the incremental, one increment at a time production of units, The model of machine functioning, which gave us Henry Ford’s assembly line, rests on the linear postulate that one unit’s worth of input equals one unit’s worth of output at the other end, contributing to a sum of however much finished goods inventory you want based on the number of input units. Even the lowly watch spring had its moment to shine in Evolution’s own evolution. It stored the energy needed for some finite number of ticks. Where’s the brain here? In the little winding stem up top (omitted in The Dawkins graphic) that decides when to inject new energy into the process.
When the fossil record grew large enough, paleontologists discovered that there was a period they called the Cambrian Explosion, in which a great many new species appeared almost all at once in the grand scheme of time. A colossal meteor hit ended the Age of Dinosaurs and, Bingo, there were all kinds of new rivals to replace them up and down the food chain. The ruling Pope of Evolution before Dawkins set about finding an explanation for the sudden nonlinear behavior of the species-creating system, which was that catastrophe, an external infusion of energy into the process, had quick-started a new burst of mutational activity.
Without too much effort we might view this phenomenon as the rewinding of a watch that had been shocked into losing its stored energy. (My metaphoric interpretation of course, but see if it doesn’t appeal in terms of logic.) Perhaps the environmental increase in ambient energy associated with climatic and geologic shocks has been absorbed into the mutation-making machine called Evolution. Thus was Neo-Darwinian Theory born, An ironclad rule of Evolutionist physics: the need for a new explanation that enables students and grant seekers to keep clinging to a clunky metaphor will always result in an explanation that will be seen as scientifically acceptable.
Sorry for the level of detail. But this linear machine model worked well enough in the 19th century and the first half of the 20th. Henry Ford’s assembly line turned Detroit into the manufacturing capital of the world. Then it nearly ran Ford, General Motors, and Chrysler out of business. An emerging science of “nonlinear dynamics,” born out of the ascendant metaphor of the computer as a model of brain function, speedily made the machine metaphor obsolete for most applications.
I chose the word ‘speedily’ because the chief contribution of computers to the narrative of how things work was their exponentially greater capacity for speed of operation. As they were implemented in post-WW2 industry, they could perform mathematical computations and transactions much much faster than mechanical adding machines. They generated output so quickly and in such quantity that it became possible to detect patterns in output never noticed before. It became obvious that what machine metaphors had described as a unified single process was in fact more complicated than that. They were collections of multiple processes that generated output at different rates and had to be stored in buffers until the main process was ready to use them. There were corresponding increases in error rates (e.g., bad input data). Not that big deal in the computer programming world, but a critical one in the manufacturing world.
Assembly lines now consisted of many more steps, many more much faster machines, also generating errors at an increasing clip. The biggest error, the fatal one, is that all the much more complicated machines along the assembly line were still measuring their own efficiency in terms of speed (e.g., parts per minute, etc), just as the entire assembly line was (e.g., completed units of product per hour, etc) and given a rating of cost effectiveness thereby in accordance with obsolete accounting measures. In this fashion they failed altogether to see that the only valid measure of efficiency was $-value of the total resources incurred in the entire assembly process, including the floor space, materials, and employee time wasted by building huge volumes of excess work-in-process inventory created by running machines at speeds faster than the line needed them. Excess cost associated with wrong measurement targets was also increasing error rates for the line as a whole, more defective units of finished product caused by undetected defects in the work-in process placed in W-I-P hours, days, or even months ago.
In other words, the assembly line was a linear process only if you insisted in regarding it as one. In fact, during the transition from 1919 Model T’s to 1976 Cadillacs it had become a nonlinear set of sequenced interdependent processes working at different speeds with wildly varying rates of productivity. The solution, discovered by Japanese auto manufacturers, was that the ideal speed of an assembly line should be equal to the speed of the slowest machine in the line. The critical performance factors in this new model of efficiency were governed by a concept called “sensitive dependence on initial conditions.” Small changes in the operating parameters at the start of the process could and did result in massive changes in overall system performance. A new discipline called just-in-time manufacturing represented a junking of the machine metaphor as it had been defined and used uncritically since the beginning of the Industrial Revolution early in the19th century.
Slowing the assembly line to its slowest step was only the beginning, of course. The real measure of system efficiency in the JIT world was lead time and resources expended in the output of the whole assembly line. As with species development, manufacturing lead time was merely a calculations, not a focus of study. A number of new system disciplines were devised to make the process itself the target of continuous improvement. (I’ve taught these in person to executives and factory floor employees; they are simple, they work, and they improve both quality and cost effectiveness.)
Successes like the one in manufacturing require constant vigilance and self-correcting disciplines throughout. You learn to measure different things and change your targets accordingly.
What is striking at this moment in time is that the computer metaphor has replaced the machine in many human analytical activities and measures, but not in the worlds of science, academe, or government. Worse, the computer metaphor itself is already obsolete when it is applied to modeling the processes of the human brain/body connection known correctly as the Mind.
I have written previously about the failures of the computer metaphor in terms of the new Shiny Thing called ‘Artificial Intelligence.’ (A couple of those posts are linked below at the end of the post for your perusal.)
But we are not yet done with the legacy of the machine metaphor in science as a whole, closely held and guarded academic discipline.
The Prevalence of Machine Thinking in Science and its Costs
An irony to bear in mind: Engineering, in terms of computer hardware manufacturing has largely abandoned the machine metaphor, while the computer science of software design and implementation has, despite failed false starts in the past, maintained and extended the machine metaphor to its own detriment and ours. This is why the computer pressed into service as a brain metaphor is inherently defective.
[I know that those of you who are still with me are tapping your feet and waiting impatiently for the Sex part. It’s coming, it’ll be great but I’m saving it for last. Are you linear or nonlinear in your approach to life? I’m doing the linear thing, one word after another to give you a path through a complicated set of related material. I have to write this in a certain order; I never said you have to consume it in that order. You’re you. Do what you like…]
What’s defective? There’s a circularity of design problem. The history of computer technology is be no means free of the constraints and errors contained in Evolutionary Theory. Computer software is still overwhelmingly machine-like, despite the fancy new oxymoron called Artificial Intelligence. As a brain metaphor, computers are superior to the Evolutionary machine metaphor because their applications to real world transactions do recognize the ways that information can be manipulated, intentionally altered by design, and used to create new perspectives and better decisions. But the brain they’re modeling computer designs on is still based on the machine metaphor inherited from Evolutionary theory.
Evolutionary history is a series of fossil layers separated by time and catastrophe, each layer subject to specialized study in isolation and connected to one another principally by observing the overall patterns made by the analyst who is organizing, interpreting, and collating them. The tyranny of dental picks and brushes remains, the macro subjugated by the macro view in terms of research approaches. (Still measuring the wrong things like the parts-per-hour performance of individual assembly line machines). The only patterns they recognize are the ones that reinforce the unitary model of the assembly line. (In this case, oddly anthropomorphic for academics who keep speaking disparagingly of flaws in the human animal.) The Big Picture for Darwin, Gould, and Dawkins is to provide the narrative of how we got from the amoeba to Leonardo Da Vinci. Which is, of course, an implied purpose they would deny having, even if it leads them to illogical inferences.
For example a battle has long been raging in Evolutionary science about whether birds were an independent emergence or an outgrowth of dinosaurs. It appears the dinosaur advocates have won the argument, at least for the moment. They’ve the dental pick discovery of archaeopteryx, which is dinosaur-looking in skeletal ways but has feathers.
All we have of the first bird.
Archaeopteryx was so exciting that it led to fanciful new images of big dinosaurs with feathers and colorful avian plumage. This from Evolution autocrats who will icily inform you that chaos theory is irrelevant as an argument against antique Evolutionary assumptions, although it is chaos theory which identifies feathers as an embedded template of deep order, like leaf shapes and seashell configurations, which recur repeatedly in our un-designed flora and fauna. Where did those templates come from? More TAs.
No. Dinosaurs developed feathers via retained random mutations, and little dinos with feathers achieved flight by tiny random mutations that gave them much larger brains, much much lighter bones, and the highly specialized beaks and wings they would need to survive as birds in another million years. The fossils of true dinosaurs presumed capable of flight were laid aside in this intramural skirmish, principally because pterodactyls and pterosaurs are a mystery unto themselves, the problem of how they actually managed flight with heavy dinosaur bones still unsolved because less dangerous than the hint that flying might be like seeing, a functional property that pre-existed the mutations and drove them to organic fruition via multiple independent lines of descent in different species.
What carried the day, finally, was the narrative — amoeba to Da Vinci — that made assembly line sense. Reptiles, then birds, then mammals, then men… And great for sales! Birds are living dinosaurs, more attractive than crocodiles and cockroaches as survivors of the fittest sweepstakes. Never mind how Pterodactyls flew. Never mind how bird bones got so amazingly lighter in weight, given that animals with decreasing bone density without yet being capable of flight on half-developed wings might be at a significant survival disadvantage for several hundred thousand years. Time heals/conceals all anomalies.
Evolution has even bigger problems with microbiology and cell evolution, never satisfactorily addressed in the pocketwatch world of Dawkins, but these are definitely in the weeds. What matters is the assembly line mentality that describes the human brain as a reptilian layer, a bird layer (in females… ), a mammalian layer, and a human thinking layer sitting on top of the pile. It is postulated that the brain metaphorically operates — that is, “drives” — the body from its crows nest perch on top of the skeleton. And the body is always described as a machine, a natural cyborg consisting of a bony structure containing food handling organs, muscle mass for movements, and sensory collection and processing appliances. Layers again. Exterior layer of skin collecting input from the atmosphere and protecting the inner layers of nerve cells, muscles, bone, and specialized devices within, which enable breathing, digestion, excretion, and reproduction. Kind of like a more elaborate watch with an outer case, a face, inner cogs and gears, and a bigger winding stem on top to keep the whole thing running.
How modern medicine has gotten itself into so much trouble. They have their own dental pick problem of focusing on research targets in isolation and over-generalizing about holistic patterns they’re not looking for if they involve intelligence below the cranial area. Result? Pharmacology that treats symptoms in isolation with little regard for broader interactions between mind and body. Surgery that is still skillful high-tech butchery, the surgeon a mechanic (okay, a jeweler) called in to fix the broken watch whose design he understands chiefly in terms of replacement parts catalogues and tiny hammers that make most problems look like tiny nails. Not his fault. Simply that like the scientists in the fields of Evolution, cosmology, psychology, and chemistry, the initial condition we are all sensitively dependent on is that their ‘expertise’ presumes to know a great deal more about the variables in the real world than it does in fact. When experts claim the mantle of godhood in terms of their relations with people they claim to be serving, they make huge mistakes and cover their own insecurities with dictatorial behavior and bureaucratic walls against criticism.
Why the brain as computer metaphor is so dangerously flawed. It shares science’s universal secular bias against reality as a product of design. They believe reality is flawed and they are the fixers. This leads them into catastrophic instances of illogic and absurd practice. And we’re not supposed to raise our hands and invoke the smell test.
The biggest inherent advantage of the brain as computer metaphor is that it automatically includes the design factor. Everything in computer software is the product of code written by a programmer. That’s why it can use an incredibly primitive alphabet of two characters, (bit/no bit) to perform large complicated processing tasks, store and format incredibly large volumes of information, and transmit that information to worldwide networks of compatible machines. It can do all of this at, literally, the speed of light. Beats the pocketwatch and the Model-T assembly line all to hell as a way of understanding how the human brain accomplishes its own computational and information storage tasks.
It’s just that its other weaknesses are so profound that the computer metaphor is actually existentially dangerous if we don’t understand the incorrectability of its flaws.
Go up and take a look at the top graphic in this post. The hunter-gatherers of Göbekli Tepe who started building their temple 14,000+ years ago knew something all the PhDs in science, mathematics, medicine, and computer science have forgotten. There is a supernatural power — that is, a power above nature — that designed and created nature, set its laws in perpetuity, and also created every living being on earth and will still own its creation long after all of us, our children, and our grandchildren are dead. That power does intervene on occasion and deliver both blessings and punishments to the lives of the beings on earth. Every part of the creation is tied to every other and part of the same whole. No man or woman or product of same can ever be a god. That will never change.
Not long ago, in revisiting the dean of statistical process control, the legendary Dr. Deming, I realized that one of his prime assertions was in fact a modern-day restatement of God’s first words in Genesis:

I flashed on God commanding, “Let there be variation.” Indistinguishable from the Biblical ‘Let there be Light,’ the simple edict dismissed by secularists as a poetic metaphor for what can be viewed as an obvious first step of creating a world from scratch. If the nothing that obtained before the command was darkness in any sense, then “Let there be variation” would have introduced light as a variation from darkness without replacing it. Indeed, everything we do know of follows from the fact that variation is perhaps the prime directive of physics. No two objects, however defined at any scale, are identical. That’s the inborn creativity of even an infant universe. This thing may be like that thing, but it is not that thing. Newton’s mission restated: “Let us explore and expand the potentials of that difference in the infinity of time(s) to come.”
The Scientific Method built the box it would ultimately imprison itself in. For understandable and pragmatic reasons. The error was that examining things in isolation that could be counted or measured led to the inference that what could not be counted or measured in isolation did not exist in any relevant way and must be ignored in the laboratory. So-called soft sciences came along later to address the glaring holes in this inference, but they came late enough that they fell into the grip of Evolution’s machine metaphor.
Psychologists described the human mind as having discrete but connected parts suspiciously akin to physiologist’s description of the reptilian, mammalian, and human layers. Thus, Freud described the Ego, Superego, and Id. Others described the processes of the brain in terms of drives: survival, sex, the pleasure drives of the senses. Electronic technologists described a hierarchy of wave categories in brain activity: Alpha, Beta, Delta, Gamma, Theta. Carl Jung, the only one to break out of the isolationist box, described the Ego (consciousnesses), the Personal (Subconscious), and the Collective Unconscious (accessible species-level memory). None offered any particular grammar for describing the interactions of their mind parts on any continuous basis. The assumption seemed to be that parts were operating independently for the most part and collided occasionally, resulting in illness, meaning neurosis or psychosis.
Like physicians, psychologists were focused on instances where the brain was not working properly and needed intervention. Thus, the question of how healthy brains go about their business has been neglected except by motivational speakers and writers of books about male-female relationships. (The sex is coming.,. I promise).
The general neglect of brain function has led to incompetent intelligence testing and outrageous assertions about things that cannot be counted or measured except clumsily at a distance. Poorly designed and politically compromised IQ testing has concealed a prolonged and rapidly accelerating decline in human intelligence, including levels of consciousness, The assertion that human beings only use 10 percent of their brain on average is absurd, dead wrong, and an incitement to computer programmers who have their own convenient delusions about capacity measurement.
The long and the short of it is that people who should know better are seriously underestimating the realizable power of the human brain and what it is doing in even the least (seemingly) accomplished among us. And computer designers have completely failed to understand that their attempts to imitate brain function are using less than 1 percent of the relevant data to build their doomed models of human mentation. Digital symbols of reality do not account for variation and because of the bit/no bit character set regard all data in terms of units, not individual ‘datums’. Why, for example they cannot experience but only name sensory and emotional data. Possibly the most absurd delusion in computerland is that human programmers are able to build a human-like electronic brain that can function as well as or better than a human brain after they have switched off their keyboards and gone home to watch the performance. Thus, the product of deliberate uncoordinated design efforts can be passed off as an proto-conscious decision-making system as if it were the natural by-product of technological evolution and therefore smarter than its makers.
Oh. One more fatal flaw in computer brain simulations. Programmers are human beings. They cannot observe and understand brain function in isolation because a thing cannot study itself and see all of it at once. That’s a built-in disqualifying bias. They also have not learned from Evolutionist mistakes. Tiny genetic mutations do not accidentally turn a dinosaur into a bird, Software “patches” are the mutations of the digital world. They almost always fail, sometimes in catastrophic ways. (Ask the honchos who are still trying to fix the deadly problems of the Boeing 737-400.) Variables that are not anticipated by programmers cannot be dealt with successfully 100 percent of the time. Variations are infinite, but computers deal successfully only in finite data sets. In planes, we will always need human pilots to carry human passengers. Why computers as presently conceived can never run the world. The best computer on the design boards is to a competent human being what Robby the Robot is to Leonardo Da Vinci (or Michael Jordan) in their peak years.
Implementation of Artificial Intelligence on a massive scale will lead to a civilization-threatening catastrophe. Even the modest implementations in place are causing the antique layers of computer infrastructure dating all the way back to the 1960s to creak, slow down, and time out with increasing frequency. Even the early implementations are power hogs that are weighing on the old infrastructure solar panels installed on a roof that wasn’t designed to bear such massive weight Ambitious patches have already made user interfaces less friendly and frequently obstructive. It has already taken me longer to write this than it would have six months ago because AutoCorrect’s new AI ‘enhancements’ interpret a two-letter typo at the beginning of a word as an invitation to insert whole replacement words it has identified as personal to me. These replacements are always wrong, sometimes hard to detect during brisk phases of writing, and time consuming to hunt down after the fact and fix. This kind of unforced computer error is replicating itself everywhere, and it can only get worse as Tek-Lords feed appetites that are bigger than their stomachs.
And a lot of us don’t want to see the perils lurking before us. We are obsessed with other things.
The Secular Dead End
This is the prospect we are facing now. We have just passed the hundredth year anniversary of a civilizational crisis that has served to damage individual and therefore societal consciousness in ways that have been leading us back to 1500 BC. That’s the date, in the Jaynesian hypothesis when the proto-conscious mind brain organization he called the bicameral mind ceased to work and had to be replaced by a new, more future-oriented kind of self awareness. The day to day, reactive form of living was unable to deal creatively with rapid change and combined natural/economic convulsions. More powerful individuals were necessary, aware of why they must carry on and develop whatever new ideas could improve chances of survival for themselves, their families, and their descendants.
Why, in his definition, Jaynes argued that the real purpose of what we call consciousness is the ability to make personal decisions based on the foreseeing the consequences of our behaviors. This required a mind space that could look ahead to hypothetical futures and imagine them vividly enough to devise preventive steps and persevere in taking those steps even when they prove inconvenient or onerous. In other words, the sea change in mental life that occurred just that recently was the creation in each person of a clear-eyed observer of what the drive-dominated physical person was doing and judge it in big picture terms.
In 1976, Jaynes was careful to avoid the obvious religious implications and historical connections of the 1500 BC turning point he had identified. The date correspondence he spent the most time in was the estimated archaeological speculation about the timing of the Trojan War, which probably occurred about a thousand years before Homer committed the oral legends about that conflict to the written word in ~600 BC. His discovery, instantly true when he draws our attention to it, is that the actors in Homer’s “Iliad” are way (way) different from the characters we read about in the literature we have inherited in the millennia since.
There is no real sign of interior life in the characters of the Iliad. They are usually described in terms of appearance, particularly their physical accessories — helmets, weapons, horses, chariots — and their actions. Emotional descriptions are extreme and mostly primitive: “He was wroth…” There is no dialogue to speak of, mostly declarations and proclamations, as if recited ready-made. When a decision of some kind is required, it is made by interaction with some God, not by sitting down on a rock and working it out in conversation with peers, subordinates, or spouses. Even the meter of the Iliad is a reflection of rhythms from nature. The phenomenon of “speaking in tongues,” official name glossolalia, has actually been found to coincide in most recorded instances with Iliadic dactylic hexameter. (In school, I had to learn how to read the Roman imitation of the Iliad, the Aeneid, in the correct dactylic hexameter. As you read out it loud, the meter gets easier and you can your own voice reading it automatically. Lovely but kind of eerie.)
Jaynes does reference the Bible’s Book of Amos as being a close cousin of the Iliad. The oldest book of the prophets, Amos is written as if it is received wisdom be spoken by the page to the reader. In other words, Amos is a step in the progression toward consciousness but not quite there. Interesting in that a common accepted date for the writing down of the Old Testament is 1500 BC. And the historical record verifies that the moral component of religion as a prime purpose added to the ancient regs requirements fealty, ear, obedience, and worship also dates to this epoch in recorded history/mythology. Why the Ten Commandments are still the basis of humane societies, however its ideas are represented in other scriptures.
With many ups and downs, the moral guidance provided by the major religions required individuals to evaluate their own behavior with an eye to the potential costs of moral disobedience in an eternal afterlife. Pretty strong incentive to be good. Then the world turned, as it does, in its grand cycles, and that explains the relevance of the timeline above.
We have covered the subversive role of science, with its emphasis on random, essentially meaningless change effected by entropic chemical processes over a period of terrestrial time too long and intimidating to comprehend. Then came the shock that changed thousands of years worth of human development. World War I.
Call it the Coming of Age of secular atheism. In that terrible conflict, whose phosgene gas atrocities presaged the atomic devastation of Hiroshima, the ivory towers born of the Christian era’s Enlightenment, realized that Mankind was truly mortal. We didn’t need God to punish us with a Great Flood. We could and probably would punish ourselves to death simply by being who we’s always been under the surface. There was no divine justice in a war that killed 22 million people in response to the assassination of a minor aristocrat in 1914. Reliance on religion was a delusion, and the doom of humankind was in the cards. Time to rework all our philosophies, arts, and politics. Time to grow up and accept the lonely despair which is the only rational state of an intelligent mind.
Revolutions duly occurred in all these realms. Religions did not go away, but they “evolved” and the faithful of every denomination began a philosophical balancing act that could not be maintained for multiple generations in perpetuity. The early 20th Century version of the Internet, a storehouse of human learning called the Encyclopedia Britannica, began to concede that it was impossible to prove that Jesus Christ had ever existed, let alone preached, died on a cross, and been resurrected to save all Mankind.
There were skirmishes. The religious belatedly resisted the scientific imposition of Evokution as a replacement for Genesis. A play called “Inherit the Wind” put Clarence Darrow on Darwin’s side in the courtroom and he made God the defendant look foolish under cross-examination.
The balancing act. Accepting the secular science in real world material terms while continuing to abide by religious precepts in family and personal life. Through time, it was the religious component of this divided world view that became closeted, half-apologized for when it came up in conversations with “intellectuals,” who increasingly included clerics educated at seminary to disbelieve the historicity of the Bible and use its quaint metaphors in service to humanitarian community and political causes.
Surveys kept reporting that Americans still believed in God. But did they? There’s a whole genre of movies in which devout Christians who lose a child to violence or ravaging disease and respond by blaming and denouncing God. The ‘happy ending’ is usually the striking of a separate peace, a resigned acceptance we can’t know the purposes of God if he exists, and we probably should respond by starting a new philanthropic organization of some kind.
The bottom line? Fewer and fewer people really believe in an afterlife or the eternal life of the human soul. It survives as a yearning, perhaps, but not a pillar of our identities.
More bottom line. The power of responsibility to the future as a fundamental driver of our personal decisions is ebbing away. Which means that the most obvious proof of impaired consciousness is the shortening of the timeframes individuals use as a basis for assessing the present and planning for the future.
Compare. Elected President in 1932, FDR promised to fix everything and end the Great Depression. By 1936, the economy was in approximately the same shape it was back in ‘32. FDR was re-elected because people thought he was trying and were willing to trust him one more time. In 1940, FDR got re-elected again because the world was in a mess (everybody could see that, and dance with the guy that brought you…), and in 1944 re-elected on his deathbed because we were at war and…?
As I said, compare. Having survived the colossal error of inaugurating a President on his deathbed in 2020, the nation re-elected a man who had been a better President by far than the dead one, because he promised to close the border, deport 20 million illegal aliens, end runaway inflation, make the tax reductions if his first term permanent, reform the corrupt practices of an overweight government that existed solely for its own benefit, and pursue a foreign policy that put America citizens first, ahead of the freeloaders in Europe and Asia. Less than a year later, he has delivered substantially on all those promises. Result? The number one topic in the public debate is the probability that the President’s party will lose its majorities in the Senate and House next year because the people who voted for him aren’t happy about the lack of progress so far.
The record of the opposition party is similar in its shortsidedness. Since its platform consists of automatically opposing everything done or proposed by the President, this means they have been forced (i.e., forced themselves) to side with illegals against citizens, violent criminals against law enforcement employees, abortion advocates and practitioners against babies, genocidal Islamic radicals against Jews, blatantly corrupt state and city governments against taxpayers, and totalitarian globalists against the sovereign independence of the United States government, as stipulated in its 237 year old Constitution. How can this be? They can no longer envision consequences beyond a few months or so. Why the Trump Curse takes so many of them out. Not his personal revenge, but their own crippled imaginations.
The people who still remember the selves who voted for President just a year ago are being squeezed from the top and bottom by consciousness-impaired secularists of slightly different stripes. There is a submerged 10 to 30 percent minority of citizens who are demographically very similar to the MAGA supporters they hate so fervently. The are ill-informed or uninformed, and all they “know” for sure is that the President is evil, senile, perverted, and wrong about absolutely everything. They want him dead. I’ve visited their home base in recent months.
They’re squeezing from the bottom, toward riots and impeachments and assassinations. The squeezers from the top are not just Democrats but a constituency consisting of Dems, RINOs, MINOs (MAGA in name only), and a huge mass of people who can only be described as The Distracted. Who are also consciousness-impaired. Starting with the Lower Slobbovians, I have termed this population the LSDRM+++ bloc.
The Distracted
Ah yes. The headline here is Sex. Other subheads will be mentioned in passing, the most important of which is Race, but Sex takes precedence for reasons I will explain as we go.
The Shiny Thing is the most useful kind of tool to attract attention away from the Big Thing and hypnotize the consciousness-impaired. We have several different Shiny Things working wonders in the Sex Department.
One of them is the clickbait leader. That would be the Epstein Files. The other two are more dangerous, simmering diversions from common sense decision making. These are the Abortion Thing and the Gender Thing.
Epstein.

The blue dress is the ultimate Clinton symbol. First with Monica Lewinsky and her stained frock, then with Bill reclining as a showpiece of the Epstein Island harem. The flap about releasing the files to the public has never been about justice. It’s about voyeurism. The practical impact of the document tranches will be more negative than positive if legal accountability is your motive for demanding to see them. Epstein is dead. His Island Inferno is shut down. His chief accomplice and consort is in prison. The victims whose cases did not result in prosecutions will not be obtaining redress in courtrooms. The perpetrators who have not been charged will not be. Some of the guilty will be humiliated by association, appearing in photos where they shouldn’t have been present but can’t be proven to have broken any laws, Some of the innocent will also be humiliated, appearing in photos with Epstein or on his plane (or what might be his his plane because there are females present). Still others will be dragged in by crude and/or sophisticated Photoshop creations. It won’t matter which, because in the tabloid world, the appearance of guilt is conviction in public. The only avenue of escape is to be implicated by an AI fake, because all the ballyhoo about AI sophistication is overstated. It cannot create artificial reality convincingly (and everyone knows it).
The only reality that matters here is that everyone who seeks out Epstein photographs is a Peeping Tom. Pick any photo you like, fake or not. Nothing good will come of your having seen it. The ones who also seek the documentation looking for hints of wrongdoing, hearsay accusations, and circumstantial slanders are not only Peeping Toms but hypocrites. I can’t believe the standard spiels of the Lower Slobbovians who are deeply emotional in their self-righteous damnations of people they’ve never met based on no evidence whatever. That’s consciousness impairment at a psychotic level.
Why am I so offended by showy outrage about the Epstein Files? Because there are a few hundred thousand victims of the abduction and sexual trafficking that went on for all four years of the abiden administration with nary a squeak of protest from the Peeping Toms. Worse in my mind are the people who declare themselves MAGA dropouts and advocates of Bondi impeachment over the Epstein Files, when they have no excuse for not knowing the following:

The Abortion Thing. Still having its subterranean impact on the voting choices of single/? females who think their anatomy entitles them to an automatic license to kill. They actually need to keep the issue alive as a national despite the fact that the Supreme Court has made it a local issue, which give every citizen more control of the governing laws in their communities than Roe did. The inability to imagine the 30 million missing human beings with unloved lives is prima facie proof of impaired human consciousness, All I’ll add on this subject is two graphics:
The shirt is one of many. So is the fetus from my ST99 project.
The Gender Thing. No, it’s a Sex Thing, pure and simple. Akin to the Abortion Thing in that it refuses to take into account the collateral damage to real people caused by its absurd attention-getting gambits. The narrative has it that it’s not strictly another feminist power grab because it also claims to be about advancing the rights of homosexuals, bisexuals, and transgenders of both sexes, not to mention the racial and ethnic minorities who are also not members of the white male patriarchy, which must be done in at all costs. Oops. Right. It is another feminist power grab who are using all the other flagged initials to isolate and sideline white males who are sexually interested in attractive women. It’s everybody against the white guys, including even furries and woke pedophiles who were never invited to Epstein Island.
Collateral casualties? Trad women (ugh) who don’t like transitional TGs in their bathrooms and locker rooms. Female jocks (grow up, girl!) who object to getting seriously injured by delusional men with silicon breasts competing against them in female sports. Minor children being lobbied by feminist school boards to learn about sexual fetishes in their elementary school libraries and even take one for the team by submitting to surgical castration and mutilation before they’re even old enough to have sex. Various children and adults targeted for mass shootings or assassination by Transgenders who still haven’t attained the Utopian ideal of thinking about nothing at all but how others regard you.
Lies, Damned Lies, and Statistics. This all three. The Official Narrative.
The four shooters figure is a blatant falsehood, enabled by mass media who routinely conceal the TG identities, especially in school shootings. Interestingly, if you perform the calculation needed to determine the percent-incidence of shooters among TGs vs “Cisnormals” (eew!), here’s the number of shootings there would be if the population disparities are adjusted for.
As the number of such shootings increase, as it would in any honest reporting, that number would increase dramatically.
As a final point on the Gender Thing, women are still complaining mightily about the patriarchy, but here’s where the smell test comes in. Have you noticed how many women are visibly in charge of so many government/academic/elite institutions compared to even a decade ago? Prosecutors and Judges (including SCOTUS), mass media executives and on-camera talent, presidents of prestigious universities, blue state governors, big city mayors, chiefs of police and fire departments in major cities, cabinet secretaries (Trump admin included), and rising political media magnets in both parties. Feels like the patriarchy is already on the run, except for the handful of most hated men in the country, whom more women than men want pushing up daisies for the most fantastic imaginary crimes against, uh, women.
These are the distractions that occupy more time and attention in the public discourse than anything else I’ve mentioned in this essay. Think about that. Nothing much you can do about any these attention headline grabbers, is there?
Other lesser distractions?
Europe is gone. Muslim terror states all.
Ukraine is gone. A Third World wasteland whether they continue their war or not.
Russia is what it’s been since the Soviet Union fell. A poor country with a lot of nuclear weapons.
China will keep playing its long game. The same one they’ve been playing since the Emperor Ch’in took over a couple hundred years before Christ. A giant panda bewildering the world but mostly staying at home, munching on bamboo and still producing remarkably few individuals.
Israel will survive. Islam will stagger in till their oil runs out than melt back into the sands with millions of casualties as their sole legacy.
All that’s left is the important stuff. The soul of Mankind. And the Big Question.
Summary
So where are we? More precisely where am I in my appraisal of our prospects in 2026 and beyond?
I think we’re headed for a reckoning of our own counterpart of 1500 BC. The catalyst, barring unpredictable natural and enemy-created disasters, will be the misguided premature implementation of what the hoaxsters call Artificial Intelligence. (See the two posts linked below for focused assessment of this technology.) There is likely to be an early crisis with an AI stock market bubble exacerbated by serious breakdowns in our computer infrastructure. That would actually be good news if it slows down or shuts down the fsr more ambition investments in the offing.
Very large scale implementation of Artificial Intelligence will collapse the global economy and create a Depression so profound it will be called a Dark Age. No one is anticipating this because we have lost our individual and collective ability to foresee consequences beyond even a few months in the future. When only a shrinking minority are making their decisions based on a sense of responsibility to an eternal moral imperative, the tipping points of chaos will arrive with great suddenness and wreak great ruin.
I have been concerned about the dangers of declining consciousness since 1976, when I read Jaynes and added 2 + 2 beyond his own conclusions, I had occasion to speak with him on the phone shortly before he died. I ran my principal tenet past him. If consciousness is a variable, one variation in organizations of mind possible in the human brain, then it is also possible that the kind of modern consciousness you describe can be lost. Am I wrong about that? “No,” he said. “You’re right.”
Whee we are, in my opinion. Moving at breakneck speed toward a tipping point engineered by scientists who are making the same kind of mistake the Democrat Party is making right now. They are so determined to cement their position as technological replacements for a God they see as outmoded and dumber than they are, that they fail to see they are victims of a delusion that will destroy them and the rest of us too.
As the man said, “You can’t fix stupid.”
As to matters that sit in the far future (as our myopic, self-obsessed souls view the world), the outcome of the 2026 midterms cannot be predicted at all. How many rounds of polls will be administered and capture the “this moment, right now” feelings of voters before November of the new year? Neo-proto-conscious LSDRM+++’ers who could be fat and sassy come Election Day or so pissed off by the last electric bill that they want their own congressional version of that Mamdani joker. You pick it. I won’t even go on record about Alabama vs. Indiana on New Years Day.
Comments