Being the continuation of InstaPunk and InstaPunk Rules
End of Year Thoughts, 2023 — Part 3
Get link
Facebook
X
Pinterest
Email
Other Apps
The physical aspect of the mind at work…
The state of the nation at this point is terminal. As far as I know I’m the only one who fully understands why this is so, because I’m the only one who has devoted a writing life so continuously to the subject of human consciousness.
It’s only now that we can finally see the symptoms that demonstrate what is really wrong with the people who are responsible for leading civilization through the storms and rapids of our times. They are suffering from profound impairments of consciousness. In point of fact, almost everyone is, including the people who are emotionally and intellectually on ‘our side’. There are three distinct U.S. populations coming sharply into focus one calendar year before the make or break decisions of November 2024. All three are impaired. Mathematically, each of these can be considered a set.
There’s the set which is engaged compulsively in a civilization-level process of murder-suicide. Their actions are so destructive and yet so nakedly flaunted in public that the perpetrators perceive no irony in describing their own behaviors in very specific terms as symptoms of the evils of their enemies. When they say “they” in their shouted lists of accusations, one can easily substitute the pronoun “we” and be nearly perfectly accurate as to diagnosis. The easiest explanation of this is clinical insanity. But it is not a sufficient explanation.
Opposing them is the set which has been catalyzed into being by the murder-suicides. Outraged by a longstanding pattern of crimes so deeply embedded in once respectable institutions, they are committed to resistance but handicapped by some of the same impairments as their oppressors, particularly their failure to see accurately and understand the third crucial population, which we will call for now, incorrectly but conveniently, the complacent set.
There’s an explanation of these phenomena here. Don’t bother reading it. I’m only writing it because I can’t not write it. Just putting it on the record.
I have at least a baker’s dozen of lines of attack on this subject, which is why I intend to write about it only in specific, fairly narrow contexts. This article is an attempt to boil it down in an understandable way. I have abundant evidence for each of the following generalized assertions:
Intelligence is not what we think it is.
Time is not what we think it is.
Reality is not what we think it is.
Consciousness is not what we think it is.
I’ll address each one of these in some detail. But first, a documented Book List at least 20 years old demonstrating a small but significant part of the terrain I have explored with these topics in mind. There are many more relevant books and other resources that contributed to the ideas outlined below. I offer the list principally to show that I am not new to the ideas presented here, I’m not playing or showing off, and I’m not worried about reader attention spans. As with everything I write, entertaining you is an objective but far from the only one. When the scavengers are picking through the rubble looking for useful scraps, they might stumble across this and my efforts will be justified.
Let’s get started on the task of examining my assertions. You may think that what follows has little or no relevance to the continuously unwinding rope of lawfare abominations committed by the U.S. Government against Donald Trump and his supporters. If you think that, you’re wrong. On with the show…
Intelligence Is Not What We Think It Is
In the age of science with its emphasis on quantifying and numberfying everything, intelligence has come to be synonymous with the abbreviation ‘IQ,’ which stands for ‘Intelligence Quotient.’ Here’s a dictionary definition of that interesting word, which is rarely discussed in conversations about IQ in its normal incarnation as a number rounded to the nearest integer.
The definitions that matter here are the 2nd and 4th. The number that stands for your IQ conforms to the second definition. The standard value is “1,” meaning the average (or arithmetic mean) score in the population measured, multiplied by 100. Scores above and below that are representative of percentage increments. An 80 IQ means that the raw score was 80 percent as correct as the mean score. An IQ of 120 means that the raw score was 20 percent higher than the mean score. The sum of all results are plotted on a two-dimensional graph that takes the shape of a bell curve, which is how the percentage of people with any given raw score (or above or below average) are computed. The unstated implication is that the test does define the boundaries of human intelligence from lowest to highest. An invisible inference is that the percentage differences in raw score are also definitions of the degree of difference in intelligence in the tested population; i.e., that the intelligence of a person with a 200 IQ is twice as smart as a person with a 100 IQ. This may not be how the test designers would represent it, but it is what people tend to believe.
The fudge factor of the test designers and administrators is the use of the word ‘quotient.’ Where the 4th definition suddenly becomes more important than the math-looking one. The word is unusually rich in connotations because once liberated from the jail of long division it becomes synonymous with potentials not limits. Which when you read between the lines is what the testers are really saying about IQ. A high IQ does not mean you are more intelligent, only that you have the raw ability to accomplish the extraordinary feats by which we measure genius. Perhaps the one real benefit of the preponderantly phony brain-computer comparison is that it enables us to see an IQ test as a measure of hardware capacity before software ever enters the picture.
[PERTINENT INTERRUPTION: The AI mavens go on about how their technology can write a compelling legal brief without human assistance. Probably true. Lawyers suffer from a particularly damaging consciousness impairment. Howsomever… There is no computer on earth or on the drawing boards that could write what has already been written here, let alone what will follow this.]
It’s applications that make computers seeming miracle workers. Software. Which is entirely a function of intention. There are blazingly fast accounting machines which can never be confused with genius, however useful they are. There are cleverer accounting applications which can cook the books very impressively. Genius? Depends on how you assess the intelligence involved in cunning and criminal calculation.
What’s important to understand here is the fatal limitations that turn our historical attempts at intelligence testing into ludicrous junk. These actually flow from the limitations we have embedded in science itself, even the hard sciences which are more self-critical than the social sciences out of which obsolete artifacts like the Stanford-Binet IQ test continue to thrive. Take a look:
It’s a multiple choice test. There’s a correct choice and three or four incorrect choices. Which means that the test designer is by definition more ‘intelligent’ than the test takers. He presumes to stand above the entire population of test takers or he would be unable to bound that population quantitatively in finite, linear terms, which is absolutely the purpose of the reliance on numbers, two-dimensional graphs, and percentile assignments expressly intended to disguise the fact that ‘intelligence’ is a human attribute which cannot be reduced to integers and yes/no trials in terms of its scope, scale, and differentials. The subject is immensely, in fact infinitely, beyond what can be measured by any multiple choice test. Even at this low-grade hardware level. What are the right questions? And, more opaquely, what are the right answers to the population of possible questions.
I can pretty much guarantee you that no geniuses are involved in the development of questions used in the Stanford-Binet test. Their purpose is not, never has been, to identify parameters of intelligence but the much more prosaic set of skills that translate to success in several categories of schoolwork. But there’s a difference between people who get good grades and geniuses. Considerable overlap in all likelihood, but the differences are legion. Harvard likes to claim Orson Welles as an alumnus, but all he ever did there was enroll. He never took a single class in college.
Here’s what the test designers are after:
Behold the Bell Curve. It goes up, it goes down. Very neat. Take out your six inch rule. How far smarter are you on that curve than the 100 IQ feller. How many inches or fractions of an inch? Kind of trivial when you look at it that way. Even the very limited accuracy of raw score measurements being taken are deliberately disguising far more interesting inferences. Doing better in terms of raw score means answering harder questions ‘correctly.’ To a certainty there are gradations of ‘harder,’ from hard to harder to very hard to ‘almost no gets this one right.’ In other words, the increments of intelligence beyond the average represent not linear but exponential increases in the abilities measured by the test. Confusing the percentile differentials with the qualitative differentials is an act of fraud by the social scientists in charge.
If fraud was not the intention, why do social scientists presume to estimate the IQs of geniuses who never took their damn test. Albert Einstein did not have a 200 IQ. He was not merely twice as smart as the 100 IQ average guy. But how do we get to where he should be placed in the realm of IQ?
We have to recognize that all of science is damaged in its most basic assumptions regarding the tools it relies on. Indeed, every branch of science is impaired in the same way IQ tests are. (We’ll be getting to that later on, trust me.)
I wound up answering this question by accident almost exactly 30 years ago. For satirical purposes, I decided to make up the science that would prove why the smartest men were smarter than the smartest women. I decided to model intelligence in three dimensions instead of the Bell Curve’s insistent imagery of two dimensions, beginning with their assumption that key questions were necessarily collapsible to finite right/wrong propositions, whose results could also be mapped instructively on flat pieces of paper.
I stipulated that a critical difference between the sexes lay in the realm of metaphor, not as a figure of speech but as the most important of all learning tools. This is like that. What else is like this? What else can this be? Little girls tend to play with toys the way toys are meant to be played with. Little boys are promiscuous about turning whatever they lay their hands on into something else. A mop become a rifle. A box becomes a hat. A steam iron becomes a war ship. A football and a helmet becomes a complete NFL game played in the downcounting seconds of the Super Bowl. A kid’s bicycle, even with training wheels still attached, becomes a motorcycle. A motorcycle becomes a plane. And on… Everything is the raw material of mind play. Which creates mind space, the real world difference between a ‘correct’ answer and imaginative brilliance and breakthroughs.
There was a time when the average teenage boy had a working three-dimensional model of an internal combustion automobile engine in his head, not necessarily all correct, but sound in the essentials, just as even today most little boys can draw a bicycle more functionally correct than most little girls, who can’t seem to figure out on paper how the chain thing works.
You can sharpshoot the sexism all you want. IDC. But the mind space construct is so compelling that it’s likely the reason the manuscript it appeared in was not just rejected by publishers but angrily and permanently so. What matters here and now is that it enables us to understand that our understanding of the IQ question is flat wrong. Einstein was not, as I pointed out, only twice as smart as, say, a cashier at the tire store but thousands of times smarter. Just as the teenage boy had a working model of a V-8 in his head, Einstein had a working model of the entire universe in his head, including the proposition that space and time were in fact an integrated entity he called space-time, an answer the writers of the Stanford-Binet test could never have given him a question about because the range of possible answers couldn’t be multiple choice. Time for a more skeptical look at the flat (because flat) prevarication of the Bell Curve.
The official IQ orthodoxy pegs the term ‘genius’ to scores of 140 and above.
That’s the admissions criterion for Mensa, the club for the brainy. There’s
also a club called Double Mensa, reserved for only the top one percent of
test takers. Except that IQ tests are not predictive of genius-level feats in life.
What else do IQ tests and the reporting on same not tell us? When I first studied statistical probability in business school, our professor made us a read a short funny book called “How to Lie with Statistics.” It turns out there’s no end to the way you can lie with statistics. More all the time. Nearly 20 years later I even wrote a full-day training course called “False Quantification” for a Fortune 100 consulting client to help their non-mathematically minded communicators look past the charts and graphs and numbers they were routinely asked to accept as factual documentation of some truth.
One of the easiest because most brute simple ways of lying with statistics is making it impossible to find significant numbers people might be interested in knowing. This can be accomplished by not counting the data that could be added up to calculate that number, not disclosing the number if it has been tabulated somewhere, and changing the definitions of available numbers in misleading or complicating ways that render them unusable without being attacked by those in the know. Why, 50 years after Roe v. Wade, it’s still virtually impossible to calculate a verifiably accurate count of how many abortions have occurred in the United States since 1973. The people in charge of the counting don’t want you to know. Every number cited is no more reliable than an urban legend.
The same kinds of problems pervade the topic of IQ testing. That average 100 IQ, for example. Is it tied strictly to whatever population took the current test? Or is the 100 score in fact a constant through time which could be used to determine that people are getting smarter or dumber as the decades roll by? I’ve made casual efforts to find what the experts have to say about this question, but all I’ve found is lip service to the inclusion of historicity as a component of IQ results analysis. I said ‘casual efforts’ because any official answers or representations are worthless. The idea of any sort of standard constant defining the 100 IQ impossible. Changing population parameters for those tested, sampling bias in weighting results, changing tests through time for perceived demographic and economic variables which might have skewed past results, and only the haziest gobbledygook describing what the tests actually test and how that correlates to any reasonable standard of human intelligence (i.e., capacity for “understanding” complexity) — all of these are factors which ensure that each year’s testing is unique in various ways and that the 100 IQ is a constantly moving target by no means independent of politically desired outcomes.
What the tests test is pretty basic. Vocabulary, arithmetic skills, lowest-common-denominator comprehension of metaphor (i.e., analogies), and elementary pattern recognition with regard to shape and sequencing questions. Both of the latter two more abstract testing instances are limited, and increasingly so, by the need to preclude economically dependent cultural knowledge or experience from the questions or multiple choice answer alternatives. What isn’t tested at all: any mental capacity associated with integrating the sense of hearing, smell, taste, and touch — let alone emotion, imagination, and humor —into the analysis of visual and strictly numerical inputs resulting in some kind of gestalt. It doesn’t enable us to identify a Leonardo da Vinci or Groucho Marx except by accident via the most partial of criteria.
No room for funny. And no room for William Blake, who couldn’t spell.
This is not meant to say that the set of things tested by IQ testers is not important in its own way, chiefly as an indicator of basic ability to perform well in school from an early age onward. Why for many years, the Scholastic Aptitude Test was regarded as synonymous with IQ testing by the university gating institution called the Educational Testing Service, where scores of 700 and above in the Verbal and Math sections were believed to account for roughly the top two percent of the test population, itself a distinct subset of the total population. (The Brits used “O” And “A” level exams to formalize the distinction between the masses and the Quality.) Except that the SATs unhooked themselves from IQ results when their own historically computed average scores began to fall significantly in the late 20th century. Subsequent to this, the SATs were “dumbed down” by degrees to disguise the weaker performance of students in succeeding generations. Differentials were explained away as prejudicial cultural artifacts, and the “Nothing to see here, move along…” dismissals took over as usual. Most recently, prestigious colleges and universities are dispensing with standardized testing across the board. They don’t want to count what they don’t like the look of. Easier to inflate grades and reduce curriculum difficulty to make things look more or less the same as they used to be.
The important question before us is not the accelerating fakery of higher education. It’s the impact failing education has had on individual human potentials for intelligence. Not intelligence as “smarts,” but intelligence as understanding, comprehension, imagination, and the visionary creation of the new metaphors by which civilization evolves through the centuries. Even if IQ is no loftier a measure than the technical specs of a computer circuit board, that kind of hardware capacity is a limiting factor to whatever that computer can be used to accomplish. Real intelligence is the applications, the code and what it enables human minds to create. The famous first big computer was the Univac, bigger than some houses even today. But your smartphone has infinitely, yes infinitely, more potential to assist human creativity than that dinosaur of technology on display in some antique museum somewhere.
What does it mean if that Average 100 IQ we’ve been taught to regard as a constant is truly declining from year to year without our being able to see what is happening inside the minds of our children? What would it look like in terms that matter?
At the lefthand side of both curves, it’s time to consider what is happening to the nearly half of all test takers who are below what we call average intelligence. In percentage terms, their loss of mind space is significantly greater than what’s happening on the other side. Is there some kind of tipping point in there at which a person is so deprived of even the basic hardware set of abilities that he loses the capacity to function as a human being in a civilized way? A point below which there is no common sense, no empathy for others, no real interior identity, no control of physical reactions to external stimuli from moment to moment, no actual existence in the continuum of time?
Well, we don’t know. Before we can figure that out, we’d first have to recognize that average human intelligence is not a constant but a significant variable whose impairment has a possibly fatal effect on the life prospects of the most impaired and those who must live with or around them. Recognize it and accept it as a matter vital to the maintenance of civilization itself. Worth allocating resources to research and corrective action.
The loss of mind space at the upper end of the IQ range is also of dire concern. Loss of mind space is higher percentage-wise among low scorers, but the quantity of mind space lost at the upper end is incomparably larger in terms of its volume. Breakthroughs in every realm of science, the arts, and philosophy have always been precipitated by the one or two percent of the best minds in human societies. This is so obviously so that like most obvious truths it cannot be proven but only seen by the ones who are smart enough to recognize the obvious and adapt accordingly.
Science is at present the pursuit that has been loudest about proclaiming its superiority to all others. Where, therefore, we should look first to see the impact of declining volumes of mind space among the critical one or two percent. In particular, physics has failed to make the one breakthrough that could have prevented much of the equally sharp decline in the mind space volumes in all other pursuits.
In order to understand this aspect of the problem we must take up our second major topic…
Time Is Not What We Think It Is
For over a century it has been within the grasp of science to discover and make sense of the fact that the linear time we experience as human beings is an illusion. Individual scientists have created lots of the tools necessary to demonstrate that this is so, but they have obstinately avoided using these tools because the job of getting from here to somewhere meaningful is too big and not terribly interesting in career terms. Indeed, they have demonstrated instead, with phenomenal dexterity, just how far afield they’re willing to go in the opposite direction. More about that later.
Again, the basic physics involved is obvious and even rudimentary. Mathematics begins its universe with a single point. It has no attribute that is like a dimension but existence. A point is its own universe, unitary, complete, and eternal (as far as we know). It’s as simple as the binary distinction between existence and nonexistence, meaningless in every other respect. Or, more poetically, it could be the “Let there be light…” moment of creation, a single photon emergent from the void before everything else, including time. Mathematics has nothing to say, no opinion, about what put the point there. It just is. “Nothing to see here…” obtains because any question about agency invalidates the legitimacy of the binary decision point of the point. If the point did not create itself in a conceptual (well, nonexistent actually) realm independent of all the measures we employ to define reality, then the fuzzy relationship science has scrupulously postulated between mathematics and the universe becomes inconveniently focused on what might constitute appropriate questions and answers. Which are nowhere forthcoming. Who or what put that first point there? Irrelevant. Because 20th Century science made the decision long ago not to get backed into that cul de sac ever again. No point in such a contradictory exercise they can see.
From here, we can use math to turn the point into a line, in any direction we choose, which does give it dimension because the line is an infinite state of being. It cannot be counted. It is just is. Forever in both directions. The line is part of our hard-wiring as human beings. (Wires themselves are, you know…) We rely on the line and fall back on it at a very deep level, although that’s only a metaphor because a line has no depth, only a sequence. Although it can also be segmented, where it begins giving us a tremendous amount of fun to play with. It’s esthetically appealing to us. Why writers and mathematicians both have a sacred regard for it. Mathematicians even have a sacred line called Pi, which is a nonrepeating decimal as infinite as math itself and just beautiful when you look at as much of it as you can memorize. They also have a sacred ritual called “the equation,” which is the perfect way of translating very complex numerical relationships into a single line segment that is eternally in balance by definition. Human definition. Because human beings invented the equation, their own counterpart to the natural perfection of Pi. Writers are just as smitten with the line. They write lines of poetry. They write lines of prose, called sentences, which are line segments separated from one another by the artistic interruption of a point called a period. Their lines can be very very long, because there doesn’t ever have to be an end, and if you’re a writer the linearity of all forms of writing is the ultimate means of control over your audience. They must travel along your line of words/sentences/paragraphs in the order you have provided while you use an infinite toolbox of artifices to make them believe that they are somehow participating with you in the experience of your writing by reading along inside the line being spun out of your head. It’s called “the willing suspension of disbelief” or more cynically, “the writer’s leash.” Ecstasy. Scientists and engineers in most disciplines love lines too. And writing about them in sentences and their own versions of the equation. Assembly lines, timelines of various processes and changes, lines of computer code, lines of business, product lines, and lists of everything, the longer the line of listed items the better. Average people, on the other hand, aren’t as sanguines about lines, waiting lines in particular, and required signature lines, and all those lists from everywhere else about ‘musts’ and ‘don’ts’ and ‘can’ts’ and even ‘thou shalts’ and ‘thou shalt nots.’ Gets to be confining after a while. But…
Wonder of wonders, we can also add a second dimension if we multiply the line by infinity (just as the line did the point), which makes it a plane, infinite in all directions and flat as can be. While writers of all types are inclined to stick with their lines of words, the numbers people are in many ways set free by the plane, kingdom of proofs and appealing depictions of fictitious ideal states. It’s a flat but incredibly beautiful world, the reason Euclid gave us “plane geometry,” where innumerable perfections not found in nature can be shown in their full glory. Right angles, perfect circles, calculations that come out even or can be forced to come out even (like with logarithms… Genius!), and two-dimensional charts galore — pie charts, Cartesian graphs, maps, economic models, messy chemistry flattened into elementary regularity, and even Venn diagrams that eschew numbers for areas of overlap made visible in color. Hard to resist the notion that the best answer is the simplest answer, expertly rendered on a sheet of paper, which is, of course, the single most important and triumphant archetype of the two-dimensional perspective. Still…
No need to stop there. We can multiply that plane by infinity, which gives us room to move in as it were with height width and depth to play in. An explosion of possibilities. The world of THINGS, including places, structures natural and artificial, the vastness of a physical universe made of suns and planets and other stuff, and closer to home, bodies of human beings, animals, plants, and even the littlest stuff like electrons and neutrons and suchlike. This completes the world we live in as human beings. Except it doesn’t. The truth is that this dimension is the still life version of the world we inhabit, like a photograph of each thing to be observed unmoving, silent, absent taste, smell, or touch. It takes still another dimension to give us life in the three-dimensional universe we live in. Why we must multiply the three dimensional universe by infinity yet again, so that there is past, present, and future in the universe, and wait… uh, wait right there…
Time is somehow, basically, only a line in our universe. Yes, it multiplies by infinity but only in a crushingly limiting, even imprisoning, way. We commonly refer to it as the fourth dimension, but a lot of scientists are intent on separating it from the other dimensions because its infinity can only be experienced in sequence, in only one direction. In that important respect it is not even a line but a vector, starting at the precise point of right this moment, which makes it unnervingly as restrictive, unitary, and deterministic as the point that started all this unfolding of dimensions. Why there are people, including scientists, who choose to regard time as a universal full stop, a kind of proof of the limitations in the whole of idea of universes. Time can’t go beyond the line or even backwards on its line because there’s nowhere for it to go. End of universal possibilities, end of discussion. Except for the trivial shit around the edges. Quarks and black holes and big bangs and expanding/collapsing universes and such that all combine to expose the fallacy of a world without end.
Linear time also serves another purpose in cosmological science. It looks kind of like an accidental feature of a place where most things proceed by predictable laws. This ‘time as a line’ business seems clunky, not quite a fit, and therefore proof that there is no real divine intelligence behind its creation, just a kind of random mutation of math and physics that enables us to recognize that the closest we can get to God or gods is scientists. The ones who can explain absolutely everything except the answers to questions they haven’t decided to ask, at least not anymore. Why official science is grouped around a consensus committed to time as a line, even though they don’t want to discuss the implications and complications of that commitment.
Proof that they are committed to the line is their acceptance that the part of the line we can’t get to does in fact exist. The past exists. Their Big Bang story depends on this absolutely. On our earth, past civilizations did exist. They emerged from some other species, evolving both physically and socially to the point of leaving behind an enormous record of their physical accomplishments we can dig out of the dirt and study. They are remembered as cultural realities by their descendants, many of whose beliefs were shaped by events and individuals we cannot witness except as by means of fossil and orally transmitted records.
Inconvenient contradictions? A few. Consider the infinities involved. If time is a line and the past exists, what must be true? Everything happened. Billions and billions (Carl Sagan!) of years worth. From the Big Bang and whatever other universes came before it, right on up to today, all those millions of stars and planets and moons and asteroids and comets and meteors, and even here on earth all those people and civilizations and the species they came from and lived with, all of it, and yet the one restriction built into this line to the past is that it cannot be visited. And by the way, this is a line that looks very much like yet another infinite universe, bigger by far than the biggest one that preceded it in the sequence.
I’ve told you science has amassed in the past hundred years or so the toolkit it needs to resolve a lot of the contradictions they don’t want to deal with. One of these tools is an emerging mathematics of infinity. Pretty commonsensical when you get past the impressiveness of the insight that spawned it. I mean infinity is infinity, right? What else can you say about it? Plenty.
It is possible to perform elementary kinds of mathematical operations on infinities. Perhaps not provably in strict terms but reasonably. For example, it should be beyond argument that the infinite set of rational numbers (fractions and decimal numbers of all kinds) is larger than the infinite set of integers. Both sets are infinite because you can always add one to however many you have. But the infinity of rational numbers is clearly infinitely larger than the infinity of integers because in between every pair of integers is an infinite set of rational numbers. In the same way, each dimension we add to our set of dimensions necessarily contains all the others. The second dimension contains an infinite set of lines, the third an infinite set of planes (like a big-boy-pants version of an MRI), and the fourth (i.e., time) an infinite set of the three-dimensional universe and its constituent parts moment by moment all the way back. Which means that time is infinitely larger than all the lower dimensions put together — and yet is somehow restricted because of some universal constraint to behaving only as a vector from our human point of view.
This puts science in something of a box. Where they don’t like to be put. If the past exists in the way scientists insists, then Time fulfills all the requirements of being a next order dimension infinitely larger than all the others below it. Except for the human point of view problem, there’s no reason for inferring that Time is a line at all. In fact, it can’t be. It contains all the lines and planes and three-dimensional constructs that it requires to fabricate past, present, and future. It therefore cannot be only a line and it cannot exist merely as a sequence without violating what seem to be the laws of dimensional physics. Like all the other dimensions we know of, all of Time exists simultaneously, past, present, and future. It is human life that lives on a line.
The only defense science has is its assertion that Time is a thing apart. An immutable law unto itself, imbued with a power beyond that of the universe itself, to which (btw) science blithely keeps amusing itself by adding other dimensions via the expedient of string theory, parallel (alternative/probable) universes, and other self-glorifying because unprovable speculations. Except that those other dimensions and universes also create problems for the Time as a thing apart theory. They suggest, collectively and individually, that cosmological physics is much much grander and more complicated than we suspect while reinforcing the intuitive inference that mathematics and cosmological physics are inseparable interconnected, wherever math comes from.
They’ve also got a problem with their own primus-inter-paribus god. The one named Einstein. Whose theories rest on his conception that the deep structure of the universe is a thing called space-time, not divisible or separable from one another. Which means that the point, the one back there at the beginning, predates the existence of Time, which is not a thing apart but an integral part of the whole, obeying the same universal laws of physics if we could understood what all those laws are. But even if we can’t, there’s no sign that they are anywhere truly inconsistent in their fundamental applications.
If Time follows, flows from, that universal first point, then that first point is also equal to or greater than the infinity of everything else we have been talking about. It begins to look not like a data point but a power so far beyond any comprehension that it is not distinguishable from the concept of one God the Creator.
In that context, all other infinities share one mathematical attribute with every other variable in the math of infinity. They are less than (<) the originating point and therefore equal to one another in the most important sense, no things apart allowed. There is not one of them (or their constituent elements) so small that it does not matter. Because, as the great religions keep insisting, there is a unity, a wholeness that includes everything, and everything reciprocates this unity by containing or at least intimating that whole within itself.
Where are we? Time is not a line. It is simultaneous. The past and the future are both present around us and interacting with each other through, um, Time. Which means there is no such thing as death, as in the eternal blank we are supposed to believe follows the termination of that segment of time in which we think we are living chronologically. This raises a large and very interesting question. If the fourth dimension is being constrained in such a way that it functions as a seemingly linear vector through a much larger universe of possibilities, why is this the case? It has extremely specific consequences, so much so that artificial linearity appears to be best explained as a product of intention.
Forget whose intention for now. Consider instead the effects of such externally imposed tinkering with our own interactions within the Time dimension.
For now, that’s all we need to understand about time before we consider the very serious question of what reality is. Really…
Reality Is Not What We Think It Is
[Intervention: Reality and Consciousness both treated variously and integrally in a separate work, as they should be, including the quantum mechanics stuff and what consciousness is and other stuff that makes everyone important and most of us presently lost…
You remember this guy ? His name was John Wilkes Booth. He killed President Abraham Lincoln, whose birthday it is today. He was a Confederate sympathizer who believed Lincoln and the federal government that enforced United States laws were evil. Pretty much like — no, exactly like — today’s Democrat know-it-alls who encourage violence against federal laws removing the technical non-voters they think they own like the crooked judges who make their fortunes. John Wilkes Booth was considered insignificant before he killed the President. He was an actor, related to a more famous actor and living pretty much on his name only anymore. Sound like any bios you’ve heard lately? I’m just asking you to remember that the following people may seem like insignificant entertainers with all their violent threats agains Trump, but in their kind of work they all learn how to load and cocks guns. And pull the trigger while aiming at the red laser dot. Yeah, these people. What do they all claim...
Yes, it became an annual Nightline Ceremony Now that the first battlefield casualties of ‘Trump’s Iran War’ have been recorded (6 as of 3/2/26), Ivan hear the bells tolling on the soundtrack of the Alphabet News networks lamenting the names of dead military personnel they don’t care about in any other respect. Soldier deaths are one more cudgel that can be used to beat the America First crowd with. We’ve been here before. The article reproduced below is one I wrote for the original Instapunk blog almost exactly 20 years ago. The occasion was a forthcoming — and much promoted — edition of Nightline dedicated to intoning all the names, one by one, of American military personnel killed in Iraq. A not so subtle undermining of ‘Bush’s Iraq War,’ by a TV program that began as a nightly update on the American hostages taken by Iran in November 1979 after Jimmy Carter handed that nation over to the Ayatollah Khomeini. The ironies abound. Nightline was outraged by the plight of the ...
This is one in a series of posts I’ve written for a friend explaining ways in which my life has seemed orchestrated rather than the strict result of my own decisions. Even my biggest seeming mistakes have produced enormous benefits in terms of furthering my education and the scope of my writing. This is the latest of those posts, shared here because there’s no one living who can be hurt by its content becoming generally available. It’s more personal than IPR posts usually are. But I’m in a Shane mood at the moment and I don’t care. It’s a mood that recurs now and again. It passes and I go back to work. But that’s why this post is being shared here, today. One point to remember. The audio narratives here were not scripted. They were extemporaneous recordings made on my iPad over a number of years, not expressly for this post. C’est L’amour That’s the Piaf I fixated on when I was forming my first thoughts on romantic love. I knew of her before we were ever went to France, because my...
In sunnier days, this would probably have been a Friday Follies post. But we’re talking a wilder take on recent antics being fed us through the mass media. More like Friday FAFO Fun. Just how batshit crazy we should feel about the hallucinogenic diet we’re on depends more than somewhat on what side of the aisle we’re viewing it from. For example, if you’re MAGA, as many of my readers are, you probably feel compelled to check in on the War Room on a fairly regular basis. Where the hunt seems to be on for that one more fatal trap the cunning Dementocrats will be using to steal yet another election. I’m not taking questions here. This is just how the daily drear if RAV is striking me. Note that the part of “Hang On” Steve is being played here by Jon Voight, and wait for the relief of seeing Julie “the Smart One” Kelly being played by Sigourney Weaver. You and I should consider ourselves Stanley. Is that better or worse than being one of wet behind the ears voyeurs of the left ...
Jesse Jackson (1942-2026) Honestly tried to find an appropriately hagiographic portrait of Mr. Jackson on the occasion of his death, but I came up pretty empty. Mostly photos of him with other famous people, usually Democrats and Civil Rights bigwigs. I really did make an effort to turn an old photo of him into something more. His was a career full of activity and effort but little glamor. He ran for President twice in two of the weakest candidate pools the Dems Hadhad before the current slagpile. In 1984 he lost the nomination to Mondale, who went on to lose 49 of 50 states. In 1988 he polled worse than Al Gore and Michael Dukakis, who also lost bigly in the general. What little attention I paid him then and subsequently is probably due to his participation in the phenomenon of Reagan Derangement Syndrome, that new streak of personal hatred which entered National Democrat politicking after Watergate. Then he gradually dropped away into the background. Honestly, I probably would l...
Two people daring to approach one another against the odds I like this pic. A surreal take on Valentine’s Day. My wife and I love each other, just not this particular commercial permutation of romance. She doesn’t want a card and I don’t either. But it doesn’t mean I want to be a Scrooge about the whole thing. More than one way to stir a heart though. Loving a musical talent of the opposite sex is not what I’d call cheating, or else I’m in very big trouble. Just shared my lifelong romance with Edith Piaf, which will live as long as I do. But she’s not the only one I have flirtations, infatuations, even relationships with. Enough of them that this could be a series, though I promise I’ll keep that to a minimum. Still, this is a good time to acknowledge such affinities. Women have been misbehaving quite a lot on public stages of Iate and I have not been shy about calling them out. Appropriate that I give a moment to my more tender feelings. Yes, even I have feelings. And female sin...
With all the ruckus about U.S. athletes showing off their jock insight about politics and patriotism this year (“me, me, effing ICE killers, and uh, me”), I haven’t paid much attention to the competitions in Milan, a city in which I had some fine evenings decades ago. Why spoil those memories with graceless images of Ugly Americans embarrassing themselves and us? What has seeped through my indifference is four American performances on, ironically, ice. Two were disasters, gold medal candidates in figure skating who failed dismally under the Olympic spotlight, and two sterling American gold medal victories by a charismatic young legal immigrant from China and a Women’s Ice Hockey Team that beat Canada thrillingly in Overtime. Any karma involved here and there? Could be. Regardless, I’m not going to replay any of these turns on ice here. Let the dead past bury its dead self and let the long lasting glow of triumph reveal itself again at intervals as occasions warrant. Why such a hig...
DISCLAIMER: If you’re anything like me (attentive to the things I’m attentive to), you’re behind the curve in this whole podcasting phenomenon. I’d seen short clips of podcasts at ‘X’-Twitter, heard about the land office business Tucker Carlson was doing right after he left Fox, and didn’t pay that much attention to people like Joe Rogan until Barron Trump suddenly got credit for stealing the media narrative from the alphabet media (ABC, CBS, NBC, MSNBC, CNN, PBS, NYT, WAPO) during the election campaign. Honestly, I’d regarded solo “pundits” filming themselves being smarter than everyone else as an opportunity for satire rather than serious analysis. Why this disclaimer. I have put my own oar in the podcasting water. Several times. Trying to figure out how normal people could produce a regular series of programs on their own hook. So I took a crack at it on the down low. I wake up early, long before dawn, so I experimented with filming myself on the iPad. Without a printer I coul...
One of these gentlemen was named Hugh Walpole. F-a-a-a-mous Writer. Some other famous writer once wrote that “the good is oft interrèd with their bones.” It’s no secret that the reputation of Stephen King has taken a bit of a hit of late. Too much with the tongue-lashing of Donald Trump for some of his more down-home fans. Should this extremely rich and prolifically prolific author be fretting about his legacy in the annals of literature? Hard to say. Have you heard of the prominent writer and “Commander of the British Empire (CBE) Hugh Walpole? No, not Horace. He was the one who wrote so swimmingly about fishing. This was Hugh, who has quite a lengthy write up at Wikipedia. Here are the most salient excerpts: WIKI: <<Sir Hugh Seymour Walpole , CBE (13 March 1884 – 1 June 1941) was an English novelist. He was the son of an Anglican clergyman, intended for a career in the church but drawn instead to writing. Among those who encouraged him were ...
Ontogeny recapitulates philogeny. There’s an intensely contemporary reason for taking a close look at Scientology. The Swamp is so huge it seems like the Borg. But what are the stripped down essentials of the Borg? Here’s a look at a laboratory example, a microcosm if you will. In the interests of full disclosure, I did encounter Scientology back in the weird year of 1968. I was in Boston, got scooped in to a “Dianetics” exercise, and got speedily thrown out for having too much “charge” to participate. The one in charge was blond, bland to the point of creepy, and I almost (but not quite) succeeded in making him lose his temper. In further interests of disclosure, I spent years on Facebook, debating Trump-haters. They did lose their tempers. But they also exhibited the exact same repetition of Talking Points the lefties (and Scientologists) employ. Exact. Same. Words. How I made the cult connection. Overview Like it says. Troublemaker. Destroy Utterly Horror Show Squared More ... ...
Comments