A very low magnitude happiness vector

It’s Friday now, for those of you who have been drinking heavily in the run-up to the big holidays and have lost track of the days.  I’m certainly working today, but I don’t know if the office will be open tomorrow, so I don’t know if I will write a blog post tomorrow.  If you’re interested, feel free to check this site in the morning.  Or, if you like, you can subscribe, and you’ll be sent emails for new posts.  But take that suggestion like a broken barometer:  no pressure.

That’s almost all that I feel I have to say.  Ordinarily, not having anything to say doesn’t mean I won’t write a post.  I’ll just blabber and blather for nearly a thousand words, just to see myself write*.  But there won’t be anything of substance.

Probably a good fraction‒perhaps even a significant majority‒of everything you can find on this blog is pointless nonsense.  Though, of course, I might contend that everything is pointless nonsense.  But here in this blog, you will sometimes find it concentrated, distilled, freeze-dried, and vacuum sealed.

No, I don’t know what some of those things might mean here, metaphorically, any more than you do.  I was just saying words that I thought seemed good.  I have curious tastes, though, so I’ve no idea what others might think of them.

Anyway, that’s me trying to act all silly and funny and whatnot, as if I might be even slightly happy, so that other people don’t have to worry about me.  Well, don’t worry about me.  I’m not happy at all, but it doesn’t matter in the slightest, because neither do I.  Maybe that’s just the way everything is, or maybe it’s just me.  Neither would particularly surprise me.

So, anyway, yeah, I’m not happy, not in any useful sense of the term.  John Galt said that happiness is a state of noncontradictory joy, and that’s always seemed to me like a pretty useful definition of the word, though it’s not the only useful one.  But I like how it separates joy from happiness.  Even people going to the gallows can sometimes joke and laugh, if only as a defense from fear, and in those moments of laughter they may feel joy.  But it is perforce transient, and it’s unlikely that they would be willing to say that they were happy**.

So, in that usage of the word happiness, joy would be necessary but not sufficient for actual happiness.  And both might be relatively orthogonal to a state of wellbeing (which is another word that has more than one interpretation).  Still, though the dot product of happiness and wellbeing may be surprisingly small***, I don’t think it could be zero.

Yes, I use vector multiplication as metaphors for such things, though honestly, it’s not really even so far separated as to be merely a metaphor.  Vectors can be useful for tremendous numbers of things that may seem far afield from each other, from computers and artificial intelligence to physics to biology to economics and ecology.

They can even be of use in psychology, though I don’t know how often they are used therein.  I haven’t dived into a lot of more formal psychology recently, though I like the popular works of Daniel Kahneman and of Jonathan Haidt.  And Paul Bloom is great fun.  But popular works of psychology rarely involve measuring aspects of mental functioning as vectors in a phase space.

Though, as you might have picked up if you’ve read a lot of what I’ve written here, I think it’s useful to think of human behavior and actions as the outcome of a vector sum of all the various “pressures” in the brain/mind, which end up with a resultant that determines what one’s actions will be in that moment.

But, of course, the action itself can feed back on the input vectors, altering them in various ways (maybe their angles, maybe their magnitudes, rarely but possibly their actual sign, which admittedly would just be equivalent to an angle change of 180 degrees, or 𝜋 radians).

Likewise, the state of many of those vectors can change with time.  For instance, one could imagine a vector associated with one’s degree of alertness.  Such a vector would tend to have greater magnitude in the daytime than late at night in most humans, so it waxes and wanes inherently (though even this is likely a result of input vectors delivered by various aspects of the sensory systems).

But the actions taken as a product of previous moments’ vector additions can affect this vector, too.  If a previous resultant led to one having a strong cup of coffee, that might increase the magnitude of the alertness vector, though there would be a delay.  Alternatively, if the previous outcome had led to one drinking a significant amount of Wild Turkey 151 on an empty stomach, the alertness vector might soon start decreasing in magnitude.

Okay, I’ve reached the point in the blog post where I’m using vectors to describe the effects of coffee versus whiskey.  I think it’s reasonable to bring things to a close now.  I hope you all have very good days, by any reasonable measure.  If I work tomorrow, I’ll write a post tomorrow.  I’ll leave figuring out what effect that will have on your own wellbeing for your consideration.


*Analogous to speaking to hear oneself talk.

**Though I can imagine possible situations in which one might be literally happy even on the way to the gallows.  It would be a very brief happiness, nonetheless.

***I doubt that it is, but I also doubt that it is the full, direct product of the magnitudes, as it would be if there were no angular difference at all.  Wellbeing, I think, is more complicated than happiness, which is itself by no means simple.

Some Halloween-style pictures among unrelated words

First of all, Happy Halloween to everyone who celebrates this day in any fashion.  Even if you don’t celebrate it, you might as well have a good day.

I don’t discriminate based on Holiday celebrations.  How very admirable of me.

Once again, I mean to keep this post short by making my target 701 words to start with, because I’m very tired this morning.  It was difficult to get up at all, and I still feel as if I’m vaguely sedated.  Unfortunately, it doesn’t seem to have been one of those sedatives that’s associated with euphoria.  It would be nice if it were, right?  If they would agree, I would agree.

Unfortunately, I’m just groggy and weak and blurry.  By which I mean I feel that the world seems slightly blurry to me.  I don’t mean that I am blurry if you look at me.  I might imagine that I could be blurry (meaning as a function of me not just poor vision in the observer) but I have looked in a mirror already this morning, and while I am far from easy on the eyes, I seem to be in focus.

Thinking of atypical interpretations of things people say, I was listening to one of the guys on the phones in the office yesterday, and I heard him use the expression “qualified individual”.  Now, I know what he meant, and it’s a perfectly valid term when discussing a promotion with a customer.  But it occurred to me that one could use the term to refer to someone who is an individual…but only from a certain point of view.

For instance, Norman Bates could be thought of as a “qualified” individual.  Yes, he’s a single person in the sense that he is one organism*, but there is more than one distinct personality in his head.  You could also say that the narrator in Fight Club is a “qualified” individual, as is James McAvoy’s character (should that be “characters”?) in Split.

Oops, sorry, I guess I could have given a spoiler alert for those movies.  But if I had done that, it would have ruined the surprise!

Of course, from certain points of view, even your typical unqualified** individuals are not as monolithic or monotonic or monotropic or, well, monopersonic as one might imagine.  We know that in split brain patients, when the corpus callosum is severed to reduce the problem of, for instance, uncontrollable seizures, the two sides of the person’s brain can act and think in some ways like two separate people.  They act like two individuals in other words, though in such circumstances, that word is least applicable, since if anyone is “undivided”, it is not these people.

But they are only a special, more extreme version of that which is true of the rest of us.  Our minds are all divided into many separate modules and centers, often running largely in parallel with each other.  There is no one central, “terminal goal” region of the mind; there are separate and conflicting areas and aspects, and even they are not constant.  Many introspective practices, particularly those associated with Buddhism, recognize that the concept of an individual, homuncular “self” is nebulous at best and is never even close to being real.

It seems the term “individual” is just as incorrectly presumptuous for people as the term “atom”*** is for, well, atoms.  However, if we’re referring to more physical literality, then it’s still pretty accurate, certainly for everything more complex than a flatworm.  If you start splitting people (and other animals) in pieces, what you get, at best, is a creature with missing bits and lots of dead former body parts.  You don’t get more than one being.  Often you get no one, because you will have killed the person with whom you started.

In such a case, one divided by two might in a sense equal zero.

Of course, even in basic mathematics, if you divide one by ever larger numbers, you get closer and closer to zero (it’s the limit as the denominator goes to infinity).

Speaking of going to infinity, the value of 1 / (701 – x), where x is the number of words I’ve written, has now crossed the singularity at infinity and is asymptotically approaching the x-axis from below.  On the positive side of the x-axis, starting from the beginning of a post’s first draft, that number can never be smaller than 1/701, since even I cannot write a negative number of words****.  But once I’ve passed the 701 point, the numbers can become an infinitesimal negative fraction, in principle.

In practice, I’m practically finished here.  I hope you all have a good day.  I will probably write a post tomorrow.


*Not counting skin and intestinal flora and the like.  If we count those, then we can all, like Walt Whitman, truthfully say “I am large, I contain multitudes”.

**Again, this has nothing to do with the person’s skills or résumé or experience or innate abilities, it’s just saying that one wouldn’t normally feel the need to add any caveats when calling a person an individual.

***Which means, basically, “uncuttable”.  And what we call atoms can indeed be “cut”.

****A number of negative words, on the other hand…

For a charm of powerful trouble, like a hell-blog boil and bubble

Hello and good morning.  It’s Thursday.  It’s also “Devil’s Night” as it was called back where and when I grew up.  I don’t know if anyone still calls it that.  Nor do I know whether it’s still a night on which some people set fire to things in “celebration”.

I never did quite understand that tendency.  Well, no, actually I completely understand the urge to burn things, but I don’t understand giving oneself license to burn things that belong to other people, just because it’s the day before Halloween.

Of course, one could just call today Halloween Eve, but when you break down the etymology that doesn’t quite work.  Halloween is already “short” for “All Hallows’ Eve”, the day before what I think is called The Feast of All Saints, or just All Saints’ Day.  I guess that must be celebrated on November 1st, since Halloween is October 31st, but I have no idea how it’s traditionally celebrated by those who celebrate it.

Are there people who actually celebrate it?  There probably are such people.

I guess I get the progression:  on Halloween, the ghosts and goblins and vampires and werewolves parade around, before the ascendancy of “good” the next day in the form of all the nutbars who have been declared “saints”.

Don’t get me wrong, I’m sure there were some fine people who have been made saints, but most of the ones of whom I’ve heard were pretty clearly just people who were mentally ill.  However, their society was not prepared actually to help them in any way, so they called them holy people.  I guess it’s (usually) better than what happened to the people who were mentally ill but were seen to be possessed or to be witches or warlocks or what have you.

Mind you, they’re all dead now, and they would have been dead pretty much no matter what, so I guess it doesn’t matter to them what sorts of nonsense people have imagined about them.

Getting back to the holiday progression, I think the addition of Devil’s Night on the night before Halloween makes some sense and improves the mythology.  By that reading, on October 30th, the Devil is truly ascendant, and there is no flouncing about in silly costumes (well, there is, but not “officially”) just acts of destruction.  Then, on the 31st, regular people dress up as creatures of the night, to turn the tables on beings that live by causing fear (much as Batman is said to do!) and run them out of town—to Hell, presumably*.  And then, once the ordinary people have done the work of driving off evil, the saints can come marching in and pretend to be the source of the goodness, when it’s really just that bad things have been driven off (by ordinary people choosing not to be afraid of them).

That’s my highly editorialized take on things, anyway.  But, whatever.

This is usually my favorite time of year, and Halloween is certainly my favorite big general holiday.  I don’t really have any plans to celebrate it this year, though.  I’m not going to be giving out candy—I live in the rear room of the house, anyway—and I don’t mean to dress up or do anything celebratory otherwise tonight or tomorrow (alas, I plan to set no fires).  Like the rest of the landscape of time before me, this patch is dreary and boggy and gray and a bit smelly.  And there’s just dull mist ahead.

By the way, I think I’m going to do the same thing today that I did yesterday and set my initial goal for this post as 701 words, which I’ve almost reached already as I write this.  I will almost surely pass it, but not by too much.  I think it worked well, yesterday, though not as well as whatever I did the day before, when for unknown reasons I saw a huge spike in the number of people who came and saw my blog.  Perhaps that was because I not only invited people to like it and share it, but actually bolstered that by sharing my song Like and Share**.

What would happen if I shared by song Breaking Me Down?  Let’s see.  I’ll embed it below, and we’ll see how successfully I’ll be digested or otherwise broken down today.

In the meantime, please have a good Devil’s Day or whatever.

TTFN


*As Dave Barry pointed out, that’s in concourse D at O’Hare International Airport, which frequent travelers will know.

**Maybe it was sharing the Ricochet Racers that did it, triggering nostalgia in members of Generation X.  It’s possible.

Bing-bing-bing! Ricochet Robert.

I’m in a rather unusually bad amount of pain this morning, even for me, so please excuse me if my thoughts are somewhat incoherent or distracted or grumpy-seeming.  Though I don’t know how you would be able to tell if I’m grumpier than usual.

It’s Monday yet again, and it’s only been two days since my last post, not three, because I worked on Saturday, and on that day, I also wrote a very angry blog post.  I think some people might have found the degree of malice I expressed on Saturday disquieting or at least just not good, which I can understand.  I tend to think of such terrible things a lot more often than most people do (though I share them only infrequently); it’s one of the reasons I find my own company unpleasant.

But, of course, I’ve tried to compensate for my dark tendencies by doing as much good in the world as I’ve been able to do, such as by becoming a doctor.  I’ve never actually acted on any of my darkest impulses and dreams, except when I write horror stories, or when I write non-horror stories with horrible elements in them.

I guess maybe that’s one of the things that’s been therapeutic for me about writing fiction.  Maybe the trouble is right now that I don’t have a good outlet for my terrible thoughts.

Of course, I know that the idea of thoughts and emotions as “substances”, as if some manner of fluids, which can build up and need release is not merely incorrect, but is not even a good analogy for how emotions and other neurological states work.  This is part of why meditation is far more effective against stress and tension than is, for instance, the often counterproductive notion of catharsis.

Of course, sometimes things that work well for neurotypicals don’t work nearly as well for those on the autism spectrum*.  For instance, there is apparently some reasonable evidence that cognitive behavioral therapy, which often works quite well for neurotypicals with depression, is not as effective and can even be counterproductive for autistic people; we already tend to over-self-evaluate our cognitions, and so the tricks and workarounds of CBT often are not merely redundant but miss the issues entirely.

Along a line of possibly similar nature, I’ve written before about how meditation often serves to reduce my anxiety but at the same time worsens my depression.

And yes, in case you’re wondering, I think it’s all a matter of neurological states‒or neurohumoral states if you want to be slightly more precise.  I’ve spent nearly my whole life interested in such things; still, I have found neither evidence nor argument that has so far persuaded me that there’s any significant credence to the notion that humans are anything but temporary patterns of matter/energy, “spontaneously” self-assembled like any termite mound/colony or beehive/swarm**.

Once that pattern breaks because it can no longer sustain itself, due to injury or age or what have you, there is nothing more to it; it’s a hurricane that has passed.  There can be records and traces of its passing, and the damage it has done can linger for a long time, but there is no “afterlife” for weather patterns.

People are more complicated than hurricanes, at least in some senses, I will admit that.  But more intricate complexity doesn’t tend to make things more durable; it makes them more fragile, ceteris paribus.

Of course, all else is almost never equal.  Nevertheless, it’s often useful to consider complex matters as partial differential equations in more than one variable***; one explores the equation by holding all but one variable constant and differentiating or integrating along only one variable at a time.  As long as one thinks carefully about such things and never forgets that one is holding the other variables constant‒and by not forgetting, hopefully avoiding the oversimplification of one’s model of reality‒one can penetrate a great deal by recognizing when powerful tendencies persist even given the fact that other variables can influence matters.

For instance, the metallicity**** of stars influences the size at which they undergo certain levels of fusion, which is why it is thought that the earliest stars had different lifespans and luminosities relative to mass than later stars (like our sun).  But they still, overall, behave like stars, and the bigger ones shine brighter and last a shorter time than the less massive ones.  They are more alike than unalike, the narcissism of small differences notwithstanding.

Well…that tangent, or series of tangents, sure took me down unexpected paths!  But I guess that’s the nature of tangents; in any nonlinear but continuous function (even one as simple as a circle), there are a functionally infinite number of possible tangents.

I think that’s the right mathematical metaphor; isn’t it?  I guess it doesn’t much matter.  I’m just expressing my highly stochastic thoughts (I doubt they’re truly random) as they come.  But they would probably follow different courses if I did not express them in this fashion.

I hope your own thoughts are less troublesome to you than mine are to me and that you are at least at some degree of peace with yourselves and with each other.  You might as well be, though I know that’s not enough to guarantee it.  Still, do what you can, okay?


*Which I am, as you may know; I have written at least in passing about my recent, quite late, diagnosis.

**I don’t mean “like” here as “the same as” but rather “in the same fashion as”.

***My terminology is a bit sloppy here, but I’m not trying to be mathematically rigorous, I’m just trying to get my thoughts across with some clarity and accuracy.

****To astronomers/astrophysicists, a “metal” is any other element but hydrogen and helium (this no doubt irks chemists).  The earliest stars would have been almost entirely hydrogen and helium, certainly to start off.  Mind you, even later generation stars like the sun are still by far mostly hydrogen, but seemingly small “contaminants” can have noticeable effects on big systems, as in the fact that water vapor and carbon dioxide markedly affect Earth’s atmosphere and surface temperature despite being present in tiny amounts compared to nitrogen and oxygen.

Stupidity make me angry–especially my own

Once again, I am writing this on my smartphone.  Yesterday I didn’t even bother to take the laptop computer back to the house with me.  I was pretty much fed up with everything.  Though we had a successful day at work, there were multiple cases of people not paying attention to our guidelines and rules; but whenever I would bring them up, there was (and is) always an excuse to go around them‒sound familiar to anyone?‒and I repeatedly got overridden, leaving me to wonder why I bother.

I also hit the top of my head hard on the corner of my metal filing cabinet early yesterday, while reaching down to pick up a dropped pen.  It really hurt, and it left a cut, and I had a headache and a sore neck for pretty much the rest of the day.  Unfortunately, I don’t seem to have developed a subarachnoid hemorrhage, so I have to keep moving.  It sucks.

And, of course, there’s all the idiocy that is actively occurring in America and the rest of the world.  There might be some who would characterize certain things that happen and that people do as “sick” and/or even “insane”, but I don’t like to use such terms to describe the various moronic and submoronic things humans do that are not only detrimental but cause spreading suffering to others.

First of all, it denigrates people who are actually sick/have mental illness and other related disorders.  Such people (of which I guess I am one) rarely do much harm to anyone but themselves‒though sometimes, some of us wish to do harm to certain carefully chosen other people.

But also, it dignifies the idiots.  After all, insanity is a legal term that indicates someone does not know right from wrong or lacks the capacity to control their own actions.  Now, at a deep level, it is almost certain that none of us has free will, at least not in anything but the vaguest, most hand-wavy, compatibilist sense.  But there is a real difference between someone who has OCD and cannot help but wash his or her hands until they bleed and a person who selfishly and arrogantly assumes that they have the right and the power and the competence to try to run other people’s lives but who then don’t accept responsibility for the horrific messes they make.

Stupidity can be defined as doing something in such a way that it is worse than just random action‒like trying to get to the airport by driving around one’s residential block over and over again ad infinitum*, or to try to solve a Rubik’s Cube by just spinning one side over and over (again, ad infinitum).  And this is so often the distillation of so many things that humans do, especially when they group together in significant numbers.

It reminds me of a post I saw on Threads or X or Bluesky or one of those.  The person said that people are selfish when isolated, but that such selfishness doesn’t really work, that we only survive and thrive by drawing together and supporting each other, working together, caring for each other.  This is true, as far as it goes‒humans are the most social of the social primates, and their greatest power comes from their ability to work together, to cooperate, to communicate.  This is why written language is the wellspring and lifeblood of civilization.  And yet, I am also reminded of the line from the original Men In Black, which I will only paraphrase here:  a person can be smart, but people together are stupid, reactionary, panicky, dangerous animals.

Both of these things are true, at least within certain contexts.  This probably explains at least part of the appeal of Ayn Rand’s** focus on rational self interest‒which, in a large society, is going to, in its limit, come to be the same thing as rational altruism.  But it is strange to have those seemingly at least partly contradictory facts both be true, at least in a highly simplified outline of the social nature of naked house apes***.

It is terribly frustrating.  Even the most well-intentioned people, like the person who made that point about humans being social and needing each other (or at least many of those who agree with those sentiments) will often virulently demonize those who are on the opposite side of a given political spectrum or argument, not even trying to show compassion or empathy or understanding for those who disagree with them.

Likewise, those on the “other side” who seem to wallow in self-righteousness and yearn for authoritarianism will nevertheless seemingly believe that, for instance, they follow the teachings of a very socialistic, compassion-loving rabbi from 1st century, Roman-controlled Judea.

These are some of the things that make me angry, not just the persistent headache and my other, never-ending body pains and mental divergences.  And although anger can be energizing, it is also unpleasant and, as Radiohead said, “it wears me out”.

I can endure a lot, it seems, whether out of stubbornness or willpower or just my own form of stupidity, but there’s no clear reason to keep enduring when there’s no evidence of any available relief or any joy that lasts more than a few hours at a time before leaving me alone to stew in my own, solitary, odious juices again.

I really do hate the whole universe a lot of the time, and that time proportion appears to be growing as that time goes by, like the product of some perverse Dark Energy in my own psyche.  I don’t know what to do about it in my almost entirely empty life.

I say almost entirely, because there are just enough little rays of light to keep me fooling myself that I might one day return to a satisfying, mutual daily existence with people I love, only to have those hopes draw away like a will-o-the-wisp, keeping me eager and even desperate to follow them, but leaving me lost and stranded in the marshland of my mind instead of just escaping into oblivion.

Oh, well.  Life sucks.  No shit, Sherlock, what else is new?  Further clichés as thoughts warrant.

I hope you lot are in better mental states than I am, and that you each and all have a good day.


*To borrow an example, though I cannot right now recall from where.

**Do you think Ayn Rand might have been an undiagnosed autistic person?  Discuss.

***It reminds me of the “Riddle of Steel” as described in the movie Conan the Barbarian.  Early in the movie, Conan’s father tells him that you cannot rely on men or gods, but that you can trust steel.  But then, later, Thulsa Doom (played by James Earl Jones) reveals the punchline of the riddle:  Steel is not strong, flesh (i.e., a person) is stronger.  These contradictory truths engender and represent the vortex of seeming paradox through which people must try to navigate, to find the eye of the storm, the balance point at which effective action is possible.

Mind your vectors and terms of address

I’m writing this on my mini laptop computer again, because even though I find the extra weight of carrying it mildly annoying at the end of the day, at least sometimes the irritation of trying to write using my stupid smartphone is worse.

Although, since those two versions of me exist at different times, it’s hard to weigh their degrees of perceived irritation against each other.  In the morning, if I’m using my thumbs to try to type on a diminutive screen in a fashion that could be easily predicted to lead to some manner of repetitive stress injury, its all too natural for the “me” of that moment to hate the “me” of the previous evening who elected not to bring the laptop computer back with him.

But the “me” of the evening, when faced with the minor extra effort of the mini laptop, can feel very much overwhelmed and exhausted and think that the “me” of the following morning won’t find the process of writing using the smartphone particularly difficult.

The human consciousness clearly doesn’t have one, singular, constant terminal drive or goal as an imagined artificial general intelligence might.  I suppose one might think that the drive “to stay alive” would count as an ironically designated terminal goal, but that’s clearly not an accurate interpretation of the situation.

Not only are some people quite self-destructive and even actively suicidal—which you might credibly dismiss as dysfunction, not the lack of a dedicated system, though I think that would be imprecise—but there’s no good way to think that such a specific drive could evolve.  Evolution is blind to “death” as a concept or force, except as a failure, an accident, a lack, whatever you want to call it.

Before humans, as far as we can tell, no creature on Earth had a concept of “death” as the cessation of the biological processes of an individual organism.  Instead, there are proxies, such as the drive to avoid pain, and the related strong sensation of fear relating to danger and so on.

Similarly, there is no drive “to reproduce” in human (or other animal) minds.  Teens going through puberty don’t start feeling the literal desire to replicate their DNA in other bodies.  Instead, proxies for reproduction evolved, urges and drives that tended to lead to increased chances of reproduction, such as dominance hierarchy drives and displays in social primates such as humans, sexual attraction, and—of course—the pleasure of sex itself, with the reward-based drive to have it as often as feasible (with other inputs adjusting the strength of that drive and causing it to manifest differently in the two biological sexes and at different times and places).

The human brain—like probably all the other adequately complex brains on Earth—is a mélange of modules, with varying drives and processes that have evolved in parallel and sometimes independently, and also developed ways of interacting with each other.  Of course, at the root are the automatic drives that are all but undeniable—the respiratory drive, the thermoregulation drive, and so on.

There are even drives that are neurological in a broad sense, but that are so fundamental that they cannot be interdicted by the rest of the nervous system, only adjusted—I’m thinking here mainly of the heartbeat, the driver of which is in the sino-atrial* node and the Purkinje system of the heart, which is sort of a cross between muscle and nerve tissue.

The upshot is, if you ever feel that you’re “of two minds” on some particular subject, you’re probably not just speaking metaphorically, whether you know it or not.  Your final actions are produced by what I see as the final vector sum (and it can be quite small in the end or it can be huge in magnitude and surprising in direction) of all the drives or “pressures” in the brain that have any effect on decisions about behavior.  Then the action caused by the final behavior feeds back on the system**, changing the lengths and directions of some (perhaps sometimes all) of the contributing vectors, causing changes in the inputs and thus changes in the final vector sum of behavior.  Lather, rinse, repeat as needed, ad nauseam if not actually ad infinitum.

Please don’t imagine this as the sum of physical vectors in real spacetime.  The number of possible dimensions of such mental/neurological vectors is huge.  For all I know, there might even be spinors and tensors and matrices involved, but I don’t think those are necessary for my vague model.  “Simple” higher dimensional vectors probably do the trick.

What a curious set of things about which to write that was!  I had originally intended to start this post with some version of The Simpsons’ “Hi, everybody!”  “Hi, Dr. Nick!” exchanges, perhaps then noting that I could change “Dr. Nick” to “Dr. Robert” and thus reference both The Simpsons and the Beatles at the same time.

But then I might have noted that, although the Beatles song is so titled, “Dr. Robert” is not the way anyone has ever referred to me in actual practice.  It would be, honestly, a little weird for someone to refer to their physician as, for instance, “Dr. Joe” or “Dr. Judy” or whatever, certainly in our culture.

Mind you, there was that tendency for a while (it may still be prevalent) to have kids speak to adults such as teachers and daycare workers and people of that sort using their “title” and then their “given name”, such as “Miss Barbara” or “Mister Jimmy”.  I have always thought that was weird.  I mean, just imagine someone trying to address a certain prominent fictional character as “Dr. Hannibal”.

Alas, that all ended up being a discussion not worth having, except as an afterthought.  Though it’s debatable whether any discussion at all is actually worth having—including the discussion about whether any discussion is worth having.

You all can discuss that if you want; feel free to use the comments below, and to share this post to your social media platforms or what have you.  When you do discuss it, remember to define your terms ahead of time, and stick to them rigorously—i.e., the meaning of “discussion”, and of “worth”, and so on—so that you decrease your chances of getting involved in semantic games and misunderstandings and sophistry.

Whatever you choose to do, please try to have a good day.


*The “sino* in that term relates to its location in what’s called the sinus of the heart, and the “i” in it is a long “i”; it has nothing to do with China, though an identical prefix is sometimes used to mean “related to China”, but in this case with a sort of short “i” sound…or, really, a long “e” sound.

**And there are surely numerous other feedback loops all along the way affecting many, or perhaps all, of the vectors.

Don’t be afraid of “scare quotes”; they are–as am I–here to “help”

It’s Friday at last, the last day of a work week that has lasted at least 12 days already (subjectively speaking).  I am not working tomorrow, so there will be no blog post made again until Monday, barring‒as must always be the case‒the unforeseen.

I will try to remember to send myself the audio files for my last two audio blogs‒or perhaps it was three‒to turn into “videos” over the weekend.  I haven’t downloaded clipchamp or whatever it is to my home computer, but it should be no more difficult to do there than it was at work.  Of course, I may not do that, so don’t make any plans that depend upon my doing it‒goodness knows what such plans might be.

I’m not sure if anyone really likes those “video” versions of my audio blogs or is just as happy with the plain audio.  I’ve noted before that storage on YouTube is functionally limitless (as opposed to WordPress) but if I’m loading them here first, anyway, that’s a moot point at best.

You may have noticed that I tend to put quotation marks around the word “video” when I refer to the above, because though technically they are indeed video files, the visual portion is just a static image.  I’m a big fan of so-called scare quotes.  I think we should use them far more often than we do.  People often arrogate terms to themselves, or use epithets against others, as a means of manipulation, as if invoking some sequence of letters or sounds causes a thing actually to be the case, and I think it’s important to point out when one is unconvinced that the term is being used properly or accurately.

Perhaps the most prominent and pointed such ill-use might be regarding “progressives” and “conservatives”.  Both groups inherited the terms from people who came before, and who perhaps more accurately embodied the general meanings of the words, but they are now simply camouflage uniforms, at least in many cases.  You can call yourself a “freedom fighter” if you want, but using that term doesn’t mean you’re not a terrorist or that you’re actually interested in any legitimate form of freedom.

Of course, real conservatives and progressives being at hostile odds with one another doesn’t make much sense if one is considering the usual meanings of the terms rather than claiming them as team names in some tribal contest of primate dominance.  It makes sense to conserve those things in a society that are effective, that have been tested by time and found to be useful, but it’s just as reasonable that everyone should want to make actual progress whenever possible, to improve life and prosperity for everyone as much as is feasible.

The real, useful discussion would be about which things are working well and should be conserved, and which things require improvement and how to go about it.  There will be substantial disagreement on such questions, of course, and part of the discussion must always be how to decide what best to keep as it is and what is the most fruitful area in which to improve things

People of good will‒who do not think in terms of “us” versus “them” but in terms of usefulness and effectiveness and trying to get the best outcome for as big an “us” as possible‒can work in ways that will be beneficial by whatever measures one might want to use, keeping in mind always that all conclusions are in principle provisional and all processes and people are fallible, but that all problems are in principal soluble.

I’m not sure humans are clever enough primates to achieve such matters for long.  They seem to devolve so readily into conflicting tribes.  I guess this makes sense given the ancestral environment, with groups of only on the order of perhaps about 150 people living together.  But there’s no good excuse for not recognizing that tribal modes cannot function ideally in a setting in which 8 billion people are interacting in a massive and incredibly productive and complex economy and polity.  At higher levels of complexity, newer “rules” are going to tend to be required.

Humans aren’t necessarily all that good at adjusting to such things, though.  I often think that it will require a new and ongoing external threat, such as a supervillain or an alien invasion, to bring humanity together in total.  I’ve often been tempted to volunteer myself for the position, since humanity really can be contemptible and infuriating to me.

It’s not that humans are worse than the other life forms on Earth; I don’t think they are.  Life in general is frequently vicious and cruel and wretched, with all living things riding the knife edge of death and extinction much, perhaps most, of the time.  Nature’s equilibria are not achieved by some beautiful, fairy tale cooperation and self-restraint between forest creatures or what have you.  Equilibria are maintained by disease and death, by starvation and predation.  Agent Smith was just wrong, dead wrong, in his assessment of life’s tendency to form such natural equilibria.  He was too generous in his assessment of non-human forms of life.

Humans, however, are more competent than other animals.  They are also the only ones even capable of seriously planning ahead to strike a flexible and ever-changing balance between conservatism and progress.  It’s that they so often fail even to try to rise above their lizard-monkey minds that is so infuriating, and they themselves are among the worst of their victims.

Sometimes I think just wiping them all out would be a kindness‒not to the rest of the living world, which is certainly no more admirable or worthy of kindness than humans, but to humans themselves.  After all, if a function in time is always negative, then integrating the area “under” the curve will always yield a negative, and a permanent regression to zero would be a gain.  Maybe the universe, or at least the Earth, would be kinder in aggregate if it were sterile.

It’s food for thought, at least, and it is tempting.  What do you all think?  I’m not asking what you feel.  I hate feelings*.  But when you are as close to dispassionate and disinterested as you can make yourself, what do you think?  Does the human race (and by reflection, life itself) require an enemy to bring out its best?  If so, does it not then “deserve” that enemy?  And if it cannot defeat that enemy, does it not “deserve” to be destroyed?

I suspect that might be the case.


*Ha ha, that’s a little joke.