We have met the cosmic horror, and…

Well, here I go again (on my own, like the song says) writing another blog post.  As for why I am doing so, well, there is surely a set of causes‒potentially tracing all the way back to the Big Bang, or at least the period just during and/or after inflation, assuming that happened, which seems more likely than not‒there may not be any good reason for it.

Oh, of course, I could come up with reasons.  I could “justify” myself.  Indeed, there is reason (har) to think that justification and persuasion to bolster one’s status and identity in a tribe against others with opposed motives may have been one of the driving forces behind the development of the human reasoning capacity.  This is apart from, and perhaps almost orthogonal to, the basic power of reasoning to understand and thus best navigate the territory of reality.

Once it got started, reasoning would have accelerated thanks to biological arms races between those competing for survival and reproduction, and then it would have turned out serendipitously to have been more broadly and powerfully useful than merely for securing status and food and mates.

Imagine if the peacock’s tail had turned out not only to be ostentatious and beautiful and sexy (to peahens, anyway) but tremendously useful and broadly powerful, especially once it reached a certain level.  Imagine if the peacock’s tail had allowed peacocks to build skyscrapers and boats and trains and planes and cars, if peacocks’ tails helped peacocks build a global civilization, quite apart from their ability to secure one’s status and acquire good mates.

That’s quite possibly more or less what happened with human brains.

Of course, like the peacock’s tail, the human brain is not without its drawbacks.  I suspect that things like depression and anxiety, and perhaps even neurodivergence, are simply potential (and statistically inevitable) outcomes for a brain that has grown powerful enough to assess the world deeply and uncover the almost Lovecraftian terror of our tiny little existence when placed against the scope and scale of the cosmos.

I say “Lovecraftian”, but even with Lovecraft, though the beings in the mythos are thoroughly inhuman and incomprehensible‒unsane, as I like to say‒they are still beings.  The true cosmic horror is surely that beings of any kind are almost nonexistent; indeed, to a very good approximation, they are nonexistent.

In some senses, this can at least be morally reassuring.  If we do go and spread out through the universe‒or even just the galaxy or even just our local family of stars‒and there are indeed no other life forms, then at least we need not worry about violating implicit rights.  Uninhabited asteroids (for instance) don’t have goals or wishes and, as far as we can tell, they cannot suffer.

Of course, we may have aesthetic concerns about such things, but aesthetics are not as urgent as ethics.  And, of course, we will still have moral/ethical concerns toward each other; that almost goes without saying.

Whether or not we will exist long enough for the ethics (or lack thereof) of changing the state of uninhabited other places in the galaxy to be pertinent is quite uncertain.  I see nothing in the laws of physics that makes it impossible, so in that sense, I am optimistic.  But I see nothing in the laws of physics, nor more specifically in human nature, that makes it certain or even likely that we will survive to spread out from our native planet to any significant degree.  And I see nothing in the laws of nature that seems to imply that, if we don’t succeed and spread through the cosmos, anyone else will do so, or indeed that anyone else even exists.

Don’t get me wrong; physics clearly and undeniably allows life to exist, and it allows (human-like) intelligence and civilization to exist.  But those are two different scales of allowance.

The molecules and principles of life as we know it, with long-chain molecules capable of carrying information and of replicating themselves, leading to “competition” and “improvement” and increasing complexity and so on, seem so straightforward as to be happening potentially (but far from certainly) in a good many places in the universe.  This is straightforward enough.  The equivalents of viruses and prokaryotes may exist in many regions.  It’s even possible that there may be such life in other places in our solar system (Europa and Enceladus being possible contenders).

But multicellular, “eukaryotic” life, seems likely to be much rarer.  Basic life started on Earth, as far as we can see, very shortly after the Earth formed and cooled enough for complex molecules to endure (nearly 4 billion years ago).  Eukaryotes, especially multicellular ones, didn’t really arrive until about 500 million years ago.  So, seven eighths into the time of life on Earth, it was basically just “bacteria” and some viruses.

Then, for significant, interpersonal, symbolic and technological intelligence to develop took another…well, basically another 500 million years.  And as far as we can tell, it’s only happened once.

That doesn’t give us a good, clear picture of how rare or common such a thing is‒one is a difficult number of experimental subjects from which to draw too many conclusions‒but it’s possible that the existence of technologically intelligent life is so rare as to occur only once per, on average, every chunk of spacetime as large as our visible universe.  It could even be rarer than that.

In an infinite cosmos, of course, even such exceedingly rare events would happen an infinite number of times (so to speak).  But that doesn’t necessarily make things less lonesome.  If you have an infinite number of decks of cards (with no jokers), all thoroughly shuffled together, there are literally just as many Aces of Spades as there are red-suited cards in total (ℵ₀, the “smallest” infinity).  Nevertheless, if you draw cards randomly, you will only get an Ace of Spades one twenty-sixth as often as you will get a red-suited card.

Similarly, there are as many whole multiples of a trillion as there are integers in general (again, ℵ₀), but if you pick a random integer, you’re still only going to pull such a multiple one out of a trillion times (on average).

So, maybe the takeaway is that the real cosmic horror may be that we are the only entities haunting the abyss, and there are no (other) mad idiot gods bubbling away at the center of celestial existence.  Maybe it’s just us.  And if our lights go out, then nobody is home.

It’s worth considering, not least because it has every chance of being true, whether literally or just practically.  For if the nearest other technological life form is in another galactic cluster, for instance, then we are, for all reasonable purposes, alone in the universe.

The forecast calls for uncertainty

It’s Friday now (as I write this, anyway), and I think that I will have tomorrow off.  But, as some of you may have noticed, the specific plans about my work Saturdays are subject to rather erratic change.  It’s quite annoying; I don’t really like unexpected changes to plan.  I particularly don’t like them when I don’t agree with the reasoning behind them.

Of course, our two most consistently top salespeople at the office contracted when they came aboard not to work on any weekends.  And, as I said, they are consistently our best.  Could there be a causal connection between those facts?  Well, correlation does not necessarily imply causation, of course, but enough correlation should at least shift your credences.

Unfortunately, humans are not naturally good at probability and statistics.  This is part of why I think the subject(s) should be taught in standard education, starting quite early.  Though the subject(s) can be somewhat counterintuitive, the mathematics is not really all that rarefied or difficult, and probability and statistics apply to so much of the world.  On the smallest scales they seem to apply fundamentally.

Anyway, I didn’t come here today to discuss probability and statistics, though obviously I enjoy the subject(s).  So, then, what have I come here today to do or to discuss?  Well, now that I think about it…there is no particular subject.  I don’t know why that should surprise any regular reader, let alone me.

It will probably not surprise you that I have not started playing on Babbel or Brilliant yet.  I do at least look at the apps frequently throughout the day, considering using them and so on.  For whatever that’s worth.

I can allow myself some excuse with Babbel, since it’s difficult to practice a language in a busy office.  But there’s no such reason not to use Brilliant.  Its teaching and exercises are set up in nice, granular ways, so you can do one problem then get called away by work, or whatever, and then go back.

I even don’t mind the rather hokey “experience point” system they use to reward you when you get an answer right.  It’s kind of fun, but it’s not too involved or taken too seriously by the app makers (or so it seems, anyway).  And I definitely have learned new things on the app in the past, and honed and renewed prior skills as well.  So it’s not a waste of time by any means.

The same cannot be as confidently said* about the various apps/sites on which I no longer have accounts.

Of course, time passes‒or whatever it is that time really does‒no matter what we do, and sometimes “wasting” it can be a fulfilling choice.  If we are metaphorical virtual particles then we can behave like them from time to time, not just heading directly to the next interaction, but maybe throwing out an electron-positron pair and then reabsorbing them before they could be detected, or going around the universe and coming at the interaction from backwards in time and behind, as it were, just to show off a bit.

Not everything has to be useful, at least not in too narrow a sense.  Usefulness, like so many things, is in the eye of the beholder.  It is certainly not a universal, general attribute of reality.  So, while it may only rarely be wise to be counterproductive from one’s own point of view, there are times when it’s good‒maybe even useful, ironically‒not to worry about whether something has any point or not.

Yeah, I’m not terribly good at doing that, either.  I don’t know how much of that is due to culture/upbringing and how much of it is genetic or at least neurodevelopmental.  I’d guess it’s not too far from 50/50, but I would not be shocked to find the full truth surprising.

Regarding whether to worry about app usefulness or lack thereof and whether to spend time on the ones that I will have wished I spent time on, well, it’s been said that wisdom, at least a form of it, is the ability to follow your own advice (i.e., the advice you would give to someone else if they were in your circumstances).  I think most people would be able to recognize that, by that particular definition, we are all quite unwise, quite often.

Okay, well, I’ll start to wrap this up.  I really should not be working tomorrow, but if I do, I will almost certainly write a post.  It’s quite unlikely‒I would call it less than 20% likely‒that I will work, but we shall see.  You can check in if you’re “in the neighborhood”.  Don’t look for my posts to be shared on Facebook or Threads anymore, but I do share them on Substack and Bluesky and TWFKAT.  And you can always find them here, directly, and comment if you wish.

Have a good weekend in any case.  That’s an order!


*Well, it can be said, but talk is cheap mother f#cker.  Rather often, people say they are confident and act sure about situations or information that they cannot know with confidence.  I always consider this unwarranted confidence to be a “red flag”, a warning sign that this person’s judgment is unreliable.

Sometimes drunkards walk to interesting places

Well, well, as the oil tycoon said*.  It’s Saturday now and I am actually writing a blog post, as I expected I would.  It’s been three weeks since the most recent prior Saturday morning post (not counting my “non-post” from last week).  But today, this weekend, I am going to work, and so I am writing a post.

I hope you’re proud of yourself.

Okay, well, that last sentence doesn’t really make sense in this context, but I felt the curious and rather inscrutable urge to write it, and there was no real downside to doing so, so I did.  These are the sorts of things that happen in biological, nonlinear, largely subconscious brains that are communicating using language (especially written language, in my case).

A truly efficient, direct, deliberately programmed AI (not a neural net style, LLM type of AI, but one whose algorithm is precise and understood) might not produce such erratic and seemingly peculiar thoughts.  But maybe it would.  Maybe one cannot have actual intelligence, with creativity and the like, without having a system that meanders a bit into the highly tangential.

I suspect this may be so, because in order to grow and gain new knowledge, to be creative, there has to be a capacity to embrace the unknown‒not in an H. P. Lovecraft sense, but more in a sense reminiscent of Michael Moorcock’s** character that strode into chaos and by interacting with it caused it to become a locally specific order***.

The potential paths into the future which one might, in principle, explore are functionally limitless, and may actually be infinite.  It’s not possible to evaluate them comprehensively through any kind of linear logic‒not in the time span available to the universe, anyway.  So, to work things better, there must be a bit of potential for “randomness”, for moving forward into a future that is one’s best guess, or into which one has narrowed down at least some of one’s choices.  Then one can find a “good enough” path or course of action, one which may produce insights and outcomes that were not, in practice, predictable by any finite mind.  (In a way this follows from the fact that, if you can precisely and specifically predict what insight you are going to have, then you have already had it.)

It’s a bit like evolution through natural selection, where the mutations are effectively random, but the survival of those “mutants” is not at all random, at least in the long run, on a large enough scale.  Still, there’s no pre-thinking involved, no teleology, merely “motion” that is constrained (by differential survival due to the facts of surrounding nature).

Even if one has a fairly specific goal, trying to plot out one’s way through the phase space of one’s potential future paths in a very specific and precise and preplanned course is unlikely to be doable.  It may not be preferable even if it were possible.

It may be analogous to trying to get from one location to another in, say, the same city, by following a direct, straight line from one spot to the other.  One probably won’t be able to make any progress at all for very long; buildings and streets and vehicles and the like are probably going to get in the way.  Heck, the very surface of the Earth could be an impediment to any truly straight path, since it is curved****, but we’ll stipulate that you can follow a geodesic (the shortest distance between points on a curved surface).

Anyway, if one precisely follows only a preset straight path, even if one can more or less achieve it, one misses out on many potentially beneficial but unpredictable paths.  Imagine one is heading to one’s usual, mediocre but tolerable, fast food restaurant for lunch, and one only goes straight there without even looking around.  One might well miss seeing all the many other available restaurants, some of which one may find preferable‒perhaps by a great margin‒to one’s “planned” place.

That’s a slightly tortured metaphor, and I apologize for that fact, but I hope you know what I mean.

It doesn’t do‒usually‒to try to make progress by a true random “drunkard’s” walk.  I don’t recall what particular power law the number of possible outcomes follows, but it grows very rapidly, perhaps exponentially, with each new step.  But if one keeps one’s long term goal generally in sight, and one heads in that general direction, adjusting for buildings and railroads and hills and lakes and so on, constantly assessing and, when necessary, adjusting one’s course, one can usually not only get to one’s destination rather well, but one can encounter new sights and new experiences along the way.

Some of these encounters might even make one decide to change one’s goal of travel, having found a better one (by whatever criteria) as one went along.  That’s not going to happen to someone who is dogmatically focused on only one path and only one goal.

Okay, well, that’s my rather stochastic blog post this Saturday.  I hope you are already having an excellent weekend, and that it continues to be excellent (or if it is not yet excellent, that it becomes so in short order).  Thank you for reading.


*To his son, Derrick.

**I don’t remember which character‒it’s not Elric‒or which story.  My apologies.

***Of course, as I think I’ve said before, order is not the opposite of chaos, but is rather a subset of it.

****It is.  Seriously.  There is no reasonable doubt about that fact, and it has been known to humans for at least 2200 years, since Eratosthenes calculated (correctly) the circumference of the Earth using distance along what was effectively a geodesic and the angles of two simultaneous shadows.

You’re so vain, you probably think that nothing matters

I was going to start by saying that I had probably written all I could about Friday the 13th and the fact that there are 2 in a row when non-leap year Februaries have Fridays the 13th, and that a first glance might lead one to think this should happen roughly every 7 years on average*.  However, as I noted last time I discussed this, because the leap year day is in February, we will not have the two-in-a-row Fridays the 13th (February and March) as often as we might otherwise; it will not happen every 7 years on average.

Then, this morning, after recalling that today was Friday the 13th, I ran through the next years’ Fridays in my head in the shower, and it occurred to me that the next Friday the 13th in February‒which will be in 6 years, as I noted in the past‒will not be followed by a Friday the 13th in March!  2032 (six years from now) will be a leap year, so there will be 29 days in February, so there will be no Friday the 13th in that March.

The next paired ones, then, will be a further 5 years after that, in 2037 (not a leap year).  It would have been 6 years later, but there are two leap years in that interval, 2032 and 2036, so the next one comes a year sooner than it would otherwise.

It occurred to me that, because of the frequency of leap years, which is almost twice that of the cycles of days of the week, the frequency of those paired dates may well be once every 11 years rather than every 7.  At least those are both prime numbers.  I’m not going to work out some exact formula right now, though.  It’s not really important.

Of course, one could say that nothing is truly important, and I am persuadable along those lines.

There is a Doctor Who Christmas Special (the one from series 5) in which the antagonist/guest protagonist (played by Michael Gambon!) describes a woman in a cryo chamber as “nobody important”, and the Doctor characteristically responds by saying, “Nobody important?  Blimey, that’s amazing.  You know, in 900 years of time and space, I’ve never met anyone who wasn’t important before.”

This is typical Doctor, of course, but it raises the objection Dash (from The Incredibles) voiced when told that everyone is special:  Saying that everyone is important can be the same thing as saying no one is.

Of course, important is in the eye of the beholder.  But then again, the beholder is not important, either, except in its own subjective estimation and perhaps that of a few other, equally unimportant, owners of such eyes.

So, yeah, one could argue relative and subjective importance from local points of view, which is valid but more or less vacuous outside its small scale as far as I can see.  On a cosmic scale, it’s all just dust and shadows.  But you could also say that about the entirety of the cosmos itself.

I guess import has always been subjective, even though people are not inclined to see it that way.  But, of course, people are the products of their “local” forces, and they are not responsible for the laws of nature, nor for the things which have happened in the past that have affected them in the present (which could come under a certain interpretation of “the laws of nature” in and of itself).  I won’t get into all that now.

Going back to the shower, but on an entirely different subject, I was also thinking about the effects of diminishing amounts of shampoo in the bottle on the center of gravity of the bottle.  At the start, when it’s full, the center of gravity is roughly in the geometric center of volume of the whole thing.  But as one uses the shampoo, the center of gravity shifts lower and lower, since the air replacing shampoo in the upper part of the bottle is much less dense than the shampoo or the bottle.

But then, as one gets to the dregs, the smaller and smaller amount of shampoo in the bottle contributes less and less to the overall mass distribution of the bottle and its contents, and the center of mass begins to head back up.  Finally, when the bottle is “empty”, the center of gravity will have returned to almost the same place it was when the bottle was full.

All that’s fairly trivial, well-known stuff, I know.  But it got me to thinking about how much of the laws of physics, such as the laws of gravitation (Newtonian form), are solved using such concepts as the center of mass, which is really just a way of combining and averaging the effects of numerous tiny bits of gravitating material as if they were concentrated at one point.

Much of the mathematics of physics works this way, coarsely approximating the very fine details of reality in a way that provides reliable, reproducible guidelines and can produce testable predictions.

But the granularity of reality doesn’t actually ever go away, not at any level.  Even at the level of the quantum wavefunction of a single “particle”, the actual behavior of the thing as it interacts with things in the “larger” world is the summation of the effects of all the possible quantum states of the electron superposed upon each other and interacting with things‒everything‒which are also just collections of superpositions of quanta.  That superposition happening in a “space” that doesn’t directly coincide with the macroscopic space we experience, but whatever its dimensions are, they are real, because they have durable, reproducible effects.

Mathematics may be unreasonably effective in the physical sciences, as Eugene Wigner famously noted, but it seems not to be a refining of description but rather an averaging out, a glossing over, the inking of an underlying rough pencil drawing which nevertheless still constitutes the real, original picture.

It may be that, in a sense, all science is just various forms of statistical mechanics.  We know that, at larger scales, we definitely need the tools of probability and statistics to navigate as best we can the territory of reality.  And yet, we don’t teach this sort of stuff to most people, ever.  I wrote a post about this on Iterations of Zero, if I remember correctly.

I could go on about all this rather easily, I guess, but I am using my smartphone today, and my thumbs are getting sore.  That’s okay; yesterday’s post was probably way too long, anyway.

If I did a video of my thoughts on this I might be able to get into more detail, though it would probably be even more erratic and tangential than my writing.  Still, maybe it would be worth trying.

In the meantime, I’ll write at you again tomorrow.


*Go ahead, do a search on my blog page for Friday the 13th; I’m all but sure it will bring up the pertinent blog posts.

 

Give us this day our daily blog

It’s Tuesday now, and I’m writing this on my mini lapcom.  I don’t know if I wrote any of my posts from last week on the lapcom*, but so far this week, this will represent 50 percent of the week’s posts so far.

Admittedly, that’s not saying much, and one cannot draw many conclusions from a two-item sample in which one is one way and one is another.  To presume that they will continue to occur in a 50/50 ratio would be a major statistical/probabilistic error.  At best, one can say that there are at least two ways in which my blogs can be written, since two have so far been sampled—and that is certainly true.

Anyway, speaking of twos, it’s Tuesday.  It’s the 10th of March, of course, and the second full weekday in Daylight Savings Time, or in non-Daylight Savings Time, whichever one it officially is now.  You can tell that I really don’t see the sense in the whole thing from the fact that I cannot even recall nor logically infer which of the two possibilities is correct.  When I am actually interested in something, I tend to try quickly to dispel any ambiguities in my understanding if I can.  With this, I really don’t care, because it’s all silly.

In fact, it’s so silly that I think that’s all that need be said about it.  On to better things, or at least to other things.  But, of course, the question now is:  What other things should I discuss**?  I don’t know, honestly.

I don’t know dishonestly, either, come to think of it.

Isn’t it weird how much of a habit it is to say things like, “honestly”, or “to tell the truth”, or “I swear”, or other similar words and phrases to try to emphasize the authenticity of our words?  But they don’t do anything at all to confirm our truthfulness; epistemologically, they’re almost without content.  If anything, the fact that we felt unsure enough to have to say we’re being honest might raise a so-called red flag in the mind of a given listener.

Does the fact that a person says “honestly” or “I’m not gonna lie to you” or any similar phrase actually provide any information about truthfulness, except for the fact that this person recognizes that truthfulness is valued, at least by the person to whom they are speaking?  It doesn’t really demonstrate truthfulness, I think that’s clear.

Some might be inclined to think that the words actually indicate falsity, but that’s not true, either (ha ha).  It may be the case, at times, that a person who is trying to deceive another may say “honestly” to reassure their interlocutor that their lies are true and also to relieve some of their own anxiety.  But people who are telling the truth may merely want to recognize and emphasize that fact, and so use the same phrases.  They may, for instance, realize that something true they are saying could seem improbable to some hearers.

If it were always a harbinger of a lie, then such a seeming reassurance would indeed be a reliable signal, but of the opposite state from that described in the message’s content.  People would very quickly stop using it—the honest ones wouldn’t want to use it, since it always implies dishonesty, and the dishonest ones wouldn’t use it because it would be a dead giveaway.

Somehow, seemingly at least partly because it is an ambiguous signal, it stays in our discourse and is used automatically, more for emphasis and for rhetoric than for its prima facie purpose.  I’m sure Steven Pinker could give a good explanation for why this is so, or at least part of an explanation.  I know he’s come out with a recent book about mutual implicit knowledge and its nature (and its implications), but I don’t have it yet, and I haven’t read it.

I’ve read some of his other books and enjoyed them.  I seem to particularly enjoy his work as audiobooks.  I listened to The Better Angels of our Natures in audiobook format during my then-commute, using a Bluetooth enabled motorcycle helmet.  That book is almost 40 hours long on audio, but I was sad when it was over.  There was not one dull moment for me (of course, I was riding a very fast and non-armored conveyance at the time, so even if the book were to have become dull, there would have been other matters to keep me alert).

Okay, well, I’ve managed to meander about lexically—is that the proper term or not?—without any clear destination in mind, other than “at least 700 words”, and have written some vaguely coherent sentences about some distantly interrelated subjects.  I hope I have at least mildly entertained you, the reader.

I know, hopefully there is more than one of you, but only one of you can be reading this at one time in one place.  Now that’s a vaguely interesting thing to recognize:  reading is only ever a solitary process.  One can read alongside others, but one cannot share the process, even if several people are all listening to the same audiobook at once.  Reading does not add in parallel, only in series.

With that little tidbit that some of you will recognize and others will not, I’ll call this blog post to a close.  If there are no objections?  No further business?  Very well.  [Smacks the gavel on the table] This blog post is adjourned.


*I did not.

**Certainly not those round Frisbee® things they throw competitively in the Olympics.

Solitary story telling in the desert

Told you, I did.  Saturday it is.  Now…there is a blog post.

That means, of course, that I am going to work today.

That’s not because of the fact that it’s Saturday, or because I’m writing a blog post, or even because I told you, though that may have some more causal input.  But otherwise the causality is very much:  I am going to work + I write blog posts on work days generally + I told you I would ⇒ I am writing a blog post.

It’s apparently been a sticking point in the history of statistics in the twentieth century that no one felt they could definitively infer actual causality by statistical testing (such as with medicine effects and so on) but only association.  Of course, this is a root problem in epistemology, not merely in statistics:  the question of how we know what we know or if we know what we think we know.  I’ve actually been dipping in and out of a book about the science of causality, called The Book of Why by Judea Pearl.  It’s good but somewhat dry, and that’s why I’ve had to keep dipping in and out of it between other things.

That latter is just an example of a frustration I’ve experienced throughout my life:  I have a hard time not getting distracted from one interesting thing by the next interesting thing, and so I don’t accomplish things I would like to accomplish.

In fact, the range of time from when I went to prison and the years following was a rare period during which I was able to commit to and follow through with (in this case) writing books and short stories, one at a time, finishing one before starting the next, which is the way I need to do things if I am to succeed.  And during that same time‒well, this started after prison really‒I practiced playing guitar and ended up writing and producing/performing/recording a total of six songs, four of which are published and streamable on all major platforms.

Since then, though, I have deviated from those habits, at least partly because of the utter lack of impact those things have had.  Telling stories while lost and alone to the struggling plants and rare animals in a desert oasis is not very fun.  Even though they don’t interrupt, they almost certainly don’t actually understand anything.  And they never give any feedback.

I’ve thought to myself many times recently that I wish I could form my own personal Tyler Durden.  For those of you who haven’t read or seen Fight Club, I will try to avoid any spoilers, but I will just say that Tyler Durden is Brad Pitt’s character in the movie (and one of the two main characters in both the book and the movie).  Those of you who have seen or read it will know what I mean when I say I need or want my own equivalent of Tyler.

In any case, I need to escape somehow.  I’m enraged by almost everything nowadays.  At least I feel rage.  It’s uncertain that rage is truly caused by the things toward which I feel it.  They may merely happen to be “there” when I’m prone to that feeling.

See what I mean about the whole causality thing?  One can sympathize with the statisticians who felt they could not firmly infer causality from association.  Human emotional states give us good reason to be cautious about drawing conclusions too quickly and recklessly.  As Radiohead sang, “Just ’cause you feel it doesn’t mean it’s there.”  Or, as I like to remind people, just because you infer it doesn’t mean it was implied.

One may feel what seems to be anger toward another person or circumstance, but then it turns out that one’s blood sugar is just low, and the body is secreting all sorts of sympathetic nervous system hormones to trigger the release and creation of glucose in the body.  But those hormones also influence the brain, and are associated with fight and flight.  The brain may then do its usual associational thing and draw mistaken conclusions about the source or cause of one’s anger.

It reminds me a little bit of the brilliantly acted scene in The Fellowship of the Ring (and the equivalent scene in the book) where Bilbo gets angry and snaps at Gandalf when Gandalf is encouraging him to leave the Ring behind for Frodo.  In this case, of course, it is the Ring itself that’s causing Bilbo’s ire, but he feels, at least for a moment, that it is Gandalf’s “fault”.

What point am I making?  I don’t know that I am actually coherently making any point at all.  But then, I’m thoroughly unconvinced that there’s any true point to anything (though certainly people can find their own internal, subjective meanings).  I have more than a little sympathy with (Health Ledger’s) the Joker, who wants to show the schemers how pathetic their attempts to control things really are.

Of course, he is mistaken in one thing (well…almost certainly more than one), and that is his claim that when one upsets the established order and introduces a little anarchy, everything becomes chaos.  Everything does not become chaos; everything always has been chaos.  Chaos and order are not opposites; order is just a subset of chaos.  What we call order is just one of the things chaos does in some places, in some times, in some circumstances.

And chaos doesn’t need agents, anymore than death needs incarnations or servants, or anymore than gravity needs invisible angels to guide the planets in their orbits around the sun.  This shit is just the way things happen; it doesn’t require any agency.  It simply is.

As for why it is the way it is, well, that is an interesting question.  Actually, it’s probably a whole slew of interesting questions.  I don’t think any of these are answered in The Book of Why, despite its title, though.  It’s just not the sort of thing toward which it is addressed.

Wow, I’m all over the place, which is on brand at least.  I’m going to draw this post to a close now.  I hope you have a good weekend.  If you like football, the SuperBowl is on this Sunday.  Actually, it’s on even if you don’t like football.  The game is not conditional upon any one person liking football‒although, it requires a certain minimum number of people to like football or else it will stop occurring.  But what is that number?  Does it vary from moment to moment?

Agh, I need not to get started on questions like that right now.  It may be the question that drives us, Neo, but I’m getting too wordy for a Saturday blog post.  Hasta luego mis amigos and soredewa mata jikai, minasan.

“He thrusts his fists against the posts…”

Hey, everybody.  It’s Friday, and I’m not sure if I will be working tomorrow, so I guess just keep your eyes open for a blog post in case there is one.  I suspect that I will not be working, since many of the silly and tragic and chaotic and even the arguably good (but disruptive) things going on in the lives of people at the office persist, flowing and whirling through the phase space of possibilities, forming vortices and other turbulent and chaotic patterns.  Still, I may be wrong.  It would be far from the first time.  So take a peek tomorrow morning, if you’re up and up for it; if I work, I will (probably) write a post.

Anyway, I want to keep this short for today if I can.  I just feel worn out and over-stressed by the various chaotic things happening and by other things in my life.  Some of them should, on their surface, seem good, at least in some aspects, though I think anyone could imagine that they wouldn’t be exclusively good.  And there is a surprising amount of associated stress* and tension and consequent depression and worsened insomnia‒and it all doesn’t help how I feel about myself.

And then, of course, though I don’t very often talk about it, there is always my chronic pain.  Always.

In addition, despite the silliness from yesterday’s post, the holidays do stress me out.  It’s a frustrating kind of stress, because while I feel very lonely, I’m all but certain I would not be able to tolerate being part of someone’s celebration.  I’m too chronically “on my own”, so I can’t even readily imagine myself taking part in any kind of get together unless I was on some kind of powerful anxiolytic or similar.

Maybe I’ve gone too far down the “stranded alien” rabbit hole.  I guess that’s better than going down the “stranded rabbit” alien hole, though neither one sounds inviting.  Anyway, I’ve just gotten too accustomed to being isolated and non-social and paranoid.  Not that I actually think people are out to get me**; I just don’t think people are safe.  They are not trustworthy.  This is not meant to be an aspersion on their characters.  I don’t think they are (necessarily) malicious.  I just think they’re unreliable in too many, too important ways.

So, despite whatever dreams and wishes I have‒and I do have them, though I try not to waste too much energy on them‒I expect that the state I’m in right now (I don’t mean Florida) is the state I’ll be in for the remainder of my existence.  And that is at least part of why I don’t desire my existences persistence.  It’s not great for me and it seems terribly unlikely that it would be any significant good for anyone else.

One benefit of being isolated is surely that at least one’s existence or nonexistence is unlikely to be very disruptive of other people’s lives, one way or another.  And my personal ethos contains a strong aspect of trying not to cause other people trouble, and feeling horrible if I do.

It’s not even about whether those other people actually feel inconvenienced or troubled; even if they reassure me, it probably will not help.  I am the one who experiences the shame of bothering other people.  It’s not as much an empathy-related phenomenon as a sort of Categorical Imperative kind of problem.  Well, no, that’s not the right reference.  I think the term is Deontology.  It’s a rule I have to follow even if it has no impact on anyone in any way.

To be clear, though, this is not a philosophical stance on my part.  I haven’t chosen to do this based on any reasoning or logic; I’m just using those things to explain it.  It’s very much a setting-point, akin to a black-box strategy devised through gradient descent in machine learning.  As such, it is something preceding and overwhelming any potential rational assessment and judgment on my part.

I don’t think I’m expressing this well.  Perhaps that’s partly because I don’t fully understand it in any kind of systematic, algorithmic fashion.  Perhaps it’s not understandable in such terms, but is rather the product of the various nonlinear processes that entail the brain functions of human beings.

Anyway, that’s enough for now.  If I work tomorrow, I’ll probably write a blog post.  If I don’t work tomorrow, I almost certainly will not write a blog post.  This leaves a little gray area in the outcome “no blog post” because it’s not completely impossible that I might work and yet not write a blog post.  So, not working almost certainly implies no blog post, but no blog post does not imply not working with as strong a tendency.  This is a fact of probabilities relating to Bayesian statistics that sometimes throws people off, but it’s important in practical matters, such as in knowing what to make of a “positive” screening test result, say for an infection or cancer.

I leave it as an exercise for you, if you’re interested (also if it’s not just obvious to you), to work out why these things are so.  And I also leave it as an exercise for you to have a good day and a good weekend.


*Not to be confused with the Associated Press, though there are commonalities.

**I don’t rule it out categorically, of course, since it is a physical possibility and thus does not have a truly zero chance of happening.  But it seems unlikely.  Why would anyone be truly out to get me?  Whose priorities could be so out of whack that I would be their focus?  Still, people are stupid (present company included), so I can’t dismiss it completely, and I always have such possibilities at least in the back of my mind.

“Though this be madness, yet there is method in’t”

Well, first let me apologize for yesterday’s blog that largely concerned the weather, and in a trivial sense at that.  It was rather lamentable, I know, with emphasis on the first four letters of that adjective.

On the other hand, I don’t apologize for having had my little bit of fun with the date.  That may have been even less interesting to most of you than my jabbering on about the weather, but I like it.  I fully expect that I will do such things again.  For instance, in a similar vein, today is a bit fun because it is 11-12*, so the numbers are in appropriate ordinal sequence with no gaps.

That’s not very fun.  More fun will be had (by me, anyway) on Friday, when it will be 11-14-25.  If you don’t immediately see the fun there, it may help that a similar fun date next month will be available on 12-13-25.  This fun also works with the European date order (but in both you have to leave out the digits that denote the century).  Also, there were no equivalently fun dates in any month before October.

This is the most fun I’ve had on any kind of date in at least 16 years, I would guess.  That’s an easy call, because I haven’t been on any date at all in at least that long.  See what I did there with the multiple meanings of the word “date”?  Of course you did.  What do I think you are, a moron?

No, I do not think that.  You are reading a blog post, so you are a reader, which gives you a serious leg-up, moronia-wise.  You draw from the well of that greatest of all human inventions:  written language.  Your taste in reading material may be somewhat questionable, but I cannot legitimately complain about that.

Wow, I feel like I ought to be almost done with this post, but I’ve barely passed 300 words.

On to other things.  I’m going to try to do a better job about science reading during my downtime in the office.  It’s not that I’m completely slacking; I’m reading Shape by mathematician Jordan Ellenberg, and I just read his earlier book How Not to Be Wrong.  I’ve read both before and/or listened to the audio books, but they are well worth rereading.  He’s a great math professor, and has a gift for explaining potentially abstract concepts.  I think he’s slightly better at this than Steven Strogatz, the author of The Joy of X and Infinite Powers, but they’re all good.

I also just yesterday gave in to an urge I’ve had for some time:  I ordered a textbook I liked in med school but which we didn’t really get into as deeply as I would like:  Principles of Neural Science, by Kandel et al.  The edition I had was by Kandel and Schwartz, if memory serves, but Dr. Schwartz is no longer involved, it seems.

It’s a textbook, so it’s pricey, even in paperback, but I discovered that I could put it on a payment plan through Amazon, so that’s what I did.  It arrives today.

I’ve also resolved, at least tentatively, to try to take the heat off my reading of my science books‒including the above newcomer‒by doing something I did when reviewing/studying in med school:  I would get a text that I was reviewing, and I would pick a section to read/review by flipping a coin.

Actually, it was a series of flips, each one dividing the “remaining” part of the book in half.  In other words, for the first flip, heads would mean I would look in the front half of the book, tails would mean the back half.  Then the next flip would decide to which half of that half I would narrow things down further.

Anyone who has spent any time dealing with computers and/or binary numbers can readily recognize that, with 10 flips of the coin, one could choose a specific page in a 1024 page book.  I guess every flip would count as a kind of “half-life” for the book’s pages.  If one wanted, one could even choose one’s pages not with a coin flip, which is not truly random, but with a quantum event that has a 50-50 chance, like measuring whether a given electron’s spin is up or down.

Of course, I don’t have a Stern-Gerlach gate, so I would have to “farm out” that process.  But I understand that there are apps that you can use that have their sources at labs where each decision is truly made by a quantum measurement.

It’s not terribly practical nor more useful for pickling book pages than is a coin flip, but if you’re a convinced advocate of the Everettian, “many worlds” version of quantum mechanics, it has the added “benefit” that each “flip” will divide the universe into two “worlds”, one where you choose from the earlier half, another where you choose from the latter.

Coin flips do not enact such splitting, not in anything but the trivial sense that every quantum level interaction potentially does so.  The experience will be the same for you, though, except whatever glee you might derive from splitting the universe to choose a section to read.

Anyway, I’ll be trying to read my books, random section by random section.  Believe it or not, this works for me.  I don’t have to learn things in order, usually, and this method avoids me feeling bored while trying to trudge through a text in order.

Perhaps I do have some aspects of ADHD up in there in my brain.

Well, I’ve now passed my target length for this post by some margin, so I’ll call this enough for today.  I expect to be writing another post tomorrow, but like everything else**, it is not absolutely certain.  I hope you have a very good day.


*Only in the American style Month-Day-Year format, though.  It is less fun in the European Day-Month-Year format.

**Yes, even death and taxes, in principle.

“And as the fear grows, the bad blood slows and turns to stone…”

It’s Friday, and I feel as though I’ve recently run an ultra-marathon‒except that, if I were in the habit of running ultra-marathons, I think I would be more physically fit.  I like running, actually; I used to get that famous “runner’s high” endorphin rush, and it made me feel that if I just pushed a little bit extra with my next step, I could take off and fly.

Alas, my chronic pain has made it very difficult to do regular jogging and/or running.  I still like to walk, but I have to be careful.  In any case, pain saps my energy even for walking, and for many other seemingly minor things.

I’ve had a lot of pain this week, in my usual places as well as in my more newly encroached-upon regions, like my right hand/wrist/forearm/elbow.  I wish I could sleep better, just to escape from it, but my sleep has also been even worse than usual this week.

I’m stressed by the laundry machine thing as well, of course.  I’ve had to wear old backup clothes and buy quite a few new pieces of clothing, chewing up some of my savings, such as they are, and that’s so frustrating.

I hate my life, but I’m stuck in a sort of slight local bump in the middle of a huge surrounding value-sink, a kind of one-person Nash equilibrium.  There is almost nothing in my life (my daily life, anyway) that is much good, but to change my life would nevertheless at least temporarily make everything worse, and there is no way of knowing if it would ever get better.

So, I do nothing but what you “see”, waiting here for the branch* to break, which I’m sure it will do before very long at all.  It could be today; I would not be surprised.  I barely had the energy to go back to the house after work last night, and I can barely get going to go to work this morning (though I am doing it).

I don’t know why I do it.  It’s probably more out of habit and training than anything else.  Not only do I find no lasting happiness or fulfilment, I have no even momentary peace of mind.  I just occasionally get so exhausted that I am able to become unconscious, but that lasts a very short time before I sort of start awake, as if I’ve heard enemy troops going through the jungle nearby.

I’ve never fought any wars in any jungles, of course.  But I just don’t ever feel safe**.  And I certainly have no squad, no fellowship, nor even any partner with whom to share the watch or whatever.

Lone tigers can do well, I guess, since that is their nature.  But wolves and humans and humanoids (like me) are not really at our best when alone.  That was why in the ancestral environment, ostracism was such a serious punishment.  A human alone on the Serengeti thirty thousand years ago was a human who was unlikely to survive for long, let alone to leave any offspring.

It’s appropriate for something like I am, I suppose.  If I were worth being around, there would probably be people around me.  But whatever compensations I was able to generate in the past to make my weirdness worth tolerating, I don’t have the energy or the will‒or the skill, to be thorough‒to bring those things to bear.  I’m not even sure what they are anymore.

Oh, well.  It’s not like there’s any reason to suspect that anyone else knows what they’re doing or has many true, deep insights.  There are a few people here and there in history who figure out useful things, but everyone is merely flesh and blood.  Their minds and wills and insights are markedly finite.  One can learn what one can from them, but one can expect no deep, final answers.

There may be no such deep, final answers.  The universe shows no evidence of having been built for us, after all.  We are just epiphenomena.  Don’t let anyone try to fool you with any ridiculous “fine-tuning” argument(s).  The universe is not fine-tuned for us.  There is almost nowhere in the universe where we can survive.  I made a video that more or less talked about this, if I recall correctly.  Even the Earth is largely hostile to us, and it’s by far the most livable place in the known universe.

The fine-tuning claims remind me a bit of people who say that natural immunity is adequate (or even best) and that we don’t need vaccines.  People can imagine this to be true only because they are the recipients of the world their ancestors created: a world where there are few deadly diseases that wipe people out in childhood the way they used to, because of measures like vaccines.

Or‒to think of other people who speak and act out of ignorance of what it has taken to make the world in which they find themselves‒we have those who decry capitalism as fundamentally evil all while writing on their laptops and tablets and smartphones and driving their electric cars to get overpriced coffee-like dessert beverages from international coffee chains.

Don’t even get me started on flat-earthers.  The frikking ancient Greeks and Egyptians and Phoenicians and all those ancient civilizations knew the Earth was round.  Eratosthenes even figured out how big it was, to within a few percent of our modern measurements, about 2200 years ago.

No intelligent people who paid attention and thought things through (or cared) ever really thought the Earth was flat.  If the Earth were flat, on a clear day you could climb to the top of a high building and essentially see to the edge in all directions.  With a good enough telescope and no interfering mountains, you could peep through someone’s Tokyo window from Chicago.  The Earth is not flat.

I, however, am a flat person‒not in the sense of being roughly planar, but rather in the sense that all my fizz is gone; my pep and vigor are asymptotically approaching zero.

At least it’s Friday.  Maybe next week will be better.

I doubt it, though.


*Or the camel’s back, if you prefer.

**I’m actually not safe, of course.  No one ever is.  But there are gradations of safety, and probability rules ordinary reality.  When risk is low enough, one should ideally feel quite different, much more even-keeled, than when risk is high.  Unfortunately, that’s often not how things are.

“Broken branches trip me as I speak.”

Tuesday, Tuesday, Tuesday…I can’t think of any jokes or plays on words regarding this day of the week that I haven’t already done, probably ad nauseam.  That’s my habit, it seems:  perseveration, repetition, all that stuff.  That’s probably related to the ASD thing.  It’s certainly been with me all my life in one form or another, or at least as far back as I can remember.

Speaking of “as far back as I can remember”:  I think my oldest memory‒certainly one of the oldest‒is of having to be carried out of The Three Caballeros in the main street theater in Disney World (currently known as the Magic Kingdom), because they started shooting their guns.  I remember the noise being painful and terrifying, and I remember someone picking me up and taking me out of the theater.  I would have been about two years old, I believe.

I used to be unable to tolerate loud noises such as fireworks and muskets* and the like.  I also hated getting my hair cut, I remember that; but I also really hated getting it combed, especially since it was so prone to tangles.

Enough pointless recollection.  I don’t even know what I was trying to discuss there.

Ugh.  I don’t even know why I’m doing this, he said, inadvertently quoting Luke Skywalker from The Empire Strikes Back.  I mean, I get the nature of habit, but I don’t want to be a creature that blindly follows habit.  I’ve been trying to improve my own habits, to decrease or eliminate bad ones, to inculcate good new ones (or to reinitiate older habits that were good).

But even those objectives, though “good” in and of themselves from the point of view of having better strength of character or whatever, are also pointless in the end.  If I’m just robotically carrying out “good” habits without joy or friendship or love or anything along those lines, it’s just a Sisyphean task, and I’ve never been convinced by Camus on that subject.  I’ve written about this before, but I’m not sure precisely where and when.

I’ve probably written about all of this before.  Everything is repetitive and dull; it’s so irritating.  The YouTube algorithm is even failing to find me videos in which I have enough interest to distract myself for a moment.  The other social media are likewise tedious to annoying; they’re mostly just online forms of distilled human stupidity.  As if human stupidity weren’t concentrated enough already.

I’m not interested in any new science right now, or math, or computer stuff, or philosophy, or even fiction (new or old).  I have no interest in any movies or shows that are coming out; what a joke that landscape entails.  I also have no interest in listening to or writing or playing music, despite my Radiohead quote in the title of this post.

Oh, yeah, and every day, so much of the day, so much of me hurts.  That takes the bloom off many a potential rose.

I’m not even happy about the fact that it’s October and Halloween is coming.  I have no one with whom to celebrate it.  Ditto for the subsequent celebrations.  Holidays are things people celebrate with other people.  Maybe not all possible kinds of people do it that way, but on this planet it seems pretty consistent.

I thought about it recently, as if for the first time, though I don’t see how it could have been:  For the initial long stretch of my life, I was always around other people, even in my personal life.  I was the third of three children, so my parents and siblings were always about; I even shared a room with my brother until I was high school age.

I was in the same house and school system from K through 12 as they say, so I knew my fellow students and had several good friends.  Then, in college, I had a consistent roommate for all four years‒a most excellent one, I may say‒and another core group of friends.

Then, of course, I got married.  That entailed a bit of a rift with my own family‒I won’t get into that cluster fuck, because no one comes out looking good‒but also became a welcomed part of my then-wife’s family.  Unfortunately, with respect to my prior friends, when I’m away from people I have serious trouble maintaining ties‒this is apparently related to autism, but I’ve always just felt ashamed of it but incapable of doing otherwise.

Then of course I went to med school and residency and lived with my wife, and eventually we had kids, and that was wonderful‒they are wonderful‒but then my injury and chronic pain happened, and I guess my underlying ASD didn’t help me deal with that.

Then I got separated and then got divorced**.  And then I made the foolish (however well-intended they were, which they were) choices that led to me being a guest of the Florida DOC for 3 years (minus gain time).

Gradually, more and more, I have been alone by myself, and I am not good at taking care of myself***.  It’s odd; I used to be pretty good at taking care of other people, though I don’t think I have that will anymore, but I’ve never been good at taking care of myself.

And when, over time, everyone you care about goes away, consistently, then whatever your priors were, your Bayesian assessment of probabilities almost has to lead you to a high credence that you are a big part of the problem.

And by “you” I mean, of course, me.


*For instance, at the musket festival at Greenfield Village in Dearborn, Michigan…an immensely cool place, by the way.  Greenfield Village, I mean.  I don’t really know anything about the rest of Dearborn, but I expect it’s fine.

**I deliberately put this in the passive voice, because it wasn’t my idea.  I think I would never have sought a divorce‒it’s not really in my nature‒but I wasn’t going to try to coerce someone who didn’t want to be around me to stay around me, despite oaths freely given and all that.  I could never blame someone for finding my company objectionable.

***As for what “self” actually means, I’m using it here informally, just as a general reference to the person writing this blog and about whom it is being written.  There are no deeper metaphysical meanings; you can infer them if you wish, but that doesn’t mean they were implied.