In a better blog than this, I shall desire more love and knowledge of you

Hello and good morning.  It’s Thursday, and I’m writing this post on my lapcom.  I feel as though I ought write these posts only on the computer (not that smartphones are not computers, but cut me a little slack on this, please), and I would be more inclined to do so if Microsoft would stop making Aptos the default font!!!!!

If I could go back in time and change something, that’s one of the things I would be inclined to change.  If I found that there was one person mainly responsible for this new font, well…I don’t know if I’d go all Terminator on them and kill that person’s mother before that person was born, or kill the person when that person was a child, but something needs to be done to erase the stain of this horrible font from existence.

Certainly, if I were given* absolute power over the world, from this moment forward, one of the petty things I would do (I would try to keep the petty things to a very bare minimum, trust me**) is to eliminate that font from any and all standard computer systems anywhere.  I would probably allow for individuals to select the font if they really like it, but would not let them use it on anything but internal work between people who also like the font.

Also, I would probably mark people who chose the font freely for a visit from my secret police.

I’m kidding.  I despise the very notion of thought crime, let alone aesthetic policing in private matters.  This is even though some people’s quality of thought sometimes feels like a crime against nature.  But, of course, there cannot actually be crimes against nature.  Nature does not punish one for disobedience to its laws.  It’s simply not possible to do anything but follow them.

That’s one reason why I truly despise headlines like “The new finding by Hubble that breaks physics!” and whatnot.  Not only are they plainly clickbait, they are stupid clickbait.  I don’t know for sure if it’s just the headline writer or the writer of whatever the attached article might be who makes the headline in specific instances, but in either case, when I see headlines like that, I think that whoever wrote it really, clearly doesn’t understand physics very well.  Nor do they the nature of scientific discovery and advancement.  Because of that, I am far less likely to read the attached article (or watch the video) or even click on its link.

Nothing can break physics.  If you find something that seems to violate physics as you understand it, what you have found is not a violation of physics but rather a place where your understanding of physics is clearly incorrect.  This is far from a horrible thing.  This is how progress in physics (and in other sciences) is made:  by finding the places where our “understanding” doesn’t predict or describe what actually appears to be happening.  The world cannot be “wrong”, so our understanding of it must be, and will need to be revised.

That’s progress.

One should be hesitant to give too much “trust” to anyone who refuses to change their mind.  One of the best lines in a Doctor Who episode (not a truly great episode, maybe, but it has a wonderful speech by the Doctor) is after the Doctor has said to the “villain” (who goes by the human name Bonnie, though she is not human) “I just want you to think.  Do you know what thinking is?  It’s just a fancy word for changing your mind.”

Bonnie responds, “I will not change my mind.”

And the Doctor says, “Then you will die stupid.”***

This is simply true.  If you never learn that you were wrong about something, if you never update your credences or think about things in a new way, you will never learn anything new or develop any better understanding of the world than you did when you formed those credences.  Or, to paraphrase Eliezer Yudkowsky, if no state of the world can change the state of your retina and how you perceive that state, that’s called being blind.

I like to refer to Yudkowsky-sensei a lot, but that’s because he has said a lot of bright and interesting things, and he has said them well.  It’s also nice to know that there are some highly intelligent and thoughtful people in the world—clearly there are, or humans would long since has gone the way of the trilobites—because the idiots and the assholes make so much noise.

The best evidence I see for the fact that most people are good or at least benign (overall) is that civilization still exists, and has done so for a long time.  It is far easier to destroy than to create or even to maintain; the second law of thermodynamics tells us that things will fall apart even if we do nothing at all to break them (it says that more or less, anyway—that’s a bit of a bastardization of the proper, mathematical law, but it is related and implicit).

The fact that civilization still exists—so far, at least—seems to indicate that there must be a lot of people working to maintain and sustain and improve it, because we can easily see how much how many people seem to be trying to make it crumble****.

Assholes tend to make a lot of noise in the world, but they’re pretty much all full of shit and “hot air”.  It’s worth it to keep this in mind, because there have always been plenty of such nether orifices out there, spewing their flatus everywhere like perverse crop-dusters.  But the evidence strongly suggests that they are not the norm; they are just the noisiest.

I suppose that’s a good moral of sorts on which to end this post:  Be willing, even eager, to change your mind when warranted, and try not to let the assholes make you think the world is no better than a camp latrine (even if you’re one of the assholes sometimes, which you are, since we all are, sometimes*****).

Though, to be fair, I am hardly the person to be giving that last piece of advice unironically.

TTFN


*If you must be given absolute power, do you actually then have absolute power?  This is similar to the old song that says “Don’t ever take away our freedom.”  If you have to beseech someone not to take away your freedom, you’re not free, and if you have to be given power, your power is clearly not absolute.

**Or don’t, if that’s not in your character.  I’ve often spoken implicitly against the concept of trust, stating that I don’t feel that I can actually, truly trust any living person.  It’s calculated risks all the way down, which is empirically true if nothing else.  So, I can hardly scold someone if they don’t “trust” me.  Go ahead, form your own conclusions.  I do exhort you, though, to be as rational as possible when you form them, with your conclusions drawn as a consequence of the evidence and argument, not with your evidence and argument being curated based on your knee-jerk or at least hasty “conclusion”.

***He then proceeds to lay out the alternatives; he’s not making a threat, he’s making a point.

****When you read that, did you immediately think of your own least favorite political or other public figure, or perhaps of the people you encounter who disagree with your politics or religion or dietary preference or what have you?  Be careful.  Us/them thinking is not usually conducive to formulating true and accurate pictures of reality (though it did inspire at least one beautiful song):

*****We’re also all deuterostomes (I’m assuming only humans are reading this).  Look it up.  It’s kind of funny.

Give us this day our daily blog

It’s Tuesday now, and I’m writing this on my mini lapcom.  I don’t know if I wrote any of my posts from last week on the lapcom*, but so far this week, this will represent 50 percent of the week’s posts so far.

Admittedly, that’s not saying much, and one cannot draw many conclusions from a two-item sample in which one is one way and one is another.  To presume that they will continue to occur in a 50/50 ratio would be a major statistical/probabilistic error.  At best, one can say that there are at least two ways in which my blogs can be written, since two have so far been sampled—and that is certainly true.

Anyway, speaking of twos, it’s Tuesday.  It’s the 10th of March, of course, and the second full weekday in Daylight Savings Time, or in non-Daylight Savings Time, whichever one it officially is now.  You can tell that I really don’t see the sense in the whole thing from the fact that I cannot even recall nor logically infer which of the two possibilities is correct.  When I am actually interested in something, I tend to try quickly to dispel any ambiguities in my understanding if I can.  With this, I really don’t care, because it’s all silly.

In fact, it’s so silly that I think that’s all that need be said about it.  On to better things, or at least to other things.  But, of course, the question now is:  What other things should I discuss**?  I don’t know, honestly.

I don’t know dishonestly, either, come to think of it.

Isn’t it weird how much of a habit it is to say things like, “honestly”, or “to tell the truth”, or “I swear”, or other similar words and phrases to try to emphasize the authenticity of our words?  But they don’t do anything at all to confirm our truthfulness; epistemologically, they’re almost without content.  If anything, the fact that we felt unsure enough to have to say we’re being honest might raise a so-called red flag in the mind of a given listener.

Does the fact that a person says “honestly” or “I’m not gonna lie to you” or any similar phrase actually provide any information about truthfulness, except for the fact that this person recognizes that truthfulness is valued, at least by the person to whom they are speaking?  It doesn’t really demonstrate truthfulness, I think that’s clear.

Some might be inclined to think that the words actually indicate falsity, but that’s not true, either (ha ha).  It may be the case, at times, that a person who is trying to deceive another may say “honestly” to reassure their interlocutor that their lies are true and also to relieve some of their own anxiety.  But people who are telling the truth may merely want to recognize and emphasize that fact, and so use the same phrases.  They may, for instance, realize that something true they are saying could seem improbable to some hearers.

If it were always a harbinger of a lie, then such a seeming reassurance would indeed be a reliable signal, but of the opposite state from that described in the message’s content.  People would very quickly stop using it—the honest ones wouldn’t want to use it, since it always implies dishonesty, and the dishonest ones wouldn’t use it because it would be a dead giveaway.

Somehow, seemingly at least partly because it is an ambiguous signal, it stays in our discourse and is used automatically, more for emphasis and for rhetoric than for its prima facie purpose.  I’m sure Steven Pinker could give a good explanation for why this is so, or at least part of an explanation.  I know he’s come out with a recent book about mutual implicit knowledge and its nature (and its implications), but I don’t have it yet, and I haven’t read it.

I’ve read some of his other books and enjoyed them.  I seem to particularly enjoy his work as audiobooks.  I listened to The Better Angels of our Natures in audiobook format during my then-commute, using a Bluetooth enabled motorcycle helmet.  That book is almost 40 hours long on audio, but I was sad when it was over.  There was not one dull moment for me (of course, I was riding a very fast and non-armored conveyance at the time, so even if the book were to have become dull, there would have been other matters to keep me alert).

Okay, well, I’ve managed to meander about lexically—is that the proper term or not?—without any clear destination in mind, other than “at least 700 words”, and have written some vaguely coherent sentences about some distantly interrelated subjects.  I hope I have at least mildly entertained you, the reader.

I know, hopefully there is more than one of you, but only one of you can be reading this at one time in one place.  Now that’s a vaguely interesting thing to recognize:  reading is only ever a solitary process.  One can read alongside others, but one cannot share the process, even if several people are all listening to the same audiobook at once.  Reading does not add in parallel, only in series.

With that little tidbit that some of you will recognize and others will not, I’ll call this blog post to a close.  If there are no objections?  No further business?  Very well.  [Smacks the gavel on the table] This blog post is adjourned.


*I did not.

**Certainly not those round Frisbee® things they throw competitively in the Olympics.

This is the blog this man’s soul tries

Well, in case some of you were starting to feel lighthearted and optimistic‒just a little more at ease with yourselves and the world after two whole days without reading my work‒here I am to write another blog post that will probably bring you down and make you inclined to wonder whether anything at all is really worth anything, or if you should just give it all up, especially the habit of reading this blog.

Congratulations.  It’s Monday again, the start of another work week.  Also, Daylight Savings Time has ended (or is it “begun”?) over this last weekend, so for a bit, a lot of people’s circadian rhythms are going to be slightly off.  That will contribute to an increased number of accidents, both minor and major.  There will also be increased rates of illness (again, both major and minor), and I believe there is even some evidence that men at least will suffer more heart attacks after the time changes.

And what are the other advantages of Daylight Savings Time?  I’m not aware of any actual other benefits.

Of course, like most of you, I’m starting my own work week today, and it’s going to be a long one; the office is scheduled to be open this Saturday.  By then, the shifted time measure will be mostly adjusted in everyone’s heads.  I’m speaking of things here in the US, of course; I honestly don’t know off the top of my head whether other cultures have adopted this weird custom.

Whence did it originate?  I’ve heard explanations and excuses at various times in my life, but they are not very convincing.  If you know‒with reasonably good credence‒please share that information in the comments below.  And like and share it if you’re so inclined, especially if you have a strong sense of irony.  Heck, like and share the song itself if you want to immerse yourself in a kind of meta-level irony, or something like that:

I don’t know what to discuss today, even more so than usual.  I’ve committed to trying not to dwell on, or at least to share, my negative thoughts and emotions and so on, since I’m sure they do very little other than make other people feel depressed (yes, certain kinds of mental illness can be rather contagious, in a sense at least).

I won’t say I would never wish depression on anyone; that’s ridiculous.  For instance, I would feel much safer in the world if this Presidential administration, and indeed most of its equivalents around the globe, suffered from enough depression to make them second-guess themselves and doubt themselves from time to time.  It almost ought to be a requirement for office that someone be prone to dysthymia at the very least, so they would feel less confident that their shit doesn’t stink, so to speak.

And no, I am not suggesting that the people of the world ought to put me in charge for the best chance to make the world better.  I used to dream of such things, and I had a very Sauron-like wish to control events in the world for the greater good.  It might still not be too horrible a notion.

But my inclination over time has become more negative, more Melkor/Morgoth like.  So if anyone is inclined to encourage and engender acts of chaos and destruction on a hitherto unseen scale, by all means, give me immense power.  I make no warranties or guarantees or even assurances that I will use such power wisely.

I’ll try, of course.  No one can be expected (fairly) to do anything more than that, no matter what Yoda said.

Goodness knows I’ve tried a lot, in a lot of ways, all throughout my life, literally for as long as I can remember.  By which I mean, I’ve tried to do my best to do good things and to be a good person‒a good friend, a good son, a good husband, a good father, a good doctor, all that.  You can probably tell by my current state‒solitary, lonely, divorced, professionally ostracized, in bad physical health, in horrible mental health, alone*‒how well I’ve done at all those things.

I’m not exaggerating when I say I’ve tried hard.  I’m not one to big myself up very much, but I have worked hard all my life, trying to be a good son, a good friend, a good brother, a good husband, a good doctor, a good father.  Yet despite my sincere efforts and my reasonably high intelligence, here I am.

I suppose a lot of the disappointing outcome(s) is/are related to my ASD, both the heart-based one and the brain-based one, as well as my tendency (probably related to the preceding) to depression and some degree of low-grade paranoia.

By “low-grade” there, I mean that I don’t literally suspect that there are malicious forces plotting against me or trying to control me; I honestly don’t think highly enough of humans (or any other beings) to expect them to be capable of such things.  It would almost be reassuring if they were.

No, I mean I just have a general, global sense‒not just intellectually, but in my bones as it were, in my deep intuitions‒that I cannot rely upon anyone or upon anything, other than the laws of nature themselves (whatever their final version might be).  I don’t “trust” anyone or anything, including (one might even say “especially”) myself.  Everything is a calculated risk.

This is of course literally true for everyone, but I think most people hide from that fact most of the time, usually (but definitely not always) without terrible consequences.  I don’t know if that’s worse or better.  It may be more pleasant, but I suspect it’s misleading, and has been responsible for, or at least it has contributed to, many ills the human race has brought upon itself and upon others.

Whataya gonna do?  I guess you’re gonna do whatever you must, as they say, since it’s not as though you can do anything other than what you do once you’ve done it, and so it was all along what you were going to do, and so it was what you must do (or must have done).

I hope you have a good day and a good week.  I’ve tried to withhold my depression and negativity, with at least some degree of success‒trust me, I’ve withheld‒and I will continue to do so, because sharing it is pointless, and asking for help is laughable.


*Now, that phrase had some redundant notions, didn’t it?

Thoughts meander like a restless Melkor in the Outer Void…

It’s Friday, and this week I can be thankful therefor, because I do not work tomorrow.  The office will be closed (and locked) on Saturday.  Only those who have keys and the alarm code and some other reason for being there would be there (I suppose someone could break in, but there are cameras and alarms in place, and there is nothing of any significant net value, i.e., value worth risking the alarms and cameras to reach, inside).

Next Friday won’t be as good from a strictly work/not work point of view, but at least it will be Friday the 13th again, for the second month in a row.  Then we will have to wait an average* of 7 years for it to happen that way again.

***

Okay, I guess I’ve always known that I’m weird, but I just wrote a series of footnotes about Friday the 13th and year lengths and lengths of weekly cycle recurrences that dwarfed what I had yet written in the main body of this post.  I think I’m probably the only being in the universe that would write about such things and imagine that anyone else would be interested.

Yeah, definitely weird.

Still, I guess that sort of thing just happens when you talk to yourself in print and share it with any interested parties who might stumble upon it.  Also, when one is without companions or interactions one can, like Melkor, develop thoughts and thought patterns that are unlike those of one’s brethren.

I suppose that can sometimes be a good thing, though it can also sometimes be a very bad thing (rarely as bad as in the fictional Melkor case).  Though all improvement is change, most change is not an improvement‒at least not if it’s not deliberate and directed change.  So if one develops thoughts that are significantly divergent from those of all of one’s peers, odds are that they will not be a net improvement over most of the peer-born thoughts.

I have, of course, mitigated against this somewhat by reading a lot (and consuming other media that deal with science and mathematics and philosophy and such, as well as comedy panel shows).  That’s not randomly chosen reading, either; it’s carefully chosen reading.  I think this has helped improve the general content and tendencies of my thought, because I’ve influenced myself with the carefully thought-out thoughts of very bright people.

I suppose, though, that if one can read what one wants and does so, one is not really isolated from all other thoughts, so one’s own cannot be too very different, or at least are not very likely to be.  That’s good, I think.  Simply developing new thoughts without much input from others would be most likely to lead to some sort of feral state or something akin to schizophrenia.  

So, I guess it can be good to take tangents in one’s thinking, as long as they are not too many and too extreme.  But even given that, it’s clearly useful to have someone to rein one in, if one can, when one goes too far off the rails (yes, that’s a bad metaphor, since a train going off the rails at all is in huge trouble, rails representing a near-binary situation‒if one is a train and one is not on the rails completely, one has experienced a failure of locomotion).

Well, I guess that’s that for this week.  Actually, I suppose that is always that, by some principle of identity or self-reflection or something; I’m sure there’s an “official” name.  “It is what it is” as they say.  What I mean, though, is that I am drawing this post, and this week of posts, to a close now.

I hope you have a very good weekend.

After that I don’t give a shit.

(I’m kidding.)


*I know, I know, we won’t have to wait an average number of anything.  There is a specific and exact number of years before the next time February and March have Fridays the 13th, but I cannot be arsed to work it out just now**.

**Okay, well, since I am unable to keep myself from thinking about it at least a little, I think it’s going to be 6 years from now.  That’s because each regular year is 1 day longer than a multiple of a week:  365/7 is 52 with a remainder of 1, so one day longer than an even number of weekdays.  So next February should have the 13th on a Saturday, then a Sunday the following year, but then on a Tuesday the year after that because of the leap year (366/7 is 52 with remainder 2).  Then it will be Wednesday, then Thursday, then Friday.  So 6 years, if my figuring is correct***.

***If it seems counterintuitive that it’s 6 years when the average should be 7, remember that while in this case the leap year makes the next instance come faster, there will be occasional years when Thursday the 13th falls on a leap year and the following year will go straight to Saturday the 13th, the first of another six years (I think) that will be needed for the subsequent Friday the 13th in February.  In any case, 6 plus (6 x ⅙) equals 7, as does 6 + (a x 1/a) no matter what a is****.

****This doesn’t factor in those leap years in which February has a Friday the 13th, but March will not.  That may change the overall calculations somewhat regarding the average time between dual Fridays the 13th, but not the calculations about when the next one will be.

If you can look into the seeds of time, and blog which grain will grow and which will not

Hello, and also, good morning.

What to write about, what to write about‒that is the question today.  Of course, “to be or not to be” is always the question as well, as was recognized by Camus in The Myth of SisyphusIf I recall, he arrives at the conclusion that the titular rock-rolling protagonist must be “happy” despite the patent and constant pointlessness and absurdity of his existence.

That goes along with the whole recognition of the absurdity of life itself that is central to the existentialism movement.  Still, it’s hard for me to “imagine Sisyphus happy”, unless he was a true Bodhisattva or had been thoroughly lobotomized by Zeus (or whoever it was that had doomed him to his…well, his doom).

It can help, I guess, to think about the vast scale of the cosmos in space and time (and any other dimensionality that might apply) and also about the incredibly minute scale of the cosmos, the fundamental quantum fields (and whatever gravity ultimately is) interacting from the Planck scale on up.  It helps keep things in perspective.

Of course, even given the scales of the cosmos*, there’s another, almost sort of Buddhist/Taoist notion that notes that each individual‒each particle even‒always exists at the nexus of two “light cones”, existing in an ever-moving now.  These are 4-dimensional cones, by the way, but it’s okay to reduce things by one dimension if you will.  It makes them easier to visualize.

Your (or anyone’s) past light cone is the outer boundary of all influences that can possibly have had any effect upon you at the present moment‒those influences that could have reached you at the speed of light or more slowly.  Similarly, one’s future light cone encompasses all those things that could possibly be influenced by things at the present location at or below the speed of light.

Any motion within the light cones‒the only motion that anything within spacetime can execute, as far as we know‒is called timelike motion.  Any motion that would require going outside a light cone is considered “spacelike” motion, and is not allowed by relativity.  This is not merely because of the speed of light, it’s because the speed of light is defined by the speed of causality.  Causes cannot travel faster or have effects beyond the speed of causality.  This is a bit tautological, I know, but it nevertheless simply must be true.

So each individual’s experience, each individual process, sits at the moving balance point of a future light cone and a past light cone, crossing at the moving present, tracing out a “timelike” path in spacetime.  Of course, individual creatures are not individual particles, and so their overall spacetime path would resemble the final line produced by a sketcher going over and over a particular path to make the curve the artist desires.

If one could look at the structure of a human in spacetime, like the Tralfamadorians of Slaughterhouse Five, but one could also trace even the spacetime paths of individual “particles”**, a human life would be a sort of higher-dimensional braid in spacetime, surrounded by a haze of incoming and outgoing quantum entities, most of which will be locally bound and interacting, and so will be moving at a net velocity lower than the speed of light.

I’m assuming you don’t eat your food or drink your water or breathe your air or (shudder) sweat or excrete at near light speed.

Imagine what the inside of a mere proton or neutron might look like if one were able to see it as a rendered, four-dimensional model in fine detail!  If you think it wouldn’t be that interesting because it’s so wee, think again.

Remember, only the tiniest fraction of the “rest mass” of a nucleon comes from the mass of the three “net” quarks in it (two up, one down or two down, one up depending on whether it’s a proton or neutron).  Almost all the rest of its mass is the energy of the interactions between these three quarks:  all the gluons exchanged, all the virtual quark/anti-quark pairs popping into existence, mediated by that famous strong force and its weird*** “asymptotic freedom”.

Bringing this back around, I guess my point was merely to note that everyone and everything is pointless from the perspective of the laws of nature and the spacetime scale of the cosmos, but when you learn about those things‒the cosmos at large and small levels‒you are at least familiarizing yourself with those vast workings, and you are in a sense taking part of them into yourself.  That’s kind of a cool thought.

But don’t take too much of it into yourself!  For, much as would happen to someone who stuffed all the information about Graham’s number into one head, if you do you will become a black hole.  Now, it may be possible to survive becoming a black hole, but I don’t recommend betting on that pony.

TTFN


*I wrote a post on Iterations of Zero about how it might be useful for people to consider the cosmic perspective as contrasting with their prosaic concerns.  I don’t remember how good it was, but here’s the link, in case you want to read it and give any feedback you like.

**I use this word for want of a better term that everyone would recognize and that would be succinct.  I think we need such a different term, because a lot of the perceived so-called weirdness and mystery of quantum mechanics comes from trying to use inaccurate terms that originated in times before we understood things as well as we now do.  Quanta are not little “particles” that sometimes act like waves, nor are they little waves that sometimes act like particles (though that’s slightly more accurate).  They are entities unto themselves, and the ways they behave are all always consistent with that nature.  They don’t sometimes act like one thing and at other times act like another.  They all, always, act like what they are.

***Except it’s not weird, really.  Those of us who are surprised by it?  We are the weird ones.  Quantum chromodynamics has always done exactly what it still does, since long before any life at all existed in this universe.  To quote Yudkowsky again, “Since the beginning not one unusual thing has ever happened.”

And his brain ate into the worms…

Ugh.  Didn’t we just leave this party?  Evidently, we did not leave it precipitously enough, because here we are‒or at least, here I am‒rejoining it in the morning.

It seems like an ill-advised notion, but then again, I’m not sure who specifically advised me, or any of you, to do it.  There probably were a few literal, formal pieces of advice that we all or each received throughout our lives‒advice about getting up early and going to work and striving to fulfill our potential, and how if we didn’t we were somehow letting ourselves and (more importantly) letting everyone else down.

“The early bird gets the worm” is a typical phrase about such ambition and dedication and hard work.  But like many of us, I’ve often thought that worms are overrated.  They’re not rated highly at all, I’ll admit, but nevertheless, I think they are rated too highly.  Evidently‒according to what I have read‒all earthworms in at least the northern part of North America were killed off in the last ice age.  Nevertheless, plants grew and flourished without verminous help in the soil before Europeans accidentally brought their own earthworms here.

Of course, the saying is metaphorical, I know that.  We’re not really advised to seek earthworms early in the day, though perhaps liver flukes and flatworms and tapeworms and roundworms are also considered as among the worms that might be caught.

No, probably not.

But anyway, even though metaphorical, that saying raises higher level questions, such as, “Is the life of a metaphorical early bird worth having?”

Consider what that life entails:  Getting up (early), pecking around on the ground for worms and probably also for various other insects and their larvae and a few arachnids as well*; trying to avoid, in that process, being caught by some predator (such as a house cat); trying to find and attract a mate when the season is right; helping build a nest, if you’re that kind of bird; guarding the eggs and maybe sitting on them yourself, until they hatch; then, feeding and protecting them until they can fly on their own; then repeating these steps until disease or starvation or one of those house cats gets you.

That’s it.  And while there are many embellishments and flourishes and complications in the typical human life cycle, overall it is much the same as that of the bird.  Why would we expect it to be otherwise?

Admittedly, humans (and humanoids) can dream up other things to do, and some of them are more interesting and fulfilling, from their own points of view at least, than the ordinary early bird pattern.  But though, in the long run, humans as a whole may become significant enough to do something truly meaningful on a cosmic scale, almost all of them have no deeper lives than those lived by the early birds.

That’s not necessarily a bad thing, of course.  Taken with the pertinent attitude, such a life can be well lived and fulfilling.  It probably won’t end happily, because it’s not in the nature of life to be happy when ending; there’s just no real evolutionary benefit to having such a tendency.

Still, before imbibing the so-called Kool-Aid™ of the motivational life-messages‒those social moralities that keep us getting up and joining the rat race (to shoehorn in another animal-related metaphor)‒it would probably behoove us to consider whether that is the life we think we want, to ponder if that overall shape and experience are okay with us as the outline of our lives.

If so, there’s nothing wrong with that.  As long as you’re not interfering with other people’s ability to try to live their lives as they try to see fit**, then do what seems best to you.

But it’s useful to think about what might be the overall shape of your life if you continue as you currently are and if that shape will be aesthetically (or otherwise) pleasing to you.  If not, what change might improve that overall shape, trying to take all reasonably plausible inputs and outputs into consideration?

I won’t say that the unexamined life is not worth living, because, if it’s unexamined, how do you know that it’s not worth living?  Huh?  Huh?  Nevertheless, I will say that the unexamined, unconsidered life could be fulfilling only by accident, whereas it may be possible, with deliberation, to steer toward a better one.

Not that I’m a good piece of evidence in favor of this.  I think and overthink to the point that I hate the noise of my own mind, but I haven’t been able to steer myself into an optimal shape***.  But at least I make a lot of “noise” about such things.  That might be worth something.

Anyway, have a good day.  Enjoy your worms or salads or whatever other life forms you kill and consume to remain alive today (I’m assuming you are not a green plant).  Watch out for the Kool-Aid™ and even more so for the cats.


*I am quite sure that, to such a bird, these things taste delicious, so I don’t mean to disparage their diet as unpalatable.  Appetites of various kinds are species specific; what’s appetizing or sexually attractive to, say, a housefly is unlikely to appeal to any psychologically healthy human.  Likewise, the most beautiful human woman ever is not going to do anything for a male tarantula.  He also probably would have no interest in having a bite of her salad.

**This is more difficult to navigate than it may seem at first, because even when one is acting on one’s own, there are always effects at some level, there are always “externalities”, and occasionally these will have an impact on other people‒a foreseeable but perhaps unforeseen impact.  And vice versa.

***Should there be a “yet” at the end of that sentence?  I don’t know; we’ll have to see what happens to me in the future.  We can be reasonably sure, though, that there shouldn’t be a yeti at the end of that sentence, or of any sentence except one that mentions such creatures.

Nihil vere refert. Quisque videre potest. Nihil vere refert. Nihil vere mihi refert.

Well, I did warn you yesterday that I would be writing a blog post today*.  Go ahead, take a look.

Yesterday’s post was another of my recent, deliberately benign blog posts, not dwelling on my mental health and chronic pain issues, because nobody gives a shit about those things, or at least they don’t want to have to hear about them, because they’re not going to (be able to) do anything about them, and that makes them feel guilty and uncomfortable, which is unpleasantly awkward.

So, anyway, it’s the last day of February in 2026.  We are, in a certain sense, one sixth of the way through the year.

I say “in a certain sense” because it’s not precisely true.  Today is the (31 + 28)th day of the year, so the 59th day of the year.  If that were literally a sixth of the way through the year, the year would only be 354 days long.

It’s somewhat interesting to note here that, because February is shorter than every other month, the first two months of the year are shorter than any subsequent, nonoverlapping** months of the year.  And, let’s see, the first three months of the year have 90 days exactly on non leap years, whereas April thru June have 91, July through September have 92, and October through December also have 92.  So, all the later groups of three months have more days than the first three‒except on leap years, when January through March is 91 days.

Evidently, though, the latter six months of the year always have more days than the first six.  I wonder why they did it that way.  Was there an actual reason or did it just sort of happen?

Of course, I know they can’t be equal except on a leap year, since the number of days in a year is odd.  But why couldn’t they have come up with a way that made the years alternate, with one year‒the odd years perhaps‒having the surplus in the first 6 months and the other years having it in the last 6 months?  On leap years they could be equal.

How might that work?  We need 182 days divided by six months, which means we need four months which have just 30 days and two that have 31.  We could say January and February have 30, March has 31, and then repeat with April, May, and June and then July, August, and September***.  I was about to suggest that on odd years we make January have 31 days and on even years we make July have 31 days, but all leap years are even years, so the latter half would be comparatively short-changed with respect to years in which they are longer. if we add the leap year day to the first half as we do now.

On the other hand, we could put the leap day always in the 2nd half of the year, perhaps in November, or even more sensibly in December:  we would thereby add our extra day to the very end of the year, rather than squeezing it into the earlier part of the year like someone cutting into a line.  Though that would make the second half two days longer than the first, though, which is unpleasantly asymmetrical in a year with an even number of days.

Of course, really, all days are fungible.  I remember seeing on QI once that apparently some sect maintained that they added an extra day not at the end of February but in the middle; I don’t recall precisely where they thought the day was being inserted, alas, but I can imagine some alternative, anatomical suggestions I’d like to make for them.

Days of a month are fungible (dammit!).  It makes no more sense to say that you added a day into the middle of February and pushed subsequent days later than it does to say that you deposited $100 into your bank account right after the 256th dollar that was already there, pushing what had been dollars 257 through 356 to become dollars 357 through 456.  Every dollar is just “a dollar”, every cent is just “a cent”.  It’s rather reminiscent of the way every electron is interchangeable with every other electron (likewise for all other elementary “particles”).

So, on leap years, the extra day of the year is and can only be (in our current system) the 29th of February, because that’s the day-label that isn’t there in other years.

You’re allowed to imagine if you like that you’re adding a day to the middle of the month and pushing the other days back and renaming them.  You’re also free to argue about how many angels can dance on the head of a pin, or to debate, without first agreeing on word usages****, whether unattended trees that fall in forests make noises.  That doesn’t mean you’re doing anything that has any bearing on the real world.

Okay, well, that’s been much ado about nothing, hasn’t it?  Or, multum strepitus de nihilo fuit, as is apparently the way to say it in Latin, which almost always sounds fancier, though it doesn’t always sound better aesthetically (consider the above headline’s Latin versus the original English).  English is‒or can be‒quite a beautiful language if you take a step back and see it as if from outside.  It can be hard to distinguish that beauty “from within”, though, because the meanings and usages of the words involved can distract from their inherent loveliness.

Tolkien, for instance, wrote that he thought the most beautiful sounding phrase in English was “cellar door”.  I’m not sure I agree with him on this, but it’s a matter of taste, so there’s no slight, or “diss” or “shade”, involved in not both liking the same thing.

Enough nonsense for now, or at least enough nonsense here in this blog for now.  I’m sure that there is plenty of nonsense to be had elsewhere.  Do try to find some that’s enjoyable for you this weekend.


*That was unless I was lucky enough to get very sick or very injured or to die, which I have apparently not been lucky enough to do by this time.

**I say “nonoverlapping” because February and March combined contain the same number of days as January and February combined.

***I think in the final three months it should be October that always has 31 days, because Halloween really should fall on a day that’s a prime number, not a 30th or a 1st.

****Most such debates tend to devolve into discussions about the “definition” of the word “noise”, as if that were concrete and singular and fixed‒which it is not‒rather than the laws of physics and biology that constrain all the actual events of such an arboreal catastrophe.

Our wills and fates do so contrary run, that our devices still are overthrown; Our blogs are ours, their ends none of our own.

Hello and good morning.  It’s Thursday, the 26th of February in 2026, a date that’s only very slightly interesting whether you write it as 2-26-2026 or 26-2-2026.  The fact that you have repeated 2s and repeated 26s is somewhat entertaining, but the zero throws potential symmetries off, making it not nearly as much fun as it could conceivably be.  It’s a shame, really.  I suppose you could write it as 26-02-2026 and rescue a bit of symmetry, but that feels like reaching.  It’s not quite symmetrical anyway, unless one is writing in base-26 or higher.  No, wait, even that wouldn’t work.

I don’t know about what I’m going to write this morning.  That in itself, of course, is nothing unusual.  But I don’t feel that I have much to say about anything at the moment.  I don’t want to get into my depression and ASD and anxiety and chronic pain and insomnia and just general moribund state, because I’m sure no one wants to hear about it anymore, and in any case, there seems to be no way anyone can do anything about it that’s useful, which makes it all the more frustrating.  Writing about it certainly hasn’t cured or even improved my state much, if at all.

Anyway, as I said the other day, you have been put on notice.  Unless you just started reading my blog for the first time yesterday, you’ve no right to act fucking surprised no matter what happens.

Okay, that’s that out of the way.

Now, let’s see, what should I write today?  I could discuss some topics in science, especially physics, though I also have literal, legally recognized expertise in biology, and I know a lot about quite a few other branches of science as well.  This is because I have always been curious about how the world, the universe, actually and literally works on the largest and on the most fundamental scales.

I mean, yes, humans also have their rules and laws and social mores and antisocial morays and all that nonsense, but if you step back even a bit, you can see nearly all human behavior encapsulated by basic primatology.  If you know how the various monkeys and gibbons and gorillas and chimpanzees behave‒especially their commonalities‒human behavior almost always fits right in.  It’s usually not even very atypical.

That doesn’t make the specifics of behavior very easily predictable in any given case, necessarily; then again, we understand an awful lot about the weather and the climate, but the specifics of tomorrow’s weather are tough to predict precisely and accurately, let alone next week’s weather.  Nevertheless, the physics of longer term climate effects of certain kinds of atmospheric gases is almost trivial.

Anyway, humans are too annoying to be very interesting, except in special circumstances.  In this, they are perhaps a bit like cockroaches.  From the point of view of a scientist who studies them, they can be interesting, and from just the right angle and with the right detachment, they can even be beautiful (or some of them can).  But overall, they are merely large masses of highly redundant little skitterers, just doing their shit-eating and reproducing and infesting almost every possible location.

Which type of creature did I mean to describe just now?  See if you can figure it out.

Of course, on closer scales, cognitive neuroscience and neurodevelopment and related stuff, such as “neural” networks, “deep” learning, and other such areas are fascinating.  One thing interesting about them is how all the things that brains and computers and so on are and do are implicit in the laws of physics‒clearly they are some of the things that stuff in the universe can do‒and yet, for all we know, they have only ever happened here, just this once in all the vast and possibly infinite cosmos*.

And for all we can tell, given the human proclivity to plan about 20 Planck units ahead and then after that trust to luck, this could be the only place they occur, and their time will not continue much longer, certainly not on a cosmic scale.

I could be wrong about that…except in the sense that, since I am stating it merely as one of the possibilities, I am not actually wrong at all.  Even if humans do survive into cosmic time scales and become cosmically significant, it will still not be easily debatable that it could have happened that humans would go extinct and would fail to go anywhere but Earth.

Of course, depending on the question of determinism, I suppose one could say that if humans (or their descendants) become cosmically significant then there literally was nothing else that could have happened, at least as seen from outside, at the “end”.

On the other hand, if Everettian quantum mechanics is the best description of the fundamental nature of reality, then in some sense, every quantum possibility actually happens “somewhere” in the universal quantum wave function, though those variations may not include all conceivably possible human outcomes.

Some things that seem as though they should be possible may simply never happen to occur (or occur to happen?) anywhere in the possible states of the universe.  That feels as though it should be unlikely, given how many possible states can be locally evolved in the quantum wave function, but I don’t think we know enough to be sure.

Okay, well, I vaguely hope that this has been mildly interesting and perhaps thought provoking.  It would be enjoyable to get more feedback and thoughts, but I don’t have a very large readership, and only a certain small percentage of people ever seem to interact with written material in any case, so I’m probably lucky to get the feedback that I get.

TTFN


*With the inescapable caveat that, if the universe is spatially and/or temporally infinite, and if as it seems there are only a finite number of differentiable quantum states in any given region of spacetime (the upper limit of which is defined by the surface area of an event horizon the size of the given region) then every local thing that happens, and all possible variations thereof, “happen” an infinite number of times.  But given that all these regions are more or less absolutely physically distinct and incapable of “communicating” one with another, they can be considered isolated instances in a “multiverse” rather than parts of the same “local universe”.

Man overboard

As the real weekends go, it was better than most, to paraphrase The Wreck of the Edmund Fitzgerald.  By this, I’m referring to this last weekend, the two days before this day, of course.

I did not work on Saturday, which is good, because that would have been the third time in a row.  I also got to hang out with my youngest on Saturday, and we watched about four episodes of Doctor Who together, which was good, good fun.  I cannot complain about that in any way.

I have though a weird, disquieting, sinking sort of feeling that it may have been the last time I will see my youngest, or maybe anyone else that I love.  It’s is not one of those reliable sorts of feelings, like those that lead one to new insights in science or mathematics or what have you.  It’s probably more a product of depression and anxiety, the feeling that anything good in my life is sure not to last, if it happens at all, because I do not and cannot possibly be worthy of anything good happening to me.

Is that irrational?  Of course it is irrational.  It cannot be expressed in any sense as the ratio of two whole numbers, no matter how many digits they may have.

Wait, wait, let me think about that.  My thought, my feeling, was expressed above finitely.  That is, of course, a shorthand for what is really happening, but even if one were to codify those processes down to the level of each molecular interaction that affects any neural/hormonal process that contributes to my feeling, we know that must be a finite description (though it could, in principle, be quite large).

Even if we’re taking the full spectrum of quantum mechanics into account when describing my mental state, we know that quantum mechanics demands a minimum resolvable distance and time (the Planck length and the Planck time) below which any differentiation is physically meaningless.

A finite amount of information can describe the events and structures and processes in any given finite region of spacetime.  In fact, the maximum amount of information in any given region of spacetime is measured by the surface area (in square Planck lengths) of an event horizon that would span exactly that region, as seen from the outside*.

Any finite amount of information can be encoded as a finite number of bits, which can of course be “translated” to any other equivalent code or number system.  So, really, though the contents of my mind are, in principle, from a certain point of view, unlimited, they are finite in their actual, instantiated content, and can therefore certainly be expressed as an integer, and thus also as a ratio (since any integer could be considered a ratio of itself over one, or twice itself over two, etc.).

So, in that sense, my thoughts are not irrational.  Neener, neener, neener.

In many other senses—maybe not the literal, original sense, but in the horrified, cannot accept that not all numbers can be expressed as ratios of integers because that makes the universe too inconceivable, sense, among others—I can be quite irrational.

It’s very difficult to fight one’s irrationality from the inside, alone.  Even John Nash didn’t really beat his schizophrenia from within as shown in the movie version of A Beautiful Mind.  Also, his delusions in real life were far more extravagant and bizarre than those which appear in the sanitized version that made a good Hollywood story.

If one escapes from mental illness from within, one has to consider it largely a matter of luck, like a young child who doesn’t know anything about math getting a right answer on a graduate level, high order differential equation problem.  It’s physically possible; heck, if it were a multiple choice question, it might even be relatively common***.  But it’s not a matter of being able to choose to do it right and to know how it was done.

Severe mental health issues are going to need to receive assistance from outside, almost always.  This is not an indictment of them or of the need for help.

Surely, someone who has been swept off the deck of a ship by a rogue wave cannot be faulted for needing help from those still on the ship of they are to survive.  It would certainly seem foolish and almost inevitably fruitless if such a person tried to claw his way up the side of the ship to get back on board when there is no ladder and no handholds.  He should certainly not be ashamed that he cannot swim hard enough to launch himself bodily from the water and back onto the surface of the vessel.

One cannot reasonably fault such a person for trying to do the superhuman.  A person might try to do practically anything rather than drown or be eaten alive by some marine predator.  But, of course, barring an astonishing concatenation of events such as the time-reverse of the splashing entry into the ocean happening and sending the person out of the sea just as it was entered, such efforts will not succeed.

And though it might be heartening or at least positive for one to receive encouragement from those still on the deck—don’t drown, keep treading water, you can do it, you’ll make people sad if you drown, you deserve to stay afloat, I’m proud of you for treading water yet another day, it’ll get better, this won’t last forever, you’ve made it this far so you know you can keep going, you don’t want the people who know you to feel sad because you drowned, etc.—in the end it might as well come from the seagulls waiting to pick at one’s floating corpse.

Mind you, certain kinds of words can be more useful than others.  Words like, “Hey, around the other side of the ship there’s a built-in ladder; if you can get over there and time things right, you might be able to grab the lowest rung when the waves lift you, and then climb up,” might be useful because they are directions for using real, tangible resources that we know can make a difference.  Also, words like, “Hang on just a bit longer, we’re throwing down a life preserver on a rope so we can haul you up” would be useful, obviously, unless they were mere “comforting” lies.

Alas, though one could reasonably expect such literal assistance if one were washed overboard—the “laws” of the sea are deeply rooted in the hearts of those who work there, and they include a general tendency to help anyone adrift to the best of one’s abilities—when it comes to mental illness, the distress and the problems are difficult for others to discern and easy to ignore.  Calls of distress are often experienced as annoyances, and even treated with contempt, since those hearing them cannot readily perceive that they themselves might be similarly washed overboard at any time.

But, of course, they might be.

I don’t know how I got on this tangent, but I guess I never really do.  I just go where my mind takes me, and my mind is not a reliable driver.  It is, though, a reliable narrator.  It doesn’t matter, anyway.  Nothing does.

Anyway, here we go again into another work week, because that was what we did last week.  I wish I could offer you better reasons, but I’m really only good at breaking things down, destroying things, not at lifting anyone or anything up.  That comes from other regions and is conveyed by other ministers.


*From within an event horizon, the volume could be much larger than the spacetime that seems to be enclosed from the outside, because spacetime inside the horizon is massively curved and stretched.  It’s conceivable (at least to me) that there could be infinite space** within, at least along the dimension(s) of maximum stretch, just as there is infinite surface area to a Gabriel’s Horn, but only finite volume.

**See, mathematically, one can stuff infinite space inside a nutshell.  Hamlet was right.  He often was.

***Perhaps this explains why certain types of mental health problems can respond well to relatively straightforward interventions, and even to more than one kind of intervention with roughly comparable success, e.g., CBT and/or basic antidepressants and such.  These relatively tractable forms of depression are the “multiple choice problem” versions of mental illness.  This does not make them any less important.

This is not an attention-grabbing headline

I’m writing this post on my smartphone, even though I brought my lapcom with me yesterday evening.  I did not use my lapcom for yesterday’s post, such as it was.  I didn’t even write that post in the morning yesterday, or at least, I didn’t write the “first draft” of it then.

By the end of the workday on Wednesday, I didn’t feel like I was going to want to write a blog post on Thursday.  So I went to the site directly and just wrote the “Hello and good morning,” and the “TTFN” and set it to publish later.

I already knew what title I was going to want to use for it.  I wanted to use Polonius’s dithering, meandering jabber about brevity being the soul of wit, as a sort of left-handed self compliment about my own brevity in that post, and because, in the original form, it would have made the headline longer than the post, which would be ironically funny, in principle.

Then, yesterday morning, I got the urge to put my little “insert here” bracketed bit in the post, the better to convey how disgruntled and disaffected and self-disgusted I (still) felt, as well as how tired.  It did sort of spoil the joke about the headline being longer than the post, of course.  At least the older joke about Polonius still holds water.  Then again, that joke was made by Shakespeare, so we shouldn’t be too surprised if it has serious legs (though this raises the question of how serious legs could possibly hold water).

One thing worth at least assessing this week might be whether there is an aesthetic difference between this post (for instance) and the posts I wrote earlier this week, on the lapcom.  Writing on the lapcom is quite different for me in many ways.

On the lapcom, I generally have to work to stop myself before a post, or whatever, gets too long.  Whereas on the smartphone, that isn’t as frequent a problem.  Not that I can’t yammer on and on even with the smartphone, of course.  Some might say all I ever do is yammer on and on.  But anyway, I can’t write as “effortlessly” on the smartphone as I can on a regular keyboard*.

Sorry, I’m retreading a lot of old ground here, which I guess is better than retreading a lot of old tires. I know how to tread on the ground; indeed, I cannot recall a time when I didn’t know how to do that kind of treading.  Whereas retreading a tire sounds like something that requires special skills and equipment, both of which I lack.

I don’t know, I’ve heard of “retread” tires, but I don’t know if such things still abound, or if they ever did.  It sounds vaguely like a bad idea, like such tires might be more prone to blowouts.  But latex is a finite resource, and there aren’t very good synthetic alternatives, so maybe there’s at least some cost/benefit tradeoff (or treadoff?) there.

Ugh.  With that last joke, I probably convinced at least some of my readers that, yes, the world would be better off if I were dead.  Actually, I say that as if it were conditional, but it’s not.  It would be more in line with reality to say “the world will be better off when I am dead”.

There’s a quote by which to be remembered, eh?

I cannot say whether I will be better off when dead.  It’s probably a nonsensical question.  When I am dead, I will not be anything at all, not better, not worse, not uglier.  What happens to virtual particles after they have annihilated?  Nothing, and less than nothing, for they truly no longer exist, and in some senses they never existed.  Indeed, as physics goes, they probably never do exist; they are a shorthand description of what happens in quantum fields when perturbances in the fields have effects that do not rise to the level of actual, true particle production.

Or so I am led to understand.

From another point of view, it is possible for something to improve, at least in a sense, by ending.  I’ve mentioned this before, but if the curve of a function‒perhaps a graph of the “quality of life” or one’s “wellbeing”, to say nothing of happiness‒is persistently negative, then returning to zero is a net gain.  It can be a huge net gain, in fact.  This is related to the origin of my own version of an old saying, which I use with tongue definitively in cheek:  The one who dies with the most debt wins.

Now, of course, the integral, the area “under” that wellbeing curve would not be improved by the curve reverting to zero and stopping.  But at least that integral would not keep getting more and more negative over time.

Some might say, “well, the integral can become less negative over time, and might even become positive”.  This is, in principle, true.  And when one is younger enough, it’s relatively easier to tip the curve, and its integral, into positive territory.  But as the curve goes on, having been negative for a longer and longer time, it’s going to become ever harder to bring things to a net, overall positive integral, even if one could reliably make one’s curve positive (which one often simply cannot do).

Of course, the moment to moment experience (which is all the mind really gets) of an ascending curve could be pretty darn good, and might well be worth experiencing, even if it’s not enough to bring the integral into positive territory.  We are straying into the “peak-end” rule here, which was elicited regarding (among other things) colonoscopies but applies to much else in human experience.

Speaking of peak endings, I’ll mention in passing the curious fact that, no less than twice in the last week, the evening train service has been disrupted by someone either getting hit by or becoming ill next to the train.

Earlier this week, right by the station where I catch the train to go back to the house, there was a man who looked like he was probably homeless and had collapsed next to the train tracks not far from the station.  I saw him brought away, finally, on a stretcher.  He didn’t look physically injured‒certainly not in the ways I would expect someone who had actually been hit by a train to look‒but he did look cachectic, which is why I thought he might be homeless.

Then, last night’s commute was interrupted by what they call a “trespasser strike”, one that did not involve the train I rode but which always slows everything down.  I’m vaguely amused by the euphemism “trespasser strike”.  A “trespasser” here is a non-passenger who doesn’t work for the train company (or whatever) who is in the area adjacent to the tracks.  The “strike” part is probably self-explanatory.

I suppose it’s literally true, at least from a legal point of view, to call the person a trespasser.  But it’s amusing that the train people have to say something derogatory about a person hit by a train‒even if the person deliberately put themselves in harm’s way‒to sort of, I don’t know, assuage the company’s conscience.

But we are all trespassers, in at least some senses.  We are also, in other senses, all owners.  We are all innocent, and we are all, in some other senses, guilty.  “Every cop is a criminal and all the sinners saints.”  Above all, we are all very much just passing through, staying only a very short time.  We are all virtual particles.  Or you might say, we are all Iterations of Zero.

Have a good weekend.  I should not be writing a post tomorrow (in more than one sense).


*I wish I could honestly say that my use of a piano-style keyboard were as effortless, but I am terribly rusty with that, though I started learning it when I was 9, a rough 2 years earlier than when I got my first typewriter.