You’re so vain, you probably think that nothing matters

I was going to start by saying that I had probably written all I could about Friday the 13th and the fact that there are 2 in a row when non-leap year Februaries have Fridays the 13th, and that a first glance might lead one to think this should happen roughly every 7 years on average*.  However, as I noted last time I discussed this, because the leap year day is in February, we will not have the two-in-a-row Fridays the 13th (February and March) as often as we might otherwise; it will not happen every 7 years on average.

Then, this morning, after recalling that today was Friday the 13th, I ran through the next years’ Fridays in my head in the shower, and it occurred to me that the next Friday the 13th in February‒which will be in 6 years, as I noted in the past‒will not be followed by a Friday the 13th in March!  2032 (six years from now) will be a leap year, so there will be 29 days in February, so there will be no Friday the 13th in that March.

The next paired ones, then, will be a further 5 years after that, in 2037 (not a leap year).  It would have been 6 years later, but there are two leap years in that interval, 2032 and 2036, so the next one comes a year sooner than it would otherwise.

It occurred to me that, because of the frequency of leap years, which is almost twice that of the cycles of days of the week, the frequency of those paired dates may well be once every 11 years rather than every 7.  At least those are both prime numbers.  I’m not going to work out some exact formula right now, though.  It’s not really important.

Of course, one could say that nothing is truly important, and I am persuadable along those lines.

There is a Doctor Who Christmas Special (the one from series 5) in which the antagonist/guest protagonist (played by Michael Gambon!) describes a woman in a cryo chamber as “nobody important”, and the Doctor characteristically responds by saying, “Nobody important?  Blimey, that’s amazing.  You know, in 900 years of time and space, I’ve never met anyone who wasn’t important before.”

This is typical Doctor, of course, but it raises the objection Dash (from The Incredibles) voiced when told that everyone is special:  Saying that everyone is important can be the same thing as saying no one is.

Of course, important is in the eye of the beholder.  But then again, the beholder is not important, either, except in its own subjective estimation and perhaps that of a few other, equally unimportant, owners of such eyes.

So, yeah, one could argue relative and subjective importance from local points of view, which is valid but more or less vacuous outside its small scale as far as I can see.  On a cosmic scale, it’s all just dust and shadows.  But you could also say that about the entirety of the cosmos itself.

I guess import has always been subjective, even though people are not inclined to see it that way.  But, of course, people are the products of their “local” forces, and they are not responsible for the laws of nature, nor for the things which have happened in the past that have affected them in the present (which could come under a certain interpretation of “the laws of nature” in and of itself).  I won’t get into all that now.

Going back to the shower, but on an entirely different subject, I was also thinking about the effects of diminishing amounts of shampoo in the bottle on the center of gravity of the bottle.  At the start, when it’s full, the center of gravity is roughly in the geometric center of volume of the whole thing.  But as one uses the shampoo, the center of gravity shifts lower and lower, since the air replacing shampoo in the upper part of the bottle is much less dense than the shampoo or the bottle.

But then, as one gets to the dregs, the smaller and smaller amount of shampoo in the bottle contributes less and less to the overall mass distribution of the bottle and its contents, and the center of mass begins to head back up.  Finally, when the bottle is “empty”, the center of gravity will have returned to almost the same place it was when the bottle was full.

All that’s fairly trivial, well-known stuff, I know.  But it got me to thinking about how much of the laws of physics, such as the laws of gravitation (Newtonian form), are solved using such concepts as the center of mass, which is really just a way of combining and averaging the effects of numerous tiny bits of gravitating material as if they were concentrated at one point.

Much of the mathematics of physics works this way, coarsely approximating the very fine details of reality in a way that provides reliable, reproducible guidelines and can produce testable predictions.

But the granularity of reality doesn’t actually ever go away, not at any level.  Even at the level of the quantum wavefunction of a single “particle”, the actual behavior of the thing as it interacts with things in the “larger” world is the summation of the effects of all the possible quantum states of the electron superposed upon each other and interacting with things‒everything‒which are also just collections of superpositions of quanta.  That superposition happening in a “space” that doesn’t directly coincide with the macroscopic space we experience, but whatever its dimensions are, they are real, because they have durable, reproducible effects.

Mathematics may be unreasonably effective in the physical sciences, as Eugene Wigner famously noted, but it seems not to be a refining of description but rather an averaging out, a glossing over, the inking of an underlying rough pencil drawing which nevertheless still constitutes the real, original picture.

It may be that, in a sense, all science is just various forms of statistical mechanics.  We know that, at larger scales, we definitely need the tools of probability and statistics to navigate as best we can the territory of reality.  And yet, we don’t teach this sort of stuff to most people, ever.  I wrote a post about this on Iterations of Zero, if I remember correctly.

I could go on about all this rather easily, I guess, but I am using my smartphone today, and my thumbs are getting sore.  That’s okay; yesterday’s post was probably way too long, anyway.

If I did a video of my thoughts on this I might be able to get into more detail, though it would probably be even more erratic and tangential than my writing.  Still, maybe it would be worth trying.

In the meantime, I’ll write at you again tomorrow.


*Go ahead, do a search on my blog page for Friday the 13th; I’m all but sure it will bring up the pertinent blog posts.

 

I had a good headline idea, but it slipped my mind

I was surprised by how much response I’ve received to yesterday’s blog (and that of the day before) as well as the number of comments.  It’s very gratifying, and I appreciate it very much.  Thank you.

As for today, well, I am really not sure what to write, because yesterday’s blog was‒from my viewpoint, anyway‒about as free-form and chaotic and tangential and stochastic (not to say redundant) as anything I’ve written.  But maybe that’s just the experience I had while writing it; maybe it doesn’t actually come across that way to the reader(s).  It’s difficult for me to know, because even more than reading, writing is a solitary thing.

That’s not to say that people can’t write together.  Back when I was a teenager, I co-wrote some partial stories with one of my best friends, and we did it sitting next to each other and talking things through aloud as we typed.  That was a pretty active and interactive collaboration.

Unfortunately, I don’t think we got very far with it.  We made much more progress writing silly computer programs in Basic on the Apple II+ my father had bought.  This was in the days before there were any ISPs as far as I know, though we did dial onto a couple of local “billboard” services from time to time with my dad’s old modem (I think it was 600 baud*, but it may be some even divisor or even a very small multiple of that number).

One time, I even had a conversation with a girl (!) who was helping run one of the billboards.  She was (supposedly) about my age, and obviously she was much more into computers than I was for the time.  There was never (in my regretful mind) any possibility of an ongoing interaction, let alone a physical meetup or anything, however.  Even then, though I was reasonably confident when within my local group of friends and teachers, I was painfully shy and awkward, and could never make conversation other than about specific topics.

Goal-directed interactions are okay, as they tend to flow naturally from the process involved.  This is why I’ve made nearly all my friends at school or at work.  Purely social interactions were never really an option for me, except with people I already knew quite well.  And having a successful romantic relationship was unfortunately not in the cards for me.

It still isn’t, as far as I can tell.  I suspect the problem is that there’s no other member of my true species on this planet.  I did come reasonably close, or so I thought for a long time, but I’ve been divorced now about five years longer than I was married, so I apparently wasn’t all that successful.

Okay, well, sorry about the weird, ancient info-dump.  It’s not nearly as cool as the data that’s coming in from the recently-activated Vera Rubin observatory.  That, at least, is the sort of thing that helps restore my faith in humanity.  Or, well, maybe it would be more accurate to say that it shifts my Bayesian credence slightly away from the “humans are without net redeeming value” end and toward the “humans may not be all that bad in the end” end.

The credence is still quite low, though.  By which I mean I’m closer to the first end than the second most of the time.

Things might be a little bit better if the sort of people who do things like setting up the Vera Rubin telescope, and who set up and launched and now use the James Webb telescope, and the members of the former human genome project, and the people who study cognitive neuroscience, were the sort of people working in government, writing and administering laws.  Generally speaking, though, the first type of people don’t tend to want to do the governing nonsense, probably not least because a lot of that business is not about everyone trying to do the best they can for the people they represent.

The people who want to do astronomy and mathematics and biology and geology and neuroscience and meteorology and so on are probably some of the best people to do those things‒not just from their point of view but also from the viewpoint of civilizational benefit.  Unfortunately, many of the people who want to go into government and politics tend to be some of the worst people for those jobs, from the point of view of civilization.

I can’t say they are the worst possible group for the job.  The truly disaffected and uninterested or the misanthropic and nihilistic might well do a worse job even than the lot who do it now.  This is despite the fact that most of those latter people act on shallow and immediate self-interest.  Self-interest can do the job adequately when the incentives are structured such that one’s self-interest is served by serving the interests of the people of one’s community/city/nation/species.

Those incentives are very tricky to manage, unfortunately.  It would be much better if we could find people who had real enthusiasm and curiosity and an actually somewhat scientific approach to government.  If only we could find a group as committed to seeing a truly and objectively well-run society‒in which everyone was better off than they would have been in nearly any other‒as the group who set up the Vera Rubin observatory was committed to actually getting the observatory done so they and we could learn ever more about the universe on the largest scales, things might be quite a bit better than they are.  Maybe not, but my credence leans more toward the “maybe so” end.

Alas, politics and government were not born of human curiosity and creativity‒the things almost entirely unique to the species‒but of the old, stupid primate dominance hierarchy/mating drives, which are evolutionarily understandable, but which don’t make for pretty, let alone beneficial, government.  Think about it.  Would you want to put a bunch of self-serving apes doing the jobs of government?

Oh, wait!  That is the group doing the jobs of the government!  Of course, it’s also the group being governed.  Uh-oh.  This could be boding better**.

Not that being recognized as an ape is an insult per se; apes are all that we’ve had available, and they’re the best that’s come along so far.  Some of them are really not so bad.  Some of them figure out ways to launch immense telescopes into space, not so very long after one of them first created the telescope.  Some of them figure out ways to cure and even prevent unnecessary disease.  Some of them figure out ways to turn simple manipulations of base-two arithmetic into information processing that can be scaled up to any kind of logic and information that can be codified.

Some of them just write blogs and sometimes write stories and songs and such***.  But hopefully, that’s not too detrimental an endeavor.


*A baud is a bit per second being sent over the phone lines.  Not a meg, not a K, not even a byte, but rather a bit‒a binary digit, a one versus a zero, on or off.  If you listened to the sound of the modem, you could imagine you could almost hear the individual bits.

**Tip of the hat to Dave Barry’s “Mister Language Person”.

***Though I have done my very small part in advancing human scientific knowledge, in that I am a co-author and co-investigator on an actual published scientific paper.

This is the blog this man’s soul tries

Well, in case some of you were starting to feel lighthearted and optimistic‒just a little more at ease with yourselves and the world after two whole days without reading my work‒here I am to write another blog post that will probably bring you down and make you inclined to wonder whether anything at all is really worth anything, or if you should just give it all up, especially the habit of reading this blog.

Congratulations.  It’s Monday again, the start of another work week.  Also, Daylight Savings Time has ended (or is it “begun”?) over this last weekend, so for a bit, a lot of people’s circadian rhythms are going to be slightly off.  That will contribute to an increased number of accidents, both minor and major.  There will also be increased rates of illness (again, both major and minor), and I believe there is even some evidence that men at least will suffer more heart attacks after the time changes.

And what are the other advantages of Daylight Savings Time?  I’m not aware of any actual other benefits.

Of course, like most of you, I’m starting my own work week today, and it’s going to be a long one; the office is scheduled to be open this Saturday.  By then, the shifted time measure will be mostly adjusted in everyone’s heads.  I’m speaking of things here in the US, of course; I honestly don’t know off the top of my head whether other cultures have adopted this weird custom.

Whence did it originate?  I’ve heard explanations and excuses at various times in my life, but they are not very convincing.  If you know‒with reasonably good credence‒please share that information in the comments below.  And like and share it if you’re so inclined, especially if you have a strong sense of irony.  Heck, like and share the song itself if you want to immerse yourself in a kind of meta-level irony, or something like that:

I don’t know what to discuss today, even more so than usual.  I’ve committed to trying not to dwell on, or at least to share, my negative thoughts and emotions and so on, since I’m sure they do very little other than make other people feel depressed (yes, certain kinds of mental illness can be rather contagious, in a sense at least).

I won’t say I would never wish depression on anyone; that’s ridiculous.  For instance, I would feel much safer in the world if this Presidential administration, and indeed most of its equivalents around the globe, suffered from enough depression to make them second-guess themselves and doubt themselves from time to time.  It almost ought to be a requirement for office that someone be prone to dysthymia at the very least, so they would feel less confident that their shit doesn’t stink, so to speak.

And no, I am not suggesting that the people of the world ought to put me in charge for the best chance to make the world better.  I used to dream of such things, and I had a very Sauron-like wish to control events in the world for the greater good.  It might still not be too horrible a notion.

But my inclination over time has become more negative, more Melkor/Morgoth like.  So if anyone is inclined to encourage and engender acts of chaos and destruction on a hitherto unseen scale, by all means, give me immense power.  I make no warranties or guarantees or even assurances that I will use such power wisely.

I’ll try, of course.  No one can be expected (fairly) to do anything more than that, no matter what Yoda said.

Goodness knows I’ve tried a lot, in a lot of ways, all throughout my life, literally for as long as I can remember.  By which I mean, I’ve tried to do my best to do good things and to be a good person‒a good friend, a good son, a good husband, a good father, a good doctor, all that.  You can probably tell by my current state‒solitary, lonely, divorced, professionally ostracized, in bad physical health, in horrible mental health, alone*‒how well I’ve done at all those things.

I’m not exaggerating when I say I’ve tried hard.  I’m not one to big myself up very much, but I have worked hard all my life, trying to be a good son, a good friend, a good brother, a good husband, a good doctor, a good father.  Yet despite my sincere efforts and my reasonably high intelligence, here I am.

I suppose a lot of the disappointing outcome(s) is/are related to my ASD, both the heart-based one and the brain-based one, as well as my tendency (probably related to the preceding) to depression and some degree of low-grade paranoia.

By “low-grade” there, I mean that I don’t literally suspect that there are malicious forces plotting against me or trying to control me; I honestly don’t think highly enough of humans (or any other beings) to expect them to be capable of such things.  It would almost be reassuring if they were.

No, I mean I just have a general, global sense‒not just intellectually, but in my bones as it were, in my deep intuitions‒that I cannot rely upon anyone or upon anything, other than the laws of nature themselves (whatever their final version might be).  I don’t “trust” anyone or anything, including (one might even say “especially”) myself.  Everything is a calculated risk.

This is of course literally true for everyone, but I think most people hide from that fact most of the time, usually (but definitely not always) without terrible consequences.  I don’t know if that’s worse or better.  It may be more pleasant, but I suspect it’s misleading, and has been responsible for, or at least it has contributed to, many ills the human race has brought upon itself and upon others.

Whataya gonna do?  I guess you’re gonna do whatever you must, as they say, since it’s not as though you can do anything other than what you do once you’ve done it, and so it was all along what you were going to do, and so it was what you must do (or must have done).

I hope you have a good day and a good week.  I’ve tried to withhold my depression and negativity, with at least some degree of success‒trust me, I’ve withheld‒and I will continue to do so, because sharing it is pointless, and asking for help is laughable.


*Now, that phrase had some redundant notions, didn’t it?

Thoughts meander like a restless Melkor in the Outer Void…

It’s Friday, and this week I can be thankful therefor, because I do not work tomorrow.  The office will be closed (and locked) on Saturday.  Only those who have keys and the alarm code and some other reason for being there would be there (I suppose someone could break in, but there are cameras and alarms in place, and there is nothing of any significant net value, i.e., value worth risking the alarms and cameras to reach, inside).

Next Friday won’t be as good from a strictly work/not work point of view, but at least it will be Friday the 13th again, for the second month in a row.  Then we will have to wait an average* of 7 years for it to happen that way again.

***

Okay, I guess I’ve always known that I’m weird, but I just wrote a series of footnotes about Friday the 13th and year lengths and lengths of weekly cycle recurrences that dwarfed what I had yet written in the main body of this post.  I think I’m probably the only being in the universe that would write about such things and imagine that anyone else would be interested.

Yeah, definitely weird.

Still, I guess that sort of thing just happens when you talk to yourself in print and share it with any interested parties who might stumble upon it.  Also, when one is without companions or interactions one can, like Melkor, develop thoughts and thought patterns that are unlike those of one’s brethren.

I suppose that can sometimes be a good thing, though it can also sometimes be a very bad thing (rarely as bad as in the fictional Melkor case).  Though all improvement is change, most change is not an improvement‒at least not if it’s not deliberate and directed change.  So if one develops thoughts that are significantly divergent from those of all of one’s peers, odds are that they will not be a net improvement over most of the peer-born thoughts.

I have, of course, mitigated against this somewhat by reading a lot (and consuming other media that deal with science and mathematics and philosophy and such, as well as comedy panel shows).  That’s not randomly chosen reading, either; it’s carefully chosen reading.  I think this has helped improve the general content and tendencies of my thought, because I’ve influenced myself with the carefully thought-out thoughts of very bright people.

I suppose, though, that if one can read what one wants and does so, one is not really isolated from all other thoughts, so one’s own cannot be too very different, or at least are not very likely to be.  That’s good, I think.  Simply developing new thoughts without much input from others would be most likely to lead to some sort of feral state or something akin to schizophrenia.  

So, I guess it can be good to take tangents in one’s thinking, as long as they are not too many and too extreme.  But even given that, it’s clearly useful to have someone to rein one in, if one can, when one goes too far off the rails (yes, that’s a bad metaphor, since a train going off the rails at all is in huge trouble, rails representing a near-binary situation‒if one is a train and one is not on the rails completely, one has experienced a failure of locomotion).

Well, I guess that’s that for this week.  Actually, I suppose that is always that, by some principle of identity or self-reflection or something; I’m sure there’s an “official” name.  “It is what it is” as they say.  What I mean, though, is that I am drawing this post, and this week of posts, to a close now.

I hope you have a very good weekend.

After that I don’t give a shit.

(I’m kidding.)


*I know, I know, we won’t have to wait an average number of anything.  There is a specific and exact number of years before the next time February and March have Fridays the 13th, but I cannot be arsed to work it out just now**.

**Okay, well, since I am unable to keep myself from thinking about it at least a little, I think it’s going to be 6 years from now.  That’s because each regular year is 1 day longer than a multiple of a week:  365/7 is 52 with a remainder of 1, so one day longer than an even number of weekdays.  So next February should have the 13th on a Saturday, then a Sunday the following year, but then on a Tuesday the year after that because of the leap year (366/7 is 52 with remainder 2).  Then it will be Wednesday, then Thursday, then Friday.  So 6 years, if my figuring is correct***.

***If it seems counterintuitive that it’s 6 years when the average should be 7, remember that while in this case the leap year makes the next instance come faster, there will be occasional years when Thursday the 13th falls on a leap year and the following year will go straight to Saturday the 13th, the first of another six years (I think) that will be needed for the subsequent Friday the 13th in February.  In any case, 6 plus (6 x ⅙) equals 7, as does 6 + (a x 1/a) no matter what a is****.

****This doesn’t factor in those leap years in which February has a Friday the 13th, but March will not.  That may change the overall calculations somewhat regarding the average time between dual Fridays the 13th, but not the calculations about when the next one will be.

The painful truth – the truthful pain

Please forgive me if I behave or speak as though today were Tuesday.  I know that it is in fact Wednesday as I write this‒it’s anyone’s guess on what day of the week you might be reading it (though I suspect that, for the most part, if one doesn’t read my blog on the day it’s posted, one is unlikely ever to read it)‒but I didn’t write a post yesterday (Tuesday) so I may be a bit thrown off.

I didn’t write a post yesterday because I didn’t go to work yesterday.  And I didn’t go to work yesterday because of pain.  I had already been having a bad pain day on Monday, one in a long string of worse-than-average pain days.  Then, in the evening on Monday, while trying to reach for something in my room, I took a bad step on the tile floor and slipped and nearly fell.

I caught myself, as is implied by the “nearly” in that last sentence, but I wrenched my back significantly, and the night and morning and so on were particularly bad, and I hardly slept and I did not have the energy to go to work, or at least to do so and not spend all my time writhing and snapping at people.  So I stayed at the house.

Regarding chronic pain, I’m fond of quoting Ulrich’s description of Vermithrax from Dragonslayer:  “When a dragon gets this old it knows nothing but pain, constant pain.  It grows decrepit.   Crippled.  Pitiful.  Spiteful.”  I had to double-check and fix a few words to get the quote exactly correct, but the most important parts are always remembered correctly.  And the whole thing feels like it describes me pretty well.

I used to be much more pleasant and amiable than I have become since my chronic pain began.  Though I’ve had problems with depression since my teens and anxiety before that and ASD since I was born (in two different senses), I always tried to be polite and amiable and kind as much as I could.  I always figured that was the real position of strength:  not being in competition with other people but just trying to do your best while others do similarly.

But when one is in chronic pain, it is hard not to be grumpy (presumably even if one hasn’t lost almost everything one had worked to achieve through the first thirty plus years of one’s life, though I cannot know for sure).  I think there are people who have only known me since the time of the beginning of my back problem who would be surprised by how pleasant I was back in the day.

Though, there are those who read this blog who did know me in the past, before the aforementioned time, and maybe they would give a different report.  I can only share my own perceptions and perspective, and I could to a certain degree be mistaken about how I came across to other people.

I’ve never been all that good at knowing what other people think of me.  Because of that, I generally try just to take people at their word, and take those words to have their most straightforward meaning.  If someone hopes to hint at something and I don’t get it, that’s on themHints are overrated even when given and received by people who embrace the practice and consider themselves good at it.  There are too many possible variations and points of incomplete information.

Anyone who has saved and transferred a video file and has also saved and transferred word processor files should grasp the difference, at least if they have been paying any attention.  A video only a few moments long can, despite the latest compression algorithms, have a storage size that dwarfs the size of even, for instance, the word file for the unsplit book Unanimity.

Now, Unanimity is about half a million words long.  It’s certainly the longest thing that I have written.  But a video I did on my phone last week for minor fun, which was maybe 20 seconds long, takes up more than 16 meg, while the combined file size for the Kindle versions of both Unanimity: Book 1 and Unanimity: Book 2 is about 3.5 meg*.

That’s a few minutes of stupid and pointless video which will never be shared anywhere versus a work that took more than a year for me to complete, edit, and publish.

At least it’s fair to say that, from a useful information point of view, my book was and is much more efficient.  Though it requires enough shared experience for others to fill in meanings and images of things described, this is not a requirement that isn’t met by nearly every human on the planet.  Perhaps videos would be better for a truly alien species that was hitherto unfamiliar with human civilization.

Okay, well, that was a weird post, I guess.  I mean that in absolute terms, mainly; I don’t know if this post is much weirder or much less weird than my usual posts.  Possibly every potential interlocutor would have different things to say about that.

I guess that’s okay.  It had better be okay, if it’s true, because if it’s true, there’s nothing anyone can do about that, and they’re already living with it.  This is the Litany of Gendlin, as quoted by Eliezer Yudkowsky, of which I have a screenshot from his book Rationality:  From AI to Zombies, below.

Well, I hope you have a good day, whatever the truth is that you and all the rest of us are living.


*This is according to the AI summary of Google’s search for “Robert Elessar Unanimity file size”.  It’s almost certainly correct, because the info is part of the Amazon description of the book.  But it’s humorous to me that it’s easier to do an AI based web search to find the file size of my own novel than it is to look up the file, since I’m using my phone and don’t have direct access to the original at the moment.

And his brain ate into the worms…

Ugh.  Didn’t we just leave this party?  Evidently, we did not leave it precipitously enough, because here we are‒or at least, here I am‒rejoining it in the morning.

It seems like an ill-advised notion, but then again, I’m not sure who specifically advised me, or any of you, to do it.  There probably were a few literal, formal pieces of advice that we all or each received throughout our lives‒advice about getting up early and going to work and striving to fulfill our potential, and how if we didn’t we were somehow letting ourselves and (more importantly) letting everyone else down.

“The early bird gets the worm” is a typical phrase about such ambition and dedication and hard work.  But like many of us, I’ve often thought that worms are overrated.  They’re not rated highly at all, I’ll admit, but nevertheless, I think they are rated too highly.  Evidently‒according to what I have read‒all earthworms in at least the northern part of North America were killed off in the last ice age.  Nevertheless, plants grew and flourished without verminous help in the soil before Europeans accidentally brought their own earthworms here.

Of course, the saying is metaphorical, I know that.  We’re not really advised to seek earthworms early in the day, though perhaps liver flukes and flatworms and tapeworms and roundworms are also considered as among the worms that might be caught.

No, probably not.

But anyway, even though metaphorical, that saying raises higher level questions, such as, “Is the life of a metaphorical early bird worth having?”

Consider what that life entails:  Getting up (early), pecking around on the ground for worms and probably also for various other insects and their larvae and a few arachnids as well*; trying to avoid, in that process, being caught by some predator (such as a house cat); trying to find and attract a mate when the season is right; helping build a nest, if you’re that kind of bird; guarding the eggs and maybe sitting on them yourself, until they hatch; then, feeding and protecting them until they can fly on their own; then repeating these steps until disease or starvation or one of those house cats gets you.

That’s it.  And while there are many embellishments and flourishes and complications in the typical human life cycle, overall it is much the same as that of the bird.  Why would we expect it to be otherwise?

Admittedly, humans (and humanoids) can dream up other things to do, and some of them are more interesting and fulfilling, from their own points of view at least, than the ordinary early bird pattern.  But though, in the long run, humans as a whole may become significant enough to do something truly meaningful on a cosmic scale, almost all of them have no deeper lives than those lived by the early birds.

That’s not necessarily a bad thing, of course.  Taken with the pertinent attitude, such a life can be well lived and fulfilling.  It probably won’t end happily, because it’s not in the nature of life to be happy when ending; there’s just no real evolutionary benefit to having such a tendency.

Still, before imbibing the so-called Kool-Aid™ of the motivational life-messages‒those social moralities that keep us getting up and joining the rat race (to shoehorn in another animal-related metaphor)‒it would probably behoove us to consider whether that is the life we think we want, to ponder if that overall shape and experience are okay with us as the outline of our lives.

If so, there’s nothing wrong with that.  As long as you’re not interfering with other people’s ability to try to live their lives as they try to see fit**, then do what seems best to you.

But it’s useful to think about what might be the overall shape of your life if you continue as you currently are and if that shape will be aesthetically (or otherwise) pleasing to you.  If not, what change might improve that overall shape, trying to take all reasonably plausible inputs and outputs into consideration?

I won’t say that the unexamined life is not worth living, because, if it’s unexamined, how do you know that it’s not worth living?  Huh?  Huh?  Nevertheless, I will say that the unexamined, unconsidered life could be fulfilling only by accident, whereas it may be possible, with deliberation, to steer toward a better one.

Not that I’m a good piece of evidence in favor of this.  I think and overthink to the point that I hate the noise of my own mind, but I haven’t been able to steer myself into an optimal shape***.  But at least I make a lot of “noise” about such things.  That might be worth something.

Anyway, have a good day.  Enjoy your worms or salads or whatever other life forms you kill and consume to remain alive today (I’m assuming you are not a green plant).  Watch out for the Kool-Aid™ and even more so for the cats.


*I am quite sure that, to such a bird, these things taste delicious, so I don’t mean to disparage their diet as unpalatable.  Appetites of various kinds are species specific; what’s appetizing or sexually attractive to, say, a housefly is unlikely to appeal to any psychologically healthy human.  Likewise, the most beautiful human woman ever is not going to do anything for a male tarantula.  He also probably would have no interest in having a bite of her salad.

**This is more difficult to navigate than it may seem at first, because even when one is acting on one’s own, there are always effects at some level, there are always “externalities”, and occasionally these will have an impact on other people‒a foreseeable but perhaps unforeseen impact.  And vice versa.

***Should there be a “yet” at the end of that sentence?  I don’t know; we’ll have to see what happens to me in the future.  We can be reasonably sure, though, that there shouldn’t be a yeti at the end of that sentence, or of any sentence except one that mentions such creatures.

Nihil vere refert. Quisque videre potest. Nihil vere refert. Nihil vere mihi refert.

Well, I did warn you yesterday that I would be writing a blog post today*.  Go ahead, take a look.

Yesterday’s post was another of my recent, deliberately benign blog posts, not dwelling on my mental health and chronic pain issues, because nobody gives a shit about those things, or at least they don’t want to have to hear about them, because they’re not going to (be able to) do anything about them, and that makes them feel guilty and uncomfortable, which is unpleasantly awkward.

So, anyway, it’s the last day of February in 2026.  We are, in a certain sense, one sixth of the way through the year.

I say “in a certain sense” because it’s not precisely true.  Today is the (31 + 28)th day of the year, so the 59th day of the year.  If that were literally a sixth of the way through the year, the year would only be 354 days long.

It’s somewhat interesting to note here that, because February is shorter than every other month, the first two months of the year are shorter than any subsequent, nonoverlapping** months of the year.  And, let’s see, the first three months of the year have 90 days exactly on non leap years, whereas April thru June have 91, July through September have 92, and October through December also have 92.  So, all the later groups of three months have more days than the first three‒except on leap years, when January through March is 91 days.

Evidently, though, the latter six months of the year always have more days than the first six.  I wonder why they did it that way.  Was there an actual reason or did it just sort of happen?

Of course, I know they can’t be equal except on a leap year, since the number of days in a year is odd.  But why couldn’t they have come up with a way that made the years alternate, with one year‒the odd years perhaps‒having the surplus in the first 6 months and the other years having it in the last 6 months?  On leap years they could be equal.

How might that work?  We need 182 days divided by six months, which means we need four months which have just 30 days and two that have 31.  We could say January and February have 30, March has 31, and then repeat with April, May, and June and then July, August, and September***.  I was about to suggest that on odd years we make January have 31 days and on even years we make July have 31 days, but all leap years are even years, so the latter half would be comparatively short-changed with respect to years in which they are longer. if we add the leap year day to the first half as we do now.

On the other hand, we could put the leap day always in the 2nd half of the year, perhaps in November, or even more sensibly in December:  we would thereby add our extra day to the very end of the year, rather than squeezing it into the earlier part of the year like someone cutting into a line.  Though that would make the second half two days longer than the first, though, which is unpleasantly asymmetrical in a year with an even number of days.

Of course, really, all days are fungible.  I remember seeing on QI once that apparently some sect maintained that they added an extra day not at the end of February but in the middle; I don’t recall precisely where they thought the day was being inserted, alas, but I can imagine some alternative, anatomical suggestions I’d like to make for them.

Days of a month are fungible (dammit!).  It makes no more sense to say that you added a day into the middle of February and pushed subsequent days later than it does to say that you deposited $100 into your bank account right after the 256th dollar that was already there, pushing what had been dollars 257 through 356 to become dollars 357 through 456.  Every dollar is just “a dollar”, every cent is just “a cent”.  It’s rather reminiscent of the way every electron is interchangeable with every other electron (likewise for all other elementary “particles”).

So, on leap years, the extra day of the year is and can only be (in our current system) the 29th of February, because that’s the day-label that isn’t there in other years.

You’re allowed to imagine if you like that you’re adding a day to the middle of the month and pushing the other days back and renaming them.  You’re also free to argue about how many angels can dance on the head of a pin, or to debate, without first agreeing on word usages****, whether unattended trees that fall in forests make noises.  That doesn’t mean you’re doing anything that has any bearing on the real world.

Okay, well, that’s been much ado about nothing, hasn’t it?  Or, multum strepitus de nihilo fuit, as is apparently the way to say it in Latin, which almost always sounds fancier, though it doesn’t always sound better aesthetically (consider the above headline’s Latin versus the original English).  English is‒or can be‒quite a beautiful language if you take a step back and see it as if from outside.  It can be hard to distinguish that beauty “from within”, though, because the meanings and usages of the words involved can distract from their inherent loveliness.

Tolkien, for instance, wrote that he thought the most beautiful sounding phrase in English was “cellar door”.  I’m not sure I agree with him on this, but it’s a matter of taste, so there’s no slight, or “diss” or “shade”, involved in not both liking the same thing.

Enough nonsense for now, or at least enough nonsense here in this blog for now.  I’m sure that there is plenty of nonsense to be had elsewhere.  Do try to find some that’s enjoyable for you this weekend.


*That was unless I was lucky enough to get very sick or very injured or to die, which I have apparently not been lucky enough to do by this time.

**I say “nonoverlapping” because February and March combined contain the same number of days as January and February combined.

***I think in the final three months it should be October that always has 31 days, because Halloween really should fall on a day that’s a prime number, not a 30th or a 1st.

****Most such debates tend to devolve into discussions about the “definition” of the word “noise”, as if that were concrete and singular and fixed‒which it is not‒rather than the laws of physics and biology that constrain all the actual events of such an arboreal catastrophe.

Is it mean not to know if one’s writing is above average?

It’s Friday again, but that’s not much consolation, since the office is open tomorrow and I will be working, unless I am lucky enough to get very sick or very injured or to die or something.

As usual, I have no idea what would be good to write today.  Actually, goodness—certainly in the moral sense, but possibly also in the sense of quality—probably doesn’t have much to do with my blog.  Perhaps weirdness would be a better adjective/measure to relate to my writing.

I’m probably not an objective judge of such things.  Then again, I don’t know of any fully objective judges.  Still, there is some degree of variability involved in such things, as in nearly everything else made up of smaller, more fundamental parts that are interacting in complicated ways producing so-called emergent behavior.  Nevertheless, cognitive biases are reasonably well studied, as are many emotional blind spots and the like.  And it’s certainly true that I have a difficult time being objective about myself and about my work.

Oddly enough for such a self-despising person, I actually like my own writing, especially my fiction.  When I reread my stories I don’t tend to see them as horrible or wretched or whatever traditionally happens with “artists” who look at their own work.  I think at least some of that sort of thing is probably affected, since our society (perhaps semi-deliberately) looks down upon artists who think highly of their own art.

If only we did that with politicians; there’s an area where humility would be welcome and beneficial, I think.

Anyway, I tend to like my stories—I wrote them, after all, because I wanted to tell and thereby hear those tales—but I don’t necessarily think they’re great or good or decent from anyone else’s point of view.  I honestly don’t know how good or how bad they might be from nearly any others’ points of view, except my sister’s, and she’s probably almost as prone to be biased about my work as I am.

Though, again, my attitude toward my writing is not akin to that oft-noted personal bias that leads more than 90% of drivers to think that they are in the upper 50% in ability (i.e., above the median), which is a mathematical impossibility* for them actually to be.  I don’t think of my writing as better (or worse) than average or the median.

I don’t really compare my writing to anyone else’s.  I just tend to like it.  That’s probably a very good thing, because I have to edit it myself.  Even these daily blog posts are run through three more times after my first draft.  My fiction I tend to reread and edit seven times (that took a very long time with Unanimity).  Why seven?  Well, I had to pick a number, and once or twice is clearly too few, and thirteen would just be unworkable.

Also, with my fiction, I tend to follow advice Stephen King repeated in his book On Writing by working to reduce my final word count by at least ten percent by the time I’m done editing it.  I used to try to do that here, but I sometimes add a bit during editing, so that becomes quite difficult and hardly worth the effort.

All that being said, it would really be nice to get some feedback on my writing, especially on my stories (from people who have read them).  Of course, I would love it if someone loved my stories and told me so and told me why.  The closest I think I’ve come is a review on Amazon of Welcome to Paradox City that was written by a former high school friend (he has since died of cancer, sadly) who had actually honestly bought the book for himself when it came out.  He wrote that the three stories in that collection each made him wish they were the beginning of a whole book, basically implying that he wanted to know what happens next.

That’s a good thing about short stories—you can leave people hanging and that’s just “too bad” for them (though it can be enjoyable).  Short stories also don’t have to have “happy” endings, which is good for me, since only one of the three in the above collection ends happily in any reasonable sense.

Of course, as I’ve noted before, my short stories are rarely short enough ever to have been, for instance, published in a magazine in the old days.  The only real exception to this is Solitaire, which I don’t think any magazine would have published, because it is very, very dark indeed.

Okay, well, I guess I ended up writing something today, even if it was all just figurative omphaloskepsis.  I don’t know whether you readers consider this good or bad or ugly, or how it compares in your estimation to posts like I posted yesterday and/or the day before.  If you’re so inclined, please let me know.

And if you have actually bought and read any of my books, I do beseech you to leave me a review on Amazon (or wherever) if you get the chance.  Thanks.

I’ll write at you tomorrow, barring—as always—the unforeseen.


*It is not, on the other hand, impossible for 90% of people to be above average (i.e., above the mean).  I’m sure I’ve addressed this before, but imagine one had administered a test, and 100 people took it.  Imagine that 90 of those people got 51/100 on the test, whereas the remaining 10 people scored zero.  Then, the arithmetic mean (what people usually mean by “average”) would be (90 x 51)/100**.  That goes to 4590/100, or a mean score of 45.9.  So, 90% of those people scored above average.  That’s not saying much, but it’s true.

** Yes I know I don’t really need the parentheses there, but I’m leaving them in for clarity.

A notification of whatever

I expect this post to be brief today, though I’ve been known to be wrong about that sort of thing.  I had sort of “intended” to make my headline “Oh, well, whatever…” and then make the entire body of the post “…never mind.”  Thus I would be quoting the last verse-line of Smells Like Teen Spirit by Nirvana.  The subsequent words in the song are just the chorus and then a refrain of “A denial” repeated nine times (if memory serves).

I wasn’t sure I hadn’t already done this before, though.  I could have checked, but I didn’t have the mental energy.

Still, using that last line from a Kurt Cobain song carries a certain subtext which would have served my purposes well.

Or, well, actually, given past history, it probably wouldn’t have served my purposes at all.  None of this sort of thing seems to serve my purpose, no matter what I do.  As far as I can tell, only one person actually read my (admittedly somewhat long) post yesterday, but though I was borderline explicit about my meaning, I don’t think it did any good whatsoever.  That’s not unusual, of course; much if not all that I do never ends up doing me much good.

Sometimes I have to be subtle because I cannot force myself to be open about my internal states after a lifetime of fighting to appear “normal”, to the degree I can achieve that, and to avoid being too much trouble for other people, since I don’t think I have the right to trouble them, and in fact I think (or feel) that I’m fundamentally reprehensible.

I shouldn’t worry, though.  The times I am more open and obvious‒even when I am borderline explicit‒don’t appear to be any more successful than when I am at my most cryptic.  Possibly, I am just not able to communicate my feelings effectively with humans.

At the very least, my success rate must be below one percent.  It’s not quite as bad as playing the lottery, but it’s pretty pathetic.  Then again, so am I.

Whatever.  Never mind.  Ha ha.

But really, though, I don’t have much to say.  Quoting iconic songs may be the extent of my capacity to convey myself.

Ironically, I don’t feel the urge to share quotes from my own songs (or my fiction).  You would think they would be the best choice for conveying my inner thoughts.  That’s not always the case, though.

In fact, though I like my songs well enough, and Breaking Me Down is meant to be fairly explicitly about depression (at least my species thereof), none of them have enough oomph, as it were.  Or maybe it’s just that they are not well known*, so no one recognizes and identifies with the words.

I think I have some pretty good lines in Come Back Again, including what’s probably my favorite:

“Only meeting strangers

always losing friends.

Every new beginning

always ends.”

It may seem a bit bleak, but it’s also true more or less by definition.  If you’re meeting someone for the first time, they had been a stranger until that point.  And friends do become “lost”.  And the next two lines are rather obviously true.

Of course, a very good signing (singing?) off quote would be from Pink Floyd’s Time:  “The time is gone, the song is over, thought I’d something more to say.”

I’ve always been annoyed that they added the little reprise of Breathe after that and made it officially part of the song, because those other two lines constitute a perfect song ending.  I always figured they didn’t want to make the song end on too much of a downer, so they threw in the reprise as part of that song instead of as a separate one.  Maybe they were unwittingly invoking a version of the peak-end rule I mentioned the other day.

Anyway, I have a locked and loaded draft of a blog post that already applies that couplet from Time, with the headline being the first half, continuing into the post which consists only of the second half of that quote, followed by the embedded “video” of the final song on the first album of The Wall.

That, of course, is still a draft, and has been waiting there for a while, because if I use it, it’s meant to be my final blog post, and practically my final anything.  So I wasn’t going to use it today.  Not quite.  But I’m close.  The Nirvana quote isn’t quite as final, but it is a warning, especially given the fate of the guy who wrote it.

Anyway, consider yourselves on notice.  On notice of what?

Figure it out.


*That’s an understatement, eh?

Man overboard

As the real weekends go, it was better than most, to paraphrase The Wreck of the Edmund Fitzgerald.  By this, I’m referring to this last weekend, the two days before this day, of course.

I did not work on Saturday, which is good, because that would have been the third time in a row.  I also got to hang out with my youngest on Saturday, and we watched about four episodes of Doctor Who together, which was good, good fun.  I cannot complain about that in any way.

I have though a weird, disquieting, sinking sort of feeling that it may have been the last time I will see my youngest, or maybe anyone else that I love.  It’s is not one of those reliable sorts of feelings, like those that lead one to new insights in science or mathematics or what have you.  It’s probably more a product of depression and anxiety, the feeling that anything good in my life is sure not to last, if it happens at all, because I do not and cannot possibly be worthy of anything good happening to me.

Is that irrational?  Of course it is irrational.  It cannot be expressed in any sense as the ratio of two whole numbers, no matter how many digits they may have.

Wait, wait, let me think about that.  My thought, my feeling, was expressed above finitely.  That is, of course, a shorthand for what is really happening, but even if one were to codify those processes down to the level of each molecular interaction that affects any neural/hormonal process that contributes to my feeling, we know that must be a finite description (though it could, in principle, be quite large).

Even if we’re taking the full spectrum of quantum mechanics into account when describing my mental state, we know that quantum mechanics demands a minimum resolvable distance and time (the Planck length and the Planck time) below which any differentiation is physically meaningless.

A finite amount of information can describe the events and structures and processes in any given finite region of spacetime.  In fact, the maximum amount of information in any given region of spacetime is measured by the surface area (in square Planck lengths) of an event horizon that would span exactly that region, as seen from the outside*.

Any finite amount of information can be encoded as a finite number of bits, which can of course be “translated” to any other equivalent code or number system.  So, really, though the contents of my mind are, in principle, from a certain point of view, unlimited, they are finite in their actual, instantiated content, and can therefore certainly be expressed as an integer, and thus also as a ratio (since any integer could be considered a ratio of itself over one, or twice itself over two, etc.).

So, in that sense, my thoughts are not irrational.  Neener, neener, neener.

In many other senses—maybe not the literal, original sense, but in the horrified, cannot accept that not all numbers can be expressed as ratios of integers because that makes the universe too inconceivable, sense, among others—I can be quite irrational.

It’s very difficult to fight one’s irrationality from the inside, alone.  Even John Nash didn’t really beat his schizophrenia from within as shown in the movie version of A Beautiful Mind.  Also, his delusions in real life were far more extravagant and bizarre than those which appear in the sanitized version that made a good Hollywood story.

If one escapes from mental illness from within, one has to consider it largely a matter of luck, like a young child who doesn’t know anything about math getting a right answer on a graduate level, high order differential equation problem.  It’s physically possible; heck, if it were a multiple choice question, it might even be relatively common***.  But it’s not a matter of being able to choose to do it right and to know how it was done.

Severe mental health issues are going to need to receive assistance from outside, almost always.  This is not an indictment of them or of the need for help.

Surely, someone who has been swept off the deck of a ship by a rogue wave cannot be faulted for needing help from those still on the ship of they are to survive.  It would certainly seem foolish and almost inevitably fruitless if such a person tried to claw his way up the side of the ship to get back on board when there is no ladder and no handholds.  He should certainly not be ashamed that he cannot swim hard enough to launch himself bodily from the water and back onto the surface of the vessel.

One cannot reasonably fault such a person for trying to do the superhuman.  A person might try to do practically anything rather than drown or be eaten alive by some marine predator.  But, of course, barring an astonishing concatenation of events such as the time-reverse of the splashing entry into the ocean happening and sending the person out of the sea just as it was entered, such efforts will not succeed.

And though it might be heartening or at least positive for one to receive encouragement from those still on the deck—don’t drown, keep treading water, you can do it, you’ll make people sad if you drown, you deserve to stay afloat, I’m proud of you for treading water yet another day, it’ll get better, this won’t last forever, you’ve made it this far so you know you can keep going, you don’t want the people who know you to feel sad because you drowned, etc.—in the end it might as well come from the seagulls waiting to pick at one’s floating corpse.

Mind you, certain kinds of words can be more useful than others.  Words like, “Hey, around the other side of the ship there’s a built-in ladder; if you can get over there and time things right, you might be able to grab the lowest rung when the waves lift you, and then climb up,” might be useful because they are directions for using real, tangible resources that we know can make a difference.  Also, words like, “Hang on just a bit longer, we’re throwing down a life preserver on a rope so we can haul you up” would be useful, obviously, unless they were mere “comforting” lies.

Alas, though one could reasonably expect such literal assistance if one were washed overboard—the “laws” of the sea are deeply rooted in the hearts of those who work there, and they include a general tendency to help anyone adrift to the best of one’s abilities—when it comes to mental illness, the distress and the problems are difficult for others to discern and easy to ignore.  Calls of distress are often experienced as annoyances, and even treated with contempt, since those hearing them cannot readily perceive that they themselves might be similarly washed overboard at any time.

But, of course, they might be.

I don’t know how I got on this tangent, but I guess I never really do.  I just go where my mind takes me, and my mind is not a reliable driver.  It is, though, a reliable narrator.  It doesn’t matter, anyway.  Nothing does.

Anyway, here we go again into another work week, because that was what we did last week.  I wish I could offer you better reasons, but I’m really only good at breaking things down, destroying things, not at lifting anyone or anything up.  That comes from other regions and is conveyed by other ministers.


*From within an event horizon, the volume could be much larger than the spacetime that seems to be enclosed from the outside, because spacetime inside the horizon is massively curved and stretched.  It’s conceivable (at least to me) that there could be infinite space** within, at least along the dimension(s) of maximum stretch, just as there is infinite surface area to a Gabriel’s Horn, but only finite volume.

**See, mathematically, one can stuff infinite space inside a nutshell.  Hamlet was right.  He often was.

***Perhaps this explains why certain types of mental health problems can respond well to relatively straightforward interventions, and even to more than one kind of intervention with roughly comparable success, e.g., CBT and/or basic antidepressants and such.  These relatively tractable forms of depression are the “multiple choice problem” versions of mental illness.  This does not make them any less important.