This is the blog this man’s soul tries

Well, in case some of you were starting to feel lighthearted and optimistic‒just a little more at ease with yourselves and the world after two whole days without reading my work‒here I am to write another blog post that will probably bring you down and make you inclined to wonder whether anything at all is really worth anything, or if you should just give it all up, especially the habit of reading this blog.

Congratulations.  It’s Monday again, the start of another work week.  Also, Daylight Savings Time has ended (or is it “begun”?) over this last weekend, so for a bit, a lot of people’s circadian rhythms are going to be slightly off.  That will contribute to an increased number of accidents, both minor and major.  There will also be increased rates of illness (again, both major and minor), and I believe there is even some evidence that men at least will suffer more heart attacks after the time changes.

And what are the other advantages of Daylight Savings Time?  I’m not aware of any actual other benefits.

Of course, like most of you, I’m starting my own work week today, and it’s going to be a long one; the office is scheduled to be open this Saturday.  By then, the shifted time measure will be mostly adjusted in everyone’s heads.  I’m speaking of things here in the US, of course; I honestly don’t know off the top of my head whether other cultures have adopted this weird custom.

Whence did it originate?  I’ve heard explanations and excuses at various times in my life, but they are not very convincing.  If you know‒with reasonably good credence‒please share that information in the comments below.  And like and share it if you’re so inclined, especially if you have a strong sense of irony.  Heck, like and share the song itself if you want to immerse yourself in a kind of meta-level irony, or something like that:

I don’t know what to discuss today, even more so than usual.  I’ve committed to trying not to dwell on, or at least to share, my negative thoughts and emotions and so on, since I’m sure they do very little other than make other people feel depressed (yes, certain kinds of mental illness can be rather contagious, in a sense at least).

I won’t say I would never wish depression on anyone; that’s ridiculous.  For instance, I would feel much safer in the world if this Presidential administration, and indeed most of its equivalents around the globe, suffered from enough depression to make them second-guess themselves and doubt themselves from time to time.  It almost ought to be a requirement for office that someone be prone to dysthymia at the very least, so they would feel less confident that their shit doesn’t stink, so to speak.

And no, I am not suggesting that the people of the world ought to put me in charge for the best chance to make the world better.  I used to dream of such things, and I had a very Sauron-like wish to control events in the world for the greater good.  It might still not be too horrible a notion.

But my inclination over time has become more negative, more Melkor/Morgoth like.  So if anyone is inclined to encourage and engender acts of chaos and destruction on a hitherto unseen scale, by all means, give me immense power.  I make no warranties or guarantees or even assurances that I will use such power wisely.

I’ll try, of course.  No one can be expected (fairly) to do anything more than that, no matter what Yoda said.

Goodness knows I’ve tried a lot, in a lot of ways, all throughout my life, literally for as long as I can remember.  By which I mean, I’ve tried to do my best to do good things and to be a good person‒a good friend, a good son, a good husband, a good father, a good doctor, all that.  You can probably tell by my current state‒solitary, lonely, divorced, professionally ostracized, in bad physical health, in horrible mental health, alone*‒how well I’ve done at all those things.

I’m not exaggerating when I say I’ve tried hard.  I’m not one to big myself up very much, but I have worked hard all my life, trying to be a good son, a good friend, a good brother, a good husband, a good doctor, a good father.  Yet despite my sincere efforts and my reasonably high intelligence, here I am.

I suppose a lot of the disappointing outcome(s) is/are related to my ASD, both the heart-based one and the brain-based one, as well as my tendency (probably related to the preceding) to depression and some degree of low-grade paranoia.

By “low-grade” there, I mean that I don’t literally suspect that there are malicious forces plotting against me or trying to control me; I honestly don’t think highly enough of humans (or any other beings) to expect them to be capable of such things.  It would almost be reassuring if they were.

No, I mean I just have a general, global sense‒not just intellectually, but in my bones as it were, in my deep intuitions‒that I cannot rely upon anyone or upon anything, other than the laws of nature themselves (whatever their final version might be).  I don’t “trust” anyone or anything, including (one might even say “especially”) myself.  Everything is a calculated risk.

This is of course literally true for everyone, but I think most people hide from that fact most of the time, usually (but definitely not always) without terrible consequences.  I don’t know if that’s worse or better.  It may be more pleasant, but I suspect it’s misleading, and has been responsible for, or at least it has contributed to, many ills the human race has brought upon itself and upon others.

Whataya gonna do?  I guess you’re gonna do whatever you must, as they say, since it’s not as though you can do anything other than what you do once you’ve done it, and so it was all along what you were going to do, and so it was what you must do (or must have done).

I hope you have a good day and a good week.  I’ve tried to withhold my depression and negativity, with at least some degree of success‒trust me, I’ve withheld‒and I will continue to do so, because sharing it is pointless, and asking for help is laughable.


*Now, that phrase had some redundant notions, didn’t it?

Thoughts meander like a restless Melkor in the Outer Void…

It’s Friday, and this week I can be thankful therefor, because I do not work tomorrow.  The office will be closed (and locked) on Saturday.  Only those who have keys and the alarm code and some other reason for being there would be there (I suppose someone could break in, but there are cameras and alarms in place, and there is nothing of any significant net value, i.e., value worth risking the alarms and cameras to reach, inside).

Next Friday won’t be as good from a strictly work/not work point of view, but at least it will be Friday the 13th again, for the second month in a row.  Then we will have to wait an average* of 7 years for it to happen that way again.

***

Okay, I guess I’ve always known that I’m weird, but I just wrote a series of footnotes about Friday the 13th and year lengths and lengths of weekly cycle recurrences that dwarfed what I had yet written in the main body of this post.  I think I’m probably the only being in the universe that would write about such things and imagine that anyone else would be interested.

Yeah, definitely weird.

Still, I guess that sort of thing just happens when you talk to yourself in print and share it with any interested parties who might stumble upon it.  Also, when one is without companions or interactions one can, like Melkor, develop thoughts and thought patterns that are unlike those of one’s brethren.

I suppose that can sometimes be a good thing, though it can also sometimes be a very bad thing (rarely as bad as in the fictional Melkor case).  Though all improvement is change, most change is not an improvement‒at least not if it’s not deliberate and directed change.  So if one develops thoughts that are significantly divergent from those of all of one’s peers, odds are that they will not be a net improvement over most of the peer-born thoughts.

I have, of course, mitigated against this somewhat by reading a lot (and consuming other media that deal with science and mathematics and philosophy and such, as well as comedy panel shows).  That’s not randomly chosen reading, either; it’s carefully chosen reading.  I think this has helped improve the general content and tendencies of my thought, because I’ve influenced myself with the carefully thought-out thoughts of very bright people.

I suppose, though, that if one can read what one wants and does so, one is not really isolated from all other thoughts, so one’s own cannot be too very different, or at least are not very likely to be.  That’s good, I think.  Simply developing new thoughts without much input from others would be most likely to lead to some sort of feral state or something akin to schizophrenia.  

So, I guess it can be good to take tangents in one’s thinking, as long as they are not too many and too extreme.  But even given that, it’s clearly useful to have someone to rein one in, if one can, when one goes too far off the rails (yes, that’s a bad metaphor, since a train going off the rails at all is in huge trouble, rails representing a near-binary situation‒if one is a train and one is not on the rails completely, one has experienced a failure of locomotion).

Well, I guess that’s that for this week.  Actually, I suppose that is always that, by some principle of identity or self-reflection or something; I’m sure there’s an “official” name.  “It is what it is” as they say.  What I mean, though, is that I am drawing this post, and this week of posts, to a close now.

I hope you have a very good weekend.

After that I don’t give a shit.

(I’m kidding.)


*I know, I know, we won’t have to wait an average number of anything.  There is a specific and exact number of years before the next time February and March have Fridays the 13th, but I cannot be arsed to work it out just now**.

**Okay, well, since I am unable to keep myself from thinking about it at least a little, I think it’s going to be 6 years from now.  That’s because each regular year is 1 day longer than a multiple of a week:  365/7 is 52 with a remainder of 1, so one day longer than an even number of weekdays.  So next February should have the 13th on a Saturday, then a Sunday the following year, but then on a Tuesday the year after that because of the leap year (366/7 is 52 with remainder 2).  Then it will be Wednesday, then Thursday, then Friday.  So 6 years, if my figuring is correct***.

***If it seems counterintuitive that it’s 6 years when the average should be 7, remember that while in this case the leap year makes the next instance come faster, there will be occasional years when Thursday the 13th falls on a leap year and the following year will go straight to Saturday the 13th, the first of another six years (I think) that will be needed for the subsequent Friday the 13th in February.  In any case, 6 plus (6 x ⅙) equals 7, as does 6 + (a x 1/a) no matter what a is****.

****This doesn’t factor in those leap years in which February has a Friday the 13th, but March will not.  That may change the overall calculations somewhat regarding the average time between dual Fridays the 13th, but not the calculations about when the next one will be.

The painful truth – the truthful pain

Please forgive me if I behave or speak as though today were Tuesday.  I know that it is in fact Wednesday as I write this‒it’s anyone’s guess on what day of the week you might be reading it (though I suspect that, for the most part, if one doesn’t read my blog on the day it’s posted, one is unlikely ever to read it)‒but I didn’t write a post yesterday (Tuesday) so I may be a bit thrown off.

I didn’t write a post yesterday because I didn’t go to work yesterday.  And I didn’t go to work yesterday because of pain.  I had already been having a bad pain day on Monday, one in a long string of worse-than-average pain days.  Then, in the evening on Monday, while trying to reach for something in my room, I took a bad step on the tile floor and slipped and nearly fell.

I caught myself, as is implied by the “nearly” in that last sentence, but I wrenched my back significantly, and the night and morning and so on were particularly bad, and I hardly slept and I did not have the energy to go to work, or at least to do so and not spend all my time writhing and snapping at people.  So I stayed at the house.

Regarding chronic pain, I’m fond of quoting Ulrich’s description of Vermithrax from Dragonslayer:  “When a dragon gets this old it knows nothing but pain, constant pain.  It grows decrepit.   Crippled.  Pitiful.  Spiteful.”  I had to double-check and fix a few words to get the quote exactly correct, but the most important parts are always remembered correctly.  And the whole thing feels like it describes me pretty well.

I used to be much more pleasant and amiable than I have become since my chronic pain began.  Though I’ve had problems with depression since my teens and anxiety before that and ASD since I was born (in two different senses), I always tried to be polite and amiable and kind as much as I could.  I always figured that was the real position of strength:  not being in competition with other people but just trying to do your best while others do similarly.

But when one is in chronic pain, it is hard not to be grumpy (presumably even if one hasn’t lost almost everything one had worked to achieve through the first thirty plus years of one’s life, though I cannot know for sure).  I think there are people who have only known me since the time of the beginning of my back problem who would be surprised by how pleasant I was back in the day.

Though, there are those who read this blog who did know me in the past, before the aforementioned time, and maybe they would give a different report.  I can only share my own perceptions and perspective, and I could to a certain degree be mistaken about how I came across to other people.

I’ve never been all that good at knowing what other people think of me.  Because of that, I generally try just to take people at their word, and take those words to have their most straightforward meaning.  If someone hopes to hint at something and I don’t get it, that’s on themHints are overrated even when given and received by people who embrace the practice and consider themselves good at it.  There are too many possible variations and points of incomplete information.

Anyone who has saved and transferred a video file and has also saved and transferred word processor files should grasp the difference, at least if they have been paying any attention.  A video only a few moments long can, despite the latest compression algorithms, have a storage size that dwarfs the size of even, for instance, the word file for the unsplit book Unanimity.

Now, Unanimity is about half a million words long.  It’s certainly the longest thing that I have written.  But a video I did on my phone last week for minor fun, which was maybe 20 seconds long, takes up more than 16 meg, while the combined file size for the Kindle versions of both Unanimity: Book 1 and Unanimity: Book 2 is about 3.5 meg*.

That’s a few minutes of stupid and pointless video which will never be shared anywhere versus a work that took more than a year for me to complete, edit, and publish.

At least it’s fair to say that, from a useful information point of view, my book was and is much more efficient.  Though it requires enough shared experience for others to fill in meanings and images of things described, this is not a requirement that isn’t met by nearly every human on the planet.  Perhaps videos would be better for a truly alien species that was hitherto unfamiliar with human civilization.

Okay, well, that was a weird post, I guess.  I mean that in absolute terms, mainly; I don’t know if this post is much weirder or much less weird than my usual posts.  Possibly every potential interlocutor would have different things to say about that.

I guess that’s okay.  It had better be okay, if it’s true, because if it’s true, there’s nothing anyone can do about that, and they’re already living with it.  This is the Litany of Gendlin, as quoted by Eliezer Yudkowsky, of which I have a screenshot from his book Rationality:  From AI to Zombies, below.

Well, I hope you have a good day, whatever the truth is that you and all the rest of us are living.


*This is according to the AI summary of Google’s search for “Robert Elessar Unanimity file size”.  It’s almost certainly correct, because the info is part of the Amazon description of the book.  But it’s humorous to me that it’s easier to do an AI based web search to find the file size of my own novel than it is to look up the file, since I’m using my phone and don’t have direct access to the original at the moment.

And his brain ate into the worms…

Ugh.  Didn’t we just leave this party?  Evidently, we did not leave it precipitously enough, because here we are‒or at least, here I am‒rejoining it in the morning.

It seems like an ill-advised notion, but then again, I’m not sure who specifically advised me, or any of you, to do it.  There probably were a few literal, formal pieces of advice that we all or each received throughout our lives‒advice about getting up early and going to work and striving to fulfill our potential, and how if we didn’t we were somehow letting ourselves and (more importantly) letting everyone else down.

“The early bird gets the worm” is a typical phrase about such ambition and dedication and hard work.  But like many of us, I’ve often thought that worms are overrated.  They’re not rated highly at all, I’ll admit, but nevertheless, I think they are rated too highly.  Evidently‒according to what I have read‒all earthworms in at least the northern part of North America were killed off in the last ice age.  Nevertheless, plants grew and flourished without verminous help in the soil before Europeans accidentally brought their own earthworms here.

Of course, the saying is metaphorical, I know that.  We’re not really advised to seek earthworms early in the day, though perhaps liver flukes and flatworms and tapeworms and roundworms are also considered as among the worms that might be caught.

No, probably not.

But anyway, even though metaphorical, that saying raises higher level questions, such as, “Is the life of a metaphorical early bird worth having?”

Consider what that life entails:  Getting up (early), pecking around on the ground for worms and probably also for various other insects and their larvae and a few arachnids as well*; trying to avoid, in that process, being caught by some predator (such as a house cat); trying to find and attract a mate when the season is right; helping build a nest, if you’re that kind of bird; guarding the eggs and maybe sitting on them yourself, until they hatch; then, feeding and protecting them until they can fly on their own; then repeating these steps until disease or starvation or one of those house cats gets you.

That’s it.  And while there are many embellishments and flourishes and complications in the typical human life cycle, overall it is much the same as that of the bird.  Why would we expect it to be otherwise?

Admittedly, humans (and humanoids) can dream up other things to do, and some of them are more interesting and fulfilling, from their own points of view at least, than the ordinary early bird pattern.  But though, in the long run, humans as a whole may become significant enough to do something truly meaningful on a cosmic scale, almost all of them have no deeper lives than those lived by the early birds.

That’s not necessarily a bad thing, of course.  Taken with the pertinent attitude, such a life can be well lived and fulfilling.  It probably won’t end happily, because it’s not in the nature of life to be happy when ending; there’s just no real evolutionary benefit to having such a tendency.

Still, before imbibing the so-called Kool-Aid™ of the motivational life-messages‒those social moralities that keep us getting up and joining the rat race (to shoehorn in another animal-related metaphor)‒it would probably behoove us to consider whether that is the life we think we want, to ponder if that overall shape and experience are okay with us as the outline of our lives.

If so, there’s nothing wrong with that.  As long as you’re not interfering with other people’s ability to try to live their lives as they try to see fit**, then do what seems best to you.

But it’s useful to think about what might be the overall shape of your life if you continue as you currently are and if that shape will be aesthetically (or otherwise) pleasing to you.  If not, what change might improve that overall shape, trying to take all reasonably plausible inputs and outputs into consideration?

I won’t say that the unexamined life is not worth living, because, if it’s unexamined, how do you know that it’s not worth living?  Huh?  Huh?  Nevertheless, I will say that the unexamined, unconsidered life could be fulfilling only by accident, whereas it may be possible, with deliberation, to steer toward a better one.

Not that I’m a good piece of evidence in favor of this.  I think and overthink to the point that I hate the noise of my own mind, but I haven’t been able to steer myself into an optimal shape***.  But at least I make a lot of “noise” about such things.  That might be worth something.

Anyway, have a good day.  Enjoy your worms or salads or whatever other life forms you kill and consume to remain alive today (I’m assuming you are not a green plant).  Watch out for the Kool-Aid™ and even more so for the cats.


*I am quite sure that, to such a bird, these things taste delicious, so I don’t mean to disparage their diet as unpalatable.  Appetites of various kinds are species specific; what’s appetizing or sexually attractive to, say, a housefly is unlikely to appeal to any psychologically healthy human.  Likewise, the most beautiful human woman ever is not going to do anything for a male tarantula.  He also probably would have no interest in having a bite of her salad.

**This is more difficult to navigate than it may seem at first, because even when one is acting on one’s own, there are always effects at some level, there are always “externalities”, and occasionally these will have an impact on other people‒a foreseeable but perhaps unforeseen impact.  And vice versa.

***Should there be a “yet” at the end of that sentence?  I don’t know; we’ll have to see what happens to me in the future.  We can be reasonably sure, though, that there shouldn’t be a yeti at the end of that sentence, or of any sentence except one that mentions such creatures.

Is it mean not to know if one’s writing is above average?

It’s Friday again, but that’s not much consolation, since the office is open tomorrow and I will be working, unless I am lucky enough to get very sick or very injured or to die or something.

As usual, I have no idea what would be good to write today.  Actually, goodness—certainly in the moral sense, but possibly also in the sense of quality—probably doesn’t have much to do with my blog.  Perhaps weirdness would be a better adjective/measure to relate to my writing.

I’m probably not an objective judge of such things.  Then again, I don’t know of any fully objective judges.  Still, there is some degree of variability involved in such things, as in nearly everything else made up of smaller, more fundamental parts that are interacting in complicated ways producing so-called emergent behavior.  Nevertheless, cognitive biases are reasonably well studied, as are many emotional blind spots and the like.  And it’s certainly true that I have a difficult time being objective about myself and about my work.

Oddly enough for such a self-despising person, I actually like my own writing, especially my fiction.  When I reread my stories I don’t tend to see them as horrible or wretched or whatever traditionally happens with “artists” who look at their own work.  I think at least some of that sort of thing is probably affected, since our society (perhaps semi-deliberately) looks down upon artists who think highly of their own art.

If only we did that with politicians; there’s an area where humility would be welcome and beneficial, I think.

Anyway, I tend to like my stories—I wrote them, after all, because I wanted to tell and thereby hear those tales—but I don’t necessarily think they’re great or good or decent from anyone else’s point of view.  I honestly don’t know how good or how bad they might be from nearly any others’ points of view, except my sister’s, and she’s probably almost as prone to be biased about my work as I am.

Though, again, my attitude toward my writing is not akin to that oft-noted personal bias that leads more than 90% of drivers to think that they are in the upper 50% in ability (i.e., above the median), which is a mathematical impossibility* for them actually to be.  I don’t think of my writing as better (or worse) than average or the median.

I don’t really compare my writing to anyone else’s.  I just tend to like it.  That’s probably a very good thing, because I have to edit it myself.  Even these daily blog posts are run through three more times after my first draft.  My fiction I tend to reread and edit seven times (that took a very long time with Unanimity).  Why seven?  Well, I had to pick a number, and once or twice is clearly too few, and thirteen would just be unworkable.

Also, with my fiction, I tend to follow advice Stephen King repeated in his book On Writing by working to reduce my final word count by at least ten percent by the time I’m done editing it.  I used to try to do that here, but I sometimes add a bit during editing, so that becomes quite difficult and hardly worth the effort.

All that being said, it would really be nice to get some feedback on my writing, especially on my stories (from people who have read them).  Of course, I would love it if someone loved my stories and told me so and told me why.  The closest I think I’ve come is a review on Amazon of Welcome to Paradox City that was written by a former high school friend (he has since died of cancer, sadly) who had actually honestly bought the book for himself when it came out.  He wrote that the three stories in that collection each made him wish they were the beginning of a whole book, basically implying that he wanted to know what happens next.

That’s a good thing about short stories—you can leave people hanging and that’s just “too bad” for them (though it can be enjoyable).  Short stories also don’t have to have “happy” endings, which is good for me, since only one of the three in the above collection ends happily in any reasonable sense.

Of course, as I’ve noted before, my short stories are rarely short enough ever to have been, for instance, published in a magazine in the old days.  The only real exception to this is Solitaire, which I don’t think any magazine would have published, because it is very, very dark indeed.

Okay, well, I guess I ended up writing something today, even if it was all just figurative omphaloskepsis.  I don’t know whether you readers consider this good or bad or ugly, or how it compares in your estimation to posts like I posted yesterday and/or the day before.  If you’re so inclined, please let me know.

And if you have actually bought and read any of my books, I do beseech you to leave me a review on Amazon (or wherever) if you get the chance.  Thanks.

I’ll write at you tomorrow, barring—as always—the unforeseen.


*It is not, on the other hand, impossible for 90% of people to be above average (i.e., above the mean).  I’m sure I’ve addressed this before, but imagine one had administered a test, and 100 people took it.  Imagine that 90 of those people got 51/100 on the test, whereas the remaining 10 people scored zero.  Then, the arithmetic mean (what people usually mean by “average”) would be (90 x 51)/100**.  That goes to 4590/100, or a mean score of 45.9.  So, 90% of those people scored above average.  That’s not saying much, but it’s true.

** Yes I know I don’t really need the parentheses there, but I’m leaving them in for clarity.

Our wills and fates do so contrary run, that our devices still are overthrown; Our blogs are ours, their ends none of our own.

Hello and good morning.  It’s Thursday, the 26th of February in 2026, a date that’s only very slightly interesting whether you write it as 2-26-2026 or 26-2-2026.  The fact that you have repeated 2s and repeated 26s is somewhat entertaining, but the zero throws potential symmetries off, making it not nearly as much fun as it could conceivably be.  It’s a shame, really.  I suppose you could write it as 26-02-2026 and rescue a bit of symmetry, but that feels like reaching.  It’s not quite symmetrical anyway, unless one is writing in base-26 or higher.  No, wait, even that wouldn’t work.

I don’t know about what I’m going to write this morning.  That in itself, of course, is nothing unusual.  But I don’t feel that I have much to say about anything at the moment.  I don’t want to get into my depression and ASD and anxiety and chronic pain and insomnia and just general moribund state, because I’m sure no one wants to hear about it anymore, and in any case, there seems to be no way anyone can do anything about it that’s useful, which makes it all the more frustrating.  Writing about it certainly hasn’t cured or even improved my state much, if at all.

Anyway, as I said the other day, you have been put on notice.  Unless you just started reading my blog for the first time yesterday, you’ve no right to act fucking surprised no matter what happens.

Okay, that’s that out of the way.

Now, let’s see, what should I write today?  I could discuss some topics in science, especially physics, though I also have literal, legally recognized expertise in biology, and I know a lot about quite a few other branches of science as well.  This is because I have always been curious about how the world, the universe, actually and literally works on the largest and on the most fundamental scales.

I mean, yes, humans also have their rules and laws and social mores and antisocial morays and all that nonsense, but if you step back even a bit, you can see nearly all human behavior encapsulated by basic primatology.  If you know how the various monkeys and gibbons and gorillas and chimpanzees behave‒especially their commonalities‒human behavior almost always fits right in.  It’s usually not even very atypical.

That doesn’t make the specifics of behavior very easily predictable in any given case, necessarily; then again, we understand an awful lot about the weather and the climate, but the specifics of tomorrow’s weather are tough to predict precisely and accurately, let alone next week’s weather.  Nevertheless, the physics of longer term climate effects of certain kinds of atmospheric gases is almost trivial.

Anyway, humans are too annoying to be very interesting, except in special circumstances.  In this, they are perhaps a bit like cockroaches.  From the point of view of a scientist who studies them, they can be interesting, and from just the right angle and with the right detachment, they can even be beautiful (or some of them can).  But overall, they are merely large masses of highly redundant little skitterers, just doing their shit-eating and reproducing and infesting almost every possible location.

Which type of creature did I mean to describe just now?  See if you can figure it out.

Of course, on closer scales, cognitive neuroscience and neurodevelopment and related stuff, such as “neural” networks, “deep” learning, and other such areas are fascinating.  One thing interesting about them is how all the things that brains and computers and so on are and do are implicit in the laws of physics‒clearly they are some of the things that stuff in the universe can do‒and yet, for all we know, they have only ever happened here, just this once in all the vast and possibly infinite cosmos*.

And for all we can tell, given the human proclivity to plan about 20 Planck units ahead and then after that trust to luck, this could be the only place they occur, and their time will not continue much longer, certainly not on a cosmic scale.

I could be wrong about that…except in the sense that, since I am stating it merely as one of the possibilities, I am not actually wrong at all.  Even if humans do survive into cosmic time scales and become cosmically significant, it will still not be easily debatable that it could have happened that humans would go extinct and would fail to go anywhere but Earth.

Of course, depending on the question of determinism, I suppose one could say that if humans (or their descendants) become cosmically significant then there literally was nothing else that could have happened, at least as seen from outside, at the “end”.

On the other hand, if Everettian quantum mechanics is the best description of the fundamental nature of reality, then in some sense, every quantum possibility actually happens “somewhere” in the universal quantum wave function, though those variations may not include all conceivably possible human outcomes.

Some things that seem as though they should be possible may simply never happen to occur (or occur to happen?) anywhere in the possible states of the universe.  That feels as though it should be unlikely, given how many possible states can be locally evolved in the quantum wave function, but I don’t think we know enough to be sure.

Okay, well, I vaguely hope that this has been mildly interesting and perhaps thought provoking.  It would be enjoyable to get more feedback and thoughts, but I don’t have a very large readership, and only a certain small percentage of people ever seem to interact with written material in any case, so I’m probably lucky to get the feedback that I get.

TTFN


*With the inescapable caveat that, if the universe is spatially and/or temporally infinite, and if as it seems there are only a finite number of differentiable quantum states in any given region of spacetime (the upper limit of which is defined by the surface area of an event horizon the size of the given region) then every local thing that happens, and all possible variations thereof, “happen” an infinite number of times.  But given that all these regions are more or less absolutely physically distinct and incapable of “communicating” one with another, they can be considered isolated instances in a “multiverse” rather than parts of the same “local universe”.

A notification of whatever

I expect this post to be brief today, though I’ve been known to be wrong about that sort of thing.  I had sort of “intended” to make my headline “Oh, well, whatever…” and then make the entire body of the post “…never mind.”  Thus I would be quoting the last verse-line of Smells Like Teen Spirit by Nirvana.  The subsequent words in the song are just the chorus and then a refrain of “A denial” repeated nine times (if memory serves).

I wasn’t sure I hadn’t already done this before, though.  I could have checked, but I didn’t have the mental energy.

Still, using that last line from a Kurt Cobain song carries a certain subtext which would have served my purposes well.

Or, well, actually, given past history, it probably wouldn’t have served my purposes at all.  None of this sort of thing seems to serve my purpose, no matter what I do.  As far as I can tell, only one person actually read my (admittedly somewhat long) post yesterday, but though I was borderline explicit about my meaning, I don’t think it did any good whatsoever.  That’s not unusual, of course; much if not all that I do never ends up doing me much good.

Sometimes I have to be subtle because I cannot force myself to be open about my internal states after a lifetime of fighting to appear “normal”, to the degree I can achieve that, and to avoid being too much trouble for other people, since I don’t think I have the right to trouble them, and in fact I think (or feel) that I’m fundamentally reprehensible.

I shouldn’t worry, though.  The times I am more open and obvious‒even when I am borderline explicit‒don’t appear to be any more successful than when I am at my most cryptic.  Possibly, I am just not able to communicate my feelings effectively with humans.

At the very least, my success rate must be below one percent.  It’s not quite as bad as playing the lottery, but it’s pretty pathetic.  Then again, so am I.

Whatever.  Never mind.  Ha ha.

But really, though, I don’t have much to say.  Quoting iconic songs may be the extent of my capacity to convey myself.

Ironically, I don’t feel the urge to share quotes from my own songs (or my fiction).  You would think they would be the best choice for conveying my inner thoughts.  That’s not always the case, though.

In fact, though I like my songs well enough, and Breaking Me Down is meant to be fairly explicitly about depression (at least my species thereof), none of them have enough oomph, as it were.  Or maybe it’s just that they are not well known*, so no one recognizes and identifies with the words.

I think I have some pretty good lines in Come Back Again, including what’s probably my favorite:

“Only meeting strangers

always losing friends.

Every new beginning

always ends.”

It may seem a bit bleak, but it’s also true more or less by definition.  If you’re meeting someone for the first time, they had been a stranger until that point.  And friends do become “lost”.  And the next two lines are rather obviously true.

Of course, a very good signing (singing?) off quote would be from Pink Floyd’s Time:  “The time is gone, the song is over, thought I’d something more to say.”

I’ve always been annoyed that they added the little reprise of Breathe after that and made it officially part of the song, because those other two lines constitute a perfect song ending.  I always figured they didn’t want to make the song end on too much of a downer, so they threw in the reprise as part of that song instead of as a separate one.  Maybe they were unwittingly invoking a version of the peak-end rule I mentioned the other day.

Anyway, I have a locked and loaded draft of a blog post that already applies that couplet from Time, with the headline being the first half, continuing into the post which consists only of the second half of that quote, followed by the embedded “video” of the final song on the first album of The Wall.

That, of course, is still a draft, and has been waiting there for a while, because if I use it, it’s meant to be my final blog post, and practically my final anything.  So I wasn’t going to use it today.  Not quite.  But I’m close.  The Nirvana quote isn’t quite as final, but it is a warning, especially given the fate of the guy who wrote it.

Anyway, consider yourselves on notice.  On notice of what?

Figure it out.


*That’s an understatement, eh?

Man overboard

As the real weekends go, it was better than most, to paraphrase The Wreck of the Edmund Fitzgerald.  By this, I’m referring to this last weekend, the two days before this day, of course.

I did not work on Saturday, which is good, because that would have been the third time in a row.  I also got to hang out with my youngest on Saturday, and we watched about four episodes of Doctor Who together, which was good, good fun.  I cannot complain about that in any way.

I have though a weird, disquieting, sinking sort of feeling that it may have been the last time I will see my youngest, or maybe anyone else that I love.  It’s is not one of those reliable sorts of feelings, like those that lead one to new insights in science or mathematics or what have you.  It’s probably more a product of depression and anxiety, the feeling that anything good in my life is sure not to last, if it happens at all, because I do not and cannot possibly be worthy of anything good happening to me.

Is that irrational?  Of course it is irrational.  It cannot be expressed in any sense as the ratio of two whole numbers, no matter how many digits they may have.

Wait, wait, let me think about that.  My thought, my feeling, was expressed above finitely.  That is, of course, a shorthand for what is really happening, but even if one were to codify those processes down to the level of each molecular interaction that affects any neural/hormonal process that contributes to my feeling, we know that must be a finite description (though it could, in principle, be quite large).

Even if we’re taking the full spectrum of quantum mechanics into account when describing my mental state, we know that quantum mechanics demands a minimum resolvable distance and time (the Planck length and the Planck time) below which any differentiation is physically meaningless.

A finite amount of information can describe the events and structures and processes in any given finite region of spacetime.  In fact, the maximum amount of information in any given region of spacetime is measured by the surface area (in square Planck lengths) of an event horizon that would span exactly that region, as seen from the outside*.

Any finite amount of information can be encoded as a finite number of bits, which can of course be “translated” to any other equivalent code or number system.  So, really, though the contents of my mind are, in principle, from a certain point of view, unlimited, they are finite in their actual, instantiated content, and can therefore certainly be expressed as an integer, and thus also as a ratio (since any integer could be considered a ratio of itself over one, or twice itself over two, etc.).

So, in that sense, my thoughts are not irrational.  Neener, neener, neener.

In many other senses—maybe not the literal, original sense, but in the horrified, cannot accept that not all numbers can be expressed as ratios of integers because that makes the universe too inconceivable, sense, among others—I can be quite irrational.

It’s very difficult to fight one’s irrationality from the inside, alone.  Even John Nash didn’t really beat his schizophrenia from within as shown in the movie version of A Beautiful Mind.  Also, his delusions in real life were far more extravagant and bizarre than those which appear in the sanitized version that made a good Hollywood story.

If one escapes from mental illness from within, one has to consider it largely a matter of luck, like a young child who doesn’t know anything about math getting a right answer on a graduate level, high order differential equation problem.  It’s physically possible; heck, if it were a multiple choice question, it might even be relatively common***.  But it’s not a matter of being able to choose to do it right and to know how it was done.

Severe mental health issues are going to need to receive assistance from outside, almost always.  This is not an indictment of them or of the need for help.

Surely, someone who has been swept off the deck of a ship by a rogue wave cannot be faulted for needing help from those still on the ship of they are to survive.  It would certainly seem foolish and almost inevitably fruitless if such a person tried to claw his way up the side of the ship to get back on board when there is no ladder and no handholds.  He should certainly not be ashamed that he cannot swim hard enough to launch himself bodily from the water and back onto the surface of the vessel.

One cannot reasonably fault such a person for trying to do the superhuman.  A person might try to do practically anything rather than drown or be eaten alive by some marine predator.  But, of course, barring an astonishing concatenation of events such as the time-reverse of the splashing entry into the ocean happening and sending the person out of the sea just as it was entered, such efforts will not succeed.

And though it might be heartening or at least positive for one to receive encouragement from those still on the deck—don’t drown, keep treading water, you can do it, you’ll make people sad if you drown, you deserve to stay afloat, I’m proud of you for treading water yet another day, it’ll get better, this won’t last forever, you’ve made it this far so you know you can keep going, you don’t want the people who know you to feel sad because you drowned, etc.—in the end it might as well come from the seagulls waiting to pick at one’s floating corpse.

Mind you, certain kinds of words can be more useful than others.  Words like, “Hey, around the other side of the ship there’s a built-in ladder; if you can get over there and time things right, you might be able to grab the lowest rung when the waves lift you, and then climb up,” might be useful because they are directions for using real, tangible resources that we know can make a difference.  Also, words like, “Hang on just a bit longer, we’re throwing down a life preserver on a rope so we can haul you up” would be useful, obviously, unless they were mere “comforting” lies.

Alas, though one could reasonably expect such literal assistance if one were washed overboard—the “laws” of the sea are deeply rooted in the hearts of those who work there, and they include a general tendency to help anyone adrift to the best of one’s abilities—when it comes to mental illness, the distress and the problems are difficult for others to discern and easy to ignore.  Calls of distress are often experienced as annoyances, and even treated with contempt, since those hearing them cannot readily perceive that they themselves might be similarly washed overboard at any time.

But, of course, they might be.

I don’t know how I got on this tangent, but I guess I never really do.  I just go where my mind takes me, and my mind is not a reliable driver.  It is, though, a reliable narrator.  It doesn’t matter, anyway.  Nothing does.

Anyway, here we go again into another work week, because that was what we did last week.  I wish I could offer you better reasons, but I’m really only good at breaking things down, destroying things, not at lifting anyone or anything up.  That comes from other regions and is conveyed by other ministers.


*From within an event horizon, the volume could be much larger than the spacetime that seems to be enclosed from the outside, because spacetime inside the horizon is massively curved and stretched.  It’s conceivable (at least to me) that there could be infinite space** within, at least along the dimension(s) of maximum stretch, just as there is infinite surface area to a Gabriel’s Horn, but only finite volume.

**See, mathematically, one can stuff infinite space inside a nutshell.  Hamlet was right.  He often was.

***Perhaps this explains why certain types of mental health problems can respond well to relatively straightforward interventions, and even to more than one kind of intervention with roughly comparable success, e.g., CBT and/or basic antidepressants and such.  These relatively tractable forms of depression are the “multiple choice problem” versions of mental illness.  This does not make them any less important.

This is not an attention-grabbing headline

I’m writing this post on my smartphone, even though I brought my lapcom with me yesterday evening.  I did not use my lapcom for yesterday’s post, such as it was.  I didn’t even write that post in the morning yesterday, or at least, I didn’t write the “first draft” of it then.

By the end of the workday on Wednesday, I didn’t feel like I was going to want to write a blog post on Thursday.  So I went to the site directly and just wrote the “Hello and good morning,” and the “TTFN” and set it to publish later.

I already knew what title I was going to want to use for it.  I wanted to use Polonius’s dithering, meandering jabber about brevity being the soul of wit, as a sort of left-handed self compliment about my own brevity in that post, and because, in the original form, it would have made the headline longer than the post, which would be ironically funny, in principle.

Then, yesterday morning, I got the urge to put my little “insert here” bracketed bit in the post, the better to convey how disgruntled and disaffected and self-disgusted I (still) felt, as well as how tired.  It did sort of spoil the joke about the headline being longer than the post, of course.  At least the older joke about Polonius still holds water.  Then again, that joke was made by Shakespeare, so we shouldn’t be too surprised if it has serious legs (though this raises the question of how serious legs could possibly hold water).

One thing worth at least assessing this week might be whether there is an aesthetic difference between this post (for instance) and the posts I wrote earlier this week, on the lapcom.  Writing on the lapcom is quite different for me in many ways.

On the lapcom, I generally have to work to stop myself before a post, or whatever, gets too long.  Whereas on the smartphone, that isn’t as frequent a problem.  Not that I can’t yammer on and on even with the smartphone, of course.  Some might say all I ever do is yammer on and on.  But anyway, I can’t write as “effortlessly” on the smartphone as I can on a regular keyboard*.

Sorry, I’m retreading a lot of old ground here, which I guess is better than retreading a lot of old tires. I know how to tread on the ground; indeed, I cannot recall a time when I didn’t know how to do that kind of treading.  Whereas retreading a tire sounds like something that requires special skills and equipment, both of which I lack.

I don’t know, I’ve heard of “retread” tires, but I don’t know if such things still abound, or if they ever did.  It sounds vaguely like a bad idea, like such tires might be more prone to blowouts.  But latex is a finite resource, and there aren’t very good synthetic alternatives, so maybe there’s at least some cost/benefit tradeoff (or treadoff?) there.

Ugh.  With that last joke, I probably convinced at least some of my readers that, yes, the world would be better off if I were dead.  Actually, I say that as if it were conditional, but it’s not.  It would be more in line with reality to say “the world will be better off when I am dead”.

There’s a quote by which to be remembered, eh?

I cannot say whether I will be better off when dead.  It’s probably a nonsensical question.  When I am dead, I will not be anything at all, not better, not worse, not uglier.  What happens to virtual particles after they have annihilated?  Nothing, and less than nothing, for they truly no longer exist, and in some senses they never existed.  Indeed, as physics goes, they probably never do exist; they are a shorthand description of what happens in quantum fields when perturbances in the fields have effects that do not rise to the level of actual, true particle production.

Or so I am led to understand.

From another point of view, it is possible for something to improve, at least in a sense, by ending.  I’ve mentioned this before, but if the curve of a function‒perhaps a graph of the “quality of life” or one’s “wellbeing”, to say nothing of happiness‒is persistently negative, then returning to zero is a net gain.  It can be a huge net gain, in fact.  This is related to the origin of my own version of an old saying, which I use with tongue definitively in cheek:  The one who dies with the most debt wins.

Now, of course, the integral, the area “under” that wellbeing curve would not be improved by the curve reverting to zero and stopping.  But at least that integral would not keep getting more and more negative over time.

Some might say, “well, the integral can become less negative over time, and might even become positive”.  This is, in principle, true.  And when one is younger enough, it’s relatively easier to tip the curve, and its integral, into positive territory.  But as the curve goes on, having been negative for a longer and longer time, it’s going to become ever harder to bring things to a net, overall positive integral, even if one could reliably make one’s curve positive (which one often simply cannot do).

Of course, the moment to moment experience (which is all the mind really gets) of an ascending curve could be pretty darn good, and might well be worth experiencing, even if it’s not enough to bring the integral into positive territory.  We are straying into the “peak-end” rule here, which was elicited regarding (among other things) colonoscopies but applies to much else in human experience.

Speaking of peak endings, I’ll mention in passing the curious fact that, no less than twice in the last week, the evening train service has been disrupted by someone either getting hit by or becoming ill next to the train.

Earlier this week, right by the station where I catch the train to go back to the house, there was a man who looked like he was probably homeless and had collapsed next to the train tracks not far from the station.  I saw him brought away, finally, on a stretcher.  He didn’t look physically injured‒certainly not in the ways I would expect someone who had actually been hit by a train to look‒but he did look cachectic, which is why I thought he might be homeless.

Then, last night’s commute was interrupted by what they call a “trespasser strike”, one that did not involve the train I rode but which always slows everything down.  I’m vaguely amused by the euphemism “trespasser strike”.  A “trespasser” here is a non-passenger who doesn’t work for the train company (or whatever) who is in the area adjacent to the tracks.  The “strike” part is probably self-explanatory.

I suppose it’s literally true, at least from a legal point of view, to call the person a trespasser.  But it’s amusing that the train people have to say something derogatory about a person hit by a train‒even if the person deliberately put themselves in harm’s way‒to sort of, I don’t know, assuage the company’s conscience.

But we are all trespassers, in at least some senses.  We are also, in other senses, all owners.  We are all innocent, and we are all, in some other senses, guilty.  “Every cop is a criminal and all the sinners saints.”  Above all, we are all very much just passing through, staying only a very short time.  We are all virtual particles.  Or you might say, we are all Iterations of Zero.

Have a good weekend.  I should not be writing a post tomorrow (in more than one sense).


*I wish I could honestly say that my use of a piano-style keyboard were as effortless, but I am terribly rusty with that, though I started learning it when I was 9, a rough 2 years earlier than when I got my first typewriter.

May the slope of your pain function always be negative

I’ve been thinking about something I wrote in my blog post yesterday.  I had thrown out the thought, in passing, about how it seemed as though all the things in my life that I still do are not things I necessarily do for joy or out of desire to achieve some goal, but rather they are things which are more painful not to do than to do, and so I do them.

There isn’t really a positive motivation—not the pursuit of happiness or improvement or fulfillment or enrichment.  It’s just that the feeling of stress and tension and anxiety (or whatever) regarding the prospect of, for instance, not going to work rapidly becomes worse than the equivalent feelings about going to work.

That’s not a great state of affairs.  Don’t get me wrong; it’s entirely natural.  I’ve written about this many times, this recognition of the fact that the negative experiences—fear, pain, revulsion, disgust, and so on—are the biologically most important ones.  Creatures that don’t run from danger, that don’t avoid injury, that don’t shy away from potential infection and poison, are far less likely to survive to reproduce than creatures that do those things.

We see clinical examples of people lacking some of these faculties—such as those with congenital insensitivity to pain—and while we might envy them a life without agony, it tends to be quite a short life.  Also, they tend to become immobile and deformed due to damage they do to their joints by not shifting position to improve blood flow.

In case you didn’t know, that’s one of the reasons you can’t stand completely still for very long; it’s not good for you.

But many of us, especially in the modern world, have some things that we do for positive experience.  Some of them are dubious, but food, sex, companionship/conversation, singing, dancing, all that stuff, are positive things.  Unfortunately, positive experience cannot be allowed—by biology—to last too long.

As Yuval Harari noted, a squirrel that got truly lasting satisfaction from eating a nut would be a squirrel that lived a very short—albeit fairly happy—life, and would be unlikely to leave too many offspring.

Maybe this is what happens to some drug addicts.  Maybe they really do get satisfaction or at least pleasure from drugs—and maybe that is what ends up destroying them.  At some level, that’s not truly in question, is it?  People who are addicted to drugs forego other pleasures and other positive things, but perhaps more importantly, they fail to avoid many sources of pain and fear and injury.

The reality is probably a bit of an amalgam, I suppose.  I would not say it’s a quantum superposition, though, except in the sense that everything is a quantum superposition (or, rather, a whole bunch of them).

This is one situation in which I think I’m right and Roger Penrose is wrong—a bold claim, but I think a fair one—in that I see no reason to suspect that the nature of consciousness either requires or even allows quantum processes, other than in the trivial sense that everything* involves quantum processes.  But there’s no reason seriously to think that (for instance) neurotubules can even sustain a quantum superposition internally, let alone that such a process can somehow affect the other processes of the neuron, many of which are well understood and show no sign of input from weird states of neurotubules, which act mainly structurally in neurons.

If deep learning systems—LLMs and the like—have demonstrated anything, it’s that intuitive thought** does not require anything magical, but rather can be a product of carefully curated, pruned, and adjusted networks of individual data processing units, feeding backward and forward and sideways in specific (but not necessarily preplanned or even well understood) ways.  No quantum magic or neurological voodoo need be involved.

I think too many people, even really smart people like Penrose, really want human intelligence to be something “special”, to be something that cannot be achieved except within human heads, and maybe in the heads of similar creatures.  Surely (they seem to believe) the human mind must have some pseudo-divine spark.  Otherwise, we oh-so-clever humans are just…just creatures in the world, evolved organisms, mortal and evanescent like everyone and everything else.

Which, of course, all the evidence and reasoning seems to suggest is the case.

Maybe, deep down, there isn’t much more to life than trying to choose the path from moment to moment that steers you toward the least “painful” thing you can find.

Please note, I’m not speaking here about some metaphorical continuum, some number line that points toward pleasure in one direction and pain in the other.  That’s at best a toy model.  In the actual body, in the actual nervous system, pain and fear and pleasure and motivation are literally separate systems, though clearly they interact.  Pleasure is not merely the absence of pain, nor is pain merely the absence of pleasure.  Even peripherally, the nerves that carry painful sensations (which include itching, as I noted yesterday!) use different paths and different neurotransmitters than the ones that deal in pleasure and positive sensation.

Within the brain, the amygdala and the nucleus accumbens (for instances) are separate structures—and more importantly, they perform different functions.  There’s nothing magical about their locations in the brain or the particular neurotransmitters they use.  Those things are accidents of evolutionary past.

There’s nothing inherently stimulating about epinephrine, and there’s nothing inherently soothing about endorphins or oxytocin, and there’s nothing inherently motivating or joyful about dopamine and serotonin.  They are all just molecular keys that have been forged to open specific “locks” or activate (or inactivate) specific processes in parts of other nerve cells (and some other types of cells).  It’s the process that does the work, Neo, not the neurotransmitter.

This brings up a slight pet peeve I have about people discussing “dopamine seeking” (often when talking about ADHD).  I know, the professionals probably use this as a mere shorthand, but that can be misleading to the relatively numerous nonprofessionals in the world.  The brain is not just a chemical vat.  Depression and the like are not just “chemical imbalances” in some ongoing multi-level redux reaction or something, they are malfunctions of complicated processes.  Improving them should be at least as involved as training an AI to recognize cat faces, wouldn’t you think?

But one can do the latter without really knowing the specifics of what is going on in the system.  It’s just sometimes difficult, and the things you think you need to train toward or with often end up giving you what you didn’t really want, or at least what you didn’t expect.

Maybe this is part of why mindfulness is useful (it’s not the only part).  With mindfulness, one actually engages in internal monitoring, not so much of the mechanical processes happening—no amount of mere meditation can reveal the structure of a neuron—but of the higher-scale, “emergent” processes happening, and one can learn from them and be better aware.  This can be an end in and of itself, of course.  But it can also at least sometimes help people decrease the amount of suffering they experience in their lives.

Speaking of that, I hope that reading this post has been at least slightly less painful for you than not reading it would have been.  Writing it has been less painful than I imagine not writing it would have been.  That doesn’t help my other chronic pain, of course, which continues to act up.


*With the possible exception of gravity.

**I.e., nonlinear processing and pattern recognition, the kind many people including Penrose think cannot be explained by ordinary computation, a la Gödel’s Incompleteness Theorem, etc.