Nihil vere refert. Quisque videre potest. Nihil vere refert. Nihil vere mihi refert.

Well, I did warn you yesterday that I would be writing a blog post today*.  Go ahead, take a look.

Yesterday’s post was another of my recent, deliberately benign blog posts, not dwelling on my mental health and chronic pain issues, because nobody gives a shit about those things, or at least they don’t want to have to hear about them, because they’re not going to (be able to) do anything about them, and that makes them feel guilty and uncomfortable, which is unpleasantly awkward.

So, anyway, it’s the last day of February in 2026.  We are, in a certain sense, one sixth of the way through the year.

I say “in a certain sense” because it’s not precisely true.  Today is the (31 + 28)th day of the year, so the 59th day of the year.  If that were literally a sixth of the way through the year, the year would only be 354 days long.

It’s somewhat interesting to note here that, because February is shorter than every other month, the first two months of the year are shorter than any subsequent, nonoverlapping** months of the year.  And, let’s see, the first three months of the year have 90 days exactly on non leap years, whereas April thru June have 91, July through September have 92, and October through December also have 92.  So, all the later groups of three months have more days than the first three‒except on leap years, when January through March is 91 days.

Evidently, though, the latter six months of the year always have more days than the first six.  I wonder why they did it that way.  Was there an actual reason or did it just sort of happen?

Of course, I know they can’t be equal except on a leap year, since the number of days in a year is odd.  But why couldn’t they have come up with a way that made the years alternate, with one year‒the odd years perhaps‒having the surplus in the first 6 months and the other years having it in the last 6 months?  On leap years they could be equal.

How might that work?  We need 182 days divided by six months, which means we need four months which have just 30 days and two that have 31.  We could say January and February have 30, March has 31, and then repeat with April, May, and June and then July, August, and September***.  I was about to suggest that on odd years we make January have 31 days and on even years we make July have 31 days, but all leap years are even years, so the latter half would be comparatively short-changed with respect to years in which they are longer. if we add the leap year day to the first half as we do now.

On the other hand, we could put the leap day always in the 2nd half of the year, perhaps in November, or even more sensibly in December:  we would thereby add our extra day to the very end of the year, rather than squeezing it into the earlier part of the year like someone cutting into a line.  Though that would make the second half two days longer than the first, though, which is unpleasantly asymmetrical in a year with an even number of days.

Of course, really, all days are fungible.  I remember seeing on QI once that apparently some sect maintained that they added an extra day not at the end of February but in the middle; I don’t recall precisely where they thought the day was being inserted, alas, but I can imagine some alternative, anatomical suggestions I’d like to make for them.

Days of a month are fungible (dammit!).  It makes no more sense to say that you added a day into the middle of February and pushed subsequent days later than it does to say that you deposited $100 into your bank account right after the 256th dollar that was already there, pushing what had been dollars 257 through 356 to become dollars 357 through 456.  Every dollar is just “a dollar”, every cent is just “a cent”.  It’s rather reminiscent of the way every electron is interchangeable with every other electron (likewise for all other elementary “particles”).

So, on leap years, the extra day of the year is and can only be (in our current system) the 29th of February, because that’s the day-label that isn’t there in other years.

You’re allowed to imagine if you like that you’re adding a day to the middle of the month and pushing the other days back and renaming them.  You’re also free to argue about how many angels can dance on the head of a pin, or to debate, without first agreeing on word usages****, whether unattended trees that fall in forests make noises.  That doesn’t mean you’re doing anything that has any bearing on the real world.

Okay, well, that’s been much ado about nothing, hasn’t it?  Or, multum strepitus de nihilo fuit, as is apparently the way to say it in Latin, which almost always sounds fancier, though it doesn’t always sound better aesthetically (consider the above headline’s Latin versus the original English).  English is‒or can be‒quite a beautiful language if you take a step back and see it as if from outside.  It can be hard to distinguish that beauty “from within”, though, because the meanings and usages of the words involved can distract from their inherent loveliness.

Tolkien, for instance, wrote that he thought the most beautiful sounding phrase in English was “cellar door”.  I’m not sure I agree with him on this, but it’s a matter of taste, so there’s no slight, or “diss” or “shade”, involved in not both liking the same thing.

Enough nonsense for now, or at least enough nonsense here in this blog for now.  I’m sure that there is plenty of nonsense to be had elsewhere.  Do try to find some that’s enjoyable for you this weekend.


*That was unless I was lucky enough to get very sick or very injured or to die, which I have apparently not been lucky enough to do by this time.

**I say “nonoverlapping” because February and March combined contain the same number of days as January and February combined.

***I think in the final three months it should be October that always has 31 days, because Halloween really should fall on a day that’s a prime number, not a 30th or a 1st.

****Most such debates tend to devolve into discussions about the “definition” of the word “noise”, as if that were concrete and singular and fixed‒which it is not‒rather than the laws of physics and biology that constrain all the actual events of such an arboreal catastrophe.

Is it mean not to know if one’s writing is above average?

It’s Friday again, but that’s not much consolation, since the office is open tomorrow and I will be working, unless I am lucky enough to get very sick or very injured or to die or something.

As usual, I have no idea what would be good to write today.  Actually, goodness—certainly in the moral sense, but possibly also in the sense of quality—probably doesn’t have much to do with my blog.  Perhaps weirdness would be a better adjective/measure to relate to my writing.

I’m probably not an objective judge of such things.  Then again, I don’t know of any fully objective judges.  Still, there is some degree of variability involved in such things, as in nearly everything else made up of smaller, more fundamental parts that are interacting in complicated ways producing so-called emergent behavior.  Nevertheless, cognitive biases are reasonably well studied, as are many emotional blind spots and the like.  And it’s certainly true that I have a difficult time being objective about myself and about my work.

Oddly enough for such a self-despising person, I actually like my own writing, especially my fiction.  When I reread my stories I don’t tend to see them as horrible or wretched or whatever traditionally happens with “artists” who look at their own work.  I think at least some of that sort of thing is probably affected, since our society (perhaps semi-deliberately) looks down upon artists who think highly of their own art.

If only we did that with politicians; there’s an area where humility would be welcome and beneficial, I think.

Anyway, I tend to like my stories—I wrote them, after all, because I wanted to tell and thereby hear those tales—but I don’t necessarily think they’re great or good or decent from anyone else’s point of view.  I honestly don’t know how good or how bad they might be from nearly any others’ points of view, except my sister’s, and she’s probably almost as prone to be biased about my work as I am.

Though, again, my attitude toward my writing is not akin to that oft-noted personal bias that leads more than 90% of drivers to think that they are in the upper 50% in ability (i.e., above the median), which is a mathematical impossibility* for them actually to be.  I don’t think of my writing as better (or worse) than average or the median.

I don’t really compare my writing to anyone else’s.  I just tend to like it.  That’s probably a very good thing, because I have to edit it myself.  Even these daily blog posts are run through three more times after my first draft.  My fiction I tend to reread and edit seven times (that took a very long time with Unanimity).  Why seven?  Well, I had to pick a number, and once or twice is clearly too few, and thirteen would just be unworkable.

Also, with my fiction, I tend to follow advice Stephen King repeated in his book On Writing by working to reduce my final word count by at least ten percent by the time I’m done editing it.  I used to try to do that here, but I sometimes add a bit during editing, so that becomes quite difficult and hardly worth the effort.

All that being said, it would really be nice to get some feedback on my writing, especially on my stories (from people who have read them).  Of course, I would love it if someone loved my stories and told me so and told me why.  The closest I think I’ve come is a review on Amazon of Welcome to Paradox City that was written by a former high school friend (he has since died of cancer, sadly) who had actually honestly bought the book for himself when it came out.  He wrote that the three stories in that collection each made him wish they were the beginning of a whole book, basically implying that he wanted to know what happens next.

That’s a good thing about short stories—you can leave people hanging and that’s just “too bad” for them (though it can be enjoyable).  Short stories also don’t have to have “happy” endings, which is good for me, since only one of the three in the above collection ends happily in any reasonable sense.

Of course, as I’ve noted before, my short stories are rarely short enough ever to have been, for instance, published in a magazine in the old days.  The only real exception to this is Solitaire, which I don’t think any magazine would have published, because it is very, very dark indeed.

Okay, well, I guess I ended up writing something today, even if it was all just figurative omphaloskepsis.  I don’t know whether you readers consider this good or bad or ugly, or how it compares in your estimation to posts like I posted yesterday and/or the day before.  If you’re so inclined, please let me know.

And if you have actually bought and read any of my books, I do beseech you to leave me a review on Amazon (or wherever) if you get the chance.  Thanks.

I’ll write at you tomorrow, barring—as always—the unforeseen.


*It is not, on the other hand, impossible for 90% of people to be above average (i.e., above the mean).  I’m sure I’ve addressed this before, but imagine one had administered a test, and 100 people took it.  Imagine that 90 of those people got 51/100 on the test, whereas the remaining 10 people scored zero.  Then, the arithmetic mean (what people usually mean by “average”) would be (90 x 51)/100**.  That goes to 4590/100, or a mean score of 45.9.  So, 90% of those people scored above average.  That’s not saying much, but it’s true.

** Yes I know I don’t really need the parentheses there, but I’m leaving them in for clarity.

Our wills and fates do so contrary run, that our devices still are overthrown; Our blogs are ours, their ends none of our own.

Hello and good morning.  It’s Thursday, the 26th of February in 2026, a date that’s only very slightly interesting whether you write it as 2-26-2026 or 26-2-2026.  The fact that you have repeated 2s and repeated 26s is somewhat entertaining, but the zero throws potential symmetries off, making it not nearly as much fun as it could conceivably be.  It’s a shame, really.  I suppose you could write it as 26-02-2026 and rescue a bit of symmetry, but that feels like reaching.  It’s not quite symmetrical anyway, unless one is writing in base-26 or higher.  No, wait, even that wouldn’t work.

I don’t know about what I’m going to write this morning.  That in itself, of course, is nothing unusual.  But I don’t feel that I have much to say about anything at the moment.  I don’t want to get into my depression and ASD and anxiety and chronic pain and insomnia and just general moribund state, because I’m sure no one wants to hear about it anymore, and in any case, there seems to be no way anyone can do anything about it that’s useful, which makes it all the more frustrating.  Writing about it certainly hasn’t cured or even improved my state much, if at all.

Anyway, as I said the other day, you have been put on notice.  Unless you just started reading my blog for the first time yesterday, you’ve no right to act fucking surprised no matter what happens.

Okay, that’s that out of the way.

Now, let’s see, what should I write today?  I could discuss some topics in science, especially physics, though I also have literal, legally recognized expertise in biology, and I know a lot about quite a few other branches of science as well.  This is because I have always been curious about how the world, the universe, actually and literally works on the largest and on the most fundamental scales.

I mean, yes, humans also have their rules and laws and social mores and antisocial morays and all that nonsense, but if you step back even a bit, you can see nearly all human behavior encapsulated by basic primatology.  If you know how the various monkeys and gibbons and gorillas and chimpanzees behave‒especially their commonalities‒human behavior almost always fits right in.  It’s usually not even very atypical.

That doesn’t make the specifics of behavior very easily predictable in any given case, necessarily; then again, we understand an awful lot about the weather and the climate, but the specifics of tomorrow’s weather are tough to predict precisely and accurately, let alone next week’s weather.  Nevertheless, the physics of longer term climate effects of certain kinds of atmospheric gases is almost trivial.

Anyway, humans are too annoying to be very interesting, except in special circumstances.  In this, they are perhaps a bit like cockroaches.  From the point of view of a scientist who studies them, they can be interesting, and from just the right angle and with the right detachment, they can even be beautiful (or some of them can).  But overall, they are merely large masses of highly redundant little skitterers, just doing their shit-eating and reproducing and infesting almost every possible location.

Which type of creature did I mean to describe just now?  See if you can figure it out.

Of course, on closer scales, cognitive neuroscience and neurodevelopment and related stuff, such as “neural” networks, “deep” learning, and other such areas are fascinating.  One thing interesting about them is how all the things that brains and computers and so on are and do are implicit in the laws of physics‒clearly they are some of the things that stuff in the universe can do‒and yet, for all we know, they have only ever happened here, just this once in all the vast and possibly infinite cosmos*.

And for all we can tell, given the human proclivity to plan about 20 Planck units ahead and then after that trust to luck, this could be the only place they occur, and their time will not continue much longer, certainly not on a cosmic scale.

I could be wrong about that…except in the sense that, since I am stating it merely as one of the possibilities, I am not actually wrong at all.  Even if humans do survive into cosmic time scales and become cosmically significant, it will still not be easily debatable that it could have happened that humans would go extinct and would fail to go anywhere but Earth.

Of course, depending on the question of determinism, I suppose one could say that if humans (or their descendants) become cosmically significant then there literally was nothing else that could have happened, at least as seen from outside, at the “end”.

On the other hand, if Everettian quantum mechanics is the best description of the fundamental nature of reality, then in some sense, every quantum possibility actually happens “somewhere” in the universal quantum wave function, though those variations may not include all conceivably possible human outcomes.

Some things that seem as though they should be possible may simply never happen to occur (or occur to happen?) anywhere in the possible states of the universe.  That feels as though it should be unlikely, given how many possible states can be locally evolved in the quantum wave function, but I don’t think we know enough to be sure.

Okay, well, I vaguely hope that this has been mildly interesting and perhaps thought provoking.  It would be enjoyable to get more feedback and thoughts, but I don’t have a very large readership, and only a certain small percentage of people ever seem to interact with written material in any case, so I’m probably lucky to get the feedback that I get.

TTFN


*With the inescapable caveat that, if the universe is spatially and/or temporally infinite, and if as it seems there are only a finite number of differentiable quantum states in any given region of spacetime (the upper limit of which is defined by the surface area of an event horizon the size of the given region) then every local thing that happens, and all possible variations thereof, “happen” an infinite number of times.  But given that all these regions are more or less absolutely physically distinct and incapable of “communicating” one with another, they can be considered isolated instances in a “multiverse” rather than parts of the same “local universe”.

Are gravity and frivolity truly opposites?

It’s Wednesday morning (not quite five o’clock yet) and it is February 25th.  There are only ten more shopping months until Newtonmas*.

For those of you who don’t know (and as a reminder for those of you who do know) Isaac Newton was born on December 25th, 1642 (AD**).  Now, there is a parenthetical here:  Newton was born on December 25th by the Julian*** calendar, which was the one used in England at the time of his birth.  By the Gregorian**** calendar, Newton would have been born in early January of 1643.

This might seem to imply that December 25th nowadays shouldn’t be considered Newtonmas, but of course, it’s a closer fit than celebrating the birth of Jesus on that day; supposedly, biblical scholars have found that Jesus was probably born in the summer or something.  As with many things, “The Church” appropriated the popular holidays celebrating the winter solstice and grafted Christian religious significance onto it.

There’s nothing particularly bad about that.  All these holidays and divisions of the year are fairly arbitrary (though celebrating solstices and equinoxes is common enough in multiple cultures, which makes sense because these are objective events in any given year that can be noticed by any culture that is paying attention).

The length of a year is a concrete, empirical fact, as is the length of a day and the length of a lunar orbit around the Earth.  None of them are straightforward multiples of each other, unfortunately‒they are waves that are not harmonically associated with each other.

I don’t know how long it would take for their “waves” to come back into some primordial alignment and “start over”, but it’s probably moot, because the length of a day and of a lunar orbit and of the orbit of the Earth are changing slowly.  The moon, for instance, is moving steadily (but very slowly) away from the Earth over time, and so its time of orbit is increasing (since things that orbit farther away orbit more slowly).

I think Kepler’s third law was/is that the period of a planet’s orbit around the sun is proportional to the 3/2 power of the length of the semimajor axis of its orbit.  I’m not sure if that exact power holds up on the scale of, say, the lunar orbit, but the laws of gravity are as universal as anything we know.  Indeed, there are materials that are opaque to light, but as far as we know, there are none that are opaque to gravity.  Gravity is nevertheless constrained by the geometry of spacetime, so orbits will always slow down at a faster rate than the distance from the center around which a mass orbits increases.

The inability of anything we know of to block gravity is one thing that makes me take seriously the notion that, at some level, there could be more than three spatial dimensions.  If gravity is not confined to three dimensions then nothing that is so confined could stop it; it would merely flow around any obstacle (maybe gravity waves, for instance, can even diffract around matter and energy, though that might not imply higher dimensions).

This is related, indirectly, to the fact that it is impossible to tie a knot in a string in 4 or higher spatial dimensions.

By the way, having those extra spatial dimensions curled up tiny, as is usually presented in depictions of the notions of string theory, is not the only way for them to exist and be undetected.  If most of the forces in the world we know‒the electromagnetic, the strong force, the weak force, and the various matter-related quantum fields‒are constrained to a 3-brane because their strings are “open-ended”, then we could live in a 3-brane (in which all other forces, including matter, are confined) nested in a higher-dimensional “bulk”.  Gravity could be conveyed by a “looped” string, which could pass through the 3-brane, interacting but not being confined to it.  This could also explain the comparative weakness of the gravitational force and might even explain dark matter (and why it is so difficult to detect).

This sounds extremely promising, maybe, but there are issues and hurdles, not the least that strings and higher spatial dimensions are very difficult to detect, if they exist.  Also, it’s very hard to pin down all the implications mathematically in a useful way.

I remember one lunch break when I was still in medical practice when I tried to see if I could work out mathematically if “dark matter” could be explained by a relatively nearby, parallel brane-universe (it would probably be more than one, but one was difficult enough) whose gravity spills over into and overlaps the gravity of our brane-universe.

Here’s a sort of reproduction of some of the scribbling I did then:

Unfortunately, though I could visualize what I was considering and get an intuitive feel for what the math would be like, my precise mathematical skills were just not up to the task of sorting it out rigorously.  Also, of course, lunch was not long enough, and I had many other things on my mind.  Anyway, findings like the “bullet cluster” provide some fairly strong evidence that “dark matter” is something physical within our three dimensions of space.

Okay, that’s enough for today.  I’ve managed not to talk about my depression and stress and self-destructive urges/wishes (except just now, of course), so I hope you’re pleased to have had those things cloaked from you today.

Take care.


*Working out the exact number of days, I think I figured that it was 302.  December 25th is 7 days before New Years, so it’s day number 358 in the (non-leap) year.  And today is the 25th day of the second month, and January has 31 days, so today is day 56 of the year.  And, of course, 358 – 56 = 302.

**Why not my usual “AD or CE?”  Because at the time, in England, it was just “anno domini”.

***Named for Julius Caesar, though as far as I know, he had no more to do with actually formulating that calendar than he had with the invention of the 7th month.  As far as we know, he wasn’t even born by the then-existing version of Cesarian section, which was more or less always fatal to the mother, and his mother lived well beyond his birth.

****Named after Pope Gregory XIII, also known (by me) as Pope Gregory Peccary*****.  He did not formulate the newer calendar, but supposedly he at least commissioned the Vatican astronomers to create it when it had become obvious that the Julian calendar was not quite tracking the actual year but was overshooting over a long period of time.  So, the Gregorian calendar is better named than the Julian calendar, or so it seems to me.

*****The nocturnal, gregarious wild swine.

A notification of whatever

I expect this post to be brief today, though I’ve been known to be wrong about that sort of thing.  I had sort of “intended” to make my headline “Oh, well, whatever…” and then make the entire body of the post “…never mind.”  Thus I would be quoting the last verse-line of Smells Like Teen Spirit by Nirvana.  The subsequent words in the song are just the chorus and then a refrain of “A denial” repeated nine times (if memory serves).

I wasn’t sure I hadn’t already done this before, though.  I could have checked, but I didn’t have the mental energy.

Still, using that last line from a Kurt Cobain song carries a certain subtext which would have served my purposes well.

Or, well, actually, given past history, it probably wouldn’t have served my purposes at all.  None of this sort of thing seems to serve my purpose, no matter what I do.  As far as I can tell, only one person actually read my (admittedly somewhat long) post yesterday, but though I was borderline explicit about my meaning, I don’t think it did any good whatsoever.  That’s not unusual, of course; much if not all that I do never ends up doing me much good.

Sometimes I have to be subtle because I cannot force myself to be open about my internal states after a lifetime of fighting to appear “normal”, to the degree I can achieve that, and to avoid being too much trouble for other people, since I don’t think I have the right to trouble them, and in fact I think (or feel) that I’m fundamentally reprehensible.

I shouldn’t worry, though.  The times I am more open and obvious‒even when I am borderline explicit‒don’t appear to be any more successful than when I am at my most cryptic.  Possibly, I am just not able to communicate my feelings effectively with humans.

At the very least, my success rate must be below one percent.  It’s not quite as bad as playing the lottery, but it’s pretty pathetic.  Then again, so am I.

Whatever.  Never mind.  Ha ha.

But really, though, I don’t have much to say.  Quoting iconic songs may be the extent of my capacity to convey myself.

Ironically, I don’t feel the urge to share quotes from my own songs (or my fiction).  You would think they would be the best choice for conveying my inner thoughts.  That’s not always the case, though.

In fact, though I like my songs well enough, and Breaking Me Down is meant to be fairly explicitly about depression (at least my species thereof), none of them have enough oomph, as it were.  Or maybe it’s just that they are not well known*, so no one recognizes and identifies with the words.

I think I have some pretty good lines in Come Back Again, including what’s probably my favorite:

“Only meeting strangers

always losing friends.

Every new beginning

always ends.”

It may seem a bit bleak, but it’s also true more or less by definition.  If you’re meeting someone for the first time, they had been a stranger until that point.  And friends do become “lost”.  And the next two lines are rather obviously true.

Of course, a very good signing (singing?) off quote would be from Pink Floyd’s Time:  “The time is gone, the song is over, thought I’d something more to say.”

I’ve always been annoyed that they added the little reprise of Breathe after that and made it officially part of the song, because those other two lines constitute a perfect song ending.  I always figured they didn’t want to make the song end on too much of a downer, so they threw in the reprise as part of that song instead of as a separate one.  Maybe they were unwittingly invoking a version of the peak-end rule I mentioned the other day.

Anyway, I have a locked and loaded draft of a blog post that already applies that couplet from Time, with the headline being the first half, continuing into the post which consists only of the second half of that quote, followed by the embedded “video” of the final song on the first album of The Wall.

That, of course, is still a draft, and has been waiting there for a while, because if I use it, it’s meant to be my final blog post, and practically my final anything.  So I wasn’t going to use it today.  Not quite.  But I’m close.  The Nirvana quote isn’t quite as final, but it is a warning, especially given the fate of the guy who wrote it.

Anyway, consider yourselves on notice.  On notice of what?

Figure it out.


*That’s an understatement, eh?

Man overboard

As the real weekends go, it was better than most, to paraphrase The Wreck of the Edmund Fitzgerald.  By this, I’m referring to this last weekend, the two days before this day, of course.

I did not work on Saturday, which is good, because that would have been the third time in a row.  I also got to hang out with my youngest on Saturday, and we watched about four episodes of Doctor Who together, which was good, good fun.  I cannot complain about that in any way.

I have though a weird, disquieting, sinking sort of feeling that it may have been the last time I will see my youngest, or maybe anyone else that I love.  It’s is not one of those reliable sorts of feelings, like those that lead one to new insights in science or mathematics or what have you.  It’s probably more a product of depression and anxiety, the feeling that anything good in my life is sure not to last, if it happens at all, because I do not and cannot possibly be worthy of anything good happening to me.

Is that irrational?  Of course it is irrational.  It cannot be expressed in any sense as the ratio of two whole numbers, no matter how many digits they may have.

Wait, wait, let me think about that.  My thought, my feeling, was expressed above finitely.  That is, of course, a shorthand for what is really happening, but even if one were to codify those processes down to the level of each molecular interaction that affects any neural/hormonal process that contributes to my feeling, we know that must be a finite description (though it could, in principle, be quite large).

Even if we’re taking the full spectrum of quantum mechanics into account when describing my mental state, we know that quantum mechanics demands a minimum resolvable distance and time (the Planck length and the Planck time) below which any differentiation is physically meaningless.

A finite amount of information can describe the events and structures and processes in any given finite region of spacetime.  In fact, the maximum amount of information in any given region of spacetime is measured by the surface area (in square Planck lengths) of an event horizon that would span exactly that region, as seen from the outside*.

Any finite amount of information can be encoded as a finite number of bits, which can of course be “translated” to any other equivalent code or number system.  So, really, though the contents of my mind are, in principle, from a certain point of view, unlimited, they are finite in their actual, instantiated content, and can therefore certainly be expressed as an integer, and thus also as a ratio (since any integer could be considered a ratio of itself over one, or twice itself over two, etc.).

So, in that sense, my thoughts are not irrational.  Neener, neener, neener.

In many other senses—maybe not the literal, original sense, but in the horrified, cannot accept that not all numbers can be expressed as ratios of integers because that makes the universe too inconceivable, sense, among others—I can be quite irrational.

It’s very difficult to fight one’s irrationality from the inside, alone.  Even John Nash didn’t really beat his schizophrenia from within as shown in the movie version of A Beautiful Mind.  Also, his delusions in real life were far more extravagant and bizarre than those which appear in the sanitized version that made a good Hollywood story.

If one escapes from mental illness from within, one has to consider it largely a matter of luck, like a young child who doesn’t know anything about math getting a right answer on a graduate level, high order differential equation problem.  It’s physically possible; heck, if it were a multiple choice question, it might even be relatively common***.  But it’s not a matter of being able to choose to do it right and to know how it was done.

Severe mental health issues are going to need to receive assistance from outside, almost always.  This is not an indictment of them or of the need for help.

Surely, someone who has been swept off the deck of a ship by a rogue wave cannot be faulted for needing help from those still on the ship of they are to survive.  It would certainly seem foolish and almost inevitably fruitless if such a person tried to claw his way up the side of the ship to get back on board when there is no ladder and no handholds.  He should certainly not be ashamed that he cannot swim hard enough to launch himself bodily from the water and back onto the surface of the vessel.

One cannot reasonably fault such a person for trying to do the superhuman.  A person might try to do practically anything rather than drown or be eaten alive by some marine predator.  But, of course, barring an astonishing concatenation of events such as the time-reverse of the splashing entry into the ocean happening and sending the person out of the sea just as it was entered, such efforts will not succeed.

And though it might be heartening or at least positive for one to receive encouragement from those still on the deck—don’t drown, keep treading water, you can do it, you’ll make people sad if you drown, you deserve to stay afloat, I’m proud of you for treading water yet another day, it’ll get better, this won’t last forever, you’ve made it this far so you know you can keep going, you don’t want the people who know you to feel sad because you drowned, etc.—in the end it might as well come from the seagulls waiting to pick at one’s floating corpse.

Mind you, certain kinds of words can be more useful than others.  Words like, “Hey, around the other side of the ship there’s a built-in ladder; if you can get over there and time things right, you might be able to grab the lowest rung when the waves lift you, and then climb up,” might be useful because they are directions for using real, tangible resources that we know can make a difference.  Also, words like, “Hang on just a bit longer, we’re throwing down a life preserver on a rope so we can haul you up” would be useful, obviously, unless they were mere “comforting” lies.

Alas, though one could reasonably expect such literal assistance if one were washed overboard—the “laws” of the sea are deeply rooted in the hearts of those who work there, and they include a general tendency to help anyone adrift to the best of one’s abilities—when it comes to mental illness, the distress and the problems are difficult for others to discern and easy to ignore.  Calls of distress are often experienced as annoyances, and even treated with contempt, since those hearing them cannot readily perceive that they themselves might be similarly washed overboard at any time.

But, of course, they might be.

I don’t know how I got on this tangent, but I guess I never really do.  I just go where my mind takes me, and my mind is not a reliable driver.  It is, though, a reliable narrator.  It doesn’t matter, anyway.  Nothing does.

Anyway, here we go again into another work week, because that was what we did last week.  I wish I could offer you better reasons, but I’m really only good at breaking things down, destroying things, not at lifting anyone or anything up.  That comes from other regions and is conveyed by other ministers.


*From within an event horizon, the volume could be much larger than the spacetime that seems to be enclosed from the outside, because spacetime inside the horizon is massively curved and stretched.  It’s conceivable (at least to me) that there could be infinite space** within, at least along the dimension(s) of maximum stretch, just as there is infinite surface area to a Gabriel’s Horn, but only finite volume.

**See, mathematically, one can stuff infinite space inside a nutshell.  Hamlet was right.  He often was.

***Perhaps this explains why certain types of mental health problems can respond well to relatively straightforward interventions, and even to more than one kind of intervention with roughly comparable success, e.g., CBT and/or basic antidepressants and such.  These relatively tractable forms of depression are the “multiple choice problem” versions of mental illness.  This does not make them any less important.

This is not an attention-grabbing headline

I’m writing this post on my smartphone, even though I brought my lapcom with me yesterday evening.  I did not use my lapcom for yesterday’s post, such as it was.  I didn’t even write that post in the morning yesterday, or at least, I didn’t write the “first draft” of it then.

By the end of the workday on Wednesday, I didn’t feel like I was going to want to write a blog post on Thursday.  So I went to the site directly and just wrote the “Hello and good morning,” and the “TTFN” and set it to publish later.

I already knew what title I was going to want to use for it.  I wanted to use Polonius’s dithering, meandering jabber about brevity being the soul of wit, as a sort of left-handed self compliment about my own brevity in that post, and because, in the original form, it would have made the headline longer than the post, which would be ironically funny, in principle.

Then, yesterday morning, I got the urge to put my little “insert here” bracketed bit in the post, the better to convey how disgruntled and disaffected and self-disgusted I (still) felt, as well as how tired.  It did sort of spoil the joke about the headline being longer than the post, of course.  At least the older joke about Polonius still holds water.  Then again, that joke was made by Shakespeare, so we shouldn’t be too surprised if it has serious legs (though this raises the question of how serious legs could possibly hold water).

One thing worth at least assessing this week might be whether there is an aesthetic difference between this post (for instance) and the posts I wrote earlier this week, on the lapcom.  Writing on the lapcom is quite different for me in many ways.

On the lapcom, I generally have to work to stop myself before a post, or whatever, gets too long.  Whereas on the smartphone, that isn’t as frequent a problem.  Not that I can’t yammer on and on even with the smartphone, of course.  Some might say all I ever do is yammer on and on.  But anyway, I can’t write as “effortlessly” on the smartphone as I can on a regular keyboard*.

Sorry, I’m retreading a lot of old ground here, which I guess is better than retreading a lot of old tires. I know how to tread on the ground; indeed, I cannot recall a time when I didn’t know how to do that kind of treading.  Whereas retreading a tire sounds like something that requires special skills and equipment, both of which I lack.

I don’t know, I’ve heard of “retread” tires, but I don’t know if such things still abound, or if they ever did.  It sounds vaguely like a bad idea, like such tires might be more prone to blowouts.  But latex is a finite resource, and there aren’t very good synthetic alternatives, so maybe there’s at least some cost/benefit tradeoff (or treadoff?) there.

Ugh.  With that last joke, I probably convinced at least some of my readers that, yes, the world would be better off if I were dead.  Actually, I say that as if it were conditional, but it’s not.  It would be more in line with reality to say “the world will be better off when I am dead”.

There’s a quote by which to be remembered, eh?

I cannot say whether I will be better off when dead.  It’s probably a nonsensical question.  When I am dead, I will not be anything at all, not better, not worse, not uglier.  What happens to virtual particles after they have annihilated?  Nothing, and less than nothing, for they truly no longer exist, and in some senses they never existed.  Indeed, as physics goes, they probably never do exist; they are a shorthand description of what happens in quantum fields when perturbances in the fields have effects that do not rise to the level of actual, true particle production.

Or so I am led to understand.

From another point of view, it is possible for something to improve, at least in a sense, by ending.  I’ve mentioned this before, but if the curve of a function‒perhaps a graph of the “quality of life” or one’s “wellbeing”, to say nothing of happiness‒is persistently negative, then returning to zero is a net gain.  It can be a huge net gain, in fact.  This is related to the origin of my own version of an old saying, which I use with tongue definitively in cheek:  The one who dies with the most debt wins.

Now, of course, the integral, the area “under” that wellbeing curve would not be improved by the curve reverting to zero and stopping.  But at least that integral would not keep getting more and more negative over time.

Some might say, “well, the integral can become less negative over time, and might even become positive”.  This is, in principle, true.  And when one is younger enough, it’s relatively easier to tip the curve, and its integral, into positive territory.  But as the curve goes on, having been negative for a longer and longer time, it’s going to become ever harder to bring things to a net, overall positive integral, even if one could reliably make one’s curve positive (which one often simply cannot do).

Of course, the moment to moment experience (which is all the mind really gets) of an ascending curve could be pretty darn good, and might well be worth experiencing, even if it’s not enough to bring the integral into positive territory.  We are straying into the “peak-end” rule here, which was elicited regarding (among other things) colonoscopies but applies to much else in human experience.

Speaking of peak endings, I’ll mention in passing the curious fact that, no less than twice in the last week, the evening train service has been disrupted by someone either getting hit by or becoming ill next to the train.

Earlier this week, right by the station where I catch the train to go back to the house, there was a man who looked like he was probably homeless and had collapsed next to the train tracks not far from the station.  I saw him brought away, finally, on a stretcher.  He didn’t look physically injured‒certainly not in the ways I would expect someone who had actually been hit by a train to look‒but he did look cachectic, which is why I thought he might be homeless.

Then, last night’s commute was interrupted by what they call a “trespasser strike”, one that did not involve the train I rode but which always slows everything down.  I’m vaguely amused by the euphemism “trespasser strike”.  A “trespasser” here is a non-passenger who doesn’t work for the train company (or whatever) who is in the area adjacent to the tracks.  The “strike” part is probably self-explanatory.

I suppose it’s literally true, at least from a legal point of view, to call the person a trespasser.  But it’s amusing that the train people have to say something derogatory about a person hit by a train‒even if the person deliberately put themselves in harm’s way‒to sort of, I don’t know, assuage the company’s conscience.

But we are all trespassers, in at least some senses.  We are also, in other senses, all owners.  We are all innocent, and we are all, in some other senses, guilty.  “Every cop is a criminal and all the sinners saints.”  Above all, we are all very much just passing through, staying only a very short time.  We are all virtual particles.  Or you might say, we are all Iterations of Zero.

Have a good weekend.  I should not be writing a post tomorrow (in more than one sense).


*I wish I could honestly say that my use of a piano-style keyboard were as effortless, but I am terribly rusty with that, though I started learning it when I was 9, a rough 2 years earlier than when I got my first typewriter.

Therefore since brevity is the soul of wit, and tediousness the limbs and outward flourishes, I will be brief–your noble blog is mad.

Hello and good morning.

[Insert some random nonsense that suits your fancy here, possibly regurgitating the idiocy of previous posts, as I usually do.  Knock yourself out.  It couldn’t be worse than anything I could do.]

TTFN

“Like”, “comment”, and “share” (if you feel like it)

I’m very tired this morning.  By which I mean I’m more tired even than usual.  My head is a bit foggy—more so than usual, again—and I feel like I just belong lying down inert, perhaps in an open-topped coffin.  I’ve occasionally thought that they looked like good places to sleep.  It seems a shame to waste them on people who are already dead.

It’s Wednesday today, and I don’t think I’m going to have anything nearly as thoughtful to say as what I wrote yesterday, which was at least rather “deep” if not particularly useful or helpful or interesting to any of my readers.  I did get an interested comment on my take on one of the reasons mindfulness is useful, and that’s always nice.  I’d love to encourage greater feedback from more of my readers, here on the site in the comments, but I don’t know what to do to encourage them [I decided just to do a little cajoling in the headline, in case that works].

Probably there is just some percentage of people who tend to comment, no matter the situation.  It’s a bit like the long-known “fact” (which may or may not be a true fact) that every advertisement, from flyers/mailings to commercials, actually elicit a response in only about two percent of people who interact with it.  I suspect it’s probably similar with things related to blogging and social media and the like.

One sees it most readily on places like YouTube.  The number of views of a video is almost always something like an order of magnitude greater than the number of likes (and often it’s larger than the number of subscribers), and that’s still larger, though by a ratio that’s not as clear to me, than the number of people who comment (let alone share).

My own YouTube videos (and those of my published songs) are poor examples, or perhaps one might say “poor samples”, not representative of the phenomenon as a whole.  I have a number of “views” on my music videos that is generally a couple of orders of magnitude larger than the number of “likes”, but I know why that is.

Almost all of those views are from me, because I put my songs in my YouTube music playlist, and so I have listened to them often, back when I used to ride my scooter to work and back.  I had a lovely Bluetooth enabled helmet.  I like to listen to songs and sing along in a car, or similar, when I’m on my way places*.  So I’ve listened to my own songs probably orders of magnitude more times than everyone else put together has listened to any of my songs.

It’s kind of pathetic, isn’t it?  I’m also the one who has bought more copies of my books—because I gave copies to the people in the office—than everyone else put together, I’m fairly sure.

As for this blog, well, I get a higher number of likes relative to number of readers than I do with anything else, and I even recently have been getting a comment or so most days, which is very nice.  There’s at least some interaction.  It would be nice if I could reach a larger audience, but I’m not terribly good at self-promotion.  I am pretty good at self-denigration, though.  In fact, I’m one of the best there is at it!

Ha ha.

Well, like the song says, it’s all just a drop of water in an endless sea.  Or, it’s all just spit in the ocean** as more people probably say.  My spit may be more purulent than average, but it’s all still just spit.

Anyway, I don’t know what else to discuss today.  I’m very tired and worn out and I’m in ongoing pain that only responds somewhat to all the mitigating things I try to do, at least so far.  I’ve been through a quarter of a century of trying, and I have not been passive nor uncreative nor ignorant in my attempts.  As those reading might notice, I’ve thought about this matter a lot.  You probably would also if you were in chronic pain for nearly half of your life (so far).  It has a way of garnering your attention.  It’s built that way.

It’s interesting to note that shortly after I’m sixty, if I’m still alive***, I will have been in essentially constant pain for half my life.  After that it will become a majority (unless I’m cured at some point along the way, of course).

I occasionally (not often, though, because it’s too disheartening) wonder what my life would be like, what I would be like, if not for my chronic pain.

Things would almost certainly be vastly different.  I cannot be certain that they would be better—there are probably at least a few things that would be worse.  But it seems likely that my life would be much better overall, if only because I wouldn’t have a huge chunk of my will and energy stolen by being in pain all the time.  That constant pain really does make everything else harder.

But no matter the state of the rest of my life, at least one thing would be true (by “definition” in this case), and that is that I would not be in pain every fucking day of my stupid useless life.

Surely that must be worth something.  It would not be worth not having my children exist, but almost everything else would be worth trading.  I sometimes think of it as parallel to a line from Me and Bobby McGee:  “I’d give all of my tomorrows for a single yesterday, holding Bobby’s body next to mine.”  It’s nice poetry, albeit a bit weird to think about temporally.  But in my case, I think of it as basically saying I would gladly give up some significant fraction of what would otherwise have been my future if I could be out of pain.

But, of course, my future is less valuable to me now at least partly because I am in pain.  If I were not in pain, ironically, the future would be much more valuable, since it would be at least somewhat less uncomfortable.  If I could be free of depression, and the tendency thereto, that would make things better still.  That might even constitute a future worth having.

Yeah, yeah, I know, wishes, horses, manure, beggars riding, dogs and cats living together, watermelon, cantaloupe, rutabaga, yada, yada, yada.  I’m wasting my time and yours.  And I’m writing too much, because I’m using the lapcom, and I’m not saying or doing (or being) anything at all worth saying or doing or being.  This is all just stupid.

I hope you all at least have a good day.  I would not mind if this were my last one.


*I can’t do it anymore because I don’t ride or drive anywhere anymore, so I am not “alone” when commuting anymore.  I’m also not alone at the house.  It’s really quite disappointing.  I like to sing.

**This is a bit amusing:  I made a typo when I first wrote that phrase, and it was rendered as “spit is the ocean”, which seems almost like some vaguely deep thought about how oceans are lived in, swum in, excreted in, and bled in by numerous living creatures.

***Right now that seems a horrifying prospect.

May the slope of your pain function always be negative

I’ve been thinking about something I wrote in my blog post yesterday.  I had thrown out the thought, in passing, about how it seemed as though all the things in my life that I still do are not things I necessarily do for joy or out of desire to achieve some goal, but rather they are things which are more painful not to do than to do, and so I do them.

There isn’t really a positive motivation—not the pursuit of happiness or improvement or fulfillment or enrichment.  It’s just that the feeling of stress and tension and anxiety (or whatever) regarding the prospect of, for instance, not going to work rapidly becomes worse than the equivalent feelings about going to work.

That’s not a great state of affairs.  Don’t get me wrong; it’s entirely natural.  I’ve written about this many times, this recognition of the fact that the negative experiences—fear, pain, revulsion, disgust, and so on—are the biologically most important ones.  Creatures that don’t run from danger, that don’t avoid injury, that don’t shy away from potential infection and poison, are far less likely to survive to reproduce than creatures that do those things.

We see clinical examples of people lacking some of these faculties—such as those with congenital insensitivity to pain—and while we might envy them a life without agony, it tends to be quite a short life.  Also, they tend to become immobile and deformed due to damage they do to their joints by not shifting position to improve blood flow.

In case you didn’t know, that’s one of the reasons you can’t stand completely still for very long; it’s not good for you.

But many of us, especially in the modern world, have some things that we do for positive experience.  Some of them are dubious, but food, sex, companionship/conversation, singing, dancing, all that stuff, are positive things.  Unfortunately, positive experience cannot be allowed—by biology—to last too long.

As Yuval Harari noted, a squirrel that got truly lasting satisfaction from eating a nut would be a squirrel that lived a very short—albeit fairly happy—life, and would be unlikely to leave too many offspring.

Maybe this is what happens to some drug addicts.  Maybe they really do get satisfaction or at least pleasure from drugs—and maybe that is what ends up destroying them.  At some level, that’s not truly in question, is it?  People who are addicted to drugs forego other pleasures and other positive things, but perhaps more importantly, they fail to avoid many sources of pain and fear and injury.

The reality is probably a bit of an amalgam, I suppose.  I would not say it’s a quantum superposition, though, except in the sense that everything is a quantum superposition (or, rather, a whole bunch of them).

This is one situation in which I think I’m right and Roger Penrose is wrong—a bold claim, but I think a fair one—in that I see no reason to suspect that the nature of consciousness either requires or even allows quantum processes, other than in the trivial sense that everything* involves quantum processes.  But there’s no reason seriously to think that (for instance) neurotubules can even sustain a quantum superposition internally, let alone that such a process can somehow affect the other processes of the neuron, many of which are well understood and show no sign of input from weird states of neurotubules, which act mainly structurally in neurons.

If deep learning systems—LLMs and the like—have demonstrated anything, it’s that intuitive thought** does not require anything magical, but rather can be a product of carefully curated, pruned, and adjusted networks of individual data processing units, feeding backward and forward and sideways in specific (but not necessarily preplanned or even well understood) ways.  No quantum magic or neurological voodoo need be involved.

I think too many people, even really smart people like Penrose, really want human intelligence to be something “special”, to be something that cannot be achieved except within human heads, and maybe in the heads of similar creatures.  Surely (they seem to believe) the human mind must have some pseudo-divine spark.  Otherwise, we oh-so-clever humans are just…just creatures in the world, evolved organisms, mortal and evanescent like everyone and everything else.

Which, of course, all the evidence and reasoning seems to suggest is the case.

Maybe, deep down, there isn’t much more to life than trying to choose the path from moment to moment that steers you toward the least “painful” thing you can find.

Please note, I’m not speaking here about some metaphorical continuum, some number line that points toward pleasure in one direction and pain in the other.  That’s at best a toy model.  In the actual body, in the actual nervous system, pain and fear and pleasure and motivation are literally separate systems, though clearly they interact.  Pleasure is not merely the absence of pain, nor is pain merely the absence of pleasure.  Even peripherally, the nerves that carry painful sensations (which include itching, as I noted yesterday!) use different paths and different neurotransmitters than the ones that deal in pleasure and positive sensation.

Within the brain, the amygdala and the nucleus accumbens (for instances) are separate structures—and more importantly, they perform different functions.  There’s nothing magical about their locations in the brain or the particular neurotransmitters they use.  Those things are accidents of evolutionary past.

There’s nothing inherently stimulating about epinephrine, and there’s nothing inherently soothing about endorphins or oxytocin, and there’s nothing inherently motivating or joyful about dopamine and serotonin.  They are all just molecular keys that have been forged to open specific “locks” or activate (or inactivate) specific processes in parts of other nerve cells (and some other types of cells).  It’s the process that does the work, Neo, not the neurotransmitter.

This brings up a slight pet peeve I have about people discussing “dopamine seeking” (often when talking about ADHD).  I know, the professionals probably use this as a mere shorthand, but that can be misleading to the relatively numerous nonprofessionals in the world.  The brain is not just a chemical vat.  Depression and the like are not just “chemical imbalances” in some ongoing multi-level redux reaction or something, they are malfunctions of complicated processes.  Improving them should be at least as involved as training an AI to recognize cat faces, wouldn’t you think?

But one can do the latter without really knowing the specifics of what is going on in the system.  It’s just sometimes difficult, and the things you think you need to train toward or with often end up giving you what you didn’t really want, or at least what you didn’t expect.

Maybe this is part of why mindfulness is useful (it’s not the only part).  With mindfulness, one actually engages in internal monitoring, not so much of the mechanical processes happening—no amount of mere meditation can reveal the structure of a neuron—but of the higher-scale, “emergent” processes happening, and one can learn from them and be better aware.  This can be an end in and of itself, of course.  But it can also at least sometimes help people decrease the amount of suffering they experience in their lives.

Speaking of that, I hope that reading this post has been at least slightly less painful for you than not reading it would have been.  Writing it has been less painful than I imagine not writing it would have been.  That doesn’t help my other chronic pain, of course, which continues to act up.


*With the possible exception of gravity.

**I.e., nonlinear processing and pattern recognition, the kind many people including Penrose think cannot be explained by ordinary computation, a la Gödel’s Incompleteness Theorem, etc.