“We would zig-zag our way through the boredom and pain”

It’s Monday again; indeed, it is the last Monday in March in 2026 (AD/CE), for whatever that’s worth.  This Monday shall never come again.

Then again, of course, no Monday shall ever come again.  Such is the nature of time.  This is one of the facts that makes senseless the expression “That’s a [measure of time] I’m not gonna get back”.  Well, duh!  You never get any of your experienced amounts of time back.  That’s the nature of time, and the nature of its directionality, dependent upon the second law of thermodynamics.

Even if one could rewind time, one would not “get a moment back” the way people talk about it.  If, like the events of a movie or other video story, one could rewind life, it would not be you (the self who spoke of getting the moment back) who would experience the events anew.  It would just be a return to an earlier state, in which you would again be experiencing all the same events, not merely as if for the first time, but actually for the first time.  The posterior events would be erased for you as you traveled back.

It’s not like playing a video game where you can “regenerate” at your most recent save point, but you can remember what happened to your character before it “died” so that you can learn from your mistakes.  There is no one playing your character (i.e., you) and able to learn from a repeated past.  You are not the player of the game, you are the character.  You are part of the game.  You are part of the movie, not watching it from outside.  If it resets, you reset; if it rewinds, you rewind, and all memory of any events that happened disappear along with the future.

Whether or not you will repeat the same events, like the characters of a movie/show, or if you may do something different, like a video game character, is less clear, but it doesn’t much matter.  You are still going through each moment once, effectively, and you can only learn from mistakes to affect your behavior in the future.  If your mistake kills you, you’re just dead.

Even if time were a closed loop‒if the future of the universe wraps around and becomes “the past” again, forming a closed and fixed structure, as appears to be possible in principle according to General Relativity‒you won’t get to experience it as happening again.  Each time, you will experience reality for the first time.

Just as there is no fixed self looking out from behind your mind, there is no external rememberer hovering over your reality, able to experience your experiences for the first time but as if not for the first time.  You are a phenomenon within reality, not a sojourner through reality that accumulates knowledge that could be used in reliving the past, but better.

If you could rewind yourself except for your mind, somehow retaining your memory of “the future”, that would not be truly returning to the past.  Rather, it becomes the next set of events in your future.  This demands an answer to the question of how it could be possible for you to become your earlier self and yet remember your later self, since your memories are functions of your brain.

This is what makes things like Alzheimer’s and other forms of dementia or brain damage so tragic‒they literally are injuries to what makes us ourselves.  If you lose all memories of your past, then in a very real sense, the person you were is already dead.

Of course, even in healthy states, without brain damage, your past self is still “dead” with every new moment that arrives.  Every time you sleep and then wake up, it may as well be that you have died and then been recreated in the morning, just with implanted memories from the previous person, the one who died.  There would be no way for you to know if there is no difference in your brain and the rest of your body.  Indeed, it’s in principle possible that this actually happens with each passing moment, or even each passing Planck time.

Only the past can be remembered.  Only the future, even in principle, can be planned and affected.  And only the ever-moving present can be experienced.  There is, of course, a continuity that is required for us to have any sense of a unified personhood at all, but as Sam Harris has pointed out (more than once) your memories of your past are merely thoughts in your mind in the present moment, as are your plans for the future.

So it really can make sense to “get over yourself”, in more than one way.  It’s worth recognizing that you’re mortal and‒whatever you may believe‒as far as we know, death is the end, and all that you were will be gone after that.  But it’s also worth recognizing that, in a nontrivial sense, each day all that you were the day before is already gone.

Still, though you are only existing for any given present moment, memory at least allows for us to learn and hopefully do better in the future than we would have if we didn’t have memory.  That’s why memory is a trait that gets selected for and is evolutionarily stable:  because its presence makes creatures with that trait or attribute more likely to survive and reproduce than those that do not have it, ceteris paribus.

As with most such subject-specific blog posts, I could go on and on about this.  A thousand (or, well, a lot of) other thoughts arise that could be expressed as I write what I do write.  But I have finite space and finite time (even if spacetime is infinite) in which to write this post, so I’ll stop here for the moment.

Welcome to the new week.  I hope it’s a good one for you.  Heck, I hope it’s a good one for everyone, even “bad” people (with the caveat that, “a good one” entails such people becoming better than they presently are).

“You know the day destroys the night. Night divides the day.”

It’s Friday again.  But it’s not just any Friday‒it’s the Vernal Equinox, the day when the line between the Earth and the Sun is orthogonal to the line of the axis of the Earth, and so the day and the night will be (effectively) of equal length.  This is more fun in some ways than the solstices, because it’s the same for everyone, northern and southern hemispheres.

Of course, in the north it’s officially the Vernal Equinox, heralding the beginning of spring, whereas in the south it heralds the beginning of autumn.  I don’t know, however, if it is officially called the Autumnal Equinox in the south.  Probably it is.  After all, I’m sure they have their “official” winter solstice on what is “our” summer solstice and vice versa.  It would be a bit perverse for them to do otherwise.

It’s somewhat interesting to note, as Neil DeGrasse Tyson has pointed out with some ardor, that since, for instance, winter officially begins on the “shortest”* day of the year, the days actually get longer and longer through the winter (and the opposite happens in summer), until finally, on the Vernal Equinox, they break even with it and then daytime passes the night.

I wonder what Zeno would say about that race.

On a different topic, it’s quite rainy here this morning, and it’s a rather chilly rain, which is mildly unusual for south Florida.  It occurred to me, seeing just how sloppy it is here at the train station, that I hope it will not be so rainy at my destination.  What’s interesting about that is that it may not be rainy at all there, at work.  And yet, it could still be raining heavily down here in Hollywood.

In the modern world, weather can seem to change much more rapidly than it really does because we travel through the weather, whereas throughout all of our ancestral time we would merely have seen the weather passing over us.  It can give a somewhat misleading impression of how quickly the weather changes, even in Florida, where it can be raining on one side of a street and dry on the other**.

I recall when visiting my grandparents as a child, that there were times we would all be going somewhere in the car, and as we went along it would start to rain heavily, all of a sudden‒and then, just as suddenly, as we went along, it would stop.  And then it would suddenly start again, and then stop again, and so on.

But even in south Florida (or, well, west central Florida back then) the weather doesn’t change like that if you’re sitting still.  It changes quite rapidly compared to many other places, but not the way it seems to do when one is traveling in a modern vehicle.

For some reason, I feel as though there’s an analogy or insight available here with respect to special and possibly general relativity, but I don’t feel like trying to explore it right now.

I did bring my hardcover copy of General Relativity: The Theoretical Minimum, which is part of Leonard Susskind’s Theoretical Minimum series, with me when I left the office yesterday, thinking I might read it while on the train last night.  I did not read it.  There are too many distractions, it seems, for me to be able simply to flip my attention into focus on that, however much I really am interested in it.  It’s frustrating.

I have read part of it, mind you, as well as parts of the other Theoretical Minimum series.  I have all of them in both physical copies and on Kindle, so really, I didn’t need to bring the physical book.  But it is a lovely hardcover edition, and I hoped that might make me more likely to read it, since reading a nice hardcover is much more pleasant than reading a Kindle book on one’s phone, though that can still be fun.

I also entertain the admittedly absurd fantasy that I might be reading the hardcover copy on the train some day and some other, like-minded person (preferably an attractive woman) might notice and be interested because she is into the subject as well, and so on.

This is particularly silly as pipe dreams go, because even if such an absurd event happened, I would definitely screw the whole thing up.  I tend to be quite terse when strangers try to speak with me, even if they are beautiful women.

Looking back on my life, I’m sure that there have been several occasions in which someone was expressing interest in me, but I didn’t get it or got too anxious and froze up.  Sometimes I figured it out soon after, and sometimes it took longer.  There are probably some cases that I never noticed at all, even in hindsight.

Of course, I was married for fifteen years, during some of which I was in medical practice, and so such interactions would have had a different character.  There were sometimes more flagrant and obvious “advances” in that time, because, well…doctor.  But I never had any inclination to pursue them, even when I recognized them; I’m not the kind to want to cheat on a partner.  Hell, I’m not even the kind to seek a new partner two decades after my wife divorced me (though I briefly tried a little).

I wouldn’t mind a nice relationship, but I know that I am difficult to handle in many ways (I try not to be, but I am weird, and not in some charmingly popular manner), and in certain senses, my standards are high, or at least they are fairly strict.  For instance, someone who doesn’t read for pleasure is unlikely to be terribly interesting to me.  It’s not impossible; there are other ways for people to be interesting and smart.  But not liking to read would definitely be an entry for the “con” column, not the “pro” one.

I don’t know what I’m doing, going on about such nonsense.  I am not going to have any more romantic relationships in my life.  I am going to die alone, as is only appropriate and to be expected for something like me.  And while I won’t say “it can’t happen soon enough for my taste”***, I do really feel impatient for it.  I wouldn’t say I am “eager” for it, because that’s a positive feeling.  I am just quietly desperate for it, like someone trying to find an exit from a (slowly) burning building.

Anyway, that’s enough for today.  I hope you have a good one, and that you have a good weekend as well.  Yes, I mean you.

As for me, well, I am to be working tomorrow as far as I know, so I will be writing a blog post tomorrow, barring the unforeseen.


*Of course, this is a bit of a misleading characterization.  The day is the length that it is‒roughly 24 hours‒and does not change very quickly, for which fact we should all be grateful.  It’s just the length of time in a given day during which the sun is above the horizon (so to speak) that varies.

**This is not an exaggeration.  I have seen it myself on many occasions.  It seemed to happen more frequently in the area where my grandparents used to live (Spring Hill, north of Tampa) than it does down here‒or maybe I noticed it more because I was a kid‒but it is very real and quite impressive when it happens.

***Except to say that I won’t be saying it.

You’re so vain, you probably think that nothing matters

I was going to start by saying that I had probably written all I could about Friday the 13th and the fact that there are 2 in a row when non-leap year Februaries have Fridays the 13th, and that a first glance might lead one to think this should happen roughly every 7 years on average*.  However, as I noted last time I discussed this, because the leap year day is in February, we will not have the two-in-a-row Fridays the 13th (February and March) as often as we might otherwise; it will not happen every 7 years on average.

Then, this morning, after recalling that today was Friday the 13th, I ran through the next years’ Fridays in my head in the shower, and it occurred to me that the next Friday the 13th in February‒which will be in 6 years, as I noted in the past‒will not be followed by a Friday the 13th in March!  2032 (six years from now) will be a leap year, so there will be 29 days in February, so there will be no Friday the 13th in that March.

The next paired ones, then, will be a further 5 years after that, in 2037 (not a leap year).  It would have been 6 years later, but there are two leap years in that interval, 2032 and 2036, so the next one comes a year sooner than it would otherwise.

It occurred to me that, because of the frequency of leap years, which is almost twice that of the cycles of days of the week, the frequency of those paired dates may well be once every 11 years rather than every 7.  At least those are both prime numbers.  I’m not going to work out some exact formula right now, though.  It’s not really important.

Of course, one could say that nothing is truly important, and I am persuadable along those lines.

There is a Doctor Who Christmas Special (the one from series 5) in which the antagonist/guest protagonist (played by Michael Gambon!) describes a woman in a cryo chamber as “nobody important”, and the Doctor characteristically responds by saying, “Nobody important?  Blimey, that’s amazing.  You know, in 900 years of time and space, I’ve never met anyone who wasn’t important before.”

This is typical Doctor, of course, but it raises the objection Dash (from The Incredibles) voiced when told that everyone is special:  Saying that everyone is important can be the same thing as saying no one is.

Of course, important is in the eye of the beholder.  But then again, the beholder is not important, either, except in its own subjective estimation and perhaps that of a few other, equally unimportant, owners of such eyes.

So, yeah, one could argue relative and subjective importance from local points of view, which is valid but more or less vacuous outside its small scale as far as I can see.  On a cosmic scale, it’s all just dust and shadows.  But you could also say that about the entirety of the cosmos itself.

I guess import has always been subjective, even though people are not inclined to see it that way.  But, of course, people are the products of their “local” forces, and they are not responsible for the laws of nature, nor for the things which have happened in the past that have affected them in the present (which could come under a certain interpretation of “the laws of nature” in and of itself).  I won’t get into all that now.

Going back to the shower, but on an entirely different subject, I was also thinking about the effects of diminishing amounts of shampoo in the bottle on the center of gravity of the bottle.  At the start, when it’s full, the center of gravity is roughly in the geometric center of volume of the whole thing.  But as one uses the shampoo, the center of gravity shifts lower and lower, since the air replacing shampoo in the upper part of the bottle is much less dense than the shampoo or the bottle.

But then, as one gets to the dregs, the smaller and smaller amount of shampoo in the bottle contributes less and less to the overall mass distribution of the bottle and its contents, and the center of mass begins to head back up.  Finally, when the bottle is “empty”, the center of gravity will have returned to almost the same place it was when the bottle was full.

All that’s fairly trivial, well-known stuff, I know.  But it got me to thinking about how much of the laws of physics, such as the laws of gravitation (Newtonian form), are solved using such concepts as the center of mass, which is really just a way of combining and averaging the effects of numerous tiny bits of gravitating material as if they were concentrated at one point.

Much of the mathematics of physics works this way, coarsely approximating the very fine details of reality in a way that provides reliable, reproducible guidelines and can produce testable predictions.

But the granularity of reality doesn’t actually ever go away, not at any level.  Even at the level of the quantum wavefunction of a single “particle”, the actual behavior of the thing as it interacts with things in the “larger” world is the summation of the effects of all the possible quantum states of the electron superposed upon each other and interacting with things‒everything‒which are also just collections of superpositions of quanta.  That superposition happening in a “space” that doesn’t directly coincide with the macroscopic space we experience, but whatever its dimensions are, they are real, because they have durable, reproducible effects.

Mathematics may be unreasonably effective in the physical sciences, as Eugene Wigner famously noted, but it seems not to be a refining of description but rather an averaging out, a glossing over, the inking of an underlying rough pencil drawing which nevertheless still constitutes the real, original picture.

It may be that, in a sense, all science is just various forms of statistical mechanics.  We know that, at larger scales, we definitely need the tools of probability and statistics to navigate as best we can the territory of reality.  And yet, we don’t teach this sort of stuff to most people, ever.  I wrote a post about this on Iterations of Zero, if I remember correctly.

I could go on about all this rather easily, I guess, but I am using my smartphone today, and my thumbs are getting sore.  That’s okay; yesterday’s post was probably way too long, anyway.

If I did a video of my thoughts on this I might be able to get into more detail, though it would probably be even more erratic and tangential than my writing.  Still, maybe it would be worth trying.

In the meantime, I’ll write at you again tomorrow.


*Go ahead, do a search on my blog page for Friday the 13th; I’m all but sure it will bring up the pertinent blog posts.

 

This is the blog this man’s soul tries

Well, in case some of you were starting to feel lighthearted and optimistic‒just a little more at ease with yourselves and the world after two whole days without reading my work‒here I am to write another blog post that will probably bring you down and make you inclined to wonder whether anything at all is really worth anything, or if you should just give it all up, especially the habit of reading this blog.

Congratulations.  It’s Monday again, the start of another work week.  Also, Daylight Savings Time has ended (or is it “begun”?) over this last weekend, so for a bit, a lot of people’s circadian rhythms are going to be slightly off.  That will contribute to an increased number of accidents, both minor and major.  There will also be increased rates of illness (again, both major and minor), and I believe there is even some evidence that men at least will suffer more heart attacks after the time changes.

And what are the other advantages of Daylight Savings Time?  I’m not aware of any actual other benefits.

Of course, like most of you, I’m starting my own work week today, and it’s going to be a long one; the office is scheduled to be open this Saturday.  By then, the shifted time measure will be mostly adjusted in everyone’s heads.  I’m speaking of things here in the US, of course; I honestly don’t know off the top of my head whether other cultures have adopted this weird custom.

Whence did it originate?  I’ve heard explanations and excuses at various times in my life, but they are not very convincing.  If you know‒with reasonably good credence‒please share that information in the comments below.  And like and share it if you’re so inclined, especially if you have a strong sense of irony.  Heck, like and share the song itself if you want to immerse yourself in a kind of meta-level irony, or something like that:

I don’t know what to discuss today, even more so than usual.  I’ve committed to trying not to dwell on, or at least to share, my negative thoughts and emotions and so on, since I’m sure they do very little other than make other people feel depressed (yes, certain kinds of mental illness can be rather contagious, in a sense at least).

I won’t say I would never wish depression on anyone; that’s ridiculous.  For instance, I would feel much safer in the world if this Presidential administration, and indeed most of its equivalents around the globe, suffered from enough depression to make them second-guess themselves and doubt themselves from time to time.  It almost ought to be a requirement for office that someone be prone to dysthymia at the very least, so they would feel less confident that their shit doesn’t stink, so to speak.

And no, I am not suggesting that the people of the world ought to put me in charge for the best chance to make the world better.  I used to dream of such things, and I had a very Sauron-like wish to control events in the world for the greater good.  It might still not be too horrible a notion.

But my inclination over time has become more negative, more Melkor/Morgoth like.  So if anyone is inclined to encourage and engender acts of chaos and destruction on a hitherto unseen scale, by all means, give me immense power.  I make no warranties or guarantees or even assurances that I will use such power wisely.

I’ll try, of course.  No one can be expected (fairly) to do anything more than that, no matter what Yoda said.

Goodness knows I’ve tried a lot, in a lot of ways, all throughout my life, literally for as long as I can remember.  By which I mean, I’ve tried to do my best to do good things and to be a good person‒a good friend, a good son, a good husband, a good father, a good doctor, all that.  You can probably tell by my current state‒solitary, lonely, divorced, professionally ostracized, in bad physical health, in horrible mental health, alone*‒how well I’ve done at all those things.

I’m not exaggerating when I say I’ve tried hard.  I’m not one to big myself up very much, but I have worked hard all my life, trying to be a good son, a good friend, a good brother, a good husband, a good doctor, a good father.  Yet despite my sincere efforts and my reasonably high intelligence, here I am.

I suppose a lot of the disappointing outcome(s) is/are related to my ASD, both the heart-based one and the brain-based one, as well as my tendency (probably related to the preceding) to depression and some degree of low-grade paranoia.

By “low-grade” there, I mean that I don’t literally suspect that there are malicious forces plotting against me or trying to control me; I honestly don’t think highly enough of humans (or any other beings) to expect them to be capable of such things.  It would almost be reassuring if they were.

No, I mean I just have a general, global sense‒not just intellectually, but in my bones as it were, in my deep intuitions‒that I cannot rely upon anyone or upon anything, other than the laws of nature themselves (whatever their final version might be).  I don’t “trust” anyone or anything, including (one might even say “especially”) myself.  Everything is a calculated risk.

This is of course literally true for everyone, but I think most people hide from that fact most of the time, usually (but definitely not always) without terrible consequences.  I don’t know if that’s worse or better.  It may be more pleasant, but I suspect it’s misleading, and has been responsible for, or at least it has contributed to, many ills the human race has brought upon itself and upon others.

Whataya gonna do?  I guess you’re gonna do whatever you must, as they say, since it’s not as though you can do anything other than what you do once you’ve done it, and so it was all along what you were going to do, and so it was what you must do (or must have done).

I hope you have a good day and a good week.  I’ve tried to withhold my depression and negativity, with at least some degree of success‒trust me, I’ve withheld‒and I will continue to do so, because sharing it is pointless, and asking for help is laughable.


*Now, that phrase had some redundant notions, didn’t it?

Thoughts meander like a restless Melkor in the Outer Void…

It’s Friday, and this week I can be thankful therefor, because I do not work tomorrow.  The office will be closed (and locked) on Saturday.  Only those who have keys and the alarm code and some other reason for being there would be there (I suppose someone could break in, but there are cameras and alarms in place, and there is nothing of any significant net value, i.e., value worth risking the alarms and cameras to reach, inside).

Next Friday won’t be as good from a strictly work/not work point of view, but at least it will be Friday the 13th again, for the second month in a row.  Then we will have to wait an average* of 7 years for it to happen that way again.

***

Okay, I guess I’ve always known that I’m weird, but I just wrote a series of footnotes about Friday the 13th and year lengths and lengths of weekly cycle recurrences that dwarfed what I had yet written in the main body of this post.  I think I’m probably the only being in the universe that would write about such things and imagine that anyone else would be interested.

Yeah, definitely weird.

Still, I guess that sort of thing just happens when you talk to yourself in print and share it with any interested parties who might stumble upon it.  Also, when one is without companions or interactions one can, like Melkor, develop thoughts and thought patterns that are unlike those of one’s brethren.

I suppose that can sometimes be a good thing, though it can also sometimes be a very bad thing (rarely as bad as in the fictional Melkor case).  Though all improvement is change, most change is not an improvement‒at least not if it’s not deliberate and directed change.  So if one develops thoughts that are significantly divergent from those of all of one’s peers, odds are that they will not be a net improvement over most of the peer-born thoughts.

I have, of course, mitigated against this somewhat by reading a lot (and consuming other media that deal with science and mathematics and philosophy and such, as well as comedy panel shows).  That’s not randomly chosen reading, either; it’s carefully chosen reading.  I think this has helped improve the general content and tendencies of my thought, because I’ve influenced myself with the carefully thought-out thoughts of very bright people.

I suppose, though, that if one can read what one wants and does so, one is not really isolated from all other thoughts, so one’s own cannot be too very different, or at least are not very likely to be.  That’s good, I think.  Simply developing new thoughts without much input from others would be most likely to lead to some sort of feral state or something akin to schizophrenia.  

So, I guess it can be good to take tangents in one’s thinking, as long as they are not too many and too extreme.  But even given that, it’s clearly useful to have someone to rein one in, if one can, when one goes too far off the rails (yes, that’s a bad metaphor, since a train going off the rails at all is in huge trouble, rails representing a near-binary situation‒if one is a train and one is not on the rails completely, one has experienced a failure of locomotion).

Well, I guess that’s that for this week.  Actually, I suppose that is always that, by some principle of identity or self-reflection or something; I’m sure there’s an “official” name.  “It is what it is” as they say.  What I mean, though, is that I am drawing this post, and this week of posts, to a close now.

I hope you have a very good weekend.

After that I don’t give a shit.

(I’m kidding.)


*I know, I know, we won’t have to wait an average number of anything.  There is a specific and exact number of years before the next time February and March have Fridays the 13th, but I cannot be arsed to work it out just now**.

**Okay, well, since I am unable to keep myself from thinking about it at least a little, I think it’s going to be 6 years from now.  That’s because each regular year is 1 day longer than a multiple of a week:  365/7 is 52 with a remainder of 1, so one day longer than an even number of weekdays.  So next February should have the 13th on a Saturday, then a Sunday the following year, but then on a Tuesday the year after that because of the leap year (366/7 is 52 with remainder 2).  Then it will be Wednesday, then Thursday, then Friday.  So 6 years, if my figuring is correct***.

***If it seems counterintuitive that it’s 6 years when the average should be 7, remember that while in this case the leap year makes the next instance come faster, there will be occasional years when Thursday the 13th falls on a leap year and the following year will go straight to Saturday the 13th, the first of another six years (I think) that will be needed for the subsequent Friday the 13th in February.  In any case, 6 plus (6 x ⅙) equals 7, as does 6 + (a x 1/a) no matter what a is****.

****This doesn’t factor in those leap years in which February has a Friday the 13th, but March will not.  That may change the overall calculations somewhat regarding the average time between dual Fridays the 13th, but not the calculations about when the next one will be.

If you can look into the seeds of time, and blog which grain will grow and which will not

Hello, and also, good morning.

What to write about, what to write about‒that is the question today.  Of course, “to be or not to be” is always the question as well, as was recognized by Camus in The Myth of SisyphusIf I recall, he arrives at the conclusion that the titular rock-rolling protagonist must be “happy” despite the patent and constant pointlessness and absurdity of his existence.

That goes along with the whole recognition of the absurdity of life itself that is central to the existentialism movement.  Still, it’s hard for me to “imagine Sisyphus happy”, unless he was a true Bodhisattva or had been thoroughly lobotomized by Zeus (or whoever it was that had doomed him to his…well, his doom).

It can help, I guess, to think about the vast scale of the cosmos in space and time (and any other dimensionality that might apply) and also about the incredibly minute scale of the cosmos, the fundamental quantum fields (and whatever gravity ultimately is) interacting from the Planck scale on up.  It helps keep things in perspective.

Of course, even given the scales of the cosmos*, there’s another, almost sort of Buddhist/Taoist notion that notes that each individual‒each particle even‒always exists at the nexus of two “light cones”, existing in an ever-moving now.  These are 4-dimensional cones, by the way, but it’s okay to reduce things by one dimension if you will.  It makes them easier to visualize.

Your (or anyone’s) past light cone is the outer boundary of all influences that can possibly have had any effect upon you at the present moment‒those influences that could have reached you at the speed of light or more slowly.  Similarly, one’s future light cone encompasses all those things that could possibly be influenced by things at the present location at or below the speed of light.

Any motion within the light cones‒the only motion that anything within spacetime can execute, as far as we know‒is called timelike motion.  Any motion that would require going outside a light cone is considered “spacelike” motion, and is not allowed by relativity.  This is not merely because of the speed of light, it’s because the speed of light is defined by the speed of causality.  Causes cannot travel faster or have effects beyond the speed of causality.  This is a bit tautological, I know, but it nevertheless simply must be true.

So each individual’s experience, each individual process, sits at the moving balance point of a future light cone and a past light cone, crossing at the moving present, tracing out a “timelike” path in spacetime.  Of course, individual creatures are not individual particles, and so their overall spacetime path would resemble the final line produced by a sketcher going over and over a particular path to make the curve the artist desires.

If one could look at the structure of a human in spacetime, like the Tralfamadorians of Slaughterhouse Five, but one could also trace even the spacetime paths of individual “particles”**, a human life would be a sort of higher-dimensional braid in spacetime, surrounded by a haze of incoming and outgoing quantum entities, most of which will be locally bound and interacting, and so will be moving at a net velocity lower than the speed of light.

I’m assuming you don’t eat your food or drink your water or breathe your air or (shudder) sweat or excrete at near light speed.

Imagine what the inside of a mere proton or neutron might look like if one were able to see it as a rendered, four-dimensional model in fine detail!  If you think it wouldn’t be that interesting because it’s so wee, think again.

Remember, only the tiniest fraction of the “rest mass” of a nucleon comes from the mass of the three “net” quarks in it (two up, one down or two down, one up depending on whether it’s a proton or neutron).  Almost all the rest of its mass is the energy of the interactions between these three quarks:  all the gluons exchanged, all the virtual quark/anti-quark pairs popping into existence, mediated by that famous strong force and its weird*** “asymptotic freedom”.

Bringing this back around, I guess my point was merely to note that everyone and everything is pointless from the perspective of the laws of nature and the spacetime scale of the cosmos, but when you learn about those things‒the cosmos at large and small levels‒you are at least familiarizing yourself with those vast workings, and you are in a sense taking part of them into yourself.  That’s kind of a cool thought.

But don’t take too much of it into yourself!  For, much as would happen to someone who stuffed all the information about Graham’s number into one head, if you do you will become a black hole.  Now, it may be possible to survive becoming a black hole, but I don’t recommend betting on that pony.

TTFN


*I wrote a post on Iterations of Zero about how it might be useful for people to consider the cosmic perspective as contrasting with their prosaic concerns.  I don’t remember how good it was, but here’s the link, in case you want to read it and give any feedback you like.

**I use this word for want of a better term that everyone would recognize and that would be succinct.  I think we need such a different term, because a lot of the perceived so-called weirdness and mystery of quantum mechanics comes from trying to use inaccurate terms that originated in times before we understood things as well as we now do.  Quanta are not little “particles” that sometimes act like waves, nor are they little waves that sometimes act like particles (though that’s slightly more accurate).  They are entities unto themselves, and the ways they behave are all always consistent with that nature.  They don’t sometimes act like one thing and at other times act like another.  They all, always, act like what they are.

***Except it’s not weird, really.  Those of us who are surprised by it?  We are the weird ones.  Quantum chromodynamics has always done exactly what it still does, since long before any life at all existed in this universe.  To quote Yudkowsky again, “Since the beginning not one unusual thing has ever happened.”

Nihil vere refert. Quisque videre potest. Nihil vere refert. Nihil vere mihi refert.

Well, I did warn you yesterday that I would be writing a blog post today*.  Go ahead, take a look.

Yesterday’s post was another of my recent, deliberately benign blog posts, not dwelling on my mental health and chronic pain issues, because nobody gives a shit about those things, or at least they don’t want to have to hear about them, because they’re not going to (be able to) do anything about them, and that makes them feel guilty and uncomfortable, which is unpleasantly awkward.

So, anyway, it’s the last day of February in 2026.  We are, in a certain sense, one sixth of the way through the year.

I say “in a certain sense” because it’s not precisely true.  Today is the (31 + 28)th day of the year, so the 59th day of the year.  If that were literally a sixth of the way through the year, the year would only be 354 days long.

It’s somewhat interesting to note here that, because February is shorter than every other month, the first two months of the year are shorter than any subsequent, nonoverlapping** months of the year.  And, let’s see, the first three months of the year have 90 days exactly on non leap years, whereas April thru June have 91, July through September have 92, and October through December also have 92.  So, all the later groups of three months have more days than the first three‒except on leap years, when January through March is 91 days.

Evidently, though, the latter six months of the year always have more days than the first six.  I wonder why they did it that way.  Was there an actual reason or did it just sort of happen?

Of course, I know they can’t be equal except on a leap year, since the number of days in a year is odd.  But why couldn’t they have come up with a way that made the years alternate, with one year‒the odd years perhaps‒having the surplus in the first 6 months and the other years having it in the last 6 months?  On leap years they could be equal.

How might that work?  We need 182 days divided by six months, which means we need four months which have just 30 days and two that have 31.  We could say January and February have 30, March has 31, and then repeat with April, May, and June and then July, August, and September***.  I was about to suggest that on odd years we make January have 31 days and on even years we make July have 31 days, but all leap years are even years, so the latter half would be comparatively short-changed with respect to years in which they are longer. if we add the leap year day to the first half as we do now.

On the other hand, we could put the leap day always in the 2nd half of the year, perhaps in November, or even more sensibly in December:  we would thereby add our extra day to the very end of the year, rather than squeezing it into the earlier part of the year like someone cutting into a line.  Though that would make the second half two days longer than the first, though, which is unpleasantly asymmetrical in a year with an even number of days.

Of course, really, all days are fungible.  I remember seeing on QI once that apparently some sect maintained that they added an extra day not at the end of February but in the middle; I don’t recall precisely where they thought the day was being inserted, alas, but I can imagine some alternative, anatomical suggestions I’d like to make for them.

Days of a month are fungible (dammit!).  It makes no more sense to say that you added a day into the middle of February and pushed subsequent days later than it does to say that you deposited $100 into your bank account right after the 256th dollar that was already there, pushing what had been dollars 257 through 356 to become dollars 357 through 456.  Every dollar is just “a dollar”, every cent is just “a cent”.  It’s rather reminiscent of the way every electron is interchangeable with every other electron (likewise for all other elementary “particles”).

So, on leap years, the extra day of the year is and can only be (in our current system) the 29th of February, because that’s the day-label that isn’t there in other years.

You’re allowed to imagine if you like that you’re adding a day to the middle of the month and pushing the other days back and renaming them.  You’re also free to argue about how many angels can dance on the head of a pin, or to debate, without first agreeing on word usages****, whether unattended trees that fall in forests make noises.  That doesn’t mean you’re doing anything that has any bearing on the real world.

Okay, well, that’s been much ado about nothing, hasn’t it?  Or, multum strepitus de nihilo fuit, as is apparently the way to say it in Latin, which almost always sounds fancier, though it doesn’t always sound better aesthetically (consider the above headline’s Latin versus the original English).  English is‒or can be‒quite a beautiful language if you take a step back and see it as if from outside.  It can be hard to distinguish that beauty “from within”, though, because the meanings and usages of the words involved can distract from their inherent loveliness.

Tolkien, for instance, wrote that he thought the most beautiful sounding phrase in English was “cellar door”.  I’m not sure I agree with him on this, but it’s a matter of taste, so there’s no slight, or “diss” or “shade”, involved in not both liking the same thing.

Enough nonsense for now, or at least enough nonsense here in this blog for now.  I’m sure that there is plenty of nonsense to be had elsewhere.  Do try to find some that’s enjoyable for you this weekend.


*That was unless I was lucky enough to get very sick or very injured or to die, which I have apparently not been lucky enough to do by this time.

**I say “nonoverlapping” because February and March combined contain the same number of days as January and February combined.

***I think in the final three months it should be October that always has 31 days, because Halloween really should fall on a day that’s a prime number, not a 30th or a 1st.

****Most such debates tend to devolve into discussions about the “definition” of the word “noise”, as if that were concrete and singular and fixed‒which it is not‒rather than the laws of physics and biology that constrain all the actual events of such an arboreal catastrophe.

Are gravity and frivolity truly opposites?

It’s Wednesday morning (not quite five o’clock yet) and it is February 25th.  There are only ten more shopping months until Newtonmas*.

For those of you who don’t know (and as a reminder for those of you who do know) Isaac Newton was born on December 25th, 1642 (AD**).  Now, there is a parenthetical here:  Newton was born on December 25th by the Julian*** calendar, which was the one used in England at the time of his birth.  By the Gregorian**** calendar, Newton would have been born in early January of 1643.

This might seem to imply that December 25th nowadays shouldn’t be considered Newtonmas, but of course, it’s a closer fit than celebrating the birth of Jesus on that day; supposedly, biblical scholars have found that Jesus was probably born in the summer or something.  As with many things, “The Church” appropriated the popular holidays celebrating the winter solstice and grafted Christian religious significance onto it.

There’s nothing particularly bad about that.  All these holidays and divisions of the year are fairly arbitrary (though celebrating solstices and equinoxes is common enough in multiple cultures, which makes sense because these are objective events in any given year that can be noticed by any culture that is paying attention).

The length of a year is a concrete, empirical fact, as is the length of a day and the length of a lunar orbit around the Earth.  None of them are straightforward multiples of each other, unfortunately‒they are waves that are not harmonically associated with each other.

I don’t know how long it would take for their “waves” to come back into some primordial alignment and “start over”, but it’s probably moot, because the length of a day and of a lunar orbit and of the orbit of the Earth are changing slowly.  The moon, for instance, is moving steadily (but very slowly) away from the Earth over time, and so its time of orbit is increasing (since things that orbit farther away orbit more slowly).

I think Kepler’s third law was/is that the period of a planet’s orbit around the sun is proportional to the 3/2 power of the length of the semimajor axis of its orbit.  I’m not sure if that exact power holds up on the scale of, say, the lunar orbit, but the laws of gravity are as universal as anything we know.  Indeed, there are materials that are opaque to light, but as far as we know, there are none that are opaque to gravity.  Gravity is nevertheless constrained by the geometry of spacetime, so orbits will always slow down at a faster rate than the distance from the center around which a mass orbits increases.

The inability of anything we know of to block gravity is one thing that makes me take seriously the notion that, at some level, there could be more than three spatial dimensions.  If gravity is not confined to three dimensions then nothing that is so confined could stop it; it would merely flow around any obstacle (maybe gravity waves, for instance, can even diffract around matter and energy, though that might not imply higher dimensions).

This is related, indirectly, to the fact that it is impossible to tie a knot in a string in 4 or higher spatial dimensions.

By the way, having those extra spatial dimensions curled up tiny, as is usually presented in depictions of the notions of string theory, is not the only way for them to exist and be undetected.  If most of the forces in the world we know‒the electromagnetic, the strong force, the weak force, and the various matter-related quantum fields‒are constrained to a 3-brane because their strings are “open-ended”, then we could live in a 3-brane (in which all other forces, including matter, are confined) nested in a higher-dimensional “bulk”.  Gravity could be conveyed by a “looped” string, which could pass through the 3-brane, interacting but not being confined to it.  This could also explain the comparative weakness of the gravitational force and might even explain dark matter (and why it is so difficult to detect).

This sounds extremely promising, maybe, but there are issues and hurdles, not the least that strings and higher spatial dimensions are very difficult to detect, if they exist.  Also, it’s very hard to pin down all the implications mathematically in a useful way.

I remember one lunch break when I was still in medical practice when I tried to see if I could work out mathematically if “dark matter” could be explained by a relatively nearby, parallel brane-universe (it would probably be more than one, but one was difficult enough) whose gravity spills over into and overlaps the gravity of our brane-universe.

Here’s a sort of reproduction of some of the scribbling I did then:

Unfortunately, though I could visualize what I was considering and get an intuitive feel for what the math would be like, my precise mathematical skills were just not up to the task of sorting it out rigorously.  Also, of course, lunch was not long enough, and I had many other things on my mind.  Anyway, findings like the “bullet cluster” provide some fairly strong evidence that “dark matter” is something physical within our three dimensions of space.

Okay, that’s enough for today.  I’ve managed not to talk about my depression and stress and self-destructive urges/wishes (except just now, of course), so I hope you’re pleased to have had those things cloaked from you today.

Take care.


*Working out the exact number of days, I think I figured that it was 302.  December 25th is 7 days before New Years, so it’s day number 358 in the (non-leap) year.  And today is the 25th day of the second month, and January has 31 days, so today is day 56 of the year.  And, of course, 358 – 56 = 302.

**Why not my usual “AD or CE?”  Because at the time, in England, it was just “anno domini”.

***Named for Julius Caesar, though as far as I know, he had no more to do with actually formulating that calendar than he had with the invention of the 7th month.  As far as we know, he wasn’t even born by the then-existing version of Cesarian section, which was more or less always fatal to the mother, and his mother lived well beyond his birth.

****Named after Pope Gregory XIII, also known (by me) as Pope Gregory Peccary*****.  He did not formulate the newer calendar, but supposedly he at least commissioned the Vatican astronomers to create it when it had become obvious that the Julian calendar was not quite tracking the actual year but was overshooting over a long period of time.  So, the Gregorian calendar is better named than the Julian calendar, or so it seems to me.

*****The nocturnal, gregarious wild swine.

A notification of whatever

I expect this post to be brief today, though I’ve been known to be wrong about that sort of thing.  I had sort of “intended” to make my headline “Oh, well, whatever…” and then make the entire body of the post “…never mind.”  Thus I would be quoting the last verse-line of Smells Like Teen Spirit by Nirvana.  The subsequent words in the song are just the chorus and then a refrain of “A denial” repeated nine times (if memory serves).

I wasn’t sure I hadn’t already done this before, though.  I could have checked, but I didn’t have the mental energy.

Still, using that last line from a Kurt Cobain song carries a certain subtext which would have served my purposes well.

Or, well, actually, given past history, it probably wouldn’t have served my purposes at all.  None of this sort of thing seems to serve my purpose, no matter what I do.  As far as I can tell, only one person actually read my (admittedly somewhat long) post yesterday, but though I was borderline explicit about my meaning, I don’t think it did any good whatsoever.  That’s not unusual, of course; much if not all that I do never ends up doing me much good.

Sometimes I have to be subtle because I cannot force myself to be open about my internal states after a lifetime of fighting to appear “normal”, to the degree I can achieve that, and to avoid being too much trouble for other people, since I don’t think I have the right to trouble them, and in fact I think (or feel) that I’m fundamentally reprehensible.

I shouldn’t worry, though.  The times I am more open and obvious‒even when I am borderline explicit‒don’t appear to be any more successful than when I am at my most cryptic.  Possibly, I am just not able to communicate my feelings effectively with humans.

At the very least, my success rate must be below one percent.  It’s not quite as bad as playing the lottery, but it’s pretty pathetic.  Then again, so am I.

Whatever.  Never mind.  Ha ha.

But really, though, I don’t have much to say.  Quoting iconic songs may be the extent of my capacity to convey myself.

Ironically, I don’t feel the urge to share quotes from my own songs (or my fiction).  You would think they would be the best choice for conveying my inner thoughts.  That’s not always the case, though.

In fact, though I like my songs well enough, and Breaking Me Down is meant to be fairly explicitly about depression (at least my species thereof), none of them have enough oomph, as it were.  Or maybe it’s just that they are not well known*, so no one recognizes and identifies with the words.

I think I have some pretty good lines in Come Back Again, including what’s probably my favorite:

“Only meeting strangers

always losing friends.

Every new beginning

always ends.”

It may seem a bit bleak, but it’s also true more or less by definition.  If you’re meeting someone for the first time, they had been a stranger until that point.  And friends do become “lost”.  And the next two lines are rather obviously true.

Of course, a very good signing (singing?) off quote would be from Pink Floyd’s Time:  “The time is gone, the song is over, thought I’d something more to say.”

I’ve always been annoyed that they added the little reprise of Breathe after that and made it officially part of the song, because those other two lines constitute a perfect song ending.  I always figured they didn’t want to make the song end on too much of a downer, so they threw in the reprise as part of that song instead of as a separate one.  Maybe they were unwittingly invoking a version of the peak-end rule I mentioned the other day.

Anyway, I have a locked and loaded draft of a blog post that already applies that couplet from Time, with the headline being the first half, continuing into the post which consists only of the second half of that quote, followed by the embedded “video” of the final song on the first album of The Wall.

That, of course, is still a draft, and has been waiting there for a while, because if I use it, it’s meant to be my final blog post, and practically my final anything.  So I wasn’t going to use it today.  Not quite.  But I’m close.  The Nirvana quote isn’t quite as final, but it is a warning, especially given the fate of the guy who wrote it.

Anyway, consider yourselves on notice.  On notice of what?

Figure it out.


*That’s an understatement, eh?

This is not an attention-grabbing headline

I’m writing this post on my smartphone, even though I brought my lapcom with me yesterday evening.  I did not use my lapcom for yesterday’s post, such as it was.  I didn’t even write that post in the morning yesterday, or at least, I didn’t write the “first draft” of it then.

By the end of the workday on Wednesday, I didn’t feel like I was going to want to write a blog post on Thursday.  So I went to the site directly and just wrote the “Hello and good morning,” and the “TTFN” and set it to publish later.

I already knew what title I was going to want to use for it.  I wanted to use Polonius’s dithering, meandering jabber about brevity being the soul of wit, as a sort of left-handed self compliment about my own brevity in that post, and because, in the original form, it would have made the headline longer than the post, which would be ironically funny, in principle.

Then, yesterday morning, I got the urge to put my little “insert here” bracketed bit in the post, the better to convey how disgruntled and disaffected and self-disgusted I (still) felt, as well as how tired.  It did sort of spoil the joke about the headline being longer than the post, of course.  At least the older joke about Polonius still holds water.  Then again, that joke was made by Shakespeare, so we shouldn’t be too surprised if it has serious legs (though this raises the question of how serious legs could possibly hold water).

One thing worth at least assessing this week might be whether there is an aesthetic difference between this post (for instance) and the posts I wrote earlier this week, on the lapcom.  Writing on the lapcom is quite different for me in many ways.

On the lapcom, I generally have to work to stop myself before a post, or whatever, gets too long.  Whereas on the smartphone, that isn’t as frequent a problem.  Not that I can’t yammer on and on even with the smartphone, of course.  Some might say all I ever do is yammer on and on.  But anyway, I can’t write as “effortlessly” on the smartphone as I can on a regular keyboard*.

Sorry, I’m retreading a lot of old ground here, which I guess is better than retreading a lot of old tires. I know how to tread on the ground; indeed, I cannot recall a time when I didn’t know how to do that kind of treading.  Whereas retreading a tire sounds like something that requires special skills and equipment, both of which I lack.

I don’t know, I’ve heard of “retread” tires, but I don’t know if such things still abound, or if they ever did.  It sounds vaguely like a bad idea, like such tires might be more prone to blowouts.  But latex is a finite resource, and there aren’t very good synthetic alternatives, so maybe there’s at least some cost/benefit tradeoff (or treadoff?) there.

Ugh.  With that last joke, I probably convinced at least some of my readers that, yes, the world would be better off if I were dead.  Actually, I say that as if it were conditional, but it’s not.  It would be more in line with reality to say “the world will be better off when I am dead”.

There’s a quote by which to be remembered, eh?

I cannot say whether I will be better off when dead.  It’s probably a nonsensical question.  When I am dead, I will not be anything at all, not better, not worse, not uglier.  What happens to virtual particles after they have annihilated?  Nothing, and less than nothing, for they truly no longer exist, and in some senses they never existed.  Indeed, as physics goes, they probably never do exist; they are a shorthand description of what happens in quantum fields when perturbances in the fields have effects that do not rise to the level of actual, true particle production.

Or so I am led to understand.

From another point of view, it is possible for something to improve, at least in a sense, by ending.  I’ve mentioned this before, but if the curve of a function‒perhaps a graph of the “quality of life” or one’s “wellbeing”, to say nothing of happiness‒is persistently negative, then returning to zero is a net gain.  It can be a huge net gain, in fact.  This is related to the origin of my own version of an old saying, which I use with tongue definitively in cheek:  The one who dies with the most debt wins.

Now, of course, the integral, the area “under” that wellbeing curve would not be improved by the curve reverting to zero and stopping.  But at least that integral would not keep getting more and more negative over time.

Some might say, “well, the integral can become less negative over time, and might even become positive”.  This is, in principle, true.  And when one is younger enough, it’s relatively easier to tip the curve, and its integral, into positive territory.  But as the curve goes on, having been negative for a longer and longer time, it’s going to become ever harder to bring things to a net, overall positive integral, even if one could reliably make one’s curve positive (which one often simply cannot do).

Of course, the moment to moment experience (which is all the mind really gets) of an ascending curve could be pretty darn good, and might well be worth experiencing, even if it’s not enough to bring the integral into positive territory.  We are straying into the “peak-end” rule here, which was elicited regarding (among other things) colonoscopies but applies to much else in human experience.

Speaking of peak endings, I’ll mention in passing the curious fact that, no less than twice in the last week, the evening train service has been disrupted by someone either getting hit by or becoming ill next to the train.

Earlier this week, right by the station where I catch the train to go back to the house, there was a man who looked like he was probably homeless and had collapsed next to the train tracks not far from the station.  I saw him brought away, finally, on a stretcher.  He didn’t look physically injured‒certainly not in the ways I would expect someone who had actually been hit by a train to look‒but he did look cachectic, which is why I thought he might be homeless.

Then, last night’s commute was interrupted by what they call a “trespasser strike”, one that did not involve the train I rode but which always slows everything down.  I’m vaguely amused by the euphemism “trespasser strike”.  A “trespasser” here is a non-passenger who doesn’t work for the train company (or whatever) who is in the area adjacent to the tracks.  The “strike” part is probably self-explanatory.

I suppose it’s literally true, at least from a legal point of view, to call the person a trespasser.  But it’s amusing that the train people have to say something derogatory about a person hit by a train‒even if the person deliberately put themselves in harm’s way‒to sort of, I don’t know, assuage the company’s conscience.

But we are all trespassers, in at least some senses.  We are also, in other senses, all owners.  We are all innocent, and we are all, in some other senses, guilty.  “Every cop is a criminal and all the sinners saints.”  Above all, we are all very much just passing through, staying only a very short time.  We are all virtual particles.  Or you might say, we are all Iterations of Zero.

Have a good weekend.  I should not be writing a post tomorrow (in more than one sense).


*I wish I could honestly say that my use of a piano-style keyboard were as effortless, but I am terribly rusty with that, though I started learning it when I was 9, a rough 2 years earlier than when I got my first typewriter.