Probing train and work schedule inconsistencies and galaxy-scale “natural” selection

It’s midway through the week now—or it will be sometime today—and I don’t think I have anything intellectually interesting or challenging (or whatever) to write today like I did yesterday.  That’s probably a relief to most of my readers.  I don’t think those posts go over particularly well.

The train is supposed to be arriving on the proper side of the track, according to the tracker site, but we shall see.  It was also supposed to be here at 4:44 am, and it’s now two minutes behind that time, which was already one minute behind it’s programmed schedule.  Supposedly, there’s going to be some overall schedule change next week.  I hope it’s not too radical; I hate the notion of having to reset the whole system in my head.

Okay, well, this morning’s train arrived on the correct side, at least, though it was a total of six minutes late.  I know that’s not too bad—it certainly won’t change my day much—but it does boggle my mind how the very first train of the day can already be running behind schedule.  I mean, they promulgate the schedule themselves, so they know it in advance.  It’s the same every Monday through Friday.

Of course, I know that unexpected thing happen that engender delays, but if the unexpected happens and causes delays nearly every day, nearly every time, then it’s not the unexpected that’s to blame.  It’s the planning and preparation of the organization which is clearly inadequate and leads to too many things being unexpected that ought to be expected.

It’s a bit like what happens at the office.  There are people who are never there by the official time for work, and they keep being late because they face no consequences, not even embarrassment, for doing so.

I would be happy to offer some suggestions for such consequences.

Likewise with ordinary office maintenance.  I’ve announced and posted notes and signs repeatedly about, for instance, turning off the coffee pot (or brewing a new pot) if one drinks the last cup—the post-it note is literally at eye level just above the coffee maker.  But still, yesterday afternoon before I left, I had to shut off the coffee maker and put the pot in the sink to soak because someone left it on with less than a cup in it, and the residue baked into a crust of black, dehydrated coffee.

There are so many maddening things about the human world.

There are plenty of horrible things about the non-human world too, of course.  Nature does have its up-side, but it is also “red in tooth and claw” as the cliché says.  Darwin wasn’t crazy when he described that it is because of the war of nature, of famine and death, that we have the wonderful diversity of life and its beautiful and marvelous (and terrible) forms and functions.

The Buddhists were also right that suffering* is a key hallmark of life.  In any form of evolved life that I can seriously conceive, that’s going to need to be the case, since fear and pain are essential for staying alive in any world with competition for resources influencing survival and reproduction.  Genes that create bodies that don’t have pain and fear and disgust and so on don’t tend to get replicated nearly as much as genes that do, and when there is competition for scarce resources, ultimately such genes will fade away.

It seems possible, in principle, to design a life form—however loosely you want to use that term—that would not actually be capable of any kind of suffering, and if it were a stand-alone being or machine or what have you, it could very well continue to be that way, at least until it broke down.  But if it’s any kind of self-replicating “organism”, such as a Von Neumann probe or similar, there are inevitably** going to be slight errors in reproduction in each generation.  And that sets the stage for evolution via natural selection, even if it is the evolution of self-reproducing robot probes.

If there is differential survival and reproduction of variants, the ones that reproduce and/or survive better will come to dominate, even if there’s no inherent competitiveness between the probes.  If they go out into the galaxy in opposite directions, their evolution could diverge, and when and if they later encounter each other, they might have diverged enough to be in true competition for resources and/or space or what have you.

Eventually, especially as easily obtainable resources are used up by earlier generations of such probes, the ones that develop a certain degree of aggressiveness relative to others might have an advantage.  Ones that came to recognize other probe “species” as handy, localized sources of material that are easier to use than mining planets and asteroids and whatnot might become a sort of predatory or parasitic species of probe relative to the more autotrophic ones.

There might then follow a vast Darwinian evolution by natural selection of numerous species of what used to be Von Neumann probes, originating initially just from one source, and becoming a galaxy-scale ecosystem of self-replicating robots, just as life on Earth is a planet-scale ecosystem of self-replicating robots.  And maybe there might evolve some manner of multi-“cellular” “life”, and even a higher-scale form of intelligent, or meta-intelligent, “life”, that might begin to think about exploring other galaxies, and making new forms of probes, perhaps, to do that

I don’t know if the universe would be “habitable” long enough for any further steps to occur.  It depends how long the steps would take.  But at all levels, some manner of drives and urges inherent to the system would exist, and deprivation and damage and danger to those urges’ ends would also engender some form of what would be fear and disgust and pain.

Always.  World without end.  Amen.


*duhkha is the official Sanskrit word, apparently translated as everything from “suffering” to “unease” to “unsatisfactoriness”.

**By which I mean, it is literally impossible to copy any complex structure or information perfectly and repetitively without infinite precision and infinite checking and awareness, which is not achievable in reality, as far as anyone can tell.

Believing in “believing in” matters of empirical reality…or not

The other day, I was scrolling through The Website Formerly Known as Twitter—which I tend to do after sharing my blog posts there, since it seems the polite thing to do—and I saw a “tweet” or an “X-udate” or “X-cretion” or whatever one calls them now, that asked, “Do you believe in global warming?”

Such questions always seem bizarre to me.  It’s similar to the old, “Do you believe in UFOs?”  Though, with the latter, one can always snarkily reply, “Why, yes, I believe in unidentified flying objects.  I think people often see things in the sky that they cannot properly identify, especially if they are not experts and conditions are not ideal.”  But really, even that sarcastic response misses the point and can be misleading, so it’s best to be avoided.

The problem is, the question entails a kind of category error.  The reality of global warming—by which I assume the questioner means some form of anthropogenic climate change—is an empirical question.  It is a statement about reality itself, and is either true or false whether or not anyone even knows about it as a possibility, let alone “believes in it”.

It’s more reasonable to ask, “Do you believe that anthropogenic climate change—AKA global warming—is happening?”  That, at least, is a sensible question, when using the form of the word “belief” that means that, based on the evidence and reasoning one has available, one has arrived at the provisional conclusion that global warming is happening (or is not).

In using this term “belief”, one would usually imply that one is reasonably convinced, but open in principle to alternative explanations and counter arguments and new evidence—as one always should be in matters of empirical fact, at least if one is committed to always trying to make one’s map describe the territory as well as possible (to borrow a phrase from Eliezer Yudkowsky).

But when people say, “Do you believe in…” something, it doesn’t come across—to me at least—like a question about facts, but rather as a question about ideologies, about team membership, about religion, in a way.  It can be at least excusable and appropriate, if still rather nonsensical in my view, to ask someone if they believe in Santa Claus, or in Communism, or in God.  It doesn’t necessarily have anything to do with external reality other than the state of certain people’s minds, but at least it’s reasonably appropriate.

The absurdity of this conflation of “believing in” something with an assessment of a thing’s actual reality is pointed out well in Terry Pratchett’s delightful Discworld novels—in either Wyrd Sisters or Witches Abroad, if memory serves.  I don’t recall how the point comes up, but it relates to belief in the gods of Discworld.  The narration says that, of course, witches knew that the gods were real, they had dealings with them, they sometimes met them.  But that didn’t mean there was any call to go believing in them.  It would be like believing in the postman.

If someone were to ask me whether I think that climate change is real, and why I think whatever I think, I might reply that the general consensus of the world’s climate scientists—people who actually specialize in the area—seems to be that it is happening, and though their most specific predictions can be highly uncertain, as can all specific predictions in science beyond the realms of simple linear dynamics, most of them conclude that it is really happening.

I read a statement once that claimed that the percentage of climate scientists who are convinced that human-caused global warming is really happening is higher than the percentage of medical scientists who are convinced that smoking tobacco increases the risk of lung cancer.  I don’t know whether that statement is true, and I don’t recall the source—it sounds more like a rhetorical point than an actual argument, which makes me suspicious.  If it is true, it’s remarkable in more than one direction.

One can look up in journals the papers and the data that is being gathered and analyzed by climate scientists.  Google Scholar works nicely for searching out real, published scientific studies on almost any amendable topic.  One can also go to pre-print servers such as arXiv, to see papers that have not yet been peer reviewed.

If one is judicious, one can even find decent science news in less technical publications—phys.org seems to be pretty good—but mainstream reporting on such things is often unreliable and inconsistent, since after all mainstream media exist primarily to sell themselves, not necessarily to promulgate the most rigorous truth they can uncover.  Even Scientific American has turned into a twisted mockery of its former self.

I understand at least some of the physics behind the “greenhouse effect”, without which the Earth would be uninhabitably cold.  Visible light passes through the atmosphere without interacting much with the gases therein—which is why air is mostly transparent, other than the modest scattering of blue light that leads to the sky’s daytime color (and inversely to the color of sunsets).  But such relatively low-entropy, high frequency light is absorbed by the ground, then reemitted as higher entropy, lower frequency light, such as infrared, which is much more readily absorbed by molecules like CO2 and H2O and methane (CH4).  The reasons for this are quantum mechanical in nature, but the fact that it happens is basic physics that’s been well known since before anyone currently alive was born, as far as I know.

And so, these atmospheric gases heat up (and in turn heat up the other atmospheric gases) until the outer surface of the atmosphere is warm enough to radiate out as much energy as comes into the Earth.  Such is the nature of so-called black body radiation.

But for the outer atmosphere to be warm enough to do this, the middle atmosphere must be warmer, and the layer below warmer still, and so on, since outer layers radiate inward as well as outward.  The outer layer of the atmosphere will always be just warm enough to radiate out just as much energy as the Earth receives in light from the sun; if it were not, the Earth would rapidly get hotter until a new equilibrium was reached.  The final radiating surface might end up being higher in the atmosphere, which would mean that, closer to the surface, things would be warmer.

Anyone who has dressed in layers in cold weather should understand this intuitively.

[By the way, there may be some slight imprecisions in my very quick summary above of the thermodynamics of atmospheric gases, so if any experts in the matter would like to make any corrections—especially if such corrections are truly substantive—please feel free to do so in the comments.]

There are other atmospheric effects that are even easier to understand at basic chemical levels, such as the fact that increasing CO2  concentration leads to increasing acidification of the oceans.  This is fairly straightforward chemistry—carbon dioxide, when dissolved in water, partially reacts to form a weak acid—“weak” meaning just that the hydrogen ions do not completely dissociate from the molecule H2CO3*.  This can be demonstrated easily by getting some pH paper (readily available at all high street pH paper shops), testing some neutral water (to confirm its baseline neutral pH) and then blowing through a straw into the water for a few minutes.  You can then check if the pH has dropped, which—if you are a typical mammalian creature from Earth—it will have done.

I think this experiment can also be done with phenolphthalein, which is wine-red when in a basic (alkaline) solution and clear when in an acidic environment.  You can do a sort of magic trick, turning “wine” into “water” with just your breath through a straw bubbling in a glass.  Don’t drink it, though.  I don’t think phenolphthalein is particularly dangerous, but I wouldn’t want to endorse someone imbibing it.

I’m not going to tell you my conclusions about the empirical fact of whether or not “global warming” is happening and how and why and all that.  You can explore the subject as a homework assignment (but don’t hand it in to me).  But I will tell you my conclusion, which is probably obvious, about “believing in” things.  I don’t believe in “global warming” nor in the lack thereof.  I don’t believe in Santa Claus.  I don’t believe in Capitalism or Communism or Socialism or Fascism or Scientism** or Antidisestablishmentarianism.  I don’t believe in the Tooth Fairy, and I don’t believe in life after love.

And I really don’t believe it’s useful or good or anything but an irrationality to “believe in” matters that involve claims about the nature of reality itself.  Reality is that which exists whether or not anyone believes in it—indeed, whether or not anyone exists to be capable of believing in it.  That’s why it’s reality, as opposed to fictions and ideologies and other abstract concepts of various kinds.

I know*** that Amazon delivery people exist.  That doesn’t mean there’s any call to go believing in them.


* H2O + CO2 ⇌ H2CO3 ⇌ H+ + HCO3.  Something like that, anyway.

** Though I have more sympathy for Scientism than most “isms”.

***Not to a mathematical certainty, but to such a high degree that there’s no clear point in considering other possibilities, pending new evidence and/or arguments.

I blog you give me leave to go from hence

Hello and good morning.  It’s Thursday, and for the first time in three weeks, I’m writing a Thursday style blog post.  You can all start breathing again.

Yesterday’s blog post was kind of weird, wasn’t it?  I’m not even completely sure what I wrote.  I certainly haven’t reread it since editing it before finally posting it, but I feel I said a lot of strange things, and wrote about things I don’t know if I’ve ever talked about with anyone before.  Maybe I have.  I don’t think there was anything particularly shocking except that it was weird for me to say some of them.  Also, I feel it was more erratic and bizarre even than my usual posts.

It’s now the last day of November in 2023 (AD or CE).  That’s mildly momentous, or at minimum a mediocre milestone.  There shall be no recurrence of the month of November in 2023 (AD or CE) in any of our lives again, unless the ways we “define” the terms are changed.  Even if we had a time machine to come back to this day, we would not experience a new November in 2023 (AD or CE) if we were to return to it; we would be re-experiencing the same one, albeit from some different perspective.

I don’t know if returning to the same month would initiate some new Everettian branch of the universe, as in my short story Penal Colony, or if it would instead be some manner of closed, time-like loop in spacetime, which always happens exactly the same way—since it only actually exists in one instantiation—even if you were to experience it more than once.  It might be like coming to a crossroad, going through the light, looping around a “cloverleaf” in the road, and coming back to the crossroad in the perpendicular direction, then going on forward.  There’s only one route; it just happens to cross itself.

And, of course, if you did a self-Oedipus and somehow killed yourself at the crossroad, its not as though you would be changing your future in any sense;  that would “always” simply have been the way you died.  So, 12 Monkeys would be much more like the nature of such reality than, say, Back to the Future or Time Cop or that newer time travel movie with Bruce Willis that I haven’t seen.

I don’t know quite how I got on that subject.  My mind meanders morosely (and occasionally merrily), and I don’t necessarily know where it’s going.  As I noted above, sometimes I don’t even know where it’s been.

That’s why I never eat off of it, if I can help it.

One thing I’ve tentatively concluded after my thoughts from yesterday, though, is that I really am not capable of managing life in the human world.  I don’t think I ever have been; other people have helped me out in the past, and I have no such other people available now.

I have skills and tenacity and intelligence enough to survive for a time, and to create an illusion of “getting by” that’s convincing enough for people who aren’t really part of my life—which is everyone, these days—but everything is falling apart, and I don’t know how to maintain it, nor do I have the will and the wherewithal to do so.

You might as well ask a moth to maintain a termite mound.  Or even just ask an ant—maybe that’s a better comparison.  An ant could sort of get the idea of a termite mound, and if it’s already been built, the ant could sort of help maintain it to some degree for a bit.  But really, it’s not where the ant belongs, it’s not the lifestyle to which it is adapted.

Ask a human to try to live the life of an ostrich, among ostriches.  The human might put on an interesting show for a bit, and since humans are smarter than ostriches, the human might even succeed at things the ostriches couldn’t from time to time, but if the human is committed to living and behaving like an ostrich—if there are only ostriches anywhere to be found in that human’s environment—that human is inevitably, eventually going to fail catastrophically.  It may be a slow catastrophe.  Maybe it’s nothing anyone would make into and share as a video on YouTube or Instagram or TikTok.  But it would still be a catastrophe.  It would not be pleasant to experience.

Drawing closer to home, it would be hard enough for, say, a chimpanzee to try to live with and as orangutans or vice versa.  Even chimpanzees and bonobos—as closely related as primates get one to another—would probably not be able to thrive if one were placed within the other’s society.  I would guess that a bonobo would probably be abused and die before too long in the company of chimpanzees (who are notorious assholes) but a displaced chimpanzee would probably have just as confusing and frightening a time, if more subtle, trying to blend in with bonobos.  It would have a few slight advantages in strength and size, on average, and it might even be able to learn to try to fit in and make its way.  But it would be living a lifestyle subtly but profoundly different than the one to which it is adapted.

Anyway, that’s all a bit tangential and weird.  I don’t think I’m making myself very clear, and for that I apologize.  I just realize more and more that I don’t think I’m going to survive much longer, even if I were to find the motivation and desire to do so.  It’s a slow crash and burn, perhaps, but I think I really am crashing and burning.  And I don’t think that there was ever a chance for anything otherwise to happen, with me trying to live among and adapt to the world of humans—or normal humans, or “neurotypical” humans, if you prefer those metaphors.  So, what should I do?  I don’t know.

In the meantime, though, I hope you all are having and have had and will continue to have or (if that’s the best for which I can hope) that you begin to have a very good day and week and a very good new month starting tomorrow and so on.

TTFN

Hermit or magus

Don’t say I didn’t warn you…

…because I did warn you:  it’s Saturday, and I’m writing a blog post, because I’m on my way in to the office.

I think there was a brief moment in the middle of the day yesterday when the boss considered just keeping everything closed for the weekend, but then there were at least three people besides me and the “closer” who were planning to come in.  Since they are paid on commission (so extra work is an opportunity for them) I can’t feel too bad about having to come in, too.  As I wrote yesterday, it’s not as though I have anything better to do with my time.

Actually, today of all days, that’s not quite correct.  Today is the day of the first of the 60th anniversary Doctor Who specials, which I mentioned yesterday.  But since that is supposed to be streaming on Disney Plus, I can watch it from work (things are often slow-ish on Satudays) just as easily as at the house, and I will probably be more comfortable at the office.  My desk chair is decent, whereas at the house I basically sit on the floor.

Now that we are past the main temptation holidays, at least until Christmas/New Years, I need to go back onto a stricter diet.  I find that my physical energy is much better when I’m controlling my input.  This might seem ironic, given that I’m restricting input of the most easily “usable” calories, but the biochemistry and physiology of this fact is entirely reasonable and well understood.

It does sometimes have a detrimental affect on my mood, decreasing my emotional energy somewhat—which I guess makes my sugar cravings/sweet tooth a bit akin to the addiction of someone who uses illicit drugs to “self-treat” an underlying mood disorder.  This shouldn’t be too surprising, since sugar triggers activity in the nucleus accumbens and related centers of the nervous system that is very similar to what cocaine and amphetamines do.

I also should just avoid alcohol—not because I have a big problem with it or anything, but because it doesn’t actually make me feel good, even in the moment, but I kind of expect it to do so, and by the time I realize, “Hey, this isn’t even helping me relax or making me feel good while it’s on board”, I’ve already bought myself some GI and neurological discomfort later.

Sorry, I know this is all boring.  In a way, though, everyone is boring to most other people, or at least not terribly interesting.  And many people who are apparently interesting to so many other people are actually astonishingly uninteresting to me.  For instance, though I recognize her talent and skill and brilliance, I have no particular interest in Taylor Swift’s career or music—except to recognize those stated attributes—and I certainly have no interest in her love life.  Yet, since I do follow the news fairly regularly, I cannot help but become aware of these things.

To be fair to her, she’s much more interesting than most celebrities*.

I suppose it’s a small price to pay for making sure that I get my news input from a variety of different sources to try to avoid bias—or, at least, to balance the biases against each other as much as I can.  I don’t generally like to take in commentary on news, so I avoid editorials; I can decide what I think about issues for myself once I have the data and don’t need pundits to tell me what they think I ought to think.  I’m only too aware of studies that have generally shown that such pundits’ predictions on various news events are no better than, and quite often significantly worse than, chance.

In other words, if you get your news from sources that editorialize, let alone from pundits, you’re actually worsening your likelihood of getting a good take on events in the world.  Why not just get a “magic 8-ball” and save yourself the trouble, while ironically improving your odds?

Mind you, there are people with expertise from whom I might be interested in hearing (or reading) their take on particular, narrow issues within their wheelhouse.  For science and related news, for instance, I go to a few specific science-related YouTube channels like Dr. Becky, and PBS Space Time, and Sabine Hossenfelder, and Sixty Symbols and Deep Sky Videos and Periodic Videos and Numberphile and Computerphile—those last five are all channels pioneered by Brady Haran, a remarkably intelligent and curious science and math journalist who gets experts to discuss science (and mathematical and computer-related) stories.  He asks very good questions.

I find that the mainstream media does just an unacceptably sloppy job at conveying science news, on average.  To be fair to them, the standard deviation of that sloppiness is pretty big, so some good work happens now and then, but it’s well into one small tail of the curve.  Sadly enough, even Scientific American has become a severe embarrassment to itself—and it’s even more embarrassing that the editors thereof don’t seem even to realize how embarrassing they’ve become.  I used to love that magazine, but it’s dead to me now.

Anyway, enough kvetching.  My train will be here soon, and I’ll be on my way to the office.  I hope to have at least one hour of today that is quite fun—the Doctor Who special—and I certainly always get some satisfaction from writing these posts, at least when it’s clear that people are reading them.  I hope you all have a good remainder of the weekend, and I will return Monday morning, barring the unforeseen.


*Even David Tennant, for instance, is mainly interesting only as the Doctor (or as Hamlet, or in one of his other roles).  Ditto for the other Doctors, and for musicians whose work I enjoy, and for writers I like to read, and even for scientists whose work I follow.  I guess it makes sense; people are most interesting when they’re doing what it is that they do that is exceptional.

Be thankful you’re not a simulation. Or are you?

I’m writing this on my phone for the first time in quite a while, seated in the rear of an Uber, on the way to the office.  This was something of a whim‒the phone writing, I mean, not the Uber.  The Uber was a carefully considered choice, and it is relatively cheap because of the hour at which I’m taking it.  It’s not something I would do on a regular basis, at least not for long.  Maybe if I finally give up and decide to die in short order I might just burn a lot of money on Ubers.  I doubt it, though.

No, the whim is deciding to write on the phone, since I have some down time in the back seat.  I could use my laptop, but that feels slightly weirder or more uncomfortable to me, though I’m not sure why that’s the case.  I could also just wait until I got to the office to start, because I’m going to be very early.

The reason for going to the office by Uber is that I made the mistake of ordering an Amazon “Try Before You Buy” article of clothing‒a somewhat expensive one.  It did not fit right.  But then I learned that Amazon doesn’t do a pickup to return items like that; you need to drop them at a Whole Foods or a UPS store or similar.

That was not clear to me when I was using the option, or I wouldn’t have done it.  I have no straightforward way to get to any of the above locations, and even to use Uber to get to one would require going during working hours.  I had to arrange for a UPS pickup, at my expense, but I had to set it up to happen at the office, because I won’t be at the house during the day for ten more days (at least on days UPS does such pickups) and that’s past the pickup time window for the “Try Before You Buy” system.

So, here I am, bringing a cumbersome, and not too light, package to the office with me so that UPS can pick it up between 9 and 6.  I never want to do this sort of thing again.  It was foolish of me to try a rather expensive article of clothing anyway, but I guess it was sort of an attempt to cheer myself up with an indulgence.

That sure misfired, didn’t it?

Speaking of cheering oneself up with indulgence‒or with the inability to do so‒tomorrow is Thanksgiving for my fellow United Statesians.  We don’t call this evening “Thanksgiving Eve”, which feels like a shame to me, but certainly people do start celebrating the holiday, in a sense, quite early.  I think many people take the whole week off work.

I, on the other hand, am not really going to be doing anything to celebrate.  The closest I might come is walking to a gas station not too far from the house, where they tend to have pretty decent pre-made turkey sandwiches with mildly cranberry-associated topping.  It’s not very impressive, nor is it terribly satisfying.  I’d feel much better, I think, if I were able simply to go to sleep tonight and sleep through until Friday morning.  As it is, I probably won’t be able to sleep or rest any more than usual, and that’s even counting my plan to take some Benadryl tonight.

I’m almost at the office, so I’ll take a brief pause here and resume after I arrive.  You may not notice the gap.

Did you notice it?  I’m guessing you probably recognize that it happened, but only because I told you that it was happening.  Like the scenes in a movie that’s been filmed over months and months, or the paragraphs of a long novel like my forced two-parter Unanimity that was written and edited over the course of more than a year, the final product may end up relatively seamless despite a long and discontinuous origin.

I’ve occasionally imagined that it might be possible (in principle, anyway) for our reality to be a simulation in which each moment‒maybe each Planck time‒in every location in space‒perhaps each cubic Planck length‒is prepared individually, one by one, then subsequent and nearby ones are calculated based on the laws of physics, and each next place and time is then updated piece by piece, one infinitesimal space and one instant of time at a time, as it were*.

The simulator could take a trillion years to calculate even one second of the spacetime in the visible universe, probably far longer.  But it wouldn’t really matter, necessarily**, how long it took, provided there was enough memory available to keep everything stored.  From the outside, the process of one human life (and its past and future light cones) might take a googol years to calculate, but from the inside point of view, for the human being “simulated”, time would just progress normally.

It doesn’t matter to the people in a video, for instance, if their video is viewed at 2x speed or .25x speed; for them it all happens the same way no matter what.  It doesn’t matter to the characters in a Studio Ghibli movie that their individual movie cels each took hours to be painstakingly drawn and painted, or if a Pixar character took even longer to be computer generated.  Their “experience” would pass at one frame per frame, or 24 frames per experienced “second” for them (at traditional movie frame rates).

Even if each second of the person’s life took a trillion eons to simulate, it would still be experienced just as a second for that person.

A rather weird and possibly disquieting implication of this is that, if those simulating the person stopped doing it‒perhaps they got bored, or had a power cut, or suffered a natural disaster or catastrophe in their meta-level universe‒the simulation would just…stop.  It’s not that the people in the simulated universe would die in any conventional sense; certainly they would not die in the usual within-the-universe meaning of dying.  Nor would their universe “die” as if some cataclysm like a phase change in the vacuum energy occurred***.  It would just stop.

There would be no next moment, no next occurrence*****.  If someone were later to restart that simulation for whatever reason, even if it was ten to the thousand to the googol years later or more, the people within the simulation would experience no difference between the before pause and after pause moments than between any other two moments in their existence.

But if the simulation were stopped and never restarted, with perhaps all associated memory erased…well, again, the inhabitants would not experience it in any possible, conceivable sense, any more than a video game character experiences the moments when and after you reset the game or the power goes out.  If you are a simulated existence, and the simulation is permanently stopped, you will not so much die as cease to have any manner of existence whatsoever.

Have a happy Thanksgiving.

happy-thanksgiving-from-the-farm-maria-keady


*It’s interesting also to think of, for instance, two “people” starting to simulate such a universe from different points in space and time, and to wonder what would happen when they came together if their simulations did not mesh perfectly, like frost on a window-pane with multiple initial points of nucleation leading to a “fractured” pattern.  But that’s a different, if related, thought process.

**From the point of view of the “simulated” universe, anyway.  It’s hard to see anyone having the commitment or desire to bother actually carrying out such a laborious simulation; that would be quite a dreary task.

***This is a possible occurrence in an ordinary, physics-related sense.  If the “dark energy” is indeed the cosmological constant (called lambda, Λ, as in the ΛCDM model of cosmology) but is not at its lowest “vacuum state”, then it could spontaneously “tunnel” down to a lower, more stable set-point.  This would wipe out every particle in the current universe in a growing sphere, with its outer shell expanding at the speed of light.  Of course, that means that you could never, in principle, have any warning that it was happening, nor could you, even in principle, experience your destruction and that of everything else that exists.  This is not the same manner of cessation as what I discuss in the main body of the post‒it is very much a within-simulation event, not a meta-level one‒but it would still be just an instantaneous erasure of sorts, happening too fast to be experienced even in principle****.  There are many worse ways to die.  Indeed, almost all ways humans do die are much worse than this.

**** Presumably, quantum information would be conserved even in this catastrophe, whereas in a halted and erased simulation, that principle wouldn’t apply, at least within the simulation.  Whether it would apply to the process of simulating and then ceasing to do so would depend on the nature of the meta-level universe.

*****I suppose this is analogous to what will happen to everything in the universes of my stories Outlaw’s Mind and The Dark Fairy and the Desperado if I never finish those stories.

I almost forgot to give this a title

I seriously considered walking to the train station today, but after I finally arose—I’d been awake for hours, already—I realized that I just wasn’t up to doing it, physically.  Or maybe I wasn’t up to doing it, mentally.  In any case, it’s not as though there’s any actual difference or separation between the two things, despite the wishes of dualists* of many stripes throughout the ages.

I simply am this thing that is writing this, and it’s all instantiated in this body—though I store aspects of my persona and records of various things and highlights of information in external media, as people have done for quite some time to greater and lesser degrees.

In any case, I really don’t feel very well, and I don’t mean just my usual depression/dysthymia, though it may be related to those things.  Perhaps it’s just an exacerbation.  After all, dysthymia (now officially called persistent depressive disorder or some such boring name, because that’s what really matters, making sure that things have optimal names, right?) can episodically increase into a full blown episode of major depression.

Also, it’s that time of the year for those whose symptoms are affected by the seasons—in the northern hemisphere, at least—to feel the detriment of longer nights and shorter days (so to speak).  I am at least somewhat “seasonally affected”, though I’ve always loved autumn.  This may seem superficially contradictory, but in my youth, autumn was a time that brought birthdays and holidays and the start of school and all that good stuff that I liked.  Also, probably when I was quite young, I didn’t have any real evidence of depressive disorders to come, at least as far as I recall right now.  Although, if I do have ASD, it was present then.  There is some evidence in my recollections that it was.

In addition, of course—speaking of holidays—this is a rough time of year for people who are already depressed and who are also socially isolated**.  Thanksgiving is in two days, and that is a traditional, very positive and social family holiday, which I will not be celebrating again this year.  I will have the day off work, though—all the better to drive home the fact of being alone in a single room (with attached bath) and having no one with whom one shares life at pretty much any level.  Then of course, the Hanukkah season (and Christmas season) and New Years and all that is coming up—extremely family-and-friends-oriented holidays.  I again am not planning on trying to spend any of them with anyone else.

The weird irony is, when I imagine actually trying to spend holidays with other people—yes, even when I merely imagine it—I feel tremendous tension.  I guess it’s what one could call significant anxiety.  It’s a strange kind of…not exactly a contradiction, but a conflict, a tension of ideas.  I am depressed and gloomy when alone, which is my usual way to be, but I feel almost terrified at the thought of being around other people socially.

I particularly wouldn’t want to have a group of people just bring me into their celebrations of holidays just so that I could have someone with whom to celebrate.  It’s not that I dislike people I don’t know.  How could I dislike them if I don’t know them?  I just don’t feel a sense of camaraderie with most people; I don’t feel like a member of the same species.

The guy, Paul Micaleff from the YouTube channel “Asperger’s from the Inside” (well, now it’s “Autism from the Insode”) made a great analogy that struck home with me about that kind of thing.  He said that, if he goes to a pond and sees a lot of ducklings playing around and swimming and all that, he might really think they were great and enjoy watching them, but it would never occur to him to try to join them in their pond.  That would make no sense.  He wouldn’t know how to act, they would be terrified of his presence, and he would never be able to fit in or enjoy trying to pretend to be like them, in any case.

I think it’s a really good analogy.  One doesn’t have to hate a group of people or even think them uninteresting not to feel that one has any business trying to join the group or attempting to act as if one were like them.

I don’t know what my species is.  Even though I find people like Paul more relatable than most, I still don’t really feel like I could connect even with the people in those communities.  I think the closest guy online I feel like could be my kind of person is Dave, from Dave’s Garage (his book was also very good and extremely relatable).  But I don’t think that he would find me very interesting, partly because our backgrounds are so dissimilar.  Anyway, he’s doing his thing and putting up nice educational videos about computers and stuff, and that’s good enough for me.

Actually, I don’t know that there’s anyone I might possibly want to spend time with who would truly want to spend time with me, except for family of course.  Even more so, I would not feel comfortable imposing myself upon anyone, even if I wanted to spend time with them and they were interested.  I’m just not selfish and cruel enough to inflict myself upon people I like.

I’m very tired and just utterly pointless—in the sense that I have no particular reason to do much of anything; I just have biological drives and habits, none of which provide any purpose or sense of satisfaction.

I have been thinking about using this month’s Audible credit to get Stephen King’s On Writing in audiobook format.  It’s read by King-sensei himself and his two author sons (Owen King and Joe Hill).  I’ve read the print version before, of course—more than once—and it was certainly inspiring in its way.  Stephen King’s nonfiction is sometimes even better than his fiction.  His style and substance and personality are quite engaging.  So, maybe if I get that audiobook, I’ll listen to it, and maybe just feel inspired to start writing fiction again.

Possibly, it’s worth a try.  If it doesn’t work, well, I don’t know what will happen.  That’s not new, though.  No one knows the specifics of the future in other than trivial senses until it happens.  And then it’s no longer the future.  We’re falling through time, in that sense, facing backwards, only seeing where we’re going once we’re past it.

It seems like a weird way to run things, but of course, it’s the only way that makes sense, given that complexity and life and memory are all driven by processes that harness increasing entropy.  And being fairly close to the surface of an extremely low-entropy state in space-time (AKA “The Big Bang”) explains why things like life and mind exist at all.  You wouldn’t see stalactites and stalagmites form in a place without a local strong gravity differential providing a sensible “up” and “down”, and you wouldn’t see life or consciousness forming in a spacetime with already uniform entropy, thus leaving no local “past” or “future”.

All right, let’s stop before I go off on a tangent, even a sine or a secant.  Have a good day.


*Not to be confused with “duelists”, a group or set that could certainly overlap with dualists, but need not do so, and which is defined by quite unrelated characteristics.

**Not in the sense of avoiding spreading disease, but just in general lack of social contacts or supports.  I am very “challenged” in that area.

“And I find it kind of funny, I find it kind of sad…”

The madness continues, or begins again, as the beginning of a new work week occurs.  “What madness is that?” you ask?  I mean the madness of bothering to stay alive, the madness of continuing to do things that are absolutely pointless and irrelevant even in the moment, let alone in the long term history of the cosmos.  I mean the madness of trying to pretend to be cheerful or positive in any way, to try to be polite or engaged or interested in anything around me.

That madness, and other forms related and/or similar to it, is the sort of madness I mean.

I guess I really would have to say that the madness “continues” rather than that it begins again.  It’s not as though it has ever stopped or paused.  It simply takes a different form over the weekend, when there is less to do.  But there is no more real sanity involved in any of my activities even when I’m not commuting to the office and back.  I’m just less constrained to try to seem vaguely pseudo-normal, or at least vaguely pseudo-tolerable, when I’m by myself in my room.

I should look up a thorough etymology of the word “madness” or “mad”.  I know that it has morphed, to at least some degree, into a modern synonym of “angry”, but the older meaning of “lack of sanity” or “extreme agitation” of other types still persists at least a bit.  And it’s better than “insanity” in my opinion.

Madness has a certain poetic quality to it that “insanity”, which is really a legal term, does not have.  Insanity, whether by design or just by customary use, carries the impression of a loss of previously existing “sanity”.  I’ve introduced my term “unsane” before, but I don’t know if it’s going to catch on.  At least, though, it conveys the notion, potentially, of situations or people or beings to whom or to which the very concept of sanity doesn’t apply.

But of course, as I noted, insanity (and sanity) is a legal term that applies to assessing whether or not one can be held legally culpable for one’s actions.  As such, it can be fairly vague, and certainly it is not scientific.  There are quite a few forms of mental illness* that are truly debilitating and dangerous and can even be life-threatening, and are certainly immiserating, but which would not allow one to be found “not guilty by reason of insanity” if one committed a crime.

Mind you, all these notions, from laws to words to legal or even moral responsibilities, are simply inventions, creations, “fictions” produced by humans for various reasons—they are memes** and memeplexes that happened to survive and reproduce, so they carried on.  Often, though not always, such memes persist in the meme pool—i.e., culture—because they are useful to the organism(s) through which they propagate.  But they do not have any truly fundamental reality.  They are emergent things in a spontaneously self-assembled complex adaptive system that has no more intrinsic, inherent meaning than does a snowflake or a piece of rock candy—also, they are far less beautiful and/or tasty, though they have their charms.

Still, I’m sick of nearly all of it—mentally sick, physically sick.  I’m particularly sick of my part within it, largely because I don’t think I have much of a part within it.  Like the song says, I don’t belong here.  But, of course, the fact of not belonging in one place does not logically imply that one belongs somewhere else.  Even setting aside the fact that the term “belong” is fairly vague and protean, by any version of it but the very loosest one, it is entirely possible for an individual entity or being not truly to “belong” anywhere at all.

I certainly know that it’s possible to feel that one does not belong anywhere.

It’s vaguely reminiscent the old Groucho Marx joke in which he said he would never join a club that would have him as a member.  It’s funny, but it’s also a good description of a dysfunctional state of mind—or at least an inefficacious frame of mind—such that a person feels that he or she is an outsider, and that any group that would welcome him or her is probably not the sort of group in which he or she could possibly feel comfortable.

It’s what happens when one looks online to find communities that purportedly have common difficulties or shared issues and which intend to provide mutual support, but one feels at least as alien and uncomfortable with the thought of these support groups as one does about any other group.

No-win situations are clearly possible in reality—the very concept of “winning” is another entirely artificial one, though it can be pertinent to the objective biological world in some circumstances—and when one is in one, it can be reasonable to try simply to accept that one cannot win, and therefore that one’s choice of how to escape the situation is arbitrary and so may as well be random, or whatever seems most attractive at the time.

Anyway, that’s enough bullshit from me for today.  I don’t know what point I’m trying to make, but that’s okay; there is no inherent point, no evident telos to the cosmos.  There is no purpose in which to lose myself, and there is no home to which I can return.  I’m certainly in no position to try to make a new home of any kind or to create some new purpose.  I wish I had just walked away a month ago today, as I’d hoped to do—it would have been a good day for it.  Or perhaps I should have done so a month before that; it would have been even better.

Oh, well.  The past cannot be changed, anymore than the characters in a film can rewind their own reels and edit earlier frames to change their story.  If one were able to change past time, it would necessarily involve another level of time, some “higher” time in which a different kind of future and past existed, not constrained by the one within this world.  That’s conceivable, of course.  However, there’s no evidence that it exists.

But that’s a discussion for some other time.


*Yes, I prefer to call things “mental illness” when they impair the successful functioning of a person’s mind, to greater or lesser extent.  Referring to everything as “mental health” comes across as just weird a lot of the time.  “He struggles with mental health” is the sort of thing people sometimes seem to say, but that doesn’t make much sense.  Surely he struggles with his relative dearth of mental health.  Or is it meant that perhaps he dislikes mental health, which seems fairly pathological in and of itself, just as a person might want to sabotage that person’s own physical health?  Either tendency seems to be a case of mental illness, in the same sense that anything from an upper respiratory infection, to dysentery, to a heart attack, to vasculitis, and to cancer are all forms of “physical” illness, not physical health.

**In the original sense of the term, coined by Richard Dawkins in his brilliant work, The Selfish Gene.

Meandering thoughts early on a Saturday morning

As I noted above, it’s early Saturday morning, and here in south Florida, it’s already 80 degrees (Fahrenheit) and muggy, despite it being the 11th of November.

The trees here don’t change color, there’s always mold and mildew and stuff like that, annoying insects are pretty much always out and about throughout the year, and I’m sure there are lots of other things worth reviling about the area.  I won’t even get into the politics and the general idiocy levels and the bureaucracies, because they’re probably not significantly worse here than anywhere else; they’re just different and weird, because it’s Florida.

I do enjoy being able to see the various reptiles that abound here most of the year.  You definitely don’t get many lizards in Michigan, even in the summer; you’ll see the occasional turtle here and there, and if you go into the woods, once or twice you might encounter a snake.  But it’s mostly mammals and birds (and various Arthropoda when the weather is warm) up there, and in pretty much all but the southernmost US states.

Mind you, Hawaii had no endemic mammals (if you don’t count humans) for quite a long time.  It’s the most isolated archipelago on the face of the Earth; how could mammals have reached it?  Birds, sure.  Insects—well, they can get almost anywhere*.  Amphibians—it’s more difficult, but they can hitch a ride on floating vegetation, as can many reptiles, since they don’t tend to require as much food and fresh water as mammals do.  But how would a population of mammals from the mainland survive an accidental trip to the Hawaiian islands?  It’s not impossible, but to my knowledge, until humans brought them, no other mammals had come to those islands.

Florida, on the other hand—that second most southern of the United States, and the most southern of the continental United States**—has been part of the mainland for as long as human beings have existed, as far as I know.  Plenty of mammals abound here, in addition to the various birds and reptiles and amphibians and insects and other arthropods.

It’s my understanding that, until quite recently, actual jaguars lived in Florida!  I’m not talking about the Jacksonville football team.  I’m talking about the actual, third-largest member of the cat family (and the largest in the western hemisphere).  I’m talking about that brilliant, beautiful predator that can casually fetch crocodiles from the waters of the Amazon to eat.  I’m talking about the member of the big cat family that, instead of going for the throat, like most big cats do, tends to jump down on the back of its prey and crush the prey’s skull in its immensely powerful jaws.

Death by jaguar would probably not be pleasant, but it would at least be stylish and cool.  And if a jaguar eats you, you become part of one of the most magnificent predators on Earth.  While it’s true that humans are better predators—they are pretty much the most powerful predators ever on the planet—there are plain few of them that could be described as magnificent and sleek and imposing.

There are no more wild jaguars in Florida, and there are probably no more wild Florida panthers, either.  Instead, we have this horrible proliferation of Naked House Apes, the vast majority of whom are far from inspiring either to look at or with which to interact.  They succeed by dint of science and technology, of ideas the vast majority of them could not begin to describe or explain.

How many humans who regularly use the GPS system could explain why the system has to account for both special relativity and general relativity, or else it would be utterly useless and inaccurate?  How many of them even understand what is meant by a logic gate, even as they carry around spectacularly sophisticated computers in their pockets, which they use to take selfies*** and watch idiotic nonsense on TikTok?

How many people can’t interact with an idea that requires more than 240 characters to express?

I could go on and on, of course.  And I’ll admit that all of those positive things and ideas—engines and mathematics and circuits and piping and roads and farms and houses and medicine and so on—came from people who at least appeared to be human (though one often wonders if there isn’t some deep level of difference within the species such that some minds are barely the same type as many others).  But those people, and their ideas, are exceptions to the general rule and tendency.

Even nowadays, when we see so many of the fruits of the brilliant ideas of the likes of Ada Lovelace and Emmy Noether and their sistren****, we have to realize that there is such an abundance only because those ideas are so potent—they persist, they spread, they lead to other, subsequent, consequent ideas.

The prevalence or rate of occurrence of brilliance is probably no greater than ever before, as a matter of percentages, but there are more people—thanks to the products of past genius—and the edifice on which they rest is so much vaster and more stable and powerful that newer, still achingly rare instances of genius can build on those monumental, cyclopean, Olympian structures and devise things and ideas that could, in principle, in the long run, change the face of the very universe itself.

I don’t know what point I’m making here, today.  This is almost free-association or even “automatic writing”.  I guess it’s a good way to pass the time while I’m on my way to the office, which is at least a nearly decent way to pass some of my time on the way to the grave.  But I’m impatient to reach my destination.  I don’t feel very well.  I wish I could rest.  I’m really, really tired, and yet I never seem to be able to sleep much.

Oh, well.  The universe was clearly not made for my comfort, so I have no right to feel slighted or misled by it.  Then again, rights themselves are a human invention (or, just possibly, a human discovery), as are laws and customs and social patterns and all that happy horseshit.  The universe at large does not recognize any rights at all, unless you want to count the right (as well as the absolute obligation) to follow the laws of physics, whatever their ultimate nature might be.

That’s enough of my random brain exudates***** for the time being.  I hope you all have an excellent weekend.


*There are apparently endemic midges in Antarctica!

**At latitudes that roughly match those of Egypt, apparently.

***And how many of them understand how LCD screens (or LED screens) are different from the old CRT screens of traditional TVs (or what those acronyms mean), and why some people predicted that color TVs would become “extinct” because the earlier ones relied on certain rare-Earth elements, and why that prediction was incorrect because clever people figured out there were other ways to do the same thing?

****It’s horrible to realize that the reason it’s comparatively easy to list the women who have made astonishing contributions to human knowledge and understanding—these two I just mentioned having done no less than, respectively, basically inventing computer science and programming before the computers had even been built and codifying and mathematically explicating how conservation laws in physics derive from fundamental symmetries—is because women have been prevented from even exploring their potential in such areas throughout most of history in almost every culture.  Interactions with humans throughout my life has made it quite clear to me that the average human female is at least as intelligent as the average human male.  This implies that, over the course of human history, to a good first approximation, half of all potential genius has been not merely squandered but prevented.  It’s heartbreaking and soul-crushing to imagine all the possible art and poetry and science and philosophy and mathematics and music and so on and so on that might have existed already had women not been systematically prevented from developing their skills and ideas throughout most of human history.  If anyone ever wonders why I get depressed, this is one of the reasons.

*****I think the replacement for the term “tweet”, as in a posting on Twitter, should be something like an X-cretion, an X-udate, an X-trusion, or maybe even an X-foliation.

“Be resident in men like one another and not in me”

Well, I’m on the laptop (computer) again today.  I specify that it is the computer because I want to make it clear that I’m not on anyone’s actual lap top.  I don’t think there is anyone out there whose lap could tolerate me sitting on it—I suppose Santa Claus could maybe use his magic, but it’s a bit early in the year for him, even given holiday-time mission creep—and probably even fewer laps on which I would be able to tolerate sitting.  And one cannot really be on a lap around a race track or in a swimming pool, unless one is actually going around that track or swimming, either of which activity would make it very difficult to type.  I guess the top of such a lap could be thought of as its beginning, as in “taking it from the top” in music.  But that wouldn’t change the writing difficulty.

That’s a weird opening to a blog post.  Sorry.  I think I’m particularly weird in the morning, or at least I’m a particular kind of weird in the morning.  I know that, as with many people suffering from depression, my mood is often at its worst in the morning, but sometimes I’m at my least weird and my most sane—from my own point of view, anyway—in the morning relative to the middle of the day or the afternoon or the evening.  Often I feel most sane when I’m most depressed.

It’s quite frustrating when, by the end of the day, my energy level lifts a bit, because then I have a hard time relaxing and getting to sleep.  But, of course, it’s not as though I can sleep in, or sleep late to make up for staying up too late.

I will say, though, that last night I got nearly four hours of sleep (pretty uninterrupted once I got to sleep), and it felt surprisingly deep.  I had at least one dream of which I was vaguely aware, because it was interrupted when my alarm sounded.  I don’t remember anything about the dream, other than that it was a dream, and I awakened feeling quite disoriented*, thinking it must be much later than it was.  It wasn’t.  It was just as late as it was, as one might expect.

My work friend who had the stroke is apparently doing pretty well, which is good news.  It feels so ironic to me how often people around me, ones who have a lot for which to live, and who have good reasons to be healthy, and who have families and friends, are stricken with significant health problems.

I’m referring to serious, dangerous health problems here.  I have some health problems—chronic pain, stuff like that—and I certainly have mental health issues.  But I’m the person I know whose life could most easily tolerate significant health setbacks, or at least the one whose ill-health and/or death would have the least impact on those around me and the world at large.  Even so, on I go.

Yet my life, such as it is, is in fact steadily eroding.  It has already become quite a poor, puny, pathetic little remnant of a life.  I don’t do anything other than go from my one room (with attached bathroom/shower) to work and back, and I write this blog.  I don’t play guitar or write fiction or sing or any of that anymore.  I’m getting more and more tired of even non-fiction books.

I don’t watch any ongoing TV shows other than things like Loki, which is quite limited, and Doctor Who.  Unfortunately, even the latter is something that I wish I could watch with someone…and not via a cheesy-ass “watch party” thing online.  I don’t understand how those could be any fun at all.

I have a hard time even visualizing people I know when I’m not around them.  I mean, I know they exist, of course, but I can’t readily imagine what they might be doing, or that they’re doing anything in particular, if I’m not with them.  I know they exist, but I only really feel them existing when I’m in their presence.

Maybe that’s part of the whole ASD thing, I don’t know, but it’s always been very difficult for me to maintain any form of relationship over significant distances.  There have been exceptions, but you could count them on maybe half the fingers of one hand.  And those exceptions always involved nearly-continuous communication.

Still, while of course I know, intellectually, that other people are all still there when I’m not in their presence, I don’t seem intuitively to model them except when they’re nearby—and when they’re nearby, I don’t so much model them as watch them in a kind of analytic way (though I do feel the noise of their emotions).

So, when I’m alone, I often feel*** truly and completely and fundamentally alone in the universe.  I often feel that way even when other people are around, though there are some distractions and intellectual engagement that help make that a bit easier.  But there have been relatively few people in my life with whom I feel really connected, and eventually most of those people have gone far away or cut ties with me or died or whatever.

Who can blame them?

So, anyway, that’s the deal.  It’s Wednesday, and that means it’s payroll day.  And tomorrow will be my traditional Thursday post.  I sometimes entertain the notion of writing blog posts in the afternoon or evening, and seeing if the content is different in character, and if anyone would notice.  But to do that would require serious restructuring of my routines and schedules and things, and I don’t think I’m up for it.  Also, morning is when I have time to do this.

I’m awake anyway, so I might as well use that fact for something productive…if that’s how this can be described.

Please try to have a good day.


*It’s weird how the Brits tend to use “disorientated” even though the root word is disorient, not disorientate (which sounds, perhaps, like the name of Catherine Tate’s sibling or child**).  I guess even in the states we say “disorientation”, but I think that’s just because “disoriention” would not flow very well.  I’m probably biased.  One related thing I find frustrating, and found especially frustrating when I was in medical practice (and training) was how many doctors, even American ones, would refer to the state of having been dilated as “dilatation” instead of just “dilation”.  It feels like they lost control of themselves, and only just barely were able to resist saying “dilatatatatatatation”.  It makes no good sense.

**Of course, Catherine Tate is her stage name, so it would be weird for a sibling or child of hers to have the last name “Tate”, to say nothing of the first name “Disorien”.

***I don’t think I’m alone, of course.  I’ve never been tempted by the philosophical position of solipsism; it doesn’t make any sense, at least in its literal form.  But I definitely feel a sort of intuitive pseudo-solipsism in some senses and at some times.  By that I mean I am the only person I have any actual sense of persistently existing.  On the other hand, I can sometimes “feel” other people’s emotions, in a sense, when they’re around, and one on one that can be good when one is a doctor.  However, when there are a lot of other people around it can quickly be overwhelming, especially if it’s also literally noisy.  Two kinds of cacophony is too much.

Urchins shall forth at vast of night that they may blog all exercise on thee.

Hello and good morning.  It’s Thursday again, that day with which DentArthurDent always had so much trouble.  It’s the first Thursday in November, which means that (in the US) Thanksgiving will fall on the 23rd of November, since it’s celebrated on the 4th Thursday in November, which is always going to be 21 days after the 1st Thursday in November.

Further bulletins as events warrant.

I’m at the train station, and I was early even for the 610 train today.  I’m not going to get on the 610 train, because I still want to cool down* and begin this blog post, and it looks like the 630 is running on time.  I got here early partly because I got up early this morning…but really, that was only about 5 minutes earlier than usual, and it had little relation to when I first woke up.  The main reason, I believe, for my comparative earliness is that, as I mentioned yesterday, I tried to jog a bit this morning.

After getting to the end of my block and turning, I jogged 40 paces, as I had said I was going to do.  That was so comparatively easy and bracing that, at my next 90 degree turn, I did another 40 paces (each pace being 2 steps, at least the way I define the terms).  Then again at the next 90 degree turn, then at the last one.  So, I jogged a total of 160 paces, and walked the rest, and the jogging didn’t make me feel breathless or sore (so far) because it is such a limited amount.

It’s rather curious and amusing to note that my pedometer reads as if I’ve gone slightly less far than I usually do, because of course, jogging steps are quite a bit longer than walking steps, but the pedometer still just reads them as steps.

It’s a nice feeling to have done even that very little bit of running.  It’s a good way to start a day, to have accomplished that little bit of a goal, as part of a general pattern of exercise.  It is the first time (I think) that I’ve tried jogging while wearing a backpack.  That turns out not to have been a noticeable problem.

It’s quite windy today‒which is rather pleasant‒and there was a bit of rain on and off while I walked, though it’s really been negligible.  I got my umbrella out at one point, but even if I hadn’t used it, I don’t know that I would have gotten unpleasantly wet.

I decided last night to revisit the “mantra” notion I mentioned earlier this week, but with a slight downgrade or alteration from my previous idea to make it more workable.  If you’ll recall, I had started with the plan just to say “I love myself” as a form of auto-suggestion, then expanded it to “I love the world and I love myself”.  Anyway, I found that, upon awakening the next morning, I could not even make my mind’s voice speak the words.  They simply felt too utterly at odds with my thinking.

However, only one of those phrases was really the problem.  So, starting last night, I’ve tried to repeat to myself the mantra “I love the world” when I’m not otherwise engaged.  This seems to work much better.

I have a hard time even saying that I love myself, but the world…well, I’ve always loved nearly all branches of science, and they are all about understanding and exploring the world.  And I like mathematics and philosophy, and I even like history.

It can be easy to get discouraged by the way people behave at any given moment, and certainly humans say and do some ridiculous and destructive things.  But loving something doesn’t require it to be perfect.  In most cases, the concept of “perfect” isn’t even coherent.  Indeed, loving something can entail wanting to help it get better than it already is.  If you hate something (or someone) there’s no sense of trying to improve anything.  Wanting something (or someone) to improve is a positive, beneficent emotion.

To clarify, when I say “the world” in this context, I don’t just mean “the Earth”, I mean “the Universe”, to whatever level of multiverse and/or higher dimensionality might exist‒everything, all time, all possible stuff.  And let’s be honest, when you start thinking about things like that, while they can be daunting‒since, compared to infinities, anything finite is vanishingly small‒they’re still just mind-blowingly cool.  Don’t even get me started on the uncountable infinities of the “real” numbers and “complex numbers” and functions that are discontinuous at every point**, or infinite-dimensional Hilbert spaces!

So, anyway, when I woke up this morning, I was easily able to start thinking “I love the world” to myself, and that was a pleasant surprise.  Hopefully, I can keep it up.  At the very least, it would help make other things easier to tolerate, even if it doesn’t help me like myself.

Would that be a peculiar kind of dualism?  Possibly, but it’s not a formal distinction of type or substance; it would just leave me as an exception to a general tendency.

Anyway, that’s about it for now.  My coworker who had a stroke is apparently stable, and no clot was discovered, so I’m still puzzled, but I don’t have much information.  Hopefully we’ll find out more soon.

And, hopefully, you all have a good Thursday.  Thank you for reading.

TTFN

urchins on kelp


*I keep accidentally writing “cook down” when I try to write “cool down”.  It’s not a nonsense phrase, but it probably never would apply to me.

**There’s a term for this, but I’m dipped if I can recall it‒something like “continuously discontinuous functions”*** but I don’t think that’s quite right.  I know next to nothing about the subject, but just the notion of a function that is non-differentiable at every point is astounding.

***Though I heard at least one mathematician refer to them as “infinitely kinky functions” in a tongue-in-cheek fashion.