“You know the day destroys the night. Night divides the day.”

It’s Friday again.  But it’s not just any Friday‒it’s the Vernal Equinox, the day when the line between the Earth and the Sun is orthogonal to the line of the axis of the Earth, and so the day and the night will be (effectively) of equal length.  This is more fun in some ways than the solstices, because it’s the same for everyone, northern and southern hemispheres.

Of course, in the north it’s officially the Vernal Equinox, heralding the beginning of spring, whereas in the south it heralds the beginning of autumn.  I don’t know, however, if it is officially called the Autumnal Equinox in the south.  Probably it is.  After all, I’m sure they have their “official” winter solstice on what is “our” summer solstice and vice versa.  It would be a bit perverse for them to do otherwise.

It’s somewhat interesting to note, as Neil DeGrasse Tyson has pointed out with some ardor, that since, for instance, winter officially begins on the “shortest”* day of the year, the days actually get longer and longer through the winter (and the opposite happens in summer), until finally, on the Vernal Equinox, they break even with it and then daytime passes the night.

I wonder what Zeno would say about that race.

On a different topic, it’s quite rainy here this morning, and it’s a rather chilly rain, which is mildly unusual for south Florida.  It occurred to me, seeing just how sloppy it is here at the train station, that I hope it will not be so rainy at my destination.  What’s interesting about that is that it may not be rainy at all there, at work.  And yet, it could still be raining heavily down here in Hollywood.

In the modern world, weather can seem to change much more rapidly than it really does because we travel through the weather, whereas throughout all of our ancestral time we would merely have seen the weather passing over us.  It can give a somewhat misleading impression of how quickly the weather changes, even in Florida, where it can be raining on one side of a street and dry on the other**.

I recall when visiting my grandparents as a child, that there were times we would all be going somewhere in the car, and as we went along it would start to rain heavily, all of a sudden‒and then, just as suddenly, as we went along, it would stop.  And then it would suddenly start again, and then stop again, and so on.

But even in south Florida (or, well, west central Florida back then) the weather doesn’t change like that if you’re sitting still.  It changes quite rapidly compared to many other places, but not the way it seems to do when one is traveling in a modern vehicle.

For some reason, I feel as though there’s an analogy or insight available here with respect to special and possibly general relativity, but I don’t feel like trying to explore it right now.

I did bring my hardcover copy of General Relativity: The Theoretical Minimum, which is part of Leonard Susskind’s Theoretical Minimum series, with me when I left the office yesterday, thinking I might read it while on the train last night.  I did not read it.  There are too many distractions, it seems, for me to be able simply to flip my attention into focus on that, however much I really am interested in it.  It’s frustrating.

I have read part of it, mind you, as well as parts of the other Theoretical Minimum series.  I have all of them in both physical copies and on Kindle, so really, I didn’t need to bring the physical book.  But it is a lovely hardcover edition, and I hoped that might make me more likely to read it, since reading a nice hardcover is much more pleasant than reading a Kindle book on one’s phone, though that can still be fun.

I also entertain the admittedly absurd fantasy that I might be reading the hardcover copy on the train some day and some other, like-minded person (preferably an attractive woman) might notice and be interested because she is into the subject as well, and so on.

This is particularly silly as pipe dreams go, because even if such an absurd event happened, I would definitely screw the whole thing up.  I tend to be quite terse when strangers try to speak with me, even if they are beautiful women.

Looking back on my life, I’m sure that there have been several occasions in which someone was expressing interest in me, but I didn’t get it or got too anxious and froze up.  Sometimes I figured it out soon after, and sometimes it took longer.  There are probably some cases that I never noticed at all, even in hindsight.

Of course, I was married for fifteen years, during some of which I was in medical practice, and so such interactions would have had a different character.  There were sometimes more flagrant and obvious “advances” in that time, because, well…doctor.  But I never had any inclination to pursue them, even when I recognized them; I’m not the kind to want to cheat on a partner.  Hell, I’m not even the kind to seek a new partner two decades after my wife divorced me (though I briefly tried a little).

I wouldn’t mind a nice relationship, but I know that I am difficult to handle in many ways (I try not to be, but I am weird, and not in some charmingly popular manner), and in certain senses, my standards are high, or at least they are fairly strict.  For instance, someone who doesn’t read for pleasure is unlikely to be terribly interesting to me.  It’s not impossible; there are other ways for people to be interesting and smart.  But not liking to read would definitely be an entry for the “con” column, not the “pro” one.

I don’t know what I’m doing, going on about such nonsense.  I am not going to have any more romantic relationships in my life.  I am going to die alone, as is only appropriate and to be expected for something like me.  And while I won’t say “it can’t happen soon enough for my taste”***, I do really feel impatient for it.  I wouldn’t say I am “eager” for it, because that’s a positive feeling.  I am just quietly desperate for it, like someone trying to find an exit from a (slowly) burning building.

Anyway, that’s enough for today.  I hope you have a good one, and that you have a good weekend as well.  Yes, I mean you.

As for me, well, I am to be working tomorrow as far as I know, so I will be writing a blog post tomorrow, barring the unforeseen.


*Of course, this is a bit of a misleading characterization.  The day is the length that it is‒roughly 24 hours‒and does not change very quickly, for which fact we should all be grateful.  It’s just the length of time in a given day during which the sun is above the horizon (so to speak) that varies.

**This is not an exaggeration.  I have seen it myself on many occasions.  It seemed to happen more frequently in the area where my grandparents used to live (Spring Hill, north of Tampa) than it does down here‒or maybe I noticed it more because I was a kid‒but it is very real and quite impressive when it happens.

***Except to say that I won’t be saying it.

Is it small talk if you discuss the weather despite being alone?

Well, I brought the lapcom back to the house with me yesterday evening, but nevertheless I am writing this post on my smartphone.  Why?  Because the lapcom is more inefficient to get out of my bag and put back in my bag, and the smartphone is much easier to unsheathe and restow‒it just sits in my front pocket when not in use.  I can also use it to check the temperature, which is a bit chilly this morning even here in south Florida.  I’m sure that it’s quite a bit worse for regions north of us that have been hit by the wave of unpleasant recent weather.

I don’t find it unpleasant for it to get a bit cool down here‒55 degrees Fahrenheit* with a bit of overcast and some rain feels like autumn up north where I grew up, and that was always my favorite season.  It’s usually rather easy to adjust by wearing more clothes and moving around a bit if one feels chilly.

On the other hand, there’s not much to do about the heat and humidity other than to stay inside air conditioned buildings.  That isn’t very much fun, unfortunately, especially when one lives in a state that is touted for its beautiful and interesting nature.  After a point, though, one can take off as many clothes as one likes, but one will not get cooler; one will only be at risk for sunburn in rather uncomfortable places.

The worst part, though, is the humidity.  Yes, humans developed in sub-Saharan Africa, so we’re built well for endurance in hot environments (humans have more sweat glands per square inch of skin than any other animal).  But humidity is another matter.  Humidity is almost like an electronic counter measure to sweat’s ability to cool the body.  Sweat works by evaporative cooling; like blowing on soup, taking away the warmest liquid molecules lowers the average temperature of those remaining, and so on.

But evaporation depends at least partially on the differential in concentration between the liquid and the gas “above” it.  If the air is already saturated (or nearly so) with water vapor, there is going to be significantly less tendency for net evaporation to occur, and thus there will be less cooling.

This is why the reassuring and somewhat comical statement, “Yeah, but it’s a dry heat” is actually pertinent and indeed positive.  If the air is dry, and if they have adequate water, humans can tolerate surprisingly intense heat.  But when it’s humid, things don’t work nearly as well.

Also, when it tends to be humid and rainy a lot, one finds fungi and algae and the like growing on almost every immobile surface, as well as on some that are mobile, such as human intertriginous areas.

Anyway, to make a long diatribe slightly longer by summarizing it, I don’t mind cooler weather, but humidity is very annoying when it’s warm.

As for other matters, well, the holiday is over from yesterday, and I did not get to eat any corned beef and cabbage.  That’s a bit disappointing.  The next major holiday (which is coming soon) is the Passover/Easter holiday.  There’s no particular food related to these holidays that I like, though, nor really much of anything else come to think of it.

I did get into the St. Patrick’s Day spirit by drawing a shamrock yesterday, then scanning it and coloring it and fiddling with it a bit between other things at work.  Here, this is how it’s turned out so far:

It’s nothing terribly impressive, but it’s at least one very tiny, mild, creative act.  Not that writing this blog isn’t creative, but it’s not as creative as writing fiction, or not creative in the same way.  And drawing a picture is closer to writing a story than to writing a blog post.  Though I have to admit, at first glance drawing and writing would seem to be somewhat too different to compare.  And yet, I think I’m not the only person who has a deep, intuitive feeling that they are part of a strong, self-similar group.

It’s quite curious.  I wonder if such seemingly odd combinations are common among intelligent life forms.  Of course, if this planet is the only place in the universe on which intelligent life exists, then it’s a universal attribute of such life, if we count only creatures that use languages and create and use artifacts.

Well, this has been a weirdly inconstant blog post, especially for a relatively short one.  It’s not just meandering around among topics, it’s ricocheting.  I would prefer to meander; ricocheting seems like it would be very bad for my chronic back and joint pain.


*If it were 55 degrees Celsius, it would not be chilly at all.  Indeed, many people would be dying around here from the heat.  If it were 55 Kelvin, then, yeah, that would definitely be chilly.  Not that anyone would feel it, because we would all be dead if it were that cold.

You’re so vain, you probably think that nothing matters

I was going to start by saying that I had probably written all I could about Friday the 13th and the fact that there are 2 in a row when non-leap year Februaries have Fridays the 13th, and that a first glance might lead one to think this should happen roughly every 7 years on average*.  However, as I noted last time I discussed this, because the leap year day is in February, we will not have the two-in-a-row Fridays the 13th (February and March) as often as we might otherwise; it will not happen every 7 years on average.

Then, this morning, after recalling that today was Friday the 13th, I ran through the next years’ Fridays in my head in the shower, and it occurred to me that the next Friday the 13th in February‒which will be in 6 years, as I noted in the past‒will not be followed by a Friday the 13th in March!  2032 (six years from now) will be a leap year, so there will be 29 days in February, so there will be no Friday the 13th in that March.

The next paired ones, then, will be a further 5 years after that, in 2037 (not a leap year).  It would have been 6 years later, but there are two leap years in that interval, 2032 and 2036, so the next one comes a year sooner than it would otherwise.

It occurred to me that, because of the frequency of leap years, which is almost twice that of the cycles of days of the week, the frequency of those paired dates may well be once every 11 years rather than every 7.  At least those are both prime numbers.  I’m not going to work out some exact formula right now, though.  It’s not really important.

Of course, one could say that nothing is truly important, and I am persuadable along those lines.

There is a Doctor Who Christmas Special (the one from series 5) in which the antagonist/guest protagonist (played by Michael Gambon!) describes a woman in a cryo chamber as “nobody important”, and the Doctor characteristically responds by saying, “Nobody important?  Blimey, that’s amazing.  You know, in 900 years of time and space, I’ve never met anyone who wasn’t important before.”

This is typical Doctor, of course, but it raises the objection Dash (from The Incredibles) voiced when told that everyone is special:  Saying that everyone is important can be the same thing as saying no one is.

Of course, important is in the eye of the beholder.  But then again, the beholder is not important, either, except in its own subjective estimation and perhaps that of a few other, equally unimportant, owners of such eyes.

So, yeah, one could argue relative and subjective importance from local points of view, which is valid but more or less vacuous outside its small scale as far as I can see.  On a cosmic scale, it’s all just dust and shadows.  But you could also say that about the entirety of the cosmos itself.

I guess import has always been subjective, even though people are not inclined to see it that way.  But, of course, people are the products of their “local” forces, and they are not responsible for the laws of nature, nor for the things which have happened in the past that have affected them in the present (which could come under a certain interpretation of “the laws of nature” in and of itself).  I won’t get into all that now.

Going back to the shower, but on an entirely different subject, I was also thinking about the effects of diminishing amounts of shampoo in the bottle on the center of gravity of the bottle.  At the start, when it’s full, the center of gravity is roughly in the geometric center of volume of the whole thing.  But as one uses the shampoo, the center of gravity shifts lower and lower, since the air replacing shampoo in the upper part of the bottle is much less dense than the shampoo or the bottle.

But then, as one gets to the dregs, the smaller and smaller amount of shampoo in the bottle contributes less and less to the overall mass distribution of the bottle and its contents, and the center of mass begins to head back up.  Finally, when the bottle is “empty”, the center of gravity will have returned to almost the same place it was when the bottle was full.

All that’s fairly trivial, well-known stuff, I know.  But it got me to thinking about how much of the laws of physics, such as the laws of gravitation (Newtonian form), are solved using such concepts as the center of mass, which is really just a way of combining and averaging the effects of numerous tiny bits of gravitating material as if they were concentrated at one point.

Much of the mathematics of physics works this way, coarsely approximating the very fine details of reality in a way that provides reliable, reproducible guidelines and can produce testable predictions.

But the granularity of reality doesn’t actually ever go away, not at any level.  Even at the level of the quantum wavefunction of a single “particle”, the actual behavior of the thing as it interacts with things in the “larger” world is the summation of the effects of all the possible quantum states of the electron superposed upon each other and interacting with things‒everything‒which are also just collections of superpositions of quanta.  That superposition happening in a “space” that doesn’t directly coincide with the macroscopic space we experience, but whatever its dimensions are, they are real, because they have durable, reproducible effects.

Mathematics may be unreasonably effective in the physical sciences, as Eugene Wigner famously noted, but it seems not to be a refining of description but rather an averaging out, a glossing over, the inking of an underlying rough pencil drawing which nevertheless still constitutes the real, original picture.

It may be that, in a sense, all science is just various forms of statistical mechanics.  We know that, at larger scales, we definitely need the tools of probability and statistics to navigate as best we can the territory of reality.  And yet, we don’t teach this sort of stuff to most people, ever.  I wrote a post about this on Iterations of Zero, if I remember correctly.

I could go on about all this rather easily, I guess, but I am using my smartphone today, and my thumbs are getting sore.  That’s okay; yesterday’s post was probably way too long, anyway.

If I did a video of my thoughts on this I might be able to get into more detail, though it would probably be even more erratic and tangential than my writing.  Still, maybe it would be worth trying.

In the meantime, I’ll write at you again tomorrow.


*Go ahead, do a search on my blog page for Friday the 13th; I’m all but sure it will bring up the pertinent blog posts.

 

In a better blog than this, I shall desire more love and knowledge of you

Hello and good morning.  It’s Thursday, and I’m writing this post on my lapcom.  I feel as though I ought write these posts only on the computer (not that smartphones are not computers, but cut me a little slack on this, please), and I would be more inclined to do so if Microsoft would stop making Aptos the default font!!!!!

If I could go back in time and change something, that’s one of the things I would be inclined to change.  If I found that there was one person mainly responsible for this new font, well…I don’t know if I’d go all Terminator on them and kill that person’s mother before that person was born, or kill the person when that person was a child, but something needs to be done to erase the stain of this horrible font from existence.

Certainly, if I were given* absolute power over the world, from this moment forward, one of the petty things I would do (I would try to keep the petty things to a very bare minimum, trust me**) is to eliminate that font from any and all standard computer systems anywhere.  I would probably allow for individuals to select the font if they really like it, but would not let them use it on anything but internal work between people who also like the font.

Also, I would probably mark people who chose the font freely for a visit from my secret police.

I’m kidding.  I despise the very notion of thought crime, let alone aesthetic policing in private matters.  This is even though some people’s quality of thought sometimes feels like a crime against nature.  But, of course, there cannot actually be crimes against nature.  Nature does not punish one for disobedience to its laws.  It’s simply not possible to do anything but follow them.

That’s one reason why I truly despise headlines like “The new finding by Hubble that breaks physics!” and whatnot.  Not only are they plainly clickbait, they are stupid clickbait.  I don’t know for sure if it’s just the headline writer or the writer of whatever the attached article might be who makes the headline in specific instances, but in either case, when I see headlines like that, I think that whoever wrote it really, clearly doesn’t understand physics very well.  Nor do they the nature of scientific discovery and advancement.  Because of that, I am far less likely to read the attached article (or watch the video) or even click on its link.

Nothing can break physics.  If you find something that seems to violate physics as you understand it, what you have found is not a violation of physics but rather a place where your understanding of physics is clearly incorrect.  This is far from a horrible thing.  This is how progress in physics (and in other sciences) is made:  by finding the places where our “understanding” doesn’t predict or describe what actually appears to be happening.  The world cannot be “wrong”, so our understanding of it must be, and will need to be revised.

That’s progress.

One should be hesitant to give too much “trust” to anyone who refuses to change their mind.  One of the best lines in a Doctor Who episode (not a truly great episode, maybe, but it has a wonderful speech by the Doctor) is after the Doctor has said to the “villain” (who goes by the human name Bonnie, though she is not human) “I just want you to think.  Do you know what thinking is?  It’s just a fancy word for changing your mind.”

Bonnie responds, “I will not change my mind.”

And the Doctor says, “Then you will die stupid.”***

This is simply true.  If you never learn that you were wrong about something, if you never update your credences or think about things in a new way, you will never learn anything new or develop any better understanding of the world than you did when you formed those credences.  Or, to paraphrase Eliezer Yudkowsky, if no state of the world can change the state of your retina and how you perceive that state, that’s called being blind.

I like to refer to Yudkowsky-sensei a lot, but that’s because he has said a lot of bright and interesting things, and he has said them well.  It’s also nice to know that there are some highly intelligent and thoughtful people in the world—clearly there are, or humans would long since has gone the way of the trilobites—because the idiots and the assholes make so much noise.

The best evidence I see for the fact that most people are good or at least benign (overall) is that civilization still exists, and has done so for a long time.  It is far easier to destroy than to create or even to maintain; the second law of thermodynamics tells us that things will fall apart even if we do nothing at all to break them (it says that more or less, anyway—that’s a bit of a bastardization of the proper, mathematical law, but it is related and implicit).

The fact that civilization still exists—so far, at least—seems to indicate that there must be a lot of people working to maintain and sustain and improve it, because we can easily see how much how many people seem to be trying to make it crumble****.

Assholes tend to make a lot of noise in the world, but they’re pretty much all full of shit and “hot air”.  It’s worth it to keep this in mind, because there have always been plenty of such nether orifices out there, spewing their flatus everywhere like perverse crop-dusters.  But the evidence strongly suggests that they are not the norm; they are just the noisiest.

I suppose that’s a good moral of sorts on which to end this post:  Be willing, even eager, to change your mind when warranted, and try not to let the assholes make you think the world is no better than a camp latrine (even if you’re one of the assholes sometimes, which you are, since we all are, sometimes*****).

Though, to be fair, I am hardly the person to be giving that last piece of advice unironically.

TTFN


*If you must be given absolute power, do you actually then have absolute power?  This is similar to the old song that says “Don’t ever take away our freedom.”  If you have to beseech someone not to take away your freedom, you’re not free, and if you have to be given power, your power is clearly not absolute.

**Or don’t, if that’s not in your character.  I’ve often spoken implicitly against the concept of trust, stating that I don’t feel that I can actually, truly trust any living person.  It’s calculated risks all the way down, which is empirically true if nothing else.  So, I can hardly scold someone if they don’t “trust” me.  Go ahead, form your own conclusions.  I do exhort you, though, to be as rational as possible when you form them, with your conclusions drawn as a consequence of the evidence and argument, not with your evidence and argument being curated based on your knee-jerk or at least hasty “conclusion”.

***He then proceeds to lay out the alternatives; he’s not making a threat, he’s making a point.

****When you read that, did you immediately think of your own least favorite political or other public figure, or perhaps of the people you encounter who disagree with your politics or religion or dietary preference or what have you?  Be careful.  Us/them thinking is not usually conducive to formulating true and accurate pictures of reality (though it did inspire at least one beautiful song):

*****We’re also all deuterostomes (I’m assuming only humans are reading this).  Look it up.  It’s kind of funny.

If you can look into the seeds of time, and blog which grain will grow and which will not

Hello, and also, good morning.

What to write about, what to write about‒that is the question today.  Of course, “to be or not to be” is always the question as well, as was recognized by Camus in The Myth of SisyphusIf I recall, he arrives at the conclusion that the titular rock-rolling protagonist must be “happy” despite the patent and constant pointlessness and absurdity of his existence.

That goes along with the whole recognition of the absurdity of life itself that is central to the existentialism movement.  Still, it’s hard for me to “imagine Sisyphus happy”, unless he was a true Bodhisattva or had been thoroughly lobotomized by Zeus (or whoever it was that had doomed him to his…well, his doom).

It can help, I guess, to think about the vast scale of the cosmos in space and time (and any other dimensionality that might apply) and also about the incredibly minute scale of the cosmos, the fundamental quantum fields (and whatever gravity ultimately is) interacting from the Planck scale on up.  It helps keep things in perspective.

Of course, even given the scales of the cosmos*, there’s another, almost sort of Buddhist/Taoist notion that notes that each individual‒each particle even‒always exists at the nexus of two “light cones”, existing in an ever-moving now.  These are 4-dimensional cones, by the way, but it’s okay to reduce things by one dimension if you will.  It makes them easier to visualize.

Your (or anyone’s) past light cone is the outer boundary of all influences that can possibly have had any effect upon you at the present moment‒those influences that could have reached you at the speed of light or more slowly.  Similarly, one’s future light cone encompasses all those things that could possibly be influenced by things at the present location at or below the speed of light.

Any motion within the light cones‒the only motion that anything within spacetime can execute, as far as we know‒is called timelike motion.  Any motion that would require going outside a light cone is considered “spacelike” motion, and is not allowed by relativity.  This is not merely because of the speed of light, it’s because the speed of light is defined by the speed of causality.  Causes cannot travel faster or have effects beyond the speed of causality.  This is a bit tautological, I know, but it nevertheless simply must be true.

So each individual’s experience, each individual process, sits at the moving balance point of a future light cone and a past light cone, crossing at the moving present, tracing out a “timelike” path in spacetime.  Of course, individual creatures are not individual particles, and so their overall spacetime path would resemble the final line produced by a sketcher going over and over a particular path to make the curve the artist desires.

If one could look at the structure of a human in spacetime, like the Tralfamadorians of Slaughterhouse Five, but one could also trace even the spacetime paths of individual “particles”**, a human life would be a sort of higher-dimensional braid in spacetime, surrounded by a haze of incoming and outgoing quantum entities, most of which will be locally bound and interacting, and so will be moving at a net velocity lower than the speed of light.

I’m assuming you don’t eat your food or drink your water or breathe your air or (shudder) sweat or excrete at near light speed.

Imagine what the inside of a mere proton or neutron might look like if one were able to see it as a rendered, four-dimensional model in fine detail!  If you think it wouldn’t be that interesting because it’s so wee, think again.

Remember, only the tiniest fraction of the “rest mass” of a nucleon comes from the mass of the three “net” quarks in it (two up, one down or two down, one up depending on whether it’s a proton or neutron).  Almost all the rest of its mass is the energy of the interactions between these three quarks:  all the gluons exchanged, all the virtual quark/anti-quark pairs popping into existence, mediated by that famous strong force and its weird*** “asymptotic freedom”.

Bringing this back around, I guess my point was merely to note that everyone and everything is pointless from the perspective of the laws of nature and the spacetime scale of the cosmos, but when you learn about those things‒the cosmos at large and small levels‒you are at least familiarizing yourself with those vast workings, and you are in a sense taking part of them into yourself.  That’s kind of a cool thought.

But don’t take too much of it into yourself!  For, much as would happen to someone who stuffed all the information about Graham’s number into one head, if you do you will become a black hole.  Now, it may be possible to survive becoming a black hole, but I don’t recommend betting on that pony.

TTFN


*I wrote a post on Iterations of Zero about how it might be useful for people to consider the cosmic perspective as contrasting with their prosaic concerns.  I don’t remember how good it was, but here’s the link, in case you want to read it and give any feedback you like.

**I use this word for want of a better term that everyone would recognize and that would be succinct.  I think we need such a different term, because a lot of the perceived so-called weirdness and mystery of quantum mechanics comes from trying to use inaccurate terms that originated in times before we understood things as well as we now do.  Quanta are not little “particles” that sometimes act like waves, nor are they little waves that sometimes act like particles (though that’s slightly more accurate).  They are entities unto themselves, and the ways they behave are all always consistent with that nature.  They don’t sometimes act like one thing and at other times act like another.  They all, always, act like what they are.

***Except it’s not weird, really.  Those of us who are surprised by it?  We are the weird ones.  Quantum chromodynamics has always done exactly what it still does, since long before any life at all existed in this universe.  To quote Yudkowsky again, “Since the beginning not one unusual thing has ever happened.”

And his brain ate into the worms…

Ugh.  Didn’t we just leave this party?  Evidently, we did not leave it precipitously enough, because here we are‒or at least, here I am‒rejoining it in the morning.

It seems like an ill-advised notion, but then again, I’m not sure who specifically advised me, or any of you, to do it.  There probably were a few literal, formal pieces of advice that we all or each received throughout our lives‒advice about getting up early and going to work and striving to fulfill our potential, and how if we didn’t we were somehow letting ourselves and (more importantly) letting everyone else down.

“The early bird gets the worm” is a typical phrase about such ambition and dedication and hard work.  But like many of us, I’ve often thought that worms are overrated.  They’re not rated highly at all, I’ll admit, but nevertheless, I think they are rated too highly.  Evidently‒according to what I have read‒all earthworms in at least the northern part of North America were killed off in the last ice age.  Nevertheless, plants grew and flourished without verminous help in the soil before Europeans accidentally brought their own earthworms here.

Of course, the saying is metaphorical, I know that.  We’re not really advised to seek earthworms early in the day, though perhaps liver flukes and flatworms and tapeworms and roundworms are also considered as among the worms that might be caught.

No, probably not.

But anyway, even though metaphorical, that saying raises higher level questions, such as, “Is the life of a metaphorical early bird worth having?”

Consider what that life entails:  Getting up (early), pecking around on the ground for worms and probably also for various other insects and their larvae and a few arachnids as well*; trying to avoid, in that process, being caught by some predator (such as a house cat); trying to find and attract a mate when the season is right; helping build a nest, if you’re that kind of bird; guarding the eggs and maybe sitting on them yourself, until they hatch; then, feeding and protecting them until they can fly on their own; then repeating these steps until disease or starvation or one of those house cats gets you.

That’s it.  And while there are many embellishments and flourishes and complications in the typical human life cycle, overall it is much the same as that of the bird.  Why would we expect it to be otherwise?

Admittedly, humans (and humanoids) can dream up other things to do, and some of them are more interesting and fulfilling, from their own points of view at least, than the ordinary early bird pattern.  But though, in the long run, humans as a whole may become significant enough to do something truly meaningful on a cosmic scale, almost all of them have no deeper lives than those lived by the early birds.

That’s not necessarily a bad thing, of course.  Taken with the pertinent attitude, such a life can be well lived and fulfilling.  It probably won’t end happily, because it’s not in the nature of life to be happy when ending; there’s just no real evolutionary benefit to having such a tendency.

Still, before imbibing the so-called Kool-Aid™ of the motivational life-messages‒those social moralities that keep us getting up and joining the rat race (to shoehorn in another animal-related metaphor)‒it would probably behoove us to consider whether that is the life we think we want, to ponder if that overall shape and experience are okay with us as the outline of our lives.

If so, there’s nothing wrong with that.  As long as you’re not interfering with other people’s ability to try to live their lives as they try to see fit**, then do what seems best to you.

But it’s useful to think about what might be the overall shape of your life if you continue as you currently are and if that shape will be aesthetically (or otherwise) pleasing to you.  If not, what change might improve that overall shape, trying to take all reasonably plausible inputs and outputs into consideration?

I won’t say that the unexamined life is not worth living, because, if it’s unexamined, how do you know that it’s not worth living?  Huh?  Huh?  Nevertheless, I will say that the unexamined, unconsidered life could be fulfilling only by accident, whereas it may be possible, with deliberation, to steer toward a better one.

Not that I’m a good piece of evidence in favor of this.  I think and overthink to the point that I hate the noise of my own mind, but I haven’t been able to steer myself into an optimal shape***.  But at least I make a lot of “noise” about such things.  That might be worth something.

Anyway, have a good day.  Enjoy your worms or salads or whatever other life forms you kill and consume to remain alive today (I’m assuming you are not a green plant).  Watch out for the Kool-Aid™ and even more so for the cats.


*I am quite sure that, to such a bird, these things taste delicious, so I don’t mean to disparage their diet as unpalatable.  Appetites of various kinds are species specific; what’s appetizing or sexually attractive to, say, a housefly is unlikely to appeal to any psychologically healthy human.  Likewise, the most beautiful human woman ever is not going to do anything for a male tarantula.  He also probably would have no interest in having a bite of her salad.

**This is more difficult to navigate than it may seem at first, because even when one is acting on one’s own, there are always effects at some level, there are always “externalities”, and occasionally these will have an impact on other people‒a foreseeable but perhaps unforeseen impact.  And vice versa.

***Should there be a “yet” at the end of that sentence?  I don’t know; we’ll have to see what happens to me in the future.  We can be reasonably sure, though, that there shouldn’t be a yeti at the end of that sentence, or of any sentence except one that mentions such creatures.

Our wills and fates do so contrary run, that our devices still are overthrown; Our blogs are ours, their ends none of our own.

Hello and good morning.  It’s Thursday, the 26th of February in 2026, a date that’s only very slightly interesting whether you write it as 2-26-2026 or 26-2-2026.  The fact that you have repeated 2s and repeated 26s is somewhat entertaining, but the zero throws potential symmetries off, making it not nearly as much fun as it could conceivably be.  It’s a shame, really.  I suppose you could write it as 26-02-2026 and rescue a bit of symmetry, but that feels like reaching.  It’s not quite symmetrical anyway, unless one is writing in base-26 or higher.  No, wait, even that wouldn’t work.

I don’t know about what I’m going to write this morning.  That in itself, of course, is nothing unusual.  But I don’t feel that I have much to say about anything at the moment.  I don’t want to get into my depression and ASD and anxiety and chronic pain and insomnia and just general moribund state, because I’m sure no one wants to hear about it anymore, and in any case, there seems to be no way anyone can do anything about it that’s useful, which makes it all the more frustrating.  Writing about it certainly hasn’t cured or even improved my state much, if at all.

Anyway, as I said the other day, you have been put on notice.  Unless you just started reading my blog for the first time yesterday, you’ve no right to act fucking surprised no matter what happens.

Okay, that’s that out of the way.

Now, let’s see, what should I write today?  I could discuss some topics in science, especially physics, though I also have literal, legally recognized expertise in biology, and I know a lot about quite a few other branches of science as well.  This is because I have always been curious about how the world, the universe, actually and literally works on the largest and on the most fundamental scales.

I mean, yes, humans also have their rules and laws and social mores and antisocial morays and all that nonsense, but if you step back even a bit, you can see nearly all human behavior encapsulated by basic primatology.  If you know how the various monkeys and gibbons and gorillas and chimpanzees behave‒especially their commonalities‒human behavior almost always fits right in.  It’s usually not even very atypical.

That doesn’t make the specifics of behavior very easily predictable in any given case, necessarily; then again, we understand an awful lot about the weather and the climate, but the specifics of tomorrow’s weather are tough to predict precisely and accurately, let alone next week’s weather.  Nevertheless, the physics of longer term climate effects of certain kinds of atmospheric gases is almost trivial.

Anyway, humans are too annoying to be very interesting, except in special circumstances.  In this, they are perhaps a bit like cockroaches.  From the point of view of a scientist who studies them, they can be interesting, and from just the right angle and with the right detachment, they can even be beautiful (or some of them can).  But overall, they are merely large masses of highly redundant little skitterers, just doing their shit-eating and reproducing and infesting almost every possible location.

Which type of creature did I mean to describe just now?  See if you can figure it out.

Of course, on closer scales, cognitive neuroscience and neurodevelopment and related stuff, such as “neural” networks, “deep” learning, and other such areas are fascinating.  One thing interesting about them is how all the things that brains and computers and so on are and do are implicit in the laws of physics‒clearly they are some of the things that stuff in the universe can do‒and yet, for all we know, they have only ever happened here, just this once in all the vast and possibly infinite cosmos*.

And for all we can tell, given the human proclivity to plan about 20 Planck units ahead and then after that trust to luck, this could be the only place they occur, and their time will not continue much longer, certainly not on a cosmic scale.

I could be wrong about that…except in the sense that, since I am stating it merely as one of the possibilities, I am not actually wrong at all.  Even if humans do survive into cosmic time scales and become cosmically significant, it will still not be easily debatable that it could have happened that humans would go extinct and would fail to go anywhere but Earth.

Of course, depending on the question of determinism, I suppose one could say that if humans (or their descendants) become cosmically significant then there literally was nothing else that could have happened, at least as seen from outside, at the “end”.

On the other hand, if Everettian quantum mechanics is the best description of the fundamental nature of reality, then in some sense, every quantum possibility actually happens “somewhere” in the universal quantum wave function, though those variations may not include all conceivably possible human outcomes.

Some things that seem as though they should be possible may simply never happen to occur (or occur to happen?) anywhere in the possible states of the universe.  That feels as though it should be unlikely, given how many possible states can be locally evolved in the quantum wave function, but I don’t think we know enough to be sure.

Okay, well, I vaguely hope that this has been mildly interesting and perhaps thought provoking.  It would be enjoyable to get more feedback and thoughts, but I don’t have a very large readership, and only a certain small percentage of people ever seem to interact with written material in any case, so I’m probably lucky to get the feedback that I get.

TTFN


*With the inescapable caveat that, if the universe is spatially and/or temporally infinite, and if as it seems there are only a finite number of differentiable quantum states in any given region of spacetime (the upper limit of which is defined by the surface area of an event horizon the size of the given region) then every local thing that happens, and all possible variations thereof, “happen” an infinite number of times.  But given that all these regions are more or less absolutely physically distinct and incapable of “communicating” one with another, they can be considered isolated instances in a “multiverse” rather than parts of the same “local universe”.

Are gravity and frivolity truly opposites?

It’s Wednesday morning (not quite five o’clock yet) and it is February 25th.  There are only ten more shopping months until Newtonmas*.

For those of you who don’t know (and as a reminder for those of you who do know) Isaac Newton was born on December 25th, 1642 (AD**).  Now, there is a parenthetical here:  Newton was born on December 25th by the Julian*** calendar, which was the one used in England at the time of his birth.  By the Gregorian**** calendar, Newton would have been born in early January of 1643.

This might seem to imply that December 25th nowadays shouldn’t be considered Newtonmas, but of course, it’s a closer fit than celebrating the birth of Jesus on that day; supposedly, biblical scholars have found that Jesus was probably born in the summer or something.  As with many things, “The Church” appropriated the popular holidays celebrating the winter solstice and grafted Christian religious significance onto it.

There’s nothing particularly bad about that.  All these holidays and divisions of the year are fairly arbitrary (though celebrating solstices and equinoxes is common enough in multiple cultures, which makes sense because these are objective events in any given year that can be noticed by any culture that is paying attention).

The length of a year is a concrete, empirical fact, as is the length of a day and the length of a lunar orbit around the Earth.  None of them are straightforward multiples of each other, unfortunately‒they are waves that are not harmonically associated with each other.

I don’t know how long it would take for their “waves” to come back into some primordial alignment and “start over”, but it’s probably moot, because the length of a day and of a lunar orbit and of the orbit of the Earth are changing slowly.  The moon, for instance, is moving steadily (but very slowly) away from the Earth over time, and so its time of orbit is increasing (since things that orbit farther away orbit more slowly).

I think Kepler’s third law was/is that the period of a planet’s orbit around the sun is proportional to the 3/2 power of the length of the semimajor axis of its orbit.  I’m not sure if that exact power holds up on the scale of, say, the lunar orbit, but the laws of gravity are as universal as anything we know.  Indeed, there are materials that are opaque to light, but as far as we know, there are none that are opaque to gravity.  Gravity is nevertheless constrained by the geometry of spacetime, so orbits will always slow down at a faster rate than the distance from the center around which a mass orbits increases.

The inability of anything we know of to block gravity is one thing that makes me take seriously the notion that, at some level, there could be more than three spatial dimensions.  If gravity is not confined to three dimensions then nothing that is so confined could stop it; it would merely flow around any obstacle (maybe gravity waves, for instance, can even diffract around matter and energy, though that might not imply higher dimensions).

This is related, indirectly, to the fact that it is impossible to tie a knot in a string in 4 or higher spatial dimensions.

By the way, having those extra spatial dimensions curled up tiny, as is usually presented in depictions of the notions of string theory, is not the only way for them to exist and be undetected.  If most of the forces in the world we know‒the electromagnetic, the strong force, the weak force, and the various matter-related quantum fields‒are constrained to a 3-brane because their strings are “open-ended”, then we could live in a 3-brane (in which all other forces, including matter, are confined) nested in a higher-dimensional “bulk”.  Gravity could be conveyed by a “looped” string, which could pass through the 3-brane, interacting but not being confined to it.  This could also explain the comparative weakness of the gravitational force and might even explain dark matter (and why it is so difficult to detect).

This sounds extremely promising, maybe, but there are issues and hurdles, not the least that strings and higher spatial dimensions are very difficult to detect, if they exist.  Also, it’s very hard to pin down all the implications mathematically in a useful way.

I remember one lunch break when I was still in medical practice when I tried to see if I could work out mathematically if “dark matter” could be explained by a relatively nearby, parallel brane-universe (it would probably be more than one, but one was difficult enough) whose gravity spills over into and overlaps the gravity of our brane-universe.

Here’s a sort of reproduction of some of the scribbling I did then:

Unfortunately, though I could visualize what I was considering and get an intuitive feel for what the math would be like, my precise mathematical skills were just not up to the task of sorting it out rigorously.  Also, of course, lunch was not long enough, and I had many other things on my mind.  Anyway, findings like the “bullet cluster” provide some fairly strong evidence that “dark matter” is something physical within our three dimensions of space.

Okay, that’s enough for today.  I’ve managed not to talk about my depression and stress and self-destructive urges/wishes (except just now, of course), so I hope you’re pleased to have had those things cloaked from you today.

Take care.


*Working out the exact number of days, I think I figured that it was 302.  December 25th is 7 days before New Years, so it’s day number 358 in the (non-leap) year.  And today is the 25th day of the second month, and January has 31 days, so today is day 56 of the year.  And, of course, 358 – 56 = 302.

**Why not my usual “AD or CE?”  Because at the time, in England, it was just “anno domini”.

***Named for Julius Caesar, though as far as I know, he had no more to do with actually formulating that calendar than he had with the invention of the 7th month.  As far as we know, he wasn’t even born by the then-existing version of Cesarian section, which was more or less always fatal to the mother, and his mother lived well beyond his birth.

****Named after Pope Gregory XIII, also known (by me) as Pope Gregory Peccary*****.  He did not formulate the newer calendar, but supposedly he at least commissioned the Vatican astronomers to create it when it had become obvious that the Julian calendar was not quite tracking the actual year but was overshooting over a long period of time.  So, the Gregorian calendar is better named than the Julian calendar, or so it seems to me.

*****The nocturnal, gregarious wild swine.

May the slope of your pain function always be negative

I’ve been thinking about something I wrote in my blog post yesterday.  I had thrown out the thought, in passing, about how it seemed as though all the things in my life that I still do are not things I necessarily do for joy or out of desire to achieve some goal, but rather they are things which are more painful not to do than to do, and so I do them.

There isn’t really a positive motivation—not the pursuit of happiness or improvement or fulfillment or enrichment.  It’s just that the feeling of stress and tension and anxiety (or whatever) regarding the prospect of, for instance, not going to work rapidly becomes worse than the equivalent feelings about going to work.

That’s not a great state of affairs.  Don’t get me wrong; it’s entirely natural.  I’ve written about this many times, this recognition of the fact that the negative experiences—fear, pain, revulsion, disgust, and so on—are the biologically most important ones.  Creatures that don’t run from danger, that don’t avoid injury, that don’t shy away from potential infection and poison, are far less likely to survive to reproduce than creatures that do those things.

We see clinical examples of people lacking some of these faculties—such as those with congenital insensitivity to pain—and while we might envy them a life without agony, it tends to be quite a short life.  Also, they tend to become immobile and deformed due to damage they do to their joints by not shifting position to improve blood flow.

In case you didn’t know, that’s one of the reasons you can’t stand completely still for very long; it’s not good for you.

But many of us, especially in the modern world, have some things that we do for positive experience.  Some of them are dubious, but food, sex, companionship/conversation, singing, dancing, all that stuff, are positive things.  Unfortunately, positive experience cannot be allowed—by biology—to last too long.

As Yuval Harari noted, a squirrel that got truly lasting satisfaction from eating a nut would be a squirrel that lived a very short—albeit fairly happy—life, and would be unlikely to leave too many offspring.

Maybe this is what happens to some drug addicts.  Maybe they really do get satisfaction or at least pleasure from drugs—and maybe that is what ends up destroying them.  At some level, that’s not truly in question, is it?  People who are addicted to drugs forego other pleasures and other positive things, but perhaps more importantly, they fail to avoid many sources of pain and fear and injury.

The reality is probably a bit of an amalgam, I suppose.  I would not say it’s a quantum superposition, though, except in the sense that everything is a quantum superposition (or, rather, a whole bunch of them).

This is one situation in which I think I’m right and Roger Penrose is wrong—a bold claim, but I think a fair one—in that I see no reason to suspect that the nature of consciousness either requires or even allows quantum processes, other than in the trivial sense that everything* involves quantum processes.  But there’s no reason seriously to think that (for instance) neurotubules can even sustain a quantum superposition internally, let alone that such a process can somehow affect the other processes of the neuron, many of which are well understood and show no sign of input from weird states of neurotubules, which act mainly structurally in neurons.

If deep learning systems—LLMs and the like—have demonstrated anything, it’s that intuitive thought** does not require anything magical, but rather can be a product of carefully curated, pruned, and adjusted networks of individual data processing units, feeding backward and forward and sideways in specific (but not necessarily preplanned or even well understood) ways.  No quantum magic or neurological voodoo need be involved.

I think too many people, even really smart people like Penrose, really want human intelligence to be something “special”, to be something that cannot be achieved except within human heads, and maybe in the heads of similar creatures.  Surely (they seem to believe) the human mind must have some pseudo-divine spark.  Otherwise, we oh-so-clever humans are just…just creatures in the world, evolved organisms, mortal and evanescent like everyone and everything else.

Which, of course, all the evidence and reasoning seems to suggest is the case.

Maybe, deep down, there isn’t much more to life than trying to choose the path from moment to moment that steers you toward the least “painful” thing you can find.

Please note, I’m not speaking here about some metaphorical continuum, some number line that points toward pleasure in one direction and pain in the other.  That’s at best a toy model.  In the actual body, in the actual nervous system, pain and fear and pleasure and motivation are literally separate systems, though clearly they interact.  Pleasure is not merely the absence of pain, nor is pain merely the absence of pleasure.  Even peripherally, the nerves that carry painful sensations (which include itching, as I noted yesterday!) use different paths and different neurotransmitters than the ones that deal in pleasure and positive sensation.

Within the brain, the amygdala and the nucleus accumbens (for instances) are separate structures—and more importantly, they perform different functions.  There’s nothing magical about their locations in the brain or the particular neurotransmitters they use.  Those things are accidents of evolutionary past.

There’s nothing inherently stimulating about epinephrine, and there’s nothing inherently soothing about endorphins or oxytocin, and there’s nothing inherently motivating or joyful about dopamine and serotonin.  They are all just molecular keys that have been forged to open specific “locks” or activate (or inactivate) specific processes in parts of other nerve cells (and some other types of cells).  It’s the process that does the work, Neo, not the neurotransmitter.

This brings up a slight pet peeve I have about people discussing “dopamine seeking” (often when talking about ADHD).  I know, the professionals probably use this as a mere shorthand, but that can be misleading to the relatively numerous nonprofessionals in the world.  The brain is not just a chemical vat.  Depression and the like are not just “chemical imbalances” in some ongoing multi-level redux reaction or something, they are malfunctions of complicated processes.  Improving them should be at least as involved as training an AI to recognize cat faces, wouldn’t you think?

But one can do the latter without really knowing the specifics of what is going on in the system.  It’s just sometimes difficult, and the things you think you need to train toward or with often end up giving you what you didn’t really want, or at least what you didn’t expect.

Maybe this is part of why mindfulness is useful (it’s not the only part).  With mindfulness, one actually engages in internal monitoring, not so much of the mechanical processes happening—no amount of mere meditation can reveal the structure of a neuron—but of the higher-scale, “emergent” processes happening, and one can learn from them and be better aware.  This can be an end in and of itself, of course.  But it can also at least sometimes help people decrease the amount of suffering they experience in their lives.

Speaking of that, I hope that reading this post has been at least slightly less painful for you than not reading it would have been.  Writing it has been less painful than I imagine not writing it would have been.  That doesn’t help my other chronic pain, of course, which continues to act up.


*With the possible exception of gravity.

**I.e., nonlinear processing and pattern recognition, the kind many people including Penrose think cannot be explained by ordinary computation, a la Gödel’s Incompleteness Theorem, etc.

 

That was a weird tangent dot com?

Well, it’s Friday, the 30th of January.  We’re almost done with the first month of the year (2026).  Has it been an auspicious month?  Has it been inauspicious?  I suppose the answer to such questions will vary from person to person depending upon how their personal month has gone.  And I suppose that points toward the notion that actual auspices are certainly not any kind of reliable indicator of how the future might go, at least not without great care to separate true patterns from false ones.

On the other hand, it’s not entirely mad to try to draw some potential conclusions about the near future from what’s happening in the present and what has happened in the recent past.  That’s one of the useful skills that’s available to minds that have the capacity to note patterns‒one can try to anticipate the future based on patterns one has noticed over time, and potentially, one can try thereby to avoid outcomes that are undesirable.

Of course, humans do tend to notice patterns that aren’t actually there a lot more than ones that really are there*.  This is usually‒probably‒related to the notion of the differential detriments of different types of errors:  It’s usually more useful to see potential threats that aren’t there than it is not to see potential threats that are there.

I think anyone who stops to think about such things will recognize that the first type of organism will be somewhat more likely to live long enough to reproduce than the second type, though they may be much less comfortable and content in the meantime.  Jumping at shadows can certainly be maladaptive, and too much of it can have a net negative effect on general outcomes, but not jumping at hyenas and lions (for instance) tends to be a very short-lived habit.

This goes back to my frequent talking point that fear, the ability (and it is an ability) to become alarmed and unhappy but energized and driven to fight or flee is going to be present in nearly every lifeform capable of movement over time.  Variations who feel less fear, or none, will not tend to reproduce as much because they are more likely to be killed in any given finite stretch of time, so whatever genetic makeup they have that leads them to lack a fear response, or to be prone to lack it, will not tend to propagate down the generations.

“Genetic makeup”, the term I used in that last sentence (go look, it’s there), made me think of a possible future technology in which people use some CRISPR-style techniques to achieve the effects that hitherto require the use of cosmetics.  They could insert genes into the cells of their cheeks, for instance, to lead them to have more pinkish pigment, or perhaps to make local blood vessels dilate for a nice blushing look, instead of having to use rouge (which is what I think the stuff is called that one applies to make one’s cheeks look pinker).  Or one could generate actual pigments in the cells of one’s upper eyelids, or increase the thickness of one’s eyelashes, all that sort of stuff.

Of course, doing this might entail risks.  Presumably, altering the genes of a given population of cells, even at the local level, could increase the risk of developing cancers, because one cannot perfectly control where genes will insert (at least not so far), and there will always be a chance of mucking up genes that regulate cell division rates.

Once one cell becomes more rapidly reproducing than its companion cells, it will tend to overpower them, in numbers anyway, over time***.  And with rapid and persistently higher rates of reproduction, there come more chances for new mutations to happen.  Those mutations that kill their cells obviously just go away more or less immediately.  Even the ones that revert their cells’ division rates back to “normal” will be quickly locally overwhelmed by the faster growing ones.  But a mutation that encourages even faster division/reproduction will quickly take hold as the dominant cell type, ceteris paribus.

And then, of course, this even more rapidly dividing population of cells will have that many more chances to develop mutations.  And so, down the line, given the billions of cells present in just one’s face, we find the chance for skin cancers to develop, once a cell line becomes so prone to reproduce itself that it cannot be constrained by any local hormonal or immune processes.

That was a weird tangent, wasn’t it?  Although, frankly, I could change the title of my blog from “robertelessar.com” to “thatwasaweirdtangent.com” and it would not be inappropriate.

I’ll finish up today with just some basic housekeeping style stuff:

I will probably not work tomorrow, so I will probably not be writing a blog post.  But if I do write one, it will show up here.  I will certainly not be sleeping in the office tonight, but I did sleep here last night.  I had a terrible day yesterday, pain-wise, and after work I went to the train station but the train was badly crowded and there were no relatively comfortable seats available, so I gave up and trudged back to the office.

I just felt worn out, and I feared that if I did go back to the house, I might not come to the office today.  And today is payday, of course, and Sunday is the first of a new month, so rent is due (Wouldn’t it be nice if rent was dew?  Maybe not if you lived in the Atacama Desert.  Though a little dew might be very strong currency there, come to think of it, relative to most of the rest of the world). 

Hopefully today will be a better day than yesterday with respect to pain.  So far, at least, it doesn’t feel any worse.  The hard office floor can help a bit sometimes with my back pain.  That makes a certain amount of sense, or at least it may do so.  After all, our ancestral environment did not include mattresses.

Anyway, that’s what I’m up to, that’s my life.  I mean that seriously.  That’s pretty much all there is to my life:  Getting up and getting to work (while writing a blog post), doing office stuff while dealing with noise and people and tinnitus, not getting long enough breaks because people seem incapable of watching the time, being the last to leave the office, commuting back to the house, trying to get at least a bit of sleep, and then repeating.  There appears to be nothing more than that coming my way until I’m dead.  Which, I think you might be able to understand, becomes more attractive and less frightening as the tedious, exhausted, and painful days go by.

I hope you all have a good weekend.  As for me, I hope at least to be able to sedate myself enough to have a longer-than-usual sleep tonight.  It’s not ideal (pharmacologically induced sleep being generally and significantly less beneficial than natural sleep), but it’s what I have to use.


*Think of the constellations**.

**Won’t someone please think of the constellations!?!?

***It’s like the difference between exponential functions. ab will grow much more rapidly**** when b is 3, for instance, than when b is 2 or 1.5 or 1.1, and so on.

****Stop looking at the negative side of the number line, dammit.  Just stipulate that a is always a positive number.  Or make the function the absolute value of ab, in other words, |ab|.