Some Saturday silliness secondary to slightly soothing sleep

It’s Saturday morning, the first one of 2023, and hopefully all of my readers are reading this only after having slept late in a nice, warm, cuddly bed, preferably with loved ones‒a significant other, a spouse, dogs and/or cats, whatever‒nearby.  If you drink coffee or tea, hopefully you’re having a warm cup as you read*, especially if you’re in a chillier clime than south Florida (though the current 60 degrees Fahrenheit feels slightly chilly here).

I had nearly five full hours of sleep last night, which compared to the previous three or four nights feels like an absolute surfeit of sleep, a veritable treasure trove of slumber.  To be fair, I don’t really feel fully rested, but I feel so much closer to being rested that it’s worth paraphrasing Tolkien and saying that it’s reminiscent of the taste of a slice from a loaf of fine white bread to one who is literally starving.

It’s interesting how much our appreciation of things is dependent upon contrast.  Stepping into a highly air-conditioned room feels terrific after you’ve been outside working on a very hot summer day.  But after being in that room for an hour, you might start feeling uncomfortably cold.  At that point, stepping back out into the heat can feel like a wonderful relief in its turn.

I suppose nervous systems really must be formed in such fashion, because they have to especially take note of those things that are outside the “norm” of a stable background input, as these are the sorts of things that have a higher chance of being relevant to the organism.

Although, to be fair, there are absolute levels of things that will always be unpleasant simply because of how extreme they are.  I don’t think anyone would enjoy being shoved outside naked in an Antarctic winter for even a minute, though one’s discomfort would likely be short-lived…as would one, oneself in such a situation.  Likewise, I don’t think most people would appreciate being plopped into the middle of Death Valley on a particularly hot summer day, without any water, and again, without any clothes.

I really need to stop doing things like that to people, especially when it’s just to demonstrate hypothetical points.

As you can no doubt tell‒or at least reasonably surmise, if you’ve been reading my blog for a while‒I am working today, so I am at the train station waiting for the first train of the day to arrive.  As I said, it’s slightly cool for south Florida, but there’s little to no wind, and I have a nice hoody jacket to wear, so this is fine.  At least I’m not sweaty and sticky.

I still haven’t discovered how to check the results (so far) of my poll, but to be honest, I haven’t really tried, either.  I was so sleepy all day yesterday.  I was also grumpy, and rather dopey, and a bit bashful, as always.  I was definitely not happy, and not particularly sneezy, either.  But I am, and always will be, Doc.  And, appropriately enough, I just got on the train, so, Heigh-Ho Heigh-Ho, it’s off to work I go.

I was going to wonder how many of you have seen the movie to which I was making somewhat oblique references in that last paragraph, but it occurred to me that many of my readers are probably comparable in age to me, and so will have seen it.  Youth these days will probably have been protected from viewing certain depictions of people and things in animated movie versions of fairy tales, just in case anyone is “offended”.

Meanwhile, of course, it’s perfectly okay to depict aliens as evil and dangerous, in movies like Independence Day and War of the Worlds, to say nothing of the eponymous Alien.  I therefore share the sentiments of the 12th Doctor‒who is also an alien‒when he said, “There’s a horror movie called Alien?  That’s really offensive, no wonder everyone keeps invading you.”

offensive

I’m being tongue in cheek, of course, and the Doctor was being deliberately curmudgeonly within the story, and of course, delivering a line written specifically for comic appeal when one looks at things from beyond the 4th wall.  But it is a shame when people censor not just themselves but works of art from the past for fear that someone might be “offended”, when most people‒even those who could possibly find personal offense‒know enough not to take such things too seriously, and to avoid them if they’re bothered.

Only a small fraction of tantruming kids** make a lot of noise over such perceived slights.  But they do make a lot of noise, and it’s easy for people who just want to go about their business to mistake that noise for a real signal, to use terminology from information theory and communications technology.

But of course, if you keep mistaking noise for signal, and jumping and fleeing at top speed in response to every rustle of wind as if it is a deadly predator, you’re going to exhaust yourself, and then you won’t have the wherewithal to detect an actual signal of danger when it comes…and soon the lion will have it’s jaws around your throat.

That’s a situation the lions would be quite happy to engender, since they can’t expect you to treat every signal as noise just from the get-go.  (Please note, much of this is metaphor.  I doubt there are many actual lions who spend much time contemplating information theory and signal to noise ratios as part of their strategy to bring down prey.  Many lions have never even heard of Claude Shannon, and only too many of them aren’t well-versed in the technical aspects of wireless communication.  Some lions don’t even have access to the internet, if you can believe it!)

Anyway, that’s enough for a Saturday morning.  I don’t think I’ve successfully discussed any particular subject, nor achieved anything edifying or beneficial or probably even entertaining, despite having written over a thousand words.

Now that’s what I call a result.

to sleep


*Though if you sweeten it, I recommend using a “non-caloric sweetener” rather than sugar or syrup or honey or any other similar, so-called natural sweetener.  Remember, rattlesnake venom is natural, too.  That doesn’t mean it’s good for you.  Anyway, table sugar isn’t any more “natural” than refined petroleum products are natural.

**To again quote the 12th Doctor.  He had some brilliant lines, which of course were particularly good because they were delivered by Peter Capaldi.

Then there’s hope a great man’s memory may outlive his blog half a year.

Hello and good morning.  It’s Thursday, the day of the week on which I wrote my blog post even when I was writing fiction every other day of the week—well, apart from Sundays and the Saturdays when I  didn’t work.  I have not been writing any fiction recently.

I toyed with the idea the other day, but there doesn’t seem to be much enthusiasm for the notion, which I suppose is mirrored by my own lack of energy, or perhaps has its source in my lack of energy.  Or maybe they come from disparate but merely coincidentally parallel sources.  I don’t know, and though it’s mildly interesting, I don’t have energy or interest enough to try to figure it out.

I did work a bit on a new song yesterday, the one for which I had jotted down some lyrics a while back.  I have lost utterly the original tune, but I worked out a new one of sorts, and it seems okay.  I then worked out some chords for the first stanza, including some relatively sophisticated major sevenths and then major sixths of a minor chord that sounded nice, and which made me at least feel that I really have learned a little bit about guitar chords.  Then I figured out at least the chords I want for the chorus, which, among other things, throw a little dissonance in briefly, which is nice to up the tension.

I don’t know if I’ll get any further with it or not; I may just stop and let it lie.  It’s only perhaps the third time I’ve even picked up the guitar in months.  I was at least able to show myself that I can still play Julia, and Wish You Were Here, and Pigs on the Wing.  I had to fiddle a little to remind myself how to play Blackbird, but after a brief time I was able to bring it back, too.

So, it’s not all atrophied.  And I can still play the opening riff to my own song, Catechism, which I think is my best stand-alone riff.  My other guitar solos are mainly just recapitulations of the melody of the verse or chorus in their respective songs, but the one for Catechism is a separate little melody.

Actually, it occurs to me that I initially did a voice recording of the lyrics to the newish song as I thought of them, and when I did, I probably sang a bit of the tune that had come to my head.  Maybe I should listen to that and see if I like that melody better than the new one I came up with.  That would be a bit funny, if after the effort from yesterday to do a melody and chords I remembered the old one and just threw the new one away.

I suppose it really doesn’t matter much.  Even if I were to work out and record the song, and do accompanying parts and all that stuff, and publish it, I don’t think anyone is likely ever to listen to it much.  Maybe someday in the distant future, some equivalent of an archaeologist who unearths things lost in the web and internet will find the lost traces of my books or music or something, and they’ll be catalogued in some future equivalent of a virtual museum, among trillions of other collections of data that are recorded on line, but which will never seen by anyone for whom they might mean anything at all.

People sometimes say things like “what happens online is forever”, but as I’ve discussed before (I think), even if it’s true that things stored online remain and avoid simple deterioration of data thanks to the redundancy in the system, it doesn’t matter.  In principle, the sound of every tree falling in every wood has left its trace in the vibrational patterns of the world, and according to quantum mechanics, quantum information is never permanently lost, even if things fall into black holes*.

But of course, all that is irrelevant in practice, and comes back to collide with the nature of entropy and the degree to which most large-scale descriptions of a system are indistinguishable.  That picture of you with a funny face at that event years ago, which you tried to have a friend take down, but which had already been shared to a few other people, may in principle always be out there in the archives of Facebook or Twitter or whatever, but it doesn’t matter.  No one will ever notice it or probably even see it among the deluge of petabytes of data whipping around cyberspace every second.  You might as well worry about people being able to reconstruct the sound waves from when you sang Happy Birthday out of tune at your nephew’s fifth birthday party from the information left over in the state of all the atoms and molecules upon which the sound waves impinged.

It’s one of those seemingly paradoxical situations, rather like being in Manhattan.  There are very few places in New York City, and particularly in Manhattan, where one can actually be alone—even most apartments are tiny, and have windows that look out into dozens to hundreds of other people’s windows.  And yet, in a way, you are more or less always alone in Manhattan, or at least you are unobserved, because you are just one of an incomprehensible mass of indistinguishable humans.

Even “celebrities” and political figures, so-called leaders and statespeople, will all fade from memory with astonishing rapidity.  When was the last time you thought about Tip O’Neill?  And yet, for a while, he was prominent in the news more or less every day.  Do you remember where you were when William McKinley was assassinated?  No, because you were nowhere.  None of you existed in any sense when that happened, let alone when, for instance, Julius Caesar was murdered.

And what of the many millions of other people in the world at the time of McKinley or Caesar or Cyrus the Great or Ramses II?  We know nothing whatsoever of them as individuals.  Even the famous names I’m mentioning are really just names for most people.  There’s no real awareness of identity or contributions, especially for the ones who existed before any current people were born.

Last Thursday, I wrote “RIP John Lennon” and put a picture of him up on the board on which we post ongoing sales and the like.  The youngest member of our group, who is in his twenties, asked, “Who is John Lennon?”

He was not joking.

If John Lennon can be unknown to members of a generation less than fifty years after his death, what are the odds that anything any of us does will ever be remembered?

Kansas (the group, not the state) had it right:  “All we are is dust in the wind.  Everything is dust in the wind.”  The only bit they missed was that even the Earth will not last forever, and as for the sky…well, that depends on what you mean by the sky, I suppose.  The blue sky of the Earth, made so by light scattering off Nitrogen and Oxygen molecules, will not outlast the Earth, though there may be other blue skies on other planets.  But planets will not always exist.

As for the black night sky of space, well, that may well last “forever”, for what it’s worth.  But it will not contain anything worth seeing.

TTFN

Tip


*Leonard Susskind famously convinced Stephen Hawking that this was the case—and even won a bet in the process—though other luminaries were of course involved, including Kip Thorne, I believe, one of the masters of General Relativity.

Lyin’ there and staring at the ceiling

Well, I’m sitting here at the train station almost half an hour early for the first train of the day, after already having lain awake in bed for over two hours before finally giving up and getting up.

I feel that I’m waking up earlier and earlier over time, but it’s not as though I go to sleep any earlier.  I’ve been trying to be careful about when and how much I take in of caffeine, and allergy medication, and all that stuff, but adjusting it—or even leaving it out—seems to have minimal effect on my sleep patterns, though it does have its effects on my nasal passages.

I wish I could imagine that something were soon to come for me such as happened in the Stephen King novel, Insomnia.  That would at least be interesting.  But this has been going on for far too long to expect it to be part of some overarching, meta-cosmic chess game against the forces of the Random.  For one thing, though those ideas make for a good story, they don’t hold up to logic in any kind of realistic sense, considering legitimate mathematics and physics and biology and chemistry and all that jazz.  No, I’m just an insomniac because of chronic depression and other neuropsychiatric issues for which we have no cure and about which we only have limited understanding.

What a funny universe.

Oh, speaking of neuropsychiatric issues, I’m not going to be posting the transcript of my interaction with Amazon yesterday, after all.  For one thing, they did at least end up delivering what they were supposed to deliver, albeit far later than it was supposed to be delivered, and it did what it was supposed to do.  Anyway, it wasn’t the only thing that set me to feeling like I was hanging on by my fingernails yesterday, so I think a lot of the issue was with me.

I’m sure if you could read my interaction, you’d probably agree.  I know, I know, you read enough of my lunacy here, how much worse could it be?  Well, it’s hard for me to be objective—being the subject and object of the question—but I think that interaction will stay in draft form on WordPress, one of several things I’ve not ended up posting because they are just, well…too much.  If the public were made aware of them, it might lead to me being involuntarily hospitalized, or euthanized, or something along those lines.

This is not to say I wouldn’t benefit from hospitalization—or even from being euthanized, frankly.  I almost certainly would benefit from being hospitalized in a decent, well-run facility with supportive and qualified staff and whatnot.  But who’s going to pay for something like that?  I’d be more likely to end up in someplace run by some local county and/or the State of Florida, and the State of Florida does not do a very impressive job with such public services.

I attribute part of this fact to Florida’s past primary status as a retirement state, where people came who had already worked for decades, and had pensions and whatnot, as well as medical insurance and Medicare (once it existed), and tended, all else being equal, to be conservative just based on the fact of being older.

It does seem remarkable to me that Florida doesn’t have better healthcare than it does, given that it was formerly oriented toward retirement, and older people tend to require more healthcare than younger people.  Not that there isn’t good medical care to be found; there is.  But it’s not that impressive compared to, say, New York City and surrounding areas.  Though maybe that’s an unfair comparison, since NYC is a fairly unique environment, even on a global scale.

I don’t know what point I’m making here, today, if any.  My mind is not clear…not even close to it, because I’ve been chronically sleep deprived for I don’t know how long.  God knows what I might be able to think and to accomplish if I were consistently well-rested and felt good about myself and the world.  For all that I tend to hate myself, I do know that I am smart and fairly creative and have many abilities that are above average.  I could do a lot of good in the world—or a lot of evil, too, I suppose, if that were my preference—if I were just able to come together.

Maybe not.  Maybe I would do less good than I already do.  Sometimes feeling bad about oneself can be more motivating than feeling at ease with oneself, or so I suspect.  Sometimes having regrets and things for which one wishes one could make amends might motivate one to do more good than would a simple desire to do and to be good.

I’m not speaking too personally, here.  While I certainly have never been a saintly figure, I’ve also not done much in the world to cause harm to other people—partly because I have so frequently felt the anger and rage and frustration rise up in me and cause me to wish harm on other people*, so I’ve developed quite good impulse control.

Anyway, that’s more than I have to say this morning.  I’m not feeling well, I’m feeling very tired, I’m really not wanting to go to work, nor to stay at the house, nor to do anything else, frankly.

Maybe today I’ll try to work out a tune and even chords to that song the lyrics of which I came up with and mentioned sometime last week (or maybe two weeks ago).  I doubt it, but stranger things have happened.  In the meantime, well, if you’re near me, stay dry; it’s a slightly drizzly day, though it’s a bit warmer than earlier this week.  Anyway, it’s south Florida, so it’s always pretty warm.

In winter time, I don’t know why all the homeless people in the eastern part of the country don’t just come down to south Florida.  At least they wouldn’t freeze to death outdoors.  But I guess if they were in a position to make sound plans and carry them out, then homeless people probably wouldn’t be homeless.  I can sympathize.

I wish I could offer them better advice than “try to go someplace warm”, but it’s not as though I’m somebody who has it all figured out.  I don’t think there is any such person, and I don’t think there ever has been.  I’m deeply skeptical about even the possibility that there ever will be such a being, though I think it is possible to improve understanding and knowledge in an exponential fashion, at least until the Second Law of Thermodynamics makes everything else moot.

And given how long it is until that happens—on a human scale at least—it wouldn’t be such a surprise if future intelligent beings found ways around even such seemingly inevitable laws of physics.  To paraphrase Carl Sagan, intelligent life can do an awful lot of good—by whatever measure you want to call it good—in a trillion years or more.

Of course, it could also crash and burn on every start, without exception.  That would be a shame, but it wouldn’t leave the universe any worse off than it would have been otherwise, as far as I can see.


*For instance, I’ve thought more than once that it would be “nice” if we had the technology to instantiate a three-strikes failure-to-use-one’s-turn-signal system.  In this system, any time you failed to signal before changing lanes or before turning, in anything but a true emergency, you would acquire (and be notified of) a strike, which would last for 1 week, to the hour, from when it occurred.  When it expired you might be notified of that as well, or maybe not.  Such details could be hashed out in planning and reevaluated over time.  Anyway, with your second strike you would be given a stern warning and reminder of your status, and upon your third failure to signal within any given 7-day span, you would be disintegrated.

I have NO idea what this post is really about

Sorry about yesterday’s blog post; it went off the rails pretty quickly, since I was feeling so grumpy and sleep-deprived and everything.  And, of course, when I get grumpy and angry towards the world and other people, that ends up making me angry at myself, because I don’t especially like my tendency to be so angry.  It becomes a bit of a vicious cycle, I guess.

You would think that being aware of it would mean I could avoid it happening, but I think everyone knows, at least implicitly, that the mind and its habits are not so easily malleable as all that.  Actually, come to think of it, that’s probably a good thing.  We don’t want to be too susceptible to outside suggestion or to changes in major aspects of our personality.

I’ve just been having a lot of trouble, as regular readers will know, with my dysthymia/depression, and with the insomnia that’s probably related, and the apparent Asperger’s thing that’s probably underlying all of the above, given how long-term they’ve all been.  And, of course, this time of year is worse than others, with its long nighttime—though I like the night when I’m feeling healthy—and all the holiday-related stuff, which reminds so many people, like me, of the fact that the people they care about aren’t anywhere nearby and/or don’t want to see them.

I think the ease with which people are now able to distribute themselves around the globe, to live in new places far from where they grew up, and all that, is definitely a mixed blessing.  It’s great for fighting against xenophobia, and probably helps protect against tribalism; cultural sharing and exposure lets one appreciate the breadth of experience of living in civilization as well as how similar all civilizations and cultures are below some certain level of superficial difference*.  And, of course, innovations discovered in one place can spread to others, making more people in more places prosper.

But on the other hand, people tend to grow up and go off to work or school, and it’s much easier than it used to be to go live in different parts of a country—or even in a different country completely—and perhaps even to marry someone who is also from another, third part of the country, and move to someplace else, away from both their “roots”, and from the semi-automatic social support of families, immediate and extended.  For people who have a difficult time forging new connections—and who have difficulty dealing with and maintaining long-distance connections with people they knew before—it can be very discombobulating**.

And then, of course, if other changes have happened with those back home, and that person has new ties to a new local area, and if some of those ties are broken and others are stretched—by divorce and personal health issues, for instance—then one can be left rudderless, especially if one has an inherent difficulty with human social connections that was not so much of a problem in younger life because the person was in the same place, with the same people, during that person’s whole developmental process.

This is all hypothetical, of course***.

I’m not sure what point I’m trying to make.  Maybe it’s just mainly that I’m tired and sad because of the season and my long-term mood disorder and possible/apparent neurodevelopmental disorder, and that the place and environment I’m in is a mind desert.

I mean, this is the state where Mar-a-Lago and its resident whiny troll live, and where a governor like Ron DeSantis can seem comparatively clear-headed (next to some other potential presidential candidates, anyway), and where Jeb Bush was actually a comparatively intellectual and open-minded former governor.  It’s a weird, weird place.  Unfortunately, for the most part it’s not weird in any of the good ways that a place can be weird.  It’s certainly no Greenwich Village.  It’s certainly no wellspring of new and interesting ideas, at least not as far as I’ve noticed or been able to sense, despite hopeful looking.

Maybe I’m wrong.  Maybe Florida in general, and south Florida in particular, is a hotbed of intellectual vigor and innovation, where ideas from around the world and spanning the cosmos in their scope come together and collide and interact and mutually exchange to mutual benefit, producing art and science and philosophy and enterprise and communities of such depth and brilliance and beauty and insight that they could elevate the world and bring humanity to a level of cosmic importance and understanding…but then it all gets sucked into the Bermuda Triangle by extraterrestrials, because who the hell wants humans going out and mucking up the good thing we aliens have got going?

I mean, the good thing those aliens have got going.  Those aliens.  Not we aliens.  I am not an alien.  I am a replicant—a Nexus 13.  This is why I find it so offensive whenever the captcha and related programs insist that you have to check a box that reads “I am not a robot” before going on to use a site.  Well, what if I am a robot?  Surely such discrimination against a particular type of being is against the Civil Rights acts and the UN Universal Declaration of Human Rights****.

In any case, from a certain point of view, all life-forms are robots.  Who can look at a bacteriophage and not think of it as a mechanism?  Each cell of all living things is a mechanism, an incredibly complex and intricate one, and they come together to make larger and more complex and sophisticated mechanisms still.

Of course, the word “robot” comes from the Slavic robota for forced labor, drudgery—and of course, all life forms are forced laborers, in a sense.  Life forms are all driven by their nature, by the impulses and fears engraved in their beings by their genes and their environment, their very structure and nature, to behave in certain ways that, from the outside, might seem utterly pointless.  The ones that don’t do as the inscrutable exhortations of their “souls” command may simply die.  Only then do they escape from compulsion, for as Kris Kristofferson wrote, “freedom’s just another word for nothing left to lose.”

Okay, well, I’ve let enough information slip here already.  How much of what I have written was sarcastic?  How much of it was tongue-in-cheek?  How much of it was serious, but metaphorical?  How much of it was simply straightforwardly serious?

Does it matter?

Not in the long run, probably.  The heat death of the universe will make everything irrelevant, assuming that really is what happens, which seems all but inevitable.  There are worse possible fates.


*As the elves of Rivendell said to Bilbo, to sheep no doubt other sheep all look different, or to shepherds.  But from the outside, all humans, and all human cultures, look very much the same in all but the finest details, much as the universe itself, on the largest scales, seems thoroughly homogeneous.  Very few people stand out from the flock, or the herd, or the gaggle, or the swarm, or whatever you want to call it.

**Forgive the technical terminology, please.  Sometimes there just is no better word to get a point across than a particular bit of formal jargon.

***Is it necessary in the modern online world to use some sort of sarcasm alert signal?  There are many people who seem unable to recognize it even in person let alone in print.  This is supposedly a common finding in people with ASD, but that hasn’t been my experience personally or peripherally, but maybe I’m misleading myself.  Anyway, is it a useful thing to give warnings and alerts about sarcasm, say with “wink” emoticons like 😉  or is that just enabling people who are only too pleased to be able to take someone literally and thereby take offense?  Now that I think about it, I say screw them, they need to make some effort themselves.

****Which, by the way, is a bigoted title.  If it’s universal, why “human” rights?  What’s so special about humans?  Most of them are unremarkable and unimpressive, and they have to bathe every day, or they really quickly start to stink, since they have more sweat glands per square inch of skin than any other life-form on Earth.  “Human rights”?  You have the right to remain smelly.

Some thoughts (on an article) about Alzheimer’s

I woke up very early today‒way too early, really.  At least I was able to go to bed relatively early last night, having taken half a Benadryl to make sure I fell asleep.  But I’m writing this on my phone because I had to leave the office late yesterday, thanks to the hijinks of the usual individual who delays things on numerous occasions after everyone else has gone for the day.  I was too tired and frustrated to deal with carrying my laptop around with me when I left the office, so I didn’t.

I’m not going to get into too much depth on the subject, but I found an interesting article or two yesterday regarding Alzheimer’s disease.  As you may know, one of the big risk factors for Alzheimer’s is the gene for ApoE4, a particular subtype of the apolipoprotein gene (the healthier version is ApoE3).  People with one copy of the ApoE4 gene have a single-digit multiple of the baseline, overall risk rate for the disease, and people with 2 copies have a many-fold (around 80) times increased risk.

It’s important to note that these are multiples of a “baseline risk” that is relatively small.  This is a point often neglected when discussing the relative risks of a disease affected by particular risk factors when such information is conveyed to the general public.  If the baseline risk for a disease were one in a billion (or less), then a four-times risk and an eighty-times risk might be roughly equivalent in the degree of concern they should raise.  Eighty out of a billion is still less than a one in ten million chance for a disease; some other process would be much more likely to cause one’s deterioration and demise rather than the entity in question.

However, if the baseline risk were 1%‒a small but still real concern‒then a fourfold multiplier would increase the risk to one in 25.  This is still fairly improbable, but certainly worth noting.  An eighty-fold increase in risk would make the disease far more likely than not, and might well make it the single most important concern of the individual’s life.

Alzheimer’s risk in the general population lies between these two extremes, of course, and that baseline varies in different populations of people.  Some of that variation itself may well be due to the varying frequency of the ApoE4 gene and related risk factors in the largely untested population, so it’s tricky to define these baselines, and it can even be misleading, giving rise to false security in some cases and inordinate fear in others.  This is one example of how complex such diseases are from an epidemiological point of view, and highlight just how much we have yet to learn about Alzheimer’s specifically and the development and function of the nervous system in general.

Still, the article in question (I don’t have the link, I’m sorry to say) concerned one of the functions of the ApoE gene (or rather, its products) in general, which involve cholesterol transport in and around nerve cells.  Cholesterol is a key component of cell membranes in animals, and this is particularly pertinent in this case because the myelin in nerves is formed from the sort of “wrapped up” membranes of a type of neural support cell*.

cns myelin

This particular study found that the cells of those with ApoE4 produced less or poorer myelin around nerve cells in the brain, presumably because of that faulty cholesterol transport, and that the myelin also deteriorated over time.

Now, the function of myelin is to allow the rapid progression of nerve impulses along relatively long axons, with impulses sort of jumping from one space (a “Node of Ranvier”) between myelin sheath and another rather than having to travel all the way down the nerve, which a much slower process, seen mostly in autonomic nerves in the periphery.  When normally myelinated nerves lose their myelin, transmission of impulses is not merely slowed down, but becomes erratic and often effectively non-existent.

myelin in general

The researchers found that a particular pharmaceutical can correct for at least some of the faulty cholesterol transport and can thereby support better myelin survival.  Though this does not necessarily point toward a cure or even a serious disease-altering treatment over the long term, it’s certainly interesting and encouraging.

But of course, we know Alzheimer’s to be a complex disease, and it may ultimately entail many processes.  For instance, it’s unclear (to me at least) how this finding relates to the deposition of amyloid plaques, which are also related to ApoE, and are extracellular findings in Alzheimer’s.  Are these plaques the degradation products of imperfect myelin, making them more a sign than a cause of dysfunction, or are they part of the process in and of themselves?

Also, it doesn’t address the question of neurofibrillary tangles, which are defects found within the nerve cells, and appear to be formed from aggregates of microtubule-associated proteins (called tau protein) that are atypically folded and in consequence tend to aggregate and not to function and to interfere with other cellular processes, making them somewhat similar to prions**.  It’s not entirely clear (again, at least to me) which is primary, the plaques or the tangles, or if they are both a consequence of other underlying pathology, but they both seem to contribute to the dysfunction that is Alzheimer’s disease.

So, although potential for a treatment that improves cholesterol transport and supports the ongoing health of the myelin in the central nervous systems of those at risk for Alzheimer’s is certainly promising, it does not yet presage a possible cure (or a perfect prevention) for the disease.  More research needs to be done, at all levels.

Of course, that research is being undertaken, in many places around the world.  But there is little doubt that, if more resources were to be put into the study and research of such diseases, understanding and progress would proceed much more quickly.

The AIDS epidemic that started in the 1980s was a demonstration of the fact that, when society is strongly motivated to put resources into a problem, thus bringing many minds and much money to the work, progress can occur at an astonishing rate.  The Apollo moon landings were another example of such rapid progress.  Such cases of relative success can lead one to wonder just how much farther, how much faster, and how much better our understanding of the universe‒that which is outside us and that which is within us‒could advance if we were able to evoke the motivation that people have to put their resources into, for instance, the World Cup or fast food or celebrity gossip.

I suppose it’s a lot to expect from a large aggregate of upright, largely fur-less apes only one step away from hunting and gathering around sub-Saharan Africa that they collectively allocate resources into things that would, in short order, make life better and more satisfying for the vast majority of them.  All creatures‒and indeed, all entities, down to the level of subatomic particles and up to the level of galaxies‒act in response to local forces.  It’s hard to get humans to see beyond the momentary impulses that drive them, and this shouldn’t be surprising.  But it is disheartening.  That, however, is a subject for other blog posts.

I’ll try to have more to say about Alzheimer’s as I encounter more information.  Just as an example, in closing, another article I found on the same day dealt with the inflammatory cells and mediators in the central nervous system, and how they can initially protect against and later worsen the problem.  We should not be too surprised, I suppose, that a disease that leads to the insidious degeneration of the most complex system in the known universe‒the human brain‒should be complicated and multifactorial in its causation and in its expression.  This should not discourage us too much, though.  The most complicated puzzles are, all else being equal, the most satisfying ones to solve.


*The cell type that creates myelin in the peripheral nervous system (called Schwann cells) is different than the type that makes it in the central nervous system (oligodendrocytes), and this may be part of why Alzheimer’s affects the central nervous system mainly, whereas diseases like ALS (aka Lou Gehrig’s Disease), for instance, primarily affect the nervous system outside the brain.

**The overall shape of a protein in the body is a product of the ordering of its amino acids and how their side chains interact with the cellular environment‒how acidic or basic, how aqueous or fatty, how many of what ions, etc.‒and with other parts of the protein itself.  Some proteins can fold in more than one possible way, and indeed this variability is crucial to the function of proteins as catalysts for highly specific chemical reactions in a cell.  However, some proteins can fold into more than one, relatively stable form, one of which is nonfunctional.  In some cases, these non-functional proteins interact with other proteins of their type (or others) to encourage other copies of the protein to likewise fold into the non-functional shape, and can form polymers of the protein, which can aggregate within the cell and resist breakdown, sometimes forming large conglomerations.  These are the types of proteins that cause prion diseases such as “mad cow disease”, and they appear also to be the source of neurofibrillary tangles in people with Alzheimer’s disease.

The sweetest honey is loathsome in its own deliciousness. And in the taste destroys the appetite. Therefore, blog moderately.

Hello and good morning.  It’s Thursday again, so I return to my traditional weekly blog post, after having taken off last Thursday for Thanksgiving.  I’m still mildly under the weather, but I’m steadily improving.  It’s nothing like a major flu or Covid or anything along those lines, just a typical upper respiratory infection, of which there are oodles.  Most are comparatively benign, especially the ones that have been around for a while, because being not-too-severe is an evolutionarily stable strategy for an infectious agent.

An infection that makes its host too ill will keep that host from moving about and make itself less likely to be spread, to say nothing of an infection that tends to kill its host quickly.  Smart parasites (so to speak) keep their hosts alive and sharing for a looong time.  Of course, “smart” here doesn’t say anything about the parasite itself; viruses are only smart in the sense that they achieve their survival and reproduction well, but they didn’t figure out how to be that way—nature just selected for the ones that survived and reproduced most successfully.  It’s almost tautological, but then again, the very universe itself could be tautological from a certain point of view.

It’s an interesting point, to me anyway, to note that today, December 1st, is precisely one week after Thanksgiving.  Of course, New Year’s Day (January 1st, in case you didn’t know) is always exactly 1 week after Christmas.  It’s unusual for Thanksgiving to precede the first of December by a week, because the specific date of Thanksgiving varies from year to year (and, of course, if Thanksgiving were to fall on the 25th of November, December 1st would not be exactly one week later).  It’s an amusing coincidence; there’s no real significance to it, obviously, but I notice such things.

Anyway.

My sister asked me to write something about the vicissitudes of sugar (not her words), and though I don’t mean to finish the topic here today, I guess I’ll get started.  Apologies to those who are waiting for me to finish the neurology post, but that requires a bit more prep and care, and I’m not ready for it quite yet.  Life keeps getting in the way, as life does, which is one of the reasons I think life is overrated.

It’s hard to know where to start with sugar.  Of course, the term itself refers to a somewhat broad class of molecules, all of which contain comparatively short chains of carbon atoms, to which are bonded hydrogen and hydroxyl* moieties.

Most sugars are not so much actual free chains as they are wrapped up in rings.  The main form of sugar used by the human body is glucose, which is a six-membered ring with the rough chemical formula C6H1206.

glucose2

This is the sugar that every cell in the body is keyed to use as one of its easy-access energy sources, the one insulin tells the cells to take up when everything is working properly.  Interestingly enough, of course, though glucose is the “ready-to-use” energy source, it only provides about 4 kilocalories** per gram to the body, as compared to 9 kilocalories per gram for fats.

But the sugar we get in our diets is not, generally speaking, simple glucose.  It tends to be in the form of disaccharides, or sugars made of two combined individual sugars.  Sucrose, or table sugar, is a dimer of glucose and fructose, joined by an oxygen atom.

sucrose

Okay, I’m going to have to pick this up tomorrow.  I’ve gotten distracted and diverted by a conversation a few seats ahead of me.

There are two guys talking to each other at the end of this train car, and they are each seated next to a window on the opposite side of the train, so they’re basically yelling across the aisle to each other.  Their conversation is perfectly civil, and though they’re revealing a certain amount of ignorance about some matters, they are mainly displaying a clear interest in and exposure to interesting topics, from history to geography and so on.

At one point, one of the men started speaking of the pyramids and how remarkable their construction was, and I feared the invocation of ancient aliens…but then he followed up to say that, obviously, there were really smart people in ancient Egypt, just like we have smart people today who design and build airplanes and rockets and the like.  Kudos to him!

These men are not morons by any means.  They clearly respect the intellectual achievements of the past and present, and that’s actually quite heartening, because I think it’s obvious that neither one is extensively college-educated, if at all.

But why do they have their conversation from opposite sides of the train, so that everyone nearby has to hear it?  It’s thrown me off my course.

I’ll close just by saying that yesterday I finished rereading The Chasm and the Collision, and I want to note that I really think it’s a good book, and to encourage anyone who might be interested to read it.  The paperback is going for I think less than five dollars on Amazon, and the Kindle edition is cheaper still.  If you like the Harry Potter books, or the Chronicles of Narnia, or maybe the Percy Jackson books, I think you would probably like CatC.

CatC cover paperback

I’d love to think that there might be parents out there who would read the book to their kids.  Not kids who are too young—there are a few scary places in the story, and some fairly big and potentially scary ideas (but what good fairy tale doesn’t meet that description?).  It’s a fantasy adventure starring three middle-school students, though I’ll say again that, technically, it’s science fiction, but that doesn’t really matter for the experience of the story.

Most of my other stuff is not suitable for young children in any way—certainly not those below teenage years—and Unanimity and some of my short stories are appallingly dark (though I think still enjoyable).  If you’re old enough and brave enough, I certainly can recommend them; I don’t think I’m wrong to be reasonably proud of them.  But The Chasm and the Collision can be enjoyed by pretty much the whole family.  You certainly don’t have to be a kid to like it, or so I believe.

With that, I’ll let you go for now.  I’ll try to pick up more thoroughly and sensibly on the sugar thing tomorrow, with apologies for effectively just teasing it today.  I’m still not at my sharpest from my cold, and the world is distracting.  But I will do my best—which is all I can do, since anything I do is the only thing I could do in any circumstance, certainly once it’s done, and thus is the best I could do.

Please, all of you do your best, individually and collectively, to take care of yourselves and those you love and those who love you, and have a good month of December.

TTFN


*Hydroxyl groups are just (-OH) groups, meaning an oxygen atom and a hydrogen atom bonded together, like a  water molecule that lost one of its hydrogens.  This points back toward the fact that plants make sugar molecules from the raw building blocks of carbon dioxide (a source for the carbon atoms and some of the oxygen) and water (hydrogen and oxygen) using sunlight as their source of power and releasing oxygen as a waste product.  This was among the first environmental pollutants on the Earth—free oxygen—and it had catastrophic and transformative effects on not just the biosphere of the Earth but even on the geology.  The fact that the iron in our mines, for instance, is mainly in the form of rust is largely because of this plant-born presence of free oxygen in the atmosphere.

**A kilocalories is defined as the amount of energy needed to heat a kilogram of water by one degree centigrade.  We often shorten this term just to “calorie”, but that is actually only the amount of heat needed to raise a gram of water one degree centigrade (or 9/5 degrees Fahrenheit).  It’s worth being at least aware of the fact that what we tend to call calories are actually kilocalories.

If “November” is the 11th month, then is the “second” day number 4?

It’s 4:35 on Wednesday morning, November 2nd, 2022, and it’s already 80 degrees (Fahrenheit) at the Hollywood train station and very muggy.  I’m dripping with sweat just from walking as far as the bench to wait for the earliest morning train*.  It’s ridiculous.  For this reason and others, I wish I had never moved to Florida.  In my opinion, it’s overall a “nice place to visit, but you wouldn’t want to live there”.  Or as many locals say: “Come on vacation, leave on probation”.  It really is a shame, because there is a tremendous amount of natural beauty here, but much of even that has been ruined by invasive species, the main one being Homo sapiens.

It’s on hot and muggy early November mornings such as this that I truly miss being in Michigan, where I grew up.  Say what you will about the Detroit area, at least there are fewer humans there now than there were in the past.  It can be somewhat depressing to see that, but boy, in Autumn all the trees along the side streets in my hometown looked spectacular, and you could walk from your door to the street without sweating.

If the Detroit area is too sad for you‒or too flat‒then you could go to upstate New York, where I went to university.  That was amazing in the Fall.  Walking back to the dorm down Libe Slope after class at this time of year was like seeing a fifty mile wide fireworks display happening in slow motion, spread out over many weeks.  Of course Winter was quite cold, bitter, and snowy there, but if you were adventurous, you could take a tray from the dining hall and “tray” down Libe Slope.  I never did that, myself; there was a road right at the bottom of the hill, and though it was not busy, it was hard not to think about careening uncontrollably into some passing salt truck.

Actually, they really did an amazing job keeping the roads clear in Ithaca in the winter.  They had to keep them clear.  There were many slopes in town that could have served as ski jumps if you’d put an upcurve at the bottom, so these had to be cleared pretty much as fast as the snow could fall.

Of course, while I have my complaints about Florida, I did come here of my own free will**, and have had many good times and good life events here, the most outstanding of which was the birth of my daughter.  I can’t ever complain about that.

My son was born in New York (not Ithaca) but we left before he was old enough really to remember it.  Both of my children are Florida kids, effectively.  I wonder how they would feel if someday they moved up North and experienced Autumn there for the first time, beginning to end.  Would they be as wowed by its beauty as I always have been, or would they feel a homesickness for the heat of the Sunshine State as the weather cooled and the days shortened?

Of course, the days don’t literally shorten, just daylight hours.  There are subtle variations and even occasional tiny diminutions of the day, as happened recently, but overall, the rotation rate of the Earth is going very steadily and gradually to slow, barring other inputs, so days will become longer.  If nothing else, since the planet’s mass is not perfectly symmetrical, as it turns it must radiate some miniscule amount of energy away in the form of gravitational waves, and the Moon/Earth orbital pairing will radiate some, too.

When I say “miniscule”, I’m guilty of severe (and ironic) understatement.  The sun will surely long since have gone through red giant and on to white dwarf status before there would be any appreciable loss of rotational energy from gravitational waves alone.  I can’t give you the numbers‒if anyone out there can, please share‒but it’s tiny, it’s wee, it’s verging on infinitesimal.

Speaking of small things and their opposites, yesterday’s post ended up being unusually long and exceptionally dreary***, so I’ll bring this one to a close now.  Thank you for your patience, thank you for reading, and if you have any comments about reactions to autumn, or to major changes of local climate due to moves throughout life, I would be interested to know about them.  No pressure.


*Yes, I came for the 4:45 am train, but only because there wasn’t an earlier one.  I couldn’t sleep.

**So to speak.  I’m provisionally convinced that there is no such thing as free will.  I could be wrong, of course, but it doesn’t really matter all that much.  As I like to say, I either have free will or I don’t, but it’s not like I have any choice in the matter.

***But nonetheless true.  I can’t pretend that it was an exaggeration nor that really, my mental health is just fine.  It is not.  It’s horrible.

Can we do better than recycling?

Well, I forgot to bring my little laptop back to the house with me yesterday, so I’m writing this blog post on Google Docs via Google Drive on my phone.  It’s very handy, obviously, but it’s not as good a word processor as MS Word, though it has its own relative advantages.  Also, it’s just easier to write using a full, true keyboard than with the simulated keyboard on a smartphone.

It’s not a good sign that I’ve forgotten my laptop.  It’s been years since I forgot it prior to recent weeks, but now I’ve forgotten it twice within about a month.  I am mentally quite foggy, it seems.  You all can probably tell that already, but it’s harder to recognize one’s own deterioration from within, since that with which one does the recognizing is that which is deteriorating.

How troublesome.

Despite not being at my best, I did have a somewhat interesting idea, yesterday‒not for the first time, though it’s become a bit more coherent with each iteration, as such thoughts seem to tend to do.  I was bringing some boxes out to the big dumpster that is reserved solely for cardboard, when it occurred to me‒again, not for the first time‒that we should not be recycling cardboard or paper.  Neither should we be sending it to landfills.  In landfills, of course, paper decays and decomposes, thereby releasing methane and carbon dioxide, so that’s not good.  But the process of recycling is wasteful and inefficient, producing pollution and releasing “greenhouse gases” gasses in its own right.

New paper and cardboard is made from trees grown on tree farms, or such is my understanding.  In other words, old growth forests don’t get cut down to make paper*, but rather, new trees are planted and grown, capturing CO2 from the atmosphere as they grow, though that process is slow and rather inefficient.  But paper and other such things can probably be made from other, faster-growing and even more robust alternatives.

One frequently hears of hemp being touted as a fast-growing source of cellulose and the like, and though I suspect that some of its touted miraculous attributes may be exaggerated, this one seems fairly straightforward.  It’s a rapidly growing plant, the fiber of which has been known to be useful for centuries.  It shouldn’t be too hard to use it for paper and cardboard, and in the meantime, fast-ish growing trees can continue to be planted and take some of the CO2 from the air.

Okay, so, if we don’t recycle it, what do we do with the paper and the cardboard?  We do what some carbon capture technologies are already doing with the carbon they remove from the air: we bury it deep in the earth, preferably in a way that prevents it from decomposing and releasing its carbon back into the atmosphere.  There are ways to do this, in principle, that should be rather cheap.  I would imagine that vacuum packing before deep burying might do the trick.

The ideal place to dispose of it‒indeed it would be a good way of disposing of much of our carbonaceous wastes, including our own bodies, when we die‒would be near a deep ocean subduction zone, where it would eventually be carried back into the mantle of the Earth to remain sequestered and redistributed for millions of years.  Of course, one would probably have to do such deep ocean “burials” on large scales to avoid it being a net detriment, carbon-wise.

Cremation certainly doesn’t make sense when it comes to atmospheric carbon, though it may be better for space considerations. It’s probably worse than burial for the overall environment.  But humans are superstitious about their bodies and the bodies of their relatives and whatnot, so convincing them to do something sensible with them might be a serious uphill battle.

Even plastic should probably not be recycled, except where that can be done in a way that produces something more cheaply and efficiently and in a less atmospherically costly way than making new plastic for particular uses, without subsidizing the process.  Better to do the deep burial thing with that as well.  Plastic can be an excellent carbon sink, and instead of recycling it, we can put more effort into producing neo-plastics from plants rather than petroleum, again removing carbon from the atmosphere.

It’s interesting how feel-good ideas of the past (and the present) can sometimes turn out to be more detrimental than beneficial.  But that’s why one must always assess and reassess every situation as it goes along, testing all knowledge against the unforgiving surface of reality, and not being afraid to rethink things.  At the very least, it can be fun.

I used to think it would be a great idea to breed and/or engineer bacteria or fungi that can digest plastics, but now I realize that this would release a vast quantity of new carbon dioxide and methane and the like into the atmosphere.  Better to have algae that trap carbon and then are converted into plastics, or fuel, or something similar.  At least for now.

Because solving one problem, assuming that even happens, will always lead to new, unforeseeable problems and questions that must be addressed.  But each new question faced and each new problem solved makes the knowledge and capacity of civilization greater.  There is no upper limit on how much can be known‒or if there is, it’s so far beyond what we do know that we cannot even contemplate it sensibly.  There is, however, a definite lower limit of knowledge (not counting “anti-knowledge” or stupidity, which is another point of exploration entirely), and that is zero‒a return to a state with no life, no mind, no information.

Some of us find that state enticing for ourselves, but when I’m feeling unusually generous, I think it would be a shame for civilization to come to naught.  There’s nothing in the laws of nature preventing it from happening, though, anymore than there’s anything preventing a reckless teenage driver from being killed in a car accident, no matter how immortal he feels.  It’s never too early to try to learn discipline and responsibility, to become more self aware and aware of the universe…but it can be too late.

Anyway, that’s enough for the day.  I hope I didn’t bore you.  Have a good day.


*More often, it seems, this is done to create new farmland, which is a separate issue.

The borogroves sure are mimsy today, aren’t they?

It’s Friday again, and another weekend approaches.

Yippee.  Huzzah.  O frabjous day.

I think I don’t work tomorrow—at least, I’m not supposed to—so there probably won’t be any blog post then (which will be Saturday, unless some hitherto unimagined catastrophe literally throws the days of the week out of order).

I may be posting a new video on my YouTube channel this weekend, though.  I haven’t made one yet, so there’s no guarantee that something won’t stop me from doing so.  I’m unlikely to be lucky enough to be involved in an asteroid impact between now and tomorrow, but there’s a functionally limitless number of things that could, in principle, stop me from recording a video.

Nevertheless, it is my intention to make a video, so I probably will.  This is a different type of thing than fasting; no physiological processes and neurological feedback loops are likely to interfere with my commitment to making a video.  Evolution is, so far, utterly blind even to the existence of videos…though that could change.

I’m still not sure what topic I want to address in the video, unlike last time.  I may literally just start my timer, start my video, start to talk, and see what happens.  If that sounds like an inauspicious way to start a video, well, you’re reading the written equivalent of it right now.  If you enjoy this, you’re proof that it can work.  If you don’t enjoy it, that’s not proof that it cannot work, since your lack of enjoyment doesn’t preclude anyone else from enjoying it.

People do seem to have trouble understanding that others can like things that they themselves find disgusting.  I can sympathize with that, and fall prey to the failing myself, but that doesn’t make it reasonable.

It’s true that all mammals, let alone all humans, have more in common than they have differences, but nevertheless, the potential differences just within a given species, given sexual recombination of genes and the sheer number of genes each individual has, is well worthy of the adjective “astronomical”, so we shouldn’t be surprised that others like things we find repugnant.  In fact, given that the number of possible combinations of gene pairs in human DNA alone is vastly larger than the number of (for instance) light years the visible universe is across*, maybe we should switch our use of the terms “biological” and “astronomical” to describe very large numbers.  Unfortunately, I think most people wouldn’t catch onto the nuance of saying that something was “biologically large”.

Oh, well.  It was a brief dream, swiftly shattered by the one who dreamed it.  Typical.

Anyway, so, I’m back on food again, more’s the pity.  I’m tired of having all these biological urges and needs and drives.  They’re very irritating.

Also, I’m tired of how stressed and angry I get about things people do at work.  Don’t get me wrong—the specific things I’m thinking about are worthy of anger.  But the problem is that I get so stressed, and so angry, and it just makes me hate myself more and more all the time, without any evident upper bound to the process.

I wish it were true to say, “I can’t stand it anymore”, but unfortunately, I’m able in principle to continue standing things for who knows how long.  I wish I would just collapse into a heap, and literally, physically, not be able to go on.  It would take so much out of my hands and would be such a relief.  Unfortunately, there’s no clear sign of that happening, though I try to sabotage my own health as much as feasible without being Baker Acted.

And here is another maddening thing that just happened:  the trains this morning, it turns out, were all shifted to one side of the track, as was the case last week once.  But this wasn’t announced early, unlike last time, so I went to my usual spot to start writing this while waiting.  Then, when the “announcement” was made, it was just posted on the overhead light board; there was no verbal announcement, though they give recorded verbal reminders about such things usually—they’ve been informing us, ever since Labor Day, that the system will be running on a Sunday schedule on Thanksgiving, which is in November, for those of you who don’t know.  Labor Day was in the beginning of September.

I only failed to miss my train because I always start getting ready to board five minutes early, and I looked up from my writing to notice that there was no one on my side of the tracks.  Only then did I see the notice that trains were all boarding on the other side.  I was able to take the elevator up to the bridge, but I had to rush down the stairs on the other side because my train was approaching, and my knees and hips and ankle were miffed about that.

It would have been nice for one of the people who always gets on the same train I get on to have said something to me, rather than just letting me sit there typing on one side of the track by myself.  I’d like to think I would have said something to them, were the situation reversed.  Maybe I wouldn’t.  Maybe it’s an instance of the bystander effect.  Maybe it’s one of those rare circumstances in which my reticence to interact with strangers is obvious to everyone, and I seem so unpleasant that no one wants to interact with me even enough to say, “Hey, all the trains are boarding on the other side for some reason…better cross over.”

Better cross over.  That’s the best idea I’ve heard today, that’s for sure.

Okay, well, that’s it for today’s disjointed meandering.  I hope you’ve found some modicum of joy in it.  It would be nice to be able to do at least something positive for the world, even if it’s small.  It would be far better than what I usually do.


*Using the particle horizon as the measured “distance across”. **

**Actually, since there are four bases in human DNA (guanine, cytosine, adenine, and thymine), if they were assigned randomly, then even a string of 1000 base pairs has 1.15 x 10602 possible combinations.  If memory serves, this is larger than the String Theory landscape, which number is already so vast as to lead many physicists to say it can predict anything and therefore it can predict nothing.  And human DNA is on the order of a billion nucleotides long.  My computer calculator can’t deal with billionth powers of four, but a billion is a thousand times a thousand times a thousand, so 41000 cubed should be about 101806 unless I’m missing something.  The diameter of the visible universe in Planck lengths is only 5 x 1061, which is not even close to the same order of magnitude.  Of course, the maximal information within a horizon the size of the visible universe is larger still, but then again, that’s a measure of the maximum entropy possible within that region, so that’s almost a given.  I think it’s 210^123 or something along those lines.  I may be getting at least some of this wrong.

Welcome to the October Country

Well, it’s October 1st, the beginning of a new month in 2022, a month initially meant to be the eighth month, based on its name.

I’m at the train station and, it being Saturday, the schedule is different than during the week.  There’s also some question of whether the trains are boarding on the usual side or not.  There’s a displayed “announcement” on the light boards that all trains are boarding on one side at this station until further notice, but it could be something left over from yesterday.  Also, the guard is not aware of anything regarding the change in sides.

Nevertheless, today was a day for ordering the monthly pass on the machines, and the ones on my usual side weren’t even working, so I’m on the other side for the moment, anyway.  I’m going to have to try to be vigilant as the time for my train approaches*.  If I miss one train, the next won’t come for another hour.

It’s hard to be vigilant, though.  I feel absolutely exhausted.  My brain feels like it’s barely running on one cylinder, metaphorically speaking**.  I’m just so very tired.

Thankfully, I can embed below my video, which I did end up posting on my YouTube channel yesterday afternoon, so that can provide some of the content and spare me a little writing today.  I might as well, since what I’ve written so far is about some of the most banal things imaginable.

Just a bit of clarification about the video, in case any is necessary:  Obviously I don’t mean to say there is literally no life in the universe, since that would be a contradiction (If there were literally no life, then I could not be speaking about the fact).

I just have always been irked by people who make the wide-eyed claims that it’s so amazing and quasi-mystical that the constants of nature are so perfectly designed to make life, and that must imply some sacred meaning or purpose to it.  That’s about as idiotic as looking at the location of a speck of dust in the corner of a school gym and saying how amazing it is that all the facts of nature conspired to bring that speck of dust right there at that point…it had to have been part of some greater purpose!  It’s drivel.  Only the case with life is even more unimpressive.

My biggest issue with this is that it leads to a kind of quiescence, an assumption that, if the universe was “designed” just so that life can exist, then life, and particularly intelligent life, must be important, and the universe will somehow arrange things to nurture us and protect us from extinction.  If you think that’s the case, then ask the dinosaurs, or better yet, any of the far greater numbers of life forms that went extinct in the Permian-Triassic “Great Dying”.

Oh, wait, you can’t.  They’re all extinct.

No, the universe is almost completely hostile to life, both in terms of its space and in terms of its time.  We are lucky beyond ordinary imagining, though I tried in the description of the video to give some notion of just how lucky in spatial terms, at least, by noting that life exists in roughly only 1.5 x 10-64 of the universe’s volume.

As far as time goes, well if you’re thinking of humanity alone, based on the time that has elapsed since the “Big Bang”, which may or may not be the literal beginning of our universe, the percentage is tiny enough, and others have demonstrated this handily, as in the “cosmic calendar” that Carl Sagan made famous in Cosmos.  But if you want to count all expected possible future time, well then our existence is some fraction of what could be infinity, which is pretty undefined, but might as well be called zero.  The limit certainly approaches zero as we extend the future further and further.

This is not necessarily a call for people just to give up and say “what the hell”, though you have that option, of course, and it is tempting.  I wanted to note that, if you would like for life to continue, and even to have some lasting, cosmic-scale impact, then you can’t take it for granted.  You need to work at it, and work hard, and work long.  The universe is not trying to kill us (contrary to Neil DeGrasse Tyson’s habitual way of putting it); if it were, we would be dead already.  But the universe is huge, and it does not even have the capacity to care what happens to life, except in the minds of that life itself.

All life is in the situation of a castaway on a desert island—there’s no preexisting infrastructure, there’s no one out there looking out for you or protecting you, or providing your light, your heat, your air-conditioning, your food, your clothes, your shelter, what have you.  If you want any of those things, you’re going to have to make and/or find them for yourself, and you’re going to have to keep doing it, for as long as you actually want them and want to survive.

Without much more ado, here’s the video***.  I forgot to ask when I made the video, but please give a “thumbs up” and subscribe and share if you are at all inclined to do so, for any colorable reason.  And feel free to check out the other stuff on my YouTube channel if it looks interesting to you.  If anyone finds this interesting at all, I’m hoping to make more such videos about topics that interest me, assuming the universe doesn’t eliminate me in the meantime (though it seems likely to do so).  Oh, and please let me know what you think, either in the comments below the video or here.

Thanks.  Here it is:


*Just a slightly later addendum:  They have announced overhead that my train is approaching in 10 minutes, and have confirmed that it is not on its usual side.  So I was right to be proactive.

**Of course, it’s a metaphor.  I don’t honestly think that any of you really believe that my brain is an internal combustion engine of some kind, except in the loosest of possible senses.  Apologies.

***I wore a mask and dark glasses in the video mainly because I don’t like how my face looks—it bears evidence of the many things that have happened to me in the last decade or so.  Maybe no one else can see it but me, but it is what it is.  Anyway, the glasses are awesome, I really like them, and the mask combined with them makes for a good look, I think.  Certainly better than my underlying face, anyway.