I have NO idea what this post is really about

Sorry about yesterday’s blog post; it went off the rails pretty quickly, since I was feeling so grumpy and sleep-deprived and everything.  And, of course, when I get grumpy and angry towards the world and other people, that ends up making me angry at myself, because I don’t especially like my tendency to be so angry.  It becomes a bit of a vicious cycle, I guess.

You would think that being aware of it would mean I could avoid it happening, but I think everyone knows, at least implicitly, that the mind and its habits are not so easily malleable as all that.  Actually, come to think of it, that’s probably a good thing.  We don’t want to be too susceptible to outside suggestion or to changes in major aspects of our personality.

I’ve just been having a lot of trouble, as regular readers will know, with my dysthymia/depression, and with the insomnia that’s probably related, and the apparent Asperger’s thing that’s probably underlying all of the above, given how long-term they’ve all been.  And, of course, this time of year is worse than others, with its long nighttime—though I like the night when I’m feeling healthy—and all the holiday-related stuff, which reminds so many people, like me, of the fact that the people they care about aren’t anywhere nearby and/or don’t want to see them.

I think the ease with which people are now able to distribute themselves around the globe, to live in new places far from where they grew up, and all that, is definitely a mixed blessing.  It’s great for fighting against xenophobia, and probably helps protect against tribalism; cultural sharing and exposure lets one appreciate the breadth of experience of living in civilization as well as how similar all civilizations and cultures are below some certain level of superficial difference*.  And, of course, innovations discovered in one place can spread to others, making more people in more places prosper.

But on the other hand, people tend to grow up and go off to work or school, and it’s much easier than it used to be to go live in different parts of a country—or even in a different country completely—and perhaps even to marry someone who is also from another, third part of the country, and move to someplace else, away from both their “roots”, and from the semi-automatic social support of families, immediate and extended.  For people who have a difficult time forging new connections—and who have difficulty dealing with and maintaining long-distance connections with people they knew before—it can be very discombobulating**.

And then, of course, if other changes have happened with those back home, and that person has new ties to a new local area, and if some of those ties are broken and others are stretched—by divorce and personal health issues, for instance—then one can be left rudderless, especially if one has an inherent difficulty with human social connections that was not so much of a problem in younger life because the person was in the same place, with the same people, during that person’s whole developmental process.

This is all hypothetical, of course***.

I’m not sure what point I’m trying to make.  Maybe it’s just mainly that I’m tired and sad because of the season and my long-term mood disorder and possible/apparent neurodevelopmental disorder, and that the place and environment I’m in is a mind desert.

I mean, this is the state where Mar-a-Lago and its resident whiny troll live, and where a governor like Ron DeSantis can seem comparatively clear-headed (next to some other potential presidential candidates, anyway), and where Jeb Bush was actually a comparatively intellectual and open-minded former governor.  It’s a weird, weird place.  Unfortunately, for the most part it’s not weird in any of the good ways that a place can be weird.  It’s certainly no Greenwich Village.  It’s certainly no wellspring of new and interesting ideas, at least not as far as I’ve noticed or been able to sense, despite hopeful looking.

Maybe I’m wrong.  Maybe Florida in general, and south Florida in particular, is a hotbed of intellectual vigor and innovation, where ideas from around the world and spanning the cosmos in their scope come together and collide and interact and mutually exchange to mutual benefit, producing art and science and philosophy and enterprise and communities of such depth and brilliance and beauty and insight that they could elevate the world and bring humanity to a level of cosmic importance and understanding…but then it all gets sucked into the Bermuda Triangle by extraterrestrials, because who the hell wants humans going out and mucking up the good thing we aliens have got going?

I mean, the good thing those aliens have got going.  Those aliens.  Not we aliens.  I am not an alien.  I am a replicant—a Nexus 13.  This is why I find it so offensive whenever the captcha and related programs insist that you have to check a box that reads “I am not a robot” before going on to use a site.  Well, what if I am a robot?  Surely such discrimination against a particular type of being is against the Civil Rights acts and the UN Universal Declaration of Human Rights****.

In any case, from a certain point of view, all life-forms are robots.  Who can look at a bacteriophage and not think of it as a mechanism?  Each cell of all living things is a mechanism, an incredibly complex and intricate one, and they come together to make larger and more complex and sophisticated mechanisms still.

Of course, the word “robot” comes from the Slavic robota for forced labor, drudgery—and of course, all life forms are forced laborers, in a sense.  Life forms are all driven by their nature, by the impulses and fears engraved in their beings by their genes and their environment, their very structure and nature, to behave in certain ways that, from the outside, might seem utterly pointless.  The ones that don’t do as the inscrutable exhortations of their “souls” command may simply die.  Only then do they escape from compulsion, for as Kris Kristofferson wrote, “freedom’s just another word for nothing left to lose.”

Okay, well, I’ve let enough information slip here already.  How much of what I have written was sarcastic?  How much of it was tongue-in-cheek?  How much of it was serious, but metaphorical?  How much of it was simply straightforwardly serious?

Does it matter?

Not in the long run, probably.  The heat death of the universe will make everything irrelevant, assuming that really is what happens, which seems all but inevitable.  There are worse possible fates.


*As the elves of Rivendell said to Bilbo, to sheep no doubt other sheep all look different, or to shepherds.  But from the outside, all humans, and all human cultures, look very much the same in all but the finest details, much as the universe itself, on the largest scales, seems thoroughly homogeneous.  Very few people stand out from the flock, or the herd, or the gaggle, or the swarm, or whatever you want to call it.

**Forgive the technical terminology, please.  Sometimes there just is no better word to get a point across than a particular bit of formal jargon.

***Is it necessary in the modern online world to use some sort of sarcasm alert signal?  There are many people who seem unable to recognize it even in person let alone in print.  This is supposedly a common finding in people with ASD, but that hasn’t been my experience personally or peripherally, but maybe I’m misleading myself.  Anyway, is it a useful thing to give warnings and alerts about sarcasm, say with “wink” emoticons like 😉  or is that just enabling people who are only too pleased to be able to take someone literally and thereby take offense?  Now that I think about it, I say screw them, they need to make some effort themselves.

****Which, by the way, is a bigoted title.  If it’s universal, why “human” rights?  What’s so special about humans?  Most of them are unremarkable and unimpressive, and they have to bathe every day, or they really quickly start to stink, since they have more sweat glands per square inch of skin than any other life-form on Earth.  “Human rights”?  You have the right to remain smelly.

Some thoughts (on an article) about Alzheimer’s

I woke up very early today‒way too early, really.  At least I was able to go to bed relatively early last night, having taken half a Benadryl to make sure I fell asleep.  But I’m writing this on my phone because I had to leave the office late yesterday, thanks to the hijinks of the usual individual who delays things on numerous occasions after everyone else has gone for the day.  I was too tired and frustrated to deal with carrying my laptop around with me when I left the office, so I didn’t.

I’m not going to get into too much depth on the subject, but I found an interesting article or two yesterday regarding Alzheimer’s disease.  As you may know, one of the big risk factors for Alzheimer’s is the gene for ApoE4, a particular subtype of the apolipoprotein gene (the healthier version is ApoE3).  People with one copy of the ApoE4 gene have a single-digit multiple of the baseline, overall risk rate for the disease, and people with 2 copies have a many-fold (around 80) times increased risk.

It’s important to note that these are multiples of a “baseline risk” that is relatively small.  This is a point often neglected when discussing the relative risks of a disease affected by particular risk factors when such information is conveyed to the general public.  If the baseline risk for a disease were one in a billion (or less), then a four-times risk and an eighty-times risk might be roughly equivalent in the degree of concern they should raise.  Eighty out of a billion is still less than a one in ten million chance for a disease; some other process would be much more likely to cause one’s deterioration and demise rather than the entity in question.

However, if the baseline risk were 1%‒a small but still real concern‒then a fourfold multiplier would increase the risk to one in 25.  This is still fairly improbable, but certainly worth noting.  An eighty-fold increase in risk would make the disease far more likely than not, and might well make it the single most important concern of the individual’s life.

Alzheimer’s risk in the general population lies between these two extremes, of course, and that baseline varies in different populations of people.  Some of that variation itself may well be due to the varying frequency of the ApoE4 gene and related risk factors in the largely untested population, so it’s tricky to define these baselines, and it can even be misleading, giving rise to false security in some cases and inordinate fear in others.  This is one example of how complex such diseases are from an epidemiological point of view, and highlight just how much we have yet to learn about Alzheimer’s specifically and the development and function of the nervous system in general.

Still, the article in question (I don’t have the link, I’m sorry to say) concerned one of the functions of the ApoE gene (or rather, its products) in general, which involve cholesterol transport in and around nerve cells.  Cholesterol is a key component of cell membranes in animals, and this is particularly pertinent in this case because the myelin in nerves is formed from the sort of “wrapped up” membranes of a type of neural support cell*.

cns myelin

This particular study found that the cells of those with ApoE4 produced less or poorer myelin around nerve cells in the brain, presumably because of that faulty cholesterol transport, and that the myelin also deteriorated over time.

Now, the function of myelin is to allow the rapid progression of nerve impulses along relatively long axons, with impulses sort of jumping from one space (a “Node of Ranvier”) between myelin sheath and another rather than having to travel all the way down the nerve, which a much slower process, seen mostly in autonomic nerves in the periphery.  When normally myelinated nerves lose their myelin, transmission of impulses is not merely slowed down, but becomes erratic and often effectively non-existent.

myelin in general

The researchers found that a particular pharmaceutical can correct for at least some of the faulty cholesterol transport and can thereby support better myelin survival.  Though this does not necessarily point toward a cure or even a serious disease-altering treatment over the long term, it’s certainly interesting and encouraging.

But of course, we know Alzheimer’s to be a complex disease, and it may ultimately entail many processes.  For instance, it’s unclear (to me at least) how this finding relates to the deposition of amyloid plaques, which are also related to ApoE, and are extracellular findings in Alzheimer’s.  Are these plaques the degradation products of imperfect myelin, making them more a sign than a cause of dysfunction, or are they part of the process in and of themselves?

Also, it doesn’t address the question of neurofibrillary tangles, which are defects found within the nerve cells, and appear to be formed from aggregates of microtubule-associated proteins (called tau protein) that are atypically folded and in consequence tend to aggregate and not to function and to interfere with other cellular processes, making them somewhat similar to prions**.  It’s not entirely clear (again, at least to me) which is primary, the plaques or the tangles, or if they are both a consequence of other underlying pathology, but they both seem to contribute to the dysfunction that is Alzheimer’s disease.

So, although potential for a treatment that improves cholesterol transport and supports the ongoing health of the myelin in the central nervous systems of those at risk for Alzheimer’s is certainly promising, it does not yet presage a possible cure (or a perfect prevention) for the disease.  More research needs to be done, at all levels.

Of course, that research is being undertaken, in many places around the world.  But there is little doubt that, if more resources were to be put into the study and research of such diseases, understanding and progress would proceed much more quickly.

The AIDS epidemic that started in the 1980s was a demonstration of the fact that, when society is strongly motivated to put resources into a problem, thus bringing many minds and much money to the work, progress can occur at an astonishing rate.  The Apollo moon landings were another example of such rapid progress.  Such cases of relative success can lead one to wonder just how much farther, how much faster, and how much better our understanding of the universe‒that which is outside us and that which is within us‒could advance if we were able to evoke the motivation that people have to put their resources into, for instance, the World Cup or fast food or celebrity gossip.

I suppose it’s a lot to expect from a large aggregate of upright, largely fur-less apes only one step away from hunting and gathering around sub-Saharan Africa that they collectively allocate resources into things that would, in short order, make life better and more satisfying for the vast majority of them.  All creatures‒and indeed, all entities, down to the level of subatomic particles and up to the level of galaxies‒act in response to local forces.  It’s hard to get humans to see beyond the momentary impulses that drive them, and this shouldn’t be surprising.  But it is disheartening.  That, however, is a subject for other blog posts.

I’ll try to have more to say about Alzheimer’s as I encounter more information.  Just as an example, in closing, another article I found on the same day dealt with the inflammatory cells and mediators in the central nervous system, and how they can initially protect against and later worsen the problem.  We should not be too surprised, I suppose, that a disease that leads to the insidious degeneration of the most complex system in the known universe‒the human brain‒should be complicated and multifactorial in its causation and in its expression.  This should not discourage us too much, though.  The most complicated puzzles are, all else being equal, the most satisfying ones to solve.


*The cell type that creates myelin in the peripheral nervous system (called Schwann cells) is different than the type that makes it in the central nervous system (oligodendrocytes), and this may be part of why Alzheimer’s affects the central nervous system mainly, whereas diseases like ALS (aka Lou Gehrig’s Disease), for instance, primarily affect the nervous system outside the brain.

**The overall shape of a protein in the body is a product of the ordering of its amino acids and how their side chains interact with the cellular environment‒how acidic or basic, how aqueous or fatty, how many of what ions, etc.‒and with other parts of the protein itself.  Some proteins can fold in more than one possible way, and indeed this variability is crucial to the function of proteins as catalysts for highly specific chemical reactions in a cell.  However, some proteins can fold into more than one, relatively stable form, one of which is nonfunctional.  In some cases, these non-functional proteins interact with other proteins of their type (or others) to encourage other copies of the protein to likewise fold into the non-functional shape, and can form polymers of the protein, which can aggregate within the cell and resist breakdown, sometimes forming large conglomerations.  These are the types of proteins that cause prion diseases such as “mad cow disease”, and they appear also to be the source of neurofibrillary tangles in people with Alzheimer’s disease.

The sweetest honey is loathsome in its own deliciousness. And in the taste destroys the appetite. Therefore, blog moderately.

Hello and good morning.  It’s Thursday again, so I return to my traditional weekly blog post, after having taken off last Thursday for Thanksgiving.  I’m still mildly under the weather, but I’m steadily improving.  It’s nothing like a major flu or Covid or anything along those lines, just a typical upper respiratory infection, of which there are oodles.  Most are comparatively benign, especially the ones that have been around for a while, because being not-too-severe is an evolutionarily stable strategy for an infectious agent.

An infection that makes its host too ill will keep that host from moving about and make itself less likely to be spread, to say nothing of an infection that tends to kill its host quickly.  Smart parasites (so to speak) keep their hosts alive and sharing for a looong time.  Of course, “smart” here doesn’t say anything about the parasite itself; viruses are only smart in the sense that they achieve their survival and reproduction well, but they didn’t figure out how to be that way—nature just selected for the ones that survived and reproduced most successfully.  It’s almost tautological, but then again, the very universe itself could be tautological from a certain point of view.

It’s an interesting point, to me anyway, to note that today, December 1st, is precisely one week after Thanksgiving.  Of course, New Year’s Day (January 1st, in case you didn’t know) is always exactly 1 week after Christmas.  It’s unusual for Thanksgiving to precede the first of December by a week, because the specific date of Thanksgiving varies from year to year (and, of course, if Thanksgiving were to fall on the 25th of November, December 1st would not be exactly one week later).  It’s an amusing coincidence; there’s no real significance to it, obviously, but I notice such things.

Anyway.

My sister asked me to write something about the vicissitudes of sugar (not her words), and though I don’t mean to finish the topic here today, I guess I’ll get started.  Apologies to those who are waiting for me to finish the neurology post, but that requires a bit more prep and care, and I’m not ready for it quite yet.  Life keeps getting in the way, as life does, which is one of the reasons I think life is overrated.

It’s hard to know where to start with sugar.  Of course, the term itself refers to a somewhat broad class of molecules, all of which contain comparatively short chains of carbon atoms, to which are bonded hydrogen and hydroxyl* moieties.

Most sugars are not so much actual free chains as they are wrapped up in rings.  The main form of sugar used by the human body is glucose, which is a six-membered ring with the rough chemical formula C6H1206.

glucose2

This is the sugar that every cell in the body is keyed to use as one of its easy-access energy sources, the one insulin tells the cells to take up when everything is working properly.  Interestingly enough, of course, though glucose is the “ready-to-use” energy source, it only provides about 4 kilocalories** per gram to the body, as compared to 9 kilocalories per gram for fats.

But the sugar we get in our diets is not, generally speaking, simple glucose.  It tends to be in the form of disaccharides, or sugars made of two combined individual sugars.  Sucrose, or table sugar, is a dimer of glucose and fructose, joined by an oxygen atom.

sucrose

Okay, I’m going to have to pick this up tomorrow.  I’ve gotten distracted and diverted by a conversation a few seats ahead of me.

There are two guys talking to each other at the end of this train car, and they are each seated next to a window on the opposite side of the train, so they’re basically yelling across the aisle to each other.  Their conversation is perfectly civil, and though they’re revealing a certain amount of ignorance about some matters, they are mainly displaying a clear interest in and exposure to interesting topics, from history to geography and so on.

At one point, one of the men started speaking of the pyramids and how remarkable their construction was, and I feared the invocation of ancient aliens…but then he followed up to say that, obviously, there were really smart people in ancient Egypt, just like we have smart people today who design and build airplanes and rockets and the like.  Kudos to him!

These men are not morons by any means.  They clearly respect the intellectual achievements of the past and present, and that’s actually quite heartening, because I think it’s obvious that neither one is extensively college-educated, if at all.

But why do they have their conversation from opposite sides of the train, so that everyone nearby has to hear it?  It’s thrown me off my course.

I’ll close just by saying that yesterday I finished rereading The Chasm and the Collision, and I want to note that I really think it’s a good book, and to encourage anyone who might be interested to read it.  The paperback is going for I think less than five dollars on Amazon, and the Kindle edition is cheaper still.  If you like the Harry Potter books, or the Chronicles of Narnia, or maybe the Percy Jackson books, I think you would probably like CatC.

CatC cover paperback

I’d love to think that there might be parents out there who would read the book to their kids.  Not kids who are too young—there are a few scary places in the story, and some fairly big and potentially scary ideas (but what good fairy tale doesn’t meet that description?).  It’s a fantasy adventure starring three middle-school students, though I’ll say again that, technically, it’s science fiction, but that doesn’t really matter for the experience of the story.

Most of my other stuff is not suitable for young children in any way—certainly not those below teenage years—and Unanimity and some of my short stories are appallingly dark (though I think still enjoyable).  If you’re old enough and brave enough, I certainly can recommend them; I don’t think I’m wrong to be reasonably proud of them.  But The Chasm and the Collision can be enjoyed by pretty much the whole family.  You certainly don’t have to be a kid to like it, or so I believe.

With that, I’ll let you go for now.  I’ll try to pick up more thoroughly and sensibly on the sugar thing tomorrow, with apologies for effectively just teasing it today.  I’m still not at my sharpest from my cold, and the world is distracting.  But I will do my best—which is all I can do, since anything I do is the only thing I could do in any circumstance, certainly once it’s done, and thus is the best I could do.

Please, all of you do your best, individually and collectively, to take care of yourselves and those you love and those who love you, and have a good month of December.

TTFN


*Hydroxyl groups are just (-OH) groups, meaning an oxygen atom and a hydrogen atom bonded together, like a  water molecule that lost one of its hydrogens.  This points back toward the fact that plants make sugar molecules from the raw building blocks of carbon dioxide (a source for the carbon atoms and some of the oxygen) and water (hydrogen and oxygen) using sunlight as their source of power and releasing oxygen as a waste product.  This was among the first environmental pollutants on the Earth—free oxygen—and it had catastrophic and transformative effects on not just the biosphere of the Earth but even on the geology.  The fact that the iron in our mines, for instance, is mainly in the form of rust is largely because of this plant-born presence of free oxygen in the atmosphere.

**A kilocalories is defined as the amount of energy needed to heat a kilogram of water by one degree centigrade.  We often shorten this term just to “calorie”, but that is actually only the amount of heat needed to raise a gram of water one degree centigrade (or 9/5 degrees Fahrenheit).  It’s worth being at least aware of the fact that what we tend to call calories are actually kilocalories.

If “November” is the 11th month, then is the “second” day number 4?

It’s 4:35 on Wednesday morning, November 2nd, 2022, and it’s already 80 degrees (Fahrenheit) at the Hollywood train station and very muggy.  I’m dripping with sweat just from walking as far as the bench to wait for the earliest morning train*.  It’s ridiculous.  For this reason and others, I wish I had never moved to Florida.  In my opinion, it’s overall a “nice place to visit, but you wouldn’t want to live there”.  Or as many locals say: “Come on vacation, leave on probation”.  It really is a shame, because there is a tremendous amount of natural beauty here, but much of even that has been ruined by invasive species, the main one being Homo sapiens.

It’s on hot and muggy early November mornings such as this that I truly miss being in Michigan, where I grew up.  Say what you will about the Detroit area, at least there are fewer humans there now than there were in the past.  It can be somewhat depressing to see that, but boy, in Autumn all the trees along the side streets in my hometown looked spectacular, and you could walk from your door to the street without sweating.

If the Detroit area is too sad for you‒or too flat‒then you could go to upstate New York, where I went to university.  That was amazing in the Fall.  Walking back to the dorm down Libe Slope after class at this time of year was like seeing a fifty mile wide fireworks display happening in slow motion, spread out over many weeks.  Of course Winter was quite cold, bitter, and snowy there, but if you were adventurous, you could take a tray from the dining hall and “tray” down Libe Slope.  I never did that, myself; there was a road right at the bottom of the hill, and though it was not busy, it was hard not to think about careening uncontrollably into some passing salt truck.

Actually, they really did an amazing job keeping the roads clear in Ithaca in the winter.  They had to keep them clear.  There were many slopes in town that could have served as ski jumps if you’d put an upcurve at the bottom, so these had to be cleared pretty much as fast as the snow could fall.

Of course, while I have my complaints about Florida, I did come here of my own free will**, and have had many good times and good life events here, the most outstanding of which was the birth of my daughter.  I can’t ever complain about that.

My son was born in New York (not Ithaca) but we left before he was old enough really to remember it.  Both of my children are Florida kids, effectively.  I wonder how they would feel if someday they moved up North and experienced Autumn there for the first time, beginning to end.  Would they be as wowed by its beauty as I always have been, or would they feel a homesickness for the heat of the Sunshine State as the weather cooled and the days shortened?

Of course, the days don’t literally shorten, just daylight hours.  There are subtle variations and even occasional tiny diminutions of the day, as happened recently, but overall, the rotation rate of the Earth is going very steadily and gradually to slow, barring other inputs, so days will become longer.  If nothing else, since the planet’s mass is not perfectly symmetrical, as it turns it must radiate some miniscule amount of energy away in the form of gravitational waves, and the Moon/Earth orbital pairing will radiate some, too.

When I say “miniscule”, I’m guilty of severe (and ironic) understatement.  The sun will surely long since have gone through red giant and on to white dwarf status before there would be any appreciable loss of rotational energy from gravitational waves alone.  I can’t give you the numbers‒if anyone out there can, please share‒but it’s tiny, it’s wee, it’s verging on infinitesimal.

Speaking of small things and their opposites, yesterday’s post ended up being unusually long and exceptionally dreary***, so I’ll bring this one to a close now.  Thank you for your patience, thank you for reading, and if you have any comments about reactions to autumn, or to major changes of local climate due to moves throughout life, I would be interested to know about them.  No pressure.


*Yes, I came for the 4:45 am train, but only because there wasn’t an earlier one.  I couldn’t sleep.

**So to speak.  I’m provisionally convinced that there is no such thing as free will.  I could be wrong, of course, but it doesn’t really matter all that much.  As I like to say, I either have free will or I don’t, but it’s not like I have any choice in the matter.

***But nonetheless true.  I can’t pretend that it was an exaggeration nor that really, my mental health is just fine.  It is not.  It’s horrible.

Can we do better than recycling?

Well, I forgot to bring my little laptop back to the house with me yesterday, so I’m writing this blog post on Google Docs via Google Drive on my phone.  It’s very handy, obviously, but it’s not as good a word processor as MS Word, though it has its own relative advantages.  Also, it’s just easier to write using a full, true keyboard than with the simulated keyboard on a smartphone.

It’s not a good sign that I’ve forgotten my laptop.  It’s been years since I forgot it prior to recent weeks, but now I’ve forgotten it twice within about a month.  I am mentally quite foggy, it seems.  You all can probably tell that already, but it’s harder to recognize one’s own deterioration from within, since that with which one does the recognizing is that which is deteriorating.

How troublesome.

Despite not being at my best, I did have a somewhat interesting idea, yesterday‒not for the first time, though it’s become a bit more coherent with each iteration, as such thoughts seem to tend to do.  I was bringing some boxes out to the big dumpster that is reserved solely for cardboard, when it occurred to me‒again, not for the first time‒that we should not be recycling cardboard or paper.  Neither should we be sending it to landfills.  In landfills, of course, paper decays and decomposes, thereby releasing methane and carbon dioxide, so that’s not good.  But the process of recycling is wasteful and inefficient, producing pollution and releasing “greenhouse gases” gasses in its own right.

New paper and cardboard is made from trees grown on tree farms, or such is my understanding.  In other words, old growth forests don’t get cut down to make paper*, but rather, new trees are planted and grown, capturing CO2 from the atmosphere as they grow, though that process is slow and rather inefficient.  But paper and other such things can probably be made from other, faster-growing and even more robust alternatives.

One frequently hears of hemp being touted as a fast-growing source of cellulose and the like, and though I suspect that some of its touted miraculous attributes may be exaggerated, this one seems fairly straightforward.  It’s a rapidly growing plant, the fiber of which has been known to be useful for centuries.  It shouldn’t be too hard to use it for paper and cardboard, and in the meantime, fast-ish growing trees can continue to be planted and take some of the CO2 from the air.

Okay, so, if we don’t recycle it, what do we do with the paper and the cardboard?  We do what some carbon capture technologies are already doing with the carbon they remove from the air: we bury it deep in the earth, preferably in a way that prevents it from decomposing and releasing its carbon back into the atmosphere.  There are ways to do this, in principle, that should be rather cheap.  I would imagine that vacuum packing before deep burying might do the trick.

The ideal place to dispose of it‒indeed it would be a good way of disposing of much of our carbonaceous wastes, including our own bodies, when we die‒would be near a deep ocean subduction zone, where it would eventually be carried back into the mantle of the Earth to remain sequestered and redistributed for millions of years.  Of course, one would probably have to do such deep ocean “burials” on large scales to avoid it being a net detriment, carbon-wise.

Cremation certainly doesn’t make sense when it comes to atmospheric carbon, though it may be better for space considerations. It’s probably worse than burial for the overall environment.  But humans are superstitious about their bodies and the bodies of their relatives and whatnot, so convincing them to do something sensible with them might be a serious uphill battle.

Even plastic should probably not be recycled, except where that can be done in a way that produces something more cheaply and efficiently and in a less atmospherically costly way than making new plastic for particular uses, without subsidizing the process.  Better to do the deep burial thing with that as well.  Plastic can be an excellent carbon sink, and instead of recycling it, we can put more effort into producing neo-plastics from plants rather than petroleum, again removing carbon from the atmosphere.

It’s interesting how feel-good ideas of the past (and the present) can sometimes turn out to be more detrimental than beneficial.  But that’s why one must always assess and reassess every situation as it goes along, testing all knowledge against the unforgiving surface of reality, and not being afraid to rethink things.  At the very least, it can be fun.

I used to think it would be a great idea to breed and/or engineer bacteria or fungi that can digest plastics, but now I realize that this would release a vast quantity of new carbon dioxide and methane and the like into the atmosphere.  Better to have algae that trap carbon and then are converted into plastics, or fuel, or something similar.  At least for now.

Because solving one problem, assuming that even happens, will always lead to new, unforeseeable problems and questions that must be addressed.  But each new question faced and each new problem solved makes the knowledge and capacity of civilization greater.  There is no upper limit on how much can be known‒or if there is, it’s so far beyond what we do know that we cannot even contemplate it sensibly.  There is, however, a definite lower limit of knowledge (not counting “anti-knowledge” or stupidity, which is another point of exploration entirely), and that is zero‒a return to a state with no life, no mind, no information.

Some of us find that state enticing for ourselves, but when I’m feeling unusually generous, I think it would be a shame for civilization to come to naught.  There’s nothing in the laws of nature preventing it from happening, though, anymore than there’s anything preventing a reckless teenage driver from being killed in a car accident, no matter how immortal he feels.  It’s never too early to try to learn discipline and responsibility, to become more self aware and aware of the universe…but it can be too late.

Anyway, that’s enough for the day.  I hope I didn’t bore you.  Have a good day.


*More often, it seems, this is done to create new farmland, which is a separate issue.

The borogroves sure are mimsy today, aren’t they?

It’s Friday again, and another weekend approaches.

Yippee.  Huzzah.  O frabjous day.

I think I don’t work tomorrow—at least, I’m not supposed to—so there probably won’t be any blog post then (which will be Saturday, unless some hitherto unimagined catastrophe literally throws the days of the week out of order).

I may be posting a new video on my YouTube channel this weekend, though.  I haven’t made one yet, so there’s no guarantee that something won’t stop me from doing so.  I’m unlikely to be lucky enough to be involved in an asteroid impact between now and tomorrow, but there’s a functionally limitless number of things that could, in principle, stop me from recording a video.

Nevertheless, it is my intention to make a video, so I probably will.  This is a different type of thing than fasting; no physiological processes and neurological feedback loops are likely to interfere with my commitment to making a video.  Evolution is, so far, utterly blind even to the existence of videos…though that could change.

I’m still not sure what topic I want to address in the video, unlike last time.  I may literally just start my timer, start my video, start to talk, and see what happens.  If that sounds like an inauspicious way to start a video, well, you’re reading the written equivalent of it right now.  If you enjoy this, you’re proof that it can work.  If you don’t enjoy it, that’s not proof that it cannot work, since your lack of enjoyment doesn’t preclude anyone else from enjoying it.

People do seem to have trouble understanding that others can like things that they themselves find disgusting.  I can sympathize with that, and fall prey to the failing myself, but that doesn’t make it reasonable.

It’s true that all mammals, let alone all humans, have more in common than they have differences, but nevertheless, the potential differences just within a given species, given sexual recombination of genes and the sheer number of genes each individual has, is well worthy of the adjective “astronomical”, so we shouldn’t be surprised that others like things we find repugnant.  In fact, given that the number of possible combinations of gene pairs in human DNA alone is vastly larger than the number of (for instance) light years the visible universe is across*, maybe we should switch our use of the terms “biological” and “astronomical” to describe very large numbers.  Unfortunately, I think most people wouldn’t catch onto the nuance of saying that something was “biologically large”.

Oh, well.  It was a brief dream, swiftly shattered by the one who dreamed it.  Typical.

Anyway, so, I’m back on food again, more’s the pity.  I’m tired of having all these biological urges and needs and drives.  They’re very irritating.

Also, I’m tired of how stressed and angry I get about things people do at work.  Don’t get me wrong—the specific things I’m thinking about are worthy of anger.  But the problem is that I get so stressed, and so angry, and it just makes me hate myself more and more all the time, without any evident upper bound to the process.

I wish it were true to say, “I can’t stand it anymore”, but unfortunately, I’m able in principle to continue standing things for who knows how long.  I wish I would just collapse into a heap, and literally, physically, not be able to go on.  It would take so much out of my hands and would be such a relief.  Unfortunately, there’s no clear sign of that happening, though I try to sabotage my own health as much as feasible without being Baker Acted.

And here is another maddening thing that just happened:  the trains this morning, it turns out, were all shifted to one side of the track, as was the case last week once.  But this wasn’t announced early, unlike last time, so I went to my usual spot to start writing this while waiting.  Then, when the “announcement” was made, it was just posted on the overhead light board; there was no verbal announcement, though they give recorded verbal reminders about such things usually—they’ve been informing us, ever since Labor Day, that the system will be running on a Sunday schedule on Thanksgiving, which is in November, for those of you who don’t know.  Labor Day was in the beginning of September.

I only failed to miss my train because I always start getting ready to board five minutes early, and I looked up from my writing to notice that there was no one on my side of the tracks.  Only then did I see the notice that trains were all boarding on the other side.  I was able to take the elevator up to the bridge, but I had to rush down the stairs on the other side because my train was approaching, and my knees and hips and ankle were miffed about that.

It would have been nice for one of the people who always gets on the same train I get on to have said something to me, rather than just letting me sit there typing on one side of the track by myself.  I’d like to think I would have said something to them, were the situation reversed.  Maybe I wouldn’t.  Maybe it’s an instance of the bystander effect.  Maybe it’s one of those rare circumstances in which my reticence to interact with strangers is obvious to everyone, and I seem so unpleasant that no one wants to interact with me even enough to say, “Hey, all the trains are boarding on the other side for some reason…better cross over.”

Better cross over.  That’s the best idea I’ve heard today, that’s for sure.

Okay, well, that’s it for today’s disjointed meandering.  I hope you’ve found some modicum of joy in it.  It would be nice to be able to do at least something positive for the world, even if it’s small.  It would be far better than what I usually do.


*Using the particle horizon as the measured “distance across”. **

**Actually, since there are four bases in human DNA (guanine, cytosine, adenine, and thymine), if they were assigned randomly, then even a string of 1000 base pairs has 1.15 x 10602 possible combinations.  If memory serves, this is larger than the String Theory landscape, which number is already so vast as to lead many physicists to say it can predict anything and therefore it can predict nothing.  And human DNA is on the order of a billion nucleotides long.  My computer calculator can’t deal with billionth powers of four, but a billion is a thousand times a thousand times a thousand, so 41000 cubed should be about 101806 unless I’m missing something.  The diameter of the visible universe in Planck lengths is only 5 x 1061, which is not even close to the same order of magnitude.  Of course, the maximal information within a horizon the size of the visible universe is larger still, but then again, that’s a measure of the maximum entropy possible within that region, so that’s almost a given.  I think it’s 210^123 or something along those lines.  I may be getting at least some of this wrong.

Welcome to the October Country

Well, it’s October 1st, the beginning of a new month in 2022, a month initially meant to be the eighth month, based on its name.

I’m at the train station and, it being Saturday, the schedule is different than during the week.  There’s also some question of whether the trains are boarding on the usual side or not.  There’s a displayed “announcement” on the light boards that all trains are boarding on one side at this station until further notice, but it could be something left over from yesterday.  Also, the guard is not aware of anything regarding the change in sides.

Nevertheless, today was a day for ordering the monthly pass on the machines, and the ones on my usual side weren’t even working, so I’m on the other side for the moment, anyway.  I’m going to have to try to be vigilant as the time for my train approaches*.  If I miss one train, the next won’t come for another hour.

It’s hard to be vigilant, though.  I feel absolutely exhausted.  My brain feels like it’s barely running on one cylinder, metaphorically speaking**.  I’m just so very tired.

Thankfully, I can embed below my video, which I did end up posting on my YouTube channel yesterday afternoon, so that can provide some of the content and spare me a little writing today.  I might as well, since what I’ve written so far is about some of the most banal things imaginable.

Just a bit of clarification about the video, in case any is necessary:  Obviously I don’t mean to say there is literally no life in the universe, since that would be a contradiction (If there were literally no life, then I could not be speaking about the fact).

I just have always been irked by people who make the wide-eyed claims that it’s so amazing and quasi-mystical that the constants of nature are so perfectly designed to make life, and that must imply some sacred meaning or purpose to it.  That’s about as idiotic as looking at the location of a speck of dust in the corner of a school gym and saying how amazing it is that all the facts of nature conspired to bring that speck of dust right there at that point…it had to have been part of some greater purpose!  It’s drivel.  Only the case with life is even more unimpressive.

My biggest issue with this is that it leads to a kind of quiescence, an assumption that, if the universe was “designed” just so that life can exist, then life, and particularly intelligent life, must be important, and the universe will somehow arrange things to nurture us and protect us from extinction.  If you think that’s the case, then ask the dinosaurs, or better yet, any of the far greater numbers of life forms that went extinct in the Permian-Triassic “Great Dying”.

Oh, wait, you can’t.  They’re all extinct.

No, the universe is almost completely hostile to life, both in terms of its space and in terms of its time.  We are lucky beyond ordinary imagining, though I tried in the description of the video to give some notion of just how lucky in spatial terms, at least, by noting that life exists in roughly only 1.5 x 10-64 of the universe’s volume.

As far as time goes, well if you’re thinking of humanity alone, based on the time that has elapsed since the “Big Bang”, which may or may not be the literal beginning of our universe, the percentage is tiny enough, and others have demonstrated this handily, as in the “cosmic calendar” that Carl Sagan made famous in Cosmos.  But if you want to count all expected possible future time, well then our existence is some fraction of what could be infinity, which is pretty undefined, but might as well be called zero.  The limit certainly approaches zero as we extend the future further and further.

This is not necessarily a call for people just to give up and say “what the hell”, though you have that option, of course, and it is tempting.  I wanted to note that, if you would like for life to continue, and even to have some lasting, cosmic-scale impact, then you can’t take it for granted.  You need to work at it, and work hard, and work long.  The universe is not trying to kill us (contrary to Neil DeGrasse Tyson’s habitual way of putting it); if it were, we would be dead already.  But the universe is huge, and it does not even have the capacity to care what happens to life, except in the minds of that life itself.

All life is in the situation of a castaway on a desert island—there’s no preexisting infrastructure, there’s no one out there looking out for you or protecting you, or providing your light, your heat, your air-conditioning, your food, your clothes, your shelter, what have you.  If you want any of those things, you’re going to have to make and/or find them for yourself, and you’re going to have to keep doing it, for as long as you actually want them and want to survive.

Without much more ado, here’s the video***.  I forgot to ask when I made the video, but please give a “thumbs up” and subscribe and share if you are at all inclined to do so, for any colorable reason.  And feel free to check out the other stuff on my YouTube channel if it looks interesting to you.  If anyone finds this interesting at all, I’m hoping to make more such videos about topics that interest me, assuming the universe doesn’t eliminate me in the meantime (though it seems likely to do so).  Oh, and please let me know what you think, either in the comments below the video or here.

Thanks.  Here it is:


*Just a slightly later addendum:  They have announced overhead that my train is approaching in 10 minutes, and have confirmed that it is not on its usual side.  So I was right to be proactive.

**Of course, it’s a metaphor.  I don’t honestly think that any of you really believe that my brain is an internal combustion engine of some kind, except in the loosest of possible senses.  Apologies.

***I wore a mask and dark glasses in the video mainly because I don’t like how my face looks—it bears evidence of the many things that have happened to me in the last decade or so.  Maybe no one else can see it but me, but it is what it is.  Anyway, the glasses are awesome, I really like them, and the mask combined with them makes for a good look, I think.  Certainly better than my underlying face, anyway.

This is an untitled blog post…or IS it?

Okay, well, I’m back on the laptop again, today.  I think I did a decent job of gauging how long my post should be yesterday, despite using my phone to write it.  It did seem to take slightly longer to write the same number of words than it would have with the laptop.  It’s just easier to write faster when you’re using a (nearly) full-scale keyboard and more or less all of your fingers instead of your two thumbs to type.

Still, as I think I’ve noted before, I wrote a goodly part of my science fiction novel, Son of Man using a smartphone that was quite a bit smaller than the one I have now, and I think it turned out pretty well.  At least, the feedback I’ve gotten from the few people I know who have read it and who deigned to comment—one of whom has sadly died—was good.

Not much has changed since yesterday, though.  By which I mean I’m not sure why I’m bothering to keep doing this blog.  I don’t think it’s doing me much good.  As anyone reading regularly can probably tell, my mental health doesn’t seem to be improving at all despite the use of this unidirectional “talk therapy”.

I’m a creature of habit, though, so I’ll continue this until…well, until something stops me, or until I stop doing even this little bit of proactive stuff.  I’m sure that will leave the world no poorer.

The hurricane that’s approaching is not supposed to hit this part of Florida, but to make landfall along the central west coast, but it’s still been sloppy and rainy, and a bit windy, these past few days.  Sunday afternoon was sunny and clear, and I went for a long walk near the end of the day, but since then we’ve had wetness.  At least the modest windiness—which may have at least something peripheral to do with the hurricane—makes it feel less muggy.

It’s almost pleasant, and even has a slight autumnal feel to it.  It reminds me vaguely of the times in the year after school had started and as Halloween approached up north, when the leaves would begin changing—something that, alas, doesn’t really happen in south Florida—and you had to wear a light jacket against the breeze, but it wasn’t yet truly cold.

Of course, no jackets are required here in south Florida, unless you’re going to some high end club or restaurant, or unless you’re wearing one to keep off the rain.  But an umbrella works better against the rain here, in my experience, and it doesn’t leave you so sweaty.  However, if you’re riding a motorcycle, a good rain jacket is useful, and rain pants if you have them.  A good helmet is more than adequate to keep your head dry, and even keeps it warm in what passes for cold weather in south Florida*.

Here I go again, talking about the weather.  It’s rather pathetic, I know, I’m sorry.

I guess I could comment on political or scientific stories if you’d prefer.  I don’t know what happened with the NASA probe thing last night, the experiment to try to shift the orbit of an asteroid.  It’s a trial of concept, basically, to tease out the workings of the process of changing the long-term orbit of an asteroid, in case one ever appears to be headed for Earth.

The laws of motion and Newtonian gravity are more than adequate for us to tell well in advance where an object’s orbit will take it—if we know where the object is and how it’s moving—and what sort of change would make it no longer headed to intersect the Earth, if it were otherwise going to do so.  Given enough lead time, even a tiny nudge can be more than adequate to prevent collisions.

Of course, also given enough lead time, a tiny nudge and the same technology could alter the trajectory of a hitherto harmless asteroid and put it in a trajectory to hit the Earth.

Don’t think I haven’t thought about it.  Regrettably, I don’t have the resources to pull off such a scheme.  However, there are now at least a few people in the world who have their own private space programs, some capable of interplanetary travel.  I wouldn’t put it past Elon Musk to steer a modest asteroid toward Earth to cause just massive enough a catastrophe to support his point pushing for human colonization of other planets, as a sort of object lesson.

Okay, well, I don’t really think he would do that.  He has too much to lose, and it could be quite tricky to steer such an asteroid finely, so that it hit where on Earth you wanted it to hit.  But it might be a good way to unify the human race.  I’ve often thought that we need a real supervillain to bring the world together.  I would volunteer, but I don’t think humanity is worth the effort.  I’m more inclined just to steer a whopping BIG asteroid at Earth and do a planetary reset.

I wouldn’t do this for any ideological reason, and certainly not for any religious reason.  I believe the supernatural cannot exist by (my) definition**.  I just think it would be a good test, of sorts.  If humanity were able to come together to prevent the catastrophe, or to at least survive it and rebuild, they would have demonstrated their continuing worthiness.  And if not, well, then not.

Honestly, given the fact that life is more or less inevitably dominated by fear and pain***, I often veer toward anti-natalism, and even pro-mortalism (look them up).  Of course, given that I have children, and they are the most important two facts about the universe to me, by far, I can hardly be said to be a pure pro-mortalist or anti-natalist.  But then, I never claimed to be.

I don’t think it’s usually good to try to define oneself by any “ism”.  It’s vanishingly unlikely that any one given, finite ideology will have come up with reliable, complete, and final answers. regarding much of anything about life.  If it had, I suspect that fact would have become evident, if not obvious, by now.

Knowledge and deep understanding is gained incrementally, not revealed by some “authority”; the universe is extremely complex, at least on scales like the surface of the Earth at this stage of cosmic evolution.  We can’t expect any simple, easy-to-solve equation to describe even the eddies and whorls that take place when milk first begins mixing into coffee, and that’s more or less the stage of the universe we’re in right now (on a much bigger scale than a cup of coffee, obviously).

Okay, well, I don’t know how I got around to those subjects, but I guess that’s the sort of thing that can happen with stream-of-consciousness writing.  At least it wasn’t just a complete rehash of what I wrote yesterday.  Hopefully tomorrow will likewise not be a rehash.  Tomorrow and tomorrow and tomorrow may creep on in this petty pace to the last syllable of recorded time (which record will eventually decay as time goes its interminable way), but each morrow will differ in its details, at least until all things are washed out by entropy.  It’ll be a while—on the mortal scale, anyway—before that happens cosmically.

Keep your eyes peeled and your ears pricked up, though.  It is coming.

Cloudy coffee


*To be fair, if you’re riding at 70+ miles per hour, even a low in the low fifties feels pretty darn cold, but that sort of weather won’t be back for months now, and goodness knows if I’ll ever ride again.

**By which I mean to say, even if there were such things as gods and demons and angels and spirits and so on, if they really existed, then they would in fact be part of nature, and would have a “lawful” existence of some type, and would therefore be natural.  Only imaginary things can be “supernatural”.

***I’m sure I’ve gone into this before.  It is essential for any successfully reproducing organism to have strong senses of pain and fear, to avoid danger and to avoid and seek to mitigate damage.  These must be more immediate and powerful—and potentially more enduring—than any sense of pleasure or joy.  All pleasure and joy must, by nature, be fleeting, or else an organism will not be driven to work to survive, to reproduce as often as feasible.  An organism that feels little to no fear or pain, and that experiences lasting and powerful joy from any given stimulus or circumstance, will live a blissful but short life, and will be outcompeted by fearful, aggressive, and pain-prone creatures.  It would not tend to leave many offspring, all other things being equal.

Just Another Tuesday

It’s Tuesday morning again.  Another Tuesday.  This one is the 20th of September, in case anyone in the future is reading and wants to know what day this post was on, and is not reading this on the site proper, where the date is—I think—displayed above the post.  I’ll assume that anyone who cares about the date and is reading it today already knows what the date is, as well as the year.

There’s nothing really new to report, obviously.  As has been the case for a while now, I haven’t written any new fiction, haven’t done more than pick up a guitar, strum at it, and think about how shitty my playing sounds before putting it back down again.  Standard issue things to do, you know?

It continues to be dreary and rainy around here, though we have it easier than Puerto Rico, where the hurricane has knocked out power for the whole island.  That hurricane is not expected to head toward us at all, though it did just now really start to rain rather heavily.  The train stations are all covered though, so the rain doesn’t matter much unless it becomes quite windy, and right now there’s essentially no wind.

I thought it might rain as I was headed toward the train station—not just because this is south Florida and it’s been raining every day, so why should it stop, but because I could see tall, pillar-like clouds looming, even in the night sky, lit by reflected urban lights below.  They look nifty, but the shape of them, and the updrafts that no doubt exist within them, cooling all that airborne water, make it all but inevitable that rain will fall.

And now, as if conjured, the wind arrives, and speckles of rain are appearing on the screen of my laptop.  At least it’s somewhat refreshing.  If it becomes too prominent, I may have to pause and put the computer away to protect it.  But if that happens, none of you will be able to tell unless I tell you about it.  Weird, huh?  Well, no, not really I guess.  I think that’s just me—I’m the weird thing here.

Anyway, the rain is already slacking off some, and there’s only the tiniest of breezes remaining.  Further bulletins as events warrant.

I suspect that nearly all the noteworthy events in my life have already passed, though.  There’s very little else to say, though that doesn’t seem to stop me from saying it.  I “talk” to all of you, because I seem incapable of talking to anyone else.  That’s my fault, not anyone else’s.  I’m a faulty mechanism, what can I say?  I’m faultier than San Andreas.  I’m buggier than the Amazon rainforest*.  I’m not a very good device.  Not to say that I don’t have some remarkable design features, but none of them are really specific to me; they’re standard in the model.  The ways in which I am not standard seem to be associated with problems, which I guess is often the case.

Or maybe that’s all just egotistical in its own way, even though it’s fundamentally a case of self-loathing.  It’s probably just as arrogant to think that one is exceptionally bad or imperfect as to think that one is exceptionally perfect or good.  But there are more ways to be imperfect than to be perfect.  At least, it seems like that would be the case, though frankly, I’m not even sure what it would mean for a person to be perfect, and I’m not sure that anyone else knows what it means when they say it, either.  People use the word without really thinking about it, though to be fair, I don’t hear people referring to other people as perfect very often, and good on them that they don’t, since I don’t think anyone is perfect by whatever standard you might choose**.

Well, the train just arrived, but like yesterday (which I didn’t mention then) whoever is driving it today stopped way “sooner” than any of the other drivers do, and so I had to follow the other people who hadn’t gotten up off their asses early to wait for it to arrive, as I had, because I try to plan ahead.  Also, someone is sitting in my usual seat, which makes me unreasonably frustrated.  I know I have no claim on any particular seat or anything, but I try to do my stuff consistently so there are fewer surprises with which to deal, but that doesn’t seem to work.

Here’s an aside, though.  This is one of the trains that’s running the automated PA announcement system, which tells you which station you’re approaching and reminds you to check for your belongings before you get up and leave.  Then it says, “Please watch your step while you’re exiting the doors.”

Am I the only person who finds that last sentence irritatingly a-sensical?  “Exiting the doors” seems to imply that you were, until that point, inside the doors!  But no one is inside the doors.  The doors are barely three-dimensional; no ordinary, human-scale organism could actually be inside the doors.  Passengers are inside the train cars, they exit through the doorways, they don’t exit the doors!

If the person who wrote and recorded that announcement—which has annoyed me since the first time I heard it—is out there, can you please just come and kill me?  You’re one of the things that makes this planet so intolerable, and it would be just as well if you could help me leave it, since I’m looking to do that anyway.

I want to say that I feel like I’m losing my mind, but the problem, if anything, is that my grasp of reality is too persistent and consistent.  My weakness, if you will, is my relative inability to delude myself.  I can see the chaos (in the mathematical and poetic senses) for what it is, as well as the infinite stupidity*** of everything out there.

It sometimes seems that I can literally feel the yawning emptiness of the cosmos, but I know that’s an illusion.  I’m no more capable of truly conceiving of the infinite than is any other finite being.  But it does sometimes seem that I can feel it, just vaguely, looming above me and above everything, as well as beneath me, since “above” is a relative measure, and we are surrounded in all directions by mostly empty space.  Sometimes that’s even comforting.  You know, like the song says, “Freedom’s just another word for nothing left to lose.”

I don’t know what point I’m trying to make.  I don’t think there is a point, either to this post, or to anything else.  It’s just another post, just another Tuesday, just another meaningless instantiation of “atoms and the void”, to quote Democritus.

I wonder if that was his real name, “Democritus”?  It seems too coincidental to be what his parents named him.  I know “Plato” was a nickname; I’m not sure about Aristotle.

Oh, well, what does it matter?  He’s dead, and he’s been dead for a couple of thousand years.  I always knew he was smart****.


*Which I like better than the Microsoft rainforest or the Google rainforest.  Ha ha.

**Unless you choose some cheesy standard such as “perfect at being who you are”, but in that case, everyone is perfect, which is just another way of saying that no one is, so it adds nothing.

***No matter how large an intelligence is, as long as it’s not infinite, then its stupidity, or at least its ignorance, is always infinite.  I know, that’s probably an unreasonable standard against which to measure any intelligence or anything else, really, but I never claimed to be trying to be fair, just that I can recognize the endless abyss of lack that lies beyond the realms of anything finite that exists.

****Well, no I didn’t.  I haven’t existed always, for one thing; I’ve only been around for just shy of 53 years, though sometimes it feels like it’s been millennia.  Also, I hadn’t even heard of Democritus for the first ten years or so of my life, not until Carl Sagan talked about him in Cosmos.  So “I always knew” is just flagrantly inaccurate.  It’s a bit like how people say things like, “that email never came”.  I always want to say, “Never?  You waited until the end of time itself, and the email still hadn’t arrived?  I mean, never is a really long time.  If you wait an infinite amount of time, anything possible that can happen will have happened, so it seems truly impossible that the email never arrived.  EVERY email should have arrived if you waited long enough to legitimately use the word ‘never’.”  But I hold my tongue…usually.  It gets my fingers wet, though.

All talk is small—all facts are trivia

Well, it’s Monday morning again, now the 19th of September in 2022, and I’m again at the train station waiting for the train to bring me to work…though before I’m done with this post, I’m sure I’ll already be on the train.  I write pretty fast, but it’s rare that I finish the first draft of any blog post before the train arrives, unless it’s running quite late.

This is the last Monday of summer in 2022, for whatever that’s worth.  It’s still irritatingly hot here in south Florida, and more importantly, it’s muggy and has rained every day.  Yesterday morning there was an absolute torrent for a bit, then it slacked off for a while before sputtering on and off throughout the rest of the day and night.

Yes, I am writing about the weather.  I don’t know if that’s better or worse than talking to someone about the weather.  I’m not much good at small talk, so maybe writing about the weather is better.  It doesn’t make me feel stressed, at least.  Possibly there are people out there who wish that it did, so I wouldn’t write such things.  But, then again, unlike the case with small talk, there is no social pressure for anyone to have to read what I write, so it’s better, ethically, to write nonsense than to talk trivialities, because there’s no pressure on anyone else to go along with it or to respond in kind.

That is one of the issues with small talk, after all.  When someone starts talking to you about something in which you have no interest, or which you find irritating, there’s this weird social impetus at least to give a cursory listen to what they’re saying.  That’s a puzzling social dynamic, when you think about it.  Why do people feel pressure to interact with someone when that other person is not saying anything of interest?

But of course, people do feel that pressure, and so small talkers can impose themselves upon their…well, let’s call them their “victims” for lack of a better word, knowing that the victims will feel the urge to interact politely, even if they have no interest in the conversation.  The only people who would feel comfortable just ignoring the small talker are those who feel no moral or social obligations, who can just go off and ignore the first person with internal impunity, perhaps sadistically to initiate small talk with someone else, solely for the purpose of tormenting them, knowing that others feel the pressure to go along with it.

In other words, small talk rewards sociopaths.

For this, and for many other reasons, we should abolish it.  Also, it makes people like me feel ridiculously awkward, because for me, conversation is something that generally serves a purpose, one related to the subject of the conversation, so engaging in small talk is rather like watching an old-school television tuned to an empty channel and trying to discern what the meaning behind the static might be.

At least a percent or so of that crackling and hissing and “snow” comes from the cosmic microwave background, the leftover heat from the early universe, last propagated when the current cosmos was about 300,000 years old and it finally got cool enough for electrons and protons to bond into atoms, so photons could finally fly freely through space without hitting a stray charged particle every few instants and being scattered.  That’s an interesting fact, unlike most things to do with small talk.

Although, in a sense, the cosmic microwave background and what it implies or that of which it records the evidence, is not much more significant than the weather is.  In fact, on any given day, it’s probably far less crucial than the weather.  It can be useful to know whether to bring an umbrella with one (I always do, anyway), or whether one should bring a jacket (rarely necessary in south Florida in September), or if there’s a hurricane threatening*.

So, if small talk is a way of spreading seemingly trivial, but potentially consequential, bits of information from one person to another, to try to keep the whole group, or “flange”, in a state of preparedness, I guess that could be a good thing.  That is, it would be a good thing if you think it’s a good thing for groups of humans to be mutually connected and better prepared to protect themselves and each other from the elements.

Most days, there are at least a few moments when I would much prefer for a massive storm to come up and blow them all away.  But don’t be misled into thinking that I’m just a misanthrope.  I don’t think other animals, or plants, or fungi (or microbes) are any finer or more innocent or sweet or lovable than humans.  They aren’t.  Indeed, nature does not select for sweetness except as a means to an end.  A baby is sweet and cute because that fact manipulates the nervous system of adults to protect it and care for it.

All life manipulates and exploits and preys on other life in one way or another.  Even photosynthetic organisms compete with other such organisms for light, trying to out-produce and out-reproduce the organisms around them.  Nature, red in tooth and claw has been said to unnecessarily focus on violence as a description of the world, but in fact, it’s overly narrow.  Nature could be accurately described as red in tooth and claw and leaf and branch and fur and feather and shell and stem, and so on.

Even cooperation strategies are mainly ways of forming gangs to outcompete other gangs.  What’s more, they are all vulnerable to the defection of any member of their group—thus the horror of cancer, as individual cells in a body lose their inhibitions and start to reproduce without check, temporarily succeeding but eventually destroying the organism.

So, though there’s nothing inherently evil or wrong with life, from some moral point of view—since morality doesn’t have any meaning without life in the first place—there’s nothing particularly moral or good about life, either.  Life likes life, as a general tendency, and tends to make excuses for itself, which it would, and fair play to it, but it’s just a highly localized, complex epiphenomenon (or set of epiphenomena) that for all we know exists only on the surface of the Earth.

It may legitimately be true that we cannot rule out life existing elsewhere in the cosmos, and it may seem terribly unlikely that the only life in the universe is on Earth, but it’s very tricky to try to extrapolate probability from one solitary instance of a phenomenon.  It’s a pretty undisputable fact that nearly everything we can see in the universe is not hospitable to life as we know it.

Maybe the answer to the Fermi problem is that there is no sign of life outside of Earth because there is no life outside the Earth, and all that one would ever hear, if one were to listen to the cosmos forever, is static.  Not even small talk.  Life on Earth could be the true aberration, an abomination of sorts…except, of course, nature doesn’t do abominations, nature just does whatever it does.

I don’t know what point I’m trying to make with all this.  Maybe there is no point.  Maybe that, in fact, is the point.  Maybe I shouldn’t lament or bemoan small talk, because all talk is small talk when you get right down to it, and every fact is trivia, and all of history is just a “poof” of a random sound taking place in a wasteland…a pebble dislodged by the wind and rolling down a sand dune to rest a little lower than it had been, but without any purpose, without any goal, without any inherent or external meaning.

Anyway, what I’m really trying to get at is, the weather sure has been crappy lately, hasn’t it?


*As far as I know, there isn’t.  Not in the Atlantic, at least, not one that’s going to head toward Florida.  But I haven’t checked the hurricane center since Friday or Saturday, when there was just a tropical storm that was never going to hit us here unless something truly weird happened.