Udaimonic so-and-so, U.

It’s the last day of February, everyone.  It’s also Friday, the last day of the “typical” work week, and it is also the last day of my work week, since I am not working tomorrow.  It’s not as though I have anything particular to which to look forward this weekend, but I do need the rest.  I’ve been feeling exceptionally exhausted lately.

Alas, as you know, exhaustion does not translate into sleepiness for me, just weariness.  Somewhere in the neurologic centers and relays that connect such things as fatigue and sleep, I have a short circuit, or at least one that doesn’t perform up to spec.

Of course, my pain continues, though as always, I have tried to adapt my activity, my posture, my exercise, my shoes…even my underclothes to try to decrease my pain.  I have put a tremendous amount of mental energy into this over the years.  If I had devoted that time/energy/effort to the study of any abstract problem‒say, the dynamics of an accelerating near-light-speed spacecraft approaching its local Schwarzschild “radius” as length contraction and “relativistic mass” take effect and bring GR into play‒then I would have made significant, possibly really important, advances.

Alas, when one’s problem is chronic pain (coupled, causally or otherwise, with insomnia), it is very difficult to focus enough mental acuity upon other things.  The very nature of pain as a neurological process in animal systems does not allow it to be easily ignored, or indeed to be ignored at all for any length of time.

Those creatures which can readily ignore pain for long, or who don’t experience pain*, don’t tend to leave as many offspring as those for whom pain is both present and urgent.

It’s a similar problem for those rare people who don’t experience fear, though clinically this seems more likely to happen as a result of damage to the brain rather than being congenital, possibly because children without fear really don’t tend to reach adulthood.

It’s interesting to note that, anecdotally at least, people who don’t feel fear tend to be quite frightening to would-be bullies and predators.  They don’t behave like others do in response to potential threats, and predators tend to rely on fear in others.  A person who looks at them with no more fear than they would at a tree or a rock can be quite disconcerting for someone who has become dependent upon the fear of others.

This is one of the reasons it can be good to have dogs present if you’re guarding something.  They don’t fear guns (generally) so one can’t exactly threaten them with firearms.  And if they attack, they don’t hold back.

That was quite a series of little tangents, wasn’t it?  I think they were interesting, but then again, I was the one who brought them up, so that shouldn’t be surprising.  Whether or not anyone else is interested is difficult to guess.  It’s rather akin to the way things are with humor‒it can be very hard to know consistently what other people will find funny, or for them to know what you find funny, so you might as well amuse yourself.  Then, at least, you can watch to see who enjoys your humor, and those people are the ones with whom you can enjoy such things in the future, at least in principle.

I am horribly tired, and I’m in a great deal of pain as I write this, though for the moment at least I don’t notice any fear that might be present.  Time’s been my way when I’ve been so tired and depressed and in pain that I had no reaction to and felt no fear toward things that would normally have made me quite afraid, from minor things like wasps and bees all the way up to oncoming cars and trucks.  I don’t tend to be afraid of people much, never have been‒at least, I’m not afraid of them physically.  Socially, they can make me quite tense.  In that case, though, the tension is not the same as fear, though I guess it qualifies as anxiety.

Speaking of fear, I fear this is it for this week.  I truly hope that you all have a wonderful day and a wonderful weekend and that you are healthy and safe and eudaimonic**.


*There are people who have a genetic disorder called CIPA:  congenital insensitivity to pain with anhidrosis (i.e., they don’t sweat), and they basically don’t experience pain.  They also don’t live very long, and before they die their bodies tend to be quite damaged, often by such simple things as standing in one position for too long, since it doesn’t feel uncomfortable to do so for them.  They also don’t notice infections, and they don’t tend to get fevers.  It occurs to me, however, that though their lives are short, people with CIPA might well have significantly longer pain-free lifespans than, say, I have had.  I had pain issues starting at a pretty young age, after all.  Still, if I could be cured of all pain at this stage of my life, when I am hardly worried about my longevity anyway, I think it would be worth it.

**It’s interesting to consider the prefixes “eu” and “u” in words of Greek origin.  “Utopia”, for instance, literally means “no place”, making it clear that an imagined perfect society does not exist and may be impossible.  Whereas, if one were to write “Eutopia”, one would mean “true place” or “good place”.  Thus, my middle name “Eugene” means “true born” and is etymologically related to the term “eugenics”.  Mind you, only a fool would believe that I was actually the product of some eugenics program, that I am some true-life Khan Noonien Singh***.  “Eugene” was just my paternal grandfather’s name.  On the other hand, while eudaimonia means “good spirit” and refers to a state of general emotional and mental well-being, “udaimonia” would mean “no spirit”.  That sounds more pertinent to me, don’t you think? 

***Though I suppose one could speculate that I was a failure of such a program.

And I looked, and behold a pale cat

Well, I have some relatively good news, which is why I decided to write a post today instead of just leaving it:  Dorian, the light gray cat, has returned.  Well…he was back last night, at least, though this morning he was nowhere to be seen once again, which is itself somewhat unusual.

He was a bit scraggly, with some traces of dried blood around his fur on the side of his head and neck, but it didn’t look like it was his blood.  He actually looked lean and healthy, moving very much like the hard-ass stray cat that he is.

I’m guessing that he got into a pretty big fight at some point‒he seems prone to them‒and then hid away somewhere while he recovered his strength.  Then, that pale grey shadow took a new shape* and grew again.

I think stray cats, like defective and damaged people, don’t like to show any weakness to those around them.  Perhaps it’s more accurate to say that they are unable to show their weakness, even though they may crave acceptance and support.  There are good, sound biological motivations for this in stray cats and other mammals; showing weakness or injury can invite further aggression from other cats and even encourage predators.

Of course, human males (or anthropoid creatures living among humans, such as I) are no exceptions to that tendency.

It’s also been said that, in many ways, people on the autism spectrum are like cats, at least in some ways, and I can see the point, though it is an oversimplification.  Still, it leads me to speculate that, sometime in the relatively deep past, perhaps two separate subspecies of humans (maybe the legendary Neanderthals and Cro-Magnons) existed, one being more naturally ultrasocial, the other more constrained but with other capacities that aided their survival.  We know that Neanderthals, for instance, had bigger brains than so-called modern humans, but the structure appears to have been slightly different.

Perhaps it’s the genes from such a separate subspecies that led to some people having ASD or other versions of “neurodivergence”.  To be clear, I don’t know that there’s any good evidence that this is the case.  I did encounter at least one study that looked for markers known to be associated with the autism spectrum and the DNA residua of Neanderthals present in people of European descent.  There seemed to be some correlation, but I didn’t think it was particularly impressive.  So there’s not a lot of data to support the hypothesis.

It would be nice‒in some ways‒to think of oneself as just a different kind of human, not as something alien.  But I think that’s probably a silly dream for me.  I do not belong here in any serious sense; I am an alien, a mutant, a replicant, a stranger.  And to humans, of course, a stranger is presumptively an enemy unless and until proven otherwise.

Anyway, Dorian was back last night, but gone again this morning.  We’ll see if he returns.  There are other cats who come around.  But, of course, there is no real affection from most of them.  They come to me opportunistically, because I put food out for them.  I am useful to them.  Similarly, I am often useful to humans in the world.  I have many skills and abilities, so I have frequently found that people like to have me around to help them get things done.  But eventually, the negatives of my presence outweigh the positives, and people go away (or send me away).

I don’t blame them.  I want to go away from myself, though I have never had any desire to be anyone else.  I would prefer oblivion.  Or maybe I would just prefer rest.

Speaking thereof, I slept almost four hours last night, and of course, I awakened and couldn’t go back to sleep in the wee hours of the night, and I am now at the office finishing this post.  I don’t look forward to the weekend‒there’s nothing good about it‒but at least I can collapse and try to recuperate.  I don’t know if I’ll write anything next week, or just leave everything be.

I feel perched on the borderland between life and death, and the Undiscovered Country beckons.  It must be really great there, because no one who goes ever comes back.


*To be honest, it’s pretty much exactly the same as the old shape.

We skipped the light fandango…

Well, here I am, writing a blog post again on Tuesday, Batman* only knows why.  I don’t really have anything of substance to say.  Not that I had anything of substance to say yesterday.

Actually, come to think of it, I did encounter a neat fact last week.

One morning I decided to get in a bit of reading in one of the textbooks I keep in my office‒Classical Electrodynamics by John David Jackson.  I employed a technique I’ve often used for reviewing:  I flipped a coin to increasingly winnow down the textbook‒heads is first half, tails is second half, etc.‒and pick a random section to start reading.

I knew that much of the mathematical formalism and at least some of the technical matters in the book would be unfamiliar, so I didn’t expect to understand fully what I was reading.  But I also know that the stuff I do and don’t understand will linger in my brain, and as I’m exposed to other things that go with it or explain it or link up with it, the picture will form.  I don’t read or learn especially quickly, but I do learn deeply, and in a way that connects ideas and principles together in the end.

There was much of this brief section (which was about refraction and/or absorption of light** by water) that was slightly over my head.  Nevertheless, it was interesting, and the author introduced a graph (see below) showing at the top the refraction of light by water across wavelengths, and how it tends to vary.  I assume we’re all at least implicitly aware of the fact that different wavelengths are refracted by water differently‒thus the phenomenon of rainbows.

Below this is a table showing the absorption of electromagnetic radiation by water across frequencies.  Here there is a steep upward slope when coming in toward the center from highest and lowest frequencies.  It peaks at around the microwave/infrared wavelengths from the left and around the ultraviolet from the right.

Then a striking thing happens.  There is a sudden, precipitous drop in absorption down to very low levels in a fairly narrow range of frequencies in the “middle” of the graph, meaning that in this range, light passes through water with relatively little absorption.  This is the range we know as visible light.

The author took the time to point out that this fact about the nature of water‒that it is more or less transparent in this very narrow range of frequencies‒is exactly why we Earthlings tend to see only in that range.  It’s not an accident of evolution, some ancient, stochastic occurrence that is thenceforward cemented, unchangeable, into all descendants, like the DNA code and ribosomes and the chirality of biological molecules.  It is instead a fundamental fact of physics that determines where creatures will be able to see if they first developed vision while living in water and then developed eyes that, like the rest of them, were mainly made of water.

There’s no point in making retinal proteins that react to wavelengths of EM radiation that are almost entirely absorbed by water.

That simple fact‒simple in summary, at least‒is enough to explain a huge swath of the nature of our visual perception, and it doesn’t require any further explaining to understand why we see in the range of light we do.

That was just a randomly chosen section of a textbook that reputedly is extremely difficult.  I don’t disagree with that assessment of difficulty; it was a very dense bunch of material even in just 4 or 5 pages.  But to think that one can find such remarkable facts while just trying to read and learn in random order from a textbook!

So, that’s an interesting little tidbit that seems worth sharing, at least to me.  It’s far more interesting than anything going on in the human world right now.  What’s more, this is a fact that has existed as long as water itself has existed‒and implicitly, it existed even before that, lying there waiting in the fundamental laws of nature.  And it will be there long after everything but those fundamental laws is gone.

If you want to embrace eternity, and things like Hilbert’s Hotel and Cantor’s diagonal proof make you worry about your sanity (this happens to many of us, so don’t feel bad) then focus on this fact about visible light.  It’s there, it’s real, it’s quasi-eternal, and it’s concrete.

Though the absorption spectrum for concrete is…quite different.


*This harkens back to the reference from Batman Begins, when Flass says “I swear to God,” and Batman snarls “Swear to me!”  It seems fun to use Batman when one would normally say God.

**By “light” I mean all electromagnetic radiation, from radio waves to gamma rays.

Thoughts meander like a restless spore inside a humid room

In case you’re confused:  Yes, it is Monday, even though I’m writing a blog post.  I just decided that I might as well do something that gives the illusion of productivity, since, y’know, I’m awake and on my way to the office anyway.

As for topics about which to write, well, on that I don’t know.  I don’t think there are any momentous or interesting things worth discussing that are happening in the world right now (ba-dump-bump, chhh!).

Still, it is true that to a very good approximation every event that happens and that seems so earth-shattering and important in the moment is utterly forgotten by everything except quantum mechanics and Laplace’s demon.

Don’t believe me?  Do you remember that scandal in the Roman Senate when Cinna the Younger misdirected Republic funds to buy a “servant” for his household?  No?  Neither does anyone else.  Of course, that was 2000 years ago, so you may think it doesn’t count, but I’m sure almost none of you know about any of the “crucial” events of the day or the escapades of popular stars even 40 years ago.  I certainly don’t remember any, and I was fifteen at the time, and I have an unusually good memory.

Also, time‒with respect to online information, anyway‒is cycling more quickly nowadays.  Sure, information can last a very long time online*, but that doesn’t really matter in a dispositive way, because there is a constant gusher of new and distracting information coming in at all times, with signal and noise intermingling haphazardly.  It’s a bit akin to the fact that although Manhattan is crowded with millions of people in a small area‒so you might think your life would be less private‒in many ways it is more private than other places, because when there, you are one indistinguishable face among those millions.

The internet makes Manhattan look like Mayberry.

Still, it would be nice to be able to get my words and maybe my stories and maybe even my music out to more people who might find them interesting and/or entertaining.  It’s not immortality in anything like a literal sense‒nothing is‒but still, there’s at least some little internal drive to spread the memetic code of me out in the world.  I could think of myself like the fruiting body of a fungus, spewing the spores of my thoughts into the wind, seeing if they’ll be able to infest and infect any other people out there.

It might at least be interesting to give someone the psychological equivalent of a persistent, itchy rash thanks to my words.

Of course, the fungus metaphor is an ironic one for me, since I cannot stand mushrooms for eating, and even the smell of wild mushrooms after damp weather (or mildew for that matter) fills me with literal nausea.  Then again, given my own poor opinion of myself, maybe it’s right for me to think of myself and my ideas (my celium, perhaps…get it?) that way.

Fungi don’t have any qualifications or requirements to meet other than survival and reproduction.  And that’s very much the nature of online information exchange; the stuff that spreads most isn’t the “brilliant” or the “important”, it’s just the catchiest.  The whole process is stochastic.

Of course, there is an entire ecosystem of such meme-plexes, and they vary in their tendencies to spread quickly versus being more long-lasting.  Like the species in a rainforest, some spread quickly and germinate and reproduce quickly, but then quickly die, with short, frequently repeating cycles; others spread and grow more slowly, some perhaps becoming the mighty trees that dominate the structure of the forest, but which perforce have longer, slower growth and death cycles.  And, of course, the various other plants (and animals) in the jungle create and are each others’ environment.  There are even parasitic plants, and opportunistic ones that germinate only after a fire.  It’s very complicated, and no one plant is crucial or eternal**, though they may think they are.

Am I pushing the metaphor too far?  I don’t think so.  I think it’s important for people to recognize that no one controls the internet, just as no one controls the economy, just as no one controls the ecosystem, just as no one controls the evolution of the universe itself.  Everything just happens thanks to the interactions of numerous smaller elements interacting according to local forces and pressures.

That’s enough for a Monday, I think.  I hope you all have a good one.


*But not forever, despite what anyone says.  I fear no contradiction here, because to prove me wrong, you would have to wait until forever had passed.  At which point, if you are right, I will gladly concede the issue.

**Unless you count microbes‒some of those could be considered immortal.  Of course, in a sense, since it all has one common ancestor somewhere, life as a whole could be considered just one gigantic, very long-lived (but probably not immortal) organism.

The year is dead; long live the year. Whatever.

It’s Tuesday, December 31st, 2024‒New Year’s Eve.  Of course, as I’ve written before, every day is the first day of a new year, in a trivial but nevertheless true way.  The day to mark the new year is an arbitrary choice.  We could have had the new year begin “officially” on the first of any month.  Indeed, we could have started it in the middle of a month, or perhaps on the winter solstice.  Or we could have 12 months of thirty days, leaving a 5 day extra period which we could use as a long holiday and an official new year celebration, with an extra day every leap year.  We could do this around the winter solstice, or even around the summer one, like hobbits do.

Oh, well.  It’s not as though people are going to collectively change, any more than people are going to go back to really celebrating holidays:  with most workplaces closed except for hospitals and police and fire stations and the like, with people spending the holidays with loved ones.  Once one business stays open on those extra days, competitors (direct and indirect ones) will stay open, too, or suffer a disadvantage that may lead them to be more likely to go out of business.

The world of commerce is red in tooth, claw, and debt, so after a while, only those who push every edge they can without getting more negative marginal returns, will dominate.  And that will become the norm.

No one made it happen, no one planned it.  Everyone’s caught in the currents of chaos, but those able to use the flows to their advantage‒chaos surfing, as I call it‒will thrive, at least temporarily, even if they don’t realize why they are succeeding, which they usually don’t.

It’s similar with the workers:  once some small subgroup is willing to eschew holidays and to work longer hours, they will have advantages over other workers, at least as long as working more proves advantageous to them and their workplaces.  Soon, the marketplace of workers will skew toward people being willing to work longer hours in worse conditions, as long as it provides a relative, local advantage.  Those who cannot match this will fall by the wayside, perhaps becoming homeless, getting addicted to drugs, going to jail or prison‒self-destructing directly or indirectly.

This is not a conspiracy by employers or governments or anyone else.  No one is that clever, and they are all beset by their own local pressures and competitions.  Why else would the very wealthy do anything but sit back and eat ice cream until they die (figuratively speaking)?  They are no more happy or satisfied than most other people.

It’s analogous to the situation with trees and forests.  It takes a lot of effort and resources for trees to grow tall.  Why do they do it?  Because other trees do it, and any tree that doesn’t want the sun blocked out had better do the same*.  If all trees could agree somehow to stay short, they could all thrive and get adequate sunlight and nutrition and water and air at a fraction of their usual height and resource usage.

But once one tree grows taller, the arms race begins.  Such is the way of economies and ecologies.  They cannot be planned, they cannot even really be controlled or constrained (at least not without disastrous results).  At best, they can be “herded”.  That’s a metaphorical herding, by the way‒a careful nudging of things to keep the eddies in the phase space currents from driving the system toward deteriorating returns, along whatever axes one may use to measure such things.

None of this happens due to some malicious plot, and it is not generally evil.  This competitive jockeying and self-abnegation while seeking seemingly locally selfish ends, or at least responding to local pressures (internal and external), has led to all the many scientific and technological advances that we have, from improved farming techniques that allow the world to sustain billions, to better healthcare, better sanitation, better transportation, greater safety from the elements, greater understanding of the universe at large…and the sometimes-cesspool that is the world of electronics, computers, smartphones, tablets, and digital interactions.

No, this shit all just happens “on its own”.  Natural selection works in places other than biology, and it is a ruthless, blind, and amoral driver, here in the region of spacetime where increasing entropy is in the stage where that increase leads to local complexity rather than uniformity.

Whether or not the local manifestation of it will last long remains to be seen.  There are many ways for any particular state of a system to be obliterated, or for that system itself to decay and disintegrate.  It requires constant effort to maintain anything like homeostasis and growth, but not just any effort will do.  One must constantly reassess, course correct, look for mistakes from which to learn, adapt to all the new, varying states of the system, or perish.

I don’t know about you, but I’m very unsure that it’s worth it.  In all honesty, I did not want to see 2025, and really, I still don’t.  I want to find the courage just to check out.  There’s very little for me here, and of all the things in the world that frustrate and irritate and disgust me, I’m the worst.

I guess if I write a blog post on Thursday, you’ll know that I am still around to see 2025.  If so, please don’t congratulate me.  It is not a good thing; it’s yet another failure in my long string of them.

Anyway, I hope you all have a Happy New Year.


*I’m anthropomorphizing here, but don’t get confused.  The individual trees don’t get to choose, evolution just favors the tall in this situation, ceteris paribus.

If I could write the beauty of your blogs, and in fresh numbers number all your graces…

Hello.  Good morning.

Aaahhh, doesn’t that feel better?  Now I can use my standard Thursday blog post opening phrases, because today is, in fact, Thursday.  It’s the 21st of November, the third Thursday of the month, so in the USA you only have seven shopping days until Thanksgiving.

Speaking of Thanksgiving, since next Thursday is that holiday, I probably will not be writing a blog post then.  It is one holiday on which our office is always closed.  We will be open on so-called Black Friday, but I can’t guarantee that I’ll write a post on that day.

Of course, in principle, I cannot guarantee that I’ll write anything at all ever again after this post.  I may not even survive to post this entry*‒I am in the back seat of a Lyft, on the highway (I-95) of the East Coast of the US, so goodness knows there’s a non-zero chance of a fatal accident.  I would even wish for one, but I know such a thing would involve harm and possibly death to other, more innocent, people.

Also, of course, wishes don’t actually directly affect reality‒thank goodness.  Imagine if even one percent of wishes came true as wished.  The world would be thoroughgoing chaos…and not in a good way.  I tend to say of wishes that “If wishes were horses, then we’d all be hip deep in horse shit,” but it would be even more terrifying if wishes worked.

The “if‒then” character of the wishes saying (my version or the more SFW one that involves beggars riding) often makes me think of lines of computer code in some generic programming language, like:

If wishes==horses then execute beggars.ride

Or maybe 

If wishes==horses then horseshit_level = “hip deep”

I wonder what that would look like in machine language.  Or, I wonder what it would look like in straight binary.  Really, though, I know part of the answer to the latter piece of wondering:  it would look, to the naked eye, like a random string of ones and zeros, perhaps the tally of some very long record of flipping a coin and marking heads as 1 and tails as 0 (or vice versa).

Actually, of course, given a binary-based computer language, one can literally generate every possible computer program just by flipping an ever-increasing number of coins.  Or, to be honest, one can do it just by counting in binary:  0, 1, 10, 11, 100, 101, 110, 111, 1000…

This is why, if memory serves, computer science people and information theory people say that every program can literally be assigned (and described by) a number.  You could express that number in base ten if you wanted, to make it a bit more compact and familiar to the typical human.  Or, if you want to be more efficient and make conversion easier, you can use hexadecimal.  This is easier because a base-sixteen number system is more directly and easily converted to and from binary, since 16 is a power of 2 (2 to the 4th).

Even the human genome, or any genome in fact, could be fairly readily expressed in binary.  The DNA code is a 4 character language, so it wouldn’t take too much work to make it binary, however you wanted to code it.  Then, each person’s genome would have a single, unique number.  That’s kind of interesting.

It would be a bit unwieldy as an ID number, of course.  The human genome is roughly 3 billion nucleotides long, which means it would be roughly 6 billion binary digits (AKA bits).  And since every ten bits is roughly a thousand in base 10 (2^10 is 1024, which is very close to 10^3, aka 1000) then 6 billion bits should be roughly 2 billion decimal digits long (a bit less), which is much, much larger than the famously large number, a googol**.

It’s a big number.  This should give you at least some idea of just how unique each individual life form is at a fundamental level.  There are so many possible genomes that the expected time until the final heat death of the universe is unlikely to be long enough to have a randomly created duplication within the accessible cosmos.

Of course, within an infinite space‒which is the most probable truth about our universe as far as we can tell‒one will not only have every possible version that can exist, but will have infinite copies of every possible version.  Infinity makes things weird; I love it.

Of course, just as with the making of computer programs by simply counting in binary, the vast majority of genomes would not code for any lifeform in any kind of cellular environment, using any given kind of transcription code you might want (the one on Earth, found in essentially all creatures, uses three base pairs to code for a given amino acid in a protein, but that’s not all that DNA does).  Similarly, most of the counted up programs would not run on any given computer language platform, because they would not code for any coherent and consistent set of instructions.

But even so, you would still, eventually, get every possible working program, or every possible life form in any given biological system if you could just keep counting.

On related matters, there are things like the halting problem and so on, but we won’t get into that today, interesting though it may be (and is).

It’s quite fascinating, when one is dealing with information theory (and computer science) how quickly one encounters numbers so vast that they dwarf everything within the actual universe.

Mind you, the maximum possible information‒related to the entropy‒carried within any bounded 3-D region is constrained by the surface area (in square Planck lengths) of a black hole with that size event horizon.  For our universe, roughly 96 billion light years across, I think that’s something like 10 to the 124th bits, or at least it’s that many Planck areas.  That’s quite a bit*** smaller than the number of possible genomes, though I have a sinking feeling that I’m underestimating the number.

And information, at least when instantiated, has “mass” in a sense, and the upper limit of the amount of information in a region of spacetime is delineated by the Bekenstein entropy description.  So there’s only so many binary strings you can generate before you turn everything into a black hole.

Something like all that, anyway.

I may have been imprecise in some of what I said, but when you’re dealing with very large numbers, precision is only theoretically interesting.  For instance, we**** have found Pi to far more than the number of digits needed to calculate the circumference of the visible universe down to the Planck length.  It would require only about 40 digits of Pi to get to that precision to the size of a hydrogen atom, and those are only about 10^25 Planck lengths across, so we wouldn’t expect to need much more than 65 digits of Pi to get that precise, but let’s be generous and use 100 digits.

How many digits of Pi have actually been “discovered” by mathematicians?  Over 105 trillion digits.  Talk about angels dancing in the heads of pins!  It’s literally physically impossible, according to the laws of quantum mechanics, even to test whether that number precisely defines the ratio of any given circle to its diameter by measuring it.  One cannot, in principle, measure finely enough.

Still it just goes to show that mathematics is vastly larger in scope than any instantiated, superficial reality.  Information is deeper than one might think…so to speak.  But, then, so are minds themselves, vastly deeper.

As Idris/the TARDIS asked in Doctor Who, Series 6, episode 4, “Are all people like this?  So much bigger on the inside?”  Yes, Idris, I suspect they are, even those people we don’t like and feel the urge to denigrate.

That’s enough for today, I think.  I’ve achieved nothing, really, other than write a Thursday blog post, but then again, that’s all I meant to do.  I hope you have among the better half of all the vast number of possible days available to you.

TTFN


*If you’re reading this, though, I clearly did survive.  I have mixed feelings about that.

**How much larger?  Soooo much larger that if you subtracted a googol of something from 10^1,800,000,000 of something, you would not change it to any extent measurable even by the most precise instruments humans have ever created.  And a googol is already something like 10 to the 19th times as large as the total estimated number of protons and neutrons in the accessible universe.

***No pun intended.

****Actually, I had nothing to do with it; it’s just the sort of “royal we”***** kind of thing everyone uses when discussing the accomplishments of humanity as a whole.

*****Not to be confused with royal wee.  That’s the sort of weird, niche thing one might find for sale in mason jars on the dark web.  Be careful if you’re into such things.  I wouldn’t buy it unless you’re sure of the source, so to speak.

My charity is outrage, life my shame, and in that shame still blog my sorrows’ rage.

Hello and good morning.

It’s Thursday, and it’s thus time for my now once again weekly blog post.  I hope you’re all pleased.

Before I go any further, does anyone out there know any way to reset the default font in Microsoft Word back to Calibri?  As I have mentioned before, I cannot stand the new Aptos font.  If I could send a terminator* back in time to kill the mother of the person who designed that font, I would be strongly tempted to do so.

But, wait, you might say.  Surely if I have access to terminator and time travel technologies, there must be other, less homicidal ways to change the basic font of a word processing program.  That may well be so, but violent matricide is all such a person deserves, I’m afraid.  Anything less would not convey the degree of my antipathy.  I’m inclined to say the entire family tree should be eliminated, but eventually the line of any living person intersects with the line of all people alive on the planet, so to wipe out the oldest ancestor would be to wipe out a common ancestor to all living humans, thus wiping out the whole human race.

Hey, wait, maybe that’s not such a bad thought.

While we’re at it, maybe we can go back over three billion years ago, to that warm pool about which Darwin spoke, and spray some Lysol, thus aborting all life on this planet.  I suppose life might start randomly again somewhere else, even if one did such a thing.  After all, it happened pretty quickly once conditions became conducive, implying that it might not do just to wipe out the spot where the ancestors of all actual modern life began, but might instead be necessary literally to sterilize the whole planet.  But how do you do that if even the collision with Theia that is the presumed origin of the moon didn’t do it?

Still, while the origin of basic life seems to have been a strong or at least a rapid tendency, the formation of eukaryotes and then multicellular life seems to have been much harder, taking another two and a half to three billion years after the earliest life to evolve on the planet.  So maybe, if a different proto-life had formed, life would never have progressed beyond something like bacteria.

Okay, well, I think I’ve made it clear that I don’t like Aptos.  And now that I’ve finished the first draft of Extra Body, I think I may in future switch over to using Google Docs for my word processing.  I hate unnecessary change in the first place—such as all the tweaks and upgrades and nonsense that all the apps and systems are constantly enacting, and the changes in WordPress that nearly always make the platform less convenient—but when they are changes for the worse, I really cannot abide them.

What misguided notions led Microsoft to think that their weird little new font with its curlicues and malformations of letters would be an improvement?  Can entire software companies develop global degenerative neurological conditions?  Or is it just a matter of the second law of thermodynamics, ensuring that any local cleverness is an ephemeral exception?

Just look what’s happened to the United States.

Anyway, as I mentioned above, I have completed the first draft of Extra Body as of yesterday morning.  I did not write on Friday, because I really felt like crap, mentally.  I honestly suspected that my brain was crashing, experiencing a burgeoning system failure (speaking of degenerative neurological conditions).  But then, on Monday, Tuesday, and Wednesday mornings, I wrote a total of 5,599 words, bringing the final first-draft tally to 80,676 words, at 123 pages.

I don’t know if the tale is any good, but it’s certainly impressively long for something that was imagined as a short story.  I’m going to take a very brief break before I begin my intended draconian editing process, during which time I mean to transcribe what I’ve typed so far of HELIOS** into a spiral bound notebook so that when I get to the appropriate stage, I can just continue writing that first draft by hand.

Of course, this is all extremely speculative.  I don’t expect that it will come to fruition, because I know that I simply cannot survive as my life is and—more importantly—as I am.  In case you can’t tell, I’m constantly almost completely defined by tension and hostility (though I do my best never to allow them actually to be released unjustly; I may almost always wish to wipe out all life in the universe, but I almost never do it).  The world, the planet, the biosphere, what have you:  none of it seems natural to me, none of it seems good or beautiful or welcoming.

I feel like I’m already in some Lovecraftian otherverse, not just a stranger in a strange land but an alien entity in an alien universe, where there are not even an integer number of spatial or time dimensions.  I truly sympathize with Agent Smith in the original The Matrix, when he says, “I hate this place, this zoo, this prison, this reality, whatever you want to call it, I can’t stand it any longer.  It’s the smell—if there is such a thing—I feel saturated by it…”

Of course, I don’t think he was literally saying that it was solely the smell that bothered him.  This was merely the metaphor, the shorthand, the figurative focus of his antipathy.  The sense of smell is merely the most elemental, the oldest, the most direct sense, and it tends to elicit the most visceral responses.  Even bacteria can be said to “smell” the world.

Lest anyone be fooled, I want to make clear that it’s not politics and social dysfunction and the like that make me so antipathic toward the world, though politics is pathetic and contemptible.  But politics—including dishonesty, hypocrisy, willful stupidity, delusion, political violence, and all such manifestations of primate dominance hierarchical jockeying—has always been pathetic and juvenile and worthy of sneers and nausea (as well as occasional mordant, contemptuous laughter).

Anyway, that’s about a thousand words in this post already.  I could go on and on spewing vitriol, but I don’t think it would make much difference.  I don’t know how I can possibly survive as I am, as things are.  More to the point, I don’t know why I would possibly survive as I am, as things are.

The world is disgusting, my life is almost entirely uncomfortable and frankly painful, and above all, I find myself disgusting.  I try to distract myself with writing, and with some music, and with studying physics and mathematics and languages, using various books and apps and so on.  I even pretend I have friends by watching YouTube videos of people reacting to songs movies I like.  But nothing is fun.  And none of my chronic pain and sensory issues have improved.  And don’t even get me started on insomnia!

Oddly enough, I think I would feel less alone if I were truly the only person on the planet, or if I were a castaway on an island.  Perhaps I’m wrong, of course; that is purely speculation.  But it feels like it would be the case, and that’s not a good feeling.

Well, I hope (and suspect) that most of you are doing and feeling better than I am.  That almost has to be a good thing.  Please take care of each other and yourselves.  Despite all the people and things I feel that I might wish didn’t exist, or that could be obliterated, you are among the rare few to whom that doesn’t apply.

TTFN


*As in the movies created by James Cameron, not the line that separates night and day on an astronomical body illuminated by a star.

**A little less than 3,000 words.

Numbers of words and words of thoughts and thoughts of consciousnesses

Since I came up with the idea and mentioned it in my blog on Saturday, I could not fail to put the idea into practice of keeping count of both the number of words in the new “block” of fiction writing I did today and to keep track of the change in the total word count, to compare them.  This was especially true since, on rereading what I had written on Saturday, I realized that I had started a conversation between two characters rather abruptly, and so I added in a more natural beginning to that interaction while I was editing.

This didn’t have as big an impact as it might have, since I also pruned things slightly while rereading.  In any case, I kept track of the net total word change and the word count in the new block of writing, and those numbers are:  1,228 words in the new block written today, but a net increase in word count of 1,264.

I don’t know how representative this is of the typical disparity, but it’s less than a 3% difference whether you use the larger or the smaller number as your denominator, so it’s not huge.  Still, I’ll probably keep this up, at least for a while.

After I had finished writing and gotten up to get ready to get off the train, I had a weird train (ha ha) of thought that led from me thinking about the fact that one can no longer readily stream series A through I of the British show QI in the US, to how I had needed to order the DVDs for those seasons through Amazon UK, which I did quite some time ago.  This led me to think about the shipping process, and how seamless and rapid it had been–it was not as fast as ordering something that’s sourced locally, but nevertheless it was impressively rapid.

And I thought of the various people involved, and how not one of them had been aware of the whole process from beginning to end, and indeed, possibly not one of them had thought about what was being sent and to where.  Each part of the process was more or less automated, or at least occurred “locally”, in a phase-space sense*.  And yet, the whole has become a process that takes place with remarkable efficiency, despite no member of the chain of the process really knowing too much beyond their own part of the job.

And I thought, the whole economy is like this, locally, nationally, and globally.  Indeed, all of civilization is like this; everyone simply acts in response to local forces and events and incentives and disincentives, and the process turns into a self-sustained, much larger entity that has not been created by anyone, and is certainly not run by anyone (any more than a bee hive or an ant hill is “run” by the queen insect).  Nor should it be, since no human mind is capable even of grasping very precisely and in detail anything beyond a tiny part of the thing itself–this is probably part of why “planned economies” always fail, and until there is a super-intelligent AI (and perhaps even then) they always will.  It’s like trying to put one single nerve cell in charge of the entire human brain and body.  It simply doesn’t have the capacity to do such a thing.  When one nerve cell’s activity spreads with relatively little impediment through the brain, you get what we call a seizure.

Anyway, all that led me to thinking about whether it would ever be possible for a civilization, in the aggregate, to become truly sentient and self-aware.  I don’t mean that the members are self-aware; obviously they are already (at least some of them, and to varying degrees).  I mean, could the civilization as a whole develop self-awareness, develop what the philosophers of mind call “qualia“.

Our civilization is probably far too small to instantiate such a thing, currently.  There are after all “only” about 8 billion humans on Earth, compared to, for instance, the roughly hundred billion neurons in each individual human brain (mileage may vary) and tens of trillions of cells in an entire human body.  But perhaps, someday, if a civilization becomes large enough and remains interconnected enough, the lights may come on, so to speak–actually it would probably be a gradual process, rather like those European, “energy-saving” lights; it’s unlikely to be an instantaneous change.  But it could, in principle, happen.

Of course, those who espouse the so-called Hard Problem of Consciousness™, might say that it could never happen, that qualia, that true consciousness requires some other ingredient or process.  I’ve never encountered an argument from any of them that impresses me, though.  Even Roger Penrose’s ideas about quantum mechanical processes being necessary for human consciousness–in denial of the Church-Turing Thesis and related ideas of universal computation–seems to me to be pure motivated reasoning, albeit by one of the great minds of the modern world, so it’s still worth exploring his ideas.  Even when he’s wrong, Penrose’s thought is more fruitful than that of the vast majority of people when they right, yours truly included.

I’ve arrived at no conclusions, of course.  It was just an interesting mental diversion that I thought I would share with you readers, since I have no one else with whom to share such things.  If any of you have any thoughts or ideas about them, please feel free to leave a comment below, here on my blog proper, not on other social media–I would prefer a forum in which other people who read comments on my blog could comment, too, and that’s not likely to happen on Facebook or on “the site formerly known as Twitter”.

Okay, that’s it for today.  I’m not going to edit this much before posting, so apologies if there is any persistently awkward wording or if there are any unnoticed typos.  Have a good “Not Memorial Day” day**.


*Of course, everything in the universe behaves locally–even quantum entanglement is “local” in a very specific sense.  Even gravity is local–the local gravitational “field” responds to the state of the nearby gravitational field, not literally to distant objects, which is part of why gravity can “escape” from black holes.  The larger-scale laws of nature emerge “spontaneously” from all these tiny, local interactions, or so it seems based on the best information I have.

**I mistakenly thought today was going to be Memorial Day because people at work kept talking as if it were.  However, that holiday is next Monday.  Sorry if I confused anyone, and thank you to my cousin for pointing it out to me.

And writers say, the most forward bud is eaten by the canker ere it blog

Hello and good morning.

It’s Thursday, so it’s time for my Thursday blog post.  There will be no fiction from me today, other than such ordinary, day-to-day fiction as pretending to be doing better than I really am, as well as using money to buy things*.

I’m writing this on my phone, since I didn’t bring the laptop computer back to the house yesterday.  I was wiped out, and stressed out, and I didn’t feel like carrying any more than necessary.  I did get a bit of walking in, since I had to stop at the store on the way back.  I guess that was good, though something in the way I moved caused a blister on the medial side of my right big toe.  It’s not too bad, but I’ll probably not do any serious walking today.

It’s often questionable why I bother.  Of course, I would like to lose weight and whatnot; I would rather not die the physical travesty that I currently am.  But the best way to do that would be to stop eating completely.  That would be a win-win situation, as the cliché goes.  But that is very difficult to do in ordinary, day-to-day life in the modern United States.

I got a terrible night’s sleep again last night.  It wasn’t as bad as my one-hour night earlier in the week, but it wasn’t a whole lot better.  I’ve been trying to restrict my caffeine intake to the relatively early morning, just to make sure that doesn’t interfere with my sleep, but it doesn’t seem to make much difference.

I haven’t read anything much in quite a while.  I think it’s been over a month since I read any book, fiction or nonfiction.  I have been doing some stuff on Brilliant dot org, as I’ve mentioned here, but yesterday I didn’t even feel like extending my “streak” by doing some simple work in their computer programming course.  For one thing, the constant prods to “extend one’s streak” are thoroughly irritating.

I really despise all the manipulative tactics undertaken by these companies to get people to keep using their sites.  Even Kindle does it.  I had a “streak” of something like 170 or more weeks of reading pretty much every day on my Kindle app, but that’s now been broken, and already Amazon isn’t even recommending any e-books to me.

Still, it’s not as though I ever read to maintain a “streak”.  I read because I want to read.  Except right now I don’t.  I don’t even want to read my own stuff.

I did practice a little on the guitar yesterday.  I guess that’s something.  And, as you all know, I’ve been writing fiction now for a total of over twenty days (counting only writing days).  But it feels almost disloyal to be writing without reading, though it’s only myself that I’m betraying, and I don’t like myself, anyway.  Still, reading has been a fundamental part of my identity for literally as long as I can remember, and not being able to do it makes me feel very much adrift and puzzled.

It’s getting seriously hot and muggy down here in Florida.  I’m sweating significantly and quite visibly just sitting at the train station.  I suppose, if climate change persists, Florida will at least reap what it has politically sown, since both the heat and the sea levels are likely to drive quite a lot of people out of the state, and make much of the coveted ocean-front property into literal and figurative underwater real estate.

I’m not the sort to laugh in malicious glee when people get their comeuppances; I’m much more the type to tighten my lips grimly and nod in affirmative contempt.  But that doesn’t mean it’s not ego-syntonic for me when people get fucked over because of their own arrogant stupidity.

I don’t expect to be around to see any of it happen.  And, honestly, I would not be disappointed if people actually make headway at fixing the problems and correct them in time to save even people who don’t necessarily deserve to be saved, because innocent and beneficent people will be saved along with them.

Human ingenuity is much rarer than people probably think; however, it is so incredibly powerful that it doesn’t take much of it to accomplish wonders.  I guess it’s worth it for there to be so much arrogant stupidity if that’s necessary or unavoidable in order for the occasional sparks of cleverness and even genius to be found.  But it would be nice if stupidity were more sexually unappealing than it is.  Regrettably, though, stupid people seem more likely to breed than smart ones, especially since the smart ones understand about planning and delaying reproduction, or even choosing not to reproduce at all.

Oh, well.  This is the tragic farce of life.  It can be funny if you like lowbrow slapstick in the vein of the Three Stooges.  Unfortunately, I’m not really a big fan of such things, so I don’t think I’m going to keep watching much longer.

All right.  Time to call this to an end.  My back is flaring up quite a lot, probably from yesterday’s walk, and it’s distracting me.  Please try to nurture cleverness and creativity at all levels, and please don’t feed the trolls in any sense.  They’re not worth it.

TTFN


*Yuval Harari famously pointed out that money is a “fiction”, though it is a useful and important one.  So is law and government and the very existence of rights and stuff like that.  Such things exist only in the minds and works of people.  Nature certainly recognizes no rights, unless you want to count the right to be wiped out if you don’t do what you need to survive.  Indeed, the world seems to promise only one thing:  eventually, you (as well as everything you would recognize as the universe) will die.  That’s probably a truly unalienable right.

There may be no firm fundament but is there a fun firmament?

It’s Tuesday morning, now, and I’m writing this on my laptop computer, mainly to spare my thumbs, but also because I just prefer real typing to the constrictive and error-ridden twiddling of virtual buttons on a very small phone screen.

Speaking of the day, if the Beatles song Lady Madonna is correct, then it’s still Tuesday afternoon, and has been at least since last Tuesday, since “Tuesday afternoon is never-ending”.  Of course, if Tuesday afternoon really is never-ending, then it has been Tuesday afternoon ever since the first Tuesday afternoon.  From a certain point of view, this is trivially the case.  After all, every moment after 12pm on the first Tuesday that ever happened could be considered Tuesday afternoon—or, at least, they could be considered “after Tuesday noon” if you will.

Enough of that particular nonsense.  I only wrote that because there’s nothing sensible about which to write that comes to my mind.  But, of course, in a larger sense, there is nothing “sensible” at all.

There are things that can be sensed, obviously.  I can see, hear, and touch this computer, for instance.  If I wanted, I could probably smell it, though I think its odor is likely quite subdued.  But I mean “sensible” in the more colloquial, bastardized, mutated sense—as in the word “sense” just there—which has to do with something being logical, reasonable, rational, coherent, that sort of thing.  Indeed, it has to do with things having meaning.

Deep down, though, from the telos point of view, there is no true, inherent meaning to much of anything, as far as anyone can see.  Certainly there’s no meaning that anyone has ever demonstrated or asserted convincingly that I have encountered at any point in my life.

Of course, people have beliefs and they have convictions, and humans assign meanings to various things.  All the words I have used in writing this post so far, and all the words I will use henceforth, have “meanings”, but those are invented meanings.  There is nothing in the collection of letters—nor indeed in the shapes of the letters themselves, nor the way we put them down on paper or a screen—that means anything intrinsically.  They were all invented, like justice and morality and the whole lot of such things.

That something is invented doesn’t mean it isn’t real, of course.  Cars are an invention, and only a fool (in the modern world) would deny that cars are real.  But they are not inherent to the universe; they are not in any sense fundamental.

In a related sense, even DNA and the protein structures for which it codes are very much not fundamental; they are quasi-arbitrary.  Of course, one cannot make DNA or RNA or proteins out of substrates for which the chemistry simply will not hold together.  But the genetic code—the set of three-nucleotide-long “letters”, the codons, in the genetic code that each associate with a given amino acid (or a stop signal, or similar) as they are transcribed into proteins—is arbitrary.  There’s nothing inherent in any set of three nucleotides that makes it associate with some particular amino acid.

This sort of thing took me quite a long time to realize as I was growing up and trying to understand biology and chemistry and such.  What, for instance, was the chemical reaction with, say, adrenaline that made things in the body speed up and go into “fight or flight” mode, as it were?  How was it that aspirin chemically interacted with bodies and nervous systems to blunt pain?  How many possible chemical reactions were there, really?  It was mind-boggling that there could be so many reactions, and that they could all produce such disparate effects on various creatures.

When finally I was shown the real nature of such things, it was definitely a scales-dropping-from-eyes moment.  There is nothing inherent in the chemistry of DNA, or of drugs or hormones, that produces their effects.  There is no inherent “soporific” quality to an anesthetic.  You could give a dose of Versed  that would kill a human to some alien with a different biology, and at most its effects would be those of a contaminant.

It’s all just a kind of language—indeed, it’s almost a kind of computer language, and hormones are just messengers*, which are more or less arbitrary, like the ASCII code for representing characters within computer systems.  Likewise, there’s nothing in the word “cat” that has direct connection with the animal to which it refers.  It’s just keyed to that creature in our minds, arbitrarily, as is demonstrated by the fact that, for instance, in Japan the term is “neko” (or, well, it sounds like that—the actual written term is ねこ or 猫).

Of course, there are things in the universe that, as far as we can tell, are fundamental, such as quantum fields and gravity and spacetime itself.  But even these may yet peel away and be revealed to be arbitrary or semi-arbitrary forms of some other, deeper, underlying unity, as is postulated in string theory, for instance.

The specific forms of the fundamental particles and forces in our universe may—if string theory and eternal inflationary cosmology for instance are correct—be just one possible version of a potential 10500 or more** possible sets of particles and forces determined by the particular Calabi-Yau “shape” and configuration of the curled up extra dimensions of space that string theory hypothesizes.  So, the very fundamental forces of nature, or at least the “constants” thereof, may be arbitrary—historical accidents, as much as are the forms and specifics of the life that currently exists on Earth.

And what’s to say that strings and branes and Calabi-Yau manifolds are fundamental, either?  Perhaps reality has no fundament whatsoever.  Perhaps it is a bottomless pit of meaninglessness, in which only truly fundamental mathematics are consistent throughout…if even they are.

I’m not likely to arrive at a conclusion regarding these matters in a blog post written off-the-cuff in the morning while commuting to the office.  But I guess it all supports a would-be Stoic philosophical ideal, which urges us to let go of things that are outside our control and instead try to focus on those things over which we have some power:  our thoughts and our actions.

Of course, even these are, at some deeper level, not truly or at least not fully ours to control—we cannot affect the past that led to our present state, after all, and the future is born of that present which is born of that past over which we have no control.  But, for practical purposes, the levers that we use to control ourselves are the only levers we have to use.

We might as well keep a grip on them as well as we can, and not worry too much about things that are not in our current reach.  Though we can try to stretch out and limber up, maybe practice some mental yoga, to try to extend that reach over time, I suppose.  But that’s a subject for some other blog post, I guess; this one has already gone on long enough.


*For the most part.  Things like cholesterol and fatty acids and sugars—and certainly water and oxygen—and other fundamental building blocks do have inherent chemical properties that make them useful for the purposes to which bodies put them.  Then again, words can have tendencies that make them more useful for some things than others, too.  “No” and “yes” are short and clear and clearly different sounds, for instance; it makes sense that such words evolved to be such important, fundamentally dichotomous signals.

**That means 10 x 10 x 10 x 10… until you’ve done that multiplication 500 times.  You may know that a “googol” is a mere 10100, and that in itself is already roughly 20 orders of magnitude (100,000,000,000,000,000,000 times!) larger than the number of protons and neutrons estimated to exist in the visible universe.  So 10500 is a number far vaster than could ever be written out within the confines of the universe that we can ever see.  There’s not enough space, let alone enough matter, with which to write it.  It’s a googol times a googol times a googol times a googol times a googol!