Hello and good morning. It’s Thursday yet again, and I’m writing my more traditional blog post, but for those of you who weren’t expecting them, and so did not look, you should know that I also wrote posts on Monday, Tuesday, and Wednesday. I had no drive or desire to write any fiction; it has felt utterly pointless to do so all week. Everything pretty much feels pointless.
I did spend a bit of time yesterday sharing all my blog post links to my published works‒not counting music‒to X and LinkedIn and Facebook. I don’t know if many people saw them, though my sister did leave a comment on Facebook on the shared link for Hole for a Heart, stating that it is one of my scariest stories. Thankfully, it was intended to be scary, so that’s quite a good compliment. If I’d meant it to be a light-hearted children’s fairy tale and it was one of my scariest stories, that would have been troubling.
I’ve long since noticed, from early on in my writing, that I tend to put horror elements into a lot of my work. For instance, in Ends of the Maelstrom, my lost work from my teenage years‒which was an overlap of science fiction and fantasy‒I ended up having quite a few sequences that followed a large and powerful (and quite mad) cy-goyle named Chrayd, who was basically a horror monster, and whose actions didn’t directly push the plot forward. His portions of the book were clearly little horror stories.
Also, my son read the original 2nd chapter of The Chasm and the Collision (which became the second half of the first chapter, “A Fruitful Day and a Frightful Night”) back when he was, I guess, about 11 or 12, and he said specifically that it was scary. Of course, obviously it was meant to be scary for the main character‒I did call it “A Frightful Night” after all‒but I guess I did a good job of conveying Alex’s fear and making it at least slightly contagious.
I feel that at least some of the portions of Outlaw’s Mind ought to be quite scary‒it’s certainly meant to be a horror story‒but that may just be because I know what’s happening, and that at least some of events of the story were inspired by one of my two experiences of sleep paralysis (which is a truly frightening thing).
Of course, the two stories that are currently on my burners are not horror stories at all. One is sort of a whimsical, light science fiction tale (set in the “ordinary” world), and the other is a more “light-novel” science fiction adventure, possibly good for young adults, based on a comic book I had long-ago envisioned. I’m sure I will throw some horror elements in the latter by accident‒it seems to be how my writing works‒but it’s not any primary part of it.
Here I am writing as if any of those stories will be published and read by people. Isn’t it cute?
One good thing about writing horror is that there is no reason to have any “trigger warnings”. If you’re the sort of person who needs trigger warnings, you probably shouldn’t be reading horror stories. I admit, though, that a few of my works probably merit greater-than-average caution; I’m thinking most specifically of Solitaire and both parts of Unanimity. These are stories in which some quite “realistic” horrors take place‒things that could, in principle, happen in the real world.
Not that Unanimity itself could happen in the real world. It couldn’t. But many of the things done in the book that are horrific are possible and even realistic in a sense.
As for Solitaire, well…yeah, there’s nothing supernatural there at all. It’s an entirely realistic story, probably too much so. It’s short though, so a potential reader wouldn’t be troubled for long. Still, that story is probably for “grown-ups” only. Yet, as I’ve noted before, I wrote the story, all in one night, while I was in a perfectly good mood, keeping my then-future-fiancée company while she worked overnight on a project.
It’s curious to think about where these ideas originate and how they arise.
Even if we ever have a full description of the workings of a human brain, I doubt it will ever be possible to model, predictively and precisely, the specific outputs of any given one. There are hundreds of trillions to a quadrillion synapses in a typical (or even divergent) brain, and those synapses are not simple And, Or, Xor, Not, Nand or other basic binary logic gates. Their connections are almost continuously variable, and the reactivity and set-points can vary over time as well, in response to intracellular and extracellular conditions.
A quadrillion-bit system would never be close to big enough to model a human brain, even if we knew how to write the program. And the possible outcomes of different processes in such a system would rapidly grow to numbers so vast they make the number of cubic Planck lengths in the accessible universe vanishingly close to zero.
As for “neural networks”, well, don’t let the name fool you too much. They aren’t really modeling neurons or even acting very much like them. I mean, they are super-cool*, don’t get me wrong! But I don’t suspect that any of them, at least not by itself, will ever be a true AGI, not without also incorporating some analog of basal ganglia, limbic systems, and brain stems‒drives and motivations (general and partly alterable utility functions) in other words.
It’s also a concern (mainly orthogonal to the above) that, as more of what is out there on the anti-social webernet has been produced by LLM-based chat programs, the programs will more and more be modeling their future responses on responses not created by humans but by previous uses of the GPT style bots, and so they will more and more model only themselves‒a kind of solipsistic spiral that could rapidly degenerate into a huge, steaming pile of crap.
Of course, the programmers are clever, and they may well find ways to circumvent such issues. I suppose we shall see what happens, unless civilization fails and falls before that comes to pass.
Wow, all that was a curious course of thought, wasn’t it? I certainly neither planned for nor predicted it. It just happened (like everything else).
As for what will happen for the rest of the week, well, I’m far from sure and can’t even give a very good guess. I may write blog posts tomorrow and Saturday, or I may write fiction, or I may do neither. I may take a long walk off a short pier, literally or metaphorically. If Hugh Everett was right, there will probably be some versions of me “somewhere” who take each of all possible actions.
In the meantime, I sincerely hope that the only possible Everettian branches in your futures are ones in which you are happy.
TTFN
*Though at least most of them don’t need literally to be supercooled, unlike most modern quantum computing systems.
