Since yesterday was Monday, the 30th of June, it’s almost inevitable that today would be Tuesday, the 1st of July. And, in fact, that is the case, unless I am wildly mistaken.
If I were to be wildly mistaken about such a thing, it’s rather interesting to consider just how I could come to be so wildly mistaken about something so prosaic and so reliably consistent. It is from such speculations that—sometimes—ideas for stories begin.
This is not one of those times, however. I’m not thinking about any kind of story related to that notion at all, though at times I might consider it an interesting takeoff for some supernatural horror tale. If any of you find yourselves inspired to write a story—of any kind—based on my opening “question”, you should feel free to write that story. I, at least, will give you no trouble.
These sorts of thoughts also remind me of a post that Eliezer Yudkowsky wrote, and which also appeared as a section in his book Rationality: From AI to Zombies. I won’t try to recapitulate his entire argument, since he does it quite well, but it was basically a response to someone who had said or written that, while they considered it reasonable to have an open mind, they couldn’t even imagine the sort of argument or situation that could convince them that 2 + 2 for instance was not 4 but was instead, say, 3.
Yudkowsky, however, said that it was quite straightforward what sort of evidence could make him believe that 2 + 2 = 3; it would be the same kind of evidence that had convinced him that 2 + 2 = 4. In other words, if it began to be the case that, whenever he had two of a thing and added two more, and then he subsequently counted, and the total was always three, well, though he might be puzzled at first, after a while, assuming the change and all its consequences were consistent and consistent with all other forms of counting, he would eventually just internalize it. He might wonder how he had been so obviously mistaken for so long with the whole “4” thing, but that would do it.
This argument makes sense, and it raises an important point related to what I said last week about dogmatic thinking. One should always, at least in principle, be open to reexamining one’s conclusions, and even one’s convictions, if new evidence and/or reasoning comes to bear.
That doesn’t mean that all ideas are equally up for grabs. As Jefferson pointed out about governments in the Declaration of Independence, things that are well established and which have endured successfully shouldn’t be cast aside for light or frivolous reasons.
So, for instance, if you’ve come to the moral conclusion that it’s not right to steal from other people, and you’re pretty comfortable with that conclusion, you don’t need to doubt yourself significantly anytime anyone tries to justify their own personal malfeasance. Most such justifications will be little more than excuse making. However, if one should encounter a new argument or new data or what have you* that really seems to contradict your conclusion, it would be unreasonable not to examine one’s conclusions at least, and to try to do so rigorously and honestly.
There are certain purely logical conclusions that will be definitively true given the axioms of a particular system, such as “If A = B and B = C then A = C”, and these can be considered reasonably unassailable. But it still wouldn’t be foolish to give ear if some reasonable and intelligent and appropriately skilled person says they think they have a disproof of even that. They may be wrong, but as John Stuart Mill pointed out, listening to arguments against your beliefs is a good way to sharpen your own understanding of those beliefs.
For instance, how certain are you that the Earth is round, not flat? How well do you know why the evidence is so conclusive? Could you explain why even the ancient Greeks and their contemporaries all could already tell that the Earth was round?
How sure are you that your political “opponents” are incorrect in their ideas and ideals? Have you considered their points of view in any form other than sound bites and tweets and memes shared on social media, usually by people with whom you already agree? Can you consider your opponents’ points of view not merely with an eye to puncturing them, but with an eye to understanding them?
Even if there’s no real chance that you’ll agree with them, it’s fair to recognize that almost no one comes to their personal convictions for no reason whatsoever, or purely out of perversity or malice. At the very least, compassion (which I also wrote a little bit about last week) should dictate at least trying to recognize and consider why other people think the way they do.
Sometimes, if for no other reasons, it is through understanding how someone comes to their personal beliefs that one can best see how to persuade them to change those beliefs (assuming you are not swayed by their point of view).
This is a high bar to set when it comes to public reasonableness, I know, but I think it’s worth seeking that level. Why aim to be anything less than the best we can strive to be, as individuals and as societies? We may never quite reach our ideals, but we may at least be able to approach them asymptotically. It seems worth the effort.
But I could be wrong.

*I don’t have any idea what such an argument or such evidence would be, but that’s part of the point. Presumably, if I were being intellectually honest, and someone raised such a new argument, I would recognize it for what it was.
