Archives for category: Anatomy of a Thinker

One of the things that happens to all of us is getting rooted in a static perspective. It is sometimes difficult to see this calcification from our own viewpoint, but it can stand out like a sore thumb to other people. The trick seems to be working furiously to keep our defensive reactions from browbeating other people out of telling us when we seem wrong from their perspective. That, and finding enough people who do the same to keep the input useful.

Long story short: we can all be wrong due to errors in data and thinking. I suppose the best indicator is being slightly uncomfortable at all times from a constant awareness of this. Comfort seems to presage motionlessness unless it is the comfort of temporary respite from hard work with the sure knowledge that shortly it will begin again.


More later.


I want to talk about a subject near and dear to my heart: the fallacy of equivocation. To commit the fallacy of equivocation is a somewhat formal and precise way of saying you have made an error in the way you used a word, specifically with regard to the sense in which you treat its denotative meaning. You might think that this sort of thing is of importance only to intellectuals dwelling among the clouds in firmly shuttered ivory towers, but you would be badly mistaken in this opinion.

If we render it in terms more familiar to the large segment of the population concerned with things like getting enough money to make the rent, or whether or not that cute person is going to call you back, it is most simply expressed by saying that it is a form of lying. It can occur through apathy or by being unclear on what a word means, but most often the cause is that on some level the person who is equivocating wants either to avoid the discomfort of saying something right out, or evade the consequences of speaking plainly–usually manifesting themselves as someone telling them to take their opinion and place it where the sun does not shine.

In other words: the fallacy of equivocation is usually a deliberate misrepresentation intended to get someone else what they want out of you without you objecting to it. This is, as all fallacies are, either the result of intellectual dishonesty or ignorance, either of which is to be pinned down and gotten rid of with a quickness.

The easiest way to avoid accidentally engaging in this fallacy is be sure of your definitions, and be sure you do not use a word in more than one way at a given time. The rule of thumb is: when in doubt, look it up. It matters not at all how smart you are or the degree to which your erudition regularly attracts the adulation and approval of your peers–if you are not sure, look it up. If you are writing anything of consequence, your work will not suffer irrecoverably with the loss of five seconds spent consulting with the Google. If nothing else, catch it  when you edit.

Some examples of this fallacy in action can be found at:

To avoid being fooled by this sort of language, ask a question to pin down the meaning of the statement. The easiest way to do this is by asking for or providing and asking for verification that the sense of the word you are thinking of is what is intended. If the person readily provides you with a more specific meaning, it was probably a mistake–even if it takes them awhile to figure out what specifically they meant to say. However, if all you get is weasel words and ambiguity, or the person tries to change the topic, you have most likely caught them with their hand in the Equivocational Cookie Jar.

This sort of thing is not just for eggheads: you should care about being clear and specific because this kind of thing leads to false expectations, anger, and misunderstandings that can potentially damage or destroy trust and relationships. If you have ever been disappointed by  politician, frustrated by someone who says one thing but does another and squeaks by on a technicality, or misunderstood because someone just assumed they knew what sense of a word you were using–this should matter to you.

As a person who is present to some extent in modern social media, I find that I am exposed daily to an unhealthy dose of bad advice from apparently well-meaning but deeply confused people who want so desperately to be right that they are willing to sacrifice truth to their hunger for the feeling of certainty. One of the more insidious forms of this offense begins with an admonition to “think for yourself.” There is nothing inherently wrong with this advice. It might actually be one of the best pieces of advice one can give, but a number of hideous flaws can creep silently in hiding in its shadow if we are not cautious. The most egregious and abhorrently poisonous of these wretched little gremlins is the notion that doing one’s thinking in a vacuum is the only–or perhaps worse, the best–way to go about the task of figuring things out.

The title of this piece is intended to provide an unequivocal demonstration of why this method is not only disastrously stupid, but so easily repudiated that anyone who cares to can do so inside of a minute or two. Masturbation is an intrinsically solipsistic sort of activity: you need only your brain, your hands, and whatever plumbing nature has supplied you with to conduct it. I will for the moment dismiss the exception of fetishists who require something of outside manufacture to reach a satisfactory level of excitement; it is possible at least in theory for those persons to either substitute sufficient imagination or manufacture the necessary adjuncts themselves which leaves us back at our starting point. The point to be taken away from this is that masturbation does not inherently require a second sentient being, and while it does co-opt the use of various mental circuitry related to reproduction, it does not constitute a functional replication of the reproductive process.

In other words, you are never going to have a baby no matter how much you masturbate. Barring incredibly rare abnormalities like Turner’s Syndrome, you will never be able to become pregnant (especially if you have an XY phenotype body) in the absence of another human sentient. In any case, that sort of exception is physiologically unrelated to masturbation and so even that would not disprove the example. The long and the short of this is that if you attempted to “reproduce for yourself” in the absence of another human, you could spend as long as your heart desired at it without the effort contributing to your goal in the slightest. You may have a fantastic relationship with Rosy Palm and her five sisters, but none of them are going to be your baby daddy, sorry.

It is in precisely the same way that “thinking for yourself” in the absence of evidence will get you nothing aside from a warm, fuzzy feeling. If that is all you are after, allow me to refer you to the former example as it will allow you to obtain that result with significantly more regularity. Merely “citing your thoughts” is mental masturbation. You may always share your thoughts, but as soon as any of them purport to be representative of anything outside your opinion, you may have begun to waggle your intellectual wang, (or started “bluffin’ with your cranial muffin”,) in a most embarrassing manner. Do have a care for any impressionable people who might be exposed to your intellectually indecent exposure.

To get a bit more into the nuts and bolts, when we say, “think for yourself” honestly what we mean is, “examine the evidence for yourself and come to a conclusion that is not biased by another person’s assumptions.” The phrase presupposes that not only is the evidence available in full, but that the recipient is interested in perusing it and constructing his or her own theory to explain it. Or at least examining the available explanations and selecting the one from the source he or she judges to be most likely to be correct. Even the latter method is rife with peril if it is not accompanied by a basic understanding of reality, some fact-checking, and a firm conviction that truth is preferable to comforting sophistry.

I will be blunt: anyone who tells you to rely on your own thoughts and feelings to the exclusion of evidence, skepticism, and communication/cross-checking with other people, that person is either a contemptible lout or a lunatic and more than likely wants to sell you something, be it a used car, a religion, or the dubious privilege of his presence between your thighs for as long as it takes him to do his business. Thought without evidence or logic is like sperm without an egg or a womb and it will get you just as close to producing truth as the latter will to producing a baby.

No matter how many people you get to agree with you that it is otherwise, the facts will remain the facts. So when you say, “think for yourself” you had better bloody well mean it and the rest of you who gobble up that vacuous piffle in the spirit in which it was intended, cut that shit out before civilization collapses beneath your vacant and incurious bulk.

Ille equus mortuus percussus est.


There are a lot of people in the world. About is 7.036 billion, actually [USCB]. If we assume we can get to know someone in a cursory 30 second chat, assuming we all spoke the same language, it would take about 6700 years for everyone to meet and greet everyone else even if we assume rigid 30 second time periods, no time for sleep, no eating or drinking (or at least none that interferes with talking) and instant switching from one partner to the next.

Even if we allowed for a mere 3 seconds for changeover between each conversation, that would still tack on another 660 years. Giving everyone eight hours of sleep would tack on another 2300 years or so. Giving everyone an hour time to bathe, brush their teeth, and so forth is good for another 290 or so. Grand total: 9950 years–and that’s still assuming that say, robots are doing all the labor to sustain our industries, agriculture, and the giant conveyor belt never, ever breaks down and always functions perfectly.

The result is that, if humans lived for ten thousand years, and everyone did nothing but meet other people, we would still spend 99.995% of our lifetimes meeting other people. This also assumes, of course, that the birthrate is zero. But let’s divide it instead by the average human lifespan of 69.6 years [World Bank]. The result is about 143–yes, that means that if you had 143 average human lifetimes, you might be able to have a single, thirty-second conversation with every person *currently* alive on the planet as of this instant.

Obviously, then, anyone proposing a system by which we base our actions exclusively on personal relationships would be laughed out of town by anyone who had thought about it for longer than about five minutes. Some kind of compromise has to be made, and our rather curious brains, having evolved as they did to help us maintain relationships among comparatively small social groups of primates, provide us with a surprisingly functional but somewhat problematic solution.

Stereotyping: This brilliant shortcut lets us establish correlative relationships between superficially similar things, which has obvious benefits. If you see someone eat a brightly colored frog and die, you are more than likely going to benefit from avoiding the consumption of the whole category of brightly colored frogs. If say, you are living in a tribe among others and the possibility of violent conflict exists, you are more likely than not to benefit from associating their phenotypical features, clothing, language, and mannerisms with danger or at least the unknown of which you should be wary. So on, and so forth.

Where we start to get into trouble is that while these correspondences are easy to establish, they are difficult to break without a great deal of effort. The mechanisms themselves are quite intricate, and as a I am no neuroscientist I will not attempt to explain them, but the upshot is that what is in actuality a useful approximation can be unintentionally conflated with a bit of guaranteed predictive information. This actually would not even be a problem if a “stereotype” was established with an arbitrary but large degree of precision and applied only with respect to things that met the specific definitions, but that would be antithetical to the purpose of our rather fuzzy system of categorizations.

They are of value specifically because they allow us to benefit from prior knowledge even when dealing with novel situations. Learning by analogy might be the best use we have for these ‘fuzzy’ correspondences. Math, science, and art all rely on establishing correspondences to our previous experience and constructing mental tools of ever-increasing complexity.

A very young child can be said to be starting on mathematics when they establish the line of demarcation between a single discrete object and a group of them. The child continues by learning the words that correspond to specific discrete quantities and learns to place them in order, learning to count from one to two, two to three, and so on. From there it’s a hop, skip, and jump to addition. Subtraction is only addition in reverse, multiplication only addition of group quantities. When we divide, we are splitting a quantity into groups and the problem can be framed as a ratio with fractions. All math problems inherently possess at least one variable, and algebra is just learning to break up a single problem into smaller discrete parts and manipulate those parts. Once we have that, we can look at the relationships between real world shapes and equations with geometry.

When we look closely at a story, the fuzzy correspondences are being made use of any time something happens in the text that we have not personally experienced. When Cervantes’ Don Quixote charges the giants, explosions of neurochemicals construct the notional realities of the errant knight’s tale–he is able to charge the great four-armed beasts never seen on Earth because we have had the experience both of seeing something and having been mistaken and having felt something we wanted to be true even if cold reality stonily folded its arms. We are Sancho Panza, observing Quixote with bemused, if phlegmatic, marvel. We have never been in these situations, but nevertheless a string of correlations forms the connective tissue that allows us to use language to grasp something of value and meaning from the story.

I submit, therefore, that the problem with stereotypes is only that they underlie so much of our accomplishment both as individuals and as a species, that we forget sometimes to confine them to their proper category–probabilistic approximations  useful only in providing a general estimate that allows us to act without freezing, but that can and very often are imprecise, inaccurate, or even outright mistaken. If we remain willing to juggle people between our categories, shift those categories around, re-write their boundaries, or even dispose of them entirely if we find too much evidence against them–we will not go too far afield to treat our fellow humans decently even if we do not know them personally.