The fundamental goal of legitimate critical thinking (as opposed to the fraudulent kind) is to ascertain what is true, about yourself and the world. So the tools that constitute critical thinking must be tools for finding the truth. And that means tools for discovering which beliefs are false. And that means continually admitting to yourself that you might be wrong about something. If that scares you, then the simple fact is: you won’t be a reliable thinker.

Of course, the only way to know a belief is false is to find out that some other explanation of the evidence for that belief is as or more likely—and thus, discovering a false belief always produces one or another true belief. But the only way to know a belief is true, is to sincerely and effectively try to prove it false and fail. So it’s always about finding the false beliefs. What’s left is true. But too many people call “critical thinking” what is instead a toolbox for defending false beliefs (or avoiding true ones) rather than discovering false beliefs (and thus finding true ones). The key moment that will separate you from someone who abandons rather than defends delusions and errors is the very moment when you realize this—and change tack from an illegitimate to a legitimate critical-thinking toolbox.

Because all beliefs are motivated—you won’t ever hold strongly to any belief you are not in some way dependent on or emotionally invested in—so you need a set of tools for managing this, to ensure your investment, your emotional motivations, don’t blind you to the truth. The only way this can be reliably done is to place your top emotional investment in the very idea of legitimate critical thinking itself. You have to be so emotionally invested in discovering false beliefs that this emotional drive overwhelms any other. That way, you can actually come to abandon beliefs you emotionally or otherwise needed or were invested in, because you are more emotionally committed to knowing what’s true. If you lack that emotional commitment, you will remain consumed by false beliefs. You will then never be a real critical thinker. You will only ever be a sham thinker (or not even a thinker at all).

For example, whenever faced with any new claim or challenge to your beliefs, and indeed even when surveying all your own beliefs that haven’t faced any external challenge yet, what you want to know is “Is it true?”, not “Can I come up with a reason to reject it?” (much less “Can I come up with a reason to accept it?”). One can always rationalize an acceptance or rejection of anything, be it true or false. So the latter approach is actually a tool for defending false beliefs (usually beliefs you don’t want to change in the face of any challenge to their being true, potential or actual). But the other approach is a tool for discovering false beliefs (“yours” or “theirs”). Then, of any claim or belief, you won’t just ask “Can I come up with a reason to accept or reject it?” You will instead ask, “Do I have a genuinely adequate reason to accept or reject it?” Which requires working out what it takes for a reason to be adequate.

So answering that latter question requires more than just coming up with reasons. Many people will claim to be critical thinkers because they are “critical” of any belief or claim they don’t like—but not similarly critical of any claim or belief they do like. This hypocrisy is extremely common even among atheists, but to see it explored among the religious, I highly recommend John Loftus’s The Outsider Test for Faith, whose analysis is convertible to any other belief-divide, not just religion. Critical thinking does not mean just being “critical” (much less only selectively critical). Just as being a skeptic does not mean just doubting everything. Doubting something there is overwhelming evidence for is adopting a false belief: a false belief in the doubtfulness of what is in fact well-evidenced. So that kind of skepticism is a false-belief generator. It is built to defend, rather than discover, false beliefs. That is anti-critical thinking.

Loftus’s book focuses on faith-based epistemology, which is usually associated with religion (and indeed, for its degrading effect on people’s ability to think well in that domain, see my series on Justin Brierley). But that’s only because religion will put the faith element forward and try to defend it as an actual epistemology (anti-critical thinking in its purest form). Faith is literally the ultimate tool for defending false beliefs (and even the most liberal of religions suffer from this: see What’s the Harm? Why Religious Belief Is Always Bad). But the nonreligious also rely on faith. They just don’t call it that, to avoid admitting the significance of what they are really doing. Any false belief you actively defend out of passion or need is a faith-based belief. Motivated reasoning is faith-based reasoning. And faith-based reasoning is motivated reasoning. They are identical.

Attacking or resisting a claim you don’t want to believe, or are otherwise emotionally invested against, is exactly the same folly. To be confident something is false when there is adequate evidence to suspect it could be true is a false belief. To be confident something is false when there is overwhelming evidence it’s true is also, obviously, a false belief. But any incorrect belief in the correlation between a claim’s truth-value and its evidence is a false belief. If you believe “There is no way that can be true,” when there are actually, demonstrably, many plausible ways it could be true, then your belief is false. It would not be false to say, perhaps, “There is inadequate evidence to be sure either way,” if that is in fact the case (and not just some claim you are making that is incongruous with the actual state of the evidence). But if that is in fact the case and yet you are denying it (by insisting it “can’t be” or “has to be” one way or other), you are defending a false belief rather than discovering (and appropriately abandoning) one. Having true beliefs includes beliefs that something is ambiguous or in an appreciable zone of uncertainty—when, indeed, something is ambiguous or in an appreciable zone of uncertainty. But when you insist something is ambiguous or in an appreciable zone of uncertainty when it is not, that is a false belief.

So a commitment to getting at the truth requires rooting out false beliefs—and all beliefs that are incongruous with the state of the evidence are false beliefs. False beliefs include beliefs held in ignorance (you literally have not run into or been told or given access to the evidence that they are false). Those are still false beliefs. You should want to root them out. And that’s easy enough: just explore the evidence. For instance, while I was writing my article on Gun Control That’s Science-Based & Constitutional I had thought that sentencing enhancements for crimes committed with guns would help deter gun violence. Sounds logical enough. Surely criminals will use guns less if they know they’ll do more time if they get caught than if they effect their crimes with less lethal means. Right? But I am a committed critical thinker. I did not assume I was right. I checked first. Because maybe I was wrong. Maybe the obvious intuitions supporting this idea are bollocks.

Well, guess what? They are. I believe in evidence-based politics, as opposed to ideology-based politics. And that means, in this case, evidence-based policing. And it turns out (as you can find with links in my article) sentencing enhancements have proven to have no deterrence value (and can end up producing a number of unexpected injustices as well; in fact, longer sentences can even increase crime rates, by manufacturing more hardened and desperate criminals). In fact, they have no such effect at all; but particularly with respect to firearms. Criminals, just by choosing to be criminals, have already proven themselves too irrational and lacking in judgment or foresight to be affected by such abstract principles of deterrence. They are far more likely to be deterred by an increase in the likelihood of being caught (an emotionally immediate worry) than by any increase in the punishment for it.

Well, heck. The more you know. But this illustrates one tool in a genuine critical thinking toolbox: don’t assume you’re right. Check. And have a sound standard of evidence—know what it will take to convince you you are wrong; and make sure it’s reasonable and not absurd. Because if you are setting unreasonable standards for changing your mind, that is by its very nature a tool for defending false beliefs, and thus of anti-critical thinking. Wrong toolbox. There are many examples one could give like this. For instance, did you know the Dunning-Kruger Effect actually doesn’t exist? Neither does the Backfire Effect. Many things we think are intuitively true and that may even have had scientific research supporting them, turn out on further examination to be false. That’s why we need that further examination. Both societally (all the science that followed these effects up and found them wanting) and individually (we ourselves should check the state of these things when we can, before simply repeating “what we’ve heard,” as I show for example in Dumb Vegan Propaganda: A Lesson in Critical Thinking).

Another example of such false-belief-defending tools is of course the most obvious: recourse to fallacious reasoning. Understanding logic, and thus how to detect fallacies in your own reasoning (and not just in others’ reasoning), is key to critical thinking. Resorting to fallacies, ignoring when you are using them, that’s anti-critical thinking. You would only do that to defend false beliefs against challenge or doubt; it can never help you root them out. It’s doing the opposite: it is preventing you from rooting them out. Likewise when you selectively try to spy out fallacies in others’ reasoning but not your own: you only do that to get rid of threats to your existing doubts and beliefs; that is a method of defending and maintaining false belief. It’s even worse when you are claiming to find fallacies in others’ arguments that don’t even exist. That is literally delusional: you are then generating more false beliefs to protect your other false beliefs! But even when you are competently and correctly rooting out fallacies in others’ arguments but not your own, you are still engaged in anti-critical thinking.

In my monthly course on Critical Thinking for the 21st Century (one of ten classes I teach, all of which expand on it with even more critical thinking skills in specialist topics) I focus on three domains of mastery that are required to be “a reliable avoider and purger” of false beliefs: standard logics (fallacy-detection-and-avoidance); cognitive science (knowing all the ways your brain routinely generates false beliefs and thus how to catch and thus control and correct for them); and probability theory (which requires understanding Bayesian epistemology; because all beliefs are probabilistic, which means if you don’t know how probability works, neither are you going to reliably assign probabilities to your beliefs: see my Advice on Probabilistic Reasoning). In each case, the difference between the skills I teach, and what unreliable thinkers instead “call” critical thinking, lies in whether the tools help discover false beliefs—or help protect false beliefs.

It is thus not enough to know and learn the tools. You have to consistently employ them, all the time. If you “turn them off” sometimes, that is a blatant tool for protecting false beliefs. Because protecting false beliefs is the only reason to ever do that. You can’t just know all about logical fallacies—but then only selectively look for them (like, for example, not burn-testing your own reasoning to catch when you are using them). Likewise, you can’t just know all about cognitive biases—and then do nothing about the hundreds of cognitive biases you know your brain is relying on every day. And you can’t scoff at probability theory, making excuses for why you don’t have to put in the effort to understand it. That is inherently a false-belief defense strategy: you are essentially thereby admitting you want to be less reliable at belief-testing. Which means: you are admitting you want false beliefs; that you don’t want to discover them. Which is a tool of defending false beliefs. That’s anti-critical thinking.

I’ve provided extensive examples of this before, where I survey not just “how” someone is wrong, but what they did, methodologically, that ensured what they ended up believing or claiming was wrong. In other words, I haven’t just continued debunking false claims. Often when I do that now I take the trouble to highlight not just that they are wrong, but how they ended up being wrong: what methods they used to end up with false conclusions, and to continue defending those false conclusions—thus defending rather than detecting false beliefs. Those methods illustrate anti-critical thinking. Which means to do real critical thinking, often you just have to do the opposite of what those people are doing. Indeed, even the mere act of avoiding their methods of defending false beliefs can leave you with methods that detect them instead.

I’ve given many examples of this point before. For example, in:

In these I show how certain people not only ended up with false beliefs, but even elaborately defended those false beliefs. If you avoid their techniques for doing that—if you check actual peer-reviewed scholarship first instead of ignoring that step, if you check what the evidence actually is and what your sources actually say first instead of making the mistake of not doing that, if you actually fact-test and logic-test your own argument before resting your reputation on it, if you try really hard to refute yourself before trying to defend yourself, then the mere act of avoiding such methods of defending false beliefs will land you in methods of detecting them. Which is exactly where you want to land—if true beliefs are what you want to have.

I more systematically discuss these toolsets—the tools for defending false beliefs vs. the tools for detecting them—in:

For example, in Media Literacy I show how selectively choosing what sources to trust based on an invalid criterion of trustworthiness or reliability is a technique for defending rather than detecting false beliefs. To critically think for real you have to not do that. But the only way to “not do that” is to do the opposite: to establish for yourself some reasonable and realistic criteria for what sources to trust and how much to trust them—and how to test and vet their claims to ensure that stays the case (rather than simply reject their claims when you don’t like what they are, or who is telling you what they are).

In Three Tactics I show how common methods used by people who are desperately committed to false beliefs—JAQing Off, Whataboutism, and Infinite Goal Posts—are all designed to avoid confronting uncomfortable facts and arguments, to try and make them go away, rather than acknowledging their implications. This is a method of defending false beliefs—by elaborately avoiding any attempt to detect them. All three techniques aim to distract from the facts or change the subject, thus avoiding any confrontation with reality. A fourth such technique is Motte and Bailey, where someone will defend a false position but when caught doing this will retreat to a more defensible position and claim that was their position all along; until the pressure is off, then they go right back to defending the false position again. This technique is designed to help them avoid confronting any evidence that they are wrong.

That’s all anti-critical thinking. I give many more examples in Shaun Skills, which illustrate not only how to properly critically-think YouTube media, but in the process also show how pundits on YouTube defend false beliefs rather than detect them. Those pundits should have done what Shaun did before publishing their arguments. And this is the key lesson for today. It is as important to pay attention to how people are defending false beliefs—the methods they are using to do that (and thereby to avoid discovering or admitting their beliefs are false)—and not just the methods used to catch their mistakes and debunk them. Those methods, if the pundits had used them on themselves before publishing claims and asserting beliefs, would have uncovered their false beliefs and thus left them with true beliefs instead. But their methods, the methods they used to defend those false beliefs instead, are also methods you need to make sure you are avoiding. Don’t be like them. Be like Shaun. That is the summation of real critical thinking. To learn it thus requires observing examples of behavior to replicate, and behavior to shun, so you know how to purge bad thinking and generate good thinking in its place. From there you can build lists of error-protecting techniques and how to avoid them, as I did, for example, in How to Successfully Argue Jesus Existed (or Anything Else in the World).

There are a lot of good resources online for mastering critical thinking tools. For example:

You can go even deeper with:

But in every case, what you need to focus on—so as to learn the correct use, and importance, and application of every tool—is how that tool roots out rather than defends or protects false beliefs. This is different from focusing on how they might help you find and form true beliefs. They nevertheless do that, too; in fact, they are the only things that do. But you need to focus on the flip-side of that. Because if you focus on “finding the truth,” you are still in danger of missing the real lesson, which is that you need to “find what is false” to get at the truth. Just looking for what’s true can activate all sorts of cognitive biases and fallacies and probability errors—from verification bias to gullibly believing what you read because of mistaken criteria regarding who to trust, and many other common mistakes. But if you are always thinking in terms of “How does this tool help me root out any false beliefs I might have?” then you will be poised to detect rather than fall prey to error and bias.

To locate what’s true requires trying to prove each possible thing false (each claim, each belief), and failing. Because it is only that failure which guarantees a claim’s probability is high. Because when it is improbable that a false belief will have survived a test, then it becomes more probable that that belief is true and not false after all. And because all claims and beliefs make assertions about what “the explanation is” for some body of evidence, and an explanation’s probability can only be known relative to competing explanations, it is only by proving alternatives improbable that you can ever establish anything is probable. Whereas if you fail to prove your belief false because you used weak tests or rigged tests, tests designed to make it easy to fail, all so you can “declare” that you “tried to prove it false and failed,” then you are engaged in anti-critical thinking; you are then using a tool—disingenuous falsification testing—to defend rather than discover false beliefs. But if you apply genuine burn-tests—tests that really will discover your claim or belief is false if in fact it is—and then fail, you will have verified that claim or belief is probably true. That is, in fact, the only way to soundly verify a claim or belief is true (see Advice on Probabilistic Reasoning).

This means the questions a critical thinker must become comfortable with, and ask of their own beliefs as often as they can, are:

  • “Is that actually true?”
  • “Why do I believe that?”
  • “How would I know if I was wrong?”

We can’t of course completely vet every belief we have. We have very finite resources—even just in time, much less money and other means. But we should apportion resources by the importance of the belief. And the importance of a belief really is best measured by what it will cost if you are wrong. Not just cost “in money,” but in everything else, from time to embarrassment, to public or personal harm. The greater the risks you are taking affirming belief in something, the more you ought to take special care to make sure you are right. Whereas things it would be no big deal to be wrong about you can let slide if you must, until they become important, or pressing, or you just happen to encounter challenges to them and want to explore whether they hold up. Indeed you should aim for a strong inverse correlation between the number of false beliefs you have and the cost of being wrong about them. More of your trivial or unimportant beliefs should be false than fundamental and important beliefs. Because the latter should get the lion’s share of your critical thinking efforts to check, test, and vet. But even the trivial deserves a little attention. Even just knowing that we should always ask whether what we are being told is accurate makes a big difference, because it puts a more cautious flag on our memory of it.

What’s scary about all this is that to have the required passion to know which of your beliefs are false, so as to be in any way assured of having as few false beliefs as possible (rather than a plethora of them), you have to admit that you have false beliefs—that you are, and have been, wrong about something. You may even be harboring an enormous number of false beliefs. You could even be the very sort of person you despise: an irrational rationalizer of your belief-commitments, a motivated reasoner, an anti-critical thinker, and you have just been telling yourself the opposite, avoiding anything that would expose that belief to be false.

This can be scary. But it’s an easy fix. You can just admit this of yourself—and change. You can start abandoning false beliefs by admitting you had been incorrectly defending or protecting them rather than rooting them out; but now you are committed to rooting them out. The past is past. You will now be the sort of person you like and can count on, instead of the unreliable person you had been; and you will emotionally invest in being this sort of person above all else. Truth first. Everything else, secondary. Only then will you no longer be vulnerable to emotional investment in false beliefs. And only then will all your beliefs start to hew more reliably towards the truth. And this is why excuses (like, “I need certain false beliefs”) don’t logically work; because you can’t “gerrymander” a false-belief defending strategy to only defend harmless or necessary false beliefs (even if such beliefs exist). You simply either have a reliable epistemology—or you don’t (on this point see, again, What’s the Harm?).

§

To comment use Add Comment field at bottom or click a Reply box next to a comment. See Comments & Moderation Policy.

Discover more from Richard Carrier Blogs

Subscribe now to keep reading and get access to the full archive.

Continue reading