Mike's Rules for Critical Thinking
A working set of rules for critical thinking organized into three domains: foundation, analysis, and engagement. Written for non-academics who want a down-to-earth framework.
I’m seeing a lot of references to “critical thinking” in my social media feeds, but often those most excited about critical thought also hold ideas that are strong evidence of its absence. “Think critically! Examine everything! The world is flat!” Those people define “critical thinking” differently than I do.
It made me think carefully about what I consider critical thinking. I’ve arrived at a working set of rules, which I’ve organized into three domains:
- Foundation Rules: Mindset and fundamental approach. The attitudes and intellectual humility required for critical thinking.
- Analytical Rules: The methods and principles we use to evaluate information, weigh evidence, and reach conclusions.
- Engagement Rules: How we test and refine our thinking through discourse with others, testing it in the crucible of reasoned debate and collaborative exploration.
I’m aware there are academic frameworks for critical thinking (see Analytical Rule One), but I’ve superficially examined them (ICAT, Bloom, Paul-Elder, CARS) and I think there’s space for a down-to-earth set of guidelines for non-academics. Here goes.
Foundation Rules
Mindset and fundamental approach. The attitudes and intellectual humility required for critical thinking.
Foundation Rule One: Assume you could be wrong.
The basis of critical thinking is skepticism—the default doubt that questions. And the utterly essential first element of any critical thought process is to doubt yourself. Assume, as the highest priority, that your current thinking on the matter could be flawed. A closed mind is an uncritical mind. This is not a new concept. Socrates said it best: “I neither know nor think I know.”
Foundation Rule Two: Identify the circumstances under which you’d change your mind.
If you’re following Foundation Rule One, you admit you could be wrong. Foundation Rule Two mandates that you describe what could prove it to you. The evidentiary bar could be very high—as Carl Sagan said, “Extraordinary claims require extraordinary evidence”—but it must exist.
Foundation Rule Three: There is nothing wrong with changing your mind.
Your opinions are not your identity. Your beliefs are not you. You can change what you believe without changing who you are. And a challenge to those beliefs is not a challenge to you; it’s a challenge only to the beliefs. There’s nothing wrong with starting out wrong if you end up right.
Foundation Rule Four: It’s always better to be really right than mostly right.
Truth is not merely what is useful. If that were the case, we’d still use the Ptolemaic geocentric view of the universe. It was right enough to be useful for astrologers and eclipse predictions. But it wasn’t really right. And the continued search for “really right” took us to the moon and will likely take us to Mars. Practical models solve today’s problems, but true understanding prepares us for tomorrow’s. If you stop at what is merely useful, you are stuck with only what you know how to use today. And future you will not be grateful that you stopped thinking.
In more real-world examples of how mostly right isn’t good enough, here’s why really right matters.
How can you find out how many geophysicists are in the room? Announce that the acceleration of gravity on Earth is 9.8 m/s^2. “At the equator, at sea level, if there’s no large ore bodies nearby, sure. But everywhere else, uh, nope.” Geophysicists use the precise measurement of gravitational variations to find ore bodies. That you’re reading this on a computer is thanks, in part, to the pursuit of “really right” measures of gravity.
And how can you find out how many geographers are in the room? Announce that the earth is an oblate spheroid. “If you ignore the oceans, flatten all mountains, and allow a margin of error wide enough to fit the Grand Canyon, sure. And even then it’s more of a lumpy pear shape with dynamic tidal deformations.” If the only use for measuring the Earth is to find your way from Spain to Cuba, mostly right is good enough. But not good enough to get Google Maps to show you the way home.
Foundation Rule Five: Reserve judgement.
Critical thinking begins with questions, not answers. The temptation to stereotype, overgeneralize, or dismiss entire categories (“all X are Y”) is an intellectual dead-end, often emotionally motivated, that prevents analysis. Cynicism masquerading as wisdom is particularly dangerous because it offers the illusion of critical thought while actually blocking it. When you start with “everyone is corrupt,” you’ve reached a conclusion before examining evidence—which is exactly the opposite of critical thinking. Cynical pre-judgment without careful analysis is, however, a necessity for conspiracy theories.
Foundation Rule Six: All generalizations are wrong, including this one.
Be as specific with your argumentation as possible and as general as necessary. Be careful with broad generalizations because they tend to include unintended exceptions. Most human characteristics, behaviors, and situations exist on a spectrum rather than in distinct categories, so sweeping statements usually capture cases they shouldn’t or miss cases they should include.
Foundation Rule Seven: Read more that challenges you than confirms you, and don’t believe everything you read.
Critical thinking requires universal skepticism nurtured by deliberate exposure to opposing viewpoints. A natural tendency is toward confirming information which validates rather than tests your thinking, so you must consciously overcorrect to achieve balance. This requires both skepticism and courage: skepticism to question everything, including sources that align with your current beliefs, and courage to regularly engage with well-reasoned challenges to your positions. Critical thinking mandates that you actively seek out thoughtful opposition to your views, and make it a practice to spend more time understanding opposing arguments than reinforcing existing beliefs. Without this, all other rules become tools to defend existing beliefs rather than pursue truth.
Analytical Rules
The methods and principles we use to evaluate information, weigh evidence, and reach conclusions.
Analytical Rule One: Remember that for effectively every human endeavour, there are probably at least 100 people with PhDs in it.
“What is the best room colour to stimulate happiness?” Probably a dozen people pursuing PhDs currently in that. “How do we most efficiently bake sourdough bread at high altitude?” Another dozen PhD students raise their hands. The point is that there are legitimate experts—people who have spent decades studying it—in effectively everything. That means, on effectively every possible subject, there is someone who knows it vastly better than you do unless you’re the world’s expert in it. Before beginning any critical examination of a topic, ensure that you have at least basic comprehension of it, which also means awareness of at least the existence of academic disciplines related to the subject.
Analytical Rule Two: If your opinion differs from those who know the subject matter better than you do, you’re almost certainly wrong.
This is an idea called epistemic humility. For example, if there are 1000 researchers investigating how wood decomposes across variations in humidity, temperature, and local micro-organisms and they all agree but you don’t, it’s almost certain you are wrong. If there are two opinions on the matter and the groups are evenly split, it’s reasonable for you to disagree with one side but not both. If there are 1000 opinions on the matter but 100,000 experts agree on one and 999 others each have their own opinion, it’s not reasonable to disagree with the effective consensus opinion. Assuming independent and distinct investigation and judgement, the likelihood of error in a consensus decreases exponentially as the number of experts supporting it increases. This principle applies most strongly to established fields with mature methods and evidence. In emerging fields, consensus may be more fluid, and further investigation is often needed. But it’s essential to seek the predominant view among experts, rather than selectively finding experts who happen to align with your initial opinion.
This is where “doing your own research” is a bit misleading. No matter how much amateur research you do, you’re not going to surpass the collective knowledge of the experts in the field. The “research” you should be doing is finding out what the smartest, best-informed in the field have to say on the matter. If a plurality of them agree on one position, it’s irrational for you to disagree.
I have an acquaintance who is undeniably brilliant. MIT undergrad, MBA from Harvard, big brain. He’s in finance and spends his days building proprietary financial models that make him and his firm a lot of money. He uses enterprise computing tools to run simulations at scale and has a profound understanding of the capabilities and limitations of financial modeling. But because he’s a smart guy and has deep experience with financial modeling, he makes the mistaken assumption that his expertise is applicable outside his field. He applies it to other scientific modeling and questions the claims made by a large group of scientists because he believes the models on which they base those claims are not capable of the degree of certainty the claims rely on. The question is not whether he’s right but whether it’s reasonable for him to doubt the claims of the scientists. In this case, there are roughly 20,000 scientists who make the claim he disputes. There are roughly 600 dissenting scientists in the field but almost all of the dissenters have their own unique opinion, so the consensus opinion carries enormous statistical weight.
Even with conservative estimates, my brilliant buddy would need to carry the statistical weight of over 2,000 experts by himself to create even a 10% chance of being right. While he could be correct, the odds are about as likely as winning the lottery three times in a row. It’s not rational to believe he could be right and all the experts are wrong. Thinking about it another way, let’s say he’s in the top 0.1% of smart people. So of 1000 people, he’s the smartest. Well, of 20,000 scientists, he’s not in the top 10, statistically speaking. To imagine that people smarter than him, with years of experience, and vastly more knowledge than he has in the subject matter are wrong and he’s right is again not rational.
Analytical Rule Three: Learn how arguments work—and fail—but focus on understanding rather than scoring points.
Familiarize yourself with how reasoning works: deductive reasoning moves from general rules to specific cases and inductive reasoning builds general rules from specific cases. Learn to spot common flaws like assuming correlation means causation, challenging the person rather than the argument, or drawing conclusions from too little evidence. Watch for cognitive biases like confirmation bias (seeking only confirming evidence) and the availability heuristic (overweighting easily remembered information). But remember—an argument can contain flaws and still reach the right conclusion. Use this knowledge to identify poorly founded or misleading arguments and to avoid flaws and biases in your own as much as possible. But see Engagement Rule Three—not everyone who has good things to say has a good command of rhetoric and logical argumentation. And the inverse is also true; many people who have studied rhetoric are actively trying to mislead you.
Analytical Rule Four: Sometimes there are not enough facts or too many opinions to reasonably find a single good answer.
In some areas, there just isn’t enough information or there aren’t opportunities for experimentation, so theory and opinion is all you’re left with. If there are 1000 experts and 1000 vigorously debated opinions, that’s probably an indication that there’s not a lot of facts. When there are too many differing opinions, you’ll need to start looking more closely. I have rules for this as well but they’re more detailed. Things like examining the methods and measures, weighting tested results higher than untested results, the consilience (degree to which it’s drawn from a range of independent, diverse methods and fields) of the evidence, and many others. This Analytical Rule Four involves deep investigation of the research and the researchers but it’s the most dangerous rule because it’s tempting to attempt to do this more often than necessary. See Analytical Rule Two—if most of the experts agree, they’ve done this already and you doing it amateur-style independently has effectively no likelihood of being valuable.
You might point to cases where outsiders proved experts wrong. Elon Musk, for instance, has become the richest man in the world partly by challenging experts. But he does it not through amateur analysis; he funds scientists to apply their critical thinking to design and run experiments that create new facts. That’s valuable, but it’s distinct from critically analyzing existing evidence. It’s also an example of the value of being “really right”—Elon doesn’t quit at “useful,” he persists until he gets new facts that change the world. Note—I’m not an Elon Musk fan but I can’t deny he’s incredibly successful.
Analytical Rule Five: Even a consensus could be wrong. But “wrong” in this case usually means “not really right” and not “totally wrong.”
As the saying goes, “science advances one funeral at a time.” Scientists are people too so they become attached to their own work to the point of disregarding evidence that invalidates or challenges it. However, Newtonian physics is not “wrong” until you increase the scope of observations. Only when you need incredible precision or you leave the planet do you need relativity. Science, for the most part, advances by gradually uncovering the truth. The popular analogy is of a sculptor chiseling to reveal the truth within the block of stone. Each chunk of stone removed represents additional insight into what lies beneath. Very rarely does the insight represent a huge change and in almost all cases is a refinement or enhancement of previous thought.
Analytical Rule Six: Test what you can, but think critically about your testing. And remember not everything can or should be tested.
Testing ideas in the real world is essential—it’s how we connect theory to reality and how Elon makes his billions. But effective testing for Elon requires the scientists to think critically at every stage: What exactly should we test? What would constitute success or failure? Are our test conditions representative? What variables might we be missing? What are we assuming? The person who says “I don’t think, I just test” is actually thinking critically without realizing it—they’re using empirical testing as their framework for critical analysis. But testing without careful thought about what and how you’re testing is like trying to build a house with power tools but no blueprint. The tools are essential, but you need to think carefully about how to use them. Also, many things can’t be tested either practically or ethically. We can’t field test macroeconomic theories with control variables because there’s only the one global economy. We can’t test most social sciences because, for one example of many, we can’t ethically raise feral children as controls in experimentation. And even the limits to empiricism require application of critical thinking to identify. So “test it, see what happens” isn’t a bad approach but it’s necessarily not the only approach.
Engagement Rules
How we test and refine our thinking through discourse with others, testing it in the crucible of reasoned debate and collaborative exploration.
Engagement Rule One: Test your opinions against the best examples of their counter-arguments.
This is sometimes called “steel-manning”—seek out the smartest, best-informed criticism of your position. If you have to change your mind as a result, see Foundation Rule Three.
Engagement Rule Two: If you’re debating, argue about ideas not words. Unless you’re arguing about words, that is.
Make sure that when you are discussing with someone that you are using the same language. Imprecise terms and definitions are a time waster. And if someone refuses to define terms or changes their definitions mid-stream for whatever reason, find someone else to discuss with. They’re not arguing in good faith.
Engagement Rule Three: Be generous. Not everyone with a valuable opinion is a skilled communicator. Apply the most charitable interpretation you can to an argument rather than the most restrictive.
It is disingenuous to misinterpret an argument just because someone misspoke or didn’t use a technical term correctly. Valuable insights are not always eloquent. Reciprocally, it’s just as important to strive to be as clear as possible when presenting an argument to avoid misunderstanding. Try to avoid requiring generous interpretations of your arguments by stating them as carefully as you can. There’s a time and place for pedantry but it’s not when you’re trying to engage and understand.
Engagement Rule Four: Being right is the objective. Winning is ending up right, not proving the other guy wrong.
The goal of critical thinking is not to “win” a debate but to gain better understanding. Critical thinking isn’t just a skill but a commitment to integrity in the pursuit of knowledge.
Engagement Rule Five: Assume good faith but pick your battles.
Ideas deserve fair consideration regardless of their source. Assume ideas emerge from honest inquiry or observation rather than dismissing them as propaganda, foolishness, or malice. Even ideas we ultimately reject may contain valuable insights or raise important questions. However, if you discover through examination that an idea or an argument is not grounded in good faith, feel free to abandon it without further consideration. Not all inquiry is honest. Not all questions are sincere. Engage only with those that are, but at least start with the benefit of the doubt and feel free to leave any conversation at any time.
Engagement Rule Six: Regularly assess how well you’re doing at critical thinking.
Evaluate your thinking against these criteria:
- Can I articulate opposing views fairly?
- Have I identified my assumptions?
- What thinking habits need improvement?
- Where did I show bias?
- What should I do differently?
So there you have it. Mike’s Rules for Critical Thinking. 3000 words. If you’ve read this far, congratulations, that probably makes two of us. I wrote this for me mainly but also to have it available online in case I want to reference it later. If you find it useful, feel free to share or re-use. No license or attribution needed.