Critical thinking is a topic I speak often about on this blog, and one that you will come across on most skeptics’ writings. I thought it would be useful to go into what critical thinking is, common characteristics and philosophies, and how to apply it. I will also be starting a series looking at one of the cornerstones of critical thinking – identifying logical fallacies.
Put simply, critical thinking can be described as “the objective analysis and evaluation of an issue in order to form a judgment.” A more detailed definition, provided by the The National Council for Excellence in Critical Thinking, defines critical thinking as the intellectually disciplined process of actively and skillfully conceptualizing, applying, analyzing, synthesizing, and/or evaluating information gathered from, or generated by, observation, experience, reflection, reasoning, or communication, as a guide to belief and action.
So, what are some of the characteristics of critical thinking?
- Critical thinking is reasonable and rational. Critical thinkers do not jump to conclusions. Collect data, weigh the facts, and think the matter through.
- Critical thinking is reflective. Thinking the matter through, weighing the facts and evidence.
- Critical thinking inspires an attitude of inquiry. Be inquisitive but also skeptical.
- Critical thinking is autonomous thinking. Critical thinkers are not easily manipulated or swayed by popular opinion.
- Critical thinking includes creative thinking.
- Critical thinking is fair thinking. It is not biased or one-sided.
- Critical thinking focuses on deciding what to believe or do. Critical thinking is used to decide on a course of action; make reliable observations; draw sound conclusions, solve problems; and evaluate claims, and actions.
The National League of Nursing came up with this list (you can view the full version here) and considered critical thinking so important, they added it as a mandatory criterion for accreditation of schools of nursing 20 years ago. I wanted to add to this list by pointing out what I consider to be the foundations of critical thinking and the common stumbling blocks that get can get in way.
Everyone posses the ability to think critically, and most of us do in our day-to-day lives. But very often, when a person holds a core belief very strongly, it can be easy to put on blinders and only seek out information that agrees with one’s own beliefs or pre-conceived ideas, while dismissing any evidence that works against their core beliefs. This is known as confirmation bias, the tendency to search for, interpret, favor, and recall information in a way that confirms one’s beliefs or hypotheses, while giving disproportionately less consideration to alternative possibilities. Everyone has the disposition of being bias, which why it is so important to have an open mind, be willing to admit that you could be wrong, and have a system in place to filter through information to determine it’s validity.
If anyone has ever served on jury duty or watched a lot of courtroom drama, then they know how important evidence is. Evidence is defined as the available body of facts or information indicating whether a belief or proposition is true or valid. In the scientific world, evidence is defined as evidence which serves to either support or counter a scientific theory or hypothesis. Such evidence is expected to be empirical evidence and interpretation in accordance with scientific method. A few important points about evidence:
First, some things that are not considered evidence: Opinions are not evidence. Arguments are not evidence. Conspiracy Theories are not evidence. Hearsay is not evidence. “Strongly held beliefs” are not evidence. Emotions are not evidence.
Second, the judicial system has mandated that the burden of proof lies with the prosecution. In other words, the burden of proof, or evidence is always with the one making the claim. If I tell my neighbor that aliens have been visiting me at night in my back yard, it is up to me to provide him with evidence to back up this claim. It is not his responsibility to prove me wrong. You will often see people trying to deflect their responsibility to provide evidence onto the person demanding said evidence. This is what’s known as an Appeal to Ignorance fallacy. A common example can be found when a skeptics asks for evidence for God, the response will often be, “Prove to me that there isn’t a God!”
And lastly, using a phrase first said by Marcello Truzzi, but made famous by Carl Sagan: extraordinary claims require extraordinary evidence. The more a claim differentiates from what we consider to be a “normal” occurrence, the more evidence is required for validating the claim. If I told my neighbor that I saw a hawk in my backyard, he would most likely take my word for it, as hawks are often seen in our neighborhood and strong evidence is not needed. If I was to say that I saw a Sasquatch in my back yard, a great deal of evidence (footprints, hair samples, photographs, etc) is going to be needed before he could be convinced.
Sometimes known as Bayes’ theorem, the probability spectrum describes the probability of an event, based on conditions that might be related to the event. In debates regarding far-fetched claims, often a last-ditch effort is made by appealing to the idea of possibility. Speaking of possibility gives the illusion of leaving the door open that such a claim may be true, despite the evidence pointing out the improbability of the claim. As the old saying goes, there are very few certainties in life, but we all make decisions based on the probability of what’s going to happen. I know that when I leave my home in the morning, there is the possibility that I may get in an accident, but that doesn’t stop me from going to work, because I know that the probability is relatively low. This same principle applies to critical thinking – in cases where a definitive conclusion cannot be made, the most probable answer or scenario should be the taken. Another way to think of it is to draw a line, and have “very unlikely” at one end and “very likely” at the other and postulate where a claim or explanation falls on that line. In his book, The God Argument, A.C. Grayling explains the possibility spectrum like this:
One line of thinking in the theory of knowledge has it that belief is not an all-or-nothing affair, but a matter of degree. The degree in question can be represented as a probability value. A virtue of this approach is said to be that it explains how people adjust the weighting they give to their various beliefs as the evidence in support of them changes when more and better information becomes available. People might not talk of probabilities unless challenged to say just how strongly the believe something, but their beliefs are nevertheless measurable in terms of how subjectively probable they appear to their holder.
Along these same principles is the idea of Occam’s Razor. Occam’s Razor is a problem-solving principle attributed to William of Ockham (c. 1287–1347), who was an English Franciscan friar and scholastic philosopher and theologian. The principle can be be interpreted as, “Among competing hypotheses, the one with the fewest assumptions should be selected.” In other words, the simplest explanation is generally the right one.
Another important aspect of probability is natural vs. supernatural explanations. Supernatural is anything that goes against or beyond the natural world. When debating religious claims, the supernatural is often invoked as “evidence”. This is a cop-out of sorts, as supernatural claims, by their very nature, cannot be tested by normal means. This is why in any discussion; a natural explanation is always favorable to a supernatural one. Put another way: the supernatural is the least likely explanation explanation for events. Divine intervention, miracles, the paranormal, psychic powers, angles/demons, spirits, etc are all considered supernatural and should send off warning bells whenever they are used in a debate. Supernatural explanations are unacceptable in the courtroom and in the science lab, and they should be equally unacceptable in critical thought.
Last, but certainly not least, we must talk about falsifiability. Falsifiability (or reputability) of a statement, hypothesis, or theory is the inherent possibility that it can be proven false. A statement is called falsifiable if it is possible to conceive of an observation or an argument that negates the statement in question. For any hypothesis to have credence, it must be inherently disprovable before it can become accepted as a scientific hypothesis or theory.
Falsifiability is a cornerstone of the scientific method and should be equally applied to critical thinking. For a statement to be questioned using observation, it needs to be at least theoretically possible that it can come in conflict with observation.
For example, I can make the claim that “All polar bears are white”, and it is logically possible to falsify this statement by observing just a single black polar bear. In the same way, Newton’s Theory of Gravity has been accepted as truth for centuries because objects do not randomly float away. If they were to start floating away, then scientists would need to go back to the drawing board and come up with a different hypothesis.
With all of that out of the way, we’ll next be looking at some of the most common logical fallacies. They are easy to spot once you recognize them and will help you in navigating the endless sea of nonsense that permeates our social media and literature. Hopefully this has been helpful. If you have any questions or need clarification, please leave a comment below. I’m no expert, but I’ll do my best to answer or at least point you in the direction of someone who can. Thanks for reading.