A Guide To Critical Thinking
think.maresh.info

What is Critical Thinking?

Critical thinking is the art of analyzing and evaluating thinking with a view to improving it in order to make an informed decision that is most likely to result in desired effects.

Critical thinking describes a process of uncovering and checking our assumptions and reasoning. First, we analyze to discover the assumptions that guide our decisions, actions, and choices. Next, we check the accuracy of these assumptions by exploring as many different perspectives, viewpoints, and sources as possible. Finally, we make informed decisions or judgments that are based on these researched assumptions.

Life is a series of decisions, some small, some much larger. Whom we date or choose as friends, the work or career we pursue, which political candidates we support, what we choose to eat, where we live, what consumer goods we buy, if and whom we marry, if and how we raise children—all these decisions are based on assumptions. We assume our friends will be trustworthy and won't talk about us behind our backs. We assume our career choices will be personally fulfilling or financially remunerative. We assume politicians we vote for have our, or the community's, best interests at heart. We assume that the foods we choose to eat are healthy for us, and so on.

These assumptions are sometimes correct. At other times, however, the assumptions we base our decisions on have never been examined. Sometimes we hold these assumptions because people we respect (friends, parents, teachers, religious leaders) have told us they are right. At other times we have picked these assumptions up as we travel through life but can't say exactly where they've come from. To make good decisions in life we need to be sure that these assumptions are accurate and valid – that they fit the situations and decisions we are facing. Critical thinking describes the process we use to uncover and check our assumptions. Decisions based on critical thinking are more likely to be ones we feel confident about and to have the effects we want them to have.

Your Mental Models

Mental models are the filters we use to understand the world. A mental model is a representation of how something works. Everyday we encounter so much information that we cannot store it all and the phenomena we encounter are too complex to understand every detail. Therefore, we use filtering models to simplify the complex into organizable and understandable chunks, conceptual models to file and organize new information, and reasoning models to create new ideas and make decisions.

Mental models shape what we think, how we interpret what we value most, where we direct our attention, how we reason, and where we perceive opportunities. The quality of our thinking is only as good as the models in our head and their usefulness in a given situation. The best models improve our likelihood of making the best decisions. By critically examining our assumptions, we can adjust them to be in better accord with reality and they become more powerful mental models in the toolkit through which we understand reality.

All of us go through life with many incorrect core assumptions about reality. For example, most of us believe (1) we are perceiving reality accurately, (2) our perceptions are valid, and (3) that what is obvious to us must be obvious to others. Let that sink in for a minute: these are incorrect assumptions. It is simply not possible to perceive reality accurately and everyone's reality is different. Our sensory nervous system sends gigabytes per minute of data to the brain but the brain has the attentional bandwidth to process megabytes per minute. On top of that, we are always allocating some of our bandwidth to our thoughts (have you every been lost in thought and missed an important detail?). To improve our thinking, first we have to accept that our perceptions of the moment are filtered through mental models, that our most dearly held beliefs may not correctly describe reality, and be open to improving them.

Building your toolkit of mental models is a lifelong project. Stick with it, and you'll find that your ability to understand reality, accomplish your goals, deepen your relationships, and make the best decisions will always improve. Critical thinking is a set of reasoning tools that we use to improve our other models about the world. They are the foundation upon which we can build our best mental models. In the next section, you will find an overview of the reasoning tools described in this website.

Organization of this Resource

Learn to analyze the elements of reasoning

The Critical Analysis page is dedicated to the first step in the process of developing critical thinking skills, recognizing elements of reasoning that are present in the mind whenever we reason. I categorize six elements of reasoning: purposes, questions, points of view, information, assumptions, and reasoning. Note how these elements are related in the following paragraph.

To take command of our thinking, first we need to clearly formulate both our purpose and the question at issue. To uncover truths, we need to make logical inferences based on sound assumptions and information that is both accurate and relevant to the question we are dealing with. We need to understand our own point of view and fully consider other relevant viewpoints. We also need to recognize problems created by bugs in the human operating system by formally working around them. These bugs can be categorized into two major categories, each of which has it's own page.

Fallacies of reasoning are found in unsound arguments that may sound persuasive on the surface.

Cognitive biases are a predictably systematic patterns of deviation rationality in judgment. Cognitive biases can lead to irrational thought through distortions of perceived reality, inaccurate judgment, or illogical interpretation. For example, confirmation bias is the tendency to interpret new evidence as confirmation of one's existing beliefs and filter-out information that does not confirm one's existing beliefs.

Learn to evaluate reasoning

The Critical Evaluation page describes the second step in the process of critical thinking, evaluating the quality of thought. We need to use concepts justifiably and follow out the implications of decisions we are considering.

Learn to avoid other common mistakes

No one is a master of every discipline, however there are some common misconceptions that people have of other disciplines that you should learn to avoid.

Additionally, I have created a page of common writing errors that I have observed in developing student writing.

Before submitting your writing, I suggest that you please consult these resources as checklists and verify that you have done your best to avoid these mistakes.

Critical Analysis

Analysis is the act of breaking something complex down into simpler parts that you examine in detail. To critically analyze a text or idea, identify its purpose, the question at issue, the author's point of view, the kinds of information involved, the reasoning, and the conclusions.

Unless a text is simply presenting information, it will often contain arguments. An argument is a series of statements that reach a logical conclusion that is intended to reveal the degree of truth of another statement. Arguments begin with premises (kinds of information) that are related to each other using valid forms of reasoning (a process) to arrive at a logical conclusion, new information. A logical conclusion is a new kind of information that is true in light of premises being true (if the premises are all facts) or seeming to be true (if the premises contain some opinions). A logical conclusion may be false, if the premises are false or the reasoning is poor.

argument

1. Identify the Purposes

All texts or ideas have a purpose.

 

2. Identify the Questions at Issue

When reasoning is present, the author is attempting to figure something out, to answer some question, or to solve a problem.

 

3. Identify Points of View

All reasoning is done from some point of view. We often experience the world in such a way as to assume that we are observing things just as they are, as though we were seeing the world without the filter of a point of view. Nonetheless, we also recognize that others have points of view that lead them to conclusions we fundamentally disagree with. One of the key dispositions of critical thinking is the on-going sense that, as humans, we always think within a perspective, that we virtually never experience things totally and absolutely. There is a connection, therefore, between thinking so as to be aware of our assumptions and intellectual humility. Therefore, it is often helpful to open your mind and involve other people (friends, family, work colleagues) who help us see ourselves and our actions from unfamiliar perspectives. Sometimes reading books, watching videos, or having new experiences such as traveling to other cultures, going to college, or being an intern help us become aware of our assumptions. It is equally important to recognize that one person's is biased by their world view and experiences, and therefore all points of view should be examined critically.

 

4. Distinguish Types of Information

Uncritical thinkers treat their conclusions as something given to them through experience, as something they directly observes in the world. As a result, they find it difficult to see why anyone might disagree with their conclusions. After all, they believe that the truth of their views is right there for everyone to see! Such people find it difficult to describe evidence without interpreting it through their point of view. Critical thinking requires the ability to label types of information and evaluate their quality before accepting an argument.

Information is true if it is accord accord with reality. Since our knowledge of reality is always incomplete, in practice truth is measured by its accord with the best information we have about reality. All information has an associated degree of belief (a feeling about truth) or confidence (the scientific term for statistical likelihood of truth) in its truth value. When analyzing, we are simply categorizing rather than evaluating the quality of the information.

All arguments are based on information. Premises are information that is used in the context of an argument. Information can be classified with four characteristics that describe the context in which it is used.

1. Evidence is information upon which conclusions are based. There are two categories of evidence:

2. Assumptions are statements that we accept as true without proof or demonstration.

3. Conclusions are the results or reasoning, irrespective of their truth value.

4. Propaganda is information that is not objective and is used primarily to influence an audience and further an agenda

4A. Identify Evidence

Evidence is information that is relevant to question at issue. Both facts and opinions are evidence.

Facts

A fact is an accurate description of an object, event, or statement that is independently verifiable by empirical means.

There are two distinct senses of the word "factual." The word may refer to a verified fact. However, "factual" may also refer to claims that are "factual in nature" in the sense that they can be verified or disproven by observation or empirical study, but those claims must be evaluated to determine if they are true. People often confuse these two senses, even to the point of accepting as true, statements which merely "seem factual", for example, "29.23 % of Americans suffer from depression." Before I accept this as true, I should assess it. I should ask such questions as "How do you know? How could this be known? Did you merely ask people if they were depressed and extrapolate those results? How exactly did you arrive at this figure?"

Purported facts should be assessed for their accuracy, completeness, and relevance to the issue. Sources of purported facts should be assessed for their qualifications, track records, and impartiality. Many students have experienced an education which stressed retention and repetition of factual claims. Such an emphasis stunts students' desire and ability to assess alleged facts, leaving them open to manipulation. Likewise, activities in which students "distinguish fact from opinion" often confuse these two senses. They encourage students to accept as true statements which merely "look like" facts.

To identify facts, look for these signal words in italics: "The annual report confirms ...," "Scientists have recently discovered ...," "According to the results of the tests...," "The investigation demonstrated ... "

Credible facts reference the observer of the information. You should accept a fact only after you have identified confirmation by many different independent observers and evaluated their credibility and potential bias. Even before this evaluation, you should reject a fact that does not have a clear source

As an example, in the debate we watched, Nick Gillespie says, "[drugs are] not addictive for 99 percent of people." This is factual only in the sense that may be empirically possible to measure, but you should not accept this as fact without more context such as a source.

If you have the opportunity, ask someone, "where did you get that information?" to give them the chance to confirm a fact. Until, you actually understand the limits and source of the fact, you should regard the information as suspicious and categorize it as an opinion that someone believes is true.

Opinions

An opinion is a statement that expresses either how a person feels about something or what a person thinks is true. With objective verification, opinions can become facts. If they cannot be proven or disproven, they will always be opinions.

Since we cannot examine the facts in all situations, sometimes we must rely on an opinion as evidence in an argument. Any conclusion derived from an argument that uses an opinion in place of a fact will generally be less reliable. You should always acknowledge such uncertainty when presenting such a conclusion.

 

4B. Identify Assumptions

An assumption is a statement that we accept as true without proof or demonstration. It is an unstated premise, presupposition, or opinion that is required to connect data to conclusions.

All human thought and experience is based on assumptions. Our thought must begin with something we believe to be true in a particular context. We are typically unaware of what we assume and therefore rarely question our assumptions. Much of what is wrong with human thought can be found in the uncritical or unexamined assumptions that underlie it. Identifying and evaluating accuracy and validity of assumptions is arguably the most important application of critical thinking. Accurate and valid assumptions can become facts.

Assumptions are often very difficult to identify. Usually they are something we previously learned and do not question. They are part of our system of beliefs. We assume our beliefs to be true and use them to interpret the world about us.

This packet of exercises has many excellent examples assumptions identified in short scenarios.

4C. Identify Conclusions

Conclusions are the results or reasoning.

In logic, conclusions can be categorized based on their truth value:

Additionally, conclusions are often categorized as either:

Conclusions also can be categorized based on their role in an argument:

It should be noted that different disciplines that study human thought (i.e. philosophy, cognitive psychology, artificial intelligence, etc.) define the distinction between a conclusion and an inference differently. To avoid confusion, I will make the following distinctions. When analyzing reasoning, a logical conclusion refers to the result of any argument. When analyzing a complex argument focused on a question at issue, an inference is a logical conclusion drawn from a single step in reasoning and may be used as information in the premise of a successive step of reasoning. A drawn conclusion describes a logical conclusion that specifically answers the question at issue by logically relating many inferences as premises. The example in this article, effectively illustrates my distinction between an inference and drawn conclusion(Note that other sources may define these word in the exact opposite way!).

Conclusions are generally straight-forward to identify in context. When analyzing a complex argument focused on a complex question at issue, inferences are often made implicitly in the course of reasoning. For this reason, an inference may be more difficult to identify. Critical thinkers try to monitor their inferences to keep them in line with what is actually implied by what they know. When speaking, critical thinkers try to use words that imply only what they can legitimately justify. They recognize that there are established word usages which generate established implications.

Examples:

A helpful tool is to first identify an inference (what do we infer from the situation being evaluated?) then identify an assumption that is the premise to that inference ("If the inference is true, what did I assume about the situation?"). Often an assumption you identify this way is an inference that can be further unpacked by repeating the second step to identify deeper core assumptions.

Situation: I heard a scratch at the door. I got up to let the cat in.

Inference: I inferred that the cat was at the door.

Ask: If that is true, what did I infer about the situation?

Assumptions: Only the cat makes that noise, and he makes it only when he wants to be let in.

Since different people can have difference assumptions, they will make different inferences about the reality of the same situation.

Person One

Person Two

Situation: A man is lying on the sidewalk.

Situation: A man is lying on the sidewalk.

Inference: That man is a bum.

Inference: That man is in need of help.

Ask: If that is true, what did I assume about him in this situation?

Ask: If that is true, what did I assume about him in this situation?

Assumption: Only bums lie on sidewalks.

Assumption: Anyone lying on a sidewalk is in need of help.

4D. Identify Propaganda

Propaganda is a special category of information that is not objective and is used primarily to influence an audience to reach a specific conclusion. Propaganda attempts to arouse emotions and biases to short-circuit rational judgment. The author of propaganda deliberately designs an argument that does not hold up to critical thinking. It's use indicates an intent to, at worst mislead, or at best persuade without the use of reasoning. Whether or not propaganda is ethical is a personal and context-dependent value judgment that is separate from critical thinking.

Students often find analysis of propaganda to be confusing because it is an extra feature of information, rather than its own type. Information that is propaganda can be any non-objective type (opinion, assumption, and/or inference) if it is deliberately used to manipulate opinions using poor reasoning. Moreover, propaganda quite utilizes poor reasoning—it often employs logical fallacies or takes advantage of cognitive biases to mislead.

The following is a list of common propaganda techniques:

Wikipedia has an extensive list of propaganda techniques with numerous examples.

5. Analyze Reasoning

The identification of poor reasoning invalidates the conclusion of an argument. The conclusion of the argument may or may not be true. You must formulate an alternative valid argue ment to support the conclusion.

5A. Identify Logical Fallacies

Fallacies are faulty reasoning used in the construction of an argument. This topic is so vast that I have created a separate fallacies of reasoning page.

 

5B. Identify Cognitive Biases

A cognitive bias is a cognitive shortcut that leads to a loss of objectivity. Cognitive biases can lead to irrational thought through distortions of perceived reality, inaccurate judgment, or illogical interpretation. By learning about some of the most common biases, you can learn and how to avoid falling victim to them.

The identification of cognitive biases at work in an argument should make you skeptical. Like fallacies, this topic is so vast that I have created a separate cognitive biases page to explain them.

Critical Evaluation

After we have cataloged the elements of reasoning, we must evaluate texts and our own reasoning for clarity, accuracy, precision, relevance, depth, breadth, significance, logic, and fairness. When making a decision with incomplete information, it is critical to recognize that truth is often a degree of belief based on our evaluation of the quality of the information and reasoning.

1. Evaluate point of view

 Critically evaluate the reliability of an author (and publisher):

 Compare and contrast points of view to reveal how related material is presented by different authors and different purposes of their writing. After reading two texts on the same topic, ask yourself:

 

1A. Evaluate a Scientific Author's Qualifications

 

2. Evaluate of Degree of Truth in Information

After analyzing to identify the different kinds of information, we must be explicit about the quality of each piece of information used in the text or our own thinking. Using the highest quality information in arguments increases the degree of belief in the truth of the argument. We must acknowledge when poor quality information is used in an argument and clearly state that we have low confidence in the truth of the argument.

A scientist's perspective on facts

In everyday language most of us consider a confirmed fact to be truth. However, scientists consider all truth to be provisional, the current facts serve as description of truth only for the time being. Scientists assume that all knowledge has the potential to be overturned if new information suggests that it should be. Scientists use the uncertainty and percent confidence to describe the statistical likelihood that a fact is true.

Physicist Richard P. Feynman once said, "In physics and in human affairs... whatever is not surrounded by uncertainty, cannot be the truth." He said this in reference to a newspaper article that asserted absolute belief in a scandalous rumor regarding a colleague. He observed that a responsible reporter should have referred to an "alleged incident." With no reference to a process that had first evaluated the quality of the truth, he considered accusation to be opinion, not fact.

Examples:

This last example highlight the property that all scientific information is actually a statement probability. Nothing in science is ever "proven" or "100% certain." Always avoid saying that science has proven something. This is a discipline-specific error in reasoning commonly made by non-scientists. Non-scientists sometimes misinterpret when scientists attach uncertainty to every fact. If there is 95% confidence that climate change is being caused by human activity, people with a psychological bias to avoid taking action around this crisis may focus on the 5% uncertainty in the truth value. On the other hand, people who are convinced of this fact and want to take action get frustrated that scientists refuse to say that it has been proven, we are certain. In practice, 95% confidence in science is the gold standard for a complex phenomenon being "as good as proven," but scientists always keep open the possibility that they don't have all the data and keep open the possibility that this fact may be more nuanced or simply wrong in the future.

Comparing and Contrasting Information

By comparing and contrasting information, you can identify facts, make inferences, and draw conclusions that would not otherwise be possible. After reading two texts, ask yourself:

 

2. Evaluate assumptions

[Unfinished]

Contrasting Assumptions

If two sides are arguing from different assumptions, it is very effective to focus on these in critical evaluation. Controversies generally rest on different sides interpreting the same information through different assumptions.

Assumptions, can be unjustified or justified, depending upon whether we do or do not have good reasons for them. Likewise, if two sides of a controversy share assumptions that are found faulty, both arguments become invalid.

Example:

 

3. Evaluate reasoning

When an argument doesn't "feel" right, first analyze it as follows. Write down the information that forms each premise of the argument and categorize them. Write down the conclusion and label it. Write your best general description of the reasoning that links them. The mechanics of the reasoning are usually found in a "therefore" type statement. To unmask the logic, replace the premise statements with letters that represent concepts and properties. Example: "It's raining and the sun is shining, therefore it's raining." The logical form is "X has property Q and P, therefore X has property Q". The logic is sound. [I will link some more examples later.]

3A. Logical Fallacies

Fallacies are faulty reasoning used in the construction of an argument. This topic is so vast that I have created a separate fallacies of reasoning page. The identification of fallacious reasoning invalidates an argument and we then forced to formulate our own arguments to uncover truth.

3B. Evaluate Propaganda

Propaganda is information that is not objective and is used primarily to influence an audience to reach a specific conclusion. Propaganda attempts to arouse emotions to short-circuit rational judgment. It is not by definition "good" or "bad." However, it's use indicates possible intent to, at worst mislead, or at best persuade without the use of reasoning. The techniques of propaganda are utilized in some logical fallacies and you will find some conceptual overlap. The following is a list of common propaganda techniques:

 

3C. Evaluate Cognitive Biases

A cognitive bias is a cognitive shortcut that leads to a loss of objectivity. Cognitive biases can lead to irrational thought through distortions of perceived reality, inaccurate judgment, or illogical interpretation. By learning about some of the most common biases, you can learn and how to avoid falling victim to them. The identification of cognitive biases at work in an argument should make you skeptical. Like fallacies, this topic is so vast that I have created a separate cognitive biases page to explain them.

4. Evaluate Judgments and Conclusions

After you read an article, you should be able to answer these questions:

 

5. Predict future Implications and Consequences

The alignment of reasonable future implications and consequences of a conclusion or judgment with your values should inform your reasoning.

 

Fallacies

Fallacies are faulty reasoning used in the construction of an argument. They make an argument appear to be better than it is. Here are some major fallacies of reasoning that you be able to recognize. All of the following fallacies are known as informal fallacies because they originate in a reasoning error. In contrast, formal fallacies, also known as non sequiturs, arise from the logical form of the argument. The following article introduces the most common fallacies.

In this video example we see rapid fire deployment of straw man, false dichotomy, and some formal fallacies on a kid who, impressively, recognizes each flaw of reasoning.

Identifying fallacies

Remember that arguments begin with premises that are related to each other using valid forms of reasoning to arrive at a logical conclusion.

Once you have analyzed the parts of an argument, evaluate:

Is the reasoning faulty?

Is/are the premise(s) faulty?

Are the premises and/or the arguments a distraction from the actual issue in question?

Are you still not able to identify the error in reasoning?

Formal Fallacies (Non Sequiturs)

An error in the argument's form. Invalid logic is applied to the premises.

Fallacy fallacy. This is the inferrence that an argument containing a fallacy must have a false conclusion. It is entirely possible for someone to pose a bad argument for something that is true. Try not to get so caught-up in identification of logical fallacies that you are quick to dismiss a flawed argument—instead, try to make the argument reasonable.

Syllogistic fallacies. There are many kinds of these. Syllogisms are generally three step arguments that use two premises to derive a conclusion. The premises and conclusion all take the form of categorical propositions that somehow relate two categories. These fallacies derive from incorrect application of logic. These fallacies are often more obvious if you draw a Venn diagram of the categories and shared features.

 

Informal Fallacies

The proposed conclusion is not supported by the premises.

Whereas formal fallacies can be identified by form, informal fallacies are identified by examining the argument's content. There are many subcategories.

Improper Premise Fallacies

Any form of argument in which the conclusion occurs as one of the premises.

Begging the question. Providing what is essentially the conclusion of the argument as a premise. You assume without proof the stand/position that is in question. To "beg the question" is to put forward an argument whose validity requires that its own conclusion is true. Formally, begging the question statements are not structured as an argument and are harder to detect than circular arguments. Some authors consider circular reasoning to be a special case of begging the question. In the following examples, notice that the question at issue answers itself without argument.

Circular reasoning. Formally, circular reasoning differs from begging the question by specifically referring to arguments in which the reasoner simply repeats what they already assumed beforehand in different words without actually arriving at any new conclusion. Circular reasoning is not persuasive because a listener who doubts the conclusion will also doubt the premise that leads to it. This may sound silly, but people make such statements quite often when put under pressure.

Examples:

Loaded question. Asking a question that has an assumption built into it so that it can't be answered without appearing guilty.

Weak Premise Fallacies

These reach a conclusion from weak premises. Unlike fallacies of relevance, the premises are related to the conclusions and yet only weakly support the conclusions. A faulty generalization is thus produced.

Cherry Picking / Card Stacking. The presentation of only that information or those arguments most favorable to a particular point of view.

Faulty/Weak analogy. Comparison is carried too far, or the things compared have nothing in common.

Hasty Generalization (from an Unrepresentitve Sample). A judgment is made on the basis of inaccurate or insufficient evidence. They are extremely common because there is often no agreement about what constitutes sufficient evidence. Generalization from one person's experience is a common example of this fallacy.

No True Scotsman. Making what could be called an appeal to purity as a way to dismiss relevant criticisms or flaws of an argument.

Questionable Cause Fallacies

The primary basis for these errors is either inappropriate deduction (or rejection) of causation or a broader failure to properly investigate the cause of an observed effect.

Correlation Without Causation / Cum Hoc. A faulty assumption that, because there is a correlation between two variables, one caused the other.

Gamblers Fallacy. The incorrect belief that separate, independent events can affect the likelihood of another random event.

False Cause / Post Hoc. Treating coincidence of one event following another as causation.

Single Cause Fallacy / Causal Oversimplification. It is assumed that there is one, simple cause of an outcome when in reality it may have been caused by a number of only jointly sufficient causes or a third cause.

 

Relevance Fallacies

These are distractions from the argument typically with some distracting sentiment that seems to be relevant but isn't really on-topic. Red Herrings are a specific sub-category Relevance fallacy that is distinguished by an intent to mislead often due the lack of a real argument.

Ad Hominem Argument. Rejection of a person's view on the basis of personal characteristics, background, physical appearance, or other features irrelevant to the argument at issue. Pay close attention to words that question an opponent's character. Examples: slob, prude, moron, embarrassing, stubborn.

Ambiguity. Using double meanings or other ambiguities of language to mislead or misrepresent the truth. Meaning in language can be so slippery that there are at least a dozen sub-fallacies including ambiguous grammar, equivocation, and quoting out of context (a tactic most often encountered on the Internet).

Appeal to Authority. This fallacy happens when we misuse an authority. This misuse of authority can occur in a number of ways. We can cite only authorities — steering conveniently away from other testable and concrete evidence as if expert opinion is always correct. Or we can cite irrelevant authorities, poor authorities, or false authorities.

Appeal to Emotion. The use of non-objective words, phrases, or expressions that arouse emotion having the effect of short-circuiting reason. Common examples include appeals to fear, flattery, outrage, pity, pride, ridicule of opponent's argument, spite, wishful thinking. Emotional appeals are also a powerful tool in propaganda.

Appeal to Nature. Any argument that assumes "natural" things are "good" and "unnatural" things are "bad" is flawed because concepts of the natural, good, and bad are all vague and ambiguous. The person creating the argument can define these in any way that supports their position. Appeals to Nature also employ the begging the question fallacy (above).

Argument from ignorance / burden of proof. It asserts that a proposition is true because it has not yet been proven false or a proposition is false because it has not yet been proven true. This type of argument asserts a truth and shifts the burden of providing counter-evidence onto someone else. Logically, we should remain skeptical and demand legitimate evidence from the person asserting the proposition.

Argument from incredulity (appeal to common sense). Saying that because one finds something difficult to understand that it's therefore not true.

Association fallacy. Inferring either guilt or honor by association. It is an irrelevant attempt to transfer the qualities of one thing to another by merely invoking them together. Sometimes fallacies of this kind may also be appeals to emotion, hasty generalizations, and/or ad hominem arguments.

Bandwagon / FOMO. The use of the fear of being "different" or "missing-out" is used to influence behavior.

Genetic fallacy. Judging something good or bad on the basis of where it comes from, or from whom it comes.

Ignoring The Question. Digression, obfuscation, or similar techniques are used to avoid answering a question.

Missing the point / Irrelevant Conclusion. Presenting an argument that may or may not be logically valid and sound, but whose conclusion fails to address the issue in question.

Straw Man Argument. Appearing to refute an opponent's argument by instead creating an oversimplified or extreme version of the argument (a "straw man") and refuting that instead.

Texas sharpshooter. A conclusion is drawn from data with a stress on similarities while ignoring differences. An example is seeing localized patterns where none exist. The name comes from a joke about a Texan who fires some gunshots at the side of a barn, then paints a target centered on the tightest cluster of hits and claims to be a sharpshooter.

Tu Quoque Fallacy. Latin for "you too," is also called the "appeal to hypocrisy" because it distracts from the argument by pointing out hypocrisy in the opponent. This tactic doesn't prove one's point, because even hypocrites can tell the truth.

Informal Fallacies with Multiple Structural Problems

Composition / Division. The fallacy of composition infers that something is true of the whole from the fact that it is true of a part of the whole. The opposite reasoning is the fallacy of division.

False dilemma / false dichotomy / black and white. Reducing an issue to only two possible decisions.

Middle ground / false compromise / argument to moderation. Arguing that a compromise, or middle point, between two extremes is the truth.

Slippery Slope. Moving from a seemingly benign premise or starting point and working through a number of small steps to an improbable extreme when many other outcomes could have been possible. Although this form of slippery slope is a sub-type of the formal appeal of probability fallacy (it assumes something will occur based on probability and thus breaks rules of formal logic), slippery slope arguments can take on many other forms and should are generally categorized as informal fallacies.

Special pleading. Moving the goalposts to create exceptions when a claim is shown to be false. Applying a double standard, generally to oneself.

 

Biases

"The first principle is that you must not fool yourself – and you are the easiest person to fool." –Richard Feynman

As we examine our assumptions and improve our mental models, we have to confront the reality that we all have inescapable hardwired biases that we cannot change through critical thinking. Because we all have them, science can teach us a lot about our biases. Biases are an inescapable feature of being human. No training will stop you from commiting them. However, learning about them can help you second guess the validity of your judgment, think more critically, consider other points-of-view, and develop empathy for the biases in others.

The operating system of our brains uses biologically evolved shortcuts in our thinking. Many of these shortcuts are useful and essential. However, we have also inherited bugs in the code that make many of our judgments irrational. A cognitive bias is a cognitive shortcut that leads to a loss of objectivity. By learning about some of the most common biases, you can learn and how to avoid falling victim to them. For example many of the biases below occur because the brain tends to find patterns where none exist and uses irrational biases to reduce cognitive dissonance when stressed with contradictory ideas. To learn more, I recommend reading Thinking Fast and Slow and You Are Not So Smart.

Common Cognitive Biases

Anchoring. The first thing you judge influences your judgment of all that follows.

Human minds are associative in nature, so the order in which we receive information helps determine the course of our judgments and perceptions. For instance, the first price offered for a used car sets an 'anchor' price which will influence how reasonable or unreasonable a counter-offer might seem. Even if we feel like an initial price is far too high, it can make a slightly less-than-reasonable offer seem entirely reasonable in contrast to the anchor price.

Be especially mindful of this bias during financial negotiations such as houses, cars, and salaries. The initial price offered has proven to have a significant effect.

Availability heuristic. Your judgments are influenced by what springs most easily to mind.

How recent, emotionally powerful, or unusual your memories are can make them seem more relevant. This, in turn, can cause you to apply them too readily. For instance, when we see news reports about homicides, child abductions, and other terrible crimes it can make us believe that these events are much more common and threatening to us than is actually the case.

Try to gain different perspectives and relevant statistical information rather than relying purely on first judgments and emotive influences.

Barnum effect. You see personal specifics in vague statements by filling in the gaps (e.g. interpreting your horoscope).

Because our minds are given to making connections, it's easy for us to take nebulous statements and find ways to interpret them so that they seem specific and personal. The combination of our egos wanting validation with our strong inclination to see patterns and connections means that when someone is telling us a story about ourselves, we look to find the signal and ignore all the noise.

Psychics, astrologers and others use this bias to make it seem like they're telling you something relevant. Consider how things might be interpreted to apply to anyone, not just you.

Belief bias. You are more likely to accept an argument that supports a conclusion that aligns with his values, beliefs and prior knowledge, while rejecting counter arguments to the conclusion.

It's difficult for us to set aside our existing beliefs to consider the true merits of an argument. In practice this means that our ideas become impervious to criticism, and are perpetually reinforced. Instead of thinking about our beliefs in terms of 'true or false' it's probably better to think of them in terms of probability. For example we might assign a 95%+ chance that thinking in terms of probability will help us think better, and a less than 1% chance that our existing beliefs have no room for any doubt. Thinking probabalistically forces us to evaluate more rationally.

A useful thing to ask is 'when and how did I get this belief?' We tend to automatically defend our ideas without ever really questioning them.

Belief perserverance. When some aspect of your core beliefs is challenged, it can cause you to believe even more strongly.

We can experience being wrong about some ideas as an attack upon our very selves, or our tribal identity. This can lead to motivated reasoning which causes a reinforcement of beliefs, despite disconfirming evidence. Recent research shows that the backfire effect certainly doesn't happen all the time. Most people will accept a correction relating to specific facts, however the backfire effect may reinforce a related or 'parent' belief as people attempt to reconcile a new narrative in their understanding.

"It ain't what you don't know that gets you into trouble. It's what you know for sure that just ain't so." —Mark Twain

 

Bystander effect. You presume someone else is going to do something in an emergency situation.

When something terrible is happening in a public setting we can experience a kind of shock and mental paralysis that distracts us from a sense of personal responsibility. The problem is that everyone can experience this sense of deindividuation in a crowd. This same sense of losing our sense of self in a crowd has been linked to violent and anti-social behaviors. Remaining self-aware requires some amount of effortful reflection in group situations.

If there's an emergency situation, presume to be the one who will help or call for help. Be the change you want to see in the world.

Confirmation bias. You favor things that confirm your existing beliefs.

We are primed to see and agree with ideas that fit our preconceptions, and to ignore and dismiss information that conflicts with them. You could say that this is the mother of all biases, as it affects so much of our thinking through motivated reasoning. To help counteract its influence we ought to presume ourselves wrong until proven right. "When you are studying any matter, or considering any philosophy, ask yourself only what are the facts and what is the truth that the facts bear out. Never let yourself be diverted either by what you wish to believe or by what you think would have beneficent social effects if it were believed." – Bertrand Russell

Think of your ideas and beliefs as software you're actively trying to find problems with rather than things to be defended.

Curse of knowledge. Once you understand something you presume it to be obvious to everyone.

Things makes sense once they make sense, so it can be hard to remember why they didn't. We build complex networks of understanding and forget how intricate the path to our available knowledge really is. This bias is closely related to the hindsight bias wherein you will tend to believe that an event was predictable all along once it has occurred. We have difficulty reconstructing our own prior mental states of confusion and ignorance once we have clear knowledge.

When teaching someone something new, go slow and explain like they're ten years old (without being patronizing). Repeat key points and facilitate active practice to help embed knowledge.

Declinism. You remember the past as better than it was, and expect the future to be worse than it will likely be.

Despite living in the most peaceful and prosperous time in history, many people believe things are getting worse. The 24 hour news cycle, with its reporting of overtly negative and violent events, may account for some of this effect. We can also look to the generally optimistic view of the future in the early 20th century as being shifted to a dystopian and apocalyptic expectation after the world wars, and during the cold war. The greatest tragedy of this bias may be that our collective expectation of decline may contribute to a real-world self-fulfilling prophecy. For some real data,

Instead of relying on nostalgic impressions of how great things used to be, use measurable metrics such as life expectancy, levels of crime and violence, and prosperity statistics.

Dunning-Kruger effect. The more you know, the less confident you're likely to be. The less you know, the more confident you are likely to be.

Because experts know just how much they don't know, they tend to underestimate their ability; but it's easy to be over-confident when you have only a simple idea of how things are. Try not to mistake the cautiousness of experts as a lack of understanding, nor to give much credence to lay-people who appear confident but have only superficial knowledge.

"The whole problem with the world is that fools and fanatics are so certain of themselves, yet wiser people so full of doubts." —Bertrand Russell

Framing effect. You allow yourself to be unduly influenced by context and delivery.

We all like to think that we think independently, but the truth is that all of us are, in fact, influenced by delivery, framing and subtle cues. This is why the ad industry is a thing, despite almost everyone believing they're not affected by advertising messages. The phrasing of how a question is posed, such as for a proposed law being voted on, has been shown to have a significant effect on the outcome.

Only when we have the intellectual humility to accept the fact that we can be manipulated, can we hope to limit how much we are. Try to be mindful of how things are being put to you.

Fundamental attribution error. You judge others on their character, but yourself on the situation.

If you haven't had a good night's sleep, you know why you're being a bit slow; but if you observe someone else being slow you don't have such knowledge and so you might presume them to just be a slow person. Because of this disparity in knowledge we often overemphasize the influence of circumstance for our own failings, as well as underestimating circumstantial factors to explain other people's problems.

It's not only kind to view others' situations with charity, it's more objective too. Be mindful to also err on the side of taking personal responsibility rather than justifying and blaming.

Groupthink. You let the social dynamics of a group situation override the best outcomes.

Dissent can be uncomfortable and dangerous to one's social standing, and so often the most confident or first voice will determine group decisions. Because of the Dunning-Kruger effect, the most confident voices are also often the most ignorant.

Rather than openly contradicting others, seek to facilitate objective means of evaluation and critical thinking practices as a group activity.

In-group bias. You unfairly favor those who belong to your group.

We presume that we're fair and impartial, but the truth is that we automatically favor those who are most like us, or belong to our groups. This blind tribalism has evolved to strengthen social cohesion, however in a modern and multicultural world it can have the opposite effect.

Try to imagine yourself in the position of those in out-groups; whilst also attempting to be dispassionate when judging those who belong to your in-groups.

Just world hypothesis. Your preference for justice makes you presume it exists.

A world in which people don't always get what they deserve, hard work doesn't always pay off, and injustice happens is an uncomfortable one that threatens our preferred narrative. However, it is also the reality. This bias is often manifest in ideas such as 'what goes around comes around' or an expectation of 'karmic balance', and can also lead to blaming victims of crime and circumstance.

A more just world requires understanding rather than blame. Remember that everyone has their own life story, we're all fallible, and bad things happen to good people.

Halo effect. How much you like someone, or how attractive they are, influences your other judgments of them.

Our judgments are associative and automatic, and so if we want to be objective we need to consciously control for irrelevant influences. This is especially important in a professional setting. Things like attractiveness can unduly influence issues as important as a jury deciding someone's guilt or innocence. If someone is successful or fails in one area, this can also unfairly color our expectations of them in another area.

If you notice that you're giving consistently high or low marks across the board, it's worth considering that your judgment may be suffering from the halo effect.

Negativity bias. You allow negative things to disproportionately influence your thinking.

The pain of loss and hurt are felt more keenly and persistently than the fleeting gratification of pleasant things. We are primed for survival, and our aversion to pain can distort our judgment for a modern world. In an evolutionary context it makes sense for us to be heavily biased to avoid threats, but because this bias affects our judgments in other ways it means we aren't giving enough weight to the positives.

Pro-and-con lists, as well as thinking in terms of probabilities, can help you evaluate things more objectively than relying on a cognitive impression.

Optimism bias. You overestimate the likelihood of positive outcomes.

There can be benefits to a positive attitude, but it's unwise to allow such an attitude to adversely affect our ability to make rational judgments (they're not mutually exclusive). Wishful thinking can be a tragic irony insofar as it can create more negative outcomes, such as in the case of problem gambling.

If you make rational, realistic judgments you'll have a lot more to feel positive about.

Pessimism bias. You overestimate the likelihood of negative outcomes.

Pessimism is often a defense mechanism against disappointment, or it can be the result of depression and anxiety disorders. Pessimists often justify their attitude by saying that they'll either be vindicated or pleasantly surprised, however a pessimistic attitude may also limit potential positive outcomes. It should also be noted that pessimism is something very different to skepticism: the latter is a rational approach that seeks to remain impartial, while the former is an expectation of bad outcomes.

Perhaps the worst aspect of pessimism is that even if something good happens, you'll probably feel pessimistic about it anyway.

Placebo effect. If you believe you're taking medicine it can sometimes 'work' even if it's fake.

The placebo effect can work for stuff that our mind influences (such as pain) but not so much for things like viruses or broken bones. Things like the size and color of pills can have an influence on how strong the effect is and may even result in real physiological outcomes. We can also falsely attribute getting better to an inert substance simply because our immune system has fought off an infection i.e. we would have recovered in the same amount of time anyway.

Homeopathy, acupuncture, and many other forms of natural 'medicine' have been proven to be no more effective than placebo. Keep a healthy body and bank balance by using evidence-based medicine from a qualified doctor.

Reactance. You'd rather do the opposite of what someone is trying to make you do.

When we feel our liberty is being constrained, our inclination is to resist, however in doing so we can over-compensate. While blind conformity is far from an ideal way to approach things, neither is being a knee-jerk contrarian.

Be careful not to lose objectivity when someone is being coercive/manipulative, or trying to force you do something. Wisdom springs from reflection, folly from reaction.

Self-serving bias. You believe your failures are due to external factors, yet you're responsible for your successes.

Many of us enjoy unearned privileges, luck and advantages that others do not. It's easy to tell ourselves that we deserve these things, whilst blaming circumstance when things don't go our way. Our desire to protect and exalt our own egos is a powerful force in our psychology. Fostering humility can help countermand this tendency, whilst also making us nicer humans.

When judging others, be mindful of how this bias interacts with the just-world hypothesis, fundamental attribution error, and the in-group bias.

Spotlight effect. You overestimate how much people notice how you look and act.

Most people are much more concerned about themselves than they are about you. Absent overt prejudices, people generally want to like and get along with you as it gives them validation too. It's healthy to remember that although we're the main character in the story of our own life, everyone else is center-stage in theirs too. This bias causes so many people to attribute to motives of malice when there may have been a simple misunderstanding.

Instead of worrying about how you're being judged, consider how you make others feel. They'll remember this much more, and you'll make the world a better place.

Sunk cost fallacy. You irrationally cling to things that have already cost you something.

When we've invested our time, money, or emotion into something, it hurts us to let it go. This aversion to pain can distort our better judgment and cause us to make unwise investments. A sunk cost means that we can't recover it, so it's rational to disregard the cost when evaluating. For instance, if you've spent money on a meal but you only feel like eating half of it, it's irrational to continue to stuff your face just because 'you've already paid for it'; especially considering the fact that you're wasting actual time doing so.

To regain objectivity, ask yourself: had I not already invested something, would I still do so now? What would I counsel a friend to do if they were in the same situation?

Discipline-specific misconceptions often made in arguments

Chemistry

Medicine

Neuroscience

Philosophy

Science

A longer list of misconceptions

Wikipedia has a great list of common misconceptions on many other topics.

Writing Tips

General Style Tips 

Citing and Referencing

 

Presenting Information

 

Glossary 

Why are precise definitions of concepts and ideas important?

Humans think within concepts or ideas. We can never achieve command over our thoughts unless we learn how to achieve command over our concepts or ideas. Thus we must learn how to identify the concepts or ideas we are using, contrast them with alternative concepts or ideas, and clarify what we include and exclude by means of them. For example, most people say they believe strongly in democracy, but few can clarify with examples what that word does and does not imply. Most people confuse the meaning of words with cultural associations, with the result that "democracy" means to people whatever we do in running our government—any country that is different is undemocratic. We must distinguish the concepts implicit in the English language from the psychological associations surrounding that concept in a given social group or culture. The failure to develop this ability is a major cause of uncritical thought and selfish critical thought.

Fundamental Definitions

Argument. An argument is a series of statements that reach a conclusion that is intended to reveal the degree of truth of another statement. Arguments begin with premises (kinds of information) that are related to each other using valid forms of reasoning (a process) to arrive at the logical conclusion, new information. A logical conclusion is a new kind of information that is true in light of premises being true (if the premises are all facts) or seeming to be true (if the premises contain opinions).

Critical thinker. A well-cultivated critical thinker raises vital questions and problems, formulating them clearly and precisely; gathers and assesses relevant information, using abstract ideas to interpret it effectively; comes to well-reasoned conclusions and solutions, testing them against relevant criteria and standards; thinks open mindedly within alternative systems of thought, recognizing and assessing, as need be, their assumptions, implications, and practical consequences; is committed to overcoming our native confirmation bias, egocentrism, and sociocentrism; and communicates effectively with others in figuring out solutions to complex problems. ( https://www.criticalthinking.org )

Concept. A concept is a generalized idea of a thing or of a class of things that make up the fundamental building blocks of thoughts. Concepts are your brain's representations of past experiences (Barsalou 2003 and 2008). Using concepts, your brain groups some things together and separates others. You can look at three mounds of dirt and perceive two of them as "Hills" and one as a "Mountain," based on your concepts. The dominant psychological/philosophical school of thought known as constructivism assumes that the world like a sheet of pastry and your concepts are cookie cutters that carve boundaries, not because the boundaries are natural, but because they're useful or desirable. These boundaries have physical limitations of course; you'd never perceive a mountain as a lake (Boghossian 2006).

Empirical. Relying on or derived from experiment, observation, or experience as opposed to conceptual or evaluative.

Idea. An idea is anything existing in the mind as an object of knowledge or thought based on concepts regarding particular instances of a class of things. The word specifically refers to something conceived in the mind or imagined. An idea can be specific whereas concepts are generalized.

Thought refers to any idea, whether or not expressed, that occurs to the mind in reasoning or contemplation.

Additional Definitions

For additional definitions of the objects of mind and parts of thinking, I suggest this glossary: https://www.criticalthinking.org/pages/glossary-of-critical-thinking-terms/4