Category: Critical Thinking

  • Logical Fallacies: The Gambler’s Fallacy

    Logical Fallacies: The Gambler’s Fallacy

    Unfortunately, I have a very personal reason for choosing the gambler’s fallacy as my next topic. I am experiencing the effects of this fallacy with someone close to me who is in the grips of a gambling addiction. For years, she has exhorted me that winning depends upon making large bets. She believes that betting more money improves the odds of coming up lucky on her vice of choice, the slot machine. No amount of patient explanation of statistics and odds can dissuade her from this belief. She also falls prey to the classic version of the gambler’s fallacy, which essentially states that the odds of one event are influenced by the outcome of the preceding event. This can sometimes be true. For example, in cases where numbers are drawn and not replaced, as in bingo, the odds of the remaining numbers being drawn increase with each subsequent draw (that is, in a bag of ten numbered balls, the odds of drawing any of the numbers is one in ten. Once the first ball is drawn, the odds of drawing any of the remaining numbers becomes one in nine, and so on). However, it is not true for simple odds such as coin tosses. Each and every toss of a coin is an independent event. So, even if you get nine tails in a row, the odds of getting heads on the tenth toss remains exactly the same as it was for the preceding nine tosses; that is, 50 percent. Yet, many people will believe that the tenth toss has much greater odds of being heads because the previous nine tosses were tails. This is the gambler’s fallacy.

    It is amazing how many people fall for the gambler’s fallacy. You see it operating not only in casinos but in lotteries. When a lottery jackpot gets really big, it is because several drawings have passed with no winner. And of course, the bigger the jackpot gets, the more people buy tickets. Many people do this simply because they hope to win the huge jackpot, not because they believe their odds are any better; but I have had conversations with many people who insist that they are more likely to win when the jackpot is bigger. Their argument is a perfect example of the gambler’s fallacy: because no winner has been drawn for so many weeks, the odds of a winner must be greater for the bigger jackpots! This is actually true in one specific sense, because the more people who buy tickets, the more potential combinations of numbers there are in the ticket pool. But the big jackpot and the long time elapsed since a winner does not change the fundamental odds of drawing, say, five numbers out of 56. No matter how big the jackpot, the odds remain exactly the same for each and every drawing. For the Mega Millions lottery, the odds of drawing five particular numbers and the Mega number are 1 in 175,711,536. Again, these odds remain the same no matter how big the jackpot and no matter how many tickets have been purchased. It seems that attaching money or some other consequence to the outcome of a random event scrambles people’s ability to rationally judge the odds.

  • Logical Fallacies: The Slippery Slope

    Logical Fallacies: The Slippery Slope


    The slippery slope fallacy is another of those logical mistakes to which so many people fall prey. I’m sure you recognize it; it’s the idea that doing one thing will inevitably lead to doing something worse. It’s a common argument deployed for things like drug use, in which a drug like marijuana is labeled a “gateway drug” because it allegedly opens the door to much more dangerous and addictive substances. In other words, once you start using marijuana you’ve started down the slippery slope towards becoming a full-blown cocaine/meth/heroin/oxycontin, etc. addict. The problem with the slippery slope fallacy, paradoxically, is that it’s true often enough that people start to become convinced that it’s true for everything. Certainly there are drug addicts who started with marijuana and ended up with heroin; the problem is the assumption that the worst will always happen. It becomes a form of confirmation bias. People remember the cases where one decision led to another, and then another… and the next thing you know, the world as we know it has come to an end!

    The slippery slope fallacy is used all the time in public and political discourse. Recent events illustrate well that the slippery slope is a common bludgeon for changing people’s positions or illustrating their stance. So, gay marriage inevitably leads to multiple marriage (polygamy), then marriage between children and adults, then marriage between people and animals, and so on. It reminds me of Peter Venkman, Bill Murray’s character in Ghostbusters: “Human sacrifice! Dogs and cats, living together! Mass hysteria!” (Of course, the interesting thing about this particular argument is that polygamy is widely practiced in many cultures, and girls and boys who might be considered children by Western standards are considered to be of marriageable age as early as 13 in some parts of the world… but that doesn’t mean that this is going to be accepted in all cultures, or even that it should be. Personally, I have no problem whatsoever with polygamy, although I wouldn’t want it for myself).

    We see a similar argument occurring in the gun control debate: if gun control laws are tightened, then only criminals will have guns, and law-abiding citizens will have to live in fear for their property, safety, and their lives. The converse is that if gun laws are relaxed, gun crimes and gun deaths in general will skyrocket as people go around randomly shooting each other. Obviously I’m exaggerating to make a point, but this is the natural consequence of the slippery slope argument.

    One place where I fall prey to the slippery slope myself is in my concerns over how digital and wireless technology is allowing more and more intrusion into our private lives. I harbor dark fears that eventually, no one will be able to do anything without it being recorded somewhere, somehow. I cringe every time I read a story about the new ways technology is intruding into our lives, in many cases without our knowledge. A company in the UK has developed public trash cans that can track smart phone signals and display ads based on the personal habits of passers-by. This sort of thing keeps me awake at night and gets me started down the slippery slope. I think my overall concerns about privacy and government/corporate overreach are well founded; but do these concerns necessarily lead to a completely totalitarian, Orwellian world? Perhaps, but simply arguing that one thing inevitably leads to another is not enough.

    Like all the things I write about, there are always grey areas. Sometimes a slippery slope concern ends up being justified, but even if it is, slippery slope arguments are still fallacious. People always sense danger when they feel their worldview is being threatened, and some people see threats everywhere. Even if we feel justified in listening to our fears about the slope, we are still obligated to do the actual work of using facts and logic to determine what the potential outcomes may be. Fear, as compelling as it may be, is not rational, and neither is the slippery slope.

  • Logical Fallacies: Appeal to Popularity

    Logical Fallacies: Appeal to Popularity

    A few days ago I was in Staples buying some supplies for a new craft project, and the cashier inquired whether I had a Staples preferred customer card. When I answered in the negative, he asked if I wanted one. I declined. He persisted: “But our customers are so happy to be in our program! Millions of people can’t be wrong!” On the contrary, I replied; they very well could be wrong. I did end up joining the program because it’s free and might save me some money – but his reference to millions of happy customers had no bearing on my decision.

    This logical fallacy is known as the appeal to popularity. It is one of the many irrelevant appeals that people use to back a particular point of view, and like the ad hominem argument from my previous post, it is very easy to explain: just because a view is held by many people does not make it correct. Yet, it is a very common argument. I often find it used in defense of religious beliefs, e.g. God must exist because most people believe he does. How could all those people be wrong? As appealing as that argument may be, it is not rational, logical, or based on facts. Over the thousands of years of human history, millions of people have shared countless incorrect beliefs. Physicians used to treat patients without washing their hands or their instruments, because what we now call the germ theory of disease hadn’t yet been formulated (thank you, Louis Pasteur and your predecessors). Instead, doctors believed that disease spread from what they called miasma, or bad air. They had no conception of microscopic organisms such as bacteria, or invisible particles of virus, or even tiny parasites. Millions of people, including the physicians responsible for treating them, believed in miasma… and they were all wrong.

    None of this is to say that the opposite of the appeal to popularity is true; that is, that the truth is only known by a select few and the rest of the world is simply mislead or deluded. This kind of thinking is common amongst conspiracy theorists. Much of their certainty comes from the feeling that they have access to rare, special knowledge that others don’t know or won’t accept. They convince themselves that they have extra sharp powers of logic and perception because they accept things others won’t, even when the facts aren’t on their side. Conspiracy beliefs make the believer feel like they are part of a special, rarefied group of the truly knowledgeable, and may actually work against the appeal to popularity by saying that the more people believe something (e.g. that fluoridated water or childhood vaccines are safe) the less likely it is to actually be true. This means we have to beware not just the appeal to popularity itself, but how it is deployed. Be extremely wary of any argument that goes along the lines of “But that’s what they want you to believe!”

    It is also important to remember that plenty of things that almost everybody in the world believes to be true are actually true; but they aren’t true because we all believe them to be true – they are true because they are facts. In other words, our belief in something, or lack thereof, is completely irrelevant to the truth value of what we believe in. And the number of people who believe in something – or don’t believe in something – is equally irrelevant to the truth value of a given argument.

  • Logical Fallacies: Ad Hominem

    Logical Fallacies: Ad Hominem

    It’s been a while since my last post, primarily because much of my attention has been focused on my other endeavor over at the Rock and Shell Club. But that doesn’t mean I haven’t been nurturing several rants, large and small. I am occupied by the usual topics of critical thinking, the over-saturation of social media in our daily lives, and the peaks and valleys of capitalist consumerism. One goal I’ve been considering for quite some time is to develop a curriculum or course description for an anthropology class that centers around the topic of critical thinking. More specifically, I’d like to teach a class that discusses how and why people think about things the way they do. Anthropology is ideally suited to such a topic, because anthropological analysis requires cultivating the ability to see things from other points of view.

    Michael Shermer wrote a book that I would require for my class: Why People Believe Weird Things. This book clarified and helped me conceptualize many of the things I was already thinking about the workings of the human brain. I believe that having a good understanding of how people think is absolutely crucial to living a fully aware life. Even more important, understanding how thinking works, and how it can trip us up, helps us be more careful about our own thinking process. It’s one thing to criticize and evaluate others’ ideas; it’s something else entirely to be able to turn that process back onto your own analyses. I think the world would be a better place if more people did this. To that end, I’d like to discuss, in a series of posts, some of the basic logical fallacies. These are the things that seem to be the most common in people’s thought processes, and the things I think everybody should know and look for in their own thinking. They are also the things I’d start with in my eventual class on critical thinking.

    A fallacy in thinking basically means coming to an irrational conclusion that is not supported by the facts. Instead of arguing from a factual or rational basis, logical fallacies tend to revolve around arguments that stem from emotion, appeals to the spiritual or supernatural, personal attacks, or errors of cause and effect. They fall into many categories and some are more common than others. One of the most common, and simplest to explain, is the ad hominem argument. An ad hominem argument is generally understood as one in which you attack the arguer rather than the argument. This is commonly perceived as a personal attack, where you berate, insult, or criticize the person with whom you are arguing. However, it is important to note that an ad hominem argument does not have to be an attack per se; it is simply an approach by which you say something about the arguer rather than the argument. So, to take a simple example, you could say “You only support gay marriage because your brother is gay; therefore, gay marriage shouldn’t be legalized.” In this argument you aren’t saying anything negative about your interlocutor; but neither are you saying anything factual, rational, or logical in support of the position that gay marriage should or shouldn’t be legal.

    The ad hominem argument does absolutely nothing to advance your case. If it is the kind of ad hominem that actually stoops to the level of a personal attack, then I feel it may actually impede your case – not in a rational sense, because the ad hominem argument does not in any way negate the logic (or lack thereof) of your position – but because it degrades and impedes constructive discourse. No matter how much you may personally disagree with someone’s position, no matter how much personal animosity you may feel towards them, no matter how egregious or offensive or bigoted or immoral you may find their position to be, none of those feelings have any bearing on the logic of either your or your opponent’s position.

    I started with ad hominem not only because it’s the simplest, but it is extremely common and, I believe, extremely damaging. If you use it as a personal attack, your bring yourself down. If you use it to question or draw attention to the arguer rather than the argument, it does nothing to help prove your case. Believe me, I know what it feels like to get angry during a debate, and I know what it feels like to want to call your opponent names or question their character. Resist the urge. If your position truly has merit, that in itself should give you the ammunition you need in your fight.

    Speaking of which, once you’ve retired the ad hominem argument from your arsenal of false weapons, you are on your way to making more room in your quiver for logical arrows. In the next post, I’ll address some of the more complicated, but still common, logical fallacies that remain.

  • The Skeptical Method

    The Skeptical Method

    When I was at SDSU getting my MA, I had to write a paper for my graduate seminar in linguistic anthropology. I chose to write about the question of whether or not Neanderthals were capable of speech. In researching the topic (using the card catalog, books, and bound journal articles – yes, we had the interwebs in 1999 but real research still had to mostly be done the old-fashioned way), I discovered that there was a great deal of disagreement on the topic. Some researchers proposed that, indeed, the evidence supported the hypothesis that Neanderthals had the physical capacity to speak in much the same manner as modern humans. Others proposed that, while there was a cultural basis for proposing language skill, there was not any physical/anatomical evidence that supported the speech hypothesis. Still others proposed that Neanderthals surely were capable of symbolic communication, but not at the sophisticated levels attained by modern humans. After sorting through the major arguments, I centered my paper around a discussion of each one, analyzing the strengths and the weaknesses, and ultimately concluding that while all of the major hypotheses were possible, more work needed to be done before reaching a conclusion, and that I found it likely that bits of each hypothesis would end up being validated (FYI – since I wrote that paper, a fossil Neanderthal hyoid bone has been discovered. This bone is what makes human speech possible. That, along with the presence in Neanderthal crania of a hypoglossal canal essentially identical to the one through which speech-related nerves connect to the brain in modern humans, makes the complex Neanderthal speech hypothesis into more of a true, scientific theory).

    My linguistics professor lauded me for a thoroughly researched and well-written paper, but his final comment was this: “You can’t just write about what others think. At some point, you have to pick a theory and take a stand.” This bummed me out. I hadn’t taken a stand because no single hypothesis had completely convinced me. If I had been forced to choose, I would have sided with those who argued for complex Neanderthal speech, but I wanted to leave the options open. This, in my mind, is what makes the scientific method so brilliant: it leaves the options open.

    For those of you in need of a refresher, the scientific method basically goes like this:

    1. A verifiable truth, or fact, is observed.
    2. A tentative explanation of the observed fact, or hypothesis, is proposed.
    3. The hypothesis is tested.
    4. If the hypothesis is disproved, it must be rejected as an explanation of the observed fact. If it is not disproved, then the hypothesis may be provisionally accepted.
    5. If a hypothesis survives repeated rounds of testing, and continues to not be disproved, then it gains the status of theory.

    This requires several important corollaries to be properly understood. Probably the most important is that, for a hypothesis to be scientific, it must be testable and you must be able to disprove or reject it. This is because a scientific hypothesis cannot be proved – it can only be disproved. There are tons of hypotheses out there that are not testable, or at least, not testable with current technology. For example, you observe that the universe exists. You hypothesize that a supernatural creator designed the universe. This is a perfectly acceptable hypothesis, but it cannot be tested and rejected and is therefore not a scientific hypothesis. And that brings up another important corollary: the common vs. the scientific understanding of the words theory and hypothesis. The common understanding is that calling something a theory or a hypothesis means it is no better than a random guess: “Evolution is just a theory.” To which I answer, well, yes: evolution is a scientific theory, which means it has survived so many rounds of so many different kinds of tests, and never once been disproved, that it has attained a status almost as binding as physical laws. Another important corollary is predictive power. The best theories have predictive power, which means that you can reasonably expect to predict certain things to happen with the theory as your guide.

    But finally, my favorite corollary of all: no matter how many rounds of testing your theory has survived, if it should ever fail, you have no choice but to go back to the drawing board and revise your original hypothesis. This, in a nutshell, is how science works, and it is how we are explaining and doing things now that were unimaginable even a few generations ago. Science builds on its mistakes. Science learns from its failed experiments. And the best scientists are the ones who are willing to go back and look again, and revise, expand, edit, and redraft their hypotheses and theories to make them even better at being predictive. (A cautionary note: evolution, in particular, is a scientific theory that its detractors claim is still controversial within the scientific community. Nothing could be further from the truth. Evolutionary theory, to me, is truly one of the simplest and most elegant scientific explanations ever proposed. It is the grand unifying theory of biology and all life science, and it undeniably works. What is still debated within the field of evolution are many of the more specific mechanisms of evolution’s operation. A great example of this is the debate over group selection vs. individual selection, which is robustly debated by no lesser thinkers than E.O. Wilson and Richard Dawkins. But neither Wilson nor Dawkins would ever think to propose that natural selection itself is in question. Although, as brilliant scientists, if it were ever disproved they would have to reject it!).

    Ok, so now you should hopefully have a grasp of the scientific method, and I can get to what I really want to say in this post: I use the scientific method in my attempts to come to conclusions about almost every issue I come across. However, many of the more controversial issues that people find important nowadays are not amenable to the rigorous testing portion of the scientific method. Instead, I am calling this the skeptical method. It works in the same way as the scientific method, especially as far as the “disproving” portion is concerned. Here’s how it works:

    1. Encounter a controversial issue (say, whether or not to support the Affordable Care Act).
    2. Propose a tentative opinion of the controversial issue.
    3. Gather data against which to test the tentative opinion.
    4. Accept, reject, or revise the opinion as needed based on the data.
    5. Be prepared to revise the opinion based on new information.

    As with the scientific method, the skeptical method has corollaries. First, controversial issues are controversial for a reason. They are often in conflict with people’s deeply held values. While I would consider these to be subjective, not scientific, they are still very important in the formation of the tentative opinion. Second, the skeptical method requires deep and careful critical thinking to be truly effective. This often takes work. If you are not willing to do the work it takes to support your tentative opinion, then it remains tentative. If you have done critical, careful research, and feel confident in the data you are using to support your opinion, then you might stop being tentative and become more confident of your opinion. But just like in science, you have to be able to support your hypothesis, and you have to be willing to consider new data. Third, people will very often disagree with you and marshal their own body of research in opposition to your opinion. It behooves the critical thinker to consider your opposition’s arguments. Even if you still disagree in the end, you will strengthen your argument by understanding the argument of the opposition… and sometimes, you might actually change your opinion. Even though this might seem like a defeat, it’s not. The skeptical method is not about personal victory; it’s about understanding the world. Part of that understanding involves the realization that sometimes – oftentimes – good, rational, moral people can have deep and irreconcilable differences of opinion. Sometimes, thoughtful, intelligent, compassionate people can take the same data and come up with completely opposite opinions. Sometimes, you will feel the urge to fling poop at that person’s head because you are so frustrated that they don’t seem to understand your beautifully reasoned and elegant opinion! Don’t do it. The true critical thinker accepts and understands that she may feel like she couldn’t possibly be more right, and there will always be someone who thinks she couldn’t possibly be more wrong.

    I want to end by saying that I do find facts to be much more compelling than opinions. Daniel Patrick Moynihan once said that “people are entitled to their own opinions, but not entitled to their own facts.” If your opinion is supported by fact then I respect that. However, I am becoming more and more concerned by what appears to be a trumping of opinion over fact in our national discourse – a topic that deserves its own post.

  • The Evolution of Pink Slime

    The Evolution of Pink Slime

    So-called pink slime has been all over the news lately. Friends have posted links and comments about it on Facebook, I have heard stories about it on NPR, and I’ve heard people talk about how they can’t believe our government would allow the meat industry to sell the stuff as food. Pink slime, known formally as lean, finely textured beef trimmings (LFTB), is certainly not a food product that is likely to provoke anticipatory salivation. The term pink slime is itself deliberately crafted to instead provoke a reaction of disgust. And, the associated news that the stuff is treated with ammonia to remove potentially harmful bacteria just adds insult to our collective sense of injury. But there’s a problem here: making decisions about what to eat based on a visceral reaction to something that has been uncritically dubbed with a description designed to elicit just that reaction is not a way to make choices about what we eat.

    I find the whole uproar rather silly, myself. First let’s tackle the linguistic angle: the name pink slime. Some might argue that it’s misleading to relabel this edible meat substance with a name that does not reveal what it really is. The name “lean, finely textured beef trimmings” does not evoke the actual cow parts that are used to make it. The stuff is made by combining fatty trimmings and ligament material from the cow and spinning it in a centrifuge to separate the fats. It is pink and looks slimy; hence the media-friendly and consumer-alarming moniker “pink slime.” One thing to note is that this stuff is not sold as-is; it is combined with regular ground beef as a bulk additive, and can be up to 30% of the final ground beef product (whether raw, bulk meat or items such as hamburger patties). So we are not unwittingly consuming unadulterated pink slime; nor is it being fed as-is to kids in school. A second thing to note is that we use all manner of euphemisms to describe the things we eat, especially when it comes to meat products. Filet mignon sounds much more appetizing than “hunk of cow flank.” Bacon cheeseburger stimulates the appetite in a way that “salted fatty pig belly cheeseburger” does not. In fact, when raw, most meat is pretty slimy, so we might as well add that adjective to all our meats. The point is that these are subjective reactions. Call things what they really are and lots of people might think twice before eating them. It reminds me of the failed “toilet to tap” initiative that was proposed in San Diego several years ago. Once the descriptor “toilet to tap” caught on in the media, there was no way the public would abide this water treatment program, even though the reclaimed water from the sewer system was just as pure and clean as regular municipal tap water. The name killed it because people could not reconcile themselves to water that came from the toilet, no matter how much scientific evidence there was that the water was clean. I find this fascinating in light of the fact that municipal tap water is held in reservoirs before treatment, in which people drive boats, fish, and probably urinate, and which is filled with all sorts of animal and plant matter, both alive and decomposing.

    My second issue with this uproar has to do with food supply in general. There are seven billion people on this planet. They all need to be fed. In many places people subsist on foods that we here in the US would find appalling, and not merely because of cultural differences, but because some people are so poor that they will eat whatever they can. Our objection to LFTB is a beautiful example of a first-world problem. I know many people are rethinking where their food comes from and signing on to local food and slow food movements, and that’s all well and good, but within a country like the US, that is (for the most part) an upper-middle class movement. Poor people in this country do not have the luxury to worry about where the food comes from, much less exactly what is in it. For a poor family, knowing the kids will at least get lunch at school is a bigger concern than whether or not that lunch may contain pink slime.

    When agriculture arose 10,000 years ago, humanity began the evolutionary road towards pink slime. Agriculture allowed previously nomadic people to become sedentary. Sedentism led to expansions in technology and booms in population. Ultimately, agriculture allowed for centralized cities ruled by top-down leaders, supplanting the egalitarian cultures of hunting-gathering and small-scale agricultural groups. Technological innovations continued to abound and populations continued to boom, and to feed all those people, intensive, factory-driven, and mechanized industrial agriculture became necessary. Can we really turn back that process now, and all start growing our own gardens and raising and slaughtering our own livestock? I’m not talking a fancy herb garden, heirloom tomatoes, and hobby chickens; I’m talking feeding yourself and your entire family by the products of your own labor. We do not live in that world any more. We live in a world where a beef supplier will use every part of the cow. Our industrial food complex has grown so efficient that almost nothing goes to waste.

    I’m not blind to the fact that the beef producer is also trying to turn as much profit as possible; this is capitalism, after all. But I have no objection to seeing otherwise wasted parts of the cow get turned into an edible substance. As for the ammonia gas issue, it is simply a way to make the stuff safe. A chemical like ammonia is certain to provoke another knee-jerk: it’s in glass cleaner! It’s a poison! Well, yes; but without understanding how the process works people somehow conjure a picture of the pink slime getting dipped in a bright-blue Windex bath, which is far from the case. I can see the other side of the coin if the stuff didn’t go through this process: how dare the government allow us to eat beef that has not been treated for bacterial contamination! (Which reminds me of another rant I have against what I see as a massively over-reacting food safety process in this country; I think it’s ludicrous to destroy thousands or even millions of pounds of a food because a few people got food poisoning – but that’s a rant for another day). In fact, much of our food goes through similar sanitizing processes to prevent illness. As far as I can tell, no one has ever died from eating ammonia-treated LFTB, but they have died from food poisoning caused by the very bacteria the ammonia treatment is designed to prevent.

    I can understand, to a degree, the people who argue that we have a right to know what is in our food so we can make an informed decision about whether to consume it, and I don’t object to the idea of more comprehensive food labeling. However, I still think this is a first-world and middle-class problem. How many people actually read food labels? Yes, the information should be there, but then the consumer does have some responsibility to think critically about what they see on the label if they decide to read it. I would bet that many of the people upset about pink slime have never bothered to really research what is in the other foods they buy at the store. Some people make it a point to not buy food products with lots of chemical additives and unnatural ingredients, but that is a tiny minority. Most of us are happy with the “ignorance is bliss” approach; and I would argue that if we didn’t take that approach then we might be paralyzed with largely unnecessary worry. Does anybody ever really stop to think about how many other people’s mouths have been on the fork they use at a restaurant? Wow, that’s gross, isn’t it? Of course the dishes at the restaurant are cleaned between uses, but to me, pink slime is no more dangerous than using a cleaned fork that has been in 1,000 different mouths. It’s gross if you think about it – so, don’t think about it!

    This controversy will fade as other things grab people’s attention, but what I fear is that whatever the next issue is, people will still have the same knee-jerk, uncritical reactions. Sometimes those reactions turn out to be completely justified, but that is irrelevant to the initial reaction. People need to come to conclusions that rely on more than a sound bite and an unappetizing label or picture that is designed to grab attention. Thinking critically means gathering facts and forming a provisional opinion that may be modified in light of future information. Being grossed out is not a good reason for objecting to a food product.