Tag: Logical Fallacies

  • Logical Fallacies: The Bandwagon Fallacy

    Logical Fallacies: The Bandwagon Fallacy

    When I was attending Humboldt State University in the early to mid-90s, I noticed that I was putting on some weight – the dreaded Freshman 15. To combat this phenomenon, I started getting regular exercise, joined a gym, and started watching my diet. It was right around this time that a new dieting trend burst on the scene: a massive proliferation of low- and non-fat foods, all of which were marketed directly to the consumer’s desire to lose weight while still being able to indulge in treats like cookies, ice cream, and chips. In particular, I remember the Snackwell’s brand of cookies and snack cakes in their trademark green packaging. I remember scanning the nutrition label and seeing that I could eat an entire package of vanilla creme cookies and only ingest 4 grams of fat. Eureka! It never occurred to me to stop and think about the wisdom of this approach. Did it really work? Well, it must – otherwise, why would everyone be buying these products?

    Welcome aboard the bandwagon fallacy. The premise is simple: if an idea is becoming popular, it must be true. The low-fat fad took off because so many people wanted to believe in its simple premise that removing fat from your diet would remove fat from your gut. As the idea gained in popularity, it gained in adherents, which further increased its popularity, in a nice little feedback loop. Bandwagons can form around all sorts of premises, tested and untested, but I find the ones that form around food to be quite fascinating. These fads seem to come and go: the high-protein Atkins diet was first popularized in the 1970s then faded, only to experience a resurgence in the 2000s. Of course, as people came to realize that the diet didn’t have the lasting weight-loss effects it promised, it lost its popularity as people abandoned the bandwagon. Yet, these ideas manage to persist. The same thing happened to the lactose-intolerance fad, and I strongly suspect it will happen to the gluten-free fad.

    When a bandwagon idea holds the potential for becoming a marketing bonanza, it explodes across a universe of products. This reinforces the bandwagon. Currently, gluten-free is the top dietary fad. It’s quite amusing to see products that never had gluten in them in the first place emblazoned with the GLUTEN FREE! label. I’ve seen it on products as ridiculous as soda and fruit snacks (although as an aside, it can also be quite shocking to discover all sorts of strange ingredients in prepackaged foods, so I suppose it’s always possible that a fruit roll-up could have gluten in it). The same thing happened during the fat-free fad. Other current bandwagon labels include organic, free range, cage free, all natural, non-GMO, RBGH-free, and other labels catering to the health-conscious (but sometimes logic-unconscious). Gluten-free still seems to be towing a full bandwagon, but the next wagon is rapidly filling with adherents. This is the anti-sugar bandwagon. I’ve lately been seeing a lot of ink spilled over the toxic hazards of our high-sugar modern diets, and I have absolutely no doubt that the marketing bonanza has already begun.

    Research is revealing that the causes of modern health problems are much more complex and intertwined than the simplistic healthy-food bandwagons would make it appear. I do want to stress that there is real research into some of the bandwagon fads I have mentioned. Sometimes the research supports the fad, sometimes it doesn’t, and often the results are maddeningly inconclusive. The Atkins diet has been thoroughly studied with mixed results, depending on what particular factors were the focus of the research. Lowering the amount of fat in one’s diet also can certainly lead to weight loss, but that by itself is not enough. People with celiac disease truly cannot ingest gluten without becoming severely ill, and some people may be able to handle wheat protein in their diets better than others. Organic foods have the benefit of lowering our exposure to potentially toxic pesticide and herbicide residues; however, some of the other popular adjectives for “healthy” food remain highly problematic because they are misleading. “All Natural” is a loosely regulated term that can be used by almost anybody. “Free Range” and “Cage Free” can mean only that the birds in question are released to a fenced yard for a short time each day or are crowded together in large facilities with no cages – but no natural light or ability to go outside. The research into GMOs and RBGH (recombinant bovine growth hormone) is unsettled and deserves a post of its own. Even the simple “calories in-calories out” approach is turning out to be much more complicated than we thought.

    The point of all this is that these issues are complicated and deserve critical review. The bandwagon fallacy encourages us to jump aboard because it’s easier to go with the crowd than do the hard work of researching an issue on the merits. Do your research and you may just find that the bandwagon is the right place to be – but it’s not because everybody else is there. If you choose to ride on the bandwagon, be sure it’s because you are confident in its origins and its destination – whether it’s about making food choices, social choices, or even pop-culture choices. Better yet, build and drive your own wagon!

  • Logical Fallacies: Appeal to Authority and the Tu Quoque Fallacy

    Logical Fallacies: Appeal to Authority and the Tu Quoque Fallacy

    The appeal to authority is probably one of the most common logical fallacies. You hear it and see it all the time: “The Intergovernmental Panel on Climate Change says the climate is changing. All those scientists can’t be wrong, so the climate must be changing.” It’s true that the IPCC’s research has revealed a great deal of scientific evidence that the climate is changing, and that the change is most likely caused by human activity. But simply saying it’s true because a fancy-sounding panel of scientists says so is not enough. It’s the research that supports the conclusion, so if you are going to make an argument about something, cite the research, not just the source.

    Don’t get me wrong; I’m not saying that it’s a bad thing to bolster your argument by touting the credibility of your sources. I am saying that you better be prepared to cite the conclusions of your sources as well. The reason an appeal to authority by itself is not enough is because it can be extremely misleading. Just because a person, group, or study is associated or affiliated with a highly-respected institution or researcher does not mean that the conclusions themselves are sound. Linus Pauling, a Nobel laureate in chemistry, was rightfully lauded for his work in explaining complex molecular bonds (and he was also awarded a Nobel Peace Prize for his anti-war activism). Pauling is routinely listed as one of the most influential American scientists of the 20th century. However, in his later years, he did work on vitamin C, among other things, that fell short when critically analyzed by other scientists. Nevertheless, even today people will cite Pauling’s work to bolster claims about vitamin C’s efficacy in treating certain diseases, such as cancer, even though none of his claims have stood up to testing. It is simply his authority as a Nobel prize winner that seems to give credence to disproved claims.

    Something similar happens with people who have the title of “doctor” (and I should know, being one of them); somehow, the Doctor is seen as an authority simply because of her title. I claim no specialized knowledge outside of my very specific research in anthropology, but when I talk, people who know I’m a PhD tend to listen… and to ask me about things I really know nothing about! Along similar lines, Dr. Laura Schlessinger, of conservative talk-show radio fame, earned the title of Dr. by virtue of her PhD in physiology… not in the psychotherapy she dispensed on her program. Yet “Dr. Laura” was able to trade on her title to enhance her credibility with her listeners: a perfect example of the appeal to authority. This is a fallacy we all have to be very careful of, not only because we use it in our arguments with others but because we fall for it ourselves when convincing ourselves of the rightness of our views. Always remember that it is not enough that somebody is “a Harvard Medical School neurosurgeon.” That by itself does not make the research credible. It is the scientific process, the peer review, the repeated testing, that gives a particular conclusion its credence. And on the flip side, reversing the appeal to authority – e.g., “how can we trust that research when it’s only from Bob State University?” – does not mean that the research is shoddy or its conclusions can’t be trusted. If it has gone through the same rigorous scientific process as the work of the Harvard neurosurgeon, then it should have equal credibility. Final flog of the dead horse: you should definitely be aware of the credentials and background of researchers, but you should not use that as the sole criterion by which to judge their work.

    And now our bonus fallacy: the tu quoque fallacy. It has a fancy name but the concept itself is simple. This is the classic child’s whine that “so-and-so gets to stay up until 10, so I should get to stay up too!” Just because someone else gets to do it doesn’t mean you get to do it! Even more specifically, the tu quoque fallacy is used to try to justify wrongdoing. I’m sure cops hear it all the time in the form of “Why didn’t you get that other guy who was racing past me at 95?” You know as well as the cop does that just because somebody else was going 95 doesn’t make it ok for you to go 90. I love tu quoque because it’s really so childish when used in this classic sense. But it does get used in other ways as well, in more of an “apples to oranges” way. You tend to hear the tu quoque fallacy when people can’t really refute an argument with logic, but they remember an example of something similar turning out not to be true, so they cite that instead. I’ve been hearing it regularly in discussions of global climate change when people refer to a brief, media-hyped panic in the 1970s that the world was about to go through a global freeze. As it happens, while a few papers suggested that a cooling period might be coming, the majority of research at the time found more evidence for global warming. But the media got ahold of the cooling theory and ran with it. The conclusion is that the climate scientists who proposed a potential global cooling period turned out to be wrong; therefore, climate scientists who are predicting global warming are also wrong. It’s a variation of the child’s whine: “science was wrong about something in the past, so it must be wrong now.” This is absurd. Scientific research is based on revising conclusions based on new information. If scientists gave up every time something they predicted turned out to be wrong, no scientific field would ever advance. So being wrong in the past has little predictive value for being wrong in the future.

    It’s exhausting to try to keep track of all these fallacies, committed by ourselves, the people we talk with, and the sources we rely on for information. It’s also exhausting to glean what’s important from a conversation and use care to establish credibility without tipping over into an appeal to authority, or to cite examples of previous, similar situations without falling into a tu quoque, or to refrain from the ad hominem of calling somebody a blithering idiot (or much, much worse) instead of actually deconstructing their argument. Of course, a lot of people don’t really want to try because the fallacies are so much easier… but I do hope we will all keep trying.

  • Logical Fallacies: The Red Herring

    Logical Fallacies: The Red Herring

    The red herring is an argument that I see deployed again and again, and I’m never entirely sure if the person deploying it is even aware that they are bringing up issues that are tangential to the debate at hand. The phrase originates from the days of fox hunting, when the scent of a red herring was used to distract the hounds from the pursuit of the fox. That’s exactly what the red herring does in an argument: it distracts the participants from what is really at issue, and they find themselves talking about onions when they started out talking about apples.

    I get very frustrated when people deploy the red herring, whether they are doing it deliberately or unconsciously. Actually, it’s the unconscious deployment that gets to me the most, because it tells me that my interlocutor does not have a firm grasp on what we are really debating. I honestly think it’s a defense mechanism for most people. They bring up side issues as a way to distract from the fact that they really have no answer to whatever point their opponent is making. (As an aside, I want to clarify that my use of words like argument and opponent is not meant to say that I expect every difference of opinion to lead to anger; but when people disagree about something and they engage in conversation about it, they become like opponents in a refereed debate – only without the formality of an actual referee. Of course, sometimes the debate does devolve into an actual argument that is heated with emotion.)

    The red herring seems to come up regularly in arguments that are about sensitive subjects such as gun control or gay marriage. I generally see it used when a person is arguing from emotion rather than from logic. For example, I might say that stricter gun control laws could have saved the lives of some of the 194 children who have died from gunshot wounds in the year since the Sandy Hook massacre in December 2012. Someone deploying a red herring might say “But what about all the people who used guns to defend themselves since Sandy Hook?” There may well be many cases of people deploying guns in self-defense since then, but that is not what the argument at hand is about. Bringing up guns used in self-defense is a distraction from my hypothesis that stricter gun control may have prevented the deaths of some children. My argument says nothing about whether stricter laws might have hindered those who used guns to defend themselves. Although that may well be the case, it is not the point of this particular, specific debate.

    Another situation that I’ve encountered many times is when the red herring is used to put people on the defensive. It usually takes the form of a question, wherein your interlocutor will say, “So you’re saying that we should (such-and-such illogical leap)?” It is so easy to be distracted by this and to start defending yourself from the stinky fish being lobbed in your direction! As another example, if I say that I am opposed to the “stop and frisk” policy in New York City because I think it unfairly targets minorities, a red herring-lobbing opponent could say, “So you’re saying that a suspicious looking person should never be stopped by police?” Of course that’s not what I’m saying, but if I lose my cool and chase the herring, the chance to talk intelligently about the merits of the stop and frisk policy is lost.

    I’m using broad examples on purpose to try to illustrate the red herring. Obviously, in the course of having conversations about these issues, many different points will be made about different aspects of particular issues. And in many cases, I’ve found, no matter how hard you try to keep your debate partner on point, they will keep tossing the fish. The hardest part when you are trying to concentrate on a specific point is not allowing yourself to be distracted by the scent of the herring, and to keep your eye on the fox… or in some cases, to simply disengage from the hunt.

  • Logical Fallacies: The Gambler’s Fallacy

    Logical Fallacies: The Gambler’s Fallacy

    Unfortunately, I have a very personal reason for choosing the gambler’s fallacy as my next topic. I am experiencing the effects of this fallacy with someone close to me who is in the grips of a gambling addiction. For years, she has exhorted me that winning depends upon making large bets. She believes that betting more money improves the odds of coming up lucky on her vice of choice, the slot machine. No amount of patient explanation of statistics and odds can dissuade her from this belief. She also falls prey to the classic version of the gambler’s fallacy, which essentially states that the odds of one event are influenced by the outcome of the preceding event. This can sometimes be true. For example, in cases where numbers are drawn and not replaced, as in bingo, the odds of the remaining numbers being drawn increase with each subsequent draw (that is, in a bag of ten numbered balls, the odds of drawing any of the numbers is one in ten. Once the first ball is drawn, the odds of drawing any of the remaining numbers becomes one in nine, and so on). However, it is not true for simple odds such as coin tosses. Each and every toss of a coin is an independent event. So, even if you get nine tails in a row, the odds of getting heads on the tenth toss remains exactly the same as it was for the preceding nine tosses; that is, 50 percent. Yet, many people will believe that the tenth toss has much greater odds of being heads because the previous nine tosses were tails. This is the gambler’s fallacy.

    It is amazing how many people fall for the gambler’s fallacy. You see it operating not only in casinos but in lotteries. When a lottery jackpot gets really big, it is because several drawings have passed with no winner. And of course, the bigger the jackpot gets, the more people buy tickets. Many people do this simply because they hope to win the huge jackpot, not because they believe their odds are any better; but I have had conversations with many people who insist that they are more likely to win when the jackpot is bigger. Their argument is a perfect example of the gambler’s fallacy: because no winner has been drawn for so many weeks, the odds of a winner must be greater for the bigger jackpots! This is actually true in one specific sense, because the more people who buy tickets, the more potential combinations of numbers there are in the ticket pool. But the big jackpot and the long time elapsed since a winner does not change the fundamental odds of drawing, say, five numbers out of 56. No matter how big the jackpot, the odds remain exactly the same for each and every drawing. For the Mega Millions lottery, the odds of drawing five particular numbers and the Mega number are 1 in 175,711,536. Again, these odds remain the same no matter how big the jackpot and no matter how many tickets have been purchased. It seems that attaching money or some other consequence to the outcome of a random event scrambles people’s ability to rationally judge the odds.

  • Logical Fallacies: The Slippery Slope

    Logical Fallacies: The Slippery Slope


    The slippery slope fallacy is another of those logical mistakes to which so many people fall prey. I’m sure you recognize it; it’s the idea that doing one thing will inevitably lead to doing something worse. It’s a common argument deployed for things like drug use, in which a drug like marijuana is labeled a “gateway drug” because it allegedly opens the door to much more dangerous and addictive substances. In other words, once you start using marijuana you’ve started down the slippery slope towards becoming a full-blown cocaine/meth/heroin/oxycontin, etc. addict. The problem with the slippery slope fallacy, paradoxically, is that it’s true often enough that people start to become convinced that it’s true for everything. Certainly there are drug addicts who started with marijuana and ended up with heroin; the problem is the assumption that the worst will always happen. It becomes a form of confirmation bias. People remember the cases where one decision led to another, and then another… and the next thing you know, the world as we know it has come to an end!

    The slippery slope fallacy is used all the time in public and political discourse. Recent events illustrate well that the slippery slope is a common bludgeon for changing people’s positions or illustrating their stance. So, gay marriage inevitably leads to multiple marriage (polygamy), then marriage between children and adults, then marriage between people and animals, and so on. It reminds me of Peter Venkman, Bill Murray’s character in Ghostbusters: “Human sacrifice! Dogs and cats, living together! Mass hysteria!” (Of course, the interesting thing about this particular argument is that polygamy is widely practiced in many cultures, and girls and boys who might be considered children by Western standards are considered to be of marriageable age as early as 13 in some parts of the world… but that doesn’t mean that this is going to be accepted in all cultures, or even that it should be. Personally, I have no problem whatsoever with polygamy, although I wouldn’t want it for myself).

    We see a similar argument occurring in the gun control debate: if gun control laws are tightened, then only criminals will have guns, and law-abiding citizens will have to live in fear for their property, safety, and their lives. The converse is that if gun laws are relaxed, gun crimes and gun deaths in general will skyrocket as people go around randomly shooting each other. Obviously I’m exaggerating to make a point, but this is the natural consequence of the slippery slope argument.

    One place where I fall prey to the slippery slope myself is in my concerns over how digital and wireless technology is allowing more and more intrusion into our private lives. I harbor dark fears that eventually, no one will be able to do anything without it being recorded somewhere, somehow. I cringe every time I read a story about the new ways technology is intruding into our lives, in many cases without our knowledge. A company in the UK has developed public trash cans that can track smart phone signals and display ads based on the personal habits of passers-by. This sort of thing keeps me awake at night and gets me started down the slippery slope. I think my overall concerns about privacy and government/corporate overreach are well founded; but do these concerns necessarily lead to a completely totalitarian, Orwellian world? Perhaps, but simply arguing that one thing inevitably leads to another is not enough.

    Like all the things I write about, there are always grey areas. Sometimes a slippery slope concern ends up being justified, but even if it is, slippery slope arguments are still fallacious. People always sense danger when they feel their worldview is being threatened, and some people see threats everywhere. Even if we feel justified in listening to our fears about the slope, we are still obligated to do the actual work of using facts and logic to determine what the potential outcomes may be. Fear, as compelling as it may be, is not rational, and neither is the slippery slope.

  • Logical Fallacies: Appeal to Popularity

    Logical Fallacies: Appeal to Popularity

    A few days ago I was in Staples buying some supplies for a new craft project, and the cashier inquired whether I had a Staples preferred customer card. When I answered in the negative, he asked if I wanted one. I declined. He persisted: “But our customers are so happy to be in our program! Millions of people can’t be wrong!” On the contrary, I replied; they very well could be wrong. I did end up joining the program because it’s free and might save me some money – but his reference to millions of happy customers had no bearing on my decision.

    This logical fallacy is known as the appeal to popularity. It is one of the many irrelevant appeals that people use to back a particular point of view, and like the ad hominem argument from my previous post, it is very easy to explain: just because a view is held by many people does not make it correct. Yet, it is a very common argument. I often find it used in defense of religious beliefs, e.g. God must exist because most people believe he does. How could all those people be wrong? As appealing as that argument may be, it is not rational, logical, or based on facts. Over the thousands of years of human history, millions of people have shared countless incorrect beliefs. Physicians used to treat patients without washing their hands or their instruments, because what we now call the germ theory of disease hadn’t yet been formulated (thank you, Louis Pasteur and your predecessors). Instead, doctors believed that disease spread from what they called miasma, or bad air. They had no conception of microscopic organisms such as bacteria, or invisible particles of virus, or even tiny parasites. Millions of people, including the physicians responsible for treating them, believed in miasma… and they were all wrong.

    None of this is to say that the opposite of the appeal to popularity is true; that is, that the truth is only known by a select few and the rest of the world is simply mislead or deluded. This kind of thinking is common amongst conspiracy theorists. Much of their certainty comes from the feeling that they have access to rare, special knowledge that others don’t know or won’t accept. They convince themselves that they have extra sharp powers of logic and perception because they accept things others won’t, even when the facts aren’t on their side. Conspiracy beliefs make the believer feel like they are part of a special, rarefied group of the truly knowledgeable, and may actually work against the appeal to popularity by saying that the more people believe something (e.g. that fluoridated water or childhood vaccines are safe) the less likely it is to actually be true. This means we have to beware not just the appeal to popularity itself, but how it is deployed. Be extremely wary of any argument that goes along the lines of “But that’s what they want you to believe!”

    It is also important to remember that plenty of things that almost everybody in the world believes to be true are actually true; but they aren’t true because we all believe them to be true – they are true because they are facts. In other words, our belief in something, or lack thereof, is completely irrelevant to the truth value of what we believe in. And the number of people who believe in something – or don’t believe in something – is equally irrelevant to the truth value of a given argument.