Category: Critical Thinking

  • Logical Fallacies: The Appeal to Antiquity

    Logical Fallacies: The Appeal to Antiquity

    The first definition for the word conservative from Dictionary.com reads as follows: “disposed to preserve existing conditions, institutions, etc., or to restore traditional ones, and to limit change.” With this definition in mind, it comes as no surprise to me that those who identify themselves as politically conservative often employ the appeal to antiquity, which is a common logical fallacy. When you are averse to change, it makes sense to argue that things should stay the same because “that’s the way they’ve always been.” That argument, in a nutshell, is the appeal to antiquity.

    The appeal to antiquity is part of a family of fallacies known as “irrelevant appeals.” The idea is that because something has been done or believed for a long time, it must be true. Obviously we know that this is not the case; people used to believe that the earth was flat; that all disease was caused by “bad air”; and that phrenology was an accurate science. I know that political conservatives aren’t the only people to employ this fallacy, but I do tend to associate it with common conservative arguments about a variety of hot political topics. For example, an extremely common argument against legalizing gay marriage is that marriage has always been defined as a union between one man and one woman. Aside from the fact that this is not historically accurate, it clearly employs the appeal to antiquity: that’s the way things have always been; therefore, we shouldn’t change anything.

    The opposite of the appeal to antiquity is the appeal to novelty. This fallacy holds that because something is newer it must be better, but that’s just as wrong as the appeal to antiquity. New discoveries are made about things all the time that ultimately turn out not to be true. This is related to the bandwagon fallacy, which proposes that if a bunch of people believe in a new idea it must be true. This holds for recent research into all sorts of things that may be good or bad for us, such as the potential benefits of various dietary supplements. Recent research into the benefits of taking multivitamins has concluded that they do not appear to benefit human health. This may well be true, but it is not the recency of the conclusion that makes it so; and I wouldn’t be surprised if further research disputes this claim. (This topic deserves a rant of its own discussing the rampant fallacies that are employed by consumers of mainstream media reporting on scientific research – both the way the information is presented and the way it is interpreted leaves much to be desired.)

    Back to the appeal to antiquity. Just like the appeal to novelty, the age of the topic under consideration is of no relevance to its veracity. Again, the position being argued may well be true, but it is not its age that makes it so. And conversely, because something is old does not mean it should automatically be abandoned. The point is that the age of a topic under discussion has no relevance to its validity. So, to use a few more examples, just because something is written into old documents that set up the governance of our nation does not mean they remain ideas that should be embraced. For example, should Black Americans still be considered 3/5ths of a person? Should we still allow slavery? Should women still not be allowed to vote? Should Native Americans not be granted US citizenship? And for a very hot topic in our current culture wars… should every American citizen be allowed to own a gun just because that’s always been a part of our governing tradition? Don’t mistake me – there are valid arguments for and against gun ownership, but the appeal to antiquity is not one of them. And as an example of a governing novelty that did not work out: should alcohol still be prohibited? Prohibition did not work – its newness as an amendment to the US Constitution did not make it a success.

    So, if you are going to argue that something should remain as it is, do not make the mistake of deploying the appeal to antiquity. There are many good arguments in support of a variety of positions on our many cultural conflicts, but saying “that’s the way it’s always been” is not one of them. The more carefully you select your arguments, the more likely you are to be heard.

  • Logical Fallacies: Attribution Error

    Logical Fallacies: Attribution Error

    How many times have you honked your horn in anger or raised your middle finger at some idiot while driving? Have you ever seethed inwardly as some dawdler wastes time at the checkout counter while you are waiting behind them in line? Do you assume that the person who took up two spaces in the parking lot is a complete asshole? On the other hand, how many times have people honked at you or flipped you off as you sheepishly realize that you accidentally went out of turn at a four-way stop? Have you felt the back of you neck burning with the stares of people behind you at a checkout line as you realize you entered your PIN incorrectly or forgot to give the cashier a coupon? How about having to park awkwardly between the lines because another car was partially blocking the space? But you’re not a bad person, right? Those other people, though…

    When you believe that your actions can be explained by situational factors, but other people’s actions can be explained by their personalities, you are succumbing to a particular type of attribution bias known as fundamental attribution error. Attribution bias involves the human tendency to explain our own and other people’s behavior by attributing it to causes that may actually have nothing to do with the behavior. Overall, we tend to explain our own actions, or the actions of those we know well, as being due to situations and not due to something fundamental about our personality. Conversely, when it comes to explaining the behavior of people we don’t know, we are much more likely to explain it as a function of who they are without taking contextual factors into account. Essentially, we are judging a book by its cover.

    Attribution errors occur in the public sphere all the time. If you have ever made the mistake of getting sucked into the comment page rabbit hole accompanying articles about controversial issues, you know what I am talking about. So often, we are only given a tantalizing tidbit of information in an article, but that’s all it takes to trigger a cascade of attribution error. I find this troubling. One of the classic examples of attribution error is the case of the Albuquerque woman who sued McDonald’s after she spilled hot coffee into her lap. This case took the media by storm and eventually became a cultural touchstone for describing apparently frivolous lawsuits. The vitriol that rained down on Stella Liebeck was thick and furious. She was obviously an idiot for resting the coffee cup between her legs. She shouldn’t have been driving with hot coffee in the first place, so she must be a careless person in general. She clearly was just out to get McDonald’s because they have deep pockets. How dare she sue for an incident that was clearly her own fault? She was just trying to get rich off McDonald’s! It turns out that the reality of Liebeck’s case was much, much different than the public perception of events. There is a reason the jury awarded such a huge amount of money when they heard the case – it is because they heard the facts and made their decision based on those facts. In hearing the facts, the possibility of attribution error was dramatically reduced. I strongly encourage you to read the facts in the case if you are one of those who has never heard them. There is even a documentary film about the case called Hot Coffee, which explains how Leibeck’s case got so distorted.

    I started thinking about this the other day when I was reading a Jezebel article about a woman in Ontario, Canada who hit three teenage boys with her car, injuring two and killing one. She is suing a whole host of people in connection with the case, including the dead boy. I reacted as most people probably would when I read the article: this woman is clearly a monster. She was speeding. She may have been talking on her cell phone. She killed a kid and badly injured two others! What kind of awful scum of humanity would dare to sue the families involved in this tragedy, much less sue the dead kid? And I was not surprised when I scrolled down to the comments and saw that many posters felt as I did. But as is my general practice, after getting over my initial reaction I started to wonder about the context of the situation. How fast was she actually going? Is there any evidence supporting the allegation that she was on her phone? What about the kids? Did they ride into her path? What time of day was it? What is the context? What are the facts? It turns out that it was dark when the boys were hit. They were wearing dark clothes. They were riding side-by-side along the road. Now, I’m certainly not blaming the victims here, but it sounds like this situation was ripe for potential tragedy and that they were struck by accident. Even the most attribution-biased among us probably don’t believe that the driver hit the boys deliberately. And if you’re like me, you also start to think about your own, personal context. I rarely drive the exact speed limit. I wouldn’t say I’m a speed demon, but 5-10 miles above the limit is pretty par for the course. I’ve also been guilty of taking calls while driving… sending and receiving text messages… even checking social media. (And just FYI, nowadays when I feel the temptation to use my phone while driving I ask myself if it’s worth a life to do it. The answer is always no.). If I were to hit and injure or kill someone under those circumstances, I would be crippled with guilt and shame… but would it mean I am a monster?

    You may be saying to yourself that this is all well and good, but what in the world could ever justify this woman suing the dead boy and the families for emotional trauma? What kind of person would do such a thing? She must be a monster! I think this is the point at which we must pause and ask ourselves what we might feel if the same thing happened to us. This appears to have been a terrible accident. I don’t know about you, but if I hit and killed someone, whether I was at fault or not, I would be devastated. That devastation would probably manifest itself physically as well as emotionally. I would live it over and over and suffer terrible guilt, grief, and shame. And I’d also have to defend myself in the court of public opinion as well as the civil court. Of course, the driver in this case is being sued by the victims’ families. And she is countersuing because that’s what lawyers tell you to do in cases like this. It’s a tactic you use to protect yourself in the court system so that if the judgement comes down against you, you have some protection from financial ruin. I don’t know about Canada, but in the United States this is a fairly typical situation that happens at the behest of insurance companies who don’t want to be the ones paying out a big settlement. There may well be more to this situation, but I don’t think it’s fair to automatically paint the driver as a soulless monster without at least attempting to learn more about the context.

    Don’t get me wrong. I’m not saying that the driver in this case is blameless. But that’s not the point. The point it that our knee-jerk attribution error paints people as one-dimensional villains and allows little room for the nuances and subtleties that arise when we look at a situation in its complete context. I can say the same about people we canonize as heroes! Just as the driver in this case is probably not a demon incarnate, people who do heroic things may also not be overall nice people. We are all complex, multidimensional creatures, and it would behoove us to remember that when attribution error tempts us to label people with a single dimension.

  • Logical Fallacies: The Bandwagon Fallacy

    Logical Fallacies: The Bandwagon Fallacy

    When I was attending Humboldt State University in the early to mid-90s, I noticed that I was putting on some weight – the dreaded Freshman 15. To combat this phenomenon, I started getting regular exercise, joined a gym, and started watching my diet. It was right around this time that a new dieting trend burst on the scene: a massive proliferation of low- and non-fat foods, all of which were marketed directly to the consumer’s desire to lose weight while still being able to indulge in treats like cookies, ice cream, and chips. In particular, I remember the Snackwell’s brand of cookies and snack cakes in their trademark green packaging. I remember scanning the nutrition label and seeing that I could eat an entire package of vanilla creme cookies and only ingest 4 grams of fat. Eureka! It never occurred to me to stop and think about the wisdom of this approach. Did it really work? Well, it must – otherwise, why would everyone be buying these products?

    Welcome aboard the bandwagon fallacy. The premise is simple: if an idea is becoming popular, it must be true. The low-fat fad took off because so many people wanted to believe in its simple premise that removing fat from your diet would remove fat from your gut. As the idea gained in popularity, it gained in adherents, which further increased its popularity, in a nice little feedback loop. Bandwagons can form around all sorts of premises, tested and untested, but I find the ones that form around food to be quite fascinating. These fads seem to come and go: the high-protein Atkins diet was first popularized in the 1970s then faded, only to experience a resurgence in the 2000s. Of course, as people came to realize that the diet didn’t have the lasting weight-loss effects it promised, it lost its popularity as people abandoned the bandwagon. Yet, these ideas manage to persist. The same thing happened to the lactose-intolerance fad, and I strongly suspect it will happen to the gluten-free fad.

    When a bandwagon idea holds the potential for becoming a marketing bonanza, it explodes across a universe of products. This reinforces the bandwagon. Currently, gluten-free is the top dietary fad. It’s quite amusing to see products that never had gluten in them in the first place emblazoned with the GLUTEN FREE! label. I’ve seen it on products as ridiculous as soda and fruit snacks (although as an aside, it can also be quite shocking to discover all sorts of strange ingredients in prepackaged foods, so I suppose it’s always possible that a fruit roll-up could have gluten in it). The same thing happened during the fat-free fad. Other current bandwagon labels include organic, free range, cage free, all natural, non-GMO, RBGH-free, and other labels catering to the health-conscious (but sometimes logic-unconscious). Gluten-free still seems to be towing a full bandwagon, but the next wagon is rapidly filling with adherents. This is the anti-sugar bandwagon. I’ve lately been seeing a lot of ink spilled over the toxic hazards of our high-sugar modern diets, and I have absolutely no doubt that the marketing bonanza has already begun.

    Research is revealing that the causes of modern health problems are much more complex and intertwined than the simplistic healthy-food bandwagons would make it appear. I do want to stress that there is real research into some of the bandwagon fads I have mentioned. Sometimes the research supports the fad, sometimes it doesn’t, and often the results are maddeningly inconclusive. The Atkins diet has been thoroughly studied with mixed results, depending on what particular factors were the focus of the research. Lowering the amount of fat in one’s diet also can certainly lead to weight loss, but that by itself is not enough. People with celiac disease truly cannot ingest gluten without becoming severely ill, and some people may be able to handle wheat protein in their diets better than others. Organic foods have the benefit of lowering our exposure to potentially toxic pesticide and herbicide residues; however, some of the other popular adjectives for “healthy” food remain highly problematic because they are misleading. “All Natural” is a loosely regulated term that can be used by almost anybody. “Free Range” and “Cage Free” can mean only that the birds in question are released to a fenced yard for a short time each day or are crowded together in large facilities with no cages – but no natural light or ability to go outside. The research into GMOs and RBGH (recombinant bovine growth hormone) is unsettled and deserves a post of its own. Even the simple “calories in-calories out” approach is turning out to be much more complicated than we thought.

    The point of all this is that these issues are complicated and deserve critical review. The bandwagon fallacy encourages us to jump aboard because it’s easier to go with the crowd than do the hard work of researching an issue on the merits. Do your research and you may just find that the bandwagon is the right place to be – but it’s not because everybody else is there. If you choose to ride on the bandwagon, be sure it’s because you are confident in its origins and its destination – whether it’s about making food choices, social choices, or even pop-culture choices. Better yet, build and drive your own wagon!

  • Additive Outrage

    Additive Outrage

    Rat poison saved my life. I know how strange that sounds, but it’s true. In July 2003 I was hospitalized with a pulmonary embolism – a blood clot in my lung. The treatment is blood thinners – IV heparin while in the hospital for a week, then oral warfarin – brand name Coumadin – for six months afterwards to keep dissolving the clot and to prevent a recurrence. Warfarin is an anti-coagulant, and it happens to be very effective as a rodenticide by causing fatal internal bleeding in rats that ingest it in the form of poison baits. So what’s the takeaway? It’s really quite simple: the dose makes the poison.

    I bring this up because I have noticed that it doesn’t take much to frighten people by telling them about “disgusting” or “scary” or “poisonous” stuff that shows up in food. This absolutely, positively requires a great deal of skepticism and critical thinking. Case in point: I ran across an article in the Huffington Post that capitalizes directly on this sort of fear-mongering. Titled “9 Disgusting Things You Didn’t Know You’ve Been Eating Your Whole Life,” the article runs through a list of food additives that are apparently supposed to make us feel like the food industry is bent on poisoning its customers. Now, I’m not stupid; I’m well aware that there is all sorts of stuff in our food that is not exactly healthy, and even some stuff that could be dangerous. I am concerned about modern eating habits (my own included!) and think it’s rather frightening how removed we are from the process of providing food for millions of people. In fact, when I teach the section on subsistence in my cultural anthropology classes, I ask my students to think about what they would eat if the world as we know it came to an end. Do they have the remotest inkling of what they would eat if there were no grocery stores or restaurants? And even if they talk about hunting, I ask them, when the bullets run out, how will you kill animals? Do you know how to prepare them? How will you keep that food from spoiling? What plant foods will you eat? I have no doubt that when the shit hits the fan for humanity, those few cultural groups that still forage or practice horticulture and pastoralism will be the only survivors, with a few exceptions for those who have learned skills for living off the land in nations like the United States (although even these few won’t survive as long-term populations unless they meet other people and are able to form larger groups that can sustain population growth).

    So what does any of this have to do with the HuffPo article? My real point is that people get unreasonably frightened or disgusted by things without thinking through why they are frightened or disgusted. The first thing on the list in the article is castoreum. This is a substance that is produced in the anal sacs of beavers, and even I have to admit that it sounds pretty disgusting. It is used as a flavoring similar to vanilla, although according to Wikipedia the food industry in the US only uses about 300 pounds of it a year. My problem with this is the automatic reaction that some parts of the animal are not acceptable for food use and others are. The way we use animal parts is culturally determined and completely arbitrary. Why is castoreum any more disgusting than drinking the liquid that shoots out of a cow teat? Some people eat tongue – why is that body part any worse than eating the ground up flesh from a pig’s side? What about eggs, which are essentially the menstrual flow of a chicken contained in a shell? Disgust, again, is culturally determined and therefore ultimately arbitrary from an objective standpoint.

    Other things listed in the article include L-cysteine, which is one of the amino acids that is found in human hair; sand; coal tar; anti-freeze; and a few others. The human hair bit is similar to the beaver anal secretions bit – we just knee-jerk find it disgusting, but it’s not as if there is actual human hair in your food! Every single living thing is made of amino acids, so you could make the argument that any food that contains an amino acid is part, I don’t know, semen? Bile? Blood? In other words, without the full background of the chemical all you read is that human hair has a component that is processed into a food additive and the implication is that you are directly consuming hair. As for the things like anti-freeze and coal tar, reference back to the dose making the poison. Once again, it’s not like food companies are pouring Prestone into your food. The ingredient in question is called propylene glycol, which has many of the same properties as ethylene glycol, which is what is actually used in automobile antifreeze. Propylene glycol is not only used in food but in medications that are not soluble in water – so much like warfarin, propylene glycol in the right dose and formulation has important medical applications.

    I could go through the list one by one, but I’m hoping that these examples make my point that so much information and context is left out of articles like this. I really don’t understand the desire to frighten and disgust people by only focusing on shock value rather than useful information. Again, I want to stress that I realize there are bad things in our food, and I am firmly committed to the idea that most companies are more concerned about their bottom line than they are about the health and safety of consumers; but it’s also important to remember that if companies sicken or kill their customers they won’t be in business for long! And I know that plenty of people automatically distrust government agencies like the FDA, but again, what does the FDA gain by allowing truly dangerous chemicals to be part of the food supply? It behooves us to think very carefully about this sort of thing.

    A final point: in reading the comments at the end of the HuffPo article, I was amazed at the self-righteousness and privilege of many of the contributors. So many bragged about only eating fresh food from the farmers’ market or making their own bread or only buying organically raised meat or making baby food from scratch or blah blah blah. Have these people ever been outside their privileged little bubble and considered how the real world works for so many people? Farmers’ markets are great – if there’s one in your neighborhood and you can afford to pay the premium prices. Organic meat? Only if there is a fancy grocery store nearby and you want to pay double the price. Food made from scratch? Sure, if you have the time and the tools and the money for the often pricey ingredients. It’s terrific that a lot of people are trying to get back to basics with food prep – I myself make bread from scratch – but it fails to recognize the deep inequality and lack of access to resources that so many people in the United States, and the world, have to contend with – but that’s a rant for another time.

  • Logical Fallacies: Appeal to Authority and the Tu Quoque Fallacy

    Logical Fallacies: Appeal to Authority and the Tu Quoque Fallacy

    The appeal to authority is probably one of the most common logical fallacies. You hear it and see it all the time: “The Intergovernmental Panel on Climate Change says the climate is changing. All those scientists can’t be wrong, so the climate must be changing.” It’s true that the IPCC’s research has revealed a great deal of scientific evidence that the climate is changing, and that the change is most likely caused by human activity. But simply saying it’s true because a fancy-sounding panel of scientists says so is not enough. It’s the research that supports the conclusion, so if you are going to make an argument about something, cite the research, not just the source.

    Don’t get me wrong; I’m not saying that it’s a bad thing to bolster your argument by touting the credibility of your sources. I am saying that you better be prepared to cite the conclusions of your sources as well. The reason an appeal to authority by itself is not enough is because it can be extremely misleading. Just because a person, group, or study is associated or affiliated with a highly-respected institution or researcher does not mean that the conclusions themselves are sound. Linus Pauling, a Nobel laureate in chemistry, was rightfully lauded for his work in explaining complex molecular bonds (and he was also awarded a Nobel Peace Prize for his anti-war activism). Pauling is routinely listed as one of the most influential American scientists of the 20th century. However, in his later years, he did work on vitamin C, among other things, that fell short when critically analyzed by other scientists. Nevertheless, even today people will cite Pauling’s work to bolster claims about vitamin C’s efficacy in treating certain diseases, such as cancer, even though none of his claims have stood up to testing. It is simply his authority as a Nobel prize winner that seems to give credence to disproved claims.

    Something similar happens with people who have the title of “doctor” (and I should know, being one of them); somehow, the Doctor is seen as an authority simply because of her title. I claim no specialized knowledge outside of my very specific research in anthropology, but when I talk, people who know I’m a PhD tend to listen… and to ask me about things I really know nothing about! Along similar lines, Dr. Laura Schlessinger, of conservative talk-show radio fame, earned the title of Dr. by virtue of her PhD in physiology… not in the psychotherapy she dispensed on her program. Yet “Dr. Laura” was able to trade on her title to enhance her credibility with her listeners: a perfect example of the appeal to authority. This is a fallacy we all have to be very careful of, not only because we use it in our arguments with others but because we fall for it ourselves when convincing ourselves of the rightness of our views. Always remember that it is not enough that somebody is “a Harvard Medical School neurosurgeon.” That by itself does not make the research credible. It is the scientific process, the peer review, the repeated testing, that gives a particular conclusion its credence. And on the flip side, reversing the appeal to authority – e.g., “how can we trust that research when it’s only from Bob State University?” – does not mean that the research is shoddy or its conclusions can’t be trusted. If it has gone through the same rigorous scientific process as the work of the Harvard neurosurgeon, then it should have equal credibility. Final flog of the dead horse: you should definitely be aware of the credentials and background of researchers, but you should not use that as the sole criterion by which to judge their work.

    And now our bonus fallacy: the tu quoque fallacy. It has a fancy name but the concept itself is simple. This is the classic child’s whine that “so-and-so gets to stay up until 10, so I should get to stay up too!” Just because someone else gets to do it doesn’t mean you get to do it! Even more specifically, the tu quoque fallacy is used to try to justify wrongdoing. I’m sure cops hear it all the time in the form of “Why didn’t you get that other guy who was racing past me at 95?” You know as well as the cop does that just because somebody else was going 95 doesn’t make it ok for you to go 90. I love tu quoque because it’s really so childish when used in this classic sense. But it does get used in other ways as well, in more of an “apples to oranges” way. You tend to hear the tu quoque fallacy when people can’t really refute an argument with logic, but they remember an example of something similar turning out not to be true, so they cite that instead. I’ve been hearing it regularly in discussions of global climate change when people refer to a brief, media-hyped panic in the 1970s that the world was about to go through a global freeze. As it happens, while a few papers suggested that a cooling period might be coming, the majority of research at the time found more evidence for global warming. But the media got ahold of the cooling theory and ran with it. The conclusion is that the climate scientists who proposed a potential global cooling period turned out to be wrong; therefore, climate scientists who are predicting global warming are also wrong. It’s a variation of the child’s whine: “science was wrong about something in the past, so it must be wrong now.” This is absurd. Scientific research is based on revising conclusions based on new information. If scientists gave up every time something they predicted turned out to be wrong, no scientific field would ever advance. So being wrong in the past has little predictive value for being wrong in the future.

    It’s exhausting to try to keep track of all these fallacies, committed by ourselves, the people we talk with, and the sources we rely on for information. It’s also exhausting to glean what’s important from a conversation and use care to establish credibility without tipping over into an appeal to authority, or to cite examples of previous, similar situations without falling into a tu quoque, or to refrain from the ad hominem of calling somebody a blithering idiot (or much, much worse) instead of actually deconstructing their argument. Of course, a lot of people don’t really want to try because the fallacies are so much easier… but I do hope we will all keep trying.

  • Logical Fallacies: The Red Herring

    Logical Fallacies: The Red Herring

    The red herring is an argument that I see deployed again and again, and I’m never entirely sure if the person deploying it is even aware that they are bringing up issues that are tangential to the debate at hand. The phrase originates from the days of fox hunting, when the scent of a red herring was used to distract the hounds from the pursuit of the fox. That’s exactly what the red herring does in an argument: it distracts the participants from what is really at issue, and they find themselves talking about onions when they started out talking about apples.

    I get very frustrated when people deploy the red herring, whether they are doing it deliberately or unconsciously. Actually, it’s the unconscious deployment that gets to me the most, because it tells me that my interlocutor does not have a firm grasp on what we are really debating. I honestly think it’s a defense mechanism for most people. They bring up side issues as a way to distract from the fact that they really have no answer to whatever point their opponent is making. (As an aside, I want to clarify that my use of words like argument and opponent is not meant to say that I expect every difference of opinion to lead to anger; but when people disagree about something and they engage in conversation about it, they become like opponents in a refereed debate – only without the formality of an actual referee. Of course, sometimes the debate does devolve into an actual argument that is heated with emotion.)

    The red herring seems to come up regularly in arguments that are about sensitive subjects such as gun control or gay marriage. I generally see it used when a person is arguing from emotion rather than from logic. For example, I might say that stricter gun control laws could have saved the lives of some of the 194 children who have died from gunshot wounds in the year since the Sandy Hook massacre in December 2012. Someone deploying a red herring might say “But what about all the people who used guns to defend themselves since Sandy Hook?” There may well be many cases of people deploying guns in self-defense since then, but that is not what the argument at hand is about. Bringing up guns used in self-defense is a distraction from my hypothesis that stricter gun control may have prevented the deaths of some children. My argument says nothing about whether stricter laws might have hindered those who used guns to defend themselves. Although that may well be the case, it is not the point of this particular, specific debate.

    Another situation that I’ve encountered many times is when the red herring is used to put people on the defensive. It usually takes the form of a question, wherein your interlocutor will say, “So you’re saying that we should (such-and-such illogical leap)?” It is so easy to be distracted by this and to start defending yourself from the stinky fish being lobbed in your direction! As another example, if I say that I am opposed to the “stop and frisk” policy in New York City because I think it unfairly targets minorities, a red herring-lobbing opponent could say, “So you’re saying that a suspicious looking person should never be stopped by police?” Of course that’s not what I’m saying, but if I lose my cool and chase the herring, the chance to talk intelligently about the merits of the stop and frisk policy is lost.

    I’m using broad examples on purpose to try to illustrate the red herring. Obviously, in the course of having conversations about these issues, many different points will be made about different aspects of particular issues. And in many cases, I’ve found, no matter how hard you try to keep your debate partner on point, they will keep tossing the fish. The hardest part when you are trying to concentrate on a specific point is not allowing yourself to be distracted by the scent of the herring, and to keep your eye on the fox… or in some cases, to simply disengage from the hunt.