Blog

  • Additive Outrage

    Additive Outrage

    Rat poison saved my life. I know how strange that sounds, but it’s true. In July 2003 I was hospitalized with a pulmonary embolism – a blood clot in my lung. The treatment is blood thinners – IV heparin while in the hospital for a week, then oral warfarin – brand name Coumadin – for six months afterwards to keep dissolving the clot and to prevent a recurrence. Warfarin is an anti-coagulant, and it happens to be very effective as a rodenticide by causing fatal internal bleeding in rats that ingest it in the form of poison baits. So what’s the takeaway? It’s really quite simple: the dose makes the poison.

    I bring this up because I have noticed that it doesn’t take much to frighten people by telling them about “disgusting” or “scary” or “poisonous” stuff that shows up in food. This absolutely, positively requires a great deal of skepticism and critical thinking. Case in point: I ran across an article in the Huffington Post that capitalizes directly on this sort of fear-mongering. Titled “9 Disgusting Things You Didn’t Know You’ve Been Eating Your Whole Life,” the article runs through a list of food additives that are apparently supposed to make us feel like the food industry is bent on poisoning its customers. Now, I’m not stupid; I’m well aware that there is all sorts of stuff in our food that is not exactly healthy, and even some stuff that could be dangerous. I am concerned about modern eating habits (my own included!) and think it’s rather frightening how removed we are from the process of providing food for millions of people. In fact, when I teach the section on subsistence in my cultural anthropology classes, I ask my students to think about what they would eat if the world as we know it came to an end. Do they have the remotest inkling of what they would eat if there were no grocery stores or restaurants? And even if they talk about hunting, I ask them, when the bullets run out, how will you kill animals? Do you know how to prepare them? How will you keep that food from spoiling? What plant foods will you eat? I have no doubt that when the shit hits the fan for humanity, those few cultural groups that still forage or practice horticulture and pastoralism will be the only survivors, with a few exceptions for those who have learned skills for living off the land in nations like the United States (although even these few won’t survive as long-term populations unless they meet other people and are able to form larger groups that can sustain population growth).

    So what does any of this have to do with the HuffPo article? My real point is that people get unreasonably frightened or disgusted by things without thinking through why they are frightened or disgusted. The first thing on the list in the article is castoreum. This is a substance that is produced in the anal sacs of beavers, and even I have to admit that it sounds pretty disgusting. It is used as a flavoring similar to vanilla, although according to Wikipedia the food industry in the US only uses about 300 pounds of it a year. My problem with this is the automatic reaction that some parts of the animal are not acceptable for food use and others are. The way we use animal parts is culturally determined and completely arbitrary. Why is castoreum any more disgusting than drinking the liquid that shoots out of a cow teat? Some people eat tongue – why is that body part any worse than eating the ground up flesh from a pig’s side? What about eggs, which are essentially the menstrual flow of a chicken contained in a shell? Disgust, again, is culturally determined and therefore ultimately arbitrary from an objective standpoint.

    Other things listed in the article include L-cysteine, which is one of the amino acids that is found in human hair; sand; coal tar; anti-freeze; and a few others. The human hair bit is similar to the beaver anal secretions bit – we just knee-jerk find it disgusting, but it’s not as if there is actual human hair in your food! Every single living thing is made of amino acids, so you could make the argument that any food that contains an amino acid is part, I don’t know, semen? Bile? Blood? In other words, without the full background of the chemical all you read is that human hair has a component that is processed into a food additive and the implication is that you are directly consuming hair. As for the things like anti-freeze and coal tar, reference back to the dose making the poison. Once again, it’s not like food companies are pouring Prestone into your food. The ingredient in question is called propylene glycol, which has many of the same properties as ethylene glycol, which is what is actually used in automobile antifreeze. Propylene glycol is not only used in food but in medications that are not soluble in water – so much like warfarin, propylene glycol in the right dose and formulation has important medical applications.

    I could go through the list one by one, but I’m hoping that these examples make my point that so much information and context is left out of articles like this. I really don’t understand the desire to frighten and disgust people by only focusing on shock value rather than useful information. Again, I want to stress that I realize there are bad things in our food, and I am firmly committed to the idea that most companies are more concerned about their bottom line than they are about the health and safety of consumers; but it’s also important to remember that if companies sicken or kill their customers they won’t be in business for long! And I know that plenty of people automatically distrust government agencies like the FDA, but again, what does the FDA gain by allowing truly dangerous chemicals to be part of the food supply? It behooves us to think very carefully about this sort of thing.

    A final point: in reading the comments at the end of the HuffPo article, I was amazed at the self-righteousness and privilege of many of the contributors. So many bragged about only eating fresh food from the farmers’ market or making their own bread or only buying organically raised meat or making baby food from scratch or blah blah blah. Have these people ever been outside their privileged little bubble and considered how the real world works for so many people? Farmers’ markets are great – if there’s one in your neighborhood and you can afford to pay the premium prices. Organic meat? Only if there is a fancy grocery store nearby and you want to pay double the price. Food made from scratch? Sure, if you have the time and the tools and the money for the often pricey ingredients. It’s terrific that a lot of people are trying to get back to basics with food prep – I myself make bread from scratch – but it fails to recognize the deep inequality and lack of access to resources that so many people in the United States, and the world, have to contend with – but that’s a rant for another time.

  • Technology and Its Discontents: Instant Gratification

    Technology and Its Discontents: Instant Gratification

    Over the past few years, I have been doing more and more shopping online. I have long patronized Amazon for books, especially in the used marketplace, and I have recently had occasion to order non-book items from Amazon as well. Many of the clothes and shoes in my closet have been ordered online, and the lion’s share of the supplies I need for the Rock and Shell Club have been shipped to me from all over the country (and in one memorable, not to be repeated order, China). I appreciate the convenience of finding what I need online and having it delivered directly to me, as many of the items I need are not necessarily available locally; however, I am becoming increasingly concerned about what the Amazon model is doing to us culturally, behaviorally, and economically.

    Around March 2012 I read an article in Mother Jones that pulled into focus something I had already started to hear a lot about: the backbreaking labor, low wages, and job insecurity that go into making our instant gratification economy possible. Reading about author Mac McClelland’s experience working in a warehouse subcontracted to Amazon made me seriously question the business model that allows consumers to get their goods within a few days of their order. More so, it made me scrutinize my own behavior, and I found myself asking why I expected to take delivery of my order in just a few days. The simple answer is that the Amazon model has created that expectation – order now, have it tomorrow if you’re willing to pay the price, and in just a few days or a week even if you’re not. Once you become accustomed to things arriving quickly, it creates the expectation that any delay in shipment is bad customer service – hence, Amazon becomes customer service king over your local bookstore or small online shop, which might take a few weeks to deliver the book you order.

    To do business this way, Amazon must cut corners wherever possible, which is what leads to the labor conditions in their distribution centers. But consumers, being human, are out-of-sight, out-of-mind creatures, so no thought is given to what is required to make their near-instant gratification possible. That is the nature of business competition – the nature of capitalism. But what I find dismaying about this is not the near-instant gratification for items you may have trouble getting anywhere except online; now consumers are ordering things they could just as easily buy at the local store. This article, in which a man explains that he orders his 40-pound bags of dog food from Amazon because he doesn’t want to be bothered with carrying them through a store, to his car, and into his house, is a case in point. The convenience of home delivery makes it worth it to him to pay for Amazon’s Prime service. For a flat annual fee, delivery is free – delivery of anything Amazon sells, no matter the size or the weight. When I read the article, I felt sadness, contempt, anger, disgust – all those knee-jerk, visceral reactions to what amounts to sheer laziness on the part of this consumer… but is it really laziness, or is it economic hegemony? After all, why not maximize your own time and convenience if it only takes a few dollars a year to have household items delivered straight to your door?

    And so we come to the crux of my rant. I think the Amazon model is bad for us. I think instant gratification is bad for us. I think Amazon, and the competition it has engendered, is destroying our ability to be patient, to be thoughtful, to be mindful of all the hidden economic exploitation that is required for us to get what we want NOW. I admit I’m not free of responsibility for my own part in this, but I am doing my part to react against it by finding any outlet other than online for getting the things I want and need. And, if I do make an online purchase, I try to buy directly from the source rather than from Amazon. When possible, I buy from small online businesses, and plan for the possibility that my order may take a week – or more! – to arrive. I re-read the Mother Jones article, and I hope for a time when the pendulum swings and the price of an item, including shipping, truly reflects the cost of doing business this way. And this is not just a monetary cost; it is a social and cultural cost, and it is helping to perpetuate the systematic inequality and labor exploitation that is inherent in the capitalist marketplace.

  • Logical Fallacies: Appeal to Authority and the Tu Quoque Fallacy

    Logical Fallacies: Appeal to Authority and the Tu Quoque Fallacy

    The appeal to authority is probably one of the most common logical fallacies. You hear it and see it all the time: “The Intergovernmental Panel on Climate Change says the climate is changing. All those scientists can’t be wrong, so the climate must be changing.” It’s true that the IPCC’s research has revealed a great deal of scientific evidence that the climate is changing, and that the change is most likely caused by human activity. But simply saying it’s true because a fancy-sounding panel of scientists says so is not enough. It’s the research that supports the conclusion, so if you are going to make an argument about something, cite the research, not just the source.

    Don’t get me wrong; I’m not saying that it’s a bad thing to bolster your argument by touting the credibility of your sources. I am saying that you better be prepared to cite the conclusions of your sources as well. The reason an appeal to authority by itself is not enough is because it can be extremely misleading. Just because a person, group, or study is associated or affiliated with a highly-respected institution or researcher does not mean that the conclusions themselves are sound. Linus Pauling, a Nobel laureate in chemistry, was rightfully lauded for his work in explaining complex molecular bonds (and he was also awarded a Nobel Peace Prize for his anti-war activism). Pauling is routinely listed as one of the most influential American scientists of the 20th century. However, in his later years, he did work on vitamin C, among other things, that fell short when critically analyzed by other scientists. Nevertheless, even today people will cite Pauling’s work to bolster claims about vitamin C’s efficacy in treating certain diseases, such as cancer, even though none of his claims have stood up to testing. It is simply his authority as a Nobel prize winner that seems to give credence to disproved claims.

    Something similar happens with people who have the title of “doctor” (and I should know, being one of them); somehow, the Doctor is seen as an authority simply because of her title. I claim no specialized knowledge outside of my very specific research in anthropology, but when I talk, people who know I’m a PhD tend to listen… and to ask me about things I really know nothing about! Along similar lines, Dr. Laura Schlessinger, of conservative talk-show radio fame, earned the title of Dr. by virtue of her PhD in physiology… not in the psychotherapy she dispensed on her program. Yet “Dr. Laura” was able to trade on her title to enhance her credibility with her listeners: a perfect example of the appeal to authority. This is a fallacy we all have to be very careful of, not only because we use it in our arguments with others but because we fall for it ourselves when convincing ourselves of the rightness of our views. Always remember that it is not enough that somebody is “a Harvard Medical School neurosurgeon.” That by itself does not make the research credible. It is the scientific process, the peer review, the repeated testing, that gives a particular conclusion its credence. And on the flip side, reversing the appeal to authority – e.g., “how can we trust that research when it’s only from Bob State University?” – does not mean that the research is shoddy or its conclusions can’t be trusted. If it has gone through the same rigorous scientific process as the work of the Harvard neurosurgeon, then it should have equal credibility. Final flog of the dead horse: you should definitely be aware of the credentials and background of researchers, but you should not use that as the sole criterion by which to judge their work.

    And now our bonus fallacy: the tu quoque fallacy. It has a fancy name but the concept itself is simple. This is the classic child’s whine that “so-and-so gets to stay up until 10, so I should get to stay up too!” Just because someone else gets to do it doesn’t mean you get to do it! Even more specifically, the tu quoque fallacy is used to try to justify wrongdoing. I’m sure cops hear it all the time in the form of “Why didn’t you get that other guy who was racing past me at 95?” You know as well as the cop does that just because somebody else was going 95 doesn’t make it ok for you to go 90. I love tu quoque because it’s really so childish when used in this classic sense. But it does get used in other ways as well, in more of an “apples to oranges” way. You tend to hear the tu quoque fallacy when people can’t really refute an argument with logic, but they remember an example of something similar turning out not to be true, so they cite that instead. I’ve been hearing it regularly in discussions of global climate change when people refer to a brief, media-hyped panic in the 1970s that the world was about to go through a global freeze. As it happens, while a few papers suggested that a cooling period might be coming, the majority of research at the time found more evidence for global warming. But the media got ahold of the cooling theory and ran with it. The conclusion is that the climate scientists who proposed a potential global cooling period turned out to be wrong; therefore, climate scientists who are predicting global warming are also wrong. It’s a variation of the child’s whine: “science was wrong about something in the past, so it must be wrong now.” This is absurd. Scientific research is based on revising conclusions based on new information. If scientists gave up every time something they predicted turned out to be wrong, no scientific field would ever advance. So being wrong in the past has little predictive value for being wrong in the future.

    It’s exhausting to try to keep track of all these fallacies, committed by ourselves, the people we talk with, and the sources we rely on for information. It’s also exhausting to glean what’s important from a conversation and use care to establish credibility without tipping over into an appeal to authority, or to cite examples of previous, similar situations without falling into a tu quoque, or to refrain from the ad hominem of calling somebody a blithering idiot (or much, much worse) instead of actually deconstructing their argument. Of course, a lot of people don’t really want to try because the fallacies are so much easier… but I do hope we will all keep trying.

  • Football: Why I Won’t Be Watching

    Football: Why I Won’t Be Watching

    I was raised by parents who are baseball and football fans – not fanatics, but loyal enough to their hometown teams to be regular watchers and attendees at Padres and Chargers games. I would say we were more of a baseball family, and I count going to Padres games at San Diego Stadium (then Jack Murphy, and now, in this era of paid advertisements masquerading as sports fields, Qualcomm) as some of my fondest childhood memories. I followed the Chargers more peripherally, but you knew it was football season when you could hear the occasional shriek from my house indicating to the neighborhood that the game was on and my mom was watching. I started watching football more regularly in college and remained a Chargers fan. In the last few years I even started watching games not involving the Chargers, and I was really enjoying learning more about the strategies, the roles of the different players, and the intricacies of some of the plays. I sometimes felt a little tug of guilt on Sundays when I would schedule the day’s activities around the game, but even though the game might keep me at home I would often just keep the TV on in the background or listen on the radio while doing other things rather than glue myself to the screen for 3+ hours. But make no mistake, I enjoyed my football.

    On May 2, 2012, former Charger and frequent Pro Bowler Junior Seau committed suicide. Seau had been in and out of the news since his retirement for some minor brushes with the law, but his suicide was a blockbuster story and a heartbreak for all football fans, not just fans of the Chargers. Seau shot himself in the chest, and immediately comparisons were drawn to the February 2011 suicide of retired NFL player and four-time Pro Bowler Dave Duerson, who had also shot himself in the chest. Duerson left a note requesting that his brain be used in scientific research, which is why he had chosen to shoot himself in the chest instead of the head. Although Seau left no note, as the investigation proceeded it became clear that Seau’s wishes may have been the same. What was the link? Both Duerson and Seau, and as it turned out, a number of other retired players who had committed suicide, all exhibited symptoms of a condition called CTE – chronic traumatic encephalopathy.

    A few months into the 2013 season, I watched the PBS Frontline documentary “League of Denial.” This documentary explores the link between football and CTE, and investigates how the NFL was dealing with the problem. They found that the NFL consistently denied that football was dangerous to players and insisted that concussions, even multiple concussions, were not responsible for the degenerative brain disease that some former players were developing. The documentary is pretty damning in its conclusions that the NFL brass actively conspired to thwart research, intimidate the researchers, and cover up their lack of disclosure to players about the potential for developing CTE and the seriousness of concussive injuries to the brain. The implication is that the NFL was much more concerned about protecting its financial bottom line than it was in protecting the health and safety of players. I highly recommend watching the show to learn the full extent of the issues.

    In spite of being very impressed with “League of Denial” I was left with questions. Although the correlation between CTE and concussions in NFL players, especially those who commit suicide or otherwise die young, is very high, it’s always important to research cause and effect before drawing conclusions. I believe that much more research needs to be done to truly understand what is going on. A well-designed study of CTE in football players needs to control for multiple factors, such as length of time playing; behavior off the field (e.g. drug and alcohol use, non-football-related injuries); individual and family medical history; non-medical background factors (e.g. money problems, relationship problems, and other stressors); and genetics. It may well be that concussions and CTE are directly causally linked in football players and that there are no other factors involved. But research like this could determine if there are other factors involved, and if so, make the game safer by using those factors to determine an individual’s risk.

    Research on the incidence of CTE in boxers has established that boxers are at risk of developing CTE due to repeated blows to the head. This seems intuitive, since boxing involves head shots that are intended to render the opponent unconscious. But football, with its helmets and pads, has led to the assumption that it is intrinsically safer than a sport like boxing. Paradoxically, it may be that the more you pad a player, the safer he feels, so he ends up taking more risks than he would otherwise, resulting in a higher number and a worsening degree of injuries. Some people argue that actually reducing the pads and helmets, if not banning them outright, would make football safer. It’s an interesting idea – back in the day of leather helmets and no padding, football players tackled the body, whereas now, shots to the head are commonplace (though the NFL has made certain head-busting plays illegal). Still, if you watch “League of Denial,” you can see that the game has become more and more brutal, and that head- and body-crushing violence is glorified not just on the field and in the locker room, but by the league, the media, and the fans. Yes, beautifully executed passes and running plays are glorified too, but bone-crunching tackles are also looped endlessly on the sports shows.

    After its years of denials, the NFL did finally start putting some money towards research into concussions even though it still continues to deny a link between football head injuries and CTE. Plus, it settled a lawsuit brought by former players alleging that it had actively downplayed the risks of the game and concussions in particular. Interestingly, the terms of the $765 million settlement state that the NFL is not admitting to any guilt; instead, NFL commissioner Roger Goodell said the settlement was the league’s way to “do the right thing for the game and the men who played it.” While $765 million is a lot of money, in reality it’s a very small sum for the NFL – it amounts to under 8% of the league’s annual revenue of around $10 billion. In fact, the league saw the settlement as a victory because it prevented a lengthy court battle and the risk of having to engage in a discovery process that could bring some very unsavory things to light. If you want to be cynical about it, you could call it a payoff – and for the former players (and their families) who are suffering, some money now is better than a long court fight during which some of them will surely die.

    So why am I writing about all this? Because I have decided that I cannot, in good conscience, support the NFL. After viewing “League of Denial,” I stopped watching or listening to games (although – full disclosure – I did attend the Chargers-Broncos game on November 10, 2013, because my sister bought the tickets for my birthday in September and I didn’t want to let her down). I have brought this up with several people, and every single one has said to me that the players know what they are getting into, and not only that, they are paid millions to take on the risk. I disagree. I think they are only just now starting to learn what they are getting into. I think they are not given the information they need to make an informed decision. Bear in mind, these athletes begin playing as kids. Do you think their parents were aware of CTE in the late 1980s and early 1990s when some of today’s players were in Pop Warner or high school? If they knew then what they know now, would they have allowed their sons to play? Parents today are better armed than the parents of current players, but even now not enough research has been done. That research must happen if players and their families are to go into this game with all the information – and the NFL has to pay for it if they want players to keep taking on the risks. As for the “millions of dollars” argument, it doesn’t sway me at all. How much is your future health worth? How much money does it take to sacrifice your brain? Is there really a number you can put on that? And let’s talk about league minimums. In 2013, the minimum salary for a rookie was $405,000. A 10-year vet makes at least $955,000. A lot of money? By minimum wage standards, you bet it is… but when you consider that the average NFL player’s career lasts about 3.5 seasons, it suddenly doesn’t seem like that much. Obviously not every player makes the minimum, but even if you manage to make $3.5 million for your 3.5 seasons of service, once you are cut from the team in your mid-20s, what comes next? Individuals are responsible for managing their own money and some of the players probably are well-advised and do okay for themselves, but 3.5 years of a low-to-medium six-figure salary (especially, sadly, for men who didn’t give much thought to how they’d support themselves once their football careers were over) will not last forever, and it certainly is not enough to compensate for repeated traumatic brain injuries (not to mention the overall body injuries that can keep many players in pain for the rest of their lives). And let’s not forget that most of these players played in high school and college, suffered the same injuries, and were not compensated at all (my beef with big-money sports in college is a rant for another time). So no, I don’t buy that these players are paid well enough to justify the risk, even if they are one of the few star players with a multi-million dollar contract.

    I know a lot of people who read this will disagree with me, and that’s okay. I’m not expecting anybody else to change their behavior, and I do not judge people who continue to watch and enjoy the game. The Chargers managed to squeak into the playoffs this year, and even though I’m not watching I am still happy to hear when my hometown team wins. But until the NFL admits their role in downplaying the risk of concussions and acknowledges the link to CTE, I won’t be watching. Until they put as much money as it takes into researching the correlation between concussions and CTE, I won’t be watching. Until that research either definitely shows that there is no link, or comes up with ways to quantify the risks and applies it to improving player safety, I won’t be watching. Until the NFL fully and thoroughly educates each player on the risks of the game (and if you watch “League of Denial” you’ll see that they currently don’t do much), I won’t be watching. And until the NFL prioritizes the lives and health of players over the bottom line, I won’t be watching.

  • Shifting Perspective: Kiddie Couture

    Shifting Perspective: Kiddie Couture

    On April 24, 2013, a building in Bangladesh known as Rana Plaza collapsed, killing 1,129 people and injuring 2,515. Rana Plaza housed several garment factories, in which workers – including children – were employed in manufacturing clothing for a variety of brands, including The Children’s Place, Benetton, and Walmart. The collapse triggered a wave of collective shock and outrage throughout the developed world as people were faced with the reality that working conditions in Bangladesh were poorly regulated, often dangerous, and beset with bribes, graft, and abuse.

    At the time of the collapse, the minimum wage for Bangladeshi workers was $38 a month. Following the collapse, international pressure and a series of worker strikes led the Bangladeshi government to raise the minimum wage to $68 a month, beginning on December 1, 2013. The real shock to many people in countries like the United States was having to face the fact that the reason we are able to buy $10 t-shirts and $19 jeans is because workers in places like Bangladesh make the equivalent of 39 cents an hour – and that’s assuming a standard 40-hour work week. In reality, Bangladeshi workers can labor for 12 hours a day, 7 days a week. Of course, the cost of living in Bangladesh is much lower than it is in most parts of the world – but we are fooling ourselves if we believe that this is truly a living wage.

    I bring this up not because I have a solution for the wage slavery taking place in much of the economic periphery – I don’t. I bring it up because I think it’s important for people to have perspective. To that end, I offer the story that made me decide to rant about this topic. ABC news broadcast a story about a new trend in children’s clothing: renting clothes instead of buying them. On the face of it, I think this is a terrific idea. The company offers parents the chance to pay a fee to rent clothes for special events such as weddings instead of having to pay full price for an outfit that will probably only be worn by their child once, and which they will outgrow soon in any case. Great! Sounds like a wonderful way to reduce our impact! But here’s where I got fired up: the company in question, Borrow Mini Couture, only rents high-fashion clothing. They carry brands such as Moschino, Roberto Cavalli, John Galliano, and Fendi – brands that charge hundreds of dollars for a single piece of children’s clothing. The least expensive Roberto Cavalli dress on the website retails for $352 – and it’s sized for a one year old girl. You can rent it for five days for $98 – $30 more than the monthly minimum wage of a Bangladeshi garment worker.

    The ABC piece makes it sound like this company is a boon to parents who want to save money. That very idea makes me want to weep. It’s not about saving money. It’s about aspirational parents being able to say they dressed their tot in couture clothing. Now, I don’t know where these couture brands manufacture their clothes, but that’s not really the point. Even if they are made by workers who are employed in safe, well-regulated factories where they earn enough to make a dignified living, what does it say about us as a society that we would even consider paying hundreds (or thousands) of dollars for a single piece of our own clothing, much less the clothes for our kids? And what does it say about us that there are people who will spend $50 to $100 just to briefly rent a status symbol for their child (or more accurately, for themselves)?

    For the shift in perspective I wish to impart in this rant, I offer this 2-minute video produced by the Toronto Star of children working in the garment industry in Bangladesh. Juxtapose this video with the ABC story and, like me, you might just want to weep – and I hope, want to think about what this means for the world we live in.

  • Logical Fallacies: The Red Herring

    Logical Fallacies: The Red Herring

    The red herring is an argument that I see deployed again and again, and I’m never entirely sure if the person deploying it is even aware that they are bringing up issues that are tangential to the debate at hand. The phrase originates from the days of fox hunting, when the scent of a red herring was used to distract the hounds from the pursuit of the fox. That’s exactly what the red herring does in an argument: it distracts the participants from what is really at issue, and they find themselves talking about onions when they started out talking about apples.

    I get very frustrated when people deploy the red herring, whether they are doing it deliberately or unconsciously. Actually, it’s the unconscious deployment that gets to me the most, because it tells me that my interlocutor does not have a firm grasp on what we are really debating. I honestly think it’s a defense mechanism for most people. They bring up side issues as a way to distract from the fact that they really have no answer to whatever point their opponent is making. (As an aside, I want to clarify that my use of words like argument and opponent is not meant to say that I expect every difference of opinion to lead to anger; but when people disagree about something and they engage in conversation about it, they become like opponents in a refereed debate – only without the formality of an actual referee. Of course, sometimes the debate does devolve into an actual argument that is heated with emotion.)

    The red herring seems to come up regularly in arguments that are about sensitive subjects such as gun control or gay marriage. I generally see it used when a person is arguing from emotion rather than from logic. For example, I might say that stricter gun control laws could have saved the lives of some of the 194 children who have died from gunshot wounds in the year since the Sandy Hook massacre in December 2012. Someone deploying a red herring might say “But what about all the people who used guns to defend themselves since Sandy Hook?” There may well be many cases of people deploying guns in self-defense since then, but that is not what the argument at hand is about. Bringing up guns used in self-defense is a distraction from my hypothesis that stricter gun control may have prevented the deaths of some children. My argument says nothing about whether stricter laws might have hindered those who used guns to defend themselves. Although that may well be the case, it is not the point of this particular, specific debate.

    Another situation that I’ve encountered many times is when the red herring is used to put people on the defensive. It usually takes the form of a question, wherein your interlocutor will say, “So you’re saying that we should (such-and-such illogical leap)?” It is so easy to be distracted by this and to start defending yourself from the stinky fish being lobbed in your direction! As another example, if I say that I am opposed to the “stop and frisk” policy in New York City because I think it unfairly targets minorities, a red herring-lobbing opponent could say, “So you’re saying that a suspicious looking person should never be stopped by police?” Of course that’s not what I’m saying, but if I lose my cool and chase the herring, the chance to talk intelligently about the merits of the stop and frisk policy is lost.

    I’m using broad examples on purpose to try to illustrate the red herring. Obviously, in the course of having conversations about these issues, many different points will be made about different aspects of particular issues. And in many cases, I’ve found, no matter how hard you try to keep your debate partner on point, they will keep tossing the fish. The hardest part when you are trying to concentrate on a specific point is not allowing yourself to be distracted by the scent of the herring, and to keep your eye on the fox… or in some cases, to simply disengage from the hunt.