Author: Ranthropologist

  • Logical Fallacies: The Appeal to Antiquity

    Logical Fallacies: The Appeal to Antiquity

    The first definition for the word conservative from Dictionary.com reads as follows: “disposed to preserve existing conditions, institutions, etc., or to restore traditional ones, and to limit change.” With this definition in mind, it comes as no surprise to me that those who identify themselves as politically conservative often employ the appeal to antiquity, which is a common logical fallacy. When you are averse to change, it makes sense to argue that things should stay the same because “that’s the way they’ve always been.” That argument, in a nutshell, is the appeal to antiquity.

    The appeal to antiquity is part of a family of fallacies known as “irrelevant appeals.” The idea is that because something has been done or believed for a long time, it must be true. Obviously we know that this is not the case; people used to believe that the earth was flat; that all disease was caused by “bad air”; and that phrenology was an accurate science. I know that political conservatives aren’t the only people to employ this fallacy, but I do tend to associate it with common conservative arguments about a variety of hot political topics. For example, an extremely common argument against legalizing gay marriage is that marriage has always been defined as a union between one man and one woman. Aside from the fact that this is not historically accurate, it clearly employs the appeal to antiquity: that’s the way things have always been; therefore, we shouldn’t change anything.

    The opposite of the appeal to antiquity is the appeal to novelty. This fallacy holds that because something is newer it must be better, but that’s just as wrong as the appeal to antiquity. New discoveries are made about things all the time that ultimately turn out not to be true. This is related to the bandwagon fallacy, which proposes that if a bunch of people believe in a new idea it must be true. This holds for recent research into all sorts of things that may be good or bad for us, such as the potential benefits of various dietary supplements. Recent research into the benefits of taking multivitamins has concluded that they do not appear to benefit human health. This may well be true, but it is not the recency of the conclusion that makes it so; and I wouldn’t be surprised if further research disputes this claim. (This topic deserves a rant of its own discussing the rampant fallacies that are employed by consumers of mainstream media reporting on scientific research – both the way the information is presented and the way it is interpreted leaves much to be desired.)

    Back to the appeal to antiquity. Just like the appeal to novelty, the age of the topic under consideration is of no relevance to its veracity. Again, the position being argued may well be true, but it is not its age that makes it so. And conversely, because something is old does not mean it should automatically be abandoned. The point is that the age of a topic under discussion has no relevance to its validity. So, to use a few more examples, just because something is written into old documents that set up the governance of our nation does not mean they remain ideas that should be embraced. For example, should Black Americans still be considered 3/5ths of a person? Should we still allow slavery? Should women still not be allowed to vote? Should Native Americans not be granted US citizenship? And for a very hot topic in our current culture wars… should every American citizen be allowed to own a gun just because that’s always been a part of our governing tradition? Don’t mistake me – there are valid arguments for and against gun ownership, but the appeal to antiquity is not one of them. And as an example of a governing novelty that did not work out: should alcohol still be prohibited? Prohibition did not work – its newness as an amendment to the US Constitution did not make it a success.

    So, if you are going to argue that something should remain as it is, do not make the mistake of deploying the appeal to antiquity. There are many good arguments in support of a variety of positions on our many cultural conflicts, but saying “that’s the way it’s always been” is not one of them. The more carefully you select your arguments, the more likely you are to be heard.

  • #YesAllWomen, #StillSomeMen

    #YesAllWomen, #StillSomeMen

    When I was 13 and in 8th grade, I went to a sleepover at a friend’s house. Her parents were not home, and she invited some boys over. While she and the other girl present went into separate bedrooms with their boyfriends, I was left alone with three other boys. I didn’t feel any fear, and I flirted innocently with the boys, thinking perhaps one of them would like me and ask me to “go” with them – our youthful term for being a couple. It never occurred to me that what I was doing might be seen as an invitation to sexual activity; although my two friends were more worldly, I was still naive, and I thought that the most that might happen would be one of the boys trying to put an arm around me or even kiss me. So I was shocked and alarmed when the boy with whom I had been doing most of the flirting suddenly lunged at me, grabbing at my nightgown and growling “I’m going to fucking rape you!” I pulled away from him and raced to the bathroom, where I managed to slam and lock the door just as the boy caught up with me. He pounded on the door, shouting at me to “fucking open” it. Then I heard him move away from the door, and with horror I realized he was heading for the balcony. The bathroom window overlooked the balcony, and I ran to make sure it was locked as well as the boy slammed into it and howled at me to let him in.

    After a little while, the house was silent again. I slumped against the bathroom door and held my breath, wondering if I could sneak down the stairs and out the door to my own house, which was just a few houses away. My heart pounded in my throat – I can still feel it – as I cautiously opened the door to peek, then scrambled down the stairs and out the front door. I made it to my own house and fell on the living room sofa, still feeling the terror, confusion, and shame coursing through me. I fell asleep crying on the couch and was awakened in the morning by my mother. I didn’t tell her exactly what had happened, only that I had decided to come home. I thought about the boy who had threatened me – he was a year younger than me, in 7th grade – and wondered what would happen when I saw him at school. Fortunately, he pretended he didn’t even know me.

    I don’t often think about that night, and I don’t feel as if it has had a lasting impact on me – but in retelling it here I feel the ghost of that night’s terrible fear. I was so young, and the boy was so young, yet neither of us was so young that we hadn’t absorbed some of the more egregious lessons in gender relations that are taught by our culture. He felt he had the right to sexual activity with me. I felt ashamed that I had somehow sent the wrong message. I should have been angry but instead I was humiliated and frightened. I lost track of the boy – he was not a member of my regular group of friends – but I wonder now if he ever tried to take what he felt was his from any other girls.

    This all comes up in response to the #YesAllWomen hashtag that has trended on Twitter in the days since a disturbed man took out his frustrations against women and men he perceived to be sexually successful in a shooting rampage in the community of Isla Vista, California. Elliot Rodger believed that he deserved the attention and sexual availability of women, and because his needs were not fulfilled he wrote a terrifying 141-page manifesto and then set about killing the objects of his rage. This tragedy is a mix of cultural hot potatoes: mental illness, gun control, victimhood, and misogyny. I do not believe for a minute that any one of these things alone is responsible for the killer’s rampage, any more than I believe that violent video games automatically turn players into killers. What I do find most interesting about this event is the light it is throwing on male privilege in our society.

    I can already sense some of you rolling your eyes and scoffing. “Male privilege? Come on, give me a break! It’s tough to be a man! Every woman automatically thinks you are an asshole who is just waiting to assault someone!” Calm down – that’s not what I mean by male privilege. What I mean is that men have the privilege of not having to deal with things that women deal with on a regular basis (and yes, men probably have to deal with things on the regular that women don’t, but that’s not the subject of this post nor is it the point – just because men deal with their own issues doesn’t make women’s issues any less valid). I mean the privilege of not feeling uneasy about walking alone at night, or being afraid to open the door when someone knocks, or lying to a man about having a boyfriend because past experience has taught you that if you just say you aren’t interested, some men will keep bothering you anyway. This is not the same thing as being spoiled, which is how some people tend to interpret privilege. In fact, I would rather call it something like “things men get to take for granted” but that’s cumbersome. So again, I’m not saying male privilege makes men spoiled or unaware – it just means there are things they can take for granted about their safety and how society will treat them that women can’t.

    This idea comes to a head with the #YesAllWomen hashtag. Some men have responded with a hashtag of their own: #NotAllMen. That is absolutely correct. Not all men harass or assault or demean or attack or condescend or otherwise make women feel unsafe and disrespected. But the point is, SOME MEN STILL DO. If you are not one of them, that is wonderful, and I understand if you feel defensive, but instead of reacting defensively, stop for a moment and think about why women feel this way. It is not meant to be a blanket indictment of all men. Instead, I read it as an indictment of a culture in which a person like Elliot Rodger can find a community of men who truly, horrifyingly believe they are owed the sexual attention of women. And it is an indictment of a culture in which YES, ALL WOMEN have stories about being harassed or bothered or afraid. We are so fast to blame the victim or say she should toughen up, pull up her big girl panties, and put a stop to the harassment. Why aren’t we asking instead that the men who still treat women in this way pull on their big boy panties and act like civilized human beings who treat others, no matter what their gender, with dignity and respect? Why aren’t we asking our culture to grow up and start teaching boys as young as the 12 year old who attempted to assault me all those years ago that men and women are equals with autonomy, individuality, and the right to feel safe? This is not about not being a victim – this is about not being a perpetrator.

    So this is my response to the #NotAllMen hashtag: #StillSomeMen. I know many warm, wonderful, caring, loving men. I am lucky to always have had good, close friendships with men. I am incredibly blessed in my relationship with my father. I do not blame all men for the actions of some. But it is still important to fight against the men and the culture that still gives #YesAllWomen stories to tell about their experiences with misogyny and fear.

    In closing, I want to recommend two of the several articles that have commented eloquently on this phenomenon. There are many, many more, but these two particularly resonated with me.

    Your Princess Is in Another Castle: Misogyny, Entitlement, and Nerds, Arthur Chu. This article is a stunner. And I was surprised and gratified to see Chu describe two movie scenes that have always bothered me. They both feature what can only be called rape, but both scenes are played as victories for the nerdy guys who finally get to sleep with the hot girl because the girls are either tricked or too drunk to know the difference.

    I Am Not An Angry Feminist. I Am A Furious One., Madeleine Davies. This one inspired me to start using the #StillSomeMen hashtag on my own Twitter account. Davies is more eloquent than I am about why the #NotAllMen response is upsetting.

  • Logical Fallacies: Attribution Error

    Logical Fallacies: Attribution Error

    How many times have you honked your horn in anger or raised your middle finger at some idiot while driving? Have you ever seethed inwardly as some dawdler wastes time at the checkout counter while you are waiting behind them in line? Do you assume that the person who took up two spaces in the parking lot is a complete asshole? On the other hand, how many times have people honked at you or flipped you off as you sheepishly realize that you accidentally went out of turn at a four-way stop? Have you felt the back of you neck burning with the stares of people behind you at a checkout line as you realize you entered your PIN incorrectly or forgot to give the cashier a coupon? How about having to park awkwardly between the lines because another car was partially blocking the space? But you’re not a bad person, right? Those other people, though…

    When you believe that your actions can be explained by situational factors, but other people’s actions can be explained by their personalities, you are succumbing to a particular type of attribution bias known as fundamental attribution error. Attribution bias involves the human tendency to explain our own and other people’s behavior by attributing it to causes that may actually have nothing to do with the behavior. Overall, we tend to explain our own actions, or the actions of those we know well, as being due to situations and not due to something fundamental about our personality. Conversely, when it comes to explaining the behavior of people we don’t know, we are much more likely to explain it as a function of who they are without taking contextual factors into account. Essentially, we are judging a book by its cover.

    Attribution errors occur in the public sphere all the time. If you have ever made the mistake of getting sucked into the comment page rabbit hole accompanying articles about controversial issues, you know what I am talking about. So often, we are only given a tantalizing tidbit of information in an article, but that’s all it takes to trigger a cascade of attribution error. I find this troubling. One of the classic examples of attribution error is the case of the Albuquerque woman who sued McDonald’s after she spilled hot coffee into her lap. This case took the media by storm and eventually became a cultural touchstone for describing apparently frivolous lawsuits. The vitriol that rained down on Stella Liebeck was thick and furious. She was obviously an idiot for resting the coffee cup between her legs. She shouldn’t have been driving with hot coffee in the first place, so she must be a careless person in general. She clearly was just out to get McDonald’s because they have deep pockets. How dare she sue for an incident that was clearly her own fault? She was just trying to get rich off McDonald’s! It turns out that the reality of Liebeck’s case was much, much different than the public perception of events. There is a reason the jury awarded such a huge amount of money when they heard the case – it is because they heard the facts and made their decision based on those facts. In hearing the facts, the possibility of attribution error was dramatically reduced. I strongly encourage you to read the facts in the case if you are one of those who has never heard them. There is even a documentary film about the case called Hot Coffee, which explains how Leibeck’s case got so distorted.

    I started thinking about this the other day when I was reading a Jezebel article about a woman in Ontario, Canada who hit three teenage boys with her car, injuring two and killing one. She is suing a whole host of people in connection with the case, including the dead boy. I reacted as most people probably would when I read the article: this woman is clearly a monster. She was speeding. She may have been talking on her cell phone. She killed a kid and badly injured two others! What kind of awful scum of humanity would dare to sue the families involved in this tragedy, much less sue the dead kid? And I was not surprised when I scrolled down to the comments and saw that many posters felt as I did. But as is my general practice, after getting over my initial reaction I started to wonder about the context of the situation. How fast was she actually going? Is there any evidence supporting the allegation that she was on her phone? What about the kids? Did they ride into her path? What time of day was it? What is the context? What are the facts? It turns out that it was dark when the boys were hit. They were wearing dark clothes. They were riding side-by-side along the road. Now, I’m certainly not blaming the victims here, but it sounds like this situation was ripe for potential tragedy and that they were struck by accident. Even the most attribution-biased among us probably don’t believe that the driver hit the boys deliberately. And if you’re like me, you also start to think about your own, personal context. I rarely drive the exact speed limit. I wouldn’t say I’m a speed demon, but 5-10 miles above the limit is pretty par for the course. I’ve also been guilty of taking calls while driving… sending and receiving text messages… even checking social media. (And just FYI, nowadays when I feel the temptation to use my phone while driving I ask myself if it’s worth a life to do it. The answer is always no.). If I were to hit and injure or kill someone under those circumstances, I would be crippled with guilt and shame… but would it mean I am a monster?

    You may be saying to yourself that this is all well and good, but what in the world could ever justify this woman suing the dead boy and the families for emotional trauma? What kind of person would do such a thing? She must be a monster! I think this is the point at which we must pause and ask ourselves what we might feel if the same thing happened to us. This appears to have been a terrible accident. I don’t know about you, but if I hit and killed someone, whether I was at fault or not, I would be devastated. That devastation would probably manifest itself physically as well as emotionally. I would live it over and over and suffer terrible guilt, grief, and shame. And I’d also have to defend myself in the court of public opinion as well as the civil court. Of course, the driver in this case is being sued by the victims’ families. And she is countersuing because that’s what lawyers tell you to do in cases like this. It’s a tactic you use to protect yourself in the court system so that if the judgement comes down against you, you have some protection from financial ruin. I don’t know about Canada, but in the United States this is a fairly typical situation that happens at the behest of insurance companies who don’t want to be the ones paying out a big settlement. There may well be more to this situation, but I don’t think it’s fair to automatically paint the driver as a soulless monster without at least attempting to learn more about the context.

    Don’t get me wrong. I’m not saying that the driver in this case is blameless. But that’s not the point. The point it that our knee-jerk attribution error paints people as one-dimensional villains and allows little room for the nuances and subtleties that arise when we look at a situation in its complete context. I can say the same about people we canonize as heroes! Just as the driver in this case is probably not a demon incarnate, people who do heroic things may also not be overall nice people. We are all complex, multidimensional creatures, and it would behoove us to remember that when attribution error tempts us to label people with a single dimension.

  • Logical Fallacies: The Bandwagon Fallacy

    Logical Fallacies: The Bandwagon Fallacy

    When I was attending Humboldt State University in the early to mid-90s, I noticed that I was putting on some weight – the dreaded Freshman 15. To combat this phenomenon, I started getting regular exercise, joined a gym, and started watching my diet. It was right around this time that a new dieting trend burst on the scene: a massive proliferation of low- and non-fat foods, all of which were marketed directly to the consumer’s desire to lose weight while still being able to indulge in treats like cookies, ice cream, and chips. In particular, I remember the Snackwell’s brand of cookies and snack cakes in their trademark green packaging. I remember scanning the nutrition label and seeing that I could eat an entire package of vanilla creme cookies and only ingest 4 grams of fat. Eureka! It never occurred to me to stop and think about the wisdom of this approach. Did it really work? Well, it must – otherwise, why would everyone be buying these products?

    Welcome aboard the bandwagon fallacy. The premise is simple: if an idea is becoming popular, it must be true. The low-fat fad took off because so many people wanted to believe in its simple premise that removing fat from your diet would remove fat from your gut. As the idea gained in popularity, it gained in adherents, which further increased its popularity, in a nice little feedback loop. Bandwagons can form around all sorts of premises, tested and untested, but I find the ones that form around food to be quite fascinating. These fads seem to come and go: the high-protein Atkins diet was first popularized in the 1970s then faded, only to experience a resurgence in the 2000s. Of course, as people came to realize that the diet didn’t have the lasting weight-loss effects it promised, it lost its popularity as people abandoned the bandwagon. Yet, these ideas manage to persist. The same thing happened to the lactose-intolerance fad, and I strongly suspect it will happen to the gluten-free fad.

    When a bandwagon idea holds the potential for becoming a marketing bonanza, it explodes across a universe of products. This reinforces the bandwagon. Currently, gluten-free is the top dietary fad. It’s quite amusing to see products that never had gluten in them in the first place emblazoned with the GLUTEN FREE! label. I’ve seen it on products as ridiculous as soda and fruit snacks (although as an aside, it can also be quite shocking to discover all sorts of strange ingredients in prepackaged foods, so I suppose it’s always possible that a fruit roll-up could have gluten in it). The same thing happened during the fat-free fad. Other current bandwagon labels include organic, free range, cage free, all natural, non-GMO, RBGH-free, and other labels catering to the health-conscious (but sometimes logic-unconscious). Gluten-free still seems to be towing a full bandwagon, but the next wagon is rapidly filling with adherents. This is the anti-sugar bandwagon. I’ve lately been seeing a lot of ink spilled over the toxic hazards of our high-sugar modern diets, and I have absolutely no doubt that the marketing bonanza has already begun.

    Research is revealing that the causes of modern health problems are much more complex and intertwined than the simplistic healthy-food bandwagons would make it appear. I do want to stress that there is real research into some of the bandwagon fads I have mentioned. Sometimes the research supports the fad, sometimes it doesn’t, and often the results are maddeningly inconclusive. The Atkins diet has been thoroughly studied with mixed results, depending on what particular factors were the focus of the research. Lowering the amount of fat in one’s diet also can certainly lead to weight loss, but that by itself is not enough. People with celiac disease truly cannot ingest gluten without becoming severely ill, and some people may be able to handle wheat protein in their diets better than others. Organic foods have the benefit of lowering our exposure to potentially toxic pesticide and herbicide residues; however, some of the other popular adjectives for “healthy” food remain highly problematic because they are misleading. “All Natural” is a loosely regulated term that can be used by almost anybody. “Free Range” and “Cage Free” can mean only that the birds in question are released to a fenced yard for a short time each day or are crowded together in large facilities with no cages – but no natural light or ability to go outside. The research into GMOs and RBGH (recombinant bovine growth hormone) is unsettled and deserves a post of its own. Even the simple “calories in-calories out” approach is turning out to be much more complicated than we thought.

    The point of all this is that these issues are complicated and deserve critical review. The bandwagon fallacy encourages us to jump aboard because it’s easier to go with the crowd than do the hard work of researching an issue on the merits. Do your research and you may just find that the bandwagon is the right place to be – but it’s not because everybody else is there. If you choose to ride on the bandwagon, be sure it’s because you are confident in its origins and its destination – whether it’s about making food choices, social choices, or even pop-culture choices. Better yet, build and drive your own wagon!

  • Technology and Its Discontents: Getting Quizzed

    Technology and Its Discontents: Getting Quizzed

    For the past several months I have noticed a proliferation of quizzes on social media and pop culture websites. There is something about a headline that reads “Find out which Disney princess you are!” that overcomes my inner curmudgeon and makes me want to participate, even though I’m not a fan of Disney and I don’t really care which princess I am. The internet quiz is possessed of an uncanny ability to draw in even the wary, because what could possibly be the harm in finding out which Walking Dead character you are or what your profession should actually be or what mythological beast is your totem animal? It turns out, more harm than I realized.

    Even as I was giving in to the siren call of the quiz I found myself questioning why its allure was so strong. I was answering quizzes dealing with topics I knew nothing about – things like which character I am on a TV show I don’t even watch. I told myself that it was ok because I mostly kept my participation hidden from others. I ran across most quizzes on Facebook on the feeds of friends who had taken them. I almost never posted my results, although I would sometimes leave them as a comment for the person who originally posted the quiz. I was slightly embarrassed by how quickly the quizzes would suck me in, and I figured if I didn’t pass them on then I was at least not contributing to their proliferation. As usual, there was an element of “I’m better than this” to my refusal to share my participation; I didn’t want to admit that I was indulging in such a petty use of my time.

    I thought my slight embarrassment was the worst consequence of taking these silly quizzes until I ran across an article that revealed information I immediately realized I should already have known: the quizzes are a back door way for marketers to track consumer data. OF COURSE THEY ARE. My literal headslap after reading the article paled in comparison to the anger I felt at myself for being so easily duped. If you are not familiar with these quizzes, they ask seemingly innocent questions in an effort to peg you as, for example, a fictional character or famous author or classic movie. Many of the questions have answers that hint at certain results, so if you are just dying to be identified as Allison from The Breakfast Club, you’ll select the picture of the sandwich made with Cap’n Crunch and Pixie Sticks as your lunch of choice. This all just seems so harmless and fun! But in reality, it is telling the purveyor of the quiz very specific details about you. What is your favorite color? Favorite animal? Favorite breakfast cereal? What bands do you like? Where is your dream vacation destination? What decade do you identify with? How do you dress? What do you read, watch, eat, do for fun? All these questions are things I have encountered on these quizzes, and I can’t believe I didn’t realize on my own that they aren’t harmless at all.

    There are plenty of people who think this sort of thing is no big deal, and I suppose to some degree that’s true, but I will not concede that it is without harm. It is manipulation, pure and simple. It’s worse than subliminal because it not only sends a message out to the consumer, it gathers a response that can be used to craft even more manipulative messaging. It absolutely depends on the notion that people don’t realize what they are giving away when they participate. If this was really just about selling us stuff, I’d still be upset by it, but ultimately I think it’s much more than that. Our willing participation in the online world means unwitting participation as harvested data. We all need to bear in mind that on the internet, nothing is truly free. The adage of marketing holds true: if you’re not paying for a product, you are the product. So the next time a quiz pops up, remember: you are giving yourself away for free.

  • Shifting Perspective: The Economics of Privilege

    Shifting Perspective: The Economics of Privilege

    Just over a year ago, I decided to start indulging my creative side by crafting objects like lamps, clocks, and even furniture out of vintage, found, and second-hand objects. I even turned my little projects into a business of sorts, and started a website to showcase and write about my creations. As I learned how easy it is to make things that I thought would be difficult – like wiring lamp sockets, cutting, sanding, and finishing wood, and drilling through glass and metal – I started experimenting with other things I realized might be easier than they seem. This led to experiments with making food from scratch. As it turns out, ice cream, fruit jams and preserves, soda, and nut butters are easy to make and generally taste better than what you can buy at the store.

    At first, I felt smug about my new-found insights into the relative ease of the DIY lifestyle. It made me wonder how consumers got so easily fooled into believing that paying full price at the store was better than making their own bread and jam and peanut butter and ice cream and soda. But then, when I was sifting flour into my bread machine one evening, I suddenly thought about my Grandma G. Grandma G. made bread for her six kids and her husband every day. She did it by hand, and though by my father’s account Grandma’s cooking wasn’t great, it was serviceable. I pondered the innovation of the bread machine that allowed me to spend five minutes measuring ingredients into a pan so the machine could spend three hours mixing, kneading, and raising the dough which I would then transfer to the oven. I didn’t have three hours to spend mixing my own dough on a regular basis, which is why I had the bread machine. Bread making from scratch has become a luxury, and as such, it has also become a marker of status. In other words, it is a privilege. That is, if you are making your own bread, that probably means you have the luxury of time and resources – ironically, resources Grandma G., who was raising a large family on Grandpa G.’s meager salary, didn’t have. Her bread making was a necessity, not a luxury. She didn’t have a bread machine and access to hundreds of fancy bread recipes; she just had flour, yeast, salt, water, and her own efforts.

    This line of thinking shouldn’t have startled me, but it did. I had to admit that I am privileged to indulge in DIY cooking of the staples most people buy at the store. I have the resources to buy organic produce, free-range chicken, hormone-free milk from pasture-raised cows, and the myriad tools that make it easy to bake your own bread and make your own nut butters and jam. I own a fancy, high-powered food processor that whirs nuts into butter in just a few minutes. I have giant stock pots that I can use to boil fruit and sugar into jam, and tools for canning it. I have a fancy ice cream attachment for my expensive countertop mixer. I spent hundreds of dollars on bottles, caps, strainers, and funnels, and roots, herbs, and special yeast for making soda. Somehow in all that frenzy of DIY activity, I lost sight of the fact that what people used to have to do has become what most people can’t afford to do.

    How did we come to this state of affairs? Why is it now a privilege to get back to the basics that my Grandma G. practiced in her daily life? These are not rhetorical questions, but as of yet I’m not ready to dig too deep into some of the possible answers. At their core, these are questions related to the stratification that is inherent in the structure of capitalism, but they also have a lot to do with our individual pursuits of a better, faster, easier way to get things done. Our pursuit of ease in the interests of freeing up time to do more things has ironically led to us having less time than we used to. Grandma G. made bread for her family every day because she had to, but she undoubtedly would have loved to buy loaves at the store instead. Now, the daily treadmill of making ends meet, especially for those in the lower economic strata, makes buying loaves at the store the necessity, and having the time to make bread from scratch becomes the privilege.