Tag: anthropology

  • Technology and Its Discontents: Alienation

    Technology and Its Discontents: Alienation

    As the industrial age took hold in the late 17th and early 18th centuries and began its saturation of the globe, a curious phenomenon began to take place. People who had once labored for themselves – doing what they needed to do to support themselves, their families, and their communities – began laboring for others. They quit their simpler lives and moved to bigger towns, then cities, seeking and finding employment in factories, assembly lines, and sweatshops, laboring to produce things over which they had no ownership. The logical outcome of a capitalist world system began to spread and solidify, requiring that people work for others to support themselves, but have no ownership of the fruits of their labor. Yes, these laborers were paid for their work, but unlike when people engaged in farming, hunting, small trades such as blacksmithing, horseshoeing, wheel-wrighting, candle and soap making, carpentry, and all the simple but vital labors for which people could once get paid, the only thing this new class of laborer owned was themselves. All they could sell was their labor.

    This is the microcosm of what is called industrial alienation. It’s what happens when all people can sell is themselves, and they have no ownership of the means of production. They become a commodity, no different than the raw materials used in manufacturing the things they are paid to make. In the modern world system of capitalism, most people can only sell themselves for the money they need to support themselves and survive. To an enormous extent in the Western industrialized world, this has meant that nearly everybody has forgotten how to survive in the way our ancestors did – by knowing actually how to find and produce food and shelter. Labor has become so extraordinarily specialized in this brave new world that most people no longer have any connection with the basics of survival. Even worse, we have become alienated not just from what we do, but from our very purpose for living. Why are we here? What is the point? Do I even matter? These are not questions asked in cultures where people are still able to support themselves with the knowledge of actual, physical, animal survival. That, itself, is the point: survival. In the face of securing it for yourself and your group, there is no need, no room, for existential questions. Those questions are created by alienation.

    This is a winding road to some thoughts about technology. Humans have always sought to answer the basic question of why we are here, probably since the dawn of the species, and have found a variety of answers (often in the supernatural and religion). Now, though, I think technology is filling the hole of our alienation. Specifically, we are filling our existential emptiness with social media. Posting, Tweeting, sharing, Instagramming – they all provide a sense that we matter. They give us a way to be acknowledged (or so we think) by others. They remind us: I exist. The urge, the compulsion, is so strong that we will risk our relationships, our jobs, our educations, our safety or even our lives to fulfill it by doing all those things while driving, or walking, or cycling, or eating, or watching TV, or at the movies, at work, at school, at a football game, at a wedding, a funeral, anytime, everywhere. Yesterday I could have mowed down a woman glued to her phone, scrolling endlessly, as she walked obliviously down the center aisle of a parking lot. I have sat with friends while they pretend to be engaging with me, but they are staring, staring, staring at the phone. I have been accosted with pictures, videos, websites, texts that the other person insisted I see. And, I have done most of those things myself. I understand.

    Humans are extraordinarily social primates. It is no surprise to me at all that social media has exploded into a frenzy of self-referential attention seeking. Humans are also status-seeking animals, and the feedback we crave from our social sharing is highly addictive. It is a constant lure for us to try to elevate or affirm our status amongst our peers. But as with anything, there can be too much. Just as the buzz of alcohol can make us feel attractive, funny, and smart, so can the buzz of our relentless technological distractions make us feel noticed, important, and liked (if not loved). But the alcohol buzz wears off, and so does the brief high we receive from seeing who has responded to our online presence. Alcohol can become an addiction, and so can technology. It is not a good way to fill the hole left by our alienation.

    I am not immune to the lure of technology, but I am thinking deeply about it and making some decisions about how much I am willing to let it intrude upon my life. I understand that there are also positive aspects to our use of phones and computers, et al (for example, the fact that I can write and share these thoughts). But at the moment I am deeply uneasy, and I am making a conscious effort to concentrate on the world outside the screens.

  • Logical Fallacies: Ad Hominem

    Logical Fallacies: Ad Hominem

    It’s been a while since my last post, primarily because much of my attention has been focused on my other endeavor over at the Rock and Shell Club. But that doesn’t mean I haven’t been nurturing several rants, large and small. I am occupied by the usual topics of critical thinking, the over-saturation of social media in our daily lives, and the peaks and valleys of capitalist consumerism. One goal I’ve been considering for quite some time is to develop a curriculum or course description for an anthropology class that centers around the topic of critical thinking. More specifically, I’d like to teach a class that discusses how and why people think about things the way they do. Anthropology is ideally suited to such a topic, because anthropological analysis requires cultivating the ability to see things from other points of view.

    Michael Shermer wrote a book that I would require for my class: Why People Believe Weird Things. This book clarified and helped me conceptualize many of the things I was already thinking about the workings of the human brain. I believe that having a good understanding of how people think is absolutely crucial to living a fully aware life. Even more important, understanding how thinking works, and how it can trip us up, helps us be more careful about our own thinking process. It’s one thing to criticize and evaluate others’ ideas; it’s something else entirely to be able to turn that process back onto your own analyses. I think the world would be a better place if more people did this. To that end, I’d like to discuss, in a series of posts, some of the basic logical fallacies. These are the things that seem to be the most common in people’s thought processes, and the things I think everybody should know and look for in their own thinking. They are also the things I’d start with in my eventual class on critical thinking.

    A fallacy in thinking basically means coming to an irrational conclusion that is not supported by the facts. Instead of arguing from a factual or rational basis, logical fallacies tend to revolve around arguments that stem from emotion, appeals to the spiritual or supernatural, personal attacks, or errors of cause and effect. They fall into many categories and some are more common than others. One of the most common, and simplest to explain, is the ad hominem argument. An ad hominem argument is generally understood as one in which you attack the arguer rather than the argument. This is commonly perceived as a personal attack, where you berate, insult, or criticize the person with whom you are arguing. However, it is important to note that an ad hominem argument does not have to be an attack per se; it is simply an approach by which you say something about the arguer rather than the argument. So, to take a simple example, you could say “You only support gay marriage because your brother is gay; therefore, gay marriage shouldn’t be legalized.” In this argument you aren’t saying anything negative about your interlocutor; but neither are you saying anything factual, rational, or logical in support of the position that gay marriage should or shouldn’t be legal.

    The ad hominem argument does absolutely nothing to advance your case. If it is the kind of ad hominem that actually stoops to the level of a personal attack, then I feel it may actually impede your case – not in a rational sense, because the ad hominem argument does not in any way negate the logic (or lack thereof) of your position – but because it degrades and impedes constructive discourse. No matter how much you may personally disagree with someone’s position, no matter how much personal animosity you may feel towards them, no matter how egregious or offensive or bigoted or immoral you may find their position to be, none of those feelings have any bearing on the logic of either your or your opponent’s position.

    I started with ad hominem not only because it’s the simplest, but it is extremely common and, I believe, extremely damaging. If you use it as a personal attack, your bring yourself down. If you use it to question or draw attention to the arguer rather than the argument, it does nothing to help prove your case. Believe me, I know what it feels like to get angry during a debate, and I know what it feels like to want to call your opponent names or question their character. Resist the urge. If your position truly has merit, that in itself should give you the ammunition you need in your fight.

    Speaking of which, once you’ve retired the ad hominem argument from your arsenal of false weapons, you are on your way to making more room in your quiver for logical arrows. In the next post, I’ll address some of the more complicated, but still common, logical fallacies that remain.

  • Sheep and Goats

    Sheep and Goats

    When I am teaching, my goal is to pass on the basic principles and tenets of anthropology to my students. After all, they are taking an anthropology class with me, and I am obligated to teach them the fundamentals of the subject as summarized in the course description, whether it is Cultural Anthropology or Human Origins. But what I am really doing is using anthropology to teach them something much more useful and important: how to think. I don’t flatter myself that I am the best person in the world to teach them this, or that I am the first or only person who will expose them to the strategies of critical thinking. It is, however, a charge that I take extremely seriously, because I am deeply concerned about what seems to be a basic lack of critical thinking skills in the world at large. Because I am teaching college students I can at least reasonably expect that these young thinkers are only at the beginning of a process of becoming skilled at interpreting the world around them. I am also not arrogantly assuming that college educated people are the only ones who are good critical thinkers; nor do I subscribe to a corollary thought that being formally educated automatically means a person is a good thinker. I have encountered many a person with a college education who is nonetheless not skilled at thoughtful analysis; and I have met many people whose life experiences have honed their thinking skills far more sharply than a formal education has. I guess my point is that you find a broad spectrum of thinking ability in society at large, and it doesn’t necessarily correlate with education.

    Back to the point of what I do in the classroom. I find that anthropology is an excellent vehicle for helping students discover and practice new ways of thinking about the world. It teaches you to look at situations from multiple perspectives. As I ponder ideas and information, I often visualize the issue at hand as an object sitting in the center of a room, and I imagine myself walking around and around that object, looking at it top, bottom, and middle, prodding it, testing it, moving it to see how it looks in different positions. I imagine other people entering the room and describing the object to me from their perspective. Sometimes those other people see things I didn’t see, and open my eyes to original or alternative points of view. Sometimes, I still can’t see what they see, but I welcome their description of the object nonetheless. In anthropology, being open to other points of view is absolutely critical. We all bring preconceived notions with us to the field, but we are trained to shed those ideas as best we can and let the experience itself tell us what we need to know. The most magical moments can sometimes occur when our experience in the field makes us suddenly recognize things we had taken so deeply for granted that we weren’t even aware of our own perceptions (this can also be frighteningly disconcerting). Those moments can make me almost giddy with excitement. What makes me even giddier is introducing those moments to my students, and seeing the recognition on their faces of new ideas that, once introduced, bring on the “a-ha” moment of understanding.

    I have to remind myself that I am a professional in the study of human culture and behavior. It’s easy to forget that I, too, had to be taught how to think this way. I think this is why I often feel such deep frustration at the fact that so many people seem unable or unwilling to look at issues from multiple perspectives. I am more than happy to accept that, once someone has explored an issue from several angles, they can come to a rational, logical conclusion about what they see. I am also happy to accept that I can come to an equally rational, logical conclusion about the same issue that is nonetheless very different from another person’s. What I have a hard time accepting is people refusing to consider any view other than the one they originally brought to the issue, in spite of repeated opportunities to see things from another perspective.

    Over and over, I have heard people refer to those who blindly follow along with a single point of view as sheep. A sheep follows the sheep in front of it, and the lead sheep simply follows the shepherd. Those who rail against the sheep usually have a problem with the perceived leadership of the shepherd. What I find confounding is the failure of many to recognize that they are following a leader of their own. Those who label others as sheep may very well be members of a herd of goats, blindly following the leadership of the goatherd. Humans, in many ways, do have a herd mentality. Whether you are a sheep or a goat is immaterial if you are still blindly following the leader. Maybe the sheep and goats should spend some time talking to each other and learning about each others’ herds. Perhaps the sheep should follow the goatherd for a bit, and see what it’s like to walk in a goat’s hooves. The goats should do the same with the shepherd. In fact, all of us would do well to consider each others’ perspectives. Take the anthropological view. Strive to recognize your biases. Reach for those “a-ha” moments. Learn to really talk about what you believe and why you believe it, and learn to really listen to what others believe and why they believe it. Don’t fall for the easy way out by going for the ad hominem (or would it be ad ovinem?) sheep label. That’s too simple, and too dismissive, and not worthy of those who truly wish to have others take their point of view seriously.

  • Economic Maladaptation

    Economic Maladaptation

    Near the end of the semester in my Human Origins course, I teach about two concepts: the epidemiologic transition, and the demographic transition. Both of them have to do with overall improvements in quality and length of life in societies that have reached a certain level of knowledge and wealth. In the epidemiologic transition, knowledge and innovations regarding health and medicine combine to reduce the incidence of infectious disease, and generally increase the overall health and longevity of the population. Mortality from non-infectious diseases such as heart disease and cancer increase as life expectancy increases. So, instead of dying young of an infectious disease, you live longer and ultimately perish from a disease or condition linked to old age and/or the consequences of a Westernized lifestyle (such as poor diet and lack of exercise). Combine this with the demographic transition, which sees life expectancy increase, and a drop in death rates followed by a drop in birth rates as societies industrialize and modernize, and you have a perfect recipe for booming population growth. Not every society in the world has gone through both of these transitions, but enough have that what should have been viewed as benefit is now becoming a detriment. Put in evolutionary terms, what was once adaptive is becoming maladaptive. I hypothesize that it is not these two transitions themselves that are to blame, but yet another transition, which I am going to call the economic transition.

    So what is the economic transition? As the world has modernized, starting at least four centuries ago with the age of European exploration and colonization in the 16th and 17th centuries, the global economic system that we know today as capitalism has taken hold. Capitalism, and the quest for profit through the exchange of material goods, is in many ways a beneficial and adaptive system for human groups. However, link it to the natural human desire to achieve status, and then link status to the ownership of material things and the symbols of exchange that make that ownership possible (i.e., money), and add in the longer-lived and massively expanded human population we are dealing with today, and you have a recipe for maladaptive disaster. Capitalism, superficially, is extremely similar to biological evolution and natural selection – call it economic selection. Left to its own devices, the natural consequence of capitalism is to concentrate wealth in the hands of a very few people or groups. This can work in some circumstances, especially when the social group affected is reasonably sized. Competition can lead to greater resource acquisition for the overall group, which is then redistributed by the leaders who had the greatest hand in acquiring it. This is what happens in the Big Man system that used to characterize many native economies in places like Papua New Guinea. The Big Man worked hard and gained followers who worked on his behalf to grow the biggest garden and the largest herd of pigs, and as harvest or slaughter time came, he rewarded his followers for their hard work. Those who worked the most gained the most, but nobody went without basic necessities. Why? Because in small groups, the well-being of the group depends on the well-being of the individuals who comprise it. The Big Man, for his part, was well compensated for his leadership efforts, but he did not end up with portions that were much larger than those of his workers; his gains instead had to do with status and leadership power (which from an evolutionary standpoint tends to correlate with greater reproductive success – to me, this is the underlying impetus for the development of these sorts of systems).

    The Big Man system is a sort of proto-capitalism. Anybody could aspire to be a Big Man, and with enough hard work and charisma, individuals could work their way into the top status tiers of these groups. The key difference is that the Big Man did not keep the majority of the wealth for himself, and he did not attempt what true capitalism attempts today: gather the most wealth possible while paying as little as possible to acquire it (whether for raw materials, workers, overhead, or what have you). The capitalist world system is designed to concentrate wealth. It is theoretically true that anybody can compete in this system, but with 7 BILLION competitors, success is anything but assured, and the structural obstacles to reaching that success are more numerous and complex than I can possibly attempt to explain in one post.

    I still haven’t really explained the economic transition yet, because at least a basic knowledge of economic systems is important. Nevertheless, I can describe it simply as a transition from small-group based competitive yet redistributive systems to a system based on personal financial gain that thrives on the perpetuation of class inequality. In a survival of the fittest economic system, inequality is the only possible outcome. What’s even more insidious about this transition is that even those at the bottom of the class and income scale believe that this is the way it is supposed to be, and that the only way out is through acquisitiveness and consumption. I have already written on this at some length in posts discussing hegemony. This is hegemony in a nutshell. The economic transition is maladaptive because it relies on continued resource consumption, and it is linked to the large and long-lived global population that consumes those resources. The economic transition, if it continues to its logical conclusion, ultimately means the ultimate biological maladaptation for the human species, to wit: extinction.

    I actually didn’t mean for this post to be a treatise on my view of our world’s economic problems, but these things just come out as they come out. This is the starting point for many more specific posts to come. What started my ruminating on this particular topic (other than the fact that I ruminate about it just about every single day) was thinking about our obsession with material things, and wondering how in the world we can save ourselves from ourselves. How can we modify the system so that status comes from the person you are rather than the things you own? How can we actually slow down the economic engine, and adopt a philosophy of economic balance, instead of constant growth? When will we realize that the values of our lives come from experiences, rather than possessions?

    Let me end this post with a question for my readers: what are your fondest memories? What makes you smile when you need a boost? Do these memories revolve around things, or people and experiences? One of my favorite memories, one I call on when I want to feel happy, is from 2001 when I surprised my mom by coming home from Albuquerque for Christmas one week early. I called her from outside her front door. As we spoke, I made it sound like I was still in Albuquerque. I knocked on her door and laughed to myself as she said “Hold on sweetheart, there’s someone at the door.” She opened the door and saw me, and I will never, ever forget the look on her face or how she dropped the phone and grabbed me into a hug of joy. This is a memory that I could never buy, yet it makes me happier than any material thing I have ever owned. I think that if we consciously remember what truly has given us joy in our lives, it may lead us out of the materialism = happiness lie that so many forces are leading us to believe.

  • Communication Pandemic

    Communication Pandemic

    As an anthropologist, I have been trained to ask, and attempt to answer, questions about human behavior. As a cultural anthropologist, my methods involve participant observation to gather the data needed to start formulating an analysis. This all sounds very dry, but in layman’s terms all it really boils down to is that I am a professional people-watcher. I became an anthropologist because I thought it would be bitchin’ to make a career out of finding answers to the questions that fascinate me. Even better, I get to teach others how to answer their own questions about why people do what they do, and open their eyes to new ways of trying to answer those questions.

    So, this week the musings of the skeptical anthropologist are focused on communicative strategies and how they are changing. Or, to put it in a more entertaining way, what the hell are people thinking sometimes when they open their mouths? Or, even more so these days, when they e-mail/post/Twitter/link/YouTube/Flickr? What are we really trying to accomplish with all these new ways of communicating?

    I find that there is a fascinating blend of potential motivations in what people do with the new instant-communication technologies at our fingertips. The cell phone camera/video/e-mail/upload capabilities have given us the ability to reveal ourselves and our lives in some shockingly exhibitionist ways… and some coma-inducingly boring ways, too. And, the variety of emotions we can express! I know I’m not saying anything new here, but the anonymity the internet can afford (but that actually seems less and less anonymous each day) allows people to be astonishingly raw.

    So what’s my hypothesis? I haven’t settled on anything concrete yet, but I am attempting to formulate some ideas based on status-seeking behavior, and ultimately (as always with me) evolutionarily adaptive behaviors. To wit: on a venue such as Facebook, there seems to be a great deal of “look at how clever/cool/witty/educated I am” sorts of posts. There are also many “look at how virtuous/healthy/creative/thoughtful I am” posts. These posts, I need not remind, are not anonymous. Who wouldn’t want to take credit for “Just made a delicious risotto with truffle oil and chanterelle mushrooms”? Who, exactly, are we trying to impress? Facebook is also a simple way to keep friends up to date on your life, and most people seem to use it that way, but for how many is this an opportunity, however unconscious, to simply brag? And, how much of what we post is actually reflective of the life we are trying to project?

    On the opposite side of that coin are anonymous posters, such as those who lurk at the San Diego Union-Tribune website. The amount of vitriol that drips from some of these posts is simply boggling. And along with the vitriol is just plain ol’ racism, sexism, and one of my favorites, stupidity. But the point is that, for the most part, people won’t flame each other on Facebook, but they will easily descend to the level of playground bullies on anonymous comment boards. No surprise there, but I can’t help wondering how many of these folks are the same as the ones posting about the truffle risotto.

    Yet another area of fascination is the YouTube phenomenon. This came disgustingly home for me when my little cousin linked to a graphic video of… wait for it… a guy letting his girlfriend LANCE A BOIL on his back. There is no attempt to disguise the face of said guy, and he knows he is being filmed, and I assume he approved of having the video posted online. And yes, I watched it, in all its blood- and pus-spurting glory (which brings up another topic of fascination for me, which is our primate urge to groom each other… but I’ll save that for another post!).

    What does all this new technology, these novel (but not for long) communicative strategies, mean for us and our behavior? How do we know what to trust? How do we know what to say? What will be the limits, if any, on what we will reveal? How long will it be before all this availability, visibility, and downright exhibitionism festers into its own cultural boil and pops? Or will we simply adapt, as we have for so long?

    An even better question, for me personally: what makes me think anybody is interested in what I have to say in this blog (besides my mom – awww, thanks Ma!)? Hmmm… the musings continue.

  • Finals week

    Finals week

    This week is finals week at school. I have the exam ready for Tuesday’s class, and only need to make a few tweaks to Thursday’s exam before it, too, is ready to go. I have mixed feelings at the end of the semester because I am sad to see another crop of students go on their way, but I am also glad to get a break from having to stand in front of 40 people and educate and entertain (edutain?) them for three hours twice a week for sixteen weeks.

    I have now been teaching at CSUSM for three years, and I still love just about every minute of it. Now, don’t take this the wrong way, because I certainly appreciate the extra money, but I would do this for free. Part of the reason I love teaching so much is purely selfish: I like being at the front of the classroom, I like the feeling of power – and responsibility – that comes from introducing the students to new ideas, I enjoy the attention and the accolades. But, the real reason I love teaching is because of the look I sometimes see on their faces when contemplating a new idea for the first time: the look of dawning comprehension, or joyous understanding, or best of all, the emphatic nods of agreement when a student realizes that something they had already thought of, on their own, is being confirmed by what they are learning in class. Sometimes students have come to me to tell me that my class is the first time they have ever heard a teacher give voice to some of their own ideas about the world, and the people in it. This is one of the greatest things about anthropology: it is an ideal vehicle for introducing people to new ways of thinking and understanding – not just about so-called exotic cultures, but about the students’ own culture(s).

    I am fully aware of how powerful my position is in front of the classroom. My classes are generally heavily weighted towards freshmen students, and I try to tread delicately as I introduce them to concepts and ways of thinking to which they may not yet have been exposed. I know that this is probably the only anthropology class they will ever take (although I have had repeat students, since I teach two different classes). I just leave each semester hoping that I have taught these students something worthwhile – something beyond just random factoids about other ways of life. I hope they remember to be skeptical, think critically, withhold judgment, be objective, and have a healthy respect for that which is different without succumbing to moral relativity, all while making rational decisions about how to navigate our rapidly changing world in an ethical way. Heh – I don’t want much from them, do I? And I certainly am not arrogant enough to think I can teach them all those things… but I do what I can, and I am grateful for the opportunity.