Singular They is Old, Singular They is New

(The Linguistic One stretches, blows the dust off this blog that has been sitting idle while we all deal with Life Things, and dives in to writing).

In the last few years, English pronouns have become a hot topic of discussion and controversy, mainly because they constitute the central linguistic battleground on which English-speakers play out debates about the nature of sex and gender. I thought I had written about this more than once on here before, but it turns out I touched on it only once (here, in relation to the claims of authority expressed by Canadian Academia’s own He-Who-Shall-Not-Be Named). The main focus of The Great Pronoun Controversy is what is conventionally called “singular they”, although novel pronouns (like ze, xie, or others) also come up sometimes. There is now an abundance of good writing available, both in accessible blog posts or news stories and in books and academic articles — to highlight just a few, check out the work of sociocultural linguist Lal Zimman (here for academic stuff, here for some blogging), as well as the fantastic work of my new colleague Lee Airton on the blog “They is my Pronoun” or in their book “Gender: Your Guide”, discussed here.

We are hitting another round of public discussion of “singular they” right now, as the Merriam-Webster dictionary has declared it to be the “word of the year” (for announcement, see here). This comes on the heels of the fairly significant announcement a few months ago that the American Psychological Association style guide (heavily used across several academic disciplines) will include, in its 7th edition, the instruction to use “singular they” in cases where a) the gender of the individual being discussed is unknown or not specified or b) the gender of the individual being discussed is known to be neither male nor female. The APA decision in particular makes an important move toward changing the material manifestation of gender representation in print, since style guides constitute, in some contexts, formal rule books — in academic or journalistic writing, you may be able to argue for some wiggle room, but the default will be for copy-editors and other reviewers to “correct” your word choices.

Pronoun_badgesAs noted above, there are any number of experts on the use of Singular They that could tell you more about the pragmatics, psycholinguistic acquisition, and politics of this pronoun (in addition to the scholars above, see for example, information about this conference on the topic from last summer, or Twitter accounts of scholars @kirbyconrod, @VerbingNouns, and @lexicondk). What I want to add here is really about the discourse around singular they, in particular, around the “Word of the Year” declaration. In addition to folks who have really rigid ideas about how both gender and language work or should work, I see some mild pushback on these types of announcements from people who totally support the use of singular they, but who dismiss the idea that anything new is happening here. People who point out this oldness and commonness are profoundly well-intentioned and supportive of the rights of gender non-conforming people (they may even be trans or non-binary themselves), but I think they are missing something that does matter about this pronoun, and in doing so, are appealing to a view of language that is worth pushing against.

In a certain sense, it is accurate to say that “singular they” is old – there are attested uses of it, referring to unknown individuals or hypothetical people (e.g. “If someone comes to my door to sell me cookies, I will give them all my money”) going back hundreds of years. There are even fun examples of people arguing against it literally while they are using it (for example “If a student submits a paper using ‘they’ as a singular pronoun, they are going to lose grades for grammatical incorrectness”), and it is hard to resist the schadenfreude involved in pointing out this apparent hypocrisy. But Merriam-Webster — and the American Dialect Society, who declared ‘they’ its Word of the Year in 2015 — are not hopelessly out of touch in recognizing this pronoun as as a significant word that highlights an important social change. The use of singular they to apply to named/known, non-binary individuals is definitely new, and its rise is directly connected to an increasingly prominent understanding of gender/sex in non-binary terms. This is a point I want to emphasize for a couple of reasons – first, in rooting the claim to its “correctness” in an argument that “singular they is old”, it opens to door to those who can object to your point by noting (accurately) that this way of using it is new. If your point is that we should be okay with the grammaticality of singular they because we have been, in a certain form, for centuries, that is one type of linguistic battle you may choose to fight; if your point is that we can and should affirm non-binary gender identities through the recognition of “they” are a personal and specific pronoun, relying on an appeal to its longstanding grammatical presence is weak. Don’t get me wrong – I am not debating or contesting the grammaticality of non-binary, specific, singular they. I’m just saying a) it is actually a new thing, b) that’s actually great, because it shows that our language can change to accommodate our new social understandings of fundamental things like gender. Grammatical correctness does not accumulate with age.

The second reason I want to emphasize it is that I think sometimes this “singular they is old” and “everyone uses singular they” point is somewhat dismissive of the challenge of learning to apply this pronoun. A lot of the really great scholarly work around singular they right now is looking at people’s ability to acquire the pronoun and to learn to use it appropriately. Airton’s entire blog (linked above) proceeds from the recognition that this is a thing that has to be learned for most people, and that addressing that process and effort compassionately and supportively is an important part of bringing about the necessary social change. Perhaps unsurprisingly, this change is easiest to make for people who are themselves trans, non-binary, or genderqueer, who have been thinking about gender in complex and life-altering ways for essentially their entire lives, or for people are situated in communities in which they encounter a lot of gender non-conforming individuals, who therefore get a lot of opportunities to use these pronouns. I do think people outside these groups – in other words, cis, straight people who don’t necessarily engage much with queer communities – need to put in the time to learn how to do this right. It matters. Using the wrong pronouns for people hurts them (see for example this discussion of the related practice of “deadnaming” trans people), and denies their gender identities. Language is a central battleground in this particular story because it is through language that we express our acceptance or denial of the reality of who a person is. These expressions are about real changes to how we, as a society, talk about gender, and that means it’s worth taking the time to learn even (or perhaps especially) if it’s hard and confusing for you. It’s one thing to criticize pedantic dinosaurs for refusing to even entertain the grammaticality of singular they in any form, but quite another, I think, to suggest that there’s nothing to see here.

The grammaticality of “singular they” doesn’t depend on its presence in dictionaries or style guides or on appeals to its age in the English language, but in this case, the dictionary is right to highlight it – trans and non-binary people are becoming much more visible, and we as a broader society are learning new ways to talk about gender as a result. This pronoun is a radical thing, and it has come to mainstream public attention and use really quite quickly. Recognizing its newness is not to dismiss it – instead, it is to highlight its importance and to push forward with making it more present.

 

Why I teach about Female Genital Cutting (FGC) in First-Year Anthropology

Content Warning: The following post discusses the importance of acknowledging one’s own bias and avoiding judgment of cultural practices. It also explores the importance of concepts such as cultural relativism and critical cultural relativism when discussing taboo topics, like FGC, in Canadian Post-Secondary Classrooms. This post does not attempt to take a position on whether FGC or male circumcision is right or wrong or, to provide a comparison between the two practices. Its goal is to discuss how FGC is covered in Canadian and US mainstream media and why this discussion is an informative case study that I use to demonstrate and discuss foundational concepts in my first-year cultural anthropology course. Reader beware.

Continue reading “Why I teach about Female Genital Cutting (FGC) in First-Year Anthropology”

Language, Accommodation, and the View from Whiteness

I want to take a minute to make a quick point about the underlying implications of several stories that have circulated in the media over the last couple of weeks. These stories have all been pretty thoroughly reported on and critiqued, so apologies if you are already sick of them, but I think it’s worth gathering them in one place for comment.

The stories:

  1. Senior NBC journalist Tom Brokaw comments (and later apologizes for saying) that “the Hispanics should work harder at assimilation”, specifically noting that they should make sure “all their kids are learning to speak English”. (Ed: Wow, he really said ‘the Hispanics’, even, didn’t he? That’s…special).
  2. A professor (and now former program administrator) at Duke University wrote an email to students in a medical program blatantly stating that choosing to speak Chinese to their friends would be held against them when it came time to consider internship and employment candidates.
  3. A study of court reporters in Philadelphia found that significant inaccuracies characterized their transcripts that included African American varieties of English, to the point that 11% of the transcribed sentences were “gibberish”.

The first two stories contain an obvious similarity: racialized people are perceived as insufficiently willing to “assimilate” or become full members of the community, primarily as a result of their use of languages other than English. As several responses have pointed out, the claims made by Brokaw and by Duke professor Megan Neely are built on fictions – the language skills of Latinx people in the US are just fine, the assumption that English is the only language of US culture is an act of extraordinary erasure, and international students at Duke, for their part, are required to demonstrate English proficiency before even being accepted into the program. But facts like these don’t matter in shaping these perceptions — as scholars like Nelson Flores and Jonathan Rosa have observed, language and race are mutually constructed pieces of social life, such that, for example, Latinx bilinguals are interpreted as linguistically deficient, while White bilinguals are interpreted as exceptional and intelligent (my go-to example in class is this gushing headline about Princess Charlotte, which makes me sigh so hard). These raciolinguistic ideologies also come in to play as people’s interpretation of whether someone has an accent or not is heavily influenced by that person’s appearance (or other non-linguistic information, like the person’s last name)*. All of that is to say: in these two stories, discrimination based on (perceived) linguistic ability is being used to stand in for discrimination based on racial identity, since the latter is considered vile (at least for people like Brokaw, about whom any number of but he’s definitely not racist! defenses were marshalled), but the former is apparently justifiable.

The third story, about the Philadelphia court reporters, is a bit different. In this case, what we see is how court reporters demonstrate something that should actually be an obvious disqualification for the position they hold: the inability to understand and accurately represent varieties of English spoken by a significant proportion of the people whose speech they are paid to document. This is an especially high-stakes context that demonstrates a fundamental lack of care about these speakers, and can’t be detached from the well documented inequalities in court outcomes for African Americans (both as defendants, and as victims/witnesses**). Grammatical patterns of African American English(es) are well known (a few are even included in the linked article) and could easily be taught and learned as part of training for a position in which accurate rendering of speakers’ words is vital. But…they’re not, and people hired in to these positions are allowed to continue, despite their clear linguistic limitation.

These three examples illustrate the same story from two different sides: the language of non-white people is differentdifficult, and needs to be improved. Non-white people are responsible for working diligently to demonstrate, linguistically and otherwise, their membership in the group. White people do not have to bother learning how to understand or use the language of non-Whites – not even when it’s central to their job. Bilingualism and the ability to code-switch appropriately and effectively becomes a survival requirement for some people, and a complete non-issue for others. This functions not only to reinforce racist interpretations of different people’s linguistic abilities, but also to impose a cognitive burden onto those who are required to do the extra work of learning multiple codes, as well as the social expectations about switching between them, and of constantly monitoring for how their speech is being (mis)interpreted.

Whenever discussions of racism and racist comments emerge, a lot of focus goes on to whether or not the individual person who made the comments “meant” to be racist, or whether that person “is a racist”. The thing is, ultimately, Brokaw, Neely, and those individual Philadelphia court reporters are not what these stories are about. These stories are about the constant reminders that non-White people get about their “limitations”, about the work they need to do to be accepted in “mainstream” (read: White) society, about the shifting goalposts through which racial discrimination is enacted in practice, and about how the view from the perspective of Whiteness is continually rendered as the only “normal” way of being. Language is an extremely powerful force in the manifestation of racism, and these examples are pieces that make that force work.

*This research is complex, and there are a lot of theories around how and why it comes into play – a relatively recent discussion of it, for example, can be found here: https://link.springer.com/article/10.3758/s13414-017-1329-2 .

**A particularly prominent example of the impact of African American language in court can be seen in the discussion, by John Rickford and Sharese King, of testimony given in the trial of George Zimmerman.

 

Tightening the belt – The Anthropology of Consumerism

Did you spend too much over the holidays trying to spoil your dearest and nearest friends and family? Did you decide to travel to see loved ones? Eat out more than usual? Grab a drink with an old friend or new somebody?

Spending on travel, eating out and gifts during the holiday season is increasingly putting Canadians into debt; According to a national cross-generational survey of 1000 participants in early October (2018), Canadians planned to “spend an average of $1,563 (for the 2018 Christmas season), up 3.7 per cent from $1,507 in 2017” (CBC October 3, 2018).

christmas-234105_640
Christmas Shopping in Hamburg – CCO

In the latest publication of the Annual Review of Anthropology (2018, Vol 47), Anne Meneley defines consumerism as “a matter of concern or crisis in the contemporary neoliberal, globalized world (which can be) characterized as capitalism unbound” (emphasis my own). She describes 5 topics of contemporary consumerism: (a) excess, (b) waste, (c) connectivity, (d) fair-ish trade, and (e) the semiotics of self-fashioning, some of which have a particular resonance after this most recent holiday season. Her article provides some interesting insights into consumerism – especially over the holidays.

In relation to excessive spending (surely evident during Christmas), Meneley notes that consumerism is increasingly framed as a problem, and one that is often related to under/mis-education of the lower classes. Meneley also identifies how excessive consumerism has become medicalized as new obsessive-compulsive disorder (hoarding), where fetishized objects are thought to contain residues of the owner and can therefore, not be thrown away. In addition, she describes the new attention paid to the storage and organization of things, which, if disorganized, may now require professional intervention (e.g. professional organizers – check out Netflix’s Tidying Up with Mary Kondo) to realign the relationship between human being and thing.

Perhaps you’re feeling exhausted now that the holidays are over? This might be because you’ve spent more time than other members of your household preparing for it.

Using ethnographic research, Meneley describes the shopping experience as an(other) example of unpaid labour for many women. She identifies the “considerable amounts of time (spent on the shopping experience), especially when the shoppers are employed, care givers, or on restricted budgets that require bargain shopping” (2018). Examples include how women are required to spend time purchasing meaningful gifts to fulfill their kin-keeper obligations, or plan, purchase materials, and serve home-cooked meals throughout the holidays that follow recent cooking trends or health-guidelines. Meneley goes on to note that if the shopper can be thrifty (with time – for example through online shopping – or money spent), this may add further significance to their purchases but this also may take additional time.

Meneley concludes her article with a list of ways in which consumerism is encroaching into the academic world: paying for access to journals or subscription services, measuring citation indices and impact factors, and the continued trend toward under-paid and -supported adjunct faculty to staff universities. She calls for greater attention to the encroaching ‘problem of consumerism’ into academic practices, a call that already feels old and tired.

At the outset of the article, Meneley defines consumerism as “an unremarkable part of quotidian existence, as a patriotic duty at various moments, as an indicator of social class, and as a means of semiotic self-fashioning” (2018, 117); yet, in my reading, Meneley’s work also includes ‘thoughtful consumption’ as a practice, an act of which implicitly requires the passage or importance of time (spent). Although she does not address the topic of ‘time’ overtly, Meneley describes time as being precarious, fleeting, expensive (i.e. time spent finding the cheapest, most meaningful, most nutritious goods). Throughout the article then, time becomes remarkably interconnected with the act of consumerism and is likewise involved in everyday acts of consumption as both an indicator of social class and personal branding (what she calls ‘semiotic self-fashioning’).

Perhaps for this new year then, when we’re told to tightening our belts (to spend less), those of us who gave a lot (whether that be time or gifts, etc.) could pay more attention to our use of time and/or put effort into thoughtful consumption as a way of clawing back some of our own resources (such as time, space, and energy). This approach could provide further evidence of ‘connectivity’ in consumerism which Meneley describes as the efforts of consumers to connect to the producer of goods (and where certain products make this impossible) as seen in ‘follow-the-thing ethnographies’ (and her discussion on ‘fair-ish trade’ products and cultures of circulations) or, as they relate to the growing importance of ethical consumerism that focus less on the ‘life of things’ and instead explore participants’ experiences of a ‘life with things’.

To read more about Consumerism in the Annual Review of Anthropology by Anne Meneley follow this link: https://doi.org/10.1146/annurev-anthro-102116-041518

Reflections on the AAA, Part 1: On Speculative Anthropology

Editor’s note: This year, two of our people (the Linguistic One and the Cultural One) went to the American Anthropology Association’s Annual Meeting in San Jose, California. This is an optimistically titled “Part 1″ of their reflections on the conference, since they are undoubtedly returning to their ‘”real” lives as we speak and staring at the pile of grading that did not magically diminish while they were otherwise occupied learning new things, sharing their own work, and meeting with old and new colleagues and collaborators. 

A few days before I left for San Jose, members of the AAA got what was honestly a very surprising email announcing a special guest lecture that about five hundred people would be able to attend. Tickets were free, but were scooped up within a few hours of this announcement, because while we are used to getting excited about academic rock stars, it is pretty rare for someone who is truly famous in the rest of the world to connect to such an event.

The guest was George Lucas, and the near universal reaction to this announcement was…wait, what? What does Lucas have to do with anthropology? (An alternative

DsAF7hcUcAA7HW2
A less-than-spectacular photo proving that George Lucas did indeed have a conversation with Deborah Thomas, editor of American Anthropologist, and that I watched from a balcony seat.

reaction was recounted to me by a colleague, who had a graduate student ask her about the famous anthropologist George Lucas and what he worked on, because the only George Lucas he could think of was “the Star Wars guy”, and that didn’t make any sense). Well, Lucas studied anthropology in college, before finding his way into filmmaking somewhat, as he tells it, by accident. The themes he explores in his films are, in some ways, rooted in what he learned in that context – most famously, the theory of mythical journeys associated with Joseph Campbell, and the imagining of the archaeologist as hero-adventurer, but also the ethnographic lens that he took in American Graffiti, which documents what he then saw as a disappearing rite-of-passage in American life. The event was billed as a way of thinking about storytelling in anthropology in discussion with a “Master Storyteller”.

So how did that turn out? Well….not great, honestly. Lucas was never trying to be an anthropologist, or to be rigorous in thinking through anthropological ideas, or, of course, to stay current within anthropology. He made quite a few references to “primitive” cultures, and invoked a general view of the “universal journey” that was a) highly masculine (as Campbell is known to be) and b) …not actually universal at all. While some praised the moderator, Deborah Thomas, for navigating his problematic highly offensive statements (seriously, there were audible winces a few times), I myself felt a bit frustrated as she continually turned the discussion back to asking him to comment on anthropology. The thing is, no one in that audience had anything to learn about anthropology from George Lucas. He was in his element talking about stories, and about his educational outreach initiatives with the new Museum of Narrative Arts. For me, the most interesting recurring theme was about the twelve-year-old as the site of imaginative potential. There was a thoughtfulness about the idea of coming-of-age (though not at all based on actual knowledge of coming of age rituals around the world), human creative potential, and hope, that was quite beautiful. But even that was undermined by his “encouragement'” of anthropology as, essentially, a really good way to learn to do market research (which, ok, it can be), and eye-rolling at “ivory tower” academics who refuse to admit that the “real world” is all about capitalist wealth accumulation. It is quite something, as a group that includes many people who try, however imperfectly, to walk with and understand a huge range of human experiences, including severe economic and political marginalization, to be lectured by a kajillionaire about being “out of touch”. Applauding when Lucas was given a lifetime membership in the AAA – when an annual one is difficult to afford for many active, passionate, graduate student and precariously employed anthropologists – left a bit of a bad taste in my mouth.

At the conference itself, I attended a panel that presented an interesting counterpoint to the talk by Lucas. After the death of author Ursula K. LeGuin earlier this year, linguistic anthropologist Bernard Perley organized a series of talks reflecting on her work and its anthropological legacy/roots. LeGuin was the daughter of Alfred Kroeber, a foundational figure in North American anthropology (which, as one of the panelists noted, is not necessarily reported as a compliment, given the colonial roots of our discipline), and she was raised in a world steeped in anthropology. The speakers on this panel were diverse, both in terms of their identities and their anthropological specialties. They talked about different aspects of LeGuin’s stories, sometimes positively, sometimes critically, but always with an eye to how she used her fiction as a kind of (what she called) speculative anthropology. How do we understand our own world through the lens of another one? How do we move as ethnographers through characters like Genly Ai in The Left Hand of Darkness, or the narrator in The Ones Who Walk Away from Omelas? How do we understand ourselves as sharing responsibility for the stories that shape our world, and what do we owe to the peoples whose stories have been colonized by others?

There were reflections, through these lenses, of both the limitations of LeGuin’s imagined worlds (rereading The Left Hand of Darkness in the context of teaching Language & Gender, Jocelyn Ahlers analyzed how gender markings actually crept in, in ways she hadn’t seen previously – as LeGuin herself also conceded a few decades after writing it, at least with respect to the “generic he” pronoun) and of anthropology (archaeologist Lee Bloch used the journey of The Dispossessed to ask provocative questions about the techno-scape of the temporal paradigms on which the field relies, and the colonial logic of these frames in themselves). There was engagement with the contemporary political moment and how examination of The Ones Who Walk Away from Omelas – always a challenging story – resonates in a time and country in which children in cages is not even a little bit metaphorical.

It was interesting to me to contrast these two different genre of anthropological conversation with two different creative minds’ uses of an anthropological imaginary. At its best, anthropology allows us to see universality refracted across radical difference. At its worst it tries to reduce that difference into a universalizing narrative of progress and improvement. I left California with a desire to imagine more, and in their own way, both these discussions are helping me to do that.

The Underlying Hope of Anthropology: Reflecting on the Work of Jane Hill

This past week, the world of linguistic anthropology – and the world in general, though that world is presumably less conscious of the loss – lost a giant with the death of Jane H. Hill, Professor of Anthropology and Linguistics at the University of Arizona. It is an odd thing in academia when a person whose ideas loom large over a field of thought passes away, much like the death of a more popularly influential artist makes some of us return to their work with a renewed sense of its meaning and impact on the world. I never met Jane personally, though my academic lineage traces back to her in a very short line (she was the PhD supervisor of my PhD supervisor). By all accounts I’ve ever heard, in addition to being a brilliant scholar, she was a wonderful human and mentor, and I can only imagine how that loss is felt by the people closest to her. At the time of this writing, her faculty page at the University of Arizona is still active, and on it, she invites students to “join [her] on the tightrope”, where, as she puts it

I attempt a precarious balancing act among diverse commitments: to the detailed documentation of languages and cultures and specialized expertise in technical tools such as comparative linguistic analysis, to the understanding of the scope and diversity of human history that is the glory of anthropology, and to using what I learn to advance social justice and mutual respect among human beings.

In a case of social media producing something right, anthropologist Anthony K. Webster (@ethnopoetics) suggested to the American Anthropological Association on Twitter that, in light of Hill’s death, the organization could provide access to her publications for free – and they did! For six months, any of Hill’s articles from the considerable library of publications housed at AnthroSource are available to access free of charge. Anyone interested in the broad areas of language, culture, and social justice should absolutely take advantage of this opportunity.

This blog is not really the best place for me to even try to highlight the value of these contributions (the upcoming AAA meetings in San Jose are sure to include many such reflections), but I will make a few recommendations about what to read, from that list, as well as additional work.

  1. The Everyday Language of White Racism – I am starting immediately with a book, which is not, of course, made accessible through AnthroSource, but which is too significant not to lead with. This is the book I always go to whenever anyone asks for the one recommendation from my field that I think everyone should read. Hill wrote this book late in her career, based on analysis of online discourses and commentary about various racial issues manifested in language, including slurs, appropriation, and “gaffes”. The title of the book makes clear what this is about – whiteness, and the quotidian ways in which a white racist social order is maintained. Now ten years old, it is dated only in some of the technological details, and I have found that the tools she uses with reference to US contexts are equally relevant for understanding racism in Canada.
  2. “Language, Race, and White Public Space” – American Anthropologist, 1999. This article previews some of the analysis presented in the book above, and fortunately is available for free online. Here, Hill focuses on how language is used not only to construct a negative racial view of non-whites, but also “whiteness as an unmarked normative order”. The discussion of “Mock Spanish” that originates here has become a staple of linguistic anthropology courses, especially in the US, because it so powerfully demonstrates the multifaceted political and social underpinnings of what is initially easy to dismiss as an offhand, casual joke.
  3. “‘Expert Rhetorics’ in Advocacy for Endangered Languages: Who is Listening, and What Do They Hear?” Journal of Linguistic Anthropology, 2002. This is the article that I refer to most in my own work – without checking, I would put money on it being probably the only piece of writing that I have cited in literally everything I’ve ever published. Hill started her career working with speakers of Mexicano (Nahuatl), and continued her work with Indigenous language advocacy in the Southern US and Mexico throughout her life. In this article, she takes a critical eye on how we talk about Indigenous languages, and how in our efforts to convince people that they should care about this sometimes difficult-to-articulate issue, we inadvertently reinforce colonial power structures and the very marginalization that we aim to counteract. This is an example of the best kind of anthropological critique, to my mind: while we can often become cynical or righteous in ‘tearing down’ the efforts of well meaning folks around us, a call to re-examine how we do our work, from a place of love and valuing of the goals of our advocacy effort, is often needed.
  4. “The Grammar of Consciousness and the Consciousness of Grammar” American Ethnologist, 1985. This one is for those of you who are fully on board the linguistic anthropology train already, as it includes a lot of theoretical discussion of how to think in relation to both structural grammar and political economy. It is, however, definitely one that is worth engaging with in order to gain a more advanced understanding of these interrelated systems of power, and it’s a reminder for those of us who are students of language, in whatever form (linguistic, anthropological, or otherwise), that our object of study is one that is deeply intertwined with a political world.

Hill’s writing is definitely with an academic tradition, but it’s relatively accessible. I’ve used all but the last of the above articles in my undergraduate classes, and even included chapters from The Everyday Language of White Racism in a first year course. Revisiting her work reminds me of why I do what I do, and to keep in mind the “balancing act” that she highlights, with a commitment to creating a more just world acting as the centre of gravity that orients my study of both linguistics and anthropology. The echo and imprint of her time in the world is a great one, and it gives me something to aspire to.

Should you major in Cultural Anthropology?

A first-year student in my Anthropology 101 course emailed to let me know that they found the class readings intriguing and that they loved to learn about cultural values, stories, and traditions from around the world. Their email ended with a question: Can you tell me what kind of jobs there are for graduates of (cultural) anthropology?

This isn’t the first time I’ve had a student ask me this and I thought my first post on this blog (see the editor’s post about bringing a cultural anthropologist to the group) might address this question for anyone thinking of majoring in cultural anthropology.

There are lots of great resources out there that discuss careers for Anthropologists: such as the American Anthropological Association’s page on advancing one’s career, but few discuss tangible skills gained by students graduating with an Anthropology BA.

As a cultural anthropologist, I think anthropology graduates can do any job that requires  someone trained in the social sciences; that is, an anthropology graduate can think critically, wade through lots of data and identify the important information, they can communicate, they can problem solve, and have had experience working toward time/project deadlines. While cultural anthropologists study similar topics and fields to sociologists, we tend to receive more qualitative data analysis training, with a focus on ethnography, rather than quantitative training.

From my work experience in for- and non-profit organizations, I find anthropology graduates have the unique ability to appreciate difference (they can identify and acknowledge that there are different ways of living, leading, and learning, etc.) and, they have learned how to be self-reflexive – both skills are features of ethnographic methodology.

These skills have been discussed elsewhere as facets of a ‘Tolerance of Ambiguity’ (TOA). Psychologists DeRoma, Martin and Kessler (2003) define TOA according to Budner as “an individual’s propensity to view ambiguous situations as either threatening or desirable” (105). Put simply, if you have a low tolerance for ambiguity, you will not be comfortable with situations or people who are different that you. Likewise, sociologist Donald Levine argues that tolerance, and intolerance, are learned, context-dependent and something experienced ‘between people’. These theories signal the importance of being open to difference and acknowledging one’s own cultural context.

Important for our anthropology graduates, employers have identified the benefits of flexibility and adaptivity in their quest to hire university graduates with transferable skills. Minocha, Hristov, and Leahy-Harland 2018 argued that acquiring such traits create a global-ready workforce. In the recent study by Fewster and O’Connor, the authors found that “individuals who ha(d) a higher tolerance of ambiguity (would) be more productive and responsive in the volatile, uncertain and complex world of work, and experience increased job satisfaction, and overall well being” (2017: 2). In this report, the authors identify ‘cultivating curiosity’ as a trait individuals could focus on to develop their level of TOA. Cultivating curiosity is defined as:

“Cultivating curiosity in the workplace was also found to be a trait that people could focus on to develop their TOA. These behaviours centre around interacting with others and include effectively communicating and listening to co-workers; when problems arise, asking questions that encourage curiosity and if confronted with resistance from others, asking questions that lead to identifying possible solutions rather than dwelling on the past. Collaboration is also important including behaviours such as encouraging  participation from others, posing questions, creating strong professional relationships and networks for diversity of thought, sharing ideas and being open to connect the ideas of different people” (Fewster and O’Connor 2017: 9).

Anthropology graduates have spent their entire undergraduate careers cultivating such curiosity in their search to understand the ways in which human beings live their lives similarly and differently around the globe. Taking a holistic and comparative perspective comes naturally for our graduates, as these skills have been honed over time.

In The Teaching of Anthropology, Cora Du Bois argues that TOA is one of the attitudes that anthropologists as teachers need to foster in their  students (1963:37). She describes this attitude as “a capacity to entertain uncertainty, to cope with paradox, to allow for the indeterminate” (Du Bois 1963: 37).  There are many opportunities for anthropology instructors to facilitate and develop such skills in their students through in-class activities (e.g. through discussions that entice self-reflection) and through both summative and formative assessment strategies (e.g. comparative analysis, field prep tasks, etc.) throughout students’ undergraduate careers.

So what do Anthropology graduates have that other undergraduates might not? In addition to all those skills gained from a university degree, they have the unique ability recognize and appreciate difference, to critically reflect on internal logic (systems in place) and, to adapt to situations that are different from what they or their company may be used to.

But there is one caveat.

Cultural anthropology as a sub-discipline is pretty terrible at its Public Relations and it suffers from a bit of self-doubt among other established social science disciplines. Applied anthropologists (a group that I also identify with) tend to run in separate circles from cultural anthropologists (although I believe there is more of an overlap between applied and archaeology and biological anthropology) and therefore, not all cultural anthropologists working in academia are skilled at telling their students what their transferable skills are and how they can capitalize on an anthropology degree.

Employers want their next employee to have a university degree, and our graduates will need to tell them why a degree in anthropology has made them the better/best candidate.

Editor’s Comment: The Cultural One, Jennifer Long will write a future reflection on her experiences as an applied anthropologist in the areas of program evaluation, market research, and as a qualitative researcher-for-hire.

If any reader wants to know more about what applied anthropologists do, they could visit these websites:
https://www.applied-anthropology.com/

https://www.sfaa.net/annual-meeting/

These links provide a brief overview into the work of various applied anthropologists.