Singular They is Old, Singular They is New

(The Linguistic One stretches, blows the dust off this blog that has been sitting idle while we all deal with Life Things, and dives in to writing).

In the last few years, English pronouns have become a hot topic of discussion and controversy, mainly because they constitute the central linguistic battleground on which English-speakers play out debates about the nature of sex and gender. I thought I had written about this more than once on here before, but it turns out I touched on it only once (here, in relation to the claims of authority expressed by Canadian Academia’s own He-Who-Shall-Not-Be Named). The main focus of The Great Pronoun Controversy is what is conventionally called “singular they”, although novel pronouns (like ze, xie, or others) also come up sometimes. There is now an abundance of good writing available, both in accessible blog posts or news stories and in books and academic articles — to highlight just a few, check out the work of sociocultural linguist Lal Zimman (here for academic stuff, here for some blogging), as well as the fantastic work of my new colleague Lee Airton on the blog “They is my Pronoun” or in their book “Gender: Your Guide”, discussed here.

We are hitting another round of public discussion of “singular they” right now, as the Merriam-Webster dictionary has declared it to be the “word of the year” (for announcement, see here). This comes on the heels of the fairly significant announcement a few months ago that the American Psychological Association style guide (heavily used across several academic disciplines) will include, in its 7th edition, the instruction to use “singular they” in cases where a) the gender of the individual being discussed is unknown or not specified or b) the gender of the individual being discussed is known to be neither male nor female. The APA decision in particular makes an important move toward changing the material manifestation of gender representation in print, since style guides constitute, in some contexts, formal rule books — in academic or journalistic writing, you may be able to argue for some wiggle room, but the default will be for copy-editors and other reviewers to “correct” your word choices.

Pronoun_badgesAs noted above, there are any number of experts on the use of Singular They that could tell you more about the pragmatics, psycholinguistic acquisition, and politics of this pronoun (in addition to the scholars above, see for example, information about this conference on the topic from last summer, or Twitter accounts of scholars @kirbyconrod, @VerbingNouns, and @lexicondk). What I want to add here is really about the discourse around singular they, in particular, around the “Word of the Year” declaration. In addition to folks who have really rigid ideas about how both gender and language work or should work, I see some mild pushback on these types of announcements from people who totally support the use of singular they, but who dismiss the idea that anything new is happening here. People who point out this oldness and commonness are profoundly well-intentioned and supportive of the rights of gender non-conforming people (they may even be trans or non-binary themselves), but I think they are missing something that does matter about this pronoun, and in doing so, are appealing to a view of language that is worth pushing against.

In a certain sense, it is accurate to say that “singular they” is old – there are attested uses of it, referring to unknown individuals or hypothetical people (e.g. “If someone comes to my door to sell me cookies, I will give them all my money”) going back hundreds of years. There are even fun examples of people arguing against it literally while they are using it (for example “If a student submits a paper using ‘they’ as a singular pronoun, they are going to lose grades for grammatical incorrectness”), and it is hard to resist the schadenfreude involved in pointing out this apparent hypocrisy. But Merriam-Webster — and the American Dialect Society, who declared ‘they’ its Word of the Year in 2015 — are not hopelessly out of touch in recognizing this pronoun as as a significant word that highlights an important social change. The use of singular they to apply to named/known, non-binary individuals is definitely new, and its rise is directly connected to an increasingly prominent understanding of gender/sex in non-binary terms. This is a point I want to emphasize for a couple of reasons – first, in rooting the claim to its “correctness” in an argument that “singular they is old”, it opens to door to those who can object to your point by noting (accurately) that this way of using it is new. If your point is that we should be okay with the grammaticality of singular they because we have been, in a certain form, for centuries, that is one type of linguistic battle you may choose to fight; if your point is that we can and should affirm non-binary gender identities through the recognition of “they” are a personal and specific pronoun, relying on an appeal to its longstanding grammatical presence is weak. Don’t get me wrong – I am not debating or contesting the grammaticality of non-binary, specific, singular they. I’m just saying a) it is actually a new thing, b) that’s actually great, because it shows that our language can change to accommodate our new social understandings of fundamental things like gender. Grammatical correctness does not accumulate with age.

The second reason I want to emphasize it is that I think sometimes this “singular they is old” and “everyone uses singular they” point is somewhat dismissive of the challenge of learning to apply this pronoun. A lot of the really great scholarly work around singular they right now is looking at people’s ability to acquire the pronoun and to learn to use it appropriately. Airton’s entire blog (linked above) proceeds from the recognition that this is a thing that has to be learned for most people, and that addressing that process and effort compassionately and supportively is an important part of bringing about the necessary social change. Perhaps unsurprisingly, this change is easiest to make for people who are themselves trans, non-binary, or genderqueer, who have been thinking about gender in complex and life-altering ways for essentially their entire lives, or for people are situated in communities in which they encounter a lot of gender non-conforming individuals, who therefore get a lot of opportunities to use these pronouns. I do think people outside these groups – in other words, cis, straight people who don’t necessarily engage much with queer communities – need to put in the time to learn how to do this right. It matters. Using the wrong pronouns for people hurts them (see for example this discussion of the related practice of “deadnaming” trans people), and denies their gender identities. Language is a central battleground in this particular story because it is through language that we express our acceptance or denial of the reality of who a person is. These expressions are about real changes to how we, as a society, talk about gender, and that means it’s worth taking the time to learn even (or perhaps especially) if it’s hard and confusing for you. It’s one thing to criticize pedantic dinosaurs for refusing to even entertain the grammaticality of singular they in any form, but quite another, I think, to suggest that there’s nothing to see here.

The grammaticality of “singular they” doesn’t depend on its presence in dictionaries or style guides or on appeals to its age in the English language, but in this case, the dictionary is right to highlight it – trans and non-binary people are becoming much more visible, and we as a broader society are learning new ways to talk about gender as a result. This pronoun is a radical thing, and it has come to mainstream public attention and use really quite quickly. Recognizing its newness is not to dismiss it – instead, it is to highlight its importance and to push forward with making it more present.

 

Why I teach about Female Genital Cutting (FGC) in First-Year Anthropology

Content Warning: The following post discusses the importance of acknowledging one’s own bias and avoiding judgment of cultural practices. It also explores the importance of concepts such as cultural relativism and critical cultural relativism when discussing taboo topics, like FGC, in Canadian Post-Secondary Classrooms. This post does not attempt to take a position on whether FGC or male circumcision is right or wrong or, to provide a comparison between the two practices. Its goal is to discuss how FGC is covered in Canadian and US mainstream media and why this discussion is an informative case study that I use to demonstrate and discuss foundational concepts in my first-year cultural anthropology course. Reader beware.

Continue reading “Why I teach about Female Genital Cutting (FGC) in First-Year Anthropology”

Language, Accommodation, and the View from Whiteness

I want to take a minute to make a quick point about the underlying implications of several stories that have circulated in the media over the last couple of weeks. These stories have all been pretty thoroughly reported on and critiqued, so apologies if you are already sick of them, but I think it’s worth gathering them in one place for comment.

The stories:

  1. Senior NBC journalist Tom Brokaw comments (and later apologizes for saying) that “the Hispanics should work harder at assimilation”, specifically noting that they should make sure “all their kids are learning to speak English”. (Ed: Wow, he really said ‘the Hispanics’, even, didn’t he? That’s…special).
  2. A professor (and now former program administrator) at Duke University wrote an email to students in a medical program blatantly stating that choosing to speak Chinese to their friends would be held against them when it came time to consider internship and employment candidates.
  3. A study of court reporters in Philadelphia found that significant inaccuracies characterized their transcripts that included African American varieties of English, to the point that 11% of the transcribed sentences were “gibberish”.

The first two stories contain an obvious similarity: racialized people are perceived as insufficiently willing to “assimilate” or become full members of the community, primarily as a result of their use of languages other than English. As several responses have pointed out, the claims made by Brokaw and by Duke professor Megan Neely are built on fictions – the language skills of Latinx people in the US are just fine, the assumption that English is the only language of US culture is an act of extraordinary erasure, and international students at Duke, for their part, are required to demonstrate English proficiency before even being accepted into the program. But facts like these don’t matter in shaping these perceptions — as scholars like Nelson Flores and Jonathan Rosa have observed, language and race are mutually constructed pieces of social life, such that, for example, Latinx bilinguals are interpreted as linguistically deficient, while White bilinguals are interpreted as exceptional and intelligent (my go-to example in class is this gushing headline about Princess Charlotte, which makes me sigh so hard). These raciolinguistic ideologies also come in to play as people’s interpretation of whether someone has an accent or not is heavily influenced by that person’s appearance (or other non-linguistic information, like the person’s last name)*. All of that is to say: in these two stories, discrimination based on (perceived) linguistic ability is being used to stand in for discrimination based on racial identity, since the latter is considered vile (at least for people like Brokaw, about whom any number of but he’s definitely not racist! defenses were marshalled), but the former is apparently justifiable.

The third story, about the Philadelphia court reporters, is a bit different. In this case, what we see is how court reporters demonstrate something that should actually be an obvious disqualification for the position they hold: the inability to understand and accurately represent varieties of English spoken by a significant proportion of the people whose speech they are paid to document. This is an especially high-stakes context that demonstrates a fundamental lack of care about these speakers, and can’t be detached from the well documented inequalities in court outcomes for African Americans (both as defendants, and as victims/witnesses**). Grammatical patterns of African American English(es) are well known (a few are even included in the linked article) and could easily be taught and learned as part of training for a position in which accurate rendering of speakers’ words is vital. But…they’re not, and people hired in to these positions are allowed to continue, despite their clear linguistic limitation.

These three examples illustrate the same story from two different sides: the language of non-white people is differentdifficult, and needs to be improved. Non-white people are responsible for working diligently to demonstrate, linguistically and otherwise, their membership in the group. White people do not have to bother learning how to understand or use the language of non-Whites – not even when it’s central to their job. Bilingualism and the ability to code-switch appropriately and effectively becomes a survival requirement for some people, and a complete non-issue for others. This functions not only to reinforce racist interpretations of different people’s linguistic abilities, but also to impose a cognitive burden onto those who are required to do the extra work of learning multiple codes, as well as the social expectations about switching between them, and of constantly monitoring for how their speech is being (mis)interpreted.

Whenever discussions of racism and racist comments emerge, a lot of focus goes on to whether or not the individual person who made the comments “meant” to be racist, or whether that person “is a racist”. The thing is, ultimately, Brokaw, Neely, and those individual Philadelphia court reporters are not what these stories are about. These stories are about the constant reminders that non-White people get about their “limitations”, about the work they need to do to be accepted in “mainstream” (read: White) society, about the shifting goalposts through which racial discrimination is enacted in practice, and about how the view from the perspective of Whiteness is continually rendered as the only “normal” way of being. Language is an extremely powerful force in the manifestation of racism, and these examples are pieces that make that force work.

*This research is complex, and there are a lot of theories around how and why it comes into play – a relatively recent discussion of it, for example, can be found here: https://link.springer.com/article/10.3758/s13414-017-1329-2 .

**A particularly prominent example of the impact of African American language in court can be seen in the discussion, by John Rickford and Sharese King, of testimony given in the trial of George Zimmerman.

 

Tightening the belt – The Anthropology of Consumerism

Did you spend too much over the holidays trying to spoil your dearest and nearest friends and family? Did you decide to travel to see loved ones? Eat out more than usual? Grab a drink with an old friend or new somebody?

Spending on travel, eating out and gifts during the holiday season is increasingly putting Canadians into debt; According to a national cross-generational survey of 1000 participants in early October (2018), Canadians planned to “spend an average of $1,563 (for the 2018 Christmas season), up 3.7 per cent from $1,507 in 2017” (CBC October 3, 2018).

christmas-234105_640
Christmas Shopping in Hamburg – CCO

In the latest publication of the Annual Review of Anthropology (2018, Vol 47), Anne Meneley defines consumerism as “a matter of concern or crisis in the contemporary neoliberal, globalized world (which can be) characterized as capitalism unbound” (emphasis my own). She describes 5 topics of contemporary consumerism: (a) excess, (b) waste, (c) connectivity, (d) fair-ish trade, and (e) the semiotics of self-fashioning, some of which have a particular resonance after this most recent holiday season. Her article provides some interesting insights into consumerism – especially over the holidays.

In relation to excessive spending (surely evident during Christmas), Meneley notes that consumerism is increasingly framed as a problem, and one that is often related to under/mis-education of the lower classes. Meneley also identifies how excessive consumerism has become medicalized as new obsessive-compulsive disorder (hoarding), where fetishized objects are thought to contain residues of the owner and can therefore, not be thrown away. In addition, she describes the new attention paid to the storage and organization of things, which, if disorganized, may now require professional intervention (e.g. professional organizers – check out Netflix’s Tidying Up with Mary Kondo) to realign the relationship between human being and thing.

Perhaps you’re feeling exhausted now that the holidays are over? This might be because you’ve spent more time than other members of your household preparing for it.

Using ethnographic research, Meneley describes the shopping experience as an(other) example of unpaid labour for many women. She identifies the “considerable amounts of time (spent on the shopping experience), especially when the shoppers are employed, care givers, or on restricted budgets that require bargain shopping” (2018). Examples include how women are required to spend time purchasing meaningful gifts to fulfill their kin-keeper obligations, or plan, purchase materials, and serve home-cooked meals throughout the holidays that follow recent cooking trends or health-guidelines. Meneley goes on to note that if the shopper can be thrifty (with time – for example through online shopping – or money spent), this may add further significance to their purchases but this also may take additional time.

Meneley concludes her article with a list of ways in which consumerism is encroaching into the academic world: paying for access to journals or subscription services, measuring citation indices and impact factors, and the continued trend toward under-paid and -supported adjunct faculty to staff universities. She calls for greater attention to the encroaching ‘problem of consumerism’ into academic practices, a call that already feels old and tired.

At the outset of the article, Meneley defines consumerism as “an unremarkable part of quotidian existence, as a patriotic duty at various moments, as an indicator of social class, and as a means of semiotic self-fashioning” (2018, 117); yet, in my reading, Meneley’s work also includes ‘thoughtful consumption’ as a practice, an act of which implicitly requires the passage or importance of time (spent). Although she does not address the topic of ‘time’ overtly, Meneley describes time as being precarious, fleeting, expensive (i.e. time spent finding the cheapest, most meaningful, most nutritious goods). Throughout the article then, time becomes remarkably interconnected with the act of consumerism and is likewise involved in everyday acts of consumption as both an indicator of social class and personal branding (what she calls ‘semiotic self-fashioning’).

Perhaps for this new year then, when we’re told to tightening our belts (to spend less), those of us who gave a lot (whether that be time or gifts, etc.) could pay more attention to our use of time and/or put effort into thoughtful consumption as a way of clawing back some of our own resources (such as time, space, and energy). This approach could provide further evidence of ‘connectivity’ in consumerism which Meneley describes as the efforts of consumers to connect to the producer of goods (and where certain products make this impossible) as seen in ‘follow-the-thing ethnographies’ (and her discussion on ‘fair-ish trade’ products and cultures of circulations) or, as they relate to the growing importance of ethical consumerism that focus less on the ‘life of things’ and instead explore participants’ experiences of a ‘life with things’.

To read more about Consumerism in the Annual Review of Anthropology by Anne Meneley follow this link: https://doi.org/10.1146/annurev-anthro-102116-041518

Reflections on the AAA, Part 1: On Speculative Anthropology

Editor’s note: This year, two of our people (the Linguistic One and the Cultural One) went to the American Anthropology Association’s Annual Meeting in San Jose, California. This is an optimistically titled “Part 1″ of their reflections on the conference, since they are undoubtedly returning to their ‘”real” lives as we speak and staring at the pile of grading that did not magically diminish while they were otherwise occupied learning new things, sharing their own work, and meeting with old and new colleagues and collaborators. 

A few days before I left for San Jose, members of the AAA got what was honestly a very surprising email announcing a special guest lecture that about five hundred people would be able to attend. Tickets were free, but were scooped up within a few hours of this announcement, because while we are used to getting excited about academic rock stars, it is pretty rare for someone who is truly famous in the rest of the world to connect to such an event.

The guest was George Lucas, and the near universal reaction to this announcement was…wait, what? What does Lucas have to do with anthropology? (An alternative

DsAF7hcUcAA7HW2
A less-than-spectacular photo proving that George Lucas did indeed have a conversation with Deborah Thomas, editor of American Anthropologist, and that I watched from a balcony seat.

reaction was recounted to me by a colleague, who had a graduate student ask her about the famous anthropologist George Lucas and what he worked on, because the only George Lucas he could think of was “the Star Wars guy”, and that didn’t make any sense). Well, Lucas studied anthropology in college, before finding his way into filmmaking somewhat, as he tells it, by accident. The themes he explores in his films are, in some ways, rooted in what he learned in that context – most famously, the theory of mythical journeys associated with Joseph Campbell, and the imagining of the archaeologist as hero-adventurer, but also the ethnographic lens that he took in American Graffiti, which documents what he then saw as a disappearing rite-of-passage in American life. The event was billed as a way of thinking about storytelling in anthropology in discussion with a “Master Storyteller”.

So how did that turn out? Well….not great, honestly. Lucas was never trying to be an anthropologist, or to be rigorous in thinking through anthropological ideas, or, of course, to stay current within anthropology. He made quite a few references to “primitive” cultures, and invoked a general view of the “universal journey” that was a) highly masculine (as Campbell is known to be) and b) …not actually universal at all. While some praised the moderator, Deborah Thomas, for navigating his problematic highly offensive statements (seriously, there were audible winces a few times), I myself felt a bit frustrated as she continually turned the discussion back to asking him to comment on anthropology. The thing is, no one in that audience had anything to learn about anthropology from George Lucas. He was in his element talking about stories, and about his educational outreach initiatives with the new Museum of Narrative Arts. For me, the most interesting recurring theme was about the twelve-year-old as the site of imaginative potential. There was a thoughtfulness about the idea of coming-of-age (though not at all based on actual knowledge of coming of age rituals around the world), human creative potential, and hope, that was quite beautiful. But even that was undermined by his “encouragement'” of anthropology as, essentially, a really good way to learn to do market research (which, ok, it can be), and eye-rolling at “ivory tower” academics who refuse to admit that the “real world” is all about capitalist wealth accumulation. It is quite something, as a group that includes many people who try, however imperfectly, to walk with and understand a huge range of human experiences, including severe economic and political marginalization, to be lectured by a kajillionaire about being “out of touch”. Applauding when Lucas was given a lifetime membership in the AAA – when an annual one is difficult to afford for many active, passionate, graduate student and precariously employed anthropologists – left a bit of a bad taste in my mouth.

At the conference itself, I attended a panel that presented an interesting counterpoint to the talk by Lucas. After the death of author Ursula K. LeGuin earlier this year, linguistic anthropologist Bernard Perley organized a series of talks reflecting on her work and its anthropological legacy/roots. LeGuin was the daughter of Alfred Kroeber, a foundational figure in North American anthropology (which, as one of the panelists noted, is not necessarily reported as a compliment, given the colonial roots of our discipline), and she was raised in a world steeped in anthropology. The speakers on this panel were diverse, both in terms of their identities and their anthropological specialties. They talked about different aspects of LeGuin’s stories, sometimes positively, sometimes critically, but always with an eye to how she used her fiction as a kind of (what she called) speculative anthropology. How do we understand our own world through the lens of another one? How do we move as ethnographers through characters like Genly Ai in The Left Hand of Darkness, or the narrator in The Ones Who Walk Away from Omelas? How do we understand ourselves as sharing responsibility for the stories that shape our world, and what do we owe to the peoples whose stories have been colonized by others?

There were reflections, through these lenses, of both the limitations of LeGuin’s imagined worlds (rereading The Left Hand of Darkness in the context of teaching Language & Gender, Jocelyn Ahlers analyzed how gender markings actually crept in, in ways she hadn’t seen previously – as LeGuin herself also conceded a few decades after writing it, at least with respect to the “generic he” pronoun) and of anthropology (archaeologist Lee Bloch used the journey of The Dispossessed to ask provocative questions about the techno-scape of the temporal paradigms on which the field relies, and the colonial logic of these frames in themselves). There was engagement with the contemporary political moment and how examination of The Ones Who Walk Away from Omelas – always a challenging story – resonates in a time and country in which children in cages is not even a little bit metaphorical.

It was interesting to me to contrast these two different genre of anthropological conversation with two different creative minds’ uses of an anthropological imaginary. At its best, anthropology allows us to see universality refracted across radical difference. At its worst it tries to reduce that difference into a universalizing narrative of progress and improvement. I left California with a desire to imagine more, and in their own way, both these discussions are helping me to do that.

The Underlying Hope of Anthropology: Reflecting on the Work of Jane Hill

This past week, the world of linguistic anthropology – and the world in general, though that world is presumably less conscious of the loss – lost a giant with the death of Jane H. Hill, Professor of Anthropology and Linguistics at the University of Arizona. It is an odd thing in academia when a person whose ideas loom large over a field of thought passes away, much like the death of a more popularly influential artist makes some of us return to their work with a renewed sense of its meaning and impact on the world. I never met Jane personally, though my academic lineage traces back to her in a very short line (she was the PhD supervisor of my PhD supervisor). By all accounts I’ve ever heard, in addition to being a brilliant scholar, she was a wonderful human and mentor, and I can only imagine how that loss is felt by the people closest to her. At the time of this writing, her faculty page at the University of Arizona is still active, and on it, she invites students to “join [her] on the tightrope”, where, as she puts it

I attempt a precarious balancing act among diverse commitments: to the detailed documentation of languages and cultures and specialized expertise in technical tools such as comparative linguistic analysis, to the understanding of the scope and diversity of human history that is the glory of anthropology, and to using what I learn to advance social justice and mutual respect among human beings.

In a case of social media producing something right, anthropologist Anthony K. Webster (@ethnopoetics) suggested to the American Anthropological Association on Twitter that, in light of Hill’s death, the organization could provide access to her publications for free – and they did! For six months, any of Hill’s articles from the considerable library of publications housed at AnthroSource are available to access free of charge. Anyone interested in the broad areas of language, culture, and social justice should absolutely take advantage of this opportunity.

This blog is not really the best place for me to even try to highlight the value of these contributions (the upcoming AAA meetings in San Jose are sure to include many such reflections), but I will make a few recommendations about what to read, from that list, as well as additional work.

  1. The Everyday Language of White Racism – I am starting immediately with a book, which is not, of course, made accessible through AnthroSource, but which is too significant not to lead with. This is the book I always go to whenever anyone asks for the one recommendation from my field that I think everyone should read. Hill wrote this book late in her career, based on analysis of online discourses and commentary about various racial issues manifested in language, including slurs, appropriation, and “gaffes”. The title of the book makes clear what this is about – whiteness, and the quotidian ways in which a white racist social order is maintained. Now ten years old, it is dated only in some of the technological details, and I have found that the tools she uses with reference to US contexts are equally relevant for understanding racism in Canada.
  2. “Language, Race, and White Public Space” – American Anthropologist, 1999. This article previews some of the analysis presented in the book above, and fortunately is available for free online. Here, Hill focuses on how language is used not only to construct a negative racial view of non-whites, but also “whiteness as an unmarked normative order”. The discussion of “Mock Spanish” that originates here has become a staple of linguistic anthropology courses, especially in the US, because it so powerfully demonstrates the multifaceted political and social underpinnings of what is initially easy to dismiss as an offhand, casual joke.
  3. “‘Expert Rhetorics’ in Advocacy for Endangered Languages: Who is Listening, and What Do They Hear?” Journal of Linguistic Anthropology, 2002. This is the article that I refer to most in my own work – without checking, I would put money on it being probably the only piece of writing that I have cited in literally everything I’ve ever published. Hill started her career working with speakers of Mexicano (Nahuatl), and continued her work with Indigenous language advocacy in the Southern US and Mexico throughout her life. In this article, she takes a critical eye on how we talk about Indigenous languages, and how in our efforts to convince people that they should care about this sometimes difficult-to-articulate issue, we inadvertently reinforce colonial power structures and the very marginalization that we aim to counteract. This is an example of the best kind of anthropological critique, to my mind: while we can often become cynical or righteous in ‘tearing down’ the efforts of well meaning folks around us, a call to re-examine how we do our work, from a place of love and valuing of the goals of our advocacy effort, is often needed.
  4. “The Grammar of Consciousness and the Consciousness of Grammar” American Ethnologist, 1985. This one is for those of you who are fully on board the linguistic anthropology train already, as it includes a lot of theoretical discussion of how to think in relation to both structural grammar and political economy. It is, however, definitely one that is worth engaging with in order to gain a more advanced understanding of these interrelated systems of power, and it’s a reminder for those of us who are students of language, in whatever form (linguistic, anthropological, or otherwise), that our object of study is one that is deeply intertwined with a political world.

Hill’s writing is definitely with an academic tradition, but it’s relatively accessible. I’ve used all but the last of the above articles in my undergraduate classes, and even included chapters from The Everyday Language of White Racism in a first year course. Revisiting her work reminds me of why I do what I do, and to keep in mind the “balancing act” that she highlights, with a commitment to creating a more just world acting as the centre of gravity that orients my study of both linguistics and anthropology. The echo and imprint of her time in the world is a great one, and it gives me something to aspire to.

Should you major in Cultural Anthropology?

A first-year student in my Anthropology 101 course emailed to let me know that they found the class readings intriguing and that they loved to learn about cultural values, stories, and traditions from around the world. Their email ended with a question: Can you tell me what kind of jobs there are for graduates of (cultural) anthropology?

This isn’t the first time I’ve had a student ask me this and I thought my first post on this blog (see the editor’s post about bringing a cultural anthropologist to the group) might address this question for anyone thinking of majoring in cultural anthropology.

There are lots of great resources out there that discuss careers for Anthropologists: such as the American Anthropological Association’s page on advancing one’s career, but few discuss tangible skills gained by students graduating with an Anthropology BA.

As a cultural anthropologist, I think anthropology graduates can do any job that requires  someone trained in the social sciences; that is, an anthropology graduate can think critically, wade through lots of data and identify the important information, they can communicate, they can problem solve, and have had experience working toward time/project deadlines. While cultural anthropologists study similar topics and fields to sociologists, we tend to receive more qualitative data analysis training, with a focus on ethnography, rather than quantitative training.

From my work experience in for- and non-profit organizations, I find anthropology graduates have the unique ability to appreciate difference (they can identify and acknowledge that there are different ways of living, leading, and learning, etc.) and, they have learned how to be self-reflexive – both skills are features of ethnographic methodology.

These skills have been discussed elsewhere as facets of a ‘Tolerance of Ambiguity’ (TOA). Psychologists DeRoma, Martin and Kessler (2003) define TOA according to Budner as “an individual’s propensity to view ambiguous situations as either threatening or desirable” (105). Put simply, if you have a low tolerance for ambiguity, you will not be comfortable with situations or people who are different that you. Likewise, sociologist Donald Levine argues that tolerance, and intolerance, are learned, context-dependent and something experienced ‘between people’. These theories signal the importance of being open to difference and acknowledging one’s own cultural context.

Important for our anthropology graduates, employers have identified the benefits of flexibility and adaptivity in their quest to hire university graduates with transferable skills. Minocha, Hristov, and Leahy-Harland 2018 argued that acquiring such traits create a global-ready workforce. In the recent study by Fewster and O’Connor, the authors found that “individuals who ha(d) a higher tolerance of ambiguity (would) be more productive and responsive in the volatile, uncertain and complex world of work, and experience increased job satisfaction, and overall well being” (2017: 2). In this report, the authors identify ‘cultivating curiosity’ as a trait individuals could focus on to develop their level of TOA. Cultivating curiosity is defined as:

“Cultivating curiosity in the workplace was also found to be a trait that people could focus on to develop their TOA. These behaviours centre around interacting with others and include effectively communicating and listening to co-workers; when problems arise, asking questions that encourage curiosity and if confronted with resistance from others, asking questions that lead to identifying possible solutions rather than dwelling on the past. Collaboration is also important including behaviours such as encouraging  participation from others, posing questions, creating strong professional relationships and networks for diversity of thought, sharing ideas and being open to connect the ideas of different people” (Fewster and O’Connor 2017: 9).

Anthropology graduates have spent their entire undergraduate careers cultivating such curiosity in their search to understand the ways in which human beings live their lives similarly and differently around the globe. Taking a holistic and comparative perspective comes naturally for our graduates, as these skills have been honed over time.

In The Teaching of Anthropology, Cora Du Bois argues that TOA is one of the attitudes that anthropologists as teachers need to foster in their  students (1963:37). She describes this attitude as “a capacity to entertain uncertainty, to cope with paradox, to allow for the indeterminate” (Du Bois 1963: 37).  There are many opportunities for anthropology instructors to facilitate and develop such skills in their students through in-class activities (e.g. through discussions that entice self-reflection) and through both summative and formative assessment strategies (e.g. comparative analysis, field prep tasks, etc.) throughout students’ undergraduate careers.

So what do Anthropology graduates have that other undergraduates might not? In addition to all those skills gained from a university degree, they have the unique ability recognize and appreciate difference, to critically reflect on internal logic (systems in place) and, to adapt to situations that are different from what they or their company may be used to.

But there is one caveat.

Cultural anthropology as a sub-discipline is pretty terrible at its Public Relations and it suffers from a bit of self-doubt among other established social science disciplines. Applied anthropologists (a group that I also identify with) tend to run in separate circles from cultural anthropologists (although I believe there is more of an overlap between applied and archaeology and biological anthropology) and therefore, not all cultural anthropologists working in academia are skilled at telling their students what their transferable skills are and how they can capitalize on an anthropology degree.

Employers want their next employee to have a university degree, and our graduates will need to tell them why a degree in anthropology has made them the better/best candidate.

Editor’s Comment: The Cultural One, Jennifer Long will write a future reflection on her experiences as an applied anthropologist in the areas of program evaluation, market research, and as a qualitative researcher-for-hire.

If any reader wants to know more about what applied anthropologists do, they could visit these websites:
https://www.applied-anthropology.com/

https://www.sfaa.net/annual-meeting/

These links provide a brief overview into the work of various applied anthropologists.

Assessing Value: Reflections on the Royal Alberta Museum

I spend a lot of time these days angry; there is so much to be rightfully angry about and I want to write about it all but don’t have enough fucks. Sometimes writing isn’t the best way to respond, however, in this case I know that the only way I might be heard is to write about it. So what has me so angry that I am actually writing about it? This “review” of the newly opened Royal Alberta Museum (a.k.a. the RAM). This opening was a much anticipated event; the original RAM closed its doors to the public after a wonderful 48 hour long celebration in December 2015.

My attention was first drawn to the opinion piece via Facebook, when many friends who had not yet visited the museum expressed their hesitation to visit after it received such a negative review. As someone who has eagerly awaited the opening, who also knows several people who work for the museum, and also has research communication, particularly that of cultural heritage, as an expertise I felt compelled to comment, which I did:

fb re museum

Now that I’ve visited the RAM I have much much more to say about the opinion piece and why I think it is unfair and misguided.

First, it is challenging to calculate and negotiate cost versus value when assessing cultural and natural resources. As I’ve been discussing with my Issues in Archaeological Methods and Interpretation (Anth395) course students, yes archaeologists are involved in assessing the significance of cultural resources and yes this should include considering the economic significance of said resource BUT it represents only one aspect, is one criteria under which significance or value is assigned.

Yes a museum is a building that requires raw materials, resources, time, energy, labour, and effort to build. These are costly and sure one could reduce them by sourcing alternative materials, cutting down on square footage, etc. BUT a museum is NOT just a building. It is a place. Yes the RAM is a beautiful building; I love its overall design and layout. It has great curb appeal with design features that connect it to other buildings in our downtown cultural hub (the Art Gallery of Alberta, the Winspeare, the Citadel Theatre, the Shaw Conference Center, Churchill Square, City Hall, the under-construction downtown branch of the Edmonton Public Library, Ice District, Rogers Place, MacEwan University); it just “fits” in yet is unique. But a building does not make a place – a place is made by people for people.

The RAM is all about people. I would encourage my colleagues at the museum to share their stories over the next few months and years; the public need to hear about how decisions were made, how much time, effort, and intellectual and emotional labour went into creating a place for everyone to experience – in their own way. Note that I say experience not “enjoy”. Some parts of the museum will not be enjoyed and are not meant to be enjoyed; the section on residential schools, for example, should make us uncomfortable (the part of the opinion piece on the residential school display is particularly problematic but I’ll return to that in a second). How you experience the RAM is up to you and this is a good thing. Other than having to enter or exit through the one entrance to each gallery, you can wander through each gallery however you would like. It also doesn’t matter which order you visit the galleries in. I was there with my five year old and so we just took in whatever caught her eye. I asked her what she wanted to do after reading the titles of the gallery. She made the decisions including when to leave each gallery (so we didn’t actually see everything but that’s what multiple visits are for). Having her take the lead, which she loved, worked because we weren’t forced to go in any particular order nor were there any narratives or descriptions that were unclear if you went through the “wrong way” or “missed” reading something else in the same space. I get that some people won’t like this – they want a “story” that has a start, middle, then end but this approach means telling only one story. How can the curators choose a single narrative? Which one is the “correct” one? Should they choose the “epic” or “dramatic” story as our opinion piece author implies ensuring that “colour” is added to keep the attention of the viewer?

What the author of the opinion piece misses is that this is a museum where the visitor can connect with the objects presented in a way that is meaningful to them. Instead the author states “The whole point of a provincial museum is to highlight the things that make Alberta special. Instead, most of the square footage is devoted towards cataloguing quotidian aspects of Alberta life that existed pretty much everywhere.” This is only one kind of purpose for a museum. Yes things that are special should be celebrated in a provincial museum, but as an archaeologist I also recognize that the “quotidian aspects” are ALSO very important when it comes to understanding and connecting with our past. As James Deetz (1977:161) argues

it is terribly important that the “small things forgotten” be remembered. For in the seemingly little and insignificant things that accumulate to create a lifetime, the essence of our existence is captured. We must remember these bits and pieces, and we must use them in new and imaginative ways so that a different appreciation for what life is today, and was in the past, can be achieved.”

It is in these small things, like the canning equipment, radios, and benches the author so disdains, that people connect with the past. For example, last summer when we excavated at Mill Creek Ravine we excavated three GWG buttons and, having owned an excellent pair of their jeans, I was SO excited. One can then imagine my delight in seeing a WHOLE room dedicated to this made-in-Edmonton brand! When my kid asked why I wanted to “look at clothes” I told her about the buttons we’d found (at a site she’d visited several times) and she became excited too. I can imagine other small and large things throughout their galleries that will capture the excitement of one individual and go unnoticed by another. AND THAT’S OK!

FYI: My kid loved the megafauna fossils and reconstructions, in particular the Giant Ground Sloth – after our visit she climbed the stairs in our house “like a sloth”. She also loved putting her face on rocks, playing with kinetic sand, and seeing the tiny baby bear that looked like a dog. We left when she grew tired but only agreed to leave when I promised we’d return again so she could explore more things.

20181020_160513

My kiddo loved this Mantis Shrimp, squealing with glee when it poked its head out to check her out.

Also the author’s suggestion that visitors “could have gazed in horror at a scene of a Cree village plunged into famine when those bison disappeared” is just gross (side note: if you want to understand the importance of bison, go visit Head-Smashed-In Buffalo Jump, a World Heritage Site). There are other ways to get at the horrors of colonialism without exploiting our indigenous peoples’ suffering and reinforcing colonial narratives at the same time. For example, my favourite part (aside from the GWG exhibit) was the respect and honour given to Manitou Asinîy and to the Elders who provided teachings on this sacred being. The intentional, consistent, and thoughtful recognition of indigenous peoples and being on Treaty 6 territory and the traditional lands of many indigenous people, the acknowledgement of the teachings of Elders that informed the presentation of items, and the use of indigenous language, images, and voices IS powerful and important. As I mentioned in my pre-visit Facebook post, the author also missed the point when they state:

Among the small number of artifacts chosen to represent the province’s gut-wrenching history of residential schools, one of them is literally a pile of the bricks used to build one of the schools. It’s not the only artifact on display, but it’s telling that bricks made the cut. It’s like representing a gulag by simply displaying a bunch of plates and saying “here are the plates that gulag prisoners used.”

In my opinion, residential schools should be physically reduced to a pile of bricks. This was an exceptional choice because it is not the building that is the focus but the stories of those who experienced them that is highlighted an emphasized. If the author was less focused on assigning dollar signs to “tacky kitsch, random antiques and straight-up garbage” they might have instead sat and scrolled through the haunting images present on the interactive displays or listened to the stories of resilience told by those who survived that can be heard in that space.

Sure, maybe the names of well known provincial figures Nellie McClung and Peter Lougheed are missing at the RAM (I didn’t see them but honestly didn’t look for them either). But that’s okay; they are present on buildings, schools, and/or statues/monuments and are part of the curriculum across the province so we won’t forget them anytime soon. They also are part of our colonial heritage – figures that have loomed large in a particular narrative about Alberta’s past – so maybe this new museum is shedding light on others that were previously made invisible by their absence from these kinds of spaces. This again is why the treatment of Manitou Asinîy is so profound and important – it can be visited by anyone free of charge; it is also explicitly described and presented as spiritually significant. It is an object, a being, a rock, a spirit, a story that cannot be assigned any monetary value.

Finally, this opinion piece fails to acknowledge the people – the curators, workers, consultants, teachers, Elders, designers, educators, etc. – who worked on this museum. Their labour is valuable and I bet their wages made up an important chunk of the budget. As previously stated I hope we hear more from them as the RAM settles into its place in our community. We need to make sure that funding for the RAM to grow and change, to represent different voices, and to share more of the small things that tell the stories of our province (see Dr. Shulist’s post on what is really lost when museums don’t get funded adequately and consistently).

So to those who hesitated to go to the RAM, I say go. Have your own experience – that is what it is there for – and assess its value in a way that is meaningful to you.

To the author of the opinion piece, I’ll acknowledge it as that – your opinion – but I’ve decided to not value it as worth the paper it was printed on.

Editor’s Comment: I’m glad to see the Archaeology One sufficiently fired up to provide us with a critical, engaging post after her long, but much needed, break from blogging. While I’m optimistic that we’ll hear from her again soon (maybe about her field work from this summer hint hint), I believe that rather than waiting for her next post at my desk with my red pen poised and ready to edit whatever swear-ridden rant she throws my way, I’ll take advantage of my downtime to visit the RAM and the other downtown places she mentions in her post. Well argued Dr. Biittner.

 

 

Reference Cited: Deetz, J. 1977. In Small Things Forgotten: The Archaeology of Early American Life. New York: Anchor Press/Doubleday.

Anthropology As Announcement: We Have a New One!

It’s been a busy start to the new (academic) year, despite the lack of actual visible activity on this blog, as we’ve had some behind-the-scenes action (which mean the editor actually did some entirely non-sarcastic work, check it out!). The big change we have is that we have a New One on board as a contributor to the Anthropology As Universe. Dr. Jennifer Long joined the MacEwan anthropology department in July, and because we knew she had a history of blogging, and was generally awesome, we basically made her join our team. Dr. Long will be “The Cultural One” around here, and will bring that perspective to her observations about, well, whatever she feels like observing about.

We’ve added an introduction to “The Cultural One” on the “About Us” page, where you can learn more about Dr. Long’s research and teaching interests. As we all do, Dr. Long has a few ideas in her head about topics to cover and her first posts will undoubtedly magically spring from her brain on to the virtual page any minute now.

We also remain open to guest posts from anthropologists and anthropology students, so if you think this might be a good space for you to mouth off in, fire us a comment here and we’ll talk!

 

On What Was Really Lost in the Fire

As everyone almost certainly knows by now, just over a week ago, the Brazilian Museu Nacional in Rio de Janeiro burned, with massive damage and the complete destruction of huge proportions of an extensive collection of irreplaceable artifacts, fossils, documents, and artwork. No one thinks this is anything less than a tragedy, though people have varying levels of anger about it – some seem to see it as an unfortunate accident, others (who know more about Brazil, including most Brazilians) are quick to focus rage on decades of neglect by a series of governments, who at best just didn’t care enough about maintaining this building and its contents.

I’m writing this to call attention to another level of anger, which is mainly being expressed by Indigenous people, and which I’ve briefly commented about on Twitter and elsewhere. This anger is about why we allow so much cultural knowledge and linguistic information, not to mention sacred and/or valuable artifacts, from Indigenous peoples around the world, to be housed in singular buildings run by colonial governments in the first place? Why do we accept the assumption that these organizations are inherently better at “preserving” this information than the communities themselves? Why do we uncritically act as though, despite the fact that anything in a museum is inherently removed from its context and active role in the community from which it was taken, this form of “preservation” is a priority?

Fire_at_Museu_Nacional_05
Photo by Felipe Milanez, Creative Commons License (source). In addition to showing the fire, the image includes the looming figure of Dom Pedro II, the last monarch of the colonial Empire of Brazil.

At this point it’s worth stepping back to ask who the “we” is in those above questions. There is only a certain proportion of “us” who have accepted or advocated for these things, or made these assumptions. Because as I noted, many Indigenous people reacted to this with one common statement – repatriate. Return museum materials to their rightful owners. Reprioritize – instead of emphasizing access for outside, mainly European-descended, people and some kind of ideal of “global human knowledge”, consider the needs and values of living Indigenous cultures and languages. The “we” in these discourses generally refers to white academics. So much of what was lost, the stuff that can’t be recovered (like the entire linguistics section of the museum, containing the only documentation of several languages that have no remaining speakers), was Indigenous knowledge. It’s one thing to lament the loss to our (there’s that word again) knowledge of language in general, and it’s completely another to consider what this loss means to the community that spoke that language and how devastating it is to see the elimination of essentially any chance at reawakening it.

I’m angry about this. I’m angry at governments who build museums to preserve and publicize knowledge and then neglect them. I’m angry at centuries of colonial theft that has built these museums, trapped thousands of different types of hostages inside, just waiting for the spark to light them on fire. And I’m angry at my disciplines, in which we continue to treat language documentation and preservation in buildings far away from the people who can or would use the language as a substitution for supporting reclamation and revitalization. Digitization is a major step forward, and documentary work can be done in a way that is profoundly community-oriented. But it doesn’t have to be, and there is plenty of academic reward involved in perpetuating the old “salvage” model of linguistics, the one that puts this information into archives and museums. The fire is really the logical end point of anthropology as colonial enterprise, in which we take Indigenous worlds, reduce them to paper, lock them away where they can’t be actively used, allow them to burn, and then feel sorry for ourselves because we lost that source of academic insight.

My anger is superseding most everything else as I write this, but I should say that I do understand the value of museums. Public scholarship matters, and museums serve as an excellent corrective to navel gazing research and publication circles in which we carry on an abstract theoretical debate with two or three other researchers over the course of our entire careers. Not everything in a museum, including not everything in the Museu Nacional, has been stolen from Indigenous people. I feel little guilt for deeply appreciating, for example, a museum filled with dinosaurs, and wanting to know more about the scientific discovery of knowledge about them. But at the same time, I think this conversation needs to move beyond how to create better fire proofing for a colonial museum, or how to ensure that governments care about museums in general. For some, the fire was the last stage of a loss that began a long time ago, and until it happened, too few of us in academia were engaging seriously with that loss.