The Humanities and the Academy
Author: Marcie Bianco
January 29, 2013
For those of us of a scholarly pedigree, a couple of weeks ago the city of Boston hosted the MLA Convention—that’s the “Modern Language Association” for those of you without such privy access to this rarified society.
Founded in 1883, the Modern Language Association of America provides opportunities for its members to share their scholarly findings and teaching experiences with colleagues and to discuss trends in the academy. MLA members host an annual convention and other meetings, work with related organizations, and sustain one of the finest publishing programs in the humanities. For over a hundred years, members have worked to strengthen the study and teaching of language and literature.
My Facebook news feed was inundated with all-too-familiar laments: those friends lamenting the classless bravado of certain unsavory literary academics; those lamenting long-winded panelists; those lamenting job interviews. Greek tragedy knows not this level of woe. A favorite Gaga feminist of mine even took to the Zucker-verse to release some frustration: “I am shaking off the toxicity of the MLA—bad food, bad city, bad conference, bad panels and a really toxic run in with a queer theorist who thinks he is a bad boy but is actually just an English Department’s wet dream. Pathetic.”
I relinquished my MLA membership after the 2005 conference, having both literally and figuratively fucked myself to the outskirts of “lesbian dramatics” in the very traditional field of the “English Renaissance”. But as someone who still remains—however marginally—within academia, the annual convention is always on my radar…especially because, to quote a friend of mine who texted me over the weekend from Boston: “the MLA is SO GAY!”
The MLA, as the institutional bastion of the humanities, has always felt too much like the 1998 flick The Truman Show for me: Everyone consciously believes in the illusion, in the fantasy of realness that they perform habitually, day-in and day-out, called “The Truman Show.” The illusion is technically the delusion that the knowledge being produced is actually “original” and unique to those within the humanities. There are a handful of stalwart “Truman Burbank” (Jim Carrey) characters who really believe in the hegemonic power of “The Humanities.” The fact is these Trumans need to continue to exist, especially in the United States, where intellectual endeavors are poo-poo’ed by essentially everyone across the ideological spectrum who lives outside the humanities.
In a recent editorial at CNN.com outgoing MLA president Michael Bérubé emphasized the use value of a graduate degree in the humanities, claiming that with said degree one learns:
[how] to deal with complex material that requires intense concentration—and to write a persuasive account of what it all means. And you may find that the humanities major with extensive college experience in dealing with complex material handles the challenge better—more comprehensively, more imaginatively—than the business or finance major who assumed that her degree was all she needed to earn a place in your company.
I have immense respect for Professor Bérubé, but I feel these comments—intended as a passionate defense of the humanities—are trite and uncompelling. Yes, academia does train its graduate students to think critically, but academia is not an advocate of free thinking. In this regard, the “creativity” that suggestively marks the graduate student as a capable job applicant of numerous occupational fields is sorely overdetermined.
A graduate training in the humanities prepares students for academic careers—and arguably academic careers alone. The “creativity” championed by humanities scholars such as Professor Bérubé is not really creativity in the vein of spirited free thinking. Rather, this creativity is of a critical estimation in which students are rigorously trained to think and write in line with extant academic thinking—those particular critical discourses or methodologies en vogue in the humanities. Thinking, as such, is ossified—not alive. It is not really “thinking”—dynamic, vitalized—at all. Creative thinking (manifested in one’s writing) is not recommended if one wants to advance her career; it’s shunned, marginalized, and blatantly rejected from those highly coveted ‘peer-reviewed’ journal publications that are essential for one to move her career one notch closer to tenure.
What precisely do humanities students excel at? Abstract thinking, cerebral thought—not to mention a penchant for wildly obtuse language (which graduate students must learn how to master before leaving school). None of these skills are easily translatable into careers outside academia. To paraphrase my PhD dissertation advisor, who responded to my question about the use of a PhD outside academia, graduate training prepares students solely for an academic career.
That creative thinking is very much born from the independent thinker suggest that creativity is a subjectively produced experience and knowledge. Knowledge from experience—as genius minds throughout the ages, from Chaucer to Audre Lorde, have told us—is therefore more valuable than purely abstract knowledge…especially when the purpose of this knowledge is to communicate in order to educate.
Scholars in the humanities, however, compulsively need to re-affirm the value of the humanities, but they seem to do so overwhelmingly in terms of difference: We produce creative, critical thinkers. We alone cultivate the most perspicacious, inquisitive minds. And this is simply wrong—not to mention elitist.
Elitism is not “wrong” or “immoral”. In fact, I am an ever-aspiring member of its church. Yet learned men are not always wise men. And it is false logic to equate the two.
My issue with the humanities is not one of “clarity” or “simplicity”. I’m not critical of the quality of writing, and I agree fully with queer scholar Michael Warner that there is a disciplinary functionality of academic writing, as he explains in his essay “Styles of Intellectual Publics”:
Style performs membership. Academics belong to a functionally segregated social sphere, and in the humanities in the United States that sphere is increasingly marginal, often jeopardized. People use style to distinguish themselves from the mass and its normalized version of clarity…. Should writing intended for academic in the humanities aspire to accessibility…? Isn’t such an expectation tantamount to a demand that there should be no such thing as intellectuals in the humanities, that the whole history of the humanistic disciplines should make no difference, and that someone starting from scratch to enter into a discussion—of, say, the theory of sexuality—should be at no disadvantage compared to someone who had read widely in previous discussions of the question?
My criticism is rather one of awareness—specifically, the lack thereof (which also finds a corollary with “social skills”). Humanities scholars are each trained to worked within an extremely narrow field, with a very acute focus on a particular facet or two of a literary or political culture. Even those who claim to be scholars of, for instance, “the long 18th century” realistically have a very limited scope of knowledge. And yet, for many a scholar, there is a solipsistic equation of “humanities scholar” with, say, “public intellectual,” or even the belief that the humanities scholar is the optimal candidate for any and every job.
My thoughts here coincidentally align with the recently passed MLA Convention but have primarily been developed through personal anecdote. My girlfriend—one of the most intelligent persons I know and who is without question the most intuitive thinker I’ve ever met (Henri Bergson would be head-over-heels for her!)—is, in her mid-forties, applying to graduate school. And for the past nine months or so I’ve seen her toil with the academic rigmarole of the application process: From studying for the GRE (which is as essential to one’s admittance as the graduate application fee; an empty requirement) to writing her twenty-page writing sample. I have never once doubted her capability or desire for graduate studies; yet I am concerned about how graduate training will circumscribe her mind with the garden variety of critical methods that she’ll need to master in order to complete her degree. Subjective epistemologies—thinking through the body, thinking with intuition—are more often than not dismissed by scholars ensconced in the academy, heavily marred by western thinking and particularly British pragmatism.
Will she play the game? Will she participate in the illusion in order to advance her career? Or will she be the one to maintain her supply of brilliant, creative thinking while complying with academic demand?